
3 Business Benefits of GPU Virtualization
Overloaded components in your IT infrastructure cause system bottlenecks GPU virtualization allows you...
The use of an edge network to deliver latency-sensitive use cases is growing. IoT. Streaming Video. Cloud gaming. All of them require servers as close to the end user as possible to reduce round-trip time and mitigate latency.
But how do you ensure that those servers are operating as efficiently as possible? What if, by optimizing the server itself to maximize throughput and performance, you could reduce your physical server needs by 50% and improve end-user satisfaction?
Most businesses would jump at the chance but don’t know how to optimize their servers. The solution? Engaging with a cloud partner who can understand your use cases and has a willingness to build balanced, optimized servers for your unique needs.
In many edge-based use cases, such as streaming video and gaming, just a few seconds of latency can destroy the end-user experience. But latency can come from a variety of places within the delivery chain. The table below covers a few of the more common causes and how they might be mitigated.
But hardware selection and configuration isn’t the only way to optimize server performance. In many cases, the software employed to address the specific requirements of a use case can be configured as well (in the test environment, NGINX acted as a reverse proxy for CDN services).
This configuration must be done on an application-by-application basis, which includes the operating system itself. For example, tuning the TCP stack can affect such things as how each user session is handled and shaped as well as how the server responds to requests.
Memory, disk, bus, CPU, and the OS/applications all work together to balance an edge server’s Input/Output (I/O). The problem is there’s no “one-size-fits-all” approach.
Imagine if there was an application that let the user select the server’s purpose (such as a streaming video edge cache node) and have everything optimized as a result. That’s the fundamental issue: each server must be built based on the use case to ensure it is tailored to meet the specific needs of memory, storage access, throughput, and performance.
CDNs need to have high-performing edge servers in order to mitigate latency when replying to device requests. But an imbalanced server does more than just add milliseconds to the round-trip time.
Long-term operation of a server in an imbalanced state forces it to work harder to achieve the same results of an optimized server. This hard work means the server will wear out faster. For that reason, fixing imbalanced I/O will not only improve the overall performance of the edge caches but also prolong the life of the hardware.
There is no silver bullet to address I/O imbalance or optimization. Each use case, such as streaming video or cloud gaming, requires a specific understanding of how it will employ server resources and adapt to the expectations of end-users. If ultra low latency is part of the user experience, then the server CPU, memory, disk, bus architecture, and software need to be balanced against that requirement.
Solving edge server performance and throughput optimization by building bigger boxes (with more CPUs and memory) or replacing NICs, storage devices, and memory with faster versions doesn’t address the underlying imbalance. Yes, the server may perform marginally better under certain use cases, but each of those components needs to be balanced in relation to the resource demands being placed on the server. And once the hardware is balanced, the software needs to be optimized as well, ensuring requests for server resources are handled in an efficient and effective manner through the CPU and bus architecture.
Every Intequus edge solution is custom engineered to your use case and lab-tested to ensure optimal performance. By designing for balanced I/O, Intequus engineering improves end-user experience by reducing round-trip time. Optimized edge servers also have longer lifespans, reducing long-term capital costs associated with server maintenance. Finally, by distributing data within the edge you can help mitigate downtime and improve overall resiliency, especially against cyberattack.
Intequus is a leading provider of configurable compute and storage solutions. Intequus offers full-service edge hardware programs optimized for your application. Intequus’s expert engineering, high-quality manufacturing, experienced program management, and personalized technical support give CDNs the tools they need to be the leaders in their field.