This is the third entry in a four-part Data Center Frontier special report series that explores edge computing from a data center perspective. This post covers the potential and benefits of true edge flexibility and how to tackle the challenges involved.
Download the full report.
Many organizations are now looking for better ways to deliver rich content to users who are heavily spread out. Furthermore, we’re seeing even more companies push out applications, desktops, and various services to rural locations. A big challenge here revolves around performance and user experience. After all, just because an application can be delivered doesn’t mean that’s happening efficiently.
Plus, this isn’t limited to rural locations only. Edge can live in urban environments where network resources are constrained or slow (i.e. where number of peering hops are too frequent). For example, Boston needing to peer into New York City to get to a major hop point.
Either way, this was the ultimate challenge when it came to cloud computing. As we see more data and services impact traditional cloud systems, we saw real inefficiencies in trying to stream and work with all of this data from a cloud ecosystem. Basically, organizations needed a better way to process this data.
As more users connect to the cloud and request data heavy in content and size — utilizing the edge for fast delivery will make complete sense. Gartner recently stated that emerging technologies require revolutionizing the enabling foundations that provide the volume of data needed, advanced compute power, and ubiquity-enabling ecosystems. The shift from compartmentalized technical infrastructure to ecosystem-enabling platforms is laying the foundations for entirely new business models that are forming the bridge between humans and technology.
“When we view these themes together, we can see how the human-centric enabling technologies within transparently immersive experiences — such as smart workspace, connected home, augmented reality, virtual reality and the growing brain-computer interface — are becoming the edge technologies that are pulling the other trends along the Hype Cycle,” said Mike J. Walker, research director at Gartner.
Edge flexibility, design, and overcoming challenges
The entire concept of edge is to be able to impact users as well as services based on proximity. So, edge computing offers tremendous benefits in terms of how you deploy edge solutions and manage data.
Today, you can deliver modular edge data center infrastructure solutions which provide standardized deployment options. This gives you the flexibility and capability to meet the demands of compute today and beyond.
From the customer’s perspective, edge computing can be any services or architecture which helps you simplify and localize the delivery of applications, data sets, and services. These services help you gain more control over your WAN, bandwidth requirements, and how rich content is delivered. The future absolutely looks to be a lot more interconnected with more user distribution. And, with the influx of new data, edge will be even more important.
This means that edge design is flexible, and specifically caters to high-performance or even latency-sensitive applications. The really cool part here is that you can control how data flows throughout your entire edge ecosystem, secure the processing of that data, and still positively impact the user experience.
That said, deploying edge can have its challenges. Remember, edge solutions aren’t just ‘another data center site.’ They’re smaller, use-case specific, and are designed to be dense environments to help you process more services and user data. With that, there are three challenges to be aware of when working with edge design:
Understanding and managing the latency budget
To an end user, latency is the reason that downloading a movie “takes so long”, but to a content provider the number of milliseconds it takes to complete a function can be measured in customer dissatisfaction and cost.
Furthermore, to a business, latency can also mean the loss of business or a competitive edge.
Even at the speed of light the round trip from a central data center, a facility located in a Tier I market for example, can mean the accumulation of transmission costs. A study conducted by ACG Research estimated that caching content locally in a metro population can save approximately $110 million over a five-year period. If we were to apply this same logic to a company running an IIoT parts tracking application, the hard costs of transmission could be measured, but the associated cost in the degradation of the performance of the application would be incalculable.
It’s important to have your data reside closer to your users as well as the applications or workloads which are being accessed.
Consider this as an example — using supply chain in physical systems, like Walmart or even Amazon Prime Same Day Delivery. In any supply chain system, as traffic increases, transportation costs go up. As a result, distribution gets closer to end users to decrease transport costs and increase throughput. The same concept can be applied to edge and data delivery.
Edge computing will take data everywhere, including the floor of the ocean, as is the case for Microsoft’s Project Natick deployment in Scotland. (Photo by Scott Eklund/Red Box Pictures for Microsoft).
With the increase of traffic moving through the edge, there is a greater demand for more bandwidth and less latency. As discussed earlier, it’s important to have your data reside closer to your users as well as the applications or workloads which are being accessed. Where data may have not fluctuated too much in the past, current demands are much different.
- Bandwidth Bursts. Many providers now offer something known as bandwidth bursts specifically for edge solutions. This allows the administrator to temporarily increase the amount of bandwidth available to the environment based on immediate demand. This is useful for seasonal or highly cyclical industries. There will come a time when for a period of business operation, more bandwidth is required to help deliver the data. In those cases, look for partners who can dynamically increase that amount and then de-provision those resources when they are no longer being used.
- Network Testing. Always test your network and the network of the edge provider. Examine their internal speeds and see how your data will act on that network. This also means taking a look at the various ISP and connectivity providers being offered by the colocation provider. Many times, a poor networking infrastructure won’t be able to handle a large organization’s ‘Big Data’ needs despite potentially having a fast Internet connection. Without good QoS and ISP segmentation, some edge data centers can actually become saturated. Look for partners with good, established connections providing guaranteed speeds.
- Know Your Applications. One of the best ways to gauge edge data requirements is to know and understand the underlying application or workload. Deployment best practices dictate that there must be a clear understanding of how an application functions, the resources it requires and how well it operates on a given platform. By designing the needs around the application, there is less chance that improper resources are assigned to that workload.
There are a lot of benefits and use-cases around edge and connected systems. Take the time to think about your own strategies and whether your current infrastructure is capable of supporting these initiatives.
This Data Center Frontier series, focused on edge computing, will also cover the following topics over the coming weeks:
- New Ways to Deploy Edge Capacity for Data Center Leaders
- Understanding the Edge and the World of ‘Connected Devices’
- Deploying Real-World Edge Solutions: A Lego-Inspired Design Approach
Download the full Data Center Frontier Special Report on Edge Computing, courtesy of BASELAYER.
Source:: Data Center Frontier