By Chris Larkins
Hyper-converged infrastructure (HCI) is helping organisations in every sector provision applications at the network edge more effectively and efficiently. This, in turn, helps them to better process the massive amounts of data being generated by a wide range of Internet of things (IoT) devices and smart sensors.
This has been particularly beneficial during the upheaval of the past few years, with entire workforces being sent home to work, and the rapid acceleration to the cloud and uptake of video conferencing solutions that came with it. Again, this drove an enormous increase in the amount of traffic generated and consumed at the network edge, which triggered dramatic changes in network architecture.
During these times, with the majority of an organisation’s data being created and eventually used outside on-premise data centres or cloud environments, the cost and network capacity needed to consolidate, analyse, and redistribute grew more and more prohibitive. This drove corporates to look for new ways to store and process their data closer to the source, and eventually to the users of their data-intensive applications.
Even in the aftermath of the pandemic, the world of work is a hybrid one, and the proliferation of IoT devices, sensors, remote workers, and mobile users is forcing organisations to rethink how they provision their applications and process data. For many enterprises, this skyrocketing volume of data being generated will fuel a shift towards moving data processing and analysis out of central facilities and cloud locations to localised systems that are specifically built for this purpose.
Many of these could be made up of IoT sensors embedded in appliances, and a local system on a chip, but for larger workloads, these won’t be enough. What is needed in this case are stand-alone compute and storage systems, which is where the new generation of hyper-converged, edge-optimised infrastructure solutions come into play.
The operational efficiency of any computing architecture can be easily overcome by the expense and resources needed to backhaul an ever-increasing flood of data. Therefore, adding several edge locations can dramatically reduce an entity’s network and computing bottlenecks. This is because hyper-converged systems offer a way to locally process IoT data and deliver streaming content right at the edge of the network while keeping service and application performance consistent and seamless.
In this way, reduced latency is one of the greatest benefits of the edge, removing any ‘traffic jams’ that the enormous quantity of data streaming to the cloud might create. Also, the return time from the cloud to the entity using the information is shortened, leaving the cloud-based platform for more crucial tasks such as analytics. Similarly, much in the same way edge computing reduces latency, it reduces bandwidth. When more data is processed, analysed, and stored locally, less data is sent to the cloud. By reducing this data flow, the cost to the user is lowered, as less bandwidth means less expense.
Data storage security in the cloud has also improved significantly over the last few years and will only continue to do so. Add to this the fact that with edge computing, less data is centralised in cloud storage, minimising the risk of putting data at risk. By processing and storing some data in an edge network, the situation of having “all your eggs in one basket” is minimised—the edge filters redundant, extraneous, and unneeded data. Only the company’s most vital data is sent to the cloud.
Moreover, since edge devices are becoming more commonplace, they are now more freely available for cloud-based platforms. And as original equipment manufacturers or OEMs start to add edge capabilities to their solutions, systems become far easier to scale. In addition, this enables local networks to remain operational even during times when upstream or downstream nodes are not.
It is for these and other reasons that edge computing has been on the minds of many IT practitioners, and hyper-convergence is coming to the fore as the game changer that will meet the technology requirements that make edge computing possible