With technology advancing as fast it is, the world is a very different place than it was around a decade it. These days, the internet isn’t considered a luxury, but a necessity, with billions of people connecting to it for their daily lives. Forbes Magazine even predicted that more than 80 billion devices will be connected to the internet by 2025. Reportedly, that amounts to about 180 trillion GB of data.
Currently, a lot of the data generated on the internet is handled by clouds, which provide computing power and storage via the internet. It has many applications, from cloud computing in healthcare, to manufacturing, etc. The problem being that it’s already showing its flaws, with delays due to the number of devices, which are set to increase.
Seconds matter when it comes to connections; there’s a reason latency is measured in the millisecond. Current tech like gaming and virtual reality try to avoid any sort of delay with their operations. Those tend to only lead to discomfort, but for the more dangerous issues with delays, consider autonomous vehicles; a 10 millisecond delay for such technology isn’t just bad, it can outright lethal.
The limitations in cloud provisions can’t be the source of issues and life-or-death situations. Even if worldwide networks can transmit their data at the speed of light, users still need to be less than 93 miles from the user to avoid suffering from more than one millisecond.
One possible solution being looked at is edge technology, which processes data at the ‘edge’ of networks, geographically closer to the users, such as via a home router or a mobile base station. Currently still in its infancy, there are some who are saying it might be the next step following cloud computing in healthcare and across technology at large.
Major companies like Cisco, Dell and Arm, have already put significant investment on the technology, which is seen as valuable thanks to their ability to minimising delays. Cost-effective use of the tech would demand that the edge do a lot of data pre-processing before passing anything along to the cloud. There have already been some proof of concept projects, and a lot of potential for things like online gaming, cloud computing in healthcare, military operations, among others are being seen.
However, this technology wouldn’t be a complete replacement for cloud technology, more along the lines of a complimentary addition. They offer global processing and storage capabilities, but the edge will handle the processing should it become commonplace.
The massive storage and easy scaling that cloud technology provides is irreplaceable, but the edge would render their delayed processing power obsolete, at least for real-time needs. The edge will basically act as an extension of the cloud, allowing consistent processing power across the cloud.
Another thing worth noting is that edge technology is not directly linked to the cloud; should the edge end up compromised, the rest of the cloud stays safe, avoiding mass breaches. Using home router servers also don’t leave a data footprint outside, which prevent breaches. More mobile public edge devices will have much more going through it, so it’s still under study.