Regardless of where we are based in the modern world, technology is more often than not, reshaping our daily living. The tech industry is continuously thriving with technology trends and rapid advancements, impacting the way we interact, educate ourselves, and conduct business.
With the advent of cloud computing – on-demand availability of computer system resources, organizations have been able to increase the capacity of their networks and computing power as well as their capacity to communicate. Moreover, the massive deployment of IoT (Internet of Things) devices and the on-growing 5G wireless technology have helped create the need for "Edge Computing", where data processing occurs in part at the network edge, rather than completely in the cloud. To understand what this means, we must first define the term.
According to Mahadev Satyanarayanan, "Edge Computing is a new paradigm in which substantial computing and storage resources are placed at the Internet's edge in close proximity to mobile devices and sensors." Simply put, edge computing brings the cloud to you. With this distributed computing paradigm, the computing and processing of information are done at or near the source of data, instead of depending on the cloud from data centers kilometers away from the device.
Essentially edge computing addresses concerns such as mobile devices' limited battery life, bandwidth costs, latency, security, and privacy.
On a privacy and security level, a perfect example of edge computing would be the simple encryption and storage of biometric information on a smartphone – be it fingerprints or facial recognition. By doing so, tech companies have offloaded security concerns from the centralized cloud to individual devices. Without this technology, having to wait for the device to ask permission from cloud servers before we could authenticate ourselves would be a time consuming and frustrating process. Moreover, the authentication exchange happening over the cloud would not only be subject to latency but also could be subject to man-in-the-middle attacks.
With latency – the time it takes for data or a request to go from source to destination, edge computing provides various elaborate techniques to mitigate true and perceived delays between an action taken on a device and the response from the connected server. Typically, such a request would go to the cloud servers and take a few notable seconds before a response is sent back. But as edge computing processes information locally, it significantly outperforms the traditional cloud-based system. The proximity of the analytical resources to the end-users allows AI tools to increase operational efficiency. In the instance of facial recognition used on smartphones, processing a facial scan and comparing it to the data on the cloud would take more time and be subject to an internet connection; while with edge computing, the algorithm runs locally. Automation systems, self-driving cars, virtual reality, and smart homes, are all tech applications that require rapid processing and responses, thus tech companies have been compelled to invest in developing edge computing. Self-driving cars, for instance, are essentially data centers on wheels. With edge computing, they do not have to feed their numerous sensors to the cloud and wait for a response. As they are collecting vast amounts of information, this information needs to be processed in real-time. "The notion of real-time becomes a very important ingredient to edge computing, given the massive amount of real-world information being fed to network systems." – Peter Levine, investor & partner at Silicon Valley firm Andreessen Horowitz.
The Internet of Things is steadily driving the virtual world as we know it to an incredible new dimension. Before, we could easily envision that the limit to the number of computers out there would probably be the number of people on the planet. Now, with the variety of devices – from toasters and coffee machines to smartwatches and self-driving cars – that could be connected to the Internet, the amount of information to manage in real-time has grown exponentially. With Machine Learning and data, edge computing could provide the future of agile processing and management of information and software.