Automation, machine learning and artificial intelligence are already dramatically changing the way we live and interact with all our data, but all of these technologies will ultimately be dependent on advances in edge computing to realize their full potential.
Whether it’s virtual AI assistants like Alexa or Siri or smart watches that monitor heart rates or even printers that alert companies when they need maintenance, the proliferation of smart, connected devices puts increasing pressure on wireless networks to deliver immediate, real-time information to make these devices valuable. Latency or lag defeats the purpose of so many of the IoT gizmos and resolving this “last mile” roadblock is one of the many reasons edge computing has become a priority for so many industries.
In June, Hewlett Packard Enterprise announced it would invest more than $4 billion over the next four years to develop intelligent edge technologies and services. The company said this commitment will be focused on “helping customers turn all of their data, from every edge to any cloud, into intelligence that will drive seamless interactions between people and things … and employ AI and machine learning to continuously adapt to changes in real time.”
Gartner predicts that by 2022, 75 percent of enterprise-generated data will be created and processed outside the traditional, centralized data center or cloud – up from just 10 percent in 2018. This dramatic uptick illustrates why this technology is so important and informs the decision-making processes of all these device vendors.
Simply put, not having the requisite edge computing processors as close to the consumers as possible will be perhaps the biggest competitive differentiator in automation, AI and machine learning.
“Organizations that have embarked on a digital business journey have realized that a more decentralized approach is required to address digital business infrastructure requirements,” Santhosh Rao, a senior research director at Gartner, said in a comprehensive edge computing report. “As the volume and velocity of data increases, so too does the efficiency of streaming all this information to a cloud or data center for processing.”
Speed is king when it comes to the internet of things. Having the ability to process data along the edges of a network – particularly when these things are in motion – means the information doesn’t have to travel back to the cloud or the on-premise data center. And while those speeds continue to improve at an impressive rate, the difference to end users is substantial and usually critical to the success or failure of their business.
Obviously, putting this kind of processing horsepower so close to where we live and work (or even on our bodies themselves) and connecting them to the internet opens the door for all kinds of security issues. We’ve all heard tales of casino databases being hacked by accessing connected thermometers in the lobby fish tank or printers being compromised to deliver mountains of spam for various nefarious purposes.
It’s a trade-off that most of us have begrudgingly accepted. For convenience and immediacy, the general consensus is that we’re all willing to go along with the program so long as there are legitimate efforts made by the companies providing these devices and networks to make an earnest effort to secure our most personal information. It obviously hasn’t always worked out that way but as edge computing technologies mature, there are actually some reasons for optimism on the security front.
Security experts say that because traditional cloud architectures are inherently centralized, they’re exposed to distributed denial of service (DDoS) attacks and power outages that have plagued networks for years.
Because edge computing distributes the processing, storage and in-use application across a wide range of devices and data centers, it limits the damage throughout the overall data pipeline. Also, security teams can more easily section off portions of this data flow with security protocols that ensure only a limited amount of data will be “vulnerable” at any given time. The distributed nature of edge computing means there’s much less chance of a catastrophic exposure to the entirety of the information gathered, analyzed and distributed.
What’s even more exciting for service providers is how immersive edge computing technologies will make it possible to offer new products and services in a much faster and cheaper fashion without having to overhaul their entire data center or network infrastructure.
Armed with immediate feedback and data analytics generated by these smart devices, companies can adjust their offerings on the fly to offer enhanced or modified versions of their services tailored for exactly what end users need without investing more time or money building out new infrastructure for each specific use case.
Gartner’s Rao points out that these more complex edge-computing solutions will eventually act as gateways – particularly as 5G technology moves to the fore. The ability, for example, to aggregate local data from traffic signals, GPS devices and other vehicles in nanoseconds has the potential to revolutionize safety and navigation worldwide.
“Servers deployed in 5G cellular base stations will host applications and cache content for local subscribers, without having to send traffic through a congested backbone network,” he said.
Just as some people are starting to really understand what cloud computing means and how it works, we’re now heading into the so-called fog computing era where edge computing devices and technologies will be the A-list stars.
Fog computing basically describes the nebulous space between the information that’s created at the source – a company or a device or an individual – and the cloud, and determining where the data is stored and what information can be analyzed locally without making the time-consuming trip back and forth to the data center.
According to IDC, IoT spending will eclipse the $1 trillion threshold by 2020, with most of that investment dedicated to hardware, sensors, modules, infrastructure and the connected devices hosting these edge processors.
As with any other emerging technology, there’s going to be an adjustment period as the hype subsides and the market matures and sorts itself out. But there’s nothing gray or foggy about how edge computing is going to reimagine data collection, management and transmission in the not-too-distant future. It’s already moved to center stage.
is senior analyst for BPO Media, which publishes The Imaging Channel and Workflow magazines. As a market analyst and industry consultant, Ames has worked for prominent consulting firms including KPMG and has more than 10 years experience in the imaging industry covering technology and business sectors. Ames has lived and worked in the United States, Southeast Asia and Europe and enjoys being a part of a global industry and community.