Edge computing: buzzword or breakthrough?

Edge computing is a buzzword that’s been skulking around for a while now, waiting for its time in the spotlight. It’s something to keep an eye on, because it has all the qualities that a tech trend needs to suddenly be at everyone’s lips. It’s sleek, sexy and dynamic. The by-lines even write themselves: “Are you living on the edge?”, “Embrace true cutting-edge.”, “Get the edge on the competition.” You get the idea.

But what is it, anyway? Like its predecessor buzzwords cloud and IoT, it relates to how companies collate and manage ever-increasing volumes of data.

Give it to me straight (edge)

In layperson’s terms, the current situation is this: you have an internet-enabled thing. You interact with it, and that data whistles away to a centralised cloud to be processed. For example, you shout at Alexa to play ‘Don’t Stop Me Now’. Alexa dutifully listens and sends what it hears back to Amazon where the words are parsed, the request is understood, and a command to play Queen comes back.

In essence, all the magic happens behind the scenes. Alexa itself doesn’t understand a thing – it is a box with a microphone and speaker and that’s all. End-user devices like these don’t have a mind of their own (phew). Not yet, anyway.

This is what edge computing is positioned to change. It is exactly how it sounds – computing that takes place on the ‘edge’ of the network, rather than being routed back to a central location. In other words, the Alexa of the future will have the capacity to understand and action your requests all on its lonesome.

How will it change things?

The primary advantages to edge computing are speed and cost. By distributing processing to the edge, it lowers the bandwidth burden and costs associated with routing everything through a central cloud. Well, strictly speaking it distributes the cost into local hardware and infrastructure, but from the perspective of the tech manufacturers who will be pushing edge computing, that’s a saving on their end.

In terms of speed, while it may not matter so much that a device takes a second to “think” about a request by interfacing with its cloud in many areas, in others the extremely low latency that comes with edge computing is a distinct advantage. Self-driving cars are an obvious one, where split-second decisions can’t afford to be shuttled back and forth a few hundred miles. The trading world is another example of where every millisecond counts when it comes to data.

Why will it matter to me?

It’s not just another cute boondoggle for Tesla to show off, either. While it will be baby steps at first, edge computing is set to disrupt pretty much every industry. Currently, it’s projected to maintain 30% year-on-year growth, reaching a five-fold increase in value to $10 billion by 2026.[1] And with the technologies that enable it, like 5G, set to become more advanced and widespread, there’s little reason to suspect that growth will tail off.

From telecoms to manufacturing, distributing processing back to the edge has significant implications for enabling things like optimised real-time monitoring and automation capabilities. From a marketing perspective too, devices on the edge will harvest far more in-depth client data, allowing more personalised and timely messaging and effectively supplanting traditional forms of market research. Gartner predicts that 75% of all enterprise-generated data will be sourced from the edge by 2025, up from only 10% in 2018.[2] That’s one hell of a swing.

As tech trends go, this is one to watch. We’ll be on the edge of our seats.

Can we help you with marketing, branding, strategy, or just want to chat? Then get in touch!