In recent years there's been a huge push to move everything to the cloud. In the process, we've found, that for some applications, a 100% cloud architecture doesn’t always make sense. Recent technological developments such as autonomous cars, drones, IoT sensors and blockchain data distribution networks have showed us that not everything belongs on the cloud.
Guest blog by Mark Patrick, Mouser Electronics.
On the contrary, for applications with strong security or privacy requirements, and when responsiveness is key, moving away from the cloud may in fact be the answer.
Cloud computing can't do it all
For years, enterprises have discovered that offloading their data centre operations to the cloud can save incredible amounts of money, in terms of both Capex and Opex, while also providing high reliability and availability. With the advent of GMail, Google Docs and Google Drive, as well as services like Dropbox, cloud computing has also made huge inroads into the consumer space.
Today each one of us interacts on a daily basis with cloud-centric architecture, whether it's streaming a movie from Netflix onto our tablet, using Google Translate on our mobile phones, or simply checking our email. Our world now is more cloud-connected than ever.
At the same time we've discovered that for all the benefits of the cloud, there are also many drawbacks. By centralising our data, cloud computing leverages efficiencies in the data centre - however that same centralisation brings up privacy concerns. Cloud computing gives us access to massive computing power at our fingertips, but we'll have to wait for it as the data makes its way across the Internet. Having access to our email from anywhere in the world is convenient - yet, at the same time, it raises the risk of our sensitive data being compromised. We're slowly discovering that cloud computing is great, just not for everything.
The foggy IoT
The Internet of Things (IoT) was supposed to drive us towards a more cloud-centric world. Ironically, it has also done the opposite - by showing us the inherent limitations of cloud computing.
Autonomous driving is a prime example of the restrictions of cloud technology. While autonomous cars rely on cloud connections for software updates and machine learning improvements, the hundreds of slight steering, braking and acceleration adjustments it makes every minute must be decided locally. The ultra-low latency needed, means there is simply no time for data to be sent to the cloud and analysed.
IoT sensor deployments are another situation where computing at the network edge makes a lot of sense. Businesses, governments and other organisations may want to leverage IoT sensors to streamline their processes, and also use cloud services for centralised data storage and analytics. However, they may not want to expose sensitive organisational data to third party cloud providers. Hackers, unscrupulous cloud service providers and other risks mean that these kinds of organisations may want to reduce the amount of sensitive information they sent to the cloud.
Edge computing - where processing power is shifted to a network edge device, such as a gateway, is perfect for these kinds of scenarios. Instead of a dumb router, IoT data can be connected to a smart gateway server which subsequently performs basic data processing and strips sensitive information before sending it to the cloud.
This way, the organisation is able to reap the benefits of cloud computing, without revealing any more information than it absolutely needs to. In addition, the server can perform some simple analytics locally. This could be useful for applications which require low latency, like machine control and automation. Besides stripping away sensitive information, the gateway could also reduce the data being sent or even compress it, so that the cloud connection is used more efficiently and network latency is reduced.
Blockchain will drive the edge
Besides the IoT, one of the biggest drivers of edge computing in the near term may well turn out to be blockchain technology. Pioneered by the digital currency Bitcoin, blockchain refers to a distributed, decentralised database where copies of data are stored across all nodes in a network.
The name blockchain comes from the original Bitcoin architecture. In Bitcoin's case, each transaction on the network is recorded in a ledger that is encrypted into a timestamped ‘block’ of transactions. A block currently contains 1MBytes of transactions. Besides encrypting the current block's transactions, each block also contains a hash of the previous block, thus forming a recursive chain.
The distributed nature of blockchain technology, along with its recursive recording of previous transactions, makes it extremely effective at authenticating transactions. Before Bitcoin and blockchain technology, authentication came from central, trusted authorities. Bitcoin's distributed ledger allows for decentralised, trusted transactions to happen.
Besides decentralising trust, blockchain technology also decentralises computing. Unlike the traditional model of a trusted central authority, blockchain networks rely on peer-to-peer computing nodes. These nodes listen in on the network, confirming valid transactions and encrypting them into blocks. This peer-to-peer based architecture is the polar opposite of centralised cloud computing, so as blockchain technology is used for more and more applications, our computing power will shift increasingly to the network edge.
That doesn't mean that cloud computing will go away. Blockchain's distributed computing model is great for authentication and proving trust, but its massive redundancy makes it very inefficient at doing more complex tasks (such as running regular software applications or doing data mining). Therefore, computing systems that leverage blockchain technology in the future will probably have a mix of both cloud computing and blockchain elements for authentication at the network edge.
Beyond the Cloud
Cloud computing has changed the face of modern day computing, but it must be recognised that the centralised computing model employed is not ideal for all application scenarios. In the future, as autonomous cars, IoT deployments and blockchain technology enter the mainstream, our computing systems will evolve into a mix of cloud and edge computing.