With IT seeing a shift towards self-sufficient systems, Cloud containers and computing are becoming more and more attractive to businesses and offer major benefits over Virtual Machines, such as lower costs and ease of movement between different infrastructures. They are simple, portable, and, as they are self-contained, are particularly useful for testing and integrating new applications.

With Dockercon18 just around the corner (12-15 June in San Francisco), we’re taking a look at the innovations driving containers and Cloud computing.


Application development is often hampered by cross-platform performance and the drawbacks of depending on the system’s underlying infrastructure, such as limited storage and how many servers are available. Working on the Cloud outsources the hosting of a server to a provider (Infrastructure-as-a-Service, or Iaas).

Of course, there are servers propping up the system, so this isn’t an entirely accurate name, but they are third party provisions and so don’t require any space or upkeep from the developer. Most providers, such as market giant AWS (Amazon Web Service) Lambda, charge by how much power is being used (in their case, by every 100 milliseconds) and in exchange they maintain the network. Scalability is therefore handled more effectively, as users only pay for the traffic they are receiving, meaning lower costs for smaller businesses.

Serverless code can either be run as a PaaS (Platform-as-a-Service) but more typically, as seen with Amazon, Microsoft, and Google, with FaaS (Functions-as-a-Service), where a piece of code is executed after it is prompted by an event mechanism. These codes are held within containers until they are prompted by the request and are decommissioned once the execution is complete.

Docker has created a platform for the creation of independent serverless networks, including Gestalt, Nuclio, Apache OpenWhisk, and OpenFaas, that provide a framework for the creation and deployment of serverless applications by presenting functions as Docker images and then running them as containers, which are triggered into executing their command by either an event from an event bus or a call from an API gateway.

Currently AWS Lambda holds 40% of the market, with Microsoft Azure, IBM, and Google Cloud Platform taking 23% when combined together – but Lambda has been running since 2006 and has a head-start on the competition, and it won’t be long before the others catch up, narrowing the gap between what AWS has to offer and competitors’ own features.


Blockchain is no longer synonymous with cryptocurrency: we have seen how distributed ledgers have been adopted by global governments, international financial houses, healthcare providers, and logistics managers to provide an unalterable, validated means of retaining and exchanging information.

Blockchains offer increased accountability for third-party providers, which is reassuring for clients that use these services, particularly if they are storing and processing data outside of their own servers.

Since these applications can be either public (viewable and validated by anyone on the chain) or private (only those with permissions can perform specific functions), there are varying degrees of privacy and accountability. In fact, blockchains could be perceived as the next logical step for containers: they are both immutable infrastructures, with new ones being written rather than editing the old, they are decentralized to work with a multitude of platforms, and they are resistant to failure, blockchains by disseminating copies throughout the network for verification and containers by being one of many supporting an application. As more and more uses for blockchain are implemented beyond cryptocurrency, we will see other applications embrace it, particularly in open-source Cloud computing.

Machine Learning

Cloud computing is particularly interesting to data scientists and analysts, who can share data, develop research, work in tandem to produce an analysis, and also develop and integrate deep learning models.

Public Cloud providers, including Amazon (“More machine learning is built on AWS than anywhere else“) and Google, are already offering Machine Learning as a service under three different categories: binary (typically responding to yes or no queries, such as whether a transaction is valid or whether a customer fits a demographic), category (a broader range of options, such as age bands and risk levels), and value predictions (a more complex estimation of a quantitative response to a question, such as how many copies will be sold or whether the stock will last to the end of the month).

Using a Cloud-based ML system is cheap to run and provides cheap storage, and it can be integrated directly into applications.

Service Meshes

A service mesh is a layer of interconnected technologies to manage the exchanges in a microservices architecture.

Containers are of course the perfect method for deploying and scaling microservices, but observability has been a drawback, with inter-microservice communications being complex and difficult to trace – most microservice errors occur during an exchange between services. However, open-source Cloud projects such as Envio, Linkerd, and Istio are providing enhanced communication in interservice service meshes for a more seamless experience.

The Growth of Cloud-based Working

It’s been predicted that public Cloud computing will be used by over half of global enterprises by the end of 2018 and that by 2025 we will need the capacity to store 163ZB of data. Decentralization and third-party hosting will shoulder the bulk of the weight, leaving users with more resources to focus on building their business rather than maintaining a network.