The third cloud wave: The Multi Cloud dilemma

The third cloud wave

While cloud computing is used in most companies, the cloud becomes more important to reach digital transformation.

As we go through different waves in industries, there are rising and falling technology cycles which obviously also apply to the cloud technology.

Companies are now updating their processes and upskill their teams to maintain the necessary control while cloud technologies reach new waves.

During the first wave, companies focused on moving their systems to the cloud based on IT driven initiatives and often without business involvement. The focus was on cost savings and being agile. When companies learnt that there is not a single cloud strategy, they started to differentiate towards vendors who are strong in their services and offerings, rather to depend on one vendor.

As a consequence, in the second wave, multi cloud strategies became more popular. Because cloud providers became differentiated, companies started to choose the best cloud service for their specific IT systems and applications.

This kind of general merchandise trade approach resulted in an increased getting out of cost control situation and CIO’s had difficulties to justify their total return of cloud technology investments.

The Multi Cloud dilemma

The multi cloud strategy reduced vendor dependencies, however, application portability and integration still remains a challenge.

Companies therefore started a withdrawal from the different multi cloud solutions, especially in the public cloud and moving more towards an on premises, data center and a private cloud approach.

Workloads for the private cloud are still critical and companies modernize their private cloud infrastructure in parallel with public cloud initiatives because they need to understand how much and what kind of workloads remain in the private cloud. Most private clouds are hosted and approaches are about virtualization, orchestration and control plane.

One of the major drivers supporting the third wave is the return to cost control. On one hand, companies want to maximize their return on their application investments. On the other hand, going through the digital transformation, costs of transforming those applications become much higher because of the volume and the returns do not meet the expectations, despite the promised economy of scale factor when using public clouds. However, this is not met when the applications are just shifted to the cloud based on a traditional on premises architecture. A container approach for example can be seen as an answer to a better cloud adoption.

So the question goes back to redesign the appropriate architecture which allows to achieve better cost savings and using modern cloud technologies to get the maximum from the cloud service providers.

Examples described in my previous blog series about Rising Cloud Technologies are

  • Distributed Cloud architecture
  • API-Centric SaaS
  • Cloudlets
  • Block chain PaaS
  • Cloud-Native architecture
  • Containers
  • Site Reliability Engineering
  • Edge Computing
  • Service Mesh
  • Micro Services

Through the next couple of years, legacy applications migrated to the public cloud infrastructure as a service (IaaS) will require optimization to become more cost effective.

The focus is on small and discrete functions and to scale only those business functions which are in demand instead of scaling all functions as in previous waves. The homework though is to understand the current architecture, identify silos and redundant applications and replace them for example with micro services, cloud native architectures and other concepts from the rising cloud technologies to achieve reliability while reducing complexity and costs.

Coming from a traditional IT infrastructure with the burden of existing silo architectures, data is unstructured, available in various file formats and distributed in different storages with different hierarchies. This makes it difficult to come up with a unified data model to avoid inconsistent data in the cloud. This is often found in regulated industries such as financial services where these companies depend on high complex and old legacy systems which cannot be just “lifted and shifted” to the cloud.

Companies getting the maximum from the cloud will challenge their multi cloud environment by investing only in the right cloud services which are best for the business. The leading cloud service providers will increase their portfolio by serving a subset of their services for low latency application requirements.

The cloud in your own data center

Most regulated companies such as banks, government and pharmaceutical companies still run their IT systems on onsite premises instead in a public cloud infrastructure because of security reasons to keep their data in their own data centers.

These companies are missing the advantages of the cloud technologies such as embedded machine learning, artificial intelligence and autonomous databases which reduce costs and security risks by eliminating user errors through automatization. The cloud does not work the same way as our datacenter does.

Cloud service providers are now coming with a new business model to give those companies the benefits of the cloud such as pay as you go and pay per use, rapid elasticity and the latest patches with an infrastructure run by the vendor but that physically is in the data center of the customer.

Vendors put their own cloud hardware and software in the customer data center, for example a cloud based autonomous database, which has the advantage for system and data base administrators and developers to focus on innovation instead of time consuming maintenance which exposes them to higher security risks of data breach and failures.

Users will only pay for what they use and the infrastructure is the customer’s data center and behind their firewall. The data is not travelling between a public or private cloud on the outside internet to reach the user, it is all in house.

The same business model can be leveraged for a public cloud inside an enterprise data center. The companies can then use a wider range of cloud services in their own data center including new technologies the cloud service providers have in their portfolio such as machine learning, artificial intelligence, Internet of Things and blockchain.

Conclusion

Companies using in-house cloud services from an external vendor have the advantage that the hardware, software and data are in their own data center and the vendor manages their infrastructure, patching, updates, security and technology updates through a remote connection. Using for example autonomous services provided from the vendor, the customer can benefit from machine learning instead of using 3rd party tools and trying to integrate them into their systems. And if the performance can be increased through running the infrastructure in-house, the impact on cost savings is also beneficial.

However, the vendor must not have access to sensitive data. The vendor’s role is the same as if the customer were using a public cloud but now this cloud is physically inside the enterprise and the data does not travel on the outside internet.

It is important to understand how the cloud is different from traditional data centers. Companies need to pay attention to leverage the skills of their IT data center staff to learn new things as the cloud requires a different approach, cost model and infrastructure management than building or replicating a new data center in the cloud.

The focus should be on real value through transformation how you operate today and benefit from data analytics, automation, machine learning and artificial intelligence, becoming more agile and efficient. Whether this is achieved with public, private, hybrid or multi cloud, it does not matter. If a company wants to survive in the future, you need to transform the people and the companies’ culture.

Leave a Reply

Your email address will not be published. Required fields are marked *