The Ops Community ⚙️

Cover image for Predictions for the future of cloud computing – 2023 edition
Eyal Estrin
Eyal Estrin

Posted on • Originally published at eyal-estrin.Medium

Predictions for the future of cloud computing – 2023 edition

As someone who uses cloud services, investing time in learning new technology trends and sharing knowledge about cloud adoption, I see the opportunity to share my thoughts about the future of cloud computing.

This blog post was written by the end of the year, so I called it the 2023 edition since technology keeps evolving, and I will probably have to write new blog posts in the coming years.

I have decided to sort my predictions for the coming year to several topics:

  • Cloud adoption
  • Cloud-native applications
  • AI/ML
  • Cloud Security

Cloud Adoption

The public cloud has been with us for many years, but as the years go by, more and more traditional organizations begin to see the benefits of the cloud.

For many years, traditional organizations were afraid of the public cloud, mostly due to security and cost concerns.

As the technology evolves, the transparency from cloud providers and the knowledge of using the public cloud become much better than we used to have 10 years ago.

What do I mean by transparency from cloud providers?

Think about misconfiguration for example.

For many years, the most common mistake was leaving object storage publicly accessible, causing data leakage incidents to become very common.

Today's default setting for Amazon S3 (for example) is to make sure all newly created S3 buckets are private by default and encrypted by default.

As organizations move to managed services in the cloud, they shift the burden of maintaining the security of the infrastructure layers to the cloud provider, so when a vulnerability in the cloud provider infrastructure is discovered, the cloud provider fixes it, without any additional change required from the customers side (assuming customers are using PaaS).

Cost was always a huge stop for customers who thought about using the cloud, but "everything costs a lot of money"…

Organizations adopting the public cloud, should make sure they fully understand the pricing options for each of the cloud services they are using (or planning to use), adjust their architecture decisions to their actual use, and put cost as one of the ongoing tasks they are doing, together with software maintenance and security controls.

Another trend I am beginning to see is a conversation about sustainability in the cloud.

More and more organizations become aware of the carbon footprint they leave behind by using traditional data centers, and by consuming more and more resources in the cloud.

In the coming future, I see a lot of focus on sustainability, adding it as part of decisions when developing new workloads in the cloud.

Cloud-Native Applications

Previously, we talked about traditional organizations, beginning to embrace the public cloud, by migrating workloads to the cloud or building new ones in the cloud.

Now let us talk about start-ups, modern organizations, or what we used to call "born in the cloud".

Modern organizations are not tied to a physical data center.

They want to invent something new, consume services fast, and produce new products for the market as fast as they can.

Such organizations are already using the public cloud, and understand the benefits of the cloud.

Traditional development was based on three-tier applications (Front-end, business logic, and back-end), built from a complex monolith.

Cloud-native applications use modern technologies, such as communication using APIs, built from multiple small components in a microservice architecture, wrapped inside containers, using event-driven architectures, and more.

Modern applications are built to support changes, on unpredictable scales, allowing organizations to release multiple versions of each component, even several update releases on the same day.

To support business demand, the organization had to change its development lifecycle – from using CI/CD pipelines, automated testing, and strong use of open-source components, to deployment using declarative Infrastructure-as-Code.

Organizations are no longer bound to a specific programming language or costly database license.

Modern applications, with the use of microservice architecture, allow each development team to choose the most suitable programming language, decide on the technology they wish to use (from Kubernetes for container orchestration, to Function-as-a-Service), up to the back-end data store (who says we must use a database? Perhaps object storage is better for our use case?)

AI/ML

Perhaps one of the biggest trends in 2023 is the use of AI/ML, specifically the use of generative AI.

It began with OpenAI, with their ChatGPT chatbot engine, continued to Midjourney AI engine for image generation, and ended up with the integration of Gen-AI capabilities into cloud providers services, such as Amazon Q, and Microsoft Copilot for Azure.

Generative AI has huge potential for many aspects of our daily lives, such as:

  • Chatbots for analyzing conversations with customers to detect sentiments
  • Code generation services such as Amazon Code Whisperer or GitHub Copilot, provide developers with code suggestions, saving hours of work
  • Automated report generation – the ability to analyze large amounts of written data and produce a summary for the readers
  • Drug discovery – the ability to create new drugs to make human lives better The potential use of generative AI models is huge, every day new services come to the market, but Gen-AI models, allow organizations to gain business value from their data. Gen-AI requires a lot of compute power, mostly based on GPUs (or equivalent processors from the various cloud providers), and for this specific reason, I truly believe organizations would shift more of their workloads and new developments to the public cloud, leaving less and fewer workloads on their static and traditional data centers.

Cloud Security

As we are heading into the future, embracing cloud services, so are the attacks on organizations' systems becoming more sophisticated.

In the past, it was enough to deploy patches, close some firewall ports, and conduct vulnerability assessments.

In today's world, the old security controls, are simply not enough.

As the development process becomes much faster, and business demand for new versions of applications rises, organizations must embrace a "shift left" mindset, embedding security controls in the early stages of the development process.

Organizations embracing CI/CD pipelines must invest in learning how to embed scanning tools as part of the process, making sure misconfigurations (such as failure to secure their secrets/credentials) are detected as early as possible, and integrate security posture services to gain ongoing insights into every bit of their applications and infrastructure.

I predict that investment in automated tools for detecting and blocking attacks will rise in the coming years.

Summary

The topics I wrote about in this blog post, are my predictions for 2023, from things I have read and conversations I had with colleagues.

Do not take my words for granted, invest time to read about new technologies, and how your business can gain value from those technologies.

Next year I will try to keep an eye on new trends and write new predictions for the next year.

About the Author

Eyal Estrin is a cloud and information security architect, the owner of the blog Security & Cloud 24/7 and the author of the book Cloud Security Handbook, with more than 20 years in the IT industry.

Eyal is an AWS Community Builder since 2020.

You can connect with him on Twitter

Opinions are his own and not the views of his employer.

Top comments (0)