The shift to cloud repatriation: why organizations are making the change – part 1

Over the past decade, perhaps no IT trend has been as transformative as the widespread availability of public cloud. With hyperscalers offering the promise of infinite scalability and flexibility for their workloads while easing the need for organizations to spend on internal infrastructure, tools and staff, organizations have entered a new era headlong.

But recently, as enterprise cloud strategies have matured, there has been a growing realization not only that the expected financial returns from public cloud investments may prove elusive, but also a realization that organizations may be at risk of losing flexibility. , safety and control. when they go “all-in” on the public cloud. As a result, we’ve seen a growing number of companies begin to rethink their cloud strategies and make wiser decisions about where their most critical workloads should be located. This rethink has led to a gradual migration of workloads from public cloud to private cloud environments – “repatriation” – and reflects a growing understanding of an undeniable truth: the public cloud is simply not the optimal choice for every type of workload. .

So how should organizations think strategically about the types of workloads that could benefit from repatriation? The decision about which workloads go where really depends on a deep understanding of their nature and the specific needs of the organization. Regardless of a company’s specific IT architecture, successful repatriation requires a nuanced approach and an understanding of how you want to access your data, what you need to protect, and how much you are willing to spend.

In this first part of a two-part series, we look at two of the four key factors driving the current wave of repatriation: edge computing and data privacy/sovereignty.

Bryan Litchford

Vice President of Private Cloud at Rackspace.

Living at the Edge Computing: Bringing Workloads Home

According to research from Virtana, most organizations are currently using some type of hybrid cloud strategy, with more than 80% operating across multiple clouds and approximately 75% using some form of private cloud. More recently, we have seen a shift, especially in sectors such as retail, industrial, public transportation and healthcare, towards edge computing, driven by the need for greater flexibility and control over computing resources. The development of the Internet of Things (IoT) has been critical here, as it has enabled the collection of a wide range of data at the network edge.

When the number of connected IoT devices at the edge was relatively small, it made sense for organizations to send the data they provided to the public cloud. But as these devices continue to proliferate, there are significant efficiencies to be gained from collecting and analyzing data at the edge, including near real-time response and increased reliability of critical infrastructure such as point-of-sale systems and assembly. lines.

Particularly in industries where uninterrupted operations are paramount, minimizing downtime is crucial to maintaining profitability and competitiveness. This shift to edge computing reflects a strategic reassessment of IT infrastructure deployment, prioritizing localized solutions over traditional public cloud services, and has led many organizations to retire workloads from the public cloud.

Data sovereignty and privacy

As companies grapple with increasing concerns about privacy and ownership of information, there is growing recognition of the need to maintain greater control over sensitive data and establish parameters and policies for its use.

In industries such as healthcare and financial services, where vast amounts of sensitive, critical data are generated and exchanged, maintaining trust and control over this information is of paramount importance. By ensuring this data is in highly corroded environments, organizations can effectively protect their assets and limit the risk of unauthorized access or breaches.

Additionally, increased scrutiny from key stakeholders such as CIOs, CTOs, and boards has increased the importance of data sovereignty and privacy, resulting in a notable increase in scrutiny of third-party cloud solutions. While public clouds may be suitable for workloads not covered by data sovereignty laws, a private solution is often required to meet compliance thresholds. Key factors to consider when deciding whether a public or private cloud solution is more suitable include how much control, visibility, portability, and customization the workload requires.

Of course, trust and privacy are not the only data factors driving repatriation. There are additional operational and strategic benefits to be gained from keeping data within trusted environments, such as greater control over how information is accessed, used and shared.

In part two of this series, we’ll look at two other key factors playing a role in repatriation: the rise of Kubernetes and the flexibility of containers.

We recommended the best cloud backup.

This article was produced as part of Ny BreakingPro’s Expert Insights channel, where we profile the best and brightest minds in today’s technology industry. The views expressed here are those of the author and are not necessarily those of Ny BreakingPro or Future plc. If you are interested in contributing, you can read more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Related Post