When cloud computing began to take over the IT industry a decade ago, many organizations jumped in with both feet, migrating their operations entirely into the cloud or starting up new companies based around cloud services. While public cloud architecture has delivered many benefits, experience has demonstrated that it isn’t always the right solution for every business. As the range of options for cloud deployments continues to grow, some companies are beginning to question whether or not a public cloud makes sense.
This is not to say public clouds are going anywhere. If anything, they’re expected to grow significantly, with a recent IDC study showing that public cloud adoption will be three times that of private clouds by 2019. Indeed, lift and shift strategies that migrate assets are becoming more effective than ever. But despite this trend, a sizable number of companies are engaging in repatriation, or shifting some portion of their data and computing workloads from a public environment to a private one. Another survey found that more than a third of companies opted to make that transition in 2017, with 36 percent of cloud-based organizations hosting their computing platform in a private cloud.
Perhaps the most prominent example of such a repatriation is Dropbox, the popular storage service that got its start on Amazon Web Services (AWS). In the two years since repatriating most of its operations from the public cloud to data centers providing colocation services, Dropbox trimmed its operational expenses by $74.6 million. While the company still stores about 10 percent of its data in the AWS cloud, the transition still made Dropbox a poster child for the benefits of data repatriation.
Routinely cited as the leading reason for companies leaving the public cloud, security is a serious matter for today’s organizations. With several high-profile data leaks involving cloud providers (an August 2018 AWS error, for example, exposed business critical information involving over 31,000 systems for the company GoDaddy), their concerns are not without merit. Public cloud providers do offer extensive security measures, but many of them make it difficult to actually use those defenses. The very nature of the public cloud allows data to be accessed from anywhere, and data often has to travel exposed to the open internet, making it vulnerable to theft of infection by malware. Cloud providers also manage data for many clients in the same cloud environment, increasing the likelihood of the wrong people being granted access to what is supposed to be sure data. By shifting to a private cloud solution, companies can implement whatever security protocols they prefer to ensure the protection of their data.
For today’s companies, few assets are more valuable than their data. From customer information to the unstructured mass of data collected from the edge of networks by a variety of devices, successful organizations subject their data to a never-ending barrage of analysis to extract valuable insights that can give them an advantage in increasingly competitive markets. But in order to use that data, they need to have control over where it’s located, how it’s managed, and where it goes. That level of control often isn’t possible in a public cloud, where the provider has the ability to interfere with data or even restrict how it’s used in some cases. There’s also the little matter of getting data back when a company decides to relocate their assets. Some cloud providers are under no legal obligation to return data in a usable format, effectively locking companies in to a specific vendor.
Aside from ease of use, one of the key benefits of the public cloud is that it’s relatively inexpensive. For companies that are just starting out and can’t afford the up-front expenses of servers, storage, and other physical IT infrastructure, the public cloud offers a simple and quick solution that won’t break their budget. At a certain point, however, the public cloud ceases to be a cost-effective solution. Many companies eventually get to a point where it simply makes more financial sense to make an up-front investment in infrastructure (either physical or virtual) and manage it themselves.
Data availability is crucial to the success of any organization. Server downtime can cripple business operations, leading to reduced productivity, lost opportunities, and lasting brand damage. While public cloud providers regularly tout the benefits of being able to access data from anywhere at anytime, that access doesn’t amount to much if the provider is experiencing system downtime. Even established cloud providers like AWS only promise 99.99% SLA uptime reliability, which translates to nearly an hour of downtime each year. By shifting to a private cloud environment facilitated through a colocation or software defined data center, companies can secure much higher levels up uptime service. This ensures that their data and assets will be available when they need them most.
While many companies have made the decision to repatriate their assets from the public cloud to a private cloud, the decision isn’t an entirely “either/or” prospect. In many cases, the ideal solution is a hybrid cloud deployment, which combines the security and control of a private cloud with the scalability and resources of a public cloud. Multi-cloud architecture also provide a great deal of flexibility to meet a variety of computing needs. By working with the rich connectivity options and versatile network environment of a data center, companies can build customized solutions that address their biggest concerns and deliver their most pressing needs.