Today’s companies have more flexibility than ever before when it comes to determining where to house their valuable data and IT assets. When public cloud services emerged as a viable alternative to on-premise data solutions, many companies jumped at the opportunity to rid themselves of legacy infrastructure and embrace the potential cost savings and versatility of the cloud.
After a decade of practice, however, some of them have discovered that the public cloud isn’t the “one size fits all” solution many experts believed. These companies have begun to explore the concept of repatriating, which involves shifting data and workloads off the cloud and into a private cloud, an on-premises solution, or a colocation service.
Things to Consider When Repatriating Your Data
Why Make the Move?
While a public cloud environment makes sense for many types of organizations, there are a number of reasons why it may not be a good fit for others. Security concerns ranging from data breaches, distributed denial of service (DDoS) attacks, and malware pose a unique threat to purely cloud-based operations. Public clouds may offer a number of security assurances, but they are ultimately external service providers that companies must trust to follow best security practices and respond to threats promptly. And as many companies have learned the hard way, should the public cloud provider’s data center suffer downtime for any reason, their network services will go down with it, potentially inflicting severe financial losses. The unexpectedly high cost of cloud services is another factor driving many companies to reconsider where they’re storing their data and application workloads. Companies often end up paying for computing resources they don’t use, especially in the case of enterprise-level organizations, making repatriating an ideal solution. In many cases, managed service providers can offer bundled services through a data center partner that are better suited to their needs.
Can You Get Your Data Back?
Of course, a repatriating strategy assumes that a company can get its data back in the first place. Most public cloud providers charge an egress fee to remove data from their environment. Amazon Web Services, for instance, charges the following fees for every gigabyte removed:
1GB to 10TB: $0.09
10TB to 50TB: $0.085
50TB to 150TB: $0.07
150TB to 500TB: $0.05
500TB or more: Contact Amazon
For companies with large amounts of data, these costs can add up quickly. A bigger problem, however, is the prevalence of proprietary data formats. Some public cloud vendors (such as Salesforce) store data using a proprietary compression algorithm. This means that the data is all but unusable outside its native cloud environment unless it is converted before being removed or reverse engineered after removal. Either process can be much more expensive than simply repatriating the data to an on-premises or third party data center. It’s critically important that any company storing data in a public cloud familiarize itself with the provider’s terms and conditions of service to avoid getting “stuck” in data prison.
Does Everything Need to Move?
Repatriating data doesn’t need to be an all or nothing proposition. Depending upon the organization’s needs, some data and application workloads may be better off remaining in a public cloud environment. In such cases, companies need to undertake a thorough audit of their existing resources to determine what assets should be repatriated to a data center. It’s significant that one of the most common examples of repatriation, Dropbox, didn’t migrate completely from Amazon Web Services and still uses the cloud provider to serve select European markets. For the average company, however, it may be beneficial to keep some applications and data in a public cloud environment as part of a hybrid cloud or multi-cloud bundled services solution that allows it to take advantage of the public cloud’s scalable computing power.
Where to Go?
In some ways, the very concept of repatriating is a bit misleading because it implies that data is returning to its original location. Most of the time, this won’t be the case. Public cloud data was often migrated from outdated legacy systems in an on-premise location. While some companies have repatriated back to a new on-premise solution with updated hardware, others will elect to colocate with a third-party data center or set up a private cloud environment using the virtualized resources of a software defined data center (SDDC). Many of these private IT solutions will be set up by managed service providers (MSPs) offering bundled services that are customized around a company’s specific needs. Hybrid cloud and multi-cloud environments are common destinations for a great deal of repatriated data as these architectures are ideal solutions for companies that want the security and control of a private network along with the scalability of the public cloud.
As the cloud computing market matures and managed service providers become more versatile, more companies will begin to reassess their existing network infrastructure. Organizations that embraced the public cloud in the early days of cloud migration may be looking to regain some of the control they surrendered when they originally made the move while also saving on costs. Carrier neutral data centers will play a key role in facilitating repatriating strategies thanks to their ability to provide colocation and other bundled services to customers in search of the optimal IT solutions for their business.
About Tom Banta
Tom is the Senior Vice President of Product Management & Development at vXchnge. Tom is responsible for the company’s product strategy and development.