How to Master Multi-Cloud Data Complexities

Ajinkya | May 20, 2020

article image

The current patterns of cloud migration include simple “lift and shift,” which moves data with as little work as possible, typically by refactoring or redoing the applications and data so they work more efficiently on a cloud-based platform. More and more migrations include multi-cloud, which contributes to the appearance of new data complexity issues. When leveraging multi-cloud architectures, it’s important for IT leaders and cloud professionals to rethink how to deal with data complexity.
 

If businesses are tasked with this massive and growing data management problem, it seems to me they ought to get their IT house in order. That means across a vast heterogeneity of systems, deployments, and data types. That should happen in order to master the data equation for your lines of business applications and services.
 

Table of Contents

What is multi-cloud?
Why use multiple clouds?
How to manage multi-cloud data complexities?
 

What is multi-cloud?

Multi-cloud is the use of two or more cloud computing services, including any combination of public, private, and hybrid.  The end result is the capacity to orchestrate resources across multiple private or public cloud platforms that contain multiple cloud vendors, accounts, availability zones, or regions or premises.
 

 

Why use multiple clouds?

The three most important benefits of utilizing multiple clouds are:

High availability – The multi-cloud provides protection for an organization’s data storage against threats. If a cloud is unavailable, other clouds remain online to run applications.

Flexibility – Multi-cloud gives businesses the option to select the “best” of each cloud to suit their particular needs based on economics, location, and timing.

Avoid Vendor Lock-In – This allows the application, workload, and data to be run in any cloud based on business or technical requirements at any given time.

Cost effectiveness – Multi-cloud enables businesses to control their costs by optimizing the public cloud and choosing infrastructure vendors based on price. Public cloud services deliver functionality without having to hire personnel.
 

Multi-cloud allows you to choose the right platform for your application and customers while using the best features from each cloud service provider. This gives companies the flexibility they need to select the “best” of each cloud to suit their particular needs based on economics, location, and timing.
 

Multi-cloud also provides protection against the failure of a single cloud platform. Large enterprises may also be able to maximize the benefits of different infrastructure vendors that are competing on price for their business (smaller companies won’t have this luxury).
 

Cloud is very different from your internal IT stuff — the way you program it, the way you develop applications. It has a wonderful cost proposition, at least initially. But now, of course, these companies have to deal with all of this complexity.

- Martin Hingley, President and Market Analyst, IT Candor Limited



Read more: How To Derive Data Insights In Hybrid Cloud Model And Drive Innovation
 

 

How to manage multi-cloud data complexities?

The reasons for the rising data complexity issues are fairly well known and include the following:

  • The rising use of unstructured data that doesn’t have native schemas. Schemas are typically defined at access.
  • The rising use of streaming data that many businesses employ to gather information as it happens and then process it in flight.
  • The rise of IoT devices that spin off massive amounts of data.
  • The changing nature of transactional databases, moving to NoSQL and other non-relational models.
  • The continued practice of binding single-purpose databases to applications.
  • Finally, and most importantly, the rise of as-a-service cloud-based and cloud-only databases, such as those now offered by all major cloud providers that are emerging as the preferred databases for applications built both inside and outside of the public clouds. Moreover, the use of heterogeneous distributed databases within multi-cloud architectures are preferred.  
     

Challenge of multi-cloud

For the most part, those who build today’s data systems just try to keep up rather than get ahead of data complexity issues. The migration of data to net-new systems in multi-clouds is more about tossing money and database technology at the problem than solving it. Missing is core thinking about how data complexity should be managed, along with data governance and data security. We’re clearly missing the use of new approaches and helpful enabling technology within multi-cloud deployments that will remove the core drawbacks of data complexity.
 

The challenge is that you need a single version of the truth. Lots of IT organizations don’t have that. Data governance is hugely important; it’s not nice to have, it’s essential to have.

- Martin Hingley, President and Market Analyst, IT Candor Limited



The core issue is to move toward application architectures that decouple the database from the applications, or even move toward collections of services, so you can deal with the data at another layer of abstraction. The use of abstraction is not new, but we haven’t had the required capabilities until the last few years. These capabilities include master data management (MDM), data service enablement, and the ability to deal with the physical databases using a configuration mechanism that can place volatility and complexity into a single domain.
 

Virtual databases are a feature of database middleware services that technology suppliers provide. They serve to drive a configurable structure and management layer over existing physical databases, if that layer is in the requirements. This means that you can alter the way the databases are accessed. You can create common access mechanisms that are changeable within the middleware and do not require risky and expensive changes to the underlying physical database.
 

Moving up the stack, we have data orchestration and data management. These layers provide enterprise data management with the ability to provide services such as MDM, recovery, access management, performance management, etc., as core services that exist on top of the physical or virtual databases, in the cloud or local.
 

Moving up to the next layer, we have the externalization and management of core data services or microservices. These are managed, governed, and secured under common governance and security layers that can track, provision, control, and provide access to any number of requesting applications or users.
 

ACT NOW

Most enterprises are ignoring the rapid increase of data, as well as that of data complexity. Many hope that something magical will happen that will solve the problem for them, such as standards. The rapid rise in the use of multi-cloud means that your data complexity issues will be multiplied by the number of public cloud providers that end up being part of your multi-cloud. So, we’ll see complexity evolve from a core concern into a major hindrance to making multi-cloud deployment work effectively for the business.
 

What’s needed now is to understand that a problem exists, and then think through potential solutions and approaches. Once you do that, the technology to employ is rather easy to figure out.
 

Don’t make the mistake of tossing tools at the problem. Tools alone won’t be able to deal with the core issues of complexity. Considering the discussion above, you can accomplish this in two steps. First, define a logical data access layer that can leverage any type of back-end database storage system. Second, define metadata management with the system use of both security and governance.
 

The solution occurs at the conceptual level, not with the introduction of another complex array of technology on top of already complex arrays of technology. It’s time to realize that we’re already in a hole. Stop digging.
 

Read more:Flexible building blocks for the new cloud and data-driven world

Spotlight

HPE SimpliVity

HPE SimpliVity powers the world’s most efficient and resilient data centers with the most complete hyperconverged infrastructure solution. Unlike traditional infrastructure that’s complex and costly to manage, HPE SimpliVity dramatically simplifies enterprise IT by combining all infrastructure and advanced data services for virtualized workloads including guaranteed data efficiency, data protection, and VM-centric management and mobility onto the customer’s choice of server.

OTHER ARTICLES

Are Manufacturing Companies in Danger of Implementing a Fake Cloud ERP Solution?

Article | April 6, 2021

When it comes to growing your manufacturing business, speed is the name of the game. When technology capabilities increase, customer expectations tend to increase at an even faster rate. Customers now assume that key team members working anywhere will have complete project information at their fingertips. Not with a fake cloud ERP solution. Your bottom line depends on your ability to exploit new business opportunities before your competitors do. Consider why a cloud system is needed to deal with this challenge: Every decision you make on the shop floor or in the field is closely tied to accounting—and every accounting decision requires heavy input from the shop floor, sales, and service departments. Wherever your employees are working, they need speedy access to all the information that’s relevant to the decision at hand.

Read More

The cloud and rising to the challenge

Article | April 16, 2020

In a very short space of time, the coronavirus outbreak has completely transformed how organisations across all sectors operate. With lockdown restrictions rolling out across the globe, those organisations have moved quickly to instruct their staff to work from home in a bid to halt the spread of the Covid-19 infection.The nature of the IT industry will mean many of its businesses and employees will have remote working experience, but that isn’t the same across the board. According to March 2020 research from the Office of National Statistics (ONS), only 30% of UK employees ever worked from home during 2019. For many, the change has been sharp and abrupt, and brought with it significant challenges for both the workforce and the IT infrastructure required to support it.Due to the rapidly unfolding pace of the Covid-19 pandemic, CIOs have had little time to test the robustness of their business continuity strategy. However, the demands of the workforce will remain the same, with remote staff expecting the same resources and applications from home as they would if they were still operating out of a central office. Making sure that IT infrastructures can cope with the surge in demand for access to on-premise and cloud platforms from remote locations is key in the coming months.

Read More

RemoteLock Announces Partnership with Buyers Access

Article | February 26, 2020

RemoteLock, a leading provider of cloud-based access control and smart lock management software, today announced it has partnered with Buyers Access, a leading provider of purchasing optimization services and customized purchasing solutions to the multifamily industry.We are thrilled to partner with RemoteLock, who brings a cloud-based solution to key control that so many multifamily properties need as a solution,says Dan Haefner, CEO and president of Buyers Access. RemoteLock’s technology solves some of the biggest access management and scheduling headaches that our members face on a daily basis at their properties.RemoteLock’s cloud-based application, EdgeState, consolidates connected locks on an easy-to-use software platform enabling property managers to remotely control access for tenants, guests, employees and support staff that in turn, saves time, money and manpower. EdgeState allows property managers to secure and manage any door at any of their rental properties from anywhere around the world. The platform can support Wi-Fi and Z-Wave smart locks for any type of door including resident units, elevators, garage doors and common area doors.

Read More

Avoiding public cloud? Telcos are running out of excuses

Article | February 28, 2020

The need for telecoms operators to digitally transform isn’t just being stressed by outsiders. Telcos themselves are making a significant push toward becoming more nimble, digital businesses to improve subscriber experiences and drive higher revenues. This push toward digitalisation is, says Tim McElligott, senior analyst at TM Forum, driving telcos to consider significant infrastructure changes.

Read More

Spotlight

HPE SimpliVity

HPE SimpliVity powers the world’s most efficient and resilient data centers with the most complete hyperconverged infrastructure solution. Unlike traditional infrastructure that’s complex and costly to manage, HPE SimpliVity dramatically simplifies enterprise IT by combining all infrastructure and advanced data services for virtualized workloads including guaranteed data efficiency, data protection, and VM-centric management and mobility onto the customer’s choice of server.

Events