Not Just Another Trend: The Postmodern Cloud Computing Imperative

Cloud computing has come to occupy a central locus in the data management ecosystem, much more so than it did even a couple months ago.

In the wake of patent economic instabilities, global health concerns, and unparalleled need for remote access, many organizations are struggling to simply keep their lights on to retain what customers they still have.

Advanced analytics only helps so much with the necessities of reducing costs and provisioning IT resources in immensely distributed settings—which is the crux of the requirements for maintaining operations in such an arduous business climate.

Although Artificial Intelligence will likely always be considered “cool”, the cloud—and not AI—is the indisputably pragmatic means of staying in business in an era in which budget slashing and layoffs (even of IT personnel) are a disturbingly familiar reality.

googletag.

cmd.

push(function() { googletag.

display(div-gpt-ad-1439400881943-0); }); The cloud is the single most effective way to address contemporary concerns of: Overhead: Most cloud manifestations dramatically decrease costs for securing and accessing IT resources, which for most organizations is simply “an enabler, it’s always overhead,” divulged Denodo CMO Ravi Shankar.

“A CEO of any reasonable company will try to cut the overhead as much as possible.

”Remote Access: The cloud’s collaboration benefits are critical to working in decentralized settings (including from home or anywhere else) and support the escalating need for services like mobile banking or telemedicine.

IT: Cloud architecture enables organizations to outsource the difficulty of modern IT needs to specialists, so companies can focus on mission critical activities central to revenue generation.

What were once incentives for cloud migration are rapidly becoming mandates for contemporary IT needs.

However, “the cloud… is fairly complex, [there are] a lot of cloud services, a lot of moving parts,” reflected Privacera CEO Balaji Ganesan.

By understanding the options available for overcoming the inherent complexities of the cloud—involving data governance and security, integration, and data orchestation—organizations can perfect this paradigm to thrive in the subsequent days of economic uncertainties.

Public Cloud Strengths Most companies know the three main public cloud providers are Amazon, Azure, and Google.

Fewer realize these provide the foundation for serverless computing; only a chosen few realize they have the following respective strengths that are determinative when selecting providers.

Azure: According to Shankar, many larger enterprises gravitate towards Microsoft Azure, which excels in “office productivity applications and BI.

” Oracle is also increasing market share among larger organizations, particularly those investing in its applications.

Google: For organizations in which machine learning and cognitive computing applications are core to their business, Google Cloud—which focuses on these areas—is a natural fit.

Amazon: Amazon Web Services is the incumbent among public cloud providers and resonates with small and mid-sized businesses because of perceived pricing advantages, its capability to enable smaller retailers to reach global audiences, and its “marketplace is bigger than Microsoft’s or other marketplaces simply for those reasons: people can find and use these services,” Shankar commented.

Cloud Orchestration The hybrid and multi-cloud idiom typifies today’s cloud architecture by allowing organizations to position—and access—resources where they’re cheapest and work best: whether they’re in data centers, central clouds, or at the edge.

The downside of this flexibility is it’s inherent complications, which is why “there needs to be a way to manage all this complexity and this challenge of different locations,” explained Kubermatic CEO Sebastian Scheele.

Cloud native approaches with orchestration platforms like Kubernetes or Docker are foundational to automating the management of “a thousand or even ten thousand clusters in a scalable way, from a single glass, and having a central management control over your whole organization,” Scheele noted—which encompasses all cloud resources.

Orchestrating the containers widely used to build and deploy cloud applications is influential for implementing common use cases that make the cloud an efficient cost-saver, including: Horizontal Scalability: Containers are critical to capitalizing on the cloud’s elasticity to scale horizontally for workload surges and “getting rid of inflexible infrastructure for scaling,” Scheele said.

Many organizations are replacing in-person conferences with virtual ones; with containers, they can rapidly scale to accommodate tens of thousands of attendees for a couple days and spin away those resources as needed.

Modernizing Legacy Apps: Containers are the de facto environment for running applications with microservices.

They also grant organizations the ability to “use modern infrastructure and move their existing infrastructure in a modern way,” Scheele revealed.

Finally, they enable developers to “cut more functions and capabilities out of existing applications and rewrite them in a modern way,” Scheele denoted.

Automating Operations: Cloud native technologies and orchestration platforms underpin operational automation for cloud deployments (which possibly trumps that of analytics) that reduces the complexities of positioning IT resources in multi-cloud deployments “so you don’t have this traffic-like, ticket space process when you create a firewall or things like this; you can automate this,” Scheele indicated.

Data Integration A fundamental cloud architecture need that orchestration solutions don’t resolve—and possibly exacerbate with their dynamic portability—is data integration.

In fact, the most pressing consequence of the increasing distribution of the data landscape the cloud supports is the need to integrate IT resources, because “the cloud alone doesn’t provide integration,” Shankar cautioned.

Instead, the many varieties of clouds, their hybrids, and their innumerable connectors simply reinforce the need to integrate data for almost any singular use case.

Additionally, there’s what Shankar termed “third party data.

” For example, various forms of commerce involving “retailers and wholesalers and so on, things are being returned, things need to be recalled,” Shankar mentioned.

“All these need to go back to people that supplied them.

So it’s a huge connected network and you need the ability to bring them all together.

” Data virtualization has emerged as a consistently credible means of integrating data into a single glass pane while simultaneously rectifying differences in schema, format, and structure variations.

Moreover, it provides an abstraction layer to view and access these resources so the data themselves don’t actually move for enterprises to integrate them.

In this respect, any form of the cloud simply becomes another source to be virtualized alongside others in a comprehensive data fabric.

“You come to this one logical layer and then you ask for the data,” Shankar specified.

“It then goes and figures out where the data resides, whether on-premises, in the cloud, data at rest, data in motion, structured data, or unstructured data, and gives it all back to you in one single format that you can use.

” Data Governance Virtualization technologies have multiple means of facilitating data governance standards like data quality.

Other solutions expressly designed for the cloud use different methods for ensuring governance protocols are preserved, regardless of where data are.

Options functioning as a central interface between organizations and their cloud resources “can point to these sources, understand data, and enable a central way of managing policies and enforcing them locally,” Ganesan disclosed.

Although this approach works best with data at rest for analytics and statistical AI, it obsoletes the need to move data.

Many of the cloud’s governance considerations involve sensitive data or personally identifiable information.

Formidable solutions in this space rectify these issues across all cloud deployments in three ways, including: Data Profiling: Data profiling leverages statistics to reveal values in datasets, including their likelihood of containing PII.

With a confluence of rules—which may stem from governance councils or specific regulations like the California Consumer Privacy Act, for example—and machine learning, organizations can profile data “automatically as soon as any data comes into the cloud,” Ganesan said.

Data Cataloging: Data profiling results form the basis of a sensitive data catalog that pertains to data governance policies.

Organizations can access this information from one place where they have “phone numbers, social, and all these files,” Ganesan commented, so they “can make sure it’s not being accessed by anybody or anonymized.

”Policy Enforcement: The final step is to inform and enforce data governance policy with various measures that restrict access, some of which involve software plug-ins and translation of governance policies into native databases’ (such as Athena, Redshift, or others) Access Control Lists.

This approach lets users “get a single pane of glass where they can specify these compliance and access policies, and seamlessly keep enforcing them at scale as they add more services,” Ganesan observed.

Chief Value Proposition The cloud’s worth to the modern enterprise is remarkably simple, yet undeniable: it enables them to do more with less.

It empowers them to scale better, store data cheaper, and access IT resources much faster than they could if everything were on premise.

Moreover, they effectuate these boons with considerably less overhead, technical expertise, and impediments to agility than they’d otherwise have with on-premise deployments.

Still, the cloud’s greatest boon is likely its remote, ubiquitous access to IT services—which is at a premium today.

By solving its requirements for governance and security, data integration, and streamlined orchestration, organizations have the most efficient means of meeting their IT needs while significantly decreasing their associated risks.

   About the Author Jelani Harper is an editorial consultant servicing the information technology market.

He specializes in data-driven applications focused on semantic technologies, data governance and analytics.

Sign up for the free insideBIGDATA newsletter.

.

Leave a Reply