Eliminating Cloud Data Management and IT Operations Security Woes

Click to learn more about author Don Boxley.

Key elements of Data Management and IT operations continue to move to the cloud due to its numerous benefits, such as cheap storage, pay-per-use pricing, disaster recovery (DR), and on-demand resources.

  This trend shows no signs of slowing, and will undoubtedly continue for some time.

Unfortunately, while it would be impossible to ignore all of the benefits of moving to the cloud, there are also seemingly unavoidable problems involved in doing so.

  Security breaches are becoming an almost daily, well-publicized occurrence.

  Likewise, the severity of each attack appears to be increasing.

  It is extremely disconcerting that the growing regularity and severity of data breaches in the cloud could diminish the utility of conceivably the most innovative technology advancement of our times.

What’s necessary now to secure the cloud’s value proposition is a security paradigm as flexible and as low latent as the very opportunities cloud computing affords.

It should minimize the surface area for attacks while evading the notice of intruders; it should be deeply embedded within an organization to safeguard its applications and information as the enterprise “family jewels” that they are.

Software defined perimeter is an advanced security model delivering these benefits and others.

When correctly implemented, it secures gateways at the application layer both to and between clouds for unassailable security with cloaked micro-tunnels hackers won’t see or detect.

The best of these implementations depend upon proprietary protocols rarely used, offer micro-tunnel failovers for continuous application connectivity between clouds and on-premises settings, and are dynamically positioned wherever resources are.

With encryption capabilities to ensure even third-party software providers aren’t privy to transmissions, they deliver an impenetrable deep segmentation perimeter, purposefully designed for hybrid and multi-cloud deployments.

VPNs Multiply Risks Hybrid and multi-cloud deployments are becoming increasingly necessary to reduce organizational costs and boost productivity.

In fact, according to 451 Research’s Voice of the Enterprise: Cloud Hosting and Managed Services, Budgets and Outlook survey of 644 enterprise IT decision-makers, 58% of organizations are pursuing a hybrid strategy involving integrated on-premises systems and off-premises cloud/hosted resources.

Moving datacenters or specific applications to the cloud to enable uniform access for distributed locations is a common use case; establishing different nodes in the major public cloud providers for various pricing options, failovers, or burst performance needs is another.

Typical perimeter security measures in these examples and others involve establishing Virtual Private Networks (VPNs), which actually multiply risk in numerous ways.

VPNs were designed for traditional on-premises security; they’re less effective in the cloud because they expand network surface area, enabling more room for lateral movement attacks.

This credential-based security method is also difficult to manage with messy access control lists and the continual reconfiguration of firewalls.

Competitive software defined perimeter solutions exceed these limitations in several ways.

They effectively implement segmented micro-tunnels between applications or servers—in different clouds and on-premises—creating micro-perimeters to decrease network attack surface, not expand it.

The lack of network expansion means users are simply connected at the application layer via a micro-tunnel gateway that effectively cloaks this conduit so intruders have nothing to scan.

In comparison, VPNs leave ports open for hackers to detect.

All the access control lists, firewall concerns, costs and risks of standard VPN measures are obsolete with software defined perimeter security.

Dynamic Deployment of Perimeter Security Because software defined perimeter options facilitate the described invisible security ports directly between applications or servers, they’re highly transferable between settings.

They result in a dynamic deployment of perimeter security wherever needed, isolating specific services for engrained user accessibility.

Certain implementations of these solutions, however, offer more protection than others do.

Most platforms create micro-tunnels with Transmission Control Protocol (TCP), which is widely used and well known to malignant actors.

More competitive approaches involve User Datagram Protocol (UDP), which is much less frequently used and therefore less familiar to potential hackers.

One reason TCP is more commonly used than UDP is because it has innate error correction capabilities that keeps data orderly.

By supplementing UDP with similar data correction capabilities found in TCP, competitive software defined perimeter solutions keep data packets in order while relying on a lesser known protocol for improved security and lower data transmission latencies.

Thus, when distributed, on-premises Oracle client applications are using such a solution to simultaneously talk to an application server in the Azure cloud for a financial services use case, for example, one of the first things to transpire is the opening of randomly generated UDP ports between the on-premises micro-tunnel gateway and the Azure micro-tunnel gateway.

Security is enhanced by the random generation of the port—whereas many applications rely on standard ports known to all users—and the fact that most algorithms are trained to hone in on TCP, not UDP ports.

Once the micro-tunnels are in place the client application and cloud server application hosts only communicate via their respective micro-tunnel gateways.

Their ports are never exposed to the internet, effectively cloaking them from everyone.

Software Defined Perimeters Offer Unparalleled Advantages The most robust software defined perimeter implementations offer a pair of advantages competitors don’t.

The first is application level encryption and Public Key Authentication.

Even if attackers did manage to find and access these invisible ports, they’d only get encrypted data.

The second boon is unique to this implementation as the actual gateways are highly available.

All users have to do is implement multiple gateways between sites.

If the micro-tunnel between an on-premises application and AWS, for example, failed for any reason, the tunnel could failover to a secondary gateway for uninterrupted networking connectivity to the AWS service.

Additionally, if there were a failure on the AWS server or site, the data could automatically failover to an Azure Cloud, for instance, for availability at the workload level.

Another use case for multi-cloud deployments involves burst performance.

If users had, for example, a three-node cluster on premises, in Azure and in AWS for OLTP, they could rely on this implementation of software defined perimeter to burst to large nodes in the cloud for end of the week or month tallying, which would otherwise tax their on-premises resources.

If one provider failed for any reason, users could securely go to the other to continue operating.

Complete Security Assurance and Deployment Flexibility Not only do such software defined perimeter implementations exceed traditional security measures for hybrid and multi-cloud access, but their protocols, encryption, and high availability surpass those of other implementations.

They’re also cloud agnostic for complete flexibility between clouds, enabling users to eschew vendor lock-in with the most effective security for multi-cloud and hybrid usage.

  .

. More details

Leave a Reply