3 data-driven strategies to secure the atomized network

Join today’s top leaders online at the Data Summit on March 9. Register here.


This article was written by Martin Roesch, CEO of Netography.

In October 1969, the first computer message was successfully sent from a computer at UCLA to another computer at Stanford, announcing the birth of ARPANET. The original design intent of ARPANET was to establish a decentralized communications network that would remain operational even if part of it was destroyed.

As the Internet has evolved over the next five decades, we have seen sweeping changes in the way networking and computing models are delivered – from centralized mainframes to the distributed desktop revolution. , and then back to centralized data centers, where security controls and data governance could be effectively consolidated.

Now we’re seeing that pendulum swing again – but this time around, the decentralized computing model is an altogether different beast. Today, applications and data are both dispersed and automatically replicated across multiple public cloud environments. They can also live in an on-premises data center or in a dedicated private cloud. And now, two years into a global pandemic, any notion of a network perimeter has been all but obliterated by employee demands to work from anywhere.

Welcome to the atomized network, a fluid computing environment where applications, data, and even system resources are in constant motion. And perhaps no one has benefited more from the emergence of the atomized network than a new generation of opportunistic threat actors who now have a large and fragmented surface at their disposal on which to conduct their attacks.

The security challenges of the atomized network

Security has always been a challenge, but securing the atomized network ups the ante significantly. In 2021, organizations were using an average of 110 cloud-based apps while supporting hundreds of custom apps that ran in the cloud.

The versatility of being able to mix and match different clouds was a boon to IT managers who needed a more responsive and flexible infrastructure. However, as with so many IT decisions, there are tradeoffs to consider. With each cloud environment that a network or application connects to, greater complexity is added to the equation. And of course, as your network becomes more distributed, the harder it becomes to see everything in it and everything connected to it.

And then, of course, there’s the most important asset of all: your data. In the atomized network, data now moves seamlessly across these distributed environments through the cloud, as well as to and from a changing fleet of remote workplaces. Each of these unique environments brings with it its own security controls. However, these tools were never meant to work together, nor do they have a common interface to help security managers really understand what’s going on within their networks.

All the complexity created by the atomized network is one of the reasons why companies can take months or even years to realize that their network has been compromised in the first place. IBM estimates that it takes an average of 280 days to identify and contain a breach. And with each day that passes without an attacker being detected, they have the luxury of taking the time to observe, learn, and isolate weak points in their victim’s infrastructure, which can make the difference. between a minor incident and a massive breach.

Three data strategies to defend the atomized network

While no one knows what the network of the future will look like, it’s likely to only decentralize as enterprises seek to further offload their applications and workloads to the cloud. So, given these challenges, what steps should security teams take to protect the atomized network? Consider the following:

1. Leverage network metadata as a primary source of threat intelligence

Detecting and responding to conventional network threats has always required deep packet inspection appliances deployed in all network environments. The rapid adoption of zero-trust initiatives is accelerating the trend toward blind deep packet inspection due to the encryption of network traffic. As zero trust becomes the norm, the usefulness and practicality of deep packet inspection will drastically decrease in value. After all, you can’t inspect traffic that you can’t decipher, and there’s also no possible place to locate these “middle box” appliances — in the atomized network, there’s no more middle.

However, this does not mean that the company is unable to analyze the encrypted traffic it sees on the network. As NIST points out, “The company can collect metadata about encrypted traffic and use it to detect possible malware communicating on the network or an active attacker. Machine learning techniques…can be used to analyze the traffic that cannot be decrypted and examined.It should be noted that any attacker who has successfully penetrated a network must also use that same network to elevate privileges and, despite their best efforts, they will invariably leave a trail in the form of Network Metadata The ability to collect and analyze network metadata in real time will therefore become a critical capability for modern security teams.

2. Go beyond binary security checks

For the past two decades, whitelisting and blacklisting applications and discrete entities have served as a practical first line of defense. However, the process of maintaining these lists can be tedious, nor do they address how threat actors have evolved their tactics. Just as sophisticated hackers have quickly adapted to evade signature-based detection tools, they have also found new ways to circumvent these methods. This problem becomes especially pronounced in the atomized network where entry and exit points abound and every minute counts. While these types of methods and tools will likely continue to serve a function in the security team’s toolkit, defending the atomized network will require an ability to interpret and act decisively on a vast array of data. wider.

3. Enrich your data sources to provide behavioral context

Data enrichment strategies should be considered a critical factor in effective threat detection, threat investigation, and remediation. Using rich data adds event and non-event contextual information to security event data to turn raw data into meaningful insights. It’s also important to have the ability to enrich the data in real time and supplement it with business and threat intelligence.

While many security managers attempt to gain full visibility into their atomized networks, we know that collecting vital metadata from all of the disparate systems on those networks. This thus provides attack visibility and detection, reusable integrations to eliminate blind spots, block threats, alert on malicious traffic, etc., which is the best way to protect them.

Martin is the CEO of Netography, the security company for the atomized network.

DataDecisionMakers

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including data technicians, can share data insights and innovations.

If you want to learn more about cutting-edge insights and up-to-date information, best practices, and the future of data and data technology, join us at DataDecisionMakers.

You might even consider writing your own article!

Learn more about DataDecisionMakers


Source link