Active Combat in the Era of Continuous Compromise

The question of if a breach will occur is no longer valid.  It’s now only a matter of when.  We have entered into the era of ‘continuous compromise’; organizations must change from a purely defensive posture to one of active hunting and remediation.cyberSecurity600x3001

When a breach has been detected, it’s important to have instant visibility from multiple viewpoints in order to understand the breach, scope out the damage, and succeed at kicking out the attacker who probably has multiple backdoors and other less obvious points of entry, such as VPN access using stolen user credentials.  When investigating a breach, incident responders will need access to logfiles from multiple sources, endpoint forensic data, endpoint memory, network traffic information, and full packet capture from the internet connections if at all possible. If any of these things are lacking, a compromised company should enable them as soon as they believe they have a problem.

Time is of the essence, as well.  The longer the digital footprints go undetected, the harder it becomes to piece together the chronology and root cause of the breach. The ability to look back in time should go back to the beginning of the attack lifecycle which could be years before the incident gets discovered. It can even go back to initial failed probes for weaknesses in defenses that happened before the attacker succeeded in breaking in.

Additionally, it’s important to scale visibility. Here’s a common example; if you know the attacker is using the Windows Help folder to copy files to and from systems they’re accessing, you’re going to be able to scope out a breach much more rapidly and accurately if you can reach out to all your endpoints at once, looking for anomalous files in the Help folder. For systems not on the corporate network when you’re conducting that search, you’re going to need a mechanism to reach those systems as well. If at all possible, integrate all the points of visibility into a common interface to enable analysts to get a holistic view instead of having to bounce between multiple one-trick point solutions. Incident responders will be much more effective if they don’t have to keep a mental tab of what’s going on in multiple windows and copy/paste text back and forth from product to product.

Without that scalable, holistic visibility, security analysts and incident responders will have a very difficult time reconstructing the attack and providing accurate information on what exactly was stolen or sabotaged. Enter ‘data-decay’, and an increase in the efficiency of an attack. There will be gaps in knowledge about what happened and just how bad the breach was. Kicking the attacker out will be more difficult and possibly require multiple attempts. In some cases, the attacker could still have access, but be waiting patiently for things to settle down before they resume activity. This kind of uncertainty can be devastating. Public perception of the breached company gets destroyed because people feel they’re withholding information that affects them personally when reality is that there are a lot of unknowns, including a tremendous volume of effort expended to halt the breach and compromised systems.

As a general guideline, organizations should retain one year of logged data from the following sources with three months online and ready to search at a moment’s notice; server event logs, web service logs and logs from other critical services, proxy, firewall, DHCP, DNS, VPN, antivirus and malware logs. Both successful and failed events should be logged. Many companies only log failed attempts and end up in a situation where they can’t figure out which systems an attacker successfully connected to. If you can afford the space, add in workstation logs and increase retention beyond one year. Attackers thrive on performing their activity from seemingly random workstations. Also consider any regulatory compliance requirements that call for more than one year.

After the investigation is complete, organizations should secure and archive all collected data in case it’s needed in the future such as in the event an attacker lies dormant for a few months after a failed remediation, then resumes their activity. If remediation failed, something was obviously missed and it could be grievous to not have access to the original set of data. Another reason to retain the data is in the event legal actions must be taken. Lastly, there are elements of the data that makes for powerful threat intelligence, for instance, observed attacker habits like file naming conventions and hidden folders they copy files to and from, information about their hacking tools and forensic artifacts left behind when tools are deleted, such as registry modifications. Anything you see that’s unique to the attacker in your environment should be utilized as threat intelligence to proactively monitor moving forward for activity from that attacker to ensure remediation worked and to detect them when they attempt the next break in. Use the threat intelligence to monitor at the endpoint, in the network traffic, and in logfiles. That kind of feedback loop from all points of visibility is absolutely the future of security. It is what’s required to combat modern threats, in the era of continuous compromise.

avatar

Lucas Zaichkowsky

Lucas Zaichkowsky is the Enterprise Defense Architect at AccessData, responsible for providing expert guidance on the topic of CyberSecurity. Prior to joining AccessData, Lucas was a Technical Engineer at Mandiant where he worked with Fortune 500 organizations, the Defense Industrial Base, and government institutions to deploy measures designed to defend against the worlds most sophisticated attack groups.

More Posts

Follow Me:
Twitter

This entry was posted in Uncategorized on by .
avatar

About Lucas Zaichkowsky

Lucas Zaichkowsky is the Enterprise Defense Architect at AccessData, responsible for providing expert guidance on the topic of CyberSecurity. Prior to joining AccessData, Lucas was a Technical Engineer at Mandiant where he worked with Fortune 500 organizations, the Defense Industrial Base, and government institutions to deploy measures designed to defend against the worlds most sophisticated attack groups.

One thought on “Active Combat in the Era of Continuous Compromise

  1. Pingback: D4 December newsletter, eDiscovery and Computer Forensics Best Practices and helpful tips « D4 eDiscovery

Leave a Reply

Your email address will not be published. Required fields are marked *


*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>