Here we are again – our fourth installment of the DBIR series (sixth if you count the ’08 and ’09 mid-year supplementals). To our readers, it may seem like the 2010 DBIR published ages ago. To us, it feels more like yesterday. The expanding scope and increasing depth of the report makes it almost one continuous effort throughout the year. It is, however, a labor of love and we’re very glad to be sharing our research into the world of data breaches with you once again.
While it isn’t exactly groundbreaking news that malware attackers are using social media to control their botnets, certain aspects are notable:
In each of these cases, the attackers’ remote activity looked like normal SSL-encrypted traffic to popular Internet sites, making it nearly impossible for packet inspection and netflow anomaly analysis tools to differentiate the malicious from benign activity.
If you can’t read the packet because it’s encrypted, it is very difficult to detect what it is doing.
Prevention efforts will typically not work against APT, Mandiant said. Instead of trying to stop APT intruders from using legitimate sites to compromise their networks, organizations should make it difficult for the APT intruders to stay in the breached network, ultimately making them “too expensive” to attack, according to Mandiant.
This is achieved when the security team can determine what the attacker is doing and to anticipate what the attacker will do next, Mandiant said. Organizations need to increase visibility across the enterprise by incorporating specialized monitoring systems that provide host- and network-based visibility, increased logging, and log aggregation, Mandiant said.
Host-based detection tools look for indicators that the host had been compromised as well as signs of the tools, tactics and procedures used by the attacker. These tools can find unknown malware because they aren’t looking for actual signatures like a traditional anti-virus, the researchers said. Network-based tools do the same search on network traffic. Mandiant researchers listed nine different logs security managers should be looking at regularly, including internal DNS server logs, DHCP logs, internal Web proxy logs, firewall logs with ingress/egress TCP header information, and external Webmail access logs. Log aggregation tools help managers correlate information from numerous sources, highlight critical information and indexes all information for easy searching. The security team can use all the information to effectively detect and remove the compromised host, repeatedly forcing the attacker to start over to regain control, Mandiant said.
Anton Chuvakin was right about logs all along!
Details on an old DoD break-in. I wonder if the activity of the malware was logged anywhere?
In an article for Foreign Affairs, Deputy Defense Secretary William J. Lynn III writes that in 2008, a flash drive believed to have been infected by a foreign intelligence agency uploaded malicious code onto a network run by the military’s Central Command.
“That code spread undetected on both classified and unclassified systems, establishing what amounted to a digital beachhead, from which data could be transferred to servers under foreign control,” Lynn writes. “It was a network administrator’s worst fear: a rogue program operating silently, poised to deliver operational plans into the hands of an unknown adversary.”
Some years ago, I read that the next big shift in web development would be from hand-coding HTML (and everything that goes with webpages) to WYSIWYG editors doing all the coding behind the scenes.
Although that hasn’t fully been realized, can WYSIWYG-easy log analysis tools be on the horizon?
Further, yesterday I was trying to explain the state of the art of log analysis to a client (who looks to use his cool new technology for log analysis and SIEM), and I felt embarrassed to admit that, yes, “search” and “rules” are indeed the state of the art.
In other words, most of the analysis burden is on the tool USER BRAIN, not on the TOOL. They looked at me like I just wasted 10 years of my life, writing regexes and otherwise being a stupid monkey. Even things like profiling/baselining (example) or simple – and I mean SIMPLE – data mining (example, details) mostly stay on research drawing boards for ages.
For Leslie Lambert, former CISO at Sun Microsystems who recently joined Juniper Networks Inc. as CISO, assuming that the bad actors behind cybersecurity threats are already inside the network raises the issue of how sensitive data is secured. Juniper has acknowledged that it was among the victims of Operation Aurora.
“If they’re already in, how have you applied the principals of data protection?” she asked.
Historically, the reason key management worked for stored data was that the key could be stored in a secure location: the human brain. People would remember keys and, barring physical and emotional attacks on the people themselves, would not divulge them. In a sense, the keys were stored in a “computer” that was not attached to any network. And there they were safe.
This whole model falls apart on the Internet. Much of the data stored on the Internet is only peripherally intended for use by people; it’s primarily intended for use by other computers. And therein lies the problem. Keys can no longer be stored in people’s brains. They need to be stored on the same computer, or at least the network, that the data resides on. And that is much riskier.
Let’s take a concrete example: credit card databases associated with websites. Those databases are not encrypted because it doesn’t make any sense. The whole point of storing credit card numbers on a website is so it’s accessible — so each time I buy something, I don’t have to type it in again. The website needs to dynamically query the database and retrieve the numbers, millions of times a day. If the database were encrypted, the website would need the key. But if the key were on the same network as the data, what would be the point of encrypting it? Access to the website equals access to the database in either case. Security is achieved by good access control on the website and database, not by encrypting the data.
If you harden the interior of your organization, where do you store the keys? How difficult are they to find? (And that’s a rhetorical question, do not tell me in the comments where your keys are.)