Threats to corporate data are pervasive and no industry is immune to data breaches. Organizations as varied as Sony, Target and Anthem have seen millions of records and multiple terabytes of data leave their networks unseen. The fact that these large enterprises have the resources to invest in any cybersecurity solutions they choose, hasn’t seemed to matter. The preventive measures they had in place didn’t stop them from getting infected and it didn’t keep massive amounts of data from being stolen. Anthem lost an estimated 80 million records, Target lost 110 million and Sony had more than a terabyte of data leave the network. The chance that this could happen to any organization, anywhere, is what keeps CIOs awake at night, and it’s the reason data security has become a boardroom-level discussion. This whitepaper discusses the challenges to protecting corporate
data and why a new approach is needed; one that puts equal emphasis on stopping inbound threats and finding evasive infections to dramatically reduce data loss.
The Impact of Cloud Computing
The unabated growth of cloud computing is another factor that has increased data security challenges. Studies show that businesses are moving to the cloud in greater numbers every year, driven by the efficiency, economies of scale and cost savings it offers. SaaS platforms from CRM to payroll are becoming the standard, while new cloud services such as PaaS (platform-as-a-service), DBaaS (database-as-a-service) and others, gain ground. Unfortunately, the benefits of cloud computing come with risks and security is frequently cited as the primary reason organizations resist embracing the cloud in full. This is understandable. The explosion of mobile devices, thousands of mobile apps and the nature of today’s widely distributed office locations and roaming workforce has introduced new threat vectors that organizations can’t ignore. Anthem’s records, for instance, were reportedly being siphoned to a public cloud service over an extended period of time before being discovered. Despite this evidence, the move to the cloud will continue and until organizations adopt a change in security strategy, so will data loss.
A Traditional Approach Isn’t Working
After a major breach, commentary frequently focuses on the failure of technology to stop the offending malware. High profile breaches quickly spawn blogs and articles addressing the failure of security vendors to develop adequate preventive technologies. This is compounded by analysts who warn organizations that they “will be infected so deal with it.” And while security vendors continue to tout their abilities to stop even the most sophisticated APTs and targeted threats, a non-stop procession of high-profile breaches seems to indicate otherwise. This doesn’t mean the drive to improve malware detection isn’t important, but organizations must
go beyond preventive measures and face the fact that even though they might not have lost any data yet, their networks are infected now.
Dwell Time and the Security Gap
Though your data security problem begins with incoming APTs and malware, the real damage occurs when unauthorized data leaves the network. The massive numbers of records and even terabytes of data lost beg the question, “How could that much data leave without being noticed?” The average time between when a network gets infected, and when that infection is detected, is called dwell time. The average dwell time window was most recently estimated at 205 days. Imagine how much data could leave the network in 2 days, let alone 205? Adding insult to injury is the fact that a majority of data breaches are actually reported by a third party - someone outside the organization. Clearly, a critical factor in minimizing losses would be to close this gap, reduce dwell time and keep the data from leaving. If this process can occur in as close to real time as possible, organizations are in a good position to minimize their losses.
The Consequences of a Data Breach
There are good reasons for organizations to worry about their data because the damage from a data breach can be devastating with far-reaching effects, including:
Sales loss: Organizations can suffer significant revenue loss as sales plummet following a publicized data breach. The infamous Target data breach, which involved the loss of 40 million credit card numbers, resulted in a 16% decline in sales, over the same period the previous year.
Cost of remediating the data breach: According to Ponemon Research, the cost of investigating and remediating a data breach is approximately $200 per record. This figure includes factors such as hiring forensic investigative experts, outsourcing additional hotline support, customer notifications and discounts for damage control, legal work and other expenses.
Regulatory fines and lawsuits: The fines resulting from regulatory violations and the cost of litigation can add up quickly. Protracted lawsuits can deplete resources and cost millions to adjudicate. Anthem, for example, is dealing with multiple class action and individual lawsuits resulting from their breach.
Interruption of operations: The cost of unplanned downtime, which is often a consequence of a major data breach, can be significant. A survey by IT Trust Curve found that the average cost of business interruptions due to a data breach was almost $500K per incident.
Brand and Reputation Damage: These consequences are more difficult to quantify, but no one doubts their existence. Damage control after your brand has been tarnished is not only costly, it can take many months, if not years, to rebuild your brand equity after a significant data loss.
Standard Security Solutions Aren’t Enough to Stop APTs and Data Loss
With the vast array of new security tools and features continuously being released, if all malware could be stopped at the gateway - it would have happened by now. This is not to say that preventive defense isn’t important. We need to continue to focus on inbound detection as a critical defense strategy, but clearly inbound protection alone, is not going to do the job.
Signature and Heuristic AV Technology
Although having a robust signature/ AV database is an essential weapon in your security arsenal, using it as your only protection is risky. This is because it can only block known malware and today’s sophisticated hackers are continuously creating new exploits that leverage polymorphic viruses, encrypted malware or other malicious code that is signaturless and unknown, designed to evade signature/heuristic-based detection.
Sandboxes provide virtual environments where suspicious files can be isolated and executed, allowing IT security to analyze them and issue alerts based on their findings. Because the value of sandboxing has been recognized, it is now integrated into operating systems, applications and browser technologies. And while it has become an important part of APT and zero-day protection, relying on sandboxing technology alone won’t provide the protection you need.
Sandboxing is Important but it has Weaknesses
Encrypted malware and polymorphic viruses – Sandboxes have to spot malware on your inbound communications and if the malware is encrypted or the virus is able to change into something that appears benign, the sandbox won’t detect it.
Windows OS – Sandboxes are designed to analyze only Windows OS files, so malware built for other operating systems won’t be recognized. In addition, some malware is designed to exploit Windows flaws by doing version checks and executing commands when they find a version with a vulnerability.
Malware designed to evade – Hackers create APTs that can sniff out whether or not they are in a sandbox and the malware goes dormant, appearing benign to the sandbox. Because the typical sandbox only analyzes malware for a specific time period, this malware can be executed based on a variety of evasive techniques and triggers including:
Triggered by human action – Hackers are using malware that can be triggered by a mouse click or opening a dialog box, resulting in a C&C callback or other malicious action.
Malware that detects the sandbox – If the malware detects any of a number of factors indicating it’s in a sandbox, it goes to sleep until it can execute from within the network.
Hiding in non-executable files – Another tactic for evasion is to hide executable malware by embedding it in innocuous files such as GIFs or Acrobat Flash files where the sandbox will miss them.
Malware downloaded in stages – Malware is typically delivered via a dropper file that is downloaded to the network. Sandboxing is effective at detecting these downloads and isolating the files, but if the exploit downloads files in stages, the sandbox can easily miss subsequent downloads, thinking it has caught the malware.
Here are some security approaches that are must haves for minimizing the consequences of a criminal data breach:
Today’s Advanced Threats Require a New Focus
A comprehensive security strategy for today’s threat environment demands a new approach that doesn’t ignore prevention, but balances it with technologies that detect and respond to the evasive malware that gets past even the most robust inbound gateway defense.
New Technologies can Shorten Dwell Time, Contain Data and Minimize Loss
Even with the inability of today’s cybersecurity to stop 100% of malware, there are new technologies that can boost your security profile without depleting your bottom line. These technologies can help organizations broaden their perspective by implementing security strategies that focus equally on preventive, pre-infection measures to keep threats out, and post-infection detection and containment to find active infections and contain illegal data transfers. Solutions that can find evasive threats, and more importantly contain data exfiltration, even before an infection is detected, will be crucial to effective cybersecurity going forward.
Comprehensive Visibility and Continuous Monitoring
Given the reality that even the best preventive technology will not stop 100% of malware, organizations must be prepared to deal with the fact that there will be infections on their networks. Whether it is a bot that got past your security and is hiding until it can execute a C&C callback, or malware from a BYOD user logging into the network after being offsite, you must acknowledge the strong possibility that a sophisticated piece of malware will get through your inbound defenses. The point- in- time audits and compliance checks of the past are no longer sufficient to protect against today’s fast moving threat cycle.
Organizations need to continuously monitor all traffic and make sure their solution has visibility over the full Web stream. When evaluating your security solution be aware that many legacy solutions focus only on ports 80 and 443 and lack the capacity to monitor across all 131K inbound/outbound data channels. This is a critical capability since sophisticated exploits increasingly use high ports and hidden protocols on streaming UDP channels to deliver malware.
Network Baselining for Anomaly Detection
Network anomaly detection leverages outbound traffic visibility and continuous monitoring to detect anomalies in data transfers that can signify compromise. The beauty of this technology is that it can stop data loss even before an active infection is detected. Imagine if Sony or Anthem IT personnel had been aware of suspicious data transfers early on, when only a few hundred kilobytes had left the network.
The first step in implementing this technology is to establish baselines for normal network traffic. To be most effective and accurate, baselining should be granular. For example, you may need the ability to distinguish weekday from weekend traffic, or measure different times of day. The next consideration will be what you measure to detect anomalies. If you have certain servers that hold sensitive information, you would want to set narrower parameters around traffic leaving them. Geo-location is another essential factor, since many damaging exploits emanate from the same high-risk countries or regions. Once this solution is functioning, the initial settings you establish should be reviewed and adjusted as your organization evolves, in order to maintain the effectiveness of this technology.
Automatic Data Exfiltration Containment
In addition to continuous monitoring and anomaly detection, an advanced cybersecurity solution must also be able to leverage packet-level visibility and control, to automatically contain data transfers when compromised traffic is detected. Technology that can find active malware infections and then contain the data the malware is trying to exfiltrate, in real-time, will transform a potentially catastrophic data loss into a minor data glitch.
One of the issues cited in the Target breach, where 110M sensitive records were exposed, was the problem of noise. Reportedly, the IT team was inundated with alerts to the point where resolving each one individually, in a reasonable time-frame that might have reduced the data loss, was impossible. This has become a pervasive problem with many vendors claiming to have solutions that reduce noise and false positives. If your organization isn’t prepared to hire an army of IT pros to investigate each alert as it happens, you will want to find a solution that prioritizes threats to deliver actionable intelligence. Otherwise, one bot issuing continuous call-back attempts, may also spark continuous alerts.
Delivery Through the Cloud Can Increase Efficiency and Lower Costs
With more business processes being conducted direct-to-cloud, it makes sense that your cybersecurity solution should be cloud-based also. Choosing a solution that can deliver advanced threat defense via pure-cloud configuration, will allow you take advantage of cloud benefits:
A pure cloud solution should eliminate the need for costly hardware and software investments and simplify integration into even the most complex, distributed environments
It can lower TCO by eliminating expensive hardware purchases and upgrades and by reducing the IT resources required to manage a complex on-premises solution
Cloud cybersecurity should encompass all your locations and mobile users in your security policies and compliance without having to backhaul data through corporate headquarters
It should provide multitenancy, infinite scalability, and rapid elastic load balancing so that you can expand to accommodate growth instantly, without jeopardizing network availability
The right cloud solution will offer multiple options for traffic redirection so that encompassing all your users and locations is efficient, including GRE, IPSEC, WCCP, agents and native integration for iOS and Android
Because standard cloud-based solutions typically mix organizations’ data in the public cloud, look for security with tools and resources that can ensure the safety of your data in the public cloud
If you’re reticent to move your cybersecurity to the cloud, look for a solution that provides private cloud and hybrid options. This will allow you to customize your configuration to suit your organization’s exact requirements.