This article was original posted on DarkReading

Bandwidth, boredom and cognitive bias are three weak spots that prevent analysts from identifying threats. Here's how to compensate.

Even if you have dozens of point security products, security analysts are still your final line of defense. You tasked them with evaluating the thousands of events your security products generate to determine if something harmful is lurking in your environment. This is a daunting responsibility in the face of expanding data volumes.

To put it into perspective, a recent Ponemon Study shows that in a typical week, an organization may receive 17,000 malware alerts. If the company has three to five dedicated security analysts, each would have to review nearly 3,000 to 5,000 alerts per week.

Analysts, being human, have three weak spots, and they and their managers must be aware of them to avoid missed threats.

Bandwidth

The process of investigating each security alert tends to be boring, but the volume of such events continues to increase at an unprecedented rate. Hiring to keep up isn't a viable option because of skill-set and budget constraints. As a result, analysts are overwhelmed with the number of alerts they must process every day. This fatigue leads to individuals rushing through investigations, with a strong tendency to skip key steps, thus increasing the probability of missed breaches.

Boredom

The nature of security operations (SecOps) is that the system evaluates millions or billions of events each day, and only a tiny percentage are suspect. Of those, analysts review thousands and only a few merit further escalation. Boredom leads to complacency, which leads to low job satisfaction, contributing to lower performance and higher attrition. The key is to automate much of the routine workflow, so that you keep analysts focused on investigating real problems.

Cognitive Biases

The third weakness is micro in nature: the cognitive biases that all humans struggle with in making diagnoses and prescribing solutions. Cognitive bias is an area of study that often arises in the context of financial trading and medical diagnosing. It is relevant in the area of cybersecurity because it has implications in terms of not only how many evaluations can be made per time, but also of the quality of those evaluations. Security analysts face the following cognitive biases:

  1. Anchoring is the tendency to rely too heavily, or "anchor," on one trait or piece of information when making decisions (usually the first piece of information acquired on a subject). It's not uncommon for SecOps teams to inadvertently have a narrow focus on daily activities. Hence, they may miss intrusions because they anchored on the likely source of a given pattern in the data and didn’t consider every alternative.
  2. Availability heuristics refers to the tendency to overestimate the likelihood of events with greater "availability" in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be. One of the issues we return to often is that there is so much data to evaluate that a holistic view of the threat landscape is impossible for a single person to hold in his or her head. Another issue is that analysts will make inferences about the entirety of the data set based only on the events they've reviewed.
  3. Confirmation bias is the tendency to search for, interpret, focus on, and remember information in a way that confirms one's preconceptions. An example of this is in the most boring data set anyone could imagine: VPC Flow logs. I recently challenged one of our teams to find intrusion patterns in a data set of VPC logs and immediately got the response, "Of course there won’t be anything in there — there never is." When we looked, we found some servers that were wide open to public scanning, as well as some other problems. It’s critical to always check and check again. 
  4. Clustering illusion is the tendency to overestimate the importance of small runs, streaks, or clusters in large samples of random data (that is, seeing phantom patterns). It's hard to get people to think in terms of statistical significance, even with the aid of powerful tools. So it's not surprising when SecOps teams become convinced there is something there when there isn't. Other biases lead to false negatives, while the clustering illusion leads to false positives.
  5. Inattentional blindness is the failure to notice something in plain sight because of cognitive overload. For security analysts, the excessive stimulus is the volume of data to sift through. During the alert triage process, there is a tendency to rely on mental shortcuts that effectively cause analysts to miss obvious critical signals. 

Overcoming Boredom, Bandwidth, & Biases

Here are some items every SecOps leader should consider to mitigate the tendencies above:

  • Make jobs more interesting by assigning meaningful projects that go beyond the routine — for example, researching and implementing a new solution. Empower analysts with greater decision-making authority.
  • Assign each analyst an area of expertise, such as the Web, networking, etc., with collaboration across analysts during investigations. This mitigates the "availability heuristic" because no one analyst feels the need to be an expert across all systems.
  • Free up bandwidth by automating every process that can be automated. This doesn't mean replacing analysts but, rather, empowering them to do more of what they do best while automating areas in need of support.
  • Create regular open forums with internal and external teams, as well as peer reviews, to discuss actions and results, what worked, and what didn’t. This helps avoid several biases, including confirmation bias and inattentional blindness.
  • Have junior analysts shadow senior analysts for a few hours a week to grow expertise and contextual awareness, as well as to avoid the clustering illusion.

Download Ebook

Blog

Related Posts

September 13, 2022 Kumar Saurabh

Why No Code Solutions Are a Double-Edged Sword

Most out-of-the-box security automation is based on a simple logic — essentially, if “this”...

Learn More

August 16, 2022 Willy Leichter

Understanding MDR, XDR, EDR and TDR

A program with proper threat detection and response (TDR) has two key pillars: understanding the...

Learn More

August 9, 2022 Willy Leichter

Intuition vs. Automation: What Man and Machine Bring to Data Security

Cybersecurity experts Colin Henderson and Ray Espinoza share their take on the automation-driven...

Learn More

August 2, 2022 Anthony Morris

Using AI/ML to Create Better Security Detections

The blue-team challenge Ask any person who has interacted with a security operations center (SOC)...

Learn More

July 26, 2022 Willy Leichter

How to Select the Right MDR Service

It can be difficult to understand the differences between the various managed detection and...

Learn More

July 21, 2022 Willy Leichter

The Evolving Role of the SOC Analyst

As the cyber threat landscape evolves, so does the role of the security operations center (SOC)...

Learn More

July 19, 2022 Kumar Saurabh

Life, Liberty, and the Pursuit of Security

As cyber threats evolve, organizations of all sizes need to ramp up their security efforts....

Learn More

July 15, 2022 Tessa Mishoe

LogicHub Security RoundUp: July 2022

Hello, and welcome to the latest edition of the LogicHub Monthly Update! Each month we’ll be...

Learn More

July 12, 2022 Willy Leichter

Security Tools Need to Get with the API Program

No cloud API is an island The evolution of cloud services has coincided with the development of...

Learn More

July 6, 2022 Willy Leichter

Why the Rush to MDR?

LogicHub recently published a survey conducted by Osterman Research, looking at changing trends and...

Learn More