This article was original posted on DarkReading

Bandwidth, boredom and cognitive bias are three weak spots that prevent analysts from identifying threats. Here's how to compensate.

Even if you have dozens of point security products, security analysts are still your final line of defense. You tasked them with evaluating the thousands of events your security products generate to determine if something harmful is lurking in your environment. This is a daunting responsibility in the face of expanding data volumes.

To put it into perspective, a recent Ponemon Study shows that in a typical week, an organization may receive 17,000 malware alerts. If the company has three to five dedicated security analysts, each would have to review nearly 3,000 to 5,000 alerts per week.

Analysts, being human, have three weak spots, and they and their managers must be aware of them to avoid missed threats.

Bandwidth

The process of investigating each security alert tends to be boring, but the volume of such events continues to increase at an unprecedented rate. Hiring to keep up isn't a viable option because of skill-set and budget constraints. As a result, analysts are overwhelmed with the number of alerts they must process every day. This fatigue leads to individuals rushing through investigations, with a strong tendency to skip key steps, thus increasing the probability of missed breaches.

Boredom

The nature of security operations (SecOps) is that the system evaluates millions or billions of events each day, and only a tiny percentage are suspect. Of those, analysts review thousands and only a few merit further escalation. Boredom leads to complacency, which leads to low job satisfaction, contributing to lower performance and higher attrition. The key is to automate much of the routine workflow, so that you keep analysts focused on investigating real problems.

Cognitive Biases

The third weakness is micro in nature: the cognitive biases that all humans struggle with in making diagnoses and prescribing solutions. Cognitive bias is an area of study that often arises in the context of financial trading and medical diagnosing. It is relevant in the area of cybersecurity because it has implications in terms of not only how many evaluations can be made per time, but also of the quality of those evaluations. Security analysts face the following cognitive biases:

  1. Anchoring is the tendency to rely too heavily, or "anchor," on one trait or piece of information when making decisions (usually the first piece of information acquired on a subject). It's not uncommon for SecOps teams to inadvertently have a narrow focus on daily activities. Hence, they may miss intrusions because they anchored on the likely source of a given pattern in the data and didn’t consider every alternative.
  2. Availability heuristics refers to the tendency to overestimate the likelihood of events with greater "availability" in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be. One of the issues we return to often is that there is so much data to evaluate that a holistic view of the threat landscape is impossible for a single person to hold in his or her head. Another issue is that analysts will make inferences about the entirety of the data set based only on the events they've reviewed.
  3. Confirmation bias is the tendency to search for, interpret, focus on, and remember information in a way that confirms one's preconceptions. An example of this is in the most boring data set anyone could imagine: VPC Flow logs. I recently challenged one of our teams to find intrusion patterns in a data set of VPC logs and immediately got the response, "Of course there won’t be anything in there — there never is." When we looked, we found some servers that were wide open to public scanning, as well as some other problems. It’s critical to always check and check again. 
  4. Clustering illusion is the tendency to overestimate the importance of small runs, streaks, or clusters in large samples of random data (that is, seeing phantom patterns). It's hard to get people to think in terms of statistical significance, even with the aid of powerful tools. So it's not surprising when SecOps teams become convinced there is something there when there isn't. Other biases lead to false negatives, while the clustering illusion leads to false positives.
  5. Inattentional blindness is the failure to notice something in plain sight because of cognitive overload. For security analysts, the excessive stimulus is the volume of data to sift through. During the alert triage process, there is a tendency to rely on mental shortcuts that effectively cause analysts to miss obvious critical signals. 

Overcoming Boredom, Bandwidth, & Biases

Here are some items every SecOps leader should consider to mitigate the tendencies above:

  • Make jobs more interesting by assigning meaningful projects that go beyond the routine — for example, researching and implementing a new solution. Empower analysts with greater decision-making authority.
  • Assign each analyst an area of expertise, such as the Web, networking, etc., with collaboration across analysts during investigations. This mitigates the "availability heuristic" because no one analyst feels the need to be an expert across all systems.
  • Free up bandwidth by automating every process that can be automated. This doesn't mean replacing analysts but, rather, empowering them to do more of what they do best while automating areas in need of support.
  • Create regular open forums with internal and external teams, as well as peer reviews, to discuss actions and results, what worked, and what didn’t. This helps avoid several biases, including confirmation bias and inattentional blindness.
  • Have junior analysts shadow senior analysts for a few hours a week to grow expertise and contextual awareness, as well as to avoid the clustering illusion.

Download Ebook

Blog

Related Posts

May 20, 2022 Willy Leichter

Automating Threat Detection: Three Case Studies

Demystifying the technology with case studies of AI security in action Many automation tools, such...

Learn More

May 17, 2022 Willy Leichter

It's Time to Put AI to Work in Security

While we’ve been talking about and imagining artificial intelligence for years, it only has...

Learn More

May 15, 2022 Tessa Mishoe

LogicHub Security RoundUp: May 2022

Hello, and welcome to the latest edition of the LogicHub Monthly Update! Each month we’ll be...

Learn More

May 9, 2022 Tessa Mishoe

Bad Luck: BlackCat Ransomware Bulletin

Blackcat Ransomware On April 19th of 2022, the FBI Cyber Division released a flash bulletin...

Learn More

May 6, 2022 Kumar Saurabh

Let Humans Be Humans and AI Be AI

LogicHub’s unique decision automation technology can build clients the ultimate security playbook...

Learn More

May 3, 2022 Kumar Saurabh

How to Build a Threat Detection Playbook In 15 Minutes or Less

Automating a threat-hunting playbook with the help of AI Many threat-hunting playbooks we build for...

Learn More

April 29, 2022 Tessa Mishoe

Integrating Better: What Can Integrations Do For Me?

Introduction Within the realm of security, there are many different toolsets and opinions on what...

Learn More

April 27, 2022 Willy Leichter

Beyond No-Code: Using AI for Guided Security Automation

SOAR Playbooks Outside of football, the term “playbook” is well understood by a relatively small...

Learn More

April 21, 2022 Willy Leichter

Goodbye Lonely SIEM, Hello MDR

When updating your systems from a pure Security Information Event Management (SIEM), choosing the...

Learn More

April 15, 2022 Tessa Mishoe

LogicHub Security Roundup: April 2022

Hello, and welcome to the latest edition of the LogicHub Monthly Update! Each month we’ll be...

Learn More