Several months ago we started conducting Blue Team Training Sessions with a group of security analysts averaging 4+ years of experience.  We had several goals in doing so.  First, we wanted to get an up close and personal view of how easy or hard threat hunting really is.  Second, we wanted to gauge to what extent analysts are able to utilize the tools available to them in the market today. Third, we thought it would be a good way to engage with our community and provide value.

We did not set out to create a representative sample nor do we believe that what happened during the training is representative of the skills and knowledge of the industry as a whole.  That said…we were surprised, so we thought the experience might be of interest to other security folks.

The Format

The training spanned four days of two hour sessions each, followed by a Cyber Hunt Challenge.

This was a timed, tournament style competition. We ran it twice, with 24 total participants. Participants ranged from individual analysts who responded to email and advertising outreach as well as several groups of analysts who participated under sponsorship from their employers.

The goal of the tournament was to find as many attacks as possible in a set of data we provided the participants. There were eight attacks in the data, all of which were known attacks in the sense that they had been identified and documented before. They were generated by Metasploit and included:

  1. Brute force login attempts
  2. Compromising vulnerable apps  
  3. Privilege escalation
  4. Accessing existing backdoors

As a side note, going in our security experts did not think that these were particularly complex or difficult threats to uncover, and participants were allowed to use any tools they wished to analyze the data set. The main tools for the exercise were Splunk and command line tools.

What Happened

It is very likely that there was serious self-selection bias in a group of participants who chose to take this type of a class. That said, the cyberhunt challenge turned out to be harder than even we anticipated going in. We thought many analysts would be able to uncover most of the attacks. However, only 17% were able to detect greater than 50% of the attacks and only one analyst found 7 out of 8 attacks very quickly. He was the most experienced of the group, and it showed. And, he won a trip to Hawaii.

The first challenging aspect was the time allotment - we were too optimistic about how much time would be sufficient for such a challenge. The original time allotment was 2 hours; ; however, we decided to  extend it to 24 hours, changing it into more of a hackathon format. As expected there were varying skill levels and the scores ranged between 1 and 7 out of a total of 8 available points. 4 analysts clearly excelled and had some common traits which most analysts will achieve as they hone their skills through training and experience. Amongst the analysts who were more successful, it still took them 4-6 hours of slicing and dicing the data

It drove home the point for us that even with today’s tooling - finding threats requires a lot of expertise and time. This has been confirmed in several conversations with CISOs - one of the CISOs I remember quipped "Red Team always wins!"

Revisiting our Goals

So how did we do against our goals?
 
Evaluating Skills
 
Despite the fact that the training was billed as a “master class” kind of a session, there was a lot of variance in the skill set in the group. We have heard a lot about the cyber security skills shortage and how hard it is to build and scale secops teams, but this really brought it home. For a lot of folks in the class - a training more focused on basic unix skills and using tools would be a good start. We heard from multiple participants that we should consider providing a more basic or “101” level of training class for those with less experience and new to threat hunting.
 
Evaluating Knowledge

We also learned that familiarity with the toolset that is available out there is not as common. There were some clear experts in the class, but several people got to use some of the tools covered in the class for the first time. On the plus side - almost everyone liked the hands on aspect of the class. It became clear to us that a lot remains to be done in building, simplifying, and training on the cybersecurity toolset.
 
Evaluating the Potential for Community

Last – in terms of community, every single one of our participants saw great value in this type of an exercise. We wonder what would happen if the expertise from the best in class could be more easily and  broadly shared. Would that not ease the cyber skills shortage a bit?

Some Thoughts on the Sessions

First off, any secops person will tell you that just knowing that a dataset has an attack in it is about as different from real life as you can possibly get. In fact deciding whether or not an attack has happened is one of the biggest challenges in threat detection. Given that, we thought that once the participants knew there were X number of attacks to be found, a large hurdle had already been overcome. Not so. 

If anything, this experience showed us that it is very likely that if you are running a secops team you would benefit from carefully evaluating your ability to detect and address threats in your current setup. If you are not running “Red Team/Blue Team” exercises - you might be totally off in judging your ability to detect threats. This despite the fact that we all pride ourselves on being paranoid and most of us try to assume we have already been breached.