“14 Jobs That Will Soon Be Obsolete.” “Can A Robot Do Your Job?” “These Seven Careers Will Fall Victim to Automation.” For each incremental advance in automation technology, it seems there’s an accompanying piece of alarmist clickbait, warning of a future in which robots will be able to do everything we can, only better, cheaper, and for longer. Proponents of AI and automation view this as the harbinger of a golden age, ushering in a future free from all the paper-pushing, the drudgery, the mundane and repetitive things we have to do in our lives. We will work shorter hours, focus on more meaningful work, and actually spend our leisure time on, well, leisure.

But while it’s one thing to enjoy having a robot zipping across the floor picking up your 3-year-old’s wayward Cheerios, it’s quite another to imagine automation coming to our workplace. For those of us in cybersecurity, however, it has become a foregone conclusion: Now that criminals have begun adopting automation and AI as part of their attack strategies, it’s become something of an arms race, with businesses and individuals racing to stay one step ahead of increasingly sophisticated bad actors that human analysts will no longer be able to fend off on their own.

Spurred by growth in both the number of companies deploying automation and the sophistication of threats, automated processes are closing in on and even surpassing human analysts in some tasks—which is making some cybersecurity professionals uneasy. “When robots are better threat hunters, will there still be a place for me? What if someday, they can do everything I can do, and more?”

According to the “2019 SANS Automation and Integration Survey,” however, human-powered SecOps aren’t going away anytime soon. “Automation doesn’t appear to negatively affect staffing,” the authors concluded, after surveying more than 200 cybersecurity professionals from companies of all sizes over a wide cross-section of industries. What they found, in fact, suggested the opposite: Companies with medium or greater levels of automation actually have higher staffing levels than companies with little automation. When asked directly about whether they anticipated job elimination due to automation, most of those surveyed said they felt there would be no change in staffing levels. “Respondents do not appear concerned about automation taking away jobs,” the paper concludes.

There are many reasons for this, but perhaps the most basic is that, in order to see any sort of loss in the number of cybersecurity jobs, we’d first need to get to parity—and we’re currently about 3 million short of that.

Phrased another way, automation could theoretically eliminate three million jobs before a single analyst had to contemplate a career change. That’s an oversimplification, to be sure, but it’s also one that presupposes AI and automation will live up to all of its promises—and as we’ve seen with a number of “revolutionary” cybersecurity technologies, many fall short of the hype, at least in the early days.

Automation currently faces some fundamental shortcomings. First, it cannot deploy itself: Experts are needed to tailor the solution to the business’ needs and ensure it is set up and functioning correctly. And once they’re in place, the systems cannot reliably cover all the security needs of an enterprise—due to a lack of human judgment, automated systems surface a great many false positives, and failing to put an analyst in charge of filtering and investigating these these would create a huge burden on the IT staff responsible for remediation.

There’s also the issue of false negatives. AI is great at spotting what it’s programmed to spot; it is vastly more unreliable at catching threats it hasn’t been specifically instructed to look for. Machine learning is beginning to overcome this hurdle, but the operative word here is still “machine”—when significant threats are surfaced, the AI has no way of knowing what this means for the business it’s working for, as it lacks both the context to fully realize what a threat means to its parent company, and the ability to take into consideration everything a person would. Humans will still be needed at the helm to analyze risks and potential breaches, and make intuition-driven, business-critical decisions.

As effective as these automated systems are, once they’ve been programmed, their education begins to become obsolete almost immediately as new types of attack are created and deployed. Automated systems cannot continue to learn and evolve effectively without the guiding hand of humans. Humans are also needed as a check on this learning, to test and attempt to penetrate the defenses the system has developed.

Then there are the things that can never be automated: hiring and training people; selecting vendors; any task that requires creativity or “thinking outside the box”; making presentations and eliciting buy-in from the board of directors and upper management—and, of course, compliance. No automated system, no matter how sophisticated, is going to know when new laws, company regulations, and rules are passed, and no system will be able to adjust to such changes without human intervention. Even if the work of compliance could be completely automated, the responsibility for compliance cannot be outsourced, and rare would be the individual who could sleep easy letting a machine handle such tasks singlehandedly.

But for the sake of argument, let’s assume for a moment we could fully automate the SOC. While the loss of jobs is certainly a serious matter, we’d soon find the stakes to be much higher than even that. Hackers have already demonstrated an ability to hack into automated systems. If they were able to retrain your AI to ignore critical threats, and there was no human present to realize what was happening and respond swiftly and appropriately, sensitive data could be compromised enterprise-wide—or worse.

In short, automation won’t eliminate the demand for human cybersecurity expertise, at least in the short- to medium-term. But it will certainly redefine roles. According to SANS, implementation of effective automation often requires an initial surge in staff to get the kinks worked out—but it is almost invariably accompanied by a redirection, not reduction, of the existing workforce. Once in place, the automated systems will have two functions. By allowing analysts to shift their focus to more critical cybersecurity functions, improving efficiency, reducing incident response time, and reducing fatigue, they function as a tool for cybersecurity professionals to increase their effectiveness.

But their most valuable role may be as a partner. Automation may be powerful, but automation closely directed and honed by humans is more powerful. Rather than taking the place of humans, robots will take their place alongside humans. Automation, then, should be thought of as a way not to replace SecOps teams, but rather to complement and complete them in a way that will allow them to handle both the monotonous and mundane (yet necessary) tasks in the SOC, and also attend to the true mission-critical tasks rapidly and without distraction.

For more on misconceptions surrounding automation, read the 2019 SANS Automation Survey

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Top
%d bloggers like this: