<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1076912843267184&amp;ev=PageView&amp;noscript=1">

RL Blog

|

6 ways AI helps SecOps punch back

While AI is mostly seen as opening a new front in the threat landscape, it will also be tapped to fight back with advanced threat hunting and more.

John P. Mello Jr.
Blog Author

John P. Mello Jr., Freelance technology writer. Read More...

ai-security-fight-back

From the moment OpenAI let ChatGPT out of the box, the potential for generative AI and large language models (LLMs) to cause harm has dominated conversations about the emerging technology.

Less talked about has been how AI can be a formidable weapon in the hands of the good guys. AI decision makers believe that among departments in the enterprise, IT operations will be affected the most by generative AI — more than security — according to data gathered by Forrester Research.

Forrester principal analyst Allie Mellen said in a new report on the use of generative AI in security tools that, while the tools are not yet widely available, they are coming. 

"Security leaders need to be prepared for this new technology to affect how their teams operate."
Allie Mellen

Here are six ways AI can be used by security teams to punch back at adversaries.

[ See related: The AI executive order: What AppSec teams need to know | See Webinar: Secure by Design: Why Trust Matters for Risk Management ]

1. AI can be used to develop more secure code

Using AI for securing code is still in its infancy and has attracted skepticism from some security practitioners, but its potential can't be denied. In a presentation at InfoSec World 24, Mark Sherman, technical director of the Cyber Security Foundations group in the CERT Division at Carnegie Mellon University's Software Engineering Institute, said early experiments show promise but also have limitations. (See his slides from the talk in PDF form).

When using AI to secure code, Sherman cautioned, output must be reviewed by knowledgeable users. Unfortunately, most of the time that doesn't include programmers because they're not very good at reading and evaluating code.

2. AI can help address security staffing issues

Forrester's Mellen said that many implementations of generative AI in security tools today rely on chatbot-style features built into a separate view in an application. "As unique as this is right now, it ultimately does not naturally fit into the analyst workflow and is little more than a novelty,” she said.

The real value in generative AI is in addressing tasks automatically that were previously part of the analyst workflow. One example: writing draft incident-response reports. Mellen advised security leaders to look for generative AI implementations that fit into the analyst experience, to help analysts make decisions faster — not just force them to use another view or tab.

Generative AI can also improve the performance of less experienced security team members. At the recent Ignite conference, Vasu Jakkal, Microsoft corporate vice president for security, compliance, identity, and management, cited a Microsoft study that found that “new in career” analysts using AI-enabled tools produced responses to security events that were 44% more accurate and responded 26% faster across all tasks.

In addition, a large majority of the analysts said the tools helped improve the quality of their work (86%), reduced the effort needed to complete a task (83%), and made them more productive (86%).

3. AI can aid security teams in report creation

With tasks such as summarizing incidents for reporting purposes and creating human-readable case descriptions, generative AI can be used effectively. It can also convey information in a human-friendly way, which can be valuable in responding to customer service requests and in producing better product documentation.

4. AI can assist security teams in analyzing pattern behavior

With its predictive abilities, generative AI can help identify privacy risks, attacker activity, and risk scenarios, and it can suggest remediation actions.

5. AI can improve threat hunting

Generative AI can add speed and scale for scenarios such as security-posture management, incident investigation and response, and security reporting.

6. AI can be used to unify numerous security solutions

The number of security tools organizations have in use can be overwhelming for teams. Generative AI, for one, can bring together all the security signals and threat intelligence siloed in disconnected tools. That allows security teams to streamline, triage, and obtain a complete end-to-end view of threats across the digital estate, making response easier and quicker for analysts of every level.

The potential is there. It's time to put AI to work

Generative AI tech has the potential to greatly enhance our ability to detect and respond to cyber threats,” said Joseph Thacker, a security researcher with AppOmni.

“There are already so many companies building AI security analyst agents. It’s going to be vital to use AI for securing digital assets in the future.”
Joseph Thacker

However, Forrester’s Mellen said patience is in order, because despite vendor promises, "this technology is currently available only to a select set of customers, if at all.”

“Every vendor we mentioned in the report (and have spoken with) has at least a press release related to the generative AI offering that it’s building, but none are generally available, or likely to be generally available, before the first half of 2024."
—Allie Mellon

Get up to speed on RL's malware analysis and threat hunting solution updates with our year in review post. Plus: Learn more about our malware analysis and threat hunting solutions

More Blog Posts

    Special Reports

    Latest Blog Posts

    Chinese APT Group Exploits SOHO Routers Chinese APT Group Exploits SOHO Routers

    Conversations About Threat Hunting and Software Supply Chain Security

    Reproducible Builds: Graduate Your Software Supply Chain Security Reproducible Builds: Graduate Your Software Supply Chain Security

    Glassboard conversations with ReversingLabs Field CISO Matt Rose

    Software Package Deconstruction: Video Conferencing Software Software Package Deconstruction: Video Conferencing Software

    Analyzing Risks To Your Software Supply Chain