37 While generative AI can create vulnerabilities, it also helps to combat cyberthreats. Microsoft’s Azure AI Foundry helps AI developers detect, measure and manage the risk of jailbreaks and indirect attacks. “We launched Microsoft Security Copilot in April 2024, which helps level the playing field,” says Jakkal. It does this by using Microsoft’s security data and OpenAI’s GPT models to simplify tasks. “That’s why I love generative AI – because I think this tool is going to make it easy for everyone to become a defender.” Organisations using Security Copilot experienced a 30 per cent reduction in average time to resolve security incidents, according to Microsoft’s Generative AI and Security Operations Center Productivity: Evidence from Live Operations report. And IT admins using Copilot in the Microsoft Entra admin centre spent 45 per cent less time troubleshooting sign-ins and increased accuracy by 47 per cent. Oregon State University, for example, is using Security Copilot alongside Microsoft Sentinel and Defender to automatically process security incident tickets. Automation allows the university to more efficiently query ticket generation so that security analysts can concentrate on higher-priority incidents like system intrusions or potential data breaches instead of routine administrative tasks. Meanwhile, materials manufacturer Eastman is using the solution to upskill its security analysts with step-by-step guidance for response and faster threat remediation. The team also uses Defender solutions to protect workloads in its apps across its Microsoft 365 E5 license. “We work in a world where every second matters,” says David Yates, senior cybersecurity analyst at Eastman. “Attackers can move very quickly, so we need to understand how the attack is being deployed and where. Efficiency is crucial.” Security Copilot uses AI to trace security incidents to specific IP addresses, relating clues that may at first seem random and unconnected to expose a larger threat. “AI gives the asymmetric advantage to the defender over the attacker,” says Charlie Bell, executive vice president of Microsoft Security. “We don’t have access to all the data, and we can’t see the whole environment. [This means that we] can’t see how an attacker would get from one thing to another. The beauty of AI is that it can see everything, and we can finally reason across a vast space that no human can get to because we can’t perceive everything within our surface area.” “ We launched Microsoft Security Copilot in April 2024, which helps level the playing field” VASU JAKKAL, MICROSOFT SECURITY Organisations need to monitor AI models designed to improve customer experiences, such as those who bank online, to avoid cybercriminals manipulating the technology Photo: Adobe Stock/lordn
RkJQdWJsaXNoZXIy NzQ1NTk=