How Microsoft Approaches AI Red Teaming | BRK223

Estimated read time 1 min read

Post Content

​ AI Red Team (AIRT) serves as the independent red team for high-risk AI across Microsoft, identifying vulnerabilities in pre-ship products and driving research to scale insights across the company. This session will cover the processes, techniques, and tools that AIRT uses in red teaming, including PyRIT AIRT’s open-source automation framework, and in AI security research.

To learn more, please check out these resources:
* https://learn.microsoft.com/collections/z2x4hk48rn2em1
* https://aka.ms/Build24CollectionsSecurity-SecuringAI
* https://learn.microsoft.com/en-us/azure/defender-for-cloud/defender-for-devops-introduction
* https://github.com/Azure/PyRIT
* https://www.microsoft.com/en-us/security/blog/2024/02/22/announcing-microsofts-open-automation-framework-to-red-team-generative-ai-systems/
* https://www.microsoft.com/en-us/security/blog/2023/08/07/microsoft-ai-red-team-building-future-of-safer-ai/

????????:
* Pete Bryan
* Raja Sekhar Rao Dheekonda
* Gary Lopez
* Roman Lutz
* Tori Westerhoff

??????? ???????????:
This video is one of many sessions delivered for the Microsoft Build 2024 event. View the full session schedule and learn more about Microsoft Build at https://build.microsoft.com

BRK223 | English (US) | Security

#MSBuild   Read More Microsoft Developer 

You May Also Like

More From Author

+ There are no comments

Add yours