1. Garak Probing Engine: New Red Team Tool Targets LLM Vulnerabilities for Jailbreak, Injection, and Exfiltration
A new open-source red teaming tool, dubbed the Garak probing engine, has been introduced on GitHub with the explicit purpose of systematically scanning Large Language Models (LLMs) for critical security vulnerabilities. The tool's release signals a growing, proactive effort within the security community to pressure-tes...