Microsoft Rolls Out New Tools to Bolster Copilot Safety and Prevent Misuse
-
Microsoft has new tools to prevent prompt injection attacks that trick AI systems like Copilot into behaving badly.
-
A new "Groundedness Detection" feature helps users identify when a chatbot like Copilot is hallucinating text.
-
Microsoft is providing videos and templates to help people better utilize Copilot through proper prompt engineering.
-
Subtle changes in a prompt can significantly improve Copilot's quality and safety.
-
Microsoft aims to bring safety templates into Azure AI Studio and Azure OpenAI Service.