Study Finds AI Code Tools Like Copilot and CodeWhisperer Can Leak Secret Credentials
-
Researchers found that AI code completion tools like GitHub Copilot and Amazon CodeWhisperer can reveal hardcoded credentials captured during training.
-
The tools were coaxed to emit API keys, tokens, and other secrets by prompting them to fill in missing credentials in code snippets.
-
Around 33% of Copilot's suggestions and 18% of CodeWhisperer's contained valid secrets, though only a small subset were active credentials.
-
A few secrets generated were exact copies of those used to create the prompts.
-
The findings reveal privacy risks, as models can expose inadvertently collected secrets. Developers should use caution and security practices with AI coding assistants.