GitHub Copilot Chat turns blabbermouth with crafty prompt injection attack

AI assistant could be duped into leaking code and tokens via sneaky markdown GitHub's Copilot Chat, the chatbot meant to help developers code faster, could be helping attackers to steal…