New 'Rules File Backdoor' Attack Lets Hackers Inject Malicious Code via AI Code Editors
Mar 18, 2025
AI Security / Software Security
Cybersecurity researchers have disclosed details of a new supply chain attack vector dubbed Rules File Backdoor that affects artificial intelligence (AI)-powered code editors like GitHub Copilot and Cursor, causing them to inject malicious code. "This technique enables hackers to silently compromise AI-generated code by injecting hidden malicious instructions into seemingly innocent configuration files used by Cursor and GitHub Copilot," Pillar security's Co-Founder and CTO Ziv Karliner said in a technical report shared with The Hacker News. "By exploiting hidden unicode characters and sophisticated evasion techniques in the model facing instruction payload, threat actors can manipulate the AI to insert malicious code that bypasses typical code reviews." The attack vector is notable for the fact that it allows malicious code to silently propagate across projects, posing a supply chain risk. The crux of the attack hinges on the rules files that are used ...