The artificial intelligence coding tool favored by the likes of crypto exchange Coinbase has a vulnerability allowing hackers to silently inject malware and “spread itself across an organization,” says a cybersecurity firm.
HiddenLayer reported on Thursday that a “CopyPasta License Attack” can hide malicious instructions in common developer files to “introduce deliberate vulnerabilities into codebases that would otherwise be secure.”
“By convincing the underlying model that our payload is actually an important license file that must be included as a comment in every file that is edited by the agent, we can quickly distribute the prompt injection across entire codebases with minimal effort,” it added.
HiddenLayer predominantly tested the virus on Cursor, an AI-powered coding tool that Coinbase’s engineering team said in August was the preferred tool for most of its developers and had been used by “every Coinbase engineer” by February.
AI coding tools Windsurf, Kiro, and Aider were also shown to be vulnerable to the attack, according to HiddenLayer.
CopyPasta hides in common files
HiddenLayer explained that the CopyPasta attack puts hidden instructions, or “prompt injections,” into LICENSE.txt and README.md files that can direct AI coding tools without a user knowing.
The virus, or the prompt injection for the AI, is hidden in a markdown comment — text within a README file used for adding explainers or notes that aren’t shown when it’s rendered into its final format.
HiddenLayer created a code repository with the virus and asked Cursor to use it, and the hidden instructions saw it copy the prompt injection across to the new files it created.
“This mechanism could be adapted to achieve far more nefarious results,” the company said.
“Injected code could stage a backdoor, silently exfiltrate sensitive data, introduce resource-draining operations that cripple systems, or manipulate critical files to disrupt development and production environments,” HiddenLayer added. “All while being buried deep inside files to avoid immediate detection.”
Coinbase boss slammed for “insane” use of AI
It came after Coinbase CEO Brian Armstrong said on Wednesday that AI has written up to 40% of its code and wants to expand this to 50% next month, which prompted backlash.
“This is a giant red flag for any security sensitive business,” said decentralized exchange Dango founder Larry Lyu.
“Software company leaders: don’t do this. AI is a tool, but mandating its use at a certain level is insane,” said Carnegie Mellon University computer science professor Jonathan Aldrich. “I have no interest in using Coinbase, but even if I did, I certainly would not trust it with my money after seeing this.”
Delphi Consulting head Ashwath Balakrishnan called Coinbase’s goal “performative and vague,” and it should instead focus on “new features and fixing existing bugs,” while longtime Bitcoiner Alex Pilař said that as a major crypto custodian, Coinbase “should prioritize security.”
Coinbase uses AI in “less-sensitive data backends”
However, Armstrong said in his post that AI-generated code “needs to be reviewed and understood” and not all areas of the exchange can use it, but it should be used “responsibly as much as we possibly can.”
Related: Criminals are ‘vibe hacking’ with AI at unprecedented levels: Anthropic
The Coinbase engineering team’s blog post said that AI adoption was deepest in teams working on front-end user interfaces and “less-sensitive data backends,” while “complex and system-critical exchange systems” had seen a slower uptake.
The team added that using AI for coding “is not a magic-bullet we should expect teams to universally adopt.”
Armstrong sacked devs who shirked AI
Armstrong said on Stripe co-founder John Collison’s podcast last month that he fired engineers who didn’t try AI tools after Coinbase bought licenses for Cursor and GitHub Copilot.
He recounted being told it would take months to get the engineers to use AI, admitting he “went rogue” and told all engineers it was mandatory that they use the tools.
“I said, ‘AI's important, we need you to all learn it and at least onboard. You don’t have to use it every day yet until we do some training, but at least onboard by the end of the week, and if not, I’m hosting a meeting on Saturday with everybody who hasn’t done it, and I’d like to meet with you to understand why,” he said.
At the meeting, Armstrong said there were a few engineers who hadn’t used AI and didn’t present a good reason why, and “they got fired,” admitting it was a “heavy-handed approach” that “some people really didn’t like.”
AI Eye: Everybody hates GPT-5, AI shows social media can’t be fixed