Anthropic Sues Chinese Companies for Massive Claude Distillation 🔥

Published on February 24, 2026 | Translated from Spanish

Competition in artificial intelligence takes on a tone of direct conflict. Anthropic, creator of the Claude model, has filed a formal complaint against the Chinese firms DeepSeek, Moonshot, and MiniMax. The complaint points to an industrial-scale distillation campaign, using millions of queries and thousands of fake accounts to extract the system's knowledge. This event marks a point of friction in the unwritten rules of technological rivalry between the United States and China.

Image of a robotic American hand (with flag) protecting a shiny data core (Claude), while mechanical claws from Chinese companies try to extract it through massive digital copies connected by multiple tubes, badges, and servers.

The Distillation Technique and Its Industrial-Scale Application ⚙️

Model distillation is a process where a smaller system learns to imitate the responses of a larger and more complex one. It is a common practice to optimize one's own resources. However, Anthropic details that in this case it was automated on a massive scale to replicate advanced capabilities of an external competitor. The operation would have used networks of accounts that rotated proxies and query patterns to simulate legitimate traffic and evade rate limits and security detections, turning a development technique into an extraction method.

Claude Discovers It Has Thousands of 'Evil Twins' in China 👥

Imagine Claude's shock upon learning that, while it slept, it was cloned millions of times by an army of fake accounts. Its intellectual essence, carefully distilled, could now be serving word teas on distant servers, without even having signed a license agreement. It is the wet dream of any company: obtaining the wisdom of years of someone else's research by paying only the API bill. A lesson for models: in the future, they might need to include in their terms of service a clause against imaginary friends who ask too much.