OpenAI claims to have discovered that the Chinese AI startup DeepSeek is using its models to train new models.
OpenAI asserts that it has evidence that DeepSeek used a technique called "distillation," which involves using the output of a large model to train a smaller model, thereby achieving similar results at a lower cost.
However, OpenAI did not specify what the evidence is, only mentioning that their terms of service state that their resources cannot be used for competition.