Distillation is the process of training ‌smaller AI models using output from larger, more expensive ones as part of an effort ‌to ⁠lower the costs of training a powerful new AI tool [File]

Distillation is the process of training ‌smaller AI models using output from larger, more expensive ones as part of an effort ‌to ⁠lower the costs of training a powerful new AI tool [File]
| Photo Credit: AP

The U.S. State Department has ordered a ‌global push to bring attention to what it says are widespread efforts by ​Chinese companies, including AI startup DeepSeek, to steal intellectual property from U.S. artificial intelligence ⁠labs, according to a diplomatic cable seen by Reuters.

The cable, dated Friday and sent to diplomatic and consular posts around the world, instructs diplomatic staff to speak to their foreign counterparts about “concerns over adversaries’ extraction and distillation ‌of U.S. A.I. models.”


Leave a Reply

Your email address will not be published. Required fields are marked *