According to a report from the public relations firm Chainstory, nearly half of cryptocurrency news coverage in major publications has disclosed the use of some form of artificial intelligence tools. AI tools are having a significant impact on the information space, including journalism. Chainstory analyzed 80,000 articles from five major cryptocurrency news websites in its new report, revealing a notable increase in AI usage by 2025. During this period, 48% of articles from Investing.com, The Defiant, Benzinga, CoinDesk, and Bitcoin News included disclosures about AI usage. Investing.com and The Defiant have the highest proportion of AI-generated or AI-assisted content. Notably, the selection of these publications was based on their explicit disclosure policies regarding AI usage. Chainstory acknowledges that the actual numbers on these five news sites and within the broader cryptocurrency news industry could be higher or lower. Chainstory communicated with several editors about their AI policies. One of them is our own crypto.news editor Jayson Derrick. He told Chainstory that AI has its place in the newsroom, but writing entire articles with AI defeats the purpose of the technology. 'AI can be an excellent research assistant, but it is a poor storyteller. It is very helpful for accelerating background tasks, such as summarizing long research reports, extracting key points from documents, or finding statistics that support a particular assertion. But in terms of actual content output, the end result is often poor.' Derrick explained that AI currently cannot replicate a true human voice. Instead, articles written by AI often come off as mechanical, which can feel insincere to readers. He said, 'I believe the audience can recognize whether the content is written by a real person, regardless of whether it's labeled.' In response to the report, Chainstory co-CEO Afik Rechler stated, 'AI has somehow become an important part of cryptocurrency news, and everyone is using it today. But it hasn't replaced human reporting. It can't, at least at this stage. Large language models cannot handle stories that require depth, deeper context, nuances, and so on. Using AI is not necessarily bad, but a certain balance must be maintained to uphold trust.'