Copyright concerns create need for a fair alternative in AI

Copyright concerns create need for a fair alternative in AI

When future generations look back at the rise of artificial intelligence technologies, the year 2025 may be remembered as a major turning point, when the industry took concrete steps towards greater inclusion, and embraced decentralised frameworks that recognise and fairly compensate every stakeholder. The growth of AI has already sparked transformation in multiple industries, but the pace of uptake has also led to concerns around data ownership, privacy and copyright infringement. Because AI is centralised with the most powerful models controlled by corporations, content creators have largely been sidelined. OpenAI, the world’s most prominent AI company, has already admitted that’s the case. In January 2024, it told the UK’s House of Lords Communications and Digital Select Committee that it would not have been able to create its iconic chatbot, ChatGPT, without training it on copyrighted material. OpenAI trained ChatGPT on everything that was posted on the public internet prior to 2023, but the people who created that content – much of which is copyrighted – have not been paid any compensation; a major source of contention. There’s an opportunity for decentralised AI projects like that proposed by the ASI Alliance to offer an alternative way of AI model development. The Alliance is building a framework that gives content creators a method to retain control over their data, along with mechanisms for fair reward should they choose to share their material with AI model makers. It’s a more ethical basis for AI development, and 2025 could be the year it gets more attention.