In a recent consultation, the government outlined its "preferred option," which involves revising copyright legislation to permit artificial intelligence companies to utilize publicly available content for model training aimed at commercial gain, provided that rights holders do not explicitly opt out to retain their rights. This proposed amendment would be accompanied by enhanced transparency obligations for AI firms, as indicated by the government. Reports from POLITICO suggest that government officials intend to publish evaluations of technical solutions to address these stipulations, aiming to mitigate backlash against the initiative. However, OpenAI, in its feedback on the consultation, highlighted that experiences from other regions, particularly the European Union, demonstrate that opt-out frameworks encounter considerable implementation difficulties. Furthermore, they cautioned that imposing transparency requirements might lead developers to deprioritize market engagement. OpenAI emphasized that the United Kingdom stands at a pivotal juncture, presenting a unique chance to establish itself as the leading hub for artificial intelligence in Europe. They advocated for a comprehensive exemption from copyright restrictions to promote policy stability, stimulate innovation, and enhance economic development. In a similar vein, Google remarked that rights holders currently possess adequate means to exercise control over their content, effectively preventing web crawlers from extracting information. However, they noted that individuals who choose to opt out of AI training may not necessarily be entitled to compensation if their content inadvertently remains within a model's training dataset.
This news is summarized and processed by the IP Topics artificial intelligence algorithm.
Read the full article on the original webpage: https://www.politico.eu/article/openai-google-reject-uks-ai-copyright-plan/
The cover image belongs to the source website and is used as an integral part of the summary of the reference article.