Publishing Groups Call for EU Transparency Laws on AI Training Data to Protect Books and Democracy
-
European publishing bodies call for EU action on transparency of generative AI to protect democracy and the book industry.
-
Statement comes after revelations that works of authors like Zadie Smith were used without permission to train AI models.
-
Groups argue the EU's AI Act should ensure transparency of data sources used to train foundation models that underpin generative AI.
-
They say transparency obligations are easy for AI developers to comply with and rely on data they already have.
-
Groups argue transparency is needed now since existing AI models have long used works without consent or compensation.
![](https://i.guim.co.uk/img/media/640de33f7106f25288d91d530a2d3aef99026cb5/0_1219_5464_3278/master/5464.jpg?width=1200&height=630&quality=85&auto=format&fit=crop&overlay-align=bottom%2Cleft&overlay-width=100p&overlay-base64=L2ltZy9zdGF0aWMvb3ZlcmxheXMvdGctZGVmYXVsdC5wbmc&s=705220ea921e538d814eeca44bd8f2a4)