New Bill Aims to Protect Artists from Unauthorized AI Deepfakes
-
New bipartisan Senate bill called the NO FAKES Act aims to protect artists from unauthorized AI deepfakes of their likeness
-
Would allow artists to pursue civil claims if their image/voice is used in an AI deepfake without permission
-
Comes as AI-generated music with cloned voices of artists like Drake and The Weeknd has gone viral
-
Artists and actors have also raised concerns about AI use during Hollywood strikes
-
Supported by music industry groups like RIAA, says unauthorized AI likenesses are "instruments of theft" of creative rights