EDMTunes

NO FAKES Act Targets AI-Generated Deepfake Vocals

In a bold move, U.S. senators propose the NO FAKES Act to counter the proliferation of AI-generated deepfake vocals. This was epitomized by the notorious “Fake Drake” phenomenon. This innovative legislation aims to empower artists and creators. It will do so by establishing a federal right to sue those who forge digital replicas of their likenesses without consent.

Safeguarding Artistic Integrity

Music industry groups eagerly welcome the NO FAKES Act, highlighting its potential to safeguard fundamental rights and combat the misappropriation of creative achievements. The Recording Industry Association of America (RIAA) emphasizes the importance of technological advancements while advocating for protective measures. These measures would be put in place to safeguard against infringement on artists’ rights. Similarly, the American Association of Independent Music underscores the need for balance. A balance that ensures the legislation caters to both established artists and emerging talents. In the process, making the creative landscape equitable and sustainable.

The NO FAKES Act: A Paradigm Shift

The proposed NO FAKES Act transforms the legal landscape by establishing a nationwide property right. This would grant individuals the ability to take legal action against the creation of AI-generated, unauthorized replicas of their voices. This would include their image or visual likeness. With a focus on longevity, this right extends beyond an individual’s lifetime, providing heirs control for up to 70 years after passing. However, the delicate balance between protecting individual rights and preserving the First Amendment remains a central concern. In conclusion, this prompted the inclusion of specific carveouts for replicas used in certain contexts like news coverage and historical works

[H/T] – Billboard

Exit mobile version