Ex-NPR Host Alleges Google Stole Voice for AI Product
David Greene, a former NPR host, has taken legal action against Google. He alleges that the tech giant utilized his voice without permission for its AI voice technology in the NotebookLM platform, which features recently introduced Audio Overviews.
Background of the Allegation
NotebookLM, launched in 2024, allows users to convert written notes and documents into audio formats. The resulting AI-generated podcasts typically feature a male and female co-host. Greene claims that the male co-host’s voice closely mimics his own, trained on extensive recordings of his work.
Details of the Lawsuit
- Greene argues that Google replicated his unique vocal qualities without consent.
- The lawsuit states that Google has misappropriated his career, identity, and livelihood for profit.
- Greene was alerted to the similarities by colleagues and sought confirmation from an AI forensic firm.
Results from the forensic analysis indicated a high confidence level, between 53% and 60%, that the AI voice mirrored Greene’s. The firm’s CEO stated their firm belief that Google’s podcast model was trained on Greene’s voice.
Google’s Response
Google has responded to the allegations, claiming that the male voice in NotebookLM’s Audio Overviews was created using a paid professional actor, not Greene’s voice. José Castañeda, a spokesperson for Google, labeled the allegations as “baseless.”
The Broader Implications
This lawsuit highlights ongoing issues surrounding the use of intellectual property in AI. The rapid development of AI technologies has raised questions about data usage and authorization.
AI Voice and Image Usage Concerns
Concerns over autonomy and control are significant, as individuals lose ownership of their likeness in AI applications. Greene’s case is one among several high-profile disputes regarding the rights of creators in the age of AI.
Industry Context
Similar controversies have arisen recently. Notably, actress Scarlett Johansson criticized OpenAI for allegedly using her voice in a chatbot without permission, further emphasizing the industry’s need for clearer regulatory guidelines.
As the dialogue around AI and intellectual property continues, the outcomes of lawsuits like Greene’s could shape future practices in content creation and technology.