Google is denying claims that one of its AI tools copied the voice of a well-known public radio host. The company responded after a lawsuit accused its software of using his unique speech without permission.
Dispute centers on AI podcast feature
David Greene, who used to host NPR’s Morning Edition and now presents KCRW’s Left, Right & Center, is suing Google over the male voice in its NotebookLM tool. This tool can turn written material into podcast-style audio. Greene said that the AI voice sounds a lot like his own, including his cadence, tone, and vocal habits.
Report from TechCrunch shared that Greene became aware of the similarity after colleagues and friends asked whether he had licensed his voice to Google. After listening to the tool himself, he believed the AI-generated host sounded strikingly like him, prompting legal action.
NotebookLM has an “Audio Overviews” feature that makes podcast-style summaries with virtual hosts. The lawsuit says the male voice copies Greene’s way of speaking and identity, which he says is a key part of his broadcasting career.
Google rejects the allegation
Google has denied using Greene’s voice, saying the audio used in NotebookLM is based on a paid professional actor and is not connected to the radio host.
Google’s position is that any similarity is a coincidence and not because the AI was trained on Greene’s voice.
Lawsuit highlights growing AI voice concerns
The Washington Post reported that Greene’s legal team submitted a third-party audio analysis suggesting a 53% to 60% confidence match between the AI voice and Greene’s, a range they describe as significant. Greene argues the resemblance could harm his reputation if the AI voice is associated with content he does not control.
This case highlights bigger questions about synthetic voices and consent, especially as AI tools get better at making human-like speech. Other public figures have also raised concerns in the past about AI voices sounding like them without their permission.
Personal identity at the center of the dispute
Greene has said his voice is a core part of his professional identity and that hearing a similar-sounding AI host was unsettling. He is not opposing AI technology itself, according to reporting, but argues that safeguards should be in place when a digital voice closely resembles a real person.
The lawsuit adds to a growing list of legal challenges tied to AI-generated likenesses, including voice and image replication.
As generative tools expand, the outcome could help shape how companies approach voice modeling, licensing, and identity protection in future AI products.