Lehrman v. Lovo, Inc.
On July 10, 2025, the federal district court for the Southern District of New York issued an Order granting in part and denying in part a motion to dismiss a putative class action lawsuit that Paul Lehrman and Linnea Sage commenced against Lovo, Inc. The lawsuit, Lehrman v. Lovo, Inc., alleges that Lovo used artificial intelligence to make and sell unauthorized “clones” of their voices.
Specifically, the complaint alleges that the plaintiffs are voice-over actors. For a fee, they read and record scripts for their clients. Lovo allegedly sells a text-to-speech subscription service that allows clients to generate voice-over narrations. The service is described as one that uses “AI-driven software known as ‘Generator’ or ‘Genny,'” which was “created using ‘1000s of voices.'” Genny allegedly creates voice clones, i.e., copies of real people’s voices. Lovo allegedly granted its customers “commercial rights for all content generated,” including “any monetized, business-related uses such as videos, audio books, advertising promotion, web page vlogging, or product integration.” (Lovo terms of service.) The complaint alleges that Lovo hired the plaintiffs to provide voice recordings for “research purposes only,” but that Lovo proceeded to exploit them commercially by licensing their use to Lovo subscribers.
This lawsuit ensued.
The complaint sets out claims for:
- Copyright infringement
- Trademark infringement
- Breach of contract
- Fraud
- Conversion
- Unjust enrichment
- Unfair competition
- New York civil rights laws
- New York consumer protection laws.
The defendant moved to dismiss the complaint for failure to state a claim.
The copyright claims
Sage alleged that Lovo infringed the copyright in one of her voice recordings by reproducing it in presentations and YouTube videos. The court allowed this claim to proceed.
Plaintiffs also claimed that Lovo’s unauthorized use of their voice recordings in training its generative-AI product infringed their copyrights in the sound recordings. The court ruled that the complaint did not contain enough factual detail about how the training process infringed one of the exclusive rights of copyright ownership. Therefore, it dismissed this claim with leave to amend.
The court dismissed the plaintiffs’ claims of output infringement, i.e., claims that the “cloned” voices the AI tool generated infringed copyrights in the original sound recordings.
Copyright protection in a sound recording extends only to the actual recording itself. Fixation of sounds that imitate or simulate the ones captured in the original recording does not infringe the copyright in the sound recording.
This issue often comes up in connection with copyrights in music recordings. If Chuck Berry writes a song called “Johnny B. Goode” and records himself performing it, he will own two copyrights – one in the musical composition and one in the sound recording. If a second person then records himself performing the same song, and he doesn’t have a license (compulsory or otherwise) to do so, that person would be infringing the copyright in the music but not the copyright in the sound recording. This is true even if he is very good at imitating Berry’s voice and guitar work. For a claim of sound recording infringement to succeed, it must be shown that the actual recording itself was copied.
Plaintiffs did not allege that Lovo used Genny to output AI-generated reproductions of their original recordings. Rather, they alleged that Genny is able to create new recordings that mimic attributes of their voices.
The court added that the sound of a voice is not copyrightable expression, and even if it were, the plaintiffs had registered claims of copyright in their recordings, not in their voices.
The trademark claims
In addition to infringement, the Lanham Act creates two other potential bases of trademark liability: (1) false association; and (2) false advertising. 15 U.S.C. sec. 1125(a)(1)(A) and (B). Plaintiffs asserted both kinds of claims. The judge dismissed these claims.
False association
The Second Circuit court of appeals recently held, in Electra v. 59 Murray Enter., Inc. and Souza v. Exotic Island Enters., Inc., that using a person’s likeness to create an endorsement without the person’s permission can constitute a “false association” violation. In other words, a federally-protected, trademark-like interest in one’s image, likeness, personality and identity exists. (See, e.g., Jackson v. Odenat.)
Although acknowledging that this right extends to one’s voice, the judge ruled that the voices in this case did not function as trademarks. They did not identify the source of a product or service. Rather, they were themselves the product or service. For this reason, the judge ruled that the plaintiffs had failed to show that their voices, as such, are protectable trademarks under Section 43(a)(1)(A) of the Lanham Act.
False Advertising
Section 43(a)(1)(B) of the Lanham Act (codified at 15 U.S.C. sec. 1125(a)(1)(B)) prohibits misrepresentations about “the nature, characteristics, qualities, or geographic origin of . . . goods, services, or commercial activities.” The plaintiffs claimed that Lovo marketed their voices under different names (“Kyle Snow” and “Sally Coleman.”) The court determined that this was not fraudulent, however, because Lovo marketed them as what they were, namely, synthetic clones of the actors’ voices, not as their actual voices.
Plaintiffs also claimed that Lovo’s marketing materials falsely stated that the cloned voices “came with all commercial rights.” They asserted that they had not granted those rights to Lovo. The court ruled, however, that even if Lovo was guilty of misrepresentation, it was not the kind of misrepresentation that comes within Section 43(a)(1)(B), as it did not concern the nature, characteristics, qualities, or geographic origin of the voices.
State law claims
Although the court dismissed the copyright and trademark claims, it allowed some state law claims to proceed. Specifically, the court denied the motion to dismiss claims for breach of contract, violations of sections 50 and 51 of the New York Civil Rights Law, and violations of New York consumer protection law.
Both the common law and the New York Civil Rights Law prohibit the commercial use of a living person’s name, likeness or voice without consent. Known as “misappropriation of personality” or violation of publicity or privacy rights, this is emerging as one of the leading issues in AI law.
The court also allowed state law claims of false advertising and deceptive trade practices to proceed. The New York laws are not subject to the “nature, characteristics, qualities, or geographic origin” limitation set out in Section 43(a) of the Lanham Act.
Conclusion
I expect this case will come to be cited for the rule that copyright cannot be claimed in a voice. Copyright law protects only expression, not a person’s corporeal attributes. The lack of copyright protection for a person’s voice, however, does not mean that voice cloning is “legal.” Depending on the particular facts and circumstances, it may violate one or more other laws.
It also should be noted that after the Joe Biden voice-cloning incident of 2024, states have been enacting statutes regulating the creation and distribution of voice clones. Even where a specific statute is not applicable, though, a broader statute (such as the FTC Act or a similar state law) might cover the situation.
Images and references in this blog post are for illustrative purposes only. No endorsement, sponsorship or affiliation with any person, organization, company, brand, product or service is intended, implied, or exists.
