Skip to main content

Already a subscriber? Make sure to log into your account before viewing this content. You can access your account by hitting the “login” button on the top right corner. Still unable to see the content after signing in? Make sure your card on file is up-to-date.

A telecom provider has agreed to pay the FCC a $1 million fine for transmitting deceptive robocalls that used AI to mimic President Joe Biden’s voice.

What’s the deal: Lingo Telecom, a voice service provider, sent AI-generated robocalls to thousands of New Hampshire voters, mimicking President Joe Biden’s voice. These calls falsely claimed that voting in the state’s presidential primary would block participation in the November general election. The FCC argued this was intended to mislead voters.

Agt

What the call said: In a voice that sounded like President Biden, the AI voice said, “It’s important that you save your vote for the November election… Voting this Tuesday only enables the Republicans in their quest to elect Donald Trump again.”

What happened today: Lingo Telecom agreed to pay a $1 million fine to the FCC for sending deceptive AI-generated robocalls to New Hampshire voters. Initially, the FCC sought a $2 million fine, but the amount was reduced. Steve Kramer, the political consultant behind the calls, is still facing a proposed $6 million FCC fine and state criminal charges, including voter suppression and impersonating a candidate, which could lead to prison time if convicted.

FCC reacts: In a statement, FCC chairperson Jessica Rosenworcel said “Every one of us deserves to know that the voice on the line is exactly who they claim to be. If AI is being used, that should be made clear to any consumer, citizen, and voter who encounters it. The FCC will act when trust in our communications networks is on the line.”

JOIN THE MOVEMENT

Keep up to date with our latest videos, news and content