FCC Chair Proposes Making AI-Generated Robocalls Illegal

Robocalls that use AI-generated voices, including those of celebrities and politicians, would be deemed illegal under a new proposal from FCC Chairwoman Jessica Rosenworcel.

The agency would be able to restrict the calls under the Telephone Consumer Protection Act, which limits the use of artificial or pre-recorded voice messages unless they have consent. The law was passed 30 years ago to try to limit robocalls, but it does not apply to calls that are not made for a commercial purpose.

More from Deadline

The issue of IA robocalls gained new attention just last week, when New Hampshire residents received calls that appeared to use AI voice technology to make it appear that it was coming from Joe Biden. But there also have been concerns over the use of deepfakes of celebrities like Taylor Swift. In the Biden call, the voice tried to discourage voters from heading to the polls, just as the president’s allies were in the midst of a write-in campaign to ensure that he won the state’s primary. He did.

In a statement, Rosenworcel said, “AI-generated voice cloning and images are already sowing confusion by tricking consumers into thinking scams and frauds are legitimate. No matter what celebrity or politician you favor, or what your relationship is with your kin when they call for help, it is possible that we could be a target of these faked calls.”

The proposal would recognized that AI-generated voices are artificial under the Telephone Consumer Protection Act, meaning that they would be illegal under existing law. State attorneys general could then “crack down on these scams and protect consumers.”

The FCC launched a notice of inquiry in November to look into how the agency could curb robocalls and how to address AI technology.

Best of Deadline

Sign up for Deadline's Newsletter. For the latest news, follow us on Facebook, Twitter, and Instagram.