Longshot Democratic presidential candidate Rep. Dean Phillips, D-Minn., is distancing himself from a report that one of his campaign's former consultants hired a magician to create a deepfake of President Biden urging New Hampshire voters not to participate in last month’s primary.
Paul Carpenter, a magician from New Orleans, came forward and said he had made the deepfake for $1 and that a Democratic consultant Steve Kramer had paid him $150 to do it, according to an NBC report. Kramer is a get-out-the-vote specialist who worked on ballot access for the Phillips campaign and also worked on Kanye West’s unsuccessful 2020 presidential campaign.
"I’m disgusted that a consultant hired to assist my campaign [with] ballot access is alleged to have faked a robocall impersonating Joe Biden," Phillips wrote on X on Friday.
"While I don’t know the person, such behavior is despicable and I trust will be investigated by authorities. It’s also despicable that the Party actively limits access to state ballots and blackballs reputable consultants who would otherwise work with challengers like me. The corruption in politics is pervasive and must be exposed and addressed."
The Phillips campaign told NBC that its relationship with Kramer ended weeks ago after his signature gathering work to get Phillips on the ballot in certain states had ended.
"If it is true that Mr. Kramer had any involvement in the creation of deepfake robocalls, he did so of his own volition which had nothing to do with our campaign," Phillips' press secretary Katie Dolan told NBC in a statement.
"The fundamental notion of our campaign is the importance of competition, choice, and democracy. We are disgusted to learn that Mr. Kramer is allegedly behind this call, and if the allegations are true, we absolutely denounce his actions."
Kramer, a longtime political operative, meanwhile, told NBC that he will be giving his side of the story in a Saturday op-Ed.
Carpenter shared text messages, call logs and Venmo transactions with NBC to back up his claim about a scheme that is now at the center of a multi-state law enforcement investigation.
"I created the audio used in the robocall. I did not distribute it," Carpenter told NBC. "I was in a situation where someone offered me some money to do something and I did it. There was no malicious intent. I didn’t know how it was going to be distributed."
The date that New Hampshire set for its primary Tuesday is out of compliance with the DNC's 2024 presidential nominating calendar. Holding an unsanctioned primary meant President Biden was not on the New Hampshire ballot, but Granite State Democrats launched a write-in campaign in an attempt to prevent an electoral embarrassment for the president as he runs for a second term in the White House.
"What a bunch of malarkey. You know the value of voting Democratic when our votes count. It’s important that you save your vote for the November election," the voice says in a recording of the message obtained by NBC News.
"We will need your help in electing Democrats up and down the ticket. Voting this Tuesday only enables Republicans in their quest to elect Donald Trump again. Your vote makes a difference in November, not this Tuesday."
The fake Biden calls reached 5,000 to 25,000 people, according to NBC’s investigation citing authorities.
The caller ID’s appeared as if they were coming from the former chairperson of the New Hampshire Democratic Party, who was running a pro-Biden write-in campaign at the time.
Following the revelations, New Hampshire Attorney General John Formella announced an investigation into the calls while the Federal Communications Commission has now made AI-generated robocalls mimicking the voices of political candidates to fool voters illegal.
Formella said investigators had identified the Texas-based Life Corp. as the source of the calls and that the calls were transmitted by another Texas-based company, Lingo Telecom.
CLICK HERE TO GET THE FOX NEWS APP
Carpenter told NBC News he was coming forward about the deepfake because he regrets his involvement and wants to warn people about how easy it is to use AI to mislead.
"It’s so scary that it’s this easy to do," Carpenter said. "People aren’t ready for it."
He said it cost him $1 to make, and he was paid $150, according to Venmo payments he shared with the outlet, while he was also able to provide them with the original audio version.
Carpenter said he also created two deepfakes of Sen. Lindsey Graham, R-S.C. asking GOP presidential primary voters which candidate they supported and believed that all three calls had been authorized by the respective campaigns. He said he had not heard of Phillips before making the deepfakes.
Carpenter told NBC that he had been introduced to Kramer through a mutual acquaintance and that Kramer had taken an interest in the creator’s experience with AI.
Fox News’ Danielle Wallace and Greg Norman contributed to this report.