The governor of Arizona sure loves a veto. Since taking office over a year ago, Democrat Katie Hobbs has already broken the Grand Canyon State’s veto record. She says the legislation that she’s been vetoing is “extreme.” Among the many measures Hobbs has already vetoed in 2024 so far is Senate Bill 1335, which would have criminalized the posting of deepfakes without the subject’s consent.

S.B. 1336, if passed, would have set up criminal penalties for making or distributing “deepfakes.” As you might know, deepfakes are videos or images that use artificial intelligence to realistically replace a person’s likeness with another’s. While some deepfakes are for entertainment, they can be misused to spread misinformation or damage someone’s reputation – or worse. Much, much worse.

Deepfake Dilemmas

A recent story in The New Yorker is an extreme example of deepfakes used by the dark side. A Brooklyn couple got a phone call in the middle of the night from what sounded like the son’s elderly parents. A third voice was a man holding them hostage. The voice on the phone imitated the real parents’ voices and seemed to be calling from the elderly mother’s number, as saved on her daughter-in-law’s caller ID. The captor demanded an oddly low sum of money — about $500 — as ransom.

The terrified younger couple paid via Venmo and immediately dialed the elderly father’s phone, frantically asking if they were okay and had been released. The elderly couple had no idea what their son and his wife were talking about. It turns out the whole thing was set up with deepfake technology. But the money transferred? Of course, that was real.

The technology for synthesized voices, images, and videos has become astoundingly good. In the past couple of years, companies have started creating business models around it. Unfortunately, much of it is being used for nefarious purposes.

The problems are hardly limited to the U.S. A woman who worked in finance in the Shanxi province of China was tricked into transferring hundreds of thousands of dollars to a fraudster who called her with a video call using a deepfake of her boss. In Arizona, a woman in Chandler was the target of a scammer with deepfake of her daughter’s voice on the phone, who tricked the woman into thinking her daughter had been taken for ransom. The girl was found safe and sound at her place of work soon after the call.  

Arizona Republicans Concerned

In May, Arizona Republicans shared stories of incidents such as these. One of them, Representative David Cook, voted “yes” for S.B. 1336, saying, “These things are not OK, and we have to take action today.” Republicans in both the House and the Senate unanimously supported the measure.

The new legislation would have prohibited the non-consensual dissemination of deepfake images of a person under certain parameters, namely, if they were either nude or engaging in sex acts. The law would also establish this action as a crime. It would be classified as either a class 4 or class 6 felony, depending on factors such as the intent of the perpetrator, the method of dissemination, and the impact it had on the victim.

Dems, ACLU Find Bill Problematic

Despite the unanimous support from the Right, the Democratic vote was split down the middle in both the state Senate and House. Some of the Democrats’ concerns with the bill included its impact on freedom of expression.

The ACLU had spoken out against the bill for similar reasons, calling the bill overly broad. Their Policy Director, Darrell Hill, said: “Our concern is that it criminalizes speech that’s protected by the First Amendment of the Constitution, particularly political speech that’s satire. We would have loved to see explicit language with S.B. 1336 holding that those depictions are criminal when they are done with the intent to defraud or harass.”

Other Anti-Deepfake Bills Already Signed

Governor Hobbs hasn’t said whether she agrees with the ACLU. Despite not signing S.B. 1336, she certainly does not seem to be okay with the misappropriation of deepfakes in her state. She denies accusations that she simply refuses to work with Republicans, and she’s been supportive of other bills that would target deepfakes. In fact, she’s already signed into law other bills that do just that.

For example, she signed House Bill 2394, which allows people to get a court order to take down “digital impersonations” published without their consent. She also signed Senate Bill 1359, which regulates deepfakes of candidates running for election. That law requires disclosures that the media includes AI-generated content generated by artificial intelligence, and imposes civil penalties for those that don’t comply, although it allows exceptions for satire and parody.

New Alternative Bill Still to Come

Part of Governor Hobbs’s reason for not signing S.B. 1336 was that she thought it was duplicative of past bills she’s already signed and very similar to another bill that she says she plans to sign: Senate Bill 1078. This bill would establish similar penalties to the one she vetoed but includes the language that the ACLU urged needed to be included regarding the intent to defraud or says that this alternative bill would accomplish the goals that critics of the vetoed bill were concerned about.  

S.B. 1078 establishes a class 5 felony for the use of fake audio recordings, videos or images with the intent to harass or defraud the subject of the fake media. It passed both chambers of the Arizona legislature with a good deal of support, though it hasn’t made its way to Governor Hobbs’ desk for a signature yet.

All in all, Arizona is just one example a broader difficulty that legislators in other parts of the country (and world) are likely to have when it comes to regulating deepfakes and other online content.

Related Resources:

The post Arizona’s Attempts to Criminalize Deepfake Fraud appeared first on .