WATCH: Sen. Kelly Highlights Danger of AI Scams in Senate Hearing
Testimony features Scottsdale scam victim
Today, during a Senate Aging Committee hearing, Arizona Senator Mark Kelly addressed the dangers of artificial intelligence being used to scam Arizonans. In his remarks, Kelly highlighted the video testimony of a Scottsdale mother who received a fake phone call from scammers using a voice clone of her daughter saying she had been kidnapped and demanding ransom.
Kelly called on Congress to address legal gaps that make it harder to prosecute attempted but ultimately unsuccessful scams.
Sen. Kelly: Chairman Casey, I want to thank you for the video you showed at the top of the hearing that told the story of Jennifer DeStefano.
Jennifer is a mom of four from Scottsdale. As we heard, earlier this year she got a call from an unknown number. When she picked up, it was her 15-year-old daughter.
Her daughter was crying and calling out for her. A man got on the phone and threatened to harm her kid unless Jennifer paid $50,000. The man said he needed the money in cash and would be coming to her in a van.
Folks nearby called 911, as well as calling Jennifer’s husband. It turned out that her daughter was just at home—not with kidnappers.
The call wasn’t real. Scammers had used, in this case, AI to create a voice that sounded like her daughter’s voice and she couldn’t tell the difference.
And even though in that moment of extreme, horrific terror probably shook Jennifer to her core, the police said there was nothing they could do. No money was transferred. No crime, they said, had been committed.
This feels to me and I imagine many others and many Americans, as a huge blind spot in the law. We’ve got a couple of lawyers on the panel here.
Mr. Schildhorn and Mr. Weisman, how should we in Congress be looking at filling these gaps in the law?
Gary Schildhorn (attorney and AI-powered scam victim): Thank you, Senator.
As I think you’ve just mentioned as part of your question, it is a fundamental principle of our system that there is a remedy if you are harmed. With crypto and AI, law enforcement does not have a remedy and neither does the judicial system. You can’t find anyone to sue.
So, my answer is that there needs to be some legislation that allows these people to be identified or where the money has gone to be identified so that there is a remedy for the harm that’s being caused.
Currently, there is a hole in the system. There is no remedy that I’m aware of.
Sen. Kelly: How about the issue in this specific case?
By the way, I’ve had this happen to somebody I’m rather close with—almost the exact same thing. Again, no money was transferred. But it was incredibly shocking. In this case it was a grandparent who had the same issue. Same thing happened to them with a grandkid. No money was transferred.
To me, that still seems like a crime—attempting to rip somebody off even if they weren’t successful.
Do you feel that we should make that a crime with criminal penalties?
Gary Schildhorn: Well, Senator Kelly, I’m not an expert on this but I am a lawyer, so that’s never stopped me from giving an opinion.
In this instance, there are analogies to law towards intentional infliction of emotional distress. I believe that’s a cause of action in many states. There might be a way to enhance that type of law if someone is using this.
Even if you do not spend money and you cause that kind of shock and distress, that the law allows you to recover a sum of money that’s not calculated by how much you actually lost but how much pain and suffering you have incurred because you’ve been subject to that type of extortion.
Sen. Kelly: Mr. Weisman.
Steve Weisman (law professor and creator of Scamicide.com): I think Gary hit on the keyword there. Extortion is a crime. Attempt at extortion is a crime.
I do think it is already a criminal violation, but I agree with you. I think some federal legislation to this particular medium of delivery of this extortion could be done.
The other thing is, one thing I tell my students when we are talking about white-collar crime, is that it’s about the money.
Here, as has been said before, it is very difficult to trace it. They’re using burner phones. Who knows where they may be. They may be using voice cloning technology. They could be in a foreign country where their accents are no longer able to be heard.
But what we can do is, as the Senior Safe Act is, go after gift cards because they get paid by gift cards. Go after the wiring. Go after the banking so you stop it there where people are in the rush of emotion then they go to pay by gift card. The gift card or people say where is this going, what’s this for? They recognize the scam to stop it before it actually occurs.
Sen. Kelly: Thank you.