64
submitted 2 years ago by 0x815@feddit.de to c/technology@beehaw.org

After being scammed into thinking her daughter was kidnapped, an Arizona woman testified in the US Senate about the dangers side of artificial intelligence technology when in the hands of criminals.

top 20 comments
sorted by: hot top controversial new old
[-] animist@lemmy.one 18 points 2 years ago

This is gonna become waaaaay more common. I am already working on code words with my family members just in case.

[-] loklan@kbin.social 6 points 2 years ago

This is an excellent idea, I am going to discuss that with the family. My dad gets phished constantly, he's pretty savvy but he also has early dementia, so between that and AI, it's going to get harder in the future.

[-] carnha@lemmy.one 12 points 2 years ago

I've only been thinking about the implications of faking a celebrity's voice - personalizing it like this makes me sick to my stomach. Had no idea it's already that easy. I don't think the voice would even have to be that realistic - if they're faking a life threatening situation, my first thought isn't going to be "Hey, their voice sounded a little off". Absolutely horrifying.

[-] arcticpiecitylights@beehaw.org 12 points 2 years ago

Jesus fucking Christ man.

[-] DarkThoughts@kbin.social 10 points 2 years ago

Is no one questioning how the alleged kidnappers managed to create a voice profile from a random 15 year old girl to create such a convincing AI voice? The only source that claims that this was potentially an AI scam, was in fact just another parent:

But another parent with her informed her police were aware of AI scams like these.

Isn't it more likely that dad & daughter did this and it backfired?

[-] Stumblinbear@pawb.social 4 points 2 years ago

It's pretty easy to create voice clones, now. As long as you tailor the speech you want it to speak and don't have it speak too long it can get pretty good even with very little input

[-] davidhun@lemmy.sdf.org 3 points 2 years ago

Given the prevalence of social media platforms where you post videos of yourself, it seems pretty easy to get enough voice sampling to generate a convincing clone. Depending on how much personal info she and her family members put out on social media, it's trivial to connect all the dots to concoct a plausible scenario to scam someone.

Now whether or not it was "just a prank, bro" from family or whomever, I don't know.

[-] PlantJam@beehaw.org 2 points 2 years ago

All it takes is a three second sample, according to "The AI Dilemma" on YouTube. It's about an hour long, but it has a lot of really good information.

[-] scrubbles@poptalk.scrubbles.tech 1 points 2 years ago

Correct, it does not take much anymore to train up a voice model, especially a hysterical sounding one that would trick a mother. Teens post enough on social media that this could be done

[-] WorseDoughnut@kbin.social 8 points 2 years ago

When DeStefano tried to file a police report after the ordeal, she was dismissed and told this was a “prank call”.

Why am I not surprised.

[-] Gray@lemmy.ca 8 points 2 years ago

My grandma fell for a scammer that was pretending to be one of her grandchildren stuck in a jail in Mexico over a mixup. No AI voice or anything, just an actor and a vulnerable 90+ year old woman. She sent the scammer $10,000. I cannot fucking begin to imagine what AI is going to do to the scamming industry.

[-] Entropywins@kbin.social 7 points 2 years ago

What an amazing grandma and what absolute pieces of shit to take advantage of such a nice person.

[-] IllegallyBlonde@kbin.social 4 points 2 years ago

A similar thing happened to me and my husband. We were out of state on vacation, and got a call and visit from the police at 3 am saying that my husband had been kidnapped for ransom. Of course he was laying right next to me, but for half a second I was terrified. It was a complete scam. They think they found my husband's information in a data hack at his school.

[-] pokexpert30@lemmy.pussthecat.org 7 points 2 years ago

Teach your grandparents about those scams. Insists on the "nowadays a computer can replicate a voice perfectly enough over the phone, it will really sounds like real" as well as the "the numéro that will call will look like it's from me. Just hang up and call me of you receive such a call"

[-] Plume@beehaw.org 7 points 2 years ago

I hate everything about AI and this is not helping. It feels like we opened a door wide open that we should've never have touched.

[-] hoshikarakitaridia@lemmy.fmhy.ml 5 points 2 years ago

Yeah that's the bad stuff.

[-] Pluto_Is_A_Planet@kbin.social 2 points 2 years ago

That's terrifying and likely going to be way more common going forward.

[-] TruthButtCharioteer@kbin.social 2 points 2 years ago* (last edited 2 years ago)

Soooo... holdup.

  1. Take out kidnapping insurance
  2. Go to mexico
  3. Get "kidnapped"
  4. Run the scam with your fancy schmancy ai
  5. Get pay out and get "rescued"
[-] ActuallyRuben@actuallyruben.nl 5 points 2 years ago

If you're running the scam yourself, why would you even use an AI to mimic yourself?

[-] somniumx@feddit.de 3 points 2 years ago

Your scammers were so preoccupied with whether they could, they didn’t stop to think if they should!

load more comments
view more: next ›
this post was submitted on 15 Jun 2023
64 points (100.0% liked)

Technology

37809 readers
244 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS