-1
submitted 10 months ago by yogthos@lemmy.ml to c/technology@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] TowardsTheFuture@lemmy.zip 4 points 10 months ago

What? All of the photo is real. It’s just taken like 50 frames over a second or two and chosen the best. Since it sees 3 “people” it chooses the best for each “person” which leads to those 3 not being from the exact same, but still close enough.

I can’t imagine very many photos in evidence being like “well we can see you were holding the knife , actively stabbing toward them and also that they were stabbed to death by the knife, but I mean who knows WHAT could’ve happened in the one second the phone took the picture it stitched together to create this.”

[-] nicky7@lemmy.ml -1 points 10 months ago* (last edited 10 months ago)

edit: since it wasn't obvious to readers, this is a hypothetical of a techno-distopian future...

Imagine taking a selfie only to see an image of you holding a knife. But there are no knives in your hands. Another snap. Same image displays on the screen, but there's a person of particular importance in the background. You turn your head but are all alone. Nobody is around. You're starting to freak out. Are you being pranked, maybe your phone has been hacked. Another shutter sound effect and you see an image of yourself over a victim. You frantically open your camera's gallery, thinking your eyes are fooling you, but the photos are the same. And are sent to the cloud. Deleting isn't allowed, AI detected felonious imagery. You've been reported to multiple agencies. You are alone. There are no knives in your hands.

[-] TowardsTheFuture@lemmy.zip 2 points 10 months ago

What? Again, the images are real. They just take 50 shots and use the best frame and thus movement can happen in between. They’re not using AI to make it into what they think the image should be. They’re just stitching together a bunch of frames taken from basically the same time (1-2 seconds) into one picture.

[-] nicky7@lemmy.ml 1 points 10 months ago* (last edited 10 months ago)

I was taking the comment thread (about how dangerous this could be in photographic evidence) a step further by imagining a hypothetical techno-distopian future where corporate controlled AI alters photos to make them look better, but in reality, it creates a back door where incriminating evidence can be created.

load more comments (1 replies)
this post was submitted on 01 Dec 2023
-1 points (48.1% liked)

Technology

34616 readers
318 users here now

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

founded 5 years ago
MODERATORS