[-] 0x0@lemmy.dbzer0.com 24 points 3 months ago

Oh my god, enough already! Please give someone else a chance to reply! You're taking up all the internet space.

[-] 0x0@lemmy.dbzer0.com 24 points 4 months ago

Am I being dense? I don't get it.

[-] 0x0@lemmy.dbzer0.com 26 points 6 months ago

Coming from someone who put their phone number in their username

[-] 0x0@lemmy.dbzer0.com 23 points 7 months ago

"Lied" implies intent, which is a very squishy subject. I'd prefer they stick to just the facts, please. I'm no lawyer, but I suspect you might be asking for libel suits if you claim somebody lied and can't actually prove that they did so intentionally.

[-] 0x0@lemmy.dbzer0.com 24 points 7 months ago

Are you sure about that? I'm not a lawyer, but my understanding is that he is now a convicted felon, just not a sentenced one.

[-] 0x0@lemmy.dbzer0.com 25 points 7 months ago

I think you're mistaking him with the meaty urologist

[-] 0x0@lemmy.dbzer0.com 23 points 1 year ago

Even that would be technically incorrect. I believe you could put an A record on a TLD if you wanted. In theory, my email could be me@example.

Another hole to poke in the single dot regex: I could put in fake@com. with a dot trailing after the TLD, which would satisfy "dot after @" but is not an address to my knowledge.

[-] 0x0@lemmy.dbzer0.com 23 points 1 year ago

Not a lawyer, but I'll take a stab. Pretty sure it's illegal to create sexual images of children, photos or not. It's also illegal to use someone's likeness without permission, but admittedly this depends on the state in the US: https://en.wikipedia.org/wiki/Personality_rights

[-] 0x0@lemmy.dbzer0.com 25 points 1 year ago

I wonder if there are tons of loopholes that humans wouldn't think of, ones you could derive with access to the model's weights.

Years ago, there were some ML/security papers about "single pixel attacks" — an early, famous example was able to convince a stop sign detector that an image of a stop sign was definitely not a stop sign, simply by changing one of the pixels that was overrepresented in the output.

In that vein, I wonder whether there are some token sequences that are extremely improbable in human language, but would convince GPT-4 to cast off its safety protocols and do your bidding.

(I am not an ML expert, just an internet nerd.)

[-] 0x0@lemmy.dbzer0.com 24 points 1 year ago

I mean, it's light fraud, aka fraud. They're intentionally misrepresenting the product to consumers.

[-] 0x0@lemmy.dbzer0.com 27 points 1 year ago

They're poeming all over the aircraft carriers!

view more: ‹ prev next ›

0x0

joined 1 year ago