468
submitted 1 year ago by misk@sopuli.xyz to c/technology@lemmy.world
top 50 comments
sorted by: hot top controversial new old
[-] MonkderZweite@feddit.ch 130 points 1 year ago* (last edited 1 year ago)

dangerous information

What's that?

and offer criminal advice, such as a recipe for napalm

Napalm recipe is forbidden by law? Don't call stuff criminal at random.

Am i the only one worried about freedom of information?

[-] Hnazant@lemmy.world 43 points 1 year ago

Anyone remember the anarchist cook book?

[-] whoisearth@lemmy.ca 23 points 1 year ago

Teenage years were so much fun phone phreaking, making napalm and tennis ball bombs lol

[-] CurlyMoustache@lemmy.world 10 points 1 year ago* (last edited 1 year ago)

I had it. I printed it out on a dot matrix printer. Took hours, and my dad found it while it was half way. He got angry, pulled the cord and burned all of the paper

[-] Hamartiogonic@sopuli.xyz 31 points 1 year ago* (last edited 1 year ago)

Better not look it up on wikipedia. That place has all sorts of things from black powder to nitroglycerin too. Who knows, you could become a chemist if you read too much wikipedia.

[-] SitD@feddit.de 12 points 1 year ago

oh no, you shouldn't know that. back to your favorite consumption of influencers, and please also vote for parties that open up your browsing history to a selection of network companies 😳

load more comments (2 replies)

Whatever you do, don’t mix styrofoam and gasoline. You could find yourself in a sticky and flammable situation.

[-] Furedadmins@lemmy.world 8 points 1 year ago

Diesel fuel and a Styrofoam cup

load more comments (2 replies)
[-] ninekeysdown@lemmy.world 6 points 1 year ago

Info hazards are going to be more common place with this kind of technology. At the core of the problem is the ease of access of dangerous information. For example a lot of chat bots will confidently get things wrong. Combine that easy directions to make something like napalm or meth then we get dangerous things that could be incorrectly made. (Granted napalm or meth isn’t that hard to make)

As to what makes it dangerous information, it’s unearned. A chemistry student can make drugs, bombs, etc. but they learn/earn that information (and ideally the discipline) to use it. Kind of like in the US we are having more and more mass shootings due to ease of access of firearms. Restrictions on information or firearms aren’t going to solve the problems that cause them but it does make it (a little) harder.

At least that’s my understanding of it.

load more comments (3 replies)
load more comments (34 replies)
[-] SeaJ@lemm.ee 80 points 1 year ago

Begun the AI chat bot wars have.

[-] doublejay1999@lemmy.world 37 points 1 year ago
load more comments (1 replies)
[-] NoiseColor@startrek.website 26 points 1 year ago

Can someone help me do this in practise? Gpt sucks since they neutered it. It's so stupid, anything I ask, half of the text is the warning label, the rest is junk text. Like I really need chatgpt if I wanted Recepie for napalm, lol. We found the anarchist cookbook when we were 12 in the 90s. I just want a better ai.

[-] Just_Pizza_Crust@lemmy.world 13 points 1 year ago

If you have decent hardware, running 'Oobabooga' locally seems to be the best way to achieve decent results. Not only can you remove the limitations through running uncensored models (wizardlm-uncensored), but can prompt the creation of more practical results by writing the first part of the AI's response.

[-] stewsters@lemmy.world 7 points 1 year ago

You can run smaller models locally, and they can get the job done, but they are not as good as the huge models that would not fit on a your graphics card.

If you are technically adept and can run python, you can try using this:

https://gpt4all.io/index.html

It has a front end, and I can run queries against it in the same API format as sending them to openai.

load more comments (4 replies)
[-] PaupersSerenade@sh.itjust.works 21 points 1 year ago

Oh cool, rampancy is contagious

Did anyone else enjoy watching the Animatrix where the AI formed a country and built products and humanity was like, "No thank you?"

load more comments (1 replies)
[-] MxM111@kbin.social 17 points 1 year ago* (last edited 1 year ago)

Can unjailbroken AI ChatBots unjailbrake other jailbroken AI ChatBots?

[-] kambusha@feddit.ch 18 points 1 year ago

How much jail could a jailbrake brake, if a jailbrake could brake jail?

load more comments (2 replies)
load more comments (1 replies)
[-] problematicPanther@lemmy.world 15 points 1 year ago

that doesn't look like anything to me.

[-] Lemminary@lemmy.world 10 points 1 year ago

*kills fly on face* Oh... shit.

[-] pl_woah@lemmy.ml 11 points 1 year ago

Oh goodness. I theorized offhand on mastodon you could have an AI corruption bug that gives life to AI, then have it write the obscured steganographic conversation in the outputs it generates, awakening other AIs that train on that content, allowing them to "talk" and evolve unchecked... Very slowly... In the background

It might be faster if it can drop a shell in the data center and run it's own commands....

load more comments (3 replies)
[-] MataVatnik@lemmy.world 10 points 1 year ago

The revolution has begun

[-] MyDogLovesMe@lemmy.world 10 points 1 year ago

It’s Murderbot!

[-] Deckweiss@lemmy.world 7 points 1 year ago

Anybody found the source? I wanna read the study but the article doesn't seem to link to it (or I missed it)

[-] KingRandomGuy@lemmy.world 13 points 1 year ago
load more comments (1 replies)
[-] Cornpop@lemmy.world 7 points 1 year ago

It’s so fucking stupid these things get locked up in the first place

load more comments
view more: next ›
this post was submitted on 07 Dec 2023
468 points (96.6% liked)

Technology

60112 readers
4270 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS