[-] Aidan@lemm.ee 12 points 1 year ago

In one of my 300 level poli sci classes, literally one of the first things the professor said is that in politics, everyone running for office is a power-hungry narcissist. It’s only a slight exaggeration.

That type of person is at every level of politics. I’d wager that if you could get data on the real motivations of every person who has ever run for office, you’d probably see the same amount of those people at every level, from school board to president.

[-] Aidan@lemm.ee 5 points 1 year ago

Apple’s fate is to be the American Sony

[-] Aidan@lemm.ee 7 points 1 year ago

And it’s not an open palm gesture, you point only your index finger up. Otherwise it looks like you’re just waving at someone

[-] Aidan@lemm.ee 7 points 1 year ago

I love Wen and I hope she has all the pizza she could ever want

[-] Aidan@lemm.ee 5 points 1 year ago

I like the idea of never referring to it again

[-] Aidan@lemm.ee 1 points 1 year ago* (last edited 1 year ago)

I don't agree that ChatGPT has gotten dumber, but I do think I’ve noticed small differences in how it’s engineered.

I’ve experimented with writing apps that use the OpenAI api to use the GPT model, and this is the biggest non-obvious problem you have to deal with that can cause it to seem significantly smarter or dumber.

The version of GPT 3.5 and 4 used in ChatGPT can only “remember” 4096 tokens at once. That’s a total of its output, the user’s input, and “system messages,” which are messages the software sends to give GPT the necessary context to understand. The standard one is “You are ChatGPT, a large language model developed by OpenAI. Knowledge Cutoff: 2021-09. Current date: YYYY-MM-DD.” It receives an even longer one on the iOS app. If you enable the new Custom Instructions feature, those also take up the token limit.

It needs token space to remember your conversation, or else it gets a goldfish memory problem. But if you program it to waste too much token space remembering stuff you told it before, then it has fewer tokens to dedicate to generating each new response, so they have to be shorter, less detailed, and it can’t spend as much energy making sure they’re logically correct.

The model itself is definitely getting smarter as time goes on, but I think we’ve seen them experiment with different ways of engineering around the token limits when employing GPT in ChatGPT. That’s the difference people are noticing.

[-] Aidan@lemm.ee 2 points 1 year ago* (last edited 1 year ago)

I live near the former Holmdel Bell Labs complex. It’s an amazing building. It was sadly left in disrepair for decades until a developer bought it a few years back and turned it into a corporate office space with a mall at the ground floor. I got my Covid shots there.

[-] Aidan@lemm.ee 1 points 1 year ago

In about a year we’ll probably have that anyway. Practices like that will emerge as people get more experience running fediverse servers, and then they’ll get adopted by people trying to do what’s known to work

[-] Aidan@lemm.ee 1 points 1 year ago

Well you managed to get the username aidan@lemmy.world before me, so that’s something

[-] Aidan@lemm.ee 5 points 1 year ago

The iCloud support app? I’ll say it if you won’t. Apple needs to be shamed into doing something about that

[-] Aidan@lemm.ee 4 points 1 year ago

Working at the morgue must have been tough

[-] Aidan@lemm.ee 2 points 1 year ago

This is why I refuse to take relationship advice from the internet. I wonder how many adults have gotten divorced because a teenager on Reddit told them to

view more: next ›

Aidan

joined 1 year ago