873
you are viewing a single comment's thread
view the rest of the comments
[-] chicken@lemmy.dbzer0.com 0 points 11 months ago

ChatGPT regularly makes up methods or entire libraries

I think that when it is doing that, it is normally a sign that what you are asking for does not exist and you are on the wrong track.

ChatGPT cannot explain, because it doesn’t understand

I often get good explanations that seem to reflect understanding, which often would be difficult to look up otherwise. For example when I asked about the code generated, {myVariable} , and how it could be a valid function parameter in javascript, it responded that it is the equivalent of {"myVariable":myVariable}, and "When using object literal property value shorthand, if you're setting a property value to a variable of the same name, you can simply use the variable name."

[-] state_electrician@discuss.tchncs.de 3 points 11 months ago

If ChatGPT gives you correct information you're either lucky or just didn't realize it was making shit up. That's a simple fact. LLMs absolutely have their uses, but facts ain't one of them.

this post was submitted on 24 Oct 2023
873 points (93.1% liked)

Programmer Humor

19331 readers
92 users here now

Welcome to Programmer Humor!

This is a place where you can post jokes, memes, humor, etc. related to programming!

For sharing awful code theres also Programming Horror.

Rules

founded 1 year ago
MODERATORS