61
submitted 1 year ago by alyaza@beehaw.org to c/technology@beehaw.org

Outside of English, ChatGPT makes up words, fails logic tests, and can't do basic information retrieval.

all 12 comments
sorted by: hot top controversial new old
[-] GenderNeutralBro@lemmy.sdf.org 42 points 1 year ago

ChatGPT is not only made for information gathering, though

I'd argue that it is not made for information gathering at all, and it is largely coincidental that it performs as well as it does even in English.

[-] Kazumara@feddit.de 11 points 1 year ago

Our CIO at work posted a warning about using ChatGPT on sensitive data. The shocking part was that in the set of examples for why we might be using ChatGPT already he mentioned "for performing a quick fact check", which is insane to me. Who would use the system that is know to just generates likely answers even if they are untrue, for a fact check of all things?!

[-] dudewitbow@lemmy.ml 8 points 1 year ago

Machine learning is only as good as the dataset it has, and given that english has a HUGE data set on the internet, its okay at it, but it makes sense that for other languages, its likely not ideal.

An example would is art. Look up one using a smaller data set (e.g fully legal ones where all training data had artist permission) vs ones trained on the larger dataset where legality wasnt a concern. Night and day difference

[-] FaceDeer@kbin.social 5 points 1 year ago

ChatGPT is actually able to translate the information it learns in one language into other languages, so if it's having trouble speaking Bengali and such it must simply not know the language very well. I recall a study being done where an LLM was trained up on some new information using English training data and then was asked about it in French, and it was able to talk about what it had learned in French.

[-] lloram239@feddit.de 6 points 1 year ago

Another thing worth keeping in mind is that LLMs are feedforward networks, meaning they do everything in a fixed amount of time, no looping or second turns allowed. So even if the network has enough information to translate Bengali to English, that still doesn't necessarily give it the power to answer question in Bengali directly, as by the time it has figured out what the Bengali means, it's already out of layers to solve the problem.

Proper prompting can sometimes overcome those limitations, e.g. translate the problem into English first, feed the English text back into it to solve the problem and translate that output back to Bengali. The LLM itself can't do all of that in one step, but if the human sends the proper prompts do to it step by step it can sometimes solve complex problems that it couldn't do in one turn.

[-] GenderNeutralBro@lemmy.sdf.org 3 points 1 year ago

That's an important point you raise. I feel like a big problem with the LLM projects we see today, including ChatGPT, Bard, etc., is that the developers have tunnel vision. Rather than using the LLM as one component of a system with many well-researched traditional algorithms doing what they do best, they want to do everything within the network.

This makes sense from a research perspective. It doesn't make sense from an end-product perspective.

The more I play with LLMs, the more I feel like their true value is as something like "regular expressions on crack".

[-] JWBananas@startrek.website 3 points 1 year ago

regular expressions on crack

How does it go?

"I know! I'll use an LLM!"

"Now you have three problems."

[-] GenderNeutralBro@lemmy.sdf.org 1 points 1 year ago

True. When you have a hammer, everything looks like a nail.

I haven't actually implemented this yet, but I've been thinking about making a local file search program using an LLM. It would enable me to search for things in ways that are absolutely impossible with language-naive tools.

Here are a few examples of tasks I have in the past wanted to do, but was not able to:

  • Find references to the board game 'go' but please for the love of god do not return every case of the verb 'go'. Also include passages that refer to the game but don't mention it by name.

  • Find all references to foods, eating, or meals.

  • Find all the dialog lines of a specific character.

You can't do any of that with simple search, or even with regular expressions. You need general language awareness.

I know that LLMs will not be perfect at these tasks, either (at least not the current ones), but I think they could be quite effective.

[-] dudewitbow@lemmy.ml 1 points 1 year ago

Of course. But with translations, brings mistranslations, especially with tier 3+ languages to learn as a english speaker. The data is subject to the accuracy of the translation, and Chat GPT translation is still pretty far from perfect.

[-] FaceDeer@kbin.social 1 points 1 year ago

Ah, I had interpreted your comment to mean that you thought ChatGPT wouldn't know how to answer a question in Bengali unless the information it needed to solve the problem had been part of its Bengali training set. My bad.

this post was submitted on 06 Sep 2023
61 points (100.0% liked)

Technology

37801 readers
193 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS