845
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 26 Jun 2024
845 points (97.8% liked)
Technology
60112 readers
2568 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
I expect it's going likely to be used to train some Chinese AI model. The race to AGI is in progress. IMO: "ideas" (code included) should be freely usable by anyone, including the people I might disagree with. But I understand the fear it induces to think that an authoritarian government will get access to AGI before a democratic one. That said I'm not entirely convinced the US is a democratic government..
PS: I'm french, and my gov is soon to be controlled by fascist pigs if it's not already, so I'm not judging...
Even if they do that, the license for open source software doesn't disallow it from being done.
It certainly can. Most licences require derivative works to be under the same or similar licence, and an AI based on FOSS would likely not respect those terms. It's the same issue as AI training on music, images, and text, it's a likely violation of copyright and thus a violation of open source licensing terms.
Training on it is probably fine, but generating code from the model is likely a whole host of licence violations.
Some, but probably not most. This is mostly an issue with "viral" licenses like GPL, which restrict the license of derivative works. Permissive licenses like the MIT license are very common and don't restrict this.
MIT does say that "all copies or substantial portions of the Software" need to come with the license attached, but code generated by an AI is arguably not a "substantial portion" of the software.
How do you verify that though?
And does the model need to include all of the licenses? Surely the "all copies or substantial portions" would apply to LLMs, since they literally include the source in the model as a derivative work. That's fine if it's for personal use (fair use laws apply), but if you're going to distribute it (e.g. as a centralized LLM), then you need to be very careful about how licenses are used, applied, and distributed.
So I absolutely do believe that building a broadly used model is a violation of copyright, and that's true whether it's under an open source license or not.
I agree with you, and don't really have any answers :)
By comparing it to the original work.
And how will you know what original work(s) to compare it to?
How do you know anything about anything an LLM generates? Presumably if you're the author you would recognize your own work?
I'm not going to be monitoring Chinese code projects. They don't seem to care much about copyright, so they'll probably just yoink the code into proprietary projects and not care about the licenses.
What am I going to do, sue someone in China? And decompile everything that comes from China to check if my code was likely in it? That's ridiculous. If it's domestic, I probably have a chance, but not if it's in another country, and especially not one like China that doesn't seem to care about copyright.