15
submitted 2 months ago by JRepin@lemmy.ml to c/programming@lemmy.ml

The Open Source Initiative (OSI) released the RC1 (“Release Candidate 1” meaning: This thing is basically done and will be released as such unless something catastrophic happens) of the “Open Source AI Definition“.

Some people might wonder why that matters. Some people come up with a bit of writing on AI, what else is new? That’s basically LinkedIn’s whole existence currently. But the OSI has a very special role in the Open Source software ecosystem. Because Open Source isn’t just based on the fact whether you can see code but also about the License that code is covered under: You might get code that you can see but that you are not allowed to touch (think of the recent WinAmp release debate). The OSI basically took on the role of defining which of the different licenses that were being used all over the place actually are “Open Source” and which come with restrictions that undermine the idea.

This is very important: Picking a license is a political act with strong consequences. It can allow or forbid different modes of interaction with an object or might put certain requirements to the use.

top 2 comments
sorted by: hot top controversial new old
[-] NuraShiny@hexbear.net 3 points 2 months ago

I hope all generative AI will cease to exist. How deep can the money pit go?!

[-] hedgehog@ttrpg.network 2 points 2 months ago

“But tante, then we will never have Open Source AI”. Exactly. That’s how reality works. If you can’t fulfil the criteria of a category you are not in that category. The fix is not to change the criteria. That’s playing pigeon chess.

This is a bad take. If your criteria aren’t grounded in reality, they aren’t useful, so of course you should change the criteria.

It’s also a missed opportunity to point to an AI model that did things right and that would qualify as “open source AI” even if that definition were not watered down. For example, OLMo (which I just learned about) says that they provide full insight into the training data as well as “full model weights, training code, training logs, training metrics in the form of Weights & Biases logs, and inference code.” Their most complex models are 7B models, which is enough to be relevant.

Saying “Meta and Alphabet will never release Open Source AI that meets the proposed definition” is fine. Saying “we’ll never have Open Source AI, period, that meets the proposed definition” means your proposed definition needs rewritten.

this post was submitted on 28 Oct 2024
15 points (94.1% liked)

General Programming Discussion

7881 readers
7 users here now

A general programming discussion community.

Rules:

  1. Be civil.
  2. Please start discussions that spark conversation

Other communities

Systems

Functional Programming

Also related

founded 5 years ago
MODERATORS