78
submitted 6 months ago by ylai@lemmy.ml to c/gaming@lemmy.ml
21
submitted 6 months ago by ylai@lemmy.ml to c/usa@lemmy.ml
17
submitted 6 months ago by ylai@lemmy.ml to c/usa@lemmy.ml

Without paywall: https://archive.ph/xAJv5

423
submitted 6 months ago by ylai@lemmy.ml to c/programming@programming.dev
281
submitted 6 months ago by ylai@lemmy.ml to c/gaming@lemmy.ml
57
submitted 6 months ago by ylai@lemmy.ml to c/gaming@lemmy.ml
111
submitted 6 months ago by ylai@lemmy.ml to c/gaming@lemmy.ml
117
submitted 6 months ago by ylai@lemmy.ml to c/gaming@lemmy.ml
122
submitted 6 months ago by ylai@lemmy.ml to c/gaming@lemmy.ml
63
submitted 6 months ago by ylai@lemmy.ml to c/gaming@lemmy.ml
226
submitted 6 months ago by ylai@lemmy.ml to c/linux@lemmy.ml
483
submitted 6 months ago by ylai@lemmy.ml to c/technology@lemmy.world
[-] ylai@lemmy.ml 1 points 7 months ago* (last edited 7 months ago)

How does this analogy work at all? LoRA is chosen by the modifier to be low ranked to accommodate some desktop/workstation memory constraint, not because the other weights are “very hard” to modify if you happens to have the necessary compute and I/O. The development in LoRA is also largely directed by storage reduction (hence not too many layers modified) and preservation of the generalizability (since training generalizable models is hard). The Kronecker product versions, in particular, has been first developed in the context of federated learning, and not for desktop/workstation fine-tuning (also LoRA is fully capable of modifying all weights, it is rather a technique to do it in a correlated fashion to reduce the size of the gradient update). And much development of LoRA happened in the context of otherwise fully open datasets (e.g. LAION), that are just not manageable in desktop/workstation settings.

This narrow perspective of “source” is taking away the actual usefulness of compute/training here. Datasets from e.g. LAION to Common Crawl have been available for some time, along with training code (sometimes independently reproduced) for the Imagen diffusion model or GPT. It is only when e.g. GPT-J came along that somebody invested into the compute (including how to scale it to their specific cluster) that the result became useful.

[-] ylai@lemmy.ml 2 points 7 months ago* (last edited 7 months ago)

This is a very shallow analogy. Fine-tuning is rather the standard technical approach to reduce compute, even if you have access to the code and all training data. Hence there has always been a rich and established ecosystem for fine-tuning, regardless of “source.” Patching closed-source binaries is not the standard approach, since compilation is far less computational intensive than today’s large scale training.

Java byte codes are a far fetched example. JVM does assume a specific architecture that is particular to the CPU-dominant world when it was developed, and Java byte codes cannot be trivially executed (efficiently) on a GPU or FPGA, for instance.

And by the way, the issue of weight portability is far more relevant than the forced comparison to (simple) code can accomplish. Usually today’s large scale training code is very unique to a particular cluster (or TPU, WSE), as opposed to the resulting weight. Even if you got hold of somebody’s training code, you often have to reinvent the wheel to scale it to your own particular compute hardware, interconnect, I/O pipeline, etc.. This is not commodity open source on your home PC or workstation.

[-] ylai@lemmy.ml 2 points 9 months ago

It is like the U.S. Wired catching up to the idea years later?

[-] ylai@lemmy.ml 1 points 9 months ago* (last edited 9 months ago)

The PS Vita side of Sony customer has gotten a deep taste of Sony’s issues of catering everything to a singular console. And same with PSVR2: Of course it must be PS5 exclusive, because everything are adornments towards their shiny console — and went on to not sell a lot of PS5.

[-] ylai@lemmy.ml 0 points 11 months ago

Yes. If you mean “CLI” as for e.g. pacman install, it is a GUI (Electron) application, so I expect will install straight from e.g. KDE Discover and then run without you touching the shell.

[-] ylai@lemmy.ml 1 points 11 months ago* (last edited 11 months ago)

Installing podman-compose with the immutable filesystem is fairly straight forward, since it is just a single Python file (https://github.com/containers/podman-compose/blob/devel/podman_compose.py), which you can basically install anywhere in your path. You can also first bootstrap pip (python3 get-pip.py --user with get-pip.py from https://github.com/pypa/get-pip) and then do pip3 install --user podman-compose.

[-] ylai@lemmy.ml 1 points 1 year ago

This is absolutely not true, certainly not at the time of Bungie and how Microsoft made Halo Xbox-exclusive: https://arstechnica.com/gaming/2010/10/jobs-turned-down-bungie-at-first-how-microsoft-burned-apple/

[-] ylai@lemmy.ml 0 points 1 year ago

As a user of an ecosystem that I care about, I totally do not. Why should the health of an ecosystem be dictated by my usage patterns or that of people that I know? Bit self-centered, also?

Also, today’s Apple fans and their “Apple-no-gaming” fiction are too quick to “forget” Bungie and how upset Steve Jobs was when Halo became Microsoft-exclusive. https://arstechnica.com/gaming/2010/10/jobs-turned-down-bungie-at-first-how-microsoft-burned-apple/

[-] ylai@lemmy.ml 1 points 1 year ago

See: https://en.wikipedia.org/wiki/English-language_spelling_reform

English has been the total outlier among (originally) European language with no body of authority over its spelling. Even the “reform” by Noah Webster never really caught on outside North America, nearly 100 years later. And even more curious, the somewhat authoritative Oxford English Dictionary disagrees in their spelling with everybody (https://en.wikipedia.org/wiki/Oxford_spelling).

[-] ylai@lemmy.ml 2 points 1 year ago* (last edited 1 year ago)

Nearly every single word in English that starts with a g followed by a soft ih/eh vowel is pronounced as a soft g, just a few:

That is patently not true and blatant cherry picking, e.g. already contradicted by the lexically matching word “gift” (and there are “giggle”, “gild”, “girl”, “git”, “give”, “gizmo”, etc.). See Wikipedia, which referenced linguists studying this:

An analysis of 269 words by linguist Michael Dow found near-tied results on whether a hard or soft g was more appropriate based on other English words; the results varied somewhat depending on what parameters were used.[11] Of the 105 words that contained gi somewhere in the word, 68 used the soft g while only 37 employed its counterpart. However, the hard g words were found to be significantly more common in everyday English; […]

https://en.wikipedia.org/wiki/Pronunciation_of_GIF#Cause

Michael Dow is an associate professor in linguistics with specialization in phonology, by the way.

and if you’re confused why others pronounce it with a soft G, they would seem to be simply more familiar with the English language 🤷‍♂️

Well, clearly you are already not as “familiar with the English language” as you might think.

view more: ‹ prev next ›

ylai

joined 2 years ago