202
submitted 5 months ago* (last edited 5 months ago) by Andromxda@lemmy.dbzer0.com to c/technology@beehaw.org
you are viewing a single comment's thread
view the rest of the comments
[-] AlexWIWA@lemmy.ml 44 points 5 months ago

I'm willing to be we'll see something to train language models on the user's hardware soon enough. Folding at home, but instead of helping science, Google steals your electricity.

[-] vvv@programming.dev 14 points 5 months ago

I really think that's the secret end game behind all the AI stuff in both Windows and MacOS. MS account required to use it. (anyone know if you need to be signed in to apple ID for apple ai?) "on device" inference that sometimes will reach out to the cloud. when it feels like it. maybe sometimes the cloud will reach out to you and ask your cpu to help out with training.

that, and better local content analysis. "no we aren't sending everything the microphone picks up to our servers, of course not. just the transcript that your local stt model made of it, you won't even notice the bandwidth!)"

this post was submitted on 09 Jul 2024
202 points (100.0% liked)

Technology

37805 readers
96 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS