[-] DavidGarcia@feddit.nl 32 points 5 months ago

BlackberryPi Flipper Zero with LoRa and a thermal camera (because why not) would be cool

[-] DavidGarcia@feddit.nl 31 points 6 months ago

there will be a time when LLMs are small enough to be distributed as viruses and then you have some semi conscious entity in your computer messing with you

[-] DavidGarcia@feddit.nl 30 points 8 months ago

in our hearts we're all just bros hanging out. some of us might murder more people than others, but like that's ever mattered!

[-] DavidGarcia@feddit.nl 33 points 11 months ago

"see you later"

[-] DavidGarcia@feddit.nl 30 points 1 year ago

here, I made it worse

[-] DavidGarcia@feddit.nl 31 points 1 year ago

if you are doing behaving exactly like everyone else, it shows you are well adjusted and your genes and brains function normally, thus your babies are more likely to survive. It's arbitrary, but so are words. There is no reason why the word "word" has to be "word", but by showing you understand and speak language you show your fitness. Just think of non-verbal communication as a language that everyone speaks and biting your lip the right way is the same as saying "word" the right way. You'd think I'm having a stroke if I say "gooyallodsiu" instead of "word" and it's the same for body language.

As an avid autism enjoyer myself, some of us are just mute and deaf in this language.

[-] DavidGarcia@feddit.nl 31 points 1 year ago

At least game engines provide massive value. Yeah they take a cut, but more money would have ultimately been used to produce a vastly inferior inhouse engine. Yeah Unity's recent move is douchey, buy it's still miles better than any of the extortion by app stores. No one can tell me Apple's curation is worth a 30% cut. Ridiculous.

[-] DavidGarcia@feddit.nl 33 points 1 year ago

why does everything have to be relatable to this miserable prison planet existance

[-] DavidGarcia@feddit.nl 36 points 1 year ago

the simulation ran out of processing power for a moment

[-] DavidGarcia@feddit.nl 32 points 1 year ago

Putting any other issues aside for a moment, I'm not saying they're not true also. Cameras need light to make photos, the more light they get, the better the image quality. Just look at astronomy, we don't find the dark astetoids/planets/stars first, we find the ones that are the brightest and we know more about them than about a planet with lower albedo/light intensity. So it is literally physically harder to collect information about anything black, that includes black people. If you have a person with a skin albedo of 0.2 vs one with 0.6, you get 3x less information in the same amount of time all things being equal.

And also consider that cameras have a limited dyanmic range and white skin might often be much closer to most objects around us than black skin. So if the facial features of the black person might fall out of the dynamic range of the camera and be lost.

The real issue with these AIs is that they aren't well calibrated, meaning the output confidence should mirror how often predictions are correct. If you get a 0.3 prediction confidence, among 100 predictions 30 of them should be correct. Then any predictions lower than 90% or so should be illegal for the police to use, or something like that. Basically the model should tell you that it doesn't have enough information and the police should appropriately act on that information.

I mean really facial recognition should be illegal for the police to use, but that's besides the point.

view more: ‹ prev next ›

DavidGarcia

joined 1 year ago