481
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 26 Sep 2023
481 points (94.1% liked)
Technology
60130 readers
2782 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
Not every conversation, just statements following a detected wake word.
You trust that?
Considering I set up one of the content types that relates to wakeword and utterance text analysis for Alexa, I trust it completely.
But can I trust you? Are you willing to share the source code?
Edit: Tell me why I'm suppose to trust an internet rando?
You're right to be distrustful, but there's a fine line between a healthy distrust of a closed ecosystem and blind worry/cynicism.
Obviously I'm not going to share proprietary source code. Even if I did, it would mean very little without knowing the upstream and downstream services. What I will say is that Amazon is at least honest about what it's services do, even if it's in the fine print. Customers are able to delete their data when they choose to, and if they do, there are serious (internal) consequences when stuff like data deletion and DSAR aren't followed.
Also, it would very little without also inspecting every chip on the board. You could have easily written safe code, but the audio signal could also be intercepted before it gets to that point.
Alexa doesn't solve any problems and only exists to make consumption easier. It's not something I need to trust because it's not something I or anyone else needs.
There's this study for those interested in knowing more about how often these devices mistakenly record conversations:
https://moniotrlab.khoury.northeastern.edu/publications/smart-speakers-study-pets20/