If you’ve tucked yourself under the sheets of ignorant bliss when it comes to your privacy while speaking to your favorite personal device, this is yet another wakeup call. A Reddit user recently shared a firsthand account of starting a new job that required her to listen to recordings of voice commands people gave to their smartphones to check for accuracy.
The user, dubbed FallenMyst on Reddit, shared her personal account in a thread called “Everything you’ve ever said to Siri/Cortana has been recorded…and I get to listen to it.” (To be fair, she also talks about listening to what people said to their Samsung Galaxy phones.) In her post, the user said she began working for a hard-to-track company called Walk N’Talk Technologies. In that role, she was asked to listen to audio recordings of people talking to their smartphones and rating how the resulting text matches up with what was actually said.
“Guys, I’m telling you, if you’ve said it to your phone, it’s been recorded…and there’s a damn good chance a 3rd party is going to hear it,” FallenMyst wrote. “I heard everything from kiddos asking innocent things like ‘Siri, do you like me?’ to some guy asking Galaxy to lick his butthole. I wish I was kidding.”
The anonymous Reddit user got this job via CrowdFlower, a data mining company. A post from Motherboard delves deeper into what this specific job entailed and even shares some of the recordings that were posted for vetting.
Apple’s iOS Software License Agreement includes this bolded note about using Siri: “By using Siri or Dictation, you agree and consent to Apple’s and its subsidiaries’ and agents’ transmission, collection, maintenance, processing, and use of this information, including your voice input and User Data, to provide and improve Siri, Dictation, and dictation functionality in other Apple products and services.”
- Cortana vs. Siri vs. Google Assistant vs. Alexa
- How to secure your Alexa device
- Apple Music vs. Spotify
- The best funny and useful Siri commands for iOS and MacOS
- How to hear and delete Alexa conversations