日本語で読む (Translated by Toshimaru Ogura)
Voice recognition technology often violates human rights, and it’s popping up more and more. Recently we’ve called out Spotify for developing voice recognition tech that claims to be able to detect gender and emotional state, among other things.
But it’s not just Spotify. Some of the most powerful companies in the world are deploying similar abusive tech because harvesting data about you is profitable. The market for voice recognition is growing, expected to be worth a whopping $26.8 billion by 2025.
This is not an exhaustive list, but below are some dangerous examples:
All of this is alarming, unwarranted, and, in certain jurisdictions, illegal. Using voice recognition tech to make inferences about us invades our private lives and reinforces harmful, regressive stereotypes.
We’re keeping an eye on this emerging tech, and calling out companies to hold them accountable. You deserve respect, not exploitation.
And we’re not the only ones paying attention. With our partners from around the world, we launched a campaign to ban biometric surveillance and a call to outlaw automated recognition of gender and sexual orientation.
Spread the word. RT us about the proliferation of this dangerous tech here.
We shouldn’t have to worry about our smart refrigerators, voice assistants, and apps with a microphone listening to us, profiling us, and trying to read our minds.