24 September 19, 07:32
(This post was last modified: 24 September 19, 07:33 by harlan4096.)
Quote:Continue Reading
Artificial intelligence is listening, and that is unstoppable – but we must all safeguard the privacy of children
“There are just too many things we have to think about every day, too many new things we have to learn. New styles, new information, new technology, new terminology … But still, no matter how much time passes, no matter what takes place in the interim, there are some things we can never assign to oblivion, memories we can never rub away. They remain with us forever, like a touchstone.” – Haruki Murakami, “Kafka on the Shore”
According to a recent Microsoft report, users of digital assistants, such as Amazon’s Alexa and Apple’s Siri, continue to weigh convenience over potential privacy concerns. Eighty percent report being satisfied with the utility these devices provide; only half that percentage (41%) are concerned about the safety of the data they acquire. As I have written previously, this is a tradeoff every one of us must weigh in the digital age, and there aren’t any right or wrong answers. But there are informed and uninformed decisions, and I suspect these survey respondents did not properly consider how the data they feed into their virtual assistants could be used. Once your data is introduced to algorithms, the chain of ownership is broken and you lose control – for children that is an unfair burden, as well as a potential security risk.
Even if you trust the companies that are collecting your data and the algorithms that analyze and apply it, there is a concern of hackers gaining access to it. We may understand what a criminal hacking group wants with our credit card numbers and identity information, but no one can be sure what harm will result from exploitation of the AI analysis of our behavior, our biometric data, and other sensitive data. State actors could use these details to steal secrets, interfere in elections, or manipulate or blackmail officials. Repressive states are already using advanced hacking tools to target dissidents and other groups.
Your personal data feeding the AI machines
There remain immediate reasons to be wary of digital assistants. The business model in place depends on continued improvements in their accuracy and intelligence. The data that comes in is assimilated into the machine’s learning algorithm, helping the AI to avoid past mistakes and make better predictions. In a recurring loop, the consumer gleans the benefits of the technology, while continuously providing material for its improvement. Around the time of the Microsoft report cited above, it was revealed that Amazon has a dedicated team of employees that listen to voice recordings made by Alexa, in order to train the software to perform better. Amazon’s response was to emphasize that it only records “an extremely small sample” of recordings to improve consumer experience. Disturbingly, however, even users who opt out of having their recordings fed back into the system could still be subject to this manual review process. This month Facebook also admitted paying contractors to listen to and transcribe Messenger conversations. Then Apple confessed that human employees were listening to Siri recordings, reportedly including users having sex. In other words, the information you give to a digital assistant isn’t just going into the black box of an AI. It could very well be replayed by other human beings, not to mention in a way that is traceable back to you.
...