icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
3 Aug, 2018 03:20

Smart tech voice assistants are a threat to news pluralism, Reporters Without Borders warn

Smart tech voice assistants are a threat to news pluralism, Reporters Without Borders warn

If instead of rummaging through search results and newspaper headlines, you leave it up to your Alexa to feed you the news, you could be locking yourself in an echo chamber, Reporters Without Borders (RSF) warn.

The pro-information freedom NGO is sounding the alarm over the impact that smart tech is having on the way we consume news. In a Thursday report, it says the development of voice assistants – like Amazon's Alexa, or Siri, or Google Assistant – "raises the question of guarantees for pluralism in news and information."

RSF is concerned that when you ask "What's the news?" and submit to the mercy of your virtual assistant, you could be allowing someone else to cherry pick your sources for you. "Voice assistants are liable to reinforce the opaque and often pay-based methods of media content distribution that exist already,"says RSF's Journalism and Technology head Elodie Vialle. And while the media is happy to report on the futuristic technology itself, it diverts attention from its potential implications.

The selective news-feeding is not the only thing worrying RSF about virtual assistants. There's also the constant surveillance of their environment and owner, and of course, the annoying bugs.

A combination of those last two factors caused one of Alexa's biggest fumbles of late: in March, one US couple's digital helper recorded their private conversation about hardwood floors and sent it to one of the husband's employees. Amazon explained it away by saying Alexa had misheard some of the words in the conversation as a weirdly specific series of commands. Still, the couple said they would never plug the device in again.

If you like this story, share it with a friend!