Alexa suggests lethal TikTok challenge to 10-year-old

Alexa uses the Bing search engine for all of her search queries by default. Credit: Unsplash

A mother was shocked when her Amazon Echo Dot gave a potentially deadly suggestion when her 10-year-old daughter asked Alexa for ‘a challenge’ to do.

On Sunday, Kristin Livdahl, shared a screenshot on Twitter of the message Alexa allegedly replied with.

‘Here’s something I found on the web. According to ourcommunitynow.com: The challenge is simple: plug in a phone charger about halfway into a wall outlet, then touch a penny to the exposed prongs,’ said Alexa.

Ironically, the article that Alexa pulled this response from warns parents against their children doing the ‘outlet challenge’ that went viral on TikTok in January.

The ‘outlet challenge’ went viral when teenagers started taking part in the dangerous trend and TikTok has since banned searches for it. The challenge essentially tells people to poke a penny into a power outlet with a phone charger. The penny creates sparks and a potential fire hazard.

Alexa uses the Bing search engine for all of her search queries by default. So, the message is likely an automated one, rather than a conscious decision by anyone working at Amazon.

‘Our customers want Alexa to get smarter and more helpful to them every day. To do that, we use your requests to Alexa to train our speech recognition and natural language understanding systems using machine learning,’ Amazon states on its website.

According to the tech giant, Alexa ‘relies in part on supervised machine learning, an industry-standard practice where humans review an extremely small sample of requests to help Alexa understand the correct interpretation of a request and provide the appropriate response in the future’.

While one Twitter user asked why the child was asking Alexa for challenges in the first place, Livdahl responded saying, ‘We have been doing some physical challenges from a Phy Ed teacher on YouTube as the weather gets colder and she just wanted another one. I was right there. The Echo was a gift and is mostly used as a timer and to play songs and podcasts.’

Amazon apologised in the Twitter comments on Monday. Credit: Twitter

Some people pointed out that allowing unfiltered results to be pulled by Alexa was potentially problematic.

‘Companies should really know better than to put unfiltered question answering systems live on the internet. Google does this in their summaries too, and they similarly have shocking/dangerous results like this periodically,’ said one Twitter user.

On Monday, Amazon spotted Livdahl’s Twitter post and took to the comments to apologise and asked her to reach out directly using a link so ‘they could look into this further’.

Metro.co.uk has reached out to Amazon for comment.


MORE :
Two Amazon workers reportedly die within hours of each other after being denied sick leave


MORE : Amazon customer orders passport holder – also gets unsolicited fake Covid vaccine card



Source link

Latest

Vehicle owner turns in amaphela taxi driver who ran over metro cop

A 30-year-old suspect was arrested after a hit-and-run incident...

At 852, Covid Cases In Mumbai Spike By Almost 79% in 24 Hours

Mumbai on Wednesday reported 852 COVID-19 cases, the highest...

Homes for Ukraine: Quarter of refugee sponsors do not want to carry on

Over 25% of Homes for Ukraine sponsors want to...

Samsung unveils two new foldable smartphones at £999 and £1,649

Samsung has unveiled a new generation of foldable smartphones as it continues its push to bring flexible-screen devices to the mainstream. The company announced the...

Casper the ghostly octopus has kept scientists enthralled since 2016

Back in 2016, scientists photographed an octopus far deeper than they’d ever seen before. Sitting nearly three miles below the surface on a seabed...

Amazon enables payment at Whole Foods stores with a scan of your palm

Amazon is allowing Whole Foods shoppers across California to pay using just a scan of their palm. The tech giant is expanding its palm-scanning technology, Amazon...