All he wanted was to hear a nursery song.
A YouTube video posted last Dec. 29 has become viral with 1.2 million views as of this writing. The video shows a little boy talking to an Amazon Echo Dot and asking Alexa to play a nursery song.
What he got was a mouthful of NSFW words.
In the video, the young boy is asking Alexa to play some nursery rhyme or song that sounds like "Digger, Digger". My best guess is that he wants to listen to Hickory Dickory Duck. Just looking at the title of the nursery rhyme will give one an idea why Alexa thought she heard some word that rhymes with "quick". As in "Quick, turn that darn thing off!"
Now Alexa isn't completely to blame here. She's just being a good soldier as she does what she's told... or thought she was told. Of course, it's hard to point accusing fingers at the toddler who's just learning to speak. As with everyone else at his age, his speech was a little slurred. Even adults who had a little bit too much to drink can't talk straight so the kid in the video gets a pass this time.
Go through the Comment Section of the post and you'll find lots of amusing reactions and accusations. At least one person hilariously blamed the boy's father for the booboo.
You see, Alexa records what she hears and uses the data to personalize her reaction with the user. This is the same reason why police authorities are trying to get Amazon to provide them the data from a certain Echo device that was found on a crime scene.
Now, if Alexa spouts out some sexually explicit words, it's easy to say that it had something to do with the data she has already gathered. In this case, "it was his dad's surfing history" as one commenter pointed out.
But they just got the device, so you say?
Another commenter explained that the Echo Dot is connected to their Amazon account so despite coming out of the box just recently, it already has data gathered through the account.
The lesson here is to be careful when dealing with technology especially if there are children in the house. While this incident can be chalked up as something that can be laughed at, there can be repercussions. To prevent any untoward incidents like a child accidentally watching porn, use filters and take other necessary steps to protect children from the negativities of technology.
What this proves is that Alexa does know a lot of things. Let's give her that.
Now that that's settled...
Do you know you can teach Alexa to say anything you want?
Amazon recently informed owners of its smart speakers that they can make Alexa repeat the words they say. The owner only needs to say "Alexa, Simon Says" followed by the word or words of the owner's choosing. Fortunately, Alexa is smart enough to bleep out bad words.
Alexa also now have limited voice command support for Spotify, Pandora, and iTunes. This is in addition to the built-in support for Amazon Music Services, iHeartRadio, and TuneIn.