These Toys Dont Just Listen To Your Kid; They Send What They Hear To A Defense Contractor
There's a lot more about this investigation and the filed complaint at the link. Additionally:
Kids say a lot of random, unsolicited, or just plain personal things to their toys while playing. When that toy is stuffed with just fluff and beans, it doesnt matter what the kid says: their toy is a safe sounding board. When their playtime companion is an internet-connected recording device that ships off audio files to a remote server without even notifying parents thats a whole other kind of problem.
According to a coalition of consumer-interest organizations, the makers of two smart kids toys the My Friend Cayla doll and the i-Que Intelligent Robot are allegedly violating laws in the U.S. and overseas by collecting this sort of voice data without obtaining consent.
In a complaint [PDF] filed this morning with the Federal Trade Commission, the coalition made up of the Electronic Privacy Information Center (EPIC), the Campaign for a Commercial-Free Childhood (CCFC), the Center for Digital Democracy (CDD), and our colleagues at Consumers Union argue that Genesis Toys, a company that manufactures interactive and robotic toys, and Nuance Communications, which supplies the voice-parsing services for these toys, are running afoul of rules that protect childrens privacy and prohibiting unfair and deceptive practices.
These particular toys basically a girl and boy theme on the same core idea both use voice recognition tech to listen to the kids that play with them.
They connect via Bluetooth to a mobile phone app, usually belonging to a parent, and then from there access the internet in order to interact with kids and answer their questions. To accomplish that feat, the apps record and collect conversations between the toys and the kids, and use speech-to-text protocols to turn kids questions into searchable queries.
When users first set up the app for their toy, they may be sharing data you dont want shared. Cayla in particular asks for multiple pieces of personal information the childs name, their parents names, their school name, their hometown, among other questions so it can converse more naturally. The app also allows for location setting, and both the Cayla and i-Que apps collect users IP addresses.
So far this is pretty straightforward. The Terms of Service for both toys say that they collect data in order to improve the way the toys work, and for other services and products.
Researchers studied the way the toys work, the complaint continues, and it turns out that they send audio files to a third party: Nuance Communications servers at the companys headquarters in Massachusetts.
Nuance is a giant company best-known to consumers, at least for its Dragon-branded suite of speech-to-text dictation software. The company also has a significant presence in healthcare dictation, and is like more large corporations than youd think a defense contractor that sells products, including voice biometric solutions, to military, intelligence, and law enforcement agencies.
And heres where it starts to get more complicated: both toys are also governed under Nuances general privacy policy, which says, We may use the information that we collect for our internal purposes to develop, tune, enhance, and improve our products and services, and for advertising and marketing consistent with this Privacy Policy.
It continues, If you are under 18 or otherwise would be required to have parent or guardian consent to share information with Nuance, you should not send any information about yourself to us.
There's a lot more about this investigation and the filed complaint at the link. Additionally:
And on top of all that, the toys connections are just plain insecure. Basically anyone searching for nearby Bluetooth devices can easily connect to them, as this video and report [PDF] from the Norwegian Consumer Council shows:
The Norwegian researchers technical report [PDF, in English] also found that some queries, like ones that made the toys connect to Weather Underground, were using insecure HTTP connections that can easily be subjected to a man-in-the-middle attack if someone were so inclined.
The researchers were able to use free apps in order to turn both the doll and the robot into recording devices, and were able to use the toys as a two-way handset by calling the phone to which they were connected. This is very easy and requires little technical know-how, the researchers added.