• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Consumer interest groups allege two children's "smart toys" violate COPPA

Status
Not open for further replies.

Dalek

Member
These Toys Don’t Just Listen To Your Kid; They Send What They Hear To A Defense Contractor

GenesisToys_06nuance02_liv.jpg


Kids say a lot of random, unsolicited, or just plain personal things to their toys while playing. When that toy is stuffed with just fluff and beans, it doesn’t matter what the kid says: their toy is a safe sounding board. When their playtime companion is an internet-connected recording device that ships off audio files to a remote server without even notifying parents — that’s a whole other kind of problem.

According to a coalition of consumer-interest organizations, the makers of two “smart” kids toys — the My Friend Cayla doll and the i-Que Intelligent Robot — are allegedly violating laws in the U.S. and overseas by collecting this sort of voice data without obtaining consent.

In a complaint [PDF] filed this morning with the Federal Trade Commission, the coalition — made up of the Electronic Privacy Information Center (EPIC), the Campaign for a Commercial-Free Childhood (CCFC), the Center for Digital Democracy (CDD), and our colleagues at Consumers Union — argue that Genesis Toys, a company that manufactures interactive and robotic toys, and Nuance Communications, which supplies the voice-parsing services for these toys, are running afoul of rules that protect children’s privacy and prohibiting unfair and deceptive practices.

These particular toys — basically a “girl” and “boy” theme on the same core idea — both use voice recognition tech to “listen” to the kids that play with them.

They connect via Bluetooth to a mobile phone app, usually belonging to a parent, and then from there access the internet in order to interact with kids and answer their questions. To accomplish that feat, the apps record and collect conversations between the toys and the kids, and use speech-to-text protocols to turn kids’ questions into searchable queries.

When users first set up the app for their toy, they may be sharing data you don’t want shared. Cayla in particular asks for multiple pieces of personal information — the child’s name, their parents’ names, their school name, their hometown, among other questions — so it can converse more naturally. The app also allows for location setting, and both the Cayla and i-Que apps collect users’ IP addresses.

So far this is pretty straightforward. The Terms of Service for both toys say that they collect data in order to improve the way the toys work, and for “other services and products.”

Researchers studied the way the toys work, the complaint continues, and it turns out that they send audio files to a third party: Nuance Communication’s servers at the company’s headquarters in Massachusetts.

Nuance is a giant company best-known — to consumers, at least — for its Dragon-branded suite of speech-to-text dictation software. The company also has a significant presence in healthcare dictation, and is — like more large corporations than you’d think — a defense contractor that sells products, including “voice biometric solutions,” to military, intelligence, and law enforcement agencies.

And here’s where it starts to get more complicated: both toys are also governed under Nuance’s general privacy policy, which says, “We may use the information that we collect for our internal purposes to develop, tune, enhance, and improve our products and services, and for advertising and marketing consistent with this Privacy Policy.”

It continues, “If you are under 18 or otherwise would be required to have parent or guardian consent to share information with Nuance, you should not send any information about yourself to us.”

There's a lot more about this investigation and the filed complaint at the link. Additionally:

And on top of all that, the toys’ connections are just plain insecure. Basically anyone searching for nearby Bluetooth devices can easily connect to them, as this video and report [PDF] from the Norwegian Consumer Council shows:

The Norwegian researchers’ technical report [PDF, in English] also found that some queries, like ones that made the toys connect to Weather Underground, were using insecure HTTP connections that can easily be subjected to a man-in-the-middle attack if someone were so inclined.

The researchers were able to use free apps in order to turn both the doll and the robot into recording devices, and were able to use the toys as a two-way handset by calling the phone to which they were connected. “This is very easy and requires little technical know-how,” the researchers added.
 

Meatfist

Member
People need to think long and hard about the Internet of Things, and what it means for their own security and privacy. I'm currently reworking my home network to section off my Echo, Nest cam, etc., but even then I'm very paranoid about how much private information these devices are able to collect.
 

Matty77

Member
I have been saying this could eventually become an issue since the original Furby but everyone told me I was being silly or needed a tinfoil hat.

Well it took a couple of decades but I actually feel vindicated.
 
NBN can sell its data to other corporations, and also provide precision-targeted advertising to that same subscriber list. NBN-produced advertising uses psychographic profiling and the latest neuroscience and braintaping techniques to promote message penetration and brand retention.

Oh baby! We are close to living in the Android Netrunner universe.
Someone give me my sexbot!
 
As someone in advertising, it would be great to find a way to use that information about the kids interests and likes so we can personalize the advertising fed to them. Hopefully we can program the doll to ask questions like "What are your favorite cereals, and who is your favorite Disney Princess?" Then use the answers the kids give to cater their marketing message
 
It sounds more like they're just collecting massive amounts of recorded voice data to try and feed their learning machines for speech-to-text/text-to-speech rendering.
 

Acinixys

Member
As someone in advertising, it would be great to find a way to use that information about the kids interests and likes so we can personalize the advertising fed to them. Hopefully we can program the doll to ask questions like "What are your favorite cereals, and who is your favorite Disney Princess?" Then use the answers the kids give to cater their marketing message

I dislike this but know its already happening

Although targeted ads are garbage at this stage

All of mine are for Russian mail order brides and cars

I would hate to see what a kid would be subjected to
 
Status
Not open for further replies.
Top Bottom