In December 2017, Facebook launched Messenger Kids, a version of its Messenger platform specifically created with children in mind.
Loren Cheng, Product Management Director for Facebook, wrote the following about the new app:
To give kids and parents a fun, safer solution, we built Messenger Kids, a standalone app that lives on kids’ tablets or smartphones but can be controlled from a parent’s Facebook account. Whether it’s using video chat to talk to grandparents, staying in touch with cousins who live far away, or sending mom a decorated photo while she’s working late to say hi, Messenger Kids opens up a new world of online communication to families.
Sounds like a neat idea! What parent doesn’t love the thought of their kids having a safe app through which they can send mom and dad fun photos while they’re at work? Sounds so cute and idealistic!
What About Safety and Privacy?
Safety on such an app would obviously be a big concern. Ms. Cheng wrote in 2017:
Messenger Kids gives parents more control. Parents fully control the contact list and kids can’t connect with contacts that their parent does not approve. Parents control kids accounts and contacts through the Messenger Kids Controls panel in their main Facebook app.
Just this week, Facebook announced some new features for the app to give parents even more oversight into how their kids are using the app. Jon Porter writes for The Verge:
Parents will now be able to see more details about who their children are messaging with, whether they’re video calling them, and a history of anyone they’ve blocked in the app. They’ll also be able to see a log of recent images their child has sent and received (with the option to remove and report it if it’s inappropriate), and can log them out of devices remotely at any time. A new option also now allows parents to download all of their child’s information, similar to the data-download feature available in the main Facebook app.
Further, Facebook has said from the start that the app does not serve ads or collect data from children to be used for advertising. Ms. Cheng wrote at the product launch in 2017:
There are no ads in Messenger Kids and your child’s information isn’t used for ads. It is free to download and there are no in-app purchases. Messenger Kids is also designed to be compliant with the Children’s Online Privacy and Protection Act (COPPA).
It sounds like Facebook is working hard to provide a safe, secure environment for children to chat with other trusted contacts, continually improving the app to better serve parents.
Facebook Cannot Be Trusted
Facebook is the least trustworthy social media company in existence today. A 2019 NBC News/Wall Street Journal poll found that 60% of Americans do not trust Facebook with their data—37% of Americans said the same of Google, 35% of the federal government, and 28% of Americans with Amazon.
This is for good reason! Facebook has proven time and time again that it should not be trusted with user data and protecting the interests of its users…yet all the while its stock price continues to rise. Investors don’t care if user data is protected, as long as companies are continuing to spend billions on Facebook ads.
We aren’t going to explore all the reasons Facebook can’t be trusted here—the list is too long—but Facebook has a knack for committing all kinds of mistakes that happen to support its bottom line rather than its users.
Here are a few ways Facebook has tried to sneakily harvest user data over the years, according to The Next Web:
- Paying teens $20 to learn how they use their phones
- Offering a “free VPN” that protected against harmful websites…and tracked phone usage outside the app
- Sponsored stories that collected data without the chance to opt out
- Tracking phone call logs and SMS conversations on Android phones
- Shadow profiles and unwanted data gathering from non-users
The saying goes, “If you’re not paying for it, you are the product.” Such is the case with Facebook. And somehow, the kind of mistakes Facebook makes tend to breach user privacy for the sake of the advertisers. None of Facebook’s mistakes seem to benefit users at the detriment of advertisers.
If the sort of shady data collection tactics above were tried on the main Facebook platform, we would be foolish to assume such “mistakes” or shady tactics wouldn’t be tried on the Messenger Kids platform.
Messenger Kids Will Fail Children and Families…It Already Has
The design of Messenger Kids is evidently meant to put parents at ease. It’s full of parental controls; kids must get a parent’s authorization (via the parent’s own Facebook account) to sign up and to add each new contact.
However, the app also has some of the very grown-up features you find on Messenger. For instance, if you send a contact a message on Messenger Kids, it lets you know if the person is online or how long it’s been since he or she was active. It will also tell you whether the person you’ve sent a message to has viewed it already and if so, for recently sent messages, when.
That kind of information can cause anxiety even in adults who’ve already spent years using apps. And habituating kids to always-on communication concerns Tristan Harris, a former design ethicist at Google who cofounded and runs the Center for Humane Technology. “It’s like Coca-Cola inventing a kids’ soda product,” he says. “It still has to sell sugar; it can’t really be genuinely concerned with the well-being of kids.”
Practically-speaking, Facebook has already failed children and parents with Messenger Kids. It made a mistake that was potentially harmful for its users. The bug was found last summer. Russell Brandom of The Verge wrote of the bug:
The bug arose from the way Messenger Kids’ unique permissions were applied in group chats. In a standard one-on-one chat, children can only initiate conversations with users who have been approved by the child’s parents. But those permissions became more complex when applied to a group chat because of the multiple users involved. Whoever launched the group could invite any user who was authorized to chat with them, even if that user wasn’t authorized to chat with the other children in the group. As a result, thousands of children were left in chats with unauthorized users, a violation of the core promise of Messenger Kids.
It’s unclear how long the bug was present in the app, which launched with group features in December 2017.
I don’t know how much clearer I can make this: Facebook plays fast and loose with its users privacy, data, and overall security. Its early motto, “Move fast and break things,” may not be an official company motto anymore, but it still seems to influence product development.
Facebook promises they aren’t doing anything nefarious with kids’ data, and that their platform is secure. But I wouldn’t be surprised if they made a mistake and accidentally made all of that data available to advertisers. They already compromised security in the app. Accidentally compromising data, making it available to advertisers, could prove profitable.
I don’t trust Facebook with my own data. I wouldn’t advise you trust Facebook with your kids’ data or their eyes. Find a better way to keep in contact with your kids.