On July 1, 2013, the latest set of FTC rules restricting what technology companies can do with children’s private information go into effect. Parents should care because these changes are massive, and focused on mobile applications in particular.
COPPA (Children’s Online Privacy Protection Act of 1998) has been around for a long while. The law itself is clear in its intent: to protect kids from falling prey to bad actors who want their information. The rules around the law, which tell application developers how to comply, have been around for a long time, too. Too long, as it turns out — they are ineffective with the advent of mobile devices. So, the Federal Trade Commission (FTC) has come up with some brand new rules in their COPPA working group. Those rules go into effect, as I mentioned, on July 1. For brief and excellent description of the new rules, you might read this post by Scott Weiner.
As the CTO of Playrific, I’m pretty proud of our commitment to protecting kids online. That is our company charter, after all. Because I’m a dad, I have a personal interest in making things safer for kids to explore. I spent this week in DC with a handful of other mobile app developers talking to the FTC about COPPA (and other things as well). We met with the entire team working on the regulations, as well as two of the four commissioners at the FTC.
Playrific is a very active member of the Association for Competitive Technology (ACT). On the right is a snapshot of a few of the ACT members meeting with the whole COPPA team this week. The COPPA team is focused on what data gets collected without parental consent. And that is a laudable goal.
Here’s why parents should care: this whole debate has become centered around advertising, not safety. The advertising lobby got in on the game early, since they saw their life blood being pinched off. Advertisers want to be able to target kids, and want to be able to adapt their advertising to kids’ behavior. App developers are not thrilled about pushing ads to kids, but do so in order to avoid something worse: limiting their apps to kids who can afford to buy them. Providing free, ad supported versions of apps, especially educational apps, is the best way we currently have to reach the broadest audience while still paying our staff. Playrific resists this model, and has found other ways to fund its operations besides advertising to kids, but it’s a reality of life in 2013 that kids are going to be targeted by advertisers online. So, it’s best to focus some attention on controlling it.
However, the focus on advertising has completely eclipsed the real problem, which we believe is data protection. Having advertisements jammed in front of kids is annoying — especially ones tailored to a specific kid’s behavior patterns online. Annoying, no question. That is why we at Playrific don’t do it. However, having data which is collected for legitimate reasons get leaked or hacked by bad actors is far worse than annoying; it’s downright dangerous. I have spent the past 30 years creating and selling software for businesses big and small. Businesses care about what happens to their data, and take serious steps to protect it from hacking and intrusion. As a self-serving promotional statement, we at Playrific are extremely proud of the “back-end” data security measures we have taken to protect any data we have about our users. Most app developers don’t — they simply are too new to software development, and often come from the entertainment side, where business software practices are foreign. It’s not their fault, but it’s also not in a parent’s best interests.
Consider this question: imagine that you are an offender, and prey on kids — where is the best spot to find a list of kids with concentrated information about each one, including behavior patterns, and perhaps even a photo or two? Online is certainly the answer; a very scary answer for parents. Now, imagine you are that same bad actor, but you are clever; what do you do then? Get a job at any of the 50,000+ app developers who gather this information in databases. Or, hack into any of these databases, especially if they are lightly guarded. Our conclusion is that the real danger to kids revolves around data protection, not around parental consent to advertising. As a parent, the choice is clear between protecting my kid from an ad and protecting my kid from unwittingly disclosing their photo and current location (accurate to within 2 meters) to a bad actor.
Playrific gathers childrens’ behavior information for what we consider legitimate reasons: to help provide relevant educational and entertaining content to young kids — without ads or risk of navigating out into dangerous territory. A large number of parents seem to agree with us that this is legitimate, if recent download numbers are an indicator. But we don’t view download or usage numbers as an advertising windfall — we view them as a mandate to keep that data protected. We believe it should receive more protection than healthcare data or financial data. We take the trust parents give us extremely seriously. One misstep and all is lost (in our view). You will notice that I am not describing how we keep things safe and secure — that’s part of keeping the data safe and secure. I can say that we run hourly, weekly, and monthly penetration testing to assure that those who need to stay out are kept out. And we have very serious policies about who inside our company can see what data — and we expose as little data to as few people as possible. And we research the backgrounds of each person with access.
We applaud the FTC in its efforts to protect kids. We sympathize with the COPPA team because we both face the same issues surrounding keeping kids safe. What we are discouraged by is the intense focus on whether advertisers can get access to kids’ behavior, and the relative complete disregard for protecting whatever data is ultimately collected.
What can parents do about all this? First, if you agree that protecting concentrated databases of information on children is paramount, tell everybody you know, and then join us in engaging the US Congress, the US Senate, and the FTC to shift its focus from advertising to data protection. Second, be on the lookout for industry-wide changes between now and July 1, when the new COPPA regulations go into effect. You should see dramatic changes in how apps try to protect kids from advertising. There are a hundred buzz words surrounding this topic, like Verifiable Parental Consent, and Age-Gating. Forget the buzz words and ask yourself two questions after you install and set up an app: (1) do I know what information this app collects about my child, and why they collect it, and (2) do I have any idea whether they protect the information they gather from exposure to bad actors — either employees or outside hackers? Third, tell us how we can inform you better. We’re parents, but we are also kind of nerdy. It is certain that we can do a better job of informing you, and of keeping you involved in protecting your kids. If you tell us how to improve, you may be assured that we’ll act on it.