Facebook can’t be trusted to protect users’ data on its own. It’s time for Congress to step in.

trusted to protect

While the details of this week’s Facebook scandal — that the company had allowed partners to view supposedly private user data — were new, the arc of this story already follows a predictable script that, unfortunately, Americans are seeing over and over and over again.

Every few weeks, a new story breaks about some new corporate innovation in unauthorized sharing, selling or losing control of Americans’ data; the company will apologize and promise to always put customers first; it will make a cosmetic, largely meaningless change to its policies; and then it all happens again a few months later. Wash, rinse, repeat.

After years of this, Americans are understandably fed up with empty corporate promises. Some of the biggest tech companies have repeatedly shown that profits come first… and then we’ll see about best interests of users.

Sheryl Sandberg personally told me that personal privacy is a matter of national security, and yet we now know that Facebook shared users’ personal information with Russian and Chinese telecom companies with strong links to their governments. (Yandex has been credibly accused of funneling user data to the Russian government, while U.S intelligence officials have been sounding the alarm about Huawei’s government links for years now.)

I personally asked Facebook about its data-sharing partnerships with other companies earlier this year, and my staff reviewed the privacy audits it filed with the Federal Trade Commission. Those audits revealed, much like the New York Times’ reporting, that, once Facebook had handed users’ personal information to other companies, it did nearly nothing to make sure that outside companies protected that information.

trusted to protect

So why does this latest confirmation that users’ data may well be shared or traded away without their permission matter, beyond proving once again that too often corporations will say nearly anything to further their march to higher profits and stock prices?

For one, it makes America less secure, by making it much easier for foreign adversaries to scoop up Americans’ deeply personal information to then use as ammunition against the United States.

Two recent reports written for the Senate Intelligence Committee are a stark illustration of how these campaigns work. Russia tried hard to keep African-Americans away from the polls by targeting them on social media in 2016. These kinds of foreign influence operations will only get more sophisticated in the future.

When hostile regimes have ready access to Americans’ data — by obtaining it from friendly companies or stealing it — it makes it even easier for these governments to micro-target us with divisive messages and false content designed to undermine American democracy.

Companies repeatedly lie to Congress and the American people about what they do with our information. It’s my view that CEOs who lie to the government about protecting your privacy shouldn’t get off with a slap-on-the-wrist fine: They should face serious financial penalties and even the possibility of prison time for lying to the government about protecting your data and, under my bill, they will.

After watching the privacy spin cycle far too many times, it’s clear to me that corporate CEOs need some skin in the game to actually take Americans’ privacy seriously.

Google hit

Google hit with FTC complaint over ‘inappropriate’ kids apps

The Federal Trade Commission is being asked to investigate how apps that may violate federal privacy laws that dictate the data that can be collected on children ended up in the family section of the Google Play store.

A group of 22 consumer advocates, led by the Institute for Public Representation at Georgetown University Law School, filed a formal complaint against Google on Wednesday and asked the Federal Trade Commission to investigate whether the company misled parents by promoting children’s apps that may violate the Children’s Online Privacy Protection Act (COPPA) and Google’s own policies.

“The business model for the Play Store’s Family section benefits advertisers, developers and Google at the expense of children and parents,” Josh Golin, executive director of the Campaign for a Commercial-Free Childhood, said in a statement. “Google puts its seal of approval on apps that break the law, manipulate kids into watching ads and making purchases.”

Among the examples cited in the complaint are a “Preschool Education Center” app and a “Top 28 Nursery Rhymes and Song” app that access location, according to an analysis by privacy research collective AppCensus. Other apps, including “Baby Panda’s Carnival” and “Design It Girl – Fashion Salon,” were among those listed that sent device identification data to advertising technology companies, allowing them to build a profile of the user.

The complaint also spotlights several apps that may not be age appropriate, including “Dentist Game for Kids,” which lets the player give the virtual patient shots in the back of their throat. Another game, “Doctor X & the Urban Heroes,” requires players to cut clothing off of a patient.

“Parents want their children to be safe online and we work hard to protect them. Apps in our Designed for Families program have to comply with strict policies on content, privacy, and advertising, and we take action on any policy violations that we find,” a Google spokesperson said in a statement.

Google marks apps that are suitable for children with a star and the recommended age group. Google said it removed thousands of apps this year from its family program after it found policy violations. In addition, Google said one-third of applicants to the program were rejected in 2018.

Google hit

The complaint is just the latest scrutiny of the Google Play store. Earlier this year, researchers analyzed 6,000 free children’s Android apps and found that more than half shared details with outside companies in ways that could violate COPPA. A study from the University of Michigan looked at 135 apps marketed by Google to children under the age of 5 and found that 95 percent of the apps had some kind of advertising. Additionally, more than half had pop-up ads that were difficult for a young child to close, according to the study.

And in September, Google was named in a lawsuit filed by New Mexico’s attorney general, accusing app maker Tiny Lab Productions of sending location data of its young users to other companies.

In 2016, the FTC settled a case against InMobi for $950,000 for tracking the location of children using the app without first getting parental consent.

Google removed an app based on the show “Blaze and the Monster machines” in January after a sinister recording of a voice in the app threatening children with a knife went viral, prompting parents in the U.K. to complain.

Copyright 2019 Talk 605 | All Right Reserved.