Facebook is not going to Like this: Brit watchdog proposes crackdown on hoovering up kids’ info

This post was originally published on this site

In the UK, it seems, someone is trying to think of the children

Analysis The famous “Like” button may be on the way out if a new code for social media companies, published by the UK’s Information Commissioner’s Office (ICO), has its way.

Among the 16 rules in the consultation document [PDF] is a proposed ban on the use of so-called “nudge techniques” – where user interfaces and software are specifically designed to encourage frequent, daily use – as well as gather information that can then be sold on.

“Do not use nudge techniques to lead or encourage children to provide unnecessary personal data, weaken or turn off their privacy protections, or extend their use,” the code states, among a range of other measures that social media giants like Facebook, Twitter and Snapchat are going to hate.

The code is specifically all about the children, with the head of the ICO, Elizabeth Denham saying in a statement: “We shouldn’t have to prevent our children from being able to use [the internet], but we must demand that they are protected when they do. This code does that.”

Many of the changes would require companies like Facebook to make adjustments to their software and back-end systems to work. And some would directly impact social media companies’ bottom line as they would cut off access to vast quantities of personal information, which the companies repackage and sell to advertisers.

The code is just the latest push by the UK government – also reflected across Europe – to bring social media companies in line with what have been long-established norms and make them more responsible for removing damaging and illegal content, as well as limit the amount of personal data they compile.

Big push

It comes a week after the UK government published a White Paper on “Online harms” that argued for new, restrictive laws on social media and amid a global sense among lawmakers that the era of self-regulation in the internet space is over. The code has also been published just before a new law that requires adult content websites to verify the age of UK consumers before providing them with access to their material comes into effect.

One of the key drivers for the new code, film director and children’s rights campaigner Baroness Beeban Kidron said in a statement: “For too long we have failed to recognize children’s rights and needs online, with tragic outcomes.”

She went on: “I firmly believe in the power of technology to transform lives, be a force for good and rise to the challenge of promoting the rights and safety of our children. But in order to fulfill that role it must consider the best interests of children, not simply its own commercial interests.”

Some of the rules are general to the point of vagueness – such as the first requirement that a social media company make “the best interests of the child a primary consideration.”

But others are firm and threaten to have a significant impact on not just the design but also the business model used by such companies. The code makes it plain that unless the companies enact age-verification systems, the UK government expects them to extend all the changes to all users, regardless of age.

One key change is for default settings to be set to “high privacy” – something that Facebook famously gets around by constantly changing its own systems and forcing users to rediscover and reapply content controls. A high-privacy default would significantly limit the amount of personal information that can be automatically gathered through such a service.

Another is the key concept of “data minimization” – which is present in Europe’s GDPR data privacy legislation – where companies are expected to only gather the information they need to provide their service and no more.

And the code says that location tracking should be turned off by default and there should be “an obvious sign” if it is turned on. It also says that making user location visible to others “must default back to off at the end of each session.”

Clear and concise? What madness is this?

And in a clear poke in the eye to Facebook, the code insists that user are provided with “‘bite-sized’ explanations about how you use personal data at the point that use is activated” and those explanation be “concise, prominent and in clear language suited to the age of the child.”

The code is quite clearly aimed at banning all the questionable practices that social media companies have introduced in order to gain access to as much personal data as possible, and uses the fact that different laws exist around the protection of children and their information to push the changes.

Man blasted with noise from speaker

Turn me up some: Smart speaker outfit Sonos blasted in complaint to UK privacy watchdog

READ MORE

Somewhat predictably, those companies are not happy although they are currently treading a diplomatic line – in public at least. In its response [PDF] to the ICO’s initial call for feedback, Facebook made it plain that it was not happy with the direction they were going and basically claimed that it was already doing enough.

It even strongly implied that the regulator was patronizing kids by insisting on such controls when “we know that teenagers are some of the most safety and privacy users of the internet.” It adds that, “age is an imperfect measure of maturity” and the proposals risk “dumbing down” controls for children “who are often highly capable in using digital services.”

And in a word-perfect summary of Facebook and its culture, the organization notes that when it comes to its systems “the design journey is never over” and that it is “highly committed to improving people’s experience of its own services.”

The code is out for public review and comment until May 31. ®

Sponsored: Becoming a Pragmatic Security Leader

Leave a Reply

Your email address will not be published. Required fields are marked *

April 15, 2019