Clicky
Tech Post

TECH TALKS

Facebook child abuse detection hit by new EU rules


Bangladeshpost
Published : 22 Dec 2020 08:48 PM

Facebook has switched off some of its child abuse detection tools in Europe in response to new rules from the EU.

The company said it has had no choice but to do so, since the new privacy directive bans automatic scanning of private messages, reports BBC.

The change only applies to messaging services rather than all content uploaded to Facebook.

However, there is no change in the UK, where measures are "consistent with applicable laws", Facebook said.

The problem has emerged despite warnings from child protection advocates that the new privacy rules effectively ban automated systems scanning for child sexual abuse images and other illegal content.

Some other firms, including Microsoft, have not made such changes, arguing the most responsible approach is to keep the technology functioning.

But few companies deal with the sheer volume of private messages as Facebook, which runs its own messenger service and owns Instagram.

Last week, it was revealed that implementing it would put restrictions on some features of messaging apps. But the impact on child protection tools has also now kicked in.

"This train crash has been approaching since the summer," said John Carr of the Children's Charities' Coalition on Internet Safety.

"Neither the EU nor one of the world's largest and most powerful tech companies could find a way of avoiding it. 

It is a very sad day for the children of Europe.

"We are heading for a very strange world if privacy laws can be used to make it easier for paedophiles to contact children, or for people to circulate or store pictures of children being raped."

Spotting exploitation

The new rules update long-standing privacy protocols so that they now cover additional forms of messaging - such as email and online chat. This has had the unintended effect of prohibiting advanced tools designed to spot:

Newly created violent and exploitative images not yet logged by investigators

Online conversations that have the hallmarks of abusers grooming victims

Anna Edmundson, head of policy at the NSPCC, said tech firms' ability to scan for such content was "fundamental" to protection efforts.

In October, the Council of the European Union said it was aware of the problem it posed for the detection of child abuse-related content - because no exception for scanning for it existed in law.

"Providers will not be able to continue to take such action unless a legislative measure is adopted with urgency and brought into effect by 21 December 2020," the October press release warned.

However, no such exemption arrived in time.

As a result, "the European Commission and child safety experts have said that the directive does not provide a legal basis for these tools", Facebook said in a blog post explaining the issue.

"The safety of our community is paramount, and we are advocating for changes that will allow us to resume our efforts to identify this type of material."