Menu
What happened to Facebook’s 'revenge porn' prevention pilot?

What happened to Facebook’s 'revenge porn' prevention pilot?

Nearly six months since Australian launch, controversial scheme remains grounded

Facebook’s controversial 'revenge porn' prevention pilot has still not launched close to six months after it was announced.

The pilot scheme – which requires victims send their intimate images to Facebook to be reviewed by a “specially trained representative” – was revealed in November last year, but is yet to start.

The Australian Government’s Office of the eSafety Commissioner, which is facilitating the pilot as the first of four world government agencies touted to do so, was not clear on the reasons for the delay.

“Facebook received some feedback about the proposed pilot, which we understand they are considering before going live,” a spokesperson for the commissioner said.

Facebook refused to respond to any questions relating to the pilot.

Soon after the initial announcement – in which eSafety Commissioner Julie Inman Grant said the agency was “proud to partner with Facebook” – the scheme faced criticism on multiple fronts.

The social media site issued a number of brisk retorts to the fault-finding, including a blog post by Facebook Global Head of Safety Antigone Davis entitled ‘The Facts’.

That was at a time before the social media site – already considered a key enabler of revenge porn and ‘sextortion’ – became embroiled in a global scandal in which an ever-rising number of users had personal information improperly shared with political consultancy Cambridge Analytica.

Despite the scandal the eSafety Commissioner told CIO Australia that it “already had some people express interest in participating in the pilot once it goes live”.

More questions than answers

In the proposed pilot, to get an image removed or blocked from Facebook, a user is first required to complete an online form on the eSafety Commissioner’s website. The commissioner then alerts Facebook.

The user then sends the image in question themselves over Facebook Messenger.

At this point Facebook says a “specially trained representative from our Community Operations team” reviews the image and creates an image hash from it.

Image hashing essential turns the image into a numerical digital fingerprint, which is stored in a Facebook database to allow it to detect and block alike images from being uploaded.

Once the image is ‘hashed’ Facebook tells the user, and asks them to delete the photo from the Messenger thread on their device. “Once they delete the image from the thread, we will delete the image from our servers,” Davis said at the time.

The process has been roundly criticised for its security pitfalls and the burden it places on users.

“Why is the image hashing being performed on a Facebook server and not directly on the client device, like the user’s smartphone?” asked Dr Chris Culnane and colleagues from the University of Melbourne.

Culnane et al’s question, raised by others too, drew a response from Facebook chief security officer Alex Stamos who tweeted:

“Mr Stamos’s tweet raises more questions than it answers,” the Melbourne researchers responded, explaining that the algorithm could, by observing its use and output, could realistically be revealed and circumvented.

“While there is insufficient security justification for not performing the image hashing on the client, however, there would be a commercial motivation for not revealing the algorithm. High-quality image hashing is a valuable asset for Facebook and its commercial partners, potentially allowing it to recognise and classify similar images,” Culcane et al wrote.

Although Stamos said the hashing technique would be “resilient to simple transforms like resizing”, it is unclear whether images that have a filter added or had been scribbled on would also be flagged.

The hashing issue isn’t the only point of concern with the pilot. Critics have also taken aim at the fact Facebook employees would have access to the images submitted, questioning why they couldn’t instead review images attempting to be posted that were flagged by a matching hash.

According to a Motherboard article, Facebook initially said nude images would be blurred when they were reviewed by employees, which they later clarified to not be the case.

Some have praised Facebook’s effort to solve the growing issue of image-based sexual abuse. A 2015 study found that 1 in 10 Australians had had someone post online or sent onto others a nude or semi-nude image of them without their permission.

Stamos defended his company and hit back at critics who have “trouble talking about imperfect solutions to serious problems”.

Ultimately, the success of the pilot, when and if it is launched, will come down to victims’ trust in the platform.

“There is the enormous issue of how far we can trust Facebook and its staff,” wrote Amy Binns, senior lecturer of digital communications at the UK’s University of Central Lancashire.

“Can Facebook guarantee that these photos, trustingly uploaded by desperate people trying to break free from damaging relationships, will only be seen by responsible staff? Or will they, over time, be farmed out to subcontractors, trainees and people who are themselves damaged by constant exposure to violence and sex online,” Binns added.

Individuals who think their intimate images may have been shared online without their consent are encouraged to visit the eSafety Offices image based abuse portal for further information, or to report the abuse if it’s not removed at www.esafety.gov.au.

Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags social mediaFacebookcyberbullyingmark zuckerbergeSafety Commissionerrevenge pornOffice of the eSafety Commissionernon-consensual images

More about AustraliaFacebookMessengerUniversity of Melbourne

Show Comments
[]