Would You Trust Facebook With Your Most Intimate Photos?
May 30, 2018 - 4 minutes readOnce you put something on the Internet, it’s extremely difficult to remove it. Facebook is launching a program to deal with this problem, specifically to tackle the issue of revenge pornography.
All you have to do to participate is share the compromising photos of yourself with the social media giant.
A Risqué Compromise
Sharing revenge porn is an illegal act in 38 states. But it’s still a common problem that can cause extreme humiliation. But a certain San Francisco-based social network thinks it has the solution to keeping your private pictures off the Internet.
Antigone Davis, Facebook’s Global Head of Safety, recently revealed that the tech company would soon be releasing a pilot program to stop anyone from sharing your nudes on the platform. Essentially, it would create a digital fingerprint of each intimate photo you want on lockdown. The catch? You have to share each photo with Facebook for the company to create the fingerprint.
It’s okay to be surprised by this news; in fact, it’s a little expected. Recent months have not been kind to Facebook’s reputation. Between the Cambridge Analytica scandal and fake accounts, the tech company doesn’t exactly have the best standing with the public right now.
A Human Touch Over AI
The new program seems to be built upon Facebook’s current protocol for handling sensitive photographs. Basically, if you find intimate photos uploaded to the platform without your permission, you contact the company to take them down. Facebook will then create a mathematical fingerprint of that picture to block any future uploading of it.
In the pilot program, you beat the unconsented uploaders to the punchline by uploading them yourself. Then, a “specially trained member of [Facebook’s] Community Operations Safety Team” reviews and creates a unique fingerprint for each of your photos to prevent uploading elsewhere on the platform.
Yes, you read that right. Basically, one person out of a group of five reviewers will examine your content. What about recent AI developments, you ask? Can’t they… circumvent the need for a human eye? Apparently, no. But, luckily, Facebook’s image-recognition software doesn’t remember your picture pixel-for-pixel, for lack of a better term. Your photo gets deleted a week after being uploaded; only the digital fingerprint is needed to prevent future uploads.
Peace of Mind?
It’s not exactly clear just how sophisticated the software is that Facebook will employ for this program. Will modified versions of a photo pass the blueprint comparison? Will Photoshop be the tool for getting around it? There’s no telling, yet.
The whole concept is frankly uneasy to accept. After all, we are talking about uploading intimate photos of ourselves to a company that makes the major chunk of its profit from selling user data. But if successful, it could prove to be a viable way of stopping revenge porn.
Regardless, the question remains — would you trust Facebook with your most intimate photos?
Tags: AI App Developer, AI app developers, AI App Development, AI apps, AI developers, facebook, facebook app, facebook censorship, mobile app developer San Francisco, mobile app developers San Francisco, mobile app development San Francisco, san francisco AI app developer, San Francisco app developers, San Francisco app development