Skip to main content

Sorry Vine, your porn problem isn’t going anywhere

xxxvineThe Internet used to be for porn (at least, more for than it is now), and then the safety nets went up to keep young and pure eyes (and ears) away from such content. And it looks like the same thing is happening with social media. The recent fall out over Vine and its NSWF issue, however, has made us realize one thing: Porn just follows one of the primary rules of the internet – adapt. 

When Vine first appeared, it was set into the wild with an encouraging push from Twitter, all wide-eyed and bushy-tailed – and unregulated. You could search for tags as obvious as “NSFW” or the more blatant “boobs” and find results… lots of results. And notoriously, due to a supposed human error, a vine called “Dildoplay” made it onto the Editor’s Picks list on the app (you can’t help but wonder how the resulting meeting went down). 

It looked like Vine was at risk of going the way the Post Secret app did, which is to say out of the iOS App Store swiftly after its launch thanks to pornographic, gruesome, and even threatening images that a small team of moderators (in PostSecret’s case, largely made up of volunteers) just couldn’t keep up. The App Store rules say that apps which “contain user generated content that is frequently pornographic” will get the boot, and mention ChatRoulette by name in its examples of things which don’t fly. 

Vine’s terms of service didn’t even prohibit explicit imagery or nudity – and still don’t (though it does note that threats prohibited). While the company maintains the right to “respond to user support requests,” nowhere does it explicitly say that the explicit is not OK. That hasn’t changed, but the ability to use tags to find porn has been dropped.

Since Vine’s early days, hashtags like #boobs and #smut get just sad face “nothing to see here” results. #Sex101 brings up a strangely adorable vine of stuffed animals posed in various coital positions, but no actual human body parts.

vine suicidegirlsBut there are still loopholes. For example, the SuicideGirls account shows off full-frontal shots of tattooed models. Scrolling through, the first few Vines play a series of very adult theme images… but get about four or five videos in, and suddenly you see warnings about sensitive content you need to tap to view. Clearly, the Vine team is having trouble keeping up with what’s rolling in. While some of the more obvious hashtags aren’t pulling up the desired Vines, others are still rife with very adult videos (want to see for yourself? Search for #xxxvine or #xxxxxx and be prepared for what shows up).  

When it comes to depending on the App Store and functioning with a relatively small staff to rifle through content and its visual nature — and a tendency for attracting lewd images — Instagram sets the standard for flagging, finding, and pulling adult-themed pictures.

Today, Instagram has a 12+ rating in the iTunes store for, among other things, “Infrequent/Mild Sexual Content or Nudity.” Tags not allowed: #NSFW,# porn, and the like. And even the porn_stars and Suicidegirls accounts are more Victoria’s Secret than Penthouse. Instead of relying on an army of moderators, Instagram took out the obvious tags and then relies on the “Flag for Review” button.

tags instagram
Image used with permission by copyright holder

Still, the masses are curious, so one user tried to see what he could get away with. The user posted a pic of boobs onto Instagram as @thebreastsofayounglady and reported the results. Within a day and a half, Instagram came calling with a warning:

It has come to our attention that one or more of the photos you’ve shared on Instagram violates our Community Guidelines, which can be found here.

In short, we ask that you:

  • Don’t share photos that aren’t yours.
  • Don’t share photos that show nudity or mature content.
  • Don’t share photos of illegal content.
  • Don’t share photos that attack an individual or group, or violate our Terms of Use.

Any violating images flagged by members of the Instagram community have been automatically removed, and we strongly suggest deleting any additional content on your account that may not fall in line with the above guidelines or our Terms of Use

It seems while plenty of users don’t mind a little T&A, others are policing pretty effectively.

Instagram has done an admirable job of cleaning itself up – but the point is that users will continue to post adult content, even if it keeps getting pulled. There are too many loopholes and too many people to keep porn off social networks entirely. Vine will have to learn from its predecessor’s experience if it wants to stay partnered with Twitter and maybe even fix that 17+ rating from the App Store. 

As more and more visual-sharing app take over social networking (and buddying up with big platforms like Facebook and Twitter), self-censorship is going to be incredibly important. And as we adapt and find ways to slip our dirty minds into these image-heavy feeds, they’ll need to keep on coming up with ways to warn us or block the content entirely. The Internet – and by extension, social media – will always be for porn, but now it’s also going to come with child locks attached. 

Editors' Recommendations

Jenny An
Former Digital Trends Contributor
Jenny writes about technology, food, travel, and culture. She lives in Brooklyn with a MacBook that is like a pet. She has…
Instagram CEO says the app doesn’t listen to your conversations
instagrams new explore grid tempts you to open your wallet mobile technology applications

Instagram's CEO says that the app doesn't listen to your conversations — even though you may see ads related to products you were talking about with a friend, but never actually searched for.

In an interview with CBS News' Gail King scheduled to air Wednesday, Instagram CEO Adam Mosseri said ads that seemingly pop up right after you talk about products in real life are mostly "dumb luck." King had asked why she sees advertisements for products she never searched for while browsing Instagram.

Read more
Instagram’s new camera feature, Create Mode, isn’t for taking photos or video
instagram create mode f8 2019 instagramshoppingtags

Instagram’s camera is getting an upgrade -- but not for taking pictures. During the annual F8 conference on April 30, Instagram shared an upcoming new design for the camera mode. The update brings a Create Mode that allows users to share information without having to start by taking a photo or video. Instagram also shared details on a new shopping tool, fundraising efforts, and a test to eliminate the like counts.

Sometime in the next few weeks, an Instagram update will bring a new design to Instagram’s built-in camera. The update features the new Create Mode, which allows users to use effects and stickers, but doesn’t require starting out with a photo or a video. Instead, users start from scratch on a blank canvas and add stickers and other effects.

Read more
X rival Threads could be about to get millions of more users
Instagram Threads app.

Threads -- Meta’s rival to X, formerly Twitter -- has just launched in the European Union (EU), a market with nearly half a billion people.

The app launched in the U.S. to much fanfare in July, with Meta hoping to attract X users disillusioned with the turbulence on the platform since Elon Musk acquired it for $44 billion 14 months ago.

Read more