On Instagram, people share a bevy of photos — food, places they go to, and a never-ending flurry of selfies. There's the platform's darker side, too, though, where a handful of users share lewd, violent, and downright sexual content. But users will see less of these posts moving forward as Instagram cracks down on content that violates its community guidelines.
"We have begun reducing the spread of posts that are inappropriate but do not go against Instagram's Community Guidelines," Instagram said. If a post is sexually suggestive, even though it doesn't feature sexual activity or nudity, it could still get demoted.
Instagram Begins Crackdown Of Explicit Posts
Similarly, if a post is considered in bad taste, lewd, violent, or hurtful — even if it doesn't necessarily constitute hate speech or harassment — it could get less visibility, as well. Posts of this nature "may not appear for the broader community in Explore or hashtag pages," specifies Instagram.
The company said it's begun using machine learning to determine if an actual post is "eligible to be recommended to our community." Examples of non-recommendable content include violent, graphic, shocking, and sexually suggestive photos or videos.
Can We Trust Machine Learning?
The problem is machine learning isn't perfect — at least not yet. When Tumblr rolled out a similar update to combat its growing pornography problem, users complained that posts that didn't violate its policies still got banned, which means it's not enough to rely on a machine learning alone to clean up an entire platform.
Which is why Instagram is now training content moderators to label borderline content when they're hunting down policy violation. Instagram use those labels to train its algorithm. Furthermore, it won't remove posts from the feed entirely, and Instagram says the new policy won't impact Stories just yet.
Borderline Content
But in November, Facebook CEO Mark Zuckerberg published a manifesto calling the need to broadly reduce the reach of "borderline content," which on Facebook would mean being shown less. Such a policy could easily extend to Instagram at some point.
Perhaps the group to be most impacted by this are creators, which use Instagram and a number of other platforms to reach fans and make a profit. They have no idea just yet of what constitutes borderline content, and Instagram rules or terms of service don't make mention of it, either. Its Help Center does contain a brief note — published recently, it's worth noting — about borderline content, but with no visual examples, indicators, and points.