Instagram Reveals AI-Powered Tools to Combat Bullying and for Restricting Problematic Followers

Instagram is now using artificial intelligence to identify potential bullying and other tech to restrict problematic followers…

Social site Instagram today announced two new features to prevent bullying. The first uses AI or artificial intelligence to warn users if content yet to be posted might be considered offensive. The other gives people the ability to “restrict” problematic followers.

New Instagram Anti-Bullying Tools Announced

The AI bullying tech essentially reads content prior to its publication. It can help to spot things which might be considered bullying such as certain comments, images, and videos. Head of Instagram, Adam Mosseri writes:

“In the last few days, we started rolling out a new feature powered by AI that notifies people when their comment may be considered offensive before it’s posted. This intervention gives people a chance to reflect and undo their comment and prevents the recipient from receiving the harmful comment notification. From early tests of this feature, we have found that it encourages some people to undo their comment and share something less hurtful once they have had a chance to reflect. “

The “restrict” option prevents comments from restricted users from showing up, unless granted approval. And, restricted users can’t see when particular people are active or have read direct messages.

Savannah Marie

Savannah Marie loves writing and all things social media. She writes on a variety of topics, from social media to health and wellness to travel and all points in between! She is the lead writer and creator of Mixios and blogs with style and a one of a kind voice.