Instagram Starts issuing Alerts over Offensive Captions
Instagram has announced that its platform will start warning users when it detects that they’re about to post a potentially offensive caption on a photo or video. This new feature marks the expansion of the anti-bullying system Instagram introduced earlier this year.
In July, Instagram rolled out an AI-powered system that warns users when they attempt to publish a ‘harmful’ comment. This same technology is now being used to monitor for potentially offensive content captions, as well, Instagram announced on Monday.
If an Instagram user posts something that the service’s AI-powered tools think could be hurtful, the app will generate a notification to say that the caption “looks similar to others that have been reported.” It will then encourage the user to edit the caption, but it will also give them the option of posting it unchanged.
Unlike its other moderation tools, the difference here is that Instagram is relying on users to spot when one of their comments crosses the line. It’s unlikely to stop the platform’s more determined bullies, but hopefully it has a shot at protecting people from thoughtless insults.
Instagram says the new feature is rolling out in “select countries” for now, but it will expand globally in the coming months.
Via: The Verge