Photo sharing site Instagram is looking forward to strengthen its moderation policies and is adding a new alert that will warn people who violate rules when their account is close to being deleted. The warning will come as an alert and will show the users a history of the posts, comments and stories that Instagram has had to remove from their account. They will also be told as to why they were removed.
The page by Instagram reads, “If you post something that goes against our guidelines again, your account may be deleted”. However , the photo sharing site will give the users a chance to appeal its moderation decisions directly through the alert, instead of having to go through its help page on the web. But only some of the types of the content will be appealable at first (like the pictures removed for nudity or hate speech). With time, Instagram plans to expand the available content appeals over time.
The new measures will also give a clarity to the users on why they are in trouble and should remove the shock of suddenly finding that their accounts have been deactivated or vanished. It is also likely that a big number of banned accounts are removed for violation of rules. Just like its parent company Facebook, Instagram has also regularly had moderation issues when it came to nudity and other related content. The users have had photos removed of posting pictures of breast feeding or period blood. Such an update will not prevent such mistakes but would make appealing the decision easier.
Apart from the new alert, Instagram is also giving the moderation team freedom to ban bad actors. The policy of Instagram is to ban users who post a certain percentage of violating content. But it will now ban people who repeatedly violate such policies within a window of time.
Photo Credits: Instagram