Sometimes you should save as draft, right? Well, Twitter will be testing a new tool that will hopefully help users think twice before they send out a nasty tweet.

This is Twitter's newest attempt at helping stop harassment on their platform, or at least limit it.

For those who use any "harmful" language in their tweets, a prompt will come up asking the user to edit their tweet before they post it.

According to WILX, there is no word as of yet as to what the social media company will consider "harmful" language.

This won't appear for all users as of now. It will currently only show up for iOS users while this idea gets tested out.

The idea came around as some companies are facing a lack of staff to help moderate social media during the coronavirus pandemic. This new update will rely more heavily on AI technology.

100.7 WITL logo
Enter your number to get our free mobile app

Here are some tips for self-care during the pandemic:

More From 100.7 WITL