It has also published political ads and streamlined its privacy controls after coming under fire for its lax approach to protecting consumer data. Standards have evolved through time so it's always good to check or change them.
Talking to Indian media over a video conference, Monika Bickert, Facebook's vice-president of product policy and counter-terrorism, said the company is for the first time making its internal set of community guidelines public to offer more transparency.
The previous public-facing version of Facebook's community standards gave a broad-strokes outline of the rules, but the specifics were shrouded in secrecy for most of Facebook's 2.2 billion users. So on Tuesday, the latter made a decision to reveal its internal community standards and policies, which it uses to determine which posted content is acceptable, and which should be removed.
She said Facebook chose to publish the internal guidelines for two reasons.
Hate speech, which Facebook defines as a direct attack on people based on race, religion, sexual orientation and other categories protected by law, is not allowed.
Here Facebook lays out what it can do to help various users.
For the full conversation with Bickert, see the video player above. "We are trying to strike the line between safety and giving people the ability to really express themselves".
Arsenal must stick with Per Cech in Europa League - David Seaman
Arsenal also have selection problems as coach Arsene Wenger looks to bring the curtain down on his 22-year-reign at the club in style.
Facebook Inc has disclosed it had deleted and put a warning label on 1.9 million pieces of extremist content related to ISIS or al-Qaeda in the first three months of the year, as well as also published its internal definition of terrorism for the first time on Monday. (The photo was restored after protests from news organizations.) Moderators have deleted posts from activists and journalists in Burma and in disputed areas such as the Palestinian territories and Kashmir and have told pro-Trump activists Diamond and Silk they were "unsafe to the community".
In response, the company has said it will double its 10,000-person safety and security team by the end of this year. They have ballooned from a single page in 2008 to 27 pages today.
Facebook, the world's largest social network, has become a dominant source of information in many countries around the world. A far-flung team of 7,500 reviewers, in places like Austin, Dublin, and the Philippines, assesses posts 24-hours a day, seven days a week, in more than 40 languages. Along with this, those whose posts are removed will now have a second appeal option.
Facebook is still in hot water and it may be a while before some people get over the scandal. "We will make mistakes and it is important to take a second look if we have removed a post or a photo", she said, adding that these appeals will not be sent to another person for review. "We make mistakes because our processes involve people, and people are not infallible".
Along with the new standards Facebook has released a brand-new appeals process that will initially apply only to content that depicts nudity / sexual activity, hate speech or graphic violence.
"We've promised to do better and we hope that sharing these details will serve as the basis for increased dialogue and input", Bickert said.
The new guidelines give users a way to petition the decision if they believe content is unfairly removed, or for content that was flagged by a user and not removed. Facebook uses a combination of Artificial intelligence and people's reports for identifying. The document details what counts as sexual exploitation of adults or minors, but leaves room to ban more forms of abuse, should it arise.