NEW YORK, NY.-
Content creators have long criticized Facebook and Instagram for their content moderation policies relating to photos that show partial nudity, arguing that their practices are inconsistent and often biased against women and LGBTQ people.
This week, the oversight board for Meta, the platforms parent company, strongly recommended that it clarify its guidelines on such photos after Instagram took down two posts depicting nonbinary and transgender people with bare chests.
The posts were quickly reinstated after the couple appealed, and Metas oversight board overturned the original decision to remove them. It was the boards first case directly involving gender-nonconforming users.
The restrictions and exceptions to the rules on female nipples are extensive and confusing, particularly as they apply to transgender and nonbinary people, Metas Oversight Board said in its case summary on Tuesday. The lack of clarity inherent in this policy creates uncertainty for users and reviewers, and makes it unworkable in practice.
The issue arose when a transgender and nonbinary couple posted photos in 2021 and 2022 of their bare chests with their nipples covered. Captions included details about a fundraiser for one member of the couple to have top surgery, a gender-affirming procedure to flatten a persons chest. Instagram removed the photos after other users reported them, saying their depiction of breasts violated the sites Sexual Solicitation Community Standard. The couple appealed the decision and the photos were subsequently reinstated.
The couples back-and-forth with Instagram underscored criticism that the platforms guidelines for adult content are unclear. According to its community guidelines, Instagram bars nude photos but makes some exceptions for a range of content types, including mental health awareness posts, depictions of breastfeeding and other health related situations parameters that Metas board described as convoluted and poorly defined in its summary.
How to decide what depictions of peoples chests should be allowed on social media platforms has long been a source of debate. Scores of artists and activists contend that there is a double standard under which posts of womens chests are more likely to be deleted than those of men. Such is also the case for transgender and nonbinary people, advocates say.
Metas oversight board, a body of 22 academics, journalists and human rights advocates, is funded by Meta but operates independently of the company and makes binding decisions for it. The group recommended that the platforms further clarify the Adult Nudity and Sexual Activity Community Standard, so that all people are treated in a manner consistent with international human rights standards, without discrimination on the basis of sex or gender.
It also called for a comprehensive human rights impact assessment on such a change, engaging diverse stakeholders, and create a plan to address any harms identified.
Meta has 60 days to review the oversight boards summary and a spokesperson for the company said they would publicly respond to each of the boards recommendations by mid-March.
This article originally appeared in The New York Times