
Meta removed two Israel-Hamas videos unnecessarily, Oversight Board says
CNN
Facebook-parent Meta’s automated tools that police potentially harmful content unnecessarily removed two videos related to the Israel-Hamas war, the Meta Oversight Board said in a statement Tuesday. The moderation technology may have prevented users from viewing content related to human suffering on both sides of the conflict, it said.
Facebook-parent Meta’s automated tools to police potentially harmful content unnecessarily removed two videos related to the Israel-Hamas war, the Meta Oversight Board said in a statement Tuesday. The moderation technology may have prevented users from viewing content related to human suffering on both sides of the conflict, it said. The comments are the result of the Oversight Board’s first “expedited review,” highlighting the intense scrutiny facing social media companies over their handling of content related to the conflict. The board overturned Meta’s original decision to remove two pieces of content. As part of the decision, the group urged Meta to respect users’ rights to “freedom of expression … and their ability to communicate in this crisis.” “The Board focused on protecting the right to the freedom of expression of people on all sides about these horrific events, while ensuring that none of the testimonies incited violence or hatred,” Michael McConnell, a co-chair of the board, said in a statement. “These testimonies are important not just for the speakers, but for users around the world who are seeking timely and diverse information about ground-breaking events.” In response to the board’s decision, Meta said that because it had already reinstated the two pieces of content prior to the board’s decision, it would take no further action. “Both expression and safety are important to us and the people who use our services,” the company said in a blog post. Meta’s Oversight Board is an entity made up of experts in areas such as freedom of expression and human rights. It is often described as a kind of Supreme Court for Meta, as it allows users to appeal content decisions on the company’s platforms. The board makes recommendations to the company about how to handle certain content moderation decisions, as well as broader policy suggestions.