Facebook’s main company, Meta, has introduced new capabilities for Groups targeted at reducing the amount of misinformation spread among group members. One of the choices allows Group admins to auto-decline postings that have been identified by third-party fact-checkers to include misleading information, preventing the post from being seen by other members of the Group.
This has been a major issue for Facebook, as many Groups are secret, allowing damaging or erroneous material to spread fast and without scrutiny. Groups have been criticized for spreading COVID-19 falsehoods and other conspiracy theories, as well as providing a platform for criminal actors to organize the kidnapping of Michigan’s governor and coordinate elements of the January 6th uprising.
Facebook has taken several efforts to try to reign in members who break Group rules, as well as punish Groups who break them. Last year, Facebook also provided tools for Group admins, enabling them to limit how frequently specific members may post and alerting them to talks that may be “contentious or unhealthy dialogues” (though it wasn’t clear how its AI would achieve this). However, like with most of Facebook’s attempts to rein in Groups that disseminate disinformation or otherwise violate its standards, most of the company’s remedies have been late to the party, frequently reacting well after the issue content has gone widespread.
In addition to allowing Group admins to prevent certain types of information from being shared, Facebook renamed its “mute” tool “suspend,” allowing admins and moderators to temporarily suspend users. Admins will be able to manage Groups more effectively with the new capabilities, according to the business, and will gain further insights about how to expand their Groups with “relevant audiences. (floorshields) ”