Facebook parent company Meta recently announced changes to the way it tries to identify misinformation and harmful material published on its social media services.
Facebook 母公司 Meta 最近宣布改变其识别社交媒体服务上发布的错误信息和有害材料的方式。
Meta chief Mark Zuckerberg explained in a video that the company had decided to make the changes because the old system had produced “too many mistakes and too much censorship.”
Meta 首席执行官马克·扎克伯格在一段视频中解释说,该公司决定做出这些改变是因为旧系统产生了“太多的错误和太多的审查”。
Zuckerberg said the moderation system Meta had built needed to be “complex” to examine huge amounts of content in search of material that violated company policies.
扎克伯格表示,Meta 建立的审核系统需要“复杂”,以检查大量内容,以查找违反公司政策的材料。
However, he noted the problem with such systems is they can make a lot of errors. The Meta chief added about such systems, “Even if they accidentally censor just one percent of posts, that’s millions of people.”
然而,他指出此类系统的问题是它们可能会犯很多错误。这位 Meta 负责人对此类系统补充道,“即使他们不小心只审查了百分之一的帖子,那也涉及数百万人。”
So, he said the company had decided to move to a new system centered on “reducing mistakes, simplifying our policies, and restoring free expression.”
因此,他表示公司决定转向一个以“减少错误、简化我们的政策和恢复言论自由”为中心的新系统。
The new method turns over content moderation duties to a “Community Notes” system. The company said this system aims to “empower the community” to decide whether content is acceptable or needs further examination.
新方法将内容审核职责移交给“社区注释”系统。该公司表示,该系统旨在“赋予社区权力”来决定内容是否可以接受或需要进一步检查。
The changes will be effective for Meta’s Facebook, Instagram and Threads services. Meta said the new system would become available first to U.S. users in the coming months.
这些变化将对 Meta 的 Facebook、Instagram 和 Threads 服务生效。Meta 表示,新系统将在未来几个月内首先向美国用户开放。
Meta’s former moderation system involved the use of independent, third-party fact-checking organizations. Many of these were large media companies or news agencies. The efforts included digital tools as well as human workers to fact-check content and identify false, inappropriate or harmful material.
Meta 以前的审核系统涉及使用独立的第三方事实核查组织。其中许多是大型媒体公司或新闻机构。这些努力包括使用数字工具和人类工作人员来对内容进行事实核查并识别虚假、不适当或有害的材料。
Meta said the third-party moderation method ended up identifying too much information for fact-checking. After closer examination, a lot of content should have been considered “legitimate political speech and debate.”
梅塔表示,第三方审核方法最终识别出了太多用于事实核查的信息。仔细一看,很多内容本应被视为“合法的政治言论和辩论”。
Another problem, the company said, was that the decisions made by content moderators could be affected by their personal beliefs, opinions and biases. One result was that “a program intended to inform too often became a tool to censor.”
该公司表示,另一个问题是内容审核者做出的决定可能会受到个人信仰、观点和偏见的影响。一个结果是“一个旨在提供信息的程序经常成为一种审查工具。”