Facebook’s (FB - Free Report) troubles continue to mount due to concerns over security of the platform. A recent New York Times article brought to light the irregularities of the company’s content standards.
The article states that moderators who are responsible for regulating the content on the platform rely on some outdated and incorrect information. Third-party moderators are required to monitor huge amount of content produced by Facebook’s 2.6 billion users in a very limited time.
Facebook’s approach of one size fits all is not working as the guidelines to identify and monitor hate speech are not proper and a bit vague. This has led to a wrongful ban on appropriate content while problematic content continues to exist on the platform.
Lack of adequate moderators, their limited knowledge on important policies and the bigger impact on communities are major issues. Although Facebook has employed a few local moderators, it is not enough to support its vast presence and the amount of content the platform generates.
Facebook is probably looking at temporary solutions instead of relying and partnering with subject matter experts to solve its security issues.
We believe that the task of regulating and monitoring content requires tremendous efforts and cannot be handled through a few boardroom discussions. A larger consideration of the communities and the respective policies need to be monitored closely to prevent such malpractices.
Facebook, Inc. Revenue (TTM)