Facebook: Offensive Content Versus Free Speech
The social networking site Facebook has over 200 million active members and is available in 40 languages. Seventy percent of Facebook users live outside the United States, less than a third are
1
college students, and the fastest growing demographic is individuals thirty-five years and older.
With this kind of diversity in membership, there are often opposing viewpoints regarding acceptable content. Facebook has rules prohibiting hateful, threatening, or pornographic content
2
or content that contains nudity or graphic or gratuitous violence. Some content, however, resides in the gray area between natural and obscene, between inflammatory and hateful. When an issue does reside in this gray area, Facebook has to make a judgment call based on its ethics and value system, which may be at odds with its users’ values.
In January 2009, Facebook removed pictures of nursing women from personal pages, citing
3
that these pictures violated the policy against nudity. In 2008, Facebook received criticism for not removing content, specifically, the pages of anti-gypsy groups. Members of the UK parliament
4
condemned Facebook for hosting pages that include images of the Ku Klux Klan. And in 2009,
5
Facebook was pressured to remove groups that denied the Holocaust.
No doubt Facebook is in a tough position. They would like to maintain freedom of speech on the Web, but critics argue that Facebook has the responsibility to decide what is appropriate for
6
users within its terms of service and that hate groups should not be tolerated. Issues like these are likely to continue as Facebook adds users with diverse backgrounds and viewpoints. What steps has Facebook taken to deal with these ethical dilemmas?
Currently, Facebook does not actively search for content that doesn’t adhere to their policies. Instead, they rely on users to flag this content. Questionable items go before a team that