Facebook Checks Its Bias

When Facebook recently said it would allow outside reviewers inside its platform to look for signs of racial or political bias, civil liberties and human rights activists politely applauded.

For years, activists have called on tech companies to undergo assessments of how their policies affect people, both in the U.S. and globally. The companies have long rejected those audits as unnecessary.

But now Facebook is inviting outsiders in to look at allegations of racial and political bias.

“It’s better than nothing,” Rebecca MacKinnon said of the Facebook audits. She is director of Ranking Digital Rights a project that evaluates 22 tech and telecommunications firms annually in areas such as privacy, expression and governance.

“There’s increasing pressure on them to do this kind of thing,” MacKinnon added.

Facebook has faced criticism that it has allowed advertisers to use racial and ethnic profiles to target job and housing ads. American political conservatives have complained that Facebook has removed or taken down legitimate content because of its liberal bias, something the company has denied.

Both issues came under scrutiny following the 2016 U.S. election, but activists say the company’s focus on issues mainly concerning American users is overshadowing Facebook’s bigger problems with the platform abroad.

“The audits that Facebook is doing in the U.S., while welcomed, are very U.S.-centered,” said Arvind Ganesan, director of Human Rights Watch’s business and human rights division. “That’s really a response to domestic pressure.”

Call for global assessments

Critics say Facebook’s bias problems do not stop at the U.S. border. They point to the role that the platform is alleged to have played in incidents of mass violence, such as the persecution of ethnic Rohingya in Myanmar in recent years or sectarian violence in Sri Lanka.

The United Nations reported that in the case of violence in Myanmar, Facebook “substantively” contributed to the level of conflict.

Facebook’s News Feed, which highlights content of interest to a user based on the person’s friends and preferences, has also been accused of reinforcing false or inflammatory stories that go viral. That can help extreme viewpoints get in front of a mainstream audience.

Critics say the company is only starting to come to grips with the issue.

“There needs to be an honest, candid, comprehensive assessment,” said HRW’s Ganesan. “What is the panoply of Facebook’s impact?”

Transparency as industry trend

Self-assessments are nothing new for tech firms. Starting with Google in 2010, tech companies began publishing transparency reports that provide snapshots of how governments have turned to firms for user data or issued takedown notices because of copyright infringement or other reasons.

More than 60 companies regularly file transparency reports, according to Access Now, a digital rights group in New York.

Eleven companies, including Google and Facebook, undergo outside assessments every two years by the Global Network Initiative, a nongovernmental organization that looks at how companies are responding to government requests.

In its recent assessment, Ranking Digital Rights, which is a nonprofit research initiative affiliated with the nonpartisan New America Foundation think tank, gave low marks to Facebook for disclosing less information than other tech firms about how it handles data that can be used to identify, profile or track users.

Apple earned the greatest year-over-year score improvement of any company because it “strengthened its public commitment to protecting users’ privacy as a human right,” the report said.

How effective these assessments are in spurring companies to change is unclear. But company-run reports and outside audits can help find and measure problems, human rights advocates say.

“We call on Facebook to engage with stakeholders wherever it impacts human rights — the burden extends globally,” said Peter Micek with Access Now.” It doesn’t make sense from a human rights perspective to treat the U.S. exceptionally.”

leave a reply: