Release Date: May 20, 2016 This content is archived.
BUFFALO, N.Y. – Algorithms determine what users see on Facebook. But these algorithms are constructed by humans and, therefore, reflect human bias, says Mark Bartholomew, University at Buffalo professor of law.
“Facebook is this very powerful social force, but we don’t hold it to the same standards as traditional media,” says Bartholomew, who studies the intersection of privacy and social media. “With Facebook, there’s a tendency to say it’s just technology, so it’s neutral. There’s a tendency to believe Facebook just tallies votes of what is most popular. But neither of those things are true.
“Facebook is steering things in directions to make its interface more appealing, to make itself more desirable for advertisers, and to make you look at things that Facebook wants you to look at.”
The discussion of Facebook and alleged bias started after former Facebook workers claimed that the site’s “trending topics” section largely suppressed news stories of interest to conservative readers.
These former employees also said they were told to add selected stories into the trending topics section, even if they were not popular enough to warrant such placement.
Facebook defended itself by saying the trending topics process is neutral – a process “surfaced by an algorithm.”
But herein lies a problem, according to Bartholomew.
Algorithms reflect human bias, as they are constructed by humans, he says. Even computer algorithms are designed by humans, he says, and therefore reflect human biases, too.
But more than that, as Facebook becomes the source for information of all kinds, including political reporting, it is a very different model than traditional news sources, but one that still needs to be transparent.
“It’s almost impossible to tell how particular stories are selected and why they end up in the feeds of our social media accounts,” Bartholomew says. “There isn’t one editor at Facebook to point to as the decision-maker. But there is no doubt that these are intentional choices made by Facebook’s very human architects and should be held up to public scrutiny.
“Facebook is not a neutral force that simply tallies up user votes. Instead, it is a business that makes decisions that will maximize profits by crafting content to flow in ways that please both users and advertisers. The more we think about Facebook as a business run by people with their own biases and motivations, and not simply a neutral conduit for information, the better.”
To find UB faculty experts on other topics – including issues trending in the news – visit UB’s Faculty Experts website and follow us @UBexperts.
Rachel Stern no longer works for University Communications. To contact UB's media relations staff, call 716-645-6969 or visit our list of current university media contacts. Sorry for the inconvenience.