Music, Arts & Culture » Arts

'Like' it or not

An Indiana philosophy professor addresses the ethics of Facebook in a Cal Poly talk

by

comment

WITH FRIENDS LIKE THESE :  Anthony Beavers critiques Facebook from within as a user.
  • WITH FRIENDS LIKE THESE : Anthony Beavers critiques Facebook from within as a user.

When Evansville, Indiana-based philosophy professor Dr. Anthony Beavers spoke at Cal Poly in 2009, it was to share his optimistic view of what social media—and Facebook in particular—could mean for the next generation. Hailing it as “a wonderfully open world of information exchange,” Beavers saw it as a platform to immediately exchange news, opinions,
and ideas.

But in the two years that have elapsed since his last Cal Poly talk, the professor became dubious about the control of information on the Internet by a few corporations: Google, Yahoo, Microsoft.

Facebook.

And when Facebook began to direct more web traffic than Google, Beavers became alarmed. In a May 13 talk on the college campus, part of the school’s Ethics and Emerging Sciences lecture series hosted by Patrick Lin, Beavers will address the topic from a more critical angle: How is Facebook challenging our sense of right and wrong, and what does it intend to do with the information garnered from its 700 million users?

“I’ve become patently aware of agendas in the background,” Beavers told New Times in a phone interview. “From the user’s perspective, people interact, and Facebook looks like a social networking site. From the business perspective, it’s the largest social science database that’s ever been compiled in the history of the world.”

But while social scientists are bound by ethical codes concerning the use of that information, Beavers went on, the corporate sector is not.

By population, the social networking platform equates to the third largest country in the world. But it’s a country without geographical borders, and the power it has amassed doesn’t fall under any set of political laws.

Last August, Beavers attended a UCLA event at which representatives from Facebook spoke candidly about how their data was used. The company, for example, is able to divine the political leanings of users in its data set, even if those users don’t provide that information outright.

“They can do ghost-node identification, which means that if you put a couple of your favorite movies and your favorite books and a few of your friends, I can find out what political party you belong to,” Beavers said. “You don’t have to tell me that.”

Statistical reliability, he said, is upward of 90 percent.

Using this information, Facebook can have a significant impact on political elections: simply identify which users are on the fence, politically speaking, and sell advertising spots to political candidates, guaranteed to reach swing voters. And since Facebook is different for every user, it’s hard for anyone to tell if he or she is being singled out by advertisers—and impossible for those paying for advertising to tell whether their ads are actually being delivered. This is where ethics is an issue.

“Every FB user sees what they see, and nobody sees it all—except for Facebook,” Beavers said. “So this is the interesting thing. They can promise you privacy from other people, but they’re watching everything. I mean, they’ve got these massive network analysis computers that are calculating demographics and profiles and pitching advertisements.”

That’s Facebook on a macro scale. But the platform poses ethical questions on a micro level as well, by becoming a breeding ground for interpersonal misunderstandings.

- EXPLORE THE MANY FACES OF FACEBOOK:  Dr. Anthony Beavers’ talk “The Ethics of Facebook” is free and open to the public. Check it out on May 13 from 1 to 2:30 p.m. at Philips Hall (building 6, room 124). The talk is part of a series hosted by the Ethics and Emerging Sciences Group, directed by Patrick Lin. -
  • EXPLORE THE MANY FACES OF FACEBOOK: Dr. Anthony Beavers’ talk “The Ethics of Facebook” is free and open to the public. Check it out on May 13 from 1 to 2:30 p.m. at Philips Hall (building 6, room 124). The talk is part of a series hosted by the Ethics and Emerging Sciences Group, directed by Patrick Lin.

Beavers saw this play out in his own family: “Some people don’t realize that when they’re mad and they say something, it has a ripple effect to create drama. My mom is 68; my dad is in his 70s. And I’m reading the news feed, and my mother has changed her relationship status to my dad from ‘married’ to ‘it’s complicated.’”

When the professor asked what had happened, his mother replied that they’d just had a fight. But now she had unwittingly told everyone on Facebook that she was having marital difficulties. And without the proper context to explain her change in relationship status, friends and relatives were immediately alarmed and assumed the worst.

It’s a classic example of the problem of half-information that Facebook often perpetuates, enabling users to give just one part of the story without providing a full explanation, leading to misunderstandings, unnecessary worry, and hurt feelings.

If there’s one common thread in all of Beavers’ issues with the site, it’s the politics of information flow. Your newsfeed can’t show you everything, so it shows the information pertaining to friends you interact with most—thus surrounding users with thoughts and ideas they like and agree with, leading them to believe there are more people like them than there actually are. (This doesn’t sound terribly harmful, until you ask yourself, as Beavers did, what if a pedophile surrounds himself with only pedophiles? Wouldn’t he begin to feel that his actions were more acceptable?)

In a less extreme way, The New York Times and Washington Post websites have begun to catch on to Facebook’s logic, showing readers first those news stories their friends have “liked” on Facebook—as though individuals need only be exposed to news they like and agree with.

Of course, the meaning of the word “like” has been redefined in the land of Facebook—another issue Beavers finds deserving of exploration.

The professor himself, though he quit the site at one point, now regularly interacts on Facebook. He believes today’s Facebook users are part of something of great historical significance, but that we’re too deep into it to understand what that something is.

In fact, he said, “I don’t think we’re going to understand what’s going on for another hundred years. … We’re going to look back and say, ‘What a crazy mess that was.’”

But right now, according to Beavers, Facebook is too large a power to try to slow down. People often ask him if they ought to delete their account, a small way to starve the beast.

But he only shrugs: “Quit or don’t quit. There’s nothing we can do about it.”

Arts Editor Anna Weltner can reached at [email protected].

Tags

Add a comment