Facebook's other user experiment: conflict resolution

facebook compassion
Facebook says it wants to understand "how unspoken rules of human interaction apply to attitudes and behavior online."

Here's a Facebook user experiment that most people can get behind.

Facebook (FB) has been working with social scientists for the past few years to better understand conflicts among users, including bullying and antisocial behavior. Last year, the social network recently began offering tools for users to resolve those disputes.

Facebook has long offered users the option to report content that violates its terms of service -- things like pornography, threats and graphic violence. But when it comes to things like insults or embarrassing photos, the site's administrators won't step in.

Those are the kinds of situations Facebook developers say they want users to address among themselves. Instead of simply encouraging users to flag posts for review by Facebook, the site has deployed a series of new message templates through which people can explain to others why they find a particular post upsetting.

The company consulted academic research on compassionate communication as it experimented with different templates. Rather than facing a blank text box, users messaging their friends about questionable content are given options like "It's embarrassing," "It shows inappropriate behavior," or "It's a bad photo of me" to express their requests.

Related: Facebook workforce is 69% male and mostly white

Facebook also asks users how the post in question makes them feel -- for example, "afraid," angry," "sad" or "embarrassed" -- and tailors the message templates further based on the intensity of the emotion expressed. There's also the option to un-friend, block or un-follow the person who made the post.

Facebook tracks the results of the interactions via optional follow-up surveys. The site says the changes, which it has been implementing since the start of 2013, have already borne fruit.

"What we've seen is that by giving people better language to have these conversations, it actually turns out that the person who receives the message is actually much more likely to respond," Facebook product manager Jake Brill says.

Did Facebook study go too far?
Did Facebook study go too far?

Facebook claims it's seen a ten-fold increase in the likelihood that users will send messages to someone who posts a status update they don't like; for shared links, there's been a five-fold increase. In all, the company says the tools are aiding in 3.9 million conversations a week.

Related: Facebook treats you like a lab rat

The problem of cyberbullying is particularly urgent for young people, who use social media in far greater numbers than older groups. Cybersecurity firm McAfee released a study last month saying 87% of the kids aged 10 to 18 that it surveyed said they'd witnessed cruel behavior online, a huge jump from last year's figure of 27%. Half of those surveyed said they'd been involved in an argument offline based on something posted on social media, with 4% saying the posting had led to a physical fight.

There's only so much Facebook can do to counter the harassment its tools enable; McAfee says the most important way to address cyberbullying is for parents to communicate with their kids about their online activities.

For researchers at Facebook and elsewhere, meanwhile, the work continues. The company held its fourth annual "Compassion Research Day" in December, bringing together developers, academics and teens from California's Bay Area to discuss online conflict resolution.

"Conflict and other challenges in relationships are impossible to avoid, both online and off," Facebook said following the event. "While these are realities of life, scientists are only beginning to understand how unspoken rules of human interaction apply to attitudes and behavior online."

CNNMoney Sponsors