A recent study of personality types for Internet commenters revealed that the “trolls” (people who bully, call names, and generally antagonize others in comment threads) are disproportionately made up of people who fit the psychological profile of narcissist, sadist, or psychopath. Unlike the majority of human beings, these people get off on being mean. They like to hurt others.
This is why things like Gamergate still happen. Throughout most of 2014, a rogue group of men in the gaming community have “cyberbullied” women who publicly criticized video game culture. The chief target was Anita Sarkeesian, creator of the online media program Feminist Frequency, whose media deconstructions focus on the portrayal of women in popular games.
Sarkeesian ignited a media storm with her episodes about women as background decoration (Part 1 and Part 2). She built a case that the gaming industry needs to change, displaying a series of images and animated sequences where violence against women was commonplace.
Among the images in her program was this advertisement for the 2006 first-person shooter game, Hitman: Blood Money:
This image hit a nerve. The visual itself was emotionally potent, provocative, and jarring—all ingredients for something that goes viral on the Internet. This angered a group of misogynistic “cyber warriors” who began issuing death threats, rape threats, and efforts to hack her personal information to post online. “One detractor created a game in which players can click their mouse to punch an image of her face,” according to a piece in the New York Times.
This bullying behavior routinely crops up on the Internet and persists there unchallenged. We could just as easily note the hate-filled vitriol on political blogs every time a controversial issue is brought up for discussion. It seems the anonymity (or some other feature) of the Internet enables those who behave badly to do so without consequence.
An evolutionary challenge presents itself. Humans are profoundly social—a massive body of anthropological research makes clear that cooperation and altruistic behavior is the norm. (See, for example, Moral Origins: The Evolution of Virtue, Altruism, and Shame by Christopher Boehm or this article by Dirk Helbing that uses computer simulations to show how cooperative behavior spreads in social systems.) And yet cyberbullying on the web is an example of anti-social behavior. How do we manage this conflicting evidence? What are the evolutionary mechanisms involved in selecting for cooperation in some settings? How do they differ from those at work in the breakdown of group cohesion for other situations?
Every hunter-gatherer society in the world has highly ritualized practices for group sharing and the cultivation of bonds among its members. Groups select for cooperative behaviors, succeeding despite the fact that roughly 1% of the people are psychopaths.
How did they do it? Stated simply, there was great advantage in cooperative hunting and sharing the spoils. Free riders and would-be dictators threatened the cohesion of these tribal bands—each having a few dozen members where everyone knew each other personally. Thus it was quite natural for those who sought to break this cohesion to be shamed for their bad deeds. If this didn’t work, they were ostracized. And in rare circumstances (when a serial killer was in their midst, for example) the troublemaker would be executed.
In other words, the group sanctioned bad behavior because they had enough social transparency to hold the bullies accountable through their collective actions. Behave badly toward others and there will be consequences. Paraphrasing the 80’s sitcom Cheers: You can ONLY go where everyone knows your name
Flash forward to the present. We now live in a world comprised of more than 7 billion people. That’s roughly 70,000,000 psychopaths if the 1% ratio holds true. We no longer live in small tribal communities where everybody knows your name. What’s more, the advent of digital avatars on the Internet enables people to jump from one virtual community to another wearing a mask that hides their identities.
The moral sanctioning that would have operated in a hunter-gatherer society is not working here. So what can be done? This is where the influential researcher Elinor Ostrom comes into play.
A political scientist at Indiana University (now deceased), Ostrom studied the tried-and-true approaches to managing common pools of finite resources such as grazing land, forests and irrigation waters in different societies around the world. What she found is a set of 8 Design Principles at play in every culture that was successful. Define clear group boundaries. Ensure the rules for decision-making align with local needs. Make sure those affected by the rules have the ability to change them. Develop a system for monitoring behavior for members of the group. Use graduated sanctions for rule violators. And so on.
Even a brief survey reveals that these principles have not been put into practice to address cyber aggression. The “commons” in question here is a safe space for community dialogue where people can express their views honestly and openly without fear of reprisal. What we see in the case of Gamergate is that none of the principles has been applied.
This observation is supported by a recent study of nearly 3000 internet users by the Pew Research Center which showed that two people in five have been harassed while contributing to discussions on the Internet. Among the respondents was a woman from the gaming industry who described how commonplace sexism is in the workplace:
“I work in gaming and I play competitively. Sexism is more apparent when playing as most people can hide behind a monitor and spout off their sexism. I once outplayed a competitor who promptly told me he was going to find out where I lived and rape me because I deserved it.”
Anti-social behaviors routinely go unchecked —making it easy for cyberbullies to use stealth and anonymity to wreak emotional havoc on the lives of strangers.
How can we deal with bullies on the web?
Start by establishing clear boundaries between acceptable and unacceptable behavior. Blogs and online news sites should create standard rules for participation that can be reviewed and amended by visitors through a facilitated review process. Let these become the new social norms for online engagement. Be sure that participants get to discuss and debate how the rules are created to enforce these social norms. Then establish protocols for how to correct undesirable behavior.
I realize that this is easier said than done. My hope in this essay is to stimulate dialogue about how we might manage online debate in a respectful way. I don’t have all the answers. Perhaps together we can create a new set of rules for fair play.
What is obvious at this point is that no such rules are enforceable now.
Bullies among us can threaten, intimidate, and lie with no consequences to constrain their anti-social behavior. This is unacceptable. With evolutionary principles as our guides—those discovered in Christopher Boehm’s work on group processes for sanctioning moral behavior and those uncovered by Elinor Ostrom’s work on collective governance—we can begin the process of unpacking the design elements of prospective solutions.
Then we can test and refine them until we get to a framework that works.