游戏邦在:
杂志专栏:
gamerboom.com订阅到鲜果订阅到抓虾google reader订阅到有道订阅到QQ邮箱订阅到帮看

探讨如何减少在线游戏社区中的反社交行为

发布时间:2012-11-06 15:17:57 Tags:,,,

作者:Travis Ross

最近,Bungie(《光晕4》的开发商)和NCsoft(《激战2》的开发商)都对聊天渠道中的反社交行为采取了强硬的态度。即这两家公司都开始禁止玩家使用任何憎恨性言论。但是直到目前为止,我发现禁止这种反社交行为是件非常困难的任务,我也未曾听说有哪些方法取得了不错的效果。troll和griefer非常狡猾,它们总是能够找到聊天以外的其它方法去破坏游戏社区的乐趣。看到许多公司一直在努力避开这些行为,我也希望能够助他们一臂之力。所以我便根据自己的研究以及一些社交政策列出了几大建议,希望以此能够帮助开发者创造出更具社交性的在线社区。

antisocial behavior(liverpoolmutualhomes)

antisocial behavior(liverpoolmutualhomes)

1.将创建社区规范作为总体目标

本篇文章主要是关于游戏开发者将如何使用规范去减少游戏在线社区中的反社交行为。规范之所以如此强大是因为它是源于社区行为。当规范形成后,社区便可以省去监督成本。如果存在规范,开发者便可以基于社区成员的反馈进行管理,并能够惩罚那些行为不当的玩家们。这不仅能够降低开发者的成本,同时还将赋予社区成员一定的权利,从而大大提升他们的自主感。

2.两大同样重要的规范

的确存在两种类型的规范。首先便是描述性规范,即传达其他玩家的行为。换句话说,我们可以根据其他玩家的行为(通过聊天渠道)明确这种描述性规范。对于开发者来说这一信息非常重要,因为人们总是倾向于模仿别人,所以开发者可以使用少量的社交信息去告知玩家他们的各种决定,并描述相关的游戏环境。这便意味着反社交行为可以在整个社区中进行传播——特别是当它具有内隐性时,通常小小的负面行为都有可能引起一连串的连锁反应。如火焰战争(游戏邦注:指一个线上讨论变成一连串的人身攻击)等。当人们因为troll和griefer的复仇或好意而抓狂时便很容易做出反社交行为。

第二种规范便是社交规范或命令规范。这些规范之所以重要是因为它将社交压力强行置于个人身上。它们能够传达人们所期待的内容。但是因为游戏总是不能明确地传达这些规范,从而导致规范们只能在玩家的各种羞耻行为和社交期待中发挥作用。除此之外,在在线环境中,玩家很少会对羞耻行为做出反应——因为他们意识到这并不会对自己的声誉带来持续影响。而在现实世界中,社交规范便总是伴随着任何违规处罚。

对于这两种类型的规范我需要明确几大点。首先,它们必须作为单独的激励力量。其次,根据研究表明,描述规范位居社交规范之上。也就是如果描述规范认为反社交行为是正常的,那么社交规范也不能予以反驳。如果有许多人都在做着反社交行为,那便没人会希望自己倾向于社交性。有趣的是,如果公众对于描述性规范的理解发生了改变,那么现有的社交规范将会彻底坍塌。

3.处罚

最糟糕的troll和griefer行为将会引起连锁反应(就像火焰战争),但是社交规范不仅不能阻止个人行为,甚至有时候还会起到推动作用。为了解决这一问题,我们便需要通过剥夺玩家的乐趣而惩罚他们。对于开发者来说,一种选择便是聘请警官去制裁那些违反规范的人。这一方法虽然有效,但是如果开发者所面对的是大量玩家和游戏政策时,他便需要为此付出高额的成本——遥测技术和机器学习能够带来一定的帮助。

另一种选择便是让玩家去承担处罚任务。不管怎么说,监管griefer和troll并不危险,并且如果社交行为的规范足够有趣,玩家便会希望所有糟糕的行为能够消失。处罚系统不仅复杂,同时也很难落实行动。为什么?因为如果任何人可以轻易掌控这一系统,它便有可能演变成创造反社交行为的工具。同时,如果这一系统过于疲软或成本过高,它便不可能获得社区成员的欢迎,并取到积极的作用。我并不打算详细分析处罚系统,但是我将列举出其中的一种系统,并且与游戏设计一样开发者也必须对此进行迭代测试:

反社交行为的分数点:

1)玩家将因为糟糕行为而开始累积分数点。

2)其他玩家能够分配分数点。

3)一段时间后分数点便会过了有效期。

4)当多名玩家在一次会话中同时举报某一玩家的违反行为时,其分数点将会迅速增加。

5)当累积了多次处罚后玩家的分数点便会到达极限。这时候开发者便会先终止其音频通讯权利,并最终将其逐出游戏。

6)玩家可以在x天内提起上诉。

7)其他玩家将得到一些工具去处理这种上诉(如遥测数据),并通过对上诉做出判断而获得虚拟货币作为回报(游戏邦注:三名随机玩家将审查这种上诉,并最终达成一致的判决结果)。如果玩家不能做出判决,这一权利便会移交给现实中的客户服务代理人。玩家可以通过仲裁而赢得信任级别。

8)输掉仲裁听证会的玩家将继续累积分数点。

9)而如果玩家处罚了赢得仲裁的玩家,他的制裁能力便会被大大削弱(长期)。

4.处罚应该升级吗?

在社区管理和研究中还有两点是我们搞不明白的,即开发者是否应该完全禁止玩家游戏,或者不断累积处罚?以下便是关于这两个问题的讨论。

禁令锤

首先,关于反社交行为的描述规范似乎已经扎根于多款在线游戏的聊天渠道中。人们似乎认为“他们只是游戏玩家”或者“在这些游戏中反社交行为再平常不过了。”为玩家创造规范去制裁违反行为真的非常困难。除非有人真正希望玩家们这么做,否则他们便不会愿意去履行这种制裁。如今存在着一种反复的搭便车问题,即团队成员们不想履行处罚任务便是因为这需要付出一定的代价。而为了摆脱玩家认为反社交行为是一种规范这一认知,我们就需要使用禁令锤。除此之外,任何性别歧视,种族歧视以及粗话都应该遭到处罚。

逐级制裁

如果玩家可以“改邪归正”会是怎样的情况?如果玩家在第一人称射击游戏延续现状会是怎样的情况?在竞争环境中,我们总是很难照顾到每个人的情感,所以经常会让玩家感到沮丧。我们是否应该对此展开思考?是否应该给予这些人警告?这么做是否能够帮助人们在现实生活中变得更具有社交性?关于社交政策研究的一大发现便是,严厉的处罚将不利于社区的发展。即未留给玩家第二次机会的处罚将会让他们感到沮丧,并因此为游戏树立起敌人——特别是当某些特定的行为并不存在描述规范和社交规范,或开发者未能明确传达这些信息时。也就是玩家未觉得自己被警告,或者不理解游戏对自己的期望是什么时,便莫名其妙地被禁令锤给砸中了。毕竟,反社交行为已经成为了这些环境中的一大规范了。

结论

总之,大量的研究能够帮助社区管理者创建并有效地维护游戏社区,确保社交行为的规范性。我所提到的这些问题都只是冰山一角,关于在线社区中的行为还存在各种各样的问题。而游戏开发者和研究人员必须更加努力地构建起一个完整的社区,并推动着社区内部社交性行为的发展。

本文为游戏邦/gamerboom.com编译,拒绝任何不保留版权的转载,如需转载请联系:游戏邦

How to reduce antisocial behavior in your game.

by Travis Ross

Recently Bungie (Halo 4) and NCsoft (Guild wars 2) have both taken a very aggressive stance on antisocial behavior in chat channels. Both companies have begun to ban players who use hate speech. Now, as far as I can tell banning such behavior can be difficult and I haven’t heard reports of how well it is going. Trolls and griefers can be crafty and may find other ways outside chat to reduce the enjoyment of the community. As these companies have taken some excellent steps toward getting rid of bad behavior, I wanted to help. In order to do so, I’ve compiled some suggestions for how to make more prosocial online communities based on my own research and that of others in the field of social policy. Also, for more writing at the intersection of social science and games check out my blog Motivate. Play. (shameless plug).

1: The overall goal should be to build community norms.

My dissertation focuses specifically on how game developers can use norms reduce antisocial behavior in the online communities in games.  Norms are powerful in that they emerge from community behavior. When a norm exists the community can reduce the cost of surveillance. If norms exist then a developer can rely on community members to report or even sanction (punish) players who are behaving badly. This can reduce costs for developers and even empower community members possibly increasing the feeling of self-determination.

2: There are two types of norms. Both are important.

That’s right, there are two types of norms. The first are known as descriptive norms – these communicate what other players are doing. In other words, descriptive norms can be observed in the behavior of others (in chat channels they are broadcast). This information is extremely important for developers because humans have a propensity to copy others, and to use relatively small amounts of social information to inform decisions and form scripts about their environments. What this means is that it is possible for antisocial behavior to spread through a community – especially if it is intrinsically motivating – and that a small amount of bad behavior can set off a chain reaction of bad behavior. Think flame wars, white-knighting, etc. When people get mad at trolls and griefers revenge or good intentions can actually lead to more antisocial behavior.

The second type of norm is called a social or injunctive norm. These are important because they put social pressure on individuals. They communicate what others expect. Yet, games often lack clear communication of these norms, which draw their power from shame and social expectation. In addition, it seems like players in online environments are less responsive to shame – as they probably recognize there are no lasting reputational implications. In the real world social norms are generally also accompanied by sanctions for transgression.

I want to make an important note about these two types of norms. They should be treated as separate motivational forces. In addition, research indicates that descriptive norms out-rank social norms. What I mean is that if descriptive norms communicate that antisocial behavior is common then social norms will not hold up. If enough people are behaving antisocially no one will believe that other people expect them to be prosocial. Interestingly existing social norms can unexpectedly collapse if the public perception of descriptive norms changes  – Scott Page talks about how this can happen.

3: Sanctions

Given that the worst trolls and griefers actually like when others respond to their behavior (see flame wars) social norms (the expectations of others) are not enough to stop individuals and may sometimes encourage them. To stop the worst trolls and griefers you must punish them by taking away what they enjoy. One option for punishment is for developers to pay police officers to sanction deviants. This can work, but when there are a lot of players or games policing can be expensive – telemetry and machine learning can certainly help.

Another option if for the players to do the sanctioning. After all, policing griefers and trolls isn’t dangerous and if norms of prosocial behavior are in place players should feel expectations to sanction others bad behavior. Sanctioning systems can actually be complex and difficult to implement. Why? Well if they are too powerful and easy to use then they become a tool for griefing (ironic). However, if they are too weak or costly then they won’t be effective or be employed by community members. I won’t go into all of the details about how to make a great sanctioning system – in fact there are still many questions for community designers and researchers to address. However, I will give an example – like any game design this probably needs some iterative testing:

Points for antisocial behavior

1.Players can earn points for bad behavior.

2.Other players can assign points.

3.Points expire after an amount of time.

4.Points are multiplied when multiple players in one session report a transgression.

5.Players who cross certain point threshold are hit with a graduated sanction (sanctions increase with the multiplier). First take away voice communication, then take away the game.

6.Players can file an appeal within x days.

7.Other players are given tools to research an appeal (telemetry data) and are paid in virtual currency for answering appeals (three random players must review the appeal and come to a consensus. If they do not it is passed to an actual customer service agent. Players earn trust ratings for arbitration.

8.Players that lose arbitration hearings earn addition points.

9.Players that sanction a player who wins an arbitration earn points that reduce their ability to sanction – for a long period of time.

4. Should sanctions be graduated?

One thing that is still uncertain in community management and research is if players should be banned or if sanctions should be graduated. There are arguments for both.

Ban Hammer

First, it seems that descriptive norms for antisocial behavior already exist in the chat channels of many online games. “That’s just gamers being gamers.” Or “Antisocial behavior is normal in these games.” Creating norms for players to sanction bad behavior maybe difficult. And why would players sanction unless there are expectations that they should? There is something called the 2nd order free-rider problem where group members don’t sanction because there is a cost. To get rid of the perception that antisocial behavior is normative or OK, the ban hammer maybe required. In addition,  sexism, racism, foul language deserves severe punishment.

Graduated Sanctions

However, what if people can be rehabilitated? What if individuals simply are following the status quo for FPS? In competitive environments it can be difficult to control ones emotions and sometimes people get frustrated. Could this be a teaching moment? Do people deserve a warning? Could this actually help people be more prosocial in real life? One of the findings of research in social policy is that very severe sanctions can actually be detrimental to a community. It doesn’t allow for second chances and can frustrate or create enemies. This is especially the case when descriptive and social norms of a certain behavior don’t exist or are not clearly communicated. In other words when players feel like they didn’t get a warning or understand what was expected of them, but still get the ban hammer. After all antisocial behavior has been normative in these environments for some time.

Concluding

In conclusion, a significant amount of research exists that could help community managers build and sustain communities where prosocial behavior is normative. What I’ve talked about is really only the tip of the iceberg and there are still a lot of questions about behavior in online communities that need to be answered. It is up to game developers and researchers to keep trying to figure out how to construct societies that promote prosocial behavior.

If anyone things this is interesting and would like to apply it to their communities I’d be happy to talk about it in more detail – just leave a comment, tweet me or shoot an email. (source:gamasutra)


上一篇:

下一篇: