Mariamz

Community Success Factors 6: Boundaries and Coercion

Posted on: October 18, 2009

This is Item 6 of my 8 Critical Success Factors for Virtual Communities post.

Every community has boundaries however they are formed or constructed. If you feel it necessary to take an active hand in this there are many ways to do it, without cutting off its lifeblood (freedom of expression) any more than you have to.

Virtual communities have boundaries in the form of system architecture, stated rules (terms of use and community guidelines) and / or community norms or expectations (Lawrence Lessig, 2006). The shape of these determines the type of place they are to hang out – therefore a decision about the regulation of any virtual community must be made even if that decision is to rely on norms and interfere as little as possible.

Norms are social rules developed collectively by the community. There is no easy answer to whether a virtual community will be able to rely on norms alone. USENET was found to attract “both anti-social and opportunistic behaviour” (Mansell & Steinmueller, 2000). In Code 2.0 Lessig (2006) recounts the shocking incident of a virtual rape in LambdaMOO. These experiences indicate regulation is a necessary consideration. Whatever course is taken the chosen structure should be regularly reviewed: Wikipedia (the user-generated encyclopedia) has in the past relied “on social norms to secure the dedication of project participants to objective writing” (Benkler, 2006) but even their open-editing policy has changed over recent years.

Where there are community guidelines these should be clearly documented for members to read – it is also a good idea to get users to tick a box to agree to them when they join up. In terms of developing your own this online database of social media policies includes an array of community guidelines (Flickr also have a good example). Ensuring regulation is congruent with community sensibilities is important. Shirky (2008) tells the story of Sanger, one of Wikipedia’s founders, who was forced out after annoying participants with his authoritarian style.

The most likely need in terms of regulating activity is moderation to deal with offensive or illegal user-generated content. Common models for this include pre-moderation, post-moderation and / or peer-moderation, or you could experiment with hybrid models, for example allowing post-moderation during office hours but making user content post-moderated when no-one is around to deal with any inappropriateness.

Slashdot has a sophisticated peer-moderation system. In summary, moderators are chosen from users, who each have a small amount of power. User comments are rated for relevance and funniness. This determines how prominently posts are displayed. Moderators’ decisions are in turn peer-reviewed. (Benkler, 2006). Troublesome users can be handled in a number of ways: they can have posts edited or deleted, be warned privately, or in front of the community, or have their username deleted or blocked.

Another interesting method is described by Jeremiah Owyang. The ‘Bozo feature‘ means when a user posts a message it appears as expected to her, but is invisible to the rest of the community. This robs the difficult member, or ‘troll’, of the attention she seeks.

A system can also be designed to encourage behaviour beneficial to the community. Rewarding good behaviour is the other side of the social punishment outlined above. This can be as simple as counting user votes – as is the case with Digg, the user-generated news site. A visible reward boosts the status of a member: another successful implementation is Slashdot’s ‘karma.’ “Karma is a number assigned to a user that primarily reflects whether he or she has posted good or bad comments” (Benkler, 2006). Conversely, users who post comments without logging in are designated ‘anonymous coward.’

Joshua Porter combines psychology and web design practice to optimize interfaces for social interaction. He proposes ‘leveraging cognitive bias’ in social design. For example, he stipulates displaying the activity of a very active user will lead others to unconsciously assume this is typical (representation bias) and want to be equally involved (the bandwagon effect) (Porter, 2008). This type of implicit coercion can be used to encourage sign-up, increase participation and modulate user focus and interest.

In summary the question is not if your community should or will have boundaries, but whether you want to influence what they are, and how you should go about it.

About these ads

1 Response to "Community Success Factors 6: Boundaries and Coercion"

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

This blog is about utilizing and optimizing the social web for business, pleasure and social change

My tweets

Enter your email address to follow this blog and receive notifications of new posts by email.

Creative Commons License
This work is licenced under a Creative Commons Licence.

PositionDial

The views in this blog do not reflect that of my employer
Follow

Get every new post delivered to your Inbox.

Join 2,997 other followers