Sorry if I jump around a bit in this blog post but by reading these points, and listening to the video, you’ll have a better idea of how social science can help you design a successful community, using a specific kind of moderation approach. Or at least how to impress to use the difference between a theory vs design-type approach to community building to respond better to new customer needs.
OK, I am paraphrasing here so bear with me, with me taking notes from Robert Kraut’s Stanford presentation above. My aim is to show how social science can inform good online community design. So the first point is that Kraut makes that I want to highlight is that real community design is “highly multidimensional”. And that this is at odds with logic of social science which seeks to understand effects of one variable at a time, while all other variables are else held constant, to discover causality. OK, so that’s some of the fundamentals sorted. Skip to this section on the video to hear the explanation.
This social science approach is at odds with (i.e. online community) design where you are trying to figure out the configuration of all possible variables to have the effect that you want to have. Kraut says that basically with design you don’t want one variable at a time you want ‘kitchen sink experiments which are theory-based experiments which you want to try out in a relatively cheap way.
But they use agent based modelling – allow theory to be tested as models in community environment, change member behaviour, which change environment (see 1:12:56) – where the ‘Identity Benefit’ is greater when agent’s interests are similar to group interests:
Here’s how to simply capture that ‘Identity Benefit’:
# viewed messages that match // # viewed messages
In comparison for the other principal type of community benefit to members Kraut identifies, the ‘Bond-based benefit’ is greater when there is repeated interaction. Kind of obvious I guess, but this is social science, so still worth stating!
Agent-based modelling and simulated communities results
And from simulated communities what Kraut found is that the simulated agent models (taking the place of community members) produced results very similar to that observed in real Usenet groups.
So the next step is that if we have a working agent model that shows how community works we can test out different types of moderation techniques, which can test in this simulated community.
From this Kraut found that ‘Personalised moderation’ out performs ‘Community level moderation’, though this really matters significantly when dealing with a large volume of content, or diverse content. In other words ‘Personalised moderation’ works well with large complex communities.
And as an example, I see this personalised moderation functionality appears to be available in community platform Telligent’s latest version of their analytics, which sounds useful. Be good to know which other major community platforms like Lithium offer such beneficial functionality, and how well it really works in the day-to-day:
Your community can now offer its participants dynamic and personalized recommendations of both people and content. Telligent Analytics looks at your community’s data, compares it with each member’s unique interests, and then delivers personalized recommendations to that member. Telligent Analytics doesn’t just tell you how your community’s doing; it applies the analytics to improve your community members’ experience.
So if you want to go into this study applied in more practical detail here’s Robert Kraut’s paper (pdf) with the graphs and stats:
A Simulation for Designing Online Community: Member Motivation, Contribution, and Discussion Moderation – (pdf: 10.1.1.141.6657)
Or maybe you’d like to read the chapter’s of Kraut’s 2012 book, Building successful online communities: Evidence-based social design:
- Resnick, P. & Kraut, R. Introduction [PDF]
- Kraut, R. E. & Resnick, P. Encouraging contributions to online communities [PDF]
- Ren, Y, Kraut, R. E. & Kiesler, S. Encouraging commitment in online communities [PDF]
- Kraut, R. E., Burke, M. & Riedl, J. Dealing with newcomers [PDF]
- Kiesler, S, Kittur, A., Kraut, R., & Resnick, P. Regulating behavior in online communities [PDF]
- Resnick, P, Konstan, J & Chen, Y. Starting a community. [PDF]
Pingback: Your Friends Are More Interesting Than You On Average | @stuartgh (#thinslicing)
Also worth reading is IBM’s empirical study of the long-term deployment of their Community Insights (CI) platform to leaders of 470 communities over 10 months: “Our results suggest the need to develop new metrics focused around achieving specific actions within the community, rather than focusing on overall growth measures”.