Viral loops as a subset of feedback loops in social media

I like Jeremiah Owyang’s matrix with the embedded point about the viral loop value, to drive engagement in advanced integration of one’s corporate site with your social media strategy.

Accords with my views on how to grow online communities (where I’ve seen viral loops are a subset of feedback loops) – so using such a strategy both in the viral sense as above with with users, and in terms of establishing feedback loops with top contributors.

Any circuit-design type simulators out there which could plug into your web analytics data to allow you to test out viral campaigns?

I sketched this out a bit more on a recent slideshare: http://www.slideshare.net/stuartgh/feedback-loo…

It’s also worth reading the comment from Bert DuMars on the value of using consumer generated product reviews published on one’s site as a powerful feedback loops for driving performance:

When you integrate CGP reviews into your branded website you are inviting additional conversation about your products and services. You are opening up to your consumers and allowing them to begin a conversation with you about what they like and do not like about your products.

If you are open and honest (showing both positive and negative reviews) you not only learn how to improve your products and services, you are given the opportunity to show that you care about your consumers. We have seen culture change at our Rubbermaid and Dymo brands based on CGP reviews.

We can respond faster to feedback, especially negative, and reach out to consumers to learn what went wrong. We can then adjust the product or service based on that feedback. Think of it as an ongoing, near real-time, feedback loop and a gift from your consumers.

Who believes in the 90-9-1 rule?

A second question on LinkedIn from Dr Michael Wu, Principal Scientist at Lithium Technologies:

Is there something more accurate and precise than the 90-9-1 rule out there? IMHO, Lorenz Curve and Gini Coefficient. Do you know anything else? The Economics of 90-9-1

My answer as part of yesterday’s Online Community Manager group discussion kind of sums up where I’ve got to after reading Dr Wu’s blog previous post and this latest one:

I like the approach you have using economics-based models. I’ve come at it from a more particpant-observer type sociological point of view, so what I’d like to see is for your analysis to return a new ‘rule of thumb’ based on your in-depth data analysis.

The 90-9-1 rule is useful to community managers because it helps provides a starting point for understanding, as Arantza says above. For example it would be useful to know from a practical point of view whether for more open communities (as opposed to niche market research or project based communities) the 90-9-1 is a useful tool for helping launch a new community.

It’s partly about creating a social dashboard that can explain to a member of senior management why a certain kind of community activity may help or hinder greater participation.

I did this kind of work previously in the National Health Service, creating simple reports on the success of a national public health initiative, which worked well for senior managers (government ministers in that case).

So I come back to the challenge, the age old relationship between lab & fieldwork if you like, what would be the new rule of thumb/thumbs?

I’ve chosen to highlight multiple feedback loops as a useful tool, to help drive top contributors for example (taken from the HP Labs research), but I take your point that for commercial ROI purposes more precision is required. To put it another way in such a dynamic social context how does precision allow you to create heuristics for day to day community management?