Awesome presenter Derek Powazek (of www.fray.com and www.powzek.com) speaks today on: How do we apply James Surowiecki's The Wisdom of Crowds to our online design and creation? While the book was written about economics, the concept is also great for designers wanting to get user feedback and participation on sites, which is pretty much everyone.
Incredibly misunderstood concept – does NOT mean that when you get a crowd of people in a room that they are all smart. You have to work at it. The idea really started with Francis Galton who ran a cow guessing contest at a fair. 1,198 pounds – no one got it. But whenever he collected the answers and averaged them out, the median guess was 1,128 pounds.
So how was it that individually everyone was wrong, but collectively everyone was right? That is the phenomenon of the wisdom of crowds. Any news site with a 'most emailed stories' page where the crowd is making a single decision on who would like this story, but collectively you see what the Community thinks is most relevant. Same thing happens in a P2P service – pile of people sharing MP3s, the more something is shared the more popular it must be. When sorted and agregated, you can see what the most popular
4 elements of 'wise crowds':
- Diversity – background, opinion, wide variety of inputs.
- Independence – each person must be able to contribute in their own way, for their own reasons to avoid group think.
- Decentralization – there's no one in charge, it just happens.
- Aggregation – take all data into a central area to condense it down.
Bringing the 4 elements online:
- Small simple tasks – can't ask someone to do too much. A comment form, for instance, is an open petri dish where anything can grow. There needs to be an answer – let people speak so they feel good, okay… but let them work towards an answer.
- Very simple: Hot or Not / Threadless design submission (a little more complicated, rating by number and saying how you'd buy it)
- Less simple: Assignment Zero (www.newsassignment.net) – magazine quality content from the crowd. Initially was too broad of a task – write stories! No one did it, so they broke it down to let people nominate folks to be interviewed. Then asked people to interview them. Then editors helped with the stories. Much easier.
- Large diverse groups – Group think happens when participants in the crowd put themselves ahead of the crowd. They limit their input out of fear. The antidote is not to avoid groups, but to design communities to encourage participation. No raising barriers of entry over time, rather they should lower them.
- Chevy Tahoe campaign – crowd sourced their next television commercial. Someone should've spoken their doubts.
- Design for selfishness – People want/need to get something out of doing something for you. Attention, sense of release, personal satisfaction. Have to design interface to take into account peoples' selfish motives for participation. Is it worth their time? What do THEY get for participating? This selfishness is very important, when you lose it things go off the rails much more quickly. There is wisdom in the data.
- Result aggregation – How to take the aggregate data without turning it into a game? When it turns into a game or contest, things can take a different turn. People like to break the rules and win, not be themselves and share.
Favrd is a good example of this: www.favrd.com. It takes the favorite information within Twitter to tally it up on the site. No one votes, it just pops up.
The Heisenberg Problem – Flickr's Interestingness, an algorithm on most to least interesting based on views, comments, favorites, etc. They created a game – to be the winner, you have to be number one on the list. That takes this ranking and gave an incentive for bad behavior (spamming groups, bugging friends to comments, etc.) – all to reverse engineer the algorithm. After it had been up for a bit, they lowered the focus on that page and moved to: www.flickr.com/explore/interesting/7days — a random 'smoosh' of interesting photos on Flickr. No more ranked list – far more random display of well-ranked photos. It is now less of a game – the interface can change how it all works.
Threadless has avoided this by not displaying the ranking of a submission to avoid group think. The majority of online polls separate data input from results to avoid this as well.
Popularity does NOT have to rule. Popular doesn't always mean best. Most votes doesn't have to win: Amazon product review page – show both favorable and critical review to show the disagreement. Not just the best or worst review, rather one from each angle presented side by side. Also focus on 'most helpful' vs. the most recent review.
Implicit vs. Explicit Feedback
Neat or Not smiley face / Threadless numbered ranking / Amazon 'yes or no' review helpful / millions of starred feedback version. Different interfaces work for different projects, but never use more than you need. Never ask people to do more thinking than they have to – can't think of a really good reason to NOT use thumbs up or thumbs down. Most 5 star ratings average to 2.5 – 3.5 stars anyway.
- Page views
- Interestingness – catch all for algorithms to monitor all of this 'stuff'.
How you ask questions changes the answers you get. The interface that you collect feedback on changes the results on your site. Example: Kvetch (Yiddish word for complaining) site to complaining www.kvetch.com on random topics. Was meant to be light and funny, but ended up very very dark. The initial design was dark black with orange and red, after a redesign happened (with light background, rounded corners and knobs) and the responses were dramatically different.
Cognitive performance affected by the colors people see. Red creates a fear response – motivator to avoid mistakes and get details oriented. Blue creates a calmer and more emotional stance where they are freer to be creative.
Putting it all together
Who is doing this right?
- Brooklyn Museum's Click! gallery (www.brooklynmuseum.org/exhibitions/click) on the changing face of Brooklyn. 400 photos submitted, users had to rank themselves on what kind of 'expert' they are and how into art they are. Interesting the correlation between expert status and photos selected. Photos that won were printed and displayed – but their size was dependent on the number of votes received.
- Get Satisfaction – allows users to perform a small simple task ('I like this idea!' button) and also bump up the most well liked post to the top. Other implicit feedback data is also aggregated by sharing replies, number of people talking on thread, etc. Also, great visual in 'the mood in here' by smiley face bar chart. These are all in addition to the long lists of posts – find bits to plug in to create a better experience.
In online spaces, we are deprived of real world / human interaction data (facial expressions, tone of voice, etc.) are gone. So our brains work twice as hard to fill in that missing data. When we feel out of control, our brains make up stories. No social data, so we fill it in. This is why people are insane on the Internet – they fill in with completely offbase ideas.
Using smart community settings will help you see patterns in the chaos and not have to fill in with stories. Help people get in touch with something they fell in control of and participate in the Community is the best thing.