Why Crowdsourcing is No Magic Bullet

By

A few years ago in an English newspaper, there was a story about a near-fatal stabbing. The victim was rushed to hospital, where during an X-ray to determine the extent of any damage, a small cancerous growth was found. It was removed and the man made a full recovery, remarking that were it not for the stabbing he might have died from the hidden cancer that was quietly causing more damage than the knife.

In 1906, British scientist Francis Galton went to the West of England Fat Stock and Poultry Exhibition, where he witnessed a weight judging competition. A crowd of farmers and local people had gathered, and were paying sixpence to enter their guess of how much the ox would weigh ‘slaughtered and dressed’.

Galton had little faith in the intelligence of the average man, so decided to conduct an experiment to back up his argument. When the competition was over, he collected all 787 guesses written on discarded tickets and analyzed the data. What he found shocked him: the average guess was 1,197 pounds; the correct weight was 1,198. The ‘dumb’ crowd’s judgment was almost perfect.

Groups of farmers aren’t always right. Stabbing does not always cure cancer.

It seems absurd to point out that a solution doesn’t have to work either all the time or not at all. Penicillin is not a failed drug because it doesn’t cure cancer. Some things aren’t supposed to be universally applied and when we stretch an idea way beyond its effective plasticity, cracks begin to show.

Popularized by New Yorker staff writer James Surowiecki’s 2004 book The Wisdom of Crowds ideas such as crowdsourcing and the hive mind began to seep into the mainstream thanks to fully-functioning embodiments of group-think theory like Wikipedia and Linux.

Seemingly blowing the economic model of the rational man –– one who only parts with his time for hard cash –– out of the water, the founders of such experiments became the pin-ups of a post-expert infosystem, a digital nirvana where the crowd was king yet the monarchy was mocked.

But what do we actually have to show for the crowd’s toil, years later? As recovering digital evangelist Jaron Lanier points out in his book You Are Not A Gadget, if 15 years ago he’d told people that all we’d have to show for this revolutionary approach to problem solving would be a new type of encyclopedia (Wikipedia) and an adapted operating system (Linux), people wouldn’t have been too impressed. As fascinating as the Wikipedia model is, we already had an encyclopedia model that worked. We already had Unix.

Web 2.0 experiments have achieved some amazing goals. But like the wisdom of crowds theory that inspired/intellectualized them, they are limited.

Wikipedia isn’t always wrong. But it also isn’t always right. The same is true of crowdsourcing as a tool.

Ask a crowd to calculate how many balls are in a jar, how much an ox weighs or the correct price for a company share, and the crowd will often out-perform its most-accurate member.

Where there is a right answer, the crowd will flourish. A good rule of thumb –– while not 100% accurate, of course -–– is if you can count it, you can crowd it. But when it comes to more qualitative issues, crowdsourcing isn’t supposed to work. Look at the gulf between the crowdsourced music charts and the opinion of experts or critics. One is no more accurate than the other in essence, but if I wanted to find the best new music, I’d ask the expert, not buy the number one single.

Likewise, crowds can find the weight of an ox, but not the prettiest ox. And, as author of Crowdsourcing: How the Power of the Crowd is Driving the Future of Business, Jeff Howe, told me, ‘I find the crowd adept at weeding out the ugliest oxen, but they’ll often miss the strange beauty.’

Applying a tool as if it is a toolbox only leads to mistakes. As the old saying goes, to a man with a hammer, everything looks like a nail.

Galton learned the folly of assuming the crowd is always dumb. But we need to remember that it’s not always inherently intelligent, and even if it is it may not be the right way to solve your problem. Forgetting this creates perceived failures and damages the reputation of a good theory that wasn’t meant to be universally applied.

It’s about selecting the right tool for the job. Use a spanner as a hammer and all you’ll do is break the tool and the nut.