The wisdom of crowds in financial investments

crowdfundtheatreThe wisdom of crowds as a concept has been around for some time now, and has gradually grown to underpin many of the crowd based models that have flourished in recent years.  It has also fostered a good deal of discussion around the merits of experts, and just how valuable they actually are.

One of the more famous examples is of course the 2005 study conducted by Philip Tetlock, who studied 284 experts over a 20 year period to see just how many of their 28,000 or so predictions actually came to fruit.  The results were infamously rather abysmal, with the so called experts performing scarcely better than pot luck.

A slightly more recent study suggests things might not be quite as simple as that.  The study saw 1,500 forecasts made by intelligence analysts poured over to determine just how accurate the spooks were at predicting the future.  This study painted an altogether different picture.  Not only were the analysts rather accurate (75% of the time), but they also became more accurate, the more experienced they were.

The suggested theory behind the difference in outcome between the two studies is that the analysts are likely to be held account for their opinions much more than the ‘experts’ used in the Tetlock study.  That accountability in turn encourages a more conservative approach by the expert, which tends to make their predictions more accurate.

So it’s with great interest that I read a recent study that was looking at the relative performance of the crowd and a group of experts for funding arts projects.

The testing ground for the research was crowdfunding platform Kickstarter.  The researchers compiled data on a random sample of theatrical projects from the site between May 2009 and June 2012.  These 120 projects were then split into 20 groups, with each group containing at least three projects that failed to meet their funding target.  Of the three remaining successful projects in each group, one exceeded their goal by at least 110%.

The researchers then asked a team of 30 expert judges, armed with the same kind of presentation videos and photos as the Kickstarter crowd, to recommend a funding amount for each project.  Would the crowd and the experts come to similar conclusions?

The decisions of the judges and the crowd were “remarkably similar,” researcher Ethan Mollick says, adding that there was 57% to 62% agreement between them.

“The judges seem to consistently rate the successful projects higher than those that did not achieve their funding goal,” they continue. “Projects that were funded by the crowds were twice as likely to be ranked as the best project, while those that were unsuccessful were more than two times as likely to be ranked as the worst project by judges.” Judges also gave, on average, 1.5 times more money to projects that were successful in their Kickstarter goal in real life than to projects that were unsuccessful.

The main divergence between the crowd and the experts occurred when experts failed to back a project that the crowd had supported.  The researchers suggested this was because those projects were highly proficient at marketing themselves in a way that appealed to sites such as Kickstarter.  So, they may have included some great rewards for backers or used highly appealing videos.

The researchers suggest that their study could provide yet more support for the value in opening up funding to the crowd, as there was a reasonably amount of agreement between who the crowd chose and who experts chose.  Of course, we’re seeing this kind of approach in action in the various crowd based talent shows such as X Factor.

As with the original studies however, the key here would seem to be the accountability of both sets of investors.  With a financial stake in their decision, there is a tangible reason to predict wisely.  That alone seems to be enough to ensure strong decisions are made.

Related

Facebooktwitterredditpinterestlinkedinmail

Leave a Reply

Your email address will not be published. Required fields are marked *

Captcha loading...