The famous NY Times cartoon suggested that on the Internet, no one knows that you're a dog. Whilst the ability to hide behind a username is a long and often cherished Internet tradition, it does have a darker side. Astroturfing is probably the most common, that is using fake names to post positive (or negative) reviews of your own products online. There have been laws in place to prohibit astroturfing since 2009 but it seems that it hasn't put people off.
A new paper by the University of California Santa Barbara looks at so called Crowdturfing, a concept they define as being a combination of crowd sourcing and astroturfing. Their research found that creating fake profiles on social networks was a rapidly growing industry and that the quality was improving from the previously obvious and poorly created machine generated profiles.
Ben Zhao, lead of the research, analysed a large set of blocked accounts from the Chinese social network RenRen, and found many of them were convincing looking shill accounts created automatically. Zhao uncovered a thriving spam business in China in the generation of such accounts. Many companies offer astroturfing services for sale and whilst prices are incredibly low these companies nevertheless make incredibly good livings from astroturfing, a sure sign of the level of demand.
The website Zhubaijie for instance was found to have advertised as many as 100 such campaigns per month way back in 2007, with more recent analysis showing the figure now to be in the tens of thousands per month. It seems clear that whilst laws exist to prevent this in Europe and America, outsourcing much of the work to India and China is proving very effective at getting round these laws.
Zhao reveals the extent of the problem in this MIT Technology Review article. “This industry is millions of dollars per year already and [shows] roughly exponential growth,” Zhao added. “I think we’re still in the early stages of this phenomenon.” he says.
To be honest I already thought this was pretty common. There's such a big industry in hiring someone for peanuts to post spam for you, it doesn't seem a stretch to see people hiring those same people to share social content.
Exactly. You can buy all sorts of things on Mechanical Turk for instance. This is not new.
Pingback: Warning: You could be talking to a Twitter bot | Adi Gaskell says...
Pingback: Social media ROI and feedback addiction | Adi Gaskell says...
A startup is now claiming that 80% of the clicks on their Facebook ad were from bots. Pretty damning if true.
http://techcrunch.com/2012/07/30/startup-claims-8…
Worrying isn't it, especially as this has only really come to light because the company constructed its own tracking script. That's something beyond many of the advertisers on FB I suspect, so this simply flies under the radar for most. With Facebook scrabbling to earn money though this must come as a real concern. They don't make much from mobile, and advertisers are increasingly disgruntled about advertising on the full site.
Whilst I'm not surprised I suspect you'll have this problem with whatever advertising system you use. I have advertised on just about all the major self-serve advertising platforms, and no platform reports clicks the same as the clicks we verify.
Having said that though, it does appear like Facebook's false clicks are much higher than those from other platforms at the moment.
What is the difference between crowdsourcing? Is this a new model or a set of protocol to follow?