How AI Can Accurately Spot Fake Restaurant Reviews

It’s well known that user reviews are a fundamental part of the web economy, with consumers tending to trust the word of their fellow customer above any other form of marketing.  Making sure those reviews are truthful and authentic therefore is key, especially as unscrupulous vendors are happy to produce fake reviews to puff up their service.  A recent study from Aalto University highlighted how it’s increasingly possible to create fake reviews autonomously.

The authors state that around 40% of us make a decision based upon the feedback received from other people, whilst a good review can boost sales by around 30%.  In other words, they’re important to the success of a product, which creates an incentive to create fake reviews to boost your ratings.  Are AI-driven technologies increasingly capable of producing accurate reviews?

“Misbehaving companies can either try to boost their sales by creating a positive brand image artificially or by generating fake negative reviews about a competitor. The motivation is, of course, money: online reviews are a big business for travel destinations, hotels, service providers and consumer products,” the authors say.

Faking success

Work in this area has been ongoing for a little while, with a team from the University of Chicago previously developing a machine learning system that was trained on a few million real reviews that was then able to generate fake reviews for restaurants.  Whilst reasonably impressive, this system had a tendency to drift off topic, thus making the reviews pretty easy to spot as fakes.

The Aalto team attempted to overcome this by using neural machine translation to give the system a better sense of context.  The system developed a text sequence of review rating>restaurant name>city>state>food tags, and this began to produce more believable reviews.

“In the user study we conducted, we showed participants real reviews written by humans and fake machine-generated reviews and asked them to identify the fakes. Up to 60 percent of the fake reviews were mistakenly thought to be real,” the team explain.

Not content with creating such a nefarious system, the team then set about creating a system that could spot such fakes.  Their technology was effective, especially in cases where human evaluators had been fooled by the fake reviews.

Given the power of reviews over our buying behaviors, it’s inevitable that such technology will be developed, so it’s good to see researchers attempting to understand the latest approaches with an ethical approach in mind.

Related

Facebooktwitterredditpinterestlinkedinmail