I was doing some research this summer into the new Google+ Local platform, and in particular their new Zagat review system. It bought home the importance many small businesses were placing on getting great reviews.
Whilst it’s always been ‘known’ that word of mouth is critical, and that a positive (or negative) review carries much more weight than any other form of marketing. A new study by the University of California, Berkeley goes to prove it however.
They wanted to test the relationship between online ratings and the purchasing decisions of actual customers. They used 300 restaurants in the San Francisco area and measured their online popularity via the Yelp.com site.
The results should be essential reading for any good online marketer. They found that if ratings for a restaurant improved by just half a point (on a 1-5 scale), it resulted in that restaurant being full during peak hours much more often. Indeed, this extra half a point ensured that the restaurant had sold out their peak hours tables between 30-49% of the time.
Of course you might argue that this could have been caused by anything. A special offer perhaps could have resulted in the boom. The researchers made sure to rule out any ‘outside’ influences however, ensuring that the changes in demand were not caused by price changes or the quality of the food and service.
The researchers said “The findings of this study demonstrate that – although social media sites and forums may not generate the financial returns for which investors yearn – they play an increasingly important role in how consumers judge the quality of goods and services.”
Ok, you may then argue that cause and effect might be difficult to establish. Surely after all, a better restaurant is more likely to sell out than a poor restaurant, and of course also get better reviews. It’s hard to pin down the good reviews as being a determining factor in a restaurant selling tables each evening.
The researchers do believe their findings are robust however, and cite the protocol at Yelp for rounding up to the nearest half a point. So, two restaurants that have similar average ratings can actually appear to be of very different quality to online viewers. For example, a restaurant with an average rating of 3.74 displays a 3.5-star average rating, while a restaurant with an average rating of 3.76 displays a 4-star average rating.
This would make two very similar restaurants appear very different in the online world. With access to the raw data, ie the actual score rather than the rounded up rating, it enabled them to compare restaurants with very similar scores but different ratings, and they found that the half a star does make a real difference to restaurant popularity.
“Differences in customer flows between such restaurants can therefore be attributed to the ratings themselves rather than differences in the quality of food or service.”
So how big a difference are we talking here?
Well it turns out quite a bit. They found that moving from a 3 star rating to a 3.5 star rating increased the chances of selling out from just 13% to a respectable 34%. Going from 3.5 to 4 stars made it 19% more likely.
Of course, when the returns are so clear, there is an obvious incentive for restaurant owners to create sock puppet accounts and leave fake reviews to boost their fortunes.
“These returns suggest that restaurateurs face incentives to leave fake reviews, but a rich set of robustness checks confirm that restaurants do not manipulate ratings in a confounding, discontinuous manner.”
Sites like Trip Advisor have long had a flagging system in place so that travellers are aware if a hotel is suspected of suspicious activity. With these findings it would seem time for the restaurant review system to ensure their own services are as bona fide as possible.
In the meantime, if you’re a restaurant, now’s the time to start getting reviews from diners. A good strategy might be to offer a discount to diners on their meal if they leave a review via their smart phone in between requesting the bill and paying for their meal.