Read Tuck's Latest Updates on COVID-19

Why We Believe Mobile Reviews

Tuck marketing professor Lauren Grewal studies the consumer implications of reviews posted from mobile devices.

Early in 2012, TripAdvisor made a slight change to its website. The company, which bills itself as the world’s largest travel site, features user-generated reviews of hotels and restaurants.

Since its inception in the early aughts, the site published reviews written from desktop computers and mobile devices without showing readers what type of device was used to write the review. But in 2012, TripAdvisor began differentiating mobile reviews by displaying the words “via mobile” above the body of the review.

Tuck assistant professor Lauren Grewal found this curious. As a marketing professor, one of her main interests is how consumers use and process digital and social media. Academics have long studied the impact and importance of user-generated digital content (UGC)—such as reviews—on consumers’ purchasing decisions. But we know very little about consumers’ reactions to UGC written from mobile devices. Now that some websites are identifying mobile reviews, there is an opportunity to examine how, if at all, that information influences consumers.

Grewal and her co-author Andrew T. Stephen of Saïd Business School  at the University of Oxford address that question in their new paper: “In Mobile We Trust: The Effects of Mobile Versus Non-mobile Reviews on Consumer Purchase Intentions,” which is forthcoming in the Journal of Marketing Research.  In it, they analyze around 1.5 million public reviews posted on TripAdvisor between 2012 and 2015, testing whether reviews written on mobile devices impact how many “helpful” votes the reviews received. They also perform five experiments to better understand how mobile reviews might influence purchase intentions. They find that when consumers know a review was written on a mobile device, the consumer is more likely to make a purchase of the reviewed product or service. And they link this behavior to the belief that it takes more effort to write a review on a mobile device, which thus endows the review with more credibility.

Consumers struggle to know which reviews to believe, and which ones to discount. So we use cognitive shortcuts to separate the good from the bad.

When Grewal and Stephen first began seeing the influence of mobile reviews, they surmised that the effect was due to the mobile reviews being seen as more recent than those posted from desktop computers. But the data didn’t match that story. Continuously, they saw no perceived differences in recency across devices. Instead, “we kept finding that there was something about the effort it takes to write a mobile review,” Grewal says, “so we wanted to know what it was about effort in particular that led to a review being more helpful and increasing a consumer’s purchase intentions.”

The answer is related to a major weakness of online reviews: uncertainty about credibility. There have been many scandals about fake reviews posted by people working directly for or against a particular product or service. Consumers struggle to know which reviews to believe, and which ones to discount. So we use cognitive shortcuts to separate the good from the bad. We look for reviewers that are “verified” by the platform, or reviews that are well-written and intelligent. Another shortcut comes via the effort heuristic, which is a cognitive bias that makes us value something that we perceived as involving a lot of effort to produce—even if the underlying product is no different. For example, consumers will be willing to pay more for the same product if a retail store has a well-designed display window compared to a store with a less organized display. This effort heuristic explains why “information is seen as more credible if more effort is believed to have gone into it,” Grewal says.

Consumers implicitly make this judgment of effort when they see a mobile review, and then they subconsciously appraise that effortful review as being more credible.

Why do consumers believe it takes more effort to write a review on a mobile phone? It’s all about the device’s limitation. The small screen. The mini-keyboard. The auto-correct feature that misrepresents what you’re trying to say. It’s just believed to be physically more difficult to write on a mobile device than on a desktop computer. Consumers implicitly make this judgment of effort when they see a mobile review, and then they subconsciously appraise that effortful review as being more credible. A more credible review is then deemed more helpful and, if the review is positive, more persuasive in positively impacting purchase intentions.

Interestingly, the researchers didn’t find the same effect with negative reviews written from mobile devices. They connect this result to prior research showing that people value negative information more than positive. “With negative reviews, as consumers are placing more weight on the information provided in the review, they are less likely to use heuristic cues (such as the mobile effort heuristic) as part of their decision-making process,” the authors write.

One strong implication from this paper is that “seemingly innocuous contextual factors can be persuasive,” Grewal and Stephen say. For online review sites where people worry about fake reviews, differentiating mobile from desktop reviews can give the mobile reviews more weight without hurting the credibility of desktop reviews. And for businesses relying on reviews as part of their marketing efforts, “they might encourage people to use their mobile device to write their review on sites that identify mobile reviews,” Grewal says.