California Management Review
California Management Review is a premier academic management journal published at UC Berkeley
by Marco Francesco Mazzù, Matteo De Angelis, Alberto Andria, and Angelo Baccelloni
Image Credit | Rabil
The rise of artificial intelligence (AI) has transformed the way consumers interact with brands and make purchasing decisions. One significant development has been the integration of AI as a recommendation tool, which companies like Amazon, Netflix, and Spotify use to deliver personalized product and service suggestions. However, while AI-driven recommendations provide personalized insights based on shopping behaviors, consumers often demonstrate a preference for human recommendations, especially when subjective judgment is involved (Longoni et al., 2019). The research into “algorithm aversion” highlights that consumers tend to favor human experts over AI in subjective or emotional decision contexts (Dietvorst et al., 2015) due to the algorithms’ inability to address consumers’ unique characteristics, referred to as “uniqueness neglect” (Longoni et al., 2019). In contrast, in more objective decision-making situations, such as those requiring numerical precision or logical reasoning, consumers may prefer AI over human recommendations (Castelo et al., 2019). As a result, for products that require detailed, fact-based evaluations (i.e., search products), rather than products that rely on sensory perceptions or emotional responses (i.e., experience products), AI may be better suited (Gray & Wegner, 2012).
Chaturvedi, R., Verma, S., & Srivastava, V. (2024). “Empowering AI Companions for Enhanced Relationship Marketing”. California Management Review, 66(2), 65-90.
Sanchita, K., & Gupta, S. (2023). “Strategies for Value Reconfiguration in Online Platforms”. California Management Review, 66(1), 72-95
Search products, such as electronics, are characterized by standardized attributes that can be easily compared and assessed before purchase. AI, with its ability to process large amounts of data and provide accurate, detailed recommendations, would perform well in this domain. Experience products, on the other hand, require the consumer to experience the product first-hand—examples include spa services or gourmet meals. For such products, human recommendations are often seen as more credible (Franke et al., 2004).
Recent works provides further insight into this distinction, suggesting that AI voice-based recommendations outperform text-based online reviews for search products due to perceived credibility (Flavián et al., 2023); AI also serve as “Companion” in relationships and decision-journeys (Chaturvedi et al., 2024).
However, when it comes to experience products, the distinction between AI and human recommendations becomes less clear, especially when the human recommender is an expert. In these cases, consumers may trust the human more, valuing the recommender’s lived experience and reputation.
Factors that might act as underlying mechanisms explaining why consumers react different to human vs. AI recommendations for different types of products are transparency and credibility. Transparency refers to the perceived ability of the recommendation source to provide complete and reliable information (Parris et al., 2016), while credibility is the perception that the source possesses the relevant expertise and can be trusted to offer an objective opinion (Ohanian, 1990; Mazzù et al., 2023).
In our recent research we showed that transparency is particularly crucial when comparing AI and human recommendation sources because consumers tend to perceive them as different in how they generate and present information. AI may be viewed as providing more reliable and complete information for search products, while human experts are perceived as more transparent and credible for experience products.
Our study suggests a clear interaction between product type, recommendation source, and consumer response. For search products, AI recommendations are generally preferred due to the perceived transparency and credibility of AI in providing detailed, fact-based evaluations. However, for experience products, the preference shifts when the human recommender is highly experienced. Consumers are more likely to follow the advice of a well-qualified human expert over AI in these cases, as they attribute higher transparency and credibility to the expert’s lived experience.
Moreover, the transparency and credibility shape consumer intentions. Consumers’ perception of source transparency leads to higher credibility, which in turn increases their intention to follow the recommendation. This sequential effect is particularly important when considering how companies might optimize their recommendation strategies based on product type.
For businesses offering search products, leveraging AI-driven recommendation systems is essential, as consumers rely on their thorough evaluations. In contrast, for experience products, companies should focus on knowledgeable human recommenders, whose expertise boosts perceived credibility and transparency. It is then crucial for businesses to manage their recommendation sources fostering trust in both AI and human recommendations, to shape consumer preferences, reconfigure online platforms (Sanchita&Gupta, 2023), and ultimately impact purchasing decisions across product types.
References
Castelo, N., Bos, M. W., & Lehmann, D. R. (2019). Task-dependent algorithm aversion. Journal of Marketing Research, 56(5), 809–825.
Chaturvedi, R., Verma, S., & Srivastava, V. (2024). Empowering AI Companions for Enhanced Relationship Marketing. California Management Review, 66(2), 65-90.
Dietvorst, B. J., Simmons, J. P., & Massey, C. (2015). Algorithm aversion: People erroneously avoid algorithms after seeing them err. Journal of Experimental Psychology: General, 144(1), 114.
Flavián, C., Akdim, K., & Casaló, L. V. (2023). Effects of voice assistant recommendations on consumer behavior. Psychology & Marketing, 40(2), 328-346.
Franke, G. R., Huhmann, B. A., & Mothersbaugh, D. L. (2004). Information content and consumer readership of print ads: A comparison of search and experience products. Journal of the Academy of Marketing Science, 32(1), 20–31.
Longoni, C., Bonezzi, A., & Morewedge, C. K. (2019). Resistance to medical artificial intelligence. Journal of Consumer Research, 46(4), 629–650.
Gray, K., & Wegner, D. M. (2012). Feeling robots and human zombies: Mind perception and the uncanny valley. Cognition, 125(1), 125–130.
Mazzù, M. F., Pozharliev, R., Andria, A., & Baccelloni, A. (2023). Overcoming the blockchain technology credibility gap. Psychology & Marketing, 40(9), 1791–1807
Ohanian, R. (1990). Construction and validation of a scale to measure celebrity endorsers’ perceived expertise, trustworthiness, and attractiveness. Journal of Advertising, 19(3), 39–52.
Parris, D. L., Dapko, J. L., Arnold, R. W., & Arnold, D. (2016). Exploring transparency: a new framework for responsible business management. Management Decision, 54(1), 222-247.
Sanchita, K., & Gupta, S. (2023). Strategies for Value Reconfiguration in Online Platforms. California Management Review, 66(1), 72-95.