AI-fueled Sensory: Democratizing Sensory Power to Understand Product Experiences

In the dynamic world of product research, Product Hub's Sensory Portrait tool leverages AI to revolutionize sensory research. By rapidly generating and analyzing sensory data, this tool makes high-quality sensory insights more accessible and actionable, enabling brands to optimize their products based on detailed consumer preferences.

Simon Harris
14 Mar, 2024 9 min read
Microsoft Teams image 49

The Power of Sensory Data

In the dynamic world of product research agility is becoming, no not becoming, is, fundamental.  And yet the dilemma can be, what do you have to lose to gain that? Surely the lighter you are the faster you fly?  That doesn’t have to be the case, in particular, when we are finding ourselves in the midst of technology and AI revolution where there are innovative ways of capturing and integrating data.  This is exactly what Product Hub does, combining consumer and sensory data and using expert standardization and UX-centric technology, such that the resultant outputs are greater than the sum of their individual parts. 

Yet, questions may arise as to what is exactly is sensory data?  How does it help us understand consumers or build products?  Great questions.   

Sensory data is generated by trained panelists – people with heightened sensorial abilities and who are proficient in articulating their experience. They deconstruct product experiences into all the sensory touchpoints that feed into that experience.  Consider chocolate (yes please) - different chocolates will have different levels of sweetness, different types of fruit notes, the sensations of biting them will differ, some may be harder than others, how they melt in the mouth and coat the mouth will also differ, making the product experience different.  These sensory touchpoints are fundamentally important in shaping the consumer experience of your products because even though the average consumer could not articulately explain how one chocolate is differ from another, their preferences and purchase intent may differ. Within a sensory research session, trained panelists decode all of those touchpoints and linking the resulting sensory data with direct consumer data helps us to understand how these sensory touchpoints influence perception and enjoyment, at a more nuanced level than consumers are able to articulate alone. 

Introducing Sensory Portrait

As this process is described, you can perhaps appreciate that this type of research can be complex and potentially time-consuming, but this is where technology enablement and AI come in!  At Product Hub, powered by MMR, we have been working on a groundbreaking sensory tool that promises to transform the landscape of sensory research: Sensory Portrait.  This tool is designed to provide brands with rapid objective sensory data, harnessing the power of AI, to deliver quantitative diagnostics on the 12 to 15 key sensory attributes which are discriminating amongst the product set. 

Sensory Portrait is a leap forward in redefining sensory research. By training a Large Language Model with our extensive sensory expertise, we’ve been able to streamline the process of rapidly capturing the rich language of sensory experiences, empowering trained sensory panels to refine and finalize lexicons more efficiently. This liberates our panel leaders to have more time to focus on research design and on selecting those all-important key touchpoints.  The AI technology is giving them space to use their sensory expertise to expedite data collection through   AI-powered lexicon generation and sorting. This results in quick and accurate sensory testing sessions which can be used to understand products more in dept and get a better view of consumer drivers. Through this novel approach, we have slashed the timeline for generating sensory data from weeks to mere days or even hours, marking a significant advancement in the field.   

Revolutionizing Product Development

However, the primary driver wasn’t just to make sensory faster, it was to open up the world of sensory to more consumer testing projects.  We know from 25+ years of conducting preference mapping type studies that having data from a trained panel alongside consumer data ALWAYS leads to more accurate sensory drivers and more actionable optimisation guidance.  However, we know not every brief warrants this type of design, and not every budget can stretch to it! Therefore, we wanted a method that reduced the barriers for running sensory alongside consumer testing.   

With as few as 4 products and 100 consumers, we can combine sensory data with consumer liking data in our Sensory Portrait bolt-on.  This tool allows you to identify whether each of the discriminating attributes the panel has identified is having an overall positive or negative impact on consumer liking and the proportion of the same for which it is showing that trend, i.e. which things should you prioritise increasing/decreasing to satisfy the majority. This is a substantial piece of information to have – you not only understand how your product is different and how much it appeals to consumers, but also which objective sensory attributes drive that consumer preference. 

The Sensory Portrait analysis uses respondent-level data to ensure that personal preferences are taken into account and that we aren’t chasing some illusive “average” consumer, but it doesn’t involve segmentation or modelling and therefore doesn’t require the large base sizes or product numbers associated with preference mapping approaches.  Our analysis shows this method predicts the attributes identified through sensory driver approaches better than simple correlations, which are often the default when the base size and product sets are too low for formal modelling. Sensory Portrait, integrated with consumer data in an accessible and intuitive platform, offers a more agile solution for linking sensory and consumer data, democratizing access to sensory insights for product optimization. 

This approach perfectly embodies the spirit of Product Hub – deep scientific understanding, expert standardization, and using tech (AI) as a team member. Sensory research has not changed much in the past 20 years, and in the few cases where technology has been used in the domain, there were attempts (but not achievements) to replace the people and their expertise with technology. We have taken a different approach – treating AI and technology as a member of our team, we are using it to amplify the expertise of our sensory teams, aiming to enhance the quality, speed, and user experience of conducting sensory studies to inform and support product development processes.  

As we launch Sensory Portrait, we invite you to join us on this journey. Together, we’ll unlock the secrets of sensory experiences, providing a deeper understanding of consumer preferences and revolutionizing product development. Welcome to the future of sensory research — expert, standardized, tech-enabled, and accessible.