Reflections From ESOMAR Reimagine 2025

From projects to knowledge platforms and the future of synthetic data, this year’s ESOMAR made one thing clear: the future of insights lies in systems, not snapshots. Brands are no longer satisfied with research projects that end in a slide deck and one-off insights. They’re demanding platforms that embed insights into the day-to-day of their organizations, ensuring that data is not only collected, but continuously applied.

Alex Dobromir
06 Oct, 2025 4 min read
ESOMAR Reflections PH Banner

Platforms as the New Differentiator 

One of the strongest signals was how leading brands are using platforms to scale insights. These are not passive dashboards. They are living ecosystems that clients return to repeatedly, each time deriving new value. 

What makes them powerful is their ability to: 

  • Integrate across the innovation funnel, from early ideas to finished campaigns 
  • Keep data accessible long after a project has ended 
  • Provide tools that help users interpret information instantly, increasingly powered by AI 

The competitive landscape is showing that platforms are now the real differentiator. They create the “stickiness” that secures long-term partnerships, rather than one-off projects. And this applies across categories and stakeholders, from Ikea exploring storage solutions to Sony Music exploring how different artists relate to different people's psychological and cognitive profiles. 

 

Accessibility and Adoption 

But technology alone doesn’t create impact. As Joel from Asahi pointed out, the critical factor is internal adoption. Platforms only succeed when stakeholders across different levels of the business understand and embrace the benefits, and their part in using them. 

Elaine Rodrigo (Reckitt) showed concrete figures on how embedding insights across every stage of development led to measurable results: a 70% increase in quality and a doubling of speed. But, more importantly, she emphasized the importance of the right partners to embark on the journey with. 

Tony Costella (Heineken) wrapped up the conference and emphasized that the aim isn’t just to deliver polished reports to the boardroom, or be heard once by the right stakeholders, but to build systems that support decision-making everywhere in the organization, whether or not the insights team is present, through true digital transformation. These systems should automate simple decisions, augment more complex decisions with a low level of risk and only support high-risk complex decisions. 

Together, these perspectives highlight that the winners will be those who make insights accessible, actionable, and indispensable across all layers of their organization. 

 

Synthetic Data and the Future of Personas & Twins 

A second theme was the shift from describing “what was” to anticipating “what’s next.” Here, synthetic data and digital twins / personas are becoming central.  

Google has described the three ways of using synthetic data: 

  1. Augmenting niche samples - filling in gaps where real-world reach is limited. 
  2. Prediction - modeling how audiences might respond to new scenarios. 
  3. Personas and digital twins - enabling brands to test concepts repeatedly with synthetic models that behave like real segments. 

This progression moves research from static insight to dynamic foresight. Instead of waiting weeks to see how a concept performs in the field, organizations can simulate reactions instantly, refine ideas, and only then invest in real-world testing. 

However, Heineken reminded everyone to exercise caution and suggested that different levels of synthetic data come with different levels of risk: 

Step 0: Personas that describe what happened in the dataset -> No risk 
Step 1: Data imputation & Data boosting -> Low risk 
Step 3: Digital twins -> High risk 
Step 4: Fully synthetic respondents -> Unacceptable risk   

Panoplai also delivered a great digital twins workshop, showing that only continuous, thorough testing can provide high confidence levels for Step 3 and perhaps, at times, Step 4. However, they warned that most companies do not conduct enough parallel testing and do not spend enough time in the data cleaning and selection stage, to even know their limitations.  

While this topic shows great promise, and the industry is rapidly progressing in this direction, it appears that there are still some issues to address, mainly around adoption and education. 

 

Implications for Product Hub 

For us at Product Hub, the implications are clear and the direction is steadfast. Within the next year, we are focusing on: 

  • Building platforms that are more than repositories, but ecosystems clients continually return to. 
  • Designing for adoption, ensuring the value of insights is visible to everyone, from frontline teams to senior leadership. 
  • Expanding the offering into AI augmentation and twins, enabling brands to test, predict, and simulate at scale. 

Ultimately, being able to truly 'extend the value of the underlying data and put it in the relevant context' and help clients make the best product decisions. 

The line between agency and technology provider is already blurring, and the brands moving fastest are those turning insights into systems of intelligence. ESOMAR 2025 made it evident: the future isn’t just about providing answers. It’s about designing platforms that help organizations think better, faster, and smarter every single day.