Planning Your App

The importance of human research in an AI-obsessed world

Pocketworks

By Anna Scandella
Director of User Experience, Pocketworks
February 20, 2025
Updated February 20, 2025

The importance of human research in an AI-obsessed world
Plant Motif Leaf

Our Director of User Experience, Anna Scandella, recently spoke at the GreenTech Gathering in Leeds about the drawbacks of using AI to make User Experience decisions. The following article is a summary of her talk.

Artificial intelligence is transforming the User Experience (UX) landscape and it is important, as researchers, that we learn to work with new technologies. Whilst AI tools can be incredibly helpful with certain tasks, understanding its limitations is key.

To ensure we are designing the right products and services that effectively meet the needs of all audiences, we have to gather robust data that gathers a wide range of opinions and experiences.

UX is complex because people are complex. Real users interact unpredictably with digital tools. The challenge is not just understanding what users do, but why they do it. Despite its sophistication, AI struggles with this critical aspect.

Why using AI in UX research is a problem

Shortcutting the research process by using AI is a real temptation. Especially when you consider the top three reasons that research gets overlooked in any project:

  • User research is too expensive
  • We don’t have time for user research, we need to launch quickly
  • We know our users well enough already

AI can feel like a perfect solution. An infallible tool that will speed up decision making and help you jump into developing the next shiny feature. However, this is a dangerous approach that is likely to result in you prioritising the wrong things and wasting valuable time and money.

The main reason to conduct user research is to be surprised. You don’t want to recruit “average humans”.

The Assumption vs The Reality

Take the example of one of our clients, Carbs & Cals, a diabetes management app.

As part of our ongoing product and roadmap development, we’ve recently been exploring new features that will encourage users to log and track their food intake.

We wanted to test the difference between AI-generated predictions and researching with actual users. To do this, we built and trained a custom GPT on our key target audience profile.

We asked the AI model to predict what percentage of users would actively engage with logging food they have consumed. The model predicted that 29% would, when we know from our product analytics, that only 18% of this core audience do.

This gap was significant enough to reveal how AI models can often misread behavioural intent.

Why AI Falls Short in UX Research

AI is highly effective at recognising patterns in large datasets, making it invaluable for tasks such as processing medical records, or predicting disease progression. However, its limitations become more apparent when it comes to human behaviour.

One of the core issues is the training data. Many AI models are built using historical datasets that fail to represent diverse populations. For example, women have been significantly underrepresented in clinical trials for decades. Research shows that only 33% of participants in cardiovascular trials are female, despite heart disease being a leading cause of death among women.

When AI models misinterpret user behaviour, they reinforce biases and drive flawed product decisions that could have serious consequences for patient care.

Gartner predicts that by 2026, AI models from organisations that operationalise AI transparency, trust, and security will achieve a 50% improvement in terms of adoption, business goals, and user acceptance.

AI’s Role in User Research

This doesn’t mean AI has no place in UX research. It can be a valuable tool for drafting surveys, identifying common themes, and analysing large datasets. It speeds up the research process and can highlight patterns that might otherwise go unnoticed. However, AI should never replace direct user research.

In healthcare, user behaviour is often influenced by deeply personal factors that AI cannot predict. A person may avoid logging food because of guilt or anxiety around eating habits, not because they dislike the idea. AI is unable to interpret these emotions accurately.

AI often struggles with sentiment analysis, frequently misreading sarcasm. A user might say, “This app is great at suggesting the wrong foods”, and an AI might register that as positive feedback (this actually happened to us when evaluating the sentiment of an app review). A human researcher would instantly recognise the sarcasm.

The Ethical and Sustainable Cost of AI Bias

Flawed AI models also present ethical and sustainability concerns. A product built on incorrect assumptions will eventually require rework. That means collecting new data, retraining models, and revising entire product strategies. The financial and environmental impact of this is concerning.

Training large AI models consumes enormous amounts of energy. Some AI systems require computational power equivalent to the lifetime emissions of five cars. Rebuilding models due to bias or flaws increases costs, making AI inefficiencies both a sustainability and ethical issue.

Responsible AI development minimises waste and ensures effective technology from the start.

A Hybrid Approach: AI and Human Expertise

Effective UX requires AI and human insight to work together, with AI as a tool, not a decision-maker. Key principles for AI-supported UX research:

  1. Diversify Training Data: Ensure datasets represent diverse demographics.
  2. Validate with Real Users: Test AI insights against actual behaviour before major decisions.
  3. Use AI for Patterns, Not Interpretation: AI detects trends; humans analyse causes.
  4. Monitor Bias Continuously: Regularly review AI to prevent bias.
  5. Prioritise Ethical Development: Consider AI’s sustainability impact to minimise waste.

The Future of AI in UX

AI is a game-changer in user experience design, but it’s not a silver bullet. It has the potential to streamline research, identify trends, and process vast amounts of data. But it cannot replace real user insights. When AI predictions misalign with actual behaviour, the consequences go beyond engagement metrics. They affect real outcomes.

The best approach is a balanced one. AI should enhance, not replace, human research. By combining AI’s efficiency with human expertise, we can design experiences that are not only smart but truly user-centred.

Final Thoughts

Technology is advancing rapidly, but human behaviour remains as complex as ever. The real challenge in UX is building AI-driven solutions that align with real needs. By recognising AI’s limitations and using it responsibly, we can create digital products that are effective, ethical and sustainable.

Making apps that make a difference

In case you're wondering, Pocketworks is a software consultancy that specialises in mobile apps.

We bring you expertise in user research, mobile technology and app growth tactics to help you develop apps that create positive impact for your customers, shareholders and society.

To get a flavour of us, check out our free guides and app development services. Or, see some more background info on us.

Wish you had found this earlier?

Enter your email below and get notified when we release new content.

Arrow

We don't share your data, unsubscribe anytime.