Press enter to see results or esc to cancel.

“Give me less analysis, make me do more work”

Said no-one. Ever. (To me anyway.)

That’s why I don’t believe that clients are suddenly finding themselves with more time, or have a desire to do all of the analysis themselves. I believe they want opinions and recommendations based on high quality, truthful insights from their peers. 

Yet there is this constant drive towards DIY platforms, the automation of everything, and the rise of tech companies who are out to replace research people with so-called AI. It worries me.

What is the goal of doing research? 

Well, there are many reasons, but generally it’s to:

  • Help make a decision. 
  • Help make a company become more successful. 
  • Give a go/no go on a product. 
  • See what changes could be made to a product to help sell more of it etc. 

The list goes on, but the general principle is to solve a business problem. 

What I fail to see is how those needs can be adequately satisfied by platforms that create more work for their commissioners, topped with “insights” suggested by machines, summed up most commonly with a near useless word-cloud.

It’s all wrong. 

I truly believe that the gold at the end of the research rainbow is where you have a combination of people and machines. The best of people, the best of tech. 

Where do we begin? 

I think we can all agree that we need to start with analyses that are based on truths. 

Not a diluted version of the truth. Not a claimed version of the truth. Not what truth someone thinks you want to hear. The truth – and nothing but the truth. 

How do we get there? 

Well, this is the hard bit. 

To get to the truth you need to create/allow the conditions best suited to obtaining veritas. But there are a lot of things that might stop you from achieving that state of openness, including, but not limited to:

  • Asking questions in an unnatural environment. 
  • Asking the same people the same questions again and again. 
  • Asking poor questions.
  • Actually, just asking questions.

Getting to the truth in an efficient manner is about, initially, lowering our levels of interference. Observe then ask (more on that here, if you’re interested).

And once we’re there?

Once we have semblances of truth, we can turn to the respective strengths of organic and metallic circuits in order to make the most of the effort (and patience and bravery) that we put in to gather it. 

To get good quality analysis, with the (current) utmost efficiency, here’s where I think the relative strengths lie…

People – Deciding on a piece of research

When a business problem is being discussed it often isn’t a binary question. Interaction with stakeholders will drive questions around approach, around design, around recruitment. It will call upon years of experience and knowledge. It will help you design a path to getting you to the answers you seek. 

This is why agencies exist: to help people set up the best possible research approach; to get the best chance of arriving at truths and answers. Machines can not remotely hope to do this (your job is safe). 

People & tech -Recruitment

Where technology can excel in recruitment is in the systems built to find the right participants, to avoid overusing others. To rate them, pay them, structure their data – all of which massively aids subsequent analysis. 

But you still need people to sense-check the machine and set-up the questions in a way that removes people who aren’t doing what they say they’re doing.

People – Briefing respondents

Get people to record with no researcher present. In the environment they would normally be in. With the people who would normally be around. It will make them more comfortable and more likely to show you or tell you the truth. 

That’s why bespoke guidance for the respondents needs to be given: filming guidelines, questions/activities to do, occasions to record. This then needs to be tested and monitored to see if it all works, if it will lead to the answers you seek. People need to do this. To judge this. 

We have to give respondents the best chance to show us (or tell us) the truth. Machines can’t tease that out.

People – Managing respondents

In research, the respondent reigns supreme. 

In the work Watch Me Think does, for example, it is crucial that there is a link between the project manager and the respondent. Why? Well, it increases the quality of response as someone is there to explain how a video will be used, what is needed from the participant, how they should film, even their privacy rights etc. It increases the quality of what you get back. 


Because it increases trust. Trust means more honesty. Understanding means better quality. Machines can’t build that. Ever tried building empathy with a chat bot? 

Tech – Getting back the data

Smartphones. 360 cameras. GoPros. Technology is used so that people can record themselves doing what we’ve been brave enough to ask them to do. 

These videos are then uploaded via apps that compress, assign metadata; alerting a Project Manager that it’s arrived and delivering them into a production/QC environment. 

Done. With only a few touches of buttons.

People & tech – Quality checking

Each upload is quality checked by people, rated using the tech (the tech can even be used to check for certain aspects of quality, e.g. sound) and feedback is given via the system. 

People – Transcriptions

The “best” automated transcription service has a 90% success rate. Which means 1 in 10 words is wrong. 1 in 10 words that a machine could analyse and give the client the wrong indication, the wrong inference, point them in the wrong direction. 

Machine translation is even worse. Often gobbledegook at best. 

Only a human can transcribe the nuances of unrehearsed, natural speech. Plus a human transcriber can also add in actions that are not visible in speech alone (although machines are getting close to being able to do that).

People – Behavioural analysis

The icing on the cake. You need real people to watch the videos and assess what they are hearing based on their experience, their knowledge, their training. Real people to look at the behaviours and words to uncover the insights and observations.

If you’re happy with a word cloud as the sum analysis of a study, then you’re probably not that interested in research. 

People & tech – Word grouping

Using a synonym library built by humans, a machine can subsequently and automatically pull out key aspects within transcription texts which are displayed to the client, grouped along navigable themes.  

Want to see everything that people have said on pack design? Easy. But it’s all based on accurate, human transcriptions and human generated language connections. 

People & tech: Storyboarding

Researchers can also create storyboards. These can be passed to a video editor who will make what you’ve collected tell the story. 

Of course you, as a person, can also create your own reels, using on or offline tech editing tools. But give this task to a machine? Here’s what one did in trying to create a storyboard for Burger King:


Scriptwriters of the world, you can rest easy. 

Respect your work

Technology should be used to make certain elements of research more efficient, not to cut corners or make insights worse. 

Do not downgrade the human element. 

Just because something can be automated, it doesn’t mean to say it should be. Think Mickey Mouse and his spell on the mop and bucket

We all want high quality, truthful insights. The combination of people and technology is where, I believe, the future of research lives. 

Not simply Saas. But TaPas (Technology and People as a service). Or whatever you want to call it.

So when people start talking about AI, or automation, or DIY, ask them what their objective is. If it’s just faster and cheaper then ask yourself what is the real point. 

If you’re serious about getting truthful quality insights, if you consider your time precious and not to be wasted, look at what machines do best, look at what people do best, and look for where and how they best intersect.