The Founders and Leaders Series

Episode 14: Lewis Reeves, Walr

Mike Stevens Season 1 Episode 14

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 34:07

Episode Overview

Lewis Reeves, founder and CEO of Walr, on building a data infrastructure platform that powers market research agencies at scale.

Episode Highlights

  • Agentic workflows in practice: Walr has built over one hundred AI agents; one example is a two-agent, one-human pipeline for survey programming where one agent builds the survey, a second verifies it, and a human reviews only the flagged exceptions. Project volumes managed per person have roughly doubled in six months.
  • Democratising AI beyond engineering: Over half of Walr's 100+ internal agents were built by operations staff, not the product or engineering team: a sign that AI tooling has moved well beyond technical teams.
  • Synthetic data and digital twins: Lewis shares a measured view on synthetic audiences. Walr is running internal trials, but sees customer demand as still cautious, particularly where validation cycles are long, such as political polling.
  • Data quality reframed: Rather than treating data quality as a single issue, Lewis argues for breaking it into distinct categories (survey design, respondent experience, fraud, and inattention) each requiring different solutions.
  • Lessons from scaling: Lewis reflects on the challenges of rapid growth, the importance of following your instincts on change, maintaining close attention to cash flow, and the irony of a market research business that had to learn to listen more carefully to its own customers.

About the Guest

Lewis Reeves is the founder and CEO of Walr, an enterprise platform providing end-to-end online quantitative data collection — covering survey building, audience access across 100+ markets, and data structuring. He co-founded his first research business at 23, which became Savanta, before launching Walr at 28. Under his leadership, Walr has processed over 1.5 billion survey questions and appeared on multiple fast-growth business lists.

Learn more about the impact of technology and AI on research, insights & analytics at Insight Platforms.

Mike

Hello, everyone, and welcome to another episode of the Founders and Leaders series. My name's Mike. I am the founder of Insight Platforms, and I'll be your host. I'm joined today by Lewis Reeves, who is the founder and CEO of Walr. Lewis, welcome to the show. Great to have you here.

Lewis Reeves

Thanks for having me, Mike. Looking forward to this one.

Mike

Yeah, me too. And do you know what? I realized in the run-up to this, there's a part of me that's been a little bit envious of you over the years because I started my first business in my early 40s, and I wished that I'd done it a lot younger at that stage. Now, when did you first start your own business in the research space? How old were you, if I can ask that?

Lewis Reeves

gr-great question. Well, the journey that became Savanta I would have been twenty-three at the point in time where we started that journey myself and, and three others on that journey. Super exciting journey that, that we had there. And then starting Walr on my own at the age of twenty-eight. But I would say, Mike, I'm quite envious of your hairline, so, you know, maybe there's some trade-offs that, that come with starting businesses young.

Mike

There you go. So tell us, what, what is, what is Walr? Tell us a bit about the background and you know, how did it come to be built? What, what is it now?

Lewis Reeves

Yeah, totally. So, so Walr exists to power the world's most successful market researchers, and we do that by providing a full end-to-end solution for their entire enterprise online quantitative data collection needs all in one place. And, and where that came from is spending my entire career working with market research agencies and buying quite a few as well on the Savanta journey was able to see what yielded success in research agencies. And it,

Mike

Okay.

Lewis Reeves

businesses that spent the most time understanding their brands and their customers and focusing their energy there grew the fastest. Those that spent time trying to plow through some quite fragmented supply chains really make any momentum going forward. A-and so the vision behind Walr was actually how do we take that pain away from market research businesses? How do we give them the technology and the solutions so that what's become a very fragmented landscape can be a lot more seamless? And on a Sunday night, a researcher is not worried about which quotas are filling. They're worried about how excited they're gonna get their brand on a Monday morning with the revolutionary new insights that they're gonna provide. And so that's been the thesis behind Walr since day one. Proposition's evolved a lot, but, but, but that is really what sits at our heart. How do we drive the success of market researchers?

Mike

Yeah. Okay. So you are effectively a data operations analytics engine that- There's a, there's a sort of multiple layers, then the agencies or the researchers that sit on top of that will have that expertise in, like you say, innovation, brand strategy, whatever it is, but you're the partner that's delivering all of that you know, the, the infrastructure underneath. Is that right?

Lewis Reeves

Absolutely. And we, we stay very clearly in our lane.

Mike

Yep.

Lewis Reeves

provide data and our customers provide insight. And I think that's a, a really important delineation.

Mike

Yeah. Okay. That's clear. So there's obviously a lot going on in the industry broadly. How is your business changing with all of this?

Lewis Reeves

Yeah, evolving a, a huge amount. AI has been fantastic for our organization. We, we talk in our business that market researchers never complain about getting their data faster, so long as all of the critical hygiene factors such as data quality are, are, are met. And, and I think what's amazing, certainly with our use of agents and as we've built an agentic layer on top of our platform, i-is just the speed. Tasks that historically took multiple days can now be delivered in seconds. The scale of management of projects can be managed at such a more granular level on such higher quantums constantly, twenty-four seven. And so ability to deliver consistency and speed and confidence to our clients is, is greater than it's ever been. Whi-which is really exciting, you know, improving your product using technology and using, if anything, less manpower I think as an entrepreneur is, is always the dream

Mike

Yeah. Okay. Well, let's unpack that because ev-- that's a buzzword that is, you know, the the word of the year, probably agentic. For a lot of people, they'll hear it and hear different things. What does it mean in the context of your business, and why is it a game changer? Can you put some flesh on the bones?

Lewis Reeves

Sure, absolutely. So Walr's business is really in two layers. We've got our platform layer, which is an end-to-end platform to allow the building of surveys to as great an enterprise level and complexity level as you can dream of. The access of audiences that's delivering in over a hundred markets, twenty million completes a year, and then the structuring of data. The, the, the platform allows you to do all of that. One of the interesting learnings on the Walr journey is actually how many of our customers want the efficiency but want us to push the buttons for them. And they don't necessarily want to learn another platform. They just want their data accurately and quickly. And, and so the agentic layer for has really been about how do we get to the finish line faster and more accurate. So taking tasks that historically humans did in that operational layer and using agents instead. Very simple. Building surveys managing and running audiences and varying incentives, and then structuring data, but also actually from a data component, allowing clients access themselves. So if they want to run their own model through the raw data that we create for them, we allow them to, to hook up their LLM and, and they can have at it. They can compare all the datasets that exist within our platform or all in one place. So really it's around operational efficiency and operational scale.

Mike

Okay. So you, you, you talked about the speed benefits of agentic approaches, but you also talked about increased accuracy. Now, for some people, they'll see a bit of a, a trade-off here, or they've maybe been bitten by the fact that they've tried to implement something as an agent and it's gone rogue, or it's, you know, it's hallucinated or made decisions that it shouldn't have done. Can you talk a little bit about the accuracy side of things then? How do you, how do you balance that, that tension between obviously wanting the speed benefits and, and getting it all right?

Lewis Reeves

Yeah. So look, I think the first principle is, is building our own agents on top of our own proprietary platform. And that way you've got a lot more control. essentially, if I look at survey programming, we've got three layers, two agentic, one human. The first layer is the agent that takes the questionnaire and transforms that into the script that is ingested into our survey building platform. The second agent is a verification agent. Essentially that's marking the homework of the first agent to say,"How confident am I in the delivery of going from point A to point B?" the outcome of the second agent is to alert the human being to where they need to spend their time. A-and so I think what this allows us to do is have the same level of human oversight, the same level of, of trust within that process, but just much more sped up. So if there are any elements within that survey where they are relatively complex and the agents aren't quite perfect just yet the human is able to get straight to them, but the human doesn't need to go through the 40 questions before they got to the three that they need to aid on. They can go directly to that point in time. what I would say to people that maybe have a sort of slight concern around agents is if you set up an agent correctly and you think about this maybe slightly more from a pure automation perspective, rather than letting an agent free to run its-- use its own mind on, on a task, repetitive. And us human beings, when it comes to repetitive tasks, are not always perfect. We, we certainly do make mistakes. And so if you correctly align your agent and your automation task, it is gonna do the same thing each and every time. So get it right the first time a-and those benefits flow through. But at Walr, we are huge believers in the value of a human being, and closing the loop with a human is critical to, to knowing we are putting the right product out for our customers.

Mike

Okay. Really interesting. So it feels like there's a three into one in roles there. You've taken a s- a survey scripter, that expertise, and agentified it. You've taken a reviewer and have that role as an agentic solution. And then you've got the human who's gonna then focus on the, the short list of outputs that, that come out of that. So presumably, you know, per team member, you're starting to see, you know, you're amplifying efficiencies at this stage, or is it too early to really see that?

Lewis Reeves

No, no, we, we totally are. And I think when we look at project volumes managed by individuals they've pretty much doubled in, in the last six months.

Mike

Right.

Lewis Reeves

not directly related to market research, but we wrote seventy percent more lines of code in Q1 compared to Q4. And, and I think those sorts of efficiency gains not only are true of our engineering team, but, but also across the operational board. That, that's the sort of benefit that we are seeing.

Mike

Yeah. Okay. It's you were saying a lot of headlines about the impact that the coding tools are having on like you say, on, on development engineering teams, but it sounds like that's actually flowing through into operations as well. It's not just a, you know, an academic thing.

Lewis Reeves

one-one hundred percent. And I, I think what's been amazing to see in our organization is how many people have been willing to pick up that baton and, and go and create tools. And yes, we've got three simple areas where we use agents, but we, we've probably got well over a hundred different agents for different use cases, and maybe less than fifty percent of them were actually built by our innovation team, which is our, our product and engineering business. So we've got people within our operations team that actually have taken it upon themselves to go,"Can I automate this task for myself to make myself more scalable?" And that's really exciting. It, it comes with a requirement for a certain amount of governance to make sure that everything is behaving correctly and data is looked after correctly. But it's exciting that we are now in an era where it's not just engineering teams that can move the dial from a technology perspective.

Mike

Yeah. I'm interested, your use of the word automation suggests that some of these are actually deter-- you know, quite tightly, if there's then that deterministic sort of workflows that obviously, you know, you don't have the same risk of creativity, hallucination going off, off-piste. So how do you get that? I guess maybe this is an unanswerable question, but I'm intrigued as to the, the blending of the generative components of an agentic, you know, reasoning model with some of those deterministic rules so that you kind of get the best of both worlds. Have you, have you got areas where you're able to blend those two things together?

Lewis Reeves

I think a, a lot of that comes through the training and learning. And, and so we are very fortunate at Walr that we've asked now in excess of one and a half billion questions through our platform. That means we've explored quite a few different question types. And, and when we train our agents on that data, it means More than ninety-nine point nine percent of the time they are seeing something they've seen before. And so that then means that we've signed off, the checking agent has verified what they've done. We re-educate in the event that we do the wrong thing from an agentic perspective. And, and so actually it's more about that sort of memory recognition rather than taking on a task from scratch. And I think we're in a very fortunate component of the market research flow where it's very clear what right is. We're not in the world of insight where I think that there is a, a, a lot more of a gray area around exactly what is a great insight and what's the insight for this data and, and what is the relevant context and what do my stakeholders want to know? And there's so many more variables. Hey, I have a Word document with questions on it. I need to have a online survey that represents exactly the same questions in the structure that the, the, the, the Word document desired. Much more easy to validate exactly what is, what is required of that task.

Mike

Yeah. Okay. It's fascinating how quickly the world is evolving. I don't know if you saw a few weeks ago that Kantar's Global Head of People is now Global Head of People and Agents, you know, where you're having a kind of unified oversight of all of this. And the way that, you know, the way that you're framing it, it does sound you know, there's a, there's a kind of side-by-side, hand-in-hand in the, in the

Lewis Reeves

A-a-and

Mike

workflow process. Yeah.

Lewis Reeves

Mike, Mike, that's a great point and, and what is so exciting to me is h-how this is really growing beyond technology

Mike

Hmm.

Lewis Reeves

Their human resources department that, that is owning agents. Each of our departments, you know, we have a lot of AI going on and a-agentic workflows within our sales team of how work gets booked in through to how it gets pushed into our systems. Those agents are run by those teams. It, it really allows us to expand and release the shackles of historic IT and, and technology restraints, whi-which is very exciting.

Mike

Yeah. So I was just doing-- I was just trying to do some quick maths in my head. I'm normally not very impressed by big numbers in isolation, but one and a half billion questions over, did you say six years since you founded the business?

Lewis Reeves

just under, just over five, yeah, at the

Mike

Right. Okay. So three hundred million questions a year, six million questions. It's, it's a, it's a fairly healthy clip. You know, and you're right, to build a, to build a quite a substantial training, you know, body of expertise to go and build those agents

Lewis Reeves

Yes. And needless to say, Mike, we didn't do three hundred million questions in our first year. So we're, we're, we're a bit of a run rate faster than that. But look, and I think this is absolutely key is, is how do we keep our, our agents as relevant as possible, and training is key. And we're very fortunate as a business that collects data, we create a huge amount of data. And, and so that, that is a great resource for us to make those tools better and, and faster. Particularly the, the sort of metadata that sits behind every type of question that, that we ask.

Mike

Yeah. Yep. Yeah. Yeah. Yeah. Okay. We've talked a lot about the agentic changes that are coming to the industry, and we're obviously still in the early stages of that. It feels like that's almost the you know, the, the kind of the mega trend that's driving substantial change across all parts of it. But are there other trends that sit either beneath that or feed into it? Any other, any other trends that you see across the research and insight space that you think are noteworthy or are having an impact on your business?

Lewis Reeves

Totally. Look, I think what's really intriguing is a huge amount of innovation has gone on in the quant space over the last, gosh, twenty years since I've been in the industry, that's for sure. What I think is intriguing is the amount of investment and the amount of innovation taking part in qual. And qual at scale in the last sort of twelve to eighteen months has really caught fire. If anyone's been on the conference scene lately, there's a lot of new logos around a lot of well-backed logos as, as well. And, and I think that's a fascinating space to see how kind of the two worlds collide because that, that quant scale requires lots of people, lots of people paying attention, lots of people turning their cameras on. And you know, really intriguing to see how that mass unstructured data can create value for, for, for brands. So an intriguing space. Lots of players today. Will be interesting to see what that market looks like in two or three years' time. But, but that's been a, a fascinating

Mike

Yep. Yep.

Lewis Reeves

I think the worlds are colliding. We-we're a business that provides a huge amount of audiences for participation in market research, and, and we've certainly seen those organizations demand audiences at scale all across the world. But, but not really an, an output and a product that, that, that we lean into.

Mike

Yeah. Okay. Yeah, I think we, you know, we see on insight platforms, you know, a lot of the startups, the innovation, the, the storytelling around it. I don't know. My sense is that there's been about a quarter of a billion, maybe three hundred million dollars investment flow into that particular category in the AI moderation and you know, unstructured interviews, qual at scale, whatever people call it. So there's obviously a lot of expectation there. The other part where there's been a, an explosion of new solutions is In the polarizing space of synthetic digital twins, AI personas, there's obviously lots of different language that hasn't quite settled down around that yet. So, you know, on the one hand, we're kind of turning the researcher or the interviewer into a- an AI agent, and then on the other side, we're creating agents that re-represent the, you know, the participants in research. How are you approaching synthetic? Are you starting to incorporate it? Are you treating it as, you know, one part of the workflow? H-how does that work for you guys at Walr?

Lewis Reeves

We-we've leant into it pretty heavily. I-I would say the, the main starting point has been more from a digital twins perspective where done a lot more of our own internal research and preparation than I would say customer demand has led us to, to, to be delivering. From a synthetic panel perspective, you know, some of the, the numbers that we mentioned earlier, we're incredibly well-placed to play in that market. We are certainly running some of our own trials in that space. What I would once again say is we've had some really great conversations with, with our customers. We've not seen too many customers just yet take that leap of faith. And, and actually, I think when you've got the, the reach that a business like ours has got, the ability to very quickly, thanks to our agentic layer, generate data of high

Mike

Sure. Yeah. W-why do you think

Lewis Reeves

o-online

Mike

the,

Lewis Reeves

but, but we always have to be agile as a business. You know, when I started this company during COVID, w-we certainly didn't have agentic layers going on. Had a little bit of automation. A-and so m-maybe if we're talking, Mike, in another five years' time we, we might be predominantly providing synthetic audiences rather than human-led audiences. I, I think market research moves at its own speed, a-and I've seen many companies try and move it faster and have struggled. And, and actually I think it's about getting that balance right, being ready for the change, but, but not trying to force the market to change because sometimes that, that can be bit difficult.

Mike

you know, the-- your customers have been reticent? What are some of the arguments you're hearing about why it's not appropriate or they don't want to go there?

Lewis Reeves

a lot of political polling, for example.

Mike

Right. Yep.

Lewis Reeves

think that when, you know, accuracy of those opinions is so key, it feels like quite a leap of faith to move to a model with- Perhaps slightly less statistical accuracy. We all know that that human beings can be flawed in their responses and does everyone tell us exactly who they're gonna vote for in the next election? I'm not sure. But I, I think that that feels like a bit of a, a, a leap of faith in that area. Certainly clients that have got maybe more stable data sets that are highly repetitive have been more keen to e-explore where there's a lot of training data available and we can really build in the accuracy. We can very quickly test the accuracy. You know, with an election, you only get to test it every four or five years to see if your model is correct, and if you get it wrong once, you're sort of a decade out of being able to, to check back again. So I, I think there's a, a little bit of our work that it's not so easy to measure so quickly. The, the work that is more measurable is probably where the demand seems the highest.

Mike

Yeah. I think last week or the week before, I think Nate Silver came out, you know, strongly against using any synthetically generated data for, for polling purposes. And I mean, I've yet to see the, the paper and the work behind it, but it's yeah, certainly contentious and you know, lots-- I, I think the, the settling out of the language around it, the terminology, the methodologies, and the validation has a long way to go in that people don't quite know what they're describing. You

Lewis Reeves

Yeah.

Mike

know, you're talking about digital twins. Some people would call the same thing a persona. Other people, that's a very much a one-to-one mapping. So it's, you know, it's not straightforward yet, I think, for people to get a handle of, you know, where is it appropriate, how does it work, and when should it be avoided. So, yeah.

Lewis Reeves

One-one hundred percent, yeah. But, but look, research, we're very curious, and we have that sort of academic

Mike

Yeah.

Lewis Reeves

I'm certain we'll figure it out, and I'm certain we'll have some white papers to, to validate our approaches. We're just not quite there yet. A-and I think it's a really exciting frontier and, and, and one that I'm intrigued to see where we get to with.

Mike

Yeah. You mentioned the pace of change and, you know, things may be different in a few years' time. If, and, you know, this is always a hard question to frame, but if you could forecast out three years from now, what it-- what will have stayed the same, what do you think will be different in the ecosystem, in the, the way that research gets done or the types of businesses that are maybe more successful, less successful at that point?

Lewis Reeves

Yeah. Look, I, I think the one thing that a long tenure in market research teaches you is things never change as quickly as we might all predict. You know, I was there really at the start of the, the end of CATI and the, the beginning of online research and, and actually we as a business still do some telephone interviewing and that, that world may be not growing that rapidly, but it's still thriving and, and still successful. You know, there's been big data, there's been social media data, there's been the move to mobile and, and actually I still see far too many surveys that are not mobile ready yet. And so I, I think as much as we might predict a certain speed of change, and I do think AI is the most disruptive technology that the world has ever seen, let alone market research, which will definitely inspire faster change. I do think that at the end of the day, market research is about really high quality human opinions, and that will remain. The speed of getting to the answers has to get faster, and the organizations that facilitate getting to that answer faster will be the winners and will be the ones that, that continue to thrive and succeed. I do think some will fall by the wayside. Those that are not necessarily digitally enabled, that are not able to get to that answer faster will, will fall over because the option of I can get an immediate answer from my LLM versus I can wait eight weeks to have a market research answer come through will lose. Now three days versus instant, maybe that is right now good enough. I think it needs to be three hours versus an instant response from an unvalidated methodology. So I think speed will be key and, and speed requires digital infrastructure to execute upon. But I, I don't think we will see a cliff. I think we will see a more steady decline of one and a steady increase o- of the other.

Mike

Yeah. It's it's very heartening to hear what you talk about in terms of the agentic workflows, the creativity, you know, what's happening around the kind of the core of data collection and you know, connecting humans to businesses, organizations. Because an awful lot of the pitches that you see or the press releases that come out from some of the VCs who've invested in, you know, in startups in the space are all about how, you know, research these days takes three months, and it's very slow and backward-looking and, you know obsessed with methodology. And the reality is actually there's-- it seems there's an awful lot of innovation at the frontier to, you know, to do some of those implementations, like you say, you know, the agentic workflows, the speeding things up, recognizing in-house as well as within the, the agencies and supply chain that speed is a primary driver of, of decision-making.

Lewis Reeves

I, I totally and utterly agree, but I think, Mike, n-never at the cost of respondent experience, and I, I do also fear that we get further and further away from that component. You know, w-we've, we've got a, a good half an hour into this conversation and have not spoken about data quality, which has obviously been very topical for the last few years. really intriguing when you've got billions of data points, and you can understand how the impact of a poorly designed survey has on a respondent versus you know, the sheer volume of, let's call them bad actors coming into surveys, much bigger impact. And, and this is where I'd love us to lean into AI and, and innovation. H-how do we create incredible experiences? We've seen the rest of the world change. We've seen Matt Damon came out speaking about Netflix very recently saying,"You now have to describe what's going on in this film four times in the first twenty minutes to garner people's attention." Yet our surveys just aren't changing. A-and I think that there is so much that we can do to garner richer data, richer quality responses from human beings that are prepared to give us their time to these new technologies which we can deliver faster, but we should never do that at the cost of, of the quality and that experience which I think is so key to delivering the, the value in research.

Mike

You're right. Designed for the people who are actually half watching Netflix as they're completing the survey on

Lewis Reeves

got it. Yeah, absolutely.

Mike

their phone. Yeah. A very similar point was, was made. In fact, I've, you know, I've heard it in, in multiple places, but a couple of weeks ago, we, we had a webinar with with Dig Insights and DQC, who were talking about, you know, this kind of standard, industry standard more for data quality. But the most important point that emerged was the one that you're describing, which is actually designed for engagement, designed for good attention, and the hygiene of if it, you know, not even making it mobile friendly to be able to complete the survey. Those types of things are, are quite astonishing in this day and age. So yeah

Lewis Reeves

Yeah, look, w-when, when you see the data within our organization, you look at what's truly an attention span all of a sudden surveys as they go over twenty minutes, the dropout rates but also the post-survey disqualification r-rates grow exponentially. Now, did suddenly someone decide to become a bad respondent after twenty minutes or, or were they just bored? A-and I think that that's a real battle that we've got Akin to how you mentioned with synthetic data, we've not quite nailed the terminology. for one, think data quality has been far too broad-brush, and, and we really need to break that down into true categorizations of why we are not getting the accuracy within the data that, that we desire, and that will give us more levers to pull to solve that problem. One of which is certainly respondent experience and, and being totally joined up from survey through audience through data at Walr, we're able to see that firsthand and really help our clients design experiences that, that yield the highest quality of data.

Mike

Yeah. Yeah. It's a good point, actually. I hadn't really thought about it in those terms, but, you know, the same way that synthetic needs breaking down into the different methodologies, the data quality, you know, does segregate into better design experiences, outright fraud, you know, error rates, d-- you know, inattention. Those, those are the buckets, aren't they?

Lewis Reeves

The, the, the biggest thing also as someone that, that owns and builds and invests a lot of money in proprietary panels is longevity of respondents,

Mike

Right.

Lewis Reeves

into this flow, but if we upset them and give them terrible experiences, they don't come back. And I tell you, the guys who are gonna constantly come back are the fraudsters. They, they will, they will run through brick walls. Great respondents won't. And so we have to have a, a, a mind on how we make those e-experiences pretty seamless to ensure that the good guys stay around, and we can use great technology to block the bad guys from, from getting in.

Mike

Yeah. Yeah. Great. Okay, good. Well, we, we started with me referencing the fact that, you know, your original-- the, the first business you founded, you know, you were comparatively young. So you've got quite a track record of building businesses, integrating with others, acquiring others, you know, scaling globally. What, what are some of the lessons that you've picked up along the way? I guess, you know, big learnings in building these businesses over the years.

Lewis Reeves

Yeah, definitely. The, the one thing I think you can never underestimate is the impact of, of change in scale. And if, if you want to move quickly, your business will evolve rapidly and the ways of working, the talent requirements along that journey do change. And we've certainly got stuck at some of the traditional hurdles. You sort of get to fifty people and things don't work. You get to a hundred people, we get to two hundred people, and I, I think these are all points where you have to lean into evolution. Sometimes when you're growing really quickly, they might be six months apart and you go,"It was working perfectly six months ago. We, we did all those changes and it was great, and why is it now not working?" And I think that's something You can never not lean into enough is, is change and evolution and, and evolving with the business. I also think for all of these journeys and, um, you know, Walr has been published in a number of awards and fast growth lists, and that's all great. I think from the outside that can look quite intimidating. I think from the inside you know, entrepreneurs will all tell you it is not a smooth path. The, the curve does not start in the bottom left and end in the top right. It, it is a rollercoaster of a ride to get there. And so don't make the highs too high, don't make the lows too low. Keep moving forward. That there is always tomorrow, and I, I think that's a really critical piece of advice that some people super close to me, at times when the next step has seemed very hard, they've encouraged me along the way, and then that makes the step after that easier and the step after that easier again. And so keep, keep moving, and even if it looks like everyone else is, is just to the top right they probably aren't in reality, a-and you're probably doing all right.

Mike

They're not, they're not really crushing it, so yeah. Yeah, a good bit of Kipling for for modern business leaders, you know, treat those two imposters just the same. So yeah.

Lewis Reeves

Great

Mike

w- are there any I guess, are there any mistakes, any missteps that, you know, that you'd like to own up to or to, you know, to help others anticipate and avoid them?

Lewis Reeves

De-definitely. I, I think e-embracing change not fast enough has been something that whenever we've then made the change, you've gone,"Pff, absolutely the right thing to do. Why didn't we do that six months ago?" And speaking, your gut is telling you six months prior that you need to make that decision, but emotional factors probably hold you back. So definitely the advice from that will be following your gut. I, I think as a, as a first-time entrepreneur, lots of people say this flow is absolutely king and, and stay incredibly close to that. You know, I'm, I'm sure we're, we're not the only organization that's had a hairy moment or two in our earlier years, and that can be incredibly stressful and really detract from building great products and, and providing great service to, to customers. So keeping hotly on top of, of cash flow and having fantastic finance teams is, is really, really key. The other area I would say is in our world, be, be better market researchers. So we've definitely fallen foul of thinking we know what people want without asking them, and we've built products that we think are brilliant and spent a lot of time and a lot of money on building those products, and they've been used very sparingly. Every single time that we listen to customers and maybe get a representative base and listen to a few of them, we've built products that have been loved and have been used time and time again. And, and so yes, we're in the world of market research. We should know better but we should ask people questions and do our research and at least better outcomes.

Mike

I love that. What a great point at which to finish. The, the physician heal thyself. Every culture has a different variation of this, but the my favorite, I think, is the Egyptian phrase. Obviously, I don't really speak Arabic, but the, the, the translated version is something like,"The carpenter's door is all knackered and kicked in," or something, you know. So

Lewis Reeves

Yeah.

Mike

right, researchers need to get better at actually asking customers ensuring they understand what their needs are. Wonderful advice for budding entrepreneurs, for people starting out in the industry. I think great overview of some of the big trends. I love the you know, the framing of actually there's, you know, there's dozens of agents that can originate in different parts of the business, not just out of engineering and development, but these are actually living as part of what would've been human workflows previously. That, you know, different takes on the role of synthetic and, and sort of qualitative AI-moderated interviews, how those two things fit together. So we've covered a lot of ground in in just over half an hour. So Lewis, thank you very much indeed for your time, for sharing your insights. And we have a bunch of other great entrepreneurs, leaders, founders of insights businesses, research tech businesses coming up in this series. Do check back for, for future episodes. We have plenty more on the slate. So Lewis of Walr, CEO founder of Walr and seasoned entrepreneur at a tender age still in this industry. Thank you very much for your insights and yeah, enjoy the rest of your day.

Lewis Reeves

Br-brilliant, Mike. Really appreciate the time. Thank you.

Mike

Thanks, mate. See you soon. Bye-bye.