ReThink Productivity Podcast

The ReThink Way

March 10, 2024 ReThink Productivity Season 1 Episode 147
ReThink Productivity Podcast
The ReThink Way
Show Notes Transcript Chapter Markers

James Bolle from ReThink is back to reveal how transforming numbers into narratives can revolutionize decision-making processes. Kicking things off with an apology from Simon for the recent video mishap, we quickly pivot to the meaty part, discussing the fidelity of data inputs and the finesse required in data analysis. Picture an optician meticulously calibrating lenses; that's the level of precision we dive into, exploring the craftsmanship of data collection that ensures our insights are as clear as perfect vision. Throughout this discussion, we grapple with the industry's relentless pace and the challenge of nurturing a skilled analytical workforce that can keep up

Moving beyond mere numbers, we unwrap the method behind tailoring data analysis and visualization to resonate with specific client needs. It's all about context—like how a pinch of salt can transform a dish, nuances in data can yield profound insights. With ReThink's tiered approach, we ensure information integrity, turning dry stats into compelling stories that drive organizational action. This episode explores how to encapsulate data in a narrative form, complete with anecdotes and vivid visual elements, ensuring that the recommendations we provide aren't just heard but felt and acted upon

#theproductivityexperts
Register for the Produtivity Forum 2024
Follow us on Twitter @Rethinkp
Connect to Simon on LinkedIn
Follow ReThink on LinkedIn

Speaker 1:

Welcome to the Productivity podcast. James joins me and we're going to start. Well, I'm going to start with an apology. So after telling you on the last episode, when we talked about data and James talked about his new role, that we were recording audio and video, I then saw the setting on the recording suite that was audio only. So welcome to our first video and audio podcast. James is back. Do you want to just give us a bit of a recap on the previous episode, jake for that? Jake, why am I calling you Jake? Because it's my son, james. Before we get into this episode, I would like to.

Speaker 2:

I mean, you heard my feelings already by not recording me last time. I know you're calling me the wrong name.

Speaker 1:

I know well and you did your hair and everything and I didn't bother.

Speaker 2:

No, it's okay. So last time we were together we talked about my career a little bit and my new role at Rethink, but more importantly we talked about data. I mean we touched on ideas that data is everywhere, that the idea that data is the new oil.

Speaker 2:

But like oil, data is dirty and messy and actually not that valuable unless you know how to refine it and so we talked a bit about how you drive insights from data and how you can drive action from insights and data, and surfacing insights is absolutely key to what Rethink is all about.

Speaker 2:

So our purpose statement is to collect great data so we can surface insights to create better decisions and discover opportunities for positive change. So data is at the heart of that, but data itself is only the enabler of great decisions. You need to turn the data into insights in order to do that, and we talked about how, in personal life, in business life, often nothing changes without corroborative data backing up opinions. But how difficult is to actually change people's minds with data, because you need to tell them stories and if those stories don't necessarily align with what they already believe about the world, they may ignore it. So we had a really interesting discussion about that. If you've not listened to it, go back and do so. I really enjoyed the discussion, but I think today we're going to talk a bit more about how Rethink approaches data and make sure that we can turn into valuable insights rather than just let it reside as data. Is that right?

Speaker 1:

Yeah, so, yeah, go back and listen to the other one, because you say you won't be able to watch it because we didn't have the right settings. So I'm now holding up a copy of our book. Every second count. So we've talked about this a lot. Some of this stuff we're going to talk about today's in the book certainly some benchmarking bits and stuff, but the first bit we're going to talk about isn't particularly, but I think, interesting for people that have worked with us before. Having thinking and working with us understand kind of what process our data goes through, so kind of the Rethink way, if you like. So do you want to talk us through that, james?

Speaker 2:

Yeah, I mean it starts with ensuring that you have the best inputs possible. Now I started my career as a market researcher and worked for 10 years in market research, and in market research you could be the best data analyst in the world or the best storyteller, but if the data that you start with is garbage, then you don't really have much of a chance to succeed, and so I spent a lot of time during those 10 years learning how to write questions and questionnaires and how to build representative samples and such like, because it's really important and I think Rethink is the same. Clearly, we're not talking about collecting market research data. We're talking about going into businesses and measuring what's actually happening, and I was surprised by this. So, again, kind of go back and listen to the last episode if you want to hear about my role at Rethink, but I've been full-time at Rethink now for about four months and it's really struck me over that time how much work goes into ensuring that the analysts that we employ and we employ them we don't use contractors or freelancers or part-time people. How much effort goes into making sure that those analysts are trained and know exactly what they're doing and understand the processes that they're that they're measuring, and so step one is about having the right people who are trained, motivated, experienced and trusted in in the locations in order to measure the data and to give you an idea of how important that is.

Speaker 2:

So I went for my first day shadowing an analyst recently, where I was in an optician's and I was using the, the retime app on one of our tablets to measure a load of processes, and I thought I was doing a great job. I was absolutely heartbroken when the analyst I was with pointed out that, of course, my data would never be used because I haven't been fully trained and haven't done multiple shadow studies and and had all my data verified before. So there I was merrily measuring how long it took to hand over a customer from optician to sales person, and I thought I was doing a great job. But even though I'm a very experienced person I've got 25 years experience in work none of my data could be used because, unfortunately, I've not been trained. So you know, it's an idea.

Speaker 2:

We have to collect great data to begin with. So that's the first part of the rethink way, and I don't know if you've talked about that before on this podcast, simon, but it struck me as something that was quite unique. I mean, obviously it creates additional costs for our business because we might end up with people on downtime when between projects or traveling from place to place, but there's a reason for it and it aligns with our purpose, which is that we want to collect great data, to surface insights, to create better decisions. If you want great data, you need to have great people collecting it yeah, agreed, and we've got some brilliant people on the team.

Speaker 1:

I think we're also conscious that it's an industry of A population which which isn't coming through in terms of youngsters, so graduates, people looking for a career change. You know, it's a it's an interesting life, you see lots of things, but ultimately you can be away from home for Lots of times. Well, so I think we, as well as the kind of data angle, the people angle, the making sure we're working in a consistent way in terms of capturing, there was also in the back of our minds what does it look like in five or ten years when the might not be new people coming into the industry? Cuz that's a risk to itself, cuz you know, I Robots, whatever gonna do lots of things. Maybe they can do this in the future knows and you gonna have to get very sophisticated to make some of the calls that the team, after making terms of how hard somebody's rating from pace of work, or was it this task or that task where there's a real new ones in what it looks like?

Speaker 2:

Yeah, but also, you know, they've worked in the work in similar businesses in the past. They've they've measured in comparable businesses in different industries and it means that I think I will be able to some point, probably from cctv, be able to do several other things that analysts do but is not gonna be able to say straight away. You know, I've seen I was, I was working somewhere three weeks ago and I saw this little trick that they have to really make a difference with this For this business in a completely different context. And yeah, I found that. I found that surprising when I when I joined, but also really reassuring and really interesting and you see at the end of the day, no doubt.

Speaker 2:

What was that sorry?

Speaker 1:

and your feet at the end of the day my feet was my head, my feet?

Speaker 2:

Well, that's the other thing. Right, because I'm, you know, we, we go out and we work in really interesting businesses, but whether it's making a coffee, making a burger, packaging up a prescription or working on a production, I'm picking things in a warehouse. Those processes people learn them and Get the heads around and actually really complicated. You have hundreds of steps in the process and you need you need really clever, really Motivate to engage people to measure that property. So that's the first stage but it's actually got nothing to do with me, because my role is to turn that data into insights, but something that really impressed me and I thought was worth mentioning yes, so that I need to get it.

Speaker 1:

They check their own data, so they do a sense. Check in terms of after the days finished and clear the heads. Check the data for any anomalies or things that they they wanna, and just check an update. That then goes to Program Manager.

Speaker 2:

Yeah, that's right. So there's three layers of data check actually. So obviously, the analyst is out measuring it day to day. The Program Manager will review every study that's submitted and then it gets reviewed by me or somebody working on the data, along with a director at Rethink, once it's gone through the Program Manager review. And that's three layers of checking. It sounds maybe like overkill, but it's not, because we're each looking for slightly different things and when it gets to, when it's all been collated in the app, it's all been reviewed and all been signed off. Actually, what we're looking for is how can we organize the data and how do you make it consistent with studies that have been done before, with other benchmarks? How do you turn the data into something that is valuable? And I think the three levels of checks I mean firstly make sure there aren't errors in the data, but secondly and more importantly, they set the data up in the best way for it to be useful for the people who we've collected it for.

Speaker 1:

Yeah, and just when you said director's check, just to be clear, that's not me, that's not one of my strengths checking data. So for those that are worried now about the data that we've got before them, it isn't me.

Speaker 2:

So don't worry, don't worry, it isn't you, but I think the commitment to having somebody who understands the needs of the business, having a site on the data before it even gets you know, even if we were just reporting raw data or just giving raw data files back to the client, the fact that somebody who knows the client and knows what they're trying to achieve from the study and knows what the brand is about before it gets to them, must be reassuring to people, even if they were slightly worried. It would be you. You know that idea of no, you know this is what we need to know about and so this is how it needs to be organized. That's, I think, really reassuring for people. And you know the data. However it gets reported, however it gets turned into insight and then into action, you know it's sound when it gets in there. So, once it's gone through those three layers of data checks, it's about organizing it and tabulating it in the right way for analysis, about collating key information about the data captured.

Speaker 2:

So, for example, again, data on its own not necessarily valuable, adding context to it. Much more valuable. So making sure that the data is all aligned with, for example, what day of week it was. Is there enough data to look by individual weekdays or are we gonna group it up into weekday versus weekend? All types of things about context and information about the data what was the weather like? Even that's all kind of standard in it. So we can add that context of what were people doing, how productive were they at different times.

Speaker 2:

So it's not just well, we know that your team spends 25% of its time doing non-value, adding things. It's where in the week does that happen? Which stores does it happen in? Why does it happen? You know we're trying to get to that level rather than just provide the data.

Speaker 2:

So there's three levels of checks and then there's contextualization. So we make sure that we add useful context on top of its store format day of week, all that kind of stuff. What time of day it was is one of my least favorite parts of the job. That is, working with the team to make sure that we know if we're doing measurements at 7.30 in the morning in one store that might be before the store opened. In the other one it might be half an hour after it opened. So actually it's not useful to know sometimes that between seven and eight we have 10% wait time, but actually if you do add a bit more context to that about whether the location was open or not, it can help you understand it even better. So you know, we do go the extra mile, I think, to make sure we've added some contextual information on top of that.

Speaker 1:

Yeah, I'd agree. So lots of thought goes into the data, lots of checks, and then what kind of starts to come out the other end?

Speaker 2:

Yeah, so currently, I mean it does depend on what the organization wants from the data and how that data is going to be consumed there and how it's going to live in the organization. But typically, though, that data will be turned into standard visualizations, visualizations that we've used before and that we know have told stories. So, built on years of organizing and presenting this data, we have data visualizations that highlight the issues that businesses need to know about and need to care about and can take action on. So, for example, yes, what proportion of your time is spent on different things, but also where your greatest pace opportunities? So I know we talked a little bit about pace rating. If you have elements or you have tasks in your business that are done generally at a lower pace rating, why is that and what does that mean? There's a filler task? If so, what time of week is it being done? Who's it being done? By which part of your business is it being done in? Let's see if we can dive into that and understand a bit further. All of that Sounds interesting, but you need it visualized in the right way in order to take action from it. So we'll take the data, will build the visualizations and then what's really important about those visualizations, so left my own devices. I could create two hundred, three hundred five hundred slides for a presentation for client, but it's not useful because it's too much there. So you then work with the program manager and with Sue or one of the directors with yourself, somebody who knows the client well to distill our experience into a results presentation and tell a story that can galvanize action in the business.

Speaker 2:

And If you listen to the previous episode, you'll hear us talking about what, what types of data there are in the world, and whether you collect it on purpose or by accident, whether it's quantitative or anecdotal. You know our analysts are in the store collecting useful anecdotes that we've woven into the story that we tell with the data. So it's not just data. There are Videos or pictures. There are anecdotes within there as well.

Speaker 2:

To bring it all to life and I was in we talked about, you know, needing to hear a story from several sources before we can believe it and take action on it. In the last episode, I was in a presentation recently where there was a video of somebody trying to log into a till and just being a spinning wheel for forty five seconds before this person could scan a jar of coffee and serve the customer and Let me tell you we can show them how much time they're spending in their business logging into till and it's good and it's compelling. But a video of somebody standing there for forty five seconds while the while the wheel spins, just to serve a customer one jar of coffee, that little video, I think, told me what it said more than we could have said in a hundred slides about that process. So it's about bringing that all together and then we contextualize it, we summarize it and we make recommendations to interpret it. Make recommendations because Businesses often want our experience, our expertise from from other organizations we work with to help them make the right decisions, and that will be facilitated by face to face or virtual discussion.

Speaker 2:

So the data and insights that we discover, our jumping off points to have discussions around them, and yeah, really interesting. So I have been working with three things data for five years on a free land, spaces having worked full time with the company now for four years and Some kind of presentations really show me the power of getting people together in a room or a zoom to discuss around that kind of stuff. And then there might be an additional work that needs to be done, as always, a follow up, and actually you know what? I'm just kind of the gold standard process, what we think of as the rethink way. There is a platinum standard, which is adding even more contextual data to it. So, for example, the tenure of the people working in this, in the location where we're measuring how many people were there At different times today, how many people were on break, was the footfall in those stores on those day, what the sales, and you know all this type of stuff how many items have you managed to get out the door, how many packages of you received and how many items have you got out on different bands?

Speaker 2:

You know all of this stuff can help us to tell an amazing story. So it goes, you know, collected in the right way, checked, thoroughly, organized, contextualized, summarized and presented along with recommendations and then discussed the key steps in the rethink way, and we can enhance that data all the way through the process with additional information. And it's not just numbers, it's not just times and counts, it's pictures and it's videos and it's feelings and it's observations and it's employee feedback as well, because our analysts take time speaking to the team when they get to know them and asking them how they feel about their jobs and their processes and how we can improve it. So we bring together lots of different ideas.

Speaker 1:

Yeah, 45 seconds is a long time when you watch your video. Yeah, you don't realize, do you?

Speaker 2:

It's a long time. If a customer is standing in front of you waiting for one jar of coffee, is there anything they would buy?

Speaker 1:

The conversation about the weather starts or that kind of stuff, and it is you kind of trying to just pad frantically while it's logging in in the hope that it doesn't say you've got the wrong password or you can't get in To benchmarking. Then I know that's a big part of some of the stuff that you look at. I think we've got 67 million benchmarking data points that go back to the formation of the companies or some of those are the tale of relevance now and there was a big thing called the pandemic in the middle of that, towards near history rather than ancient history. So that's our data and we talked in the last one about data being powerful and this, that and the other. So there's lots of richness in there, which I know you're exploring as well in terms of systematising that, building kind of more of a tool around it. But also within our projects you use that data to help clients understand where they sit.

Speaker 2:

Yeah, that's right and it makes a real difference. So 12, 13 years of data from a range of businesses and benchmarking the data we collect is really important. I talked in the last episode about how we're not trying to make everybody deliver exactly the same experience to their customers or have exactly the same process. It's going to be ridiculous. Like all these businesses are unique.

Speaker 2:

But benchmarking the data we collect is really useful because it's aspirational. It helps businesses understand what's achievable in terms of the amount of time you spend on different processes, how well you can do them. It enables some really useful performance gap analysis so where should we be doing better, areas where we're really falling behind what we should expect and it helps us identify best practices within industries and across industries. So you know, knowing that you spend this much time on these processes or they happen in this way, comparing that with other businesses is just really useful for figuring out what works best in different settings. And you know we help our clients to set realistic targets for what they're trying to achieve. You know, if we put more resource here, it means we can achieve XY or Z, and it's just really powerful for helping understand the business that we're working in and helping them to move forward effectively.

Speaker 2:

And sometimes, you know, and it was not only set realistic targets but to have a conversation about what's unrealistic as well and aim for the fringes of what's realistic or rein in kind of instincts within organizations to do things that just aren't achievable. And I love the benchmarking system. So I mean you talked about the moment the data is obviously all in one place and accessible and we can kind of put it manually. But we're working on a benchmarking tool that will productize the benchmarking information so that people other than the analyst working on the project can see OK, so how long do other organizations spend doing X, y and Z and how efficiently do they do it? And I think that information will be powerful and useful if we can democratize it in a way that people can consume.

Speaker 1:

Yeah, no, I agree, and we've kind of stepped into the future of our data with any don't share all the secrets, clearly, clearly. But are there any other bits that you want to kind of tell people about that we're thinking about and working on or potentially going to be working on?

Speaker 2:

Well, I mean when you think about the future of the future of data and the future of our product, I actually find it quite difficult to predict, because you know you might be merrily thinking about, for example, ai copilot to help you interpret the data, and then it turns out that AI can produce 30 second film trailers from just a few words. You know the technology is progressing so quickly and so radically at the moment that that saying, saying I have a clear view of what the future of the data is going to look like, I think would be a lie. But they do say that the best way to predict the future is to create it, and so we know that. I suspect that using AI to help us get around the data more effectively and then help clients if they want to get around the data more effectively and drive more insights, I think is going to be part of it. I would be surprised if data structure doesn't change over time to enable more unstructured feedback to be incorporated in data models, and the power to connect data sets is undoubtedly going to magnify. So connecting productivity measurement, historic benchmarking information to sales, to other performance metrics that people collect within a business the power to join those together and understand them will definitely magnify.

Speaker 2:

And you know, in five years time, 10 years time, I'm well aware that my job might actually be obsolete. Right, because having a person who oversees the production of insights by saying, well, this is what the brand is about, this is what they're trying to understand from this data, this presentation, we need to pull together. There's no reason, in a few years time, why the client can't just ask an AI chatbot that and that chatbot help pull the presentation together without any human interaction at all. And you know, if I can make myself redundant and it can still create great insights and better decisions, then that's probably a future we should embrace.

Speaker 1:

We'll find something else for you to do, don't you worry, you're not getting out that easily.

Speaker 2:

Well, somebody needs to. This is a different podcast, but somebody needs to look out for AI hallucinations that seem real but are actually just completely made up, and somebody needs to train the systems and all that kind of stuff. So I'm hoping You're into deep deep, faked world, aren't you?

Speaker 1:

Yeah, I mean the video, the stuff we were looking at, just for reference, it's Sora, so S-O-R-A. If you Google it, it's been released by the team behind chat GPT and it's not for public use yet because I think there's loads of legalities. You know the whole thing that kicked off in America with the Actors Guild and use of it, but it shows you the power of the future, so you know. The one example is they type in built me a film trailer with an astronaut wearing a red knitted helmet. Ten seconds later you've got what looks like a Hollywood produced film trailer with an actor with a red knitted helmet. So it's here. Whether it, how far it gets, how much people get scared of the whole terminating to kind of whatever it's called. Was it net 2000 or something in Terminator SkyNet? Yeah, skynet. Yeah, that's the one. So there'll be a bit of that skepticism but, like anything, there'll be people that use it for good, people that try and use it for something more devious. But I mean the stuff we looked at. Yeah, I was amazed, absolutely amazed.

Speaker 2:

It's amazing, but I mean, and the underlying principle of technology taking huge leaps forward and doing things that we thought that, you know, took a lot of human brain power and a lot of human experience to do is a principle that we should all be aware of. And this, realistically, why should you have to talk to a person and explain a brief to them when you can just go straight into a chat, bother, understand it and use that? And yeah, that's one of the principles that I'm using when I'm thinking about what our next product might look like. You know, it's about understanding our clients, and if anybody listened to this wants to chat to me about kind of how they use data and types of questions they try to answer with, I'd be really interested in having that conversation, because we clearly have clients who will employ us because they want an answer to a question and they just want us to tell us the answer to the question. There are some who will be employing us because they're not sure what the what the answer is or even what the question they need to be asking is.

Speaker 2:

But we can. We can have conversations with them and, over time, ai will be able to have those conversations with them and help them, firstly, define their briefs. Secondly, define what data they need to actually help them produce the insights from the data without, you know, too much human intervention in between. And we can build, we can focus on making that a reality for them if we have good enough data, which goes right back to the beginning of the conversation about why data is important. Like in this AI revolution, all of these models, all of these tools need great data to be trained and to work effectively, and that's that's something we've got a lot of, and I'd be really interested in chatting to people to hear more.

Speaker 1:

Good. So we'll, you know, we'll put James is linked in bio, in the link in the show notes so you can reach out to him. So we'll finish on a story, because you know I love a story. So I was listening to a podcast, I think I told you earlier in the week. It's called the rest is entertainment, so check it out.

Speaker 1:

But basically they were talking about the future iterations of AI, machine learning, chat, gpt in this instance, and we're on version four. I think they were saying five is not far away. By the time we get to version seven it potentially will need all the computing power in the world plus a small nuclear reactor that that's yours and it will have run out of source material. That's kind of how far and quickly these things are moving. So it gives you a sense of scale. Also gives you that sense of back to the kind of I think it was a Guildwriters in America, wasn't it a piece around how, if it's taking source material, in effect it's just a big plagiarism tool that's rewriting stuff that already exists. It's not thinking of new content, it's recreating what exists and to some degree we're a bit blind to that because that that is what it's doing. It's scaring the internet at speed and piecing together stories, articles, documents to come up with with answers. So yeah, check it out. Interesting podcast, but that that's the scale we're heading to very, very quickly.

Speaker 1:

Absolutely so we'll pause there. I am 100% confident you'll be back on again. Whether we've got video or not depends on which buttons I press on the the recording tool, but I think we've got a video on this one time. We'll tell when I edit it and if you want to watch it, we'll put it on the YouTube channel. Thanks for listening and thanks for coming on again, James.

Speaker 2:

No, thanks for having me Speak to you soon.

The Rethink Approach to Data
Data Analysis and Visualization for Action
Enhancing Data With Benchmarking Analysis
Future of Data and AI Technology

Podcasts we love