Jonas Christensen 2:29
Ben Jarvis, welcome to leaders of analytics. It is so good to have you on the show today. And I have been looking forward to this episode for quite a while since we first met a few months ago. And because you work at one of the companies that everyone else in the world pretty much, which is Google. And we're going to not really learn about Google today, but sort of more data driven decision making in general, because you have lots of experience, both as a producer and a consumer of the analytical content. So that's something we're going to talk about today. Benny, welcome to the show. Could you please start by telling us a bit about yourself, your career background and what you do?
Ben Jarvis 3:12
Yeah, for sure. And, you know, thanks so much for having me on the podcast, I think I've had a bit of an interesting career, just given, I've not really wound up where I thought I would. For me, it was really an interest in journalism and law that kicked everything off. And I did work as a lawyer for a couple of years before moving really changing careers wanting to get into the tech media space. I initially, and that was about 15 years ago now. And then since that time, I guess, evolving the roles I was in a career within Google Sydney, and winding up in a analytical manager position. But that was sort of built over a period of time. So was interesting to sort of witness the rise of data or the rise of analytics over that period, and being in a position to capitalise on some of that. So that brings us up to now wherein I've sort of shifted my focus to more of an operational General Manager role. But that is very recent. And I think it's still data driven. In my opinion, there's few roles today that touching, you know, elements of data. So all that experience is definitely helpful.
Jonas Christensen 4:34
Yeah, so you were originally a lawyer. That's a very different background to analytics and operations. In my opinion. I am a head of data science at a law firm, a big law firm. So I work with lawyers every day, and I consider them to have a different way of thinking generally speaking, so you really almost use all parts of your brain in your career, different parts and different thinking styles. How did you end up In the world of analytics from law, what was sort of the thing that moved you across?
Ben Jarvis 5:05
Yeah, it is very different. I think that law as a degree and as a practice is very research base. So it's not analytics based. So you're still trying to find pieces of information that are meaningful. And you're still sifting through a lot of data in a way, right. So I think you're right to say, you know, the thinking and approach is quite different. Initially, it wasn't a jump from me from law into analytics, it was a jump into more so online advertising in general. And that's through because online advertising in its earlier days was not as data driven, because it was new, the people were still trying to work out, what are the levers? What are the touch points? How do I understand performance, and it was somewhat basic back then. But I sort of found myself to some extent on the cutting edge of how we were understanding digital ad performance. And that, of course, is a massive part of analytics today. And so it was really on that theme, that I then evolved a career but for me, it was very much account manager and sales focused initially, and then leading into moreso analytics.
Jonas Christensen 6:22
Yeah. And I think it's important to mention here that you are in Sydney, Australia, not in Silicon Valley, or Palo Alto, wherever Google's headquarters are, you're not there. You're in a regional office. You can tell us about the specifics. But it's more sales and operations focused typically, in those offices. Could you describe your role to us and what you do day to day you in the team?
Ben Jarvis 6:43
Yeah. So what I can zero in on is the last role I held, which was a head of analytics role. And it's true, as you say, for a Asia Pacific office within Google is very much a business operations focus. There is a large engineering base in Australia for Google, which is awesome. But it's a very different part of the organisation. So the role that I was in was very much geared towards partnering with large Australian advertisers could be international, too. But a lot of it was our biggest brands in Australia, how do they get the best ROI for their investment with Google? How do they understand their performance? What recommendations could I make and could my team make for them to be able to generate even greater returns the ad solutions business that really Google pioneered online is still pretty complex. And there is a drive towards more automation, and that's going to continue. But there are still hundreds of levers to play with, right. And so we would offer consulting, like services to the bigger advertisers as a way for them to get the best use of our services. And a huge part of that was analytics, what I always liked with Google's approach, which is still true, and I think other organisations would be doing this, too, is embedding analysts within sales teams and within business teams. So that gives people immediate access to an analyst that can help them deliver insights that are super meaningful and move business forward. So that's the way that the team was structured, and that we've kind of grown in Sydney. Google has similar teams throughout the world that do very similar work, but they're dealing with local brands and businesses.
Jonas Christensen 8:34
Yeah. Okay. So this is interesting the point around having embedded analytical resources, because in my experience, it can be done well, but it can also be done very poorly. So this analysts or analysts can become very effective and highly impactful in directly in the business. But they can also become deserted islands that get us to do all and sundry but not really using their full analytical skill set, because those who asked for it, and I suppose well enough established so those who asked for the analytical output are not well enough established in the discipline to sort of really frame up good analytical questions. How did you deal with that situation? Because I assume that that was something that you wanting to avoid? And how does one do that more generally, how does one ensure that there is a connection back to analytics as a discipline and that the skill sets used most optimally?
Unknown Speaker 9:28
I think I can speak to a lot of that in a few different ways. So I think it's important to acknowledge that when we use the word analytics very loosely, oftentimes, people are referring to products and not people i in the context that I'm speaking to, it's very much about people. And so that brings us to what is an analyst right and if you think of your full spectrum of analytical capabilities, if we put at the one end, maybe machine Any engineer, data scientist type role scope. And then on the other end around, you might have business analyst, commercial analyst, they're very different skill sets, and they're actually different profiles. And so my team would fall more towards the business analyst and, and that means that there's a real requirement that they are well rounded candidates in terms of their capability. I think this is true for anyone in a matrix organisation, you know, it gets very complicated very quickly, and stakeholder management becomes a huge part of determining how successful you are enrol. So we were very much a commercial, business focused analytical team. That meant that when I'm embedding analysts in a sales team, they need to be able to have conversations with their salespeople about prioritisation about the briefing, why does this piece of work needs to be developed in such a way at this point in time. And so we spend a lot of time on framework in prioritisation, so that people have clarity, I think that a trap for a lot of businesses is they're burning a lot of time and energy on analytical projects that maybe don't have really clear ROI metrics attached. And it's a challenge because it is a specialisation. But I would say for certainly for the leadership in analytics, but also for the individuals, having a commercial lens is really, really helpful. So we sort of would hire towards that and try to make sure that individuals could operate in that world. Now, having said that, it didn't mean it was always easy or smooth sailing, the analyst role is an outlier within a sales org, it's a different profiles a different mentality, potentially, you know, depending on the individual, what was important was that a analyst manager understood the concerns and realities of those analysts on the ground, because the other option is you have analysts reporting into non analyst managers. And I think you can definitely work and I think that system doesn't work. I think it's a question of how much do you want to elevate analytics within the org? How much of a more senior executive voice Do you want to give that part of the organisation, and you can do that in a way that obviously, those structural decisions are being made. So but what we were really focused on was making those analysts feel like they were part of the team. So they're part of an analyst team, but they're also part of a business focused sales team. And actually, most of their time is not spent with other analysts. It's spent with those sales people. But we would need to come together for things like capability training for performance reviews for understanding how we evolved and progress the role, then that throughput into management was an important piece of it.
Jonas Christensen 12:56
So these analysts would report to you but also in a dotted line sends perhaps to to the sales manager. Correct? Is that sort of hard work? Yeah. Exactly. So for an individual like that, who's reporting into more than one person? What are some of the pitfalls or things to be aware of in terms of how do you make that work? Well, when there are two people who want to guide direct, and all in good faith, but sometimes there might be at odds with where things should be going that one person thinks that should be going one way, and the other thinks it should be gone in another direction. And this individual is sort of in the middle of that. How do you deal with that? How does one deal with that?
Ben Jarvis 13:40
Yeah, I think to answer the specific question, and I might zoom out a little bit, because I think it's a really important point. The question I would pose to analysts in that environment is, are you saying no to requests? If you're not saying no, there's a high probability that you're not necessarily working on the right objectives. We operate in a world where there is more work than we can get to right. There are more requests for our time and energy than we can possibly get to. And I think this applies to a whole lot of people in a modern workforce, right? And no more so than for this analyst team that you know, I was managing, so be prepared to push back, be prepared to prioritise and keep people accountable. It's important to know what the metrics are that you're dealing with, because that becomes a source of truth. So if a salesperson says, Well, this opportunity is worth 500,000, in their estimation, but you have something else that you're working on, that's worth significantly more than that. You need to be able to have that conversation to say, Hey, I can't justify pivoting by energy because you're not coming to the table with enough for me in terms of upside. So it is a change in I guess the degree of accountability for the analysts to because It's not for the most junior analysts mind you, I think this comes with time and progression in role. But for the most senior analysts, if they can't justify why they're doing the work beyond, I was asked to do it. That's simply not good enough in a matrix environment where people all have to be responsible for how they're driving those outcomes. Because we're incentivized differently. We're not the salespeople have their own book, they all think it's the most important thing in the world. And to them it is. So how do we then support that in a way that makes sense for the total business? I think zooming out slightly, you know, the risk with analytical projects, and I've fallen into this trap is that the data looks like it's going to be really interesting. And so I want to know more, I want to spend time on it, I want to spend analyst resource on understanding those numbers, whatever it happens to be. And it turns out that it is really interesting. And it goes absolutely nowhere, doesn't lead to any improved decisions, doesn't really move the conversation forward. It's just really interesting. And I think that's, that's what it's important to make sure that we avoid. And so it's a real time conversation often, and it can be confrontational, to the point where not everyone gets what they want. And this is what I've said is that we don't live in a world where everyone gets what they want, because we have to segment prioritise and look at ROI. It's the same way that like, if you're a consumer business, there are some customers that are more profitable than others, right? In an ideal world, you can serve as all of them. But if you can't, then you're only going to want to chase as customers, there are more value to you. So I think that's part of the thinking that we would be applying as part of our operations as well.
Jonas Christensen 16:49
Very good advice. And I am really attracted to your words, because I feel like my day is mostly spent saying no to people. And it's not a fun, actually a fun, enjoyable part of the job. But it's such an important part. Because if you don't, it's those who yield the most and have the loudest voices, they typically get the way. And I figure another one, I have to be careful here not to generalise, I say and I'm kind of going to do it anyway. I think a common trait for analysts is they like actually supplying to demand they like feeling those requests, and helping people and generally making people a success. So it actually feel a little bit like a failure, when you have to say no to someone. It's hard. It's emotionally hard to some extent. So it's a skill that really has to be be practised over time. So your advice, there is really good for all listeners to take on. Ben, I'm interested in your background here, because you actually have a master's degree in coaching psychology. And you're obviously then equipped here to coach and help people and understand the psychological dynamics of the situation as well. Could you tell us a bit about what coaching psychology is your degree and what skills that you've learned from, from doing that degree and how it helps you in in your role, perhaps live more broadly?
Ben Jarvis 18:11
Yeah, I guess it's definitely a continuation of the career journey for me. But I am a big fan of trying to simplify things down to a little bit like a binary idea. Now, this is oversimplifying, but like the idea that in any role, you should be either earning or learning, you need to have one of the two if you have both, and you've got no works at all right. There's also this idea that you as individuals, depending on our makeup, and areas of interests, we might be more interested in people or things. And I think analytics is often a world of things. And it's a world of data points, there's immense value in what can be done in that world with, you know, the work that we do. What, for me, the coaching psychology was my attempt to bring back to people and what I've found constantly is with all the data points in the world, it's still a human being talking to a human being, and all the overlay of their bias, their history, their knowledge, their cultural understandings, their attitude, it Trump's all the data. So it's partly understanding what drives human beings and having good stakeholder management and influence. But I think we've coaching it's also trying to understand what motivates and drives us and what motivates and drives analysts. But more than that, people in general so I want to I love the idea of trying to combine an analytical approach with a coaching approach. I don't think I've been successful yet. But the you know, the degree was my way of bringing it back to a lot of those human factors and a lot of those not so much societal, but certainly attitudinal and mental All frameworks that get used in, in the world of coaching and psychology.
Jonas Christensen 20:06
Yeah, and I think really what often makes a good analyst, a great analyst is not necessarily just outstanding technical skills, but it's the ability to tell a story with the data, it's an ability to convince stakeholders that this is important that what we found is important, it's an ability to lead and provide thought leadership to a process or a habitual way of doing things in an organisation that has always been like that, and sort of challenge the status quo with the information. So we have fact and emotion combined, basically, which is kind of an element of what you're saying. And that is really, really powerful. That is the analyst that gets invited into the decision making rooms to boardrooms, the executive tables, when you can do those things. So it is so important for any analyst to practice that into personal skill sets. And, yeah, you can follow the coaching and you can call it psychology, you can call it stakeholder management, and all those sorts of words you can attach to it. But the ability to lead other people is really critical in an analytics role to have an impact. I'm interested, then the way you coach and mentor sales in the leadership staff in the sales, lead sales teams, operations teams, to make data driven decisions, that I imagine is a really interesting challenge. And you will have very different levels of appreciation and understanding of how data can help. How do you do that in your role? And where do you see success and better, you see more challenging types of mindsets and so on.
Ben Jarvis 21:43
I would say there is a real role as an analytical leader within a business, I think part of the responsibility is to act as an evangelist for truthful insights in data. So sometimes you've got to begin that journey with we just need to be able to track what we can't track. We've got to be able to put systems and platforms in place that give us the insights we need. But assuming that work is being done, or or has been done, the challenge then becomes how do we interpret this information in a way that's meaningful for our business, and it never ceases to amaze me just how difficult that challenge is. Because we can track now more than we've ever done, we have access to more information, I think that it presents its own challenge. And that's one of information overload. And that's one of I guess, data curation. there that is a constant challenge, and especially for senior business leaders that have to have a summarised view of reality and executive level view. There's a human being involved in making that up. And they're making active decisions about what data is included, how is it structured? What insights are they drawing out? And there's so many variables involved in that. And so the more that we can educate that senior business leader to ask the right questions, to not necessarily always take it at face value, I think is really, really helpful. So I have run a number of presentations to talk about data literacy. And this is done at a pretty high level, right? It is much more guidelines. And how do you craft an Excel equation like, it's the datasets we deal with are so large, that we've got to be able to bring them down into something that's meaningful and impacts an actual decision. And that's an analyst skill on the one side, but it's also very much a business leader skill on the other. And there's a couple of I guess, being watched out that you know, and this is not exhaustive by any means. But these are just the ones that come to my mind. Thinking back to some of that work I did was too often we display metrics, with no benchmark. So the question is what does good look like? Now it's fine to have targets. But targets can often be set somewhat arbitrarily. So finding the right benchmark, to put a metric in context is vital, not just showing the individual metric on its own. The other thing to be mindful of is that we have such short memories, we're moving so quickly. In business, we're often not looking backwards. And I'm a huge fan of the trend. So training data is critical. I don't think that's a surprise to anyone, but you can pull any metric out you want and tell any story you want if you just handpick the timeline or the individual kind of moment in time. And so I think keeping it relative slash benchmarks, and also being really focused on the time series that you're using in order to make your point and you know, they're just little details that have come to my mind but teaching business As leaders to be mindful of these things, the other one is like exclusion bias. I haven't got that exactly right, because I'm my mind's gone blank. But whenever we take a data set, you're more almost always excluding some of the data in what you pull. And that's just a natural consequence of having to draw boundaries with the data that we're working with. There's a brand example from World War Two involving bombing runs over Europe, and how a data scientist at that times in the 40s, understood what was included versus excluded. And it's a really illustrative example of how this works in practice. So to just quickly give you that example, the US military had bombers coming back, their data set was returning our aircraft, right, and they wanted to protect the crew, and reduce the risk that they would come to harm, or that the plan would be shut down. Their data set was returning aircraft. And so they analyse. They looked at where the bullet holes were, they understood where they were attracting fire from the ground anti aircraft. And their conclusion was, we need to reinforce this parts of the plane that are the most impacted with the most portholes, and this data scientist said, you're coming at it the wrong way, the dataset is not complete, we're missing all the aircrafts that didn't come back. So the aircraft that did if they have bullet holes, it means that they can still fly when they're taking that fire, we need to reinforce the parts of the plane that don't have bullet holes in them. And so, you know, we do this all the time with how we understand the conclusions we're making, because we're not zooming out far enough and looking at the bigger picture. And that's really easy to say, and it's like, conclusion, but it's amazing how often this happens, because we we have this kind of cognitive bias where we're only focused on what's in front of us, I have this spreadsheet to work with. Let me draw conclusions from this, I'd say always ask the question of what might we be missing as part of this data set in order to make a decision?
Jonas Christensen 27:13
Quite a great story. And yeah, it really illustrates a point. So thank you for sharing that. Ben, how do you hire for data and analytics, literacy and aptitude? And how do you train people on the job? And I'm not necessarily talking about analysts here? Because it's sort of expected? Do you actually, in your sales and operations roles, look for this stuff as people need to consume data not necessarily produce data and analytics output, but their ability to consume it?
Ben Jarvis 27:43
Think it's a great question. And I mean, to be honest with you, I don't think we have done an amazing job. I think the challenge of hiring and developing is a big challenge, because the space is moving quickly. Because these can be very difficult things to test for. We sometimes feel like we're looking for the unicorn, because the analysts that we want, can stakeholder manage, can engage with business leaders can work independently can face off for the salesperson to a certain extent, right, and can engage with customers, as well as do everything else analytical that we might require. And so it's a lot. It's a lot, I think, a lot of specific questions that we use. I'm a big fan of, I think one of the biggest telltale factors for a, maybe any analyst, but I would certainly say business analysts would there be their ability to deal with ambiguity, to deal with unknowns, and because you're not always going to have a framework or an understanding for everything. So I would ask them how they've dealt with the data set they've never seen before, and how they, they understand it and try to draw value out of it. I'm also a big fan for very general questions that invite a longer answer, you know, and a thought process. So how strong is their cognition under pressure, and then you can get into some of your classic like problem statements, you know, how many devices have sold over this period of time? I don't really care about this specific number. What I care about is that thought process, and how they reason with the challenge, and how articulate they are in that situation. Because it's hard. It's hard in an interview, but it's the sort of pressure that they might experience in role. For more technical capabilities. I think we could do a much better job. A start up like a Luba for example comes to mind. Because there's an opportunity there I think, to level the playing field and actually do proper testing for analytics. I'm sure some companies do this, but I think you kind of had that piece of the puzzle and then you have your interview itself, which is should be much more devoted to soft skills as possible. And then training is it's tough because a lot of analysts find that they need to specialise, then they can't remain a generalist in the world of analytics, because there's too much depth. So, you know, how does that play out for them? I'm a big advocate for self paced learning. There's so much available online that if you have the motivation and the interests, you can upskill in a pretty meaningful way, on the technical stuff, for sure. But I'm also a big fan of some of the platforms, we have the simplify, like Data Studio. And I think that the value of the analyst in that world becomes less about how do I crunch the numbers in a way that brings us to a conclusion versus how do I actually visualise and tease out the comparisons in the data that will lead to improve business decision making. And I think a lot about automation will do the heavy lifting. And the real magic in the world of analytics will be the translation into commercial insights. Of course, it's true today, too, but I think it will be more true in the future.
Jonas Christensen 31:03
Yeah, absolutely. You see it already in some of the BI tools, the big BI tools that start having not just the reports of bill, but the functionality to dig deeper into the data with Automated Insights, and so on, they can take you somewhere, but they definitely do a lot of the job after the simple q&a, after data that some types of analysts have had. It's their bread and butter, sort of the first 20 years of analytics really being a big part of a business. So I agree with you, I can see that becoming more and more automated. And it's really the commercial mindset that takes over. And speaking of that, Ben, you also Google the business, you go out and help clients scale with your products, not so necessarily, that involves training upskilling and coaching of those clients in any element of analytics to use your products, because they're often about data and data driven decision making, to some extent, how do you do that? How do you train a third party like that?
Ben Jarvis 32:10
You mean, on the use of certain platforms?
Unknown Speaker 32:14
Yeah, I suppose it is the technical aspect, how you use it. But there's also a mindset element to it, I imagine, or is that not actually what's going on?
Ben Jarvis 32:24
Well, I think you're spot on the number of times I've seen customers, or business in general, really excited to improve their tracking their data, business intelligence, and then either don't know what to do with it, or make decisions that are not necessarily generating the outcomes that they're looking for. And that brings us back to things like data literacy and understanding what it is you're trying to prove. Because a lot of data can be, in some ways used to show anything you want. If you know how to manipulate it. You know, I used to say that a good analyst could provide you more or less what you want. Whereas a great analyst can provide you almost anything, anything you can dream up as a conclusion from data. Because it's so easy to manipulate what it looks like. I don't think there's an easy answer. I think the mindset is a bigger challenge than the infrastructure and the pipes for signing the contract. But the newest third party data platform, I think that it that takes time and money minutes, sometimes that's the easy stuff. So investing time and understanding what an executive level view should look like? How do we level up the data and insights? So it's actionable, and doesn't get bogged down in detail, because I mentioned this earlier, but I think one, and it's also not new, but it's as true as it's ever been. We are drowning in data, we are absolutely overwhelmed by data every day of our lives. Like even if you're just an average consumer, let alone if you're working in a complex business, and cutting through that with insights and actionability is absolutely vital. So there isn't an easy answer to it. It just takes time and energy and attention to kind of really drill into what is our philosophy around around use of data? And how do we get the right people involved to really bring a lot of that to life.
Jonas Christensen 34:24
So I can personally attest to the drowning in data, whether it's phone apps on my inbox or other places. And I can also see in a business sense that that's often what we're actually saving our stakeholders from sometimes. I have too many reports, I have too many places to go and look for a number, but it's not giving me the insight. Now. That's right, you got to zoom in and out to the appropriate level, as you said, to get together the inside, which is not necessarily just more data. Hi there, dear listener. I just want to quickly let you know that I have recently published a book with six other authors, called demystifying AI for the enterprise, a playbook for digital transformation. If you'd like to learn more about the book, then head over to www dot leaders of analytics.com/ai. Now back to the show, then you're touching on something here that I think will be close to many listeners hearts, a lot of them won't be working in a so called New Age tech company like you, they'll be working in maybe a more traditional business or an industry that's been around for a long time. They're not born out of the Internet era necessarily, how do those businesses that have been around for a long time, transform themselves into data driven analytics literate organisations?
Unknown Speaker 35:48
It's a huge question. I think often these sorts of initiatives are multi quarter, multi year, and they need to be sustained over that period of time to make meaningful difference. And then of course, it starts with leadership, right leadership prioritising this as a way to move forward. But you know, it comes back any businesses got to make trade offs, right, because it's there's money involved in this time and resourcing the right data approach can absolutely be a competitive edge, and something that will ensure increased profitability and viability over the medium to long term or the short term. And so understanding in that context, I think, is important, but also understanding its limitations. Again, it's possible to use data in a way that it's not necessarily designed for, I think, what can I see this, you know, in my current world, as as much as anywhere else, like data is only as good as the inputs that are there. So when you have imperfect inputs, you get imperfect outputs, right. And sometimes we can get away with saying data is directional. And I think this is a big point, which I'm in support of is not haggling over the decimal points. But understanding what the biggest trend is that sits behind it, like whether or not it's increased two points, or 20 points, is less important than the fact that it is increasing. You see what I mean? So don't burn time and energy on trying to understand the two verse 20, take the data and run with the conclusion that it's increasing. What's our strategy? What's our strategy to either reverse that maintain that whatever it happens to be? So that's one big call out? I would make that I use a huge question. I don't think I could probably do that answer justice. There's a lot of consultants, individuals out there that have great experience with how to come in and mix these things up for organisations that want a step change. But they've got to be prepared to put the time and money into it and have clarity around what's the win for us as an organisation. Often it's centred on advertising, because online ads is so so hugely data driven, but at the same time, you want to understand your audience, be able to segment your customer base, understand their profitability, in an ideal world, depending on the products that you're putting into market. The companies that do that well, are the ones that have that competitive edge.
Jonas Christensen 38:18
Right? I think you did two very good job at summarising the key themes in three to four minutes, Ben. So imagine what you could do with more time. Look, we're close to the end here. I have a couple of questions left for you. One is one that I always ask our guests here, which is to pay it forward. So I'm interested from you, who would you like to see as the next guest on levers of analytics and why?
Unknown Speaker 38:44
I would say for me, I don't have a specific individual, but I can give you a business. I think one of the most interesting datasets to me in the world is actually LinkedIn's data set, and specifically how they use that data to advise governments and industries on the flow of talent locally and internationally. So it's so powerful that they can literally go down some level of saying your ambition as a government is to become an engineering powerhouse for the next decade. Your own like your proportion of engineers relative in your workforce is this. The best regarded engineers live in this world like they can see that flows of human labour and migration between markets, they can categorise it. I think anyone connected to that would be super interesting to hear from depending on what they can share. But I've not come across a data set for me that I thought was as compelling and interesting as I want.
Jonas Christensen 39:47
Great suggestion and great topic. That is definitely something that will go on my to do list. So thank you for that. Ben. Lastly, where can people find out more about you and get hold of your content?
Unknown Speaker 39:58
Yes, so I've not really released content publicly, I think you know everything I do sort of kind of sit within that the walls of Google. But you can, of course, find me on LinkedIn and see the career trajectory. I think, if anything, it just shows that it's possible to make big, big switches in your areas of focus in your career, and continue to evolve. If you're curious about like, more specifically coaching, or the world of coaching and data, feel free to message me. But yeah, my info is online.
Unknown Speaker 40:36
Fantastic. And Ben, I do think we can learn a lot from your general approach to life, which is don't be afraid to step out of your comfort zone and try different things. You've done it in your career, but also in your studies. And I can see how you're trying to combine the full spectrum of human ability into one versatile superhuman, and so I'm sure the colleagues at Google are very lucky to have you, Ben Jarvis. Thank you so much for joining us today on Leaders of Analytics and all the best with your journey and we really appreciate you taking the time to share the knowledge that you have with us today.
Ben Jarvis 41:13
Thank you so much. It was a real pleasure.