Kate O’Neill
“Tech Humanist” | Global Keynote Speaker | Author, What Matters Next (Wiley, 2025) | Executive Advisor: AI Ethics, Responsible Innovation, Human-Centric Transformation | Future-Ready Tech Decision-Making

The Leadership that Matters Next

Episode Summary
When Technology Meets Humanity: A Conversation with Kate O’Neill

Get ready for an eye-opening discussion with one of the world’s most influential voices in human-centered technology!

Join host Eddie Turner as he sits down with Kate O’Neill, the renowned “Tech Humanist” and CEO of KO Insights, for a robust conversation about navigating our AI-driven future without losing our humanity.

Keep Leading!® Live

Keep Leading!® Video Shorts

About Kate O’Neill
Kate O’Neill is a digital innovator, chief executive, business writer, and keynote speaker.

As the founder and CEO of KO Insights, she leads a strategic advisory firm that improves human experience at scale, particularly in data-driven, AI-led, and algorithmically optimized interactions.

Before founding KO Insights, Kate was among the first 100 employees at Netflix, a technologist at Toshiba, and the founder of the groundbreaking analytics firm, [meta]marketer.

Her work has received widespread recognition. She was named “Technology Entrepreneur of the Year,” a “Power Leader in Technology,” and a “Woman of Influence.” Google featured her in its global campaign for women in entrepreneurship, and Thinkers50, the premier global ranking of management thinkers, included her on its list of the World’s Management Thinkers to Watch and shortlisted her for its Distinguished Award for Digital Thinking.

Her insights have been featured in the New York Times, The Wall Street Journal, and WIRED, and she has appeared as an expert tech commentator on the BBC and NPR.

She has authored six books, including four on business strategy and technology: Tech Humanist, Pixels and Place, A Future So Bright, and What Matters Next.

Website
http://koinsights.com/

LinkedIn
http://linkedin.com/in/kateoneill

Instagram
https://www.instagram.com/kateoneill/

Leadership Quote
“The important thing is not to stop questioning. Curiosity has its own reason for existing. One cannot help but be in awe when one contemplates the mysteries of eternity, of life, of the marvelous structure of reality. It is enough if one tries to comprehend only a little of this mystery every day.” — Einstein

Subscribe, share, and review on Apple Podcasts!
https://bit.ly/4kPBcZo

Subscribe, share, and review on Spotify
https://spoti.fi/4iOFzlB

Full Episode Transcripts and Detailed Guest Information
www.KeepLeadingPodcast.com

Keep Leading LIVE (Live Recordings of the Keep Leading!® Podcast)
www.KeepLeadingLive.com

Connect with Eddie Turner
Website: https://www.eddieturnerllc.com
LinkedIn: https://www.linkedin.com/in/eddieturner

About the Keep Leading!® Podcast
The Keep Leading!® podcast is for people passionate about leadership. It is dedicated to leadership development and insights. Join your host, Eddie Turner, The Leadership Excelerator®, as he speaks with accomplished leaders and people of influence across the globe about their journeys to leadership excellence. Listen as they share leadership strategies, techniques, and insights.

Share the Inspiration
Inspired by what you hear? Share the episode with your network and help spread the message of empowerment and leadership. Use the hashtag #KeepLeadingPodcast and join the community of listeners who are dedicated to continuous growth and leadership excellence.

Transcript

Eddie Turner:
Hello everyone. Welcome to the Keep Leading podcast. Another live broadcast of the Keep Leading podcast. The Keep Leading podcast is dedicated to leadership development and insights. I’m Eddie Turner, your host, the leadership accelerator. I work with leaders to accelerate performance and drive impact. I work with experienced and emerging leaders who want to have an exponential impact on the places, people and processes where they work. If you’re joining us today on LinkedIn, Facebook or YouTube, let us know. If we see your name come in, we’ll make you part of the conversation. You can feel free to ask a question of my guest. You can just share your reaction. We want to hear from you. And certainly hit the share button so that your friends and colleagues who may not be able to join us live, will be able to benefit from the replay later on. And as always, we will make this available not just as a video uh live stream, it’ll be available as a video podcast and an audio podcast uh in a few days on Spotify or wherever you get your podcasts.

I am super excited that today I get to sit down with one of the world’s most influential voices in human-centered technology. My guest today is Kate O’Neal. Kate O’Neal is a digital innovator, a chief executive, business writer and keynote speaker. She founded KO Insights. She was among the first 100 employees at Netflix before she did that though. Think about that. That’s a pretty cool thing to say. And she was a technologist at Toshiba and the founder of the groundbreaking analytics firm Metamarketer. She’s received worldwide recognition and a list that is exhaustive. I mean just the it’s the who’s who that has recognized Kate. And we’ll talk about uh a couple of the big ones as we go through our discussion. And you have seen her work in top business and academic journals including the New York Times and the Wall Street Journal. She’s appeared as an expert tech commentator on BBC and NPR. She’s authored six books. And today, we’re going to talk about the leadership that matters next based on her latest book, What Matters Next. Kate, welcome to the Keep Leading podcast. There it is.

Kate O’Neill:
Thank you, Eddie. Thanks for having me on your show and thanks for bringing me in front of uh such an incredibly illustrious audience that I’m sure we’re we’re talking to today and and over time. There I am and there’s the book.

Eddie Turner:
There you are and there’s the book. You flash it briefly, but I want to make sure everybody gets a chance to look at it so they know what to go search for. Yes, we’re going to make sure people know what matters next. Tell us the full title there.

Kate O’Neill:
What Matters Next, a leader’s guide to making human-friendly tech decisions in a world that’s moving too fast.

Eddie Turner:
And yes, I got a chance to hear some of this when I last saw you in person, you and I were, oh, that didn’t come out right. You and I were at the Thinkers 50. I I I normally test these things first. I it hit me at the last minute. Kate and I were together for Thinkers 50 in London. That’s right. Tell people who don’t know what Thinkers 50 is, uh why that’s a big deal.

Kate O’Neill:
I love that some organization, some write up of Thinkers 50 some years ago referred to it as the Oscars of management thinking. And I think that they fell in love with that description and everybody associated with the organization loves that description too. So that’s the easiest way to describe it. It’s the Oscars of management thinking. Every two years they host a gala and they uh give out they they first of all, there’s a ranked list of of 50 top thinkers in the world relating to management thinking. The the last series included Amy Edmondson at the top of that list whose work on psychological safety has been incredibly important. Um, and then, you know, a bunch of people down the list that are very familiar names to anyone who does a lot of business book reading and and pays attention to the thoughts coming out of management circles. Uh, and then they also have a radar, so paying attention to who’s up and coming. And then some distinguished thinking awards. So I have been named to the radar and uh my last my second to last book, um, a future so bright was named to the distinguished thinking uh it was a was shortlisted for rather the distinguished thinking award for digital thinking. So uh so good stuff. And and you and I uh met there or or saw each other there. I think we’ve actually met through NSA.

Eddie Turner:
Yeah, we met before that in yes, in our in our National Speakers Association circles. And we were part of the same chapter there in New York for a little while. So we hadn’t seen each other probably in five years. And then we were able to see each other and spend some time there. You delivered just a terrific presentation. And then yes, you were included on their list of the world’s management thinkers to watch list. And they put you on the distinguished award for digital thinking. And so That’s right. That’s the radar.

Kate O’Neill:
It’s a that’s huge huge honor. Thank you. It’s a huge honor. I think that I really have a lot of respect. I’m sure you do too for the folks who make that list and the folks whose whose work is recognized in any format. Uh it’s just it’s truly some of the best people to know and benchmark yourself against because people just aren’t satisfied with, you know, oh, you know, I’m not sitting pretty on on some idea I had 20 years ago. I’m like, I’m building on it, I’m building on it. And and that’s the kind of people I really enjoy being around.

Eddie Turner:
Yeah, that was basically a big nerd prom we were saying in in in jest to each other, right? And boy, we all loved it. It was just phenomenal. So great to see you again and have you here to talk about what I heard there and I I wanted my audience to get a chance to hear this from you directly. And there’s a lot being said about AI around us and there’s a lot happening with AI and your book is so thorough in how your coverage of this. I have a lot of questions to ask you and I’m almost not even sure where to start. So let me start here. Uh your big you’re a big advocate of humanizing technology and there was a big bruhaha last week when uh the CEO of Claude AI uh when he brought up the fact of what happens with technology and that there’s a blood bath coming with in the world because of AI, what it’s going to happen to white collar jobs and that by the way, it might even expose some inappropriate behavior. What’s your reaction to that?

Kate O’Neill:
Well, I think he’s he’s obviously someone who’s very much on the inside of what’s going on in the AI space. He isn’t necessarily and I think this is true for many of the CEOs and leaders in Silicon Valley and Silicon Valley geographically as well as Silicon Valley sort of as a metonym for the big tech industry. Um, those leaders don’t necessarily tend to stay as plugged into the leadership and decision making that happens in other industries kind of across the board. And and I think there’s some truth to the the reality of displacement from automation and particularly intelligent automation. We’ve certainly seen um, you know, robotic manufacturing, you know, in the in the in decades past displace jobs, but what they also did is made it safer for human workers. You know, some of those some of those jobs that were replaced or displaced by automative robotic arms actually made the workplace a safer place for humans. We aren’t necessarily in a place where we can say that about AI taking away like newsroom jobs, for example. Um, but those I would say, you know, there’s another facet to this consideration which is that some of what’s happening isn’t the fault of the AI. It’s the fault of management who’s not thinking far enough down the road. You know, so we see some examples. Newsrooms are a good place to look uh to start where you know, too rash decisions have been made when new models of AI come about, large language models, you know, you know, famously chat GPT in November 2022, but even since then every kind of unfolding of of the new emerging frontier models has kind of caught leadership attention, management attention and then many times you’ll see overcorrecting, overreact, overreacting kind of jumping the gun so to speak on firing whole swaths of editors or you know, whole swaths of writers and then having really embarrassing misprints when, you know, something like uh an chat GPT generates a misleading description of what uh the debt ceiling is or something like that and then CNET runs it and they have to print a retraction and they’re like, well, maybe we jumped the gun a little bit here. And I think that’s where it stands it behooves us to think as leaders and as as managers to think about not just what the technology affords us, but what we really need to be thinking about over time. You know, maybe we could actually cut some jobs and use chat GPT or other tools to to streamline our efficiency now, but that actually doesn’t necessarily play out so well over the many cycles of evolution that our business needs to go through. And there’s just there’s a process here and I describe it in my book that has to do with thinking about these things on a longer time horizon. Uh and I talk about it as the now next continuum. And on the now next continuum, I talk about the harms of action and the harms of inaction. And the harms of action are invoked when we move too fast for knowing what the consequences of that action are likely to be. And the harms of inaction are invoked when we already do know what we should be doing and we fail to act on that. That is the case when we look at something like climate change, for example. There are many, many instances where uh organizations and industries know that they should be doing more than they than they’re doing and and not acting on the information that they have, not making more um proactive decisions is hurting everyone. Uh it’s hurting their industry too. It’s causing the lack of trust. But the harms of action are ones that we often see as it relates to technology where people move too quickly, try to implement uh what they think are innovative solutions, but these innovative solutions haven’t been thought about in a human-centered context. They haven’t been thought about as it relates to the whole ecosystem of that industry or of that business as it relates to its marketplace. And so those are considerations that I want I want the organizations to think about.

Eddie Turner:
Yeah, and you describe this tension between future and preservation in in in in your book and in a number of ways. And as I I thought about it, I thought about what kind of happened in in the 2008 world where you had this evolution of this company called Uber. And people saying, well, hey, this is great technology, this is wonderful. But then others were saying, no, wait a minute, this is a threat to our industry. So let’s put some legislation in place and let’s let’s get these these these people out of here. But really what happened is it really elevated the the ride experience for everybody and gave birth to other industries. And so you talk about it in terms of uh you describing it as whack-a-mole legislation where sometimes people try to use legislation to put uh constrict uh the growth of AI, the growth of new technologies. Uh give us your thoughts how you describe it in your book and the implications this has specifically on AI.

Kate O’Neill:
Yeah, I I will back up just a little bit there because I I wouldn’t necessarily say that the Uber as an example of disruption lifted the game or changed the game for everyone or or made up the experience better for everyone. I think that we have uh still an opportunity to learn from the ways that Uber Uber moved too rashly and and did things in ways that weren’t necessarily ethical. Um, the the taxi industry though did things that also were kind of digging in their heels in ways that weren’t particularly helpful. Uh, and both industries had a lot to learn from one another. And and this is where I think we have more of an opportunity through, you know, regulation as an opportunity to to protect the end user, end consumer, end uh end person, you know, at the end of a series of of market actions and um things that happen in the in the real world that government is best suited to put guard rails around. Um, but where where I think we tend to see sometimes you’ll see industry try to insert itself into that process and have regulation crafted that suits them particularly, that’s, you know, carves out their space and gets they say that this regulatory capture. Um, Open AI tried to do this. Sam Altman was lobbying Congress for particular kinds of AI controls that were going to allow them to operate the way they wanted to operate but would keep uh barriers in place for new entrance to that market. So we have to be careful. I definitely see regulations as a public good when they’re done well and when they’re informed and when they have the the public in mind. Uh but I don’t see it as a public good when corporations insert themselves into that process and try to shape it in ways that are uh necessarily going to benefit just the business and not the people who are meant to be protected by uh legislation and regulations.

Eddie Turner:
Indeed. And that’s the example that you cited in the book that I thought was so apropos. And thank you for the clarification on that. What I meant was that it enhanced the ride experience in that uh cab drivers were forced to clean up your cars a little bit more. Oh yeah. To match the experience of the Ubers uh of the legislation that that was passed. Yeah. And so there’s a there’s a need to kind of put a little bit of a harness, but also not so much to where you’re constricting the growth and constricting the innovation and that’s the point that you get at there. And so wanted to highlight that because that’s impacting how businesses are structuring what this looks like as we go forward.

Kate O’Neill:
Yeah, and I think one of the things we’re seeing right now that that’s so relevant is we’re in a very deregulatory moment. Obviously our our current government administration in the US in particular, but I think some ways in some ways around the world, um there’s this tendency, this kind of pendulum swinging toward the deregulation side. And and uh one of the things I’ve written about a lot and spoken about a lot in the last few months, um since the US president took office, for example, has been that leaders need to recognize that that isn’t a free pass toward operating as if there are no guard rails in place, particularly when we’re talking about AI and other emerging technology. There are still going to be downstream consequences. Some of that may have to do with emerging regulations that come in years from now. Some of it may have to do with operating in other international markets where there are different regulations in place. But some of it may just have to do with the the fallout of public trust. If you if you overstep what people understand to be the right way to handle data and the right way to handle, you know, kind of privacy and and other types of things like that. So even when there is a deregulatory environment like we’re in right now, I think leaders need to recognize what that does is actually shifts the responsibility onto the shoulders of CEOs and board boards of directors and other leaders to think ahead and put their own guard rails in place. Make sure that they’re thinking in a responsible way about the future and about how they actually build and and deploy their technology solutions so that they’re future ready.

Eddie Turner:
Yeah, and you give a good example in your book about, I forgot the company who had the facial recognition technology and and how that got them a little bit of hot water.

Kate O’Neill:
Yeah, Clearview.

Eddie Turner:
Clearview, there we go. And how that got them a little bit of hot water. But yeah, be the that that it’s that it’s that balance, you know, how do we, you know, give people what they need in the future and part of us having technological advancements means in in some ways we we we we have the sacrificing of of somewhat uh of trust in some areas.

Kate O’Neill:
Yeah, that’s true. I just want to jump in quickly on that because if you pay attention to like the Edelman Trust barometer, you know that the last four or five years it’s been corporations as all of the entities that they survey, corporations, government, media and NGOs, I think are the types of entities they survey and corporations hold the highest level of public trust out of all those entities. And if you think about that as a as a CEO or as a leader inside of a corporation, as many of our listeners I assume are, there is an incredible opportunity inherent in that that if you if you are the entity the public looks to with trust, you have the opportunity to shape the discourse, to move things forward in ways that are aligned with human outcomes and really establish yourselves as the people who are thinking ahead and and developing innovative solutions that are very aligned with where humans want to be.

Eddie Turner:
So say that again, the Edelman Trust barometer?

Kate O’Neill:
Yeah, the Edelman Trust barometer. If you’re not familiar, folks on your who are listening aren’t familiar, E D E L M A N is the name of the PR agency. They run a survey every year and they study trust and they look across these different types of entities. And for these last four or five years or so, out of the types of entities, the type of entity that the public most trusts are corporations. It’s not government, not media and so on. Yeah, it is, right? And I think that’s just been such an incredible opportunity for corporate leaders to really own that leadership space and really say, look, okay, if we’re the ones people are looking to with trust and saying, we trust what you have to say about, you know, what the emerging topics are. We trust what you have to say and what you’re offering into this whole marketplace of ideas, that’s an incredibly important place to be and when we think about that relative to the deregulatory moment we’re in, relative to the striking down of lots of DEI legislation and recommendations, relative to all these different things, it’s such a moment for leaders to step into an ethical, moral, responsible leadership kind of role and really set the stage for what should be happening in the public discourse.

Eddie Turner:
That’s a powerful statistic and I really appreciate you sharing that and bringing that resource to to light for us, Kate.

Kate O’Neill:
Absolutely, Eddie.

Eddie Turner:
Well, I’m going to pause right now in the midst of this conversation that I’m having with Kate O’Neal, tech humanist, the CEO of KO Insights to acknowledge the sponsor of the Keep Leading podcast.

If a single employee’s indecision can cost an organization $10,000 to a million dollars, imagine the potential financial impact when more individuals are added to this indecision equation. It can spiral out of control quickly. What’s the solution? Decision X. It’s a bespoke on-demand service designed to help your leaders overcome indecision and move forward with their work. Visit papillonmdc and discover how you can help your team get unstuck, shift perspective and advance today. p a p i l l o n mdc.ca. And the Keep Leading podcast is part of the C Suite Radio Network. Uh together we’re turning up the volume on business. Visit c-suiteradio.com. And in addition to being able to get the video podcast, you can get the audio podcast there at anywhere you download your podcasts. So I’m enjoying my conversation with Kate O’Neal and we’re discussing her latest book uh what’s next?

Kate O’Neill:
What matters next.

Eddie Turner:
Sorry, what matters next and we’re talking about the leadership that matters next. If I can get my screens right. Sorry, there we go. Uh what matters next, making human-friendly tech decisions in a world that’s moving too fast. So Kate, what are the questions that I had for you is when it comes to um integrating AI in an organization, which all leaders are trying to do right now. How can leaders integrate AI while preserving human agency and creativity?

Kate O’Neill:
Yeah, I think it’s really not an overstatement or exaggeration to say that we’re really in living through the greatest leadership challenge of our time. And leaders who are leading companies that are increasingly becoming tech companies that maybe weren’t tech companies before, are having to make decisions about technology that’s moving faster than human wisdom. So that is the problem that we’re with we face right now. That the too fast part of the the moving too fast part of the the subtitle of my book is no accident because I would hear as I as I speak to to audiences of CEOs and leaders across industries, the thing I hear so often is that it just is so much to keep up with that people feel overwhelmed, they’re daunted by the the rate of change. Uh and AI is one of the big drivers of that rate of change. So one of the things I encourage people to do is to think in a human-centered way about their organization and the the way that you can think in a human-centered way is to think about human values and human contribution. And one of the one of the most key human experiences is that of meaning and purpose. Now, I know we talk a lot about purpose as it relates to business strategy. It’s it’s been a very common topic these last few years. But I think that the the common discourse, the the way we often talk about purpose sounds a little esoteric, a little flighty, a little soft and fuzzy, but the way I mean purpose is an is as organizational strategy. I actually mean it’s part of the origin story of the business, it’s part of the foundational operating model of the business. It’s really about that sort of three to five word very crisp distillation of what the company exists to do and is trying to do at scale. And the clearer an organization can get on that, the clearer it is, the easier it is for everybody within the organization to be aligned in the work that they do and for technology to be deployed that actually accelerates that organizational perspective in ways that would be impossible without that clarity. So what we’re looking for is strategic clarity and we want that to be a clear articulation. Having done that, then we can begin to look at technology as uh as disruption, as transformation, as innovation, but we definitely want to be thinking about that across all the human stakeholders uh in our ecosystem before we before we ever start thinking about, oh, that’s a shiny thing. Let’s see how that shiny thing fits into our business right now.

Eddie Turner:
That’s why you say move fast but don’t break humanity.

Kate O’Neill:
Yeah, yeah, because I think one of the flaws, one of the tendencies people have and it’s a completely understandable tendency is to grasp for the shiny thing, right? Is to go like there’s a new AI model and uh now the CEO wants to know um what’s our AI strategy? But we don’t have an AI strategy. Nobody has an AI strategy except for maybe, you know, Open AI or Claude or you know, Anthropic or whatever. Like those the companies that have AI as part of their core underpinnings maybe have an AI strategy. But you as an organization, if you are anything other than an AI company, have a strategy which is related to that organizational purpose articulation and that strategy may be helped along by AI. So I think that’s the important work is really getting that the discipline of moving the technology out of the driver’s seat and having it be something that’s actually helping you deliver on what it is you’re trying to deliver in the first place.

Eddie Turner:
Well said. And I just have to ask you which what is Kate O’Neal’s favorite AI tool?

Kate O’Neill:
You know, um, I use notion a lot uh for my team and I all collaborate in notion. If you’re familiar with notion, it’s like a wiki sort of um knowledge system. I used to use Evernote all the time and I transitioned to to notion to be able to collaborate with my team. And notion has a uh an API connection to GPT, um chat GPT. So it’s a uh a really useful integration with notion. But then of course, I’m always flipping over to perplexity for research and for to Claude for help, you know, kind of long long form drafting and things like that. So I’ve got a whole suite of tools just like many people at this point that I’m using um and and relying on. But these tools are, I think it’s really important to say to own that publicly that I think these tools are not the the enemy in and of themselves. I think what what we see as AI slop that gets published out there that, you know, the kind of stuff that that everyone recoils from and and loves to hate. That stuff is the process of not letting yourself be in the driver’s seat, right? If you are allowing, I I always say, we can’t allow, we can’t we can’t allow machines to make meaning. We have to be the ones who make meaning ourselves. And if we are trying to craft something meaningful and we use AI tools as accelerants, as, you know, uh brainstorming partners, that sort of thing, that’s going to result in meaningful work as long as we stay in the driver’s seat. I think it’s the moment when we say, you know, draft a blog post about leadership and then we just take whatever slop comes out of chat GPT and post it to our blog or whatever, that’s when you start like really handing over the keys and that is not valuable for anybody.

Eddie Turner:
So What about you, Eddie? What’s your favorite AI tool?

Eddie Turner:
I like uh the ones you mentioned. Yeah. Uh but the it’s it depends on what what task I’m trying to accomplish. But clearly the point you made about perplexity, Claude, those are formidable tools that allow you to accomplish so much. And uh I I I like your point about notion. I I I’m a big Evernote fan as well, but I haven’t migrated over to notion. So I I I like that one. I’ll have to do a little bit of research on that.

Kate O’Neill:
Yeah, sending out the love to the Evernote team because I was a long time loyal user of that platform and I still am. Still am a paid user, but it just hasn’t had my usage in a while and I’m very sorry Evernote team. I still love you but just and notion has uh has really done such a bang up job of bringing themselves up to the level of uh of competitor and then I think really surpassing whatever note was accomplishing. At least for us.

Eddie Turner:
Thank you.

Kate O’Neill:
Yeah.

Eddie Turner:
Thank you for sharing that so that we’ll know. So this is why I say this is the leadership There you go. That’s a great idea. We can talk to them about that.

Eddie Turner:
That’s why I said this is the leadership that matters next, right? To your point, even people who are not even concerned about technology now have to be aware of technology and as they’re doing this and as they’re integrating, it’s touching every aspect of the employee experience. That’s right. Leaders need to know about this, but keep humanity. So thank you for sharing this uh good information with us. And what is the main point you want those who are hearing this uh podcast or watching us live that you want them to take away from our conversation?

Kate O’Neill:
I think it it’s uh a couple of things that this whole leaders don’t need to be tech experts, but they do need to be using human-centered decisions when they make tech decisions. They they do need to be human-centered as they make tech decisions and I think that’s that’s going to be increasingly true. The more we find tech accelerating us and moving things in faster and faster ways, the more we need to be intentional and uh aligned with meaning and purpose in our leadership through through technology and beyond. So that’s the real opportunity.

Eddie Turner:
That’s wonderful. Thank you. And is there a piece of advice that you use that helps you to keep leading or a quote that you like?

Kate O’Neill:
I think one of the things that that I have a lot of quotes that I rely on, but but one distillation of this discussion that may be useful for folks is to just remember that every technology decision is really a values decision in disguise. And that might help for your next round of decisions that you have with your executive team, your board, you know, your peers, uh however, bring the values up, be overt and explicit about them and make sure that you’re having that conversation because somewhere or other someone is encoding values into your technology, so you might as well do it explicitly. Uh and I I think that’s going to make much better results for your organization.

Eddie Turner:
Every values every technology decision is a values decision. That’s right. Powerful. Thank you, Kate.

Eddie Turner:
Well, folks, I want you to go out and and visit KOinsights.com to learn more about Kate’s work. Kate is just phenomenal as as as as I mentioned at the beginning and hopefully the last 30 minutes have given you a chance to see why. Kate, thank you for helping us to understand what matters next.

Kate O’Neill:
Thank you, Eddie. I appreciate it. It’s been a joy to be on your show.

Eddie Turner:
Thank you. And thank you for listening. That concludes this episode of the Keep Leading podcast everyone. I’m Eddie Turner reminding you that leadership is not about our position. It’s not about our authority. Leadership is an activity. It’s action. It’s not about uh leadership is not a garment that we put on or take off. We must be a leader at our core and allow it to emanate in all that we do. So whatever you’re doing, always keep leading.