
Sell Me This Podcast
Sell Me This Podcast is a deep dive into the intricate world of enterprise technology sales and procurement. Hosted by Keith Daser, each episode unravels the strategies, tactics, and human psychology behind how business-oriented technology solutions are bought and sold. Designed for corporate buyers, technology sales professionals, and business leaders, the podcast provides actionable insights to help maximize the value of tech investments. Expect engaging interviews with industry experts, real-world case studies, and practical advice. Tune in to demystify the tech sales process and gain invaluable tips for navigating your next big purchase.
Sell Me This Podcast
Turning Frustration into Growth & Leading with Outcomes with Matthew Nielsen
On this episode of the Sell Me This Podcast, Keith Daser talks with Matthew Nielsen, founder and CEO of Ethicrithm and former CEO of Fishbone Analytics. With decades of experience in IT outsourcing, enterprise analytics, and digital transformation, Matthew shares how turning customer frustrations into solutions helped him build and scale a thriving business.
They explore the key decisions that fueled Fishbone’s growth and eventual sale, from early-stage pivots to leadership lessons that shaped the company’s trajectory. Matthew also explains why measuring service performance goes beyond checking off SLA boxes and how real value comes from aligning with business outcomes.
The conversation wraps with a look at his latest venture, Ethicrithm, and how it's helping organizations harness AI in practical and responsible ways. Whether you're scaling a tech company, managing client expectations, or exploring how AI fits into your strategy, this episode is packed with insight and real-world experience.
-----------------------------------------------------------------------------------------------------------------------------
If you believe you deserve more from your technology partnerships – connect with the team at:
https://www.deliverdigital.ca/?utm_source=videodescription&utm_id=youtube
Sell Me This Podcast is brought to you by the team at Deliver Digital, a Calgary-based consulting organization that guides progressive companies through the selection, implementation, and governance of key technology partnerships. Their work is transforming the technology solution and software provider landscape by helping organizations reduce costs and duplication, enhance vendor alignment, and establish sustainable operating models that empower digital progress.
This episode of the Sell Me This Podcast was expertly edited, filmed, and produced by Laila Hobbs and Bretten Roissl of Social Launch Labs, who deliver top-tier storytelling and technical excellence. A special thanks to the entire team for their dedication to crafting compelling content that engages, connects, and inspires.
Find the team at Social Launch Labs at:
www.sociallaunchlabs.com
Sell Me This Podcast is brought to you by the team at Deliver Digital, a Calgary-based consulting organization that guides progressive companies through the selection, implementation, and governance of key technology partnerships. Their work is transforming the technology solution and software provider landscape by helping organizations reduce costs and duplication, enhance vendor alignment, and establish sustainable operating models that empower digital progress.
If you believe you deserve more from your technology partnerships – connect with the team at:
www.deliverdigital.ca
This episode of Sell Me This Podcast was expertly edited, filmed, and produced by Laila Hobbs and Bretten Roissl of Social Launch Labs, who deliver top-tier storytelling and technical excellence. A special thanks to the entire team for their dedication to crafting compelling content that engages, connects, and inspires.
Find the team at Social Launch Labs at:
www.sociallaunchlabs.com
There's so much hype and there's so much investment that is going in and they're wildly overvalued companies and, again, I'm not sure that I have the wherewithal to pick the winners and the losers out of this.
Speaker 2:Welcome to Sell Me this Podcast. Today. We're joined by Matthew Nielsen, the founder of EthicRhythm and the former CEO of Fishbone Analytics. He has decades of experience in IT outsourcing, analytics and enterprise transformation, and Matthew shares how he turned customer frustrations into a thriving business and why measuring service performance is more than just hitting SLAs. We dive into pivotal decisions defining Fishbone's success and sale and how his new venture, ethic Rhythm, is helping organizations harness AI through a partnership with Palantir. If you've ever wondered about how to scale AI in a way that's both ethical and impactful, this one's for you. Let's dive in, matthew. We are so excited to have you on the show here today. Thanks so much for taking the time. I'm going to dive right into things. I know you have an incredible story and an incredible journey around how you arrived where you are today. You have a startup background. You have a passion for building businesses. I would love to start off diving right into your journey in building Fishbone and how that brought you to where you are today.
Speaker 1:Thanks, keith, it's great to be here. My story started a long time ago in IT outsourcing and I worked for some very large outsourcing companies and worked with a lot of organizations across Canada and I saw systemic failures in outsourcing and I was part of some outsourcing engagements that worked really well, but there were more and more that just did not and I spent a lot of time contemplating why was that? And if I could do something different, what would it be? And I felt like if I could measure the performance of outsourcing better and differently than what was being done, then I think I could help clients extract more value from those relationships and basically create an accountability engine over those relationships.
Speaker 1:It really crystallized for me when there was a large energy company in town with tens of thousands of employees and the CIO, who was new, surveyed their entire community and asked them how IT was performing for them. They had four major outsourcers that were delivering services and in any given month they were all green on their SLAs. But 60% of the population of that company were not very happy to very unhappy with the service that was being provided. And it led me to this aha moment that there's this customer experience chasm between what a service level agreement says and what a customer actually experiences through service delivery. And so I gathered some executives that I could trust with a secret, that I was thinking about doing something different, and I gave them beer and pizza over a series of sessions.
Speaker 1:I called them my customer advisory board and I walked them through what I thought the problem was and when we agreed on the problem statement, I walked them through what I thought the problem was and when we agreed on the problem statement, I walked them through what I thought the solution was. And it was very iterative and they had some really valuable feedback for me through that process and so when we landed on the solution statement, I turned to them and said that's fantastic. Thank you for your help. Now I need your money and with that I launched Fishbone Analytics as a customer-funded company.
Speaker 1:We walked into a couple of contracts right away and our mission was really to build an analytics engine that would take all the data that was created through service management and service delivery and apply ITIL and Six Sigma kind of deep analytical views so that we could measure performance differently and better than what was being done previously.
Speaker 1:And that worked very well until oil went to $20 a barrel and the analytics that we were providing was really a discretionary spend and I was looking at an oil recession in Calgary and realized that A I needed to diversify my client base and geography and I needed to really listen to what our customers were asking us to do, and I had dinner with one of my clients and he was talking about his cost of managing a ServiceNow application and I more or less said tongue in cheek why don't you let us do that? We're taking all the data from it? And he said okay, and so then I hustled, like entrepreneurs do, and I hired a very intelligent guy that I used to work with, Mark Raychuck, and asked him to take a course, and he did, and then we took on managing this application and that was the beginning of a pivot into what eventually became a full ServiceNow practice and Fishbone Analytics, and that's really where we grew and scaled our business and led to a successful exit to a private equity company in California.
Speaker 2:I love that story and I think that problem that you described at the very beginning. I come from the service industry and there's so many times when you're pointing at this board of all green and you're saying all of the check marks are saying the right things, even NPS could be strong, your SLAs could be strong, but there's something that's kind of hiding behind the curtain. That just doesn't feel right, and so the ability to measure that is, I can see, incredibly powerful. How did you find that pivot from the data analytics company around customer success to really honing in on service now? Was that a hard pivot for you? Was it easy because your customers took you there? What did that feel like?
Speaker 1:Yeah, I think the advantage of being a customer funded company is, every month, your customers are voting with their dollars, and so you really have to follow the thing that is most important to them. So I would say when we were pivoting I wouldn't even characterize it as a pivot, I was just following the revenue and following our customers, with a focus on making our customers really happy At the beginning I didn't wake up and go we're going to become a ServiceNow partner. We just grew into that and realized, just based on good fortune and good timing, that was a market that probably had more legs than what we were trying to do previously, and so it was really only in hindsight that I look back on it when, oh, we actually pivoted the company, like we started doing A and ended doing B, and thank goodness we did that.
Speaker 2:Yeah, and I feel like that ability to take those leaps and take those steps is really important. I was on a hike last summer with someone just as we were starting our business, and her advice to me was don't fall in love with your idea, fall in love with the problem. And if you chase the problems, your business is going to change and evolve a hundred different times in a little iterative steps. But if you get stuck on really proving out your initial idea, sometimes you can fall into a little bit of a trap.
Speaker 1:Absolutely, yeah. Yeah, you have idea bias? Yeah, my idea is fantastic. Why don't you understand that?
Speaker 2:Yeah, so then I'm assuming you had a tremendous amount of experience learning from some of these organizations through the analytics work, through the ServiceNow work. What are some of those insights that you learn from those initial engagements that are really shaping what you're building with EthicRhythm?
Speaker 1:Yeah. So Fishbone was, I guess, a 10-year journey for me, including the time I spent with the company that acquired us, and it was really. I couldn't have asked for a better education through the whole process, everything from starting a company and contemplating cash flow and the implications that it has on scale or vice versa, to bringing structure to how you operate a company and also how you deliver services. I've always been very passionate about creating what I call raving fans and I always just subscribe to this notion that if you keep your clients really happy, a it's good for long-term revenue Like it's just good business fundamentals, but also they become the best marketing engine that you could have. And I would say, going through this process I maintained my passion for creating raving fans and I really got schooled on the fundamentals of business, of finance, of contracting and just managing an operation, and so I've taken a lot of those lessons forward and I can give you like a few examples of things that I learned that I am like absolutely taking forward in my new company Ethic Rhythm. One is that Fishbone grew organically. I had a plan, but it wasn't very structured and I didn't think about things like equity for employees, so we never had a corporate structure set up that we could easily allocate equity to our team and when it came time for an exit, we ended up bonusing people that worked for us and I was very happy to do it. But it was a terribly inefficient use of our capital because everyone takes that as employment income and half of the money goes to the government. In Ethic Rhythm we're structuring it so that all employees are equity participants, because A we want to be able to deliver cash in the most efficient, tax efficient manner possible, but also I think it changes the discussions that you can have with your team. If everyone feels like they've got the owner's mindset, then it's permission to have difficult conversations about the things that you have to do as an organization at times to react to situations like what we're finding ourselves now with tariffs. So that was a big lesson just allocating equity because it's the right thing to do and because it helps you drive your business forward with everyone in the same boat, rowing in the same direction.
Speaker 1:Something else I learned Fishbone, I feel like, for the 60-ish people that we were when we exited the business. We were quite a well-run organization. We had plans, we had structure, we had KPIs and benchmarks. We knew what we were doing but we were not built for scale. We were not capable of going from 60 people to 500 people and that had implications for us after the acquisition, where all of our processes changed to another operating standard and tools changed. It created, candidly, a lot of frustration for our team and I would not like to be in that position again.
Speaker 1:So, as I build Ethic Rhythm, before we hired anyone external into the organization, before we landed on our value proposition, before we started looking for customers, we spent four months just building the infrastructure of the organization, putting in our financial system and our professional services automation platform and our HR management system.
Speaker 1:So when we were ready to start, we weren't having to think about these things, we weren't having to build on the fly. It was already established and it sounds ridiculous, but it was a big achievement that our first employee that we hired, jamie he had a laptop before he started. Oh wow, it was secure and it was in our security model and his email accounts were provisioned and he had a system to log into to enter his payroll details, like it was just done. And so, just like onboarding the first employee and not really screwing it up, was like a big accomplishment. And now we're at the point where we're starting to look at scaling. We're hiring more people, we're in customer acquisition mode and I just don't need to think about how are they going to request their T4 slip, or how are they going to request time off, or how are we going to account for that in our scheduling software.
Speaker 2:Yeah, so you can really focus on growing the business and building a business, rather than putting in that infrastructure, growing the business, building the business, but also becoming a standards-based delivery organization.
Speaker 1:I talk a lot about raving fans, which I've mentioned, and I feel like one of the secrets to creating raving fans is to be very consistent in what you deliver. A hypothesis years ago that you can deliver consistently bad service and your customers might be okay with you. You can deliver consistently good service and your clients will be really happy with you. If you deliver inconsistent service and they don't know what to expect from you, then that is a recipe for disaster. We're really focused on becoming a standards-based delivery organization, so that we have scheduled checkpoints with our clients, that we are annotating code in a particular way, that everything is structured and we have scheduled checkpoints with our clients, that we are annotating code in a particular way, that everything is structured and we have accountability down to the individual that's delivering a service for our client, but also so we can share accountability with our clients on the things that they need to do to make us successful in their environment.
Speaker 2:Of course, and I think that's a really interesting jumping off point, because if you look and I've had interesting discussions with folks like even like CTOs out of Silicon Valley and they're a 200 person organization and their deployment strategy for that first day is, well, go grab a MacBook out of the closet and the ability to have that infrastructure in place, I think, is a huge advantage, especially in a space like AI, which is so in flux right now. There's so much opportunity but also a lot of uncertainty about what it really means. So, in that AI space and I do want to spend some time diving into your thoughts and perspective around AI but let's first start with ethic rhythm and tell me about the organization. I know we've talked about the structure. I know we've talked about some of the foundations you put in place, but what is it that you're delivering to customers? What is it you're looking to build and what's the journey that you're on?
Speaker 1:Yeah, so our journey is to become the most trusted Palantir partner in the ecosystem.
Speaker 2:Sorry for our listeners.
Speaker 1:So who's Palantir? Yeah, palantir is a US-based enterprise intelligent workflow and AI organization. They are the antithesis of SaaS. So where you've got workflows that need intelligence that you can apply AI to that you don't want to manipulate your business process to fit into, palantir is a great option because it's a very flexible platform where you can consume enormous amounts of data from as many sources as you can contemplate, have that data in a central location and then apply workflow to it, make that workflow intelligent and then bring AI to bear. We talk internally about this concept of a self-optimizing enterprise, and for me, that's an enterprise that will leverage technology to the best of its ability and at some point in the future, that technology is going to start to identify inefficiencies and processes and start to self-correct it, at first with humans in the loop and then potentially autonomously. And when an organization gets to this vision of the self-optimizing enterprise, I feel like they're going to have an unfair competitive advantage and they're going to return more shareholder value.
Speaker 2:Amazing. So what role does your organization play in helping customers get there? Because obviously that is a really interesting vision and I think that there's lots of discussion around the role that AI is going to play in this new world of business. But in taking those first steps, what's the role that you play with companies and organizations?
Speaker 1:Yeah, I think step number one is helping them identify use cases where there's a return on the investment.
Speaker 1:Like all good platforms, there is a cost to the platform and there's a cost to implement the platform.
Speaker 1:Compared to the typical SI route, where you will subscribe to many platforms and then stitch them together, there's an opportunity for considerable technical debt and also reliance on SIs who have high consulting rates and are there for years, not months. Notwithstanding that, there is a cost to implement and if there's no real return, then it's not a project that should be contemplated. So I think step one is saying what are the biggest problems you've got in your business and, if you could solve them, what does that mean for you as an organization? Are you improving your employee retention so you're reducing your HR costs? Are you creating new revenue opportunities? Are you becoming more efficient? Are you satisfying a compliance requirement that could stop you from being fined, however many millions of dollars? So once we land on those use cases, then it's really about proving out the model consuming data, working with the clients as subject matter experts of their own business and being able to apply the workflows and AI on top of that so that they can learn and drive the outcome that they're looking for.
Speaker 2:Interesting. So is most of the work then coaching leaders on how to identify those things. Do you go in and help them find them? Because a lot of the different business leaders that I talk to they are very curious about AI. There's a degree of healthy optimism, there's an expectation from their board, from their shareholders, from their executive leadership teams to do something with it. But I feel like where people are getting stuck is those first steps of what do I practically do? How could you advise a leader to even identify some of those workflows and kind of what makes a really good candidate for initial AI?
Speaker 1:jobs. Yeah, so I think it comes down to what are the problems that you face as a business. So we're talking to an organization in Western Canada that is involved in natural resources and they have a business unit that is underperforming by about 60% compared to industry, and so that becomes the basis of the conversation. Well, why is that happening? What is unique about what you're doing that is leading to these inefficiencies? And really for them, we've discovered that it has to do with resource and asset management and utilization. They are distributed over for a couple of provinces and they have these big heavy assets that will get stranded out of sight and not used or someone thinks is not available. So then they go rent a heavy asset to work on a project. Meanwhile they've got this idle asset sitting over here and then just being able to schedule people and stitch projects together in a way that is most efficient. Our thesis with that client right now is that we can help them cut their inefficiency in half by the end of this year. They'll still be trailing industry, but by the end of next year they expect to be at the industry standards for gross margin on that particular business unit.
Speaker 1:Ai comes into it by helping schedule and plan out asset utilization and asset location, as well as the individuals that are going to be operating those machines and completing the work. Just to give you an example, we can consume data from things like Caterpillar and we can see when assets are being used and when they're sitting idle. We can see where they're, at what locations they're at and how long they've been sitting there. We can see where they're, at what locations they're at and how long they've been sitting there. We can pull all that data in with other metrics from the business and say what is the utilization of this asset? How much time is it sat running idle?
Speaker 1:Because someone is, they're taking their lunch break in the air-conditioned cabin, which destroys warranty, which consumes fuel, which erodes the resale value on that device, and there's lots of implications to someone just having lunch in a cab in an air-conditioned environment. Right, we can take all these data points and then start to map out. This is the order in which you should be completing projects. These assets should be allocated to those projects and these are the teams that should be involved in that, and for a human to do all of that, it would be a lot of analysis and I'm not sure that they could actually get to end of task on that without the use of AI.
Speaker 2:Well, that's the question I was going to ask. Is it something that they just don't have, the bandwidth for, the capacity, for the capability? Because it sounds like a huge financial problem for them. It sounds like it's costing them a ton of money. Have they just not been able to apply people to these challenges? Or where does AI come in to solve those things?
Speaker 1:So in this particular organization, they have multiple business units, so this, like we got one business unit that's underperforming, but what you create can be applied equally to the other business units, and I think it's just not enough capacity within a human or a team to be very efficient at this, because there's so much data that gets consumed and so many different ways that you can correlate and so many trends that it's just difficult for our brains to be able to comprehend. Right, like, how do you take data that's over here and data that's over here that you may not have total visibility to or understand the context of, and then bring those together to learn something new that you didn't know yesterday?
Speaker 2:I think that's very difficult to do and AI is a perfect use case for that, and so I'm going to open up a tiny bit of a can of worms here with a conversation I know that we've had previously around SaaS applications, and so you talked about Palantir being a little bit of the SaaS killer, and there's been some big statements from saying, hey, you know what? Saas is dead. What is the problem that organizations have with their SaaS platforms right now, and how does AI help them solve?
Speaker 1:that I think. So I'll talk about SaaS and I'll talk about AI. Yeah, Generally, saas requires a business to conform to the standards that have been established in the SaaS. If you choose to go outside of that, then you will often create technical debt and a lot of complexity when it comes to managing the currency of that application, but also all the interdependencies that surround that application and the integrations. I think we're at a point now where platforms are so capable, like Palantir, where you can just develop within many applications within one platform that work well together, that eliminate the interdependencies and the technical debt that gets created, and you build them to align to the business process that you've been creating over decades in your organization, whereas SaaS, you're having to change some of those business processes to avoid technical debt. So I think, as a business user, you don't want to conform to what someone else has prescribed in industry. Sometimes those industry standards are better for sure, but sometimes they're not.
Speaker 2:That makes sense because a lot of the conversations and I'm sure you've had these before even in your previous life where it's cheaper, more efficient, more practical to adjust your business process rather than try to customize a software or platform to do the thing that you want to do, and maybe you have 30 people that are doing it a certain way and instead of automating it, just add X, step over here and it actually starts to make way more sense versus trying to write a whole bunch of code or customizations around that ecosystem. What you're saying is that in this new world, that challenge goes away and you can essentially start to map exactly how people work and solve it with technical amplification, rather than having to fragment it to fit certain molds that are kind of precast.
Speaker 1:Yes, I think one of the advantages of Palantir. I'll just take a step back. Before we landed on Palantir, we evaluated a lot of different platforms and applications. We contemplated being AI consulting generalists, but I honestly just couldn't get my head around that and I had a fear about working with a client and recommending a platform that may not exist in two years, because there's so much hype and there's so much investment into AI today that 90% of those companies could well not exist in the next 24 to 36 months just because they run out of funding or they get acquired by one of the big players.
Speaker 1:And it was really important to me that we landed on a platform that we felt very comfortable with. So we looked for something a platform that was enduring, that would be around for the long haul, that provided deep capabilities around enterprise, intelligent workflow, that could help our clients realize a vision of their self-optimizing enterprise and that was very AI. Forward. Through our process of elimination, we got down to Palantir, which is a 19-year-old company. They are very high growth, they are profitable and they're debt-free and they're investing significant amounts of their capital in developing the Palantir Foundry and AIP platforms.
Speaker 2:A lot of work that people do is in terms of how evaluating these platforms, and their mindset is going to go from evaluating SaaS platform that does this specific thing to now making a much bigger decision around evaluating a singular platform that can deliver an incredible amount of enterprise value. But also, what I've seen is that as the value of those decisions goes up, so does the decision paralysis. If you were to lend some of the experience that you had in evaluating over the last four months, what are some of the considerations you had? And you talked about the business. You talked about the stability. Was technology part of that or is it more of a kind of a business conversation at that point?
Speaker 1:So I think it's both and thank you, you triggered where I was going with my last point which was that it doesn't have to be a fundamental shift to start.
Speaker 1:All companies have platforms that they have invested hundreds of thousands to millions to hundreds of millions of dollars in, and one of the things I like about Palantir is they don't necessarily have to walk away from that investment. Palantir has a flexibility to pull data from many sources. There's a use case for AT&T where they have 995 integrations pulling data into Palantir's Foundry platform and then they operate within that data and workflows inside of Palantir. They're not requiring people to work in these 995 applications. The work moves into Palantir and it writes back to these applications. If you're not ready to rip and replace and rebuild your ERP or your data warehouse, palantir can sit over top of that. It can consume the data it can write back. So, regardless of where a customer is in their journey in trying to create a more intelligent operating infrastructure, palantir fits really well on top of that and over time you can deprecate those platforms and rebuild those workflows in a more intelligent way inside of Palantir.
Speaker 2:Interesting. So one of the things that you brought up which has popped into my head is this idea of SaaS is going away. You're really saying there's a platform that people can build on. Is there entry gates to that? Is there a size of organization that works for it? If I'm a large oil and gas company, obviously I think that makes a lot of sense. If I'm a $10 million construction company that has a couple of folks and doesn't have a ton of technical depth, is this something that's in my wheelhouse or is this something that is going to solely serve for the big guys out there?
Speaker 1:It used to be for the big guys. I think Palantir has recognized that there is a downstream market opportunity for them, so they created what they call Palantir for Builders or Foundry for Builders. So it scales down to startups. So we are a startup, we're 10 or 12 employees and we own Foundry and we're building some of our core workflows inside of.
Speaker 2:Foundry. So this is, I think, can lean some really practical examples for people that are building their businesses. What are some of the and, if you're okay sharing this, some of the practical examples of how you're using AI and that Foundry workflow within.
Speaker 1:EthicRhythm. Yeah, every week is a new adventure for me, because the team is doing more things. The use case that we're working on right now is around helping us identify candidates to go through our hiring process. So I more or less click baited into creating a job opening on LinkedIn, which was free and ended up costing me $500.
Speaker 2:I have the same story. We can share that one afterwards.
Speaker 1:Yeah, and by the end of the first day we had parameters so minimum of three years of Python and TypeScript skills must be in Canada, and a few other variables, and I really didn't expect that we would get very many resumes from this.
Speaker 1:In the first day we had 120 resumes. By the end of the first week we had 400 resumes and when you're a small organization, to take the time to review, like manual review, all these resumes is very difficult. We created a workflow where we can ingest all the applications and then ai will parse through the resumes and it will provide a summary of the experience and then a summary paragraph and then also make a recommendation whether this is someone that we should be talking to or not. And there's a couple of things I like about that. One is it's highly efficient and b it takes away the bias that someone would have in reading a resume. So they're not looking at a name and introducing bias. They're not looking at the company that they used to work for and introducing bias. It's just based on the skills and experience that they've expressed in their resume and then providing a summary of that and making a recommendation.
Speaker 2:Is there any other examples you can think of?
Speaker 1:By the end of the year. I would venture that most of our business will be run on platform.
Speaker 2:So I'm going to pivot a tiny bit into more general AI conversation. So you talked about the use case of your HR hiring and there's a ton of productized AI elements that are out there, so not dissimilar to SaaS applications, but AI for very specific use cases. How are you viewing that in the ecosystem, and is that something that companies should be exploring for niche elements, or should they be looking at the broader AI conversation before investing in some of those individual elements?
Speaker 1:I think they should do both. I think there's some really innovative companies out there and you can learn from them. My fear about the innovation that's happening in the AI market in general right now is there's so much hype and there's so much investment that is going in and they're wildly overvalued companies and, again, I'm not sure that I have the wherewithal to pick the winners and the losers out of this. As an operator, when I'm looking at platforms, that's going to be part of my decision criteria Is this niche platform, this niche capability, going to exist in two years, or is it going to lose funding or get acquired by a competitive organization who sunsets it? There's many things that can happen, and this happened during the dot-com boom, right, so I I would be fearful of picking niche players yeah, it's almost like picking, it's almost gambling at this point.
Speaker 2:right and you're, and you find these individual organizations and we saw it during the dot-com boom and we're seeing it even with the SaaS boom. The barriers to entry become so low, the valuations become crazy, which increases the marketing budget, and essentially all of these organizations contribute to an enormous amount of vendor sprawl. And I think we're starting to see some of that same thing happen on the AI front, where all of these individual business units are picking these niche players without an overall strategy.
Speaker 1:I also think it exposes your risk factor goes up quite a bit when you've got sprawl and you don't understand how your data is being used or how it's feeding the large language, model training and so on.
Speaker 1:I've got fear about that.
Speaker 1:I do think there's a lot of consumerization around AI which is infiltrating boards and executives and they're looking at the chat GPTs and the Geminis of the world, which do amazing things, and they're trying to figure out how to incorporate that into their business.
Speaker 1:So I think looking at the overall strategy and finding enduring companies is really important. I had a conversation with a CEO of a fintech related company in the US earlier this week and we talked about the promise of AI is not being realized inside of the enterprise yet, and that's the opportunity and the challenge, I think for a large enterprise today is to figure out how do they actually leverage AI so that they can drive intelligent workflow, so that they can bring AI to bear, so that they can build trust with the AI platforms, so that they feel like they can make autonomous decisions at some point in the future. There's some great use cases that have human in the loop today, with Palantir specifically, where they're really renovating supply chain management systems and transportation systems, but it'll be a while before decisions are made autonomously. They have to have humans in the loop.
Speaker 2:That makes a lot of sense. There's no shortage of some of these bold claims in the AI space. What are some of the red flags that you're seeing when you see salespeople, marketing et cetera in the industry that are making these big, bold claims around what AI can do for organizations? Are there any red flags that just stand out to you that buyers should be looking out for?
Speaker 1:I don't know if I'm not sure I can comment on how sellers are selling their platforms. I do think that buyers and enterprises need to really contemplate the implication of bringing in AI A. It's not easy, it requires integration and it requires data to drive the decisions. It also affects how people work and there's a big organizational change management component that is required when you're introducing these intelligent workflows into your environment. I don't think that should be underestimated and I think executives need to contemplate the fortitude that they have to embrace change, because it isn't easy. It's a very disruptive and if it's not well thought out and it's not well executed and people don't understand why it's important to do what you're doing, like many technology initiatives, it's destined for failure.
Speaker 2:Yeah, so does that go back to the original comments you made around really finding those practical workflows, those practical use cases that their teams and people can connect with, they're consulted on first? Is that why that step is so?
Speaker 1:important? Yeah, absolutely. I think A. Is there an ROI? I think that should be the first question. Can we actually do something to change our earnings per share and our shareholder value? And if there is, then how do we approach this in a thoughtful way that's actually going to lead to the outcome that we're intending for?
Speaker 2:Yeah, so the concept of ROI and AI is a really interesting one, and when you share the example earlier of the utility of different worksite equipment, that becomes really clear to me, from an ROI perspective, how you can measure that. I think a lot of leaders have had this exposure to ChatGPT or to Copilot or to Gemini or to some of these platforms where they are big investments but they're having a really hard time quantifying that return and output when it comes down to those individual employees. Is there a way or a secret sauce that you can think of in terms of best measuring ROI for some of these AI initiatives and how to tell whether or not they're actually working?
Speaker 1:Yeah. So I think that is a complicated question. So, even in the example I gave you with a resource company that we've been talking with their executives, how are we going to definitively measure the benefit of the cost of this project, because there's factors outside of our control? Will people take the cue that's given to them by the platform that says there is someone that is having lunch in their big heavy excavator and send them a message saying, hey, why don't you go into the work shack and stop idling the machine? If they don't do that, then there's not going to be a benefit for that little tiny slice, right? So there's lots of variables there. It's out of control.
Speaker 1:I think when you are looking at overall programs, there's a use case for the supply chain organization, the Support 20s. That's not our client, it's a Palantir initiative, but they've been able to quantify the value of improving their supply chain using data and AI on the Palantir platforms and they have generated huge savings and also dramatically improved the time to provide supplies to their restaurants. I think they've got about 4,200 restaurants across North America. So when they run an example, when they run a sale event so if they put up like a Frosty on sale, I think it's Frosty's.
Speaker 1:If there's a particular flavor, they can start to understand what are the sales metrics around that particular sale event and are there inventory shortages in particular regions? And if so, then how do they load balance that inventory? How do they short ship stores that have a surplus of inventory to feed stores that have a shortage of inventory? And how do they engage their suppliers to create more syrup or whatever the special ingredients are that they need to be able to support that and they've been able to get dialed in on their ROI in terms of time to deliver and the impact on sales. And I can't talk to all industries, but it's a needle that has to be threaded in order to substantiate the value of the investment to the boards.
Speaker 2:Totally. Do you think that executives are prepared right now to be able to have those conversations, or what steps in education do you think needs to take place at some of those boards and executive level to even understand the implications of some of these technologies?
Speaker 1:Yeah, I think boards need to understand the difference between consumer AI and enterprise AI. Chat GPT in and of itself it may bring some user productivity, but it's not going to change the scale of your supply chain organization. It can support that. It can integrate with some platforms like volunteer that can do that, but it's not chat GPT in and of itself that's going to accomplish that.
Speaker 2:for you yeah, is that one of the biggest misconceptions that you see with AI Cause I feel like a lot of people do feel like AI is how do I apply chat GPT to my business, or how do I add a chat bot, or I feel like it's very interactive based still versus. I think some of those true meaningful chunks forward are when you start to apply AI to decision-making and to enterprise data there.
Speaker 1:Yeah, yes, I think there's a consumerization effect. I think it's like when the iPhone was introduced in 2007,. It took years for applications to catch up what iPhone did, but there was an expectation. They call it ServiceNow used to call it the Sunday to Monday effect. You were on your phone on Sunday and you've got all these beautiful applications and visual displays, and then you show up to work on Monday and you've got green screens or whatever.
Speaker 1:So I think we're going through a bit of that with AI today and the challenge is to figure out how do we incorporate that into the enterprise. So, instead of asking a question and getting a contract reviewed by ChatGPT as an example which I've done and it was remarkably fast, like it probably saved me two hours of effort, which is fantastic there's a logistics organization in town that uses Palantir to understand the implications of their distribution network. So if there is an event in a particular part of the continent that is going to affect their distribution network, ai, through Foundry, is coming back and making recommendations on alternate routing based on service level agreements that they've got, based on the inventory that they're shipping, and it's coming back and saying here are the three best options for you to reroute these products. Which would you like to choose? They choose and then it executes some of those orders Like that's enterprise AI. That's not chat GPT saying hey, you should be doing this, or yes, I looked at this and it looks fantastic.
Speaker 2:So that level of data, that level of integration, brings up some really interesting questions around ethics. And I know that Ethic Rhythm the name itself, it's a node to ethics. What does ethical AI mean to you very practically, when you're embarking on some of these changes with your customers?
Speaker 1:I think it's using the opportunity to remove bias from the information that's presented and ensuring that the outcomes aren't going to have unintended consequences. This is such a large, difficult area. Ai is going to lead to ethical quandaries, no matter what, just by virtue of what it is capable of doing or the promise of what it's going to be capable of doing over the next few years. I'm not totally sure how we grapple with that as a society Inside of an organization. I think it comes back to organizational change management and understanding what is the implication of this AI. How does it affect the way people work, or how many people do the work, and then supporting your population through that change?
Speaker 2:Yeah, is there a way and you talked about the ability to manage that change, or I think there's always going to be in this trade-off of move fast and break things or move slow and meticulously and figure out steps along the way? There's someone that I talked to that had alluded to the idea that some of these enterprise organizations are like tanker ships sometimes and you do need the little speedboats to kind of orbit them to be able to take their cargo off and bring it to shore. Is there a way that it's to practically move fast in AI while not breaking some of these ethical considerations?
Speaker 1:I would like to think so. I think move fast and break things shouldn't apply to ethics. Personally, I think when it comes to ethical standards, everyone in an organization has to stand up for ethics, whether that's the bias that things create or that you can eliminate. So I don't feel like that should be part of the move fast and break things mantra. There's other opportunities to move quickly and hopefully I don't know, I don't really subscribe to the move fast and break things to be candid I also don't subscribe to moving really slowly and getting into analysis paralysis.
Speaker 1:I think there is a happy medium there, but I've never really contemplated that with the context of ethics. But I personally wouldn't bend on my ethical perspectives in favor of speed. That's not my character.
Speaker 2:I love it. For leaders that are evaluating AI this year, what are a couple of things that they can focus on to avoid falling into rabbit holes or wasting time or money. What's the things that they should be looking at as they build out their strategies?
Speaker 1:I think it comes down to what are the biggest problems they've got in their environment, in their operations. Is their data available or that can be created to lead to a different outcome, and how will that affect their business? I don't think they should be looking at the platform. I don't think they should be looking at capabilities, which is is how solutions have been sold in the past. Right, I have this capability. It will solve this problem for you and it puts you in a bit of a box. I think the opportunity now is for executives to eliminate the box and just think about what are their core business problems. What is it that they're telling their shareholders they need to solve to be more competitive, to reduce costs, to respond to existential challenges like tariffs? What are those challenges that they need to solve for? And then thinking about how they can approach them differently.
Speaker 2:Matthew, thank you so much for this amazing conversation. You're clearly very knowledgeable in this space. If someone wanted to connect with you and learn more, what's the best way for them to reach out?
Speaker 1:Yeah, I think going to our website ethicrhythmcom, which is like ethical algorithms, and also on LinkedIn, matt.
Speaker 2:Nielsen. Perfect, and I'll make sure to link you in the show notes here. I really appreciate your time today and thank you so much for coming on the show. Thanks for having me.