EP 01: Talking AI and Next Gen Manufacturing

00:00:05.511	     Am I on?
00:00:06.932	     Oh, there's no three, two, one.
00:00:09.974	     Can you remove, delete this?
00:00:11.455	     You can edit this, right?
00:00:14.035	     All right.
00:00:14.916	     Okay.
00:00:15.455	     We are live, everybody.
00:00:17.417	     Welcome to the show.
00:00:18.658     My name is Yassen Horanszky.
00:00:20.097     I'm the founder and CEO of
00:00:21.699	     Luniko and your host for today.
00:00:23.739	     And I've helped digitally
00:00:24.780	     transform businesses of all
00:00:26.420	     shapes and sizes from publicly traded
00:00:28.942	     multi-billion dollar
00:00:29.922	     international organizations
00:00:31.943	     to local nonprofits and
00:00:33.645	     everything else in between.
00:00:35.106	     And just like you,
00:00:36.445	     I'm wondering where
00:00:37.726	     artificial intelligence and
00:00:39.228	     the disruptive technology
00:00:41.048	     that we've seen dominate
00:00:42.490	     headlines this last year fits in and how.
00:00:46.171	     So that's why we've created the show,
00:00:47.853	     to bring on guests of all
00:00:48.933	     levels and all types of
00:00:50.173	     businesses to talk about AI
00:00:52.595	     and its business adoption,
00:00:53.936	     but strictly from a non-technical lens.
00:00:57.518	     We'll be touching on topics
00:00:58.598	     such as strategy, process, people,
00:01:01.500	     emotions,
00:01:02.600	     and anything else that we find
00:01:03.902	     relevant to the guests that
00:01:04.902	     we're bringing on.
00:01:06.322	     We're going to drop as much
00:01:07.584	     of the corporate jargon as
00:01:08.623	     possible and keep the
00:01:09.584	     discussion as simple as we can.
00:01:11.986	     We're going to be as casual
00:01:13.027	     as possible and have fun.
00:01:14.908	     Now,
00:01:15.888	     The name that we wanted for
00:01:17.028	     the show was taken.
00:01:18.649	     So we still don't have a name, guys.
00:01:20.388	     Please continue to give your
00:01:21.450	     name suggestions.
00:01:22.230	     We highly appreciate it.
00:01:23.590	     We're not going to let a
00:01:24.611	     nameless show stop us.
00:01:26.430	     So I'm going to introduce
00:01:27.572	     you to our first guest today.
00:01:29.671	     He's coming from sunny San Diego,
00:01:31.433	     California.
00:01:32.472	     And we've got Daron Giles,
00:01:34.274	     who's been fortunate enough
00:01:35.793	     to have worked in some of
00:01:36.875	     the most interesting
00:01:38.394	     advanced engineering
00:01:39.415	     projects of our lifetime.
00:01:41.575	     Most of his experience has
00:01:42.736	     been in aerospace manufacturing,
00:01:44.477	     where he's contributed to
00:01:45.537	     building parts that put people on space,
00:01:48.379	     in airplanes or defense products,
00:01:50.900	     highly strategic and life-changing assets,
00:01:54.382	     or in his latest role where
00:01:55.841	     he's improving the
00:01:56.563	     efficiency in biotech
00:01:57.783	     manufacturing so that more
00:01:59.524	     medical solutions can
00:02:01.045	     continue to save and
00:02:02.525	     improve our daily lives.
00:02:04.426	     Now,
00:02:05.085	     Daron is not your typical IT director.
00:02:07.888	     He started out as a
00:02:08.807	     manufacturing engineer
00:02:10.068	     before becoming an
00:02:11.049	     operations manager and
00:02:12.610	     eventually taking the leap
00:02:14.390	     towards managing technology
00:02:16.431	     for a business.
00:02:17.632	     And like many engineers,
00:02:18.752	     he's a beer brewer.
00:02:21.433	     And him and I met when we
00:02:23.354	     worked down in California
00:02:24.735	     on a number of digital
00:02:25.816	     transformation initiatives
00:02:27.396	     for an aerospace manufacturing company.
00:02:29.957	     So, Daron,
00:02:30.957	     thank you for being on the show today.
00:02:34.177	     Yes, thanks for having me on the show.
00:02:35.961	     It's great to talk to you.
00:02:37.304	     I always enjoy spending time
00:02:38.706	     with you and the team from Luniko.
00:02:41.979	     Yeah, no, that's awesome, Daron.
00:02:43.661	     So let's get to the questions, right?
00:02:45.622	     Just so we can kind of start on here,
00:02:47.084	     right?
00:02:47.625	     And for those that are
00:02:48.566	     watching the first time,
00:02:49.466	     we'll be asking some
00:02:50.287	     questions and kind of chatting about it.
00:02:51.927	     We might pull up ChatGPT as we need to,
00:02:55.411	     right?
00:02:55.570	     This is an AI-related show.
00:02:57.432	     But let's start with just the basics first,
00:02:59.794	     right?
00:02:59.995	     And I am going to be asking
00:03:01.015	     this to everybody, Daron.
00:03:02.197	     So what does AI mean to you?
00:03:08.143	     So, so AI is, is,
00:03:09.782	     is really it's synthetic
00:03:11.044	     human intelligence, right?
00:03:12.864	     Human intelligence so far
00:03:13.984	     has only existed in humans.
00:03:16.245	     And, you know,
00:03:17.724	     now that technology has
00:03:18.844	     advanced to the point that it has,
00:03:20.425	     we're able to,
00:03:21.866	     we're able to create
00:03:22.665	     artificial intelligence through, you know,
00:03:25.526	     computing and, and,
00:03:26.887	     and computing processes that that's,
00:03:29.567	     you know,
00:03:29.766	     today comes close to simulating
00:03:31.806	     human intelligence.
00:03:32.668	     And I think the expectation
00:03:33.728	     is in the future.
00:03:35.240	     it will exceed and then far
00:03:37.360	     exceed the intellect and
00:03:40.262	     capacity and capability of
00:03:42.122	     the human mind.
00:03:42.782	     All right.
00:03:45.604	     That makes sense.
00:03:46.865	     I mean,
00:03:47.104	     I look at it a little bit similar to you,
00:03:49.006     Daron.
00:03:49.265	     I think it's about giving
00:03:50.686     the computer the ability to think, learn,
00:03:52.667	     and adapt, right?
00:03:53.987	     If I can kind of simplify this,
00:03:56.769	     a lot of the technology
00:03:57.649	     that we're seeing.
00:03:58.169	     Obviously, when people talk about AI,
00:04:00.229	     I think everybody thinks about chat GPT.
00:04:02.931	     But the truth is that AI
00:04:05.312	     technology really has been
00:04:06.614	     around for a lot longer than that.
00:04:08.155	     And some of it is integrated
00:04:09.836     into our daily life and we
00:04:11.057	     don't really know it.
00:04:12.359	     But this last year we've seen, well,
00:04:15.980	     last year, right?
00:04:16.742	     We saw a lot of claims in
00:04:17.802	     terms of what AI can do.
00:04:19.564	     We saw a lot of marketing.
00:04:21.266	     We saw just a lot of things
00:04:23.086	     being pushed out.
00:04:24.548	     Now let's break out what's
00:04:26.088     marketing versus what's real.
00:04:27.348	     And it's really hard to tell
00:04:29.149	     nowadays with content being
00:04:30.790	     so curated and done so well.
00:04:32.911	     But I'm just going to cut to
00:04:34.091	     a chase there.
00:04:34.892	     What do you think is
00:04:35.692	     bullshit when you look at
00:04:37.574	     AI and its application?
00:04:42.896	     You know,
00:04:43.476	     I think fear is the bullshit about AI,
00:04:46.997	     right?
00:04:47.338	     I think media, you mentioned, you know,
00:04:49.699	     media...
00:04:51.627	     coverage over the generative AIs,
00:04:54.189	     ChatGPT and BARD,
00:04:55.771	     and it seems like
00:04:57.012	     everybody's trying to catch
00:04:57.971	     up now or be the leader.
00:05:01.514	     But the, you know,
00:05:03.274	     I think there's a lot of
00:05:06.117	     fear being probably pushed
00:05:08.177	     a little bit by, you know, by the media,
00:05:09.858	     right?
00:05:10.420	     Everybody understands the
00:05:12.180	     media or that fear sells
00:05:14.322	     newspapers or ads now.
00:05:17.891	     And I really think people
00:05:18.831	     are worried about like, oh, you know,
00:05:20.252	     AI is going to take my job
00:05:21.793	     and AI is going to take over the world.
00:05:24.194	     And, you know, that's really, it's fear.
00:05:26.615	     It's by nature not rational.
00:05:29.656	     I'm not so worried about that.
00:05:33.598	     So what do you think about
00:05:34.699	     the application?
00:05:35.379	     Do you think the
00:05:36.139	     applications that we're seeing that, hey,
00:05:38.661	     AI can solve this for you,
00:05:40.401	     AI can solve that for you,
00:05:41.862	     do you think that they're feasible?
00:05:44.244	     Absolutely, yeah.
00:05:45.204	     I think we're starting to
00:05:45.903	     see some really...
00:05:48.735	     valuable use cases with the, you know,
00:05:51.456	     right now it's generative AI with,
00:05:53.677	     you know,
00:05:53.918	     both language model AI and even
00:05:56.538	     just generative images.
00:06:00.660     And, you know,
00:06:01.439	     I think there's another bit
00:06:02.839	     of the fear is generative videos,
00:06:06.661	     you know, spoof videos that would make
00:06:09.983     someone like Joe Biden or
00:06:11.343	     Donald Trump appear to be
00:06:13.865	     saying things or doing
00:06:14.805	     things that they did not
00:06:16.286	     actually do yet in a way
00:06:18.346	     that's indistinguishable to a human.
00:06:20.206     Is that really Joe Biden doing that?
00:06:22.267	     Or is that an AI?
00:06:26.050     Yeah, I've seen some of those.
00:06:27.290	     I've seen like some of the
00:06:28.031	     Joe Rogan and things like that.
00:06:29.451	     And you can tell right now
00:06:30.531	     that they're not real.
00:06:31.992	     But I feel like, you know,
00:06:33.773	     how long is it going to be
00:06:34.872	     before you actually can't
00:06:35.894	     tell whether they're real or not, right?
00:06:40.288     So let's shift over to kind
00:06:41.889	     of an area that you're very familiar with,
00:06:44.069     right?
00:06:44.230	     So you've been obviously in
00:06:45.050	     the shop floor from as a
00:06:46.831	     manufacturing engineer to
00:06:48.391	     supporting the shop floor
00:06:49.692	     as an IT director in your role today.
00:06:53.072	     How do you think that this
00:06:54.213	     technology is going to make
00:06:56.374	     its way to the shop floor?
00:06:59.514	     You know,
00:07:00.574	     I think it's going to in the long term,
00:07:02.516	     right?
00:07:03.596	     It's going to have a really
00:07:05.055	     significant impact on the shop floor.
00:07:08.223	     And the way I think we're
00:07:09.365	     going to see that is
00:07:10.706	     through enhancing and
00:07:13.767	     improving the work that the
00:07:16.689	     workers are doing on the
00:07:17.610	     shop floor to make
00:07:20.151	     processes and products more consistent,
00:07:22.752	     more reliable, higher quality.
00:07:26.435	     And those attributes are
00:07:28.697	     going to drive the other two goals,
00:07:32.838	     faster and lower cost.
00:07:35.204	     So I really, I'd say,
00:07:37.045	     I think artificial
00:07:39.987	     intelligence is going to
00:07:41.166	     enhance the operator,
00:07:43.487	     giving them the ability to do more,
00:07:46.290	     more reliably, more consistently.
00:07:48.389	     And, and it's, you know, in this, you know,
00:07:51.391	     first, you know,
00:07:52.331	     first phases of digitalization and,
00:07:54.572	     you know, in, in industry of, um,
00:07:57.913	     I've always used technology to, uh,
00:08:00.095     you know, to, to, to hit the goals,
00:08:03.055	     right.
00:08:03.216	     Better, faster, and cheaper.
00:08:05.208     And I think it's going to continue.
00:08:06.249     We're going to see new use
00:08:07.668	     cases that continue to
00:08:09.670     evolve manufacturing processes.
00:08:14.093	     And I think an exciting thing about it now,
00:08:16.735	     part of why I think it's coming of age,
00:08:20.817	     is the younger generations,
00:08:24.238	     the generations behind us
00:08:26.500     have grown up on computers.
00:08:29.302	     And so they're entering the
00:08:30.802	     workforce with a lot more tech savvy than
00:08:35.423	     the previous generations.
00:08:37.625	     Right.
00:08:38.505	     I think, you know,
00:08:40.067     where in the recent past, you know,
00:08:42.707	     finding,
00:08:43.508	     finding manufacturing technicians and,
00:08:45.470	     and, you know, operators,
00:08:47.149	     machine operators that have
00:08:48.591	     that understanding, you know,
00:08:49.652	     have both the experience,
00:08:50.912	     the hands-on manufacturing experience,
00:08:52.813	     know how things work and
00:08:55.394	     they're afraid of computers
00:08:56.635	     because they're new and they're change.
00:08:59.756	     But nowadays,
00:09:01.419	     You know,
00:09:01.679	     people entering the workforce
00:09:03.282	     grew up on computers.
00:09:04.783	     They can't imagine or relate
00:09:07.345	     to a time when computers didn't, you know,
00:09:11.087	     fulfill so many different
00:09:13.190	     functions in a pervasive
00:09:15.792	     way in our lives.
00:09:18.519	     Right.
00:09:18.818	     And right now you've got a
00:09:20.539	     child going through the university,
00:09:22.201	     right?
00:09:22.380	     So do you think that the
00:09:23.980	     universities right now are
00:09:25.761	     preparing for that kind of
00:09:27.903	     new wave of what the
00:09:29.244	     generational worker is
00:09:30.344	     going to look like and
00:09:31.865	     pushing some of these
00:09:32.845	     technologies so that
00:09:33.826	     they're not just ready for
00:09:35.746	     them when they kind of hit the shop floor,
00:09:37.827	     but they're the ones who
00:09:38.768	     are in a way pushing for them?
00:09:41.990	     Yeah, absolutely.
00:09:42.750	     You know, I think the, you know,
00:09:45.750	     the younger generation, you know,
00:09:47.943	     With the big lead from universities,
00:09:50.965	     they're pushing the technology,
00:09:52.424	     they're pushing the envelope.
00:09:53.405	     I mean,
00:09:53.885	     a lot of the funded research for AI
00:09:58.027	     began in academia,
00:10:00.008	     is still being pursued in academia.
00:10:02.668	     And I really feel like
00:10:05.524	     not just from university
00:10:06.804	     because people become tech savvy,
00:10:09.527	     you know, from, from an even earlier age,
00:10:11.248	     right.
00:10:11.489	     Without going to college, people have,
00:10:13.070	     everyone has cell phones
00:10:14.270	     and is learning a lot of
00:10:15.331	     new technology that was
00:10:16.351	     just unfathomable 20 years, you know,
00:10:19.313	     30 years ago.
00:10:19.794	     Uh, so I,
00:10:21.956	     I do feel like universities are
00:10:23.216	     preparing people, uh, in,
00:10:25.078	     in a lot of different ways,
00:10:27.399	     but universities aren't,
00:10:28.480	     aren't solely responsible for that, uh,
00:10:30.982	     that evolution that's happening, right.
00:10:32.624	     There's, there's lots of factors, uh,
00:10:35.854	     around the world today that
00:10:38.495	     are driving the adoption of technology.
00:10:42.258	     Right.
00:10:42.859	     Yeah, no, you're right on that there.
00:10:44.460	     So you've got experience in aerospace,
00:10:47.621	     you've got experience in biotech.
00:10:49.823	     Out of those two industries, right,
00:10:51.365	     and maybe some of the other
00:10:52.485	     manufacturing industries
00:10:54.166	     that you can think of, right,
00:10:55.326	     where do you think that AI
00:10:56.628	     is going to pick up faster
00:10:58.068	     and where do you think it's
00:10:58.850	     going to be taking a little
00:11:00.150	     bit more time?
00:11:02.610	     You know,
00:11:02.910	     I think what we're seeing right
00:11:05.212	     now with generative AI is
00:11:07.894	     going to see use cases in
00:11:09.815	     communications a lot, right?
00:11:11.216	     The smart chat agents are, I think,
00:11:15.239	     going to be a simpler use case, right?
00:11:18.860	     I think we're probably
00:11:20.042	     already seeing that
00:11:21.543	     implemented in large scales
00:11:23.563	     at large organizations
00:11:24.845	     where they're able to not
00:11:27.005	     just replace human
00:11:28.927	     intellect with machine intellect,
00:11:30.927	     but enhance
00:11:32.525	     the human work and let the
00:11:37.208	     computer do the hard part,
00:11:39.171	     I guess is the way I think of that.
00:11:41.011	     So generative AI responding
00:11:46.336	     to inquiries from customers
00:11:48.076	     and clients is where I
00:11:49.957	     think we'll start to see
00:11:51.359	     initial use cases for business.
00:11:54.902	     I think there's a lot of
00:11:55.741	     other entertainment type
00:11:56.923	     use cases that will drive like with the
00:12:00.269	     the fake videos and fake imagery,
00:12:02.370	     fake audio.
00:12:04.971	     That has no purpose that I
00:12:06.831	     can think of in business today.
00:12:10.413	     But that doesn't mean that
00:12:11.113	     it won't turn into a useful
00:12:15.014	     tool in the future.
00:12:17.235	     We just don't know what that use is yet.
00:12:18.715	     Okay.
00:12:21.263	     And then what are some of the, I mean,
00:12:23.644	     generative AI and just other uses of AI?
00:12:27.046	     It's not something that you just turn on,
00:12:28.986	     right?
00:12:29.307	     Like what are some of the
00:12:30.227	     things that you think
00:12:31.107	     businesses need to be doing?
00:12:32.389	     Some of the key milestones
00:12:33.669	     or objectives that they
00:12:35.211	     need to meet before they're
00:12:36.431	     seriously considering going
00:12:37.751	     down this path?
00:12:39.932	     Yeah,
00:12:41.053	     so I think of AI as kind of the top
00:12:44.034	     of the technology pyramid.
00:12:46.030	     And it's, you know,
00:12:46.892	     maybe even if you think of
00:12:47.712	     it like a skyscraper, it's the, uh,
00:12:49.374	     the top floors that are
00:12:50.394	     still under construction as the,
00:12:52.076	     as the skyscraper grows, uh, you know,
00:12:55.719	     closer to the sun.
00:12:56.899	     Um, it, uh, it,
00:13:00.721	     I think a company has to
00:13:01.842	     have a solid foundation and the,
00:13:03.965	     and then the layers of the
00:13:05.166	     technology infrastructure
00:13:06.846	     and capability before, you know,
00:13:09.208	     trying to go for the end, the end goal,
00:13:12.530	     right.
00:13:12.792	     The, um, uh,
00:13:15.051	     Rome wasn't built in a day.
00:13:17.350	     Technology platforms and
00:13:19.111	     technology infrastructure
00:13:20.351	     for organizations is a lot of work,
00:13:24.113	     both to design and then build,
00:13:29.894	     and I think often
00:13:30.833	     overlooked is the
00:13:31.553	     maintenance activity also.
00:13:32.774	     A company has to have the
00:13:36.394	     infrastructure to know what
00:13:38.895	     the use cases are or should
00:13:40.755	     be to really understand the
00:13:44.403	     The, the need, the unmet need that,
00:13:46.583	     that they, uh, you know, the, the, the,
00:13:48.943	     the tool or the solution will solve, uh,
00:13:51.725	     they need to have the, uh,
00:13:52.625	     the basic infrastructure to manage that.
00:13:54.765	     And, you know,
00:13:55.186	     basic infrastructure used to be, you know,
00:13:56.947	     you have a server rack and
00:13:58.126	     batteries and network
00:13:58.947	     switches and all that,
00:13:59.727	     all the other stuff.
00:14:00.488	     But, you know, uh, with the cloud,
00:14:03.249	     that's not so necessary anymore.
00:14:07.389	     Now we're dependent on the internet,
00:14:08.971	     right?
00:14:09.390	     Without a internet connection.
00:14:10.951	     If you don't have everything on site,
00:14:12.432	     then you can't do anything.
00:14:14.197	     So, uh, you know,
00:14:14.937	     so the basic IT infrastructure, uh,
00:14:18.080	     the nuts and bolts has to
00:14:19.421	     be in place and it has to
00:14:20.802	     be supported and maintained.
00:14:23.003	     Uh, and then after that, the, you know,
00:14:25.364	     the enterprise applications, um,
00:14:28.125	     have to be in place, uh,
00:14:30.287	     in the ISA 95 pyramid, which is, uh,
00:14:33.109	     you know, uh, an organization, uh,
00:14:35.870	     as produced by, uh,
00:14:37.750	     Mesa manufacturing
00:14:38.812	     execution software association.
00:14:41.844	     They talk about the
00:14:42.524	     different layers of the
00:14:44.825	     tech stack for
00:14:47.306	     manufacturing and
00:14:48.706	     manufacturing execution.
00:14:50.807	     You have to have each layer
00:14:55.208	     as a foundation for the next layer.
00:14:58.931	     We can't just build the top
00:15:00.471	     without building the
00:15:01.971	     support structure
00:15:02.812	     underneath it to develop
00:15:06.913	     and maintain the
00:15:07.953	     application as they're created.
00:15:11.556	     yeah otherwise everything's
00:15:12.557	     just going to come
00:15:13.200	     crumbling down which makes
00:15:15.504	     a lot of sense right so I
00:15:17.086	     mean you've got businesses
00:15:18.110	     that that 2023 was all the hype about ai
00:15:22.759	     And if you weren't throwing AI around,
00:15:24.941	     it seems like you were
00:15:25.600	     getting left behind.
00:15:27.422	     But should businesses really
00:15:30.722	     take a pause and kind of
00:15:32.283	     assess some of those
00:15:34.004	     foundational areas that
00:15:35.205	     you're kind of talking about?
00:15:36.846	     Do they need to redesign
00:15:38.866	     their business processes with AI in mind?
00:15:41.988	     Or is this just something that's like,
00:15:43.488	     okay,
00:15:43.729	     we're going to chip away a little
00:15:44.849	     bit about this AI piece and
00:15:46.710	     see what we can kind of
00:15:47.610	     play around with while we
00:15:50.350	     work on some of these
00:15:51.371	     foundational things?
00:15:52.711	     What do you think?
00:15:54.753	     You know, I mean, really,
00:15:58.956	     I think it's both, right?
00:16:00.956	     You have to have the foundation,
00:16:02.076	     but you also have to have the vision.
00:16:05.979	     And, you know, you talked about, you know,
00:16:10.022	     should companies just start
00:16:10.961	     working on AI?
00:16:11.962	     I think, you know, definitely not, right?
00:16:15.424	     Companies should always
00:16:16.184	     think about use cases and
00:16:18.570	     always have a good solid
00:16:20.270	     understanding of their business,
00:16:21.572	     their business objectives,
00:16:23.974	     the performance,
00:16:24.695	     the metrics against those objectives,
00:16:27.736	     and understand where there
00:16:28.518	     are deficiencies and opportunities.
00:16:29.859	     And then instead of thinking
00:16:31.419	     about a tool that needs to
00:16:32.520	     be adopted because it's a
00:16:34.883	     nifty thing that
00:16:35.524	     everybody's talking about,
00:16:37.504	     you have to start with a use case,
00:16:38.745	     start with a why.
00:16:41.327	     Why do we wanna do this?
00:16:42.229	     What do we wanna get out of it?
00:16:44.650	     And
00:16:46.530	     asking and answering that question of why,
00:16:48.732	     we'll get to the what and the how.
00:16:54.498	     But it's really important to start with,
00:16:57.059	     this is an unmet need in
00:16:58.080	     the business that we've identified,
00:17:00.081	     and this is how we're going
00:17:02.764	     to plug that with a technology.
00:17:08.929	     If the foundation's not there,
00:17:12.092	     the use case won't be successful.
00:17:14.113	     That's just my experience.
00:17:16.505	     too many times where uh you
00:17:19.727	     know a project an
00:17:21.508	     implementation a big
00:17:22.887	     expensive project will fail
00:17:25.349	     and it's because the
00:17:26.990	     infrastructure wasn't in
00:17:27.990	     place the idea itself the
00:17:29.872	     use case wasn't well
00:17:30.912	     thought out the overall
00:17:32.553	     business process wasn't
00:17:33.733	     thought out it was a you
00:17:35.535	     know it was a an initiative
00:17:37.476	     of here's a technology or a
00:17:38.695	     tool we want to use this
00:17:39.676	     tool but not a uh
00:17:42.037	     not in the context of the
00:17:43.356	     business and the business process.
00:17:44.857	     It's just,
00:17:45.198	     we want this tool because it's the shiny,
00:17:47.219	     the newest,
00:17:47.618	     shiniest bubble that
00:17:48.459	     everyone's talking about.
00:17:50.799	     So you're basically saying
00:17:51.859	     it's gotta be fit for purpose, right?
00:17:53.721	     We start with the,
00:17:54.340	     why we start with the purpose.
00:17:56.061	     And then we see,
00:17:56.882	     we kind of work backwards
00:17:58.261	     within there to define how
00:17:59.863	     and what AI applications
00:18:02.183	     really fit in with that purpose.
00:18:05.084	     Yeah.
00:18:05.904	     And in a lot of cases,
00:18:06.785	     I think you're going to, you know,
00:18:08.045	     a lot of organizations when they,
00:18:09.286	     when they really,
00:18:10.731	     take the time to look deeply
00:18:13.132	     and reflect deeply about
00:18:14.452	     their current technology
00:18:16.393	     stack that the next
00:18:17.752	     solution that's needed
00:18:18.692	     isn't always going to be an AI solution.
00:18:20.614	     The next solution that might
00:18:22.173	     be needed might be
00:18:23.634	     something a lot simpler
00:18:24.554	     like some level of factory
00:18:27.914	     automation or an enterprise
00:18:30.914	     resource planning platform
00:18:32.395	     or a planning and
00:18:33.056	     scheduling platform that
00:18:36.496	     solves a more immediate
00:18:37.497	     problem or challenge for the business.
00:18:40.805	     Yeah, no, I agree there, man.
00:18:45.455	     What do you think are going
00:18:46.777	     to be some of the
00:18:49.057	     challenges that kind of come with that,
00:18:51.778	     right?
00:18:52.038	     If you've got the executive that says, hey,
00:18:54.619	     we need to do AI because
00:18:56.339	     that's what the
00:18:56.740	     shareholders want or that's
00:18:57.901	     what the market is
00:18:58.580	     demanding or that's what
00:18:59.580	     everybody else is doing, right?
00:19:01.902	     That's obviously gonna
00:19:02.623	     create some challenges for
00:19:04.063	     people like you who have to
00:19:05.844	     grab that vision and try to
00:19:08.045	     fit it in where maybe that
00:19:09.384	     purpose isn't quite clear.
00:19:12.546	     You know, I think it's...
00:19:15.018	     I think the two biggest
00:19:15.719	     challenges are fairly generic.
00:19:18.902	     They're not just unique to
00:19:20.683	     AI or technology.
00:19:22.144	     It's time and money.
00:19:24.126	     I think that when companies
00:19:29.071	     are considering a
00:19:30.673	     technology and planning a project,
00:19:34.057	     it's got to fit in the budget,
00:19:35.778	     if it's capital budget or
00:19:36.959	     operational budget.
00:19:39.236	     it's got to fit.
00:19:41.296	     I've seen too many times
00:19:42.876	     where a project will fail
00:19:44.136	     because it's not adequately budgeted.
00:19:47.998	     We think, oh,
00:19:48.478	     we're just going to do this
00:19:49.317	     project and there's going
00:19:50.178	     to be one guy sitting in
00:19:50.698	     the corner doing this thing
00:19:51.738	     for six months and then
00:19:53.659	     magically we have a system
00:19:54.798	     implemented for an
00:19:55.679	     enterprise or for a site
00:19:57.979	     within an enterprise.
00:20:01.680	     That's not a recipe for success.
00:20:04.880	     And I think that companies
00:20:06.661	     have to be really objective about
00:20:08.896	     technology is not cheap.
00:20:12.238	     Computer systems can,
00:20:14.159	     if they're poorly designed
00:20:15.739	     and implemented, can be money pits.
00:20:18.359	     And the difference between a
00:20:21.320	     solution or a system that's
00:20:22.621	     a money pit versus
00:20:23.741	     something that's really
00:20:24.622	     truly transformative and
00:20:27.123	     value added to a business
00:20:30.923	     can start with the idea or the funding.
00:20:36.846	     The best idea in the world
00:20:38.702	     with insufficient funding can't, you know,
00:20:41.243	     can't, can't happen.
00:20:43.203	     And, and the same way that, you know, uh,
00:20:46.865	     uh, you know, if,
00:20:48.125	     if a project is set out
00:20:49.746	     with a unrealistic timeline, uh,
00:20:53.426	     that can also lead to a failure.
00:20:56.367	     Yeah.
00:20:56.928	     So who do you think should
00:20:58.208	     be leading these AI initiatives?
00:21:00.989	     That's a, that's a good one, right?
00:21:02.128	     It's, it's kind of a hot potato that gets,
00:21:03.849	     uh, that gets, uh,
00:21:05.830	     batted around in different organizations.
00:21:08.457	     Where I think I've seen it
00:21:09.517	     be most successful is when
00:21:11.018	     organizations have an
00:21:12.657	     operations technology group
00:21:15.719	     that is not necessarily the
00:21:17.999	     IT organization and not
00:21:19.398	     necessarily operations organization,
00:21:21.859	     but the bridge between the two.
00:21:23.380	     Like you mentioned in my background,
00:21:24.779	     that's where I've managed
00:21:27.080	     to find my niche in
00:21:30.261	     companies is bridging the
00:21:32.261	     gap between operations,
00:21:34.342	     even engineering and operations.
00:21:35.663	     What does the business need?
00:21:38.031	     Versus what is the art of
00:21:40.212	     the possible today?
00:21:41.953	     What can we really do?
00:21:42.755	     What can we really pull off?
00:21:44.455	     And what can we accomplish
00:21:46.817	     within the time and cost
00:21:49.499	     constraints that are set for us?
00:21:52.594	     Now,
00:21:52.773	     that's who you think should be leading
00:21:55.275	     that, right?
00:21:55.734	     And, you know,
00:21:56.556	     maybe a lot of organizations
00:21:57.935	     don't have that
00:21:59.196	     organization that you're referring to.
00:22:00.856	     Who do you think is actually
00:22:01.857	     going to be leading most of
00:22:03.857	     the AI implementation projects?
00:22:10.221	     You know,
00:22:11.300	     it feels like AI is so new that
00:22:15.163	     it's more of an IT
00:22:16.782	     challenge than anything else, right?
00:22:21.909	     you know,
00:22:22.128	     so much of the effort is going to
00:22:23.450	     be first understanding what
00:22:24.671	     you need and being able to
00:22:27.551	     define your requirements in a way that,
00:22:29.854	     you know,
00:22:30.094	     a system developer can actually
00:22:31.775	     turn that into a solution.
00:22:34.355	     And, you know,
00:22:34.996	     all the way through actually building,
00:22:38.317	     designing and building or working with,
00:22:39.959	     you know,
00:22:40.660	     a third party and outside organization to,
00:22:42.621	     you know,
00:22:43.500	     to work the design and build activity.
00:22:48.003	     Every system,
00:22:48.564	     every project is going to
00:22:49.464	     have a champion.
00:22:50.528	     Right.
00:22:50.769	     And the champion,
00:22:51.490	     it matters less what part
00:22:53.151	     of the organization it is
00:22:54.431	     that then that they're a
00:22:56.031	     strong champion and that
00:22:57.432	     they're well supported by the business.
00:23:00.575	     Right.
00:23:00.795	     You know,
00:23:01.855	     projects where the champion or
00:23:05.617	     the owner is, you know,
00:23:07.138	     they're doing this in their spare time.
00:23:08.819	     It's a, you know,
00:23:12.061	     not a primary role or
00:23:13.323	     responsibility tends to, you know,
00:23:15.763	     it means essentially that
00:23:17.005	     the project is
00:23:17.724	     deprioritized at the leadership level.
00:23:21.183	     And that's, you know,
00:23:21.984	     that's another recipe for a failure.
00:23:26.287	     Kind of coming back to that
00:23:27.228	     time thing that you're referring to,
00:23:29.048	     right?
00:23:29.189	     The person has to be able to
00:23:30.210	     kind of have the time.
00:23:31.150	     They've got to be able to
00:23:31.851	     facilitate the requirements
00:23:34.071	     from different parts of the organization,
00:23:36.232	     right?
00:23:36.432	     Because we talk a lot about
00:23:37.513	     operations and I know
00:23:38.434	     that's where your focus is,
00:23:39.994	     but a lot of the use cases
00:23:41.737	     that we're seeing where AI
00:23:42.957	     can add a lot of value is
00:23:44.917	     outside of operations, right?
00:23:46.378	     On some of the more business
00:23:47.859	     supporting functions, right?
00:23:49.480	     So you probably want
00:23:51.463	     somebody that can
00:23:52.224	     facilitate not just their needs,
00:23:54.667	     but the needs of the entire business.
00:23:56.310	     And I feel like that, you know,
00:23:57.792	     is going to be a tough
00:23:59.134	     thing to figure out,
00:24:00.455	     especially when you're
00:24:01.356	     trying to figure out
00:24:02.157	     something that is
00:24:03.119	     relatively new to most people.
00:24:04.541	     Right.
00:24:05.542	     Right.
00:24:06.724	     Right.
00:24:08.410	     So let's talk about some of
00:24:11.290	     the skill sets then, right?
00:24:13.811	     That some of the people here,
00:24:15.053	     whether you're the one
00:24:15.952	     leading the initiative or
00:24:17.473	     whether you're the one kind
00:24:18.394	     of playing a role in the initiative,
00:24:19.994	     right?
00:24:20.214	     Like,
00:24:20.795	     and you talk about the universities
00:24:23.655	     and the next generation and
00:24:24.676	     kind of workforce.
00:24:25.876	     So what are the things that
00:24:26.897	     you would be looking for
00:24:29.358	     from these next generational workers?
00:24:34.218	     Just thinking about my own
00:24:35.438	     background and what I think
00:24:37.378	     has helped me progress and
00:24:41.460	     become what I am today is a
00:24:44.661	     really broad and general understanding.
00:24:48.781	     I think universities do this fairly well.
00:24:54.343	     You can become a specialist
00:24:57.304	     in a narrow niche field.
00:24:59.712	     or you can be generalized in
00:25:01.773	     your education, learning,
00:25:03.394	     and even really capacity in
00:25:05.915	     the workforce.
00:25:07.996	     I fit in the latter, right?
00:25:09.817	     I started as an engineer and
00:25:12.117	     had the opportunity to go
00:25:12.897	     back to school and get an
00:25:13.759	     MBA and learned about
00:25:15.019	     operations and finance and
00:25:16.900	     accounting and basically
00:25:20.080	     all the different aspects
00:25:21.682	     of running a business.
00:25:25.083	     I think that there's no one
00:25:26.923	     skill set that is
00:25:30.904	     or one talent that's always
00:25:33.346	     going to be required.
00:25:34.067	     I think it's the tech savvy,
00:25:36.930	     the background,
00:25:37.490	     that foundation layer of
00:25:38.652	     not being afraid of technology.
00:25:39.992	     I think we can take that for
00:25:40.913	     granted in the next 10 to
00:25:43.695	     20 years because everyone
00:25:44.916	     will have a baseline of
00:25:46.377	     technology savvy that far
00:25:48.579	     exceeds what most in my
00:25:51.541	     generation and certainly
00:25:52.603	     most in our parents'
00:25:54.183	     generation had brought to the workforce.
00:25:58.430	     But I think it's really
00:25:59.150	     going to be an
00:26:00.250	     understanding of what is the business?
00:26:03.373	     What is the business model?
00:26:04.794	     How do we know what is our product?
00:26:07.576	     What is our service?
00:26:08.476	     How do we make money?
00:26:09.636	     And being able to see the big picture,
00:26:12.999	     right?
00:26:13.398	     Seeing the forest and the trees.
00:26:17.981	     I think it's very important
00:26:19.942	     to be able to work
00:26:22.179	     you know,
00:26:22.358	     we talk about a 30,000 foot level,
00:26:24.520	     very high level overview of, you know,
00:26:26.843	     an organization, what's going on,
00:26:28.124	     but also in the weeds, right.
00:26:29.444	     It's at sea level.
00:26:31.106	     And I think, uh,
00:26:33.127	     a really important attribute in, uh,
00:26:35.289	     successful employees in the
00:26:36.510	     future in technology is going to be that,
00:26:39.294	     that diversity and, and, and, uh,
00:26:42.375	     broadness of understanding.
00:26:45.117	     Uh, you know,
00:26:45.578	     one of the things that I heard, um,
00:26:47.500	     it was my advisor,
00:26:48.880	     my undergraduate advisor
00:26:50.641	     talked about when he talked about PhDs,
00:26:53.181	     he wasn't disparaging PhDs, right?
00:26:54.721	     He was a professor.
00:26:55.342	     He's obviously had one of his own,
00:26:56.501	     but he said, you know, when you get a PhD,
00:26:59.201	     you're learning more and
00:27:00.182	     more about less and less
00:27:01.563	     until you know everything
00:27:02.482	     there is to know about nothing at all.
00:27:06.044	     And so I think that's, you know,
00:27:09.364	     like that has a place, right?
00:27:10.865	     For that, you know,
00:27:11.565	     for the person who's going to, you know,
00:27:13.593	     develop the next ai
00:27:15.114	     technology that's going to
00:27:16.595	     change the world like you
00:27:17.615	     got to be that focused but
00:27:20.655	     to you know the the
00:27:22.557	     underlying technology isn't
00:27:24.278	     the solution right the
00:27:25.499	     underlying technology is
00:27:26.858	     the enabler and the
00:27:28.500	     solution is putting
00:27:29.421	     together the unmet unmet
00:27:31.642	     need with the the
00:27:34.103	     technology that solves it
00:27:36.423	     in a profitable use case
00:27:39.912	     That's a very good point, Daron.
00:27:41.492	     I think we're going to have to clip that,
00:27:42.634	     man.
00:27:42.834	     I really like the way that
00:27:43.755	     you phrased that.
00:27:45.576	     So it's really having understanding,
00:27:47.738	     being able to look at the
00:27:48.598	     big picture and really
00:27:50.661	     being able to connect
00:27:52.402	     different elements to what
00:27:54.183	     you're doing and everything
00:27:55.984	     tying back to that purpose, right?
00:27:57.625	     Making sure that what you're
00:27:58.666	     delivering and what you're
00:27:59.548	     working on is connected to that purpose.
00:28:03.460	     You know,
00:28:03.921	     as I'm sitting here thinking
00:28:04.701	     about it and hearing you
00:28:05.701	     kind of repeat it back to me,
00:28:07.262	     I think that there's an
00:28:08.223	     intellectual curiosity
00:28:09.904	     that's also a really
00:28:10.925	     important attribute in technology, right?
00:28:17.029	     I think that in IT in general,
00:28:19.011	     we say that if you're not
00:28:20.152	     always learning something new,
00:28:21.353	     then you're falling behind
00:28:23.154	     because things evolve so
00:28:24.855	     rapidly and something that's necessary is
00:28:28.325	     for someone to be willing
00:28:29.884	     and able to continue learning,
00:28:31.705	     to be a lifelong learner is a curiosity,
00:28:36.006	     right?
00:28:36.787	     Uh, I'm thinking about, you know,
00:28:37.926	     we're operating, you know,
00:28:39.767	     30,000 foot level down to sea level.
00:28:41.386	     And, uh, you know, I'm thinking, you know,
00:28:43.827	     and, and, and the, the, the real tech, uh,
00:28:47.008	     leader needs to have that curiosity where,
00:28:50.288	     you know, he sees the forest,
00:28:51.229	     he sees the trees and he's
00:28:52.328	     willing to get down on his
00:28:53.170	     hands and knees and, you know,
00:28:54.269	     dig through the, the, uh,
00:28:56.242	     the dirt and the undergrowth
00:28:57.564	     to see what's going on underground,
00:28:59.025	     right?
00:28:59.204	     Because you never know, right,
00:29:00.665	     where the next solution is
00:29:02.106	     going to come from.
00:29:03.248	     It's probably not in an area
00:29:05.369	     that you're already
00:29:05.950	     thinking about because
00:29:07.951	     there's so much capability
00:29:10.292	     and businesses have so many
00:29:11.994	     unmet needs that the
00:29:14.977	     challenge isn't finding a
00:29:16.637	     problem that needs solving.
00:29:17.558	     It's finding the best
00:29:19.079	     problems or the biggest
00:29:20.401	     problems that need solving
00:29:21.701	     first and then taking the time
00:29:25.828	     Uh, to, you know, understand that,
00:29:28.170	     understand the problem,
00:29:29.289	     even before you start
00:29:30.029	     working on the solution,
00:29:30.869	     really understand the problem.
00:29:33.131	     And then, and then to, uh, you know,
00:29:35.332	     to start thinking about, okay,
00:29:36.892	     the curiosity again,
00:29:38.231	     researching what's the art
00:29:39.553	     of the possible and what's
00:29:41.653	     the art of the possible, you know,
00:29:43.614	     in the near future, given the, uh,
00:29:45.794	     the pace and rate of change,
00:29:48.474	     and then being able to take,
00:29:49.615	     take all that and, and, uh, you know,
00:29:51.875	     structure that into a, uh,
00:29:53.777	     a project or a plan.
00:29:55.680	     that really builds this new
00:29:57.799	     enabling feature.
00:30:00.740	     That's very interesting that
00:30:01.980	     you kind of frame it that way, right?
00:30:03.221	     Because that's one of the
00:30:03.942	     things that obviously as we
00:30:05.701	     look to integrate this
00:30:07.461	     technology into our own
00:30:08.762	     offerings and kind of what
00:30:09.863	     we do is one of the things
00:30:11.482	     that we've been struggling with, right?
00:30:13.163	     You've got these paradigms
00:30:14.743	     in terms of what you're
00:30:16.384	     used to and how you're used
00:30:17.565	     to solving things.
00:30:19.224	     And now you've got this new
00:30:20.484	     access to this technology
00:30:22.365	     that can completely shift
00:30:25.146	     But you're going to miss it
00:30:26.969	     if you don't really
00:30:28.250	     challenge yourself and even
00:30:29.853	     just be curious about it.
00:30:32.355	     And it's not even like a
00:30:33.217	     malicious thing or you're
00:30:34.419	     trying to consciously omit it.
00:30:35.980	     But it's just you don't know
00:30:37.582	     what you don't know until
00:30:38.743	     you start playing around with it.
00:30:40.766	     And, you know,
00:30:41.267	     the joke I make with the
00:30:42.166	     team all the time is like, you know,
00:30:44.409	     you got to fuck around to learn.
00:30:46.650	     Right.
00:30:46.970	     And it's and it's true.
00:30:48.991	     And it boils down to kind of
00:30:50.893	     a lot of the stuff that you're saying.
00:30:52.653	     You got to play around with it.
00:30:53.855	     You got to see it's easiest.
00:30:54.935	     You got to see where it's good,
00:30:56.596	     where it's not good.
00:30:57.617	     And I think then you can
00:30:59.318	     really carve out a path forward.
00:31:02.540	     Yeah.
00:31:03.422	     And it really goes back to curiosity.
00:31:04.942	     Yeah.
00:31:05.717	     Right.
00:31:06.397	     And you have to, you know,
00:31:07.798	     not everyone has that curiosity,
00:31:09.381	     that intellectual curiosity to say, well,
00:31:11.762	     there's something I'm interested in.
00:31:12.924	     It seems to have no relevance to my, my,
00:31:15.346	     my role or my occupation today,
00:31:17.269	     but I'm curious.
00:31:18.750	     And that curiosity is going
00:31:20.613	     to enable learning.
00:31:22.317	     that may or may not in the
00:31:23.898	     end enable a new solution
00:31:26.479	     to be developed.
00:31:27.720	     And I want to touch on that may or may not,
00:31:30.019	     right?
00:31:30.381	     Because that's where
00:31:31.621	     organizations need to be
00:31:32.941	     open to experimenting.
00:31:35.501	     And we're talking about the
00:31:37.282	     context of time and money
00:31:39.042	     and all these kind of
00:31:39.803	     factors that play a role in it.
00:31:41.943	     Experimenting isn't really a
00:31:43.404	     word that people like to
00:31:45.145	     throw around when time and
00:31:46.267	     money is concerned,
00:31:47.106	     especially from a business perspective.
00:31:49.087	     But the truth is to really
00:31:50.489	     maximize and understand as well,
00:31:52.269	     to align to a lot of the
00:31:53.170	     messaging that you're saying,
00:31:54.431	     you're going to need to experiment.
00:31:57.232	     What do you think about that?
00:31:59.054	     Yeah, you know,
00:32:00.294	     when you were talking about that,
00:32:01.295	     it really reminded me of, you know,
00:32:03.615	     the difference between the
00:32:04.636	     old waterfall approach to
00:32:06.337	     projects versus the, you know,
00:32:08.179	     the agile approach.
00:32:10.461	     agile methodology, right?
00:32:11.682	     Where in a waterfall approach,
00:32:13.483	     you tend to start with a
00:32:15.726	     lot of effort on requirements,
00:32:17.748	     trying to capture all the
00:32:18.688	     requirements and then spend a, you know,
00:32:20.910	     large amount of time
00:32:21.770	     building the one monolithic
00:32:23.092	     solution that meets all the requirements,
00:32:25.473	     which in a lot of cases, if, you know,
00:32:27.655	     if it's just not thought
00:32:28.717	     out quite well enough,
00:32:29.616	     or if something happens or changed,
00:32:31.199	     or your, you know,
00:32:31.838	     your thought leader wasn't
00:32:32.619	     quite curious enough and he
00:32:33.621	     didn't know what was really
00:32:34.520	     possible or what the real need was,
00:32:37.723	     you know, you can end up with a,
00:32:40.401	     a failure, right?
00:32:44.664	     And to add to that too,
00:32:46.705	     a lot of this technology
00:32:47.846	     has very heavy dependency
00:32:51.730	     on things like data quality.
00:32:55.092	     And especially the higher up
00:32:57.252	     the chain that you go,
00:32:58.134	     the more disconnected that
00:32:59.835	     some of the executive team
00:33:01.296	     members are from the state
00:33:03.517	     of quality of their data,
00:33:05.137	     of their systems, of their processes,
00:33:07.480	     and whatnot.
00:33:08.380	     And that's just natural with
00:33:09.681	     any hierarchy that you've
00:33:11.923	     got in the business.
00:33:14.304	     And so, you know, AI,
00:33:16.464	     machine learning and all
00:33:17.664	     these kinds of tools will
00:33:18.684	     need to really build on
00:33:21.306	     data and the quality of it.
00:33:24.066	     So what kind of role do you
00:33:25.226	     think that that's going to
00:33:26.027	     play with experimenting
00:33:27.307	     things and running into?
00:33:29.047	     We thought we could do this
00:33:30.067	     and on paper we can.
00:33:32.008	     But when we see what we got to work with,
00:33:34.729	     we have this one ingredient.
00:33:36.888	     that we just won't be able
00:33:38.528	     to make the dish with
00:33:39.470	     because it's a rotten
00:33:40.450	     ingredient or it's not sour
00:33:42.130	     enough or it doesn't have
00:33:43.250	     the right acidity that's
00:33:44.891	     needed to really turn this
00:33:46.631	     dish into a five-star meal.
00:33:50.752	     Yeah,
00:33:51.353	     I think specifically data quality is
00:33:54.452	     a very important part of
00:33:58.054	     the evolution of technology systems.
00:34:00.694	     And I think you're absolutely right.
00:34:03.036	     Data is inherently
00:34:04.917	     the lowest level, you know,
00:34:06.239	     the C-level detail and
00:34:08.519	     executives and leadership
00:34:10.521	     tends not to be as closely
00:34:12.461	     connected to the data as
00:34:14.402	     you need to be to be able
00:34:16.043	     to objectively assess is
00:34:17.324	     this high quality data?
00:34:18.804	     Do I even have the data that
00:34:20.246	     I need to enable a use case
00:34:22.666	     that I'm thinking about?
00:34:26.969	     So to go back to, sorry about that,
00:34:30.490	     just had a distraction for a moment.
00:34:32.711	     Going back to the idea of data quality,
00:34:34.172	     right?
00:34:36.717	     I always think of a pilot,
00:34:38.057	     a pilot for a project, right?
00:34:40.480	     An experiment to say, okay,
00:34:41.840	     this is our objective.
00:34:42.782	     We want to try to do something this way.
00:34:44.402	     You know, we can,
00:34:45.844	     we can design a waterfall
00:34:46.965	     project and then, you know, million,
00:34:48.947	     million dollars or more and, and, uh,
00:34:50.768	     you know, try to do something.
00:34:53.630	     Um, or we could do it as a quick and dirty,
00:34:55.793	     like let's hire an intern and, and, uh,
00:34:58.534	     you know, have them, you know,
00:34:59.715	     give them six months or three months and,
00:35:01.597	     and set them loose with a,
00:35:03.378	     a really general, um,
00:35:05.413	     objective and see what he comes up with.
00:35:08.235	     Uh, I think that's going to,
00:35:09.976	     it's going to create a couple outcomes,
00:35:11.878	     right?
00:35:12.358	     Uh,
00:35:12.498	     first is you'll get some objective
00:35:15.380	     results that tell us about
00:35:17.141	     the quality of the, uh, the,
00:35:20.463	     the underlying, you know,
00:35:21.925	     the foundational systems
00:35:23.125	     and the data that they contain.
00:35:24.786	     And also tell us about the
00:35:26.027	     quality of the processes
00:35:27.548	     they're responsible for creating, uh,
00:35:30.431	     and maintaining that, that, uh,
00:35:32.512	     essential business information.
00:35:35.152	     uh, you know, with, uh, uh,
00:35:39.655	     I forgot who it was when they said, uh,
00:35:41.356	     you know, data is the new oil, right?
00:35:43.536	     Data is the, the, the,
00:35:44.737	     the new most valuable thing
00:35:46.159	     in business because it can
00:35:47.539	     be monetized in a bunch of
00:35:48.780	     different ways.
00:35:50.681	     Uh, you know, there's different,
00:35:52.702	     just like there's different
00:35:53.704	     qualities of crude.
00:35:54.643	     I think there's light, sweet, crude,
00:35:56.164	     and there's the, you know,
00:35:57.186	     the oil sand stuff that
00:35:58.726	     we're digging up out of the
00:35:59.606	     ground that is, uh,
00:36:00.972	     maybe abundant, but not quite as easy to,
00:36:03.373	     to dig up.
00:36:03.954	     Right.
00:36:04.094	     We, we have to really, you know,
00:36:05.094	     it's not just, it's not just oil.
00:36:07.655	     It's a, you know,
00:36:08.695	     it's a high quality or low quality and,
00:36:10.916	     and, you know,
00:36:12.275	     oil companies know all about that, right.
00:36:14.077	     Same way organizations,
00:36:15.858	     enterprises need to understand their,
00:36:18.898	     their core value, their,
00:36:21.639	     their underlying value, the data in their,
00:36:23.940	     in their systems and, and, and what it is,
00:36:25.840	     how, how valuable it is and,
00:36:28.092	     how to increase that value,
00:36:29.932	     how to derive value from it.
00:36:32.974	     Right.
00:36:33.416	     And I think when that
00:36:34.637	     context was thrown out
00:36:36.237	     there about data being the oil,
00:36:38.039	     a lot of it had to do with
00:36:39.340	     the value is not in this raw form,
00:36:42.422	     but more so in being
00:36:45.224	     processed and being refined
00:36:47.465	     and everything else there
00:36:49.065	     so that it can turn into energy, right?
00:36:50.907	     Oil to you is not valuable in any way.
00:36:53.268	     But what comes from it is valuable to you,
00:36:57.934	     right?
00:36:58.096	     It powers your house,
00:36:59.416	     it powers your vehicles,
00:37:00.579	     it creates your clothes,
00:37:02.181	     it does a number of things.
00:37:05.445	     Let's talk about not just
00:37:07.967	     the data element of it,
00:37:08.987	     but more so shifting towards learnings.
00:37:12.150	     Because when you look at AI
00:37:13.311	     integration for a business,
00:37:17.152	     it's a similar type project
00:37:18.753	     to previous digital
00:37:20.034	     transformations or previous
00:37:21.855	     system implementations.
00:37:23.396	     You've got a current state.
00:37:24.996	     that is going to change into
00:37:26.597	     a future state,
00:37:27.679	     and technology is going to
00:37:29.059	     be the driver behind that, or the enabler,
00:37:32.643	     as you kind of say, right?
00:37:34.224	     But it's still a people business.
00:37:36.126	     It's still driven by the
00:37:37.786	     people that work within
00:37:38.827	     there and operate it.
00:37:40.510	     It's still dependent on the
00:37:41.630	     data and everything else there.
00:37:44.913	     Let's talk about some of the
00:37:45.853	     things that went really bad
00:37:47.313	     or really good that you've
00:37:49.275	     been a part of throughout
00:37:51.155	     your career on similar
00:37:53.317	     projects and anything that
00:37:54.456	     you can share here with the audience.
00:37:57.257	     Sure.
00:37:57.617	     Yeah.
00:37:58.619	     I've always said the best
00:37:59.478	     example is the example of what not to do.
00:38:02.199	     And my understanding,
00:38:03.960	     I think it's reported that
00:38:05.501	     something like two thirds
00:38:06.782	     or three quarters of all
00:38:08.443	     ERP system implementation projects
00:38:12.045	     will fail in one way or one
00:38:13.565	     measure or another, right?
00:38:14.547	     A failure might mean just
00:38:15.768	     not achieving all of the
00:38:17.228	     objectives of the initial project, or,
00:38:20.512	     you know, failure could also be late or,
00:38:24.635	     or over budget.
00:38:26.416	     But, you know,
00:38:29.338	     understanding that an
00:38:30.378	     example of what not to do
00:38:31.599	     is some of the useful
00:38:33.402	     things or the most useful
00:38:35.242	     cases to learn from.
00:38:36.204	     I think that, you know,
00:38:39.215	     time and budget again.
00:38:40.315	     And then I think also just
00:38:42.315	     the idea of respecting reality.
00:38:44.775	     An organization needs to be
00:38:46.476	     very objective and
00:38:47.476	     realistic about this is
00:38:49.277	     where we are today.
00:38:50.458	     And given what we have in
00:38:53.217	     our wheelhouse today,
00:38:54.978	     are we capable of designing, building,
00:38:59.380	     implementing and
00:39:00.599	     maintaining an application, a tool,
00:39:03.320	     a use case to do whatever
00:39:04.561	     it is the company wants to
00:39:05.661	     do next and making sure that that's
00:39:09.081	     objectively estimated and is
00:39:13.222	     adequately resourced,
00:39:15.422	     both with time and money.
00:39:17.923	     So you got to have that
00:39:18.844	     organizational honesty is a must,
00:39:21.983	     is kind of what you're
00:39:22.764	     saying when you're starting
00:39:24.644	     one of these journeys.
00:39:26.164	     Yeah.
00:39:27.085	     Respect for reality.
00:39:28.065	     Got to honor reality,
00:39:29.045	     respect reality and objective reality.
00:39:33.565	     Not the world according to Daron,
00:39:34.947	     but the world according to
00:39:37.079	     know Daron and all the
00:39:38.099	     other objective observers
00:39:39.882	     that are that are uh seeing
00:39:42.425	     the same thing right so
00:39:45.568	     what else would you say is
00:39:46.889	     kind of a good learning to
00:39:48.592	     make sure that the ai
00:39:50.213	     project doesn't become a
00:39:51.295	     money pit but it actually
00:39:52.856	     becomes a transformative
00:39:54.619	     value-added implementation
00:39:57.844	     you know, the, the,
00:39:58.804	     I think a key success
00:39:59.885	     factor in any sort of a
00:40:01.266	     technology or implementation is, is the,
00:40:03.646	     you know, again,
00:40:04.106	     going back to the leadership, having that,
00:40:06.208     that solid core ownership and having it,
00:40:09.668     having that, um, you know, uh,
00:40:13.230	     having that celebrated and, and, uh, and,
00:40:16.110	     you know,
00:40:17.211	     bestow having that
00:40:18.012	     responsibility bestowed on the, uh,
00:40:20.012	     on the,
00:40:20.432	     the leaders of the organization
00:40:22.152	     that really have the capability and the,
00:40:26.514	     um,
00:40:28.117	     and the charter to go
00:40:29.958	     accomplish the objective of the project.
00:40:32.239	     You know, I'm thinking back of, you know,
00:40:34.579	     in the intro,
00:40:35.360	     you mentioned our past
00:40:36.681	     experience working together
00:40:37.760	     at an aerospace company in
00:40:38.882	     the Los Angeles area.
00:40:40.902	     We had three individuals.
00:40:42.623	     If you remember,
00:40:44.063	     you came up with the term MIT, right?
00:40:46.204	     Because I think it was Michael, Ian,
00:40:48.505	     and Tamara.
00:40:49.625	     And MIT, right?
00:40:51.606	     I mean, it sounds good.
00:40:53.485	     You know, it rolls off the tongue.
00:40:56.724	     And MIT were empowered.
00:40:58.784	     They were taken out of their day jobs for,
00:41:01.766	     I think it was six or nine
00:41:02.826	     months or so to focus on the project.
00:41:05.746	     You know,
00:41:06.007	     being an ERP implementation project,
00:41:08.327	     three full-time resources
00:41:10.088	     from within the business
00:41:11.289	     was the right amount of resource.
00:41:16.210	     I think the organization was
00:41:17.210	     about 300 people at that time.
00:41:18.771	     So, you know,
00:41:20.532	     it was 1% of the organization
00:41:22.052	     was dedicated on the project.
00:41:24.449	     And that project was really
00:41:25.590	     a wild success in large
00:41:28.630	     part attributable to the
00:41:30.911	     focus of that really strong
00:41:33.913	     leadership team.
00:41:35.952	     And that was enabled because
00:41:39.253	     the business leadership was
00:41:41.775	     very objective about this is what needs
00:41:44.371	     this is what we need to do
00:41:45.313	     to be successful in this project.
00:41:47.253	     Now you,
00:41:47.594	     you probably remember better than I,
00:41:48.775	     because I joined the company in the,
00:41:50.215	     in the, in the middle of the project.
00:41:52.358	     I think that, uh,
00:41:53.297	     that reality was realized
00:41:55.420	     after failing a couple of times, right?
00:41:57.061	     There was an initial
00:41:57.802	     implementation consulting
00:41:58.981	     group that failed because I
00:42:01.244	     think they load, you know,
00:42:02.005	     they were probably the
00:42:02.505	     lowest bidder and didn't provide the, uh,
00:42:04.365	     the time and support to the project.
00:42:07.487	     And then there was a, you know, a second,
00:42:09.170	     uh,
00:42:09.369	     consultant that took over the project
00:42:10.911	     leadership and, and, uh,
00:42:13.460	     Uh, I think, uh, you know, I was a one,
00:42:15.601	     one woman show and, um, you know,
00:42:18.125	     didn't have the consensus
00:42:20.126	     and buy-in wasn't able to
00:42:21.327	     form the consensus and buy-in with the,
00:42:23.849	     you know,
00:42:24.030	     the leadership at the site or the,
00:42:25.572	     you know, the, the,
00:42:26.253	     the more executive
00:42:26.952	     leadership of the company
00:42:27.893	     that really had to pay the bills.
00:42:29.815	     And, and after the, you know,
00:42:31.757	     the first two attempts failed.
00:42:34.550	     And when Luniko was brought
00:42:35.751	     into the project to, you know,
00:42:37.132	     to really to lead and
00:42:39.996	     finish out the project that
00:42:41.217	     had been started and, you know,
00:42:42.579	     sort of been floundering for,
00:42:43.739	     I don't know,
00:42:45.081	     I think it was close to a
00:42:45.942	     year or so by the time you
00:42:47.364	     got involved in the lead role,
00:42:51.768	     the business had learned very objectively,
00:42:53.929	     it's not going to work if
00:42:54.891	     we don't adequately resource it.
00:42:57.371	     Yeah,
00:42:57.592	     and I do recall the initial
00:42:58.972	     conversations.
00:43:00.815	     They were involved early on in the project,
00:43:02.697	     but they weren't committed, right?
00:43:04.657	     And I think there's a big
00:43:05.498	     difference between
00:43:07.161	     involving people and committing people.
00:43:09.342	     And I do recall the
00:43:10.163	     conversation with the VP of
00:43:11.565	     operations at the time.
00:43:13.545	     um ryan and he was like
00:43:16.266	     these are my best people
00:43:17.507	     like you know you're asking
00:43:19.608	     for the best people that we
00:43:21.849	     have on here right on our
00:43:23.530	     shop and they're
00:43:24.070	     representing engineering
00:43:25.271	     they're representing
00:43:25.951	     operations they're
00:43:27.072	     representing quality and
00:43:28.552	     it's like we really need
00:43:29.853	     them on there to deliver
00:43:31.293	     operations but he saw
00:43:34.074	     strategically how important
00:43:36.576	     it was to commit those three people
00:43:39.717	     And that it'd be better for
00:43:41.097	     the longevity and the
00:43:43.099	     success of the business to
00:43:45.141	     commit them to the project
00:43:47.101	     because he realized it was
00:43:48.382	     transformational so that it
00:43:50.605	     could lead to better results.
00:43:51.945	     And yeah,
00:43:52.405	     it was a six month commitment and
00:43:54.347	     it was going to be hard
00:43:55.427	     transition in them and all
00:43:56.949	     of that kind of stuff.
00:43:58.550	     But that in the end,
00:43:59.369	     it would be worth it
00:44:00.130	     because it would have a
00:44:00.831	     much stronger foundation,
00:44:02.811	     better business processes,
00:44:04.652	     and it would be something
00:44:05.492	     that's aligned to their
00:44:06.853	     business and in an
00:44:08.255	     objective and a fit for purpose way.
00:44:10.536	     And it worked out.
00:44:11.735	     It did work out in the end there, right?
00:44:13.657	     And I think that that's probably the
00:44:15.777	     the frame of mind that a lot
00:44:17.619	     of executives need to put
00:44:19.501	     themselves in it's not just
00:44:21.742	     who can we hire that's easy
00:44:23.443	     that like you said the
00:44:24.545	     interns that can come in
00:44:25.585	     and kind of execute the
00:44:27.206	     work on paper because it
00:44:28.807	     says that you must do
00:44:30.349	     gather requirements and you
00:44:31.951	     know this person can gather
00:44:33.172	     requirements and more think
00:44:35.634	     about who are the people
00:44:37.335	     that need to be committed to the project
00:44:40.257	     so that this can be
00:44:41.137	     successful and so that we
00:44:42.900	     can get the value that
00:44:44.141	     we're looking for from this
00:44:46.523	     disruptive technology.
00:44:48.005	     Otherwise, like you said,
00:44:48.925	     it's just going to become
00:44:49.786	     another money pit project
00:44:51.427	     that companies are going to
00:44:52.409	     be doing over and over and over again.
00:44:56.371	     And, you know, yes, listening to you,
00:44:58.554	     I think there's another
00:44:59.355	     benefit that came out of
00:45:00.155	     the project that actually
00:45:00.896      really had almost nothing
00:45:01.998	     to do with the project itself.
00:45:03.994	     By taking the three MVPs of
00:45:06.155	     the organization, the best quality person,
00:45:08.416	     the best engineering person,
00:45:09.617	     I think best planner,
00:45:11.978	     and committing them to not their day job,
00:45:14.039	     but to another project
00:45:15.159	     that's going to transform the business,
00:45:16.661	     it created an opportunity
00:45:18.902	     for other people in the
00:45:19.822	     organization who are, I think,
00:45:21.682	     more than prepared to step
00:45:23.583	     up to fill those open roles
00:45:26.885	     during the course of the project.
00:45:29.146	     It actually grew the talent
00:45:31.547	     base in the company.
00:45:32.547	     If you're a
00:45:33.809	     I forget the planner's name who was,
00:45:36.510	     he had to print all the trappers, right?
00:45:39.731	     He stepped up in a huge way
00:45:41.672	     and he was given a huge
00:45:42.492	     opportunity that had, you know,
00:45:45.452	     the VP of operations at Rye
00:45:46.753	     had not taken the very
00:45:49.173	     difficult decision to pull
00:45:50.414	     his best resources out of
00:45:51.855	     operations and put them on this project.
00:45:54.115	     He actually improved the
00:45:55.456	     operational team because he
00:45:57.556	     gave some people some new
00:45:58.498	     experience that they might
00:45:59.297	     not have had an opportunity
00:46:00.597	     to ever have in that business
00:46:04.050	     And that,
00:46:04.431	     and that grew the overall talent
00:46:05.992	     pool in the business that helped evolve,
00:46:09.072	     you know, not the systems, but the, the,
00:46:11.873	     the employees.
00:46:13.693	     And, and that,
00:46:14.693	     I think that that's part of the, you know,
00:46:16.634	     the, the forest and the trees, right?
00:46:18.313	     I mean, we, we took people who were at, uh,
00:46:20.135	     maybe operating at a 500
00:46:21.715	     foot level and moved them
00:46:23.335	     up to a thousand or 5,000 foot level.
00:46:26.255	     So they had bigger, bigger picture,
00:46:28.516	     bigger responsibility, bigger roles,
00:46:30.516	     bigger shoes to fill.
00:46:31.356	     And man, they,
00:46:32.398	     they were just so successful.
00:46:35.409	     Yeah, that's a good point there.
00:46:36.791	     And I think midterm and long term,
00:46:38.210	     it makes sense in almost
00:46:40.552	     every single way that you
00:46:41.452	     can kind of think about it.
00:46:42.594	     It's that short term fear
00:46:44.655	     that I think holds you back a little bit,
00:46:46.876	     right?
00:46:47.076	     Because you're like, man,
00:46:47.896	     what am I going to do?
00:46:49.280	     without Ian running the
00:46:51.543	     travelers and being out
00:46:52.704	     there in the shop floor and
00:46:53.784	     coordinating things, right?
00:46:55.206	     I won't be able to rely on him to do that.
00:46:57.829	     Or if there's quality issues,
00:46:59.070	     Tamara's not gonna be involved on that,
00:47:01.172	     right?
00:47:01.331	     Cause she's gotta be
00:47:02.072	     involved in this project
00:47:03.353	     and things like that scare people.
00:47:06.315	     Um, I, I think in retrospect, right,
00:47:08.416	     everybody can look at that
00:47:09.657	     project and say that that
00:47:10.719	     was the right choice.
00:47:11.639	     And I think that Ryan looks at his,
00:47:13.699	     his operation, uh,
00:47:15.360	     and he was able to get a
00:47:16.422	     lot of benefits from being
00:47:18.003	     able to commit that.
00:47:18.802	     But in the moment it's, it's, it's scary,
00:47:21.023	     man.
00:47:22.344	     Yeah.
00:47:23.164	     It feels like a big risk, right?
00:47:24.646	     You know, it's a, it's a big threat to the,
00:47:27.347	     uh, the current operation, right.
00:47:28.847	     To,
00:47:29.108	     to take your best people out of their
00:47:30.989	     day jobs.
00:47:32.454	     But it's, it's, it's, it's, you know,
00:47:34.115	     it's part of the strategy.
00:47:35.297	     It's, you know,
00:47:36.597	     that was enabled in large
00:47:38.298	     part because they had the foundation,
00:47:40.159	     they had, you know,
00:47:41.641	     no single points of failure as a,
00:47:43.822	     as one idea that always comes up, right.
00:47:45.463	     As, uh, you know, if you, if you take your,
00:47:48.284	     your engineering manager
00:47:49.806	     and put them on a project
00:47:51.027	     and you have to have
00:47:51.786	     somebody else that's able to step up,
00:47:53.387	     even in a temporary
00:47:54.289	     capacity to manage an
00:47:56.329	     engineering function or
00:47:57.309	     engineering department.
00:47:59.268	     In a, in a smaller,
00:48:00.588     less prepared organization,
00:48:02.170	     you're not going to be able to do that.
00:48:04.449	     So let me ask you for these
00:48:06.610	     type of projects, right?
00:48:08.130	     Um, do you think you need to do that?
00:48:10.030	     Do you think you need to be
00:48:10.952	     grabbing your best people,
00:48:12.632	     taking them out, putting them into this,
00:48:14.891	     or do they need to be fully
00:48:16.512	     committed or is this
00:48:17.592	     something that they can be
00:48:18.552	     more involved in?
00:48:19.413	     Um, and somewhat committed to.
00:48:21.713	     Well, you know, I,
00:48:24.554	     I don't think I wouldn't
00:48:25.235	     say it has to be your best people.
00:48:26.275	     It has to be the right people.
00:48:27.856	     You know, in this case, right,
00:48:29.016	     it happened to be the right
00:48:30.617	     people for the task were
00:48:32.376	     the best people in the organization.
00:48:34.936	     But, you know,
00:48:35.398	     when we talked about your
00:48:36.637	     idea earlier about, you know,
00:48:37.938	     do you start with a huge
00:48:39.177	     waterfall project or do you
00:48:40.378	     start with experimentation?
00:48:42.099	     You know,
00:48:42.998	     the best person to run a
00:48:44.099	     waterfall project is not
00:48:45.519	     going to be the best person
00:48:46.679	     to experiment and piddle
00:48:48.219	     around with a technology to
00:48:49.960	     find what is the art of the
00:48:51.541	     possible and how can we
00:48:52.501	     apply this to solve our problem?
00:48:54.601	     So I think it's, you know,
00:48:57.541	     it's not the best person,
00:48:58.422	     it's the right person.
00:49:00.302	     And often you don't know
00:49:02.023	     that until you're looking back on it.
00:49:05.465	     But I think that's part of
00:49:06.786	     the vision that a leader
00:49:08.648     has to have is objectively,
00:49:12.650	     where am I in the organization?
00:49:14.110	     Can I afford to take the
00:49:15.972	     right people for this
00:49:16.893	     project out of their day jobs?
00:49:20.675	     And will that have
00:49:21.873	     Will the negative short-term
00:49:23.237	     impact on the operation
00:49:24.842	     outweigh the long-term
00:49:26.226	     benefit that we expect to
00:49:27.851	     receive from the new solution?
00:49:30.396	     Yeah, that makes sense.
00:49:32.117	     So let's have one more question here,
00:49:34.858	     Daron, before we kind of wrap things up.
00:49:37.398	     So what advice would you
00:49:38.759	     give to people in similar roles?
00:49:42.360	     Or maybe it's somebody who's
00:49:44.221	     going from operations,
00:49:45.460	     trying to transition into technology.
00:49:48.061	     What advice would you give
00:49:49.202	     them that could enable and
00:49:50.742	     empower them in the future?
00:49:54.083	     Yeah, I guess I'd take that in two parts.
00:49:59.177	     what advice would I give to a business?
00:50:00.778	     And then what advice would I
00:50:01.659	     give to an individual or to a, you know,
00:50:03.601	     to a,
00:50:04.300	     an employee who's thinking about
00:50:05.641	     their career.
00:50:06.262	     So for, for business,
00:50:07.302	     I think it's really
00:50:10.025	     important to be objective,
00:50:12.025	     to objectively understand
00:50:14.307	     the strength of strengths
00:50:15.869	     and weaknesses in the
00:50:16.648	     business and the
00:50:17.750	     opportunities that come with it.
00:50:20.972	     And so it's,
00:50:23.954	     it's very important to have that, that,
00:50:27.215	     that solid,
00:50:28.762	     well-founded vision that's
00:50:30.744	     based in reality and
00:50:32.164	     respects and honors reality, you know,
00:50:34.565	     the good and the bad parts of the reality,
00:50:36.427	     uh, and to, and to be just, you know,
00:50:39.847	     to take that and, and,
00:50:41.028	     and to be always making the
00:50:42.829	     best choices for the organization, um,
00:50:46.871	     and, and to be realist, you know,
00:50:48.652	     realistic choices for the organization.
00:50:51.213	     Uh, and then for the individual, I,
00:50:53.315	     I think, you know, it's, I, I,
00:50:55.936	     I had this conversation with my kid, uh,
00:50:59.021	     you know, last year when they were,
00:51:01.523	     Luca graduated in June, right?
00:51:03.103	     So, you know,
00:51:04.724	     we were talking about career
00:51:05.965	     and career choices.
00:51:07.246	     And, you know, my advice to Luca was,
00:51:09.228	     you know,
00:51:10.608	     most important thing about what
00:51:12.648	     you're doing is that you enjoy it, right?
00:51:15.371	     So don't take a job because
00:51:17.092	     you think you need a
00:51:18.092	     certain income or because you think,
00:51:19.632	     you know,
00:51:19.893	     this title or this paycheck or
00:51:21.873	     this company is, you know,
00:51:25.295	     what you want to do for the
00:51:26.097	     rest of your life.
00:51:28.150	     Think about what you want to do now,
00:51:29.652	     what your short-term goals
00:51:31.775	     are and your long-term goals,
00:51:34.277	     and know that if you start
00:51:36.980	     doing one thing today,
00:51:38.021	     it does not mean that
00:51:38.762	     you're going to do that for
00:51:39.563	     the rest of your life.
00:51:41.945	     Be open, be curious,
00:51:45.148	     commit yourself to a
00:51:46.990	     lifetime of learning.
00:51:49.164	     And I think that's a recipe that makes,
00:51:51.469	     you know,
00:51:51.748	     that will help any individual to,
00:51:53.371	     you know, to grow,
00:51:54.512	     to evolve and prosper both
00:51:57.157	     personally and professionally.
00:51:59.237	     Right.
00:51:59.777	     That's good advice.
00:52:00.918	     Very good advice there.
00:52:02.239	     So thanks a lot for coming on the show,
00:52:03.900	     man.
00:52:04.121	     It's been great chatting with you.
00:52:05.782	     For all the listeners, right,
00:52:07.342	     if you're here at this point, thank you.
00:52:10.483	     You know, we're still looking for a name,
00:52:12.704	     right?
00:52:12.985	     So if anybody has kind of
00:52:14.425	     any feedback or in terms of
00:52:16.608	     how the production went or whatnot,
00:52:18.469	     we'll welcome it.
00:52:19.369	     But we are looking for a name.
00:52:20.630	     So please email me or reach
00:52:22.472	     out to me on LinkedIn or
00:52:24.052	     wherever you want to help
00:52:25.313	     us title our show.
00:52:27.956	     And if you know anybody that
00:52:29.838	     you think would be a great
00:52:30.697	     fit for the show and you
00:52:32.300	     want to get them in touch with me, please,
00:52:33.840	     please let us know.
00:52:35.081	     Daron, awesome to have you as always,
00:52:38.585	     man.
00:52:38.804	     I hope to be back in
00:52:40.106	     California with you at some point, right?
00:52:41.806	     Drinking some beers over the beach.
00:52:43.869	     Hopefully in the near future.
00:52:46.175	     Yeah, heck yeah.
00:52:47.255	     Thank you for having me on.
00:52:48.536	     It's always a pleasure talking with you.
00:52:49.996	     Really enjoy time with you
00:52:52.518	     and time in preparation in
00:52:54.179	     advance with Gar and Sophie
00:52:56.119	     and some of the other
00:52:58.240	     consultants that you have
00:52:59.161	     at Luniko that I've known now for,
00:53:01.661	     I guess, going on like five years or so.
00:53:04.402	     So thank you for having me.
00:53:05.603	     It's always great to talk.
00:53:07.585	     Call me anytime.
00:53:10.085	     All right, guys.
00:53:12.126	     Thanks, everybody.
00:53:13.688	     See you next time.
00:53:15.681	     Cheers.
Previous
Previous

EP 00: Unraveling the Podcast Purpose