EP 00: Unraveling the Podcast Purpose

00:00:00.171	     all right we are live, Gar
00:00:02.293	     welcome to the show
00:00:03.074	     everybody the show that
00:00:04.595	     does not have a name yet
00:00:06.217	     because the show that the
00:00:07.238	     name that we wanted to pick
00:00:08.519	     was taken my name is
00:00:11.001	     Yassen Horanszky i'm the
00:00:12.463	     founder and ceo of Luniko
00:00:13.844	     and your host here for
00:00:14.725	     today and i've helped
00:00:16.226	     digitally transform
00:00:17.247	     businesses of all shapes
00:00:18.288	     and sizes from publicly
00:00:19.790	     traded multi-billion dollar
00:00:21.411	     organizations to local
00:00:23.073	     non-profits and everything in between
00:00:25.655	     And just like you,
00:00:26.837	     I'm wondering where
00:00:27.657	     artificial intelligence and
00:00:29.119	     the whole disruptive
00:00:30.100	     technology that comes with
00:00:31.201	     it that's been dominating
00:00:33.103	     headlines over the last 12 months,
00:00:35.366	     where it fits in and how.
00:00:37.832	     So that's why we've created this show,
00:00:40.473	     right?
00:00:40.733	     To bring on guests of all
00:00:41.793	     levels and all types of
00:00:42.754	     businesses to talk about AI
00:00:45.135	     and its business adoption,
00:00:46.335	     but strictly from a non-technical lens.
00:00:50.336	     So that means we're gonna be
00:00:51.677	     touching on topics of strategy, process,
00:00:54.638	     people, we'll get into the emotions,
00:00:56.839	     all of that side of thing.
00:00:58.379	     And if you're curious about
00:00:59.499	     AI and its business impacts
00:01:01.360	     and challenges,
00:01:02.080	     then this show is for you.
00:01:04.101	     So we're going to drop the jargon,
00:01:06.364	     the corporate jargon as much as possible.
00:01:08.646	     We're going to keep the
00:01:09.387	     conversation as simple as we can.
00:01:11.009	     And welcome to episode zero, everybody.
00:01:15.754	     Now, as you guys know,
00:01:17.255	     we don't have a name yet.
00:01:18.837	     Thank you, Garlon,
00:01:19.538	     for putting that up there.
00:01:21.159	     Naming has never been our forte.
00:01:23.440	     So we're calling on anybody
00:01:25.580	     that thinks of a good name.
00:01:27.541	     If you can kind of help us name this show,
00:01:30.362	     right?
00:01:30.742	     Please drop us any
00:01:31.482	     suggestions in the email and LinkedIn,
00:01:33.403	     however you choose to do that.
00:01:35.203	     But not having a name isn't
00:01:37.004	     gonna slow us down.
00:01:40.144	     And we're just gonna get on
00:01:41.765	     with this until we can kind
00:01:42.885	     of figure out what that's
00:01:43.525	     gonna look like.
00:01:44.086	     So episode zero is gonna be
00:01:46.466	     a little bit different than
00:01:47.407	     the rest of the episodes.
00:01:49.288	     And this is episode zero.
00:01:51.132	     This is episode zero.
00:01:52.135	     That's right.
00:01:52.576	     Yeah.
00:01:53.686	     Thank you, Gar.
00:01:54.526	     So it's going to be a little
00:01:55.627	     bit different than everybody.
00:01:57.188	     In future episodes,
00:01:58.028	     we'll have some guests that
00:01:59.429	     is not Luniko related, of course.
00:02:01.810	     But for today's episodes,
00:02:03.631	     we're going to talk about
00:02:04.711	     what the podcast is about,
00:02:05.912	     kind of why we started the podcast.
00:02:08.033	     I've got Garlon in here,
00:02:09.073	     who's my co-founder.
00:02:11.214	     Garlon's typically going to
00:02:12.415	     be in the background here.
00:02:14.156	     So you guys are not really
00:02:14.976	     going to see him as much.
00:02:17.237	     But today I am joined by him.
00:02:19.098	     He is my business partner
00:02:20.259	     for anything and everything,
00:02:21.499	     all things Luniko.
00:02:23.140	     He is an entrepreneur at
00:02:24.622	     heart being in different
00:02:26.524	     types of industries and different roles.
00:02:28.987	     He's leveraged technology
00:02:30.208	     basically in all his
00:02:31.029	     startups from a VR startup
00:02:33.192	     before VR was cool, I will say,
00:02:35.815	     to business intelligence,
00:02:37.417	     to using crowdfunding
00:02:38.638	     platforms like Kickstarter
00:02:39.980	     to successfully launch five products.
00:02:43.303	     and to the latest venture in
00:02:44.524	     AI with myself here at Luniko.
00:02:47.366	     Now,
00:02:47.587	     innovation has always been a constant
00:02:49.328	     in Garlon's life.
00:02:50.429	     And so him and I share those similarities.
00:02:53.171	     And because of that,
00:02:53.932	     we've worked together on
00:02:55.353	     many different projects,
00:02:56.434	     on different capacities,
00:02:57.494	     various startups and
00:02:59.116	     corporate jobs as well.
00:03:00.597	     Since 2011,
00:03:02.739	     where we met working at an
00:03:04.260	     energy firm as he was a co-op student,
00:03:06.321	     I was a summer student.
00:03:07.222	     So
00:03:10.908	     I'm excited because we've
00:03:12.954	     been talking about doing
00:03:13.776	     this for about three months.
00:03:15.080	     So let's see how it goes.
00:03:18.171	     That is true, right?
00:03:20.093	     So look, if you know Garlon and I,
00:03:21.914	     and I think this is the
00:03:22.575	     best place to start, Garlon.
00:03:23.556	     If you know Garlon and I,
00:03:24.536	     you know that we like to
00:03:25.517	     start with the why, right?
00:03:26.518	     We're big Simon Sinek fans.
00:03:28.600     So we're going to talk about
00:03:29.921	     why we started the show
00:03:31.562	     from a personal standpoint
00:03:33.744	     and a professional standpoint.
00:03:35.986	     And then we'll just talk
00:03:37.387	     about our experience kind
00:03:38.988	     of going through, experiencing,
00:03:40.649	     understanding,
00:03:41.870	     and learning the
00:03:42.491	     intricacies of the
00:03:44.032	     different AI kind of models.
00:03:46.654	     just so you can kind of
00:03:47.794	     get a little bit of insight
00:03:48.975	     into some of our journey
00:03:50.015	     there right so let's talk
00:03:51.215	     with why we started this
00:03:52.495	     this show and i will
00:03:53.676	     promise to stop talking
00:03:54.596	     here and focus on you here
00:03:56.176	     Gar but really the start of
00:03:57.996	     it is this was kind of a
00:03:59.657	     been a passion project that
00:04:00.877	     we've decided on and off
00:04:02.657	     between different years
00:04:03.718	     some years we're going to
00:04:04.458	     do it some years we're not
00:04:05.358	     going to do it we talked
00:04:06.178	     about doing it with a
00:04:06.898     partner a couple years ago
00:04:08.519	     but really we just love to talk to people
00:04:11.392	     We love to connect.
00:04:12.233	     We love to learn.
00:04:12.934	     If you've ever seen me at a tech event,
00:04:14.696	     you'll see me trying to
00:04:16.738	     talk to as many people as possible.
00:04:18.840	     So we figured that this
00:04:19.741	     would be a good vehicle for that.
00:04:23.144	     We both grew up around
00:04:24.806	     technology and it's really
00:04:25.927	     been at the center of our
00:04:27.969	     world for quite a bit,
00:04:29.851	     both personally and professionally.
00:04:32.093	     And, you know, Gar,
00:04:32.853	     why don't you tell us a
00:04:33.714	     little bit more about you
00:04:34.855	     and what this podcast is in it for you?
00:04:38.236	     Yeah, for sure.
00:04:40.158	     I mean, as you mentioned,
00:04:41.718	     I've always been interested in technology,
00:04:44.540	     right?
00:04:45.961	     Growing up,
00:04:47.021	     I would always get the newest
00:04:47.982	     cell phones.
00:04:48.822	     Before that, as a little kid,
00:04:50.043	     I would always get the
00:04:50.863	     newest Game Boys to play games and stuff.
00:04:54.265	     So when it came to having an
00:04:57.367	     opportunity to talk about AI,
00:04:59.388	     especially something that is so
00:05:01.550	     disruptive something that is
00:05:04.011	     so I mean at least right
00:05:05.592	     now super talked about and
00:05:07.634	     especially with you know
00:05:09.214	     you and I being in the
00:05:10.695	     business world there's a
00:05:12.957	     lot of corporate impact
00:05:15.918	     from that sense so having
00:05:18.380	     this podcast being able to
00:05:21.281	     have conversations with
00:05:22.702	     people that are in that
00:05:24.483	     realm crossed with how this
00:05:27.625	     disruptive technology might
00:05:28.986	     impact them on a day to day I think is
00:05:31.191	     a fantastic opportunity to
00:05:32.711	     kind of just understand and
00:05:35.512	     dig into this unknown realm
00:05:37.513	     a little bit more together.
00:05:40.093	     Right.
00:05:40.793	     And it's kind of like, you know, AI.
00:05:42.934	     I mean,
00:05:43.194	     a lot of people are saying and I
00:05:44.954	     think we feel under that
00:05:46.555	     same category that it's
00:05:48.435	     going to be very disruptive
00:05:49.715	     and it's going to be
00:05:50.276	     changing the way that we
00:05:52.076	     interact both in the workplace, of course,
00:05:55.917	     with operational
00:05:56.657	     improvements and things like that,
00:05:58.097	     but also on a personal side.
00:06:00.718	     with, I don't know, man,
00:06:01.738	     just many different applications there,
00:06:03.499	     right?
00:06:03.979	     So it's cool because, you know,
00:06:06.280     when the last time
00:06:07.200	     something this
00:06:08.120	     revolutionary from a tech
00:06:09.320	     perspective happened,
00:06:10.081	     it was the internet.
00:06:11.721	     I was pretty young, right?
00:06:14.302	     I was pretty young.
00:06:15.022	     I didn't really understand the, like,
00:06:16.942	     I just grew up with the internet.
00:06:18.283	     But I do remember my dad,
00:06:20.003      who was a database
00:06:20.843	     administrator in the 90s, right?
00:06:23.744	     Working with computers.
00:06:24.804	     We always had kind of
00:06:25.625	     computers growing up.
00:06:27.125	     I love playing computer games, right?
00:06:29.586	     Game boys two and whatnot.
00:06:30.926	     Strategy games were my thing
00:06:32.187	     when I was like, you know, six, seven,
00:06:34.167	     eight years old.
00:06:35.547	     Um,
00:06:35.788	     but I didn't really get as much
00:06:37.188	     computer time from my dad as I wanted.
00:06:39.569	     Right.
00:06:39.829	     Like, uh, which, which was good.
00:06:41.889	     Right.
00:06:42.109	     Not looking back on it.
00:06:43.470	     My dad was, you know,
00:06:44.730	     he was kind of fucking
00:06:45.450	     around with different
00:06:46.191	     things on the internet.
00:06:48.031	     building he was building
00:06:49.033	     some leather keychains and
00:06:50.894      you know he started to
00:06:51.595	     build some websites so he
00:06:52.957	     showed me like what he was
00:06:54.318	     doing there on the website
00:06:55.579	     and i had no understanding
00:06:56.941	     of it right but i did end
00:06:58.943	     up building two websites
00:07:00.785	     that were basically codes
00:07:02.586	     it was a starcraft and
00:07:04.829	     warcraft games back then
00:07:07.291	     video game cheat codes yeah
00:07:09.393	     you're sharing that with everybody
00:07:11.015	     I was sharing the cheat
00:07:11.957	     codes with everybody.
00:07:12.758	     That's right.
00:07:13.919	     And they were public.
00:07:14.941	     But actually,
00:07:16.022	     you had to buy magazines back then.
00:07:18.446	     So I had a magazine of cheat codes.
00:07:20.773	     yeah so i built a website
00:07:22.514	     around it and and whatnot i
00:07:24.536	     don't know what happened
00:07:25.356	     and kind of how it went
00:07:27.098	     after that but i mean
00:07:29.259	     really technology same as
00:07:30.420	     you gar has always been a
00:07:31.501	     key part of my life from a
00:07:33.322	     personal perspective even
00:07:34.523	     with my kids now uh we talk
00:07:36.484	     about ai you know they're
00:07:38.365	     growing up with ai they
00:07:40.007	     they really don't know
00:07:40.947	     anything different for them
00:07:41.928	     things like siri is just a
00:07:43.589	     way to get answers to their questions
00:07:46.271	     They're talking about Star Wars right now.
00:07:48.012	     They're really into that.
00:07:48.953	     So it's like, hey,
00:07:50.054	     who is Baby Yoda's father?
00:07:52.856	     And does he have a brother?
00:07:55.057	     So they'll ask Siri those
00:07:56.158	     kinds of questions.
00:07:56.819	     But we do a lot of things
00:07:57.759	     with Dolly in terms of
00:07:59.761	     creating visuals and
00:08:01.122	     creating coloring pages,
00:08:02.483	     things like that that they can do.
00:08:03.984	     And it's cool to see how they interact.
00:08:07.166	     Anything else you want to
00:08:08.027	     say on the personal side, Gar, for tech?
00:08:10.068	     Or should we move on?
00:08:11.589	     Yeah, let's keep going.
00:08:13.413	     Okay,
00:08:13.673	     so let's talk about the professional side,
00:08:15.275	     right?
00:08:15.415	     Because obviously we're both entrepreneurs,
00:08:17.056	     right?
00:08:17.256	     We've been involved in the
00:08:18.557	     management consultant space
00:08:19.518	     for a number of years in
00:08:21.080	     digital improvements, transformations,
00:08:22.941	     that sort of thing.
00:08:24.542	     AI really just dominated headlines in 2023,
00:08:29.447	     right?
00:08:30.648     And I think it's going to
00:08:32.309	     continue to do so, right?
00:08:34.811	     Yeah.
00:08:36.012	     I think that it's obviously
00:08:37.313	     going to be a big
00:08:38.014	     capability that
00:08:38.954	     organizations are going to
00:08:39.995	     look to try to leverage.
00:08:42.658	     And it doesn't matter what
00:08:44.779	     the technological revolution is,
00:08:46.701	     whether it's the internet
00:08:47.762	     or whether it's basic
00:08:48.622	     systems or AI that we're
00:08:51.004	     talking about now.
00:08:53.006	     The truth is that people
00:08:54.367	     will always be at the center.
00:08:56.929	     of it right and they're
00:08:58.650     going to be either the
00:08:59.511	     reason for its effective
00:09:00.972	     integration or they're
00:09:02.513	     going to be the reason for
00:09:03.713	     its demise and you know
00:09:07.236	     when we talk about things
00:09:08.236	     from a professional
00:09:08.957	     perspective it's like we've
00:09:09.777	     been having a lot of
00:09:10.498	     conversations with various
00:09:12.479	     people and different
00:09:13.560	     stakeholders and what do
00:09:14.801	     you think and what do you think is
00:09:17.042	     is gonna be good,
00:09:17.922	     what do you think is bullshit?
00:09:19.203	     What do you think is more than hype?
00:09:21.864	     What do you think it doesn't
00:09:22.725	     get talked about?
00:09:23.805	     Just to really build our curiosity.
00:09:26.186	     And we tend to do that over
00:09:27.627	     a beer or a networking
00:09:29.668     event or anything like that.
00:09:32.069	     And we thought that like,
00:09:33.270	     why have those conversations
00:09:35.671	     where nobody else can kind
00:09:36.832	     of see them except for us, right?
00:09:38.873	     They tend to be pretty dynamic.
00:09:40.154	     They tend to be interesting.
00:09:41.715	     Everybody has kind of
00:09:42.596	     different perspectives and
00:09:44.097	     the things that they think about,
00:09:45.898	     the fears or the risks or
00:09:47.479	     the challenges or the
00:09:48.360	     opportunities that we just thought like,
00:09:50.721	     let's just get everybody
00:09:51.782	     here together one by one.
00:09:53.904	     We can chat with them.
00:09:54.924	     We can hear their perspectives.
00:09:57.986	     And, you know,
00:09:58.367	     maybe different people will
00:09:59.387	     kind of learn from that, right?
00:10:02.069	     Mm-hmm.
00:10:03.990	     For me, I think, you know,
00:10:07.011	     obviously I won't get too much into it,
00:10:09.091	     but Luniko as a business,
00:10:11.131	     we're definitely dabbling
00:10:12.592	     in AI and we're pivoting
00:10:13.992	     our business to integrate AI,
00:10:16.753	     not just for us, but for our clients.
00:10:19.553	     But what, for me, I thought that, again,
00:10:26.234	     going back to the love for technology,
00:10:28.275	     right?
00:10:29.075	     If I have an opportunity to
00:10:31.395	     marry my love for technology,
00:10:34.431	     with also just business what
00:10:37.253	     i get to do on a day-to-day
00:10:39.055	     i mean we spend 40 hours at
00:10:41.216	     work if not more out day
00:10:43.318	     right so if you can marry
00:10:44.499	     those two worlds together
00:10:45.700	     it's a win-win in my books
00:10:47.861	     and just i know you briefly
00:10:49.202	     talked about it before but
00:10:50.764	     coming straight out of
00:10:51.504	     university i've had that
00:10:52.965	     opportunity to dabble in vr
00:10:54.987	     when vr was even cool this
00:10:56.848	     was like when the first
00:10:57.709	     oculus rift came out i had
00:10:59.690	     a little startup that um or
00:11:02.132	     i was a co-founder of the
00:11:03.133	     little startup that
00:11:04.394	     said hey there's this brand
00:11:06.014	     new technology out there
00:11:07.715	     it's going to be the wild
00:11:08.535	     west there's so many
00:11:09.615	     different applications that
00:11:10.736	     could be done what should
00:11:12.756	     we do and we went from an
00:11:14.237	     idea to an actual execution
00:11:17.558	     of creating that product
00:11:19.618	     and actually selling it to
00:11:20.759	     a couple customers and so
00:11:22.419	     that was a really cool
00:11:23.419	     experience like you know
00:11:25.000	     obviously there's a lot of
00:11:27.031	     learnings that I'm taking from that.
00:11:29.395	     But just kind of remembering
00:11:32.059	     how interesting and dynamic
00:11:34.663	     that work environment was like, hey,
00:11:36.546	     we're doing things that
00:11:37.367	     nobody's done before.
00:11:39.233	     and kind of coming back full circle,
00:11:43.315	     Luniko, this new version of it,
00:11:45.316	     or this new chapter of it,
00:11:48.298	     definitely feels like that for me.
00:11:50.359	     So same thing,
00:11:51.539	     I think even just in this
00:11:53.060	     short period of time where
00:11:54.160	     we started dabbling from a
00:11:56.381	     professional standpoint in AI,
00:11:58.763	     I've had to go through a
00:11:59.963	     few paradigm shifts,
00:12:01.244	     a few new ways of thinking about AI,
00:12:04.645	     because I wasn't like, bam, oh yeah,
00:12:06.466	     this is the next big thing.
00:12:08.067	     I was also caught in the, oh,
00:12:10.329	     this might just be hype.
00:12:12.231	     I'll dabble in it and kind
00:12:13.512	     of make my own opinions.
00:12:14.853	     And throughout that journey,
00:12:16.135	     I think I have a few things
00:12:17.316	     that might be interesting
00:12:18.777	     for me to share on this podcast.
00:12:20.839	     And from a podcast standpoint,
00:12:22.580	     I think it will be cool for us to
00:12:26.061	     take a look in about a year, two years,
00:12:28.723	     or at regular frequencies
00:12:31.445	     to use this podcast as a
00:12:33.746	     bit of a journal.
00:12:34.807	     It'll be interesting to see, you know,
00:12:36.829	     if we keep going three, four,
00:12:38.610	     five years down the line,
00:12:40.691	     going back to episode number one, two,
00:12:43.253	     three, and be like, oh,
00:12:44.354	     this is what people were thinking.
00:12:45.615	     Think about how much AI has
00:12:46.996	     evolved since those conversations, right?
00:12:50.318	     So that's kind of why I'm
00:12:51.799	     doing this professionally, I think.
00:12:53.813	     Yeah,
00:12:54.573	     that'll be cool to see because like
00:12:56.274	     the internet, man,
00:12:57.075	     I don't think anybody back
00:12:58.976	     in 1995 when it was really
00:13:00.877	     picking up steam thought
00:13:02.158	     that you'd be able to pick
00:13:03.819	     up a cell phone and, you know,
00:13:06.481	     have McDonald's delivered
00:13:07.761	     to you in 15 minutes, right?
00:13:10.963	     20 minutes at your door, right?
00:13:12.884	     Like it's crazy how much we've come.
00:13:14.746	     That is cool.
00:13:15.486	     Yeah.
00:13:16.046	     I mean,
00:13:16.367	     think about all these new technologies.
00:13:19.188	     Whenever a new platform
00:13:20.869	     comes out or a new disruptive technology,
00:13:24.047	     It takes time for all the
00:13:25.989	     different permutations and
00:13:27.351	     ideas to come out of it.
00:13:29.414	     When you think about virtual reality,
00:13:31.596	     it's kind of the same thing, right?
00:13:32.918	     At first, everyone's like, oh,
00:13:33.819	     there's so many business applications.
00:13:35.441	     There's so many video game applications.
00:13:37.664	     But what's interesting about
00:13:38.745	     that is that that one
00:13:40.187	     didn't quite live up to the
00:13:41.789	     hype that it did.
00:13:43.992	     And obviously AI,
00:13:45.233	     we're having this hype cycle.
00:13:47.054	     I think you can see it from the stocks.
00:13:48.735	     But what I believe is that
00:13:50.677	     there will be value
00:13:51.818	     generated from an AI standpoint,
00:13:53.880	     regardless of what the
00:13:54.900	     stock price is saying of AI companies.
00:14:01.045	     In the long run,
00:14:03.327	     there will be business
00:14:05.268	     value that's generated.
00:14:06.429	     And I really firmly believe
00:14:08.431	     that that's undeniable.
00:14:10.500	     Yeah, I think so too, man.
00:14:12.161	     It's kind of like we see systems nowadays.
00:14:13.962	     It's not really a thing you do.
00:14:15.603	     It's just a core of your
00:14:17.624	     organizational DNA, right?
00:14:19.566	     Just like the dot-com boom too, right?
00:14:21.667	     There was the big hype and
00:14:23.428	     then it all crashed.
00:14:24.829	     But then in the longer run,
00:14:26.090	     you've got your Amazons, your Googles,
00:14:27.911	     your Netflixes and stuff like that,
00:14:30.052	     right?
00:14:31.313	     From the ashes, they arose.
00:14:32.854	     Yeah, definitely.
00:14:36.335	     The ones that made it, right?
00:14:37.916	     So let's talk about,
00:14:39.916	     let's hone in a little bit
00:14:41.437	     on large language models
00:14:43.177	     and specifically ChatGPT.
00:14:45.378	     The ChatGPT was the catalyst, I think,
00:14:48.538	     for the hype last year, right?
00:14:51.239	     I mean, geez, just seeing the capability,
00:14:55.200	     what could be possible at
00:14:56.760	     the first beginning stages
00:14:59.181	     of it right and seeing
00:15:00.362	     videos of oh in you know 30
00:15:02.784	     seconds i created this
00:15:03.945	     website in this business
00:15:05.066	     and all this kind of stuff
00:15:06.347	     that kind of came up from
00:15:07.327	     it that you know obviously
00:15:08.308	     a lot of it was not very
00:15:10.370	     good now that we kind of
00:15:11.711	     see it laid it down and the
00:15:13.812	     dust has settled but it was
00:15:16.794	     and i think it's probably
00:15:17.675	     going to be a lot of people
00:15:18.596	     it's either going to be
00:15:19.897	     chat gpt or some sort of
00:15:21.498	     similar large single model variation
00:15:24.340	     that people are either
00:15:25.361	     exposed to soon if they
00:15:27.724	     haven't already been exposed to it,
00:15:29.246	     or something like Copilot,
00:15:31.129	     where their organization is
00:15:32.390	     going to be purchasing Copilot.
00:15:34.193	     Or I think some people are
00:15:35.134	     already starting to get it
00:15:35.935	     as a preview without having
00:15:37.317	     to purchase it.
00:15:38.700	     Right.
00:15:39.080	     So you've gone through a lot of,
00:15:42.262	     let's talk about your experience in AI,
00:15:45.504	     right?
00:15:45.785	     Because, you know, you,
00:15:47.165	     you're obviously a big tech advocate,
00:15:49.147	     but anytime there's new tech, you,
00:15:51.028	     you go through this like motion of, of,
00:15:53.509	     you know, your initial journey.
00:15:55.331	     So why don't we talk about
00:15:56.912	     what your journey was like
00:15:59.093	     Gar and maybe provide the
00:16:00.494	     audience with a little bit
00:16:01.294	     of context as to how that came to be.
00:16:05.817	     Sure.
00:16:06.838	     Um,
00:16:09.055	     Well,
00:16:09.395	     when we decided to do our business pivot,
00:16:11.396	     Yas came to me and said, hey,
00:16:13.578	     you ever heard of this thing called AI?
00:16:15.219	     Like, yeah, I have.
00:16:17.700	     He's like, well,
00:16:18.021	     I'm thinking of building a
00:16:18.981	     business around it, right?
00:16:20.002	     And I was like, at that point,
00:16:22.664	     I hadn't really dabbled with ChatGPT too,
00:16:26.286	     too much.
00:16:27.387	     I think I had seen some
00:16:29.248	     videos and I went in there
00:16:30.869	     and I specifically remember that
00:16:34.424	     My fiance is a vet and she
00:16:36.645	     had this really challenging case.
00:16:38.185	     And I said, hey,
00:16:38.806	     why don't we ask ChatGPT
00:16:40.427	     and see what ChatGPT could come up with?
00:16:43.588	     And so we typed in all the
00:16:46.129	     parameters of the case and
00:16:47.710	     ChatGPT kind of gave a
00:16:49.011	     generic answer saying like, okay,
00:16:50.631	     it could be any of these things.
00:16:52.492	     From my perspective, I was like, oh,
00:16:54.853	     is this pretty good?
00:16:56.954	     From my fiance's perspective, it was like,
00:16:58.875	     well, it's lacking a few things.
00:17:00.796	     It's not bad.
00:17:02.844	     But it is, you know,
00:17:05.727	     in my professional opinion,
00:17:07.709	     there are some other things
00:17:08.650	     that it hasn't really brought up.
00:17:10.011	     So that was my kind of initial experience.
00:17:12.894	     And, you know, working with Yas,
00:17:14.937	     Yassen had to convince me a
00:17:16.558	     little bit about, hey,
00:17:18.000	     there really is something here.
00:17:19.822	     We should dig into this a bit more.
00:17:22.745	     Right.
00:17:23.899	     And I think I definitely had
00:17:27.041	     my own personal skepticisms
00:17:29.922	     of AI as well.
00:17:31.924	     I think one of the mental
00:17:33.765	     traps that people can get
00:17:35.426	     caught in with AI is when
00:17:39.068	     you first hop in a chat GPT
00:17:40.829	     and ask it a question, you kind of go, oh,
00:17:43.610	     that question wasn't very good.
00:17:45.751	     But I think there was a bit
00:17:47.933	     of a mental gap or a jump
00:17:50.134	     that people had that
00:17:53.017	     I don't know if people talk about it much,
00:17:54.358	     but ChatGPT, you kind of have to,
00:18:02.782	     or just AI in general,
00:18:04.123	     you kind of have to feed it
00:18:05.403	     certain things for it to
00:18:06.624	     give you the proper responses, right?
00:18:09.565	     For ChatGPT specifically,
00:18:11.606	     because that's the AI that
00:18:13.447	     everybody knows,
00:18:15.276	     lots of people say, hey,
00:18:16.357	     give me a recipe for X, right?
00:18:18.798	     And it's a very basic recipe for things.
00:18:22.200	     And then you look at the
00:18:23.901	     recipe it gives you, and you're like,
00:18:25.642	     well, that doesn't look very good.
00:18:27.263	     It doesn't have very specific things,
00:18:29.264	     right?
00:18:30.405	     And yet we have given it a simple,
00:18:34.727	     simple prompt and expected
00:18:37.049	     a perfect answer, right?
00:18:40.311	     When in reality, even if you're,
00:18:42.232	     let's say I'm asking Gordon
00:18:43.632	     Ramsay for something,
00:18:46.064	     I probably would give him a
00:18:47.885	     lot more context on, hey,
00:18:50.586	     I'm looking for this type
00:18:51.787	     of recipe that can fit
00:18:53.548	     within this time constraint.
00:18:55.048	     Oh, I'm allergic to X, Y, Z,
00:18:56.709	     so on and so forth.
00:18:58.090	     Can you give me a recipe for this?
00:19:00.771	     Right.
00:19:01.111	     And he probably would have
00:19:02.452	     with his experience and
00:19:03.432	     understanding of cooking.
00:19:04.960	     give me something instead so
00:19:07.741	     my mental flaw I think at
00:19:09.301	     first was to give it the
00:19:10.461	     simplest prompt and
00:19:11.702	     expecting the world from it
00:19:14.062	     when in reality basically
00:19:16.143	     for it to read my mind but
00:19:17.583	     you know even you know we
00:19:19.704	     expect so much of this
00:19:20.584	     technology I think we are
00:19:21.645	     taking some of it for
00:19:22.585	     granted but the truth is or
00:19:24.565	     the reality of is that you
00:19:25.666	     have to give it have to
00:19:27.446	     have a little bit of give
00:19:28.346	     and take it's a
00:19:29.127	     conversation it's about
00:19:30.587	     elaborating on what your
00:19:31.967	     initial idea is and so
00:19:35.369	     That was one of the first
00:19:36.850	     mental traps that I had about like, okay,
00:19:39.031	     I'm kind of skeptical about
00:19:41.032	     AI until I had that first
00:19:44.114	     paradigm shift of I
00:19:45.795	     actually need to give it
00:19:46.976	     more information.
00:19:48.297	     I can't give it a simple
00:19:49.438	     prompt and expect the best answer.
00:19:53.000	     And it should be treated as
00:19:57.923	     a back and forth, right?
00:20:00.744	     Now, my second paradigm shift was really
00:20:04.522	     assuming that AI wasn't
00:20:08.127	     going to produce anything good, right?
00:20:10.450	     It would help me get a lot
00:20:11.412	     of the legwork there,
00:20:13.495	     but it would still fall
00:20:14.997	     short for a lot of things.
00:20:16.759	     Now, the way that I kind of see it,
00:20:21.126	     or the best example of this
00:20:22.647	     was I remember you
00:20:24.588	     specifically asking me like, okay, well,
00:20:26.529	     why,
00:20:27.730	     why are you scraping information from
00:20:29.871	     this document manually?
00:20:31.973	     Right.
00:20:32.513	     And I said, well, I can't, well,
00:20:35.715	     I can't assume that chat
00:20:36.996	     GPT would be able to scrape
00:20:39.017	     these different fields that
00:20:40.218	     I need perfectly.
00:20:41.819	     Right.
00:20:42.019	     I'm sure it'll give me some
00:20:43.059	     flaws and let alone,
00:20:45.121	     I'm trying to scrape 50 documents.
00:20:47.042	     Right.
00:20:48.162	     So you said, well, why don't we just try?
00:20:51.754	     And that was a turning
00:20:53.655	     moment for me because it
00:20:56.897	     was kind of like that's the rule of, hey,
00:20:59.479	     innocent until proven guilty.
00:21:01.040	     I think from a change
00:21:03.201	     management standpoint,
00:21:04.762	     a lot of people are resistant to change.
00:21:07.383	     But the thing is, you know,
00:21:09.405	     the truth was like when you
00:21:11.106	     entered those documents,
00:21:12.366	     it actually did a really,
00:21:13.367	     really good job of scraping
00:21:14.568	     that information.
00:21:15.208	     Like I would say for that
00:21:16.849	     specific use case,
00:21:18.010	     it was like 95% accurate.
00:21:20.873	     And so that saved me a lot of time.
00:21:23.154	     And if I hadn't gone through and said,
00:21:25.935	     okay, fine,
00:21:26.536	     I'll try and I'll appease you
00:21:28.557	     just by throwing all these
00:21:29.797	     documents in there and not
00:21:31.198	     doing it manually,
00:21:32.078	     I would have still been
00:21:33.739	     doing it manually.
00:21:34.679	     I wouldn't have given the AI a chance,
00:21:37.581	     I guess.
00:21:39.222	     So now I have this philosophy of,
00:21:43.484	     I'm just going to throw
00:21:44.305	     everything I can at this
00:21:46.127	     tool until it proves me wrong and it says,
00:21:49.770	     yeah, it's kind of, you know,
00:21:52.452	     this is where the threshold
00:21:53.954	     of where I use ChatGPT or
00:21:56.216	     this is where I use AI and
00:21:57.497	     this is where I don't, right?
00:21:58.958	     I'm still on a personal
00:22:00.680     journey to discover where
00:22:02.261	     that threshold is.
00:22:03.643	     And I'll, you know,
00:22:04.724	     as new versions come out,
00:22:06.165	     I'll continue to push that boundary.
00:22:08.714	     so to speak.
00:22:09.254	     Yeah,
00:22:09.475	     and as different models come out and
00:22:11.576	     whatnot, they change.
00:22:12.717	     Sometimes they get better,
00:22:13.598	     sometimes they get worse.
00:22:15.139	     Exactly.
00:22:16.500	     And I think that that's a
00:22:17.321	     paradigm shift that a lot
00:22:18.362	     of people need to be able
00:22:19.302	     to go through with not just AI,
00:22:21.964	     but technology in general, right?
00:22:24.366	     And you have this technology that is very,
00:22:27.629	     very capable
00:22:29.557	     And it can do really amazing things.
00:22:33.060	     But you've got to know what
00:22:34.781	     it's good at and what it isn't.
00:22:36.122	     And how else will you know
00:22:37.303	     if you're not trying?
00:22:39.044	     Yeah.
00:22:39.325	     Right.
00:22:39.505	     And that was the
00:22:39.985	     conversation that we were having.
00:22:41.867	     That's like, you got to try.
00:22:42.927	     It's like, I don't know if we can do it.
00:22:44.909	     You know, it's like,
00:22:45.529	     I don't know if we can read
00:22:46.290	     a business process flow and tell me,
00:22:47.771	     you know,
00:22:48.031	     who the roles responsible of it are.
00:22:50.413	     Yeah.
00:22:51.214	     Right.
00:22:52.301	     I don't know.
00:22:52.761	     Let's try it out.
00:22:53.602	     I never gave it the benefit of the doubt,
00:22:56.764	     essentially.
00:22:59.006	     It's like if you're a
00:22:59.787	     teacher and you just assume
00:23:01.008	     your student can't do it,
00:23:02.369	     or if you're a parent and
00:23:03.109	     you just assume your kid can't do it,
00:23:05.071	     you're never going to let
00:23:05.951	     them try and do it.
00:23:07.953	     What I want to see is I want
00:23:09.494	     to see the AI fail,
00:23:11.095	     and I want to understand where it fails,
00:23:13.857	     why it fails.
00:23:15.330	     If I can.
00:23:16.131	     Right.
00:23:16.812	     And then as a leader or
00:23:19.395	     someone that has tasks to do,
00:23:21.157	     I will know whether this
00:23:23.720	     person or this student is up for the job.
00:23:27.885	     Right.
00:23:29.365	     Yeah,
00:23:29.825	     and that learning piece and having an
00:23:32.647	     understanding,
00:23:35.589	     it's like why things work
00:23:37.391	     the way that they work and
00:23:39.112	     where it was successful and
00:23:40.493	     where it wasn't successful.
00:23:42.254	     Because we've seen this many times, right?
00:23:44.596	     One person will do something,
00:23:46.477	     they'll get completely
00:23:47.518	     different results than the other person.
00:23:49.119	     And then one will just think like, oh,
00:23:51.641	     tech's not working well.
00:23:53.882	     But sometimes it is the tech.
00:23:56.246	     The tech's not quite there.
00:23:57.168	     It's obviously not perfect.
00:23:58.491	     It's got its challenges.
00:23:59.673	     But oftentimes,
00:24:01.456	     we just don't know how to
00:24:02.217	     interact with it.
00:24:04.059	     right um and the ceo of
00:24:06.100	     nvidia came out recently
00:24:07.840	     right um and he basically
00:24:10.582	     said that the next
00:24:13.202	     generation of workers they
00:24:15.183	     shouldn't be learning how
00:24:16.324	     to program they shouldn't
00:24:17.364	     be learning how to code
00:24:18.504	     like that thinking is kind
00:24:20.385	     of gone right before it was
00:24:21.786	     like you got to learn how
00:24:22.566	     to do that because systems
00:24:23.726	     are everything and you know
00:24:25.467	     you want to be able to
00:24:26.367	     operate within various
00:24:27.608	     systems and be efficient
00:24:29.108	     and effective there now
00:24:30.529	     it's like you got to understand how this
00:24:33.524	     you know,
00:24:33.844	     technology works and what's a
00:24:35.846	     good way for you to use it.
00:24:37.247	     And I think about the
00:24:38.047	     Ironman as an example, right?
00:24:40.349	     I don't know what it's called.
00:24:41.289	     Like what, you know, Marble, right?
00:24:45.112	     Gar, what's his AI called?
00:24:47.293	     Jarvis.
00:24:48.014	     Jarvis, right?
00:24:48.774	     There you go, Jarvis, right?
00:24:49.675	     Like he basically, you know,
00:24:51.616	     he's just talking to it.
00:24:53.257	     He's doing things and it's kind of,
00:24:55.239	     he understands its
00:24:56.019	     strengths and its
00:24:56.660	     limitations and its weaknesses.
00:24:58.381	     And because he has such an
00:24:59.341	     amazing understanding,
00:25:01.082	     he's basically become one with tech.
00:25:04.626	     Quite literally, I guess,
00:25:05.487	     as part of the movie, right?
00:25:07.069	     And I would challenge people
00:25:10.253	     to start to do that, right?
00:25:12.235	     And I would challenge people
00:25:13.377	     to start to play around with it and see,
00:25:15.860	     can this be done or not?
00:25:17.642	     And if it can't be done,
00:25:19.104	     then you've learned that, okay,
00:25:20.165	     you know what?
00:25:20.506	     This is not...
00:25:21.667	     an application that is good
00:25:23.068	     for ai to to to well it's
00:25:25.230	     not good for us to leverage
00:25:26.751	     ai to use to effectively
00:25:29.052	     execute it but then you're
00:25:30.433	     going to be surprised and i
00:25:32.114	     have been surprised many
00:25:34.256	     times with the things that
00:25:35.577	     it can do and that i can
00:25:37.778	     get better at it and i
00:25:39.099	     think the generative ai has
00:25:40.500	     such a huge potential in
00:25:42.501	     the business world to do
00:25:43.602	     good and to also do some bad um
00:25:46.917	     because it isn't great at everything.
00:25:49.258	     And just like you said, Hey,
00:25:50.419	     innocent to proving guilty.
00:25:52.060	     You've also got to look at
00:25:53.001	     it from the other way around, right?
00:25:54.381	     Like you can't just assume
00:25:56.042	     that it's all good and that
00:25:57.783	     the quality is there
00:25:58.904	     because it provides you
00:26:00.165	     some very logical answers
00:26:02.686	     and it sometimes makes shit up.
00:26:04.607	     Yeah.
00:26:05.548	     I think that the role of the human,
00:26:08.790	     at least right now or in
00:26:10.791	     the near future is that it,
00:26:15.665	     The human almost evolves
00:26:16.746	     into a quality assurance type of role.
00:26:21.830	     Speaking from personal experience,
00:26:23.392	     I've tried to get ChatGPT
00:26:24.773	     to build me code for certain things.
00:26:28.136	     And it gets it to about 80%
00:26:30.778	     of the way there.
00:26:32.439	     And then you put the code into, let's say,
00:26:34.761	     Power BI.
00:26:35.362	     I'm using DAX or whatever, Python.
00:26:38.544	     And Python comes back and
00:26:40.526	     spits back a bunch of errors.
00:26:43.368	     And where the human has to come in,
00:26:46.091	     or sorry, let me backtrack for a second.
00:26:49.113	     I've used the tool or tell ChatGPT to say,
00:26:52.356	     hey, your code has returned this error.
00:26:56.500	     It will say, oh, sorry.
00:26:59.142	     You're right.
00:26:59.763	     Let me fix that code for you.
00:27:01.204	     It gives me a new code.
00:27:03.766	     But I put it back in.
00:27:05.368	     Either new errors occur or
00:27:07.289	     the same error continues.
00:27:09.932	     And
00:27:11.558	     where the human's role
00:27:12.659	     shifts is that I don't have
00:27:14.401	     to do the heavy lifting for
00:27:15.643	     coding anymore.
00:27:16.504	     I have to become a very good debugger.
00:27:19.267	     I have to be very good at
00:27:20.869	     ensuring the quality of that code.
00:27:23.652	     So rather you learn to
00:27:28.293	     program still to understand
00:27:31.855	     how to debug those things
00:27:33.455	     or alternatively you know
00:27:35.716	     maybe another idea is to be
00:27:37.817	     very good at figuring out
00:27:39.578	     how to debug specific or
00:27:41.439	     common problems that the ai
00:27:42.899	     tool comes up with so we're
00:27:44.780	     in a little bit of that
00:27:45.660	     transitionary period where
00:27:48.401	     AI can do a lot of the heavy lifting.
00:27:49.921	     You don't have to do things from scratch.
00:27:51.422	     It's going to make your
00:27:53.843	     ability to be more efficient,
00:27:55.283	     save you time at work.
00:27:57.504	     Right.
00:27:58.404	     Yeah.
00:27:59.104	     And I think that like
00:27:59.904	     obviously quality assurance
00:28:00.985	     is a big component of it.
00:28:02.485	     But I would say that before
00:28:04.066     quality assurance,
00:28:04.946	     because obviously quality
00:28:05.786	     assurance comes much later
00:28:06.867	     in the process.
00:28:08.607	     It's you've got to be able
00:28:09.467	     to guide it correctly.
00:28:11.108	     And I think that that's
00:28:11.829	     where the majority of the
00:28:12.729	     people struggle right now.
00:28:14.730	     Like it's a new thing.
00:28:15.670	     And I think that most people
00:28:17.991	     haven't really grasped
00:28:20.394	     like a good enough
00:28:21.155	     understanding of what it is,
00:28:23.696	     let alone like how it
00:28:24.716	     functions and why it
00:28:25.677	     functions the way that it functions.
00:28:27.998	     That if you're not guiding
00:28:29.559	     it correctly and you're not
00:28:31.500	     structuring things well,
00:28:33.380	     and we won't get into too many details,
00:28:34.801	     right?
00:28:35.001	     But obviously like you've
00:28:36.082	     got to think about, you know,
00:28:39.083	     how you lay out the instructions,
00:28:40.804	     the things you say,
00:28:41.624	     the things you don't say,
00:28:43.185	     when you say certain things,
00:28:44.486	     how often you say certain things, right?
00:28:46.447	     Like,
00:28:47.327	     it's a very powerful request
00:28:50.389	     yeah yeah all right there's
00:28:52.130	     so many different factors
00:28:53.171	     that you have to be able to
00:28:54.052	     consider um and and it
00:28:57.054	     starts with guiding it and
00:28:58.315	     if you guide it well from
00:28:59.595	     the first time and let's
00:29:00.596	     say that you're coming up
00:29:01.357	     with this project that
00:29:02.798	     requires a whole bunch of
00:29:03.798	     different prompts and you
00:29:05.339	     know it's a very complex
00:29:06.580	     thing to kind of build out um
00:29:09.482	     sure.
00:29:09.762	     The quality assurance at the
00:29:10.742	     end of it is going to be important,
00:29:11.943	     but like you need to guide
00:29:13.243	     it well from the beginning
00:29:14.883	     and guided well throughout the process.
00:29:17.504	     And you've got to know where
00:29:18.765	     to put the human in and
00:29:20.725	     what role the human needs
00:29:22.005	     to play to make sure that
00:29:23.726	     you get something good at the end of it.
00:29:25.086	     Otherwise it's just going to
00:29:25.946	     be garbage in garbage out.
00:29:28.267	     Yeah.
00:29:28.507	     And we've seen lots of
00:29:29.367	     stories and news about that.
00:29:32.128	     Right.
00:29:33.268	     Have you heard of the, um,
00:29:37.150	     question that helps people
00:29:39.571	     determine your work style,
00:29:41.933	     where it's like, hey,
00:29:42.713	     would you rather fight a
00:29:47.616	     horse-sized duck or fight a
00:29:52.739	     hundred little duck-sized horses?
00:30:04.848	     Yeah, basically it's like,
00:30:06.990     would you rather take one
00:30:08.652	     big task and just tackle
00:30:11.094	     the whole task or chop up
00:30:13.095	     the little task into
00:30:14.196	     individual bits and then
00:30:15.818	     tackle every single one and
00:30:17.359	     kind of divide and conquer, right?
00:30:19.181	     I'm not sure if I got that analogy right,
00:30:21.423	     but you get the point, right?
00:30:24.565	     And what we've discovered,
00:30:25.806	     at least from my standpoint,
00:30:28.609	     is that this current state of
00:30:32.062	     these large language models
00:30:33.544	     is that it works well with
00:30:34.905	     these smaller series of
00:30:37.768	     prompts and having you as a
00:30:40.551	     human being figure out what
00:30:41.752	     those little questions are
00:30:43.154	     going to be to help direct
00:30:45.556	     and create what your end
00:30:46.677	     product with the language model will be.
00:30:49.580	     Right.
00:30:50.060     So there is a specific approach.
00:30:53.281	     Let's move on to the psychology of it.
00:30:55.722	     Right.
00:30:55.842	     Because one of the things
00:30:56.642	     that I know that you kind of went through,
00:30:58.463	     Gar,
00:30:59.923	     and you use your experience with it
00:31:01.823	     was this feeling of guilt.
00:31:04.984	     Right.
00:31:06.245	     So you want to talk us through that?
00:31:08.424	     Yeah, sure.
00:31:10.565	     I think it won't just be me.
00:31:12.346	     And this really circles back
00:31:13.927	     to some potential
00:31:15.728	     resistance to using AI tools.
00:31:18.989	     I think some people in
00:31:21.731	     essence think it's like
00:31:23.092	     cheating because the AI
00:31:26.033	     tool is doing so much of
00:31:27.394	     the heavy lifting for you
00:31:28.835	     and doing all the thinking.
00:31:29.855	     Well, you know,
00:31:31.176	     what am I good for if it's doing all this,
00:31:33.537	     right?
00:31:34.097	     Something that used to take
00:31:35.078	     me an hour now takes me, you know,
00:31:38.603	     two minutes, am I still as valuable,
00:31:42.345	     right?
00:31:43.346	     Now,
00:31:44.006	     I want to bring up a story that one of
00:31:46.908	     our colleagues, Catherine,
00:31:48.909	     got to give her credit for
00:31:49.770	     the story brought up,
00:31:50.691	     but I think it's extremely
00:31:52.492	     relevant to this scenario
00:31:54.373	     or this particular topic
00:31:55.774	     we're talking about.
00:31:57.700	     is that in 1947, Betty Crocker,
00:32:00.441	     you're familiar with Betty Crocker,
00:32:01.561	     right?
00:32:01.802	     The cakes.
00:32:03.082	     Yeah.
00:32:03.842	     The cakes and the different
00:32:06.523	     mixes that they make that
00:32:08.544	     you can buy in grocery stores.
00:32:10.044	     They essentially created
00:32:13.385	     this new product that was a cake mix.
00:32:17.506	     And all you had to do was just add water.
00:32:20.810	     They figured out how to dehydrate eggs.
00:32:24.053	     They figured out how to
00:32:25.053	     dehydrate the milk so that
00:32:26.935	     everything could be on a shelf.
00:32:29.137	     And really you just needed
00:32:30.337	     the waters to kind of combine everything,
00:32:33.220	     the baking powder or soda or whatnot.
00:32:35.682	     It's all in there.
00:32:36.542	     So, you know,
00:32:39.084	     they thought that they hit it
00:32:40.345	     out of the park for your
00:32:41.827	     typical busy homemaker.
00:32:44.610	     All they have to do is add water,
00:32:46.691	     put it in a pan, throw it in the oven,
00:32:48.593	     and bam, homemade cake, right?
00:32:53.376	     Don't have to buy eggs.
00:32:54.437	     You don't have to buy all
00:32:55.277	     these other ingredients and
00:32:56.778	     figure out the ratios.
00:32:57.819	     Like, man, it saves you so much time.
00:33:00.321	     But then what was
00:33:02.662	     interesting was that they
00:33:03.683	     had a really hard time
00:33:05.004	     convincing people to use
00:33:06.765	     the product because people
00:33:09.447	     started to think, well,
00:33:11.028	     if I'm just adding water, you know,
00:33:14.870	     I feel like I'm not really doing the work.
00:33:18.192	     They brought in a bunch of
00:33:19.093	     psychologists to analyze
00:33:22.195	     and ask the homemakers or
00:33:23.876	     the people that would
00:33:25.037	     potentially buy this product,
00:33:26.458	     why aren't you buying the product?
00:33:27.618	     And that's what they found,
00:33:28.679	     that there's this element of guilt.
00:33:31.781	     They felt like there should
00:33:32.942	     have been more work that
00:33:34.223	     they were supposed to do.
00:33:37.412	     It was really clever what they did.
00:33:39.456	     I think what they changed
00:33:42.543	     the box to say and changed
00:33:44.106	     their formula was to say, add eggs,
00:33:47.032	     add milk.
00:33:49.368	     so that the homemaker or
00:33:51.289	     whoever's buying the
00:33:51.969	     product felt like they had
00:33:53.090	     to do a little bit more,
00:33:54.071	     and then the sales just
00:33:55.592	     took off after that.
00:33:57.853	     So it's interesting to think
00:33:59.494	     that there's a bit of a
00:34:00.914	     human psychology standpoint
00:34:03.696	     to this as well,
00:34:04.897	     that there's this perception of, oh,
00:34:07.218	     I'm using this AI tool and
00:34:11.781	     it's just not gonna be as good or,
00:34:15.247	     Am I really, like,
00:34:17.729	     should I be spending more
00:34:18.870	     work and more of my time
00:34:21.612	     doing this work instead of
00:34:23.894	     getting AI to do all the heavy lifting,
00:34:26.016	     right?
00:34:26.977	     And I think that we need to
00:34:29.659	     take a step back and be like, okay,
00:34:31.320	     what is the task that I
00:34:32.841	     have at hand and what is my
00:34:34.262	     end deliverable?
00:34:36.304	     How can I use AI or whatever
00:34:39.527	     tool to help me either do the job faster,
00:34:45.822	     And at the same time,
00:34:48.148	     me applying whatever I need
00:34:50.975	     to do to ensure that the
00:34:52.138	     quality is still there.
00:34:54.638	     So that's just kind of the
00:34:56.019	     interesting story about the guilt part,
00:34:57.500	     right?
00:34:57.700	     So I definitely went through
00:34:59.200	     that experience of like, you know,
00:35:00.981	     I'm supposed to do all this research.
00:35:03.283	     I'm supposed to comb through
00:35:04.383	     all these reports to figure out X, Y,
00:35:06.344	     and Z. If I just ask ChatGVT,
00:35:08.746	     I'm sure it's going to miss
00:35:10.546	     the X in the X, Y, and Z, right?
00:35:13.428	     So, but you just don't know until you ask.
00:35:16.730	     And then once it does spit
00:35:18.171	     something out for you,
00:35:19.031	     my brain shifts to thinking, okay, how
00:35:21.357	     Is there anything missing here?
00:35:22.618	     Do I need to go and ask it
00:35:25.340	     more questions or just supplement it?
00:35:27.902	     Right.
00:35:28.983	     Right.
00:35:30.023	     And it's funny because decades later,
00:35:31.524	     you see all these bakeries
00:35:32.885	     that are basically just, you know,
00:35:37.198	     Creating the products of a Betty Crocker,
00:35:39.459	     right?
00:35:40.539	     Putting some decoration on
00:35:41.739	     top and selling it on their own, right?
00:35:44.840	     It's interesting to see how
00:35:45.860	     perception has changed so much.
00:35:47.000	     I didn't realize that people
00:35:48.061	     were so hesitant about it before.
00:35:51.482	     That's interesting.
00:35:52.002	     Imagine, yeah,
00:35:52.682	     you could have Betty
00:35:53.542	     Crockers that are just add water.
00:35:56.143	     It could have been simpler.
00:35:57.912	     But now we have to add egg
00:35:59.253	     and milk just because of a
00:36:02.515	     perception thing.
00:36:03.675	     That's cool, man.
00:36:05.016	     So look,
00:36:05.977	     we are about almost 40 minutes into this,
00:36:08.358	     right?
00:36:09.119	     What do you think about just AI in general,
00:36:11.400	     Gar?
00:36:11.740	     Is this a good thing?
00:36:12.541	     Is this a bad thing?
00:36:13.481	     And what would you say to
00:36:15.583	     the listeners here who are
00:36:18.865	     thinking about AI and what
00:36:21.887	     would be a good way for
00:36:22.847	     them to start to understand
00:36:25.249	     it or continue to
00:36:26.329	     understand it a little bit better?
00:36:30.254	     I think that if you're relying on it,
00:36:34.840	     not just for entertainment purposes,
00:36:37.543	     you should be skeptical
00:36:38.785	     because it's kind of your
00:36:40.107	     name on the line at the end
00:36:41.388	     of the day for whatever
00:36:42.750	     work product you're delivering.
00:36:45.874	     But, you know,
00:36:49.300	     challenge your original
00:36:51.302	     thought of like oh
00:36:52.163	     everything has to be done
00:36:53.165	     by myself use these tools
00:36:56.609	     that are available for you
00:36:58.090	     to see how it can change
00:37:00.253	     your workflow right it
00:37:02.716	     could be i know for me
00:37:04.418	     specifically i work better
00:37:05.840	     with a draft one that
00:37:07.922	     someone on our team may have
00:37:09.721	     brought up and i'm really
00:37:11.102	     good at building on that
00:37:12.664	     right that's why you and i
00:37:14.145	     work so well together
00:37:15.186	     because you come up with
00:37:17.047	     first draft and i kind of
00:37:18.869	     give it the polish right if
00:37:20.450	     you will so if you're
00:37:23.393	     relying on it be skeptical
00:37:25.635	     but at the same time push
00:37:27.617	     those boundaries again
00:37:31.000	     innocent until proven
00:37:31.921	     guilty assume that ai can
00:37:33.602	     do it until it tells you that it can't
00:37:36.906	     But don't assume so much
00:37:38.327	     that you don't check the quality.
00:37:40.007	     That you don't check your work.
00:37:41.448	     Get yourself in trouble, right?
00:37:43.969	     Still, yeah.
00:37:45.569	     Right.
00:37:46.150	     That's good, man.
00:37:46.750	     Good advice, Gar.
00:37:47.430	     I think I hope our listeners take on that.
00:37:49.611	     So guys, everybody,
00:37:50.971	     thank you so much for listening.
00:37:52.332	     If you stayed here till the
00:37:53.432	     end of the show,
00:37:54.913	     we're very excited about it.
00:37:56.354	     We've got a lot of exciting
00:37:57.454	     guests that we'll be having
00:37:59.035	     on from different paths of life,
00:38:02.316	     different careers, different industries,
00:38:04.136	     different roles.
00:38:05.917	     different levels of the
00:38:07.137	     hierarchy and everything else in there.
00:38:10.478	     And I think it's going to be
00:38:11.319	     very interesting to see different inputs.
00:38:14.140	     The next episode that we will have,
00:38:15.420	     it's going to be an IT
00:38:16.260	     director of a biotech
00:38:17.460	     company that's based out of California,
00:38:20.021	     but he has spent the
00:38:20.921	     majority of his career in
00:38:23.322	     aerospace and aviation manufacturing.
00:38:26.243	     So it's going to be
00:38:26.743	     interesting to kind of hear
00:38:27.764	     from his perspective, right?
00:38:29.084	     Where does he see manufacturing going?
00:38:31.185	     Where does he see tech going?
00:38:32.825	     He started his career in
00:38:33.926	     operations before moving
00:38:35.446	     into the more IT and
00:38:37.647	     digital transformational kind of world,
00:38:40.128	     right?
00:38:40.348	     So he's got that perspective
00:38:41.828	     of process and technology
00:38:43.129	     and kind of working together.
00:38:44.149	     So it'll be interesting to
00:38:45.270	     see what his input is on
00:38:47.510	     some of these topics.
00:38:48.671	     We'll be a lot more structured, right,
00:38:50.051	     in terms of some of the
00:38:50.731	     questions we might want to ask him.
00:38:52.672	     about there um we're always
00:38:54.453	     looking for feedback right
00:38:55.633	     this is episode zero um you
00:38:57.433	     know we will we're always
00:38:59.154	     looking to kind of improve
00:39:00.234	     as well if you've got any
00:39:01.134	     suggestions any thoughts
00:39:02.855	     any things you want us to
00:39:03.775	     talk about anybody that you
00:39:04.935	     think should be on the show
00:39:05.936	     or we should chat with
00:39:07.556	     please let us know right um
00:39:10.837	     reach out to me on linkedin
00:39:12.298	     or or Garlon or linkedin
00:39:14.018	     right or send me an email
00:39:15.218	     comment on wherever this
00:39:16.619	     video is posted yeah
00:39:18.139	     comment whatever you want there right um
00:39:21.380	     Let us know.
00:39:22.681	     And we still need a name, right?
00:39:24.682	     As you guys can see here, you know,
00:39:26.303	     placeholder for a cool podcast name,
00:39:28.404	     right?
00:39:28.564	     So give us a cool podcast name.
00:39:31.105	     We're looking for them.
00:39:31.805	     We're not very good at naming things.
00:39:33.886	     Remember that.
00:39:34.667	     So please help us out on that.
00:39:36.688	     We're going to be publishing
00:39:37.508	     this every couple of weeks.
00:39:39.129	     Two to three weeks is kind
00:39:40.130	     of what we're aiming for
00:39:40.950	     each of the episodes.
00:39:42.271	     Follow us on LinkedIn to get
00:39:44.152	     more information on when
00:39:46.594	     the next episode is going to be.
00:39:48.235	     So, Gar,
00:39:48.856	     thanks a lot for being on the show, man.
00:39:50.276	     I think we'll bring you back
00:39:51.237	     every now and then to kind of see.
00:39:52.858	     I like that whole thing that
00:39:53.899	     you're talking about,
00:39:54.639	     kind of like documenting
00:39:56.401	     things and how things evolve.
00:39:57.761	     I think it'll be cool to see later on,
00:40:00.443	     like, hey,
00:40:00.964     some of the things that we were
00:40:01.964	     talking about,
00:40:02.625	     whether they came to life
00:40:03.665	     or they didn't come to life or whatnot,
00:40:06.387	     man.
00:40:06.607     So thanks for coming on.
00:40:08.128	     Yeah, thanks for having me.
00:40:09.871	     All right, guys.
00:40:11.034	     See you next time.
00:40:12.337	     See ya.
Previous
Previous

EP 02: Talking AI in Asset management and Data

Next
Next

EP 01: Talking AI and Next Gen Manufacturing