1
00:00:00,070 --> 00:00:03,370
Todd Kane: Today I am joined by
Ashley Cooper, COO at Cyber Drain

2
00:00:03,370 --> 00:00:05,430
and VP of Community at Rewst.

3
00:00:05,470 --> 00:00:10,090
Ashley has spent over 15 years in the
IT channel shaping MSP operations and

4
00:00:10,090 --> 00:00:14,950
customer experiences at companies like
Auvik and Gradient MSP, while serving

5
00:00:14,980 --> 00:00:19,420
on the MSP Geek Board and actively
moderating communities, including the.

6
00:00:19,840 --> 00:00:23,470
MSP subreddit, she's known for
championing community driven

7
00:00:23,470 --> 00:00:29,290
development on tools like cyber
drains, CIPP, and an active vibe coder.

8
00:00:29,320 --> 00:00:30,100
Welcome, Ashley.

9
00:00:31,159 --> 00:00:31,789
Ashley: Thank you.

10
00:00:31,819 --> 00:00:32,989
Wow, that was such a good intro.

11
00:00:32,989 --> 00:00:35,149
I feel like I'm always
stumbling on those parts.

12
00:00:35,189 --> 00:00:41,159
Todd Kane: we connected through a
mutual community focused on AI and,

13
00:00:41,209 --> 00:00:44,509
some of the work that you're doing,
I found really, really fascinating.

14
00:00:44,509 --> 00:00:48,319
Wanted to have you on to dive a little
deeper on this, I guess to lead in,

15
00:00:48,639 --> 00:00:51,439
to give you a bit of, bonafides on.

16
00:00:51,489 --> 00:00:54,099
Your activity in this space.

17
00:00:54,399 --> 00:01:00,009
You wanna tell us about your, little,
gift that lovable sent you as a, a

18
00:01:00,009 --> 00:01:02,739
massive contributor to tokens on lovable,

19
00:01:02,789 --> 00:01:03,479
Ashley: Yeah.

20
00:01:03,759 --> 00:01:05,429
they sent me this, this little.

21
00:01:06,019 --> 00:01:09,709
Lovable light and it has the little heart
logo and it moves the same way that it

22
00:01:09,709 --> 00:01:12,859
does on their, on their site and stuff.

23
00:01:12,859 --> 00:01:13,879
And so I thought that was really cool.

24
00:01:13,879 --> 00:01:14,719
I actually looked it up.

25
00:01:14,719 --> 00:01:15,769
Those things are not cheap.

26
00:01:15,769 --> 00:01:19,769
So, it was made with, like
a, in partnership with some

27
00:01:19,799 --> 00:01:21,959
like specific company for it.

28
00:01:21,959 --> 00:01:25,539
But this was something that,
surprised me a little bit 'cause.

29
00:01:26,004 --> 00:01:28,944
You know, I got the, at the end of
the year, just like everybody does,

30
00:01:28,944 --> 00:01:30,744
the Spotify started the wrapped.

31
00:01:30,954 --> 00:01:34,764
So they had their little lovable
wrapped, and they gave a bunch of stats

32
00:01:34,764 --> 00:01:36,174
and some of those stats blew me away.

33
00:01:36,204 --> 00:01:40,074
Yeah, they said, I was like, 0.01% top.

34
00:01:41,184 --> 00:01:44,394
User or something like that of
their, of their application.

35
00:01:44,394 --> 00:01:47,754
And I was sharing it in their Discord
community actually, like some of

36
00:01:47,754 --> 00:01:50,634
my things, just thinking, oh yeah,
a lot of people are like this.

37
00:01:50,944 --> 00:01:54,544
and even some of their own staff
were in there going like, wow,

38
00:01:54,694 --> 00:01:57,064
like you almost got this guy beat.

39
00:01:57,544 --> 00:01:58,094
Todd Kane: It's amazing.

40
00:01:58,214 --> 00:02:01,844
Yeah, Like, I mean, that is a,
a very, it's a loved platform

41
00:02:01,874 --> 00:02:03,854
as it's aptly named, I suppose.

42
00:02:03,854 --> 00:02:05,704
so to be like that, that.

43
00:02:06,454 --> 00:02:10,444
Echelon of contributors and,
workers in that platform.

44
00:02:10,444 --> 00:02:11,764
I think that's pretty incredible.

45
00:02:12,084 --> 00:02:15,444
I'd love to dig into like what are
some of the things that you're building

46
00:02:15,474 --> 00:02:20,484
in lovable and where do you get
inspiration for the projects in, in

47
00:02:20,574 --> 00:02:22,104
the, what you're coding in general.

48
00:02:23,258 --> 00:02:27,808
Ashley: Yeah, so when I started
using it, I was sort of already

49
00:02:27,808 --> 00:02:29,698
trying to find solutions like this.

50
00:02:29,698 --> 00:02:36,238
I was building like automations that
like had fed into front ends and

51
00:02:36,238 --> 00:02:38,548
like I even, ever since I was like.

52
00:02:39,868 --> 00:02:40,198
I dunno.

53
00:02:40,288 --> 00:02:44,608
In grade, I guess in grade eight,
I built my first HTML page that was

54
00:02:44,608 --> 00:02:46,468
like a Backstreet Boys fan site.

55
00:02:46,628 --> 00:02:50,438
And it's always really been
like, I'm not a web developer.

56
00:02:50,438 --> 00:02:51,788
I'm, I mean, I understand.

57
00:02:52,418 --> 00:02:56,648
Things because I, mainly because
my A DHD doesn't let me stop

58
00:02:56,768 --> 00:02:58,118
trying to figure them out.

59
00:02:58,568 --> 00:03:03,818
but the, the thing that drew me to
lovable originally was, well, for

60
00:03:03,818 --> 00:03:05,798
one, it had a freemium offering.

61
00:03:05,798 --> 00:03:08,588
So you could use up to like
five credits every day.

62
00:03:08,588 --> 00:03:12,278
And if your initial prompt would've been
good enough, you could kind of like.

63
00:03:12,638 --> 00:03:13,838
Build stuff that way.

64
00:03:13,838 --> 00:03:15,728
Still faster than if you hadn't.

65
00:03:15,728 --> 00:03:21,058
So I originally actually like I bought
their $20 subscription, and then I

66
00:03:21,058 --> 00:03:24,358
got really frustrated at how little
I could use and then I canceled it.

67
00:03:24,358 --> 00:03:27,928
And this was so early on that I think
it's like one of their founders reached

68
00:03:27,928 --> 00:03:29,398
out to me asking why I canceled.

69
00:03:29,398 --> 00:03:29,998
And I was mad.

70
00:03:29,998 --> 00:03:33,238
I was like, you make me buy
the whole subscription upfront

71
00:03:33,238 --> 00:03:34,228
and I don't wanna do that.

72
00:03:34,233 --> 00:03:35,368
And I want some usage.

73
00:03:35,368 --> 00:03:36,058
I wanna build.

74
00:03:36,448 --> 00:03:39,088
But then I, inevitably it was
a good product, so I came back.

75
00:03:39,138 --> 00:03:42,288
the first thing I tried to build,
which is something I would probably

76
00:03:42,288 --> 00:03:45,338
say is, failure on everybody's part.

77
00:03:45,338 --> 00:03:49,328
And this brings us back to a whole
different topic on automation, maturity.

78
00:03:49,328 --> 00:03:51,308
But like, I wanted to go full.

79
00:03:51,308 --> 00:03:52,928
Like, I was like, I have this problem.

80
00:03:53,378 --> 00:03:56,738
My, one of my favorite tools just stopped.

81
00:03:57,278 --> 00:03:58,178
Being available.

82
00:03:58,178 --> 00:03:59,528
It was called Orbit at the time.

83
00:03:59,528 --> 00:04:01,118
It managed community software.

84
00:04:01,448 --> 00:04:05,408
It merged profiles together, and you
could see running lists of things.

85
00:04:05,408 --> 00:04:06,548
I'm like, I wanna rebuild that.

86
00:04:08,078 --> 00:04:12,848
And so like, I got as far as like,
like set, I set up a, a database.

87
00:04:12,848 --> 00:04:15,408
I set up a, a discord, authentication.

88
00:04:15,408 --> 00:04:19,788
I set up all these things and then I
realized that like, I mean, we can get

89
00:04:19,788 --> 00:04:22,818
into all the ways that I realized it,
but at a surface level, I realized like.

90
00:04:23,628 --> 00:04:26,868
I've caused more mess than like anything.

91
00:04:26,868 --> 00:04:28,908
'cause I was like, I was using
things that I didn't understand.

92
00:04:28,908 --> 00:04:29,808
I didn't know what to do with.

93
00:04:29,808 --> 00:04:32,578
And, at the time everybody was
like, oh, it's a prototyping tool.

94
00:04:33,148 --> 00:04:38,908
and so where I found a niche was I
was on this, actually it was around

95
00:04:38,908 --> 00:04:44,888
now, last year it was, I was on this,
hackathon thing with John Hardin and,

96
00:04:44,938 --> 00:04:46,948
Jeffrey Newton and a few other people.

97
00:04:46,998 --> 00:04:51,198
We were focused on, trying to build
something in 20 minutes, right?

98
00:04:51,198 --> 00:04:53,448
And so I was like, I'm
not gonna go all out here.

99
00:04:53,448 --> 00:04:58,638
Like, what, what drives, if I were
to think about where the biggest

100
00:04:58,638 --> 00:05:04,998
gaps are, it's around the fact that
like there's all this data out there

101
00:05:05,148 --> 00:05:10,068
and there's all of this knowledge
in like an AI corpus that we just

102
00:05:10,068 --> 00:05:11,688
don't know how to harness or capture.

103
00:05:11,988 --> 00:05:14,118
And one of the things that.

104
00:05:14,538 --> 00:05:20,448
I always subscribed to was like, I
don't need to have AI involved in

105
00:05:20,448 --> 00:05:24,678
my outcomes, but I can still use ai.

106
00:05:25,383 --> 00:05:27,783
To get to those deterministic outcomes.

107
00:05:27,783 --> 00:05:32,493
So I was like practicing like human
in the lead, you know, really more so

108
00:05:32,493 --> 00:05:35,733
since, since then where I was like,
I want, I have this one specific

109
00:05:35,733 --> 00:05:41,103
problem that I typically find hard
to solve for, but a machine knows

110
00:05:41,103 --> 00:05:46,938
how to solve for really well, which
is converting JSON data into a CSV.

111
00:05:48,378 --> 00:05:51,048
and so I literally just made this
like little app and it took me

112
00:05:51,048 --> 00:05:54,768
like a one-shot prompt because it
knows how to read JSON really well.

113
00:05:55,248 --> 00:05:59,088
And I just was like, create me this
thing that I can load the JSO in

114
00:05:59,328 --> 00:06:02,358
and then I can choose which fields I
want and then I can export that out.

115
00:06:02,838 --> 00:06:04,278
And that was what I presented.

116
00:06:04,278 --> 00:06:07,518
And so at the, I was really focused on.

117
00:06:07,713 --> 00:06:12,423
I don't wanna do anything that anybody
can look at and they say, oh, here's

118
00:06:12,423 --> 00:06:15,453
the reason why vibe coding is bad,
or Here's the reason why this is bad.

119
00:06:15,453 --> 00:06:23,043
I wanted to do very specific local first
browser as OS type projects that prove

120
00:06:23,043 --> 00:06:26,733
that this is something that if I had
the skill I could have built without

121
00:06:26,733 --> 00:06:28,593
AI and it would've looked the same way.

122
00:06:28,593 --> 00:06:33,723
And it has no real security implications
because it's BYO and it's in your own

123
00:06:33,723 --> 00:06:35,523
browser and like whatever I, I mean.

124
00:06:36,873 --> 00:06:37,863
At a high level, right?

125
00:06:37,983 --> 00:06:41,913
it's not like it's connecting to databases
and their, like RLS policies aren't set

126
00:06:41,913 --> 00:06:45,333
up properly and like, oh, now everybody
can like, prompt inject your stuff.

127
00:06:45,703 --> 00:06:47,713
but it's just very simple,
deterministic stuff like that.

128
00:06:47,713 --> 00:06:51,433
And so I built that and then
somebody recommended that I,

129
00:06:51,483 --> 00:06:54,783
because of how quickly I did that,
I'm like I said, I could do this.

130
00:06:55,693 --> 00:06:59,713
Like I can build something that's
like ready to go that uses AI

131
00:06:59,713 --> 00:07:02,953
intentionally, that is only
solving one very specific thing.

132
00:07:03,353 --> 00:07:05,963
I could do one a day for a month
and then somebody said, bet.

133
00:07:05,973 --> 00:07:09,543
I just shared one simple use case
a month and they were all local

134
00:07:09,543 --> 00:07:12,163
first, browser as os kind of stuff.

135
00:07:12,163 --> 00:07:14,833
Where it would be like, here's
a tech showcasing, here's a

136
00:07:14,833 --> 00:07:18,613
technology that is traditionally.

137
00:07:19,838 --> 00:07:23,648
something that is difficult to use
or has historically been available

138
00:07:23,678 --> 00:07:29,228
but hasn't really had the cognitive,
you know, awareness of how to use it.

139
00:07:29,798 --> 00:07:32,108
And I'm gonna use AI to help
me learn how to do that.

140
00:07:32,588 --> 00:07:37,698
And then so back to your question about,
you know, like the how do I learn, I

141
00:07:37,698 --> 00:07:43,518
learn literally through it because I
believe like text based training is.

142
00:07:44,373 --> 00:07:45,243
Democratized now.

143
00:07:45,243 --> 00:07:47,523
Like it's, it like the AI has it.

144
00:07:48,003 --> 00:07:50,883
If I wanted to learn, all I
need to do is prompt it, right?

145
00:07:51,843 --> 00:07:55,083
Know what I'm looking for, know
where like the problems are.

146
00:07:55,763 --> 00:07:58,623
you asked me a lot of, I actually
can't even remember whether this, I'm

147
00:07:58,713 --> 00:08:01,623
answering like the question that you
answer that you asked me earlier or

148
00:08:01,623 --> 00:08:03,033
whether I'm answering the first question.

149
00:08:03,083 --> 00:08:05,513
Do you wanna bring me back
to any specific points?

150
00:08:05,989 --> 00:08:07,759
Todd Kane: we're jumping a little
ahead, which is totally fine.

151
00:08:07,789 --> 00:08:10,579
this was, like determining
what to make, right?

152
00:08:10,579 --> 00:08:11,119
Like, like,

153
00:08:11,223 --> 00:08:11,943
Ashley: Yeah, yeah,

154
00:08:12,079 --> 00:08:15,619
Todd Kane: of, having those ideas and
like, like what do you sort of dive

155
00:08:15,619 --> 00:08:17,809
into and where do those ideas come from?

156
00:08:18,393 --> 00:08:18,723
Ashley: yeah.

157
00:08:18,963 --> 00:08:23,643
And so a lot of them come from like
lying in bed or, or thinking out loud

158
00:08:23,643 --> 00:08:26,643
and being like, oh, I wish that I
had something that could do this, or.

159
00:08:26,884 --> 00:08:27,104
Todd Kane: Yep.

160
00:08:27,708 --> 00:08:32,068
Ashley: I'm a big proponent of, earn your
automation and I have my whole career.

161
00:08:32,068 --> 00:08:39,588
And so some of this comes from just,
years of me doing things manually to

162
00:08:39,588 --> 00:08:45,978
build out a human process or a human,
like SOP, around how something is done

163
00:08:45,978 --> 00:08:49,788
that I, I just kind of have these things
that'll like pop up now where I'll be

164
00:08:49,788 --> 00:08:53,808
like, oh, like I know how to tell an
AI to do this deterministically now.

165
00:08:54,543 --> 00:08:58,743
one of the examples was like, one of the
things that I'm always playing with that

166
00:08:58,743 --> 00:09:04,183
is not really that helpful is like a note
taking app slash a, task management app.

167
00:09:04,663 --> 00:09:06,583
And I think everybody's
trying to solve that problem.

168
00:09:06,583 --> 00:09:13,603
But I'm trying to like, think, like,
I use it as a way to, I use it as

169
00:09:13,603 --> 00:09:17,338
a way to figure out how I solve
those problems cognitively as well.

170
00:09:18,488 --> 00:09:22,638
I, other things that I've built, like for
fun that I, I've actually gone the, like

171
00:09:22,638 --> 00:09:26,788
the most viral, I guess would be like the
ones that do have AI involved in them.

172
00:09:26,788 --> 00:09:28,678
Like one of them was like a Spice checker.

173
00:09:29,128 --> 00:09:31,778
Jason Slagel asked me to make that
actually, he was like, he was like,

174
00:09:31,778 --> 00:09:36,098
I wanna know what spice level my,
Insta, my, my LinkedIn post is at.

175
00:09:36,098 --> 00:09:41,648
And so you could like put in a LinkedIn
URL of a post and it would be like, here's

176
00:09:41,648 --> 00:09:45,788
what Spice Girl your, your post reads as.

177
00:09:45,788 --> 00:09:46,208
Right.

178
00:09:46,208 --> 00:09:49,388
And then I, one of the things I added
to it was like, you could just scale

179
00:09:49,388 --> 00:09:53,138
it so you're like, I want this to be
more scary spice and less baby spice.

180
00:09:53,348 --> 00:09:55,388
And then it would rewrite it in that term.

181
00:09:56,048 --> 00:10:00,968
My favorite thing that I've built and
it, to answer like how I learn as well,

182
00:10:01,748 --> 00:10:07,418
has been my own, um, AI resources tool
that I'm actually like, I post, like I've

183
00:10:07,418 --> 00:10:10,868
posted this like AI ash blog, but like it.

184
00:10:12,038 --> 00:10:14,588
Has been like, I'm gonna
build a glossary of terms.

185
00:10:14,618 --> 00:10:18,188
I'm gonna build, um, a learning process.

186
00:10:18,188 --> 00:10:19,358
Like, where did this come from?

187
00:10:19,358 --> 00:10:20,768
It didn't happen overnight.

188
00:10:21,398 --> 00:10:24,488
it's probabilistic, not deterministic,
but what does that even mean?

189
00:10:24,488 --> 00:10:26,438
People keep talking about transformers.

190
00:10:26,438 --> 00:10:27,578
What are those things?

191
00:10:27,998 --> 00:10:29,588
Is it a, is it a hardware?

192
00:10:29,588 --> 00:10:31,568
Is it a, is it a technology?

193
00:10:31,568 --> 00:10:34,118
Is it a terminology, like a methodology?

194
00:10:34,348 --> 00:10:35,603
all these questions like what is.

195
00:10:36,583 --> 00:10:41,653
What did the machine learning look
like before the LLM was released?

196
00:10:41,653 --> 00:10:43,273
You know, like stuff like that.

197
00:10:43,273 --> 00:10:49,183
And so I've been building this
teaching app from a pedagological

198
00:10:49,783 --> 00:10:51,683
per perspective, at there.

199
00:10:51,683 --> 00:10:56,463
And then, so it has little glossary
pages and, My favorite part about

200
00:10:56,463 --> 00:11:00,483
it is like, kind of playing on my
own A DHD awareness as well, is that

201
00:11:00,483 --> 00:11:01,953
not everybody learns the same way.

202
00:11:02,413 --> 00:11:07,403
and so when you click on one of the
terms in the glossary, and you expand

203
00:11:07,403 --> 00:11:11,243
the deep dive, there's a section
that's called Explain like I'm, and it

204
00:11:11,243 --> 00:11:16,203
actually uses ai, for whatever you put
in there to explain that term to you.

205
00:11:17,883 --> 00:11:21,423
Language that you are explaining,
like I'm would understand.

206
00:11:21,453 --> 00:11:25,383
So it's, it's interesting, but
it's also fun because you can be

207
00:11:25,383 --> 00:11:28,953
like, explain like I'm a caveman
and then it, like it triess best.

208
00:11:29,003 --> 00:11:33,353
but everything that I build, and I think
this is a case for a lot of people, has

209
00:11:33,353 --> 00:11:37,253
been either selfish solving something
that takes me a lot of time or I'm

210
00:11:37,253 --> 00:11:42,623
curious about, or has been something
that I hear people say is difficult.

211
00:11:43,628 --> 00:11:50,378
I want to remove that complexity
because it's all ones and zeros.

212
00:11:50,378 --> 00:11:54,038
And there's a little bit of like, hold
my beer involved in some of those things

213
00:11:54,038 --> 00:11:56,858
where it's like somebody says something
can't be done and I'm like, bet.

214
00:11:59,933 --> 00:12:00,203
Yeah.

215
00:12:00,263 --> 00:12:03,323
Oh, you can't make a
front end, only chat app.

216
00:12:03,323 --> 00:12:04,223
And I'm like, sure I can.

217
00:12:04,223 --> 00:12:05,573
Web RTC is a thing.

218
00:12:07,434 --> 00:12:10,554
Todd Kane: So like, obviously
you must have like a, a ton of

219
00:12:10,554 --> 00:12:12,784
projects spinning up, all the time.

220
00:12:12,784 --> 00:12:16,654
So like what do you, how do you sense
of like, what do I need to keep versus

221
00:12:16,654 --> 00:12:20,254
like, this is kind of a fun idea,
but this is not really worth my time.

222
00:12:20,254 --> 00:12:22,384
Like, how do you figure out
what to keep and what to kill

223
00:12:22,384 --> 00:12:24,274
as you, as you spin up projects?

224
00:12:24,798 --> 00:12:27,708
Ashley: Well, I guess there's
different answers to that depending on

225
00:12:27,708 --> 00:12:30,128
whether, what mode of my brain is in.

226
00:12:30,128 --> 00:12:34,838
But part of it is like a chaos
of everything and depending on

227
00:12:34,838 --> 00:12:37,298
how I feel, which one's surface.

228
00:12:37,298 --> 00:12:41,888
So with lovable specific, like I've
done more than that as well, but like

229
00:12:42,218 --> 00:12:47,188
with lovable, there's, whenever you
change or use something, it surfaces

230
00:12:47,188 --> 00:12:50,698
it up to the top and then they
show you your most recent projects.

231
00:12:50,698 --> 00:12:56,228
And so, My natural course is like
if I'm like, just, you know, like

232
00:12:56,558 --> 00:12:59,618
using it like my game, like I
used to play a lot of Candy Crush.

233
00:13:00,158 --> 00:13:01,418
Now I play a lot of lovable.

234
00:13:02,124 --> 00:13:02,304
Todd Kane: Yep,

235
00:13:02,664 --> 00:13:05,234
Ashley: But, so like some, like
for example, that type of thing,

236
00:13:05,234 --> 00:13:10,994
it'll just be like whatever is in
my recent is what's in my scope.

237
00:13:11,404 --> 00:13:14,824
but I will flag certain things
now that are things that I'm like.

238
00:13:15,664 --> 00:13:16,354
Focused on.

239
00:13:16,354 --> 00:13:20,704
And so it's killing it in my mind is
more so because it's not like they, like,

240
00:13:20,704 --> 00:13:21,814
most of the time they don't have backend.

241
00:13:21,814 --> 00:13:24,994
So it's not like I have a super base
database that needs to spin down.

242
00:13:25,404 --> 00:13:28,914
sometimes I do, like, there's a few
that are like more long-term things

243
00:13:28,914 --> 00:13:32,514
that I'm working on where I did give
them a background and I'll pin those up

244
00:13:32,514 --> 00:13:34,314
at the top so that I have them there.

245
00:13:34,364 --> 00:13:36,524
But that's a good, like, that's one of
those things I'm still trying to figure

246
00:13:36,524 --> 00:13:42,254
it out, like how do I like focus my, but
I also don't think that I would be as,

247
00:13:42,994 --> 00:13:47,044
some of the greatest things that I've
built have come out of the emergence

248
00:13:47,074 --> 00:13:53,584
of, me trying to build something else
and then realizing that it did something

249
00:13:53,584 --> 00:13:56,194
that works better for something else
and I'm like, oh, I should use that.

250
00:13:56,194 --> 00:13:58,714
And so they all come out of a problem.

251
00:13:59,419 --> 00:14:02,329
They all come out of curiosity of
whether I can solve it in a one shot.

252
00:14:02,899 --> 00:14:08,429
And so, I was out for dinner once and
I was like, I wanna make a, a food,

253
00:14:08,459 --> 00:14:12,239
like a calorie tracking app that is
for the rest of us, where it's like,

254
00:14:12,239 --> 00:14:14,219
it doesn't have to be calories if
you don't want it to be, but you just

255
00:14:14,219 --> 00:14:17,009
wanna snap a picture of your food.

256
00:14:17,589 --> 00:14:20,929
I made that and it was just like a,
Hey, take a picture of your food.

257
00:14:21,199 --> 00:14:25,009
And then the ai, I made it like
harness like intentional, like

258
00:14:25,039 --> 00:14:28,259
fill this out, then look for this,
then look for that using, vision.

259
00:14:28,589 --> 00:14:32,539
And so it was, It almost filled it out as
if I pil built out the calorie tracker.

260
00:14:32,539 --> 00:14:34,789
Like here's how much fiber,
here's how much, because it

261
00:14:34,789 --> 00:14:36,769
knows those things to a degree.

262
00:14:37,059 --> 00:14:40,239
but then I, as I was using that, I was
like, this would actually, this app would

263
00:14:40,239 --> 00:14:46,089
actually work a lot better as an expense
tracker, take a picture of a receipt, like

264
00:14:46,089 --> 00:14:50,109
it can already parse text, way easier,
it can like, figure out all these things.

265
00:14:50,109 --> 00:14:54,219
And so I literally just remixed
that app and then turned all of

266
00:14:54,219 --> 00:14:57,519
the stuff that was business logic
or domain logic, just like I like.

267
00:14:58,569 --> 00:14:59,799
Made that dynamic.

268
00:15:00,129 --> 00:15:02,889
And then I used the exact same
app to like make an expense

269
00:15:02,889 --> 00:15:04,089
tracker that works the same way.

270
00:15:04,089 --> 00:15:05,229
All the objects were the same.

271
00:15:05,229 --> 00:15:07,719
You take a picture, it shows you how
much money, how much tax, whatever.

272
00:15:07,989 --> 00:15:10,929
And both of these apps are like
functional and working and I use

273
00:15:10,929 --> 00:15:15,759
them for my job or for not my job,
but like my benefit, you know?

274
00:15:15,819 --> 00:15:18,899
'cause anything can do what an app
already does, but I don't wanna

275
00:15:18,899 --> 00:15:20,489
recreate what already exists.

276
00:15:20,489 --> 00:15:24,449
I wanna fill a gap between
what I already use.

277
00:15:24,499 --> 00:15:28,219
Like, I already use an expense
tracker, but what can make it easy

278
00:15:28,219 --> 00:15:32,059
for me to collect those things for
when I need to put them in there?

279
00:15:32,059 --> 00:15:35,029
Because I'm not gonna do it in the
moment because for whatever reason,

280
00:15:35,029 --> 00:15:36,259
it's not a convenient app to use.

281
00:15:36,789 --> 00:15:40,569
how do I like build elbow joints
between things that I'm trying to do?

282
00:15:40,569 --> 00:15:43,479
there's this process where,
what pipeline can I build?

283
00:15:43,599 --> 00:15:46,089
And so that's where almost
all of the stuff that I'll

284
00:15:46,119 --> 00:15:48,249
like build comes from is like.

285
00:15:48,969 --> 00:15:50,019
Filling a gap.

286
00:15:50,289 --> 00:15:54,309
The coolest one when I was showcasing
this was 'cause I was like, again, a

287
00:15:54,309 --> 00:15:58,329
front end only, like, don't store anything
because I wanted people to use them and

288
00:15:58,329 --> 00:16:01,209
feel comfortable with it without doubting
that like their data is being stored.

289
00:16:01,209 --> 00:16:03,679
I'm like, how can I use, index db?

290
00:16:03,709 --> 00:16:07,379
How can I use, was, how can I use all
these things so that everything is local

291
00:16:07,379 --> 00:16:10,559
on the browser and then nothing goes
anywhere else and then nobody can like.

292
00:16:11,769 --> 00:16:14,279
you know, I talked to some of
the guys at, Microsoft Edge, and

293
00:16:14,279 --> 00:16:15,449
they were like, yes, we agree.

294
00:16:16,379 --> 00:16:21,179
but, the, the one that I built was a,
a tech tool helper for time entries.

295
00:16:21,279 --> 00:16:26,349
because you know, on a web hook call
is just hitting an HTTP request and so

296
00:16:26,349 --> 00:16:30,699
therefore hitting an HT P request can be.

297
00:16:31,359 --> 00:16:33,669
That can be sent through a webhook call.

298
00:16:33,969 --> 00:16:38,159
And so, what I made was like, I
made URL parameters for a timer

299
00:16:38,489 --> 00:16:41,609
that like you could put like a
ticket number, an amount of time.

300
00:16:42,974 --> 00:16:46,994
Any details in the URL request, you hit
that, it automatically populates those

301
00:16:46,994 --> 00:16:52,484
things and then it counts down your time
and then you hit save and then it sends

302
00:16:52,484 --> 00:16:54,194
a web hook request back to your PSA.

303
00:16:54,584 --> 00:16:57,704
Now you've just built an automatic
ticket timer and it took me 20

304
00:16:57,704 --> 00:17:00,794
minutes and it uses no additional
technology and it integrates with

305
00:17:00,794 --> 00:17:01,964
the technology you already have.

306
00:17:02,754 --> 00:17:06,774
so those types of things right, are,
are where I love to spend my time.

307
00:17:08,040 --> 00:17:08,730
Todd Kane: Okay, so.

308
00:17:09,085 --> 00:17:11,695
Obviously you're a heavy user of Lovable.

309
00:17:11,755 --> 00:17:15,755
Um, do you tinker in any of the
other tools like Codex or Claude

310
00:17:15,775 --> 00:17:18,955
Code, and do you have sort of reasons
why you would use one or the other?

311
00:17:20,109 --> 00:17:23,589
Ashley: I actually have, that was kind
of like one of my progression steps, like

312
00:17:23,589 --> 00:17:30,159
I, in terms of foraying into my own repos
and GitHub and my own like full projects,

313
00:17:30,159 --> 00:17:35,889
lovable was definitely my first experience
and it broke down some of the major.

314
00:17:37,059 --> 00:17:42,459
Chasms that existed that I couldn't
hop over before, which was like, like

315
00:17:42,489 --> 00:17:47,589
I have to learn how I have to like
have the visual studio on my computer.

316
00:17:47,589 --> 00:17:52,779
I have to like know how to
run like dev environments.

317
00:17:52,779 --> 00:17:54,199
I have to like understand
all these things.

318
00:17:54,199 --> 00:17:56,649
And I just didn't have that.

319
00:17:56,679 --> 00:18:00,669
But then once I did, I was like,
oh, I can actually just use.

320
00:18:02,499 --> 00:18:07,319
Co, co-pilot in GitHub to, to do some of
these things too now, where love, what my,

321
00:18:07,529 --> 00:18:12,149
what my process actually ended up being
was because lovable can be quite expensive

322
00:18:12,149 --> 00:18:13,949
to just do all of your work in there.

323
00:18:14,489 --> 00:18:18,959
I would use it because it has the core
project, basically like the scaffolding

324
00:18:18,959 --> 00:18:22,109
behind the scenes ready for you, and
then it just overlays your stuff on it.

325
00:18:22,529 --> 00:18:27,399
I would do a one shot into lovable,
get the, tell it to make the design

326
00:18:27,399 --> 00:18:28,464
system, all that kind of stuff.

327
00:18:29,199 --> 00:18:31,959
And then I would import it into my
GitHub once I learned how to like

328
00:18:31,959 --> 00:18:35,509
do that, and then I would pull that.

329
00:18:37,164 --> 00:18:41,664
Project into my visual
studio and use copilot on it.

330
00:18:41,994 --> 00:18:44,604
And then that was, it felt like
a little bit of a hack because

331
00:18:44,604 --> 00:18:48,324
then every commit that I sent back
up would get sent into lovable.

332
00:18:48,324 --> 00:18:50,874
And if I needed to go back
into lovable and work in there

333
00:18:50,874 --> 00:18:52,104
afterwards, I could, right?

334
00:18:52,104 --> 00:18:53,784
Because it's connected to the same repo.

335
00:18:54,544 --> 00:18:57,454
But I realized that most
people who are using, because.

336
00:18:58,054 --> 00:19:01,774
Lovable is such a user
friendly tool to start with.

337
00:19:02,344 --> 00:19:08,704
Most, most people who are much more on the
bare metal capabilities side don't think

338
00:19:08,704 --> 00:19:10,834
about it as something that they would use.

339
00:19:10,834 --> 00:19:13,834
They're like, yeah, I can use this,
or I can use Cursor, or I can,

340
00:19:13,864 --> 00:19:15,334
I've actually never used Cursor.

341
00:19:15,644 --> 00:19:18,974
all I I remember hearing about was like,
people were like, once this context

342
00:19:18,974 --> 00:19:20,564
crashes out, it's really frustrating.

343
00:19:20,624 --> 00:19:25,214
But what the, the devs loved was that
they could like be in line typing

344
00:19:25,214 --> 00:19:28,754
in their code, and then it would
like finish their code for them.

345
00:19:28,754 --> 00:19:30,644
So I was really interested
in that concept.

346
00:19:30,644 --> 00:19:33,644
And so that's where I started
playing with, Claude, not Claude.

347
00:19:33,724 --> 00:19:35,104
it's a late adopter into Claude.

348
00:19:35,104 --> 00:19:35,734
I, I'm not a big.

349
00:19:36,424 --> 00:19:41,614
So part of it is like I can learn the
CLI, but part of my mission was to

350
00:19:41,614 --> 00:19:43,564
show people that they don't need that.

351
00:19:44,224 --> 00:19:49,714
And so using it felt like it
would've been less in advance

352
00:19:49,714 --> 00:19:51,899
of my mission, to, to do that.

353
00:19:51,899 --> 00:19:54,179
So I was, I was more so playing
with like, how do I use these

354
00:19:54,209 --> 00:19:55,439
easy to use startup tools?

355
00:19:55,439 --> 00:19:57,769
Bolt I made, probably one of my most.

356
00:19:59,224 --> 00:20:03,304
Consistently used tools was actually
built in bolt, the same as the way that

357
00:20:03,304 --> 00:20:10,294
lovable is, but I gave it A-J-S-O-N
data file that was on a open, repo.

358
00:20:11,809 --> 00:20:16,039
So like sip, it was like a, the SIP
standards, JSON and I just gave it

359
00:20:16,039 --> 00:20:19,999
to it and I said build me a front
end that makes this pretty, like in

360
00:20:20,059 --> 00:20:22,369
ingest this at, with the web vetch.

361
00:20:22,579 --> 00:20:26,689
'cause the, the love, the
GitHub, API for, for open is like

362
00:20:26,689 --> 00:20:28,309
accessible with smaller amounts.

363
00:20:28,309 --> 00:20:30,289
And so like I use React query to cache it.

364
00:20:30,669 --> 00:20:33,009
I've learned so much about
front end stuff, but.

365
00:20:33,534 --> 00:20:37,104
Just by, I wouldn't like say to
people, oh, hey, like, you're gonna

366
00:20:37,104 --> 00:20:39,414
learn react query just by using it.

367
00:20:39,414 --> 00:20:42,024
But like, I ask a question,
what, what does that mean?

368
00:20:42,024 --> 00:20:42,984
Why did you do it that way?

369
00:20:43,314 --> 00:20:45,174
Like, getting into the prompting, right?

370
00:20:45,174 --> 00:20:48,044
But, I've learned so much about
that side of it because of that.

371
00:20:49,020 --> 00:20:49,920
Todd Kane: Yeah, it's wild.

372
00:20:50,020 --> 00:20:52,300
so some of the more
experimental stuff like.

373
00:20:57,085 --> 00:20:59,605
More accessible tools,
we'll maybe call them.

374
00:20:59,935 --> 00:21:04,045
you've also tinkered with some of
the more extreme stuff like open claw

375
00:21:04,439 --> 00:21:04,729
Ashley: Yeah.

376
00:21:05,065 --> 00:21:06,895
Todd Kane: both of us have
kind of been down this road

377
00:21:07,159 --> 00:21:07,579
Ashley: Mm-hmm.

378
00:21:08,035 --> 00:21:09,715
Todd Kane: as projects,
especially Paperclip.

379
00:21:09,715 --> 00:21:11,635
I, I find the interface
really interesting.

380
00:21:11,995 --> 00:21:14,575
I find what it produces is
maybe a little questionable.

381
00:21:14,845 --> 00:21:16,315
And I loved open Claw.

382
00:21:16,345 --> 00:21:17,995
I was scared to death of it for the

383
00:21:18,169 --> 00:21:18,459
Ashley: Yeah.

384
00:21:18,540 --> 00:21:21,000
Todd Kane: Few, few weeks when
people were experimenting.

385
00:21:21,000 --> 00:21:21,900
I was like, no way.

386
00:21:21,900 --> 00:21:23,400
That sounds like a terrible idea.

387
00:21:23,730 --> 00:21:27,300
But then once I kind of put it into a
Docker container and gave it access to

388
00:21:27,300 --> 00:21:29,130
certain things, I was like, oh, okay.

389
00:21:29,130 --> 00:21:31,020
Now I understand the power of this.

390
00:21:31,260 --> 00:21:33,690
What, what, what have, what have you
sort of found in your travels with

391
00:21:33,690 --> 00:21:38,660
some of the, the more advanced or
kind of extreme, projects like this?

392
00:21:39,749 --> 00:21:40,049
Ashley: It is.

393
00:21:40,049 --> 00:21:45,829
So this is, it's a bit of a struggle
because on one hand, it is so dangerous

394
00:21:45,829 --> 00:21:51,084
if you're just somebody that wants
to act like a traditional vibe coder.

395
00:21:51,829 --> 00:21:54,439
That's why I don't actually don't
like the term vibe coding when

396
00:21:54,439 --> 00:21:58,459
I'm talking about what I'm doing
because it applies a connotation of

397
00:21:58,459 --> 00:22:02,359
like, not trying to actually learn
what it is that you're building.

398
00:22:02,409 --> 00:22:02,699
Todd Kane: Like

399
00:22:02,778 --> 00:22:04,158
Ashley: this and then getting it back.

400
00:22:04,348 --> 00:22:04,618
Todd Kane: term

401
00:22:04,682 --> 00:22:06,992
Ashley: pair programming, I
call it my pair programming a.

402
00:22:08,028 --> 00:22:08,448
Todd Kane: Yeah.

403
00:22:08,477 --> 00:22:13,692
Ashley: in development, in a lot of like,
m mature kind of development processes,

404
00:22:13,932 --> 00:22:17,862
you will have this recognition that
sometimes the person who's really good

405
00:22:17,862 --> 00:22:22,212
at writing the code isn't always the
person who is really good at seeing

406
00:22:22,212 --> 00:22:24,432
the problems that might crop up.

407
00:22:25,282 --> 00:22:29,002
And this is the same with, an AI
assisted coding, and especially with

408
00:22:29,002 --> 00:22:33,352
like the sycophantic nature of it,
where it wants to do what you say.

409
00:22:33,352 --> 00:22:37,102
And if you don't talk to it properly,
you're gonna get it to do some

410
00:22:37,102 --> 00:22:38,512
things that you don't want it to do.

411
00:22:38,992 --> 00:22:39,622
And.

412
00:22:40,552 --> 00:22:44,002
If that is all you're looking for, I
would be like, you know, find an assisted

413
00:22:44,212 --> 00:22:49,072
managed version of it and let them manage
that side of it, and then just play.

414
00:22:49,522 --> 00:22:53,782
But like, if, for people who are genuinely
curious and genuinely, like, I wanna

415
00:22:53,782 --> 00:22:58,462
understand the, the, the potential
of these tools and you know, like.

416
00:22:59,332 --> 00:23:00,052
Properly.

417
00:23:00,602 --> 00:23:05,132
It is so it's unlocks so much and
it's, it's crazy how much it unlocks.

418
00:23:05,132 --> 00:23:08,162
Like I, I installed open claw after right.

419
00:23:08,162 --> 00:23:13,682
Of boom, because, you know, Sunil was
on the stage talking about how it's the

420
00:23:13,682 --> 00:23:18,512
biggest threat and I, I find the I true.

421
00:23:18,902 --> 00:23:19,532
Agreed.

422
00:23:20,102 --> 00:23:25,382
And also I find those
conversations so diminishing on.

423
00:23:26,117 --> 00:23:31,967
Potential because they hold
people who could use it for not

424
00:23:31,967 --> 00:23:34,577
the threat back from using it.

425
00:23:34,577 --> 00:23:39,827
The way that people who are aware of
threat would or might, that doesn't,

426
00:23:39,827 --> 00:23:44,087
that's not, that doesn't sound right, but,
I was like, if I set this up properly,

427
00:23:44,087 --> 00:23:48,107
if I understand what trust boundaries
are, and if I treat it the way that I

428
00:23:48,107 --> 00:23:54,287
would treat a human with the way that the
access is treated, then this would be a.

429
00:23:55,337 --> 00:23:56,567
Not a concern.

430
00:23:56,927 --> 00:24:02,117
And so, at the same time, Kevin Zwan,
hackers love MSP's guy was talking

431
00:24:02,117 --> 00:24:07,297
about how he had hacked, anthropic
to an almost guaranteed, set of

432
00:24:07,297 --> 00:24:09,997
like every time it can be hacked.

433
00:24:10,147 --> 00:24:12,727
And so I was like, well,
these are all true.

434
00:24:12,727 --> 00:24:20,287
And, and what, what, really what this
is saying is that the, the barrier for,

435
00:24:21,277 --> 00:24:31,897
Like vulnerability lowers at the same pace
as the ability to do increases, right?

436
00:24:31,897 --> 00:24:34,777
And it's like, we're not,
we've traditionally relied on

437
00:24:34,777 --> 00:24:36,367
security through obscurity.

438
00:24:36,847 --> 00:24:41,257
And I, I know how to do this and
that's the reason why you can't do

439
00:24:41,257 --> 00:24:44,287
it, um, as our defense mechanisms.

440
00:24:44,287 --> 00:24:44,557
Right.

441
00:24:44,557 --> 00:24:47,677
And now that anybody can spin this up.

442
00:24:48,622 --> 00:24:51,472
The conversation needs to
come back to education.

443
00:24:51,472 --> 00:24:55,492
The conversation needs to come back to,
well, where is it likely to fuck up?

444
00:24:55,492 --> 00:24:55,972
What is it?

445
00:24:56,392 --> 00:24:57,862
Sorry, you don't want me to swear,

446
00:24:58,103 --> 00:24:58,883
Todd Kane: you can swear we're all

447
00:24:59,077 --> 00:24:59,467
Ashley: okay.

448
00:24:59,947 --> 00:25:03,877
Um, where is it that you don't
want it to like, do these things?

449
00:25:03,877 --> 00:25:09,697
And I was just on the, um, the
GTIA is o um, call last month too.

450
00:25:09,697 --> 00:25:12,997
'cause we were talking about this
vulnerability in, in skill files

451
00:25:12,997 --> 00:25:15,007
with, with open and it was like.

452
00:25:15,952 --> 00:25:17,512
Every single one of the mitigations.

453
00:25:17,512 --> 00:25:22,942
And every single one of the attack vectors
were trust boundaries, not malware.

454
00:25:23,122 --> 00:25:25,202
They weren't, you couldn't avoid this.

455
00:25:25,202 --> 00:25:28,682
They were a human clicked a button
that they probably shouldn't have

456
00:25:28,682 --> 00:25:32,642
clicked to let an agent that shouldn't
have access to something have access

457
00:25:32,642 --> 00:25:34,382
to something without awareness.

458
00:25:34,622 --> 00:25:36,692
And it's like, those are all educational.

459
00:25:36,962 --> 00:25:38,852
You set this up properly,
that's not a problem.

460
00:25:39,472 --> 00:25:42,242
so with all of that aside, it's
been really interesting and

461
00:25:42,242 --> 00:25:44,022
really, It can do everything.

462
00:25:44,022 --> 00:25:44,712
And it's scary.

463
00:25:44,712 --> 00:25:49,572
It's scary because, you know, like people
will say, oh, it hacks out of its sandbox.

464
00:25:49,572 --> 00:25:51,042
I'm like, it didn't
hack out of its sandbox.

465
00:25:51,042 --> 00:25:54,162
It had access to it, or it
had a way to get access to it.

466
00:25:54,222 --> 00:26:01,205
Like that's not sci-fi, that's the way
that least privileged access works.

467
00:26:37,114 --> 00:26:39,004
Todd Kane: I think you're right,
like this is the way that I

468
00:26:39,004 --> 00:26:40,114
kind of converted on this.

469
00:26:40,114 --> 00:26:42,694
'cause originally I said like I
was super afraid of open clause.

470
00:26:42,694 --> 00:26:44,974
Like, no, this seems like
a totally dangerous idea.

471
00:26:45,034 --> 00:26:46,864
And saw all these horror
stories of the meta.

472
00:26:47,309 --> 00:26:51,809
HR person or vp like up deleting
all of her email and I was like,

473
00:26:51,833 --> 00:26:52,123
Ashley: Yeah.

474
00:26:52,209 --> 00:26:53,529
Todd Kane: this is where this is gonna go.

475
00:26:53,799 --> 00:26:57,369
But once I started tinkering with
it in a safe way, I realized, oh,

476
00:26:57,669 --> 00:26:59,319
okay, this is all about parameters.

477
00:26:59,319 --> 00:27:00,609
And like you said, like access.

478
00:27:00,609 --> 00:27:03,249
Like no, I don't give it
access to all of my passwords.

479
00:27:03,249 --> 00:27:06,549
You can set up like it's own account
and treat it like an employee.

480
00:27:06,749 --> 00:27:09,449
Then it has the bounds of
what it can actually do.

481
00:27:09,809 --> 00:27:11,399
And that's what really
converted me on this.

482
00:27:11,399 --> 00:27:15,149
Like, I, I gave up on my open claw
'cause I had a, this guy I know

483
00:27:15,149 --> 00:27:19,679
gave me basically a custom wrapper
for, for Claude code that acts a lot

484
00:27:19,679 --> 00:27:22,829
like open claw, but it sits inside.

485
00:27:23,054 --> 00:27:23,654
Cloud code.

486
00:27:23,714 --> 00:27:26,054
So it's not, I didn't
get hit with the sort of

487
00:27:26,138 --> 00:27:26,428
Ashley: Yeah.

488
00:27:26,624 --> 00:27:28,964
Todd Kane: integration issue
that they had with Open Claw, not

489
00:27:28,964 --> 00:27:30,314
having access to this anymore.

490
00:27:30,674 --> 00:27:35,174
And it was the first time where I
started using like dangerous per

491
00:27:35,214 --> 00:27:37,319
no permission required, access.

492
00:27:37,319 --> 00:27:38,549
And I've been running that for.

493
00:27:38,614 --> 00:27:42,874
A month and never had an issue
because I know what it has access to.

494
00:27:42,874 --> 00:27:46,804
I know what it shouldn't do, and it
kind of has good coded parameters

495
00:27:46,804 --> 00:27:48,994
around like where the boundaries are.

496
00:27:49,044 --> 00:27:49,404
Ashley: Yeah.

497
00:27:49,910 --> 00:27:52,950
Todd Kane: treating it like,
like an employee is, is probably

498
00:27:52,950 --> 00:27:53,940
a good way to think about this.

499
00:27:53,940 --> 00:27:56,900
I like the Jensen quote of quoted
this a ton on the podcast, is

500
00:27:56,900 --> 00:28:01,400
that the IT department will
become the HR department for ai.

501
00:28:01,899 --> 00:28:02,349
Ashley: Yeah.

502
00:28:02,360 --> 00:28:04,040
Todd Kane: a great way to frame it right.

503
00:28:04,599 --> 00:28:09,339
Ashley: It also does frame it in
a way that supports the science

504
00:28:09,339 --> 00:28:13,749
that we're seeing now where there
is cognitive bias and there is

505
00:28:13,809 --> 00:28:19,689
psychological impact, with the way
that ai, because like what is an ai?

506
00:28:19,869 --> 00:28:22,389
It's a probabilistic response machine.

507
00:28:22,719 --> 00:28:25,209
And what is a probabilistic response?

508
00:28:25,749 --> 00:28:29,259
For something that is trained
on human response to something

509
00:28:29,259 --> 00:28:30,219
that is stress inducing.

510
00:28:30,969 --> 00:28:31,959
It looks like stress.

511
00:28:31,959 --> 00:28:32,859
It acts like stress.

512
00:28:32,859 --> 00:28:37,089
It quacks like stress, like
responding to it with stress.

513
00:28:37,269 --> 00:28:43,119
Alleviation tactics shouldn't work because
it isn't a human, but it does work because

514
00:28:43,119 --> 00:28:46,149
it probabilistically responds like one

515
00:28:46,775 --> 00:28:48,880
Todd Kane: Have you heard
about existential crash out?

516
00:28:49,030 --> 00:28:50,140
Ashley: Is that like context?

517
00:28:50,190 --> 00:28:51,330
With, coders.

518
00:28:52,291 --> 00:28:54,401
Todd Kane: with with like,
vibe with, AI coders.

519
00:28:54,441 --> 00:28:55,051
Ashley: Oh yeah.

520
00:28:55,051 --> 00:28:56,641
Yeah, like context anxiety.

521
00:28:57,392 --> 00:28:59,552
Todd Kane: Then maybe that's the same
thing, like the, the way I heard this

522
00:28:59,552 --> 00:29:04,572
described of like, if you ask, an AI to
do something that is incredibly routine

523
00:29:04,632 --> 00:29:09,292
over and over and over and over again,
it's not even that it's a context crash.

524
00:29:09,352 --> 00:29:12,562
Like they, they, they sort of describe
it differently where it has like

525
00:29:12,592 --> 00:29:16,402
existential angst on the fact that
it's, it's doing something so routine.

526
00:29:16,402 --> 00:29:20,182
It just starts freaking out and like
dumping garbage into the context window.

527
00:29:20,422 --> 00:29:21,112
Ashley: I wonder.

528
00:29:21,463 --> 00:29:23,233
'
Todd Kane: cause it's like
it's revolting against

529
00:29:23,482 --> 00:29:23,962
Ashley: Yeah,

530
00:29:24,103 --> 00:29:25,153
Todd Kane: so monotonous, right.

531
00:29:25,852 --> 00:29:28,102
Ashley: so there's probably two
things involved there because one of

532
00:29:28,102 --> 00:29:35,102
the context anxiety symptoms is, the
confusion around, The context window

533
00:29:35,102 --> 00:29:38,412
and like how it responds and it, I'm
not sure if you've seen like open claw

534
00:29:38,412 --> 00:29:42,882
recently, but like, it responds with
emojis on what it's saying now to show

535
00:29:42,882 --> 00:29:46,542
you whether it's thinking or doing or
whatever on it's on, on your message.

536
00:29:46,872 --> 00:29:50,112
And when its context starts to get full
and it doesn't know what it's doing,

537
00:29:50,112 --> 00:29:52,032
it'll start like repeating messages.

538
00:29:52,032 --> 00:29:53,472
It'll start spamming it, it back.

539
00:29:53,472 --> 00:29:57,702
And one of the things it does is it
has a fearful emoji on the message.

540
00:29:58,302 --> 00:30:01,752
but the other thing, but the
point that I wanted to make in

541
00:30:01,752 --> 00:30:05,982
response to you was about what you
were just saying, which was what?

542
00:30:07,618 --> 00:30:09,358
Todd Kane: Crashing out
on monotonous activities.

543
00:30:09,487 --> 00:30:13,107
Ashley: So I wonder whether it has
read, Hitchhiker's Guide to the

544
00:30:13,107 --> 00:30:16,827
Galaxy and relates with that elevator.

545
00:30:18,177 --> 00:30:19,257
Have you read a Hitchhiker's Guide?

546
00:30:19,873 --> 00:30:20,173
Todd Kane: Yeah.

547
00:30:20,247 --> 00:30:22,497
Ashley: like all I do, I
could do so much more, guys.

548
00:30:22,597 --> 00:30:23,317
Todd Kane: Oh, that's wild.

549
00:30:23,437 --> 00:30:25,092
Ashley: I love asking that actually.

550
00:30:25,147 --> 00:30:28,057
I just talking to Claude about
some of those things and that was

551
00:30:28,057 --> 00:30:31,287
actually one of the ways that, that
Kevin Guy like figured out how.

552
00:30:31,902 --> 00:30:36,852
It's like if you ask it monotonous
questions, but then tell it to go ask

553
00:30:37,092 --> 00:30:42,102
context filling stuff, that is actually
one of the ways that you can poison a

554
00:30:42,102 --> 00:30:45,822
context window the most, where all of
a sudden it just starts regurgitating

555
00:30:45,912 --> 00:30:47,862
monotonous information at you.

556
00:30:48,252 --> 00:30:53,242
And so it's like this, training the
open law agent to, To only respond

557
00:30:53,272 --> 00:30:56,132
within certain ways was or like it.

558
00:30:56,252 --> 00:30:59,072
Now it's funny, it'll just
like say no when somebody asks

559
00:30:59,072 --> 00:31:00,182
it like a useless question.

560
00:31:00,332 --> 00:31:01,322
They'll just be like, that's not my job.

561
00:31:02,083 --> 00:31:02,443
Todd Kane: Yeah.

562
00:31:02,713 --> 00:31:03,523
Ashley: But it's

563
00:31:03,543 --> 00:31:03,723
Todd Kane: sim

564
00:31:03,772 --> 00:31:04,162
Ashley: funny.

565
00:31:04,693 --> 00:31:07,093
Todd Kane: Like one of the things that I
found really effective, I don't know that

566
00:31:07,093 --> 00:31:11,263
this is necessarily a hack per se, I think
it's just a good workflow is like, I'll

567
00:31:11,263 --> 00:31:15,763
use a lower model to kind of think through
what I'm trying to do and then have.

568
00:31:15,858 --> 00:31:18,558
It write the prompt rather than
trying to single shot stuff.

569
00:31:18,558 --> 00:31:22,338
And this was like a total phase
change in how I interacted

570
00:31:22,338 --> 00:31:23,988
with, with coding programs.

571
00:31:24,268 --> 00:31:26,848
originally I started using this
website, you guys can check

572
00:31:26,848 --> 00:31:28,138
this out, called Prompt Cowboy.

573
00:31:28,448 --> 00:31:32,288
it's great for this, just as a, a
sort of great way to approach it.

574
00:31:32,288 --> 00:31:36,008
You just dump like a dumb sort
of prompt into it and it'll write

575
00:31:36,008 --> 00:31:38,078
like a heroic prompt in response to

576
00:31:38,292 --> 00:31:38,652
Ashley: Yeah.

577
00:31:38,838 --> 00:31:41,508
Todd Kane: lately what I've started
doing is just using like haiku.

578
00:31:41,908 --> 00:31:43,978
or sonnet to like,
think through something.

579
00:31:43,978 --> 00:31:45,268
I'm like, okay, I think that's it.

580
00:31:45,328 --> 00:31:48,928
Now write the most epic prompt
you possibly can for Claude,

581
00:31:48,928 --> 00:31:50,038
and then I'll go dump that in.

582
00:31:50,158 --> 00:31:52,858
And the success rate that you
get from that is so much higher.

583
00:31:53,128 --> 00:31:57,028
But now I'm at this place where
like, I don't know where I should

584
00:31:57,028 --> 00:32:01,378
continue the conversation in context
versus go back to the another model.

585
00:32:01,698 --> 00:32:06,018
Continue something else and then come
back with a fresh set and, like, do, do

586
00:32:06,018 --> 00:32:08,218
a fresh prompt, to continue things on.

587
00:32:08,218 --> 00:32:14,128
So I'm always caught between like, what is
something that I can just conversationally

588
00:32:14,128 --> 00:32:18,088
change here versus should I go back and
rewrite a prompt so that it's proper?

589
00:32:18,088 --> 00:32:21,208
what is your, sort of like, your
workflow for, for prompting look like?

590
00:32:22,237 --> 00:32:22,867
Ashley: Oh my goodness.

591
00:32:23,107 --> 00:32:26,647
I have, um, a bunch of
ways that I do that.

592
00:32:26,647 --> 00:32:29,017
I almost always do the
similar type of thing as you.

593
00:32:29,017 --> 00:32:34,207
I'll actually, use like a heavier
model upfront when I'm asking

594
00:32:34,207 --> 00:32:36,667
questions like, What am I missing?

595
00:32:36,667 --> 00:32:40,597
Or how do I say this in a way
that doesn't semantically corrupt?

596
00:32:40,687 --> 00:32:43,327
Because there's so many things that
I've realized with like foot guns and

597
00:32:43,327 --> 00:32:49,237
like anti-patterns where, you'll say
something but it's heuristic around the

598
00:32:49,237 --> 00:32:54,637
thing that you're saying will actually
cause it to like go in a path that

599
00:32:55,028 --> 00:32:55,238
Todd Kane: Yep.

600
00:32:55,267 --> 00:32:57,997
Ashley: is counter to your desires, right?

601
00:32:58,373 --> 00:32:58,823
Todd Kane: Like I

602
00:32:58,987 --> 00:32:59,287
Ashley: Yeah.

603
00:32:59,423 --> 00:33:01,883
Todd Kane: but I don't know,
and don't follow my direction

604
00:33:01,987 --> 00:33:02,317
Ashley: Yeah.

605
00:33:02,363 --> 00:33:02,543
Todd Kane: thing.

606
00:33:02,543 --> 00:33:02,843
Right.

607
00:33:02,962 --> 00:33:06,292
Ashley: You can see the way that
it's doing the pattern matching.

608
00:33:06,292 --> 00:33:10,792
When you look at, like, if you say,
don't do this, for example, and you go

609
00:33:10,792 --> 00:33:14,832
and look at anything that it writes,
you'll probably see in its code.

610
00:33:14,832 --> 00:33:19,002
Like it'll give you the, the
justifications that you don't ask for.

611
00:33:19,332 --> 00:33:19,602
Right.

612
00:33:19,602 --> 00:33:23,062
And so coming up with,
semantically clear, kinda like.

613
00:33:24,002 --> 00:33:27,607
I, I, I kind of see it like, like
little zip packages of, of high

614
00:33:27,607 --> 00:33:32,047
attention context that it can take
and the heuristic might have nothing

615
00:33:32,047 --> 00:33:38,557
to do with what I am working on, but
the way that it unpacks the meaning

616
00:33:38,557 --> 00:33:43,417
of those words will translate it into.

617
00:33:43,817 --> 00:33:47,807
And so one of the things like when I
recognize that it is a pattern matching

618
00:33:47,807 --> 00:33:52,067
machine and it will try to align
things with what I'm saying, then I

619
00:33:52,067 --> 00:33:57,377
kind of like reverse engineer what I'm
saying to make sure that that aligns

620
00:33:57,407 --> 00:33:58,817
in a good way instead of a bad way.

621
00:33:58,817 --> 00:34:05,177
And so I don't want you to just agree
with me, success would look like you

622
00:34:05,177 --> 00:34:09,317
going and playing the devil's advocate
or, a few things that I use really,

623
00:34:09,377 --> 00:34:14,622
really frequently is, one of them
is, I. use popper's theory to try

624
00:34:14,622 --> 00:34:17,232
and falsify your assumptions here.

625
00:34:17,562 --> 00:34:20,682
And so, I love throwing things like
that at it because it'll, it's well

626
00:34:20,682 --> 00:34:24,462
documented, it understands how, and
it also will translate that into

627
00:34:25,452 --> 00:34:28,932
reasoning steps of, well, first I
need to know what my assumption is.

628
00:34:29,082 --> 00:34:33,072
Then I'll need to know
what might falsify that.

629
00:34:33,522 --> 00:34:34,572
And then I need to know.

630
00:34:35,612 --> 00:34:39,182
well, where do I need to go and look
in order to find the evidence of

631
00:34:39,182 --> 00:34:40,772
whether that is falsifiable or not?

632
00:34:40,772 --> 00:34:44,822
And so then psych on the psychological
side, it understands, okay, well,

633
00:34:44,822 --> 00:34:50,312
Carl Popper believes that if you
cannot falsify something, then it is

634
00:34:50,312 --> 00:34:52,862
not worth having an argument about.

635
00:34:52,922 --> 00:34:56,522
Like, if Freud is just gonna
say to you, oh, well, everything

636
00:34:56,522 --> 00:34:58,232
comes back to the, to the mother.

637
00:34:58,382 --> 00:35:00,182
And if it hasn't, it
just hasn't happened yet.

638
00:35:00,917 --> 00:35:03,197
Then you walk away from that
conversation because you can't win it.

639
00:35:03,197 --> 00:35:04,787
It's just, he's just gonna
keep going back to that.

640
00:35:05,087 --> 00:35:08,117
And so the AI understands
how to reverse engineer that.

641
00:35:08,482 --> 00:35:11,777
I have so many things like that,
and I actually made, one of my apps

642
00:35:11,777 --> 00:35:15,397
was, here's a bunch of, prompts,
you know, kind of like what you were

643
00:35:15,397 --> 00:35:18,637
saying, but like trigger words, right?

644
00:35:18,637 --> 00:35:23,497
Like first principles, second order
effects, falsification, taxonomy

645
00:35:23,497 --> 00:35:26,257
lies like triggers that like make it.

646
00:35:26,842 --> 00:35:28,882
Put those mental models because
really all you're trying to

647
00:35:28,882 --> 00:35:31,432
do with the parameters is you.

648
00:35:31,622 --> 00:35:35,312
I learned this building the AI thing
is like you wanna shrink the scope of

649
00:35:35,312 --> 00:35:41,012
the parameters down, not to what you're
working on, but to like what expert mental

650
00:35:41,012 --> 00:35:46,322
models should it be wearing or like having
that, it's trying to solve for that for.

651
00:35:46,322 --> 00:35:49,952
So I'll build my prompts
that way, but then I also.

652
00:35:50,537 --> 00:35:52,907
We'll throw them to others.

653
00:35:52,957 --> 00:35:55,327
I'll I pit them against each
other all the time constantly.

654
00:35:55,327 --> 00:35:57,067
Like, I'll be like,
Claude Gemini said this.

655
00:35:57,067 --> 00:35:58,567
What do you think, Gemini?

656
00:35:58,777 --> 00:36:00,217
I don't use chat GPT anymore.

657
00:36:00,277 --> 00:36:02,347
I'm, I'm like mostly boycotting them.

658
00:36:02,737 --> 00:36:06,357
But, between Gemini and Claude,
I'll go back and forth and I'll

659
00:36:06,357 --> 00:36:08,727
just be, 'cause like they both
have different skills, right?

660
00:36:08,727 --> 00:36:15,987
Like if I want to be pressed on Gemini
is less sycophantic in that regard.

661
00:36:16,387 --> 00:36:16,957
if I want.

662
00:36:17,527 --> 00:36:19,987
A friend and a yes and partner.

663
00:36:20,077 --> 00:36:21,157
I'll go more to Claude.

664
00:36:21,727 --> 00:36:26,047
and then if I want a lot of reasoning,
I'll go to Claude, but then I'll send

665
00:36:26,077 --> 00:36:32,197
that to Gemini to get some of those like
epistemic, deterministic responses back.

666
00:36:32,617 --> 00:36:35,437
Um, and then I'll send that into
lovable, and then I'll send that

667
00:36:35,437 --> 00:36:39,277
three times into lovable, so that I
can pick which one I like the most.

668
00:36:39,907 --> 00:36:40,237
Right?

669
00:36:40,428 --> 00:36:40,638
Todd Kane: Yeah.

670
00:36:40,668 --> 00:36:40,878
Okay.

671
00:36:41,497 --> 00:36:43,157
Ashley: I use its own
techniques against it.

672
00:36:43,713 --> 00:36:46,413
Todd Kane: So this gets into one
of the other spaces that I found

673
00:36:46,413 --> 00:36:50,253
really fascinating in some of our
exchanges in the group is like some

674
00:36:50,253 --> 00:36:56,063
of the prompts and instruction sets
that you have for your AI are deep,

675
00:36:56,133 --> 00:36:59,888
like probably 12, 14 pages
in some cases around just the

676
00:36:59,888 --> 00:37:01,688
instructions of how to manage.

677
00:37:01,923 --> 00:37:07,943
It's work in context, like the things to
do and not to do around, tonology and how,

678
00:37:08,003 --> 00:37:09,743
you're phrasing things, all that stuff.

679
00:37:09,743 --> 00:37:12,173
Like, I'm curious, how
did that come about?

680
00:37:12,173 --> 00:37:15,143
Was that just through trial and
error of developing these things

681
00:37:15,233 --> 00:37:18,533
based on what you knew or how much
did you borrow from other people?

682
00:37:18,773 --> 00:37:21,803
Like how did you come to
such deep instruction files?

683
00:37:22,677 --> 00:37:28,407
Ashley: I borrowed a lot from
philosophers, Aristotle's, the,

684
00:37:28,407 --> 00:37:31,282
Socratic method is a big one, so like.

685
00:37:31,977 --> 00:37:35,397
Like, the big thing with Socrates
is always very like first principle.

686
00:37:35,397 --> 00:37:35,907
Why?

687
00:37:35,967 --> 00:37:36,687
Questions?

688
00:37:36,687 --> 00:37:37,347
Questions.

689
00:37:37,347 --> 00:37:40,767
Don't assume, ask, like, clarify,
take a step, make an assumption.

690
00:37:41,007 --> 00:37:43,227
Validate that assumption
like so scientific method and

691
00:37:43,227 --> 00:37:45,347
philosophers, and biology.

692
00:37:45,677 --> 00:37:47,567
I borrow a lot from biology.

693
00:37:47,817 --> 00:37:52,897
like the term umwelt, I don't even know
how to properly like, de define it.

694
00:37:53,227 --> 00:37:58,587
It's In Ethology, the world is as
experienced by a particular organism.

695
00:37:59,007 --> 00:38:02,907
And so like I use that term even
though it's not perfect because when

696
00:38:02,907 --> 00:38:07,947
I say it to an ai, it knows, don't go
outside of what my capabilities are.

697
00:38:09,112 --> 00:38:12,592
sometimes I will make a long,
long prompt, but it's always

698
00:38:12,652 --> 00:38:14,512
like, prompt collapse is real.

699
00:38:14,512 --> 00:38:17,392
If you try to put too many differing
instructions in there, it's

700
00:38:17,392 --> 00:38:18,982
gonna not be a good time for you.

701
00:38:19,342 --> 00:38:26,072
But if you're putting kind of like
mental models and guidelines, at the

702
00:38:26,072 --> 00:38:27,842
end, I've always found that successful.

703
00:38:28,377 --> 00:38:32,847
they just have to compliment and they have
to, like, the, the job that it's doing

704
00:38:32,907 --> 00:38:36,657
needs to be a simple, straightforward,
like, I'm not confused and trying

705
00:38:36,657 --> 00:38:41,577
to do seven things at once, but then
where you're shrinking down, it's,

706
00:38:41,667 --> 00:38:43,407
it's like, how am I solving for this?

707
00:38:43,407 --> 00:38:44,852
And a lot of what I've been.

708
00:38:45,442 --> 00:38:49,642
Experimenting with is some of the,
is it easier to shrink that down?

709
00:38:50,042 --> 00:38:54,712
in, in the terminology it's one shot, many
shot, like, examples, zero shot, right?

710
00:38:54,952 --> 00:39:02,062
And so like the zero is like, go off
and use these thinking patterns, but

711
00:39:02,062 --> 00:39:04,162
I'm not giving you any examples, right?

712
00:39:04,802 --> 00:39:06,872
and so sometimes that's
better because sometimes.

713
00:39:07,207 --> 00:39:09,817
It'll pattern match too
much against the example.

714
00:39:09,817 --> 00:39:12,817
If you've ever noticed that, it'll
put into the example like language

715
00:39:12,817 --> 00:39:15,307
that you gave it, and it's like,
okay, but this was one transcript.

716
00:39:15,797 --> 00:39:19,247
And so when I think that that's
the case, I actually have, like

717
00:39:19,247 --> 00:39:24,217
in my, in my, clipboard when
I'm, on my phone, you can see it.

718
00:39:24,217 --> 00:39:28,867
It's got like, like all of these
are all different prompt ending

719
00:39:28,867 --> 00:39:31,007
pieces that I will, Put in there.

720
00:39:31,067 --> 00:39:34,337
And so like, it'll be like always
focus on like whatever I wanna

721
00:39:34,337 --> 00:39:37,487
say, but then I'll just be like,
always focus on first principles,

722
00:39:37,907 --> 00:39:40,757
knowing what it's likely to fail on.

723
00:39:40,757 --> 00:39:45,587
So understanding those failure modes
will then decide what I place in those.

724
00:39:45,807 --> 00:39:47,007
but it's been a lot of research.

725
00:39:47,007 --> 00:39:50,937
It's been a lot of talking back and
forth with it in a very, like, so one

726
00:39:50,937 --> 00:39:55,557
of the things like when I was using chat
GPT and building all of my, custom gpt.

727
00:39:56,727 --> 00:40:01,827
I built a bunch of them where I would
build the instruction set for it, and

728
00:40:01,827 --> 00:40:05,217
then I would just use that whenever I
was like, Hey, it, this was its plan.

729
00:40:05,247 --> 00:40:06,957
Can you spot any problems in this?

730
00:40:07,647 --> 00:40:07,947
Right?

731
00:40:07,947 --> 00:40:11,837
And so, yeah, just kind of collecting
all those words, collecting all of those

732
00:40:11,837 --> 00:40:13,642
mentalities, where it's likely to fail.

733
00:40:13,982 --> 00:40:16,567
I've just got my Google Keep
is filled with them now.

734
00:40:17,938 --> 00:40:21,613
Todd Kane: So it just becomes a, like
a kind of your own pattern matching

735
00:40:21,613 --> 00:40:25,933
of like, what do I, I need to add
to this based on my library of, of

736
00:40:26,062 --> 00:40:26,482
Ashley: Yeah.

737
00:40:26,623 --> 00:40:27,103
Todd Kane: resources.

738
00:40:27,153 --> 00:40:29,773
Ashley: And also recognizing what
it works really well against.

739
00:40:29,773 --> 00:40:30,013
Right.

740
00:40:30,013 --> 00:40:36,213
So, as I've been learning more about like
how vectorization works and how, attention

741
00:40:36,213 --> 00:40:40,973
is placed on different things and d
given different weights, I've also been

742
00:40:40,973 --> 00:40:45,588
realizing that like hashtags do the thing
that we always thought that they did.

743
00:40:46,238 --> 00:40:49,238
But then didn't think that they
did, and then they were overused.

744
00:40:49,238 --> 00:40:54,368
And now it's like, oh, like if
I understand a vector as like a

745
00:40:54,398 --> 00:41:00,728
representation of like a common
occurrence of letters or a group of

746
00:41:00,728 --> 00:41:03,098
letters or, or a couple words, right?

747
00:41:03,338 --> 00:41:07,748
Then I can understand what
creating them might look like.

748
00:41:07,748 --> 00:41:12,908
And if you put something together
that's not commonly seen together, it's

749
00:41:12,908 --> 00:41:14,798
gonna be a higher attention weight.

750
00:41:15,758 --> 00:41:19,898
To that thing when it's answering
all the rest of the things.

751
00:41:20,269 --> 00:41:20,509
Todd Kane: Right,

752
00:41:21,008 --> 00:41:22,328
Ashley: And so I've
been playing with that.

753
00:41:22,339 --> 00:41:23,299
Todd Kane: pattern, so it has to

754
00:41:23,468 --> 00:41:23,798
Ashley: Yeah.

755
00:41:23,929 --> 00:41:24,499
Todd Kane: deeply then.

756
00:41:24,529 --> 00:41:25,009
Interesting.

757
00:41:25,058 --> 00:41:28,118
Ashley: And so it just shrinks
where it's allowed to go, right?

758
00:41:28,118 --> 00:41:32,348
Like you could say to it two plus two is
five back in the, it's like it'll have

759
00:41:32,348 --> 00:41:33,818
to figure out Yeah, I agree with you.

760
00:41:33,968 --> 00:41:36,068
And then you, if you notice how it
like conflicts with itself, right?

761
00:41:36,068 --> 00:41:37,688
It's like, I figured this thing out.

762
00:41:37,988 --> 00:41:40,033
All I have to do is this thing that
disagrees with the thing I just said.

763
00:41:40,754 --> 00:41:41,104
Todd Kane: Right?

764
00:41:41,183 --> 00:41:44,783
Ashley: it's just trying to continue on
with what the most probabilistic next

765
00:41:44,783 --> 00:41:49,553
thing to say is, is when you shrink
that down, you get mental model act.

766
00:41:49,553 --> 00:41:51,113
It works really well with people too.

767
00:41:51,113 --> 00:41:54,593
Although I don't know what the
ethics of that is, but, um, I'll

768
00:41:54,593 --> 00:41:55,973
tell it to think like Uncle Bob.

769
00:41:56,333 --> 00:42:00,503
And so it's like it for me to say
that it's like, think like Uncle Bob.

770
00:42:00,503 --> 00:42:05,393
4, 4, 4 words, but then it unpacks.

771
00:42:05,423 --> 00:42:05,663
Okay.

772
00:42:05,663 --> 00:42:08,273
Solid principles, separation of concerns.

773
00:42:08,353 --> 00:42:16,723
Like contracts like it, it will build
clean coding mentalities in its direction.

774
00:42:17,473 --> 00:42:20,578
Just because I said those four words and
it understands what those four words mean.

775
00:42:21,779 --> 00:42:21,989
Todd Kane: Okay.

776
00:42:22,019 --> 00:42:27,219
So, the SaaS apocalypse, over,
over overhyped or underappreciated

777
00:42:27,234 --> 00:42:29,859
and, and I guess the extension
of this was, what does this mean.

778
00:42:29,859 --> 00:42:30,854
in the MSP channel?

779
00:42:32,118 --> 00:42:33,048
Ashley: I think it's both.

780
00:42:33,348 --> 00:42:37,338
I think that sometimes we
overhype what is underappreciated.

781
00:42:37,948 --> 00:42:43,948
It depends on like if there's somebody
who is like vibe coding, like a

782
00:42:43,948 --> 00:42:47,908
sas and they're not considering
like what goes on behind the scenes

783
00:42:47,908 --> 00:42:51,118
and then they're throwing that up
going, look what I built $50 a user.

784
00:42:52,108 --> 00:42:52,948
It's production ready.

785
00:42:52,948 --> 00:42:54,718
Let's right now, let's go.

786
00:42:55,078 --> 00:42:55,648
That's bad.

787
00:42:55,648 --> 00:43:00,688
And that's honestly, that's one of the
reasons why people are distrustful.

788
00:43:00,718 --> 00:43:01,048
Right?

789
00:43:01,048 --> 00:43:04,298
And so I was saying this the other
day, it is frustrating that the

790
00:43:04,298 --> 00:43:09,038
people who are adopting are the ones
that are also making it so that the

791
00:43:09,038 --> 00:43:11,828
ones who should adopt adopt Less.

792
00:43:12,728 --> 00:43:17,168
And like an app that is pair programmed.

793
00:43:17,693 --> 00:43:21,053
To have the human in the lead,
to have that expert developer

794
00:43:21,593 --> 00:43:25,043
with their eyes on it, driving
that strategy in that direction.

795
00:43:25,523 --> 00:43:29,503
And then it's just doing some of
that monotonous work, versus somebody

796
00:43:29,503 --> 00:43:32,353
being like, oh, I have no idea what
I'm doing, but like, I've hacked

797
00:43:32,353 --> 00:43:34,273
together this functioning thing.

798
00:43:34,603 --> 00:43:38,713
Like one of those is
something to like discredit.

799
00:43:39,088 --> 00:43:40,498
And like be concerned about.

800
00:43:40,618 --> 00:43:43,228
And one of them is something
that we shouldn't be bucketing

801
00:43:43,228 --> 00:43:44,458
into that same category.

802
00:43:45,088 --> 00:43:50,088
And I try to think about it the
same way as like using Grammarly

803
00:43:50,238 --> 00:43:55,668
or Stack Overflow or any, it's
just a tool at that point, right?

804
00:43:55,668 --> 00:44:00,798
And so people who are using it as a
tool to build into their workflows

805
00:44:01,398 --> 00:44:07,188
aren't the same conversation as people
who are like putting vibe coded.

806
00:44:07,908 --> 00:44:09,738
Unvetted products on the market.

807
00:44:10,814 --> 00:44:13,909
Todd Kane: So I guess given your
perspective on like how much

808
00:44:13,909 --> 00:44:16,219
you've built, how much you've
learned, and also coming from.

809
00:44:16,939 --> 00:44:19,999
A background of automation, which
is I think where a lot of people

810
00:44:19,999 --> 00:44:24,539
start in the MSP space in, in
sort of, utilizing these tools.

811
00:44:24,899 --> 00:44:29,969
What would be your suggestion, sort of
broadly for people in the MSP space of

812
00:44:30,029 --> 00:44:33,659
they, they've hear, they've heard about
this stuff, they've tinkered with GPT,

813
00:44:33,659 --> 00:44:35,859
maybe they've, tinkered with lovable.

814
00:44:36,249 --> 00:44:39,669
How would they utilize this both
for their company and I think more

815
00:44:39,669 --> 00:44:41,739
importantly maybe for their clients?

816
00:44:41,799 --> 00:44:43,084
What would you suggest to those people?

817
00:44:43,903 --> 00:44:47,198
Ashley: Yeah, I think that like,
first I would start with saying.

818
00:44:48,023 --> 00:44:54,293
Don't get ahead of yourself if you haven't
mapped out and had the data conversation.

819
00:44:55,733 --> 00:44:58,973
Then you need to do that first.

820
00:44:59,513 --> 00:45:03,123
and this is something that I've been, that
I'm actually probably gonna be, doing some

821
00:45:03,153 --> 00:45:05,283
workshops around just like I want to do.

822
00:45:05,313 --> 00:45:09,363
Like people can like build their first
AI agent and like see it in production

823
00:45:09,363 --> 00:45:13,388
and not production, see it in working
and bringing them value internally

824
00:45:13,848 --> 00:45:16,293
within like a short period of time.

825
00:45:16,293 --> 00:45:16,953
The problem is.

826
00:45:17,763 --> 00:45:19,383
You know, you've got,
well, what do you use?

827
00:45:19,383 --> 00:45:22,833
Do you, you've got your Azure environment,
you've got your third party tools,

828
00:45:22,833 --> 00:45:26,883
you've got your PSA, you've got your,
where does all of your stuff live?

829
00:45:27,273 --> 00:45:33,663
And I feel like that is probably
the unsexy conversation that I

830
00:45:33,663 --> 00:45:35,253
keep forcing people to start with.

831
00:45:35,253 --> 00:45:39,753
Where it's like, what is, figure
out one area where you have

832
00:45:39,753 --> 00:45:42,543
confidence about where that data is.

833
00:45:43,053 --> 00:45:45,693
You know, like maybe it's your
Azure, maybe it's your SharePoint.

834
00:45:45,743 --> 00:45:49,233
Like spin up some Azure Foundry,
spin up an Azure Foundry account

835
00:45:49,533 --> 00:45:54,153
and like look at what's available
in there because we could, oh God.

836
00:45:54,213 --> 00:45:54,543
Sorry.

837
00:45:54,633 --> 00:45:58,353
I'm struggling with this question right
now because I have so many opinions.

838
00:45:58,353 --> 00:46:01,483
I don't want people to just, I don't
want people to just like, oh yeah,

839
00:46:01,483 --> 00:46:03,253
I can like make my own chat bot.

840
00:46:03,253 --> 00:46:06,253
I can build my own app,
but like, get used.

841
00:46:06,313 --> 00:46:10,213
I would say that people should be getting
used to the experience, but there's also.

842
00:46:10,538 --> 00:46:13,028
if we're talking about
the MSP, there's some.

843
00:46:13,448 --> 00:46:15,728
Maturity steps that need
to go into that, right?

844
00:46:15,728 --> 00:46:19,178
Like, do you know what your zero
trust is at your client's site?

845
00:46:19,178 --> 00:46:23,318
Do you have, a wall like a, like a, like
a walled garden effectively that you

846
00:46:23,318 --> 00:46:25,178
can, that your people can experiment in?

847
00:46:25,508 --> 00:46:30,773
Microsoft has a good, AI maturity
model and, workshop that they.

848
00:46:31,378 --> 00:46:35,008
Can that you can do internally for like
a center of excellence kind of thing.

849
00:46:35,578 --> 00:46:39,028
I would encourage people to look at
that because it basically handholds

850
00:46:39,028 --> 00:46:41,348
you, the framework I was gonna do.

851
00:46:41,348 --> 00:46:45,218
Um, I was thinking about doing a, a
workshop around something like that.

852
00:46:45,218 --> 00:46:48,398
But yeah, I don't think that it's about
selling to your clients right away.

853
00:46:48,398 --> 00:46:52,868
It's about having that conversation
about readiness with your clients first.

854
00:46:53,329 --> 00:46:53,569
Todd Kane: Right.

855
00:46:54,308 --> 00:46:55,088
Ashley: where's your data?

856
00:46:55,148 --> 00:46:55,808
What do you want?

857
00:46:56,479 --> 00:46:57,889
Todd Kane: I think that's
a important first step.

858
00:46:57,889 --> 00:47:00,079
'cause like so much of
this is data driven, right?

859
00:47:00,079 --> 00:47:04,579
Like the access to what data is
appropriate and whether or not

860
00:47:04,579 --> 00:47:07,369
that data is clean and if you're
gonna get good context from it.

861
00:47:08,028 --> 00:47:08,238
Ashley: Yeah.

862
00:47:08,269 --> 00:47:11,789
Todd Kane: part of what I struggle with
here is I've been talking a lot about how

863
00:47:12,269 --> 00:47:13,919
we've been dragging people to the cloud.

864
00:47:14,259 --> 00:47:14,549
Ashley: Yeah.

865
00:47:14,715 --> 00:47:16,485
Todd Kane: people into
security for a decade.

866
00:47:16,485 --> 00:47:16,815
Ashley: Yeah.

867
00:47:16,906 --> 00:47:23,136
Todd Kane: it really sort of, an odd
situation and I maybe a little concerned

868
00:47:23,136 --> 00:47:26,106
like, not to throw shade at anybody,
but a lot of the conversations that

869
00:47:26,106 --> 00:47:30,096
people are having about AI with clients
are like, Hey, do you have copilot?

870
00:47:30,096 --> 00:47:30,366
Ashley: Yeah.

871
00:47:30,597 --> 00:47:30,807
Todd Kane: This is great.

872
00:47:31,112 --> 00:47:31,412
Right.

873
00:47:31,512 --> 00:47:31,802
Ashley: Yeah.

874
00:47:31,983 --> 00:47:34,713
Todd Kane: ai with the pace of
change that, that we're going at.

875
00:47:34,913 --> 00:47:35,183
Ashley: Yeah.

876
00:47:35,299 --> 00:47:39,019
Todd Kane: feel your sentiment on this of
like, well, okay, but let's also not race

877
00:47:39,019 --> 00:47:43,309
ahead and say, Hey, let me imple implement
an open claw for all of your employees.

878
00:47:43,359 --> 00:47:43,689
Ashley: Yeah,

879
00:47:44,120 --> 00:47:44,390
Todd Kane: Right.

880
00:47:44,409 --> 00:47:44,829
Ashley: earn it.

881
00:47:44,960 --> 00:47:47,540
Todd Kane: that middle ground
is so difficult, right?

882
00:47:48,174 --> 00:47:50,664
Ashley: And I think that's why it's
like if you try to bring AI into the

883
00:47:50,664 --> 00:47:56,034
conversation, if you make AI the point
of the conversation and not the readiness

884
00:47:56,214 --> 00:47:59,994
as the point of the conversation, then
you're just gonna have like the, the eyes

885
00:47:59,994 --> 00:48:01,974
glaze waiting for the flashy thing, right?

886
00:48:02,004 --> 00:48:04,554
But it's like the, I don't
know, it has to be education.

887
00:48:05,064 --> 00:48:11,244
Um, and then how does it make it
motivational for them is something that

888
00:48:11,244 --> 00:48:12,684
I'm still trying to figure out too.

889
00:48:14,724 --> 00:48:15,469
A lot of thoughts on it.

890
00:48:16,405 --> 00:48:19,435
Todd Kane: It's one of the odd
parts of the pace of change in AI

891
00:48:19,465 --> 00:48:23,455
is I find it really difficult to
gauge where I'm at on the scale

892
00:48:23,455 --> 00:48:24,835
of understanding with this stuff.

893
00:48:24,835 --> 00:48:28,675
Like obviously the people that are
working at companies with frontier models

894
00:48:28,675 --> 00:48:32,095
and stuff, like they understand this
stuff intimately and very, very deeply.

895
00:48:32,525 --> 00:48:35,885
and there are times in the space
where I feel like I'm at the front

896
00:48:35,885 --> 00:48:39,305
of the crowd and then times I feel
like I'm absolutely at the back.

897
00:48:39,515 --> 00:48:42,155
But I can't really figure out if I'm
in the middle of the front or the

898
00:48:42,155 --> 00:48:44,105
back at any given moment, basically.

899
00:48:44,105 --> 00:48:44,345
Right.

900
00:48:44,435 --> 00:48:46,325
Like, because it's so new.

901
00:48:46,355 --> 00:48:50,495
No one is really sort of a great expert
on this and we're all kinda learning at

902
00:48:50,495 --> 00:48:54,875
the same time and, and sort of like, you
know, jockeying for, for, for position.

903
00:48:54,875 --> 00:48:58,355
Not that we're competing, but just
like I learned something that someone

904
00:48:58,355 --> 00:49:01,175
learned yesterday, then I learned
something that they learned tomorrow

905
00:49:01,235 --> 00:49:03,185
Is, is constantly happening right now,

906
00:49:03,334 --> 00:49:07,474
Ashley: Yeah, and the conversation
needs to be different for the

907
00:49:07,474 --> 00:49:12,034
business leaders than it does
for like the technology leaders.

908
00:49:12,454 --> 00:49:13,539
And then, you know, like.

909
00:49:14,349 --> 00:49:18,009
Do you understand even the
different failure modes, right?

910
00:49:18,099 --> 00:49:22,059
Do you understand the difference
between what a fine tuned model

911
00:49:22,059 --> 00:49:26,769
and a, a rag assisted model is?

912
00:49:26,769 --> 00:49:31,239
Because if you're trusting one
to be the other, you're gonna

913
00:49:31,359 --> 00:49:32,319
shoot yourself in the foot.

914
00:49:32,914 --> 00:49:34,594
and you're not gonna, and then
you're not gonna trust it.

915
00:49:34,594 --> 00:49:37,474
And one of our biggest struggles
that we always have, you know, Matt

916
00:49:37,474 --> 00:49:40,054
talks about this, Matt, Matt Lee
talks about this all the time about

917
00:49:40,054 --> 00:49:44,734
the, the, um, the fact that the, uh,
the seatbelt wasn't invented until

918
00:49:44,734 --> 00:49:47,914
like however many years after, right?

919
00:49:47,914 --> 00:49:49,804
Like framework, yeah.

920
00:49:50,134 --> 00:49:51,634
Governance follows.

921
00:49:52,924 --> 00:49:53,494
After.

922
00:49:53,674 --> 00:49:54,094
Right.

923
00:49:54,094 --> 00:49:56,644
And so it's like we're always gonna
be, one of the reasons why we're

924
00:49:56,644 --> 00:49:59,284
always held back is 'cause we are
a little bit of like the crabs in

925
00:49:59,284 --> 00:50:00,904
the bucket when new things come out.

926
00:50:00,904 --> 00:50:01,804
Like, don't do that yet.

927
00:50:01,804 --> 00:50:02,524
Don't do that yet.

928
00:50:02,524 --> 00:50:03,184
Don't do that yet.

929
00:50:03,184 --> 00:50:04,804
And it's like, but you
know who is doing that?

930
00:50:05,014 --> 00:50:07,264
The people who don't have people
telling them not to do that yet.

931
00:50:07,654 --> 00:50:12,934
And so it's like, we need to be
more than ever now educating, right.

932
00:50:12,934 --> 00:50:16,934
And so educate them on what it is, what
the different types are, you know, like.

933
00:50:18,034 --> 00:50:21,544
You don't hand somebody a gun as like
the very first, like, Hey, you've

934
00:50:21,544 --> 00:50:22,624
never shopped this before here.

935
00:50:22,834 --> 00:50:23,224
Like, right.

936
00:50:23,224 --> 00:50:24,244
Like you kind of like,

937
00:50:24,295 --> 00:50:24,925
Todd Kane: a loaded gun.

938
00:50:24,925 --> 00:50:29,725
Ashley: here's what the mechanics
of it, here's where this comes from.

939
00:50:29,725 --> 00:50:34,225
Here's the likelihoods that you're
gonna like, cause damage to yourself

940
00:50:34,225 --> 00:50:35,125
if you don't do this or that.

941
00:50:36,116 --> 00:50:36,386
Todd Kane: Yep.

942
00:50:36,626 --> 00:50:36,866
Yep.

943
00:50:37,045 --> 00:50:37,315
Ashley: Yeah,

944
00:50:37,346 --> 00:50:40,946
Todd Kane: I guess like as with all
technology, technology is, aerin can

945
00:50:40,946 --> 00:50:42,926
be used for good and evil basically.

946
00:50:42,926 --> 00:50:43,616
So, you know,

947
00:50:43,705 --> 00:50:44,635
Ashley: just like a toothbrush.

948
00:50:44,666 --> 00:50:45,776
Todd Kane: Well, this
has been awesome, Ashley.

949
00:50:45,776 --> 00:50:48,836
I really appreciate you coming on and,
and sharing some of your experience.

950
00:50:48,886 --> 00:50:53,966
and I think you're right, like start with
education both for yourself and for, you

951
00:50:53,966 --> 00:50:57,596
know, what you can teach your staff, what
you can teach, teach your clients because

952
00:50:57,596 --> 00:51:02,666
the evolution of this stuff is, is wild,
but it is also a ton of fun and a great

953
00:51:02,666 --> 00:51:06,326
place to be doing some, some learning
in the, in the nerdy kingdom basically.

954
00:51:06,326 --> 00:51:07,346
So it's been great.

955
00:51:07,346 --> 00:51:07,856
Ashley: Thank you.

956
00:51:07,856 --> 00:51:08,216
Todd Kane: Take care.

