1
00:00:05,768 --> 00:00:07,793
[Announcer]: Welcome to the Analytics Power Hour.

2
00:00:08,474 --> 00:00:12,724
[Announcer]: Analytics topics covered conversationally and sometimes with explicit language.

3
00:00:15,190 --> 00:00:16,192
[Michael Helbling]: Hey everybody, welcome.

4
00:00:16,453 --> 00:00:20,402
[Michael Helbling]: It's the Analytics Power Hour and this is episode 287.

5
00:00:21,158 --> 00:00:24,582
[Michael Helbling]: Ho, ho, ho, holy shit.

6
00:00:24,602 --> 00:00:26,224
[Michael Helbling]: Another year is basically over.

7
00:00:26,244 --> 00:00:32,171
[Michael Helbling]: 2025, I mean, it never even had a chance to slow down and decompress, it feels like.

8
00:00:32,751 --> 00:00:39,139
[Michael Helbling]: I mean, we're just running a break next beat, finding out about AI, doing our work, trying to do everything.

9
00:00:39,239 --> 00:00:44,104
[Michael Helbling]: But regardless, we're going to try to take a look back and maybe a small peek forward.

10
00:00:44,144 --> 00:00:48,229
[Michael Helbling]: That's the analytics power hour year in review episode.

11
00:00:48,209 --> 00:00:58,025
[Michael Helbling]: And so with no more ado, it's time to introduce my awesome co-hosts, Moee Kisss, Director of Data Science for Marketing at Canva.

12
00:00:58,526 --> 00:00:59,267
[Michael Helbling]: How you going?

13
00:01:00,549 --> 00:01:01,831
[Moe Kiss]: I'm going pretty good.

14
00:01:01,992 --> 00:01:04,336
[Moe Kiss]: But yeah, 2025, that was a time.

15
00:01:04,376 --> 00:01:06,018
[Michael Helbling]: It felt fast.

16
00:01:06,679 --> 00:01:07,200
[Moe Kiss]: Big year.

17
00:01:07,881 --> 00:01:09,204
[Michael Helbling]: Big year, I agree.

18
00:01:09,404 --> 00:01:12,529
[Michael Helbling]: Tim Wilson, Head of Solutions and facts & feelings.

19
00:01:12,627 --> 00:01:13,108
[Michael Helbling]: Do you agree?

20
00:01:13,368 --> 00:01:13,889
[Michael Helbling]: Hello.

21
00:01:13,909 --> 00:01:14,029
[Michael Helbling]: Hello.

22
00:01:14,970 --> 00:01:15,571
[Michael Helbling]: Hello.

23
00:01:15,751 --> 00:01:17,073
[Michael Helbling]: Hello.

24
00:01:17,674 --> 00:01:18,115
[Michael Helbling]: Hello.

25
00:01:18,756 --> 00:01:19,216
[Michael Helbling]: Quite a year.

26
00:01:20,017 --> 00:01:20,859
[Michael Helbling]: Yeah.

27
00:01:20,899 --> 00:01:23,562
[Michael Helbling]: Val Coroll, head of Deliverate facts & feelings.

28
00:01:25,245 --> 00:01:26,286
[Michael Helbling]: How's your year going?

29
00:01:26,907 --> 00:01:27,187
[Michael Helbling]: Gone.

30
00:01:27,768 --> 00:01:28,830
[Val Kroll]: Lots of feelings.

31
00:01:28,850 --> 00:01:29,771
[Val Kroll]: There were lots of feelings.

32
00:01:29,791 --> 00:01:31,173
[Michael Helbling]: Yeah.

33
00:01:31,193 --> 00:01:31,834
[Michael Helbling]: I agree.

34
00:01:32,540 --> 00:01:37,405
[Michael Helbling]: And of course, we are missing Julie Hoyer as she enjoys some time off with her new baby.

35
00:01:37,445 --> 00:01:41,450
[Michael Helbling]: And so we look forward to her coming back next year.

36
00:01:42,291 --> 00:01:45,594
[Tim Wilson]: So her year is going sleep deprived, right?

37
00:01:45,615 --> 00:01:46,856
[Tim Wilson]: Yeah, that's right.

38
00:01:48,338 --> 00:01:57,808
[Michael Helbling]: And of course, as a special treat, we've got Josh Crowhurst, Growth Marketing Director at Immanuel Life as our special guest this episode.

39
00:01:58,168 --> 00:01:59,270
[Michael Helbling]: Welcome back, Josh.

40
00:02:01,212 --> 00:02:02,293
[Josh Crowhurst]: Hey, yes, great to be here.

41
00:02:02,813 --> 00:02:14,358
[Michael Helbling]: You know, I don't know Josh if our listeners actually, many of them know the story of how you became involved with the podcast in the first place.

42
00:02:14,899 --> 00:02:20,451
[Michael Helbling]: So if you don't mind, I'd like to take a second and just tell people how that happened.

43
00:02:20,667 --> 00:02:23,732
[Tim Wilson]: I thought it was gonna be like the 2025 and how you stormed away.

44
00:02:23,752 --> 00:02:24,934
[Tim Wilson]: Like, keep it as the year in review.

45
00:02:24,954 --> 00:02:31,965
[Michael Helbling]: Well, I mean, it's part of the year in review that Josh finally had to step back from his role with the podcast.

46
00:02:31,985 --> 00:02:38,095
[Michael Helbling]: So we're actually really glad that you did rejoin for this one last episode for year in review, which is our tradition.

47
00:02:38,776 --> 00:02:41,079
[Michael Helbling]: And, you know, if you're up for it, come back next year.

48
00:02:41,119 --> 00:02:46,708
[Michael Helbling]: We don't care, but yeah, Josh stopped being involved.

49
00:02:46,928 --> 00:02:48,531
[Michael Helbling]: Nicely put, Michael.

50
00:02:48,747 --> 00:02:52,311
[Michael Helbling]: No, I'm just saying, it'll be fun.

51
00:02:52,371 --> 00:02:55,254
[Michael Helbling]: It's not pressure, it's up to you.

52
00:02:55,294 --> 00:02:56,735
[Michael Helbling]: You got a lot going on in life.

53
00:02:57,376 --> 00:03:07,867
[Michael Helbling]: But no, early 2019, Tim and I were working out how to make the show better, and we thought we needed some help.

54
00:03:08,187 --> 00:03:10,569
[Michael Helbling]: And so we put out a call for a producer.

55
00:03:11,410 --> 00:03:17,016
[Michael Helbling]: It was a poorly written job description, one that we did not fully understand.

56
00:03:17,266 --> 00:03:20,511
[Michael Helbling]: And then- Did Tim fully understood it just to be clear?

57
00:03:21,693 --> 00:03:28,204
[Michael Helbling]: Well, in terms of like what it would take to do and what we were looking for and all those things, it was just very much like a shot in the dark.

58
00:03:29,426 --> 00:03:34,975
[Michael Helbling]: To our surprise and delight, we got a response from Josh Crowhurst.

59
00:03:34,995 --> 00:03:43,729
[Michael Helbling]: And after chatting with him a few months later, because I forgot about the email and didn't look at it for a while,

60
00:03:43,709 --> 00:03:48,556
[Michael Helbling]: Josh joined the show as our producer and was with us for, I believe, six years, which is incredible.

61
00:03:48,596 --> 00:03:49,337
[Michael Helbling]: It's so amazing.

62
00:03:49,758 --> 00:03:55,847
[Michael Helbling]: And so now that life has taken Josh in a new direction and he's growing, he's obviously stepping into bigger and bigger roles.

63
00:03:55,907 --> 00:04:00,975
[Michael Helbling]: And it's so cool to see how your life and career has just flourished.

64
00:04:01,516 --> 00:04:04,300
[Michael Helbling]: And I like to think maybe

65
00:04:04,280 --> 00:04:06,686
[Michael Helbling]: I mean, I don't think that, I don't know.

66
00:04:06,926 --> 00:04:07,407
[Michael Helbling]: Anyways.

67
00:04:07,447 --> 00:04:10,094
[Michael Helbling]: You didn't even do an audio production at all, yeah.

68
00:04:10,214 --> 00:04:11,978
[Josh Crowhurst]: I couldn't even... All thanks to you, Alves.

69
00:04:12,800 --> 00:04:13,842
[Josh Crowhurst]: All thanks to you.

70
00:04:14,123 --> 00:04:15,546
[Michael Helbling]: No, not me personally.

71
00:04:15,586 --> 00:04:19,335
[Michael Helbling]: Just the analytics power hour generally.

72
00:04:19,315 --> 00:04:21,718
[Michael Helbling]: benefited your career in some way.

73
00:04:21,818 --> 00:04:23,821
[Michael Helbling]: I'd love to think that, but probably it did.

74
00:04:23,841 --> 00:04:24,362
[Michael Helbling]: Absolutely.

75
00:04:25,363 --> 00:04:30,069
[Michael Helbling]: Anyway, we appreciate it and we're happy that you are able to join us for this episode.

76
00:04:30,169 --> 00:04:35,816
[Michael Helbling]: Okay, what we do on all these episodes in your review, we like to look back at the year that just went past.

77
00:04:35,836 --> 00:04:36,758
[Michael Helbling]: We did a lot of shows.

78
00:04:37,358 --> 00:04:38,600
[Michael Helbling]: We did a lot of interesting shows.

79
00:04:39,021 --> 00:04:43,927
[Michael Helbling]: We like to talk about some of them, highlight some of our favorite episodes, maybe chat about some of the things that happened this year.

80
00:04:44,468 --> 00:04:45,569
[Michael Helbling]: So who wants to kick us off?

81
00:04:45,629 --> 00:04:48,613
[Michael Helbling]: What's an episode that really stands out for you?

82
00:04:49,049 --> 00:04:52,344
[Val Kroll]: Well, obviously we started off our year strong.

83
00:04:53,268 --> 00:04:57,507
[Val Kroll]: No show would be complete without Tim Wilson kicking off our year.

84
00:05:00,254 --> 00:05:04,939
[Val Kroll]: with the announcement of Analytics the Right Way, episode 263.

85
00:05:05,800 --> 00:05:09,144
[Val Kroll]: So that was a big, we were all so excited to see that come to life.

86
00:05:09,184 --> 00:05:15,210
[Val Kroll]: And it was super fun to be a part of that episode, since I had the pleasure of working with Dr. Joe Sutherland.

87
00:05:15,891 --> 00:05:21,677
[Val Kroll]: And that was just a really fun, big moment, like diving into all of the big themes of the book.

88
00:05:22,438 --> 00:05:25,401
[Val Kroll]: But that was the first one of the year, was it?

89
00:05:25,381 --> 00:05:27,967
[Val Kroll]: It feels like if it was not for a second.

90
00:05:28,248 --> 00:05:30,313
[Val Kroll]: Yeah, starting strong.

91
00:05:30,494 --> 00:05:31,155
[Val Kroll]: That was a good one.

92
00:05:32,779 --> 00:05:35,847
[Moe Kiss]: I still hope I know about missing that one.

93
00:05:36,198 --> 00:05:38,581
[Val Kroll]: We did fight to figure always get to be on it.

94
00:05:38,961 --> 00:05:39,462
[Val Kroll]: Yeah.

95
00:05:40,083 --> 00:05:50,836
[Tim Wilson]: Well, as other people have released books this year, I realized what a kind of a shit job of ongoing, rolling thunder, you know, promotion of the book.

96
00:05:51,036 --> 00:05:55,241
[Tim Wilson]: But I was in it for the writing of the book, and I figured it was going to be downhill.

97
00:05:55,421 --> 00:06:01,969
[Tim Wilson]: Once he showed up on the analytics power hours, a guest, why would there need to be any other promotion?

98
00:06:02,050 --> 00:06:04,212
[Tim Wilson]: The old APH bump, we like to call it.

99
00:06:04,373 --> 00:06:04,613
[Tim Wilson]: Yep.

100
00:06:04,793 --> 00:06:05,314
[Tim Wilson]: Clearly.

101
00:06:05,354 --> 00:06:09,458
[Tim Wilson]: Dozens, dozens of books flew off the shelf.

102
00:06:09,478 --> 00:06:14,404
[Moe Kiss]: I have bought six alone, so I am definitely helping the supplies go out the door.

103
00:06:15,084 --> 00:06:18,228
[Val Kroll]: Were those some of your stocking stuffers, Moe, for friends and family?

104
00:06:21,411 --> 00:06:22,573
[Tim Wilson]: Folks, it's not too late.

105
00:06:22,753 --> 00:06:27,518
[Tim Wilson]: If you're listening now, you can... That's right.

106
00:06:28,072 --> 00:06:29,954
[Michael Helbling]: for that special someone in your life.

107
00:06:30,615 --> 00:06:31,696
[Michael Helbling]: The e-book version.

108
00:06:35,099 --> 00:06:37,421
[Val Kroll]: Use code APHBump for 10% off.

109
00:06:39,764 --> 00:06:40,865
[Michael Helbling]: Don't say that.

110
00:06:45,930 --> 00:06:46,710
[Michael Helbling]: Oh, man.

111
00:06:47,531 --> 00:06:49,833
[Michael Helbling]: Well, I'm glad there's no other episodes to talk about.

112
00:06:50,014 --> 00:06:51,775
[Michael Helbling]: Yeah, that was really the one.

113
00:06:51,955 --> 00:06:54,518
[Michael Helbling]: Let's talk about that one more.

114
00:06:54,498 --> 00:06:55,840
[Michael Helbling]: That was the one.

115
00:06:55,880 --> 00:06:57,402
[Michael Helbling]: All right.

116
00:06:57,623 --> 00:06:59,105
[Michael Helbling]: Listen, I have an episode.

117
00:06:59,466 --> 00:07:00,147
[Michael Helbling]: Here's the thing.

118
00:07:00,247 --> 00:07:00,567
[Michael Helbling]: Okay.

119
00:07:01,208 --> 00:07:06,176
[Michael Helbling]: When we do this podcast, this is the thing I do with a lot of things.

120
00:07:07,598 --> 00:07:16,371
[Michael Helbling]: When I interview people, when I work with people, when I talk to people, I'm always looking for where their passion lies, what sparks.

121
00:07:16,351 --> 00:07:18,613
[Michael Helbling]: kind of what makes their eyes light up.

122
00:07:19,214 --> 00:07:30,646
[Michael Helbling]: And one of our episodes that I really enjoyed and it was a person I'd wanted to get on the show for a long time was Dan McCarthy, which we did episode 272 about calculated and complex metrics.

123
00:07:30,666 --> 00:07:43,219
[Michael Helbling]: It was a really fun conversation and Dan is so smart and so amazing in his role as a professor in studying these companies and the metrics they produce, especially for public reporting, for

124
00:07:43,199 --> 00:07:44,560
[Michael Helbling]: stock reporting purposes.

125
00:07:45,241 --> 00:07:52,628
[Michael Helbling]: But what was amazing was the passion he has for these topics through music.

126
00:07:52,768 --> 00:07:55,330
[Michael Helbling]: And he has a sound cloud with all these songs on it.

127
00:07:55,811 --> 00:07:59,274
[Michael Helbling]: And it was sort of after the show was over that he kind of started in on it.

128
00:07:59,735 --> 00:08:06,981
[Michael Helbling]: But that was sort of where I saw the switch kind of flip into this is fun and up a little bit of light in his eyes about that kind of thing.

129
00:08:07,001 --> 00:08:09,243
[Michael Helbling]: And I'm sure obviously he enjoys his other work too.

130
00:08:09,304 --> 00:08:12,987
[Michael Helbling]: But it was just really cool to kind of connect with

131
00:08:12,967 --> 00:08:19,659
[Michael Helbling]: In the coolest way possible, another data nerd about things they loved about their work and about data.

132
00:08:20,300 --> 00:08:22,985
[Michael Helbling]: Anyway, so that was just a moment that kind of stood out to me.

133
00:08:23,085 --> 00:08:31,060
[Michael Helbling]: As far as being a really educational and fun episode, it was just so cool to watch somebody's eyes light up about things they were passionate about.

134
00:08:31,867 --> 00:08:39,120
[Moe Kiss]: I learned so much on that episode and I even probably like a week ago sent it to someone to have a listen.

135
00:08:39,741 --> 00:08:50,299
[Moe Kiss]: The number of times I get questions about LTV2CAC and like why finance and public companies are like so interested in that specific metric and how it's calculated.

136
00:08:50,339 --> 00:08:53,445
[Moe Kiss]: I'm just like, here is a show that I prepared earlier.

137
00:08:53,505 --> 00:08:55,849
[Moe Kiss]: Please peruse at your own leisure.

138
00:08:55,829 --> 00:09:18,951
[Moe Kiss]: And I just loved how he did such a wonderful job of really getting into the, I guess, the different perspectives and the complexities that we sometimes face as data folks in a metric that its surface might seem really simple and obvious, but actually

139
00:09:18,931 --> 00:09:25,818
[Moe Kiss]: can really change a business decision or a perspective of a business by how it's calculated and how it's interpreted.

140
00:09:25,838 --> 00:09:35,609
[Moe Kiss]: And also just to say, like his SoundCloud, the number of data show and tells that I've opened with one of those songs, and people are always like, Moe, where do you get these data songs?

141
00:09:35,989 --> 00:09:36,690
[Moe Kiss]: I'm like, blah.

142
00:09:37,171 --> 00:09:38,332
[Moe Kiss]: I know people.

143
00:09:39,413 --> 00:09:40,314
[Moe Kiss]: I know people.

144
00:09:40,334 --> 00:09:45,900
[Moe Kiss]: So yeah, I definitely had that in my top couple of episode list as well.

145
00:09:46,522 --> 00:09:49,388
[Tim Wilson]: Well, that was like my finding him.

146
00:09:49,528 --> 00:09:52,314
[Tim Wilson]: So I now like see more of his stuff.

147
00:09:52,374 --> 00:10:04,480
[Tim Wilson]: And he made the point on that episode, and then he kind of continues to make it that when companies stop reporting stuff, it's not usually for... Sometimes that's informative.

148
00:10:04,460 --> 00:10:05,863
[Tim Wilson]: Yeah, that in and of itself.

149
00:10:06,224 --> 00:10:09,191
[Tim Wilson]: And there's some kind of hand-waving as to why.

150
00:10:09,211 --> 00:10:17,750
[Tim Wilson]: And he's like, but another way to look at it would be, here's this thing I wrote two years ago that indicated this might be problematic.

151
00:10:17,911 --> 00:10:21,118
[Tim Wilson]: So yeah, he was a fun one.

152
00:10:21,385 --> 00:10:38,314
[Josh Crowhurst]: So on the topic of things that people are passionate about, I think one of the episodes that I absolutely loved and maybe is a bit in line with something that I'm really passionate about was number 282, using and creating data to understand pop culture with Chris Della Riva.

153
00:10:38,775 --> 00:10:43,984
[Josh Crowhurst]: So for me, this was honestly probably my favorite episode ever.

154
00:10:43,964 --> 00:11:08,892
[Josh Crowhurst]: because it's like so it's so right up my alley like it's in my backyard like it's like he's talking about looking up writing credits and production credits on songs and tracking that and this is something that I just do just impulsively like I'm always annoying my friends with pointless surprising facts about songs that

155
00:11:08,872 --> 00:11:19,165
[Josh Crowhurst]: Like, did you know Bruno Mars co-wrote Forget You or like, I don't know, Mark Ronson produced and wrote that song from A Star Is Born?

156
00:11:19,506 --> 00:11:20,927
[Josh Crowhurst]: Like, just like shit like that.

157
00:11:20,968 --> 00:11:24,652
[Josh Crowhurst]: I'm just always, I'm always looking behind and saying, like, who's involved in that song?

158
00:11:25,333 --> 00:11:31,521
[Josh Crowhurst]: And the idea that there are just people behind the scenes that maybe don't have mainstream name recognition in a lot of cases, but have

159
00:11:31,501 --> 00:11:39,789
[Josh Crowhurst]: really shaped what you're hearing on the radio or on Spotify for, you know, sometimes for decades.

160
00:11:40,230 --> 00:11:44,154
[Josh Crowhurst]: And so, yeah, Chris talks about tracking that and having that in a data set.

161
00:11:44,654 --> 00:11:50,100
[Josh Crowhurst]: And I wish I could get my hands on that data because I would absolutely just be pouring up for it.

162
00:11:50,260 --> 00:11:50,760
[Josh Crowhurst]: Oh, it's there.

163
00:11:50,800 --> 00:11:52,502
[Josh Crowhurst]: It's on the show facts.

164
00:11:52,902 --> 00:11:53,323
[Tim Wilson]: You can.

165
00:11:53,643 --> 00:11:55,325
[Tim Wilson]: It's on the it's on the show notes page.

166
00:11:55,445 --> 00:11:55,925
[Tim Wilson]: Oh, my God.

167
00:11:56,806 --> 00:12:01,491
[Josh Crowhurst]: Yeah, we found out.

168
00:12:02,028 --> 00:12:02,449
[Josh Crowhurst]: I'm out.

169
00:12:02,549 --> 00:12:03,030
[Josh Crowhurst]: Yeah.

170
00:12:03,050 --> 00:12:03,150
[Josh Crowhurst]: Okay.

171
00:12:03,330 --> 00:12:03,992
[Michael Helbling]: I'm diving.

172
00:12:05,174 --> 00:12:06,677
[Michael Helbling]: Josh liner notes.

173
00:12:06,797 --> 00:12:10,524
[Michael Helbling]: Crowhurst.

174
00:12:10,544 --> 00:12:12,668
[Tim Wilson]: And Michael, you really enjoyed recording that show.

175
00:12:12,688 --> 00:12:13,890
[Tim Wilson]: Is that, is that right, Michael?

176
00:12:14,110 --> 00:12:14,872
[Tim Wilson]: You know what?

177
00:12:14,952 --> 00:12:18,298
[Tim Wilson]: Thank you so much, Tim, for bringing up a sore point.

178
00:12:20,455 --> 00:12:27,374
[Michael Helbling]: I just find it hilarious after 11 years of you basically being like, I don't know anything about pop culture.

179
00:12:27,414 --> 00:12:32,288
[Michael Helbling]: Like you record that episode instead of me, like come on.

180
00:12:35,153 --> 00:12:36,194
[Tim Wilson]: I read his newsletter.

181
00:12:36,234 --> 00:12:38,417
[Tim Wilson]: No, that was a fair, fair.

182
00:12:38,437 --> 00:12:39,598
[Tim Wilson]: Anyways, it was.

183
00:12:39,718 --> 00:12:43,563
[Val Kroll]: We're gonna have to rename the show, Year in Review and Erring of Grievances.

184
00:12:44,324 --> 00:12:45,886
[Michael Helbling]: This is right.

185
00:12:45,906 --> 00:12:48,149
[Michael Helbling]: It's the Festivus Erring of Grievances.

186
00:12:48,970 --> 00:12:51,272
[Tim Wilson]: Which Chris's book is now out.

187
00:12:51,372 --> 00:12:56,899
[Tim Wilson]: It was not out when we recorded the, but it is.

188
00:12:56,979 --> 00:13:02,646
[Tim Wilson]: So also, if you're like somebody, love someone so much that you want to get them analyzed the right way,

189
00:13:02,626 --> 00:13:11,886
[Tim Wilson]: and a second book that Uncharted territory is now available at booksellers near you.

190
00:13:11,906 --> 00:13:14,391
[Tim Wilson]: Still available by Boxing Day, probably.

191
00:13:15,890 --> 00:13:17,593
[Moe Kiss]: Did you guys have Boxing Day?

192
00:13:17,753 --> 00:13:23,702
[Michael Helbling]: No, but it's the day after Christmas, so you have one more day, so maybe it'll shift the time.

193
00:13:23,722 --> 00:13:24,043
[Tim Wilson]: I don't know.

194
00:13:24,203 --> 00:13:29,471
[Tim Wilson]: Everybody has some pretentious neighbor who celebrates Boxing Day, so they can explain to you what it is.

195
00:13:29,491 --> 00:13:30,292
[Tim Wilson]: Boxing Day is awesome.

196
00:13:30,312 --> 00:13:33,417
[Moe Kiss]: You have leftover food and none of the pressure of Christmas Day.

197
00:13:33,838 --> 00:13:34,138
[Tim Wilson]: Right.

198
00:13:34,178 --> 00:13:39,366
[Tim Wilson]: Now, imagine that coming out of an American who's just explaining how sophisticated they are.

199
00:13:39,565 --> 00:13:45,133
[Michael Helbling]: Well, I obviously, with these book recommendations, I would think you'd be talking about Holavoka Flaude.

200
00:13:45,153 --> 00:13:46,936
[Michael Helbling]: So maybe that's the holiday.

201
00:13:46,996 --> 00:13:48,418
[Michael Helbling]: What?

202
00:13:48,939 --> 00:13:49,500
[Michael Helbling]: Not familiar.

203
00:13:49,540 --> 00:13:50,221
[Michael Helbling]: Sorry.

204
00:13:50,842 --> 00:13:55,208
[Michael Helbling]: And it's an Icelandic holiday where you read books right before Christmas.

205
00:13:55,749 --> 00:13:56,570
[Michael Helbling]: So there you go.

206
00:13:56,831 --> 00:14:01,017
[Moe Kiss]: I was about to say, should I, like, pivot us in a totally different direction and talk about the elephant in the room?

207
00:14:02,228 --> 00:14:03,551
[Michael Helbling]: Oh, yeah.

208
00:14:04,272 --> 00:14:06,356
[Michael Helbling]: I mean... What?

209
00:14:06,416 --> 00:14:09,583
[Moe Kiss]: How many episodes you reckon AI came up in?

210
00:14:09,603 --> 00:14:11,506
[Moe Kiss]: Oh, damn it.

211
00:14:11,546 --> 00:14:13,270
[Moe Kiss]: I should have actually been prepared.

212
00:14:13,310 --> 00:14:13,871
[Moe Kiss]: Hold on.

213
00:14:13,891 --> 00:14:15,995
[Moe Kiss]: And I kept transcripts or some shit.

214
00:14:16,015 --> 00:14:16,937
[Moe Kiss]: That would have been a good idea.

215
00:14:16,957 --> 00:14:18,921
[Val Kroll]: Yeah, use your librarian thing, Michael.

216
00:14:20,268 --> 00:14:22,892
[Michael Helbling]: Yeah, well, we don't have every episode uploaded yet.

217
00:14:22,912 --> 00:14:24,554
[Michael Helbling]: So it's still a working process.

218
00:14:24,574 --> 00:14:30,442
[Michael Helbling]: But thank you, Val, for bringing that up, because it's an AI project that Tim and I are working on.

219
00:14:30,482 --> 00:14:35,789
[Michael Helbling]: But I've got to say, Moe, it probably came up in probably 75% of our episodes.

220
00:14:37,031 --> 00:14:37,691
[Moe Kiss]: You reckon 75%?

221
00:14:37,832 --> 00:14:39,253
[Moe Kiss]: Everyone put in a guess.

222
00:14:40,735 --> 00:14:41,917
[Moe Kiss]: I would say maybe higher.

223
00:14:41,977 --> 00:14:45,922
[Tim Wilson]: No, I think I'd go 70.

224
00:14:46,123 --> 00:14:47,845
[Tim Wilson]: I mean, I'm counting.

225
00:14:48,922 --> 00:14:52,508
[Val Kroll]: between one, whether it was a topic or it just came up.

226
00:14:52,728 --> 00:14:55,392
[Val Kroll]: If it just came up or last calls.

227
00:14:55,632 --> 00:14:56,654
[Josh Crowhurst]: Do last calls count?

228
00:14:56,694 --> 00:14:57,816
[Josh Crowhurst]: They do in my head.

229
00:14:57,836 --> 00:15:00,400
[Val Kroll]: That's why I got to my number.

230
00:15:00,860 --> 00:15:04,286
[Michael Helbling]: I mean, there's at least 10 episodes that have AI in the title.

231
00:15:05,307 --> 00:15:06,569
[Michael Helbling]: I'm going to say 90%.

232
00:15:07,190 --> 00:15:08,833
[Moe Kiss]: Yeah, it was a lot.

233
00:15:09,354 --> 00:15:12,799
[Moe Kiss]: Let's leave everyone hanging and not we can report back in a future day.

234
00:15:12,819 --> 00:15:13,380
[Michael Helbling]: That's right.

235
00:15:14,021 --> 00:15:16,785
[Michael Helbling]: Guess how many jelly beans are in the AI jar?

236
00:15:17,879 --> 00:15:22,569
[Tim Wilson]: So I tried to go on record that I did not commit to that it will be reported out at some future date.

237
00:15:22,649 --> 00:15:25,575
[Tim Wilson]: So I think the likelihood of that happening is.

238
00:15:25,735 --> 00:15:30,926
[Val Kroll]: If any of our listeners want to figure it out, sound off in the comments.

239
00:15:30,946 --> 00:15:36,597
[Michael Helbling]: If only we had a producer who could go back through.

240
00:15:40,542 --> 00:15:48,630
[Michael Helbling]: You know, Tim, as we click champagne glasses on another successful year of the podcast, I think our listeners would agree that you and I almost always agree on things.

241
00:15:50,151 --> 00:15:50,972
[Tim Wilson]: What?

242
00:15:50,992 --> 00:15:51,973
[Tim Wilson]: Absolutely not.

243
00:15:52,614 --> 00:15:57,158
[Tim Wilson]: I spend half or most of my time on this show, I think, just correcting your misguided thinking.

244
00:15:57,959 --> 00:15:59,020
[Michael Helbling]: Well, agree to disagree.

245
00:15:59,340 --> 00:16:01,923
[Michael Helbling]: But there is one thing we both agree on.

246
00:16:02,543 --> 00:16:04,686
[Michael Helbling]: AI is starting to reshape our industry.

247
00:16:05,186 --> 00:16:09,250
[Michael Helbling]: And I think we both call bullshit on nonsense like vibe analytics.

248
00:16:09,230 --> 00:16:10,852
[Michael Helbling]: Absolutely fucking right.

249
00:16:11,452 --> 00:16:12,534
[Michael Helbling]: But here's the flip side.

250
00:16:12,754 --> 00:16:14,956
[Michael Helbling]: Analysts do have to start using AI.

251
00:16:15,096 --> 00:16:19,201
[Michael Helbling]: Leveraging LLMs to multiplier capabilities isn't just interesting anymore.

252
00:16:19,761 --> 00:16:21,824
[Michael Helbling]: It's going to be table stakes in 2026.

253
00:16:22,304 --> 00:16:26,048
[Tim Wilson]: Which is why I'm actually excited about our new sponsor, Ask Why.

254
00:16:26,068 --> 00:16:30,773
[Tim Wilson]: Yes, it's an AI tool, but it's one where analysts can do real work.

255
00:16:31,274 --> 00:16:34,938
[Tim Wilson]: And critically, Ask Why is smart about data privacy.

256
00:16:35,078 --> 00:16:38,221
[Tim Wilson]: They do not send your raw data to the LLM.

257
00:16:38,622 --> 00:16:38,882
[Tim Wilson]: Right.

258
00:16:38,862 --> 00:16:47,512
[Michael Helbling]: Ask Why builds a semantic layer on top of your data and then uses that to generate SQL that answers your questions or helps you build reports on your own data set.

259
00:16:48,113 --> 00:16:54,320
[Michael Helbling]: It's currently in beta and it's evolving fast, but you get the upside of AI and the assurance that your data stays secure.

260
00:16:54,961 --> 00:17:00,027
[Michael Helbling]: You can actually start leveling up into being an AI analyst, starting with Ask Why.

261
00:17:00,007 --> 00:17:08,020
[Tim Wilson]: For a limited time, use the code APH when you join the waitlist, and our friends at Ask Why will move you right to the top of that list.

262
00:17:08,220 --> 00:17:11,485
[Tim Wilson]: The site is ask-y.ai.

263
00:17:11,866 --> 00:17:13,488
[Michael Helbling]: That's ask-y.ai.

264
00:17:15,792 --> 00:17:19,097
[Michael Helbling]: So go sign up for the waitlist using code APH.

265
00:17:19,618 --> 00:17:20,760
[Tim Wilson]: This isn't Vibe Analytics.

266
00:17:20,960 --> 00:17:23,384
[Tim Wilson]: This is the rise of the AI analyst.

267
00:17:23,825 --> 00:17:25,407
[Michael Helbling]: All right, let's get back to the show.

268
00:17:27,800 --> 00:17:37,433
[Michael Helbling]: Yeah, it is interesting because it certainly, I mean, Moe, I think the point you're making is like AI was everywhere and always here all year long in 2025.

269
00:17:38,755 --> 00:17:42,560
[Michael Helbling]: And it seemed to grow in speed and pace throughout the year.

270
00:17:44,462 --> 00:17:51,953
[Val Kroll]: Yeah, definitely a topic that came up in the listener survey is people wanting to, wanting it covered, wanting some topics covered there.

271
00:17:52,033 --> 00:17:56,018
[Val Kroll]: So I think that creeped into our schedule, informed in

272
00:17:56,234 --> 00:18:10,632
[Tim Wilson]: And as my other hat as the fielder of the inbound pitches for show topics, I can certainly say that that percentage was definitely north of 75%.

273
00:18:11,320 --> 00:18:34,729
[Tim Wilson]: But is it fair to say, and maybe this is my normally optimistic self that you guys are so familiar with, that at the start of the year, the ratio of AI hype to AI specifically in the world of data and analytics, that it was like north of 90% of the AI

274
00:18:34,709 --> 00:18:56,194
[Tim Wilson]: hype excitement to the, wait a minute guys, it's not going to be everything in that it's slowly gotten a little bit more in balance just as the conversation in the zeitgeist around what AI can and can't do as people have gotten their hands on it and realized limitations or is that me?

275
00:18:56,782 --> 00:18:58,064
[Moe Kiss]: I think that's fair.

276
00:18:58,124 --> 00:19:00,448
[Michael Helbling]: It has come back a little.

277
00:19:00,468 --> 00:19:04,874
[Michael Helbling]: I still think we're a little out over our skis, though somewhat in terms of AI.

278
00:19:04,894 --> 00:19:08,259
[Michael Helbling]: I mean, just AI in general, like a lot of people think we're in a bubble.

279
00:19:08,980 --> 00:19:17,734
[Michael Helbling]: By the time this comes out, hopefully the stock market hasn't crashed or anything, but that's always a thing that people are talking about.

280
00:19:17,754 --> 00:19:19,036
[Michael Helbling]: It's like, oh, is this all a bubble?

281
00:19:19,677 --> 00:19:23,262
[Michael Helbling]: And like the.com boom and bust, kind of an idea.

282
00:19:23,900 --> 00:19:30,448
[Val Kroll]: I think it's like with any trendy thing, it's like cool to think of all the use cases and all the potential.

283
00:19:30,528 --> 00:19:34,393
[Val Kroll]: And then the cool thing is to be like, but you can't do this, can't do that.

284
00:19:34,453 --> 00:19:37,556
[Val Kroll]: So like, I feel like we're in that phase of like the LinkedIn.

285
00:19:37,737 --> 00:19:39,499
[Val Kroll]: Like I just get so tired, you know?

286
00:19:39,519 --> 00:19:42,823
[Michael Helbling]: It's like the 50th time you hear they not like us.

287
00:19:42,883 --> 00:19:43,864
[Michael Helbling]: And you're like, no.

288
00:19:46,527 --> 00:19:48,650
[Tim Wilson]: I did see a thing where somebody- That was good, Michael.

289
00:19:49,971 --> 00:19:51,493
[Val Kroll]: That was good, Michael.

290
00:19:54,663 --> 00:20:02,117
[Tim Wilson]: I read a piece that was saying that instead of a bubble, think of it as a forest fire, which it actually has a lot of bubble tendencies.

291
00:20:02,598 --> 00:20:14,340
[Tim Wilson]: Well, but it talks about, even if you go back to the internet, the original, the 2000 internet bubble, that it was pointing out that it's like the bubble burst and it's not like you're back where you started.

292
00:20:14,360 --> 00:20:15,743
[Tim Wilson]: There are

293
00:20:15,723 --> 00:20:26,257
[Tim Wilson]: Players that were sufficiently hardy and had actually a plan that they, they were like the big trees that actually managed to weather it.

294
00:20:26,277 --> 00:20:31,144
[Tim Wilson]: And they're like, yeah, Google, Apple, Microsoft, they're not going anywhere if the bubble bursts.

295
00:20:31,164 --> 00:20:43,300
[Tim Wilson]: And then it talked about the ones that are basically just the thin veneer of crap that those are just going to disappear, but that it's also that correction when it comes, there will be

296
00:20:43,280 --> 00:20:53,031
[Tim Wilson]: a smarter universe out there and there will be little shoots that come out of it that have can kind of, I don't know how they refer to it as little green shoots that will crop up.

297
00:20:53,071 --> 00:20:54,713
[Tim Wilson]: Once all that sort of gets cleared out.

298
00:20:55,353 --> 00:20:58,157
[Tim Wilson]: It seemed like a useful metaphor.

299
00:20:58,997 --> 00:21:01,200
[Tim Wilson]: Involved metaphor.

300
00:21:01,620 --> 00:21:13,033
[Tim Wilson]: But I also find it's crazy like just having conversations with normies and sort of where the person who's not

301
00:21:13,013 --> 00:21:22,546
[Tim Wilson]: kind of has some responsibility to figure it out, how much they're not, there isn't real depth of thought.

302
00:21:22,686 --> 00:21:29,014
[Tim Wilson]: They're just, I had a friend say, I just use ChatGPT instead of Google search now.

303
00:21:29,255 --> 00:21:39,949
[Tim Wilson]: And I was like, I don't have the energy to say you could just use Google search and it would be Jim and I like, if you just want plain text results, and that's kind of the extent of what they're doing.

304
00:21:40,891 --> 00:21:43,234
[Tim Wilson]: although I could also go on some rants as well.

305
00:21:44,555 --> 00:21:52,665
[Michael Helbling]: You know, it was interesting to me this year when I would go to different events and like conferences or things like that and see the pace.

306
00:21:53,205 --> 00:21:57,210
[Michael Helbling]: Like, I remember going to measure camp New York in the spring.

307
00:21:58,151 --> 00:22:03,217
[Michael Helbling]: And of course, everyone was talking about AI this, AI that, and it was all kind of like, wow, look at all this cool stuff.

308
00:22:03,658 --> 00:22:09,885
[Michael Helbling]: And then literally from then to the fall and measure camp Chicago, I felt like,

309
00:22:09,865 --> 00:22:17,353
[Michael Helbling]: we'd already gone through a maturity curve almost with the way we're discussing AI and some of its use cases.

310
00:22:17,673 --> 00:22:24,900
[Michael Helbling]: It just seemed like we're just blasting through the cycle really fast, feels like sometimes.

311
00:22:25,060 --> 00:22:38,674
[Michael Helbling]: Some places, there's still quite a bit of hype, but I do think some people are getting their feet on the ground and starting to use it for actual things and starting to understand how to leverage or how to think through use cases effectively.

312
00:22:39,093 --> 00:22:44,318
[Moe Kiss]: So that was literally the thing that has been on my mind when I was looking at the episodes that were my favorite.

313
00:22:45,239 --> 00:22:49,222
[Moe Kiss]: It's probably recency bias, but they were definitely the ones towards the end of the year.

314
00:22:49,262 --> 00:23:05,537
[Moe Kiss]: Well, I suppose they weren't all the end of the year, but like the semantic layer episodes, I thought the topics on BI with Colin were really good and then also loved the one on Bayesian stats with Michael Kaminsky.

315
00:23:05,937 --> 00:23:09,100
[Moe Kiss]: But part of me wondered, I just felt like,

316
00:23:09,080 --> 00:23:12,285
[Moe Kiss]: There was this return to us discussing.

317
00:23:12,306 --> 00:23:15,892
[Moe Kiss]: I want to say quote unquote the basics, but it's not basics.

318
00:23:16,032 --> 00:23:17,935
[Moe Kiss]: It's the fundamentals of data stuff.

319
00:23:18,536 --> 00:23:23,264
[Moe Kiss]: And is the reason we were discussing that is because like everyone's trying to go so fast on AI.

320
00:23:23,725 --> 00:23:29,535
[Moe Kiss]: There was this like not reckoning, but like acknowledgement that to do that well.

321
00:23:29,954 --> 00:23:36,282
[Moe Kiss]: I don't want to be like the usual shit of data, bad data in, bad data out, blah, blah, blah, that sort of crap.

322
00:23:36,302 --> 00:23:40,146
[Moe Kiss]: But I felt like I've been giving a lot of thought and energy.

323
00:23:40,166 --> 00:23:46,995
[Moe Kiss]: And I feel like folks in the industry are about the quality and how we do things well and how we measure if the output is good.

324
00:23:47,155 --> 00:23:55,605
[Moe Kiss]: And that, of its nature, means we have to have more sophisticated conversations about fundamental data concepts.

325
00:23:56,046 --> 00:23:57,608
[Moe Kiss]: And I felt like there was a return to that.

326
00:23:57,688 --> 00:23:58,729
[Moe Kiss]: And maybe that's

327
00:23:59,435 --> 00:24:08,458
[Moe Kiss]: Similar to what you're talking about Michael where like there was kind of a bit of a rush and then people are like Having more sophisticated discussions is probably a good summary.

328
00:24:09,180 --> 00:24:09,501
[Michael Helbling]: Yeah.

329
00:24:09,581 --> 00:24:13,150
[Michael Helbling]: No, I like that framing because I think that's exactly right.

330
00:24:13,170 --> 00:24:14,052
[Michael Helbling]: It's sort of like

331
00:24:15,162 --> 00:24:20,648
[Michael Helbling]: The early thing I saw was like, well, your own expertise drives results in AI all the time.

332
00:24:20,849 --> 00:24:39,330
[Michael Helbling]: But it's sort of like, OK, if you go down to some brass tacks about how to conduct analysis, how to think about data lineage, how to think about traceability, all the things that we teachers are taught as analysts to be able to compose an analysis correctly, follow it through correctly, and deliver

333
00:24:39,310 --> 00:24:44,135
[Michael Helbling]: out the other side, those are all steps we learned as analysts.

334
00:24:44,155 --> 00:24:52,924
[Michael Helbling]: And so AI is a part of that process now, but we still have to maintain all of those parts along the way, it feels like.

335
00:24:53,665 --> 00:24:55,147
[Michael Helbling]: Does that, I don't know.

336
00:24:55,367 --> 00:25:09,081
[Michael Helbling]: And maybe AI will get so good, it can do all those steps for us at some point, but I just don't think there's ever gonna be any time in the near future, like a black box appropriate approach to analysis.

337
00:25:09,297 --> 00:25:16,386
[Michael Helbling]: which don't get me started on the topic of vibe analytics, which is the most stupidest thing I've ever heard of in my life.

338
00:25:16,947 --> 00:25:19,671
[Moe Kiss]: Well, I think we need to do a spin-off episode on that because I disagree.

339
00:25:20,291 --> 00:25:28,302
[Michael Helbling]: Well, it's probably definitional or semantically, we're probably in agreement, but yeah, we can probably do a whole show on it.

340
00:25:29,584 --> 00:25:31,466
[Michael Helbling]: Well, what other?

341
00:25:32,087 --> 00:25:37,414
[Josh Crowhurst]: This is something that I've also noticed, I think is kind of related is that

342
00:25:38,356 --> 00:25:58,677
[Josh Crowhurst]: that using AI, it really drives home to me that you really have to, especially as a manager, you need to have your critical thinking skills switched on because things will start to come up produced by, I mean, especially more junior people in their careers that are

343
00:25:58,657 --> 00:25:59,941
[Josh Crowhurst]: I guess more A.I.

344
00:25:59,961 --> 00:26:12,598
[Josh Crowhurst]: native will be using this and might at some times skip some of the steps in producing an analysis and they'll come up with something that sounds

345
00:26:13,405 --> 00:26:14,286
[Josh Crowhurst]: really logical.

346
00:26:14,786 --> 00:26:28,359
[Josh Crowhurst]: But maybe, you know, they had a conclusion in mind that they punched it into chat GPT and worked backwards at arriving on some logic to present an idea that maybe hasn't been fully thought through.

347
00:26:28,399 --> 00:26:33,023
[Josh Crowhurst]: So this is something where I think we have to be super, super aware of it, right?

348
00:26:34,125 --> 00:26:39,970
[Josh Crowhurst]: That there's a lot of, I guess, convincing sounding bullshit, where if you

349
00:26:39,950 --> 00:27:06,210
[Josh Crowhurst]: To go one layer deeper like the thinking just isn't isn't there so coming back to the idea of having the fundamentals, but also just being aware that you know, this is this is around us all the time and try to Try to really focus on You know is the logic sound I mean, I think that's That is there are

350
00:27:06,460 --> 00:27:20,421
[Tim Wilson]: when it gets used as, this is something that I don't enjoy doing, AI gets put out there as being, oh, the grunt and tedious work that you do, AI can do that.

351
00:27:20,481 --> 00:27:33,902
[Tim Wilson]: Now, I think that's an overinflation, like how many people are literally sitting there saying, I do monotonous, tedious, repetitive work day in and day out, and no one has come out with a way to

352
00:27:33,882 --> 00:27:34,643
[Tim Wilson]: streamline that.

353
00:27:34,763 --> 00:27:45,233
[Tim Wilson]: So this monotonous tedious work gets conflated with, this is work that I don't really enjoy or I have to kind of think about it.

354
00:27:45,353 --> 00:27:51,439
[Tim Wilson]: I hate summarizing meetings that are all over the place.

355
00:27:51,639 --> 00:27:55,343
[Tim Wilson]: Oh, look, Zoom will just record and summarize for me.

356
00:27:55,683 --> 00:28:00,228
[Tim Wilson]: And it's like, well, you may hate doing that, but you're missing what sort of value you should be adding along the way.

357
00:28:00,248 --> 00:28:02,590
[Tim Wilson]: And I think the same thing goes for

358
00:28:02,570 --> 00:28:13,874
[Tim Wilson]: If you think that the goal is to get a slide deck produced that looks plausible, then you're missing what analysis is.

359
00:28:13,994 --> 00:28:22,232
[Tim Wilson]: There is stuff that is supposed to be hard and that you are having to think through it with that structure as you go.

360
00:28:22,212 --> 00:28:34,618
[Michael Helbling]: Yeah, I want to step aside for a quick second and take a quick break with our friend Michael Kaminski from ReCast, the Media Mix Marketing and Geolift platform helping teams forecast accurately and make better decisions.

361
00:28:35,259 --> 00:28:40,009
[Michael Helbling]: Michael's sharing bite-sized marketing science lessons over the coming months to help you measure smarter.

362
00:28:41,051 --> 00:28:41,993
[Michael Helbling]: Over to you, Michael.

363
00:28:45,820 --> 00:28:49,306
[Michael Kaminsky (Recast)]: Granger causality might be the worst-named concept in analytics.

364
00:28:49,726 --> 00:28:53,593
[Michael Kaminsky (Recast)]: What you need to know is that Granger causality does not demonstrate causality.

365
00:28:54,054 --> 00:28:59,483
[Michael Kaminsky (Recast)]: Just because some variable passes a Granger check does not mean that it causes some other variable.

366
00:28:59,823 --> 00:29:02,748
[Michael Kaminsky (Recast)]: What Granger causality actually shows is predictive ability.

367
00:29:03,149 --> 00:29:07,035
[Michael Kaminsky (Recast)]: Effectively, the check is looking to see if past values of x can predict y.

368
00:29:07,015 --> 00:29:08,797
[Michael Kaminsky (Recast)]: better than past values of why alone.

369
00:29:09,157 --> 00:29:11,399
[Michael Kaminsky (Recast)]: As an example, let's imagine we have two time series.

370
00:29:11,659 --> 00:29:15,363
[Michael Kaminsky (Recast)]: One is the time that a rooster crows every morning, and the second is the time of the sunrise.

371
00:29:15,763 --> 00:29:20,127
[Michael Kaminsky (Recast)]: By just eyeballing the data, we can see that the rooster crows consistently a bit before sunrise.

372
00:29:20,587 --> 00:29:26,733
[Michael Kaminsky (Recast)]: Yet, a Granger causality test would conclude that rooster crows Granger cause the sun to come up every morning.

373
00:29:27,093 --> 00:29:28,595
[Michael Kaminsky (Recast)]: The problem is really in the name.

374
00:29:28,895 --> 00:29:36,702
[Michael Kaminsky (Recast)]: It confuses analysts and especially business stakeholders who, understandably, assume that a Granger causality test actually checks for causality.

375
00:29:36,682 --> 00:29:41,927
[Michael Kaminsky (Recast)]: Here's what to remember, Granger Causality only tests whether one variable proceeds and helps predict another.

376
00:29:42,308 --> 00:29:45,211
[Michael Kaminsky (Recast)]: It says nothing about whether one actually causes the other.

377
00:29:45,811 --> 00:29:46,352
[Michael Helbling]: Thanks, Michael.

378
00:29:46,993 --> 00:29:53,700
[Michael Helbling]: And for those who haven't heard, our friends at ReCast just launched their new incrementality testing platform, GeoLift by ReCast.

379
00:29:53,780 --> 00:30:00,126
[Michael Helbling]: It's a simple, powerful way for marketing and data teams to measure the true impact of their advertising spend.

380
00:30:00,246 --> 00:30:05,592
[Michael Helbling]: And even better, you can use it completely free for six months, just visit

381
00:30:05,572 --> 00:30:10,781
[Michael Helbling]: getrecast.com slash geolift to start your trial today.

382
00:30:10,841 --> 00:30:20,698
[Michael Helbling]: Okay, well, let's talk about shows we liked maybe that didn't always touch or didn't touch fully on AI.

383
00:30:20,738 --> 00:30:25,306
[Michael Helbling]: What are some topics we liked this year that weren't necessarily in the AI wheelhouse?

384
00:30:25,346 --> 00:30:30,415
[Michael Helbling]: And kind of Moee, this is coming off of you talking about sort of this fundamentals kind of an idea.

385
00:30:32,318 --> 00:30:39,491
[Val Kroll]: One of the ones that I had FOMO for not being on was the ANOVA, A Hardly Know Ya, with Chelsea.

386
00:30:39,511 --> 00:30:40,092
[Val Kroll]: Oh, that was so good.

387
00:30:40,112 --> 00:30:42,216
[Val Kroll]: That one was so good.

388
00:30:42,236 --> 00:30:43,759
[Val Kroll]: I mean, she's just a joy.

389
00:30:43,799 --> 00:30:49,129
[Val Kroll]: But I don't know if you guys remember, but she's one of the things that you guys started with on the episode is that she had a poem.

390
00:30:49,109 --> 00:30:54,436
[Val Kroll]: pre-CHAT GPT times Twitter feed poem about ANOVA, which I loved.

391
00:30:56,138 --> 00:31:05,630
[Val Kroll]: But she was just so thoughtful in the way that she was describing and getting into all the inner workings and the comparisons with ANCOVA and MANOVA.

392
00:31:06,331 --> 00:31:09,195
[Val Kroll]: She's like, at the end of the day, it's linear aggression all the way down.

393
00:31:09,275 --> 00:31:17,445
[Val Kroll]: And I thought, you guys did a really nice job probing with some really good questions that were very thoughtful from real life experiences that I'd thought.

394
00:31:17,847 --> 00:31:19,011
[Val Kroll]: made that episode really good.

395
00:31:19,894 --> 00:31:23,045
[Val Kroll]: I've definitely listened to that one more than once this year, but that was really fun.

396
00:31:23,627 --> 00:31:27,300
[Val Kroll]: It was an easy listen, even though it's a complex topic.

397
00:31:28,242 --> 00:31:30,586
[Tim Wilson]: I'll throw in the episode 268.

398
00:31:30,926 --> 00:31:45,771
[Tim Wilson]: You get an insight, and you get an insight with Chris Kocek, which was, I would say, very not AI, because it was so much about a human being pulling things from different directions.

399
00:31:46,331 --> 00:31:47,193
[Tim Wilson]: And that wasn't the first.

400
00:31:47,273 --> 00:31:51,600
[Tim Wilson]: We had Rod Jacka on years, Jacka, Jacka.

401
00:31:52,475 --> 00:31:55,820
[Tim Wilson]: Chaka, on years ago to talk about what is an insight.

402
00:31:55,840 --> 00:31:59,586
[Tim Wilson]: So I feel like that's a perpetual question in our industry.

403
00:31:59,666 --> 00:32:05,475
[Tim Wilson]: And there are certainly a million AI-powered tools that are like, it'll find insights for you.

404
00:32:05,536 --> 00:32:09,542
[Tim Wilson]: And to me, that was like that episode, Chris is not an analytics person.

405
00:32:09,622 --> 00:32:17,875
[Tim Wilson]: He is coming from much more of a creative and messaging and branding background and getting his perspective on

406
00:32:18,091 --> 00:32:25,044
[Tim Wilson]: what the many, many facets and the inherently human nature of trying to get some deeper understanding about something.

407
00:32:25,685 --> 00:32:31,236
[Tim Wilson]: I thought it was a pretty nice corrective to the AI hype.

408
00:32:31,316 --> 00:32:35,003
[Tim Wilson]: I really liked how he defined an insight.

409
00:32:36,586 --> 00:32:43,255
[Michael Helbling]: You know, another one of my favorite episodes, and Moe, you mentioned this one as well, was the one with Michael Kaminski about Bayesian statistics.

410
00:32:44,096 --> 00:33:01,058
[Michael Helbling]: I think throughout my career, I've learned things sort of just sort of by arriving at them, not necessarily being officially trained in them or those kinds of things, just because of how I started in analytics and how I kind of grew into the field.

411
00:33:01,038 --> 00:33:10,138
[Michael Helbling]: And it was sort of this really big light bulb moment to sort of realize like, wow, the way that I actually approached this stuff is literally what we talked about in that episode.

412
00:33:10,158 --> 00:33:18,456
[Michael Helbling]: And sort of, for the first time, kind of slammed together in my mind, like made the connection finally like, oh, that's Bayesian statistics.

413
00:33:18,436 --> 00:33:20,340
[Michael Helbling]: So it's just so funny.

414
00:33:20,400 --> 00:33:24,469
[Michael Helbling]: Yeah, I know what that is conceptually, like, oh, it's your priors, blah, blah, blah.

415
00:33:24,569 --> 00:33:32,445
[Michael Helbling]: But as a model for actually doing stuff in the real world, I hadn't really said, like, oh, I'm Bayesian in the way that I think about that.

416
00:33:32,678 --> 00:33:39,368
[Moe Kiss]: It's funny because I think one of my tendencies, and I always say this to my team, is that I oversimplify things.

417
00:33:40,249 --> 00:33:46,498
[Moe Kiss]: And I think that's just part of my role, right, is I'm often trying to communicate something really complex to a leadership team.

418
00:33:47,079 --> 00:33:59,057
[Moe Kiss]: But I think one of the things that I really loved about that episode is in my mind, I think I had maybe perhaps oversimplified what I understood about Bayesian stats, and Michael brought a level of

419
00:33:59,037 --> 00:34:04,065
[Moe Kiss]: new depth to the topic that really add a lot of value to me personally.

420
00:34:04,866 --> 00:34:05,106
[Michael Helbling]: Yeah.

421
00:34:05,787 --> 00:34:06,729
[Michael Helbling]: I really liked it.

422
00:34:06,769 --> 00:34:08,451
[Michael Helbling]: It actually was super applicable.

423
00:34:08,632 --> 00:34:23,374
[Michael Helbling]: I was literally sitting down with a client not long after we recorded that, and I was able to walk them through a process they could follow where they were in a situation where a frequentist approach would not have worked well.

424
00:34:23,354 --> 00:34:26,978
[Michael Helbling]: in that context, and I was like, well, here's some other alternatives.

425
00:34:27,018 --> 00:34:30,682
[Michael Helbling]: We could actually do something like this, and it actually worked really well.

426
00:34:31,202 --> 00:34:40,032
[Michael Helbling]: But it's funny because I probably would have still suggested that, but now I could actually call it what it was, as opposed to being like, I've got an idea.

427
00:34:40,052 --> 00:34:41,173
[Michael Helbling]: Try this.

428
00:34:41,393 --> 00:34:42,234
[Michael Helbling]: It probably has a name.

429
00:34:42,274 --> 00:34:43,015
[Michael Helbling]: I just don't know it.

430
00:34:44,236 --> 00:34:50,863
[Michael Helbling]: Anyway, it was just really cool to connect the dots on that for me this year.

431
00:34:51,333 --> 00:35:02,591
[Val Kroll]: All right, one of the other ones that I'll throw out there, another recent one that we did was 268, the metrics layers, data dictionaries, maybe it's all semantic layers with Cindy Hausen.

432
00:35:03,272 --> 00:35:10,023
[Val Kroll]: So I have to admit full transparency when we were in our planning for that, I'm like, is that really a whole episode?

433
00:35:10,083 --> 00:35:11,265
[Val Kroll]: I'm like, I don't know.

434
00:35:11,245 --> 00:35:15,371
[Val Kroll]: Okay, I'm not on it so I feel, but holy shit.

435
00:35:15,391 --> 00:35:20,679
[Val Kroll]: Yes, it was a whole episode because it was with Cindy and it was really, really well done.

436
00:35:20,699 --> 00:35:21,801
[Val Kroll]: I love that one so much.

437
00:35:23,203 --> 00:35:29,613
[Michael Helbling]: Val, Tim and I both will tell you like we've gone into certain episodes over the years and been like, I don't know about this.

438
00:35:29,853 --> 00:35:31,176
[Michael Helbling]: And it turns out to be amazing.

439
00:35:31,576 --> 00:35:38,046
[Michael Helbling]: So like a lot of times a little bit of doubt is almost like an indicator that like something good might happen here.

440
00:35:38,026 --> 00:35:48,926
[Moe Kiss]: But also, I think the fact is that Cindy herself is such an experienced data practitioner, has such a depth of knowledge about the technologies and the topic we're talking about.

441
00:35:49,647 --> 00:35:55,497
[Moe Kiss]: I mean, I could talk about semantic layers for hours, which I have done with Cindy from time to time.

442
00:35:56,720 --> 00:36:00,827
[Moe Kiss]: But I think that episode was really strong and really

443
00:36:00,807 --> 00:36:03,734
[Moe Kiss]: Yeah, semantic layers is a hot topic at the moment.

444
00:36:03,814 --> 00:36:05,578
[Moe Kiss]: Lots of folks are building things.

445
00:36:06,420 --> 00:36:08,325
[Moe Kiss]: There's a DBT product, a Snowflake product.

446
00:36:09,287 --> 00:36:13,016
[Moe Kiss]: There's a bunch of similar products that are built into BI tools.

447
00:36:13,276 --> 00:36:18,208
[Moe Kiss]: It's a very timely episode, as well, given how

448
00:36:18,188 --> 00:36:23,159
[Moe Kiss]: quickly things are moving in the industry or maybe, I don't know, maybe not quickly because we're like trying to catch up.

449
00:36:23,179 --> 00:36:31,236
[Moe Kiss]: But Cindy, I think Cindy was just such a wonderful guest for that specific episode and probably is one of my favorites as well.

450
00:36:31,435 --> 00:36:40,832
[Tim Wilson]: And the fact that she made the point that one, they're not new, and two, thinking of it as one monolithic thing, I was like, those were like two.

451
00:36:40,913 --> 00:36:41,534
[Tim Wilson]: That was big.

452
00:36:41,694 --> 00:36:49,308
[Tim Wilson]: Very like, ah, this has gotten the label of this is the grand new thing, just roll it out.

453
00:36:49,368 --> 00:36:54,658
[Tim Wilson]: And I was like, it is the fact that she is very, very politely

454
00:36:54,638 --> 00:37:02,969
[Tim Wilson]: really fucking annoyed with the cycle of the latest shiny thing being treated as like, this is the thing, the answer.

455
00:37:03,389 --> 00:37:06,233
[Tim Wilson]: So Josh, you were going to say something.

456
00:37:06,433 --> 00:37:14,483
[Josh Crowhurst]: Yeah, a recent one that I particularly enjoyed was 281 analytics, the view from the corner office with Annalie.

457
00:37:14,944 --> 00:37:16,205
[Josh Crowhurst]: Yeah, great episode.

458
00:37:16,245 --> 00:37:22,333
[Josh Crowhurst]: And I think we were talking about trying to get this like finding the right guess for this idea for

459
00:37:22,988 --> 00:37:24,429
[Josh Crowhurst]: Years, maybe?

460
00:37:24,750 --> 00:37:25,671
[Josh Crowhurst]: It was a long time.

461
00:37:25,831 --> 00:37:30,515
[Josh Crowhurst]: Yeah, that was when I think we were trying to put together for a long time.

462
00:37:30,536 --> 00:37:35,701
[Josh Crowhurst]: So when I saw that on my Spotify feed, I was like, oh, I have to listen to this right away.

463
00:37:36,001 --> 00:37:39,705
[Josh Crowhurst]: And it was it was worth the wait, for sure.

464
00:37:40,145 --> 00:37:51,156
[Josh Crowhurst]: And for me, it really it resonated maybe partly due to some perhaps slightly traumatic recent experiences in my previous company where I had exposure to senior leadership.

465
00:37:51,136 --> 00:37:57,242
[Josh Crowhurst]: A few of the things that she talked about like were really sharp.

466
00:37:57,342 --> 00:38:01,205
[Josh Crowhurst]: I thought talking about like setting a culture of productive curiosity.

467
00:38:02,206 --> 00:38:08,532
[Josh Crowhurst]: I love the term because yeah, I did see it first hand, you know, you'd be in a meeting and the CEO would make an offhand comment.

468
00:38:09,093 --> 00:38:19,422
[Josh Crowhurst]: And then people would just spend an inordinate time digging into that, like whatever, whatever the ask was, because, you know, the CEO said it, like I have to, you know, I have to, I have to

469
00:38:19,402 --> 00:38:20,023
[Josh Crowhurst]: do this.

470
00:38:20,083 --> 00:38:27,639
[Josh Crowhurst]: It might not be something that's worth spending hours or days looking into.

471
00:38:27,840 --> 00:38:35,115
[Josh Crowhurst]: We would come back in the next meeting and the CEO wouldn't necessarily even remember making the comment.

472
00:38:35,095 --> 00:38:40,266
[Josh Crowhurst]: So I kind of learned to level set in the meeting before going and saying, hey, we're going to look into this.

473
00:38:40,386 --> 00:38:45,076
[Josh Crowhurst]: This is the amount of time we're probably going to spend on it and just sort of set that.

474
00:38:45,878 --> 00:38:48,323
[Josh Crowhurst]: Just get that out there before leaving the room.

475
00:38:48,904 --> 00:38:52,632
[Josh Crowhurst]: But what Anna said was she

476
00:38:52,612 --> 00:38:59,882
[Josh Crowhurst]: She talks about having a level of precision that's necessary and sufficient for the importance of the decision that's being made.

477
00:38:59,902 --> 00:39:03,547
[Josh Crowhurst]: And then having the self-awareness as a leader to specify that.

478
00:39:04,188 --> 00:39:07,132
[Josh Crowhurst]: And then save the team some of the bandwidth.

479
00:39:07,152 --> 00:39:13,481
[Josh Crowhurst]: So as an analyst, when you're in there, if it's not clear, just state it and get it out in the open and get the alignment.

480
00:39:13,541 --> 00:39:18,788
[Josh Crowhurst]: But I love Anna's perspective that taking ownership as a leader

481
00:39:18,768 --> 00:39:25,258
[Josh Crowhurst]: Realizing what you say, people might just take it and run with it and spend a ton of time and you didn't necessarily intend it that way.

482
00:39:25,278 --> 00:39:26,800
[Josh Crowhurst]: So I loved that framing.

483
00:39:27,661 --> 00:39:38,818
[Josh Crowhurst]: And then I just thought, yeah, a really thoughtful perspective on what a data-driven culture can look like and how it can be established and driven from the executive level.

484
00:39:39,052 --> 00:39:40,114
[Josh Crowhurst]: And just one last.

485
00:39:40,454 --> 00:39:56,301
[Moe Kiss]: That is the specific bit that really sung to me was how much responsibility she took as the leader for that culture versus assuming that your data scientists are responsible for the data culture alone.

486
00:39:56,661 --> 00:39:58,444
[Moe Kiss]: That was one that really stood out.

487
00:39:59,306 --> 00:39:59,406
[Josh Crowhurst]: Yeah.

488
00:39:59,386 --> 00:40:03,291
[Josh Crowhurst]: No, it made me, I was like, I want to work there.

489
00:40:03,751 --> 00:40:14,445
[Josh Crowhurst]: She has such a great way of framing it and thinking about it and communicating her vision on how data can be used and should be used and then setting the example.

490
00:40:15,266 --> 00:40:19,251
[Josh Crowhurst]: It was really inspiring, honestly.

491
00:40:19,231 --> 00:40:24,481
[Josh Crowhurst]: And then one last thing that resonated, again, going back to my PTSD.

492
00:40:24,541 --> 00:40:26,945
[Josh Crowhurst]: But yeah, brief your analysts, right?

493
00:40:26,985 --> 00:40:33,898
[Josh Crowhurst]: If you want them to be set up to succeed the first time they're presenting to the CEO, I'll say that maybe didn't happen for me.

494
00:40:33,918 --> 00:40:39,308
[Josh Crowhurst]: I might have been passed in front of the whole company group executive committee as a result of that.

495
00:40:39,348 --> 00:40:40,530
[Josh Crowhurst]: So please,

496
00:40:41,287 --> 00:40:44,396
[Josh Crowhurst]: I don't think that a vandalist, please do that.

497
00:40:44,436 --> 00:40:45,299
[Josh Crowhurst]: That's great advice.

498
00:40:46,181 --> 00:40:53,482
[Josh Crowhurst]: Prevent any, uh, any traumatic pantsings of your, of your team when they're in a room with the big dogs.

499
00:40:53,502 --> 00:40:54,385
[Val Kroll]: Poor Josh.

500
00:40:55,867 --> 00:40:56,207
[Val Kroll]: Yeah.

501
00:40:56,367 --> 00:41:04,615
[Val Kroll]: The thing that also struck me about that conversation was, I don't think she realizes how novel her perspective is.

502
00:41:04,735 --> 00:41:08,518
[Val Kroll]: Like, she was like, oh, she's like, of course that that's what leaders do.

503
00:41:08,538 --> 00:41:12,302
[Val Kroll]: I'm like, I was like, can you say that in some of your circles?

504
00:41:12,362 --> 00:41:16,866
[Val Kroll]: Like, I was like, where's the, where's the link to your jobs posting?

505
00:41:16,886 --> 00:41:18,007
[Val Kroll]: I think I even said that, Josh.

506
00:41:18,027 --> 00:41:20,089
[Val Kroll]: I was like, hopefully your last call is that you're hiring.

507
00:41:21,230 --> 00:41:22,131
[Val Kroll]: This is awesome.

508
00:41:22,311 --> 00:41:23,612
[Val Kroll]: But yeah, no, that was a good one.

509
00:41:23,592 --> 00:41:35,291
[Michael Helbling]: That was one of my favorites too, Josh, because it was in a way so affirming of a thing I've really come to start believing more and more is that leadership drives data culture more than the data team does.

510
00:41:36,733 --> 00:41:44,706
[Michael Helbling]: And as the only way to really drive a data-rich culture or data-informed culture in a company is if

511
00:41:44,686 --> 00:41:46,089
[Michael Helbling]: the leadership is doing it.

512
00:41:46,630 --> 00:41:53,843
[Michael Helbling]: Because even when you take on the role as a data leader in your company, you can't force people to become data-driven.

513
00:41:53,903 --> 00:41:56,388
[Michael Helbling]: They either are, they aren't.

514
00:41:56,749 --> 00:42:01,117
[Michael Helbling]: But if the CEO is saying it, well, that makes it a different thing altogether.

515
00:42:01,858 --> 00:42:05,445
[Michael Helbling]: But yeah, that was a great episode.

516
00:42:05,425 --> 00:42:07,287
[Michael Helbling]: And yeah, it was a long time coming.

517
00:42:07,327 --> 00:42:14,594
[Michael Helbling]: That was in our like, every year we'd have that in our list of like, yeah, we got to find somebody that could do justice to this topic.

518
00:42:14,634 --> 00:42:23,602
[Michael Helbling]: And as analytics people were always thinking like, yeah, what do they think about, you know, when they're sitting as the CEO, what's their perspective on data?

519
00:42:23,622 --> 00:42:24,323
[Michael Helbling]: Do they care?

520
00:42:24,363 --> 00:42:26,004
[Michael Helbling]: Do they look at these charts and graphs?

521
00:42:26,044 --> 00:42:29,487
[Michael Helbling]: Like that's a question I think our whole audience thinks about.

522
00:42:29,788 --> 00:42:31,709
[Michael Helbling]: Anyways, Anna was amazing.

523
00:42:31,850 --> 00:42:32,310
[Michael Helbling]: That was

524
00:42:32,290 --> 00:42:32,711
[Tim Wilson]: Yeah.

525
00:42:32,731 --> 00:42:37,597
[Tim Wilson]: Years ago, we had someone who agreed and was ready to come on and then ghosted us like completely.

526
00:42:37,757 --> 00:42:40,040
[Tim Wilson]: So it was like, yeah.

527
00:42:41,141 --> 00:42:42,543
[Tim Wilson]: Oh, I forgot all about that.

528
00:42:43,224 --> 00:42:43,684
[Tim Wilson]: Talk to him.

529
00:42:43,744 --> 00:42:44,445
[Tim Wilson]: Talk to him later.

530
00:42:44,505 --> 00:42:47,789
[Tim Wilson]: It was, it turned out his company was like in the midst of, it was about to get acquired.

531
00:42:47,849 --> 00:42:50,052
[Tim Wilson]: He was like, yeah, I really needed to go darling.

532
00:42:50,072 --> 00:42:59,023
[Tim Wilson]: I'm like, I don't know that a email response of like, Hey, actually this isn't a great time would have, you know, been too problematic, but I don't know.

533
00:42:59,924 --> 00:43:01,386
[Tim Wilson]: So yeah, I agree.

534
00:43:02,615 --> 00:43:06,226
[Michael Helbling]: Well, what trends are shaping the next year, Moe?

535
00:43:08,893 --> 00:43:12,765
[Michael Helbling]: God, I don't know.

536
00:43:12,785 --> 00:43:14,650
[Moe Kiss]: I think he's...

537
00:43:17,043 --> 00:43:22,795
[Moe Kiss]: I've just obviously gone through lots of 2026 planning and thinking about the year ahead.

538
00:43:23,477 --> 00:43:37,026
[Moe Kiss]: It sounds so boring, but if I had to boil it down to a couple of key things that I'm really thinking about, it is about consistency and making sure that

539
00:43:37,006 --> 00:43:43,796
[Moe Kiss]: We have really solid consistency in metric definitions and how metrics are calculated and all those sorts.

540
00:43:43,816 --> 00:43:47,321
[Moe Kiss]: It just sounds boring, but I feel like it's becoming more important than ever.

541
00:43:47,341 --> 00:43:58,838
[Moe Kiss]: I think the other thing that I'm spending a lot of time thinking about is, I don't know, we're all using AI just for internal efficiency gains and it just feels shit.

542
00:43:59,358 --> 00:44:04,566
[Moe Kiss]: If you're using it to write a better email or a Slack message, it doesn't feel like that is

543
00:44:05,558 --> 00:44:08,885
[Moe Kiss]: how we can be getting the best from some of these tools.

544
00:44:09,827 --> 00:44:18,083
[Moe Kiss]: And so thinking a lot more about specifically like the data products we make and how we can

545
00:44:18,316 --> 00:44:19,277
[Moe Kiss]: better automate.

546
00:44:19,298 --> 00:44:22,182
[Moe Kiss]: I'll give you a specific example, which is going to sound really lame.

547
00:44:22,783 --> 00:44:26,809
[Moe Kiss]: It's going to sound stupid and lame, but this is the exact thing.

548
00:44:27,230 --> 00:44:32,838
[Moe Kiss]: We used to keep a list of dashboards, like your top company dashboards.

549
00:44:32,878 --> 00:44:36,764
[Moe Kiss]: When someone on boards, you can be like, you want to know about this topic or this topic or this topic, you go here.

550
00:44:37,405 --> 00:44:40,931
[Moe Kiss]: It's a manual list, it's paid in the ass, it always ends up outdated, not maintained.

551
00:44:41,612 --> 00:44:42,433
[Moe Kiss]: I was like,

552
00:44:42,413 --> 00:44:47,119
[Moe Kiss]: That is a problem where we should be solving with technology, right?

553
00:44:47,880 --> 00:44:55,489
[Moe Kiss]: And I think that's probably why I'm so hyper-focused on consistency and all the fundamentals.

554
00:44:55,650 --> 00:45:03,059
[Moe Kiss]: Because if you want to throw technology, how do we maintain this list without needing someone to go manually update some spreadsheet or whatever it is?

555
00:45:03,359 --> 00:45:09,467
[Moe Kiss]: How do you understand which your dashboards are being used, which are high value, which are going to answer the right question?

556
00:45:09,447 --> 00:45:15,076
[Moe Kiss]: To do that, the data that you're using to build a technological solution has to be very good quality.

557
00:45:15,217 --> 00:45:19,003
[Moe Kiss]: But yeah, those are just the things that are on my mind going into 2026.

558
00:45:19,925 --> 00:45:21,006
[Moe Kiss]: Oh, Tim looks for you.

559
00:45:22,809 --> 00:45:33,227
[Tim Wilson]: Well, I mean, I believe back on the fundamentals that there still is.

560
00:45:34,591 --> 00:45:40,964
[Tim Wilson]: It is so easy to get caught up and they were going to keep measure, measure, measure, measure, measure and the complexity kind of explodes.

561
00:45:41,224 --> 00:45:49,160
[Tim Wilson]: And Moe, you were at a very large massive amount of data digital native company.

562
00:45:49,180 --> 00:45:54,912
[Tim Wilson]: I have even in the last two weeks have had an experience with a

563
00:45:56,073 --> 00:46:09,728
[Tim Wilson]: massive company that their issue was much more around internal alignment on what different teams were trying to accomplish.

564
00:46:09,961 --> 00:46:10,922
[Tim Wilson]: and not the data.

565
00:46:10,982 --> 00:46:17,830
[Tim Wilson]: Every time the data people would come in, it was just kind of like puking out charts of stuff.

566
00:46:18,731 --> 00:46:24,158
[Tim Wilson]: And you could see that that wasn't serving the business.

567
00:46:24,518 --> 00:46:29,504
[Tim Wilson]: I mean, there were some kind of comical ways in which the data people were very knowledgeable.

568
00:46:30,345 --> 00:46:32,388
[Tim Wilson]: The visualizations were fine.

569
00:46:32,408 --> 00:46:39,276
[Tim Wilson]: They could answer questions about the minutia, and that wasn't remotely what the organization

570
00:46:39,256 --> 00:46:39,676
[Tim Wilson]: needed.

571
00:46:39,717 --> 00:46:43,101
[Tim Wilson]: So I think that's not a direct response.

572
00:46:43,121 --> 00:46:59,460
[Tim Wilson]: I mean, I think my cringe a little bit like, well, let's look at which dashboards people are looking at and which metrics like that, that to me winds up being coming up often saying, can AI come up with an engineering solution that's just going to tell me the insight?

573
00:46:59,480 --> 00:47:02,664
[Moe Kiss]: Like it's kind of like, well, let's just... No, I don't agree.

574
00:47:02,684 --> 00:47:03,525
[Moe Kiss]: I don't agree.

575
00:47:03,565 --> 00:47:05,567
[Moe Kiss]: I think the thing

576
00:47:06,374 --> 00:47:12,121
[Moe Kiss]: Fundamentally, you and I are very aligned that it's about the business question that you're trying to answer, right?

577
00:47:12,682 --> 00:47:17,788
[Moe Kiss]: Like I would say that that's, I don't know, I'm getting like a semi-nod.

578
00:47:21,813 --> 00:47:28,602
[Moe Kiss]: One of the concerns you have is like,

579
00:47:28,987 --> 00:47:36,256
[Moe Kiss]: Are people leveraging AI to answer a question that could be answered very easily with something that's already built?

580
00:47:36,336 --> 00:47:38,999
[Moe Kiss]: And then it comes down to like, this is more about cost efficiency, right?

581
00:47:39,019 --> 00:47:47,529
[Moe Kiss]: Like, I don't want someone continually asking a question every day that's costing us money to run that is sitting on a dashboard that can be easily looked at and interrogated if they just know where it is.

582
00:47:47,930 --> 00:47:49,952
[Moe Kiss]: It's about discoverability to answer that question.

583
00:47:50,573 --> 00:47:58,723
[Moe Kiss]: And so, like, there are multiple problems that you're trying to solve and it might just be another way to answer that business question.

584
00:47:59,480 --> 00:48:15,996
[Tim Wilson]: So I would say it's not a, I wish it was a trend of 2025, but I think the reason I was kind of having that reaction to answering business questions goes back to momentarily mounts soapbox that the definition is if somebody in the business asks this question, it's a business question and therefore I need to answer it.

585
00:48:16,036 --> 00:48:19,760
[Tim Wilson]: How can I answer that efficiently and most effectively?

586
00:48:19,920 --> 00:48:26,787
[Tim Wilson]: And it becomes a volume play with, and so if you instead totally shifted, I think there's a

587
00:48:26,767 --> 00:48:33,877
[Tim Wilson]: a crap ton of questions that are kind of fishing that actually point to a much more fundamental challenge.

588
00:48:34,438 --> 00:48:46,475
[Tim Wilson]: But so if trying to solve, I mean, it goes to the, and this does come to the AI companies that are like, imagine if you could just sit there with chat GPT and just ask it questions and it would provide responses.

589
00:48:46,835 --> 00:48:50,040
[Tim Wilson]: And then the pushback winds up saying, ah,

590
00:48:50,020 --> 00:48:56,820
[Tim Wilson]: but the answers don't have, they have hallucinations or ah, without this engineering, it can't provide accurate.

591
00:48:57,362 --> 00:48:59,187
[Tim Wilson]: And to me, I'm like,

592
00:48:59,285 --> 00:49:11,020
[Tim Wilson]: That is not the goal in-state, is to have people who aren't thinking rigorously about what they're trying to do and are prematurely jumping to the data.

593
00:49:11,040 --> 00:49:21,073
[Tim Wilson]: I deeply in my soul believe that that is heading down a path of just getting more people

594
00:49:21,053 --> 00:49:34,556
[Tim Wilson]: wandering through more data to have more meaningless arguments to produce more overly lengthy PowerPoint or Canva or Google Slides decks that aren't actually moving the business forward.

595
00:49:34,596 --> 00:49:44,513
[Tim Wilson]: So it's actually putting fuel on the fire of something that is broken in business horribly, horribly, horribly.

596
00:49:45,742 --> 00:49:47,685
[Michael Helbling]: Activity without outcome, maybe.

597
00:49:47,725 --> 00:49:59,401
[Michael Helbling]: So Tim, maybe the AI product you want to see built is the one that forces more rigorous questioning by guiding people through that process.

598
00:49:59,841 --> 00:50:01,604
[Michael Helbling]: So be like, why are you asking that question?

599
00:50:01,644 --> 00:50:02,906
[Michael Helbling]: Oh, interesting, refine that.

600
00:50:03,426 --> 00:50:08,673
[Michael Helbling]: OK, you don't really want that analysis because you wouldn't want to mistake this for this.

601
00:50:08,774 --> 00:50:11,277
[Michael Helbling]: So maybe you want this analysis.

602
00:50:11,257 --> 00:50:12,619
[Michael Helbling]: Like, something like that would be.

603
00:50:12,639 --> 00:50:20,232
[Val Kroll]: And then at the end, does it turn into like an intake for like an intake system that goes... Oh my God!

604
00:50:20,252 --> 00:50:20,312
[Val Kroll]: No!

605
00:50:20,332 --> 00:50:26,903
[Moe Kiss]: You and me, I was telepathically communicating with you being like, it kind of sounds like a Jira intake ticket.

606
00:50:27,980 --> 00:50:30,724
[Michael Helbling]: She's throwing gaslighting.

607
00:50:30,744 --> 00:50:42,499
[Michael Helbling]: That actually looked like something I was already going to say, which was at the end of episode 279, the process of analytics, we have thoughts, that episode, we were talking about that.

608
00:50:42,559 --> 00:50:48,487
[Michael Helbling]: And at the end of that episode, I was like, now, because of AI, all these processes are going to take on even more importance.

609
00:50:48,527 --> 00:50:52,272
[Michael Helbling]: And Tim jumped down my throat and said, everyone's been important.

610
00:50:52,973 --> 00:50:54,255
[Michael Helbling]: And he wasn't wrong.

611
00:50:54,636 --> 00:50:56,518
[Michael Helbling]: But the reality is, is like,

612
00:50:57,292 --> 00:51:01,498
[Michael Helbling]: To get to leverage AI, you have to do those precursors.

613
00:51:02,079 --> 00:51:08,488
[Michael Helbling]: To Moee's point, that return to some of the fundamentals is the trend.

614
00:51:09,009 --> 00:51:12,013
[Michael Helbling]: Tim was wrong to do that to me on that episode.

615
00:51:12,033 --> 00:51:15,418
[Val Kroll]: That's really the point I'm making.

616
00:51:15,438 --> 00:51:17,942
[Val Kroll]: Here's another poll.

617
00:51:18,363 --> 00:51:25,493
[Val Kroll]: Do we think that AI was mentioned more this year or Tim's blood pressure raising happened more this year?

618
00:51:27,228 --> 00:51:28,309
[Tim Wilson]: Wait, what was the first option?

619
00:51:29,670 --> 00:51:30,311
[Tim Wilson]: What was the first option?

620
00:51:30,331 --> 00:51:32,493
[Val Kroll]: Mentions of AI versus Tim's blood pressure, right?

621
00:51:32,513 --> 00:51:34,174
[Tim Wilson]: Oh, blood pressure, yeah.

622
00:51:34,194 --> 00:51:36,996
[Tim Wilson]: Well, they're deeply correlated, and there is causation.

623
00:51:37,016 --> 00:51:40,179
[Michael Helbling]: Well, at least for Tim's blood pressure, there's medications for that.

624
00:51:40,459 --> 00:51:42,541
[Michael Helbling]: Oh, brother.

625
00:51:44,283 --> 00:51:49,708
[Michael Helbling]: Well, that's one trend that will probably continue is that Tim and I will tangle up a couple of times.

626
00:51:50,408 --> 00:51:51,389
[Michael Helbling]: No, it's fine.

627
00:51:52,690 --> 00:51:56,053
[Moe Kiss]: So I'm probably going to say something again fiery.

628
00:51:56,658 --> 00:52:06,569
[Moe Kiss]: Also, just to clarify, answering a business question does not mean that we should answer every question raised by the business, just to like caveat the former discussion before I move on to the next question.

629
00:52:06,609 --> 00:52:09,832
[Tim Wilson]: But if you're making it so that they can get to whatever the question, yeah.

630
00:52:10,153 --> 00:52:11,974
[Tim Wilson]: Okay.

631
00:52:11,995 --> 00:52:23,627
[Moe Kiss]: The next topic that I also think is coming up a lot, which very much ties back to the episode with Anna, is about decision velocity.

632
00:52:25,075 --> 00:52:28,199
[Moe Kiss]: I think that is something that is really, really interesting.

633
00:52:28,239 --> 00:52:38,553
[Moe Kiss]: And again, Tim makes the point, I work in a very unique position at a company that's probably not representative of what most companies are that data folks are working in.

634
00:52:39,834 --> 00:52:45,522
[Moe Kiss]: But it's very much, how do you use the right level of rigor for the decision that you're trying to make as a business?

635
00:52:46,363 --> 00:52:51,910
[Moe Kiss]: And sure, maybe there's some AI sprinkle salt on the top of that as well.

636
00:52:51,890 --> 00:53:07,722
[Tim Wilson]: So I think that making that point of giving getting the business more sophisticated that what are the stakes behind the decision and therefore is a little bit of a signal very quickly because is that.

637
00:53:09,085 --> 00:53:15,295
[Tim Wilson]: better and desirable than getting a complete answer, but way too late.

638
00:53:15,495 --> 00:53:27,214
[Tim Wilson]: I think there is starting to be some awareness on the business side that if you just are waiting for the inarguable truth, you'll just be waiting forever.

639
00:53:27,234 --> 00:53:29,999
[Tim Wilson]: Although I think there is still the tension between the teams.

640
00:53:30,019 --> 00:53:31,601
[Tim Wilson]: They can't get answers to me fast enough.

641
00:53:32,503 --> 00:53:34,005
[Tim Wilson]: Why can't I just have an AI?

642
00:53:34,045 --> 00:53:36,409
[Tim Wilson]: This was secondhand.

643
00:53:37,857 --> 00:53:44,745
[Tim Wilson]: Somebody said their CMO was like, I just want to have the AI just tell me, you know, give me insights while I'm in the shower in the morning.

644
00:53:44,825 --> 00:53:47,608
[Tim Wilson]: I just want to get up and have it have sifted through the data.

645
00:53:47,828 --> 00:53:56,958
[Tim Wilson]: And I'm like, okay, we still have a ways, a ways to go because that CMO is, but that's not the same thing as decision velocity.

646
00:53:56,998 --> 00:54:00,642
[Michael Helbling]: Cause I guarantee you that CMO is not, uh,

647
00:54:00,622 --> 00:54:05,191
[Michael Helbling]: doing decisions effectively and a good speed.

648
00:54:06,674 --> 00:54:13,547
[Michael Helbling]: Because they're doing the gathering of information incorrectly to go after decision velocity.

649
00:54:14,509 --> 00:54:20,200
[Michael Helbling]: One time, somebody told me that a CEO is just a decision engine.

650
00:54:20,180 --> 00:54:23,665
[Michael Helbling]: which I thought was actually a really cool way to think about that.

651
00:54:24,386 --> 00:54:29,272
[Michael Helbling]: We think about executive leadership generally, clearing obstacles for your team and all those things.

652
00:54:29,312 --> 00:54:33,258
[Michael Helbling]: Decision velocity is a huge part, like not getting yourself bogged down.

653
00:54:33,318 --> 00:54:40,788
[Michael Helbling]: There's lots of frameworks for that, like the old Bezos, two-way door versus one-way door, decision matrix, that kind of stuff.

654
00:54:41,349 --> 00:54:49,039
[Michael Helbling]: There's these things that can help, but I do think you could look at AI as an enabler of

655
00:54:49,019 --> 00:54:52,703
[Michael Helbling]: helping you frame or think about speed to decision.

656
00:54:53,364 --> 00:55:00,851
[Michael Helbling]: Because one of the things my old boss, my guest, used to do, he used to force us to write down decision journals.

657
00:55:01,052 --> 00:55:02,713
[Michael Helbling]: I don't know if you've ever done this before.

658
00:55:03,474 --> 00:55:06,277
[Michael Helbling]: Very time consuming and very annoying, and I was always super bad at it.

659
00:55:06,397 --> 00:55:08,940
[Michael Helbling]: It's probably why I'm not as good at decision makers as I should be.

660
00:55:10,021 --> 00:55:14,065
[Michael Helbling]: But it helps you then go back and look at previous decisions

661
00:55:14,045 --> 00:55:17,671
[Michael Helbling]: and what led up to them and go through that.

662
00:55:17,691 --> 00:55:19,814
[Michael Helbling]: So not everything is all data analysis.

663
00:55:20,034 --> 00:55:23,439
[Michael Helbling]: Data informs some decisions and so it would look greater or lesser extent.

664
00:55:24,180 --> 00:55:28,207
[Michael Helbling]: But to the extent that if I had to use this data, I might have made a better decision.

665
00:55:28,347 --> 00:55:36,519
[Michael Helbling]: If you're evaluating your decision capabilities, and I think AI is really well suited to helping you remember some of those things as over time as well.

666
00:55:37,040 --> 00:55:40,305
[Michael Helbling]: So that could be another way to leverage AI in that context maybe.

667
00:55:44,572 --> 00:55:46,038
[Tim Wilson]: Go faster, be smarter.

668
00:55:47,906 --> 00:55:53,468
[Tim Wilson]: So I think this is your opportunity, Michael, to make a decision to bring the show to a close.

669
00:55:54,472 --> 00:55:55,155
[Tim Wilson]: You know,

670
00:55:56,013 --> 00:55:57,797
[Michael Helbling]: It's about that time, Tim.

671
00:55:58,478 --> 00:56:01,404
[Michael Helbling]: It's hard because I don't want to because of two reasons.

672
00:56:02,085 --> 00:56:04,731
[Michael Helbling]: Because we've got Josh on the show and I don't want it to end.

673
00:56:06,053 --> 00:56:08,559
[Michael Helbling]: And so that's one part.

674
00:56:08,579 --> 00:56:11,304
[Michael Helbling]: And then the second part is, it's the end of 2025.

675
00:56:11,324 --> 00:56:15,152
[Michael Helbling]: This is our last episode of the year.

676
00:56:15,132 --> 00:56:16,474
[Moe Kiss]: Let's get on with 2026.

677
00:56:16,674 --> 00:56:18,316
[Moe Kiss]: I am ready for it.

678
00:56:18,376 --> 00:56:19,057
[Michael Helbling]: Moe is ready.

679
00:56:19,457 --> 00:56:19,798
[Michael Helbling]: All right.

680
00:56:19,838 --> 00:56:20,759
[Michael Helbling]: Let's shut the door.

681
00:56:20,819 --> 00:56:22,441
[Michael Helbling]: So we're done.

682
00:56:22,802 --> 00:56:24,043
[Michael Helbling]: Thank you all.

683
00:56:24,063 --> 00:56:28,409
[Michael Helbling]: As you've been listening, maybe you have a memory of 2025 you want to share.

684
00:56:28,629 --> 00:56:29,790
[Michael Helbling]: We would love to hear from you.

685
00:56:29,950 --> 00:56:31,392
[Michael Helbling]: Or what are you looking forward to in 2026?

686
00:56:31,733 --> 00:56:33,074
[Michael Helbling]: Same thing.

687
00:56:33,435 --> 00:56:34,176
[Michael Helbling]: Reach out to us.

688
00:56:34,196 --> 00:56:40,043
[Michael Helbling]: You can comment to us at our LinkedIn page or on the Measure Slack chat group.

689
00:56:40,023 --> 00:56:43,767
[Michael Helbling]: or via email at contact at analyticshour.io.

690
00:56:44,368 --> 00:56:45,829
[Michael Helbling]: We'd love to hear from you.

691
00:56:46,570 --> 00:56:48,893
[Michael Helbling]: And obviously, thank you, Josh.

692
00:56:49,693 --> 00:56:56,080
[Michael Helbling]: No show would be complete without thanking you for coming back to be one more special guest one more time.

693
00:56:56,120 --> 00:56:56,721
[Michael Helbling]: This is fun.

694
00:56:56,801 --> 00:56:57,942
[Josh Crowhurst]: Thanks for having me, guys.

695
00:56:58,983 --> 00:57:00,405
[Michael Helbling]: Yeah, it's awesome.

696
00:57:00,685 --> 00:57:01,266
[Michael Helbling]: It's awesome.

697
00:57:03,228 --> 00:57:03,628
[Michael Helbling]: We do.

698
00:57:03,949 --> 00:57:05,170
[Michael Helbling]: We do.

699
00:57:05,318 --> 00:57:06,842
[Michael Helbling]: I think you're still in our Slack.

700
00:57:06,882 --> 00:57:12,154
[Michael Helbling]: I don't know if you've just abandoned that Slack at all, or you're still kind of peeking from time to time.

701
00:57:12,174 --> 00:57:14,680
[Josh Crowhurst]: Oh, it's still Slack, but I do still get the emails.

702
00:57:14,700 --> 00:57:15,622
[Josh Crowhurst]: Oh, okay.

703
00:57:16,625 --> 00:57:18,369
[Josh Crowhurst]: I don't get analytics hours.

704
00:57:18,389 --> 00:57:18,790
[Josh Crowhurst]: Oh, gosh.

705
00:57:18,850 --> 00:57:19,031
[Josh Crowhurst]: Oh, yeah.

706
00:57:19,051 --> 00:57:21,336
[Josh Crowhurst]: I see those new ideas and suggestions coming through.

707
00:57:22,531 --> 00:57:29,287
[Michael Helbling]: I'll remove you from the email list, I guess, so that you don't keep getting those.

708
00:57:29,568 --> 00:57:29,808
[Michael Helbling]: Yeah.

709
00:57:30,390 --> 00:57:32,274
[Michael Helbling]: Well, we didn't really have a process for that.

710
00:57:32,495 --> 00:57:38,970
[Michael Helbling]: So under GDPR, you do have a right to be forgotten, but I don't want to.

711
00:57:40,334 --> 00:57:44,901
[Michael Helbling]: All right, and if you listen to the show, leave a rating and review.

712
00:57:44,981 --> 00:57:50,549
[Michael Helbling]: If you've been listening throughout 2025, go to your favorite platform, give us a review, rate the show.

713
00:57:50,829 --> 00:57:52,211
[Michael Helbling]: That helps other people discover it.

714
00:57:52,852 --> 00:57:57,639
[Michael Helbling]: And we've had a lot of audience growth this year, both on our regular channels and on our YouTube channel.

715
00:57:57,659 --> 00:58:00,904
[Michael Helbling]: So if you're ever on YouTube, subscribe to us there as well.

716
00:58:01,344 --> 00:58:08,755
[Michael Helbling]: We put every episode up on our YouTube channel, as well as some awesome shorts that the team puts together for each episode.

717
00:58:08,735 --> 00:58:12,385
[Michael Helbling]: don't know what we're gonna put together out of this one, but we'll see.

718
00:58:12,405 --> 00:58:25,258
[Michael Helbling]: And then of course, for all of my co-hosts, I think I speak for everybody when I say, 2026 is gonna be an amazing year, but no matter what it brings,

719
00:58:25,880 --> 00:58:28,543
[Michael Helbling]: You know that you can always keep analyzing.

720
00:58:28,603 --> 00:58:29,925
[Announcer]: Thanks for listening.

721
00:58:30,445 --> 00:58:43,560
[Announcer]: Let's keep the conversation going with your comments, suggestions, and questions on Twitter at @analyticshour on the web at analyticshour.io, our LinkedIn group, and the Measure Chat Slack group.

722
00:58:43,580 --> 00:58:45,963
[Announcer]: Music for the podcast by Josh Grohurst.

723
00:58:47,125 --> 00:58:49,107
[Announcer]: So smart guys want to fit in.

724
00:58:49,127 --> 00:58:51,349
[Announcer]: So they made up a term called analytics.

725
00:58:51,369 --> 00:58:52,731
[Announcer]: Analytics don't work.

726
00:58:53,707 --> 00:58:56,433
[Charles Barkley]: Do the analytics say go for it, no matter who's going for it?

727
00:58:56,814 --> 00:58:59,660
[Charles Barkley]: So if you and I were on the field, the analytics say go for it.

728
00:58:59,961 --> 00:59:06,315
[Charles Barkley]: It's the stupidest, laziest, lamest thing I've ever heard for reasoning in competition.

729
00:59:06,717 --> 00:59:12,562
[Michael Helbling]: I nearly, Josh, on the last episode, did a no-show-it-be-complete without a huge thank you to Josh Norris.

730
00:59:13,743 --> 00:59:15,325
[Michael Helbling]: And I switched it.

731
00:59:15,865 --> 00:59:18,207
[Michael Helbling]: And I switched it at the last second.

732
00:59:18,948 --> 00:59:24,373
[Michael Helbling]: No-show-it-be-complete without cheap analyzing.

733
00:59:25,734 --> 00:59:28,216
[Michael Helbling]: Yes, I know how I did it.

734
00:59:29,638 --> 00:59:31,840
[Michael Helbling]: I did a huge thank you door.

735
00:59:31,880 --> 00:59:33,621
[Michael Helbling]: Yeah.

736
00:59:33,701 --> 00:59:35,543
[Michael Helbling]: And it's just a hard cut.

737
00:59:37,548 --> 00:59:39,033
[Michael Helbling]: No joke, you complete with that.

738
00:59:39,755 --> 00:59:40,518
[Michael Helbling]: He's analyzing.

739
00:59:43,126 --> 00:59:43,728
[Josh Crowhurst]: Anyway.

740
00:59:43,949 --> 00:59:48,042
[Josh Crowhurst]: Yeah, I still need to see music, so I feel like I can see the original setup.

741
00:59:48,845 --> 00:59:49,768
[Michael Helbling]: Yeah, you're in there.

742
00:59:57,933 --> 01:00:22,540
[Tim Wilson]: rock flag and let's raise a glass with tim and mo with michael julie vile five hosts who guide us through the noise and make the numbers ten oh for all our power ours friends for all

743
01:00:22,520 --> 01:00:25,024
[Tim Wilson]: our power hours.

744
01:00:25,625 --> 01:00:34,701
[Tim Wilson]: We'll toast the laughs and insight shared in all those power hours.

745
01:00:37,105 --> 01:00:39,069
[Tim Wilson]: Voice crack should have picked a different key on that one.

746
01:00:39,530 --> 01:00:40,551
[Tim Wilson]: That is awesome.

747
01:00:40,571 --> 01:00:41,573
[Tim Wilson]: That has to be it.

748
01:00:41,593 --> 01:00:42,415
[Tim Wilson]: That has to be it.

749
01:00:43,256 --> 01:00:44,919
[Tim Wilson]: That has to be it.

750
01:00:44,939 --> 01:00:46,722
[Tim Wilson]: That's the best one we've ever done.

