Christopher Parsons of Knowledge Architecture joins the show to talk about In this episode of the Confluence podcast, Chris Parsons of Knowledge Architecture joins us to discuss bringing AI to their Synthesis intranet platform. He takes us behind the scenes showing various functionalities including activity streams, document libraries, and project directories, with a focus on the upcoming integration of AI into Synthesis’ search and video transcriptions.
Chris also shows advancements in next-generation search, vector search, and future LMS capabilities. We also explore AI's impact on firm-wide knowledge capture, improved search relevance, and the creation of AEC-specific models through what Knowledge Architecture calls “Community AI”.
Episode Links:
Watch this episode on YouTube or Spotify.
-----
The Confluence podcast is a collaboration between TRXL and AVAIL, and is produced by TRXL Media.
Welcome back to the Confluence podcast.
2
:we have Chris Parsons joining us again.
3
:Hopefully you've tuned in and
heard the first episode that we
4
:did with Chris a few weeks ago.
5
:But if not, you might want to
go back and listen to that.
6
:But, uh, happy to have
you on again, Chris.
7
:Uh, you know, I know we're going
to, we're going to dig more
8
:into the Synthesis platform.
9
:And really start to dig into, you know,
not only what, what Synthesis does, which
10
:I'll get you to kind of kick this off
with, but ultimately like, why, you know,
11
:how AI is affecting what you're doing
and how you guys are thinking about that.
12
:So, uh, maybe you can just give us a
good overview, introduce yourself in
13
:a good overview of what Synthesis is.
14
:Christopher Parsons: sure.
15
:so for folks who didn't hear their
first episode, we started in:
16
:We are a hundred percent focused on
AEC Synthesis is an intranet platform
17
:for AEC firms, and we work with about
130 firms, mostly us, although we
18
:have some international, and we are
deeply integrated with a lot of AEC
19
:software that people care about.
20
:So Deltek, Uninet, OpenAsset,
Newforma, AEC 360, Zendesk, and so on.
21
:So it's a, it's Synthesis is, I think,
pretty well named product in that it
22
:like connects data from multiple places,
giving you one kind of searchable
23
:source of central source of truth.
24
:And, um, I am just going to do, I'm
going to do like a high level seven
25
:screen kind of like look at Synthesis.
26
:So people get like a, a ball, a kind
of general idea of the kind of content
27
:that we have so that when we start
talking about what we're doing with
28
:AI, it'll, it'll make a lot more sense.
29
:So I'm going to go ahead
and share my web browser.
30
:So this is the homepage of Synthesis.
31
:This is a demo environment.
32
:So obviously you can brand it.
33
:Um, colors and fonts and logos and all
the things to make it feel like home.
34
:But this will give you a general
sense of what the product is.
35
:So most of our clients on the homepage
will have, um, an activity stream.
36
:So this will be posts and comments around
different things happening in the company.
37
:So it's very much an internal
communications platform.
38
:So whether it's a strategic plan update
or a project tour or a new piece of
39
:software has been added or some PR.
40
:you know, somebody's retiring,
you know, these are the kinds
41
:of things that we typically will
see on a Synthesis intranet.
42
:And if I jump into one of those
posts, um, you'll kind of see that
43
:it's, it's a very rich multimedia.
44
:You can have images and videos and
links, and then we've got comments
45
:and likes and the kind of things you'd
expect to see, um, on a social intranet.
46
:Um, if I go across the top, um, we've
got our mega menu navigation, and this
47
:is kind of just showing you the breadth
and depth of content that typically
48
:will end up in a Synthesis intranet.
49
:So kind of.
50
:Um, on home, you'll usually find like
whelping to the company and stuff like
51
:mission values, the leadership team, the
different committees, um, different office
52
:stuff, um, in an HR community that we
call these communities across the top.
53
:So you'd find HR information, professional
development information, time and travel,
54
:accounting, different office pages.
55
:Projects in practice, things like
standards and codes, um, project
56
:management resources, different
directories to projects, um, in
57
:marketing, kind of brand templates,
marketing resources, technology, the
58
:different platforms that people use
and the technology policies, and then
59
:learning, you know, different, you know,
whether it's a company university or
60
:different things like software training.
61
:So this is what we call a
video library in Synthesis.
62
:Um, These have got different training
courses, depending on different
63
:platforms that the company's using.
64
:Um, video libraries have filters, so
you can add custom fields and kind
65
:of drill into those different videos.
66
:Um, and it's all searchable.
67
:So I'm, I'm going to skim across the top.
68
:So video is going to become important.
69
:We're going to come back to this
later and we'll come back to
70
:this idea of a video library for
training a little bit later as well.
71
:Um, I want to talk about another
key piece of content and Synthesis,
72
:which are document libraries.
73
:So if I go into our projects and practice
community and go to our standards library.
74
:You'll just see that we've got
some different standards here
75
:organized by CSI division.
76
:And again, you can add custom fields
into Synthesis so it can be on
77
:CSI division or standard type or
conditions or whatever the different
78
:things you want to apply are.
79
:So that's videos and documents.
80
:Um, from a page perspective, I'm
just going to go into technology
81
:and I'll go under our design
technology section and go to Revit.
82
:And this is just a landing page in
the company for all things Revit.
83
:So it's got links off to
other different resources.
84
:Um, we've got training videos on
Revit in the upper right, I've got
85
:kind of our key points of contact.
86
:So I've got, you know, Brianne and
Julie, and then I can kind of reach out.
87
:I can see their team status, et cetera.
88
:And then I can see any recent posts
that were written about Revit.
89
:So this is a pretty typical, like, so if
you imagine this kind of layout, but for
90
:a lot of things within the company, you're
trying to connect people to the right
91
:people and the right resources and helping
them know how the company approaches, you
92
:know, whether it's a piece of technology,
whether it's a policy, um, et cetera.
93
:If I go into projects and practice,
a big, uh, kind of content type
94
:for us is what we call guides.
95
:So I'll go to the
project management guide.
96
:And so this technology basically
replaces like the hundred page PDFs
97
:that your firm might use for your
employee handbook, or in this case,
98
:your project management manual, or a
CAD or BIM guide, or a brand guide.
99
:There's all these different like kind
of guides that we saw at our clients.
100
:And so we built a tool that
handles that content really well.
101
:So you've got kind of the different
people in project management.
102
:If I kind of go into contracts.
103
:You know, I can, you've seen, we've
broken what used to be a hundred page
104
:PDF is into a bunch of small chunks.
105
:And then these are all kind of like,
you know, I've got a table of content
106
:and I can kind of jump through our
different approach to contracts or
107
:billing rates or how we start projects or
financial controls, like all the things.
108
:Um, this is a nice place to
put it within the internet.
109
:Um, so three more things
I want to show you.
110
:Um, if I go to our project
directory, so this is, I mentioned
111
:some integrations that we have.
112
:This is where those integrations with.
113
:You know, Newforma, et cetera, come in.
114
:It helps us pull information from
those systems into one place.
115
:So I can do things like find all of our
projects by type, you know, or by state.
116
:Um, this is all searchable as well.
117
:Um, or I can put them on a map, right?
118
:And I can drill in and say like, well,
what's the work we've done in California.
119
:And I can see we've got 11 projects
in the Bay area, and then two of
120
:them are out in Marin, et cetera.
121
:So I've got a good way visually
to kind of explore the work,
122
:um, that the company has done.
123
:Um, and if I click on one of these
projects, I get a project profile.
124
:So this is data we're pulling
in from all those systems.
125
:So I've got imagery from OpenAsset,
project type, cost and size, location,
126
:descriptions, like a lot of rich data,
as well as who worked on the project,
127
:both inside and outside the company.
128
:If I click on one of these employees,
I get to an employee profile.
129
:So this is our chief
executive officer, Em Davison.
130
:And again, I'm pulling information
in from multiple systems.
131
:We've also got custom fields
and Synthesis you can use.
132
:And you can see things like professional
bio and education and skills and her
133
:posts and what projects she worked on.
134
:And then if I go all the way back up
to the employee directory, it's that
135
:same kind of idea with projects, right?
136
:I can filter employees on a
variety of different fields.
137
:Um, and search for them that way.
138
:And the last piece I want to show
you, because this is very pertinent
139
:to what we're doing today, is search.
140
:So I'm going to search
on something like Revit.
141
:You can see that I've got some
suggested search results, but if
142
:I hit return, I see everything.
143
:And so I see what we call best bets.
144
:So Rob Thompson's kind of
our go to person for Revit.
145
:But if I keep scrolling down, I see,
I'm, we're searching across pages and
146
:posts and documents and projects and
videos and all the stuff in Synthesis.
147
:So.
148
:So that's kind of Synthesis like at
a high level, um, the kind of main
149
:moving parts that make up the intranet.
150
:Evan Troxel: And Chris, the last time
that you were here, you showed us this.
151
:Really great graphic that you've come
up with this periodic table where
152
:you kind of talk about the different,
you, you can probably say it better
153
:Christopher Parsons: Sure.
154
:Evan Troxel: you have these categories
with different topics within each
155
:category that are kind of knowledge
156
:Christopher Parsons: Yeah.
157
:Yep.
158
:Evan Troxel: and that kind of shows that.
159
:There's a lot going on and you
just showed like how all of that
160
:manifests in an intranet but like
there's people responsible to get
161
:that content into the intranet.
162
:I always found that was one of the hardest
things to do was to get people to actually
163
:go do those things on the intranet to make
it available for everybody else, right?
164
:Um, it is a step way above
and I know Randall, you have
165
:a special place in your heart.
166
:from a content management standpoint
as well, of like, just making sure
167
:that the data is in there and that
people can find it and it's accessible,
168
:but it still relies, you know,
this is way better than files and
169
:folders on a, on a server somewhere.
170
:There's graphical aspect, there's layout,
there's, you can do all kinds of things.
171
:You can link between pages
much more easily than you can
172
:do between files on servers.
173
:You're still relying on people.
174
:And I think this is where we,
we, we start to talk about the
175
:future of Synthesis as well.
176
:Right?
177
:Which is.
178
:Like the elephant in the room with AI,
but up to this point, we've relied on
179
:people to do all of that work you're
showing this beautifully idealized version
180
:Christopher Parsons: Yeah.
181
:Evan Troxel: And then, and then I think
about like the reality of someone's
182
:SharePoint intranet or whatever they've
got going on and like half the data is
183
:on the right page and that person who
did it no longer works at the company
184
:and there's all kinds of other stuff.
185
:And I just want to talk about,
I just want to throw all of that
186
:out there because these are the
realities of working in a firm where.
187
:It's like, this is another job.
188
:Like we need content managers now to,
to make sure that this stuff stays up
189
:to date, that people are keeping an
eye on the community, that it actually
190
:is grease between the gears of the
firm, you know, and all these different
191
:departments and how they interconnect.
192
:And a lot going on there, but this is
potentially where AI and things like
193
:that can actually help because, you're
going to talk about some more specific
194
:examples, but you can throw data at it
and have it pull insights out of that
195
:in a much, I don't know, easier way.
196
:Let's just call it easier than it would
be for someone else to comb through a
197
:document, to split it up into all those
different topical, you know, the table
198
:of contents that you showed in the,
in the standards area, there's video.
199
:That, you know, has been typically if
somebody posts an hour long video, no one
200
:wants, wants to watch an hour long video.
201
:How can I find what
202
:Christopher Parsons: Right.
203
:Right.
204
:Evan Troxel: there's a lot of
challenges still in the content
205
:that even gets posted, let alone
just the management of all that
206
:Christopher Parsons: I think you're right.
207
:And I think we will dive into that.
208
:I think just at a high level, a
couple of ways I think about what
209
:you're saying are, one, making the
software easier to use has been a huge.
210
:level, easier to use and
then more beautiful the end
211
:product that you make with it.
212
:Right.
213
:That's, that's been really important,
especially we're a hundred percent
214
:AEC and that matters to the designers
and, and engineers in our community.
215
:Um, I think the second piece is raising
the ROI of having that knowledge captured
216
:in the platform in the first place.
217
:And so when you see what I'm about to
show you with where we're going with
218
:search and AI, I think people start
getting like, and this has happened
219
:already in our journey with Synthesis.
220
:It's like, oh, I get it.
221
:So if we put our data in Deltek,
that means I can get it out in
222
:Synthesis in this beautiful way.
223
:Or the same thing with
images from open assets.
224
:So by connecting systems, you raise the
return on investment of that information.
225
:Um, and so when you start seeing what
we're going to do with AI, I think
226
:it's going to create kind of like a
gravitational pull to get more content
227
:into the system because you'll realize,
Oh, if we have our best content, we're
228
:going to be able to do, and I know Randall
and the folks at Avail are investing
229
:in this as well, you know, the better
content that I get into my system,
230
:the further we can go as a company.
231
:And I think that's just going to be a
step change from where we are today.
232
:Randall Stevens: I was, I was going
to make a comment, Chris, that I,
233
:you know, I think you already Made
note that Synthesis is a great name.
234
:You know, those names usually
come about after you kind of
235
:figure out what you're doing.
236
:It's like, this could only be named
this if we, if we do it properly.
237
:But, um, you know, I've just made
a note that, you know, I think
238
:a challenge of these systems.
239
:of adding things like a Synthesis or
avail, uh, into an operation is to start
240
:trying to figure out how not to be yet
another thing that somebody has to do.
241
:But, but the by product of, of the main
things that you're doing and the workflows
242
:that you're doing should as automatically
as possible flow into these things.
243
:And, you know, I think about, you know,
when you show a Synthesis and I don't
244
:see how if you're a large firm that you
don't have, you That as your, you know,
245
:it's, I think of it as like, that's your
newspaper for the, that's your, that's
246
:your news source to keep everybody
contextually aware of what is going on.
247
:And as a firm scales, there's just no
way that everybody can kind of understand
248
:what everybody in the firm is doing.
249
:So it's a very, to me,
it's a very elegant way.
250
:And then the trick, like you said,
is it can't become, Somebody's job
251
:to go do all of those things because
it falls apart pretty quickly.
252
:So it's gotta be a, a by
product of the normal workflows.
253
:And I know that's what we were
always kind of, uh, trying to do
254
:is to figure out, uh, I don't want
to give you another job to do.
255
:I want to, I want you to
go do what you normally do.
256
:And then we'll try to, you
know, be a by product of that.
257
:Christopher Parsons: Totally.
258
:I do think though, like, yeah, as
a good example, like I showed that
259
:project management guide, like the
folks that put those things together.
260
:And I know you both work with firms that
have probably done things like that.
261
:Like those are labors of
262
:love done by a couple of key
individuals in the organization.
263
:Um, I don't think they were
generally happy about making
264
:hundred page long PDFs with them
because they're very hard to edit.
265
:Then you have to redistribute them.
266
:Like it's a terrible, like authoring
platform for something like that.
267
:So,
268
:and it's a dead document.
269
:Yeah,
270
:Evan Troxel: now I have to keep this
271
:Christopher Parsons: correct.
272
:Yep.
273
:Evan Troxel: the time.
274
:And so like that, the more Wikipedia
style knowledge base that is
275
:constantly updated with multiple
276
:Christopher Parsons: Right,
277
:Evan Troxel: who can do some
accountability with each other and
278
:they can hand it off and make, you
know, if they don't have, they're
279
:working on a project, somebody else can
come in and do it makes so much more
280
:Christopher Parsons: exactly.
281
:Evan Troxel: Just to keep
the life in these documents
282
:because they are constantly
283
:Christopher Parsons: you can put
short training videos in the middle
284
:of these, which is harder to do
with a document and you can add
285
:links to people and other documents.
286
:So yeah, it's a much better
experience, but to Randall's point,
287
:like this isn't a, somebody's job
288
:is doing this.
289
:Like, and so hopefully we're
giving them a better way to do it.
290
:Um, I, so that's kind of,
if I can share my screen,
291
:Evan Troxel: Mm-Hmm.
292
:Christopher Parsons: Um, so
I, I have shown you kind of
293
:where Synthesis is today.
294
:And what I want to be talking
about is where we're kind of going.
295
:And so our mission, like kind of
the way we'll describe the product
296
:in the future is around being an
integrated intranet LMS and enterprise
297
:search platform for AEC firms.
298
:So I touched on the intranet piece.
299
:I want to touch on Synthesis LMS.
300
:I've just got three slides on it because
again, it'll get a sense of how the
301
:intranet LMS and enterprise search will
all come together and the kind of content
302
:that will live in Synthesis in the future
and kind of like what we're building for.
303
:Um, so this is, and this is a mock, right?
304
:We're in design now.
305
:We're going to start construction
this summer on this project, but
306
:this is a course in Synthesis LMS
called Project Communications.
307
:And I'm in a lesson called
Client Communication Strategies.
308
:And in this example, there's six
lessons on the right hand side.
309
:You can see that I'm through one lesson.
310
:Um, I can see who the
instructors are on the right.
311
:And it's a, you know, an 18 minute
video plus a handout, right?
312
:On active listening fundamentals.
313
:So we have seen our clients
over the years, try and use the
314
:intranet platform to do this kind
of stuff and do reasonably well.
315
:And we kind of call it LMS Lite.
316
:And we had just come to the point
after finishing Synthesis 6 where we
317
:went out to our clients and had a big
listening tour and they're like, We want
318
:you to work on two things, AI and LMS.
319
:So this is the LMS piece and we're
going to talk about the AI piece.
320
:Um, all those courses can get
bundled up into a course catalog.
321
:So in the example I'm showing you here,
there's courses around onboarding and
322
:project management, but you know, our
clients create courses around things
323
:like design technology, sustainability,
leadership, design itself, marketing
324
:and business development, and so on.
325
:So this is meant to be one central place
people can have all of their educational
326
:Randall Stevens: Chris, do you, do you,
327
:are you hosting those videos or do you
support Vimeo and YouTube and other places
328
:that are, you know, traditional places
that people would host video content?
329
:Christopher Parsons: Uh, yes to both.
330
:So we have a feature called
Synthesis Native Video.
331
:We, uh, released in 2018.
332
:We are in the process of
doing a major overhaul.
333
:We'll actually talk about that toward
the end of the conversation around us
334
:using AI to generate, um, AEC specific
transcriptions for those videos.
335
:We're also adding things like
chapters and some other neat
336
:features, but no, the Synthesis
Native Video is a key part of that.
337
:of kind of how all of this
338
:comes together.
339
:And we're going to talk
about that in some detail.
340
:But we also do, you know, through embed
341
:codes, you know, you can drop in
Vimeo, YouTube, that kind of thing.
342
:and the last piece about the
Learning Center is the ability
343
:to bundle those courses up into
what we call Learning Paths.
344
:So for example, if you look
under onboarding, there's two
345
:learning paths on onboarding.
346
:There's one that's called
general onboarding.
347
:That's got like seven courses.
348
:And then there's another one
for project manager onboarding,
349
:which got, has got 10 courses.
350
:So you take that kind of core seven
courses for, you know, everyone
351
:that comes and joins the firm.
352
:And then you add three more to
like onboard project managers.
353
:So the way we do project
management here at the company.
354
:So, you know, it will also have
assignments and transcriptions and
355
:analytics and all the kind of basic
things you'd expect to find in an LMS.
356
:And this is scheduled to go
into beta next year in:
357
:So we have a 30 minute concept
video laying all this out on the
358
:roadmap, page on our website.
359
:Maybe we can drop it in the show
notes, but that's as far as I want to
360
:take it for what we're doing today.
361
:So let's talk about AI.
362
:So for us, it's three things,
um, that all work together.
363
:There's next generation
search, which we'll start with.
364
:There's video captions, which Randall,
you just kind of teed us up for.
365
:And then we have a program that we
introduced called Community AI, which
366
:is going to help us build AEC specific
models to make this work even better.
367
:Um, so now that we've kind of
seen the content, I want to do
368
:a deep dive into how we're going
to make it even more searchable.
369
:So as we saw in that quick search
example, today we're using keyword search.
370
:Um, we've, we've done pretty well
with keyword search, but we've taken
371
:it about as far as we think we can.
372
:So I'm searching on keyword like jury
duty, and maybe what I really wanted
373
:to do was ask a question, right?
374
:How many hours does our company
provide for jury duty service?
375
:Or what's our jury duty policy?
376
:And when I execute that search, what I
get back are links, in this example, to
377
:pages and to documents, which, you know,
that's how we've generally done search.
378
:Um, where we're going is, you know, using
generative AI and other kind of advances
379
:in search technology to make answers.
380
:That summarize useful information from
multiple different sources automatically.
381
:So for example, for this question
about what's our jury duty policy,
382
:we're giving part of an answer
around allocation of hours.
383
:So how many hours do
we give documentation?
384
:So how do you kind of like
prove you were on jury duty?
385
:And then how do I request time off the
ability to ask follow up questions?
386
:And then if I drill in to one of
these, um, search summaries, I can
387
:link to that source, you know, that
we, and we're citing the source
388
:where we got that information.
389
:So another example, I'm
searching on worksharing.
390
:And what I really want to ask is, should
I use Revit worksharing or central files?
391
:And then I want to get, you
know, an answer back, right?
392
:So it's summarizing the benefits
of worksharing, the benefits
393
:of central files, and then kind
of giving a recommendation.
394
:Um, on the right, you can see the
top sources that were used in kind
395
:of generating that search summary.
396
:And again, I can drill in and find
out the underlying information that
397
:was used to generate this answer.
398
:And if I click on it That will take
me directly into that video and
399
:directly into the part of a video,
um, that contains the passage that
400
:was used to generate that answer.
401
:So this is the Synthesis native video,
402
:and I'll throw it to
your earlier question.
403
:That's how we're making this work.
404
:Evan Troxel: So that's interesting
because you're doing things that I think
405
:YouTube is trying to do as well, right?
406
:Where, where they're automatically kind
of looking, they're transcribing, they're
407
:looking at the conversation there.
408
:I know whenever I upload a YouTube video,
it's like, do you want to automatically
409
:it find moments or whatever they call it?
410
:They don't even call it chapters.
411
:Like you can put in your own
chapters in the description field
412
:to help people navigate to a certain
thing that I think you might want
413
:to navigate to, but they're also.
414
:looking at finding smarter ways to
find those key moments in videos.
415
:But then you're, you're locking kind of
a timestamp and what was said to a, to
416
:a question, which makes it much easier
to, like I was talking about earlier,
417
:no one wants to watch the hour long
video too long, didn't watch, right?
418
:I just want the key takeaways,
or I want to jump to a specific
419
:point in that video that is
regarding the question that I have.
420
:So that's what you guys are really
taking on here with your own
421
:Christopher Parsons: Right.
422
:Evan Troxel: You can then process that
in the background and, and then apply
423
:all of that metadata back into the, the
424
:Christopher Parsons: Correct.
425
:And I think that the other reason this
is a huge unlock is, you know, because
426
:we work 100 percent AEC, and I'm going
to touch on this a little bit later.
427
:There's so much AEC terminology
and lexicon acronyms and product
428
:names and material names and all the
stuff that we do in this industry.
429
:So much lexicon stuff.
430
:So,
431
:Evan Troxel: That's where YouTube
432
:Christopher Parsons: right, exactly.
433
:Evan Troxel: Because it, it says Rabbit
434
:Christopher Parsons: You got it.
435
:Evan Troxel: and and there's all,
you know, that's just one example,
436
:but there's so many acronyms.
437
:There's so much terminology and jargon
and vocabulary that, you know, it's,
438
:it's great when you're in an office,
you need to be using that because
439
:you're all working in the same field
and you're all doing the same things.
440
:Thanks.
441
:And it, you have to be able to then, like
you're saying, like, it's a huge advantage
442
:to, have that vocabulary be a part of the
443
:Christopher Parsons: Exactly.
444
:Evan Troxel: that it finds
exactly what you're looking for.
445
:And it's not it as a misinterpretation.
446
:And now I can't even find it
because I didn't search for rabbit.
447
:I searched for Revit,
448
:Christopher Parsons: right?
449
:Or I searched on standards
and we use guideline.
450
:Like there's so much stuff
451
:like that.
452
:And so, um, you're teaming up
really, really well, actually.
453
:So that's it for, I've got
for like of the front end.
454
:And this project I've been talking about
as an iceberg all along, hopefully not
455
:in the Titanic sense of the word, but
in the 10 percent of this project is
456
:UI above the waterline, 90 percent of
what we're doing to bring that search
457
:stuff to life is under the waterline.
458
:It's invisible.
459
:It's infrastructure behind the scenes.
460
:And what I want to share with you and
your community is kind of what's below
461
:the waterline and like how the generative
kind of search results actually work.
462
:Yeah, please.
463
:Totally.
464
:Evan Troxel: grown up with the
internet, And so we've all been trained
465
:to use the search tools that are on
the internet to our advantage, which
466
:was typically keyword based, right?
467
:So I'm a keyword searcher and it is
such a mental leap for me to ask a
468
:computer a well formulated question.
469
:It's like you go into the chat GPT, you
go into Claude, you go into whatever.
470
:And, and, It's funny because my, my
wife will, she, she adjusted and adopted
471
:this way of doing a search a while
ago, which is just ask it a question
472
:and actually put a question mark at
the end of it in the Google search.
473
:I still don't, I still can't even
do, I can't bring myself to do that
474
:because it's not my muscle memory.
475
:And so I'm wondering like
with firms and, and how you're
476
:creating an integration with that.
477
:Cause I, I would assume that.
478
:If I don't even know the right question to
ask, but I just know, okay, work sharing
479
:Christopher Parsons: Right.
480
:Evan Troxel: I can type in work
sharing and then maybe there's Maybe
481
:there's even other questions that
have been, that have come up from
482
:other people about work sharing.
483
:So it's like, Oh yeah, that's, that's
really what I want to ask because
484
:me, I'm, I'm just a work sharing.
485
:I would say Revit work sharing, right?
486
:Christopher Parsons: Totally.
487
:Yeah.
488
:Evan Troxel: see what comes back.
489
:I don't ask it the question.
490
:And now this is all, this is
very much a communication.
491
:This is a conversation with,
with the computer at this point.
492
:And I'm not conversational with computers.
493
:That's just not how I'm
494
:Christopher Parsons: Yeah, I think you're,
you're on to, and I think you're right
495
:for probably representing most users.
496
:Um, and so I think for us, we have to
do really well when people are still
497
:using keywords in Synthesis search.
498
:And I think, you know, for example, let's
take something really prosaic, like Wi Fi.
499
:Like if somebody's asking, if someone
types in Wi Fi, what are they asking?
500
:They're not asking which
Wi Fi technology do we use?
501
:Not asking, like, do we have Wi Fi?
502
:They're asking what's
503
:the, you know, the, Wi Fi network in
my office and what's the password.
504
:That's what they want to know.
505
:Right?
506
:So.
507
:Randall Stevens: Every
508
:firm needs that.
509
:Every conference room I go into,
they cannot get me on the Wi Fi.
510
:I
511
:Christopher Parsons: from a meeting
people where they are perspective,
512
:but also from just an efficiency, like
why type 10 words if I can, should be
513
:able to get what I need with one word.
514
:Um, and you know, the more
people type, the more the
515
:typos spike, you know, all the
516
:things, um,
517
:yeah,
518
:Randall Stevens: you know, you can,
you can begin as you're capturing, uh,
519
:You can begin to make those suggestions
as somebody starts to type something in.
520
:It's like, Hey, these are the
popular things that have been
521
:asked before, even for the user
to not have to retype it, right?
522
:It's like, okay,
523
:Evan Troxel: So useful, right?
524
:It's like auto complete for
what I was thinking, not even
525
:Randall Stevens: Now, you know, Google's,
uh, Google's gotten really good, right?
526
:It's like content, you know,
it's contextual awareness.
527
:Uh, and as you start to type
something, it's like, it knows
528
:what I'm getting ready to,
529
:Evan Troxel: They've got a
530
:Randall Stevens: Right.
531
:Christopher Parsons: They
got a pretty big user base.
532
:Yeah.
533
:I mean, I think that's one,
the, the tool that I've been
534
:super inspired by is Perplexity.
535
:I don't know if either of you have
used that, um, yet, but it's fantastic.
536
:And it like really handles that.
537
:from one word all the way
through a complete, you know,
538
:multi sentence question.
539
:Um, well, and so it really depends.
540
:And sometimes like one word does it,
or two words do it, whatever it works
541
:sharing would be fine, or if it's like,
yeah, I see what you answered, but like,
542
:really now I know what it is I'm asking.
543
:What I'm asking is, you know, X, Y, Z.
544
:So.
545
:Evan Troxel: That's usually
where it leads me, right?
546
:It's like, okay, now, okay, now
that I've seen a little bit more,
547
:I can ask a little bit better of a
question, and like five questions from
548
:Christopher Parsons: That's right.
549
:100%.
550
:Evan Troxel: wanted.
551
:Christopher Parsons: Yeah.
552
:Back to the iceberg.
553
:So, um, when we, when we started
approaching, you know, obviously
554
:when everything changed with
chat GPT, um, we started.
555
:Asking the question, like every tech
company, well, every company writ large,
556
:not just tech companies we're asking.
557
:It's like, okay, what
does this mean for us?
558
:What does that mean for our clients?
559
:And we quickly kind of keyed in on search
and video transcription as being too
560
:important, uh, things we wanted to build.
561
:They weren't little quick hits, you
know, we didn't rush something out.
562
:Um, but we really took a lot of
time thinking about architecture.
563
:And so one of the ways we could
have built this was to build an end
564
:user facing large language model.
565
:Right.
566
:Um, and so questions
would come into Synthesis.
567
:We'd use a LLM that we had trained
and we generate answers, right?
568
:This is basically, it's overly simplified,
but this is basically what ChatGPT does.
569
:And for a host of reasons, like this
was the wrong architecture for us.
570
:Um, the challenges are that
it is prone to hallucinations.
571
:Just because of its data set, um,
well, oftentimes, you know, that goes
572
:across, you know, different companies.
573
:It can be a really, a really big problem.
574
:Um, it's got a problem with freshness
because, you know, you train this model
575
:and then you, you know, whatever, train
it nine months or 12 months later.
576
:And so it starts getting a gap in terms
of stuff that's either you're holding on
577
:to old information too long, or you're
not recognizing new stuff fast enough.
578
:And then it's got a permission
issue, like inside a company.
579
:It's really hard, like if we have,
for example, uh, uh, Principals or
580
:a shareholders or a management team,
community and Synthesis, like building
581
:an end user facing LLM, like who can
see what, what it's used to generate.
582
:It's just too much.
583
:It's too complicated.
584
:You know, it's not, it's not
what we like the sound of.
585
:So it's like, all right,
is there another way?
586
:And what we came across, uh, is this
emerging approach called RAG, which
587
:stands for retrieval augmented generation.
588
:And we love it.
589
:And this is kind of how
it works in a nutshell.
590
:So a query comes in, like,
what's our GRDD policy?
591
:We looked at this earlier.
592
:We execute a search.
593
:And so we execute a vector search.
594
:And I'm going to come back
to that in a little bit.
595
:Um, the vector search retrieves passages
of key text from those relevant resources.
596
:So out of videos, the right passage out
of a document, you know, it's on page 20.
597
:The right passage out of a post or a, uh.
598
:You know, a page in the platform
sends those passages to the large
599
:language model, which will then
summarize it, um, and generate those
600
:answers and citations we looked at.
601
:So we're not kind of indexing,
like all of the knowledge to make
602
:an Oracle that knows everything.
603
:We're using search and then we're just
using the LLM as a summarization tool,
604
:cause it's very good at doing that.
605
:Evan Troxel: Yeah, so is it, would
you call it more of a just in time
606
:kind of a A machine, or, I mean,
607
:Christopher Parsons: Yeah,
I think that's great.
608
:I mean, I think I, if I put a post.
609
:Up right now, it gets, by the time that
it's done indexing and it gets embedded.
610
:We'll talk about that a little bit later.
611
:Then the next time I do a search one
minute from now, and it's something
612
:that was related to that post I just
added, that would be included in that
613
:search and fed into the search summary.
614
:So it very much is a,
615
:Evan Troxel: this goes back to that dead
616
:Christopher Parsons: exactly.
617
:Yeah.
618
:The freshness issue.
619
:Evan Troxel: that you you brought up a
620
:Christopher Parsons: Yeah.
621
:So it was super important to get
something that really tackled, you
622
:know, well, let's talk about it,
that really tackled hallucinations,
623
:freshness, and permissions.
624
:So.
625
:You know, we counter hallucinations
by grounding the search in your data.
626
:So this only searches, you
know, your company's data.
627
:It doesn't do anything across,
you know, our different clients
628
:or from the internet, right?
629
:This is really your content.
630
:Um, so again, that's that search that
retrieves the passages, passes it to the
631
:LLM and then generates the citations.
632
:Um, the real time search,
you know, you just mentioned.
633
:That's a big part of it.
634
:And then permission search.
635
:So if Randall is a principal and Evan,
you know, you're a job captain, you're
636
:going to get different, uh, passages
retrieved based on your permission
637
:level that get fed into the LLM.
638
:So we can do this summarization,
but we can do it in a way
639
:that honors permissions.
640
:Evan Troxel: So when people,
how are the permissions handled?
641
:Are they handled during like the
upload of the new information
642
:or is it based on places?
643
:I'll generically call it places in the
644
:Christopher Parsons: yeah,
645
:Evan Troxel: So if it's like HR,
HR, everyone in HR has access to
646
:everything in HR, but outside of that,
there's like certain eyes only, right?
647
:So how do you handle that when new
information is constantly being uploaded?
648
:Because.
649
:being added to that.
650
:Still going into kind of documents
and, and then like there's permission
651
:for that document in that folder that
the LLM is in looking at when you ask
652
:it a question or how do you handle
653
:Christopher Parsons: yeah,
it's, it's very much the latter.
654
:It's based on places.
655
:We call them communities and
we have public communities and
656
:private communities and Synthesis.
657
:And so public communities
are what they sound like.
658
:Anyone can see anything in that.
659
:Within a public community, you have
different permissions for who can edit
660
:content versus that are end users,
but everyone can see that content.
661
:And in a private community, you have
to add, you know, users to it manually.
662
:So that kind of, that's as far as we
took it, you know, like SharePoint,
663
:as you two may know, like you get so
granular, their permissions, you're
664
:down at the document level, it makes
things super complicated and people
665
:can't remember who can see what.
666
:So we.
667
:Launched Synthesis 6 with a
very simple permission model.
668
:And it's stood the last couple
of years of testing really well.
669
:So that's, um, simplicity is
always good if you can get it.
670
:So that's kind of the basics of
retrieval augmented generation and
671
:you know, how it improves search.
672
:So there's kind of three other
concepts that are important to how
673
:we're building, uh, vector search,
which I'd lightly referenced earlier.
674
:Um, Our community AI program, where
we're building those AEC specific
675
:models and, uh, our, kind of our
approach to enterprise search.
676
:Um, so vector search is super interesting
and, um, I'm going to overly simplify it.
677
:And if there, I'm sure there are
some people in your audience and
678
:your community who know about vector
search, and they're going to be like,
679
:Dude, you're totally oversimplifying.
680
:It's like, yes, I am acknowledging
I am oversimplifying this.
681
:Um, so in,
682
:yeah, so think of it as like a multi
dimensional space with thousands
683
:of possible vectors, right?
684
:Um, and so basically what we're doing
with vector search is we're turning, uh,
685
:text into numbers and coordinates really.
686
:So for example, programming might be
near floor plan or space planning.
687
:in one sense of the word.
688
:Programming may be near coding
or software development, kind
689
:of in another sense of the word.
690
:And programming might be near
scheduling or event planning in
691
:yet another sense of the word.
692
:Um, so let's go back to, uh,
to RAG to see this in action.
693
:So when that query came in this
time around, do we have any
694
:healthcare programming templates?
695
:We execute a vector search.
696
:which goes out to a vector database.
697
:So it's stored all of the locations of all
of our content using those coordinates.
698
:It then retrieves the passages of
the key text from the most relevant
699
:resources, sends them to the large
language model for summarization,
700
:and then it can do the citations.
701
:So the question is, how does the vector
database know which passages to retrieve?
702
:And the answer to that is getting into
our vector search ingestion pipeline.
703
:Um, which is exciting as it sounds.
704
:Um, so, so we, whenever anybody
uploads or edits a Synthesis resource,
705
:uh, that resource gets broken
down into smaller chunks of text.
706
:So for example, let's take a Synthesis
page and this is a simple one on time off.
707
:So we're going to break that down
using the page headers into chunks.
708
:So we've got a chunk around bereavement
time, a chunk around jury and
709
:witness duty, and a chunk around PTO.
710
:Um, or this is a feature in
Synthesis I didn't show, but these
711
:are collapsible sections, right?
712
:So we've got all these
different collapsible sections
713
:having to do with payroll.
714
:One on W2, one on how are
raises and bonuses decided, etc.
715
:So we use those as signal on where
to break up a page, and then we
716
:put them into those smaller chunks.
717
:we take those chunks, and we send them
to what's called an embeddings model.
718
:And I'm going to go into this in a little
bit of detail, but the embeddings model
719
:knows how to look at those chunks of text,
And that it embeds coordinates, it embeds
720
:vectors into them based on their meaning.
721
:And they look like this.
722
:It's really just a string of numbers.
723
:It's complete nonsense, but
the, the vector database and the
724
:embeddings model know what it means.
725
:And so we add those chunks in the right
location using that vector, right?
726
:And so this will come a little bit clearer
at the next step, but the key to this
727
:whole organization is the embeddings
model, because it establishes the vector
728
:space in which meaning is assigned.
729
:Um, And because this is
such a key, am I good?
730
:Should I pause or should I do, are there
any questions you guys want to ask?
731
:Or should I keep, keep me going?
732
:Yeah, please.
733
:Yeah,
734
:Evan Troxel: that I know we'll be
talking a little bit about, but
735
:it's just a run on conversation.
736
:And, and so there aren't these
Chunks that you're talking about
737
:Christopher Parsons: right.
738
:Evan Troxel: And I know, like, I think
this goes back to like the thing that
739
:we were talking about earlier with
YouTube kind of trying to identify
740
:Christopher Parsons: Yes.
741
:Evan Troxel: And so maybe there's
sentiment changes, maybe there's.
742
:There's pauses.
743
:I don't know how it's doing it.
744
:I mean, maybe you have
more insight into that.
745
:So my, my first question is, is
there, it's like when, when the
746
:document does not have any kind of
hierarchy to it, there's no structure.
747
:This is what AI seems to
be pretty good at, right?
748
:You can think of it.
749
:Throw like a giant thing and it
can, it can kind of figure that out.
750
:Like I use this all the time.
751
:I use
752
:Christopher Parsons: Yeah.
753
:Yeah.
754
:Evan Troxel: chat GPT too.
755
:You can, you can summarize documents.
756
:You can say, tell me the three key point,
key takeaways from this conversation.
757
:And it does a pretty good job at that.
758
:And so I'm just wondering, is that, is
that really helping you here as well?
759
:And is that, is that really
how you're attacking this
760
:Christopher Parsons: I think so.
761
:Um, I, the reason I say, I think
so is we're evaluating a couple of
762
:different approaches and we don't know,
today on April 25th, which one we're
763
:going to pick, but that is certainly
one of them is to, and you know, it
764
:doesn't have to be, it doesn't have
to be a hundred percent precise.
765
:It just has to be good enough,
you know, to break it down.
766
:Like this, we're going to be able to
find, you know, what we need anyway.
767
:But, but I think your point is right.
768
:That's, and that is, you can read,
I've actually read a lot about how
769
:the Google key moments things works.
770
:And it's a lot of what you're saying.
771
:It's like looking for
changes in inflection.
772
:It's looking for.
773
:breaks it in some cases it will even
OCR the slides if there's slides or
774
:the video and kind of get a sense
that something's changing there.
775
:Um, so yeah, there's a lot that
goes into it, but it's super
776
:interesting how they built that.
777
:So yeah, that's when we look at
like unstructured, like a video,
778
:which is, as you said, just
like this long rambling thing.
779
:Um, we're going to have to infer some,
put some structure onto it versus
780
:just like break up every 200 words.
781
:Like
782
:that's probably too
783
:Randall Stevens-1: Yeah, this
is where, you know, the context
784
:becomes everything, right?
785
:The better, the better context you can
put around all this information, the more
786
:accurate your results are going to be.
787
:Evan Troxel: I think, you know,
to the question or to the topic
788
:about, I mean, you're using a great
example here with programming, right?
789
:I think of architect programming, clients
don't even know what that means, but
790
:Christopher Parsons: Right.
791
:Evan Troxel: that word in front of them.
792
:Um, and, and they, they learn what
it means going through the process.
793
:But if you were to bring up programming,
they would never guess that it's about
794
:spaces and adjacency and size and scope
and all those things and square footage.
795
:Like, they wouldn't, they
wouldn't think about that.
796
:And my question is, is, The older
generations in our firm who are the
797
:ones with a lot of that encapsulated
798
:Christopher Parsons: Hmm.
799
:Evan Troxel: like how are
you getting that out of them?
800
:Because a lot of this just
happens through conversation.
801
:It's not recorded.
802
:It's, and so, I mean, do
you have advice for firms on
803
:Christopher Parsons: Hmm.
804
:Evan Troxel: capture this information
moving forward to get that?
805
:put into these models so that these
models are smarter because again,
806
:like a lot of this is just spoken.
807
:It's never recorded in
or maybe it's in an email
808
:Christopher Parsons: Sure.
809
:Evan Troxel: and, and, and trying to like
dig all that up and get it in there to
810
:train this my firm would be really hard.
811
:I mean, are you just doing that on behalf
of everybody with your opt in firms AEC
812
:Christopher Parsons: Yeah, it's a great,
I mean, it's a great segue, right?
813
:So we are offering two different
embeddings models to our clients.
814
:So for the folks who opt out, which
is the default, they'll just use the
815
:generic open source embeddings model.
816
:And for the people who open, who
opt in and contribute content,
817
:they'll use the AEC specific model.
818
:Um, but I want to come back, go back
to your question a little bit, because
819
:I think it's a super important one.
820
:Yeah.
821
:It's one of the reasons
we're building an LMS.
822
:I mean, we're obviously primary reason
we're building an LMS is to help our
823
:clients grow and develop their people.
824
:and it turns out that, you know,
what's really important, isn't just
825
:volume of content for what we're trying
to do in the kind of search case.
826
:I showed, we want high quality content
and we want a lot of it, but we want
827
:the quality parts really important.
828
:And so.
829
:When people generally take the time to put
together, and you know, this could just
830
:be a simple hour recorded zoom on Zero
carbon buildings or something like that.
831
:Or, like how to work with healthcare
and like, here are the main phases and
832
:like, we're gonna talk about programming
is one of them or something like that,
833
:what I have observed in my career
and why watching our clients is that,
834
:you know, the people that put those
things together, put a lot of thought
835
:and care into them, and then there
are interesting discussions that kind
836
:of come out in the Q and a, and so.
837
:It's why video is such a core technology
for us, because it's easier for them
838
:to do that, you know, Zoom meeting
and kind of share their knowledge,
839
:then sit down and write a guide or
write a document or write a long post.
840
:Um, and then if they can, and this
is why the LMS, now it's like, okay,
841
:but then we can reuse that content.
842
:It's not just the Lunch and Learn
that happened on a Tuesday in March.
843
:This is now a course that can be for
future employees that come along that
844
:want to learn about, you know, content.
845
:You know, net zero projects, and
it can be assigned, or it can
846
:be recommended depending on your
career track or your learning path.
847
:And so we get more ROI out of those
videos in terms of educating folks, but
848
:then we get more high quality content
we can use to answer people's questions,
849
:but then also train these models to
850
:understand that AEC
851
:Randall Stevens-1: I wish both of you
guys had been able to be in New York
852
:with us last week for the Confluence
853
:event that we had.
854
:But we had, two of our speakers were from
Thornton, the core studio at Thornton
855
:Tomasetti, and, um, it actually wasn't
part of their, uh, Uh, presentation, but
856
:afterwards, as part of the discussion,
there was a, uh, a gentleman that
857
:worked at Thornton Tomasetti for 30
plus years who passed away recently.
858
:And, uh, they had built an AI engine.
859
:He had, uh, he had documented, like, uh, I
think they threw out the number, like 10,
860
:000 interactions that were through either
through email correspondence or Q and A.
861
:So he was literally the, you know,
862
:the, the, the person that knew the
863
:most.
864
:you know, it's a really beautiful
thing when you think about it as,
865
:as, you know, how do you capture that
knowledge and experience in such a way
866
:that can get to the next generation?
867
:Cause it's, you know, and I think the,
you know, what you're working on, Chris,
868
:the, the, this kind of interaction
with that kind of information is how
869
:can learnings pass on to the next
870
:gen and we don't lose, we
don't lose what was valuable.
871
:And we got into the
discussion about, you know.
872
:In the end, when you get a lot
of information, it's like, what
873
:might've been true 20 years ago
874
:may not be true today.
875
:So you have to fight that part of it.
876
:The other piece I'll throw out, and I'd
love to engage with your, uh, in this
877
:kind of thinking, but I've been throwing
out, it's like a lot of the work that
878
:everybody's doing on AI is, you know,
we have a lot of information in this
879
:industry, especially in the form of kind
of I'll just call it finished product.
880
:It's like, here's what we designed.
881
:Here's what this ended up, right?
882
:So, so you can easily go and look at that.
883
:What's, what seems to be missing
is the, how you got there.
884
:It's the, it's the why,
why did you make that
885
:decision?
886
:Yeah.
887
:And so I've been like, um, uh, kind
of throwing out this, like what you
888
:really want to start happening is, and
I actually this past weekend post that
889
:Confluence event, I actually went on
Amazon looking for, I want a red orb.
890
:I want this like device.
891
:That when we're back in person with each
other, I wanna like drop that on the
892
:table and hit record and it just be this,
it's just gathering the conversation
893
:and the why as part of the process.
894
:'cause it, I feel like that's part
of, you know, we're gonna, we're gonna
895
:get the end result of this stuff,
the polished end result, but we're
896
:missing the explanations of that.
897
:So anyway.
898
:I'd love to hear kinda what your thoughts
about where this is gonna end up.
899
:Where
900
:does this
901
:go?
902
:Christopher Parsons: Yeah, I'll
just, I'll give you, so the, in,
903
:in, in, I'll give you some KM speak
for what we're describing, which
904
:is critical knowledge transfer.
905
:And so critical knowledge transfer talks
about, you know, especially when you're
906
:talking about, in many cases, it's used
for senior people who were there at
907
:key moments, founding, launching new
services or new market types, but it's
908
:also when you've got, you know, a super.
909
:Rare subject matter expert, and you
want to just be able to like leverage
910
:their knowledge to other people.
911
:We, we ran an entire day at KA Connect,
uh, our annual conference a few years
912
:ago on critical knowledge transfer.
913
:We brought in an expert.
914
:She was a former professor at MIT
and Harvard and the business school.
915
:This was her career.
916
:She wrote a book called Deep
Smarts, which I highly recommend
917
:her name's Dorothy Leonard.
918
:And we ran a project with four of our
clients over the course of six to nine
919
:months where Dorothy kind of like used
her critical knowledge method methodology.
920
:And we just ran projects in those
companies to kind of pick out a couple
921
:of those experts and figure out how
to flesh this deep wise stuff out.
922
:Like it's the, it's the deep smarts.
923
:It's not like, you know,
simple, simple things like the
924
:project details and the square
925
:footage and whatever, it's
like, it's wisdom, it's
926
:wisdom.
927
:There was a story from
Dewberry that was amazing.
928
:Very senior, long tenured engineer.
929
:And the thing that came out that he had
that just people didn't understand is
930
:he had this list of seven priorities
when you're working on a project.
931
:And what he said is the important
thing is the order, right?
932
:Cause that's where values are determined.
933
:And so it's like, you know, for example,
safety, like it doesn't matter, cost,
934
:scope, all these different things.
935
:It's like, if this thing
is unsafe, We can't do it.
936
:And so it's kind of like,
that's a deep way of
937
:thinking about how to
938
:Randall Stevens-1: Yeah,
It's like a framework,
939
:Christopher Parsons: It's a framework,
940
:you're right.
941
:You want to excavate these frameworks
that a lot of times these experts
942
:don't even know that they have, right.
943
:They're implicit and then they're
in the back of their brain, but
944
:they don't know that they use them.
945
:And so that's what this kind of process
of critical knowledge transfer is
946
:about, is about excavating those things.
947
:And they're oftentimes done
through project stories is how
948
:you get them out of people.
949
:Right.
950
:You say like, okay, well,
you did this on this project.
951
:What did you consider?
952
:What alternatives did you look at?
953
:How did you make the
decision that you did?
954
:And through that process of giving
those exact examples, that's where you
955
:start finding out those kinds of hidden
frameworks that people are using and their
956
:heuristics that they're not
957
:even
958
:Randall Stevens-1: yeah, I
think about, um, you know,
959
:especially I'm a great example.
960
:I'm not right out of school,
but I never practiced.
961
:So it's like I've got, you
know, went to architecture
962
:school, but I never practiced.
963
:So I'm, you know, I don't have
all of that deep knowledge.
964
:You know, knowledge, practical
knowledge about some of this
965
:kind of get it in theory.
966
:But, and I, I've used this example.
967
:I have a good buddy of mine that
I went to architecture school with
968
:who now owns a firm practicing.
969
:And I had this, I had this
PDF with details on it.
970
:To me, they're just details.
971
:As soon as I showed it to him, he
starts pick, you know, picking at them
972
:and explaining different parts of it.
973
:And I've used that as the example of like,
If I, if, if I was the, I'm proverbial,
974
:I'm like perpetually a 24 year old in
my knowledge of architecture because
975
:I never did really progress from the, you
know, I understand conceptually what it
976
:is, but I don't know, I'm not intimate
with that, but what, uh, we're actually
977
:building some tools, uh, we're building
some annotation tools in avail and I'm
978
:very, right now it's kind of redlining
and the traditional, but I'm very keen on.
979
:Uh, starting to capture video and
audio as part of the annotation
980
:process, because what I'd really
981
:like to see is,
982
:Evan Troxel: It's
983
:Randall Stevens-1: want,
when you're marking it up, if
984
:you'll just record this and
985
:explain it at, in the
moment, why you're doing it.
986
:To me, it seems like
987
:that's, we've never really had that.
988
:So I'm kind of keen on starting
to experiment with will people
989
:be willing to just record an audio?
990
:And I think Chris, the work you're doing
is, is the right, is, is the right way to
991
:think about this, that people, they don't
want, they don't have to go type it up.
992
:I'm not going to go write a blog post
about it, but I'll, while I'm doing
993
:this, I'll talk and, um, That's the
easiest form of me getting this,
994
:uh, information out and then it
can be chewed on and perpetually
995
:available, you know, in perpetuity.
996
:It's like,
997
:Christopher Parsons: no, you, you, you,
so I mentioned there were four firms as
998
:part of this critical knowledge project.
999
:One of the other ones was Shepley
Bullfinch and Jim Martin and his
:
00:46:39,551 --> 00:46:41,111
team did a couple experiments.
:
00:46:41,111 --> 00:46:42,951
One of them was with kind of the senior.
:
00:46:43,351 --> 00:46:48,741
You know, quality CA kind of person who
is reviewing sets and they recorded him
:
00:46:48,741 --> 00:46:50,411
with a junior person reviewing sets.
:
00:46:50,411 --> 00:46:53,361
And as he's redlining, the role
of the junior person is saying
:
00:46:53,361 --> 00:46:54,661
like, well, what did you see there?
:
00:46:54,761 --> 00:46:55,781
Why did you circle that?
:
00:46:55,811 --> 00:46:57,171
You're like, why did you make that note?
:
00:46:57,431 --> 00:46:59,231
Because the expert
doesn't necessarily know.
:
00:46:59,231 --> 00:47:00,521
It's just so intuitive.
:
00:47:00,581 --> 00:47:02,241
Like it's tacit knowledge at this point
:
00:47:02,416 --> 00:47:02,736
Randall Stevens-1: right.
:
00:47:03,411 --> 00:47:03,601
Christopher Parsons: and They
:
00:47:03,626 --> 00:47:04,726
Randall Stevens-1: think
you should know it, right.
:
00:47:04,726 --> 00:47:05,146
Because they
:
00:47:05,146 --> 00:47:05,526
forgot.
:
00:47:06,206 --> 00:47:06,996
Christopher Parsons: I
think you should know it.
:
00:47:07,036 --> 00:47:08,606
So that was one of the
experiments they did.
:
00:47:08,606 --> 00:47:11,296
And then they did this other cool one
with one of their healthcare experts
:
00:47:11,296 --> 00:47:15,446
where they would show a picture of,
uh, uh, like an operating room or
:
00:47:15,446 --> 00:47:18,556
something like that, and they would
ask her to say like, what's wrong.
:
00:47:18,921 --> 00:47:20,981
You know, like what's broken
here, you know, and she'd say
:
00:47:20,981 --> 00:47:23,081
like, well, that thing's not
decode, that thing's not decode.
:
00:47:23,081 --> 00:47:23,761
I'm looking at that.
:
00:47:23,761 --> 00:47:24,741
Someone's going to trip on it.
:
00:47:24,741 --> 00:47:28,571
And like, just like the knowledge of
how she can just process one simple
:
00:47:28,581 --> 00:47:30,641
photo and just like tear it apart.
:
00:47:30,661 --> 00:47:32,871
Like that turned out to be super useful.
:
00:47:32,941 --> 00:47:36,131
And so they like turn that into some,
that's how you start excavating some of
:
00:47:36,131 --> 00:47:36,471
those things.
:
00:47:36,481 --> 00:47:36,601
So
:
00:47:36,601 --> 00:47:36,761
those
:
00:47:36,906 --> 00:47:37,706
Randall Stevens-1: I think, uh,
:
00:47:39,453 --> 00:47:42,003
Evan Troxel: exposing that
pattern recognition into
:
00:47:42,061 --> 00:47:42,431
Christopher Parsons: Yeah.
:
00:47:42,431 --> 00:47:44,251
Mm hmm.
:
00:47:44,251 --> 00:47:45,161
Mm
:
00:47:45,563 --> 00:47:45,913
Evan Troxel: deal.
:
00:47:46,043 --> 00:47:47,573
And I was just going to
say, Randall, about like the
:
00:47:47,583 --> 00:47:48,713
markup and talking out loud.
:
00:47:48,733 --> 00:47:51,663
This is actually something that you
could do in a zoom meeting and you
:
00:47:51,663 --> 00:47:54,063
don't even have to have somebody
else in the zoom meeting, right?
:
00:47:54,263 --> 00:47:58,953
You, because you have a whiteboard, you
can load up a PDF and you can record
:
00:47:58,953 --> 00:48:00,393
it to the cloud at the same time.
:
00:48:00,443 --> 00:48:03,183
And now with the zoom, I think
they're using Otter's engine to do
:
00:48:03,541 --> 00:48:03,841
Christopher Parsons: hmm.
:
00:48:04,573 --> 00:48:05,143
Evan Troxel: and.
:
00:48:05,458 --> 00:48:11,058
It just seems like a great way to create,
start creating a library of captures that
:
00:48:11,078 --> 00:48:17,168
are video plus markup, plus transcription,
takeaways, all those things that you could
:
00:48:17,528 --> 00:48:21,328
potentially use to train your staff in the
:
00:48:21,328 --> 00:48:22,378
future because
:
00:48:22,446 --> 00:48:22,656
Randall Stevens-1: Yeah.
:
00:48:22,656 --> 00:48:26,696
I think the trick to this is all
going to be making it, um, back as,
:
00:48:26,736 --> 00:48:29,896
as part of the normal workflow, not
:
00:48:29,896 --> 00:48:32,486
be something that you go do afterwards.
:
00:48:33,131 --> 00:48:33,531
right?
:
00:48:33,561 --> 00:48:36,311
but, but it's not, but, but
maybe it's not too difficult.
:
00:48:36,351 --> 00:48:37,131
Like, hey, just
:
00:48:37,181 --> 00:48:38,811
verbalize what you're thinking and
:
00:48:38,811 --> 00:48:39,521
say it out loud,
:
00:48:39,601 --> 00:48:39,821
right?
:
00:48:39,981 --> 00:48:41,311
Christopher Parsons:
I think it's, I agree.
:
00:48:41,311 --> 00:48:44,361
I think it's not hard because I
think, you know, in, in kind of,
:
00:48:44,401 --> 00:48:47,971
you know, mindset is the most
important thing for behavior change.
:
00:48:47,971 --> 00:48:50,541
And I do believe in
general in this industry.
:
00:48:50,951 --> 00:48:54,791
We have a culture of mentorship and
wanting to advance the next generation.
:
00:48:54,811 --> 00:48:58,671
And so I think this is just a different
technique, but it fits into a through
:
00:48:58,671 --> 00:49:00,011
line of something people already
:
00:49:00,011 --> 00:49:00,561
want to do,
:
00:49:00,751 --> 00:49:01,111
which is
:
00:49:01,191 --> 00:49:01,851
Randall Stevens-1: altruistic.
:
00:49:01,851 --> 00:49:02,331
Yeah, it's a
:
00:49:02,331 --> 00:49:02,651
very
:
00:49:02,871 --> 00:49:02,991
you
:
00:49:02,991 --> 00:49:03,131
know,
:
00:49:03,431 --> 00:49:04,261
Christopher Parsons: it's very altruistic.
:
00:49:04,261 --> 00:49:05,081
It's a great profession.
:
00:49:05,228 --> 00:49:06,228
let's talk about altruism.
:
00:49:06,228 --> 00:49:08,218
That's a great segue to community AI.
:
00:49:08,328 --> 00:49:11,468
Um, so let me tell you a
little bit about what we did.
:
00:49:11,478 --> 00:49:16,908
So when we saw this opportunity to
build these AEC specific models, um,
:
00:49:17,668 --> 00:49:18,548
we're like, how are we going to do this?
:
00:49:18,548 --> 00:49:19,588
It has to be opt in.
:
00:49:19,903 --> 00:49:21,923
You know, we're not just going to
start scraping everyone's data.
:
00:49:22,223 --> 00:49:25,463
Um, but we saw that there was this
great kind of quid pro quo, right?
:
00:49:25,463 --> 00:49:28,733
If you help share some data to help
us train these models for other folks
:
00:49:28,753 --> 00:49:32,063
in our community, you'll be able
to benefit them, benefit from them.
:
00:49:32,413 --> 00:49:35,843
So the two models we're building are the
embeddings model, which we just talked
:
00:49:35,853 --> 00:49:39,812
about a little bit for more relevant
search results, and the transcription
:
00:49:39,813 --> 00:49:41,753
model for more accurate video captions.
:
00:49:42,463 --> 00:49:44,843
And so our clients are
opted out by default.
:
00:49:45,383 --> 00:49:47,463
Once they join it, they
get access to the models.
:
00:49:48,213 --> 00:49:53,683
So at a high level, we're extracting
what we're after, our AEC specific
:
00:49:53,683 --> 00:49:58,273
terms, phrases, and then the context
for those terms and phrases from
:
00:49:58,273 --> 00:50:00,023
content at the firms that participate.
:
00:50:00,513 --> 00:50:05,903
So, you know, here's a word soup example,
you know, anything from acronyms to
:
00:50:05,973 --> 00:50:08,703
products, to different kinds of systems.
:
00:50:08,703 --> 00:50:12,403
Like these are the things that,
that generally the embeddings model
:
00:50:12,403 --> 00:50:15,323
out of the box, the generic one
and a generic transcription, video
:
00:50:15,323 --> 00:50:16,803
transcription model don't do that well.
:
00:50:17,298 --> 00:50:20,588
So that's what we use to fine
tune both the embeddings model
:
00:50:20,638 --> 00:50:21,748
and the transcription model.
:
00:50:22,808 --> 00:50:26,738
At a high level, what we're really trying
to do is kind of build an AEC crowdsource
:
00:50:26,748 --> 00:50:28,748
dictionary from our clients, right?
:
00:50:28,748 --> 00:50:30,648
So that we can do more smart things with
:
00:50:30,648 --> 00:50:30,988
AI.
:
00:50:31,268 --> 00:50:31,658
Randall Stevens-1: makes sense.
:
00:50:32,651 --> 00:50:32,931
Christopher Parsons: Yep.
:
00:50:33,083 --> 00:50:35,053
Evan Troxel: even people are going
to use different words for the same
:
00:50:35,211 --> 00:50:35,661
Christopher Parsons: Yes.
:
00:50:36,551 --> 00:50:36,821
Yeah.
:
00:50:36,973 --> 00:50:39,653
Evan Troxel: Different people
are going to call gypsum
:
00:50:39,891 --> 00:50:40,291
Christopher Parsons: Right.
:
00:50:40,713 --> 00:50:43,903
Evan Troxel: drywall, or gypsum wall
board, or, there's going to be an
:
00:50:43,903 --> 00:50:47,463
abbreviation for that, which ties
back into content management, like
:
00:50:47,463 --> 00:50:48,693
what Randall's tackling, right?
:
00:50:48,693 --> 00:50:51,893
It's like, you would abbreviate it on a
detail differently than you would even say
:
00:50:51,941 --> 00:50:52,441
Christopher Parsons: Right.
:
00:50:52,441 --> 00:50:53,031
Right.
:
00:50:53,183 --> 00:50:54,033
Evan Troxel: of these things have to
:
00:50:54,101 --> 00:50:54,941
Christopher Parsons: They
have to tie together.
:
00:50:54,941 --> 00:50:58,031
And that's what's the beauty of
a vector search versus keyword
:
00:50:58,041 --> 00:50:59,421
back to the original piece, right?
:
00:50:59,421 --> 00:51:04,111
Because vector, it's about relative
proximity, not exact match, right?
:
00:51:04,121 --> 00:51:07,081
So these terms are in
a neighborhood, right?
:
00:51:07,381 --> 00:51:09,931
These terms are all kind of
in a neighborhood together.
:
00:51:10,821 --> 00:51:13,151
And like these terms are all kind
of in a neighborhood together.
:
00:51:13,151 --> 00:51:15,531
They're not exactly
synonymous, but they're close.
:
00:51:15,531 --> 00:51:18,521
And like, that's how we start
understanding the relationship between
:
00:51:18,631 --> 00:51:19,981
different ways of saying the same thing.
:
00:51:20,284 --> 00:51:22,004
Um, the last thing I want
to say on the search front.
:
00:51:22,329 --> 00:51:26,189
And if we still have time, it'd be
good to go into, uh, training the
:
00:51:26,199 --> 00:51:30,539
video transcription model a little bit
is, um, our vision for the product.
:
00:51:31,149 --> 00:51:34,529
Is to be able to answer questions
using your highest quality source
:
00:51:34,529 --> 00:51:37,609
of firm wide knowledge, whether
that lives in Synthesis or not.
:
00:51:37,959 --> 00:51:39,579
And so we do have a history of doing this.
:
00:51:39,579 --> 00:51:42,289
There's obviously Synthesis
content and the Synthesis LMS
:
00:51:42,299 --> 00:51:43,909
content when it's released.
:
00:51:44,199 --> 00:51:48,119
We've got integrations with ERP and
CRM systems from our directories
:
00:51:48,119 --> 00:51:49,379
and profiles that we saw.
:
00:51:50,059 --> 00:51:53,269
We integrate with OpenAsset for
imagery and Newforma for contacts.
:
00:51:53,299 --> 00:51:54,719
And we do Zendesk for Help Center.
:
00:51:54,719 --> 00:51:56,979
So that's, this is all stuff we do today.
:
00:51:57,359 --> 00:52:01,009
So if you were to search Synthesis
Today on work sharing, you can see
:
00:52:01,009 --> 00:52:04,419
that one of the sources we pulled
in was from a Zendesk Help Center
:
00:52:04,429 --> 00:52:06,029
article on Revit work sharing.
:
00:52:06,779 --> 00:52:13,549
So we just have pretty ambitious plans
to expand and make more, um, more sources
:
00:52:13,549 --> 00:52:15,169
of that firm wide knowledge available.
:
00:52:15,559 --> 00:52:17,349
So that might be something like Teams..
:
00:52:17,654 --> 00:52:20,534
Like looking at kind of
messages and documents from
:
00:52:20,534 --> 00:52:22,374
public channels within teams.
:
00:52:22,804 --> 00:52:25,774
That might be something like Freshworks,
which is a Zendesk competitor.
:
00:52:25,774 --> 00:52:28,294
And it's like outside of Zendesk,
it's the most common help center
:
00:52:28,294 --> 00:52:29,864
software we're seeing in our community.
:
00:52:30,634 --> 00:52:35,194
Um, we'll likely going to build a search
connector API that will allow clients
:
00:52:35,194 --> 00:52:38,474
to build their own integrations, to
push content into Synthesis Search.
:
00:52:38,964 --> 00:52:41,814
And we're looking at this idea
of a public website indexer.
:
00:52:41,824 --> 00:52:44,334
So we've kind of gotten
two use cases for this.
:
00:52:44,344 --> 00:52:46,414
One is to index their own website.
:
00:52:46,974 --> 00:52:50,574
It's very common that clients will do
high quality content on their website
:
00:52:50,574 --> 00:52:52,064
that doesn't make its way into Synthesis.
:
00:52:52,679 --> 00:52:55,689
And then in the second use case,
it's to index the help centers
:
00:52:55,689 --> 00:52:56,979
of trusted software partners.
:
00:52:56,979 --> 00:53:00,789
So maybe a deck, uh, you know, indexing
the avail help center or building
:
00:53:00,789 --> 00:53:04,069
some kind of integration with, uh,
the avail help center, the open asset
:
00:53:04,069 --> 00:53:06,509
help center, or the Autodesk help
center, or whatever it might be.
:
00:53:06,589 --> 00:53:08,579
So these are things we're exploring.
:
00:53:08,579 --> 00:53:11,529
We're definitely committed to building
the search piece I showed you.
:
00:53:11,529 --> 00:53:13,419
And then the teams and the
fresh work and the search
:
00:53:13,419 --> 00:53:14,879
connector API and public website.
:
00:53:14,879 --> 00:53:18,429
These are all future explorations
that we're trying to validate and
:
00:53:18,429 --> 00:53:19,509
see if this is worth building.
:
00:53:19,619 --> 00:53:20,149
so the.
:
00:53:20,560 --> 00:53:24,280
um, the idea for that is this is
ing to go into public beta in::
00:53:24,430 --> 00:53:25,100
So by the end of the
:
00:53:25,100 --> 00:53:25,290
year.
:
00:53:25,480 --> 00:53:26,050
Randall Stevens-1: That's great.
:
00:53:26,450 --> 00:53:30,260
Are you thinking, Chris, because we're
kind of, because we're experimenting
:
00:53:30,260 --> 00:53:33,510
with a bunch of these different kind
of search methodologies and changes in
:
00:53:33,800 --> 00:53:38,150
search methodology, do you envision,
like, with Synthesis, are you all working
:
00:53:38,150 --> 00:53:43,680
through that this can be a single search
box, like the Google search box, that
:
00:53:43,680 --> 00:53:45,390
you'll begin everything from that?
:
00:53:45,390 --> 00:53:51,830
Or do you have to, like, quickly,
quickly put people in proper
:
00:53:51,830 --> 00:53:52,840
directions to get to that?
:
00:53:53,180 --> 00:53:56,010
To the right info because if you do
connect all of those things that you're
:
00:53:56,010 --> 00:53:57,670
talking about it can get noisy again,
:
00:53:58,100 --> 00:53:58,470
right?
:
00:53:58,470 --> 00:54:01,690
Yeah.
:
00:54:01,710 --> 00:54:04,340
Christopher Parsons: a very good question
and I don't know all the answers yet.
:
00:54:04,340 --> 00:54:06,010
I'll just tell you a couple of
things we're thinking about.
:
00:54:06,010 --> 00:54:09,440
So on one level, look, there were some
systems that weren't mentioned there,
:
00:54:09,470 --> 00:54:13,290
like email, for example, or file shares.
:
00:54:14,190 --> 00:54:21,315
Um, what, what we think is definitely
important is Kind of blessed,
:
00:54:21,335 --> 00:54:25,475
stable, finished, high quality,
evergreen knowledge, you know?
:
00:54:25,535 --> 00:54:29,305
So the kind of stuff we find in
Synthesis or in a help center, you know,
:
00:54:29,315 --> 00:54:31,615
that is like kind of blessed it's for
:
00:54:31,615 --> 00:54:32,125
everybody.
:
00:54:32,895 --> 00:54:36,105
And this is kind of, you know, when
we look at email, that's like stuff
:
00:54:36,105 --> 00:54:40,255
that's in flight, you know, teams like
a T I think teams is definitely like,
:
00:54:40,585 --> 00:54:41,915
it probably wouldn't be everything in
:
00:54:41,915 --> 00:54:42,345
teams,
:
00:54:42,605 --> 00:54:43,695
but Yeah,
:
00:54:43,695 --> 00:54:46,035
it's, it's, it's, it's just,
it moves in one direction.
:
00:54:46,035 --> 00:54:47,075
It's a fire hose.
:
00:54:47,105 --> 00:54:47,835
It's a lot of.
:
00:54:48,445 --> 00:54:52,335
Jokes and asides and like half
ventured opinions, which is great.
:
00:54:52,335 --> 00:54:53,205
We need all that stuff.
:
00:54:53,495 --> 00:54:55,855
I don't know if that's what you want
to put it into a firm wide search.
:
00:54:56,085 --> 00:54:59,025
So we're looking at every data source to
kind of say, like, I'm looking at what
:
00:54:59,035 --> 00:55:00,675
Microsoft's doing with co pilot, right.
:
00:55:00,895 --> 00:55:04,735
And what they're primarily overlaying
is like teams and email and file
:
00:55:04,735 --> 00:55:08,985
shares and not as much kind of like
firm wide, you know, sources of truth.
:
00:55:09,005 --> 00:55:11,925
And that gives you a very different thing,
you know, like, or I think if someone
:
00:55:11,925 --> 00:55:14,065
like a Newforma searching across email.
:
00:55:14,290 --> 00:55:16,830
Like there's just a lot of
email in there and I'm, and
:
00:55:16,830 --> 00:55:18,070
what they do is very valuable.
:
00:55:18,070 --> 00:55:21,220
And what TonicDM does is very
valuable in terms of email search.
:
00:55:21,610 --> 00:55:23,740
I don't know that that's the
kind of content we want, like
:
00:55:23,740 --> 00:55:27,030
in a Synthesis knowledge based
search, but to be determined,
:
00:55:27,150 --> 00:55:27,400
you know,
:
00:55:27,400 --> 00:55:28,120
this is early
:
00:55:28,150 --> 00:55:31,530
Randall Stevens-1: we're all kind of, you
know up against that like there's a lot
:
00:55:31,530 --> 00:55:35,445
of different sources You know, but even
you know in Google You You know, I use all
:
00:55:35,445 --> 00:55:39,915
the time, like, uh, like Google Drive, for
instance, you know, if I know what kind
:
00:55:39,915 --> 00:55:44,215
of document type, specifically, that's,
that's probably my most clicked thing.
:
00:55:44,405 --> 00:55:45,705
I know it's a spreadsheet.
:
00:55:45,765 --> 00:55:46,655
I know it's
:
00:55:46,655 --> 00:55:48,205
a, a, a Word doc.
:
00:55:49,195 --> 00:55:53,045
So then, you know, that lets
that at least contextually, you
:
00:55:53,045 --> 00:55:54,645
know, weed out lots of noise.
:
00:55:54,655 --> 00:55:57,315
So, you know, I don't know,
there's just challenges with the
:
00:55:57,315 --> 00:55:59,095
UI and UX when you start to have,
:
00:55:59,615 --> 00:56:03,440
you know, When you bring back a lot of
search results and it can be very noisy.
:
00:56:03,440 --> 00:56:05,480
It's like, okay, what's
the second level filtering.
:
00:56:05,480 --> 00:56:06,040
And then
:
00:56:06,040 --> 00:56:09,910
there's always the question of, do you,
do you preset it or do you post set it?
:
00:56:09,910 --> 00:56:10,190
Right.
:
00:56:10,200 --> 00:56:10,590
It's like
:
00:56:11,740 --> 00:56:12,330
Christopher Parsons: Exactly.
:
00:56:12,340 --> 00:56:15,360
No, I think, so, so this is the
second point I wanted to make.
:
00:56:15,360 --> 00:56:16,210
You took me right to it.
:
00:56:16,220 --> 00:56:16,610
So.
:
00:56:17,040 --> 00:56:20,530
I think we're going to end up with
something like, you know, I mean, an
:
00:56:20,640 --> 00:56:24,180
example, everybody would know it's like
on Amazon, you know, you kind of search
:
00:56:24,180 --> 00:56:27,210
everything or you search books or you
search prime video or whatever it is.
:
00:56:27,640 --> 00:56:31,770
Um, perplexity has an interesting
feature called focus where it
:
00:56:31,770 --> 00:56:34,610
can kind of, you can say, well, I
just want to search Reddit, right.
:
00:56:34,620 --> 00:56:37,970
Or I just want to search YouTube videos
because I just want to limit that my
:
00:56:37,970 --> 00:56:41,310
search and, and the Synthesis application
that could be, you know, Look, we've
:
00:56:41,310 --> 00:56:45,390
got a, uh, a library of all of our
proposals for the last three years.
:
00:56:45,710 --> 00:56:48,050
I just want to execute this
search against those proposals.
:
00:56:48,050 --> 00:56:50,700
Cause I don't want other noise
kind of getting in here, or I just
:
00:56:50,700 --> 00:56:51,940
want to search against the LMS.
:
00:56:52,310 --> 00:56:55,300
So I do think it's probably a mix
of what you said, Randall, both
:
00:56:55,330 --> 00:56:57,180
pre filtering and post filtering.
:
00:56:57,640 --> 00:57:01,940
It's probably a mix of the company
will probably set up some, some
:
00:57:01,940 --> 00:57:05,170
filters in advance, but then you
personally probably can go in and
:
00:57:05,170 --> 00:57:06,610
set up some different filters so I
:
00:57:06,610 --> 00:57:09,410
can go in and do, I'm doing my
proposal search or I'm doing my,
:
00:57:09,710 --> 00:57:10,070
knowledge
:
00:57:10,190 --> 00:57:10,920
Randall Stevens-1: yeah, I found it
:
00:57:10,920 --> 00:57:11,550
interesting.
:
00:57:11,550 --> 00:57:14,060
Just, uh, you know, different
people think differently.
:
00:57:14,070 --> 00:57:15,650
Some people want to pre do
:
00:57:15,650 --> 00:57:17,330
things and
:
00:57:17,340 --> 00:57:20,880
some, you know, we've, we've
taken the approach of, um, you
:
00:57:20,880 --> 00:57:23,120
know, Tell us what you want.
:
00:57:23,920 --> 00:57:28,060
We're going to go gather up all the
results and then our job is to give you
:
00:57:28,060 --> 00:57:32,580
enough context to, to, to make the next
two clicks, you know, narrow that down.
:
00:57:33,450 --> 00:57:36,400
There are, there are other solutions
that are out there that are, you know,
:
00:57:36,400 --> 00:57:39,570
you end up with a more complicated
interface because if you want to
:
00:57:39,580 --> 00:57:43,040
pre select all this stuff, now, now
I'm presenting all this stuff and
:
00:57:43,040 --> 00:57:46,910
I'm asking you to go make a bunch of
choices and then execute the search.
:
00:57:46,910 --> 00:57:49,160
So we've kind of taken
the former approach.
:
00:57:49,640 --> 00:57:55,280
Um, but I think it's, um, you know,
I'll give a, a, a slightly different
:
00:57:55,280 --> 00:57:58,900
example, but like we've, because
we've got a tight Revit integration.
:
00:57:59,780 --> 00:58:01,490
One of the things that
we've done with avail is.
:
00:58:02,220 --> 00:58:05,460
You know, you can find a piece of
content like a piece of Revit, of Revit
:
00:58:05,460 --> 00:58:07,230
family, and it's got type catalog.
:
00:58:07,720 --> 00:58:14,310
Well, what most firms want, um, BIM
managers want is, I want the user to only
:
00:58:14,320 --> 00:58:18,050
bring in the type that they're going to
use because I don't want to like preload
:
00:58:18,710 --> 00:58:22,240
all of this data into the model, into
the BIM model if it's not being used.
:
00:58:22,240 --> 00:58:23,560
I'd rather that you're just bringing it.
:
00:58:23,570 --> 00:58:25,720
So we've kind of designed
this approach that
:
00:58:25,720 --> 00:58:28,825
is Hey, when you, when you bring
the family in, we're going to
:
00:58:28,825 --> 00:58:32,235
tell you the types and we want to
right there for you to pick that.
:
00:58:32,985 --> 00:58:38,955
It's interesting though, because people
will count the number of clicks it is and
:
00:58:38,955 --> 00:58:43,195
say, Oh, this is too complicated because
it took four clicks to get that in as
:
00:58:43,195 --> 00:58:46,055
opposed to drag the entire family in.
:
00:58:46,475 --> 00:58:47,125
Then.
:
00:58:47,745 --> 00:58:50,695
Then I've got to go open
the properties type catalog.
:
00:58:50,705 --> 00:58:51,615
Then I've got to do this.
:
00:58:51,645 --> 00:58:53,725
Then because I've changed the
width, I've got to go move.
:
00:58:53,745 --> 00:58:55,005
They don't count all those.
:
00:58:55,255 --> 00:58:55,415
They
:
00:58:55,415 --> 00:58:58,035
count, you know, only what they want
to count on the front end of it.
:
00:58:58,695 --> 00:58:59,035
Not the
:
00:58:59,035 --> 00:58:59,335
post
:
00:58:59,365 --> 00:58:59,915
processing.
:
00:59:00,035 --> 00:59:01,305
Christopher Parsons: You know,
what's interesting about that.
:
00:59:01,315 --> 00:59:04,945
There's a, there's an article by
Nielsen Norman group who is an awesome
:
00:59:04,945 --> 00:59:08,325
UX research organization, and they've
got an article, I think it's called
:
00:59:08,325 --> 00:59:09,735
like the myth of the three, the three
:
00:59:09,735 --> 00:59:10,595
click myth or
:
00:59:10,595 --> 00:59:11,325
something like that.
:
00:59:11,335 --> 00:59:14,615
And basically after doing a bunch of
research, what they kind of established
:
00:59:14,615 --> 00:59:16,635
is people will click more than three
:
00:59:16,635 --> 00:59:17,455
times if they
:
00:59:17,455 --> 00:59:17,975
feel like they're
:
00:59:17,975 --> 00:59:18,285
making
:
00:59:18,295 --> 00:59:19,155
Randall Stevens-1: Give them a
:
00:59:19,155 --> 00:59:19,745
reward.
:
00:59:19,745 --> 00:59:20,425
If there's a reward.
:
00:59:20,505 --> 00:59:21,225
Christopher Parsons: give them a reward.
:
00:59:21,685 --> 00:59:23,995
Randall Stevens-1: Yeah, it's just, and
I don't know what the answer is, I just
:
00:59:23,995 --> 00:59:27,835
know, I get frustrated sometimes when
I have people being like, Yeah, you
:
00:59:27,835 --> 00:59:30,255
know, we didn't really like it because
it took four clicks to get this in.
:
00:59:30,255 --> 00:59:31,975
I'm like, you forgot to go measure.
:
00:59:32,245 --> 00:59:32,895
What you had to do
:
00:59:32,905 --> 00:59:36,075
after, Right, You were just
considering what we're doing on this up
:
00:59:36,075 --> 00:59:36,395
front.
:
00:59:36,395 --> 00:59:37,975
But anyways, just different approaches.
:
00:59:37,975 --> 00:59:40,735
And, you know, I do think people have
different mindsets about how they
:
00:59:40,745 --> 00:59:41,995
think about these kinds of things.
:
00:59:41,995 --> 00:59:43,165
So, and ultimately in
:
00:59:43,165 --> 00:59:45,105
a large firm, you have to support both
:
00:59:45,335 --> 00:59:46,305
in some, some
:
00:59:46,370 --> 00:59:46,810
Christopher Parsons: Correct.
:
00:59:47,190 --> 00:59:47,430
Yeah.
:
00:59:47,440 --> 00:59:49,370
Well, it's like browse
versus search too, Right.
:
00:59:49,370 --> 00:59:51,730
Like I showed you guys the
mega menu stuff on Synthesis.
:
00:59:51,730 --> 00:59:54,150
Like some people are going to browse to
stuff and some people will never do that.
:
00:59:54,150 --> 00:59:55,420
And they're only going
to search the stuff.
:
00:59:55,690 --> 00:59:56,070
So,
:
00:59:56,780 --> 00:59:57,060
yep.
:
00:59:57,167 --> 00:59:57,547
Evan Troxel: something in.
:
00:59:57,937 --> 01:00:01,937
It's interesting also, like the accrual
of, or the, the, the accumulation
:
01:00:01,967 --> 01:00:05,047
of the, all of the extra assets.
:
01:00:05,667 --> 01:00:08,367
And, and that is like a technical
debt at some level, right?
:
01:00:08,367 --> 01:00:10,267
It's like somebody to clean
that out at some point.
:
01:00:10,277 --> 01:00:13,237
And they're not even thinking
about all the clicks somebody else
:
01:00:13,250 --> 01:00:13,840
Christopher Parsons: Down the line.
:
01:00:13,977 --> 01:00:14,927
Evan Troxel: BIM manager or
:
01:00:15,000 --> 01:00:15,200
Christopher Parsons: Yeah.
:
01:00:16,360 --> 01:00:18,260
Randall Stevens-1: Well, you
know my, you know, my soapbox
:
01:00:18,260 --> 01:00:19,970
on the, we call them assets.
:
01:00:20,000 --> 01:00:21,390
Maybe that's the wrong word,
:
01:00:22,980 --> 01:00:23,340
right?
:
01:00:23,520 --> 01:00:23,730
It's
:
01:00:23,730 --> 01:00:23,970
built.
:
01:00:23,970 --> 01:00:25,130
in, it's, it's built.
:
01:00:25,130 --> 01:00:27,770
Yeah, I've, I've always, I've
had this longstanding, uh,
:
01:00:27,800 --> 01:00:31,320
debate, like where, where is your
content on your balance sheet?
:
01:00:31,350 --> 01:00:31,640
Is it
:
01:00:31,640 --> 01:00:32,990
an asset or a liability?
:
01:00:32,990 --> 01:00:33,240
Right.
:
01:00:33,250 --> 01:00:33,380
it's
:
01:00:33,620 --> 01:00:33,650
like,
:
01:00:34,280 --> 01:00:34,900
Christopher Parsons: I like that.
:
01:00:35,350 --> 01:00:35,880
That's funny.
:
01:00:36,297 --> 01:00:38,697
Evan Troxel: And then where, where are
you in the project and who's doing what?
:
01:00:38,697 --> 01:00:43,552
Because as a designer early on a project,
I don't even, I can't be that specific
:
01:00:43,552 --> 01:00:46,752
yet, so I'm going to throw a bunch of
stuff in there, and I'm going to, going
:
01:00:46,752 --> 01:00:51,272
to slowly narrow it down to what I want,
because things are in flux all the time,
:
01:00:51,282 --> 01:00:54,272
you know, somebody says something in an
hour, and it's going to change what I just
:
01:00:54,290 --> 01:00:54,600
Christopher Parsons: Yep.
:
01:00:55,362 --> 01:00:58,092
Evan Troxel: so, I mean, that is also
:
01:00:58,122 --> 01:00:59,142
a reality
:
01:00:59,160 --> 01:00:59,940
Randall Stevens-1: get You
:
01:00:59,962 --> 01:01:00,602
Evan Troxel: talking about,
:
01:01:00,660 --> 01:01:04,130
Randall Stevens-1: two,
youtwo probably, uh, are much
:
01:01:04,150 --> 01:01:06,750
better, you know, I only know.
:
01:01:07,865 --> 01:01:12,945
because I've spent my whole career in
it, but I get a sense that the amount and
:
01:01:12,945 --> 01:01:17,845
the breadth of information that has to be
dealt with and managed in this profession
:
01:01:18,105 --> 01:01:22,505
is maybe, you know, order of magnitude
more than most other professions.
:
01:01:22,635 --> 01:01:22,785
Is
:
01:01:22,785 --> 01:01:23,465
That true?
:
01:01:24,020 --> 01:01:25,360
Christopher Parsons: That
seems right from, that seems
:
01:01:25,360 --> 01:01:26,150
right from what I've seen.
:
01:01:26,450 --> 01:01:27,570
I mean, yeah.
:
01:01:28,732 --> 01:01:29,822
Evan Troxel: It feels right, it
:
01:01:30,050 --> 01:01:30,740
Christopher Parsons: It seems right.
:
01:01:31,300 --> 01:01:31,660
Yeah.
:
01:01:32,030 --> 01:01:32,400
Yeah.
:
01:01:33,090 --> 01:01:35,460
Do you guys want to take a little
talk about video transcription?
:
01:01:35,714 --> 01:01:37,370
Yep.
:
01:01:37,851 --> 01:01:38,231
Yeah.
:
01:01:38,392 --> 01:01:39,302
Evan Troxel: which is, okay.
:
01:01:39,302 --> 01:01:42,302
So, because I'm really
interested in like the idea of AI
:
01:01:42,561 --> 01:01:42,911
Christopher Parsons: Okay.
:
01:01:42,931 --> 01:01:43,241
Sure.
:
01:01:43,422 --> 01:01:45,402
Evan Troxel: because everybody
I think is going to be taxed.
:
01:01:45,432 --> 01:01:48,902
You, you have, you have all
these exterior partnerships that
:
01:01:49,001 --> 01:01:49,361
Christopher Parsons: Yep.
:
01:01:49,372 --> 01:01:51,752
Evan Troxel: about or connections
that you, you've got Deltek and,
:
01:01:52,132 --> 01:01:54,692
uh, you, you can list off all the,
:
01:01:54,831 --> 01:01:55,191
Christopher Parsons: Sure.
:
01:01:55,202 --> 01:01:56,232
Evan Troxel: ones, but like open asset,
:
01:01:56,261 --> 01:01:57,036
Christopher Parsons: Mm hmm.
:
01:01:57,036 --> 01:01:57,800
Mm
:
01:01:58,232 --> 01:02:00,072
Evan Troxel: is going to have their own AI
:
01:02:01,061 --> 01:02:01,321
Christopher Parsons: hmm.
:
01:02:01,502 --> 01:02:01,862
Evan Troxel: right?
:
01:02:01,902 --> 01:02:07,082
And so my question is, is like, are,
are you talking with them so that you
:
01:02:07,082 --> 01:02:11,952
don't have to duplicate any of their
work in your stuff so that your AI agent
:
01:02:11,952 --> 01:02:15,342
can talk to that AI agent, because I
assume like, they're going to be able
:
01:02:15,342 --> 01:02:19,942
to go much deeper into image search,
for example, not just metadata that's.
:
01:02:20,402 --> 01:02:24,382
Attached to the post that that
image is in, like the project and
:
01:02:24,382 --> 01:02:27,332
the location, all that stuff, but
like what's in the image, right?
:
01:02:27,352 --> 01:02:31,902
So I could, I could just see
it being this AI agent thing.
:
01:02:31,932 --> 01:02:36,022
It's like, it's like, tell me what,
what product was used in this image.
:
01:02:36,122 --> 01:02:37,692
And I can, I can just point
:
01:02:37,702 --> 01:02:37,822
at
:
01:02:37,851 --> 01:02:38,151
Christopher Parsons: Mm hmm.
:
01:02:38,151 --> 01:02:38,870
Mm hmm.
:
01:02:38,870 --> 01:02:39,230
Mm
:
01:02:39,392 --> 01:02:41,162
Evan Troxel: I found in Synthesis,
:
01:02:41,311 --> 01:02:41,641
Christopher Parsons: hmm.
:
01:02:42,752 --> 01:02:45,512
Evan Troxel: to, get me my answer,
because I want to know, because
:
01:02:45,512 --> 01:02:46,582
I need that answer right now.
:
01:02:46,582 --> 01:02:46,872
I'm on a
:
01:02:46,926 --> 01:02:47,136
Christopher Parsons: Yeah.
:
01:02:47,136 --> 01:02:49,756
I mean, I think agents will be,
we're not doing it right this minute.
:
01:02:49,756 --> 01:02:52,116
So I is all conjecture, but I do.
:
01:02:52,136 --> 01:02:52,396
Yeah.
:
01:02:52,396 --> 01:02:56,416
Like, so for example, open asset
has the ability to automatically
:
01:02:56,416 --> 01:02:59,756
generate proposals and, you know,
qual sheets and that kind of stuff.
:
01:02:59,756 --> 01:03:01,086
So that's something
we're not going to build.
:
01:03:01,366 --> 01:03:04,375
And so it could be that like a user
finds five projects and they're
:
01:03:04,376 --> 01:03:06,030
like, okay, these five projects.
:
01:03:06,781 --> 01:03:09,961
Hey, open asset agent, can you make me
a, you know, proposal out of this or
:
01:03:09,961 --> 01:03:13,371
make me some kind of RFQ or a brochure
or something like that out of it.
:
01:03:13,371 --> 01:03:15,701
So, yeah, I think agents
will be part of our future.
:
01:03:15,701 --> 01:03:17,421
I think it's just not, it's not
:
01:03:17,421 --> 01:03:18,091
here today, but
:
01:03:18,316 --> 01:03:19,016
Randall Stevens-1: Yeah, you'll either,
:
01:03:19,351 --> 01:03:20,091
Christopher Parsons: So this is all late.
:
01:03:20,131 --> 01:03:21,921
This is laying a lot of grain
work, groundwork for it.
:
01:03:21,921 --> 01:03:22,181
Yeah.
:
01:03:22,357 --> 01:03:24,977
Evan Troxel: because then it takes, I
don't have to go to all these different
:
01:03:25,431 --> 01:03:25,911
Christopher Parsons: Potentially.
:
01:03:25,917 --> 01:03:27,497
Evan Troxel: all the different
things and then assemble
:
01:03:27,497 --> 01:03:27,617
it.
:
01:03:27,617 --> 01:03:28,297
it's like, it's like,
:
01:03:28,396 --> 01:03:30,526
Randall Stevens-1: yeah, you're
probably likely to have, um,
:
01:03:31,886 --> 01:03:37,456
an API that allows a third party
to query for the important things
:
01:03:37,456 --> 01:03:41,376
that your platform is, does, and
a ability to get that info back.
:
01:03:41,926 --> 01:03:47,766
There's probably also going to be the,
you know, when you talk about these chat
:
01:03:47,766 --> 01:03:52,976
bots, you're talking about an interface
behind that, the model there, you're
:
01:03:52,986 --> 01:03:58,571
probably going to start to have a ability
to just connect directly to the to
:
01:03:58,571 --> 01:04:00,771
expose models that somebody may expose,
:
01:04:00,841 --> 01:04:03,181
you know, via some API or
some connection to that.
:
01:04:03,181 --> 01:04:08,671
So you may train a model and
then expose that other bots
:
01:04:08,701 --> 01:04:10,461
can have access to that model.
:
01:04:10,511 --> 01:04:13,981
So then, you know, back to the question
I was asking earlier, Chris, like at
:
01:04:13,981 --> 01:04:18,721
some point, do you have an interface in
a box, but it's going to need to talk
:
01:04:18,721 --> 01:04:21,861
to potentially thousands of different,
you know, hundreds of different sources
:
01:04:21,861 --> 01:04:23,371
of where this information is coming
:
01:04:23,371 --> 01:04:23,791
from.
:
01:04:24,811 --> 01:04:24,851
It's a
:
01:04:24,876 --> 01:04:25,016
Christopher Parsons: Yeah.
:
01:04:25,016 --> 01:04:25,656
I mean, the iceberg
:
01:04:25,666 --> 01:04:26,216
gets bigger,
:
01:04:26,546 --> 01:04:27,066
right.
:
01:04:27,216 --> 01:04:30,816
Um, where the interface is still the
simplest piece of the whole thing.
:
01:04:31,056 --> 01:04:31,916
Yeah, and exactly.
:
01:04:31,916 --> 01:04:34,666
And so the stuff we're putting down
now, the infrastructure we're building
:
01:04:34,666 --> 01:04:37,996
today around vector search and
embeddings and all that kind of stuff
:
01:04:38,436 --> 01:04:41,776
is totally going to help us get to
that future that we're talking about.
:
01:04:41,776 --> 01:04:45,106
Like the UI I showed you was
much more of a search UI.
:
01:04:45,536 --> 01:04:47,825
Does Synthesis end up with a chat bot UI?
:
01:04:47,826 --> 01:04:48,306
Maybe.
:
01:04:48,931 --> 01:04:53,351
Is there some other paradigm from a UI UX
perspective that takes over in 18 months
:
01:04:53,611 --> 01:04:55,311
and replaces kind of the search and chat?
:
01:04:55,351 --> 01:05:00,256
It's very possible and so whatever
happens on that kind of UI UX You know,
:
01:05:00,256 --> 01:05:04,976
advancements like that core under the
waterline stuff is going to be important.
:
01:05:05,086 --> 01:05:09,356
And I don't want to say it's just UI
in quotes, but like, to some degree,
:
01:05:09,356 --> 01:05:12,936
that's true that that's going to
be the easiest part to move around.
:
01:05:12,936 --> 01:05:17,526
It's that complexity of the integration,
the understanding, meaning coordination
:
01:05:17,526 --> 01:05:20,496
with other AI agents, like all of
that stuff behind the scenes is
:
01:05:20,496 --> 01:05:22,246
going to be really the different,
:
01:05:22,456 --> 01:05:23,166
in my opinion, it's
:
01:05:23,181 --> 01:05:26,031
Randall Stevens-1: Yeah, there's also,
you know, with the, uh, you know, vector
:
01:05:26,031 --> 01:05:31,831
databases and the rack models and, um,
you know, they're, they're very good
:
01:05:31,931 --> 01:05:37,251
at what they are being used for, but
they don't replace still some other
:
01:05:37,261 --> 01:05:39,461
valuable ways that data and databases,
:
01:05:39,511 --> 01:05:44,271
you know, databases are migrated into
one direction or another because they
:
01:05:44,271 --> 01:05:46,591
become super powerful at doing something.
:
01:05:46,991 --> 01:05:49,401
And so I think part of the
challenge is going to be.
:
01:05:49,981 --> 01:05:53,191
We're going to start mixing
search methodologies,
:
01:05:53,631 --> 01:05:56,851
which means the data's probably
stored in different kinds of
:
01:05:56,851 --> 01:05:58,201
databases behind the scenes.
:
01:05:58,201 --> 01:05:58,331
And
:
01:05:58,331 --> 01:06:00,421
then this is where the
UI UX challenge comes.
:
01:06:00,671 --> 01:06:00,971
How does
:
01:06:00,971 --> 01:06:01,051
a
:
01:06:01,051 --> 01:06:01,421
user
:
01:06:01,451 --> 01:06:01,811
keep this
:
01:06:01,896 --> 01:06:03,236
Christopher Parsons: I've
got a lot to say about that.
:
01:06:03,246 --> 01:06:04,636
So yes, So you're right.
:
01:06:04,646 --> 01:06:05,826
So database is part of it.
:
01:06:06,206 --> 01:06:09,236
You know, we, uh, the vector
database is another part of it.
:
01:06:09,556 --> 01:06:12,676
Like we'll probably end up building
a graph, you know, at some point.
:
01:06:12,906 --> 01:06:13,476
And so,
:
01:06:13,886 --> 01:06:17,946
so much of the magic of making all
this work is going to come down
:
01:06:17,946 --> 01:06:19,496
to understanding query intent.
:
01:06:19,821 --> 01:06:23,301
So for example, like some of the example,
the Wi Fi example I said before, the
:
01:06:23,301 --> 01:06:24,631
jury duty, like that's pretty clear.
:
01:06:24,921 --> 01:06:26,801
That's a, that's a knowledge based query.
:
01:06:27,181 --> 01:06:30,311
If someone asks a question in Synthesis,
how many healthcare projects have you
:
01:06:30,311 --> 01:06:31,931
done in the last five years in California?
:
01:06:32,286 --> 01:06:33,846
That is not a knowledge based query.
:
01:06:33,846 --> 01:06:36,726
That is more like a knowledge
graph database type query.
:
01:06:36,866 --> 01:06:40,576
So we have to really understand
query intent and then execute the
:
01:06:40,576 --> 01:06:45,146
query differently and we train that
using machine learning to be able to
:
01:06:45,146 --> 01:06:48,196
know when it's this kind of query,
here's the way that you use your
:
01:06:48,196 --> 01:06:49,616
resources that you have access to, to
:
01:06:49,616 --> 01:06:50,116
answer it.
:
01:06:50,376 --> 01:06:51,206
So all of that
:
01:06:51,206 --> 01:06:51,656
is like,
:
01:06:51,739 --> 01:06:53,809
Evan Troxel: how do you manage
rhetorical questions, Chris?
:
01:06:54,889 --> 01:06:55,579
It's going to be hard.
:
01:06:56,562 --> 01:06:57,172
Christopher Parsons: was that one?
:
01:06:57,692 --> 01:06:58,022
Yeah.
:
01:06:58,112 --> 01:06:58,532
Right.
:
01:06:58,639 --> 01:07:01,769
Evan Troxel: say something out loud
and then it's like, yeah, I really
:
01:07:01,769 --> 01:07:03,219
wasn't looking for the answer and you
:
01:07:03,412 --> 01:07:04,372
Christopher Parsons: Yeah, I know.
:
01:07:04,432 --> 01:07:05,122
That's interesting.
:
01:07:06,082 --> 01:07:06,532
humans are
:
01:07:06,532 --> 01:07:07,062
complicated.
:
01:07:07,197 --> 01:07:09,967
Yep,
:
01:07:09,990 --> 01:07:11,952
Evan Troxel: on that is below the
:
01:07:11,952 --> 01:07:12,344
surface.
:
01:07:12,344 --> 01:07:12,737
It's,
:
01:07:12,797 --> 01:07:16,607
Randall Stevens-1: think the, I think
what's, um, You know, exciting is
:
01:07:16,637 --> 01:07:20,837
that there's always these little pivot
points in technology development.
:
01:07:20,867 --> 01:07:24,967
So when these new technologies come
about, I know for me, it's like, it just
:
01:07:24,977 --> 01:07:28,837
reinvigorates you about, okay, you know,
things that we thought about or we're
:
01:07:28,867 --> 01:07:34,677
thinking about now kind of take on a new,
new era of possibility because we can
:
01:07:34,677 --> 01:07:36,467
apply those technologies in some new ways.
:
01:07:36,477 --> 01:07:38,517
So I think it's exciting times to be
:
01:07:38,517 --> 01:07:38,947
working on
:
01:07:38,947 --> 01:07:39,047
something.
:
01:07:39,727 --> 01:07:40,287
Christopher Parsons: 100%.
:
01:07:40,287 --> 01:07:42,867
I've been saying in the last six
months, this is the, you know, so
:
01:07:42,867 --> 01:07:44,067
far, this is the peak of my career.
:
01:07:44,077 --> 01:07:46,017
I'm having the most fun
now that I've ever had
:
01:07:46,217 --> 01:07:47,157
for that exact reason.
:
01:07:47,157 --> 01:07:49,527
That's a new wave of technology
that lets you build on all the
:
01:07:49,527 --> 01:07:50,467
stuff you've done before and
:
01:07:50,467 --> 01:07:50,987
do new things.
:
01:07:50,987 --> 01:07:52,067
It's super cool.
:
01:07:52,387 --> 01:07:54,297
Evan Troxel: Well, let's talk
about the AI transcriptions
:
01:07:54,366 --> 01:07:57,996
Christopher Parsons: Yeah, which is a, I
think a much more interesting topic than
:
01:07:57,996 --> 01:08:02,106
it sounds like on the surface and you
know, it's, it's, a little bit like, well,
:
01:08:02,106 --> 01:08:04,436
why, why are you going all in on captions?
:
01:08:04,446 --> 01:08:05,446
Like, why is that such a thing?
:
01:08:05,446 --> 01:08:07,816
And I mentioned accessibility
is really important for us.
:
01:08:07,816 --> 01:08:11,016
So it's part of an accessibility
initiative, the LMS and the search stuff.
:
01:08:11,016 --> 01:08:12,836
So it seems like an important technology.
:
01:08:13,216 --> 01:08:17,446
Um, so it's pretty simple in terms of the
diagram for, you know, how we do this.
:
01:08:17,446 --> 01:08:21,536
Someone uploads a video, we're using
Azure Cognitive Services, speech to text.
:
01:08:22,031 --> 01:08:23,680
We have two different
transcription models.
:
01:08:23,680 --> 01:08:25,411
We have the generic one,
which is the default.
:
01:08:25,441 --> 01:08:29,220
We have the AEC specific one, which
is through community AI, and then we
:
01:08:29,220 --> 01:08:31,161
generate captions and transcripts.
:
01:08:31,270 --> 01:08:34,281
And so again, the opportunity is
to build something AEC specific.
:
01:08:34,741 --> 01:08:38,140
So I want to take you into the process
of doing that because that is, um,
:
01:08:38,171 --> 01:08:42,121
it's an alpha now and the way we use
alpha, that means it's internally, um,
:
01:08:42,121 --> 01:08:43,541
and we're about to take it into beta.
:
01:08:43,560 --> 01:08:46,241
And I want to talk through kind of
the steps of building that model.
:
01:08:46,371 --> 01:08:50,810
So step one is determining what
we call the AEC terms of interest.
:
01:08:51,416 --> 01:08:53,345
Um, those are the terms we're
going to end up caring about.
:
01:08:54,116 --> 01:08:57,416
Um, we then want to acquire context
for those terms of interest.
:
01:08:57,416 --> 01:09:01,636
So we understand what those terms mean
and how they're used in, in sentences.
:
01:09:02,286 --> 01:09:06,426
Um, we want to fine tune a generic
model with that AEC context.
:
01:09:07,056 --> 01:09:09,276
We then do alpha testing
with benchmark videos.
:
01:09:09,296 --> 01:09:11,676
We go through beta testing
with representative clients.
:
01:09:12,076 --> 01:09:14,926
We do deployment to all the people
that are participating in community
:
01:09:14,926 --> 01:09:16,986
AI and then continuous improvement.
:
01:09:17,296 --> 01:09:20,986
The lion's share of what I have to say is
really in steps one and two, and then the
:
01:09:20,986 --> 01:09:22,966
rest of them are pretty self explanatory.
:
01:09:23,636 --> 01:09:27,395
So when it comes to determining
AEC terms of interest, we, uh, have
:
01:09:27,395 --> 01:09:28,666
three ways that we're doing that.
:
01:09:28,685 --> 01:09:31,256
One is to identify AEC specific terms.
:
01:09:31,841 --> 01:09:34,651
Um, and by, and I'll
show you how we do that.
:
01:09:35,031 --> 01:09:38,731
The other is to identify common
terms with AEC specific meanings.
:
01:09:38,731 --> 01:09:40,341
So we talked about programming before.
:
01:09:40,640 --> 01:09:42,591
Interview is another great example, right?
:
01:09:42,731 --> 01:09:44,871
I don't know of any other industry
that uses interview to talk
:
01:09:44,881 --> 01:09:46,591
about going to try and get a job.
:
01:09:46,600 --> 01:09:47,451
They say pitch.
:
01:09:47,711 --> 01:09:47,911
Yeah.
:
01:09:47,911 --> 01:09:48,431
Going after work.
:
01:09:48,481 --> 01:09:51,001
They say a pitch or we've got a
meeting or a presentation, whatever.
:
01:09:51,301 --> 01:09:51,571
Yeah.
:
01:09:51,801 --> 01:09:52,761
So many things.
:
01:09:53,341 --> 01:09:54,601
Um, and then there's acronyms.
:
01:09:54,611 --> 01:09:58,451
So those three things kind of
make up the AEC terms of interest.
:
01:09:59,121 --> 01:10:03,011
So we thought we were very clever and
we were only going to have to do step A.
:
01:10:03,031 --> 01:10:05,951
And as we got into it, we
realized B and C were necessary.
:
01:10:06,171 --> 01:10:07,621
So this is what we're doing in step A.
:
01:10:08,201 --> 01:10:12,061
We're extracting and de identifying
lists of unigrams, bigrams, and
:
01:10:12,061 --> 01:10:13,721
trigrams from Synthesis content.
:
01:10:14,371 --> 01:10:17,101
Um, and a unigram is a single word.
:
01:10:17,151 --> 01:10:21,521
For example, accelerate, access,
acoustic, and a bigram is a two
:
01:10:21,521 --> 01:10:25,631
word pair, like account access,
action plan, and acoustic analysis.
:
01:10:26,091 --> 01:10:27,971
And then a trigram is a three word pair.
:
01:10:28,441 --> 01:10:30,661
Annual Revenue Forecast,
Annual Performance Review,
:
01:10:30,711 --> 01:10:32,081
Acoustic Installation Material.
:
01:10:32,751 --> 01:10:36,751
And so you'll see, this is a mix of
generic terms and AEC specific terms that
:
01:10:36,751 --> 01:10:38,461
we will pull out of Synthesis content.
:
01:10:39,051 --> 01:10:42,671
And you can also see that we're starting
with posts, documents, and pages.
:
01:10:42,751 --> 01:10:46,521
So, um, we're obviously going to do
video down the road, but like we need
:
01:10:46,521 --> 01:10:49,711
to do the video captioning before
we can extract content from video.
:
01:10:50,261 --> 01:10:53,511
Um, so the first step is a two,
two step filtering process.
:
01:10:54,121 --> 01:10:55,911
So we're removing generic terms.
:
01:10:56,431 --> 01:11:00,041
using these lists of common
English unigrams, bigrams, and
:
01:11:00,041 --> 01:11:01,331
trigrams that we've assembled.
:
01:11:02,001 --> 01:11:08,481
Um, so what that gets you is, you know,
I'll strike out in the unigrams up top, I
:
01:11:08,491 --> 01:11:13,221
struck out accelerate and access, and now
I've got acoustic abutment and Autodesk.
:
01:11:14,201 --> 01:11:16,031
That's pretty good from a, for a raw list.
:
01:11:16,471 --> 01:11:19,841
Um, in the bigrams,
I've got ACME strategy.
:
01:11:19,841 --> 01:11:22,461
So ACME is meant to be a
placeholder for a firm name.
:
01:11:22,631 --> 01:11:26,081
Um, so ACME strategy, acoustic
analysis, and adaptive reuse.
:
01:11:26,716 --> 01:11:29,956
And then we've got Acoustic Installation
Material, Air Quality Monitoring,
:
01:11:29,956 --> 01:11:31,176
and Anchor Bolt Installation.
:
01:11:31,946 --> 01:11:33,036
So that's our raw list.
:
01:11:33,036 --> 01:11:34,366
It's still not good enough yet.
:
01:11:34,936 --> 01:11:36,686
So we're going to do another filtering.
:
01:11:37,166 --> 01:11:42,286
So we are going to filter terms which
don't appear at five companies or more.
:
01:11:42,936 --> 01:11:45,006
Um, so we're applying this
filter for two reasons.
:
01:11:45,016 --> 01:11:49,156
One is to help us kind of separate signal
from noise and prevent us from overfitting
:
01:11:49,166 --> 01:11:51,446
the model on just one firm's terminology.
:
01:11:52,026 --> 01:11:55,636
but it also has this side benefit
of alleviating concerns around
:
01:11:55,906 --> 01:11:57,956
confidentiality and intellectual property.
:
01:11:57,956 --> 01:12:02,476
So for example, that Acme strategy example
that we saw in the first example, wouldn't
:
01:12:02,476 --> 01:12:04,006
make it through this filter, right?
:
01:12:04,066 --> 01:12:05,696
And it just gets knocked
out of the dataset.
:
01:12:06,168 --> 01:12:09,968
So that gets us a refined
list of AEC specific terms.
:
01:12:09,968 --> 01:12:13,797
Um, any questions about this
or do you, should I keep going
:
01:12:13,797 --> 01:12:14,737
to the, kind of the next step?
:
01:12:14,999 --> 01:12:16,329
Evan Troxel: So, so you have two models.
:
01:12:16,329 --> 01:12:17,589
You have generic and then
:
01:12:17,657 --> 01:12:18,037
Christopher Parsons: Yep.
:
01:12:18,068 --> 01:12:18,559
Evan Troxel: specific.
:
01:12:18,579 --> 01:12:22,789
Are, do you eventually see it also
being like a third firm level one for,
:
01:12:22,809 --> 01:12:25,079
for like the ACME strategy version?
:
01:12:25,429 --> 01:12:27,429
Like, I want to be able
to find that stuff too.
:
01:12:27,439 --> 01:12:30,359
There's going to be repetition
in things like that, in all
:
01:12:30,359 --> 01:12:31,489
kinds of meetings that get
:
01:12:31,677 --> 01:12:31,997
Christopher Parsons: Yep.
:
01:12:32,749 --> 01:12:36,799
Evan Troxel: And it seems to me like if
I'm in a firm that could be really useful.
:
01:12:36,799 --> 01:12:40,499
That's not profession wide or
industry wide, but just firm
:
01:12:40,767 --> 01:12:42,437
Christopher Parsons: Yeah,
I, it's an open question.
:
01:12:42,437 --> 01:12:47,147
I mean, I think, I mean, to, to, it
may not end up being a third model.
:
01:12:47,527 --> 01:12:51,757
Um, but to kind of get at that,
how do we blend the kind of the AEC
:
01:12:51,757 --> 01:12:55,087
specific terminology with the firm
specific terminology that are a couple
:
01:12:55,087 --> 01:12:56,447
of different techniques to do that.
:
01:12:56,707 --> 01:12:59,857
Um, we're, we're exploring a couple of
different ones, but in general, what
:
01:12:59,867 --> 01:13:03,209
we're seeing is, um, we're seeing that
:
01:13:03,209 --> 01:13:03,994
this is going to go pretty
:
01:13:03,999 --> 01:13:07,349
Randall Stevens-1: But I think, I think
Evan, uh, if, if that is specific to
:
01:13:07,349 --> 01:13:10,289
the firm and it's in their data, it'll
pick up in the, in the vector search.
:
01:13:10,289 --> 01:13:10,609
anyway.
:
01:13:10,609 --> 01:13:10,709
It
:
01:13:10,709 --> 01:13:11,809
doesn't have
:
01:13:11,854 --> 01:13:12,494
Christopher Parsons: pick in the search.
:
01:13:12,494 --> 01:13:12,914
It'll pick up.
:
01:13:12,914 --> 01:13:13,074
Yeah.
:
01:13:13,074 --> 01:13:13,464
Sorry.
:
01:13:13,564 --> 01:13:16,384
On the, if the transcription one's
the trickier one, the transcription
:
01:13:16,384 --> 01:13:20,264
one is the one where, um, I was
trying to avoid doing this, but I'll
:
01:13:20,264 --> 01:13:21,434
just do the short version of it.
:
01:13:21,654 --> 01:13:27,094
There is an option to maybe send phrases,
which are specific to that firm along
:
01:13:27,094 --> 01:13:28,804
with the video to get it transcribed.
:
01:13:29,024 --> 01:13:33,193
Um, there's questions about building a
firm specific model as if there's enough
:
01:13:33,193 --> 01:13:34,904
volume to even make it worthwhile.
:
01:13:35,174 --> 01:13:36,244
Um, but.
:
01:13:36,654 --> 01:13:37,434
It's a good question.
:
01:13:37,514 --> 01:13:39,943
I, you know, and obviously
with anything AI there, yeah,
:
01:13:40,574 --> 01:13:40,814
yeah.
:
01:13:40,844 --> 01:13:41,654
Fix the context.
:
01:13:42,705 --> 01:13:42,885
Evan Troxel: that
:
01:13:42,954 --> 01:13:43,284
Christopher Parsons: Yep.
:
01:13:43,495 --> 01:13:43,695
Evan Troxel: right?
:
01:13:43,705 --> 01:13:45,415
It's like, yeah, yeah, I could see it
:
01:13:45,594 --> 01:13:45,914
Christopher Parsons: Yep.
:
01:13:46,734 --> 01:13:51,634
Um, so this is the thing that we missed,
but we started figuring out really
:
01:13:51,634 --> 01:13:55,534
quickly that we missed it was the common
terms that have AEC specific meaning.
:
01:13:56,054 --> 01:14:00,134
So, um, you know, when we applied
the filter on this slide, we
:
01:14:00,134 --> 01:14:05,434
knocked out all those terms, which
is fine, we also knocked out terms
:
01:14:05,434 --> 01:14:07,144
like plan, section, and elevation.
:
01:14:07,709 --> 01:14:09,709
Which mean different things
in common English, right?
:
01:14:09,719 --> 01:14:15,849
Drawing, layer, model, core, flashing,
pitch, program, specification,
:
01:14:15,849 --> 01:14:18,879
transmittal, like these are,
these are words that exist on
:
01:14:18,879 --> 01:14:22,869
those common, um, unigram, lists.
:
01:14:23,324 --> 01:14:26,064
But that we really want to
understand deeply what they
:
01:14:26,064 --> 01:14:28,164
mean in an AEC specific context.
:
01:14:28,364 --> 01:14:32,814
So we're using a variety of techniques,
both automated and manual, to assemble,
:
01:14:32,844 --> 01:14:36,174
you know, this list of the common
terms with AEC specific meaning.
:
01:14:36,644 --> 01:14:38,544
And then the last piece is the acronyms.
:
01:14:38,714 --> 01:14:41,314
So, you know, this industry
is notorious for that, right?
:
01:14:41,314 --> 01:14:46,144
So we've got, uh, Project phases, we've
got square footage and building codes.
:
01:14:46,144 --> 01:14:51,284
We've got all this, you know, kind of more
sustainable, uh, climate type acronyms.
:
01:14:51,284 --> 01:14:54,704
We've got, uh, Kind of more contractual
or business related acronyms like
:
01:14:54,704 --> 01:14:56,044
they're, this is just representative.
:
01:14:56,054 --> 01:14:56,864
This isn't all of them.
:
01:14:57,244 --> 01:15:02,184
Um, so we're, we're using a mix of
automated manual processes to collect
:
01:15:02,193 --> 01:15:06,394
these, and this is helpful for both
transcriptions, but then also in the
:
01:15:06,394 --> 01:15:09,714
search side, this will allow us to
automatically expand search queries.
:
01:15:09,724 --> 01:15:13,594
So, for example, if someone
says, just searches on CD, we'll
:
01:15:13,604 --> 01:15:17,094
search on CD and construction
documents is a simple example.
:
01:15:18,107 --> 01:15:20,367
Evan Troxel: This is great because
I see this is where transcription
:
01:15:20,367 --> 01:15:21,827
services fail all the time.
:
01:15:21,837 --> 01:15:24,207
And even in the same
conversation that it's
:
01:15:24,425 --> 01:15:24,684
Christopher Parsons: Yeah.
:
01:15:24,967 --> 01:15:27,657
Evan Troxel: it'll do, it'll
transcribe something in multiple ways.
:
01:15:27,977 --> 01:15:29,697
Sometimes it'll actually write the number.
:
01:15:29,697 --> 01:15:30,542
Sometimes it'll write the text.
:
01:15:30,872 --> 01:15:32,782
The words that make up the number, for
:
01:15:32,895 --> 01:15:33,275
Christopher Parsons: Yep.
:
01:15:33,755 --> 01:15:34,095
Yep.
:
01:15:34,184 --> 01:15:37,434
I'm actually going to get to an example
where, yeah, it gets even worse than that.
:
01:15:37,677 --> 01:15:41,207
so now we're trying to acquire context.
:
01:15:41,567 --> 01:15:44,897
Um, so that's pretty, this
is a pretty simple step.
:
01:15:44,987 --> 01:15:50,877
So for each term that makes it into our
terms of interest list, we're extracting
:
01:15:50,877 --> 01:15:53,847
and de identifying sentences from content.
:
01:15:54,177 --> 01:15:56,547
So for example, these are just
going to be Revit examples.
:
01:15:57,117 --> 01:15:58,757
You know, Revit's used in coordination.
:
01:15:58,767 --> 01:16:00,337
It allows the team to do blah, blah, blah.
:
01:16:00,737 --> 01:16:04,817
Or Revit can generate 3d models,
which helps us streamline design
:
01:16:04,817 --> 01:16:06,357
and create more visualizations.
:
01:16:07,417 --> 01:16:10,857
How does how we change and update
changes in the model and so on?
:
01:16:10,887 --> 01:16:14,747
Like, we're just really
trying to understand how
:
01:16:14,747 --> 01:16:16,667
Revit gets used in sentences.
:
01:16:16,667 --> 01:16:18,977
And the reason that's important is.
:
01:16:19,317 --> 01:16:22,927
You know, I, I did a video a couple of
months ago, kind of talking about when
:
01:16:22,927 --> 01:16:25,827
we first started down this road, and
this is what we found in the early days.
:
01:16:25,837 --> 01:16:27,817
You already gave example,
Evan, of Rabbit and Revit.
:
01:16:27,847 --> 01:16:28,877
So you've seen that too.
:
01:16:29,207 --> 01:16:33,707
Um, this, we saw the same thing with
Deltek, like two words, the software
:
01:16:33,707 --> 01:16:36,277
company in Austin versus the AEC Deltek.
:
01:16:36,277 --> 01:16:38,837
So, and there's a lot of
these kinds of things.
:
01:16:39,137 --> 01:16:42,077
So how did our transcription
model get this right?
:
01:16:42,107 --> 01:16:43,700
Where the generic, model didn't.
:
01:16:43,980 --> 01:16:45,420
It came from ingesting.
:
01:16:45,895 --> 01:16:48,875
And being fine tuned with all of
these sentences so that when it
:
01:16:48,885 --> 01:16:51,705
sees a new sentence like this,
we're going to be offering a new
:
01:16:51,725 --> 01:16:55,515
course on coordinating blank models
in the, in the summer, it knows to
:
01:16:55,515 --> 01:16:57,525
drop in Revit and not Rabbit, right?
:
01:16:57,525 --> 01:16:59,875
Because it understands in this context.
:
01:16:59,875 --> 01:17:03,405
Now, if it's about, I don't know, who's
eating the lettuce in your garden, like
:
01:17:03,405 --> 01:17:04,835
maybe Rabbit's the right choice for you.
:
01:17:04,835 --> 01:17:07,645
But like clearly in this context,
Revit is the superior answer.
:
01:17:07,847 --> 01:17:10,677
so that's kind of how we,
we determine the terms.
:
01:17:10,717 --> 01:17:12,257
We acquire context.
:
01:17:12,577 --> 01:17:15,527
Um, the fine tuning piece is super simple.
:
01:17:15,537 --> 01:17:19,377
We take all those sentences we've
collected and we send them into the
:
01:17:19,377 --> 01:17:23,257
model and then through a process
called fine tuning, and that gets
:
01:17:23,257 --> 01:17:24,757
us our AEC transcription model.
:
01:17:26,307 --> 01:17:26,997
So that's step three.
:
01:17:27,877 --> 01:17:30,237
So this is where we
are now in the process.
:
01:17:30,267 --> 01:17:33,177
And so what we're creating are what
we're calling benchmark videos.
:
01:17:33,687 --> 01:17:39,587
So these are special purpose videos
that we are making, um, to contain
:
01:17:39,587 --> 01:17:41,666
terms we're calling benchmark terms.
:
01:17:42,027 --> 01:17:45,497
And those are terms that we expect
a generic model will fail at.
:
01:17:46,017 --> 01:17:50,077
And we'll use those as criteria for
evaluating the success of our model.
:
01:17:50,527 --> 01:17:53,567
And these aren't all the benchmark
terms by any stretch of the
:
01:17:53,567 --> 01:17:56,937
imagination, but they are representative
of kind of two typologies.
:
01:17:57,547 --> 01:18:02,317
These acronyms on the left are acronyms
which you actually pronounce as words.
:
01:18:02,407 --> 01:18:05,392
So ASHRAE, BIM, CEQA, LEAD, NEPA, NOMA.
:
01:18:05,912 --> 01:18:10,282
which are different than initialisms
like, I don't know, FBI or CD or SD.
:
01:18:11,132 --> 01:18:16,362
Generally, the generic model does okay
with SD and CD when you're enunciating
:
01:18:16,372 --> 01:18:19,692
the different letters, but it does
really poorly with something like BIM.
:
01:18:19,742 --> 01:18:20,472
It has no idea.
:
01:18:20,472 --> 01:18:22,392
It thinks it's BEN or it
thinks it's some other thing.
:
01:18:22,482 --> 01:18:23,902
So, um,
:
01:18:24,272 --> 01:18:25,622
the, the acronyms are important.
:
01:18:25,662 --> 01:18:29,152
And then on the right, when it's these
AEC specific terms, it's never seen
:
01:18:29,152 --> 01:18:31,342
before, then it really kind of struggles.
:
01:18:31,582 --> 01:18:33,652
And so those are what we're
really trying to isolate.
:
01:18:34,022 --> 01:18:39,052
Is these benchmark terms to make
sure that as we, um, you know, upload
:
01:18:39,072 --> 01:18:44,102
those into the model and we generate
the AI generated AEC transcripts, we
:
01:18:44,102 --> 01:18:47,742
can compare them with human created
transcripts of those benchmark videos
:
01:18:47,762 --> 01:18:50,762
to, and we know the human transcripts
are a hundred percent accurate.
:
01:18:51,162 --> 01:18:51,422
Right.
:
01:18:51,422 --> 01:18:53,802
So then we kind of just try and find
the Delta and figure out where we're
:
01:18:53,802 --> 01:18:58,512
missing, which lets us then go and under
improve the AEC transcription model.
:
01:18:58,522 --> 01:19:01,682
So, you know, in alpha, we'll
probably get to at least two or three
:
01:19:01,682 --> 01:19:05,541
generations of this model to get to a
good place, but that means fine tuning
:
01:19:05,541 --> 01:19:10,352
with more context, or in some cases,
fine tuning with custom speech clips.
:
01:19:10,642 --> 01:19:16,702
So it may be for ASHRAE, like it just
needs to hear ASHRAE set out loud in
:
01:19:16,712 --> 01:19:20,082
five different sentences with a known,
good transcription along with it.
:
01:19:20,392 --> 01:19:22,791
So it starts to understand
that phonetic combination in
:
01:19:22,791 --> 01:19:24,332
that text, if that makes sense.
:
01:19:25,632 --> 01:19:28,541
So, I'm, I'm assuming we're going
to have to do some, some additional
:
01:19:28,541 --> 01:19:32,072
fine tuning with custom speech, which
we then deploy a new version of the
:
01:19:32,082 --> 01:19:35,332
AEC transcription model, you know,
and then the circle is complete.
:
01:19:35,402 --> 01:19:38,402
And then we go through the upload
cycle and the comparison and we get
:
01:19:38,402 --> 01:19:39,902
to a place that we feel pretty good.
:
01:19:39,952 --> 01:19:42,502
And that, that will kind of
take us, you know, into beta.
:
01:19:42,916 --> 01:19:47,726
Um, just quickly, beta, we're going to get
qualitative feedback from beta testers.
:
01:19:47,776 --> 01:19:50,066
We're going to continue
doing qualitative reviews.
:
01:19:50,466 --> 01:19:53,536
We're capturing every transcript
edit that people are making.
:
01:19:53,536 --> 01:19:56,106
So our video transcripts, you
can edit them and improve them.
:
01:19:56,376 --> 01:19:59,666
And so we'll be able to look for
patterns and say like, look, we keep
:
01:19:59,675 --> 01:20:02,636
missing this word over and over and over
again, let's, let's improve the model.
:
01:20:03,226 --> 01:20:06,506
And again, doing that through more
context or that custom speech.
:
01:20:06,870 --> 01:20:10,440
Evan Troxel: seems like there's an
opportunity here to present, like,
:
01:20:10,480 --> 01:20:14,710
during that QA process to actually
see if the transcription, it seems
:
01:20:14,710 --> 01:20:18,460
like it could ask a reviewer.
:
01:20:18,793 --> 01:20:20,173
what's the right option
:
01:20:20,216 --> 01:20:20,776
Christopher Parsons: right.
:
01:20:20,777 --> 01:20:21,796
Yep.
:
01:20:21,796 --> 01:20:23,836
So, um,
:
01:20:24,093 --> 01:20:27,413
Evan Troxel: that, that whole fine
tuning thing is actually just, it's
:
01:20:27,423 --> 01:20:31,543
overseen by somebody who, who knows the
answer, right, to help it, train it.
:
01:20:32,496 --> 01:20:34,066
Christopher Parsons: Evan, are you
talking about the transcription piece
:
01:20:34,066 --> 01:20:34,996
or you're talking about the search?
:
01:20:35,234 --> 01:20:35,634
Evan Troxel: I'm talking about the
:
01:20:35,638 --> 01:20:36,038
Christopher Parsons: Okay.
:
01:20:36,218 --> 01:20:36,508
Yeah.
:
01:20:36,544 --> 01:20:36,704
Evan Troxel: right?
:
01:20:36,704 --> 01:20:41,023
Because if, if, if somebody says
ASHRAE and it's like, Ashtray
:
01:20:41,063 --> 01:20:41,493
Christopher Parsons: Right.
:
01:20:41,532 --> 01:20:41,963
Yeah, yeah,
:
01:20:41,994 --> 01:20:43,854
Evan Troxel: did you
mean, did you mean this?
:
01:20:43,854 --> 01:20:44,704
Or did you mean this?
:
01:20:44,734 --> 01:20:46,104
And, and all I have to do
:
01:20:46,404 --> 01:20:46,624
as
:
01:20:46,693 --> 01:20:47,173
Christopher Parsons: I see.
:
01:20:47,464 --> 01:20:49,224
Evan Troxel: you know, fine tuner is just
:
01:20:49,683 --> 01:20:50,093
Christopher Parsons: yeah, yeah,
:
01:20:50,093 --> 01:20:50,433
yeah,
:
01:20:50,914 --> 01:20:52,584
Evan Troxel: instead of going through and
:
01:20:52,673 --> 01:20:53,153
Christopher Parsons: I see.
:
01:20:53,374 --> 01:20:56,244
Evan Troxel: you know, just
trying to think of, of ways
:
01:20:56,452 --> 01:20:57,532
Christopher Parsons:
Streamline this process.
:
01:20:57,593 --> 01:20:57,863
Yeah,
:
01:20:58,354 --> 01:20:59,014
Evan Troxel: to streamline
:
01:20:59,014 --> 01:20:59,924
the process because
:
01:20:59,952 --> 01:21:00,383
Christopher Parsons: I get you.
:
01:21:00,773 --> 01:21:01,874
Evan Troxel: it also seems like it could
:
01:21:02,023 --> 01:21:02,593
Randall Stevens-1: One of our
:
01:21:02,784 --> 01:21:02,954
Evan Troxel: kinds
:
01:21:03,063 --> 01:21:03,333
Christopher Parsons: Got it.
:
01:21:03,363 --> 01:21:05,743
I thought you were talking about
this step where we're reviewing the
:
01:21:05,743 --> 01:21:07,913
transcript edits, I'm like, when
this step, they're telling us what
:
01:21:07,913 --> 01:21:08,863
they want it to be, but you were
:
01:21:08,863 --> 01:21:09,523
talking a couple of
:
01:21:09,657 --> 01:21:12,678
Randall Stevens-1: Yeah, along those,
along those lines, uh, we're hoping
:
01:21:12,678 --> 01:21:16,248
this summer to build a, uh, I'd really
like to, you know, I keep using the word
:
01:21:16,248 --> 01:21:21,548
gamify it internally, but it's like what
you really want is a quick way that, you
:
01:21:21,548 --> 01:21:26,298
know, if we, if we are making suggestions
for things, um, to give somebody say,
:
01:21:26,628 --> 01:21:30,368
here's what we thought this was, but
if this is something that's better,
:
01:21:30,378 --> 01:21:33,518
would you pick another word for it and
pop those up and just make it like,
:
01:21:34,223 --> 01:21:36,993
Okay, I didn't have to do a whole
lot of thinking and then put that
:
01:21:36,993 --> 01:21:38,473
back in the, in the feedback loop.
:
01:21:38,863 --> 01:21:42,353
Where, where those interfaces are and
when people are willing to do that
:
01:21:42,353 --> 01:21:43,573
is an open question to
:
01:21:43,573 --> 01:21:44,173
figure that out,
:
01:21:44,173 --> 01:21:44,403
but
:
01:21:44,693 --> 01:21:45,153
Christopher Parsons: it is.
:
01:21:46,273 --> 01:21:46,813
It is.
:
01:21:46,913 --> 01:21:47,263
Yeah.
:
01:21:47,263 --> 01:21:49,273
We're definitely going to have
to do that on next year on
:
01:21:49,273 --> 01:21:50,313
our search interface.
:
01:21:50,313 --> 01:21:50,532
Right.
:
01:21:50,532 --> 01:21:51,733
People need to give feedback, but
:
01:21:51,733 --> 01:21:52,443
like, is a thumbs
:
01:21:52,443 --> 01:21:53,093
down enough
:
01:21:53,393 --> 01:21:53,683
to help
:
01:21:53,708 --> 01:21:54,728
Randall Stevens-1: human in the loop,
:
01:21:54,798 --> 01:21:56,728
uh, of a lot of this early training,
:
01:21:57,228 --> 01:22:01,248
you know, I think to the conversation
earlier, Chris, it's, this is where, you
:
01:22:01,248 --> 01:22:05,048
know, I would, I'd like to think that
some of the more senior people in the
:
01:22:05,048 --> 01:22:07,698
industry who have the best knowledge
:
01:22:08,378 --> 01:22:10,638
would see this as a way of like, Hey, I
:
01:22:10,678 --> 01:22:12,388
can really help the next gen of
:
01:22:12,388 --> 01:22:12,618
this
:
01:22:12,713 --> 01:22:13,263
Christopher Parsons: Mm hmm.
:
01:22:13,448 --> 01:22:14,418
Randall Stevens-1: you know, cause they
:
01:22:14,418 --> 01:22:18,474
can, Pass on a lot of knowledge
and information very quickly,
:
01:22:18,554 --> 01:22:18,904
right?
:
01:22:18,924 --> 01:22:20,464
It's like, boom, boom,
boom, boom, boom, boom,
:
01:22:20,684 --> 01:22:21,674
go knock some of this out,
:
01:22:21,859 --> 01:22:22,239
Christopher Parsons: Yes.
:
01:22:22,549 --> 01:22:25,209
Yeah, and I think that, you
know, it's interesting, like I
:
01:22:25,209 --> 01:22:27,309
got into this conversation with
our research council yesterday.
:
01:22:27,309 --> 01:22:32,749
There's a, there's a element of it, which
is kind of, well, on the very short term,
:
01:22:32,789 --> 01:22:34,949
there is like, you leverage yourself more.
:
01:22:35,284 --> 01:22:35,644
Right.
:
01:22:35,744 --> 01:22:40,664
Um, through making Synthesis, be able to
answer questions you would like in a very
:
01:22:40,664 --> 01:22:43,184
naive, naked, naked self interest way.
:
01:22:43,184 --> 01:22:45,824
Like you make it so that people aren't
coming to your desk and ask you the
:
01:22:45,824 --> 01:22:46,894
same question over and over again.
:
01:22:46,894 --> 01:22:47,724
And that's great.
:
01:22:48,054 --> 01:22:52,594
I think in a legacy perspective
though, it's kind of like this, uh, the
:
01:22:52,604 --> 01:22:54,574
Thornton Tomasetti story is like that.
:
01:22:54,584 --> 01:22:58,744
It's like, Oh, like the impact I
make now could, could still improve,
:
01:22:58,834 --> 01:23:00,034
could still help the company run
:
01:23:00,034 --> 01:23:02,284
well after I'm gone, like that's decades.
:
01:23:02,334 --> 01:23:02,554
Yeah.
:
01:23:02,554 --> 01:23:03,204
That's amazing.
:
01:23:03,324 --> 01:23:03,604
Randall Stevens-1: Yeah,
:
01:23:03,604 --> 01:23:07,193
Maybe there's a, uh, acknowledgement
piece that comes with it, right?
:
01:23:07,193 --> 01:23:09,394
That, you know, this stuff
was trained by these people,
:
01:23:09,443 --> 01:23:09,773
you know,
:
01:23:09,773 --> 01:23:10,104
it's like
:
01:23:10,584 --> 01:23:11,214
Christopher Parsons: interesting.
:
01:23:11,494 --> 01:23:12,004
I like that.
:
01:23:12,554 --> 01:23:13,023
That's cool.
:
01:23:13,418 --> 01:23:13,648
okay.
:
01:23:13,648 --> 01:23:15,398
Beta we then deploy.
:
01:23:15,628 --> 01:23:17,028
It looks pretty much the same.
:
01:23:17,208 --> 01:23:21,308
Um, so I'll keep going into
can use improvement, which is
:
01:23:21,308 --> 01:23:22,838
really my last kind of section.
:
01:23:23,248 --> 01:23:28,878
So the, um, so we imagine like
during alpha, we're going to
:
01:23:28,878 --> 01:23:30,327
probably build two or three versions.
:
01:23:30,698 --> 01:23:33,418
We imagine during beta based on
feedback, we'll probably build two or
:
01:23:33,418 --> 01:23:34,668
three versions of the model, at least.
:
01:23:35,288 --> 01:23:38,278
Once we deploy, we imagine we'll build
two or three versions, like we'll be
:
01:23:38,278 --> 01:23:41,718
rapidly going, but at some point it
will kind of settle down, we think.
:
01:23:42,118 --> 01:23:47,788
And so then we get more into like a every
six months, plus or minus, we're probably
:
01:23:48,458 --> 01:23:49,778
building a new version of the model.
:
01:23:50,008 --> 01:23:52,058
Because we get improved
models from Microsoft.
:
01:23:52,068 --> 01:23:55,218
So, you know, they're going to
improve the underlying speech model.
:
01:23:55,218 --> 01:23:57,968
They're going to add new functionality
that lets us do different things.
:
01:23:58,838 --> 01:24:01,888
Uh, you know, English continues to
evolve and change as a language.
:
01:24:01,888 --> 01:24:05,228
And so it will bring in
new, um, new lexicon.
:
01:24:05,518 --> 01:24:07,228
So we'll want to do that.
:
01:24:07,718 --> 01:24:09,848
Um, we're going to add
more community AI members.
:
01:24:09,848 --> 01:24:14,428
So we launched the community AI program
about six weeks ago, and we have 56
:
01:24:14,458 --> 01:24:18,118
clients as of today already signed
up, which we are really happy about.
:
01:24:18,157 --> 01:24:22,958
So, you know, in context, KA, you
know, we're getting near 140 clients.
:
01:24:22,958 --> 01:24:26,368
So we're, we're, we're approaching
50 percent of our clients, which was
:
01:24:26,368 --> 01:24:27,858
our goal for the end of this year.
:
01:24:28,228 --> 01:24:31,318
So I kind of have every confidence
we're going to zoom right past that.
:
01:24:31,608 --> 01:24:34,718
So as we add more community, I am,
AI members, they bring more content.
:
01:24:35,088 --> 01:24:37,668
We're going to poke in additional
data sources like videos and
:
01:24:37,668 --> 01:24:40,518
profiles and potentially some of
those search connectors to bring
:
01:24:40,768 --> 01:24:42,228
more content to train the model.
:
01:24:43,443 --> 01:24:47,293
Um, people are going to be adding new
content and that means new terminology,
:
01:24:47,293 --> 01:24:48,903
new context, all that kind of stuff.
:
01:24:48,903 --> 01:24:51,623
And then, you know, we'll
continue doing fine tuning.
:
01:24:51,623 --> 01:24:58,073
So that's, that's really kind of the view
from late April in::
01:24:58,123 --> 01:25:01,063
October, I might have a lot different view
on how this is all playing out, but this
:
01:25:01,063 --> 01:25:03,373
is, this is the best I know right now of,
:
01:25:03,413 --> 01:25:04,032
of how we're doing
:
01:25:04,168 --> 01:25:06,558
Randall Stevens-1: uh, Chris, just
from a business standpoint, are
:
01:25:06,558 --> 01:25:07,708
you going to,
:
01:25:07,843 --> 01:25:08,173
Christopher Parsons: Yep.
:
01:25:08,743 --> 01:25:13,353
Randall Stevens-1: Differentiate and
what people pay for the product about
:
01:25:13,353 --> 01:25:18,923
those two models, or it's basically, if
you want to opt in and participate, you
:
01:25:18,923 --> 01:25:21,543
get better information by opting in.
:
01:25:21,643 --> 01:25:22,702
It's a different way of paying, I
:
01:25:22,702 --> 01:25:23,263
guess, right?
:
01:25:23,263 --> 01:25:23,563
It's to
:
01:25:23,563 --> 01:25:23,903
contribute.
:
01:25:24,073 --> 01:25:25,123
Christopher Parsons: It's
definitely the latter.
:
01:25:25,243 --> 01:25:28,202
Um, yeah, so there is not a price
increase for people that want
:
01:25:28,202 --> 01:25:29,973
to use our AI functionality.
:
01:25:30,343 --> 01:25:34,883
I say that with like 95 percent
confidence and it wouldn't, the Delta
:
01:25:34,883 --> 01:25:39,613
wouldn't be because you were in the AI
program or not, you know, we think our
:
01:25:39,633 --> 01:25:41,243
inference costs are going to be fine.
:
01:25:41,253 --> 01:25:43,603
We, you know, like in terms of
like the infrastructure costs,
:
01:25:43,603 --> 01:25:44,773
like we think we're okay.
:
01:25:44,813 --> 01:25:47,573
It could turn out that we're,
that it, we have to kind of
:
01:25:47,593 --> 01:25:49,143
cover our, our, our cost change
:
01:25:49,143 --> 01:25:49,282
there.
:
01:25:49,282 --> 01:25:50,153
But I, I, don't
:
01:25:50,282 --> 01:25:53,293
Randall Stevens-1: it's the, I think it's
a healthy way to think about it that,
:
01:25:53,303 --> 01:25:57,228
you know, you're going to get, If you
contribute, it is the community aspect.
:
01:25:57,238 --> 01:25:58,058
You named it right.
:
01:25:58,098 --> 01:25:59,628
It's like it's the community aspect.
:
01:25:59,628 --> 01:26:03,888
If you contribute, then you're going
to get better info out of contributing.
:
01:26:03,888 --> 01:26:03,948
And
:
01:26:03,948 --> 01:26:06,308
if you don't want to contribute,
you don't get to participate in it.
:
01:26:06,498 --> 01:26:06,788
Right?
:
01:26:06,798 --> 01:26:07,068
It's like,
:
01:26:07,183 --> 01:26:07,393
Christopher Parsons: Yeah.
:
01:26:07,393 --> 01:26:11,673
And people can always opt in down the
line, you know, and I think one of the,
:
01:26:11,713 --> 01:26:15,593
you know, it's been a campaign, like
I have offered to meet with any one of
:
01:26:15,593 --> 01:26:20,123
our clients, we wrote FAQs and you can
imagine all the documentation we produced.
:
01:26:20,353 --> 01:26:22,963
I have met, I've offered to
meet with any client that has
:
01:26:22,983 --> 01:26:23,983
questions and wants to talk
:
01:26:24,473 --> 01:26:25,363
with them multiple times.
:
01:26:25,363 --> 01:26:27,913
If they need to go through it, I really
want people to understand what they're
:
01:26:27,913 --> 01:26:30,363
signing and to feel good about it.
:
01:26:30,463 --> 01:26:33,143
And if that means they wait
six months or two years
:
01:26:33,143 --> 01:26:36,998
or, you know, They do these high
end embassy buildings and there's no
:
01:26:36,998 --> 01:26:40,157
way they feel like okay signing the
contract because they're worried about
:
01:26:40,168 --> 01:26:41,548
their contracts with their clients.
:
01:26:42,458 --> 01:26:43,458
No judgment from us.
:
01:26:43,498 --> 01:26:45,728
Like we have an option
that you can use that
:
01:26:45,798 --> 01:26:46,178
Randall Stevens-1: Well, And
:
01:26:46,178 --> 01:26:46,548
I
:
01:26:46,588 --> 01:26:50,998
think it's, uh, you know, there's
other examples where this, um, this
:
01:26:50,998 --> 01:26:54,898
industry has tried to get cross firm
collaboration, but usually it was.
:
01:26:55,358 --> 01:26:59,188
They actually had to make some effort
and do things and then, and then
:
01:26:59,188 --> 01:27:02,327
it gets lopsided because one person
feels, you know, one firm feels like
:
01:27:02,327 --> 01:27:03,358
they've done more than the other.
:
01:27:03,608 --> 01:27:06,938
But I think with what you're talking
about, it's really, Hey, this is just a
:
01:27:06,938 --> 01:27:11,678
by product of making your info and data
available to help train these models on.
:
01:27:11,958 --> 01:27:15,988
So it's, uh, I, I could see that you
shouldn't run into those same kind
:
01:27:15,988 --> 01:27:21,583
of challenges with, uh, Oh, I, I put
more in than, uh, than, uh, you know,
:
01:27:22,423 --> 01:27:25,683
Somebody, somebody, somebody, you know,
got more out of it than they put in.
:
01:27:25,683 --> 01:27:28,353
It's like, well, that's kind of hard,
really hard to measure in these kind of
:
01:27:29,193 --> 01:27:29,563
situations.
:
01:27:29,638 --> 01:27:29,998
Christopher Parsons: Sure.
:
01:27:30,258 --> 01:27:30,698
Sure is.
:
01:27:32,014 --> 01:27:35,054
Evan Troxel: One thing that I keep
thinking of is, and you guys aren't
:
01:27:35,054 --> 01:27:39,564
guilty of this, you both have
products that have to be deployed
:
01:27:40,303 --> 01:27:40,702
Christopher Parsons: Mm
:
01:27:40,954 --> 01:27:45,144
Evan Troxel: from makes that
decision, leadership, it goes out to.
:
01:27:47,269 --> 01:27:51,409
There's a lot of other SaaS
and AI companies out there
:
01:27:52,039 --> 01:27:53,874
that are just going to Right.
:
01:27:53,894 --> 01:27:55,054
And they get into
:
01:27:55,353 --> 01:27:55,643
Christopher Parsons: hmm.
:
01:27:55,643 --> 01:27:57,373
Yeah.
:
01:27:57,654 --> 01:28:00,234
Evan Troxel: people use them at their
companies and maybe they should,
:
01:28:00,234 --> 01:28:04,064
or maybe they shouldn't like, this
is how Dropbox became a thing.
:
01:28:04,074 --> 01:28:04,294
Right.
:
01:28:04,294 --> 01:28:04,564
It was like
:
01:28:04,643 --> 01:28:05,013
Christopher Parsons: Shadow
:
01:28:05,273 --> 01:28:05,463
IT.
:
01:28:06,193 --> 01:28:06,563
Yeah.
:
01:28:07,303 --> 01:28:07,603
Yep.
:
01:28:07,704 --> 01:28:07,974
Evan Troxel: right.
:
01:28:07,994 --> 01:28:12,564
And, and it was, if I have any kind of
admin rights on my computer, I can use it.
:
01:28:12,564 --> 01:28:17,250
Well, These new SaaS programs, these
platforms don't require, all I need is a
:
01:28:17,264 --> 01:28:17,954
Christopher Parsons: And a credit card.
:
01:28:18,474 --> 01:28:18,724
Yep.
:
01:28:18,804 --> 01:28:20,224
And in some cases, it's
not even a credit card.
:
01:28:20,224 --> 01:28:21,304
In some cases, it's free.
:
01:28:21,504 --> 01:28:22,504
Up to 10 users.
:
01:28:22,504 --> 01:28:24,184
And so they get a, like, beachhead going.
:
01:28:24,244 --> 01:28:24,464
Yeah.
:
01:28:24,850 --> 01:28:26,690
Evan Troxel: or it's, or it's really in
:
01:28:26,693 --> 01:28:27,204
Christopher Parsons: They're free.
:
01:28:27,204 --> 01:28:35,464
Um,
:
01:28:36,140 --> 01:28:37,930
Evan Troxel: provide you the
answer, but also train the model.
:
01:28:37,930 --> 01:28:38,130
Right.
:
01:28:38,130 --> 01:28:40,330
So, so my question is like when the firms.
:
01:28:40,730 --> 01:28:42,850
Don't opt into your thing, Chris.
:
01:28:43,130 --> 01:28:48,770
And they don't move fast enough for the
users who are responsible for getting
:
01:28:48,770 --> 01:28:51,660
their job done on a day to day basis.
:
01:28:52,630 --> 01:28:55,470
And, and they, they want to take
this into their own hands, right?
:
01:28:55,470 --> 01:28:56,100
I like that.
:
01:28:56,110 --> 01:28:59,220
That is, I think, part of the
conversation for these firms as well.
:
01:28:59,220 --> 01:29:04,550
It's not like you have to do this, but
it's like, but understand like this is the
:
01:29:04,550 --> 01:29:06,910
trend in the industry that we're seeing.
:
01:29:07,309 --> 01:29:10,990
These, these companies are going
straight to the users and saying, like,
:
01:29:11,010 --> 01:29:15,370
look, you can just go to this address,
type in your question, get the answer.
:
01:29:15,820 --> 01:29:20,000
And, you know, is something we've
talked about on this podcast, right?
:
01:29:20,000 --> 01:29:23,230
Which is governance and ethics and
all these things with AI and, and
:
01:29:23,230 --> 01:29:27,770
the companies may be even really
slow on the uptake of providing that
:
01:29:27,770 --> 01:29:31,330
information of why they've made the
decisions that they've made to their
:
01:29:31,330 --> 01:29:35,730
staff and what you can and cannot
use explicitly, no matter what.
:
01:29:35,740 --> 01:29:36,850
Here's why.
:
01:29:37,500 --> 01:29:40,820
I think that this is all part of that
conversation that needs to be happening
:
01:29:40,820 --> 01:29:46,870
because like firms that are intentionally
slow in deploying software that gives
:
01:29:46,870 --> 01:29:51,320
their users definite advantages on a
day to day basis, like the users are
:
01:29:51,320 --> 01:29:53,980
just gonna be like, okay, I'm just
going to take this into my own hands.
:
01:29:53,980 --> 01:29:55,680
Like this happens all the time, right?
:
01:29:55,710 --> 01:29:58,690
I'm, I'm just curious from
your standpoint, are you having
:
01:29:58,690 --> 01:30:00,400
those conversations with firms?
:
01:30:00,640 --> 01:30:02,890
Is that part of the conversation
I should, I should ask?
:
01:30:03,337 --> 01:30:04,147
Christopher Parsons:
I'll take a swing at it.
:
01:30:04,147 --> 01:30:06,097
I'm, I'm interested to
hear Randall's take.
:
01:30:06,147 --> 01:30:12,832
Um, I think a lot about like the
diffusion of innovation curve and
:
01:30:13,222 --> 01:30:17,692
recognizing that, um, you know, you're
going to have your early adopters.
:
01:30:17,692 --> 01:30:19,392
They're going to move very
quickly on this stuff.
:
01:30:19,392 --> 01:30:23,852
Like, um, when we announced the AI pro,
like, I swear, we had people sign up.
:
01:30:24,187 --> 01:30:27,317
So fast in some cases that I'm not
totally sure they could have read
:
01:30:27,317 --> 01:30:29,457
everything in order to sign up that fast.
:
01:30:30,057 --> 01:30:33,527
And I'm like, are we sure
you know, but like, I get it.
:
01:30:33,527 --> 01:30:36,277
Like, and you know, they, they,
and it's also trust in KA, right.
:
01:30:36,277 --> 01:30:37,057
Cause we have long term
:
01:30:37,057 --> 01:30:38,067
relationships with our clients.
:
01:30:38,666 --> 01:30:43,797
Um, you start getting into early majority,
late majority, laggard territory.
:
01:30:44,277 --> 01:30:47,567
Those folks are going to do their
home, like the early majority and late
:
01:30:47,567 --> 01:30:50,997
majority are going to do their homework
and they're going to want to see value.
:
01:30:51,007 --> 01:30:54,017
Like people signed up and I'm
grateful for our community.
:
01:30:54,017 --> 01:30:56,627
People signed up on this pre software.
:
01:30:56,697 --> 01:30:59,916
Like they haven't been able to like
see the generic version, understand
:
01:30:59,916 --> 01:31:01,467
what would be better if we have AEC.
:
01:31:01,477 --> 01:31:03,767
Like they can connect the
dots and they believe it.
:
01:31:04,057 --> 01:31:07,207
For people later in the diffusion of
innovation curve, they need to see it.
:
01:31:07,527 --> 01:31:09,987
And in some cases they need to
see other firms going first.
:
01:31:10,477 --> 01:31:11,437
To feel comfortable
:
01:31:11,707 --> 01:31:12,157
and they need,
:
01:31:12,457 --> 01:31:14,397
they need social proof and
whatever you want to call it.
:
01:31:14,937 --> 01:31:15,367
So
:
01:31:15,717 --> 01:31:16,937
I think that's one angle to it.
:
01:31:16,937 --> 01:31:20,797
The other angle is like, some of our
clients have pumped the brakes a little
:
01:31:20,797 --> 01:31:25,067
bit and are pulling back anything
AI related and kind of forming, you
:
01:31:25,067 --> 01:31:28,277
know, a governance committee or some
kind of policy or something like that.
:
01:31:28,287 --> 01:31:31,737
And so what we've heard back is
like, we like what you're doing, but
:
01:31:31,817 --> 01:31:34,537
in order to be equitable across the
company, because we said no to other
:
01:31:34,537 --> 01:31:38,147
AI projects, we have to have this
process in place first to evaluate this.
:
01:31:38,672 --> 01:31:40,012
Understand what we're evaluating.
:
01:31:40,392 --> 01:31:41,982
So I think that's really fair.
:
01:31:41,992 --> 01:31:44,692
I, I respect that, you know,
going slow and being fair.
:
01:31:44,692 --> 01:31:48,252
And I expect under, I also say like,
if you don't understand, some of them
:
01:31:48,252 --> 01:31:51,492
have been like, we like what you're
doing, but we don't have time right
:
01:31:51,492 --> 01:31:53,052
now to understand if we can sign up.
:
01:31:53,802 --> 01:31:55,142
And so we can't sign up.
:
01:31:55,272 --> 01:31:56,852
And so like, that's, that's cool.
:
01:31:56,912 --> 01:31:58,512
Like, I understand for
instance, I have a lot going on.
:
01:31:58,532 --> 01:32:00,791
So it's probably just a
mix of all that stuff.
:
01:32:00,791 --> 01:32:03,502
And I think what we're going to
do is keep building and releasing
:
01:32:03,512 --> 01:32:06,602
and showing value and iterating
and communicating how things work.
:
01:32:06,602 --> 01:32:10,262
Like that presentation I just shared
at the end around how we're actually
:
01:32:10,262 --> 01:32:11,412
building a transcription model.
:
01:32:11,712 --> 01:32:14,791
The reason I built that presentation
is in those meetings I was taking with
:
01:32:14,791 --> 01:32:19,062
clients, I could feel this anxiety about
what they thought they were sharing.
:
01:32:19,072 --> 01:32:20,252
And then I showed them that it's like.
:
01:32:20,742 --> 01:32:23,492
Acoustics, Abutment,
Autodesk, this kind of thing.
:
01:32:23,492 --> 01:32:25,182
And they're like, Oh my
God, it's totally innocuous.
:
01:32:25,212 --> 01:32:26,162
We can totally sign up.
:
01:32:26,182 --> 01:32:27,992
So like, they need to see that.
:
01:32:28,022 --> 01:32:29,072
And so there's another wave
:
01:32:29,082 --> 01:32:29,212
that
:
01:32:29,217 --> 01:32:30,067
Randall Stevens-1: you, if you look
:
01:32:30,067 --> 01:32:32,637
at my, uh, we do these
infographics every year, Chris,
:
01:32:32,666 --> 01:32:34,907
uh, which are like the
most searched terms right
:
01:32:34,927 --> 01:32:36,037
across the platform.
:
01:32:36,497 --> 01:32:39,247
And, uh, my joke is always, it's
like, if you ask a third grader how
:
01:32:39,247 --> 01:32:41,627
to build a house, it would be chair,
:
01:32:42,502 --> 01:32:42,842
Christopher Parsons: It would be
:
01:32:42,842 --> 01:32:43,041
those
:
01:32:43,102 --> 01:32:43,412
words.
:
01:32:43,737 --> 01:32:44,217
Randall Stevens-1: light.
:
01:32:44,577 --> 01:32:48,097
It's like, Hey, this is not
proprietary information.
:
01:32:48,117 --> 01:32:48,927
This is like,
:
01:32:49,997 --> 01:32:50,307
uh,
:
01:32:50,617 --> 01:32:51,027
Christopher Parsons: Yeah,
:
01:32:51,197 --> 01:32:54,977
so some of it's just building the trust
and showing your, showing your work, you
:
01:32:54,977 --> 01:32:56,637
know, which we're doing and, and, and
:
01:32:56,637 --> 01:32:56,867
that.
:
01:32:56,887 --> 01:32:57,337
So I don't
:
01:32:57,337 --> 01:32:57,497
know.
:
01:32:57,507 --> 01:32:57,727
What do
:
01:32:57,762 --> 01:33:01,132
Randall Stevens-1: I think, you know,
uh, forgetting about the AI component
:
01:33:01,132 --> 01:33:05,132
of it, you know, what we've seen,
um, you know, with firms deploying
:
01:33:05,132 --> 01:33:10,922
our technology in the firm, a lot of
times, um, it's deployed, you know,
:
01:33:10,922 --> 01:33:14,082
I always describe it as it's deployed
from a very command and control, right?
:
01:33:14,082 --> 01:33:18,992
The goal of the firm is we're, we're as
a firm going to do better and we're going
:
01:33:18,992 --> 01:33:20,492
to do, you know, we're going to do this.
:
01:33:20,492 --> 01:33:23,082
We're going to get organized, Okay.
:
01:33:23,342 --> 01:33:28,262
And, you know, there's always this kind
of top down, uh, you know, kind of start
:
01:33:28,282 --> 01:33:32,302
of the project, but then, you know, the,
the healthiest customers that we have
:
01:33:32,302 --> 01:33:37,767
are the ones that, Also figure out that
really the best information sometimes
:
01:33:37,807 --> 01:33:39,467
isn't from the top down, it's both.
:
01:33:39,467 --> 01:33:41,277
You've got to have a
combination of grassroots,
:
01:33:41,577 --> 01:33:46,227
bottom up, and top down, and you've got
to have technology platforms that support
:
01:33:46,257 --> 01:33:47,977
both of those kind of coming together.
:
01:33:48,637 --> 01:33:53,997
And so, you know, we're, the work that
we're doing along the same lines, you
:
01:33:53,997 --> 01:33:57,407
know, the kind of technology that you're
implementing, Chris, we're doing similar
:
01:33:57,427 --> 01:33:59,007
experiments and things inside of Avail.
:
01:33:59,517 --> 01:34:03,187
One of the things I think that is,
um You know, I'm trying to wrap my
:
01:34:03,187 --> 01:34:04,666
head around, like, where do we fit?
:
01:34:04,757 --> 01:34:06,666
And, and, you know, where should we fit?
:
01:34:06,666 --> 01:34:09,057
Because everybody's working
on these same kinds of things.
:
01:34:09,117 --> 01:34:11,697
And you're going to get this coming
from a hundred different angles.
:
01:34:11,697 --> 01:34:16,517
But one of the things I'm most excited
about is that in the end, you're not
:
01:34:16,517 --> 01:34:20,657
going to have a, I don't think you're
going to have a chatbot, I think you're
:
01:34:20,657 --> 01:34:27,297
going to have a thousand chatbots that
are tuned to very specific things, right?
:
01:34:29,277 --> 01:34:29,517
the
:
01:34:29,517 --> 01:34:31,347
agents become important, and.
:
01:34:31,474 --> 01:34:31,644
Evan Troxel: right?
:
01:34:31,654 --> 01:34:34,884
It's because I can't
even keep the Rolodex of
:
01:34:34,884 --> 01:34:35,134
all the
:
01:34:35,541 --> 01:34:35,952
Randall Stevens-1: No.
:
01:34:35,954 --> 01:34:36,193
Evan Troxel: that I
:
01:34:36,193 --> 01:34:36,674
need to
:
01:34:36,724 --> 01:34:37,443
talk with on
:
01:34:37,577 --> 01:34:40,337
Randall Stevens-1: You know, but, but,
but the models that are feeding those
:
01:34:40,337 --> 01:34:43,817
behind the scenes, the info that's
feeding those models, to me, that's what's
:
01:34:43,817 --> 01:34:48,217
interesting about where we sit is that
the people that have that information,
:
01:34:48,496 --> 01:34:51,197
we've got a content management
platform that lets them organize
:
01:34:51,197 --> 01:34:52,657
and manage that information, right.
:
01:34:52,657 --> 01:34:57,057
And curated and, you know, keep the
model fed with the best information.
:
01:34:57,137 --> 01:35:00,827
And, uh, you know, we, we fight,
everybody fights the same stuff.
:
01:35:00,827 --> 01:35:06,937
I mean, it's like we use, um, Chris, we
use HubSpot for our, uh, You know, our CMS
:
01:35:06,967 --> 01:35:11,807
and, and ultimately we use their knowledge
base and it drives our, uh, you know, the
:
01:35:11,807 --> 01:35:15,277
knowledge base that our customers use to
find, you know, help documents on that.
:
01:35:15,277 --> 01:35:18,237
So we've been feeding our own chat
bot on all that info and I can go
:
01:35:18,237 --> 01:35:21,037
in there and ask it questions and
it's, you know, it's pretty good.
:
01:35:21,327 --> 01:35:24,897
And then I'll ask it another question,
I'll be like, eh, you know, that's
:
01:35:24,897 --> 01:35:27,416
not the best way to, I know that's not
the best way to answer that question.
:
01:35:27,416 --> 01:35:30,597
So then you gotta, now you gotta
have an initiative that says, We
:
01:35:30,597 --> 01:35:34,087
got to go back and start plugging
those holes with better information.
:
01:35:34,627 --> 01:35:38,117
Uh, I've, I'm, I've, uh, you know,
become interested in this thinking
:
01:35:38,117 --> 01:35:42,077
about, you know, you, you want
to build in some kind of decay.
:
01:35:42,127 --> 01:35:44,947
Like some information
should decay over time.
:
01:35:44,947 --> 01:35:45,087
It's
:
01:35:45,087 --> 01:35:46,207
value should decay
:
01:35:46,367 --> 01:35:47,707
or assume it's going to decay over
:
01:35:47,707 --> 01:35:48,107
time.
:
01:35:48,927 --> 01:35:49,567
It's yeah.
:
01:35:49,567 --> 01:35:52,307
It's like, what's the expiration
date and within what drives that?
:
01:35:52,307 --> 01:35:53,437
Does time drive that?
:
01:35:53,837 --> 01:35:56,557
Does some other, you
know, shift or change?
:
01:35:56,557 --> 01:35:57,367
And how do you.
:
01:35:58,047 --> 01:36:01,467
How do you now tag that information
so that over time those things
:
01:36:01,477 --> 01:36:02,717
start to drop off at the Right.
:
01:36:02,717 --> 01:36:03,996
time and, you know, there's
:
01:36:04,047 --> 01:36:05,477
interesting, Yeah.
:
01:36:05,687 --> 01:36:07,227
interesting intellectual,
:
01:36:07,262 --> 01:36:07,912
Christopher Parsons: just cause it's old
:
01:36:08,017 --> 01:36:09,017
Randall Stevens-1: this
is, you know, this is
:
01:36:09,017 --> 01:36:12,257
why Chris, probably you and I are
the kind of people and Evan too that
:
01:36:12,257 --> 01:36:15,007
like, like these kinds of things
because it's a technology problem,
:
01:36:15,007 --> 01:36:16,397
it's an intellectually stimulating
:
01:36:16,657 --> 01:36:20,817
problem to try to figure out to say how,
how would you do this to make it evergreen
:
01:36:20,817 --> 01:36:22,827
and, and actually work over time.
:
01:36:22,827 --> 01:36:23,097
So,
:
01:36:23,437 --> 01:36:23,457
yeah.
:
01:36:24,212 --> 01:36:24,552
Christopher Parsons: Right.
:
01:36:24,552 --> 01:36:28,962
And from a, from a human perspective,
like one, one approach, like if it's
:
01:36:28,972 --> 01:36:31,922
true that like some things are evergreen,
some things expire in six weeks.
:
01:36:31,932 --> 01:36:33,302
Some things are good for two years.
:
01:36:33,371 --> 01:36:34,282
Some things you don't know.
:
01:36:34,282 --> 01:36:35,041
Some things, you know, we're good.
:
01:36:35,442 --> 01:36:39,871
Like the person in the, in the situation
of creating that thing, like what
:
01:36:39,882 --> 01:36:42,897
percentage of people are going to have
that kind of mindset And that kind
:
01:36:42,897 --> 01:36:45,867
of like thinking in the future that
like, when I make this thing, I know
:
01:36:45,867 --> 01:36:48,787
what, what did, uh, Louis Kahn said,
imagine you're building in ruins.
:
01:36:48,787 --> 01:36:49,047
Right.
:
01:36:49,057 --> 01:36:52,227
So how many people creating
content or imagining their content
:
01:36:52,227 --> 01:36:52,797
in ruins?
:
01:36:52,827 --> 01:36:53,057
Like, I
:
01:36:53,077 --> 01:36:56,337
Randall Stevens-1: Yeah, I had a, I
had a friend, an engineer, uh, who
:
01:36:56,337 --> 01:37:00,827
was an engineer who worked for a, uh,
a company, a corporate, uh, company
:
01:37:00,837 --> 01:37:02,166
and was an engineer inside their team.
:
01:37:02,166 --> 01:37:06,007
And he told me the story, this goes
back years ago, but he said, you know,
:
01:37:06,067 --> 01:37:12,237
they started having, uh, They had to
invoke a rule that said anything that
:
01:37:12,237 --> 01:37:16,957
was like talked about in a meeting if
it was over 90 days ago is irrelevant,
:
01:37:16,967 --> 01:37:21,027
because they would have people say,
but nine months, a year ago in this
:
01:37:21,037 --> 01:37:22,337
meeting, you said this, and it's
:
01:37:22,337 --> 01:37:24,017
like, it doesn't matter.
:
01:37:24,087 --> 01:37:25,717
The context has completely changed since
:
01:37:25,717 --> 01:37:25,937
then.
:
01:37:25,947 --> 01:37:27,767
It's like, it doesn't matter.
:
01:37:27,847 --> 01:37:29,347
It's garbage, right?
:
01:37:29,717 --> 01:37:30,157
So.
:
01:37:30,777 --> 01:37:33,197
Christopher Parsons: I wonder if that gets
into, we talked a little bit about filters
:
01:37:33,197 --> 01:37:36,977
and kind of focus and certain, and I do
wonder if like, that is one of the aspects
:
01:37:36,977 --> 01:37:40,017
of how those filtering, you know, biases.
:
01:37:40,017 --> 01:37:41,827
And I mean, we do this
already in our search index.
:
01:37:41,827 --> 01:37:42,867
We weight very heavily.
:
01:37:42,867 --> 01:37:42,882
Right.
:
01:37:43,362 --> 01:37:44,121
I shouldn't say very heavily.
:
01:37:44,121 --> 01:37:46,962
We weight, we take, uh,
freshness into account
:
01:37:47,242 --> 01:37:48,412
when we return search results.
:
01:37:48,442 --> 01:37:52,202
It's not the only factor, but some
of this can be done algorithmically,
:
01:37:52,202 --> 01:37:56,112
but I think some of it is going
to be somebody who cares, you
:
01:37:56,112 --> 01:37:57,772
know, reviewing content, you know,
:
01:37:57,799 --> 01:38:00,429
Evan Troxel: it too is like, if,
if you're taking in data from
:
01:38:00,439 --> 01:38:01,739
all these different, you know,
:
01:38:01,871 --> 01:38:01,902
Christopher Parsons: right.
:
01:38:01,902 --> 01:38:07,115
Same signal.
:
01:38:07,257 --> 01:38:07,746
Randall Stevens-1: Patterns.
:
01:38:07,746 --> 01:38:08,272
You'll see patterns.
:
01:38:08,912 --> 01:38:08,982
Yep.
:
01:38:09,269 --> 01:38:12,849
Evan Troxel: in multiple, yeah,
signal from different sources, okay,
:
01:38:13,099 --> 01:38:14,898
there's still a freshness to that.
:
01:38:15,189 --> 01:38:15,809
Whereas,
:
01:38:17,009 --> 01:38:20,709
somebody said it differently,
something might be changing,
:
01:38:20,839 --> 01:38:23,413
or it might just be that person
doesn't understand it like that.
:
01:38:23,554 --> 01:38:24,443
these other people do.
:
01:38:24,674 --> 01:38:28,193
And so, yeah, all that, it, it really
complicates the situation potentially,
:
01:38:28,193 --> 01:38:31,844
but it also, I think there is a
way to actually do that, right?
:
01:38:31,844 --> 01:38:38,394
Where you can, because, because of this
idea of freshness and kind of signal,
:
01:38:38,394 --> 01:38:43,943
across mediums of, of communication and
capture, it does sound like it's actually
:
01:38:44,052 --> 01:38:46,152
Christopher Parsons: I think this is
a golden era for knowledge management.
:
01:38:46,162 --> 01:38:49,772
I mean, knowledge management's been
around, I mean, probably since like
:
01:38:49,772 --> 01:38:51,582
Hammurabi and like writing stuff down on
:
01:38:51,582 --> 01:38:55,062
tablets, but like, since the mid
nineties is when it really took off.
:
01:38:55,112 --> 01:38:56,322
And there were been a couple waves.
:
01:38:56,322 --> 01:38:58,192
I think this is the third
wave of knowledge management.
:
01:38:58,192 --> 01:38:59,362
I think it's going to be a golden era.
:
01:38:59,362 --> 01:39:03,142
Like, I think because of all the things
we talked about today and, you know,
:
01:39:03,412 --> 01:39:07,182
That value being so much higher in
this kind of gravity of like pulling,
:
01:39:07,192 --> 01:39:10,082
like you said, Randall, like you want
to, you want to plug those holes.
:
01:39:10,082 --> 01:39:12,632
Cause you know, if you do, this
is what you're going to unlock.
:
01:39:13,032 --> 01:39:14,612
Like, I just think that's
where we're headed.
:
01:39:14,612 --> 01:39:17,722
I think it's going to be a good run for
anyone that likes doing this kind of work.
:
01:39:17,722 --> 01:39:18,472
It's going to be interesting.
:
01:39:18,602 --> 01:39:19,312
Randall Stevens-1: Great stuff.
:
01:39:19,762 --> 01:39:21,592
Well, baby, that's a good, uh, place.
:
01:39:21,592 --> 01:39:24,172
I know we were, this, this
was a good two hours spent.
:
01:39:24,552 --> 01:39:25,702
It's a good deep conversation.
:
01:39:25,762 --> 01:39:26,871
We could probably spend two more.
:
01:39:27,102 --> 01:39:28,812
I love, uh, love talking about it.
:
01:39:28,882 --> 01:39:32,932
Uh, and, uh, thanks for sharing,
you know, uh, everything
:
01:39:33,012 --> 01:39:33,822
that you guys are working on.
:
01:39:33,822 --> 01:39:35,692
It's exciting, where that's all going.
:
01:39:35,692 --> 01:39:42,162
And, and, uh, I was, I teach this class,
uh, I had my last class of the semester
:
01:39:42,192 --> 01:39:45,492
in person yesterday and, you know, I
was just telling the class, they're not,
:
01:39:45,902 --> 01:39:50,282
they're engineering and business students,
but, uh, uh, at the university and, uh,
:
01:39:50,532 --> 01:39:53,862
but I was telling them, I'm like, you
know, a lot of the stuff that I work on.
:
01:39:54,432 --> 01:39:57,982
I get joy out of because I truly
hope that what I'm working on can
:
01:39:57,982 --> 01:39:59,291
make an impact on the industry.
:
01:39:59,302 --> 01:40:00,602
And I know you feel the same way.
:
01:40:00,632 --> 01:40:03,002
You know, both of you guys
are feel the same way.
:
01:40:03,002 --> 01:40:06,482
And it's like, look, this is why we get
up every day and hopefully stay excited
:
01:40:06,482 --> 01:40:10,621
about it is there's great opportunities
to move the industry forward.
:
01:40:10,652 --> 01:40:14,902
Uh, you know, the comment made earlier
about that, that this industry is maybe
:
01:40:14,982 --> 01:40:19,212
swimming in an order of magnitude, more
info and data than other industries.
:
01:40:19,212 --> 01:40:19,472
Yeah.
:
01:40:19,807 --> 01:40:23,496
Uh, just means that these, this
is going to be vitally important
:
01:40:23,527 --> 01:40:25,147
technologies to get implemented here.
:
01:40:26,022 --> 01:40:27,072
Evan Troxel: Unstructured data
:
01:40:27,161 --> 01:40:28,001
Christopher Parsons: unstructured data.
:
01:40:28,630 --> 01:40:32,531
I just want to, I appreciate you
guys creating this forum to be able
:
01:40:32,531 --> 01:40:34,351
to go in at this level of depth.
:
01:40:35,376 --> 01:40:39,066
And honestly, this kind of like
level of technical like discussion.
:
01:40:39,116 --> 01:40:42,876
Um, it's not that, I mean, I
can't go to most conferences
:
01:40:42,876 --> 01:40:43,755
and talk about this stuff.
:
01:40:43,826 --> 01:40:48,456
So like, you know, so this is a, this is
a nice to have a space and you guys are so
:
01:40:48,456 --> 01:40:49,076
good to talk to
:
01:40:49,226 --> 01:40:49,526
about it.
:
01:40:49,596 --> 01:40:50,505
Randall Stevens-1: can, we can geek out.
:
01:40:50,505 --> 01:40:54,686
Maybe we need to start, uh, uh, doing
these as like the, uh, cocktail hour and
:
01:40:54,686 --> 01:40:57,306
we can just hang out forever and drink
a beer while we're doing it, right?
:
01:40:57,306 --> 01:40:59,146
Yeah, that.
:
01:40:59,146 --> 01:41:00,671
That's awesome.
:
01:41:01,242 --> 01:41:06,509
Evan Troxel: you, Chris, for, preparing
so much for this, because this truly
:
01:41:06,509 --> 01:41:10,749
does get to the mission of this podcast,
which is, you know, I characterize it
:
01:41:10,749 --> 01:41:12,519
as the director's commentary track.
:
01:41:12,529 --> 01:41:16,059
And look at us, we're, we're
a movie length episode here.
:
01:41:16,419 --> 01:41:19,109
We, we've got the director's
commentary track, you did the
:
01:41:19,109 --> 01:41:23,326
deep dive, It was conversational
and really rich with information.
:
01:41:23,326 --> 01:41:26,456
So I feel like it's kind of
checking all the boxes on what we
:
01:41:26,456 --> 01:41:28,056
wanted to do with this podcast.
:
01:41:28,056 --> 01:41:29,626
And it's been a fantastic
:
01:41:29,626 --> 01:41:30,226
episode and
:
01:41:30,634 --> 01:41:32,054
Randall Stevens-1: I have a feeling,
I have a feeling we're going to
:
01:41:32,064 --> 01:41:36,484
have you back on before too long
to see, see some of the fruits of
:
01:41:36,484 --> 01:41:36,634
all
:
01:41:36,634 --> 01:41:37,124
this, right?
:
01:41:37,609 --> 01:41:38,499
Christopher Parsons:
We're not going to stop.
:
01:41:38,519 --> 01:41:41,549
We're on a, we're on a tear right now,
so I can't wait to show you what we're
:
01:41:41,549 --> 01:41:42,409
doing in the future.
:
01:41:42,499 --> 01:41:43,299
Appreciate you guys again.
:
01:41:43,299 --> 01:41:43,559
Thank you so
:
01:41:43,589 --> 01:41:43,909
much.