Steve Germano of IMEG joins the show to talk about developing an in-house AI assistant named Meg, which functions as a conversational chatbot and search engine for company data running within Microsoft Teams.
We talk about the goals and capabilities of the first release and how the dev team is considering adding workflows to Meg's capabilities. We also discuss the platform they have chosen to build on, what the biggest challenges have been getting to this point, early feedback from staff and leadership, what the potential ROI on a project like this is, and more.
Episode Sponsor:
Chaos
Enscape is soon launching something special that will make your 3D workflow the best 3D workflow for a special price. In the meantime, you can experience it for yourself for free at https://chaos-enscape.com/trial-14 or simply by Googling “try Enscape”.
Episode Links:
Watch this episode on YouTube or Spotify.
-----
The Confluence podcast is a collaboration between TRXL and AVAIL, and is produced by TRXL Media.
Welcome to the Confluence podcast.
2
:Uh, this is a fun one today, I'll probably
say that about everyone, but we just, uh,
3
:I've ended up building a great
network of people that I've met in
4
:the industry and it's, uh, just always
a lot of fun to, to have 'em on here
5
:and talk about what the exciting
kinds of things that they're doing.
6
:But, uh, today we've
7
:got Steve Germano uh, I've
8
:known Steve for over a decade.
9
:Uh, he was at Unifi, uh, uh, ended
up, you know, being a, you know.
10
:People probably always think it's weird.
11
:It's like, aren't they a
competitor of yours or were they
12
:a competitor before Autodesk?
13
:And it's like, yeah, but we were all,
you know, in this industry, everybody's
14
:friends and we all get along and,
and do a lot of things together.
15
:So known Steve for a long time, uh,
he ended up, most recently is he's
16
:at an, in a large engineering firm
called IMEG and um, I just noticed, uh,
17
:it's probably been about a month ago.
18
:He, uh, posted on LinkedIn, a little
clip, uh, of a internal, uh, chatbot AI
19
:project that they were doing and kind of
showing off some of the success of that.
20
:So I just thought it was really cool.
21
:Uh, obviously it's very topical.
22
:Uh, this year's, uh, Confluence, uh, three
day event that we did back in October
23
:was all around AI and machine learning.
24
:So I immediately reached out to
Steve and was like, oh, Steve,
25
:this is like the coolest thing.
26
:Would you come on?
27
:And, and, and you know,
let's talk about, show us.
28
:Show us.
29
:You know, what, what you've done and,
and then a little bit of the behind the
30
:scenes about how all this came about.
31
:So I think everybody will enjoy, uh,
he, he agreed to come on and do it.
32
:I'm still trying to twist his arm to
get him to come to our, uh, our New
33
:York event that we're doing, Confluence
event that we're doing in April.
34
:So, uh, hopefully we'll get
him out for that as well.
35
:But I think everybody
will enjoy, uh, you know.
36
:Hearing this episode and, and hearing him
kind of dig into what's behind, uh, being
37
:able to actually, uh, pull off, you know,
the technology, but more importantly,
38
:the, the people aspect of it and how
it works inside of a firm like IMEG.
39
:Evan Troxel: I think every firm is kind of
thinking about doing this, and so for him
40
:to come on the show and show how they've
done it so far, what things to look out
41
:for, what platform they're building on
top of all of that was really valuable.
42
:And I think people will get a lot out
of this because we all have experience
43
:seeing different firms in all the
different departments and how they
44
:could potentially benefit from ingesting
their data and then having it answer
45
:questions in real time to people who have.
46
:Questions.
47
:Right?
48
:So, uh, I think that it's, it's
really valuable and IMEG is, is at
49
:the forefront of this, it seems like,
because they're talking about it
50
:and, and showing it off, and, and it
seems like a, a really valuable tool.
51
:And, uh, I, I can only IMEGine
the alternative, right?
52
:It's like searching the intranet,
looking, going to the HR department,
53
:going to the graphics department,
trying to find the right person.
54
:And like this actually cuts
down on a lot of that and it.
55
:Gives you that single source of truth.
56
:But, but him also talking about
accuracy and integrity of the data
57
:and bias and all of the things that
are potentially in there as well.
58
:And what they're testing against, um,
is, is also the other part of the story
59
:that everybody has questions about and
not quite sure how to approach this.
60
:And he's giving us some answers
here today, so it's pretty
61
:Randall Stevens: Yeah.
62
:And I think, you know, Steve has a,
you know, he'll, he'll, when we kick
63
:into the episode, he'll give a little
bit more of his background from a,
64
:uh, you know, he, he was actually
an engineer, but he's turned into
65
:a, being able to write code right.
66
:All along the way.
67
:So now he's
68
:kind of wearing both these
hats and inside the industry.
69
:So, um, he's, uh.
70
:He's a smart guy, and as you'll
see, uh, his willingness to share,
71
:you know, what he's learned.
72
:And I think it's just a huge,
benefit to the industry.
73
:So appreciate that.
74
:So let's just dive right in.
75
:uh, Welcome Steve to
the Confluence podcast.
76
:Um, for those in the audience that
don't know Steve, um, been an AC
77
:industry guy for several years.
78
:Steve and I met each other,
probably known each other 10 years.
79
:I was thinking back, you know, we,
uh, Steve and his team formerly when
80
:he was with Unify, uh, were involved.
81
:In helping to get the first
building content summit going, and
82
:that's, that's the time I kind of
remember I ran down on the floor
83
:at uh, au maybe back in 2013 or 14.
84
:And, and, uh, we, we have known
each other at least since then.
85
:So, uh, anyway, welcome to the podcast.
86
:Um, part of, uh, part of this has been
a series, you know, this, um, we've
87
:been talking about a AI and machine
learning and, um, caught my attention
88
:A couple of months ago Steve posted
something on LinkedIn that was kinda
89
:showing off a new initiative that
he's got at Imec where he works now.
90
:So we thought we'd have you on
Steve and kind of talk through
91
:what's going on on that front and,
uh, so welcome to the podcast.
92
:Steve Germano: Yeah.
93
:Thanks guys.
94
:Appreciate
95
:Randall Stevens: Yeah.
96
:So,
97
:uh, I, I'll, I guess I'll just kind
of kick off and ask the question.
98
:You, you, you posted a video, uh, as I
said, over on LinkedIn and you've got a,
99
:a chat bot that
100
:you've been developing, your team's
been developing inside of IMEG,
101
:which is a large engineering firm.
102
:Maybe you can tell a little bit
about what the firm does, but, um,
103
:uh, Meg, I think is what you named
your, your bot, which is interesting.
104
:We're doing some work here.
105
:And actually just earlier today
somebody was asking, it's like,
106
:what are we gonna call this?
107
:It's like, should we
make it, you know, human?
108
:Should it be like it's a human,
or are you talking, you know,
109
:generically to something.
110
:So we can dive into all these
things and kind of the, the
111
:backstory of what's going on there.
112
:So maybe you can just kind of kick it off
with a little, a little bit more about
113
:yourself and how you ended up at IMEG and,
and the team that you're running there.
114
:And, and let's talk about Meg.
115
:Steve Germano: Sure, sure.
116
:Yeah.
117
:So yeah, Meg's, Meg's a a
funny story behind that one.
118
:But yeah, so, um, a little bit about me.
119
:So I'm originally a, a
mechanical engineer by trade.
120
:Um, started my industry or started in
the AC industry for, uh, working for
121
:RG Vander Oil Engineers in Boston.
122
:And I was actually hired, um, as
a mechanical engineer in their
123
:group, but actually specialized
on their CAD department.
124
:And so we were doing CAD setup
and all these things for every
125
:project and the whole company
while also doing mechanical design.
126
:And I was kind of, I.
127
:Really just not wanting to
do the CAD side of things.
128
:So, um, naturally kind of
gravitated into programming.
129
:Had done some programming classes
previously in school and dove in and,
130
:you know, six months later or so, kind
of pretty much automated the CAD setup
131
:process within the, within the company.
132
:Um, and was able to kind of
focus more on engineering side.
133
:Um, and while I was at Vanderweil,
uh, actually developed what was known
134
:today as Unify, but it was a very
ugly version of Unify back then.
135
:It was a, you know, pretty simple UI
and a, and a SQL database just to manage
136
:our BIM content amongst our offices.
137
:'cause that was just a need
that we found that we needed.
138
:Um, and then kind of fast forwarding that
became a business, um, you know, had some
139
:great partners and we all kind of built
up this unified product and brand and.
140
:You know, just, uh, recently, uh, a year
or so ago, got acquired by Autodesk.
141
:Um, and then I had left, uh, unify in
20, I wanna say 20, around around 20, 20,
142
:20, 20 21, and went back in the industry,
um, working for MSA, uh, consulting
143
:engineers, uh, smaller firm, about
three offices, and, um, and basically
144
:was, uh, got back in the industry
as a director of design technology
145
:there, and was helping kind of revamp
that business from design technology
146
:perspective, IT, and all those things.
147
:Um, and then lo and behold, during Covid,
uh, we end up getting acquired by iMac.
148
:And I'm like, well, you guys already
have a design tech of technology
149
:director, so, uh, where do I go?
150
:Right?
151
:Um, and so it was a really good fit.
152
:We had some, I had some great
conversations, a lot of leadership,
153
:and at that point they were, they
had done some programming and had
154
:some automations and things going on.
155
:A lot of dynamo as every firm has.
156
:And they really wanted to dive in and,
and structure a software development team.
157
:So I said, Hey, I'm up for the challenge.
158
:So I came on board as the, uh, the
head of the software development team.
159
:And, uh, we now have a team of,
uh, six engineers today and, um,
160
:actively hiring some more as we speak.
161
:So it's growing, it's growing rapidly,
and it's been, it's been just an
162
:Randall Stevens: Is that,
163
:uh, uh, Steve, is that
is that rare to see?
164
:I mean, a team of six is a
pretty good team for whatever
165
:size you're.
166
:Steve Germano: it's, it's, and, and it's
actually larger than that if you count
167
:in, so there's six core developers today,
but there's also product owners, right?
168
:And so there's, uh, product
owners, and then there's also
169
:what we call pims, which are, um.
170
:They are product, uh, innovation
managers, kinda a new thing.
171
:We just developed, uh, the
beginning of this year.
172
:And each department now has a PIM and
all they do is think about workflows.
173
:And that's like their core competencies.
174
:Like, Hey, how do we do X, y, Z workflows?
175
:And then where can we automate this?
176
:Where can we share data
with other departments?
177
:And how do those workflows and
interactions work with other departments?
178
:So they're just thinking about
innovation, which is awesome.
179
:So they come up with use cases
and ideas, oh, we need a widget
180
:for this, or a website for that.
181
:And then those requirements will
kind of flow down to a product owner,
182
:and then they'll get more technical
and spec things out and work with
183
:myself to, you know, kind of quote
those things and estimate 'em.
184
:And then we actually go and develop 'em.
185
:So it's a, it's a, there's,
there's kind of a, a big investment
186
:from, from I make, I would say.
187
:And, you know, their goals
and how, and their growth, uh,
188
:their growth has been amazing.
189
:And they grow through acquisition.
190
:So they're, they're active, very actively
purchasing two to three firms a quarter.
191
:Um, sometimes upwards to four
or five, depending on the size
192
:and kind of filling in the dots
across the nationwide map here.
193
:So when I started, they were
very much Midwest where they had
194
:begun, and then they purchased
our firm for the Southwest area.
195
:They had some West coast and
they, they added more there.
196
:They then added south and Southeast.
197
:And then just recently
they've added the Northeast.
198
:So we now kind of have a good, um,
nationwide presence and, and they keep
199
:growing through acquisitions, so it just
kind of snowball the effect of how many
200
:more apps and integrations and things
people need across the organization.
201
:So it's been a really, really fun
challenge and great place to work.
202
:Randall Stevens: Yeah, that makes sense.
203
:So let's, uh, let's dive into the, uh,
this AI bot that you've been working on.
204
:Steve Germano: Meg?
205
:Randall Stevens: Yeah.
206
:So, uh, so,
207
:you know, was this something that,
that came through that process from
208
:the top down or was this something
you guys were experimenting with and,
209
:Steve Germano: No.
210
:Randall Stevens: yeah.
211
:Steve Germano: Yeah, this
was an interesting one.
212
:Um, so.
213
:The, the original need.
214
:So, so two years ago I was kind
of dabbling around with LLMs
215
:when they first came out, right?
216
:Da Vinci oh three comes out and
I'm instantly, it, it, you know,
217
:it got my interest and I'm like,
we can now turn language into
218
:numbers that computers understand.
219
:And that was really the big thing about
LLMs is like, Hey, now this sentence
220
:means these 1500 numbers, and now I can
compare those numbers with other numbers,
221
:which are sentences, other sentences.
222
:And it really just, it was awesome.
223
:Like I, I just dove in.
224
:I started messing with stuff in
my own time, built my own little
225
:chat bot way back in the day.
226
:And, um, and no real
use case at work, right?
227
:It was just, just dibble and dabbling.
228
:And then I had a product owner come
to me and he goes, you know, I'm going
229
:out, I'm doing these site visits and,
and talking to our, um, you know,
230
:per office talking to our offices.
231
:And we're doing some
trainings on our Revit apps.
232
:And we have a, we have a Revit
app suite of about probably
233
:60 different types of tools.
234
:Um, a lot of will small, little push a
button, executes some things, uh, from
235
:small apps to some pretty larger ones
that do full electrical calculations
236
:for entire building, arc flash
calculations, some pretty cool stuff.
237
:Um, and, you know, folks
need training, right?
238
:So we have these guys
go out and do trainings.
239
:And he comes to me, he goes,
Hey, I got done with the training
240
:and I'm talking to this team.
241
:It's a fairly new acquisition.
242
:I think they were about six months,
uh, into the acquisition process.
243
:And they didn't know about half the tools.
244
:They're like, oh man, I had to do this
whole workflow and it would be great.
245
:We had a tool for that.
246
:And he's like, oh yeah, we have one.
247
:It's right here.
248
:And he's like, and then another
person was like, Hey, it'd be
249
:great if I had a tool for this.
250
:Yeah, we have that.
251
:It's right here.
252
:So we wanted to, he was like, Hey, can we
just make a search engine so that people
253
:can just say, Hey, I wanna do X, Y, Z.
254
:Let, let me just find the
tool that we already have.
255
:And so that's where it started.
256
:So I just like, oh, I could do that.
257
:I'll throw that into a LLM.
258
:Well, you know, we'll do some AI stuff.
259
:No big deal.
260
:And it became really easy.
261
:It was like a, you know, two, 3000
line of code app, like really small.
262
:I think I did it over a weekend just
messing around to see if I could do it.
263
:And then I showed him next week
and he was like, oh, that is cool.
264
:I said, what else could we put in there?
265
:I'm like, oh, oh, okay.
266
:Here we go.
267
:Right.
268
:So, so that, that kind of was
the beginning of it, and it was
269
:just a prototype and we were
actively working on the project.
270
:So it kind of just stuck.
271
:It just sat there for a couple
months and then, um, you know, the,
272
:the techs started maturing, right?
273
:And then there was that big aha moment
when G PT three came out, right.
274
:And Da Vinci oh three came
out and ev and all of a sudden
275
:everybody's like, oh my god.
276
:Chat bot, chat bot chat bots, right?
277
:Uh, opening AI was blowing up, uh,
leadership's hearing about this stuff.
278
:And, um, you know, and, and our CIO
and I were having a conversation
279
:about, you know, we have.
280
:Um, we have a whole data team that
works at IMEG and they do amazing work.
281
:They're, they're basically doing a lot
of ETL pipelines, so they're transforming
282
:data, pulling it from one database,
putting into another place, merging data
283
:together, and building power BI reports
for all these different use cases and
284
:people throughout the business, which
every a c firm is doing about these days.
285
:And they're extremely busy and
they're always busy, right?
286
:They're always new reports and new
data and insights to get from our data.
287
:And so we have this great team
that's really actively involved in
288
:our data and one of the first things
our CIO did when he came on board
289
:was, Hey, we gotta clean our data.
290
:We can't really make any insights
in our information unless we have
291
:it structured and we know how
to actually go and access it.
292
:So I was having a conversation
with him and you know, we've got.
293
:Content on SharePoint.
294
:We've got database for, you know,
VantagePoint Salesforce, and every firm
295
:has 15 different places data lives.
296
:And there's no real consolidated location
where we kind of bring that stuff together
297
:and then be able to query it, right?
298
:Uh, without having to go bother data
team to make me new Power bi, right?
299
:And so where I was like, well,
I think I can consume that
300
:data into a chat bot, right?
301
:I think there's new things coming out
and I'm, I was always reading white
302
:papers as soon as they launched.
303
:I'm a big white paper nerd.
304
:Like yeah, like I need to get a life.
305
:But yeah, so I love reading
those white papers, right?
306
:So they come out and it's like, oh, we
just developed, you know, uh, I think
307
:it was Stanford, um, that developed
the first rag pattern where RAG is
308
:a retrieval augmented generation.
309
:So the LLMs are, um, when they
first came out, a lot of problems
310
:with hallucinations, making up
things they couldn't do math.
311
:They're really, really kind of,
uh, not really trustworthy, right?
312
:And so this rag pattern came out.
313
:It was like, Hey, I wanna ask
questions about this set of
314
:information, whether that's a, a
paragraph or a book or whatever it is.
315
:And if we took that information, we put
it in line with the user's prompt and the
316
:LLM would read the user's question and
see that information and answer from just
317
:that information and not hallucinate.
318
:And there's some things you, you
know, some, some dials and knobs
319
:of turn with the LM to make sure it
doesn't hallucinate when you're given
320
:those types of rag instructions.
321
:And so that, when that came out, I read
that white paper and I was like, you know
322
:what, I think we can actually do this now.
323
:Because my primary concern was,
well, I can't release a chat bot
324
:that's gonna one, say bad things
to people and get us in trouble.
325
:Right.
326
:Especially when it's, you know,
so, and, and two, we can't
327
:be making up stuff, right?
328
:We're talking building
engineering, structural, right.
329
:I always used to make a joke
as an HVC guy, if I made a
330
:mistake, somebody's hot or cold.
331
:If a structural engineer makes a
mistake there, there's, there's
332
:bigger liabilities there.
333
:So we wanna make sure the data
that's retrieving is accurate, right?
334
:First and foremost.
335
:And with that pattern, it
opened up that opportunity.
336
:So in that conversation, I was like,
I think we got an opportunity here
337
:where we can actually execute on this.
338
:And we're like, okay,
let's go build a prototype.
339
:So we did that prototype
and it just kept going.
340
:And now here
341
:we are.
342
:So.
343
:Randall Stevens: really, uh, the, the
moment was the, the data's there and
344
:it's really just freeing up the front
end and making it accessible without, you
345
:know, having to have a data scientist or
somebody even, you know, even at the level
346
:that knows Power bi and it's like, um, you
know, of how to pull all this together.
347
:So it sounds like, you know, exciting
opportunities, I think, across the
348
:industry to begin tapping into these
data sources like you guys are.
349
:What are, what are all the different,
um, what are the different places where
350
:the data is living now being sourced?
351
:Steve Germano: Yeah,
that's a great question.
352
:Um, there's a lot of 'em today and,
um, there's kind of two buckets.
353
:If you kind of think about data
in an organization today, there's
354
:structured data, which is living
in a database somewhere, right?
355
:Um, that could be a SQL database,
that could be via APIs to something
356
:like VantagePoint or Salesforce.
357
:Um, those data is still in the
underlying locations, the database.
358
:Then there's unstructured data,
which is our documents, right?
359
:There's, you know, I don't know how
many engineering memos we have on
360
:SharePoint or PDFs, you know, code, code
documents, things of that nature, and
361
:that, that data's very unstructured.
362
:And so how do you get the data from
all these different sources and
363
:bring that in was a real challenge.
364
:And that was probably the biggest
challenge of this whole tech stack, is
365
:not only how do we consume it, right?
366
:But then as it updates and updates
live, I don't wanna be asking a
367
:question, getting outdated data, right?
368
:So we had to build a system and
architect and infrastructure.
369
:That as things change live, it can
update that data in real time, so you're
370
:never getting out of data information.
371
:And that was a real challenge.
372
:That was probably the, the most
critical piece to this actually
373
:successfully working in the organization.
374
:Um, and that involved, you know,
being able to, to get all of our,
375
:our information data off SharePoint,
which is mostly the unstructured data,
376
:um, as well as tap into, you know,
Salesforce databases and our data lake,
377
:which our data team ETLs a lot of API
data together into a single location.
378
:Um, so yeah, a lot of different places.
379
:Uh, VantagePoint today, uh, employee
directory, which is data structured.
380
:Um, you know, all of the,
the SharePoint documents.
381
:Uh, Salesforce, a vantage point, and
I think that's about it right now.
382
:Uh, oh, project database,
which I think is a mix.
383
:And this is another fun part.
384
:Half of our data for projects
lived in Salesforce and the
385
:other half lived in VantagePoint.
386
:And had you meld those two together
with a singular question and
387
:had you search across both those
singular questions, it was very
388
:complicated structures to the data.
389
:So, so again, this, this whole thing
though, with this structured data,
390
:would've never been able to, like we
would've never been able to execute
391
:it if we didn't have our data team.
392
:And, um, one of the things we were able to
do with working with that team is saying,
393
:Hey, if we can, I know this data lives
in two or three databases, but if we can
394
:make a singular location for that and ETL,
that data in it makes it much easier for
395
:us to consume and the LLM to consume it.
396
:So we did a lot of work with those
folks and, and really made that, that,
397
:you know, that opportunity possible.
398
:So.
399
:Randall Stevens: You were, uh, uh,
uh, kind of touching on earlier that.
400
:Ultimately with these new,
any new technology like this,
401
:if people don't trust it.
402
:Know we were talking about the
hallucinations and the challenges
403
:of, you know, overcoming those
early stage kind of obstacles.
404
:And, uh, I'm, I, I'm the same.
405
:It's like, I, I don't
want to do something.
406
:You only get one bite at the apple
or one swing swing at that, right?
407
:It's like, uh, as soon as
somebody has a bad experience.
408
:So maybe you can kind of dig in a
little bit about how did you all, you,
409
:you had to prove it out to yourself
that it would work, but how did
410
:you start to
411
:show it to others?
412
:What were the data sources?
413
:How did you all kind of prove out and
make sure that it, that one you were
414
:putting something out there that could
be trusted and wasn't hallucinating.
415
:And then I'm sure once people got a taste
of that, they, they were asking right.
416
:To kind of broaden the
sources of that info.
417
:But maybe you can talk
a little bit about that.
418
:Steve Germano: Yeah, the, um, so, so
Meg, it can, Meg basically has access
419
:to these different locations for data.
420
:So as you ask a question, it goes and
gets it and retrieves it, and then
421
:responds in a human-like way, right?
422
:And so, um, making sure and how
you do your prompt engineering,
423
:that you're not, that the LLM you,
you, you, you make it really safe.
424
:So it doesn't hallucinate on that data.
425
:Does it make up numbers?
426
:Um, that, that's a kind
of a prompt Engineering.
427
:Prompt engineering is tricky.
428
:It's, it is, right now, there's, there's
continual research being done on it.
429
:So almost every week there's new white
papers coming out with new techniques.
430
:And so by telling the LLM very,
very kind of simple things like
431
:let's think step by step, right?
432
:Or, um, you know, no pros, which
don't be verbose, just gimme
433
:the information I'm asking for.
434
:Those little things in the prompt
engineering make a big difference.
435
:Um, but as you start to tweak with
some of the LM settings on temperature
436
:and top p and those types of,
you know, technical settings, I.
437
:You can basically tell the inference
engine, Hey, don't be creative.
438
:There's like a
439
:creative dial in there, if you will.
440
:Right?
441
:And, um, as you kind of tweak
these dials and you do some of
442
:this prompt experimenting, you
can find that right marriage.
443
:And that is not an easy thing.
444
:I'm, I was prompt engineering
something this morning, like,
445
:you know, it's a continual thing.
446
:And as you start, now that we're into
kind of a, you know, a, a company-wide
447
:rollout, we're getting a lot more
analytics so you can start to tweak
448
:even a little bit more and fine tune.
449
:Um, but that's really the, the, the
key is, you know, can you get your
450
:data in one, but then can you make
sure the LLM respond correctly?
451
:And if it can't answer the question with
the data you provided, most importantly.
452
:Say you don't
453
:know the answer, right?
454
:So we've taken the approach, and
I'll show you in some examples when
455
:we get to it, but we've taken the
approach of, we might not have the
456
:information to answer your question,
that would just maybe a knowledge gap.
457
:And so our product owners
really focused on that.
458
:Now that we have good analytics
coming in, like, oh, this guy asked a
459
:question but it didn't have the answer.
460
:So that's a knowledge gap.
461
:Let's go get our librarians and
let's get that information added
462
:to the data sources, right?
463
:And so making sure it just says,
Hey, I don't know the answer,
464
:but here's what I did find, and
lemme summarize what I found.
465
:That seemed to be really
adequate for people.
466
:Um, because what they
ask may be very vague.
467
:And that's, uh, you know, to get to our
early conversation, uh, before we went on.
468
:You know, how do people search?
469
:That's a, that's a really ingrained
thing over the past 20 years
470
:of working with Google, right?
471
:We, we, we search by keyword
types of lexicon search, right?
472
:Where it's like one word and add
another word, add another word.
473
:We don't typically search in
a conversational way, right?
474
:So LLMs work with a conversation, right?
475
:So the, you, i, I always say to
our product owner kind of a joke.
476
:I say, ask, don't get dumb.
477
:Right?
478
:Ask smart.
479
:Get smart.
480
:So if you actually put more information
into your text query, right?
481
:As that gets vectorized, there's
more information to compare it.
482
:With the associated data to get
more accurate results coming back.
483
:So if you put in one or two words and
it's vague, you're not really asking
484
:a question, we're doing a search.
485
:It's, it's harder for the LLM to
figure out what your intent is.
486
:And there's lots of tricks out there.
487
:There's like a, hey, well as soon as
a question comes in, pass it through
488
:an AI that just does intent and then
pass it to the other things, right?
489
:So, lots of tricks out there.
490
:I've tried a lot of 'em.
491
:We've settled on, um, we have
a couple different ais that
492
:are interacting at the time.
493
:You do a question.
494
:Uh, acronyms is a big hassle.
495
:Oh, man.
496
:Acronyms, man.
497
:Whew.
498
:That's, that's giving me a lot of trouble.
499
:You know, we are, we
live in acronym soup as
500
:architects and Engineers,
501
:Randall Stevens: It makes me, uh, think
about just, you know, and I hadn't
502
:even started thinking about this,
but you know, how do you write a test
503
:plan if you're developing software
with this kind of a front end engine?
504
:How do you write a test plan?
505
:You'll now, you gotta start to say, what
kinds of questions are people gonna ask?
506
:And that's
507
:basically infinite.
508
:And, uh, so how do
509
:you really test this?
510
:Have you all started thinking about
that from a, what, what's a test
511
:plan look like for, for a bot, right?
512
:Steve Germano: We thought about it.
513
:We don't know that we have
a great answer for it today.
514
:Yeah, it's, I've seen some examples, um,
on kind of the, uh, Microsoft development
515
:team that I'm part of that does, uh,
the semantic kernel, which is kind
516
:of like their AI engine, if you will.
517
:Um, those folks are, are trying to do the
same things we're doing right now, right?
518
:Everyone's all in the same boat.
519
:Everyone's developing, getting into
production at the same time, and
520
:everyone has the same questions.
521
:Well, how do we, how do
we do testing of LLMs?
522
:Um, there has been some good white
papers put out, and OpenAI released
523
:a update to one of their models, um,
I wanna say the November release last
524
:year, if I remember correctly, where
you can now pass in a, uh, a grid
525
:for all intent and purposes, and it's
basically like an identification token.
526
:So when this exact string is asked.
527
:And I use this token,
the, here's the answer.
528
:So then when you run your test, same
question, same token should get the
529
:very same answer or close to it.
530
:Um, and that was actually a key because
of this exact question, their developers
531
:and open ais, um, you know, third
party developer apps, app developers
532
:we're asking for the same thing.
533
:How do we guarantee, um, you know,
consistency from question to question?
534
:And we still struggle with that, right?
535
:Like, you can ask the same question twice
and get, you're gonna get two different
536
:answers, but the intent might be the same,
but they're gonna be worded differently.
537
:Um, you know, so there, there are
some struggles that, that you are
538
:gonna deal with that I don't think the
industry really has a hundred percent,
539
:um, you know, kind of worked out.
540
:But from a unit testing perspective, using
that key is supposed to help the inference
541
:engine stay consistent if you will go
through the same neural network pass.
542
:Um, but it's,
543
:it's a
544
:Randall Stevens: We ran, uh,
545
:Steve Germano: a lot of, a
546
:Randall Stevens: we ran some of those very
similar kind of tests with some of the
547
:work that we're doing here internally.
548
:And it was, well, one of my first
questions was like, if you ask, if
549
:you feed the same, very same text
string in over and over and over, do
550
:you get the same answer back or not?
551
:And, you know, you don't, uh, you, you
552
:might get nine, nine times out of
10 the same thing and then all of
553
:a sudden something a little bit,
uh, you know, varied comes back.
554
:So it is.
555
:Evan Troxel: There's some interesting
expectations around that, right?
556
:Because we, we've, we expect a computer
to answer it the same every single
557
:time, but it's actually like you're
asking nine different people nine
558
:the same question.
559
:And, and it's, it's just like sitting
in architectural design studio and
560
:everybody has the same brief and you
get 16 different outcomes, right?
561
:It's
562
:everybody is applying their,
563
:and so it's, it's kind
of like this, right?
564
:It's like if it's searching across all
of the possible information that could
565
:apply to this, and it's this LLM, right?
566
:So it's just looking for patterns in
words and concepts and things that I,
567
:I see why it's coming back differently,
but we have this expectation that
568
:we're asking a computer something
and that we will always, it's
569
:like we expect it to be like math.
570
:Right.
571
:And not
572
:like a conversation because,
573
:and this
574
:has come up before, right?
575
:But I don't know what the
next word I'm gonna say is.
576
:I, I don't know.
577
:I don't know what you're gonna say next.
578
:And you don't know what you're gonna
say next, but you respond in real time.
579
:And that's exactly what's
going on with this.
580
:And so it's not different, but
we have this, this different,
581
:we think differently about it.
582
:Randall Stevens: Yep.
583
:Steve Germano: Yeah.
584
:When this, when this comes up, and it
comes up quite often right now, and,
585
:uh, the, the technical term is called,
uh, deterministic or non-Deterministic.
586
:Right.
587
:And LLMs are non-deterministic.
588
:So every time somebody tries to gimme a
bug report, I just, I have a meme say, and
589
:I just send it up to 'em, non-determinism.
590
:And it's actually from open AI's quote,
what does non-deterministic mean?
591
:Right?
592
:So it, but it's challenging.
593
:So how do you, how, and as a programmer,
we're almost kind of retraining our
594
:brains because our goal, our job right, is
595
:to get deterministic outputs.
596
:Yeah.
597
:A hundred percent.
598
:So how do you, as a programmer
retrain your brain to say, Hey, you
599
:know, now you're gonna start using
AI and, and not just chat bots.
600
:Everybody thinks chat bots,
but you can use a AI planner.
601
:Right.
602
:In a program that may have,
I'll give you a great example of
603
:talking about something right now.
604
:Um, when I place an outlet, how do I
know what host hosting behaviors should
605
:go into in Revit when there's 50 of 'em?
606
:Right?
607
:Does it go on a wall?
608
:Does it go in the floor?
609
:Does it go in the ceiling?
610
:Does it go to the work plan?
611
:Like, there's all these
different scenarios.
612
:Well, when you can't code all those
code branches, that's where AI has a
613
:great advantage because it's like, Hey,
I can give you the inputs, let you give
614
:you instructions on how to determine
your thought process, and then give
615
:me what you think is the right output.
616
:And that, that is where I feel like
programming is changing right now.
617
:Right?
618
:Everyone's all enamored with chatbots,
and that's the first, first location.
619
:But moving forward, how do we use a
small bit of AI that may never even be
620
:seen by the end user, but it's doing
things behind the scenes to just make.
621
:Decisions, right?
622
:From a, a, a, a very large amount
of potential outcomes, right?
623
:That's, there's a lot of ml that can do
that in a mathematical manner, right?
624
:When we start talking about ML algorithms
and different things like that.
625
:But when you're talking
about reasoning, right?
626
:That's a little bit different.
627
:And LLMs are great at reasoning,
especially if you give them good
628
:instructions on what to reason
over and set parameters coming
629
:in and set outputs going out.
630
:And so OpenAI wasn't really their
models, uh, kind of, they're
631
:always on the forefront, right?
632
:They're always like the best
models out, the newest features.
633
:They added something last
summer called functional.
634
:And with function calling, you can
say, Hey, uh, you can make a decision
635
:and call functions based on inputting
parameters, and then those functions
636
:can give me back set parameters.
637
:So no longer are you trying to
just parse through text strings and
638
:figure out numbers and texturing.
639
:It's now actually structured data inputs
and structured data outputs that kind
640
:of changed the game for programming
where now we can actually use this to
641
:make decisions inside a deterministic
type of method that is actually a
642
:non-deterministic type of SLU or problem
that you're trying to solve that I
643
:can't code 50 different code paths for.
644
:Right.
645
:So it becomes really interesting
and it, it is tough and I, I talk to
646
:even my, my dev team about this and.
647
:You know, you almost have to kind
of retrain your brain as to, okay,
648
:we're not gonna use AI in this tool or
this new app coming out just for the
649
:sake of, Hey, let's just use your ai.
650
:You gotta be smart, you gotta
use the right tool for the job.
651
:And it's not great for everything, right?
652
:You know, square peg and
around a whole scenario.
653
:It just doesn't fit everywhere.
654
:But as we get, you know, the models
get better, and as we get better, as
655
:we learn more, as we retrain ourselves
on how to think, we're gonna find more
656
:applications for where an LLM can be used,
where things like stable diffusion can
657
:be used, these different bits of, uh, AI
models that are coming out these days.
658
:So it's really exciting time.
659
:Um, probably the biggest change in
programming I've seen in my entire career.
660
:So it's, it's really
661
:exciting.
662
:Evan Troxel: so is Meg built
on top of open AI's platform.
663
:Steve Germano: Yes.
664
:So today, um, we have a couple
different models that we use, but we
665
:are primarily using open AI's models.
666
:Um, and a couple reasons for that.
667
:One, they're the most trusted, right?
668
:They have uh, they have
a bunch of filters.
669
:So if you just go to OpenAI and you
chat today, it goes through a bunch
670
:of behind the scenes filters there,
sexism, racism, all the isms, right?
671
:Um, it has a political filter,
never to make jokes about politics
672
:and all these different things.
673
:It's just got a lot of behind the scenes
things because they've come under so
674
:much fire and heat since they've started.
675
:They have a ton of protections.
676
:You could use open source models
for free for cheap, right?
677
:That can do about the same performance
wise, but you lose all that.
678
:So from an enterprise, right?
679
:Hey, money's kind of cheap in
that perspective, where why
680
:not pay for the best product.
681
:And it's what I've been
talking with other developers.
682
:Most people are going that
route just because it's easier
683
:and it's safer to today.
684
:That may change over time.
685
:There's a French model that came out
called Mistral that is, uh, amazing.
686
:It's, it's performing better
than GPT-4 in some areas.
687
:Um, definitely better than 3.5.
688
:And no one's been able to say
that with an open source model.
689
:They also have a paid model as well
that's, uh, comparable to GT four.
690
:Um, but, uh, you know, the protections
are, they fall a little bit short on the
691
:protections right now, but everyone's
692
:Randall Stevens: What kind of goes back,
693
:Steve Germano: it's gonna be a,
there's gonna be a lot of competition
694
:in time.
695
:For
696
:Randall Stevens: kind of goes back to
that, you know, early stages, obviously
697
:with all of this, that you, you don't
want to do anything that's gonna break
698
:trust because you, you can, you know, one,
699
:one misfire of some
700
:type right?
701
:Freak freaks people out and you just wanna
try to avoid it if you can, especially.
702
:Steve Germano: Yeah.
703
:I mean, especially a
company wide chat, right?
704
:Uh, it makes one.
705
:You know, racial slur or something,
you're getting sued, right?
706
:You gotta be really
careful with these things.
707
:That was my biggest concern.
708
:I was, you know, two years
ago, I'm like, it's not ready.
709
:It's not ready, it's not ready.
710
:And then finally I'm like, okay,
I think we're, we're there, right?
711
:The tech's catching up.
712
:Um, and that's kind of cool right now
it's like all, you wanna be bleeding
713
:edge, but you don't wanna be too
bleeding edge because, you know, one,
714
:you can waste a lot of investment.
715
:Right?
716
:Um, and we were kind of really, um,
intricately thinking about exactly when is
717
:the tech gonna catch up in different areas
for us to release different features.
718
:And so we have some new plugins
that are on hold right now.
719
:We're kind of waiting for
some things to catch up.
720
:Um, but there's just, there's a
lot of movement going on right now.
721
:And even, even Microsoft themselves are,
they're launching their co-pilots in
722
:all their product stacks and, uh, you
know, they've got issues too, right?
723
:So I, I was just watching yesterday,
um, someone demoing the preview
724
:version of the co-pilot in Excel.
725
:And, uh, someone had asked me in my
firm, well, how are they doing that?
726
:And we're not doing
Excel stuff today, right?
727
:And I'm like, eh, I don't know.
728
:That's gonna be really feasible today.
729
:And so, and, and I'm watching this
video and this guy's got a million
730
:rows of data in Excel file, right?
731
:And the co-pilot chokes and just
completely locks up and crashes.
732
:Then he goes, okay, lemme
try it with less data.
733
:He goes down to half a million rows,
crashes, a hundred thousand rows,
734
:crashes, goes down to 5,000 rows crash.
735
:He had to get all the way down to
400 rows before it would actually
736
:be able to talk over that data.
737
:So, so there's just a lot of, I mean,
if Microsoft, you know, has obviously
738
:more resources, they, if they're
still struggling with some of this
739
:tech and how to achieve it, right?
740
:Probably good to go put that on hold
for a little while and we'll come
741
:back to Excel files later, right?
742
:So, uh, it's just being smart about where
you kind of think the industry is and kind
743
:of, you know, just, just being tuned in
with things so you don't forge too far
744
:ahead and waste time and money and effort.
745
:Randall Stevens: Yeah.
746
:Well, I think,
747
:Evan Troxel: Maybe before you just
jump into, into whatever we're gonna
748
:do next, I'm hoping, I want, I want
to take a look at this, but I'm just
749
:wondering how much of the sh has,
has your team shifted from before two
750
:years ago to what you're doing now?
751
:How much has it shifted
to ai, uh, focus on that.
752
:Steve Germano: Um, not
much to be honest with you.
753
:Not, not much outside of the MEG product.
754
:That's really the only product where
we've actively integrated AI today.
755
:Um, actually, no, I'm sorry.
756
:There is one more, but it was a
legacy project that our data team's
757
:working on and that's doing, you
know, points in buildings for, um.
758
:Um, it's more natural language
processing and it's for points and
759
:buildings for, uh, control systems.
760
:So, so those are really the only
two projects I know of today.
761
:They're actually utilizing any
type of ai, if you will, with LLMs.
762
:Um, and so like we have other projects
that are act inactive development that
763
:we will be adding some elements into, but
I don't think there'll be public facing.
764
:There'll more be, like I was
saying before, behind the scenes
765
:to do some decisions and things.
766
:So, um, so yeah, I guess not a ton today.
767
:Everyone's conscious of it, but
we gotta have the, it is gotta be
768
:the right tool for the job, so,
769
:Evan Troxel: I appreciate you saying
that, just because I think, because
770
:we're talking about this topic, the,
the, the mind immediately goes to
771
:like, this is everything and this is
all that we should be focusing on.
772
:And, and so it's, it's, uh,
important I think to hear what
773
:the answer to that question.
774
:Steve Germano: Yeah, cer certainly not,
I mean, that's a common trap, right?
775
:Of, of development teams.
776
:They find this new shiny object
and they're like, oh man,
777
:I want to go work on this.
778
:So they can learn it and, and
it might not just, might not
779
:be the right tool for the job.
780
:So you gotta be really smart with that.
781
:Randall Stevens: so do you get to,
782
:can you show us, can you
783
:show us something?
784
:Yeah,
785
:let's see
786
:a
787
:Steve Germano: yeah, absolutely.
788
:so this is Meg, and you can see
it's kind of my chat history.
789
:Been asking her some
questions here and there.
790
:Um, let's ask, we've got some questions.
791
:I'll just copy and paste in here.
792
:So one of the things
793
:Evan Troxel: say her, you say her.
794
:I just wanna point out that
like you are personifying
795
:this and, and I'm just
796
:wondering is it natural?
797
:Like, is
798
:that, was that
799
:important?
800
:Is that
801
:something that comes up
802
:in conversation like
culturally in the firm?
803
:Steve Germano: it's a, it's a funny thing.
804
:So I'll tell you the backstory.
805
:Um, there was no name and I had
somehow in my early version of
806
:it, I had named it Megatron.
807
:Right.
808
:Just from, you know, being a,
being a, a nerd there, right?
809
:So, and then when we sent off some
information to marketing, like, can you
810
:give us a logo or, you know, I, I, there
was no persona, there was no Meg name.
811
:Our market, one of our marketing
people came up with it and just
812
:like, Hey, what do you think of this?
813
:This is a first iteration,
only one iteration.
814
:And we're like, it's better than Megatron.
815
:And, and then it just took
on a life of its own right.
816
:And now people personify, hi, thanks Meg.
817
:That, you know, they're talking to it.
818
:And the analytics, it's really
funny to watch the analytics.
819
:'cause you can see people have
conversations, which is really cool.
820
:They're appreciative when they
get an answer and they say
821
:Randall Stevens: We're nice to it.
822
:Steve Germano: you're welcome.
823
:Let me know.
824
:Yeah.
825
:It's really, it's
826
:Evan Troxel: not the ones who are
gonna go kick the robot because
827
:they know the robot will come back.
828
:Yeah.
829
:Steve Germano: It's, it's a smart move.
830
:I'm very polite to make.
831
:Evan Troxel: Right.
832
:Steve Germano: Yeah.
833
:Um, so, so you can ask, uh, you can
just start typing and ask a question.
834
:Um, in the, in the, there's two
ways to kinda interact with it.
835
:You can just type, like, you
can see here, can you explain
836
:ary ventilation six 2.1, right?
837
:Um, or you can call a plugin directly.
838
:And so what we did was we
built a plugin architecture.
839
:And those plugin architectures are for
different purposes, like our corporate
840
:directory, our corporate policies, right?
841
:It's all of our bucket of data for
corporate information and health
842
:benefits and all those things.
843
:Um, our directory is kind
of like, you know, employee
844
:directory, those types of things.
845
:Project databases, token's.
846
:Really interesting.
847
:Uh, token and VDC probably two of
my favorite, um, plugins because
848
:there's so much data there.
849
:Gigs and gigs and gigs
of unstructured data.
850
:Um, VDC is just all of
our Revit knowledge.
851
:Just, you know, uh, it's all
currently lives in like a, um.
852
:Uh, it lives in a massive, uh, OneNote
notebook that all the BDCs use across
853
:the company, and they search in there
and, oh, search, it's okay in there.
854
:It's not too bad, right?
855
:In OneNote.
856
:Um, but this is just much
faster, much easier, right?
857
:And because that OneNote got
so big, there's hundreds and
858
:hundreds of sections and pages.
859
:It, it can be complicated to find stuff.
860
:And then Token is uh, what we call
our tech Ops Knowledge network.
861
:And this is a location on SharePoint
with just thousands and thousands
862
:of design memos, code change
announcements, all the things that
863
:our engineering folks, um, need to
get out there to, to our engineers.
864
:Um, and this could be hey scenario
situations where, Hey, I know local
865
:jurisdiction code calls for this and
this and this, but we recommend upsizing
866
:because of x, y, Z scenario, right?
867
:Lessons learned from, from
building engineering designs
868
:and things of that nature.
869
:So there's a plethora of our
most senior engineers brains.
870
:Focused in that area of unstructured data.
871
:Um, so it's really, really exciting
to kind of have that level, level
872
:of information available to anybody.
873
:Randall Stevens: Steve, a a quick,
874
:Steve Germano: so you could fire,
875
:Randall Stevens: oh, I'll
876
:ask, just ask
877
:a quick question along those lines, I'm
sure you know, inev, inevitably there's
878
:gonna be some data that's either out
of date or potentially wrong, right?
879
:Even, you know, there's always
bad information somewhere.
880
:Have you all, uh, have you put in
any kind of a feedback loop where
881
:if, if, if that information is
identified, it can be flagged and
882
:then the system can learn from that?
883
:Steve Germano: That's a great question.
884
:That's a great question.
885
:Um, so we use a, for all of our
software, we use a feedback board, right?
886
:Where folks can go in there,
report bugs, you know, report,
887
:um, different feature requests.
888
:We don't have something live iterative
in here with the exception of, excuse me.
889
:Um, anyone can get a good
answer and then hit I.
890
:Thumbs up or love, you know, these are
the first two things that kind of pop up.
891
:And those positive sentiments are what
we'll use in the future to kind of do
892
:some fine tuning of the model, right?
893
:Like, hey, you've answered
some great things.
894
:Here's this.
895
:Uh, we don't have a negative
sentiment in there today.
896
:They have to go to can, uh, to our
feedback board and actually submit that.
897
:So, um, but this is still in beta and
that's something we're considering.
898
:You know, can I tell Meg, hey, that
previous answer was actually outta
899
:date and, you know, submitted as a
bug report or something like that.
900
:Um, those are all certainly, uh, possible.
901
:So I'll show you a really easy plugin.
902
:So we just typed help.
903
:This is kind of like our initial help
plugin where this comes in really fast.
904
:It's static text.
905
:It's not actually using any AI
or anything, but it just kind
906
:of gives people, like, first
thing people are always gonna
907
:ask how do you use stuff, right?
908
:So it gives 'em some examples
of what to search, how to
909
:search, those types of things.
910
:You know, pretty, pretty standard stuff.
911
:Um, if I come in here and
let's ask, you know, how do I
912
:submit, um, an expense report?
913
:So this will go and hopefully get
us to the right location for data.
914
:So that's the first decision point, right?
915
:Which location bucket is the right one
to go to based on user's question, right?
916
:So it'll say, Hey, I'm
answering from this plugin.
917
:And then you can see it's,
it's streaming that information
918
:that is coming from our data.
919
:So this is not an LLM making anything up.
920
:This is not an LLM saying what it
thinks it is, it's give being provided
921
:data and regurgitating that and maybe
massaging it based on how the LLM would
922
:like to, you know, predict the next word
based on the, the, the neural network.
923
:So, and then you have, uh, uh, you
know, links to the source documents.
924
:So what's kind of cool here, and what
was really complicated for us to figure
925
:out was, well, if there's five answers
to a question or a additional data from
926
:these, all these different documents, I.
927
:Like, I wanna get as much information
to you to answer your question, but I
928
:wanna separate that information out.
929
:And how do I, how do I let that user know?
930
:So we, we provide all the additional
related links in here that may be relevant
931
:to that user's question, but the LLM
has the capability through instruction.
932
:It's told it's okay to
add a consolidated answer.
933
:So not just one answer rules them
all, and I just pick the first top
934
:search result and that's what I use.
935
:No, you can make a more concerted
effort to be a bit more cohesive.
936
:Right?
937
:And a bit more
938
:Randall Stevens: I think, uh, I
939
:think referencing back to the, the
sources of the information is an important
940
:piece of, of building that trust too.
941
:It's like in these early stages when
you get an answer back, it's like, well,
942
:here's the answer, but here's the source.
943
:Right?
944
:It's like you're citing the sources.
945
:So,
946
:Steve Germano: a hundred percent.
947
:A hundred percent.
948
:So we didn't do this in
the early alpha testing.
949
:Um, the first version of this that
saw, you know, public users and
950
:that was the first thing they said.
951
:Hey, that's cool,
952
:but I need to see where it's from and I
need to go validate, you know, trust yet.
953
:Verify.
954
:Exactly, exactly.
955
:And, and it's funny 'cause I've had that
same conversation with other developers
956
:who are doing the same thing in the
enterprise for different enterprises,
957
:not just AC sector, but others.
958
:And that's the first thing they
have to tackle is, hey, people
959
:wanna know where it's coming from.
960
:They wanna validate that information.
961
:And so we
962
:Evan Troxel: though, that.
963
:that's coming from like huge OneNote
documents or a a SQL database.
964
:So when you click on a link, like
where does it actually take you?
965
:Steve Germano: Um, they
could be various locations.
966
:So like this, this is a SharePoint,
so this is in some SharePoint
967
:folders, somewhere that looks
like finance and accounting.
968
:Um, so, so these can all be
in different locations across
969
:different sectors of the business.
970
:Um, but this particular plugin, most
of its data is, is uh, you know,
971
:unstructured content and SharePoint.
972
:Evan Troxel: So will it take you to a
specific location in a huge OneNote or
973
:will it just take you to the, the general
folder and then you have to search
974
:Steve Germano: Uh,
975
:no, this will actually, we
have it set so it'll actually
976
:open the file directly right
977
:Evan Troxel: Nice.
978
:Randall Stevens: link.
979
:You can, you can build like deep links
into these documents and stuff, right?
980
:Anchor
981
:Steve Germano: yeah.
982
:Correct.
983
:Yeah.
984
:Most of the documents aren't built
like that because they're, they were
985
:just word docs that people have made.
986
:Um, but you can do that, correct?
987
:Evan Troxel: Hmm,
988
:Steve Germano: So I'm gonna, I'm gonna
show you a little bit different one.
989
:Um, so this one is actually going
to look through structured data.
990
:So we just talked about unstructured.
991
:Now we're gonna go through structured
structure is a lot tougher to get right.
992
:And, um, this one is actually
doing some database searching.
993
:It's, you know, AI writing SQL queries
and answering SQL queries and all
994
:these really complicated things.
995
:That's really, really tough to
get consistently accurate because
996
:while AI can code really well,
they don't understand your data
997
:structure and there's a lot of tough.
998
:Uh, things to kind of, uh, uh,
there's a lot of challenges in
999
:there that a lot of developers are
having, um, to really kind of think
:
00:44:48,850 --> 00:44:50,500
through and how to solve right now.
:
00:44:50,800 --> 00:44:53,050
And, uh, we are really lucky.
:
00:44:53,140 --> 00:44:54,730
Like this is a very hard thing.
:
00:44:54,730 --> 00:44:59,170
If I didn't have the collaboration
with our data team to say, Hey, let's
:
00:44:59,170 --> 00:45:05,740
massage the data side to work better
with our, our product, it would've
:
00:45:05,740 --> 00:45:07,270
been almost impossible, right?
:
00:45:07,390 --> 00:45:10,780
So if somebody said, Hey, just go build
this thing for, uh, you know, Johnson
:
00:45:10,780 --> 00:45:15,220
Controls and here's their database with,
uh, you know, 5,000 columns of data,
:
00:45:16,180 --> 00:45:18,010
that's gonna be really difficult, right?
:
00:45:18,100 --> 00:45:22,480
LMS get confused really easily, especially
when it comes to structured data.
:
00:45:22,750 --> 00:45:26,260
So we were able to kind of get a good
marriage there to give it just the
:
00:45:26,260 --> 00:45:27,940
right of data, not too much data.
:
00:45:28,450 --> 00:45:32,740
And keep it really focused on the
data we want it to answer from.
:
00:45:32,770 --> 00:45:35,380
So it doesn't even have access
to a whole bunch of other stuff.
:
00:45:35,440 --> 00:45:35,860
Right.
:
00:45:36,130 --> 00:45:38,440
And it only has access to
things that we want it to court.
:
00:45:38,590 --> 00:45:40,210
There's no social
security numbers in here.
:
00:45:40,210 --> 00:45:41,890
There's none of that kind
of silly stuff, right?
:
00:45:42,160 --> 00:45:46,450
Um, so it's very, very, um, you know,
restricted as to what it can have and,
:
00:45:46,455 --> 00:45:51,460
and, you know, you don't want it to have,
uh, to go crazy and delete tables, right?
:
00:45:51,460 --> 00:45:53,560
So it's where you only access all
those types of protections you
:
00:45:53,560 --> 00:45:54,820
just gotta kind of think through.
:
00:45:55,150 --> 00:45:58,210
But you can see here it's, it's
actually doing inquiries to the point
:
00:45:58,215 --> 00:46:00,430
where it can summarize and quantify.
:
00:46:00,640 --> 00:46:02,920
So it can tell you, Hey,
we've got 71 people in the
:
00:46:02,920 --> 00:46:04,900
Las Vegas office today, right?
:
00:46:05,110 --> 00:46:09,130
Uh, one I really like, uh, we just added
this functionality through analytics.
:
00:46:09,130 --> 00:46:14,800
We saw people were asking, um,
let's see, how many, uh, how many,
:
00:46:14,800 --> 00:46:21,155
let's see, licensed mechanical
engineers, uh, are in the state.
:
00:46:22,810 --> 00:46:24,610
Uh, Nevada.
:
00:46:25,090 --> 00:46:26,530
Hopefully this works, but should work.
:
00:46:27,010 --> 00:46:31,960
So this was something we didn't really
have this level of data exposed and
:
00:46:31,990 --> 00:46:34,960
oh, so you see that one's actually
answering, I think from the wrong plugin.
:
00:46:34,960 --> 00:46:37,180
It should be answering from
DURs, so I'll do that again.
:
00:46:37,510 --> 00:46:42,340
Um, but it's, it should be
searching through a new set of
:
00:46:42,340 --> 00:46:46,030
data that we added in and then
answering and quantifying from that.
:
00:46:46,330 --> 00:46:50,290
So, so this is one of the challenges
right now is picking the right plugin.
:
00:46:50,290 --> 00:46:53,590
So you'll see here it's actually
answering from our token library when
:
00:46:53,590 --> 00:46:56,260
it really shouldn't be, should be
answering from our directory library.
:
00:46:56,650 --> 00:46:59,740
So since it stops responding, I'll
go ahead and, and ask that again.
:
00:46:59,960 --> 00:47:03,845
Evan Troxel: While it's thinking about
this, uh, can you talk a little bit about,
:
00:47:03,845 --> 00:47:08,945
you talked about, uh, IMEG does a lot
of acquisition and so you're acquiring
:
00:47:08,975 --> 00:47:10,775
these other firms that I'm sure have.
:
00:47:11,080 --> 00:47:12,280
It's all over the map.
:
00:47:12,280 --> 00:47:16,930
The level of sophistication of their
databases and where their data is
:
00:47:16,930 --> 00:47:19,660
stored and how it's stored and if
it's structured or if it's not.
:
00:47:19,665 --> 00:47:24,040
So is your data team responsible for
ingesting that and figuring out the
:
00:47:24,040 --> 00:47:25,900
best way to bring all this together?
:
00:47:26,320 --> 00:47:26,590
How?
:
00:47:26,590 --> 00:47:29,050
How are you dealing with
that in an ongoing basis?
:
00:47:29,770 --> 00:47:31,030
Steve Germano: Yeah,
that's a great question.
:
00:47:31,030 --> 00:47:36,730
So we have a, um, a full team that
does the, um, integration of new
:
00:47:36,730 --> 00:47:41,350
acquisitions and they have kind of, um,
a graduation process, which can take,
:
00:47:41,355 --> 00:47:43,060
I believe around a year to two years.
:
00:47:43,420 --> 00:47:47,770
And so throughout that process, right,
they're transforming or they're getting
:
00:47:47,770 --> 00:47:51,550
onto our networks, they're getting their
data transformed into our data sets.
:
00:47:51,790 --> 00:47:54,040
They may have software that's
really useful for us that
:
00:47:54,045 --> 00:47:55,480
we may take and integrate.
:
00:47:55,660 --> 00:47:58,030
Um, they're getting trained on our
existing software, so there's a
:
00:47:58,030 --> 00:47:59,920
full team that does that process.
:
00:48:00,250 --> 00:48:02,980
And throughout that process is
when we'll do that evaluation.
:
00:48:03,010 --> 00:48:06,520
And we haven't had to do too much of
that because this is such a new product.
:
00:48:06,805 --> 00:48:10,975
But as we consume their information,
the data team typically figures
:
00:48:10,975 --> 00:48:15,505
out how to get them onto our
vantage point, our Salesforce.
:
00:48:15,505 --> 00:48:19,525
So we shouldn't have to, um, they, they
do have to consume it and ingest their,
:
00:48:19,525 --> 00:48:23,245
their previous projects and all that data,
but that shouldn't really affect, like,
:
00:48:23,250 --> 00:48:27,505
I wouldn't have to do any coding changes
on our team side because this knows
:
00:48:27,505 --> 00:48:29,575
how to talk to that data set already.
:
00:48:29,875 --> 00:48:32,275
So it becomes a little bit
simpler from that perspective.
:
00:48:32,425 --> 00:48:37,015
But I will say on the acquisition side,
this is a really valuable tool for that.
:
00:48:37,465 --> 00:48:42,325
And the reason is, um, we, we've
been seeing people searching just,
:
00:48:42,565 --> 00:48:45,595
hey, gimme contact information
for this, this, and this person.
:
00:48:45,595 --> 00:48:47,905
And they don't know where they
have located, but they need that
:
00:48:47,905 --> 00:48:50,185
information to go put into some
other workflow they're doing.
:
00:48:50,185 --> 00:48:50,635
And we had this.
:
00:48:51,150 --> 00:48:53,635
Just this week we had somebody who
was doing this over and over and over.
:
00:48:53,635 --> 00:48:58,225
So our PO reached out to 'em, Hey, why do
you keep searching for A, B and C people?
:
00:48:58,435 --> 00:49:01,255
And you know, and well she's like,
oh, I'm doing this and I'm copy
:
00:49:01,255 --> 00:49:03,505
and pasting it into here 'cause
it has to go in this report.
:
00:49:03,835 --> 00:49:05,635
And we're like, oh, interesting.
:
00:49:05,640 --> 00:49:10,525
So these emergent features kind of come
out and we don't know all the workflows
:
00:49:10,525 --> 00:49:12,115
people are gonna use today, right?
:
00:49:12,115 --> 00:49:14,005
We don't know all the
questions people are gonna use.
:
00:49:14,185 --> 00:49:18,805
So having that non-deterministic method
for them to search whatever they want
:
00:49:19,075 --> 00:49:23,575
becomes a really big value, especially
for new acquisition folks, right?
:
00:49:23,575 --> 00:49:27,115
Someone comes on the company,
who do I talk to for HR issues?
:
00:49:27,145 --> 00:49:28,345
Who do I talk to for this?
:
00:49:28,375 --> 00:49:30,115
Hey, who's the CE in this office?
:
00:49:30,115 --> 00:49:31,375
Or who's the licensed engineer here?
:
00:49:31,375 --> 00:49:34,285
'cause I think we got a bid on this
project and I don't even know if we
:
00:49:34,290 --> 00:49:35,605
have licensed engineers in the state.
:
00:49:35,935 --> 00:49:40,435
Those are the types of things that we
try to make this system help them with.
:
00:49:40,975 --> 00:49:45,085
Um, so, so this question finished, which
was not the right plugin, but um, and then
:
00:49:45,085 --> 00:49:48,685
this one here actually fired the directory
plugin directly and said We have 13, you
:
00:49:48,685 --> 00:49:50,455
know, engineers in that, in that state.
:
00:49:50,455 --> 00:49:53,695
So, and I could list them out, but I don't
wanna put people's information out here.
:
00:49:53,695 --> 00:49:56,245
But, um, so here's another really fun one.
:
00:49:56,245 --> 00:49:59,575
So let's do, let's do more
engineering question, right?
:
00:50:00,895 --> 00:50:06,235
So here's one of how do I size
a, a wire for 50 horsepower
:
00:50:06,235 --> 00:50:08,335
motor, um, on mechanical.
:
00:50:08,335 --> 00:50:11,875
So I wouldn't know off the top of
my head, but, uh, there's a lot of
:
00:50:11,875 --> 00:50:13,675
elec electrical engineering data.
:
00:50:13,675 --> 00:50:15,775
There's some NEC code data.
:
00:50:15,955 --> 00:50:20,125
So a lot of this information can be kind
of put together in a cohesive answer.
:
00:50:20,130 --> 00:50:22,735
So you'll see it's Andrew from
a couple different sources.
:
00:50:22,735 --> 00:50:25,855
Elevator design, guide pump,
you know, fire pump design
:
00:50:25,855 --> 00:50:28,165
guides, VFD bearing, pitting.
:
00:50:29,365 --> 00:50:32,785
I don't personally know what all these
documents are, but the information
:
00:50:32,785 --> 00:50:36,325
lives in that document and it's able
to regurgitate it and find what's
:
00:50:36,330 --> 00:50:38,755
relevant in all those documents
and bring it to the forefront.
:
00:50:39,175 --> 00:50:40,570
So it's, it's really cool the way
:
00:50:40,660 --> 00:50:43,390
Randall Stevens: that kind of makes me
think, Steve, uh, you know, some of the
:
00:50:43,390 --> 00:50:47,245
first experiments that I was just using
Chet GPT, you know, a year or so ago,
:
00:50:47,595 --> 00:50:50,045
experimenting with, and I would ask.
:
00:50:50,665 --> 00:50:53,605
Uh, I would ask, you know, I was
kind of testing what you were doing
:
00:50:53,605 --> 00:50:57,445
there, some technical questions,
and it would give me answers back.
:
00:50:57,445 --> 00:51:00,235
And I was like, well, I don't even,
I, I don't even know if that's the
:
00:51:00,235 --> 00:51:01,615
right answer or not the right answer.
:
00:51:01,615 --> 00:51:02,575
So it was back to that,
:
00:51:02,575 --> 00:51:04,105
you know, hallucinations.
:
00:51:04,705 --> 00:51:05,965
And I, I think you're
:
00:51:05,965 --> 00:51:06,265
Right,
:
00:51:06,270 --> 00:51:09,775
about the, you know, if you can,
you can say, just be verbose.
:
00:51:09,775 --> 00:51:13,525
Don't, you know, give me, give me
just the short version of this, and
:
00:51:13,525 --> 00:51:16,765
then if you don't know, tell me you
don't know, instead of just making
:
00:51:16,765 --> 00:51:17,455
something up.
:
00:51:17,455 --> 00:51:17,905
And, uh,
:
00:51:18,295 --> 00:51:20,935
some of the experiments that we've been
running, you know, we've been using
:
00:51:20,940 --> 00:51:25,255
those kinds of approaches to it, and
it's like, just get me, especially
:
00:51:25,255 --> 00:51:30,055
if you're asking it technical, not
asking you to write a, uh, a flowery,
:
00:51:30,115 --> 00:51:35,305
uh, uh, you know, cv, you know,
description of me for some, uh, you know,
:
00:51:36,295 --> 00:51:36,925
for, for some
:
00:51:36,925 --> 00:51:38,665
description of my past histories.
:
00:51:39,235 --> 00:51:42,625
So I, I want real data, real answers.
:
00:51:42,630 --> 00:51:44,275
I don't want any fluff in there.
:
00:51:44,275 --> 00:51:48,475
And to make sure that these, because
I think this is an audience, you know.
:
00:51:48,790 --> 00:51:53,680
That, that ultimately, I bet, I bet
within IMEG, a lot of the engineering,
:
00:51:53,950 --> 00:51:57,370
you know, people with an engineering hat
on, it's like, it's kind of fun to play
:
00:51:57,370 --> 00:51:59,920
with, but I want real data, real answers.
:
00:52:00,040 --> 00:52:00,700
No fluff.
:
00:52:01,210 --> 00:52:01,510
Let's go.
:
00:52:01,970 --> 00:52:02,250
Steve Germano: percent.
:
00:52:02,250 --> 00:52:02,930
Hundred percent.
:
00:52:03,440 --> 00:52:04,290
Like we've already,
:
00:52:05,190 --> 00:52:05,770
oh, sorry.
:
00:52:05,770 --> 00:52:06,410
Go ahead, Evan.
:
00:52:06,475 --> 00:52:08,785
Evan Troxel: I was just gonna say
that the prompt engineering is
:
00:52:08,785 --> 00:52:10,405
changing all the time as well.
:
00:52:10,405 --> 00:52:14,515
So if you teach somebody how to do
it today, it's gonna be different a
:
00:52:14,520 --> 00:52:17,455
year from now or, and probably in a
lot shorter amount of time than that.
:
00:52:17,455 --> 00:52:22,435
But like for example, mid journey five
to mid journey six changes changed
:
00:52:22,435 --> 00:52:23,935
prompt structure significantly.
:
00:52:23,935 --> 00:52:25,225
They made it a lot simpler.
:
00:52:25,810 --> 00:52:30,730
You can enter a much simpler prompt
and still get really amazing results
:
00:52:30,730 --> 00:52:32,920
now with, with six versus five.
:
00:52:33,340 --> 00:52:37,900
And, and so I think that's another
interesting point to make about this,
:
00:52:37,900 --> 00:52:40,900
is like there might be things you need
to include in your prompts today that
:
00:52:40,900 --> 00:52:42,670
you maybe won't need to include later.
:
00:52:42,670 --> 00:52:48,220
And it's just keeping everybody educated
on the best way to prompt these systems
:
00:52:48,370 --> 00:52:50,320
is a moving target all the time.
:
00:52:51,250 --> 00:52:51,540
Steve Germano: Yeah.
:
00:52:51,655 --> 00:52:55,465
And you know, the problem to engineering
happens behind the scenes, right?
:
00:52:55,465 --> 00:52:57,775
In our plugins and all these things.
:
00:52:58,015 --> 00:53:02,245
And we noticed, we just went to, um, you
know, the newest version of G PT four
:
00:53:02,245 --> 00:53:08,485
Turbo and from GT four to GBT four Turbo,
the prompts change from 3.5 to four.
:
00:53:08,515 --> 00:53:09,355
The prompts change.
:
00:53:09,360 --> 00:53:10,705
So we were like, all right, we're ready.
:
00:53:10,705 --> 00:53:11,845
We're gonna go with faster model.
:
00:53:11,845 --> 00:53:13,180
We upgrade this thing,
and all of a sudden.
:
00:53:13,750 --> 00:53:15,640
Whoa, it's not responding the right way.
:
00:53:15,640 --> 00:53:16,900
It was before like, what's going on.
:
00:53:17,080 --> 00:53:19,930
So you have to expect that
because the inference engines are
:
00:53:19,930 --> 00:53:21,250
just natively different, right?
:
00:53:21,250 --> 00:53:22,120
With these lms.
:
00:53:22,155 --> 00:53:22,375
so
:
00:53:22,660 --> 00:53:24,610
so that's, you know, a
little bit of a risk.
:
00:53:24,610 --> 00:53:27,220
It's like, hey, let's go this faster,
better, smarter, cheaper model.
:
00:53:27,250 --> 00:53:31,000
But you do have to put some investment in
there and gets back to the unit testing.
:
00:53:31,000 --> 00:53:32,170
Well how do we batch test it?
:
00:53:32,200 --> 00:53:35,230
How do we, you know, so there,
there is a lot to consider with
:
00:53:35,230 --> 00:53:37,540
that, um, as the technology changes.
:
00:53:37,930 --> 00:53:41,200
And also if you wanna start
exploring things like, hey, let me
:
00:53:41,200 --> 00:53:44,140
go test a open source model, right?
:
00:53:44,410 --> 00:53:48,130
Hey, I can host my own model and I
can cut out these open AI costs and I
:
00:53:48,130 --> 00:53:51,280
can go, you know, put a Mistral model
in here, or LAMA two model and fine
:
00:53:51,280 --> 00:53:52,300
tune or whatever the case may be.
:
00:53:52,600 --> 00:53:53,650
Yeah, you can do that.
:
00:53:54,025 --> 00:53:55,525
What do you lose with that process?
:
00:53:55,585 --> 00:53:55,795
Right?
:
00:53:55,795 --> 00:53:58,855
You might save some money, but
you lose all those protections.
:
00:53:58,855 --> 00:54:01,615
You open yourself up a liability
potentially of this thing going off
:
00:54:01,615 --> 00:54:03,475
the rails and Meg having a bad day.
:
00:54:03,775 --> 00:54:07,255
Or maybe you get hallucinations now
because it doesn't know how to handle
:
00:54:07,255 --> 00:54:09,475
the same prompting to keep it in line.
:
00:54:09,655 --> 00:54:11,065
So there's a lot to consider there.
:
00:54:11,065 --> 00:54:14,575
And it could be a, it's kind of a
prompt engineer is more fine art
:
00:54:14,575 --> 00:54:15,655
than the science at this point.
:
00:54:15,755 --> 00:54:16,475
Randall Stevens: Steve, we've
:
00:54:16,475 --> 00:54:19,685
had a, uh, a couple of,
uh, episodes before this.
:
00:54:19,685 --> 00:54:24,545
We've had, uh, people on talking
about the govern, you know, around ai.
:
00:54:24,545 --> 00:54:27,455
What's the, what's the govern,
governance, what, what are people
:
00:54:27,455 --> 00:54:29,015
thinking about from the ethics side?
:
00:54:29,015 --> 00:54:31,985
You made a couple of comments, but maybe
you can talk a little bit more about
:
00:54:32,675 --> 00:54:36,985
what, either, either you're starting
to think about and driving within IMEG
:
00:54:36,985 --> 00:54:40,085
or are there others within IMEG that
are trying to kind of put a framework
:
00:54:40,090 --> 00:54:41,615
in place to think these things through?
:
00:54:41,615 --> 00:54:43,506
But can you tell us what's
going on on that front?
:
00:54:44,860 --> 00:54:46,600
Steve Germano: Yeah, that,
that's an important topic.
:
00:54:46,720 --> 00:54:49,750
Um, you know, I've talked and
touched a lot on, you know, the
:
00:54:49,780 --> 00:54:51,310
ethical side of things, right?
:
00:54:51,340 --> 00:54:54,550
Um, and, and then you have to
think about the data side, right?
:
00:54:54,550 --> 00:54:55,450
Where's the data going?
:
00:54:55,600 --> 00:54:59,950
And so I think Microsoft and Open
AI's marriage and their partnership
:
00:54:59,955 --> 00:55:02,710
that they have right now is
really, really well put together.
:
00:55:03,100 --> 00:55:06,790
And, you know, um, Satya and
Microsoft and their leadership really
:
00:55:06,790 --> 00:55:08,440
had good vision on this, right?
:
00:55:08,590 --> 00:55:10,960
I mean, they invested a lot of
money and they're gonna make a
:
00:55:10,960 --> 00:55:13,960
lot of money on copilots, but they
really thought about the enterprise.
:
00:55:13,990 --> 00:55:15,340
'cause that's their customer, right?
:
00:55:15,370 --> 00:55:15,910
Their customer's
:
00:55:15,925 --> 00:55:17,395
Randall Stevens: and
intellectual property, right,
:
00:55:17,395 --> 00:55:18,355
that these are intellectual
:
00:55:18,615 --> 00:55:18,910
Steve Germano: They, they,
:
00:55:18,960 --> 00:55:18,970
Randall Stevens: They,
:
00:55:19,505 --> 00:55:24,430
Steve Germano: it doesn't matter if I'm
not surfacing IP information here or not.
:
00:55:24,895 --> 00:55:27,385
But if it's user information, any
of that, they don't wanna leave it.
:
00:55:27,385 --> 00:55:33,295
So what what, um, you can do with, with
Microsoft's, uh, framework is you can
:
00:55:33,385 --> 00:55:38,275
host your own version of those OpenAI
models within your own ecosystem, right?
:
00:55:38,275 --> 00:55:41,095
Or your, your cloud, if you
will, on your Azure stack.
:
00:55:41,305 --> 00:55:42,415
And that data stays in there.
:
00:55:42,445 --> 00:55:46,195
So the residency of that data doesn't
leave, um, which is really important
:
00:55:46,195 --> 00:55:47,395
for a lot of enterprise customers.
:
00:55:47,925 --> 00:55:50,650
Uh, and then the other piece to that
is, you know, all those protections
:
00:55:50,650 --> 00:55:54,790
we talked about, OpenAI is the
farthest along, but there is also an
:
00:55:54,790 --> 00:55:59,950
additional layer for security that
happens at the time of inference
:
00:56:00,220 --> 00:56:04,960
that open, uh, I'm sorry, that, uh,
Microsoft has with their OpenAI studio.
:
00:56:05,230 --> 00:56:10,420
So you can actually go in there and
every LLM call or every call to, you
:
00:56:10,420 --> 00:56:11,950
know, whatever model you've got hosted.
:
00:56:12,310 --> 00:56:16,750
It can run through a series of
checks and it has a severity warning.
:
00:56:16,990 --> 00:56:20,080
And one could be for, you know,
whatever ism you can think of,
:
00:56:20,080 --> 00:56:20,920
there's a whole bunch of 'em.
:
00:56:21,220 --> 00:56:24,220
And you can say, Hey, you know what,
um, I'm okay with a medium on this.
:
00:56:24,675 --> 00:56:27,075
I have a high strict on this high,
strict on this high, strict on this,
:
00:56:27,075 --> 00:56:31,215
you can really lock it down or expand
it depending on your use cases.
:
00:56:31,455 --> 00:56:34,275
And so they've really, and there's
more they're doing there, but they have
:
00:56:34,275 --> 00:56:39,045
a whole team that's literally doing
nothing but solving that problem and
:
00:56:39,045 --> 00:56:43,185
making sure their models are safe to use,
especially in enterprise environment.
:
00:56:43,190 --> 00:56:46,545
So, you know, if I'm giving any advice
to anybody who's gonna try to accomplish
:
00:56:46,545 --> 00:56:50,475
something like this for, uh, uh, you
know, their firms or, you know, the
:
00:56:50,475 --> 00:56:54,645
Microsoft stack is probably the best
place to start with, in my opinion.
:
00:56:54,645 --> 00:56:56,265
The safest place to start, for sure.
:
00:56:56,800 --> 00:56:59,440
Evan Troxel: In enterprise, there are
so many different departments, right?
:
00:56:59,445 --> 00:57:03,280
You have an HR department and a graphics
department, and a marketing department
:
00:57:03,285 --> 00:57:05,830
and a, you know, you've got all
these different, so how do you handle
:
00:57:05,830 --> 00:57:07,300
permissions when it comes to this thing?
:
00:57:07,300 --> 00:57:10,960
Because like you said, like there's no
social security numbers in there, right?
:
00:57:10,960 --> 00:57:16,720
But I would assume HR might have
access to an HR Meg version of Meg
:
00:57:16,720 --> 00:57:19,390
or something where, where it ha
maybe has things like that it, how
:
00:57:19,390 --> 00:57:20,470
are you handling things like that?
:
00:57:20,470 --> 00:57:23,500
Because I know firms are gonna
be asking questions like that.
:
00:57:23,500 --> 00:57:25,330
Well, we've got all these departments.
:
00:57:25,750 --> 00:57:29,140
You need some kind of
ability to lock things down.
:
00:57:29,140 --> 00:57:30,610
Some information is for everybody.
:
00:57:30,610 --> 00:57:31,900
Some of it's only for some eyes.
:
00:57:31,900 --> 00:57:32,140
So
:
00:57:32,290 --> 00:57:33,040
how's that work?
:
00:57:33,705 --> 00:57:34,360
Steve Germano: a great question.
:
00:57:34,365 --> 00:57:39,490
So we've basically solved that by just
saying Meg is only going to have access
:
00:57:39,490 --> 00:57:42,175
to the data that you want everyone
in the company to have access to.
:
00:57:43,075 --> 00:57:45,085
We just drew a line, we just
drew a straight line there.
:
00:57:45,655 --> 00:57:45,835
Yeah.
:
00:57:45,925 --> 00:57:50,245
Let's, let's get that right first and
then we'll start thinking about, well,
:
00:57:50,245 --> 00:57:52,375
I want an HR version of Meg, right?
:
00:57:52,375 --> 00:57:54,685
Or I want an electrical
engineering only version of Meg.
:
00:57:54,685 --> 00:57:57,055
I don't ever need to see this,
you know, mechanical stuff.
:
00:57:57,205 --> 00:57:58,735
And we've had those requests too, right?
:
00:57:58,735 --> 00:58:02,845
So we had start somewhere and we wanted to
be safe with what we were starting with.
:
00:58:02,845 --> 00:58:05,365
So let's just not give it
access to anything that's
:
00:58:05,365 --> 00:58:06,710
sensitive right off the bat.
:
00:58:06,710 --> 00:58:07,030
Mm-Hmm.
:
00:58:07,110 --> 00:58:12,535
And so in order to do that with our
ingestion, um, tech stack, our parsing
:
00:58:12,535 --> 00:58:16,015
tech stack, which was, you know, consuming
all this data from all these places in
:
00:58:16,015 --> 00:58:21,985
real time, uh, we had to build a whole
admin center for that where our product
:
00:58:21,985 --> 00:58:25,255
owner can work with our librarians to
know, Hey, this is a safe location.
:
00:58:25,345 --> 00:58:28,855
Subscribe not a safe location,
don't subscribe, block, whatever.
:
00:58:29,155 --> 00:58:31,855
And also work with a data team
to say, Hey, these columns can
:
00:58:31,860 --> 00:58:33,475
come in, these columns cannot.
:
00:58:33,565 --> 00:58:33,865
Right?
:
00:58:33,865 --> 00:58:35,845
So security column, eh, get rid of that.
:
00:58:35,850 --> 00:58:36,205
Right.
:
00:58:36,445 --> 00:58:39,325
Um, so those are the types of
things we had to do in that process
:
00:58:39,325 --> 00:58:40,135
just to make sure we're safe.
:
00:58:40,420 --> 00:58:43,840
Just make sure we don't give any
avenue for any information that's
:
00:58:43,840 --> 00:58:45,700
sensitive to even make it in here at
:
00:58:45,835 --> 00:58:48,265
Evan Troxel: I think there's another
kind of way to build on top of that,
:
00:58:48,265 --> 00:58:53,965
because I've had this experience too,
which is there, there's a, there is,
:
00:58:54,595 --> 00:59:00,895
there are safe things to be shared in
safe ways in say, safe channels, but
:
00:59:00,895 --> 00:59:02,395
not everybody knows what those are.
:
00:59:02,425 --> 00:59:07,525
So a lot of private information is shared
via email, which is completely open.
:
00:59:07,705 --> 00:59:09,625
But everybody thinks it's my email.
:
00:59:10,045 --> 00:59:11,035
It's totally private.
:
00:59:11,035 --> 00:59:14,725
Like there's just this weird
feeling about, about these things.
:
00:59:14,725 --> 00:59:15,985
They're, they're not truths, right?
:
00:59:15,985 --> 00:59:18,715
They're just myths that, that
have been embedded in people.
:
00:59:19,015 --> 00:59:23,185
So when it comes to that kind of
stuff, I mean, have you had to broach
:
00:59:23,185 --> 00:59:24,835
that subject with, with anybody yet?
:
00:59:24,985 --> 00:59:26,005
Or, or how are you dealing with that?
:
00:59:26,785 --> 00:59:27,445
Steve Germano: too much.
:
00:59:27,505 --> 00:59:31,735
Um, you know, we have, we basically
have librarians and this, this,
:
00:59:31,825 --> 00:59:36,325
um, librarian level of, uh, what
we call mega librarians, right?
:
00:59:36,325 --> 00:59:40,435
But those could be a person
in hr, a person in, uh, our
:
00:59:40,705 --> 00:59:42,175
tech ops engineering team.
:
00:59:42,415 --> 00:59:46,375
They're kind of the, um, the
main points of truth for what
:
00:59:46,375 --> 00:59:48,025
goes in and what goes out, right?
:
00:59:48,175 --> 00:59:51,925
And so they're the ones that really
are focused on the data sources.
:
00:59:52,225 --> 00:59:54,505
And so, you know, I
can't monitor that stuff.
:
00:59:54,505 --> 00:59:56,035
Our product owner can't
monitor that stuff.
:
00:59:56,040 --> 00:59:58,735
There's just too much happening
in a larger organization.
:
00:59:58,945 --> 01:00:01,315
And we're only a couple thousand
employees today, and not to mention
:
01:00:01,315 --> 01:00:02,665
a couple hundred in India, right?
:
01:00:02,665 --> 01:00:05,785
So we've, when, who knows we're
about who we're acquiring and
:
01:00:05,785 --> 01:00:06,835
adding to the team tomorrow.
:
01:00:06,835 --> 01:00:11,305
So we need people with that distinct
role to have that as their job.
:
01:00:11,695 --> 01:00:14,665
Um, and they need to be, be trusting.
:
01:00:14,695 --> 01:00:19,135
They need to have the trust in the
system that, Hey, when I put things here.
:
01:00:19,525 --> 01:00:22,855
I know they go to Meg when I put
things here, they never go to Meg.
:
01:00:23,125 --> 01:00:26,365
And so we had to build that
really clear for them and give
:
01:00:26,365 --> 01:00:27,595
them analytics so they can
:
01:00:27,655 --> 01:00:28,735
Evan Troxel: It's like a flow chart.
:
01:00:29,005 --> 01:00:29,365
Yeah.
:
01:00:29,965 --> 01:00:33,235
Steve Germano: Yeah, and we, we,
we have some cool tools, which I'm
:
01:00:33,235 --> 01:00:36,385
unfortunately not at Liberty show
today, but, um, you know, we have
:
01:00:36,385 --> 01:00:39,355
this admin center that makes it really
easy to do that process for them.
:
01:00:39,415 --> 01:00:42,505
Uh, and then we also have, uh, you
know, really intricate analytics
:
01:00:42,715 --> 01:00:45,475
and BI reports of every single file.
:
01:00:45,625 --> 01:00:48,925
And not every single file is gonna
be able to even be parsed, right?
:
01:00:48,930 --> 01:00:53,695
Like, let's say you get a PDF as a static
IMEGe, we're not parsing IMEGes today.
:
01:00:53,700 --> 01:00:57,925
So, you know, well, have I expected
to have that data Meg as a librarian?
:
01:00:58,195 --> 01:01:00,175
Well, how do I know that one
IMEGe didn't make it right?
:
01:01:00,175 --> 01:01:03,085
So we have to have those analytics
for them and, and, and dashboards
:
01:01:03,085 --> 01:01:06,445
for them to be able to see that so
that it can be a, we're trying to
:
01:01:06,445 --> 01:01:09,055
get to be a fully self-serve model.
:
01:01:09,085 --> 01:01:09,625
Right.
:
01:01:09,895 --> 01:01:12,775
Um, so you don't go to development
'cause development's expensive.
:
01:01:12,895 --> 01:01:13,315
Right?
:
01:01:13,315 --> 01:01:14,365
It's always the slowest thing.
:
01:01:14,395 --> 01:01:17,845
So we, we always, we always say like,
the process for our teams internally is.
:
01:01:18,295 --> 01:01:21,955
If you need to visualize data in some
way, you go to the data team first,
:
01:01:22,315 --> 01:01:25,525
they'll go query your data, they'll get
a power bi, and if they can accomplish
:
01:01:25,525 --> 01:01:27,025
what you want, they'll do that.
:
01:01:27,295 --> 01:01:29,125
If they can't, they'll
kick it to the dev team.
:
01:01:29,125 --> 01:01:31,825
We'll build you a custom dashboard
or a custom, you know, project
:
01:01:31,825 --> 01:01:33,445
management tool or something like that.
:
01:01:33,445 --> 01:01:36,535
We've done, we've done various of those
for different departments as well, so
:
01:01:36,535 --> 01:01:39,535
it's just kind of a, that's kind of
how we do that delineation and figuring
:
01:01:39,535 --> 01:01:41,635
out where this project should go.
:
01:01:41,695 --> 01:01:41,905
So.
:
01:01:42,375 --> 01:01:46,310
Randall Stevens: We, uh, we, uh,
didn't talk Steve, but is the primary
:
01:01:46,310 --> 01:01:51,230
interface through teams interface where
you're, where the bot interface lives.
:
01:01:52,345 --> 01:01:53,125
Steve Germano: It is today.
:
01:01:53,185 --> 01:01:53,905
Yep, it is.
:
01:01:54,175 --> 01:01:58,915
Um, one thing we do have, um, we
are having conversations potentially
:
01:01:58,920 --> 01:02:00,445
at surfacing in other places.
:
01:02:01,405 --> 01:02:06,775
Um, this kind of falls in line with
kind of wait and see approach right now.
:
01:02:06,835 --> 01:02:12,775
Um, all of our plugins, we call them, um,
all of those functions or plugins, um,
:
01:02:12,985 --> 01:02:18,085
they are, uh, built to a certain standard
where they could plug in into chat.
:
01:02:18,090 --> 01:02:18,385
GPT.
:
01:02:19,345 --> 01:02:22,465
Be run serverless in the cloud or
something like that, each individual one.
:
01:02:22,765 --> 01:02:26,515
So we're kind of, I just kind of
wanna wait and see right now teams
:
01:02:26,515 --> 01:02:29,305
is what was our decision because
this is where we all communicate.
:
01:02:29,305 --> 01:02:32,275
Like this is 99% of our
company communication.
:
01:02:32,425 --> 01:02:34,255
That is internal and informal.
:
01:02:34,255 --> 01:02:35,305
All happens here.
:
01:02:35,395 --> 01:02:38,305
And then obviously, you know, stuff or
projects that goes external, goes on,
:
01:02:38,305 --> 01:02:40,165
emails and things like that, that nature.
:
01:02:40,225 --> 01:02:43,165
Um, but yeah, we really thought
teams was the right place.
:
01:02:43,465 --> 01:02:44,245
It was harder.
:
01:02:44,305 --> 01:02:45,355
It was much harder.
:
01:02:45,360 --> 01:02:48,235
I I, I would've much rather
just built my own ui.
:
01:02:48,505 --> 01:02:49,945
'cause then you have
full flexibility, right?
:
01:02:49,945 --> 01:02:51,505
You can have your own chats going on.
:
01:02:51,505 --> 01:02:52,585
You do all sorts of different things.
:
01:02:52,795 --> 01:02:54,625
UI wise, you're really
restricted with teams.
:
01:02:54,895 --> 01:02:57,895
You can see in the text streaming
here, they actually don't even
:
01:02:57,895 --> 01:02:59,185
have a tech streaming feature.
:
01:02:59,365 --> 01:03:02,035
So we actually had to do
something to get it to do that.
:
01:03:02,035 --> 01:03:03,655
It's not even natively built in.
:
01:03:03,865 --> 01:03:06,535
And talking with the Microsoft guys,
they're trying to figure that out, right?
:
01:03:06,535 --> 01:03:08,815
So, 'cause they're trying to
build their own copilot, so we're
:
01:03:08,820 --> 01:03:09,655
all kind of in the same boat.
:
01:03:09,655 --> 01:03:11,545
So I'm like, this is where we wanna be.
:
01:03:11,545 --> 01:03:15,805
We know mega probably live here for long
term, but it may also surface and maybe
:
01:03:15,805 --> 01:03:18,145
another webpage, maybe we have a tool.
:
01:03:18,495 --> 01:03:23,865
That's a project management tool where
we may have a version of Meg that plugs
:
01:03:23,865 --> 01:03:26,055
into just that data on that screen.
:
01:03:26,235 --> 01:03:26,625
Right.
:
01:03:26,655 --> 01:03:31,125
So we can kind of, uh, reuse that code
base in different places because of the
:
01:03:31,125 --> 01:03:33,255
nature of how it's built on being open.
:
01:03:33,995 --> 01:03:34,355
Randall Stevens: Great.
:
01:03:35,255 --> 01:03:35,645
Well, we,
:
01:03:35,645 --> 01:03:38,705
uh, uh, we, I think covered, uh.
:
01:03:39,085 --> 01:03:41,545
Where you think this is going to go next?
:
01:03:41,725 --> 01:03:45,025
You know, when, when you were talking
about, um, you know, really the kind
:
01:03:45,025 --> 01:03:48,745
of behind the scenes of some of these
workflows in the apps and, uh, some
:
01:03:48,745 --> 01:03:50,485
things that, that might be manifested.
:
01:03:50,485 --> 01:03:54,175
But, uh, what, uh, for those out
there that are listening to this
:
01:03:54,175 --> 01:03:57,295
and, and beginning to think either
working on things themselves or
:
01:03:57,300 --> 01:04:01,075
thinking about it, what's kind of
your advice for getting started?
:
01:04:01,135 --> 01:04:04,255
What are some of the first things
that, where did you skin your needs
:
01:04:04,255 --> 01:04:07,525
that you can maybe help somebody
else, uh, avoid some of those pitfalls
:
01:04:08,665 --> 01:04:09,775
Steve Germano: Yeah,
that's a great question.
:
01:04:09,775 --> 01:04:12,805
There's, there's just so much
knowledge to consume on this topic.
:
01:04:12,805 --> 01:04:13,915
And it's everly changing,
:
01:04:14,095 --> 01:04:14,365
Randall Stevens: moving
:
01:04:14,370 --> 01:04:14,785
quickly?
:
01:04:15,025 --> 01:04:15,235
Yeah.
:
01:04:16,045 --> 01:04:16,885
Steve Germano: Yeah, daily.
:
01:04:16,975 --> 01:04:21,445
Um, I think you gotta be a hobbyist
first before you feel comfortable
:
01:04:21,450 --> 01:04:26,035
enough to go and execute for, for, you
know, for work, uh, for your project.
:
01:04:26,395 --> 01:04:30,295
And, um, I would say just really
start understanding, you know,
:
01:04:30,295 --> 01:04:31,495
dive into prompt engineering.
:
01:04:31,495 --> 01:04:35,215
That's something you just do in chat
GT's playground with a free, open AI key.
:
01:04:35,605 --> 01:04:37,135
And, uh, start playing with that.
:
01:04:37,140 --> 01:04:38,515
Understand those principles.
:
01:04:38,935 --> 01:04:42,175
And, you know, it's really hard
to try to say, Hey, go understand
:
01:04:42,175 --> 01:04:43,165
how neural network works.
:
01:04:43,165 --> 01:04:44,815
Nobody understands how
those things work, right?
:
01:04:44,820 --> 01:04:47,875
Like, unless you're in data scientists at
Open ai, and even they'll tell you they
:
01:04:47,875 --> 01:04:50,904
don't understand how half their emergent,
you know, patterns are happening.
:
01:04:50,904 --> 01:04:56,695
But, um, and then I would say, you
know, the data part is the hardest part.
:
01:04:57,415 --> 01:05:01,495
Like, like building a chat bot is
actually one of the easiest pieces.
:
01:05:01,915 --> 01:05:05,485
Solving the data is the hardest
thing in a company, right?
:
01:05:05,575 --> 01:05:09,055
Because you need buy-in,
you need collaboration.
:
01:05:09,055 --> 01:05:10,825
You can't just go and dev that, right?
:
01:05:10,825 --> 01:05:15,265
You've gotta have folks want to clean
their data, want to organize their
:
01:05:15,265 --> 01:05:18,985
data, want to get it into places
that are consumable, wanna sort it.
:
01:05:19,315 --> 01:05:20,425
That's the hardest part.
:
01:05:20,425 --> 01:05:23,605
So I would say really work with
relationships that you can, that
:
01:05:23,610 --> 01:05:27,115
you have in your company with it and
all these business unit leaders, and
:
01:05:27,115 --> 01:05:29,305
start having that conversation today.
:
01:05:29,605 --> 01:05:32,995
'cause it may take six months for
that data to be consumable or in a
:
01:05:33,000 --> 01:05:35,065
consumable fashion consumable state.
:
01:05:35,395 --> 01:05:37,075
Um, it takes a long time to clean up data.
:
01:05:37,285 --> 01:05:40,945
Uh, we've been, we've been live
company wide with a beta release,
:
01:05:41,245 --> 01:05:42,895
uh, since the beginning of January.
:
01:05:42,895 --> 01:05:45,025
So, so we're just going on one full month.
:
01:05:45,295 --> 01:05:48,565
Uh, we had alpha release for a couple
months before that, or a limited group.
:
01:05:48,925 --> 01:05:51,775
And, you know, we're probably
gonna continuously work with
:
01:05:51,775 --> 01:05:56,185
our librarians and filling data,
knowledge holes and cleaning data and
:
01:05:56,185 --> 01:05:59,185
outdated data probably for May, may
:
01:05:59,185 --> 01:06:00,745
never stop, honestly.
:
01:06:01,165 --> 01:06:02,725
Yeah, it just never really stops,
:
01:06:02,875 --> 01:06:03,145
Randall Stevens: Yeah.
:
01:06:03,145 --> 01:06:04,580
It's gonna require constant attention.
:
01:06:05,640 --> 01:06:06,060
Steve Germano: Mm-Hmm.
:
01:06:06,410 --> 01:06:06,531
Evan Troxel: I, I
:
01:06:06,685 --> 01:06:07,555
have a question about
:
01:06:07,555 --> 01:06:14,005
quantifying costs and, and investment
and, and I know that may is probably
:
01:06:14,005 --> 01:06:15,265
difficult because you have so many.
:
01:06:15,910 --> 01:06:21,640
People involved across different teams
in different roles, but ballpark, what
:
01:06:21,640 --> 01:06:27,340
is, what is Imeg invested in this, and
then over your one month of beta, like
:
01:06:27,345 --> 01:06:31,660
what kind of return on that investment
or satisfaction even have you been
:
01:06:31,665 --> 01:06:33,310
getting for feedback from people?
:
01:06:33,310 --> 01:06:36,160
Because I assume that
this is an investment
:
01:06:36,430 --> 01:06:37,270
that will cut
:
01:06:37,270 --> 01:06:39,850
down on how much time it
takes to find stuff, right?
:
01:06:39,850 --> 01:06:44,050
Like the whole idea of exposing and
bringing this data to the surface quickly
:
01:06:44,350 --> 01:06:49,779
and accurately is a huge, has a huge
ROI, it has huge implications across the,
:
01:06:49,840 --> 01:06:51,400
uh, organizations as large as you are.
:
01:06:51,400 --> 01:06:56,050
So just to give other firms an idea
who are, who are not going down
:
01:06:56,050 --> 01:06:58,690
this road yet, but may want to.
:
01:06:58,690 --> 01:07:00,310
Like, what, what are
we talking about here?
:
01:07:01,480 --> 01:07:04,300
Steve Germano: Uh, it's a great question
and you know, I don't have hard numbers
:
01:07:04,300 --> 01:07:08,380
today 'cause we're so early, but we're
actively tracking analytics and looking at
:
01:07:08,380 --> 01:07:09,730
those numbers and where they're trending.
:
01:07:09,910 --> 01:07:14,050
But I can tell you, um, the
conversations have been, you know,
:
01:07:14,050 --> 01:07:18,370
really gotta look at what's the
bar today and where's that bar set.
:
01:07:18,700 --> 01:07:23,080
And for us, and a lot of the a EC
industry, the bar set, let's just
:
01:07:23,080 --> 01:07:24,430
talk on structured data today.
:
01:07:24,880 --> 01:07:25,900
SharePoint search.
:
01:07:26,725 --> 01:07:27,025
Evan Troxel: Yeah.
:
01:07:27,160 --> 01:07:28,660
Steve Germano: Hey, I'm not
trying to rack on SharePoint.
:
01:07:28,665 --> 01:07:33,010
It's come a long way over the years, but
it's not what you're seeing here, right?
:
01:07:33,010 --> 01:07:35,740
So that bar's kind of
low as far as that goes.
:
01:07:36,220 --> 01:07:41,590
Um, so the feedback and the sediment
has been over overwhelmingly positive
:
01:07:41,740 --> 01:07:44,380
from users, from directors, all
the way up through the food chain.
:
01:07:44,740 --> 01:07:46,150
Uh, CEO all the way down.
:
01:07:46,150 --> 01:07:48,610
Everyone really loves this, right?
:
01:07:48,610 --> 01:07:50,590
They really love having access to data.
:
01:07:50,950 --> 01:07:54,010
Um, I will say the
structured data side, right?
:
01:07:54,279 --> 01:07:58,120
Um, we have wonderful reports,
bis all over the place, and
:
01:07:58,120 --> 01:07:59,680
that's accessible today.
:
01:08:00,310 --> 01:08:03,970
And that bar is graphically
better than this bar.
:
01:08:03,970 --> 01:08:04,810
I would, I would say, right?
:
01:08:04,810 --> 01:08:06,700
Like you're getting graphics
and slicing and dicing.
:
01:08:06,700 --> 01:08:11,920
It's actually better visuals,
but the speed is still better
:
01:08:11,980 --> 01:08:13,510
with a chat bot, right?
:
01:08:13,510 --> 01:08:17,410
Because now I don't need to go and
slice and dice 15 different filters in
:
01:08:17,410 --> 01:08:19,120
my Power BI report to get the exact it.
:
01:08:19,510 --> 01:08:21,010
I just type a sentence, I'm, I get it.
:
01:08:21,354 --> 01:08:22,045
Five seconds.
:
01:08:22,345 --> 01:08:27,715
So while we're quantifying that, we expect
it to be significant and the larger scale
:
01:08:27,715 --> 01:08:31,915
of a company as you are, the higher the
head count, we expect that, that ROI to
:
01:08:31,915 --> 01:08:36,535
be equivalent to, you know, uh, in scale
literally with the size of the firm.
:
01:08:36,895 --> 01:08:39,865
Um, I will also say there's a
lot of intrinsic values we have
:
01:08:39,865 --> 01:08:42,715
not even identified yet, and
we're hearing about 'em daily.
:
01:08:42,745 --> 01:08:45,325
Like that use case I told you earlier,
somebody was copy and pasting it
:
01:08:45,325 --> 01:08:46,795
from Meg to do some other workflow.
:
01:08:47,305 --> 01:08:49,375
We don't even know half
of those are today.
:
01:08:49,750 --> 01:08:55,149
And where Meg sits today is very much
a really good search engine, right?
:
01:08:55,149 --> 01:08:57,069
Like a conversational search engine.
:
01:08:57,279 --> 01:08:58,930
We haven't even built workflows in yet.
:
01:08:59,109 --> 01:09:03,609
And so that's where, Hey, can I go get
a, can I just ask Meg for a pre-qual
:
01:09:03,609 --> 01:09:07,630
for a project, a hospital project in
the city of Las Vegas, and let it go,
:
01:09:07,635 --> 01:09:10,870
make a pre-qual report for me and that
save somebody three hours, four hours.
:
01:09:10,870 --> 01:09:12,190
I, I, we don't know yet.
:
01:09:12,460 --> 01:09:16,210
So when, as we mature workflows
this year, it's going to, the
:
01:09:16,210 --> 01:09:17,770
ROI is gonna continue to go up.
:
01:09:18,130 --> 01:09:20,170
Um, so sorry I got skated around.
:
01:09:20,170 --> 01:09:23,649
That one I don't really know
today, but we feel, and the numbers
:
01:09:23,649 --> 01:09:24,729
show it's, it's gonna be pretty
:
01:09:24,729 --> 01:09:25,330
significant,
:
01:09:26,529 --> 01:09:28,734
Evan Troxel: I think another big part
of that investment is just having a
:
01:09:28,734 --> 01:09:33,354
team dedicated to development, not
necessarily AI development, but you
:
01:09:33,354 --> 01:09:37,495
do have a team of six people that is
dedicated to development and solving
:
01:09:37,825 --> 01:09:42,265
problems by creating software across
the, everything that that includes.
:
01:09:42,505 --> 01:09:46,915
And so the company is already set
up to go down this road in, in
:
01:09:46,920 --> 01:09:48,295
some respects, right, where other
:
01:09:48,295 --> 01:09:49,135
companies may not
:
01:09:49,255 --> 01:09:49,675
have those
:
01:09:49,690 --> 01:09:51,100
Randall Stevens: It's a good, uh, a good,
:
01:09:51,359 --> 01:09:51,890
Evan Troxel: they haven't
:
01:09:51,940 --> 01:09:52,479
Randall Stevens: oh, I'm sorry.
:
01:09:52,660 --> 01:09:53,859
Uh, it's a good setup.
:
01:09:53,859 --> 01:09:57,820
Evan, I was gonna, uh, I saw
Steve, that you're looking to
:
01:09:57,820 --> 01:09:58,960
add some people to the team.
:
01:09:59,020 --> 01:10:01,150
You know, here's your,
here's your chance to,
:
01:10:01,150 --> 01:10:02,350
uh, reach
:
01:10:02,410 --> 01:10:06,309
our global audience of the Confluence
podcast to kind of put out there
:
01:10:06,340 --> 01:10:09,155
some of the people that, uh, you're
looking forward to add to your team.
:
01:10:10,735 --> 01:10:14,875
Steve Germano: Yeah, we, uh, you right
now we're interviewing, um, actively
:
01:10:14,875 --> 01:10:17,065
for folks with Revit API experience.
:
01:10:17,095 --> 01:10:21,025
Uh, we have a lot of full stack engineers
that are not from the industry, that
:
01:10:21,025 --> 01:10:23,785
have a lot of, you know, traditional
full stack experience, and we're looking
:
01:10:23,785 --> 01:10:25,585
to add in more Revit API experience.
:
01:10:25,885 --> 01:10:29,035
Um, and so we're looking for
that, that person right now.
:
01:10:29,035 --> 01:10:32,365
And then, uh, we're continually adding,
you know, we've added a couple devs
:
01:10:32,365 --> 01:10:34,105
last year, adding more devs as we go.
:
01:10:34,375 --> 01:10:37,615
Um, and so I would say if
you're interested, please
:
01:10:37,675 --> 01:10:39,205
reach out to me on LinkedIn and
:
01:10:39,475 --> 01:10:39,985
be happy to have a
:
01:10:39,985 --> 01:10:40,375
conversation.
:
01:10:40,434 --> 01:10:40,945
Randall Stevens: that's great.
:
01:10:41,215 --> 01:10:46,405
And, uh, as, as you know, because I've
been, uh, trying to twist your arm,
:
01:10:46,405 --> 01:10:52,165
we're doing this one day Confluence event
around AI and machine learning in April.
:
01:10:52,170 --> 01:10:56,905
It's April 17th, it's gonna be in,
uh, uh, actually at the, uh, Brooklyn
:
01:10:56,910 --> 01:10:58,615
Navy Yard, uh, New York City.
:
01:10:59,095 --> 01:11:00,385
Uh, so that's underway.
:
01:11:00,445 --> 01:11:04,465
Uh, hopefully we'll figure out how to
get Steve there to be part of this.
:
01:11:04,495 --> 01:11:08,245
Uh, but, uh, it should be a, a
great day for those of you that
:
01:11:08,245 --> 01:11:12,295
are interested in, in this topic,
diving in a little bit more.
:
01:11:12,355 --> 01:11:17,005
Um, um, so anyway, you can, uh,
Evan, I guess we'll put in the show
:
01:11:17,010 --> 01:11:21,055
notes a link so that people can to
that and sign up for more info if.
:
01:11:21,493 --> 01:11:21,673
Evan Troxel: Yep.
:
01:11:21,702 --> 01:11:25,393
We'll put a link to that and we'll
put a link to Steve's LinkedIn page
:
01:11:25,393 --> 01:11:29,053
so you can get in contact if you feel
like, uh, this is a path you want
:
01:11:29,053 --> 01:11:31,663
to explore with Imeg and his team.
:
01:11:31,693 --> 01:11:31,933
So.
:
01:11:32,623 --> 01:11:33,702
Steve, thank you so much.
:
01:11:33,708 --> 01:11:35,383
This has been super fun conversation.
:
01:11:35,383 --> 01:11:39,223
Like I love, like the whole purpose of
this podcast is to go behind the scenes
:
01:11:39,253 --> 01:11:41,833
and we officially achieved that today.
:
01:11:41,833 --> 01:11:45,493
And thank you for, for your
willingness to share with the audience.
:
01:11:45,493 --> 01:11:47,653
It's been a fantastic learning experience.
:
01:11:47,698 --> 01:11:48,208
Randall Stevens: feeling we
:
01:11:48,448 --> 01:11:49,033
Steve Germano: Yeah, thanks for
:
01:11:49,113 --> 01:11:50,728
Randall Stevens: I have a feeling
we can have you back on here
:
01:11:50,733 --> 01:11:52,048
like every six months and have
:
01:11:52,648 --> 01:11:53,908
this much or more right.
:
01:11:53,908 --> 01:11:55,348
To, to talk about.
:
01:11:55,408 --> 01:11:56,908
But, uh, yeah, it's really cool.
:
01:11:56,908 --> 01:11:58,288
It'll, it'll be fun, right?
:
01:11:58,318 --> 01:12:01,948
We will try to, we'll give you maybe a
little bit more than six months to get,
:
01:12:02,038 --> 01:12:03,838
get some of these results under your belt.
:
01:12:03,838 --> 01:12:07,768
But we'll have you back on and, and,
um, you can report on, you know,
:
01:12:07,773 --> 01:12:11,518
just how well this is, uh, this has
been, uh, in the rollout at ah, Meg.
:
01:12:11,518 --> 01:12:12,598
And, uh, it's exciting.
:
01:12:12,603 --> 01:12:16,558
I think as you said, Steve,
it's like an exciting time.
:
01:12:17,008 --> 01:12:19,138
We're at this pivotal point,
right, where it's like.
:
01:12:20,188 --> 01:12:20,758
You know, I think a
:
01:12:20,758 --> 01:12:24,688
lot of the things that we've probably
people our age, uh, I'm, I'm not
:
01:12:24,688 --> 01:12:25,708
throwing you under the bus, Steve.
:
01:12:25,708 --> 01:12:29,458
I'm, I'm a few years old, got a
few years on you, but, um, but it's
:
01:12:29,458 --> 01:12:32,158
like, you know, things that you've
always kind of imagined and it's just
:
01:12:32,158 --> 01:12:35,968
like, ah, you know, you're not gonna
write code, you know, for all this.
:
01:12:35,968 --> 01:12:40,738
So I think it's just opening up this new
thinking about just what the interfaces
:
01:12:40,738 --> 01:12:42,808
to all this information can look like.
:
01:12:42,808 --> 01:12:43,738
And just a
:
01:12:43,738 --> 01:12:44,698
really exciting time.
:
01:12:44,698 --> 01:12:47,458
So I appreciate your coming on
and, and sharing this with everyone
:
01:12:48,243 --> 01:12:49,888
Evan Troxel: E even how
you write code, right?
:
01:12:49,918 --> 01:12:52,018
GitHub copilot now.
:
01:12:52,048 --> 01:12:53,038
Like these are, these are,
:
01:12:53,428 --> 01:12:54,327
it's complete.
:
01:12:54,808 --> 01:12:55,738
It's crazy.
:
01:12:55,827 --> 01:12:56,068
It's
:
01:12:56,068 --> 01:12:57,118
absolutely incredible.
:
01:12:57,508 --> 01:13:01,318
Steve Germano: I, I've seen developers
on my team that, you know, were,
:
01:13:01,318 --> 01:13:04,258
were, you know, really would,
would not have the speed aspect.
:
01:13:04,263 --> 01:13:06,658
They were really thorough, but they
weren't, weren't there very fast.
:
01:13:06,838 --> 01:13:11,218
But having that AI copilot with
them to shoot ideas against, it's
:
01:13:11,218 --> 01:13:12,688
like, oh, hey, have you tried this?
:
01:13:12,778 --> 01:13:16,228
And they may massage and tweak
it, but their velocity has gone up
:
01:13:16,228 --> 01:13:18,448
over 50% from some of these guys.
:
01:13:18,448 --> 01:13:19,048
It's, it's amazing.
:
01:13:19,048 --> 01:13:22,168
And then your newer guys, it's a
little danger with the newer guys.
:
01:13:22,168 --> 01:13:25,738
I'm a little concerned about that
when they don't know enough of what
:
01:13:25,978 --> 01:13:29,188
AI's doing and, you know, so there
could be some concerns there, but.
:
01:13:29,488 --> 01:13:33,628
Your ability to learn as a newer guy
now by seeing code written live for
:
01:13:33,628 --> 01:13:36,538
you and then learning from it and
then they can explain it to you right.
:
01:13:36,538 --> 01:13:37,618
In your id.
:
01:13:37,827 --> 01:13:38,038
Yeah.
:
01:13:38,038 --> 01:13:38,668
It's amazing.
:
01:13:38,668 --> 01:13:38,818
Yeah.
:
01:13:38,823 --> 01:13:41,848
It's really, really helped, uh,
developers across the globe.
:
01:13:41,848 --> 01:13:42,838
It's been a game changer.
:
01:13:43,468 --> 01:13:43,818
Randall Stevens: Great.
:
01:13:44,168 --> 01:13:44,458
Well,
:
01:13:44,458 --> 01:13:45,138
thanks again Steve.
:
01:13:45,158 --> 01:13:46,458
We appreciate your coming on.
:
01:13:46,458 --> 01:13:47,178
Looking forward to
:
01:13:47,178 --> 01:13:48,138
seeing what you're doing next.
:
01:13:48,638 --> 01:13:49,238
Steve Germano: Yeah.
:
01:13:49,238 --> 01:13:49,718
Appreciate it.
:
01:13:49,838 --> 01:13:50,528
Thanks everyone.