SUMMARY KEYWORDS
device, sensors, great, brian, spatial computing, product, ar, nima, work, space, building, argo, hardware, enterprise, cases, waveguide, cameras, industrial, apple, spatial awareness
00:02
I'm Karen Quatromoni, the Director of Public Relations for Object Management Group OMG. Welcome to our OMG podcast series at OMG. We're known for driving industry standards and building tech communities. Today, we're focusing on the augmented reality for enterprise Alliance area, which is an OMG program. The area accelerates AR adoption by creating a comprehensive ecosystem for enterprises, providers and research institutions. This Q and A session will be led by Christine paray from parade research and consulting. We're
00:45
in the in the blog post about:01:16
Thank you, Christine, and thank you to the area for all your support and actually doing a lot of really important work in the AR space. Brian Hamilton, Head of Marketing and Sales for DigiLens, been a pioneer and entrepreneur in the space for almost a decade now, was the founder of the real wear company, was involved in some of the first smart helmet products. Pre HoloLens have deployed this in healthcare and over 50 countries around the world. So I'm honored to be here, and special shout out to the area and xr community.
01:47
I've been in the space since:02:21
Great, great. Thank you. Well, you've both been in fields that focus on the optical see through form of augmented reality. You know, to me, that's the only kind of augmented reality there is. You know, it's a position that I have, that people need to see the real world. That's what I Not, not a video of the real world, but the real world itself. Well, with the launch of the apple vision Pro and many of the activities and launches of other products that are video pass through. There's some who have proposed that video pass through is appropriate for enterprise and industrial use cases. Do you agree with that? Or what's your position? Brian, first with you. What? What do you think about video pass through in an industrial environment?
03:18
Absolutely. So, you know, one, one firm belief I have is that not one size fits all. There's many, so many vast use cases in industrial and enterprise. There's a place at the table for monocular, binocular, video pass through non video pass through AR, VR, all the above. There's a lot of space in the world. I want to thank Apple for coming out for the product, because it's really highlighted. You probably saw Mark Zuckerberg countering Apple vision pro in a video just recently. It's really brought a great narrative to the space for people to discuss this, this important topic, and they're
03:53
talking about not just the displays. They're talking about spatial computing. They're introducing a whole new vocabulary. And people's, yeah, people's narratives,
04:02
absolutely. And you know, so our perspective is this, the Apple vision Pro and AR passer has a great applicability to the desktop kind of approach. If you're doing something in training, or if you're in you're in a controlled environment, you're in a control room, or something like that, where you need to have lots of monitors, great, but if you're out in the real world, you've got to be able to see the user's eyes. You can't. You've got to look at things like I glow, for the transparency and the ability to see correctly and not have the data compromised. So we believe that AR pass through has its place, but it's in more of a controlled environment. But when you get out in the real world, for real world use cases, across industrial, enterprise, military and government, that true AR is on our doorstep and that we believe the next evolution is coming quickly, but we also do see a lot of value in the pass through, but we believe that AR, true AR, will eventually win in the enterprise and industrial see
04:59
through V. Videos. Video is the is, is what video see through, is what Apple vision pro proposes, and what we're talking about as having value in the workplace, where there's a lot of risks and and bright lights and dim lights and all kinds of different qualities. And that's the the optical see through and using the lenses that that you guys have worked on and have a lot of intellectual property in, right?
05:30
Christine, if I may, add to what Brian said, we we're tanning. The term synthetic AR, right? What apple and meta are doing isn't true. AR, it isn't organic. AR, you don't see the world through your eyes. You see it through a camera and a monitor, right? And like Brian said, we see the same analogy we saw in computing. In computing, you have a desktop computer, you have a mobile phone and you have a pager, spatial computing is going to follow the same trend, right? What's the advantage of a desktop computer? Extremely powerful, massive field of view, not very mobile, right? Use cases, CAD entertainment, which is everything that Apple's speaking through. So Apple's product is the spatial computer of the future, on the pager, and it's a monocular device. Think Google Glass. Think maybe the meta Ray Bans, super sexy, really nice form factor, but not much capability yet. Where digilens wants a place in the center, we call that, you know, the mobile computer of spatial computing, the cell phone of spatial computing. And what's really important in the industrial place where you asked about is spatial awareness. Our default baseline is your reality. It's your eyes. If the device fails for any reason, you see your surroundings, you see the tool, you see that equipment. If the apple vision fails or the meta fails, you're literally blind. And so we think that limits its use case on the shop floor or outdoors or in areas where your spatial awareness and perception is extremely critical, and sometimes you know life, your life is in your hands, if you're in equipment or something that may harm you. So Like Brian said, three different categories, three different use cases. We love. Apple started conversation. Everyone's talking about spatial computing, so that's great for us, but use cases are very dependent on what you're trying to do.
07:30
rted in augmented reality, in:08:59
Yeah, that's a great question. So a spatial computer sits between you and your reality, right? And so in many ways, it needs to almost become smarter than the user. It needs to know when to present information at the right time. If you're like driving or have equipment with you that you need to be focused on, you don't want to Android pop up to get a, you know, update to your OS, the system needs to be ambiently aware of what's happening. So Argo, we've packed over a lot of sensors, you know, I am use multiple cameras, GNSS radio so it knows where you're fixed. We've even put a pressure sensor so it can detect the Delta and height so you know what. You know which floor of the building Are you in or how high up are you? Each device has five microphones and works in over 90 db of noise, because voice commands is one way that people interact with a device like this, and so we need to be able to pick up the user's voice even though they can't hear their own voice. So it's really. Important that a device that sits between you and your reality is aware of when to present the right information to you, right? And so the fundamental, most important aspect which we talked about is the waveguide, right? We are not distracting you from reality. You're aware, and the sensors that sit behind that only give you information at the right time.
10:25
Dual camera. You got two cameras so that, or how many cameras do you have? Would you have LIDAR as well? And you know what? What is that depth sensing technology we need to have?
10:35
Yeah, great question. We're kind of following the Elon Musk idea of not putting Lidar and using cameras, because this product's meant to work indoors and outdoors, right? So on both ends, we have global shutter cameras that I believe are 170 degree field of view overlap. These are global shutter these are monochrome, and they are sensitive to IR light. And so what these do is what's referred to as visual odometry. They look at features in your environment, and as you move, they map that environment and put you in space, slam in the center. If you have a 48 megapixel camera, no one needs 48 megapixels. The reason 48 megapixels is because it does four by four binning. So if you're in a right environment, you get very high resolution. But let's say you walk into a factory and dim lit, those 48 megapixels act like 12 megapixels, but with four times the pixel light, so they can detect lower light better. They also have OIS optical image stabilization and EIS electronic stabilization, because once it's on your head and you're moving around, whoever's seen through your camera or whatever you're capturing, it's very important for it to see it. So those are the camera sensors we've built. We know one size doesn't fit all, so we've introduced a USB C and the ability to bounce on top a mounting system different types of sensors. So maybe you do need a time of flight sensor, right? Maybe, you know, IR sensor, we've built that capability into the device. That's
12:13
really cool. That's a very, very thought. That's thoughtful, thoughtful, yeah,
12:18
one thing on sensors just add to Nima that thinks important is that, you know, sensors are very use case specific, so when you think about the integrations, it's really vertically focused. One of the challenges, I think that's happened in the space for why their mass adoption has been slower than we all like, is that people are delivering a great piece of software or a great piece of hardware, but they're not thinking about the total solution. And so a sensor package for a government is very different than a sensor package for construction or medical or medic for exactly so you know whether it's being able to catch, you know, capture diagnostic information, or, you know, in medical or being able to do something where you can identify that patient in real time, or if you're actually out in the field, and you need to turn the soldier into a sensor, for example, based on location. So sensor packages are vast and really need to be looked at on a vertical and geographical kind
13:13
of approach is perfect, Brian, because that's my my third topic for today's Fireside Chat is that diversity of devices and displays that we should have, but we don't quite have. Yet. Today, it seems that people are, you know, thinking in very boxes. They're thinking of it's got to be binocular. It's got to sit on my nose. It's got to have, you know, a computer on my head. It's and yet, I I'm convinced that we need to have much, much more variation. So are you working with your distributor partners to develop alternatives to the Argo platform and even different modules of Argo,
14:05
right? So, and I'll name us, got a lot more on this in terms of the product roadmap, but let me give you my perspective. You know, we three lines of business within Digi lens. Number one, we train and help others figure out how to manufacture and bring up lines for waveguide manufacturing. Number two, we work with different partners that have their own specifications, where they want to use the digilens as a component, the waveguide, the projectors, the boards, all that. So there's a lot of diversity in that. There's in the challenge, I think is happening. And then we've got the third line, obviously, as we've turned into a product company, the challenge that I think we see is that everybody's trying to build one purpose built device that's like the capture of device for every single market and every
14:46
single a nightmare.
14:49
And then the other challenge is, I think, you know, hardware is hard. You know, being entrepreneurs like Nima and I, you know, go try to raise money for a hardware company. Good luck. It is super hard. So the. Innovation is very difficult, especially in the United States, for development of hardware companies, and so we're seeing new trends here now with AI and spatial computing that are maybe changing that. But the reality situation is that diversity devices is really important. A monocular device like the railway HMD can be great for specific use cases, or the new navigator they have. You need to look at compliance. You need to look at all the different factors. How do you fit it? How do you wear it? What's the personal protective equipment?
15:29
Some people need to wear them all day long, and other people only for a few hours. So much variety.
15:36
That's right. Yeah, to build on what Brian said, which is really key, is digit lens is very vertically integrated, right? Even, even though we're still a startup, we're a startup that's been around for 15 years working on the hardest physical problems, which has been the optics. And so we have the capability of making waveguide technology, making projector technology, and making the boards that are needed to go into that computer. And so Argo is our initial product platform, but like you said, it doesn't necessarily fit everyone's use. So what we've done is around Argo, we've built the ecosystem. We've created a capability to allow for hinge adjustment, for ear horn removal. Some people want to wear it for a very long time, but they don't want it necessarily on their nose, like you mentioned. So we've even come with Halo head straps take the weight off of the head and even allow it out of the way, right? And so building an ecosystem around Argo, but at the same time, we understand one size doesn't fit all. So we're innovating through iterating. We're working with our partners, figuring out what's the problem they're trying to solve, and then taking all of the core components of that and building unique solution bespoke to their needs. And that's what Brian really touched on, which is key is it's the solution, right? The hardware is the barrier to entry, the software is the magic, but it's what are you trying to solve? And how are you solving that? And so we're uniquely poised to be able to solve different needs for different markets.
17:15
nt to raise about hardware in:18:28
Maybe I could start and Brian can close you touched on a really interesting point, is that there have been some false starts in the past. I think the challenge that you see with those false starts, where they were all initially coming out as a consumer device that kind of failed and pivoted to become industrial device, and then they weren't very good, because the the DNA of the product wasn't industrial, right? Google Glass. First example of it was some guy skydiving, right? Um, Magic Leap. First use case was whales in an auditorium Minecraft, right? Those were consumer products that didn't really capture the consumer's interest, so then they quickly had to pivot and claim they were enterprise products. What we're trying to do with digilins and Argo is be an enterprise product from the beginning, right? It's by design. By design, it's mil spec, it's antsy. Peripheral vision is not occluded, right? Spatial awareness is key. So by design, we're building it for the enterprise, industrial market, and therefore, we hope, and from the traction we're seeing so far, that it is a use tool, right? This is a solid product. Do that with the Apple vision Pro, right? This is designed to be a enterprise tool to enhance the work of the worker. I know Brian, if you want to add anything to that,
19:54
if you take a look back. The:20:44
privacy, all of these ethics. It's I'm not saying that those are bad topics. I'm saying that the conversations have to go on at all these levels in parallel. Absolutely
20:56
most entrepreneurs don't want to sit down and start their business around compliance, but that is
21:02
important. That's probably
21:05
where you need to start, honestly,
21:09
great, great conversation. Thank you so much for spending the time with us during this fireside chat. Thank you for being area members and supporting our community and our activities. Thanks a lot, guys,
21:21
thank you.
21:22
Bye, thank you.