LITES-Header-2020-1

E09 REALWEAR FOUNDER ANDY LOWERY ON HIS COMPANY'S ADVANCED KNOWLEDGE TRANSFER DEVICE

009-LITES-ALowery-Newsletter-web

e08 RealWear Founder Andy Lowery on His Company's Knowledge Transfer Device 

Andy Lowery, RealWear with Zack Parnell, ITI

MANDY HENRY: Welcome to the LITES Podcast. It’s Leaders in Industrial Technology, Education, and Safety where we talk to thought-leaders in construction and heavy industry during this exciting time of innovation. I’m Mandy Henry, Communities Manager at ITI, and in this episode, Zack Parnell, President of ITI, had the opportunity to talk with Andy Lowery, CEO of RealWear, about how his company has created a technologically advanced knowledge transfer device for heavy industry. Thanks for tuning in.

ZACK PARNELL:Let's understand who you are, who is RealWear, and go from there.

ANDY LOWERY:Okay. I'm Andy Lowery. I founded RealWear with a few other folks, a couple inventors of the system. I'm the CEO currently, and what we do at RealWear is we specialize in industrial or workforce training or knowledge transfer. We create today a platform, a computer platform that's worn on the head. So, you either mount it with a baseball hat or a bump cap or a hard helmet and you wear the device all day long as a companion, if you will. This computer companion can do things like show videos, take videos, take photos, show you photos, look at procedures, step-by-step through procedures and things, or it and what we get the most use out of it currently is an expert that might be a thousand miles away and do sort of a FaceTime with this expert.

Now it's much more robust than FaceTime on your phone. There's a lot of other features where you can circle things, you can point things out to the person, sort of like that. If you've ever had an IT guy log into your computer and take over your mouse and you're like, “Oh man, he's taken over my computer,” well, you can kind of do that same thing in the real world. So, I can be looking at a big switch box and not sure what switches I'm supposed to switch and in what order and I can call somebody back at the headquarters and the expert for that switch box comes on and says, “Okay,” and circles the switch up to the right, switch up to the left says “This one's first, this one second,” and walk me in real-time through the procedure. What I'm doing is ensuring that I do it right and correct and I don't have a bunch of rework afterwards.

Second, I'm getting some sort of on-the-spot training, if you think about it. A lot of people shove technology, “Oh you're an AR company, you're a wearable computer company.” And I go, “No, we're a knowledge transfer company.” Quite frankly, if the technology existed like in The Matrix, if you've seen that movie, the audience or whatnot, listening and seeing The Matrixwhere Neo was sitting on top of the roof and he had just dodged some bullets, quite literally I think, and he looked over at a helicopter and said, “Trinity, can you fly that helicopter?” She goes, “Give me a second.” She uploads the B212 helicopter program and the next thing I know she jumps in and she's flying it. I mean, in 50 years or 100 years, maybe that's RealWear where we don't use augmented reality to convey the information or a heads-up display, we're just putting the knowledge into the brain because that's what we're trying to do and we're trying to do it with the state of the art technology that we have out there today.

ZP:You mentioned Raytheon before and I know you were in the Navy before. Thank you for your service. Could you help us understand how you got here and maybe take us back to the baby days?

AL:I was born on a small farm… no hahaha. So, I grew up in the St Louis area, literally in a small town on the Illinois side of the Mississippi River and moved in middle school up to Chicago. Went to high school in the western suburbs of Chicago and then enlisted in the Navy. I went out and my first job was literally doing the stuff that we're selling to now. These are maintenance, repair and operation jobs in the Navy. I was in particular around nuclear power plants that power our submarines and surface ships. I went through nuclear training, became a nuclear reactor operator, did well and got picked up, went to college, then served out the rest of my 11 years of active service as an officer. My last job on the John C Stennis, which is a great big nuclear aircraft carrier, I was a Reactor Electrical Assistant, and then I got out of the active service, stayed in the reserves, and retired a few years ago out of the reserves with 20 years of combined service.

Then I moved into Tyco. At Tyco, I sold fluid systems and that got me a broad brush of a lot of different processes out there ranging from oil and gas to construction to a whole host. And of course, power, which is where I kind of grew up and really first got most of the education around a system as the power systems with the Navy. Then after that, I did that for a while, and got promoted to a general manager position at Tyco and ran an electronics factory for about 5 years. Then that company got bought out from Tyco and I went to Raytheon. At Raytheon, I spent about six years as a chief engineer. So, I was the top guy at a lot of these great big, huge programs, building systems. I had programs ranging all the way from concept and bidding and very, very early stage, clear through retirement and obsolescence and all of that sort of stuff. I had the gambit and everything in between from manufacturing to servicing. So, I got a very, very good aerospace education in that role.

In parallel to that, in 2010, I founded this company called Daiquiri. In fact, it was called Augmented Dynamics when we originally founded it, me and Brian Mullins, and we eventually changed the name to Daiquiri because it was a little bit more catchy and we were doing a lot of consumer work back in 2011 because there wasn't really a well-known awareness of augmented reality wearables or any of that. It was very nascent days. The best you could do is take a iPhone4 or whatnot and do video streaming and see kind of the augmented reality appear on your screen of your phone. So, I did that with Brian for a couple years but still stayed at Raytheon at the first stages because we didn't have a lot of funding and these sorts of things.

In 2013 we were kind of blessed with a very strong benefactor, a big funding group out of Newport Beach. They decided to weigh in very heavily with Daiquiri. It was at that time they approached me about the opportunity for me to come out of Raytheon fulltime and run Daiquiri. So, for 2 years, from 2014 to 2016, I built Daiquiri, helped to build Daiquiri, I guess with the team is a better way of saying that, and did lots of fun stuff, had a lot of technology. We had an EEG company for a while that we bought that we were experimenting with some concentration and thought as far as the control paradigm for augmented reality to a very cool wave company that we bought that does wave cancellation to try to build light waves out there. They call them light fields sometimes, and you can move these light fields in and out of different focal distances and stuff. Bought that company, did some prototype work around the smart helmet concept where you would build into an actual hard helmet, the different augmented reality systems and things. All in all, it was such a learning experience for me, both from running a startup and also from just the technology. During those years is when I met almost everybody in the space and there's hardly a person in the augmented world that I don't really know or know of and know their history and stories because of that time.

Then in 2016, we kind of had a mutual parting of ways. It wasn't anything contentious or negative. I was looking for something that I could build. I'm a builder by nature. I felt that although Daiquiri was doing great work, they were doing a lot of work around technology and not enough work around product design and product and getting things out and revenue going and that sort of stuff. So, I left, and I formed RealWear with a fellow that I had met at my Daiquiri years who had worked at a company called Kopin. They had done a project with that started with Motorola back in 2007-2008 to build a wearable computer. It was a Windows CE computer and they had prototyped it, and I think at the end, Motorola ended up building about 1000 or something like that and fielding them. The system wasn't ready. The operating system was poorly done. Back then it was their first take at it. They had a very heavy system. The electronics hadn't micro sized enough yet to make it weigh, I think it weighed 2 pounds or something like that when they fielded it. So, it didn't get the traction they were hoping for, but they went back to the drawing board and over a course of about 5 or 6 years, they put about $35 or $40 million into research and development. They called the program the Golden Eye Program. This was in Kopin.

Chris in 2014 basically had taken Jon's last stand to say, John said, “Look, I got this big Google Glass order. I don't think that we need to do our own system now that Glass is coming out into the consumer market.” While we all know kind of the history, it didn't survive very long out there, but John at the time decided to unplug from this idea of an industrial wearable. And that's when Chris, the inventor of the system, Dr. Chris Parkinson now the CTO at my company, decided to move out and ask for a license. And he got all that and got a deal done. Then shortly thereafter I left Daiquiri. He called me and said, “You want to join forces?” And so we did.

I brought in one last team that I had been come aware of called Sonim. Sonim has industrial wearable, I’m sorry, industrial android phones, like not the wearables variety like ours more the handheld, the traditional variety. And that company had built up over about a decade to go from feature phones to smartphones and had done oil and gas, intrinsically safe ones and very, very rugged ones and all of this. And so that team had recently had a management shakeup that I was able to bring in the founding CEO, the chief product guy there and both of their major engineers. So, in an essence I got sort of a commercialization team in a box. So, by kind of integrating the pieces and it's kind of a theme of my life being a chief engineer, kind of integrating mature pieces, I was able to not start up a company but integrate company. And with that we were able to get into the market extremely quickly. So, we went from the formation or the company to mass production and about in year and a half or so, we hit mass production towards the end of Q4 of last year (’17).We've had three full quarters of shipping and mass production. Suffice it to say, I think if we're not the #1 wearable computer company out there, we’re certainly in the top 3.  

ZP:I want to get back also to specifically what problems you're solving. So for instance, for Colgate or outside of maybe that one customer, could you define the three to five major problems you would identify that you're solving?

AL:Sure. So the first one I'll just, I'll start with the most basic and then work my way up to the most complex. Okay?

ZP:Sure.

AL:So, the most basic problem is that we have a generation of YouTube millennials that are hitting the workforce, right? So if you look at kind of how we educate and how we train an industry, it's various circa industrial age still. And then you take a look at how they do everything else in their life and they, you know, get a new president for Christmas. They don't know how to put it together. What do they do? They go to YouTube or they go to Snapchat and they call their friend up and they say, “Hey, you had one of these last week, can you help me with it?” And so there are a connected IOT generation when you look outside of the school classroom or outside of the work that they're going to do. So it's ironic that the consumer or that sort of society, if you will, on that side, has advanced them to be a very connected generation. And then we bring them into industry and those connected assets are those digital assets aren't there. So the first thing that the RealWear HMT can solve for is building a repository of videos and data and digital artifacts that you can help train these millennials either in Situ or before they go out. It's a GoPro with wings if you want to think about it that way. An industrial GoPro that has a great big 16-megapixel camera on it that can take beautiful videos and all of that, use them for training, use them for that institute sort of knowledge that quick micro learning that they need and that's the most basic version of what this thing can do. It can do that with no software and right out of the box with no other sorts of things to hook up or any of that. You don't even have to have it connected because we have a great big SD micro slot in there. You can put up to 256 gigabytes of bulk storage. So get a lot of videos on 256 gigabytes. So that's the first thing we solve for.

ZP:So, in practice, the user is able to be in front of their work face, whatever that may be and dial up a video quickly showing them and reminding them how to perform that work. Is that what you're saying?

AL:Exactly. Or creating those videos, if you're already the expert that's been out there for 30 years doing it. Now you're clipping that up to their hard helmet and saying, “Hey Jim, just go do your job and if you'd be so kind, talk to the system while you're doing it to kind of give people a step by step what you're doing.” Now you've got these YouTube videos that you're creating for a construction site or a refinery.

The second thing needs connectivity and a little bit of third party software, but there's a lot of third party software applications out there like Libor Stream Onsite that does this, let's call it FaceTime for the users and most people are familiar where I'm looking through the eyes of the person that's in the field and I'm seeing what they're doing on my laptop back at home or back at the office. I might be the engineer or the designer or someone very familiar with the system and the person out in the field is not familiar at all. They don't know what's going on, they don't know how to perform the procedure, or they just don't remember. So, what I can do is I can dial in an expert to kind of look over the shoulder and walk that individual through because in their laptop they're seeing exactly what the person that's out there in the field is seeing. I think of it like a video game, like a first-person shooter or something. You see through the eyes of that person and now you're having a voice conversation with the person because we of course have speakers and an audio jack. You can put a headset on or hear what the person's saying. You can talk through our microphones which dampen the sound all around you so the person can hear you very clearly, even in noisy environments, and you're having this very sort of easy conversation, but you also are showing a visual screen so they see on their laptop screen, you see on your micro screen, above your eye, your little heads up display the same scene. And so now I can mark up on my laptop, “Oh, this switch or this valve or this is what I'm talking about.” And this is probably right now, 80 percent of all of our use cases and that ranges in almost every single industry vertical: construction, transportation, automotive, food and beverage. I could go on and on and on. All of these places that have machines or equipment that needed to be troubleshooted when they go down and especially when those machines go down and start affecting revenue, unplanned downtime is a killer. And so, if you can get those machines up and running, nobody really cares about, you know, whether they have to fly a guy out or not, they're actually caring about how fast can I get the conveyor back up and running that's moving boxes through Amazon or a warehouse from Walmart or something like that. They care about the speed of recovery or the refinery that goes down, that stops pumping oil. If they stopped pumping oil, they stopped pumping revenue. And so getting those back up and running faster is the second use case that I'll talk about, and probably the biggest one we have in all dimensions right now.

ZP:Sure. Could you share a little bit more about too, what would the best strategy be to deploy for a plant to have a few headsets? Imagine a steel company that's got 200 divisions around the country. You know, the bottleneck for a lot of this when we're looking at HoloLens or other devices that are $5,000 or something. People run into this idea that they're never going to utilize the device. So, how would you recommend a company roll out to multiple factories or refineries or steel mills? Is it device sharing? Or is it one or two devices on site? Because otherwise it's faster just to fly somebody if the device isn't there. So how do you solve that?

AL:Yeah, so generally what we've seen is two modalities from what you're talking about, either onsite, in location with shared devices. So, they have 100 people at the factory, at a Colgate factory. They're going to buy three to five devices for the whole factory because they're not every day fixing things, you know, so hey, bring out the HMT1, click it to your hard helmet, all right, go out there and do this maintenance procedure. Or we've seen also in companies like Palfinger for example, in Germany, they're actually shipping a device with an asset. So, they might sell a crane system, like I see parked outside of your office here, and they'll ship with it an HTM1. And in that case, what they can do is offer this higher level of support service where they're saying, “Hey, you've got gold platinum and all this. We've got the RealWear level of support because we're literally going to have your maintenance technician dialed in with our maintenance technician and so anything breaks on this system or this crane or whatever out there, we're going to help you through it.” So I've seen both. I've seen the person that supplies the equipment, building, you know, sending with an HMT1 per system, and I've seen folks in the factory getting a half a dozen or whatnot.

Now, as we move into the other use cases, as I talk about the other use cases, some of the other use cases, actually it doesn't work to have a sharing modality. You actually have to have an individual per person. The one that I'll talk about next is workflow. Workflow is procedures. It's checklists. It's preventative maintenance type operations you'll want to do, but then it's just operations, like operating a plant or whatnot. And in a lot of different areas like nuclear power, where I came from, you have to have everything written down. You can't have any procedure just by memory. You have to know what you're doing, and you're required to just spend your whole day with either a tablet or paper instructions or in our case the head mounted computer, the HMT1, and have those procedures being brought up as you're doing the actual operation. And then on the flip side of that, as you're entering data, as you're putting, you know, 40 PSI is the reading on such and such a gauge or the fuel level is full, or the fuel level is half full or whatnot. And doing these checklists, you're actually taking care of those checklists on the fly because you get the procedure that asks you, “Well what's the temperature?” You say, “Oh, the temperature is 30 degrees Celsius.” It logs that into the workflow procedure of some of our software partners and then creates a report or creates something that just gets pushed to the cloud once you have connectivity, it just sweeps it up. So, there's no additional steps if you will. And so that's the power of that and now you're getting into something where every individual that has a responsibility for maintenance and repair would need their own HMT1 in order to perform their day to day tasks.

And then the final, and maybe the most I’d say complex solution, if you will, is a lot of times in a lot of the process areas you all maybe have seen out there the control rooms with 100 screens, you know, IOT coming at you from every direction and you've got these flat screens and all that. You're looking at them, you're seeing all this stuff. And then you get the guy that's about to go perform the operation in the control room. You say, “Okay, go out there and here's your Motorola walkie talkie. So we're going to convey all this information for you as you're performing your procedure through this little walkie talkie. And so instead of that, what the HMT1 can do is it can actually tap into those same what they call operational technology streams.

ZP:Is there anything else that you think you'd want to talk about that you can think of?

AL:Yeah, there's one thing I like to talk about. It's a thought leadership thing that's kind of neat. It has to do with this word I've discovered. I wasn’t the one that made it up, but it's called a centaur. It's a cool story if you want to hear about that.

Alright, you know, a lot of people talk about man versus machine, right? You see a lot of folks that are worried about jobs being sucked up by automation and robots and things like that. There's a story that maybe the listeners have heard of about a guy named Gary Kasparov, and he was a grand master chess champion back in the mid-90s and he was pitted against an IBM computer called Deep Blue. In 1997, Gary lost to the computer and the whole world rippled with this idea that computers are now smarter than people. What are we going to, what are we going to do? But not Gary. Gary sat as this genius guy would do, and he sat on the other side of the table and said, “Hmm, yes, I lost the computer by a game, but the computer didn't play chess any way like the way I played chess, it was two different games being played here. So, what if I were to not be afraid of the computer but swing around to the other side of the table and partner with the computer. So now it's not man versus machine, it's man with machine.” And so he did. And he created a new form of chess called advanced chess, and it's played today. I mean there's folks that play this everywhere and it's the highest form of chess ever played on earth because you get error proofing and a vast number of tactical moves that the computer is generating. And then you have the strategy and the emotion and the cleverness of the human, and the creativity of the human brain, which even the best computers today can't come close to producing. So you have kind of both elements there in sort of this harmony and what he called that was a centaur chess player. So centaur for those that don't know is an old mythical creature that was back in the days of the Greeks and Romans and stuff where he'd be half horse, half human. And he used that word because it was organic. It was two things in harmony with one another, like an organic creature. The horse had power and tactics and speed. And then the man portion of the body was strategy and thought and intelligence. And that was a centaur. So he began to call the players of this type of chess centaurs or centaur chess players.

What I see us doing in this arena, this arena of wearable computing is exactly the same thing. I'm trying to harmonize the best of what computers do; error proofing, procedures, memory of different tactics and things like that with the intelligence and the abstract thought and the creativity of a person. And by blending those together, we're creating, in essence, a centaur workforce. It's all about that knowledge transfer and all of that stuff. But a fleet of centaurs out there versus a fleet of your old school industrial age worker, you just had to have it all kind of in their brain.

ZP:Yeah. It's funny. Elon Musk puts it pretty plainly that we've been centaurs for a long time. The second you started sharing, keeping stored images and documents on a computer, you've created this tertiary brain.

AL:Exactly.

ZP:Right? So this idea that we utilize computing power and storage outside of our brain is that harmony. Right?

AL:That's right.

ZP:So, it'll be very interesting to see how we train workers and how workers in the field adapt their skills. What we're good at is as humans, as opposed to what computational power storage does for us over time with devices like RealWear’s HMT.

AL:Absolutely.

ZP:I mean computer vision will be a big part of the device over time and object detection or recognizing even gas detection and heat and radiation and that sort of stuff. So, it's a fascinating space to be in. Thanks for sharing.

AL:Yeah.

MH: Thanks for Listening to LITES. It’s Leaders in Industrial Technology, Education, and Safety. To learn more, visit lites.org. LITES is a production of ITI. If you enjoyed the show, please remember to rate, review, and subscribe. Our producer is Michael Montaine and today’s guest was Andy lowery with RealWear.

I’m your host, Mandy Henry, and we look forward to you tuning in next time.

LITES-Subscribe-iTunes-2020
LITES-Subscribe-Google-2020
LITES-Subscribe-Stitcher-2020
LITES-Subscribe-RSS-2020