162: I Am a Boomerang Enthusiast

Transcript from 162: I Am a Boomerang Enthusiast with Alan Yates, Elecia White, and Christopher White.

EW (00:00:02):

You are listening to Embedded. I'm Elecia White. My co-host is Christopher White. We have finally reached virtual reality week. Yay! In fact, Valve's Alan Yates is going to be speaking with us about building the Vive VR system. Now last week, when I interviewed Kat Scott about computer vision, I read most of the books she coauthored. This week I'm really glad to be talking about VR, because all I had to do is spend several hours playing games. And if you don't have your own VR system, and want homework before listening to the show, iFixit did a teardown of the Vive system, showing what electronics are there, and how they're connected. Just so you know, we're going to get to some details here.

CW (00:00:50):

Hi Alan. It's great to have you here today.

AY (00:00:52):

Hello, Chris. It's great to be here. And you too, Elecia.

EW (00:00:56):

Could you tell us about yourself?

AY (00:00:59):

Sure. Well, I've been at Valve now for about four and a half years. I'm a hardware, slash software, slash anything goes kind of engineer. I've been working primarily in the VR group, and responsible largely for the tracking system on the Vive, and many other subsystems as well. Before Valve, I used to work as a network hardware person. And then before that, I used to do web stuff, websites, terascale kind of web, and long before that, going right back, I used to be an internet service provider, right out of uni. I actually left uni to do that for many years until it became kind of unprofitable and boring.

CW (00:01:42):

Wow. That's quite a diverse set of things.

EW (00:01:44):

Yeah.

CW (00:01:44):

That's great.

AY (00:01:45):

Yeah. I've been a lot of things over the years. It's been quite a journey to get here, and obviously the journey continues.

EW (00:01:53):

So we're going to go straight to lightning round where we ask you questions and want short answers. And then if we are behaving, we won't ask you for follow-up, but eh, that never works.

AY (00:02:03):

Okay. Sounds fun.

EW (00:02:05):

Chris, you want to start?

CW (00:02:06):

Sure. Favorite electrical component.

AY (00:02:09):

BJT.

CW (00:02:11):

Pretend I'm a software engineer.

EW (00:02:12):

[Laughter].

AY (00:02:14):

Plain old-fashioned bipolar transistor.

CW (00:02:16):

Oh, right. Okay. Least favorite.

AY (00:02:19):

Least favorite. Ooh. I've got to say Zener diode.

EW (00:02:26):

Okay. What is your favorite spacecraft?

AY (00:02:31):

Ooh, favorite spacecraft. I've always liked the Mercury capsules.

EW (00:02:37):

Yeah. Because you could understand the whole thing.

AY (00:02:40):

Well, exactly -

CW (00:02:40):

They seem a little cramped. [Laughter].

EW (00:02:41):

Well -

AY (00:02:41):

Oh, yeah, very cramped, but yeah, definitely built at a time when you could understand everything that was going on in them.

CW (00:02:47):

What is the most important tool for your job? A whiteboard, soldering iron, or keyboard and mouse?

AY (00:02:53):

Ooh, depends on which particular job I'm doing that day, which hat I'm wearing. Probably I'm going to say soldering iron.

EW (00:03:01):

8 or 32 bit?

AY (00:03:03):

Ooh, [laughter], 32 bit.

CW (00:03:09):

Favorite wave, ocean, sound, or electromagnetic.

EW (00:03:11):

[Laughter].

AY (00:03:15):

What? [Laughter]. Let's go with electromagnetic.

CW (00:03:17):

[Laughter].

EW (00:03:20):

Most exciting science fiction concept that is likely to become a reality in our lifetimes, not including virtual reality.

AY (00:03:29):

Ooh, mind uploading.

CW (00:03:33):

Terrifying.

EW (00:03:33):

Yeah. [Laughter].

AY (00:03:35):

Oh yeah. [Laughter].

CW (00:03:36):

Let's go to a favorite fictional robot after that and see if they're connected. [Laughter].

EW (00:03:42):

[Laughter].

AY (00:03:42):

[Laughter]. I'm thinking Twiki.

CW (00:03:44):

Twiki? [Laughter]. Bee-dee-bee-deep.

EW (00:03:47):

[Laughter].

AY (00:03:47):

Exactly. I used to love that when I was a kid.

CW (00:03:48):

[Laughter].

EW (00:03:50):

Speaking of being a kid, what did you want to be when you grew up?

AY (00:03:54):

Ooh, now that's a... I actually wanted for a long time to be a doctor, but I ended up, well, I ended up being a software engineer. But eventually ended up in electrical. So I wanted to be a lot of things.

CW (00:04:06):

Favorite physical constant.

AY (00:04:09):

Mmm. I was going to say the gyromagnetic ratio, but you know what, I think the fine-structure constant.

CW (00:04:17):

Alright, cool.

EW (00:04:20):

Science, technology, engineering or math?

AY (00:04:24):

All of the above. If I have to pick one, engineering.

EW (00:04:29):

Okay. And this one Chris tells me is a little too obscure, but I think it has to do with the show. And so I really want to have an honest answer here.

AY (00:04:40):

Okay.

EW (00:04:40):

Phil King or Jeff Keyzer?

AY (00:04:41):

Oh, no! [Laughter].

EW (00:04:41):

[Laughter].

AY (00:04:46):

Okay. You're going to edit this out, right? [Laughter].

EW (00:04:48):

[Laughter].

CW (00:04:48):

The show will never air.

AY (00:04:52):

It depends on what I want done, I guess.

EW (00:04:55):

[Laughter].

CW (00:04:58):

How diplomatic. Diplomat. [Laughter].

AY (00:04:58):

Is that good enough? Because [laughter] -

CW (00:05:01):

[Laughter].

AY (00:05:03):

I can just see them on Monday coming in and killing me.

EW (00:05:06):

Oh no, it would be Wednesday. This doesn't air until Tuesday. [Laughter].

AY (00:05:07):

[Laughter].

EW (00:05:09):

You have days to live yet. [Laughter].

AY (00:05:10):

Days and days.

EW (00:05:14):

Okay. What is a work day like for you typically?

AY (00:05:17):

For me? Ooh, it can vary. I'm not a morning person, so I tend to come in a little later, and then...hopefully won't have too many meetings, and I can actually get some work done. Lately we've been obviously working with a lot of partners, and it's been, that part of my day is not always fun. But then there's, try and get through some email, do some, whatever the problem of the day may be. And that can be incredibly varied.

AY (00:05:45):

That can be something that's completely outside my domain that I've never even thought of before. Or it can be something incredibly mundane, like calculate this dropper resistor for this LED, or it could be hiring new people. It could be assembling some furniture that's just been delivered. Yesterday, I had to assemble a chest of drawers that I want to put all components in so that I can do some rapid prototyping. It can be just about anything.

EW (00:06:09):

I like that. I like it when my job is that varied.

CW (00:06:12):

Does it still seem like startup life? You mentioned putting together furniture and that's something that, to me, is like, "Oh, that's what we did at startups because there was nobody else to do it."

AY (00:06:19):

Absolutely. Valve is very much like that...Everyone does everything. There's no real support people. I mean, we have obviously people that do HR and have more dedicated tasks, but most of us, whatever has to be done, someone has to do it...It's not glamorous you could say, but it's very equalizing, I think. Everyone has to do the hard work as well as the good stuff.

EW (00:06:46):

And sometimes when you have a hard technical problem, building a chest of drawers is exactly the right thing to get it done.

AY (00:06:53):

Oh yes. It can take your mind off things...Actually, yesterday when I was putting together that thing, I came up with a couple of ideas for some high-frequency sensor stuff that I want to play with next week.

EW (00:07:05):

Okay. So the Vive VR system,...isn't marked Valve, it's marked HTC, but I don't even want to talk about all of that. I just want to talk about the system.

AY (00:07:18):

Sure.

EW (00:07:18):

...Tell us how it works.

CW (00:07:21):

[Laughter].

AY (00:07:21):

[Laughter].

EW (00:07:21):

As though somebody had never seen it. Give us an introduction to the system -

AY (00:07:28):

Okay.

EW (00:07:28):

- for complete newbies. We'll get into more detail, because Chris and I have one, and I have way more detailed questions.

AY (00:07:34):

Okay. I guess for someone that's never seen it before, it's the closest thing we have to a holodeck, is probably how I would put it. You don't, the room doesn't magically change around you. I mean, obviously it's a display device that you put on your face, but then through that, it has a good tracking system, so that it knows precisely where your head is and you have two controls that you hold. So you know exactly where your hands are. And really, it opens up almost an unlimited number of experiences that content people can make to take you other places. That's kind of how I would describe it.

EW (00:08:08):

Well, and "the other places" is a nice thing to emphasize, because one of the technology demos is, you're under water, just sort of sitting on this pirate ship, and a whale comes by. And when you look at the comparison of your size versus the whale's, it's correct, but you don't really think about how huge whales are, until there's one standing next to you.

CW (00:08:31):

Well, I mean -

EW (00:08:31):

Swimming next to you?

CW (00:08:33):

Yeah, they don't tend to stand.

EW (00:08:33):

Being next to you.

AY (00:08:34):

Yeah, absolutely. I didn't really have an appreciation for how large, I think that's a blue whale, they really are until, I mean, I've seen their skeletons, but to see one swim up alongside you, it's like, "Wow." ...It's such a different place. I'm not much of a scuba diver. I don't know that I'd ever have that experience any other way.

EW (00:08:53):

Yeah. And that's just a tech demo that you don't even interact with. There are a couple others in that same True Blue set that, you can play with the fish that are swimming by, but it isn't really much.

AY (00:09:04):

Yeah.

EW (00:09:04):

But when we have, what is the Space Pirate game? Is that what it's called? And you shoot at little balls?

AY (00:09:16):

Yeah, Space Pirate Trainer.

EW (00:09:16):

So that one not only are you in a world, and experiencing that somewhat scary world as things shoot at you, you are also interacting back, and that's the controllers.

AY (00:09:32):

Yeah. The controllers really are what makes the system. I mean, the display is amazing and the tracking and everything else, there's a whole lot of really cool tech there, but getting your hands in it really is what makes it something else. Otherwise, as you said, it's a fairly passive experience.

AY (00:09:46):

You can sit there, you can look around, maybe you have a controller and you can move yourself around in the world, although that will cause nausea. We know that quite reliably...Having your own action in the world, your own freedom to move about and touch things, interact with them, is really what gives VR its power to take you to some other completely different experience that you've never had before.

CW (00:10:10):

And it causes you to experience extreme muscle soreness in muscles you've never experienced before.

AY (00:10:16):

[Laughter]. Yes, absolutely. There's a lot of people, I read the Reddit forums obviously, and there's a whole bunch of people saying, "Oh my God, my thighs," they've got their leg workout in only a couple of minutes.

EW (00:10:29):

Well, I don't think people who've never tried it understand that part, just how exhausting it is to hold up the controller as you're playing lightsabers.

CW (00:10:40):

Or crouching a lot, moving from crouch position.

EW (00:10:42):

There's a lot of crouching.

CW (00:10:44):

Yeah.

EW (00:10:44):

And I do this thing where there's balls that come at you, and you have to block them with shields, and it's to music. And it's like punching and dancing at the same time. It's exhausting.

AY (00:10:56):

Oh yeah. That's a fun workout, isn't it? I love that one.

EW (00:10:59):

So what were the most difficult and foreseeable challenges with building a system like this?

AY (00:11:06):

I think getting everything right. Throughout the years, right back from before the nineties, first kind of VR, consumer VR came about, everyone's had kind of bits and pieces of the problem, display technology took a while to mature. And now obviously with MEMS, and other technologies, we've got much better tracking than we ever had before. But getting everything sort of together and getting the optics right. And getting the tracking right. Getting the latency low enough.

AY (00:11:34):

And then also GPU power being sufficient to actually render sort of photorealistic worlds. Obviously all that's going to improve in the future, but it's now at a point where it's good enough that it can convince your brain, your neural system, that you are actually in a different world, and a world that can range from cartoony, which is surprisingly compatible. People like Trump Simulator, for example, the world's relatively cartoony, but it does seem very real to you when you're in it, to the almost photorealistic things like the blue.

EW (00:12:10):

Have people talked to you about having this world, the real world, be weird after VR?

AY (00:12:20):

Yes. We don't have a name for that yet, but there does seem to be a post VR kind of thing when you come out of it, particularly people who've...never used it before. Basically, he first time they come out, you see their eyes are wide open and they're like, "Wow, mind blown." And for some people, I guess it's like the Tetris effect, they kind of experience an aftereffect for a little while.

AY (00:12:39):

We did know from research into the vestibular system that people... We're actually stimulating the vestibular system more or less correctly. Because your motion is one-to-one in the world, but the optics aren't exactly how the real world behaves. So there is kind of an effect there where people develop a second slot, if you like, where they kind of interact with virtual reality a little bit differently than reality.

EW (00:13:04):

And the vestibular system is what's responsible for your equilibrium. That's like your inner ear and the way your body is, your proprioception, how you feel where things are. Is that right?

AY (00:13:16):

Yes, absolutely. Yeah. It's probably the most difficult sense to, well, it's the one that will make you sick if you get it wrong. Let's put it that way.

CW (00:13:24):

So did you find, did you have in the back of your mind when you guys started this kind of thing that you'd have to do kind of medical research, or at least a little bit of physiological research to understand how this affects the body? Were there new things that you kind of came across that you didn't expect?

AY (00:13:42):

Yeah, we went through many of the old papers. Obviously the military had done a lot of experience because they've wanted to do simulations for training soldiers for a long, long time. And there's much research out there. Unfortunately, a lot of it was done at a time when latencies were much higher, and the general fidelity of the systems was very poor.

AY (00:14:01):

And as a result, much of that research unfortunately is kind of tainted by that. A lot of it's actually quite valid though. Much of the stuff about simulator sickness was certainly born out by our experiments. We had experiments where we tried to acclimatize people to undirected motion where they're either using a keyboard or a mouse or/and a stick. And that makes most people quite ill, and we would expose people to that for a little period of time and then would back off.

AY (00:14:28):

And then once a day they'd come in, and play until they sort of felt unwell, and then they'd take it off immediately and not do anything else for the rest of the day. And by doing that, they could build up some tolerance to it. But some percentage of the candidates, the test people, just couldn't do it. They could never develop VR legs. So that's where we decided that we just shouldn't do it that way. And we should give people that autonomy in the world to move around, and for the motion to be one-to-one, so that we just sort of completely bypass that effect of causing nausea from motional vection on their visual field.

CW (00:15:03):

I don't want to ask how your test area was set up.

AY (00:15:06):

[Laughter].

EW (00:15:08):

Plastic and buckets.

CW (00:15:08):

[Laughter].

AY (00:15:11):

[Laughter]. No one actually got quite that sick, but there was definitely a lot of ginger consumed after some of those early experiments.

CW (00:15:17):

[Laughter].

EW (00:15:19):

So did you, through those experiments, come to a, "It has to be this fast, it has to be this accurate" specification stage?

AY (00:15:29):

Yes, we got a pretty good idea of the latency that we needed and the accuracy of tracking that we needed. And that's kind of what drove a lot of our development after that. We, it did give us a reasonable idea of what display quality was required. And unfortunately, as far as the displays go, there's really, even though the current ones, and any of the foreseeable ones...for the next couple of years are not quite good enough, but...we know now where we need to be.

EW (00:15:58):

That was going to be one of my questions, is when you got those numbers, and you looked at them, and you thought about them, did you realize "Holy crap, there's no way we can do this?"

AY (00:16:06):

Pretty much. Yeah.

CW (00:16:07):

[Laughter].

AY (00:16:07):

[Laughter]. I mean the latency and things like that, I just thought, "Okay, obviously we can do that. That's just a matter of engineering, and the speed of light, be damned," kind of thing. But when I looked at the numbers and looked at where GPU performance was, it's like, "Okay." For many years, GPUs had kind of stalled out, and it was getting difficult for the GPU manufacturers to sell them, because people didn't need any more than 90 hertz or whatever they could run at. And now we've given them reason to have many orders of magnitude improvement.

CW (00:16:37):

Which is happening pretty much this year. I mean, the cards that have come out in the last month or so are just starting to be "Okay, this is really capable for doing this."

AY (00:16:47):

Absolutely. Yeah. So, we decided that 90 hertz was kind of the minimum that you'd need. And you have to render, in stereo, however many pixels you have. So two times that at 90 hertz and you can't drop a frame. So that's a pretty high bar, although yep, the Silicon guys have certainly stepped up to it. And they're obviously going to enjoy the new tail of that as it, as technology improves.

EW (00:17:15):

So motion sickness seems like one of those things that you knew you had to battle, and getting the GPUs fast enough to be able to handle the latency is another area. Did you ever sit down and think, "Well, how are we going to put this person in a room? How are we going to track their head?"

AY (00:17:36):

Absolutely. Yeah. Tracking was, tracking has always been the difficult problem in many ways. I mean, display is interesting, but you could see that it's, Moore's law is going to help you there. There wasn't anything particularly difficult about it, apart from just getting that many gigabits to the headset. In terms of tracking though, that was one of those things where we need submillimeter, kind of milliradian, pointing error, kind of tracking over whatever volume we're going to put the person within.

AY (00:18:07):

And for a long time, it seemed like that would be an expensive problem, or we could only do it to a certain amount of fidelity and that's where kind of Lighthouse and other tracking systems like it came from. There was actually many other tracking systems that we developed to, well, when I sort of first arrived at Valve, I sort of took on the tracking problem.

AY (00:18:27):

And I went through all of the physical ways that you can possibly track the position and orientation of something. In this universe, there's only so many different ways you can do it. You can either have it emit some radiation, or you can sort of throw some radiation at it, and you can get a shadow, or a reflection, that, or matter, or whatever. I mean, you could potentially do it with X-rays or a whole bunch of other different things, but you don't want to fry the person in the room.

EW (00:18:48):

[Laughter].

CW (00:18:48):

[Laughter]. That's usually frowned upon.

AY (00:18:51):

Yeah. There's some physical limits on what's possible. Many of the early tracking systems like The Sword of Damocles system were essentially mechanical. They had tendons coming off onto big mechanical things that hung the system from the ceiling. And that obviously was super low latency, but there was resistance there and very, very expensive. You had to instrument the environment.

AY (00:19:11):

One of our tracking systems, the other one that you've probably seen is with the fiducial markers on the wall, that system is actually very good. And simple to implement. You've got one camera, although there are some technical challenges associated with making it work well, it's not very deployable. It's not something you can ask people, to paper their walls with these fiducial markers. Although I'm sure some percentage of the people out there would totally do that.

EW (00:19:36):

Some people, yes. [Laughter].

AY (00:19:36):

Yeah, it's one of those things where we needed a much more fieldable solution. So we developed a couple of different tracking techs, and Lighthouse is kind of the last in a long line and probably not the last kind of tracking system we will develop.

EW (00:19:50):

Okay. I want to ask about Lighthouse in a second, but why couldn't you just use MEMS sensors?

AY (00:19:55):

Right. So MEMS sensors are accelerometer and gyro, they give you angular rates, or they give you accelerations, and then you have to, to get position, you obviously have to doubly integrate acceleration. So integrate once to get velocity, and then you integrate again, and all the noise adds up. And basically you can't, they drift, right? They have biases and the noise just kills you. If you integrate, free integrate, an IMU for more than a couple hundred milliseconds, it will literally fly out of the room.

EW (00:20:24):

Yep. Yep, yep. Sorry. I love inertial sensors, and trying to explain why they're never quite good enough for what you need is. It's a hobby.

CW (00:20:34):

[Laughter].

AY (00:20:34):

Absolutely. Yeah. Currently, one of my little projects, I'm a boomerang enthusiast and I've been trying to track an instrument, a boomerang, in flight. And inertial sensors are good, but they're not good enough. So I'm probably going to put optical tracking on a boomerang at some point.

EW (00:20:50):

And GPS isn't fast enough so you can't do that.

AY (00:20:54):

Yeah.

EW (00:20:54):

Unless you get a really expensive IMU, you can't integrate through GPS.

CW (00:21:00):

Which you're not going to be able to put on a boomerang probably.

EW (00:21:00):

Right, well -

AY (00:21:02):

Yeah, absolutely.

EW (00:21:03):

The Lighthouse system, which is what Vive uses, is sort of the opposite of what Oculus Rift uses. Can you compare the two technologies for positioning people?

AY (00:21:17):

Sure. Yeah, they are kind of mathematical duals, but they're not quite, there's a couple of subtleties in the differences between them. The Oculus, let's start with that one, and similar tracking systems. There's been many blinking LED, or synchronized LED and camera, either pixelated camera rays or linear camera rays. So there's some commercial mocap systems that use linear cameras, but they're outside looking in, essentially.

AY (00:21:44):

They have camera sensors that have angular, angle of arrival measurement, basically, of some kind of emitter. In this case, LEDs, in the volume, and some constellation of those, that's probably why they call it constellation. Some configuration of those points is rigid, and the cameras can see it from different angles. And basically do a pose endpoints kind of solve inversion of the prospective projection that the cameras see of those LEDs. Lighthouse kind of works the other way around, instead of - sorry.

EW (00:22:17):

Before we go there. I just want to clarify that with the Oculus, there are LEDs on that headset, and there are cameras around the room.

AY (00:22:28):

Correct. Yes.

EW (00:22:30):

Okay, I just wanted, I know you said that, but it was a little tough. And that ends up being a time difference of arrival problem if you have multiple cameras. As well as there's the fixed body, and so you can kind of see where the person is because they've got this LED here, and that LED there, and this other LED here. And it arrived at this camera then, and it arrived at that camera there. And so you can calculate where the person is faced and where they are in the room based on the LEDs they're emitting in a particular pattern.

AY (00:23:02):

Yeah. Yeah. It's not so much time of arrival as angle of arrival. Basically a lens turns angle into spatial position on the sensor. So there's some mapping, kind of like a tangent star mapping between which pixel the image of that LED appears to arrive at into angle space. So then you know where the cameras are and, or you can work out where the cameras are relative to each other, if you have more than one.

AY (00:23:28):

I mean, they do, their basic kit is very similar to ours, it'll track with a single camera or a single, we can track with a single Base Station because you have multiple points. So that orientation, as well as position of that object makes a pattern on the sensor, that you can then fit a position in three spaces, that six dimensional number basically that says where that object is.

EW (00:23:51):

Cool. Okay. Now Lighthouse works the other way. It puts lights on the outside and the detectors on the headset, right?

AY (00:24:01):

Yeah. Correct. Yeah. So it's kind of the other way around. So it is in a sense the mathematical dual, or you could call it the phase conjugate of the propagation of the radiation, but it's a little bit more complicated than that. It has, obviously there's some disadvantages to doing that. You have to put sensors now rather than just LEDs on the tracked object and it makes the tracked object a little bit more complicated, but it has other advantages.

AY (00:24:25):

...For example, cameras have been built using glass and silicon, they have finite field of view,...and finite depth of field. You're kind of limited by what you can do with optics. And that tends to scale in price quite aggressively. And also if you have multiple objects in the room, the camera is kind of where all of that data is being received. And if you want to do multiperson interactions, where you have multiple or multiple tracked objects in the volume, you have to then get all that information out of your tracking system and send it somewhere else. So you either have to send it to the computers that are doing the rendering or to the tracked objects themselves, if they want to be fully autonomous.

AY (00:25:09):

So Lighthouse kind of inverts that problem and makes the complex part, the solving part of the problem, the responsibility of the tracked object itself. So it's more like GPS. You have these, we call them Base Stations. They emit essentially a structured time-based light that they output. And by looking out at the field of lights submitted by one or more Base Stations, you as the tracked object, you know where your sensors are, and you see these pulses coming in at different times and you can work out where you are.

AY (00:25:39):

So in many ways it's actually quite similar to GPS. The complexity is mostly in the receiver, and the receiver has the ability to autonomously work out where it is. So it scales very well. You can have as many objects in the volume as you can put in there essentially, but they are a little bit more complicated.

EW (00:25:57):

And when I'm playing my Space Pirate Trainer game, I have three tracked objects. I have my head, which is also providing display for me, but that's not important for tracking. And I have the two controllers, which are acting as my guns or shields in the game. Is that right? I have three tracked objects, or does it not work that way?

AY (00:26:19):

Yeah, that is three tracked objects. Each one of those has its own independent, we call it a Watchman device, which is kind of the receiver that takes all those signals and correlates it. At the moment, it sends it back to your PC, because that's where all the rendering is happening. The advantage is if you want a backpack computer, or have a completely autonomous mobile device, that was just, for example, built into the headset, it could do everything that it needed to do locally. It wouldn't have to go via a lead or radio or whatever to cameras that were mounted on the edges of the volume.

EW (00:26:53):

Right. If it was the other way, as with Oculus, then the cameras have the tracked object's information. And so they have to join up somewhere, and having the detectors on the headset means that, ideally when computers are small and even faster and more power efficient, this all can be on my back or on my head.

AY (00:27:20):

Correct. Yeah. So that's one of its primary advantages in terms of scalability. The cameras sort of suffer from kind of Olber's paradox kind of thing. Where, "Why is the sky not bright at night?" kind of problem. If there's a bunch of things in the room that are all trying to be tracked by cameras, they all have to run different codes. And obviously you have to deal with code space, so you have to track them in angle space. All of these problems are certainly solvable, but some of them, you sort of just get for free if you turn the system inside out.

AY (00:27:48):

There's also advantages in depth of field. The lasers,...they have very low divergence, so you essentially have infinite depth of field. And...because of various ways about the system, why the system works, we can get pretty good angular resolution, probably better than what you can get with an equivalent price camera, just by virtue of the fact that we have these low divergence things. And we have very accurate spinning of the motors in the Base Stations.

EW (00:28:14):

Okay, now this we need to talk about more because we've been kind of saying cameras and detectors and LEDs, but what it actually is is the headset has a whole bunch of detectors, photodiodes. And then these Lighthouse things that you've made put out, you said structured information, but I was told it was like a grid and there are two in the room sort of at opposite ends, sort of diagonal was how we did it.

AY (00:28:47):

[Affirmative].

EW (00:28:47):

And they put out slightly different grids. Is that right?

AY (00:28:53):

Yeah. The current, the system that you're using at the moment, is one of many different ways you can set up Lighthouse. So you can actually track with a single Base Station, but when you turn your back on a Base Station, for example, you would shield the controllers. I mean, like all optical systems, it depends on line of sight. And like I was joking about before, we could use X-rays or some other kind of penetrating radiation and go right through bodies and occluding objects, but would also probably cause a lot of cancer in the process, but -

CW (00:29:19):

Neutrinos are fine.

AY (00:29:21):

Yeah. Neutrinos are fine, the detectors might be a bit challenging. [Laughter].

CW (00:29:25):

[Laughter].

AY (00:29:25):

So we have one or more Base Stations to cover the volume. Now what they emit, there's a synchronization flash that's sort of an omnidirectional. Well, not omnidirection, but covers 120 by 120 kind of degrees, pulses of light. It's a little bit more complicated than that. There's some modulation involved, but that's kind of "Start your clock now." And then each of the axes, there's two orthogonal mirrors, spinning mirrors, in the Base Station, basically. And they sweep a beam of laser light across the world, alternatively, or so it kind of sweeps in on the x-axis, we call them "j" and "k," but in one direction, and then orthogonally in the other direction.

AY (00:30:05):

And by timing from the flash to when you see the laser beam come past, you get an angle from the Base Station to where that sensor is in the world. And from those angles, you can then do a similar kind of problem. Although we actually use very different math to the traditional computer vision solution. We use a very different tracking system. We can talk more about that later, but it's sort of the same problem. You've got a bunch of angles. You've got a bunch of projected angles from each base and you can work out where you are in space from that.

EW (00:30:37):

Because I have 32 detectors on my head, right?

AY (00:30:41):

Correct. Yes. There's 32 on the headset, the controls I think have 26 at the moment? The 32 was chosen because A, it's a nice round binary number. And also it's about what you need for something that's kind of that shape. We could get away with less. But having some redundancy is good. If you put up your hand and block half of the constellation of sensors, then it's good to have some redundancy,

EW (00:31:09):

Some redundancy. 32, that's a lot of angles.

AY (00:31:14):

It's a lot of angles. I mean, I think the camera systems typically use 30 to 40, maybe more dots. Many mocap systems use a lot less dots because they have a lot more cameras. So they've got more redundancy kind of in the other way around in, in view space, so you've got more ways of looking at the same dot. And they actually use slightly different math generally because they can do triangulation. Because normally they can see each dot from at least two cameras.

CW (00:31:43):

So given this system, how well can you track something?

AY (00:31:49):

That depends on how far away you are. So, like most tracking systems, it degrades with range, and it sort of degrades with the, well, position in two directions from, so if you're looking out from the Base Station, range is the worst measurement. Much like GPS, height is the worst measurement. It's actually kind of a similar, although,...that's actually the mathematical dual of an angle system. Because it's a trilateral rating system rather than a triangulating, multilaterating. Anyway.

EW (00:32:20):

No, no, no. You're going to have to explain those words. Sorry.

AY (00:32:22):

Okay, so -

EW (00:32:22):

Then we'll go back. I don't know what a trilaterating system is.

AY (00:32:26):

There's kind of two ways that, given measurements of something, you can either kind of measure how far away it is, or you can measure what bearing angle it has to you. So angular systems like Lighthouse, or cameras, measure relative angles, and use sort of the old-fashioned resection problem that's used in surveying. They use big triangular grids,...where you know all the angles, and you know some of the distances, and you can kind of solve for where all the vertices are in this triangulation net...

AY (00:32:55):

It's a very old problem. It's been known since surveying, kind of just after the Egyptians kind of thing. Then there's the other thing where you can measure distances. You only know distances, you have no idea about angle of arrival. So you can either do that by time difference of arrival or, you know, time of flight. And by doing that, you get a distance. GPS works this way, and in some ways it scales better to larger things, but also it means that you have to make measurements that are of significant resolution compared to the propagation speed of the radiation. For the speed of sound that actually works pretty well.

AY (00:33:37):

And there are many ultrasonic ranging systems out there that work in kind of room-scale volumes, because the speed of sound is pretty slow. But the speed of light is really fast. And room-scale volume, you're talking subpicosecond kind of resolution to get any kind of spatial resolution out of the system. In the case of Lighthouse, that could be done, but it would be pretty expensive. Let's put it that way. And for those kind of practical reasons, we measure angles instead of measuring distances. So that's the difference. Trilaterating or multilaterating systems use distances, and triangulating or multiangulating systems use angles.

EW (00:34:17):

Hmm. Okay. What are we talking about before that?

CW (00:34:19):

I was asking how well it tracked.

EW (00:34:20):

Alright. Right. Picoseconds. What do picoseconds turn into?

AY (00:34:26):

What do picoseconds turn into? Picoseconds, yeah, I mean, that's speed of light. What does that work out to. It's about like, Admiral Hopper used to say about the size of a grain of pepper, I think is...a Picosecond at the speed of light?

EW (00:34:37):

That sounds about right.

AY (00:34:37):

Yeah.

EW (00:34:37):

A nanosecond is about 11 inches.

AY (00:34:40):

That's right. Yeah. So we measure in the nanosecond kind of region for Lighthouse because we only, the sweeping rates, the thing is spinning at 60 hertz. So it covers one tower in about 16.7 milliseconds. So if we have resolution down in the 40, 50 nanosecond, we can get angular resolutions around one part, 1.25 parts per million, somewhere around there is what we get.

CW (00:35:09):

Wow.

AY (00:35:10):

Then there's noise in the system. And noise in the system degrades that at about 5 to 25 microradians per meter. It works out anyway that at five meters, you're probably getting about a quarter millimeter, one sigma in position

CW (00:35:28):

Which is, on the face of it kind of incredible. [Laughter].

EW (00:35:31):

That's pretty darn incredible.

AY (00:35:33):

Yeah. It works pretty well. Distance, however, is computed by kind of the skinny triangle of the projection of all of those dots. So essentially, the angular size of the constellation as seen from the Base Station gives you an idea of how far away it is. And that means that you've got this long narrow triangle. So any kind of errors in the position that you've measured on those sensors get passed through an inverse tan, and it ends up blowing up. So the error in range can be quite a lot larger than that. It can be 50 times larger, potentially, at a distance.

CW (00:36:08):

So, you said it's advantageous to have two, because if you turn your back, then certain sensors are shadowed, and can't see one of the Base Stations. Is there any advantage to more than two?

AY (00:36:19):

Oh yes. [Laughter]. As many as you can really, but then that gets into some different kinds of scalability where cameras and Lighthouse have different advantages. But first of all, let's talk about multiple Base Stations. I was only talking about one Base Station solution, or one projected angle, same for camera.

CW (00:36:38):

Okay.

AY (00:36:38):

In that case, you've got poor range data, but if you have two of these measurements, particularly if they're near 90 degrees from each other, then the spatial, the angular fix of one Base Station corrects or compensates for the poor range of the other. So you get, in GPS terms,...geometrical dilution of precision for a multilaterating system, for an angulating system.

AY (00:37:02):

It's kind of the same, but the math has kind of just inverted. And it means that if two things are at 90 degrees, you get pretty good fixes. And we certainly see that with Lighthouse. When you can see both Base Stations, you get sub-quarter millimeter, maybe down to a 10th of a millimeter fixes within the volume.

EW (00:37:20):

Do humans really need that level of precision? I mean, do you notice that?

AY (00:37:27):

Yeah, you do notice it. lf it was off by inches, in particular with your head, if it was off in the same way, or if it was smooth, if the error is smooth over the volume, it's not such a big problem. Because you kind of, all of your senses are kind of relative as well, except your eyes. And obviously your vestibular system is kind of like MEMS, IMUs, kind of the same thing. It's all relative. So your brain normally kind of fixes, we believe, fixes its position in space optically, most of the time.

AY (00:38:00):

So if the world's a little bit off or there's a little bit of distortion in kind of the metric space that your tracking system provides, you probably won't notice it. And Lighthouse and all other tracking systems do have distortions in the volume that you can compensate for by calibration. We can talk more about that later, but if there was jumps or discontinuities, big non-monotenicities -

EW (00:38:25):

Oh, those are bad. Really bad.

AY (00:38:26):

Yeah. You'll notice that. So yeah, Lighthouse, like any system, will occasionally screw up, and you'll definitely notice.

EW (00:38:31):

[Laughter]. Yes.

CW (00:38:33):

Yes, yes. I've noticed.

AY (00:38:36):

[Laughter].

CW (00:38:36):

Oh, I'm on the Titanic. Everything's sinking. [Laughter].

AY (00:38:40):

Yeah. The particularly bad one, is when you lose optical, and it starts having to rely on the IMU and it flies off and that's not pleasant.

CW (00:38:47):

Right. But I've noticed that even if you sit very, very still, and I think there's some third-party program that measures the noise in the system, it'll tell you if you've got a good setup. And I think ours came back with submillimeter accuracy, which was something like that. But if you sit extremely still, you can kind of see a little bit of noise in your motion. So I think our eyes and our brains are very good at noticing that sort of thing.

EW (00:39:13):

And yet I was really impressed. There's the drawing game, the Google Tilt Brush?

CW (00:39:18):

Yeah.

EW (00:39:18):

And you can draw or sculpt. It is sort of more accurate, since you're in 3D. And you put it -

CW (00:39:25):

Yeah.

EW (00:39:25):

- you put the controller in one spot, and then you go back, and you want to start a line from that same spot again. And it's really close.

CW (00:39:32):

Yeah, very repeatable.

EW (00:39:32):

I mean, it's just really repeatable and weird that way. I was surprised. I didn't expect that accuracy.

AY (00:39:39):

Yeah.

EW (00:39:39):

But now I do.

AY (00:39:41):

Part of that is obviously you're a human closing the loop as well. So you're looking at where the point of your tool is, and if there was a little bit of error in the track system, like the constellation wasn't quite isotropic. So when the system saw it from a different angle, it was off by a quarter millimeter or something. You would just move a quarter millimeter to compensate for that.

AY (00:40:00):

So humans have this kind of eye-hand coordination thing that obviously has been incredibly important in our evolution. And we're super good at compensating for those tiny little errors. And because you can't actually see your hands, obviously you're blind, and you're only seeing the tool that you're wielding. It just seems to be a feature of our brains, that we can really compensate for tiny errors quite effectively.

EW (00:40:23):

So the Lighthouse runs with what you said, flash, and then it goes x lines, and then y lines. But these are all infrared. Have you ever replaced it with a LED or laser you can see, so you could see the pattern it's making?

AY (00:40:41):

Oh yes. The early prototypes were done with red lasers. Not red blinkers, because that would probably give you a seizure, but the lasers have been red in the past and were in the early prototypes, just so we could really debug it and understand it. Because...there's no substitute for actually visualizing what's really going on. The sweep itself, you can't really see. We have used high-speed cameras to look at the beam hitting the wall, for example. When you look at a Base Station that is invisible light, or if you look at it with an infrared camera, you can kind of see two dots on the rotors where the beam passes over your eyeball basically, or passes over the camera lens.

AY (00:41:22):

So as you move around, the position of those dots on the rotor and the Base Station kind of follows you, like those weird paintings, or with the eyes following you. It's kind of the same thing. So yeah, really Base Stations, you'd have these two red dots and they'd kind of just follow you as you moved around, because you're essentially looking down the beam projection line all the way to the optics in the Base Station.

CW (00:41:42):

Crazy.

EW (00:41:42):

And these move, the lines move because you have motors in the system. Because you're actually physically moving spinning mirrors.

AY (00:41:53):

Yes. So mirrors, well, obviously I'd love to be able to steer the beam some other way. And we've certainly looked into that. Using a spinning mirror is actually the cheapest and most reliable that you can do right now. The physics also gives you a bunch of stuff for free because the rotor has inertia, and by virtue of that inertia, the spin rate is extremely stable. So they're a little bit like pulsars. You have this little sweep, in fact, I was going to call it pulsar, but I ended up using pulsar for a different kind of tracking system that has a time differencing rather than an angular differencing system.

AY (00:42:27):

But anyway. As the things spins around, the angle that we determine, measure, by timing comes purely from how many angles per second essentially that rotor is spinning at. So that spin rate has to be extremely well-controlled. And that's been one of the challenges in building the Base Station, was to make a control system that could hold the rotor to parts per million, and it's a mechanical system. And it does.

EW (00:42:55):

And it does, but -

AY (00:42:56):

[Laughter].

CW (00:42:58):

Is that building on, I mean, laser scanners, that's a technology that has been -

AY (00:43:02):

Yes.

CW (00:43:02):

- around for a long time and has, I've worked with them in the past and they get more and more sophisticated, but it's kind of well-known. Did you build on that or was there, "Oh my God, this isn't good enough. We have to do something else."

AY (00:43:14):

Many of the laser scanners, I mean, obviously they're normally quite expensive instruments. So we had to make something that was affordable. There are many different kinds of angle-measuring this... All kinds of systems out there that use spinning mirrors or spinning lasers. Many of them actually put the laser optics on the spinning bus of whatever's actually scanning, and doing that is obviously better because you can align everything fairly precisely. You can potentially actively align everything in the factory. But that's prohibitively expensive for our application.

AY (00:43:46):

So what we had to do, and I guess kind of one of the main inventions that I came up with for Lighthouse, was to use a diverting mirror, a spinning diverting mirror, and a line generating optic that were fairly cheap off-the-shelf kind of things. And come up with ways of compensating for the nonideallities of the alignment of that system. So to build something at a consumer price, we had to do the math and do all the investigations into taking something, it was very much a nonideal system that was sort of built with some tolerances, and turn into something that could give us very good data.

EW (00:44:21):

And how do you manufacture to get that sort of, I mean, that must be a pretty fine-tuned calibration. How do you even, how do you even?

AY (00:44:32):

[Laughter]. Yeah, there's a lot of subtlety in the calibration process. We try to build the Base Station as well as we can to begin with. And it's pretty good. Yeah, you probably get accuracy to about three or four inches, absolute, without calibration, because it's assembled reasonably well and HTC does a particularly good job. They can actually build Base Stations better than we can, our early prototypes.

AY (00:44:55):

Many of the prototypes of course, they're assembled by myself or someone else, were just sort of thrown together and would kind of glue the lenses in at roughly the right angle. And the mirrors would be roughly the right place. But the motors that we now use are very, very good motors. They're the same kind of motors that you have in hard drives. So they've got tolerances in the microns in terms of their runout and their axial play and things like that.

AY (00:45:17):

And on top of that, we have mostly polymer...and there's some metals involved so that there's some tolerances in the mechanical assembly of all of that. And that's kind of what we have to calibrate out. So calibration, we actually use similar as to how you would calibrate other optical systems, essentially nonlinear regression, right? You collect a bunch of data over the volume, and you have some kind of mathematical model that says, "This is how we believe the base station should work."

AY (00:45:44):

And there's a bunch of parameters that say, "This parameter compensates for one nonidealality or the other, and then you solve for the values of those coefficients, basically, there's calibration coefficients, which minimize the error over the whole set of data that you've collected. And that's how you would calibrate most systems of any real type in the world. Newton's non-linear least squares regression kind of stuff is kind of universal in that respect.

AY (00:46:15):

But the model itself is pretty straightforward...For example, one of the parameters is the exact angle when that flash happens. That when that flash that says, "Okay, I'm at angle zero," basically there's a mark on the rotor, there's an optical pickup that determines when that flash happens. And the positioning of that is actually just a little piece of shiny aluminum tape that's put on the rotor. And the positioning of that is not accurate. So with, to within submillimeter kind of thing, kind of positioning, just from manufacturer. So we have to calibrate that out and it's one of the primaries we actually calibrate out really, really well.

EW (00:46:52):

Yes, because once you,...you can talk to the motor subsystem and say, "Okay, tell me when you're gonna hit that zero angle," and you can have a camera that waits for that flash. And then you just take the time difference of those two. And now you're calibrated on that particular parameter.

AY (00:47:15):

Yeah, and that is kind of, initially. But it's not how we do it now. We actually don't just take a singular measurement like that because alignment of the Base Station and the camera or the sensor would be very, very difficult,...because the system, it can do five microradian kind of resolution. Lining up something to five microradian resolution would require a theater light, and we'd need reference fiducials. And it would be almost impossible to do that in a production line. So we take a bunch of data over the whole volume, and we know, we do that with a normal tracked object, just like a headset.

AY (00:47:50):

We sort of wave a headset around in the volume in front of the Base Station. And by collecting all of that data, we know that the headset didn't change shape significantly. Although you can actually tell that it changes shape from the fact that you're moving it around and gravitational loading, but anyway, you get a whole bunch of that data, and you look for, you solve for the number that says that makes sense, basically. So...if those angles are wrong, the world is slightly distorted. The metric space is distorted. So you solve for the numbers that flatten that metric space

CW (00:48:22):

When you were prototyping this, was it a bunch of free space stuff on optical benches and parts from Thorlabs? Or did you go straight to kind of a form factor, small device from off-the-shelf parts?

AY (00:48:38):

The very first version was actually a 3D printed plastic thing that I made.

CW (00:48:42):

Really?

AY (00:48:42):

Yeah,...you can actually make a very crude Lighthouse. Anyone could make a crude Lighthouse for little robot that ran around on the floor. I've actually been thinking about publishing exactly how to do that, because it's a simple two dimensional problem and it would actually be super educational. I could imagine a bunch of kids building one for fun.

AY (00:49:00):

But the first version was just that. It was a laser pointer that I took apart that I got off eBay. And I took a vibration motor out of a X-Box controller, and I used that to spin a little mirror cell. It was all 3D printed plastic. And I assembled it, and I built a receiver, and it gave me 0.1 millimeter, 0.1 degree, resolution because of all the noises in the system. It wasn't close-looped. It was just spinning as fast as it span. And it had cable sync. It didn't have optical sync at that point. But it was kind of the proof of concept. Then the next version after that was, we sought up some hard drives.

AY (00:49:35):

We literally took some hard drives out of the dead hard drive bin, and we took the platters out, and we sawed the front off it and stuck some - at that point, we went to decent mirrors that we got from Thor, I think. But we still used eBay line generator optics and a bunch of, one of the first RF receivers, because there's some RF modulation on the laser beam, was one of those AM radio kits that Ben Krasnow took apart. And he kind of plugged a photodiode on one end of that, and went through and changed a bunch of stuff and used that to build the receiver.

AY (00:50:10):

And then from that, we started to build more serious ones that were actually machined aluminum, and machined brass for the rotors. And a couple of versions beyond that, we're at the point where, "Okay, we need to make this cheap and simple." So we had castings and post machine. We tried to make it as cheap and simple as possible. And really the motors and the electronics and the lasers are probably the most expensive part of it.

EW (00:50:34):

That's just amazing. Okay. So if you have all of this really precise location ability, what do you need an inertial system for?

AY (00:50:48):

Well, it only spins at 60 hertz, right? So although you get two flashes out of that, so you get 120 half updates per second per Base Station, and you're rendering it 90 hertz, and you want to get, you need something essentially to fill in the gaps in-between the optical system. So the IMU, we can sample at a thousand hertz, right? We can get data out of it at a kilohertz. And we use that as kind of the primary high frequency information in the system.

AY (00:51:15):

If you think about it in terms of frequency, then all of the really fine high frequency, short interval updates come from the inertial system, and the low frequency close to DC information, about where you actually physically are in space, that comes from the optical system. And they get fused together in a Kalman filter.

EW (00:51:35):

So it must be, I mean, kilohertz updates, that's more gyros than accelerometers at that point.

CW (00:51:42):

I feel like I should ring a bell any time somebody says Kalman filter.

EW (00:51:44):

[Laughter]. Yeah.

AY (00:51:47):

[Laughter]. Yeah. Unfortunately the chap died recently. I wonder if he appreciates just how important his contribution has been to people. All of our tech really.

CW (00:51:55):

Yeah.

EW (00:51:56):

Yeah. I mean, it's been pretty amazing that he made this algorithm that was, it's not that complicated, but we do so much with it. We make it so complicated with...all of the ways we use it.

AY (00:52:11):

Yeah. It's so general. If anything that has, it's roughly normal, it's super applicable, and it's just so elegant.

EW (00:52:20):

Yeah. Okay. So you put in gyro at one kilohertz, you put in visual, light area at -

CW (00:52:32):

Gigahertz.

EW (00:52:32):

- 120 hertz, two Lighthouse at 60 hertz each. And then you put in the accelerometer data whenever it's available, probably closer to 500 hertz. Yeah, okay...You can go pretty fast with that. So when I'm whipping my head around it does track.

AY (00:52:53):

Yep. It's actually pretty good at estimating. And there's been some people, like Doc Ok I think, did a rather nice YouTube video, or set of YouTube videos on Lighthouse where he collected the data, the raw data is straight out of kind of the tracking API. He kind of bypassed a bunch of stuff in SteamVR, and he could show you the updates where it would correct in one direction, then would correct in the other.

AY (00:53:17):

And you could see the IMU drifted a little bit, and then the optical would push it in the right direction. And then that would push it in the opposite orthogonal direction. And he also showed you the data, kind of the noise cloud when you had on the single Base Station versus two Base Stations. And you can kind of see a lot about how the system works from that, treating as a black box.

EW (00:53:38):

We will get a link to that in the show notes. So does this Kalman filter run on the Cortex M0s in the headset, or is it on my computer, or is there an FPGA in there that does it?

CW (00:53:52):

It's in the cloud.

EW (00:53:52):

[Laughter].

AY (00:53:54):

[Laughter]. It's on the host at the moment. So the Kalman filter does, there's a fair bit of math involved. There's an inversion of some pretty big matrices. It could, and probably will eventually be baked down to run on the tracked object. It's particularly advantageous for things like the controllers that have a limited bandwidth connection.

EW (00:54:12):

Yeah.

AY (00:54:13):

They currently send updates at about 250 hertz at the moment, whereas the headset run as a kilohertz for the IMU. And a lot of that is just purely because you need to ship that data over radio. And to do that, a kilohertz is a lot. That said, tracking on the head has to be way better than tracking on the hands. Because again, eyes are more sensitive than that positioning.

EW (00:54:38):

It has to be more sensitive because vomiting. [Laughter].

AY (00:54:42):

[Laughter]. Exactly. If your hand flies off, it's annoying, but if your head flies off, yeah. It's not pleasant.

EW (00:54:47):

But you do have some M0s on the headset and on the controllers, I think, although I'm not sure about that.

AY (00:54:53):

Yes.

EW (00:54:53):

What kind of math are you doing there? Is it just coordinating everything and communicating?

AY (00:55:01):

Yeah, primarily that's just the Watchman, basically. One of those M0s in the headset and in the controllers, is primarily reading the IMU, getting the data out of the FPGA from the optical subsystem. So, yeah, you're right. There is an FPGA in there. The FPGA is kind of doing the high speed time of the signals coming from the sensors. So each sensor outputs an LVTTL signal that comes back to an FPGA, but FPGA has a high speed clock that is the time base basically for all of the optical system.

AY (00:55:33):

The IMU also happens to feed into that FPGA. So we get the IMU data in the same time base because time is super important in any kind of tracking system. So everything comes back to the host over USB or over radio, and is in the individual time base of each one of those Watchman receivers. And then the host-side code uses that information to actually solve for the position of those objects.

EW (00:56:00):

And I know that you use three ST Cortex M0s, and one NXP Cortex M0.

CW (00:56:08):

[Laughter].

AY (00:56:08):

[Laughter].

EW (00:56:08):

I have to ask you this. Do you hate your software engineers?

AY (00:56:13):

[Laughter]. Yeah. The NXP selection actually has some, there's some history and some technical reasons why we went with NXP for the Watchman. The Steam Controller is based on NXP, 11U37 is exactly the same. Although we started off with a slightly different chip, but we ended up in the same kind of region. The nice thing about the NXP LPC 11Uxx devices is that they have a USB bootloader. So they're essentially unbreakable. It's in the ROM.

AY (00:56:41):

And if you bridge and bring a pin down when the thing powers up, it will come up on USB as a mass storage device. Many other chips have this feature, but that was one nice feature that the Steam Controller team selected that thing for, because they could wire that up to one of the buttons on the controller, and no matter what the person did to the thing, or what terrible software update we might've accidentally shipped, we could always unbreak it.

AY (00:57:06):

So that was one reason why we chose the NXP. The bad thing was LPCOpen wasn't a thing at that point. And a lot of the example code that we got from NXP was really, really terrible. So when I first came here, I was looking at this stuff going, "Wow, was this test vectors? Was this some guy just writing some code to test the peripherals?"

EW (00:57:27):

Yeah.

AY (00:57:27):

It was terrible. There was hardcoded clock rates in some of the timing routines.

EW (00:57:34):

Yep.

AY (00:57:34):

And the USB stack is an abomination. But anyway -

EW (00:57:38):

It's not just me then.

AY (00:57:39):

No.

EW (00:57:40):

Okay.

AY (00:57:40):

It's not just you. The NXPs, I actually don't know about LPCOpen now. I know that they put a lot of effort into it. I actually had, one of their reps was here, and the poor guy, the look on his face when I told him what I thought of it was actually priceless. But I think they've invested quite a lot of effort in making LPCOpen better. That said, I haven't used it, but I know that the controller team has, and I think they've actually migrated to it.

AY (00:58:05):

But one of the first things I did when, I wrote the firmware for the Base Station and the original firmware for the Watchman, so a lot of the time, a couple of weeks, I spent writing drivers for every single peripheral on that chip. So I know the 11Uxx pretty well, because I essentially wrote a driver for every piece of hardware it has.

EW (00:58:25):

But then you went to the ST because it's a little better supported?

AY (00:58:32):

No, the ST -

EW (00:58:32):

Cheaper?

AY (00:58:32):

The ST that's in the headset, I think there's an ST in the headset. Yeah, I think was HTC's selection. So the display subsystem in the headset is based on our reference design, but they wanted to do some stuff themselves. So, because it's always complicated when you have two parties trying to collaborate over a code base, they essentially picked their own microcontroller, and did their own subsystem. So their code kind of runs in that stuff. And our code runs in our micro. And that's just why we ended up with two micros. That'll probably get cost-reduced away at some point.

EW (00:59:14):

So you mentioned USB and that takes data over to the computer, but USB, it's not real time. Does that cause problems or do you just have enough control and have spec'd -

CW (00:59:28):

Isn't there some sort of isochronous mode or something? I'm going way back into the depths of my memory now, but -

AY (00:59:34):

We're actually using, yeah, well, yeah. It does cause some problems, particularly because there are many USB implementations out there, particularly USB 3.0 xHCI drivers -

EW (00:59:44):

Yeah.

AY (00:59:44):

- that are just terrible. Terrible, terrible implementations. And some of them just have broken silicon as well, not just the drivers in Windows. But anyway,...the latency's a little bit unpredictable, but it's generally a few milliseconds and we can kind of deal with that. It does have, it's kind of one of the parts of the system that we don't have great control over, because PC platform, there's so many different variations of PC platform, and different chip sets and things out there...

AY (01:00:14):

In terms of time, everything sort of runs in Watchman time or something derived from Watchman time. That's kind of a host time plus some boxable latency really on the host. We don't use isochronous, we're actually using HID, believe it or not.

CW (01:00:34):

Wow.

AY (01:00:34):

Yeah, but HID is interrupt driven, and it -

CW (01:00:37):

Yeah.

AY (01:00:37):

- we can get one kilohertz data out of it. Although Windows, at least previously, I think Windows 7 and below, had a bug that if you had a one kilohertz HID device, it would just refuse to boot, if that device was plugged in.

CW (01:00:52):

It's too fast. [Laughter].

AY (01:00:54):

Yeah. It's a super common problem. A bunch of mice in the world, break this as well. And it's something strange to do with power management, on the USB bus as well. We eventually worked around it, but it's a pretty common problem. USB is one of those things that you think, "Wow, you know, it's all nailed down. It's pretty simple, straightforward.

CW (01:01:11):

It's universal. [Laughter].

EW (01:01:12):

[Laughter].

AY (01:01:12):

Yeah. Well, no. [Laughter].

EW (01:01:15):

Everybody's making their own standard there.

AY (01:01:18):

Yeah. Yeah. USB has been, it's probably one of the most challenging parts of the system in terms of just support, because there are so many broken implementations out there.

EW (01:01:29):

Yeah. And so USB takes over your sensor data from your headset, and then HDMI then sends the display data back over a different cable. Do you send data over USB as well, from the computer?

AY (01:01:50):

Yeah, there's a little bit of data sent over. There's actually a way to retrieve the calibration information. So for example, calibration information for the optical, and the tracking subsystem, is stored in the headset. We can retrieve that via the USB. There's some other sensors, auxiliary sensors, like the proximity sensor, and the IPD measurement sensor. There's also some kind of power and optical display control stuff as well that goes over USB. And of course -

CW (01:02:28):

Firmware.

AY (01:02:28):

- physically to upgrade -

CW (01:02:29):

Yeah.

AY (01:02:29):

- everything right. Firmware upgrade.

EW (01:02:32):

So there is also an unused USB slot in the headset.

AY (01:02:37):

Yes.

EW (01:02:37):

Which of course, you said that one of the science fiction concepts that might become real in our lifetime is the mind -

CW (01:02:45):

I don't want my brain going over USB.

AY (01:02:45):

[Laughter].

EW (01:02:46):

But, I mean, plug in somebody else's brain into my headset, it'll work out fine, I'm sure. What is that USB slot for?

AY (01:02:57):

That's an auxiliary USB for whatever you want to put in there. So one of the things we kind of do here at Valve, is we try and build our stuff as hackable as possible. We try and make it very hacker friendly, and flexible, and expandable. Even the mechanical design of the headset. We asked HTC when they were coming up with it to make it, the strap removable so that you could potentially make your own strap.

AY (01:03:20):

The USB is there purely for whatever peripherals you might want to add to it. You can put different kinds of tracking systems, I know some people put the leap motion, the camera system that they have on the front of the headset...So that that's a general purpose USB 2.0 port for whatever you want to try and hang off the thing. And we encourage people to play with it.

CW (01:03:45):

I have seen people put those USB-powered fans.

AY (01:03:49):

[Laughter]. Yes.

EW (01:03:49):

That'd be kind of nice sometimes. [Laughter].

AY (01:03:53):

Yeah, I love that. Some of them are just a bare spinning fan blade -

CW (01:03:56):

Yeah.

AY (01:03:56):

- which seems like it could be bad too.

EW (01:04:00):

So this is actually hubbed to the computer? It doesn't modify the local headset at all. It's just a USB to the computer.

AY (01:04:10):

Correct. There's a 7-port hub in there -

EW (01:04:13):

Okay.

AY (01:04:13):

- that many of the things hang off. And one of, the spare port was basically brought out for auxiliary use.

EW (01:04:19):

I'm thinking about all of the things I have wanted added, more trackable objects. I really sometimes wish I knew where my feet were...But I don't think I want to wire them to USB.

CW (01:04:32):

I want to know my drink is. [Laughter].

EW (01:04:34):

Yes. Can we have tags that will just go on the drink? And you'd find it in the world.

AY (01:04:41):

There's a couple of ways of doing that. One of which might actually be to use the camera, right? It has a camera on the headset, and the camera is very well-calibrated. We know exactly where it is, and it's tracked by the tracking system that's in the headset. So one, Lighthouse receivers are fine. We're obviously going to have more Lighthouse receivers in the world very soon. But...for simple things like finding your drink, maybe you want to put a fiducial on the drink. You just stick a sticker on your mug, or have a mug that has a fidy on it.

AY (01:05:12):

And we use the camera to work out where it is relative, and we put it into the virtual world. There's obviously some software challenges. Whenever you had, the same thing with extra trackers for your feet, the Vive was given to developers essentially as a complete sort of system, as this is the minimum spec. Everyone will have hand input. Everyone will have track displays, and you can go and build your games based on, or build your experiences based on that.

AY (01:05:39):

But when you start adding all these third-party peripherals, then many of those in terms of the market for third-party peripheral, will live or die based on whether people support it. So one of the challenges for us and the challenges for all the developers out there is, when we get this community of different devices that people are going to make for the system, is how to expose them as an API so that people can actually use them. And how developers will work them into whatever kind of experience that they're going to do. And that's a challenge that we're going to have a crack at solving very soon.

CW (01:06:15):

So you mentioned that you started out with a goal to have things be somewhat hackable, and you also talked about a simple example kind of thing for kids, a two-dimensional robot control. So I guess I'm curious how you would recommend somebody get into developing a tracked device, whether for hacking and do-it-yourself projects, or as a product. Is there a pathway to do that that's, "Okay, you need these parts, and you need this kind of software," or is it kind of "On your own, figure it out kid."

AY (01:06:49):

So right now you're kind of on your own, and...I've actually fertilized the world a little bit. I've been, I posted on Twitter the circuit diagram for our sensor's circuit. There's been many people who've reverse-engineered pieces of it. And there's some YouTube videos out there, and...there's a website that this guy has made, where he went and decompiled the firmware. I didn't blow the bits on any of the firmware, so you can suck the firmware off the things and reverse-engineer it.

AY (01:07:17):

That was deliberate. We want all of our devices to be super hackable. So all of the code is out there, essentially in the public domain, in binary anyway, not in code, not in source, obviously,...not now. And people have gone through and reverse-engineered. I actually had some people send me some bug fixes for the Base Station that they found by reverse-engineering. So there's been a community of people out there that have been interested enough in the technology that they've already gone through and learnt enough that I'm actually surprised no one's built their own tracker with it yet.

AY (01:07:47):

Some people have come pretty close. They've got to the point where they understand the emissions of the Base Station well enough that they probably could build a tracker. They just haven't gone and done the solution for tracking. Now in the near future, there is going to be a reference design and a development platform that will be available. And there will be support for people that want to do this commercially,

EW (01:08:18):

Chris and I are stunned silent because...we're now going to stop playing with Unity and trying to design games and start trying to play with what we can do with your hardware.

AY (01:08:25):

Yeah.

CW (01:08:27):

And we'll get as far as we did with that as we did with the other part.

EW (01:08:29):

Well, that's true.

AY (01:08:31):

Well, stay tuned. There's going to be an announcement very soon about exactly how you might go about doing that.

EW (01:08:36):

So it sounded like I could already mark my drink by putting a sticker on it and using the camera, but then I still have to write a game so that I can get to my drink?

CW (01:08:47):

You'd have to write software that recognized the sticker.

EW (01:08:49):

But I couldn't put that, when I pause the game, I can't just find my fiducial -

CW (01:08:54):

No, the game would have to support it.

AY (01:08:56):

We could probably put it in, that gray room that you go into, we call it the composite, although...there's been a competition to see who can come up with a better name for that, but that room kind of, we try to keep it as simple as possible. There's a whole bunch of things we would love to put in there, but it always has to run at frame rate no matter what, even with people that have poor machines. And we're a little bit hesitant to overload it with too much functionality, but I think something like, if we do implement, and I know there are people working on this, CV, computer vision style add-ons for Steam VR, then it's quite possible that we would add a generic "Go print this QR code or this is fidy and go and stick it on whatever you want" and have that passed through.

AY (01:09:42):

Or at least punch a hole through. Because you know how you can go into the Tron mode kind of thing with the camera?

CW (01:09:46):

Right.

AY (01:09:47):

We could potentially just have some kind of recognizable object that just punches a hole through, and that would be fairly lightweight, and something that we could put in the composite, and it could run all the time.

CW (01:09:57):

It's like inverse augmented reality.

AY (01:10:00):

Yeah. It's like a mediated, some mixing between AR and VR.

CW (01:10:05):

Something real bleeding through to the virtual.

AY (01:10:07):

Yeah. Yeah. So I'm not quite sure how practical that would be. It's something I'd love to see. If we don't implement it, I'm sure someone in the community will implement it. I know that someone here is working on publishing how to access the camera for computer vision applications. Because the Vive actually makes a very good, pseudo-AR or CV dev platform because you have a track camera.

EW (01:10:31):

That's amazing...I'm still sort of boggled by all the possibilities here...Do you actually get to play it very often? I've only, I've done a little tiny bit of Unity and now you're opening worlds, but for the most part I have been a consumer. But of course this was a work purchase, a business purchase.

CW (01:10:54):

I'll cut that.

EW (01:10:55):

So I do just play, but I wonder, thinking of products I've developed, I haven't really gotten to play with them. Because when I play with them, all I ever see are bugs. Do you ever get to play with VR?

AY (01:11:15):

I do get to play with VR, not as much as I would like sometimes, because I'm busy making VR, but -

EW (01:11:21):

Yeah.

AY (01:11:22):

Yeah, I have a Vive at home. I've got, I had a couple of different generations of the Vive at home. I'm actually still using very old generation Base Stations. I've never swapped out my bases. I have the early prototype of the optical interbase sync, hanging circuit board, hanging off one of my other Base Stations. Just sort of soldered into the back of it. And it's kind of like the cobbler always has the worst shoes kind of thing. It's definitely that. And there's that tendency that you know exactly what to look for. So you see all the little problems. That's true.

AY (01:11:53):

But you can just relax and enjoy it. I like Space Pirate Trainer. You mentioned it before. I like Zen Blade. There's a lot of things that I'll just put on sometimes when I come home at some ungodly hour and just play for a few minutes before I go to bed. But I'm definitely one of, on the team here, many people that doesn't play it that much. I'm more often in the lab trying to build something than playing too much with it. The software people, they're obviously, they have to play with it all the time to test it. So in many ways they have a better job.

EW (01:12:29):

I tend to use it for exercise. I didn't expect that. I didn't expect to really play it at all. Chris got it. And then he had it for like an hour before I was like, "Okay, we're moving this to the living room." But I've been talking to people about exercise lately, how Fitbit's made me want to complete the next milestone. It would get me to walk an extra 10% because it would then throw me a little party on my wrist.

AY (01:12:59):

[Affirmative].

EW (01:12:59):

And then Pokemon Go, which I have to admit is a little fun. It gets me moving. It gets me started. And that's really great. Sometimes you're just, it's the end of the workday and you're tired. You don't want to do anything and, "Fine, I will go capture a stupid little creature." And I like that. And the VR system has some of that, but it's also got a, when I am in the system, I am 100% there. I'm not thinking about anything, but whatever is in this game. It's more of a Zen thing than just about anything else I do.

AY (01:13:39):

[Affirmative].

EW (01:13:39):

Have you heard that from other people or is this just me?

AY (01:13:43):

No, it's not just you. There are a lot of people that enjoy it as a recreation and exercise tool. Certainly what I read on the forums, you are not alone. It's funny what you say about other things like the AR games that are coming out, like Pokemon Go, and that kind of mixed or mediated reality...so you can have either the complete blinder experience where everything is absolutely virtual, like with the Vive, or you have things where you bring artificial elements into the real world. And I think that as the technology improves, we'll just see more and more of that, right? You'll have, you'll be able to dial it from completely artificial to completely real and anything in between.

EW (01:14:27):

It's amazing that it used to be, your parents would tell you to go out and play, and not sit and in front of your computer and play games, and now playing games may be how we exercise.

AY (01:14:38):

Yeah, absolutely. It's strange, isn't it? Because you've always got the couch potato kind of thing. A gamer, he just sits there, never gets a tan or anything, but that maybe as a stereotype, may have to change very soon. It already is changing, I think.

EW (01:14:54):

So I don't want to keep you too long because you have fiducials to start printing so I can find my drink.

AY (01:15:01):

[Laughter].

EW (01:15:01):

And lots of other neat things coming up. Chris, do you have any last questions?

CW (01:15:06):

I did want to ask, when you started all this, did you already have a background in kind of tracking stuff, or did you learn all this on the fly? And I mean, you talked about the triangulation and trilateration system. Did you have familiarity with GPS already?

AY (01:15:22):

Not particularly.

CW (01:15:24):

Okay.

AY (01:15:24):

I mean, I did have, I've been an engineer for many years, and I've been obviously curious about everything, and kind of read Wikipedia religiously, and learn everything I can. But no, I didn't really have a background in state estimation, or any kind of thing that would have navigation or, no is the simple answer.

CW (01:15:44):

There you go, kids. Read a book.

AY (01:15:46):

Working at Valve is, yeah. Working at Valve is very much like that, right? And on a daily basis, you will do very little of what is your core competency.

CW (01:15:53):

Yeah.

AY (01:15:53):

So that's why when we try and hire people, we try and just pick smart people who know their stuff. We call it the T-shaped person, that has some technical depths that is their own, whatever, they're really good at, know it all the way down to the atoms or whatever. And then just general breadth because you're making new stuff, you're making up things that don't exist in the world, right?

AY (01:16:15):

There is no course that says Lighthouse 101, although there probably will be in the future...You just have to go from first principles, and any reasonable engineer should really be able to do that. They should have their thing that they're really good at. And they should have enough competence in everything to be able to invent the future.

EW (01:16:35):

That's yes, I agree. Oh, and it looks like I do have one more question. Gravitational loading, you can see gravitational loading on these things, really?

AY (01:16:45):

Oh yes. [Laughter]. Yeah. There's a long story of the many discoveries about how stiff matter that looks solid is not that we find with Lighthouse. So yeah, if you put a Lighthouse on the corner of the table and you lean on the table, the table deflects enough that you can measure it. One of our calibration things, we have a robot move a tracked object around in front of the Base Station to take bunch of measurements. And we had a little rubber Sorbothane pad that we put the Base Station on, and we'd have the robot execute a move and it was doing a circle, right?

AY (01:17:19):

And it didn't quite meet up at the top when it returned back to its starting place. And for a long time, we're going, "Why does it not meet up?" And it turned out that the Base Station was sinking ever so slightly, 20 microns or something, into the rubber, during the course of this test. So we can resolve all kinds of tiny, tiny motions, because Lighthouse, yeah, it's noisy, but it's zero mean. So if you keep integrating it forever, basically the longer you integrate, the better it gets.

EW (01:17:49):

Like GPS.

AY (01:17:50):

Yeah.

EW (01:17:52):

And so how often did you attribute these things like that to errors in the software, errors in the hardware, before you realized, "No, no, it's sort of an error in the world?"

AY (01:18:08):

It happens a lot. There are many times where we spend weeks chasing down something that turns out to be some unexpected thing. Some unexpected feature like, there's been some interesting, optics is a crazy difficult discipline. There are so many compromises and some of things you didn't even think of. We've chased down stray light in lenses that we didn't even imagine this path could exist. But this path existed.

AY (01:18:36):

The universe is one of those things that it will remind you that it's in charge. Your models and everything else, and your perceptions about how it should work are often dead wrong. So we're very much about making empirical measurements and studying these systems. And we still have many, many things about, this is all brand new, right? It hasn't been around in the world very long. So as a first-generation thing, we keep finding interesting features, interesting nonidealities that we have to learn how to correct. Every day.

EW (01:19:09):

And they said physics and calculus wouldn't be important in our careers.

AY (01:19:16):

[Laughter]. Yeah. That's when you asked me that question much earlier about whether science, engineering, maths, what was the other one?

EW (01:19:21):

Technology.

AY (01:19:23):

Technology, right. All of the above, right? You've got to be, science kind of gets you the information that you can do engineering, and everything is talked about in terms of math. So you need all of them. That's what I meant by all of the above.

EW (01:19:36):

Alright. I'll accept that answer now. Do you have any last thoughts you'd like to leave us with?

AY (01:19:45):

Well, yeah. [Laughter]. You said this would be a hard question, and you weren't wrong. I think the future is super exciting for all this stuff. This is kind of a generation 0.9, really. It was the minimum thing that could possibly be made for the consumer. And I'm just super excited to see where it goes...Every day when I see new experiences, or new games that people have created, the creativity of humans, and what you can do when you give them a new media, is just mind-blowing. And I'm super excited.

EW (01:20:23):

Me too. Me too. Very much. Well, my guest has been Alan Yates, hardware engineer at Valve. Thank you so much for being here.

AY (01:20:34):

Thank you very much. It's been a lot of fun.

EW (01:20:36):

Thank you also to, IFixit for they're very informative teardown. You didn't think I just knew all this stuff, did you? Also thank you to Andrei Chichak, for giving me some great questions, and thank you to Christopher for producing and co-hosting. Finally, of course, thank you for listening. Hit the contact link or email show@embedded.fm if you'd like to say hello.

EW (01:20:59):

As usual, I do have a final thought to leave you with, this one from R. Buckminster Fuller. "You never changed things by fighting the existing reality. To change something, build a new model that makes the existing model obsolete."

EW (01:21:16):

Embedded FM is an independently produced radio show that focuses on the many aspects of engineering. It is a production of Logical Elegance, an embedded software consulting company in California. If there are advertisements in the show, we did not put them there and do not receive any revenue from them. At this time, our sole sponsor remains Logical Elegance.