371: All Martian Things Considered
Transcript from 371: All Martian Things Considered with Doug Ellison, Elecia White, and Christopher White.
EW (00:00:06):
Welcome to Embedded. I'm Elecia White, alongside Christopher White. We're on Earth. And this week we are talking to Curiosity rover from Mars.
CW (00:00:15):
I don't think that's right.
EW (00:00:17):
Well, the time lag is going to make it tedious. So let's talk to Doug Ellison, the Mars Science Laboratory Engineering Camera Team Lead from NASA's Jet Propulsion Laboratory, while we wait to see if Curiosity is on the line and ready to chat.
DE (00:00:33):
Curiosity will be here in about five minutes time.
CW (00:00:37):
Hey Doug, how are you?
DE (00:00:38):
I am very well. How are you doing?
CW (00:00:40):
Good.
EW (00:00:41):
Could you tell us about yourself as if we saw you on a panel about Martian things?
DE (00:00:49):
...All Martian things considered. So, I am the Mars Science Lab Engineering Camera Team Lead, which means I get to take pictures on Mars with the Curiosity Mars rover, and also look after the team of people who...operate our engineering cameras.
DE (00:01:06):
They are the cameras we use to have a look around, check the lay of the land, see where we want to go and point our other cameras. They are how we don't get lost when we're traveling around Gale Crater with the Curiosity Mars rover.
EW (00:01:20):
We do lightning round, where we want to ask you short questions, and we want short answers. And if we're behaving ourselves, we won't ask you for all the technical details. Are you ready?
DE (00:01:29):
I am very ready. Let's go.
CW (00:01:31):
Marvin the Martian or Martian Manhunter?
DE (00:01:34):
Marvin the Martian.
EW (00:01:36):
Mars or Europa?
DE (00:01:39):
Why not both?
CW (00:01:40):
Favorite fictional robot?
DE (00:01:42):
WALL-E.
EW (00:01:44):
Is faster than light travel possible?
DE (00:01:47):
Unlikely.
CW (00:01:48):
Color or black and white?
DE (00:01:50):
Black and white.
EW (00:01:52):
Which is harder, designing waterproof things for terrestrial deep sea exploration or regolith-proof things for Mars?
DE (00:02:01):
Regolith-proof things from Mars. Mars wants to ruin a lot of stuff very quickly.
CW (00:02:08):
Do you like to complete one project or start a dozen?
DE (00:02:12):
I like to have a dozen irons in the fire, all dedicated to one big project.
EW (00:02:17):
Do you have a tip everyone should know?
DE (00:02:22):
Try and surround yourself with people who are cleverer than you are. It's never a bad idea.
EW (00:02:27):
I totally agree with that. Okay, so -
CW (00:02:30):
That's easy for me.
DE (00:02:31):
[Laughter].
EW (00:02:31):
[Laughter].
EW (00:02:34):
This was going to be a lightning round question, but since you said what you said about the regolith, I think maybe I want more detail. What is harder to deal with on Mars for the electronics, the ESD, the radiation, the cold, or the dust? Or something else?
DE (00:02:50):
Yes, it's all of those. It starts even on Earth. You've got kind of the tangible shake, rattle, and roll of a rocket launch. Go and watch the footage from on board old space shuttle rocket launches, and those astronauts are getting rattled around inside there.
DE (00:03:08):
You can see their heads shaking. They are huge g-forces to survive. And so all of the electronic boards in and of themselves have to survive that ride into space in the first place. And then you've got a nine-month cruise to Mars in deep space.
DE (00:03:24):
And so you're outside any magnetosphere. You've got cosmic rays hitting you every now and again. Then you've got the landing at the other end. That's typically not as harsh as the launch was, but the g-forces can peak pretty high.
DE (00:03:36):
And then finally, you're where your electronics were designed to do their job in the first place. You're on the surface, and now you've got the daily temperature swings...The coldest we're allowed to use our cameras is minus 55 centigrade.
DE (00:03:50):
We will heat them up regularly in the morning to get them that warm. Mars is an incredibly cold place. And so we spend a lot of energy just making stuff hot enough to not break. The radiation is not too bad. We don't tend to see a lot of our pictures getting damaged that way.
DE (00:04:09):
We do see occasional cosmic ray hits to some of our pictures and things like that. But for us, the thing that really bites us is the freezing cold temperature.
DE (00:04:15):
It's caught us once or twice where our cameras have not been warm enough first thing in the morning when taking some pictures, and the rover said, "Meh, I tried heating them up. Not warm enough. No pictures for you. Try again tomorrow." And so I'd say it's the cold.
CW (00:04:33):
Listening to you talk about the vibration impacts brings to mind, I've used telescopes in the past for amateur astronomy, and they're very sensitive just to being shocked. And you have to recollimate things if you bump it the wrong way or look at it the wrong way.
CW (00:04:46):
It seems like the optics would be really sensitive to being jostled. Is that a major problem that you have to do something special for?
DE (00:04:55):
Not really the optics themselves, but one thing we have had to take a look at on a couple of occasions is exactly where our stereo pairs of cameras are in relation to one another.
DE (00:05:06):
We do a thing called a thermal characterization stare test, where we will stare at the same patch of ground from the earliest time we can in the morning to the latest time we can in the afternoon. Because kind of the bracket upon which all our cameras are bolted to, it can move.
DE (00:05:23):
Things can creep as the temperatures change during the day. And if those cameras aren't exactly in the same position relative to one another, then when we generate stereo data from those cameras, it can change. And we actually saw quite significant changes.
DE (00:05:37):
And so we've had to develop not just camera models for the cameras, but temperature-dependent camera models that change ever so slightly as the temperatures go from minus 50, minus 20, I think the hottest one is plus 10 degrees centigrade.
DE (00:05:51):
Just because a chunk of metal getting exposed to a hundred degree C of temperature range is going to change. It's going to move, and our cameras move with it, which makes things pretty tricky.
EW (00:06:02):
Because the cameras that you've worked on are the engineering cameras that are on the top of the mast, right?
DE (00:06:06):
Yeah. So we've got two kinds of cameras. We've got navigation cameras that are up on top of the mast. And then we have hazard cameras, front hazard cameras, and rear hazard cameras. Think of them as the backup camera on a car.
DE (00:06:17):
We've got a set at the front, and we've got a set at the back. But we actually have, of all those three kinds of cameras, NavCams, front HazCams and rear HazCams, we have stereo pairs of each.
DE (00:06:28):
And of those stereo pairs, we have one complete set tied to one of our flight computers and one complete set tied to our spare flight computer. So in total, we've actually got four cameras up on the mast, four at the front and four at the back.
DE (00:06:41):
And then up on top of the mast, we've also got our friends from the science teams. We have the color MastCam cameras and the ChemCam laser spectrometer as well.
EW (00:06:50):
So you have your backup computer, and you have some going to your backup computer. They don't all go to everything, right? Only half the cameras at a time are useful.
DE (00:07:01):
Yeah, that's right. So we landed using, imaginatively enough, the A-side computer. About 200 days in we had some trouble with it. So we swapped to the B-side computer. And in doing so, you swap cameras with the engineering cameras. You swap from the A cameras to the B cameras.
DE (00:07:15):
If you look at one of our selfies, that's actually taken with a microscope, which is called MAHLI, on the end of our robotic arm. If you look at the selfies we take, you'll actually see two pairs of engineering cameras, kind of hanging off the bottom of that camera mast.
DE (00:07:28):
The top pair are on the A computer. The bottom pair are on the B computer. And so we've been on the B computer for the majority of the mission. And so the A-side engineering cameras don't get an awful lot of love. They don't get used very often.
EW (00:07:40):
And it really is just a hard switchover, one or the other, and -
DE (00:07:44):
Yeah.
EW (00:07:44):
- only done a couple times per mission.
DE (00:07:46):
Yeah, it's done only when you absolutely need to. So we had some A-side computer flash memory issues. We swapped over to the B-side. And then we had basically kind of a file system problem with our B-side computer a couple of years ago.
DE (00:08:01):
So temporarily we swapped back with the A-side again. But when we did, we discovered actually the A-side was really not in great shape and probably only useful as a lifeboat in the future. So from the A-side, we can hook over to the B-side.
DE (00:08:14):
We did some diagnostics, and then we swapped back to the B-side. And we've now since done a nice flight software update called "Our Hope", which basically turns our CEA, our A computer, into a lifeboat, should we need to go and live on there for a while if something happens to B in the future.
EW (00:08:30):
So Percival, nope. Perseverance. Percy, sorry. Percy is the rover that just landed recently and it's got the helicopter Ingenuity. And Curiosity is a very similar sort of rover, right?
DE (00:08:50):
Very, very similar. I mean, if you didn't know them well enough, and you saw them parked side by side, you could be forgiven for thinking they're twins. They are very, very similar. Their flight computers are the same. The landing systems largely the same. Their mobility system is largely the same.
DE (00:09:04):
One's just the slightly younger sibling. And so it comes with a few new toys. And so, at long last, this kind of tradition of one megapixel black-and-white engineering cameras has finally been superseded with the enhanced engineering cameras on Perseverance.
DE (00:09:22):
And so, as you can imagine, the team operating those is the team that used to operate the engineering cameras on Curiosity, and holding the fort back with Curiosity. But we like to mock each other.
DE (00:09:34):
And so there'd be kind of the bougie Technicolor, Jezero Crater crowd versus the more Ansel Adams style, one black and white megapixel at a time, down at Gale Crater crowd. But,...it's fascinating to see. It's about time we got an upgrade.
DE (00:09:51):
But we still do great stuff with our one megapixel cameras. We got to enjoy a twilight cloud season with Curiosity this year, whereas Perseverance wasn't quite ready, so they didn't get to enjoy the twilight clouds like we have. So there's some friendly rivalry, but we're doing great stuff on both sides of the planet.
CW (00:10:08):
The cloud pictures were just incredible, but the animations -
DE (00:10:10):
Yeah.
CW (00:10:10):
It just blew my mind more than almost anything else. I mean, everything blows my mind from Mars, but for some reason seeing those high cirrus clouds was like, "Oh, that's very Earth-like. It's a planet. Oh, okay. It's a planet."
EW (00:10:21):
It's a real planet.
CW (00:10:22):
Yeah.
DE (00:10:23):
Yeah, it's easy to forget it has an atmosphere. I mean, it's not a great atmosphere.
CW (00:10:25):
Right.
DE (00:10:25):
Frankly, it sucks. But it has an atmosphere. It has a water cycle. It has a carbon dioxide cycle.
DE (00:10:32):
And so at one particular time of year, and it's kind of late fall going into early winter, we get these twilight clouds. And what's amazing is how quickly they show up, how spectacular they are for a month or two, and then they're just gone. They vanish.
DE (00:10:50):
And so while they're around, we try and put into the plan these little blocks of "Let's go and take a look at the twilight clouds. You might learn something." And then one day we put them in and just nothing shows up. Just clear, empty skies. Just gone.
DE (00:11:04):
And so, for this Martian year at least, twilight cloud season is over. But we had more luck this year, I think, than we have in any previous year.
DE (00:11:12):
We'd taken the time to take the lessons learned from last Martian year and redesigned some of our observations, made them more efficient, made them quicker, used them more often. We got some spectacular results. So we were all like, "Okay, this is the best cloud season we've had yet."
EW (00:11:29):
And when you say last Martian winter, that's two years ago.
DE (00:11:33):
Two Earth years ago. Roughly speaking, a Mars year is two Earth years...You get 360 Martian days in one Earth year and something like 660 Mars days in a Mars year. It's about twice as long. So when we talk about seasons, we're talking about seasons as we experience them on Earth, but they all last twice as long.
EW (00:11:58):
And Curiosity landed on Mars in 2012.
DE (00:12:05):
2012. Yes. It was August, 2012.
EW (00:12:09):
And the cameras that it has, because I'm trying to focus on that instead of everything, because I just want to say, "Mars. Everything. Tell me everything!" But focusing on the cameras, those had a lot in common with the previous ones, Spirit and Opportunity. Is that right?
DE (00:12:26):
Yeah. And in turn, the cameras on Spirit and Opportunity, largely speaking, were inherited from what would have been the 2001 lander that was canceled in the wake of the twin failures in 1999 of Mars Climate Orbiter and Mars Polar Lander.
DE (00:12:42):
There was going to be a 2001 lander that had an instrument called PanCam, that was going to have these one megapixel cameras that then would have color filters in front of them, to build up color mosaics using red, green, and blue filters, and a variety of other filters to do kind of cool science stuff.
DE (00:12:57):
And when that mission was canceled, they went "Well, what's next?" And those PanCam sensors were inherited by what became PanCam on Spirit and Opportunity as part of their payload.
DE (00:13:08):
And for simplicity, for ease of use, for commonality, for ease of testing, and design, and stuff like that, and frankly, to save some money, they went "Well, let's use the same sensor and the same electronics on all of the cameras."
DE (00:13:20):
So the high-resolution color science cameras, the navigation cameras, the hazard cameras, the microscope, they all use the same one megapixel black-and-white sensor. And when Curiosity came along, when it was being designed, it's design really was starting not long after Spirit and Opportunity hit the ground.
DE (00:13:36):
And so they went, "Well, what have we got that works? Ah, these engineering cameras will be fine. They worked with Spirit and Opportunity, whatever. Sure we'll have some new shiny color science cameras up on top, but engineering cameras, you get the same stuff."
DE (00:13:46):
We did some mild upgrades. There's some extra heaters in there to make them a little easier to heat up in the morning when we need to. But they're, largely speaking, the same. And so, weirdly, I actually certified to operate the cameras on Curiosity first.
DE (00:14:01):
And then I went back to Opportunity and did the last kind of 18 months of Opportunity's mission operating those engineering cameras. Because they're so very, very similar. And when it came to Perseverance, the cameras got a big upgrade for a couple of reasons.
DE (00:14:17):
One is we can get more data home from Mars on a given day these days. And the other is, we'd kind of run out of the one megapixel black-and-white style.
CW (00:14:27):
Yeah, yeah.
DE (00:14:27):
But I am delighted. There is one of the retro MER, MSL fan club, black-and-white sensors left. And it's on an upward-facing sky camera on the deck of Perseverance. So we're not dead yet. We're still spitting out these one megapixels sensors when we can.
EW (00:14:45):
Why has the bandwidth gotten so much better, and how much bandwidth are we talking?
DE (00:14:50):
So, yeah, it's easy to think in terms of data rates, but actually it's more data volumes. So we very rarely, very, very rarely in fact, send data back from our rovers direct to Earth. And we have a high gain antenna that can send data straight back to the Earth, but it takes an awful lot of power, and its data rate isn't very high.
DE (00:15:09):
So what we do is we use that antenna to send the commands to the rover, but to get the data back home, we use the fleet of Mars orbiters. Now when Spirit and Opportunity landed, that was just two old orbiters. It was Mars Global Surveyor and Mars Odyssey.
DE (00:15:25):
When Curiosity landed in 2012, we had the Mars Reconnaissance Orbiter, a much bigger, more powerful spacecraft. And since then we've had more orbiters arrive. We've had the NASA MAVEN mission. We've had the European ExoMars TGO mission arrive as well.
DE (00:15:41):
And they've all been equipped with these UHF relay radio sets on board to receive signals from the rovers and then send them back down to Earth. So with Spirit and Opportunity, we were lucky to get 50 to a 100 megabits, right? 12 and a half megabytes per day, right?
DE (00:16:02):
A hundred megabits would have been a huge day for Opportunity. With Curiosity, when we landed, we had Mars Odyssey, the old one, still going from 2001. Hence the name. And we had the Mars Reconnaissance Orbiter, this newer, shinier, better radio, and could get maybe three or four hundred megabits once or twice a day.
DE (00:16:20):
And so our daily...data volume has gone up from maybe 50, to a 100, to maybe kind of three hundred to six, seven hundred, maybe? Something like that...Three or four times more.
EW (00:16:32):
But you have to share it between the different rovers.
DE (00:16:34):
Yeah...We have to share it with all the instruments on board our rover, right? We have to share it with the color science cameras, and the instruments inside, and stuff like that. And engineering, housekeeping data, things like that.
DE (00:16:45):
By the time Perseverance landed, you've also got the European ExoMars TGO spacecraft. You've also got MAVEN, and they have again, newer, shiny, better radios. And so now, I mean, we're getting more data home from Curiosity every day now than we have ever before.
DE (00:17:01):
Because we have more orbiters that can do relay. So we have more relay passes per day, which means we can get more data home. And so nowadays we can get a thousand megabits a day, which is 125 megabytes a day, roughly.
DE (00:17:14):
That's kind of average, and it will come in lumps. It comes in these 12 minute passes where the orbiters fly overhead. And anything between fifty and sometimes a thousand megabits in a single pass when we're lucky.
DE (00:17:27):
And so the thing we've often struggled with, which is the amount of data we can get home for Curiosity, isn't so much of an issue anymore. We don't tend to worry about data volume on the day-to-day anymore.
EW (00:17:39):
Well, that's kind of cool. You said you operate the cameras. What does that mean? I assume you don't have a very long stick that you push the button for.
DE (00:17:50):
It's a gigantic selfie stick.
CW (00:17:52):
Yeah.
EW (00:17:52):
Yeah.
DE (00:17:52):
It's 120 million miles long, right? And it's got a button at one end, a camera at the other. It's easy. So...we'd love to have a joystick and a button, right? That'd be great.
DE (00:18:00):
But because of the speed of light, at the very closest, it's four minutes for a signal to get from Earth to Mars and then four minutes for it to get back, right? And when they're on opposite sides of the solar system, it's more like 20 plus minutes each way.
DE (00:18:14):
So we don't operate them in real time. We basically send the rover a to-do list when it wakes up in the morning. And then we get the data home from it, doing those things, in these lumps, through these relay passes through the orbiters.
DE (00:18:29):
And then we take the last one of those passes that happens before we go in to do the next planning cycle. And we go, "Okay, that's where we've got to. What are we going to do next?" And on the engineering camera side, what we're looking at is, we have some science observations we do,.
DE (00:18:46):
We do things like looking for clouds, looking for dust devils. We also do documentation of whatever we're doing with the robotic arm on the rover. So every time we get a microscope out, or our spectrometer on the end of the arm out, each time we put that on a new target, we'll document it with the engineering cameras.
DE (00:19:05):
And then whenever we drive, we take pictures as we're going, because the Rover actually uses those to measure its own progress versus how far it thinks it's got. Kind of like a very, very slow, gigantic optical mouse, basically. It's called visual odometry.
DE (00:19:20):
And then at the end of every drive,...in a series of different chunks, we take a complete 360 of where we've got to, so that the next shift we can go, "Okay, this is where we are now. Where do we want to drive next? What do we want to reach out and touch with the robotic arm?"
DE (00:19:35):
"What do you want to go and look at with the high-resolution color science cameras up on top of the mast?" And taking those pictures is basically little snippets of code. They are human readable...It's like a scripting language. It's actually been abstracted by the flight software.
DE (00:19:50):
So it's pretty easy to write these commands to tell the rover to do stuff, but it's "Take a picture. It's this important. Use these cameras. Point in this direction. Crop it or shrink it, then compress it, and we're done." And it might be just one image. It might be a mosaic of five or seven or twelve images.
DE (00:20:09):
And a planning cycle is basically ingesting all the requirements for the engineering cameras for that given plan, turning those into commands, making sure they're good, submitting them to the system...
DE (00:20:21):
They're modeled together in one big giant simulation together. If that simulation ends the day right side up, we've seen it's good, and we send it to the robot.
CW (00:20:31):
You said one of the commands is to compress it. If you're doing imagery for science, I imagine you want the absolute raw pixel data. There's lossless compression I assume you apply, but do you ever...say, "Give me a 50% JPEG real quick?"
DE (00:20:46):
So we have two modes. The color science cameras actually have a percentage of JPEG compression.
CW (00:20:52):
Oh, okay.
DE (00:20:52):
And if they want to uncompress, the command they use is 101. But with the engineering cameras, we have two modes of compression. One is called LOCO. People think it stands for lossless compression. It's actually low-complexity compression. It's a very, very quick algorithm.
DE (00:21:09):
That's typically speaking. It will return lossless data, but it's shrunk by about 20, 30, 40%. But we don't know exactly how well it's going to compress before we take the picture, right? We don't know exactly...what is the ground going to look like just over that hill.
DE (00:21:25):
Is it going to be full of lots of high-contrast, high-frequency detail that's not going to compress great, or is it going to be fairly bland, and it will compress really well? You just don't know in advance. So we guesstimate something like eight bits per pixel, roughly, but we don't do that very often, actually.
DE (00:21:40):
We tend to use ICER. It's a different compression algorithm. It's distantly related to JPEG 2000. And we assign the number of bits per pixel. We say, "Okay, typically we'll use three to four bits per pixel." And that's, to the human eye, indiscernible from the lossless compression.
DE (00:21:59):
You can't tell they're more heavily-compressed, but we get a lot more pictures home by doing it that way. And they generate good 3D data on the ground. They're good enough for doing things like dust devil surveys, and cloud surveys, and things like that.
DE (00:22:12):
And we will literally assign between two and four bits per pixel for most of the images we take. And then the flight software will shrink it enough to fit under that limit. But if for some reason it can actually return it losslessly, spending less bits than that, it'll do it.
DE (00:22:27):
But that very, very rarely happens. And so two to four bits per pixel. And so, because it's a one megapixel camera, we can go, "Okay, great,...a one megapixel camera, four bits per pixel, in stereo. Well, now you've just spent eight megabits of data. That's one megabyte for stereo there.
EW (00:22:46):
Have you ever sent the command and then gotten back bad pictures? Maybe you left your finger over the -
DE (00:22:56):
[Laughter]...So in the best way that the rover can leave its finger over the lens, we've done that. So we have a robotic arm. You can think of it as our arm, and it has fingers on the end of it. It has different instruments on the end.
DE (00:23:10):
And one of the observations we do quite frequently is called the dust devil survey...We do eight different pointings to get a complete 360-degree view. And each of those pointings, we will take three images in rapid succession.
DE (00:23:24):
And then the scientists can look at those three images, see if anything changes. If it changes, it's things like clouds, dust devils, and so on and so forth. And if the robotic arm has been used to do something, and it's left parked out in free space somewhere, it's going to be in the way in one of those.
DE (00:23:38):
And that's just the hit we take. It happens. We also do things like, we've driven, it's fairly late in the afternoon, and we've sequenced taking our post-drive imaging. And we'll get the sun in our eyes. We'll get lens flare, we'll get lens glare.
DE (00:23:54):
We have on occasion tried to take pictures after dark, and as long as it's not too long after sunset, it's okay. But sometimes if it's a little dark, the exposure times can kind of explode and suddenly an image that takes 30 seconds, it's still trying after five minutes.
DE (00:24:09):
And the rover goes, "Uh-uh, you're done. Good night." And so we're not perfect. We have occasionally left our finger over the lens. But we have things in place to ideally prevent us from doing that kind of stuff.
EW (00:24:23):
Have any of those ones that were sort of an accident turned out to be scientifically interesting, because they were off the normal path?
DE (00:24:32):
One of our observations we do is called our NavCam sky flats. Every month or so, about an hour before sunset, we actually take a picture of what should be an empty patch of sky directly opposite sunset.
EW (00:24:44):
Should be.
DE (00:24:46):
Should be. And we take a lossless set of images in the middle, and then we shrink-wrap that around the outside with some shrunk images. And basically you generate an artificial gradient from the shrunk images around the outside.
DE (00:24:59):
You've got your big image in the middle. The difference between them is your sky flat, right? It's characterizing the optical response of the entire stack from dust on the lens, through to sensor sensitivity and stuff like that.
DE (00:25:11):
And at about the same time that the twilight clouds vanished, we did our sky flats, and our flat field was full of clouds. We're like, "It's the worst sky flats ever. They're beautiful. And they are absolutely useless." And so we've literally said, "Okay, we can't do sky flats until the cloud season's finished."
DE (00:25:32):
And so we're going to try again in a couple of weeks. We may get them actually kind of beginning of May, I think. But yeah, we've had things like that. It's like, "We wanted an empty patch of sky, and Mars just stuck some clouds in it for fun."
CW (00:25:43):
Do you do dark frames as well?
DE (00:25:45):
So that happens as we take the pictures. So we don't have a mechanical shutter.
CW (00:25:50):
Yeah, okay.
DE (00:25:50):
What happens is, the sensor gets flushed. We expose, we read that out, and then we then read out a zero-length exposure. And so you're getting kind of dark field and the readout smear from a zero-length exposure.
DE (00:26:04):
That gets subtracted from the image you just took, and that's what gets sent home. We have actually sometimes done zero-length exposures to do things like take an image of the sun with our navigation cameras.
DE (00:26:15):
And just the readout smear is enough for the science team to figure out how bright was the sun, how bright should it have been, and the difference between them is how much dust is in the atmosphere.
CW (00:26:25):
Oh, wow.
DE (00:26:27):
Yeah. It's called taking a tau. It's atmospheric capacity when it comes to mass.
EW (00:26:31):
You've mentioned dust, and you've mentioned dust devils. Is it true that the dust devils are Martians that come and clean your lenses for you?
DE (00:26:41):
I wish they would. We've tried leaving tips. Nothing. So, our cameras, generally speaking over eight years, have just been getting ever so slightly, ever so slowly, dirtier.
DE (00:26:54):
The front optics have a little coating of dust. And it's at its worst, really, when we have sunlight actually on the front of the lenses, because then the brightness of that dust is contributing to the image signal that ends up at the sensor.
DE (00:27:10):
And we actually have little lens hoods that stick out from the front of the cameras. And if there's a shadow of that lens hood cast across the front of the camera, then half of it has illuminated dust contributing to the picture, and half of it doesn't.
DE (00:27:23):
And so you end up with this weird bright patch on the image...It can be really tricky. Towards the end of Opportunity's mission, when those cameras were 5,100 days old, that was a real problem, if you're taking images late in the day towards the sun. With Curiosity, it's not so bad right now. It's something we're kind of tracking.
DE (00:27:43):
We [inaudible] at home to have guidance of, "Hey, if you want to take pictures that are important in terms of planning the next drive, try not to take them too late in the day. Because you're going to have this issue of the front of the lens being half-illuminated and half not, and getting these bright patches in the images."
DE (00:28:00):
The good thing is it happens to both the left and the right eye together about the same. And so it still makes good stereo data. So it's not the end of the world, but that's why we take those sky flats.
DE (00:28:09):
We actually look to see, "Are we accumulating big chunks of dust? Are we losing big chunks of dust?" And we see it come and go, but generally over time, they're just getting dustier and dustier.
CW (00:28:19):
And those flats are divided at home, not locally on the rover. So you have to send those back.
DE (00:28:25):
Yeah, they get sent home. And then we have a science team member called Mark Lamb, and he's our kind of our dust devil and cloud guru. But he's also the genius guy of making our sky flats. And we basically build up a sky flat and then kind of migrate over the last couple of sky flats, and kind of keep slowly improving over time.
DE (00:28:44):
But of course, if you take a sky flat now, it has no value whatsoever for the images you took the day you landed. And so we kind of slowly migrate the sky flats as we move forward. When there was a dust storm that killed the Opportunity rover a couple of years ago, it hit Curiosity too.
DE (00:29:00):
It got murky. It got dark. It was pretty horrific. And at the end of it, we saw there was a bunch of crud on our lenses, more than usual, but a lot of that has actually blown away since. Kind of the worst of it splattered onto the lenses and then slowly blew away in the following days.
CW (00:29:17):
So this isn't, I mean, people might think, "Oh, you have cameras. You take pictures, and then you download them, and you look at them, and you post them on the internet. There's a lot of post-processing that has to be done on earth to generate something that...can be looked at or analyzed, right?
DE (00:29:31):
So the amazing thing is, and people may not realize this, is that if we were to get a downlink pass from on the orbiters tonight, there are no humans in the loop when it comes to getting those images processed and on the web for the whole wide world to see.
CW (00:29:44):
Oh, wow. Okay.
DE (00:29:44):
It's completely automated. And in fact, if we were to do a drive with the rover, let's say we drove Saturday night, for example, by Sunday morning, if those images were on the ground, they'd have been processed through our pipelines.
DE (00:29:56):
They've been thrown onto our service for us to look at. They've been thrown onto the web for the entire public to look at. And we have another script that actually generates kind of the mosaic of the 360.
DE (00:30:05):
And we have it throw that on the web as well for the public to enjoy as well. And in fact, that's kind of a PNG file. It doesn't have any compression to it. There is a lot of processing involved in generating kind of radiometrically-calibrated images for our science team and things like that.
DE (00:30:20):
And then one of the things we do when we push kind of raw JPEGs to the public is we stretch them so at least you can see something, right? We just do a basic stretch on them so that you're not looking at something that's way too dark or way too bright.
DE (00:30:32):
You can actually see there's rocks, there's sky, there's bits of rover, and stuff like that. But it's entirely hands-off. Where the humans get involved is when we do a downlink assessment, which is the morning, every time we do a planning shift...in the morning there's also a downlink assessment.
DE (00:30:47):
The team called OPGS, the Operational Product Generation System, I jokingly refer to it as overlaying pictures with glue and scissors, generates a bunch of specific products, specific mosaics, that help with planning for that day.
DE (00:31:01):
So there's one that shows the ChemCam team where they can shoot their laser. There's another that shows the color MastCam team what pictures they've already taken, and so what's left to have a look at with the color cameras, and things like that.
DE (00:31:14):
And then meshes get generated to help the rover planners, where they're going to drive next. And that's kind of the hands-on processing that happens on the ground.
DE (00:31:23):
But pictures received on earth, processed through the pipeline, thrown in our server, thrown out to the public. A quick mosaic made and thrown out. Completely hands-off. Completely automated
EW (00:31:34):
With the meshes, and you said left, right, you do distance calculations. Do you do VR systems? I mean, I just, I want to walk through it.
DE (00:31:46):
So, funny you should mention that. I started at JPL doing education and communications. Then I moved into a group at JPL called the Ops Lab. And the Ops Lab is like a little skunkworks embedded within JPL that does kind of emerging tech and how can that be applied to mission operations and things like that.
DE (00:32:06):
And one of the things we did about six years ago was team up with Microsoft. And they said, "Hey, we've got this cool thing called the HoloLens we're developing."
EW (00:32:14):
[Affirmative].
DE (00:32:14):
"Got something cool you could do with it?" And we went, "We could use it to walk on Mars." And so a project was born that ended up being called OnSight. And OnSight lets our science team put on a HoloLens and walk into our 3D data.
DE (00:32:31):
And the real genius behind OnSight is actually merging lots of different data sets to make the best possible 3D mesh. And so it starts with a kind of Google Earth-ish quality orbital data that then overlays the 3D data from the navigation cameras, our engineering cameras, on top of that.
DE (00:32:49):
And then when we have color data, it overlays that on top of that as well. And so most of the time it's kind of a little black-and-white patch of terrain, but then sometimes if we hang around somewhere for a while, we'll have accumulated enough color images for it to be a complete color 360.
DE (00:33:02):
Now, that experience is one the science teams still use, the science team members with HoloLenses in their offices. And they use it to walk on Mars, wrapped in bunny rabbit ears, but walk on Mars.
DE (00:33:13):
But we also made a spin-off of that that was called Destination Mars, where we actually took three of our best terrain models. We holographically captured actual, in-real-life Buzz Aldrin to rate these three terrain models.
DE (00:33:26):
And one of our rover drivers, Erisa Hines, we have her captured to talk about the rover, and what we do, and where we go, and stuff like that. And that was turned into an experience that was at the Kennedy Space Center Visitor Complex for about three months back in late 2016.
DE (00:33:39):
And that was really, really fantastic. And then a spin-off of that is now available to the public. So you can go to mars.nasa.gov, and in a menu somewhere, you'll find a surface experience where you can look at those same terrain models in your browser.
DE (00:33:51):
And then Google took that same data, and they turned it into kind of a web VR thing that works in more affordable headsets as well. So yeah, we take that 3D data, and sometimes we walk into it.
EW (00:34:06):
What camera capabilities do you wish Curiosity or Perseverance had that they don't?
DE (00:34:14):
So assuming I have carte blanche for data volume, I'd like to take more videos. It would be nice to take video as we're driving. I mean, a lot of people say, "I want live 4K video from Mars. Oh my God, my cell phone can take 12 megapixel pictures, why can't NASA on Mars, blah, blah, blah."
DE (00:34:32):
And the problem is, 99% of the time, if you had a red 8K camera parked on a Mars rover filming the Martian terrain, and then you sent all that data home, you would see nothing happen, right? There's nothing going on out there. Nine times out of 10.
DE (00:34:50):
The only thing that really happens is we drive around, we use the robotic arm, we take pictures with the mast. And so it'd be nice to see some of that as video. It would be nice to be able to take pictures more quickly with our engineering cameras.
DE (00:35:06):
If we're really lucky, and we tweak a few dials and press a few buttons, we can take pictures about once every 15 seconds or so with our NavCams and our HazCams. It would be nice to be able to do a little bit quicker than that, particularly with things like dust devil searches.
DE (00:35:20):
But in many respects, Perseverance has it very, very good. Because they have 20 megapixel color engineering cameras.
EW (00:35:27):
Ooo.
DE (00:35:27):
But because of the fact that the rest of the avionics are heritage from Curiosity, they take those 20 megapixel color pictures, but then they have read them out one megapixel at a time and save them to the flash memory. And they get sent home one megapixel at a time.
DE (00:35:45):
Because they were designed with our old cameras in mind, not these new shiny cameras in mind. And so you can go and see those pictures online as well. And you'll sometimes see the whole thing as a one megapixel color picture.
DE (00:35:56):
And sometimes you'll see it as 16 tiles of kind of 1280 by 720 images, but you've got to try and assemble yourself to get the full glorious image. But a little higher frame rate, maybe a few more videos would be cool.
DE (00:36:08):
But really some more data volume. And honestly, just more time in the day, more power to use the cameras we've got more often would be lovely.
EW (00:36:17):
The rover can do a lot with its own images. It doesn't need the humans to tell it what to do for a lot of things. How much can it do autonomously?
DE (00:36:28):
So we're very careful about what we let the rover do on its own, because let's be honest, it's a nuclear-powered rover which shoots lasers out of its mast. And so we don't want it to go rogue.
EW (00:36:40):
We've all seen that movie.
DE (00:36:41):
It could be terrifying. And so,...the driving is the thing where the imagery gets used autonomously most often. So we can drive in different ways. We can say, "Hey, turn all six drive motors on for the next 37 meters of revolutions, and then call it a day." We call that blind driving.
DE (00:36:59):
It's literally just hands over your eyes, drive. But then we can do things like visual odometry where the rover takes pictures as it's going. And it analyzes those pictures. It looks for features.
DE (00:37:09):
It matches those features from one set of images to the next, and establishes exactly where it has got to compare to where it thought it was, and then drives in little steps, one meter steps. Drive a meter, take some pictures, see what's moved, drive another meter, and so on and so forth.
DE (00:37:24):
And doing that, we can drive actually very accurately. We can say, "Hey, we're going to drive to a point that's 37 meters east and 14 meters south of where we are right now." And we can do that drive to within, if we're lucky, a couple of centimeters. It's pretty remarkable.
DE (00:37:38):
And then the ultimate version of that is where we do autonav, where we say to the rover, "Hey, goes as far as you can over there. You've got an hour and 45 minutes. Get to it."
DE (00:37:47):
And we'll typically do a bunch of kind of targeted visual odometry driving to begin with, but we'll set it loose. And it will start taking pictures of the terrain in front of it on its own, figure out which parts that are safe, which parts of it aren't, drive along the safe parts that get it towards where it wants to get to.
DE (00:38:05):
Kind of like a mouse down a maze. And again,...it's doing little 3D meshes of the terrain in front of it, checking for hazards. And we get all those pictures home eventually as well. And kind of little shrunk versions. We shrink them down to 256 by 256.
DE (00:38:20):
Otherwise the flight computer would take weeks to try and process one stereo pair. And the other thing we let the rover do on its own is, often on the second day of a two day plan, we'll have driven on the first day. And so we won't know where we are for the second day.
DE (00:38:35):
So we'll turn on a thing called AEGIS, A-E-G-I-S. You can look up loads of papers about it by my colleague, Raymond Francis. And AEGIS basically lets the rover take a pair of images off to its kind of forward and right, kind of off its right shoulder, of the ground, whatever the ground is right in front of it, off to the right.
DE (00:38:51):
Take a picture, look at that picture, analyze it for jaggedy white rocks, or rounded, dark-colored rocks, or some variation in between, identify the middle of one of those rocks and then shoot it with the laser beam. And actually do what's called LIBS, Laser Induced Breakdown Spectroscopy.
DE (00:39:10):
It's remote elemental composition using a laser spectrometer. And AEGIS will take pictures, shoot stuff, and then save all that data and send it home to us. So it kind of keeps a human out of the loop basically. And we've done a comparison of, "Okay, if we had to have a human in the loop, what would they have shot in that picture?"
DE (00:39:29):
And it overlaps with what AEGIS has done most of the time. And so AEGIS is like our little onboard geological gnome that goes, "Okay, let's find a cool rock to shoot. I'm going to shoot that one," and shoots it. Meanwhile, we're planning what's going to happen the day after that.
DE (00:39:43):
And we were not even aware of what the rover was actually doing. It's fantastic. But a lot of the time for the bigger decisions, you really do want humans in the loop. We tend to make the Rover be quite cowardly. We set pretty conservative limits on pitch and roll.
DE (00:40:01):
We set conservative limits on suspension. We set conservative limits on wheel slip and things like that, because we'd rather avoid the rover getting itself into trouble that we then have to go and get out of.
CW (00:40:12):
I have a dumb question. As you're navigating things, there's no GPS on Mars.
DE (00:40:19):
No.
CW (00:40:19):
...You talked about merging the data sets from the orbiter and then...from the stereo cam to make the VR stuff. How do you locate stuff to within a high degree of precision without GPS?
DE (00:40:36):
The rover's ability to know its own progress based on, "Okay, starting here, go over there," is really, really good by combining this kind of visual odometry as it goes. And we have an inertial measurement unit on board, a gyro basically.
DE (00:40:54):
And combining those two things, it's really good at measuring its own rate of progress. And often when it's done a 50, 60, 70 meter drive at the end of it, it's within a couple of centimeters of where we told it to go. We get all that data home, and then we do kind of a bundle adjustment.
DE (00:41:12):
We basically compare the images we've taken to the orbital imagery and go, "Okay, the rover thinks it's here. Actually it's a meter and a half off to one side, so let's tweak that position." And we will reset where the rover thinks it is, but in something called a SAP sun update.
DE (00:41:30):
We will literally take a picture of the sun, use it like a kind of old-fashioned mariner, checking the sun for his lat and lon. We will do the same thing to recalibrate our pitch, roll, and yaw that then resets our IMU.
DE (00:41:42):
And we do that every 500 meters of driving or so. And then it's humans in the loop on the ground who take...kind of each leg of the driving ,and the resets we do when we do those SAP sun updates every 500 meters or so, and turn that into a geo-referenced route map of where the rover has gone.
DE (00:42:02):
So Fred Calef, and we call them Team LOCO, the localization team, they do an amazing job of merging all the different datasets, kind of ground truthing against orbital imagery, where are we, where do we think we are, into a giant database called PLACES.
DE (00:42:16):
That gets published to the Planetary Data System, so anyone can pull that data. And we end up with kind of centimeter to millimeter accuracy of, "Okay, we did a microscopic image of that particular rock 475 sols ago. And we can give you the lat and lon of exactly where that rock is".
DE (00:42:34):
So six decimal places because of it. It's doing it the old fashioned way. We don't have communication satellites, but we can still correlate what we know from orbit, what we've seen with the rover, and make really, really accurate maps of where we're going, where we've been, and where all our data is.
EW (00:42:50):
Do you ever get to take pictures just because you want to know what's over there or you think that will be a nice composition?
DE (00:42:59):
We'd love to get a little more Ansel Adams than we do. That's for sure...Rarely do we have time in a given plan, and time really means power, the longer we stay awake, the more power we're using, or data volume, to take stuff just because it's pretty.
DE (00:43:14):
I will happily confess that I will sometimes coerce the geology team into requesting what's known as an upper tier, which is, we're parked next to a mountain that's taller than Mount Whitney. And so if you're driving up 395 in Southern California, you need to crane your neck upwards to see the top of Mount Whitney, right?
DE (00:43:33):
It's above you. And so we sometimes need to take pictures above our normal 360-degree panorama to get Mount Sharp, which is this mountain right next to us, and kind of the full horizon.
DE (00:43:45):
And so I'll say to the geology team, "Hey, do you guys think you need an upper tier to target some remote stuff in the next plan?" And they'll go, "Well, I mean, not really." I'm like, "Look,...today, we've got loads of data volume. We're not tight for time. How about we just do a five by one upper tier.
DE (00:44:00):
"I'll shrink it so we don't take more data than we really need." And they're like, "Oh, okay then." So sometimes I'll coerce them just a little bit, and it depends...who's on from the geology team that day. It depends which of the engineering camera team we have on. But I normally get my way.
DE (00:44:18):
I'll try and find an excuse for it, but sometimes you'll go, "You know what? We're tight up against the communications pass, or we're short on power, and it's just not worth pushing for." The other thing we started doing, which I came up with last year actually, is a thing called SPENDI. It's a NASA thing.
DE (00:44:34):
It has to have an acronym. That's just the rules. I didn't come up with the acronym, but I did come up with the idea. It's called shunt prevention ENV NavCam drop-in. SPENDI. There are situations, and it doesn't happen often, but where...we have our RTG, the radioisotope thermoelectric generator in the back.
DE (00:44:53):
And then we have a chemical battery inside with which we drive the rover, and we kind of trickle charge that with the RTG in the back. And batteries like to be between 40 and 80% charge. They don't like being fully charged. They don't like being flat. And if you can keep them in that 40 to 80% range, they're nice and happy.
DE (00:45:12):
If in a given plan, it looks like we're going to be fully charged and we're going to be shunting power, we will drop in a SPENDI. And the SPENDI is like this omnibus edition of all of our regular environmental science observations.
DE (00:45:25):
And so we will get cloud movies, dust devil movies, scattered around the whole horizon, to kind of spend 20 minutes just taking pictures and smelling the roses. And it's kind of, "If we're going to stay awake just to avoid fully charging the battery, why not have a look around while we're doing it?"
DE (00:45:43):
And so, the occasional extra bit of imaging to see Mount Sharp, to see the top of the hills nearby, and then our occasional, "We've got a full battery, let's have a look around anyway." But beyond that, it's pretty rare for us to have much in the way of artistic discretion unfortunately.
EW (00:46:03):
The cameras can't take nearly as much power as moving the rover.
DE (00:46:10):
The thing that actually takes the most power is being awake.
EW (00:46:13):
Okay.
DE (00:46:13):
So, our power supply, the little RTG in the back, generates something like 90 watts or so. So if you do the math, we get about two and a half kilowatt-hours per day of energy. Half that is kind of lost to survival heating, background activities.
DE (00:46:33):
And so when we're awake, we're burning more than the RTG generates. And so if we were to wake the Rover up at nine o'clock in the morning and leave it awake all day long, by nine o'clock at night the battery would be flat and it would brownout and die.
DE (00:46:45):
And so we have to take cat naps. We have to say, "Okay, the rover's going to wake up. We're going to use the robotic arm. We'll take a brief nap. We'll then do some science. We'll go driving. We'll do a communications pass." At that point it's six o'clock at night. We'll go back to sleep again.
DE (00:47:01):
And then the rover will nap through the night. It's waking up for communication passes. Data volume, we can always say, "Okay, this is just a pretty picture. It's not a very high priority. Leave it in the trash in the flash memory, and we'll get to it when we get to it in terms of sending the data home."
DE (00:47:15):
But most of the time it's power or time that's the killer. It's, "We don't have enough power to stay up longer and do cool stuff." Or, "We don't have time between starting the commands in the morning and that decisional communications pass in the afternoon to fit any more stuff in."
DE (00:47:32):
And honestly it's one of the most rewarding challenges of operating a rover that's eight-and-a-half years old, is trying to get it to do more, trying to fit more stuff in, what are the tricks we can use to squeeze stuff in, parallelize stuff as best we can to make our slightly arthritic, old rover as productive as it possibly can.
DE (00:47:54):
It's...really a rewarding part of the project.
EW (00:47:57):
I always find it really fun to try to optimize things until you can get every little thing out of it.
DE (00:48:02):
Yeah. And I've recently gone back and done a deep dive into exactly how long a bunch of our kind of common recurring observations have taken, compared that to how long we give those when we're planning it, and where we can tighten our belts.
DE (00:48:17):
And you know what, we've always said this takes six minutes. It's actually more like five minutes and forty seconds. Let's take 20 minutes off the plan time for that so we can fit more in or go back to sleep earlier, and get more power, and stuff like that.
DE (00:48:28):
It's weird...The past 12 months, a lot of my focus has been, "How can we just take some cool stuff when the battery is fully charged?" But also, "How can we penny-pinch down to five seconds here, twenty seconds there, to also save power?"
DE (00:48:42):
Because sometimes we're good on power. Sometimes we're not. And I'm looking forward to when we're kind of really low on power, and we're really having to use all of the tricks with our decrepit old RTG that's really, really not doing great. That's actually gonna be a huge amount of fun.
CW (00:48:57):
When does that happen?
DE (00:49:00):
Not this week, fortunately. So we are 3,100 days into our mission. I would expect us to be able to keep going at a fairly regular, good, scientifically-productive rate of progress for another four years, maybe.
CW (00:49:17):
Wow.
DE (00:49:18):
After that the RTG is going to be getting a little tired. This is the same power supply that operates the Voyager spacecraft that has been going since 1977, right? But...we don't have as much power as Voyager has ever had actually. We only have about 80 watts, between 80 and 100.
DE (00:49:36):
When it's cold we get a bit more, when it's warmer it's a bit less, stuff like that. But, about four or five years from now, we'll have to start really tightening our belt.
DE (00:49:46):
But I can imagine us ten years from now just reducing the cadence of how often we operate the vehicle, and saying, "Okay, we're just going to have three days of busy stuff per week. The rest of it's just going to be recharging the battery."
DE (00:50:00):
And we could keep that going for years and years and years. There's loads of tricks we can pull. We can say, "Hey, you know what? Waking up to do a communications pass takes a whole bunch of power. Let's do that less often. Let's just stay asleep. More catnaps, less stuff."
DE (00:50:15):
We could keep it going for a long, long time. Just a case of having the budget to keep operating it. Then as long as the budget keeps flowing, we will bring every single thing we can out of this old rover.
CW (00:50:26):
At that point,...it will have been there 20, this is incredible. 20 years, driving around.
DE (00:50:31):
Yeah. I mean, we're on sol 3,100. Opportunity, that was solar-powered, lest we forget, operated for 5,111 days before -
CW (00:50:39):
Right.
DE (00:50:39):
- succumbing to a dust storm. We could easily beat that. I think that's a very, very achievable goal is to break that record.
EW (00:50:47):
You said 9:00 AM for wake-up call for the rover and afternoon. Do you live on sol days or do you live on Earth days?
DE (00:50:58):
So the rover does Mars time and very early on in the mission the engineers do Mars time. In fact, right now I have colleagues operating on Mars time that are on the Perseverance mission. But when you get into two or three months of that, you start to kind of get a slow detachment from reality. It's not easy.
DE (00:51:18):
And for those who don't know the challenges, a Mars day is about 40 minutes longer than an Earth day. So let's say your data comes back from the rover at nine o'clock tomorrow morning. That's great. You can start planning at nine o'clock in the morning.
DE (00:51:30):
Well, tomorrow that's 9:40, then it's 10:20, then it's eleven o'clock, then it's 11:40. Two weeks from now that's now nine o'clock at night. And another two weeks from now, it's back to nine o'clock in the morning, and you've lost an entire day.
DE (00:51:42):
And you can keep that up for a while, but people have things like families, and obligations, and their sanity to maintain. And so after three months, you really can't do it. So what we do is, with Curiosity, we operate the rover typically three days a week. We plan on Mondays, Wednesdays, and Fridays.
DE (00:51:59):
But if Curiosity is on Mars time, because bearing in mind, they're scattered around the planet. So now you're talking about multiple Martian time zones...Basically if the Martian night shift for Curiosity lines up with Pacific office hours, then we'll also sneak in an extra plan on a Tuesday.
DE (00:52:17):
So we'll do Monday, Tuesday, Wednesday, Friday. Fridays we always end up giving the rover three days of stuff to do so we can go home and have a weekend. And so about half the weeks, it's Monday, Tuesday, Wednesday, Friday, and the rest, it's just Monday, Wednesday, Friday.
DE (00:52:31):
And so that means you can only drive three times in a given week, when you're doing that restricted planning. But kind of managing those schedules of, "When is the data going to come home? When do we need to be ready to send commands?" There's a whole task in and of itself.
DE (00:52:44):
And it's probably one of the biggest challenges early on in the mission when you're trying to manage the humans as well. It can be really, really tricky to get that Mars time and Earth time to play nice with each other.
EW (00:52:54):
And once they do, they just go out of sync again.
DE (00:52:57):
Yeah, yeah. The way to think of it is, you're traveling about in one time zone west every single day. It's a constant state of moderate jet lag. We've joked what we need is a very, very fast cruise ship that can circle the planet once every cycle, so that we're always operating during daylight hours.
DE (00:53:19):
And so we'd have the Mars ops cruise ship constantly circulating the planet. The science team could join us when we could live in reasonable times zones, but operate the rover every single day. Let me just go, but actually there isn't a cruise ship in the world fast enough to get it done. So we gave up. It was a nice idea.
EW (00:53:38):
I have a couple of listener questions I want to get to, although I think we've gotten most of them. Kevin asks, "How do you validate the hardware for Mars while on Earth? How high fidelity are the tests?"
DE (00:53:52):
So there are kind of three kinds of testing, actually four kinds of testing we ought to do. One is the shake, rattle, and roll of surviving launch and landing.
DE (00:54:01):
And for that, we'd actually bolt it to a shake table that can shake the heck out of it and sweeps through a whole bunch of amplitudes and frequencies that replicate what it's going to go through through launch. And you do that at the box level. You do that at the vehicle level. Then we have something called thermal vac.
DE (00:54:17):
We have a 25-foot wide space simulator up at the back of JPL, and it's a big vacuum chamber. And so we can suck all the air out, and then we can flow liquid nitrogen up and down the walls to make the walls feel cold like space.
DE (00:54:32):
But then we have a huge bank of arc lamps that reflect through a mirror at the top of the space simulator to behave like the sun. And we can turn that up or down to make a spacecraft think it's orbiting Venus, or has landed on Mars, or is orbiting Jupiter, anything in between kind of thing.
DE (00:54:47):
And so we suck all the air out, put a little bit of carbon dioxide back in again, and then turn the sun down to Martian levels. And we can put it through day/night cycles in our space simulator. We also have electromagnetic compatibility testing.
DE (00:55:03):
So we will literally, in an RF chamber, we will turn all the various bits of the spacecraft on and measure if they're impacting other bits of the spacecraft. And then there are mechanical things we can do, like drop tests, things like testing the landing system, testing the wheels, driving over rocks, and things like that.
DE (00:55:21):
So you never get to test the whole thing in an environment that's exactly like Mars, but you can test bits of it in ways that are as close as you can. And then you just have to tie all those tests together into kind of one amorphous set of VnV tests that say, "You know what, we think this will work when it gets to Mars."
EW (00:55:41):
That's a little scary. I really want it to all work.
DE (00:55:44):
I mean, if you think of our crazy landing system, it would basically be impossible.
EW (00:55:50):
Crazy landing system. [Laughter].
CW (00:55:50):
Yeah, how are you going to test that?
DE (00:55:50):
It's nuts. It's nuts. It would basically be impossible to fully run that on Earth. Earth has too much air, too much gravity.
EW (00:55:56):
Yep.
DE (00:55:56):
The first time the whole sky crane process was done end to end was August, 2012, when it successfully landed Curiosity on Mars.
EW (00:56:04):
Did you really think that was going to work?
CW (00:56:06):
Geez. [Laughter].
EW (00:56:06):
I mean -
DE (00:56:09):
So I spent nine-ish months as the technical director for the landing animation of Curiosity, long before I got into mission operations. And I was looking at some of the engineering simulations going, "Yeah, you've done a real good job to convince people that this is going to work," but you're like, "No. No way."
DE (00:56:26):
But then you sit down with these people, and you hear them walk through why it makes sense and the measure they've gone to, and you're like, "You know what? You crazy fools. You might just have got the right idea here." And let's be honest. It worked twice now, right?
DE (00:56:38):
It worked with Curiosity, and they even upgraded it, and it worked for Perseverance as well. Yeah, "The Right Kind of Crazy" is a book written by Adam Steltzner, who was the Landing Team Lead with Curiosity. And he's at the perfect name for a book, "The Right Kind of Crazy."
EW (00:56:53):
Let's see, from jakeypoo, "Do you have to worry about water ingress at all for Martian electronics?"
DE (00:57:04):
Good news. Mars is incredibly dry. If you took all of the water that's in our atmosphere and kind of froze it out, you'd get a layer that's just a couple of microns thick. Mars is incredibly cold, incredibly dry.
DE (00:57:20):
And so things like water ingress, rust, stuff like that is not really something we have to worry about. Dust ingress definitely is. But fortunately water isn't.
EW (00:57:32):
I wanted to ask, I'm thinking about all the times that I've gotten devices back that were supposed to be hardened for outdoor use, and gotten them back, and had them be half full of water.
CW (00:57:43):
Or spiders.
EW (00:57:44):
Or spiders. Yes. That happens.
DE (00:57:45):
We don't have the spider problem.
CW (00:57:48):
Yes, well, you don't know.
EW (00:57:48):
I wanted to ask him, have you ever gotten one back to make sure the dust ingress works. And my brain went, "You can't ask that...No, he hasn't ever seen that."
CW (00:57:58):
They haven't gotten any back, no.
DE (00:57:59):
This is the thing is that, I mean, I got to see Curiosity in the clean room before it launched. I got to see Perseverance as well. I operated the cameras on the Opportunity rover. I have never seen Opportunity with my own eyes. Never seen it.
DE (00:58:12):
And that was part of the motivation when back on sol 5,000, which was 2018 or so with, 2017 even with Opportunity, we decided for her 5,000th sol anniversary to take a selfie. Now Curiosity has taken amazing color, high-definition selfies.
DE (00:58:32):
Opportunity has a one megapixel black-and-white microscope that cannot focus, right? It's designed to literally be positioned six centimeters or so above a rock. And that's where it focuses. That's what you do if you want to focus it, you move it. That's how it works. But we thought, "You know what, let's give it a try."
DE (00:58:47):
And...we shrank the images because we knew they'd be blurry and out-of-focus. We did get to see our rover for the first time in, at that point, 13 years or so. And...it was a little dusty in the room when we got to see those pictures for the first time, that's for sure.
DE (00:59:03):
But one thing that you have to think about with any spacecraft, think about you've got a camera with a lens. You've got electronics boxes...The Curiosity rover is the size of a Mini Cooper. It's a beast of a thing.
DE (00:59:16):
But think about that ... that sat on the launch pad. A couple of minutes from now, it's going to in a vacuum that is stronger and harder than any vacuum chamber on Earth, because that's what space is like. So every single molecule of air that is inside any part of that spacecraft has about three minutes to get out.
DE (00:59:34):
It's got to get out from the rover into the nose cone, from the nose cone out into the atmosphere as we head up into the vacuum of space. And so you have to have means of the air to get out everywhere. If you've got a way for the air to get out, you've got a way for the dust to get back in again.
DE (00:59:50):
And so what you end up doing is making these quite convoluted little channels for the air to get out and hope that dust doesn't find its way back again. Dust did eventually find its way back inside the dust cover for the Opportunity microscope, but not an awful lot.
DE (01:00:08):
But it's a weird thing to think about. These spacecrafts start full of air, and by the time they're in space, it's all got to have got out somewhere. So you have to design a way for all that air to get out in about three minutes.
EW (01:00:20):
Never would've thought of that.
CW (01:00:22):
And that's why your rover exploded on launch.
EW (01:00:25):
Not on launch. Once it was in space.
CW (01:00:28):
Yeah.
DE (01:00:28):
It arrived in space five times its natural size.
EW (01:00:34):
And one more question from Andrei From The Great White North, "Is dust very abrasive, the dust on Mars?"
DE (01:00:40):
So Mars has had nothing to do for several billion years apart from turn rocks into ever so slightly smaller rocks. And it's really good at it. So Mars dirt, it's easy to think of it as abrasive sand. There are sandy places on Mars, but the dust that's in the atmosphere and blowing around all over the place is way, way finer than that.
DE (01:01:01):
Way, way we finer. It's more like the consistency of corn flour or talc or something like that. It's incredibly fine. Even down to kind of the cigarette smoke kind of levels of fine. It's incredibly fine dust. And so it doesn't tend to be abrasive.
DE (01:01:20):
It can get stuck in places. It can coat things, but it doesn't tend to really erode things. The Moon on the other hand -
CW (01:01:28):
Oh, right. Right, okay.
DE (01:01:31):
Its sand grains, its dust grains are very abrasive, because they've been made through impacts and cratering. They have been made through rocks getting smashed into ever smaller rocks, not wind eroding them into ever smaller rocks and rolling them around.
DE (01:01:47):
And so if you compare kind of grains of what you think Mars dirt might look like, they're round, they're maybe a little jagged here and there, but generally speaking, it's like a very fine, powdery dust. Moon dirt is like someone's gone and designed the most horrific abrasive you can possibly imagine.
DE (01:02:05):
And so in the space of three days of walking around with the geology hammer, one of the astronauts wore the rubber grip off a geology hammer just through dust. Because that Moon dust is horrific. It wants to rip everything to shreds. Mars is a holiday compared to dealing with Moon dirt.
EW (01:02:28):
If you were stuck on Mars, and had some potatoes, could you, in fact, grow things? Potatoes and water? Could you grow things, or is it too sandy?
DE (01:02:42):
So you could hypothetically do some sort of, if you had a little tent thing, go full Watney and have a little tent, you could do something hydroponically, for sure. A lot of the dirt has dissolved compounds in it that are not great.
DE (01:02:57):
And we've seen perchlorates around a lot that are kind of a bit of a bleach. They're a chlorine compound that's not particularly nice for life. So, if you just erected a tent over a patch of miles and stuck some potatoes in the ground, they're not going to have a nice day.
DE (01:03:14):
But if you were to take some of the sand, maybe wash it out a bit, right? With some fresh, free-range, organic melted Martian ice from the polar regions.
DE (01:03:24):
And then put some nutrients back in that you brought with you, or that you had made, Mark Watney style, you could conceivably kind of do the Watney thing. But you would need to essentially decontaminate that soil of the really nasty stuff before you got started.
EW (01:03:42):
Alright. Well, I have so many more questions, but we are about out of time. Doug, do you have any thoughts you'd like to leave us with?
DE (01:03:51):
The one thing I think is worth saying is that we were incredibly lucky to be involved in an amazing mission like Curiosity. I know my friends on Perseverance feel exactly the same. It is a privilege to do this on behalf of the country and indeed the planet.
DE (01:04:05):
The forefront of the human experience is trundling around on Mars, learning about it, so that one day we can send humans to go and do science there as well. But everyone can come along on that journey with us. Every single picture we ever take goes online the second it reaches the ground.
DE (01:04:22):
And so you can be there right alongside us, every shift, every day. There are times when, just because of how Mars time and Earth time have lined up, someone following along from Brazil, or Australia, or the UK, could be seeing the pictures before we do. Just because that's the way the times zones are falling.
DE (01:04:42):
This adventure is for everyone to be a part of. And we see people doing that. We see people taking the pictures, and making mosaics, and movies, and maps, and animations.
DE (01:04:51):
And it is wonderful to be able to plan stuff with the rover, knowing the public are going to be along with you for the ride, and they're going to be enjoying this stuff as well. This adventure is not just for us. It's for absolutely everyone to come along and enjoy.
EW (01:05:06):
Well, thank you. Thank you for the work you do. And thank you for being with us.
DE (01:05:10):
It's an absolute pleasure. It's always fun to talk about the fun stuff we get up to with Curiosity.
EW (01:05:15):
Our guest has been Doug Ellison, Mars Science Laboratory Engineering Camera Team Lead at NASA's Jet Propulsion Laboratory.
CW (01:05:26):
Thanks, Doug.
DE (01:05:26):
A pleasure.
EW (01:05:26):
Thank you to Christopher for producing and co-hosting. Thank you to TwinkleTwinkie for pointing me in the direction of Doug, and to our Patreon listener Slack group for many questions. And of course thank you for listening. You can always contact us at show@embedded.fm or hit the contact link on embedded.fm.
EW (01:05:44):
And now a quote to leave you with, from a very, very good set of books. Mary Robinette Kowal wrote "The Lady Astronaut of Mars." "It's a hard thing to look at something you want and to know the right choice is to turn it down."