Embedded

View Original

375: Hiding in Your Roomba

Transcript from 374: Hiding in Your Roomba with Brittany Postnikoff (Straithe), Elecia White, and Christopher White.

EW (00:07):

Welcome to Embedded. I am Elecia White, here with Christopher White. Our guest is Brittany Postnikoff, also known as Straithe. We'll be talking about robot social engineering. And other things.

CW (00:21):

Hi, Straithe.

BP (00:21):

Hi.

EW (00:23):

You tell us about yourself as though we saw you on a panel about robots.

BP (00:29):

Sure. So right now I am a researcher and community manager for Great Scott Gadgets. And otherwise I am a troublemaker. I primarily just look at how things can be changed, or broken, and adjusted using robots, specifically the emotional side of robots.

EW (00:53):

Do robots have emotions?

BP (00:57):

That is a very big philosophical question, and it depends on who you are and what your perceptions of emotions are. But I'm inclined to be somewhere in the middle.

EW (01:09):

Alright. Well, we're going to ask more about that and other things. But first we want to do lightning round, where we'll ask you short questions, and we want short answers. And we'll try not to ask for more details, but who knows.

BP (01:23):

Sure.

CW (01:23):

If you were to make a Gibsonian Cyberdeck, what would you put in it?

BP (01:29):

I keep thinking food.

EW (01:33):

Should we bring back the dinosaurs?

BP (01:35):

Yes.

CW (01:37):

Preferred listening when programming?

BP (01:40):

Electro swing.

CW (01:41):

That's a genre I've never heard of. I'm going to go look it up later.

EW (01:45):

Are fake tattoos a type of sticker?

BP (01:48):

Yes.

CW (01:50):

Googly eyes or puffy animals stickers?

BP (01:53):

Googly eyes.

EW (01:54):

If you could be a sticker, what kind of sticker would you be?

BP (01:58):

Holographic.

CW (01:59):

Oh, right. Holographic stickers.

CW (02:02):

Do you like to complete one project or start a dozen?

BP (02:05):

Start a dozen.

EW (02:06):

If you could teach a college course, just one, what would you want to teach?

BP (02:10):

Ethics.

CW (02:12):

What is your favorite fictional robot?

BP (02:14):

Bender.

EW (02:17):

Do you have a tip everyone should know?

BP (02:20):

Turn your clothes inside-out before washing them.

CW (02:25):

See, but I hate that, because then I have to turn them outside-in after drying them. That's so much work for me.

EW (02:31):

So much work.

BP (02:32):

It pills less, and it looks better longer.

CW (02:34):

Alright.

EW (02:37):

Okay. So we've touched on stickers, and robots and emotions. Which one you want to talk about?

BP (02:44):

Let's start with stickers.

EW (02:49):

Okay. So you're into stickers. What does that mean? You have a Twitch stream where you talk about stickers. I don't understand.

BP (02:57):

So stickers are a big part of hacker culture, and I love seeing the stickers on the backs of people's laptops, or on the other devices they carry. And I had this question about, "What does this sticker mean? What does that sticker mean?"

BP (03:12):

And after talking to people, you realize that stickers have so much history and story behind them that kind of gets lost if you don't archive it somewhere.

BP (03:22):

So I've been working on this idea of a sticker archive for the stories about stickers, so we don't lose that information as people leave the community, or move on, or other things like that.

CW (03:34):

Okay. That's genius.

EW (03:36):

So...there are many different Hackaday stickers, and some are older than others. And some you could only get at certain conferences. Is it that sort of information, or is it the emotional attachment people get to their stickers?

BP (03:51):

Both. I think all stories are worth writing down, or collecting, or having on a stream and sharing. And for me, it's just this idea that it's a big part of our culture as people in tech. And it's nice to collect that culture somewhere.

BP (04:08):

So some of the stories are just, "I saw this sticker. I wanted to be friends with someone because of this sticker and now we are." Or sometimes you get the stories behind the, "This is not a camera sticker." And...why did somebody make that?

BP (04:23):

And you get to hear that story that led up to somebody doing that creative design. And I think both are really exciting just to learn the thoughts that people are having around these cultural symbols.

CW (04:34):

Okay. So you're probably somebody who knows the answer to this question I've had. When I started in tech, it was when Friends was on television. The early seasons. And...we had laptops and stuff. They were garbage compared to now, but I don't remember having stickers.

CW (04:52):

I would have put stickers all over everything, but I don't remember doing that. And I don't really have a good sense of when that kind of started. I remember when I started putting stickers all over my laptop, but it was only 10 or 15 years ago maybe. Do you have a notion of where this got started?

BP (05:08):

No, but now I'm going to go and research it.

CW (05:10):

Okay, good. Report back.

BP (05:13):

I will. This is great. Thanks.

EW (05:16):

So we recently re-aired a show with Sarah Petkus, who then did our logo and stickers. And one of the things that I like best about our stickers is that...some people see it as a radio, an old-timey radio, and other people see it as a robot head. And I love that ambiguity. Are there things you look for that make for good stickers?

BP (05:43):

A few things like color, design, the quality of stickers. I have some stickers I've put through the dishwasher, and they're fine. There are some stickers you leave on a laptop for a week, and they're gone. So the longevity is important to me, and I also try and make sure messages are positive.

BP (06:03):

I have a whole stack of stickers that I will not interview people on, that I will never put on my laptop, just because they contain messages I'm not comfortable with. So just being positive, having that good design, and also using ethical creators as well.

BP (06:20):

So there are some sticker companies that I, again, won't use, and there are other ones that put a lot back into the company. So there are a number of things I look for.

EW (06:29):

What are some that you will use? Because I think I'm ready to switch sticker manufacturers.

BP (06:35):

I've heard great things about Sticker Ninja, and I've heard great things about, I think it's StickerGiant, is another one. But right now there are so many people looking for a better place to go, and I have to try out more myself still.

EW (06:54):

I understand. I'm looking at Chris's laptop.

CW (06:57):

I know. I keep wanting to turn it around, but it's being used to record this podcast, and that would be dangerous.

EW (07:01):

And I know that Ben Krasnow's wrench in a beaker for his YouTube, what is the name of that?

CW (07:12):

Channel?

EW (07:12):

Yeah, his YouTube channel. What is the name of - ?

CW (07:14):

Oh, oh, Applied -

EW (07:15):

Applied Science?

CW (07:17):

Applied Science.

EW (07:17):

And Matt Godbolt's Compiler Explorer are both logos that have no words on them. And I wouldn't be able to tell what they were without knowing...What's your opinion on that kind of sticker? The "you have to be in the know" to recognize it.

BP (07:39):

Those are personally my favorite stickers, because I don't put stickers on my laptops that have words. Because I think it's more about just the picture for me. And also it's a great conversation starter.

BP (07:55):

If you don't know what a sticker is, going up to somebody and asking, "Hey, what's that sticker?" is a great way to make new friends, and one of my favorite things about sticker culture.

EW (08:06):

How did you get into it? I mean, was it one of these conversations where you went up to someone and said, "What's the sticker?"

BP (08:16):

I mean, I do that all the time, because I'm just very curious. But I think a big part of it for me was just, my first DevCon I saw these stickers. I'm like, "Wow, there's so much color and vibrancy to this community." And I mean, of course the LEDs helped, but the stickers were a good introduction.

BP (08:36):

And people are just like, "Here, take my sticker." I'm like, "You're just going to give me a sticker?" And at this time I was a student. Anything free made me happy. So of course I'm going to throw stickers on my journals and stuff. And people were like, "No, you have to put it on your laptop." And then all of a sudden it's this big choice.

BP (08:54):

And I was like, "Well, now I just need to collect all of the stickers, and I need multiples of all the stickers so I can stick them on things and be happy." And so now I have a huge sticker collection, and of course I use some of them. But some of them I have just as an archive, and it just has kept going from there.

EW (09:11):

I get other people's stickers to send out when I send out our stickers, because I don't want our stickers to be lonely. That's normal, right?

BP (09:19):

Absolutely normal. I have so many friends you would love to spend time with if that's your mentality around stickers.

EW (09:26):

I don't put them on my laptop, but my toolbox is covered in stickers.

CW (09:30):

There's a sticker on your laptop.

EW (09:31):

There's one. There's one Embedded sticker on my laptop.

CW (09:33):

Yeah, okay.

EW (09:33):

I don't think that counts. That's more of a, "Here's our company property."

CW (09:40):

Do you think there's a crossover between the sticker kind of ethos and the badge thing?

BP (09:46):

Very much. There are a lot of people in both communities that share artwork back and forth. And for me, stickers is kind of step one before I started getting into badges.

BP (09:58):

Because so many of the cool conference badges have great PCBs that have fun designs, and...some of that uses the same skills as creating stickers. So it's kind of a natural progression to me.

EW (10:14):

So going back to robots, tell me more about robot social engineering. I mean, it's not really about tricking the robots to give you information like human social engineering, is it?

BP (10:28):

So one aspect of it is, but for my master's thesis, it was more about using robots to social engineer people.

CW (10:37):

Okay.

EW (10:38):

I mean, if a robot asks me for my password, I might give it to him, because it's a robot. It doesn't care.

CW (10:44):

What?

EW (10:45):

I don't know. I just can imagine being in a situation where I unwisely trusted a robot, because it wasn't a person.

CW (10:54):

Define robot. Is R2-D2 coming up to you and beeping, "Give me your password?"

EW (10:58):

Pretty much. Yes. That was what was in my head.

CW (11:01):

Is it C-3PO? Or -

EW (11:01):

No, no, no. It was definitely R2-D2.

CW (11:02):

Or is it just something on your computer where a fake robot comes up?

EW (11:05):

No, those are clearly people.

CW (11:07):

Oh, okay.

BP (11:10):

I love this. This is exactly why I did this research is, there are so many definitions for robot. And when I talk about robot social engineering, I specifically mean robots that are in a body, and are able to interact physically with their environment, move around in that environment, and still have some form of artificial intelligence.

BP (11:34):

So the things that pop up on a TV screen would not be a robot to me. So it's kind of the interesting thing about how everyone has these different definitions.

EW (11:46):

I mean, we gave our Roomba the password to our internet.

CW (11:50):

What? Well, yes. But it didn't ask. The app asked. The robot itself did not ask. It was a proxy. It's not the same kind of thing.

EW (12:01):

What kinds of social engineering can the robots do other than get our Wi-Fi password?

BP (12:07):

So there are some things where, how do I describe this? So robot social engineering has a number of parts,...depending on the level of artificial intelligence of the robot.

BP (12:23):

So some things, like a Roomba, would probably just be a proxy for a human social engineer that has gone into your robot, and uses the emotional connection you have with your robot to make you do things.

BP (12:37):

So for example,...people actually remodel their entire house so it's more accessible for the Roomba, and people get these emotional attachments where they name them, where they pet them -

CW (12:55):

What? People name their Roombas? That's ridiculous. Who would do such a thing?

EW (12:58):

We have googly eyes on our Roomba, and we've named it.

BP (13:00):

Right? So you're right in the perfect market for these. But there are things where those robots, you start getting use to them, and you get used to them moving around your house without any interaction necessarily, especially if you put it on a schedule.

BP (13:18):

And you get used to the noises, kind of like when you get a pet....You at first are like, "What's that noise?" when they're moving around, but after a while you get used to the noises. Now say that somebody is able to RDP into your Roomba and all of a sudden start looking around your house.

BP (13:37):

Because there are cameras on Roombas. There is Lidar on some of the Roombas. There's so many different features that can collect so many different types of data. And say these robots go throughout your house, and you have one of the ones with the camera.

BP (13:51):

All of a sudden a person can use the social comfort you have with your Roomba to go around your home and case your entire house, see everything that's in it, where it is, and also see or hear whether you're home.

BP (14:06):

And so all of a sudden, say you go on a vacation for a week, somebody has been hiding out in your Roomba for a month, RDPing, watching, and they notice you're gone for three days in a row, which is really weird for you.

BP (14:17):

Well, they know that's the perfect time to come in and rob you, and where everything is, where the alarm systems are. They might've seen you arm them or disarm them through the Roomba's camera. There are all sorts of privacy and security considerations with the technology you let in your home.

EW (14:36):

There definitely are. Lately our Amazon Echos have been irritating me with not only their insistence on offering me things I don't want, but also -

CW (14:47):

Amazon's new programs to make mesh networks, and share your network, and do weird things I didn't ask it to do.

EW (14:57):

Do you think of those as robots? I mean, because that does have a social engineering aspect as well. They don't move though.

BP (15:04):

I don't, because they don't move. To me, they're just a machine, or they're an artificial agent.

CW (15:10):

What if I tape it to the Roomba? Sorry, sorry.

EW (15:14):

Why does its ability to move make it more interesting to you?

BP (15:21):

Because all of a sudden you have a walking, talking vulnerability. It's not just a thing on your table that you talk to. And that physicality is a big component of how we socialize with other people, with animals, and in this case, robots.

BP (15:40):

It's that physicality that really makes this a unique piece of research compared to seeing how an artificial intelligence online affects people. That is a different area that doesn't consider physicality. It's highly explored, but that doesn't apply to when you have a robot in front of you in a body.

BP (16:01):

It's a different scenario. You interact with it differently, and there is research on how different those two things are. And so I was like, "Well, I want to look at this specifically." And so that's why I define robots so narrowly, is because the physicality's really cool.

CW (16:19):

...Does it plug into something deep in our brains that says this is alive, which is a difference from, say, the Echo tube, that is just a monolith?

BP (16:31):

Yeah. So humans use things like anthropomorphism to connect with different things in their environment as if they were humans or other humanoids. And then we have zoomorphism, which is when people treat things in their environment like animals.

BP (16:47):

And robots can benefit from one, the other, or both at the same time, depending on how you interact them. And I think that's really special and cool.

EW (16:58):

Do we trust them more? Because we -

CW (17:01):

I mean, I don't trust people that much, but I think I might trust a robot more. I don't know. Yeah.

EW (17:08):

We don't give our Wi-Fi password to very many people.

CW (17:12):

Well, yeah. Finish your question. I'm sorry.

EW (17:15):

It was more like, is it the fact that we do this zoomorphism and anthropomorphism that causes us to be more susceptible to social engineering attacks? Or is it just that we're so stupidly susceptible to social engineering attacks that this is just one more path?

BP (17:35):

Both. And some of it has to do with context as well. If you have a robot coming up to you in a hospital, because there are some hospitals that have these robots that will come and deliver your medication to you. Well, it's a machine in an authoritative role in an environment where most people don't feel they have much authority.

BP (17:56):

So if a robot comes in with a cup of pills and says, "Take this," you might be more inclined to trust it, even though we, again, have the issue of you don't know who's programmed those pills, if they're the correct pills, if your pills got switched with someone else.

BP (18:11):

There are trust things to think about, but because of the context and authority the robot holds, some people might be more inclined to trust them.

EW (18:23):

I remember talking to Professor Ayanna Howard about this some. That even if the robot led you, in a psych experiment, to the incorrect room, and so you knew that it was fallible, when there was a fake fire alarm, you still followed the robot, even if you kind of knew how to get out of the building.

BP (18:46):

[Affirmative].

EW (18:46):

What's wrong with us as a species?

BP (18:50):

That is one of my favorite papers. And I did cite that one in my thesis, because it just...blew my mind that people could see an exit sign clearly pointing "just go left," but the robot was pointing right, so they went right. And there's so much to think about there.

BP (19:08):

And again, it's context, where people freeze when there is a fire. There's panic. And just like when we're in public, and we see someone getting hurt, and that maybe somebody should call the cops, or should intervene, or help out, no one does it.

BP (19:27):

And there's a bunch of papers on this, that no one wants to be the first to step up. So a robot coming in in this case and being like, "Hey, follow me to safety." You're like, "Okay, I don't have to think about it. Somebody else will think about it. Great. Tell me what to do."

BP (19:42):

And so I think that's, again, part of the robot slipping into an authoritative position, and taking the pressure off of you kind of gives you more of an inclination to trust it.

CW (19:57):

It's interesting, because a lot of science fiction has a theme of "don't trust the robots," right? And the Alien series and whatever. There's plenty of examples where the robot turns out to be an enemy for some reason, and you shouldn't trust it.

CW (20:12):

And I feel like that should have subsumed into our culture over the last 50 years...That seems like more of a reflection of our desire to trust them rather than a reflection of our distrust.

BP (20:30):

Well, yeah. And...for almost every bad robot there is, there's a good robot, like C-3PO or R2-D2, or in my case, Bender isn't exactly moral, but I would love him as a best friend. So yeah, that's the thing, is we always kind of the good with the bad. And it comes down to robots are as varied as humans.

BP (20:53):

They come in so many shapes, so many different types, and different thought processes, different skills. And when we make a decision on a particular robot, it comes down to, again, context, environment, what the robot does, why we think it does what it does, and all sorts of these complex variables that go into one value of trust.

EW (21:20):

With respect to your example about people being hesitant to intervene in emergencies, there is also research to show that if you have any training at all, if people are trained to be responders, not even formally trained, but even the community emergency response teams, where it's a low level of training, they do tend to step up.

EW (21:46):

Do we need to train people to be like, "No, don't follow the robot?" ...How do we get out of this? Because I don't really want to lose the personability of my Roomba. I like its googly eyes.

BP (22:00):

Yeah. I'm happy to hear that. That gives me a lot of joy. And I think a lot of what we need to do is just be aware. Be aware that a robot could be collecting information on you and a lot of information that you don't necessarily know.

BP (22:18):

A lot of the manuals on robots don't actually give you all of the details about what a robot is doing, or what it's collecting, or where its information goes, or how its information is stored. So it's kind of exhausting, but having the awareness and vigilance to think about robots in depth when you interact with them is what we need to do.

BP (22:41):

And yeah, training could help with that. I'm hoping that as I write more papers and give more talks, it kind of gives people that low level of training. But yeah, it's a really hard problem to defend against robot social engineering, just as it is to defend against regular social engineering.

EW (23:02):

You said write more papers. You have a master's degree, but you are starting a PhD this fall?

BP (23:09):

Yeah. I'm actually switching from CS to electrical and computer engineering, where I will be doing a PhD with a bunch of other great robotics people. And one of them actually started working on robot social engineering around the same time I did, but they were living in Italy.

BP (23:29):

So we will actually be in the same place, at the same time, researching the same thing. So I'm real excited about the research that's going to come out of our lab.

EW (23:38):

And is it going to be on robotic social engineering, or robot as a whole, or do you have an area of concentration?

BP (23:51):

Yeah, I've been really inspired by [Whitney Merrill] to focus a lot more on privacy. And so I've always had this question in my head of, "What are people's public perceptions of robots in public spaces?" And I have a story on that actually.

BP (24:10):

So my partner and I were in an airport, and we saw a robot around. And I was like, "Take my bags. Check us in. I must go and look at this robot." And so I grab my phone. I start recording. Because I'd never seen this robot before. And it had the airport symbol on the side of it.

BP (24:29):

And you could scan your boarding pass, and it would tell you where to go, and where to check in, or where stores were. Or you could search for restaurants. Or it would turn its head around, and take a selfie for you, and email you the selfie. And I'm like, "Wait, that robot is collecting a ton of information."

CW (24:48):

[Affirmative].

BP (24:48):

And I'm like, "And the only reason people are trusting it is because it has a sticker on the side of it." And so I'm like, "I...want to see if I could just drop a robot somewhere, throw a sticker on it, and see if people give me their information?"

BP (25:04):

So,...a lot of what I want to look at in my PhD is "How little context do I need to give on the robot for people to give it high levels of trust?"

EW (25:18):

Are you going to put stickers that people recognize, or things that just look like something that is trusted?

EW (25:26):

All you need are googly eyes. I am telling you. That is the lowest bar.

BP (25:32):

Well, I'm thinking things like, I'll be at the University of Waterloo, so can I throw University of Waterloo -

CW (25:39):

Right.

BP (25:39):

- stickers on a robot, have it wander around, and ask people to enter a sweepstakes, and get their personal information. Or if I could drop it at a restaurant,...how long would it take for the restaurant to kick my robot out?

EW (25:57):

Creating a robot is expensive, still, and likely to remain expensive for awhile. So this level of interaction usually is related to a company, iRobot, the airport, places...that are big enough. Even Boston Dynamics, they have funding.

EW (26:25):

And so if their robot is wandering around, doing nefarious things with people's data, you can make a fuss, I guess, -

BP (26:35):

If you know that it's-

EW (26:35):

- like anybody really makes a fuss, but -

BP (26:37):

Yeah, if you know that it's collecting the data, and if you know there's something to make a fuss about. And I think people don't know that they should be making a fuss about some of these robots. There are a few, I will absolutely walk the other direction if I ever come into contact with them.

EW (26:57):

Like what?

BP (26:58):

Like the Knightscope robot, personally.

CW (27:00):

Right.

EW (27:01):

What's -

CW (27:02):

They're a security robot. They look kind of like Daleks.

BP (27:07):

They do. Oh my gosh. Do they ever.

EW (27:12):

I mean, is it just because it has a bunch of cameras? Does it have a laser?...Why are these bad? Does it have googly eyes? Sorry.

BP (27:22):

No. I'm kind of happy these one don't have googly eyes. Because I think it's bad to personify these ones, because I don't think they're good for the public.

BP (27:32):

But if you go through their documentation, you can see that they have thermal imaging, that they have license plate cameras, that they collect the Mac addresses of any devices near them...I mean, they have wireless access points on them that collect whatever access points your devices are trying to connect to.

BP (27:55):

So it knows what networks you usually try to connect to. They have cameras, they have audio, and they're collecting all of this data. And even if a company is the one that has set this out in their parking lot, you don't have notice that it's collecting all of this data, and you don't know where it's going.

BP (28:14):

And based on the documentation I've read, it looks like even if you think it's just going to the company, all of the data is going to Knightscope as well. So all of a sudden you have two companies using your data.

BP (28:26):

And Knightscope has done things like partnered with various law enforcement agencies that people don't really respect, and that make a lot of people's lives worse. And it makes me uncomfortable to think that these robots are breaking apart families and things like that.

EW (28:48):

Wow. I mean, there's so much here. Elevated body temperature. That is not something I want people to be able to know, because that indicates that I'm stressed.

BP (28:59):

[Affirmative].

EW (28:59):

Although, on the other hand, part of me is like, "Oh yeah, you put that in an airport, and if anybody has a fever, maybe we don't spread disease quite as fast." But the balance there is just impossible. And these robots,...if there isn't already a cell tower in them in order to collect the base cell data, there will be in about a month.

BP (29:26):

Right.

EW (29:26):

These are the robots. I didn't realize. I spend so much time thinking about the good robots that I don't really think about these other kinds, because I don't see them very often. And they're mostly used in larger cities and airports.

BP (29:44):

[Affirmative].

EW (29:46):

Do you think we'll see more as time goes on, or do you think that there has already been some public backlash, and there will continue to be?

BP (29:56):

Right now, I'm of the opinion that yes, we are getting more and more robots in public. Especially over the pandemic, there was kind of an explosion of robot use.

BP (30:07):

People putting robots in public spaces to take your temperature was a thing we saw a lot of over the last few months, and also robots that would be in public spaces asking people to step apart if you're too close.

BP (30:22):

There are some public parks, I think in Korea, that used these robots to say, "You are standing too close together. Please separate." But the thing is, the robot couldn't tell whether you were living together or not, and whether it was okay for you to be walking together.

BP (30:36):

So it was making these crude judgements on what should be allowed in public without actually knowing the context, or the situation, or finding out more. But because it was a big robot, the Spot robot and they are clunky. They are big. They seem like they have authority.

BP (30:54):

Again, they were given vests and stickers that...made people know that they were part of the park. And so all of a sudden these robots have authority to tell people living together, that share germs anyway, to separate, which seems excessive and incorrect, and like we're sliding down some sort of bad path.

BP (31:16):

So they're around. And I think they'll continue to be around. Especially as big corporations that have money continue to push the robots without thinking about the privacy and security, and without the average person raising their voice against the contexts in which robots are used.

CW (31:36):

It sounds like a lot of things are being delegated to them that should not be delegated to them, because they're idiots. I mean, they're not AIs.

EW (31:44):

But they say "artificial intelligence" on the side.

CW (31:47):

Do they?

EW (31:47):

Some of them, yeah, actually. They really do. Bizarre.

CW (31:50):

That's ridiculous, but okay.

CW (31:52):

But yeah, I mean, they're heuristic things that just go out there and say, "Okay, here's a bunch of people. Tell them to separate." That's all it knows. So...yeah. I mean, delegating human decisions to a robot seems very fraught.

EW (32:09):

And yet, part of me -

CW (32:10):

...Security is one of the big areas where you want human judgment.

EW (32:16):

But if we could make ethical AI more of a thing, then maybe the robots would be more fair than humans who have biases.

CW (32:26):

Okay.

EW (32:27):

Go ahead.

BP (32:29):

AIs will always have biases though, because they're made by people.

EW (32:33):

But if we start working on that piece, which is a totally separate piece, that we need our AIs to go through some path of certification towards equity, and I think we will get there eventually. Eventually.

EW (32:51):

But there are benefits to having someone who is less likely to be cranky because they're hungry to do some of the enforcement of traffic, -

CW (33:05):

I don't know.

EW (33:05):

- like parking enforcement.

CW (33:07):

Yeah. But yeah, I don't know. I like having a human who can be responsible for misdeeds.

BP (33:15):

And there are also some times humans let things slide. Like, "I could give this person a parking ticket, but [eh]." And it's those counter, positive scenarios, opposite from those cranky scenarios, that are also human. But if you have a robot, it's never going to let someone slide.

BP (33:38):

If you're a minute late getting back to your car, and it's normally not when the parking meter people come by, you usually slide by a minute. But robots and AIs hold up the rules, because that's all they can do. And they don't know how to let things slide for positive reasons.

BP (33:59):

And I think that's a show of humanity as well, and a show of compassion, and something we don't want to lose by delegating things.

EW (34:08):

I'm still on the fence about that, but I'm willing to go either way. So, as Chris said, one of these looks like a Dalek. Another one of these looks like the kind of cool robot, I think from Interstellar?

CW (34:27):

The square ones?

EW (34:28):

Yeah. But as you said, Straithe, they are intimidating. They look authoritative, and they are intimidating. It seems like if you really wanted people to be more interactive, you would put a fluffy bunny sticker on it instead of making it look scary.

BP (34:55):

[Affirmative]. And that's a big part of robot social engineering.

EW (34:58):

Do you think that they look scary, because they wanted them to look scary, or that they look scary, because nobody realized that they would be friendlier if you made them look friendlier?

BP (35:12):

100% both. I think the Knightscope robot's definitely intended to look scary. And it does look exactly like Dalek, and I've seen pictures of people taping a plunger and a whisk to them. So they are definitely meant to look scary. But a number of other robots, it's just people being like, "[Ah], I think this looks cool."

BP (35:33):

And then they make it, and people are scared of it, or don't understand why people are unhappy with it. And like, "Why is my robot failing?" And I'm like, "Because it looks like it's going to cut my knees off?" So it's a little difficult.

BP (35:49):

And that's one thing that a lot of robot designers could do better, is hiring human robot interaction specialists who've done research into the shape, outfits, heights, and all sorts of other interaction variables that could help them design their robots better. But so far that's not a common thing.

CW (36:12):

I mean, one of the first bullet points on your robot is force multiplying physical deterrence. I don't think that your goal is to make something that looks fun.

BP (36:19):

[Affirmative].

EW (36:23):

I mean, that's going to be a pretty interesting research project. Do you think you'll be able to get one of these scaryish robots and see just how friendly you have to make it before people will interact?

BP (36:37):

So there are things. I probably throw an apron on it, a frilly pink apron with some flowers on it. And all of a sudden it would look more maybe like Rosie from the Jetsons.

CW (36:47):

[Affirmative].

EW (36:47):

Absolutely.

BP (36:48):

And so there's these things like outfits that can make things a lot easier, or just colors. Stop choosing scary colors, and maybe use more yellows, or purples, or things like that too.

BP (37:02):

But yeah, that's not necessarily my specific area of research. And I would love to talk more with people who do really focus on that in depth.

CW (37:14):

We don't put enough fur on robots.

EW (37:16):

I was thinking that if you put a pink leather collar on one of the Boston Dynamics terrors -

CW (37:24):

Oh, God.

EW (37:24):

- it would be pretty cool. People would be like, "Oh yeah."

CW (37:26):

...Okay. Yeah. It's another...company...Are they doing this on purpose? Because these look like hell hounds, I mean, [sigh].

EW (37:37):

So you want to see more about people interacting with robots and try to figure out how kind, how nice, how personable you need it to look before people begin to fall for social engineering. Is that right?

BP (37:59):

No, my focus is going to be on whether people understand what sensors and abilities robots have, how those sensors and abilities can be used to collect data, and where their privacy and security might come at risk. And I want to demonstrate that through using robots to social engineer people.

EW (38:21):

Do you think that robots that interact with us take more data than just walking through with a cell phone that isn't well-protected?

BP (38:33):

Yes and no. Because,...I'm not a hundred percent certain on this, but you could always say someone else had my cell phone that day. But if a robot is there, and it's also got video and audio of you, [eh], you're a little bit more in trouble.

BP (38:48):

Or if it has your body temperature, or other things like that, there are just so many other pieces of data that maybe your phone isn't collecting that robots are definitely collecting.

CW (39:00):

Well, and people, I mean, at least theoretically, are in some control of their cell phone, right? You could leave it at home. You could turn it off. You can adjust the privacy settings and the location tracking settings.

EW (39:13):

Lead box.

CW (39:14):

You could throw it in the ocean. But a robot is not under your control. It's in an environment that you happen to wander into, and it's going to collect stuff passively. It's the same issue with facial tracking, right? And things like that, where you don't get to opt out.

CW (39:29):

Even if it's hard to opt out -

BP (39:30):

[Affirmative].

CW (39:30):

- ...with a cell phone, sometimes, there's no option to opt out with something that's just ambiently in the environment.

BP (39:38):

Yeah. Especially when it's walking, and moving, and could follow you. If you go around a corner to not be in its cameras, and it follows you, because it thinks that's suspicious, that adds a lot of stress to your interactions and walking around and just existing.

BP (39:55):

...If you try to get away from a robot, and it won't leave you alone, how uncomfortable.

EW (40:03):

Okay. So it's really about the robots being able to gather data about us that we don't want, as opposed to the robots that we find attractive? That's not the right word, but I'm going to go with it. Attractive enough to engage in a social manner in which we give up our data purposefully.

CW (40:30):

Or get tricked into.

BP (40:32):

Yeah. Privacy is just being able to control who you give your data to, when, and why. And so, like was said, like with a cell phone, you control that. But with these robots, especially in public spaces, they are someone else's property. And there is questions about laws.

BP (40:52):

If you put a sticker on a robot to take away some of its abilities, what laws are you breaking? So even if you try and do small fixes to increase your own privacy and increase the privacy of others, what laws are you breaking?

EW (41:08):

Well, and I have removable vinyl. And I could imagine using that to disable cameras, but that, even if it's not permanent, is that some sort of misdemeanor, because I am disabling their ability to track me? But I never agreed to be tracked. So...there are some pretty gnarly legal aspects here, aren't there?

BP (41:37):

[Affirmative].

EW (41:37):

So you're not starting your PhD program until fall. But you are working now at Great Scott Gadgets?

BP (41:47):

Yep. That's correct.

EW (41:48):

I believe we talked to Kate Temkin from there about USB things earlier in the year, or possibly last year, possibly a decade ago. I don't remember. What do you do there?

BP (42:01):

I am the community manager. So I have the fun job of dealing with all of the GitHub tickets,...being first point of contact for anyone who wants to talk to Great Scott Gadgets, or get customer support help.

BP (42:22):

I also will eventually help run events when we're out of the pandemic, and give talks, and focus on giveaways. And I'll also be making swag, including stickers.

EW (42:35):

What are you going to do with your stickers? What are the things that you find most important that you're like, "Okay. On a sticker I do, it's going to have these things?"

BP (42:46):

Well, so it's a little bit different for things I would create in my own time versus things I would make for a company. But I really love a lot of the Great Scott Gadgets ethos, which is making everything as transparent and open as possible.

BP (43:03):

Even the tagline for the company is making open source tools for innovative people. And...I love how freely the company shares knowledge. And one of my favorite things is looking at the different layouts for all the different pieces of hardware.

BP (43:22):

And I really want to make some stickers, and t-shirts, and stuff that really show that hardware since it's open. It's available. But I think throwing that on a black t-shirt looks really cool.

EW (43:35):

As part of being community manager, you deal with GitHub issues. Do you also work with software engineers who are contributing to the open source parts of Great Scott Gadgets?

BP (43:49):

Yeah, absolutely. So I help review pull requests. I try and ask questions to make sure we're talking about the same thing. And a lot of the GitHub issues end up turning into, "Maybe you could try and fix this," and people submitting their first pull requests.

BP (44:06):

So I definitely love when people open issues and pull requests, because it gives me another opportunity to interact with and support people, and see new ways to use the things that we've already made.

EW (44:20):

Getting people to do open source, it's still hard. How do you get them to engage?

BP (44:30):

One of the things that I've been trying to do in the company is respond to issues quicker. I've tried to put a service-level agreement in that I'm gonna respond quick so people know that we care, and that we want to hear their feedback. We want to hear what they're having issues with so we can make everything better.

BP (44:48):

So I would count issues as contributing to open source, because it does affect how we think about what we're making, what products we want to make in the future. So when people think about contributing to open source, they normally think of only contributing code.

BP (45:04):

But contributing documentation, writing down the issues, joining the Discord, and interacting with us, and telling us what you want is always to contribute to open source. And I really want people to focus on some of these other things other than contributing just software, because there are so many ways to be involved.

CW (45:30):

It seems like there's a real mix in the quality of various open source projects and how welcoming they are. What do you think are some of the keys to getting people to feel comfortable submitting a first PR, or an issue, or something, without fearing that they're going to be yelled at, or made fun of? Because that happens sometimes.

BP (45:52):

Yeah, I've definitely been in that scenario. And it's what turned me off of open source for so long, was people just being outright rude, or saying, "This is too simple of an issue to contribute. We're not going to merge someone else's stuff when we can do this in a line." It was just antagonistic and rude.

BP (46:11):

But with Great Scott Gadgets, the fact that the company constantly went to conferences, that I saw everyone constantly, all over the world, at all these places, and that they were giving back through talks, through giving away hardware, through making sure everything was open source, and really touting that.

BP (46:33):

And having tons of videos, and write-ups, and stuff out there for people to learn from it, it was obvious that it was more than just a pet project. It was kind of like a labor of love. And so that's one of the reasons I joined the company, was just how positive all of these things were.

BP (46:55):

And so that's something I look for when I try to contribute to open source now, is how much are they giving back to the community and accepting the community?

EW (47:08):

I was talking to a person who wanted to contribute to an open source project, a big one, not Linux, but a big one, and didn't understand why no one would help set up the computer in the right way. And I was like, "Okay, they have a getting started guide. I don't understand what your problem is."

EW (47:36):

And the problem was that it took a long time to compile, and they wanted help understanding how to get faster compiles.

EW (47:42):

And I tried to explain that they have a lot of people who want to do a small change, not a lot of people who actually do a pull request, but too many people to help every single one just get set up.

BP (48:02):

[Affirmative].

EW (48:02):

Do you see that problem?...I'm not sure I handled it well, other than trying to explain from the other side. Is there some way to say, "Yes, we really want you to contribute, but could you please not waste our time?"

BP (48:21):

That is something honestly I've been struggling with now that I am first point of contact for the GitHub issues, is that before me there wasn't anyone dedicated to this. It was software engineers who had a bit of spare time, or other things like that, but it slows down the open source project every time we need to help someone.

BP (48:48):

And so, I'm really happy that Great Scott Gadgets grew enough that they were able to pay and hire me, because the company is doing well enough. So now I can go in and do that, but not every open source company can afford to hire someone.

BP (49:05):

And so, another way to contribute to open source and help other people, is to be part of the community, and watch projects that you like for how they do try to solve issues, and try to help other people.

BP (49:17):

And so right now I'm doing that by looking at old issues and how they were solved, seeing if new issues match up, and being like, "Okay, well, we've tried these sort of solutions in the past on these issues. Let's try that again over here."

BP (49:30):

And using those issues and going back and forth is one way people can contribute to open source, help other people, free up open source developers' time to do other things. So it's definitely hard. And every time somebody asks an issue, it is taking up time.

BP (49:48):

But hopefully, contributing to projects, either with money or buying the products they're related to, can help open source companies hire more people that can do this type of work.

EW (50:03):

I guess, probably because of being around with the dinosaurs, I still don't understand how open source companies can make money. Do you have any insight into that?

BP (50:16):

Yeah. So one of the things with Great Scott Gadgets is that we do sell products, like the HackRF, or the Ubertooth, or soon we'll be coming out with LUNA, which is something that Kate Temkin's been putting a lot of time into. And it's that hardware that really supports the company.

BP (50:35):

And so anytime anyone buys one of the actual Great Scott Gadgets pieces of hardware, it is funding things like creating new hardware, or me helping with these pull requests, or GitHub issues. And so that's the number 1 way people can support us.

BP (50:53):

And we hope that our hardware is what people buy instead of the knockoffs, because of the customer support we provide and some of the guarantees that our resellers provide as well.

BP (51:09):

And so, it's part of that ecosystem and giving back to the community. And if people want to give back to our company, buy our hardware. It's really helpful.

EW (51:21):

Well, and this LUNA board, looking at it, it does protocol analysis for USB. It works on creating your own USB device. It has an FPGA to help all of this. I wouldn't want to create that board. I mean, if I wanted to use that board, I would not first want to build it. I would just want to use it.

BP (51:44):

Yeah.

EW (51:45):

So I understand why people would buy the hardware instead of making it, and then that pays partially for the software as well as the hardware.

BP (51:56):

[Affirmative].

EW (51:56):

Cool.

BP (51:58):

Yeah.

EW (51:58):

I can understand that.

BP (52:00):

Yeah. And hopefully, another part of my job since I've come on to the company is creating interesting types of swag, which hopefully, we will give some away at conferences, but maybe some people might be interested in buying.

BP (52:13):

So, hopefully I will get all of that up and running soon, and people will have an option to support us by buying cool things that aren't just hardware.

EW (52:24):

Well, as long as you're going to say that, I should point out that we have new swag and new merch in our Zazzle store. I did this talk with the map file that talks about map files.

CW (52:39):

What? [Laughter].

EW (52:39):

And now,...somebody actually printed it in a poster and sent me a picture of it on their wall. And somebody asked for mouse pads. And so I went ahead and made mouse pads, and new mugs...People wanting to buy this stuff is really weird. I thought you had to give this away. I didn't realize people would buy it.

BP (53:03):

[Affirmative]. It's a great opportunity to support the things you love.

EW (53:07):

Yeah, it is. Even if you're not trying to make a lot of money off of it, it's also a way, as you said, with stickers, somebody will come in and say, "What's that?" And if you've got a neat mug, or an Embedded sticker, or poster, I can totally see it. Kind of cool.

EW (53:26):

Well, I think it's about time to get back to our weekend. Do you have any thoughts you'd like to leave us with?

BP (53:37):

On the topic of open source, contribute. I've been doing so much for the Great Scott Gadgets repositories, and I would love to see people open more issues, open more pull requests, or even reach out to me if you want to talk to me about anything Great Scott Gadgets. I am always here and happy to hear from new people.

EW (53:58):

Our guest has been Straithe, Brittany Postnikoff. You can find her at straithe.com. That is S-T-R-A-I-T-H-E dot com. And of course, there'll be a link in the show notes.

CW (54:12):

Thanks, Straithe.

BP (54:13):

Thank you.

EW (54:15):

Thank you to Christopher for producing and co-hosting. And thank you for listening. You can always contact us at show@embedded.fm, or hit the contact link on embedded.fm.

EW (54:25):

And now a quote to leave you with. This is from William Gibson. "Time moves in one direction, memory another. We are that strange species that constructs artifacts intended to counter the natural flow of forgetting."