Embedded

View Original

427: No Fisticuffs or Casting of Spells

Transcript from 427: No Fisticuffs or Casting of Spells with Elizabeth Wharton, Chris White, and Elecia White.

EW (00:00:06):

Welcome to Embedded. I am Elecia White here with Christopher White. Our guest is Liz Wharton, and we're gonna talk about technology, law, public policy and cybersecurity.

CW (00:00:17):

Hi, Liz. Welcome.

Liz (00:00:18):

Hi, thank you. Thank you so much for having me.

EW (00:00:24):

Could you tell us about yourself as if we met at a random diner at DefCon?

Liz (00:00:32):

I wish I had a much better superhero origin story. If we were sitting together at DefCon, I would say I try my best to keep folks out of trouble, and enable the doers to do what they do, and do that from a legal as well as a policy, and just a business operation, standpoint.

EW (00:01:04):

Okay. Now, if we met at the boardroom at Scythe, how would you introduce yourself?

Liz (00:01:19):

I am the adult in the room. I have almost 20 years of working with the hackers, the makers and the breakers, and really just enabling. By that, I mean, I have done everything throughout my legal career from, I started off as a business lawyer, and I worked in banking and real estate and just helping companies build and grow. This is what happens when you have friends that like to ask why, and like to take things apart. They would start to do things, for example, doing research with unmanned systems and asking me to help them legally do that, or draft the contract and it went downhill from there.

EW (00:02:11):

All right. And that downhill is a large part of what we want to talk about.

Liz (00:02:17):

<laugh> Right. It's the fun part. The lawyer stuff tends to be the boring part.

EW (00:02:21):

Well, as you said, keeping us out of trouble is pretty important.

Liz (00:02:27):

It is. And also helping, everything from other members of the board and other C-suite executives, down to the legislators and the regulators, and being that transition, that bridge and translator between. "This is what people are doing. And this is where you would like to see things go. Here's how we can do that. And it may not be how you think."

EW (00:02:55):

We want to do a lightning round where we ask you short questions and we want short answers. If we're behaving ourselves, we won't ask how and why, and who's your favorite superhero. No, that one we will ask. Let's start with that. Who's your favorite superhero?

Liz (00:03:10):

I like Black Widow. Probably because of the red hair.

Liz (00:03:14):

But She-Hulk is a lawyer. Sorry.

Liz (00:03:17):

She is, but I-

Liz (00:03:19):

I'm breaking the rule already.

Liz (00:03:20):

Right, but it looks like it requires a lot of gym time to stay quite that fit and shopping for clothes seems to be- Admittedly, I still have to catch up on both reading the novel, as well as watching the show. So, no spoiler alert.

CW (00:03:38):

Do you have a favorite video game?

Liz (00:03:41):

This is where my downfall is and it's embarrassing. I like the Mario Kart, the Pokemon GO, the things where basically you can pick it up and it's fairly intuitive and then can drop it.

EW (00:03:57):

Do you have a favorite historical case file?

Liz (00:04:02):

I really don't have one. One of the fun ones to look up is the state of Maine sued the Declaration of Independence. I cannot make this up. It is a fascinating story. When you get down into it, it's really about lawyers getting creative on how to establish who owned, or the proper chain of title for, a document. But just the fact that the state of Maine had to style a case that they were suing the Declaration of Independence. I just love that.

CW (00:04:41):

I always like those federal cases where it's like, the United States versus $5,263. <laugh> Because they're trying to reclaim money from someone. It's really weird.

Liz (00:04:52):

Or 20 John Doe's. Or yes, any inanimate object. But it's the Declaration of Independence! How can you sue this? Maine, what is wrong with you? Spoiler alert: they lost.

CW (00:05:09):

Do you have a favorite law?

Liz (00:05:12):

That's a good one because it depends. Is it the favorite law that I like to make fun of? Which is the CFAA. I think it's terrible, let's start from scratch almost. Do I have a favorite one that I think is fantastic? There's too many to choose from.

EW (00:05:35):

Do you have a tip everyone should know?

Liz (00:05:39):

First of all, break it down to basics, and with everything don't assume we're all speaking the same acronyms, the same language, break it down.

EW (00:05:53):

Okay. I'm gonna go back to a previous question. You said the CFAA, the Computer Fraud and Abuse Act?

Liz (00:06:00):

Yes! Thank you. I'm so glad you circled back to that because as soon as I said it, I was like, "Oh gosh, not everyone might know." Yes. The CFAA, Computer Fraud and Abuse Act, we're seeing it get reined in a little bit, but not truly. And we're seeing some states try to get creative, but that's frequently a law that's used to tampen down research. Even companies will try to use it sometimes to say, "Employees stole our stuff! They shouldn't have been doing it." It's like, "No, stop it. Be adults."

EW (00:06:43):

Isn't that what law really comes down to? Someone saying, "Stop it. Be adults."

Liz (00:06:49):

Right! And, "Sit in that corner and think about what you just did, because that is not what you were supposed to do." Yeah! Frequently.

CW (00:06:57):

That was the law that was used against Aaron Swartz. Right? That was, I think, where it came on a lot of people's radars as, "What's going on here?"

Liz (00:07:08):

Yeah. Not only that you had, for example, the state of Georgia had- A hot topic is always election security. The state of Georgia's voter databases had some security flaws within them. Researchers alerted the Secretary of State and the proper folks about it. It put a little bit of egg on the state's face. In this case, it was just stuff was not protected as it should have been. So of course, the natural reaction to that was, "You know what we should do? Instead of spending time, effort, energy, hardening our security around the voter rolls and this election systems, we should pass a law that extends the Computer Fraud and Abuse Act to cover this." Luckily, EFF helped bring people together to write to the governor, and the governor at the time did veto it. Again, "Be adults! This is not what you should be doing." As with Aaron, this is not what this was intended to do. You should be sparking creativity and the free discussion of knowledge rather than just crushing it.

EW (00:08:41):

It does seem like the response to being informed of security holes or such things, privacy issues, is not to say, "Thank you, I'll fix that." But, "Here's a lawsuit. You can't tell anyone and I'm suing you for criminal acts against this."

Liz (00:09:01):

Right. It's absolutely fascinating that when you have researchers who are going about it the right way. By that, I mean, they're being responsible where they're providing the company or the maker of such and saying, "Here is this flaw. I've done this research for you and I'm trying to make it better." And instead it ends up, 'Now we're gonna sue you."

CW (00:09:41):

The CFFA was passed in, I think, 1986?

EW (00:09:46):

Wait, wait, I have information on this. I pulled up the Wiki page. "The original 1984 bill was enacted in response ... that computer-related crimes might go unpunished" from...

Liz (00:09:59):

Drum roll please.

EW (00:10:00):

"The original crime bill characterized in the 1983 techno film WarGames."

Liz (00:10:07):

That is absolutely correct.

CW (00:10:11):

What? WarGames inspired this law.

EW (00:10:12):

Yes. 1986 is when it went into effect. This is the WarGames law. I didn't know that. That's so cool.

Liz (00:10:19):

Yes! Who knew the power Matthew Broderick had? He inspired!

EW (00:10:30):

"A realistic representation of the automatic dialing and access capabilities of the personal computer." <laugh>

Liz (00:10:38):

Yeah!

CW (00:10:38):

Very relevant to today's world.

Liz (00:10:42):

You can't make this stuff up.

CW (00:10:43):

Right. That was gonna be my question though, is how can a law written, oh God, almost 40 years ago...

EW (00:10:50):

<laugh> Wow.

CW (00:10:50):

Since I saw WarGames in the theater, I'm really depressed. How can a law that old, when computers were basically what our toasters do now and internet connectivity was limited to three computers at Stanford, how can that apply?

EW (00:11:08):

That can't even be relevant.

Liz (00:11:09):

<laugh> This is where frequently, you have to go to legislators and regulators and help them shift the focus. Stop focusing on the shiny object aspects of something. Stop trying to be reactive, and instead focus on what is the actual harm you were trying to prevent. Not the how, but the what, and break it down like that. About five years ago, it was drones. "They're gonna be taking over the skies. They're gonna be Peeping Tom, spying." All these things. I had a lot of fun, at the state and local level even, working out, going, "Quit panicking." We've had unmanned systems in operation since before World War II. Marilyn Monroe worked at a factory that built radio-controlled aircraft.

Liz (00:12:28):

So stop trying to focus on this aspect of drones. Instead, what are you trying to get at? "We're afraid they're going to, wardriving being legal, but we're afraid they're gonna do this, this or this." That we can work into legislation. But if you're worried about this, give researchers ten minutes and they have found a different way to do that. So your actual legislation is now obsolete and all you did was add one more law to the books that really doesn't work.

EW (00:13:04):

There are a lot of laws on the books.

Liz (00:13:07):

<laugh> Indeed, indeed.

EW (00:13:11):

Is anyone ever going to say, "You know what we need to do? We need to just get as many of these off the books."

CW (00:13:17):

Do you know how hard it is to pass a law? Imagine how hard it's to get rid of one.

Liz (00:13:21):

<laugh>

EW (00:13:22):

They should all just expire over time. Invisible ink.

Liz (00:13:26):

That could be fun!

EW (00:13:28):

Okay. So that's unrealistic.

CW (00:13:29):

Then you have to pass ones to replace them, even good ones that you want to keep.

Liz (00:13:34):

Exactly. I work with a lot of startups now. Shameless plug, Scythe, if I'm permitted to give plugs, Scythe has grown, we've become an adult company, in that we actually have to have employee handbooks and all these policies. When I was with the city of Atlanta, we were going through and updating our policies. A microcosm, an example of that, at one point had three different policies that dealt with beepers, pagers <laugh>. But we didn't have something that dealt with laptops when you travel. All right, let's get a home run. So we did. But when you take that on a broader scale, that's a lot harder to do.

EW (00:14:34):

Scythe, what do they do?

Liz (00:14:37):

We are an adversary emulation platform. We enable that big picture. What's working. What's not. What's the latest and greatest when threats are coming. We are post breach, but we allow companies to see what's going on in their systems. What's working. What's not. I like that, having survived being with the city of Atlanta when we had our ransomware attack, and we're having to rebuild all of our systems and our processes. I'd like to know did we configure this correctly? Is it gonna catch all these other things? What is it catching? What is it not? That's where platforms like Scythe come in and really just are a force multiplier.

EW (00:15:27):

So you're a red team.

Liz (00:15:29):

Think bigger. What happens when the red team comes in and says, "It's broken." And gets to walk away? No, you want the blue team aspect-

EW (00:15:38):

That's the best part.

Liz (00:15:41):

It kind of is. You get to be like, "Chaos bomb!"

EW (00:15:44):

Exactly.

Liz (00:15:44):

"Now you get to deal with it!"

Liz (00:15:48):

Everyone wants to be the first scene in Sneakers.

Liz (00:15:51):

Right! Unfortunately, I have been the one sitting in the boardroom going, "Well great, now what?" So, what Scythe does is we help the red team and the blue team work together better.

Liz (00:16:09):

Can we define those for people who don't know?

Liz (00:16:13):

You think of the red team as the ones that are going in and doing the testing. They're poking and finding the holes. Finding what's working, what isn't. Either a one shot, or over time, "This is what we see. We were able to get in this. We were able to do this, this and this." Blue teams are, I say the perfect, because they have the harder part. They're actually defending against. They're the ones maintaining the systems, and really running the SOC and keeping the lights on. Then you have the concept of the purple team, which is what happens if the red team and the blue team are sitting together in the same room and working together and in real time, or as close to real time as you could, getting that feedback. Instead of having to wait six months for the report, like, "Okay, great. So you're telling me for the past six months, somebody has been sitting in my network. Ah, helpful!" From a lawyer's summary, I hope that made sense.

EW (00:17:18):

It totally did. But these are very different skills. One is breaking things and one is building and more importantly, maintaining them well.

CW (00:17:29):

Monitoring, too.

EW (00:17:30):

And monitoring.

Liz (00:17:32):

Yeah! Right! The blue team has a long, much larger, to do list.

EW (00:17:38):

They also have a much larger liability.

Liz (00:17:41):

That, and that's where folks like me can stay employed. Yes, there are risks, there are liabilities. Part of that is bridging, going back to breaking it down, bridging that gap, how do you translate? I cut my teeth on security, working with the red team type folks, doing the offensive security and that is the fun! It's how did you figure out how to jailbreak an iPhone, put an extra battery pack on it, add in some software so that it wouldn't go into sleep mode, do Wi-Fi sniffing in the mail rooms of companies, and find out just how unsecure their networks are. That's fun! But when you go and deliver that report, how realistic, if we're gonna prioritize from a company going, "I have a million dollar budget." Wouldn't that be nice! "I have a million dollar budget. How should I spin it? Should I, obviously, add passwords to stuff, lock down the networks. Have the network segmentation so that if they do get in one way, they're not, yeah. Doing all of that. But beyond that, how do you prioritize?" That's a show hack. How do we do that? That's where that dialogue needs to happen, where you translate and you break it down and you help people prioritize.

EW (00:19:28):

That does make sense. But then when I think about some of the hacking that happens, the being able to open car doors and enter houses that are secured with cyber locks...

Liz (00:19:45):

<laugh> Right! All the cybers!

EW (00:19:50):

So the first person, the researcher, I totally think they should be protected. That makes sense. We need people who are trying to break these things to be the red team, to be the penetration tester. So we know where the errors are and we can fix them. As an engineer, I want to fix those bugs. But where does the responsibility lie when these hacks are published and other people use them for nefarious purposes?

Liz (00:20:19):

Such a multifaceted question. Now my brain is like, let me talk about this, and let me talk about that. So one aspect being with the researchers, you're coming in and you're presenting the findings and saying, "This is how I did it. This is how you could replicate it." And doing it in such a manner, with a level of respect. We have organizations like disclose.io that is a nonprofit that really helps bridge that conversation. If you were turning this episode into a drinking game, I feel bad that I have said "bridge" so many times. You want to bring that conversation in, in a productive manner, so that it's not the, "Ha ha, your engineers are idiots! Look what they didn't know how to do."

Liz (00:21:18):

And you're thinking, "Oh, you mean our team of five people that are dealing with insane amounts of tech debt and we're having to basically recreate everything each time we do a new feature or iteration of the product? No, they're idiots!" Huh! No, that's not how this works. And, also providing the information, but running against a clock of, if one person's found it, there's a good likelihood it's already been found by somebody else. They just happen to be the first person to talk about it. Think of how many people have different pocket research they just keep to themselves, "This is cool. I don't know what I'm gonna do with this yet. I don't feel like dealing with a responsible- but it enables me to do this, this or this." There are a lot of people who haven't disclosed.

Liz (00:22:18):

It's reminding companies that someone came and just did your research for you. They just did this and they're willing to work with you. That's a good thing. Now your engineering team doesn't have to find it themselves. Somebody else did. Work collaboratively together.

EW (00:22:46):

In a shiny, happy world, that's what should happen.

Liz (00:22:49):

Oh yeah. But this is not shiny and this is not happy. Look at it from the perspective of you just walked into a room and said, "Oh, Hey." Especially if you look at what you have coming out of, not only in the US, but other countries as well. Saying, "Oh, wow. Now that you know about this vulnerability..." In some cases there aren't even the caveats of it is a significant. You had to patch. You had to patch now.

Liz (00:23:20):

And if you don't patch now, or if you don't patch within 48 hours, 72 hours, et cetera. And if you don't publicly disclose that you had a breach of vulnerability, et cetera, then you better open that wallet because you're about to get hammered. Wow! There's a lot that goes into it and having to explain, "Well, sure we can patch. We can fix. But in some cases we're a hospital and some of our critical systems are running on operating systems that haven't been used, like that were introduced, uh, WarGames, pre lifetime. These were introduced before some of our security team was even born! <laugh> By patching it, we lose use of this machine. Now what do you want us to do, genius?"

CW (00:24:14):

There's a lot of Windows in medical devices. It's sort of depressing.

Liz (00:24:20):

I've represented a hospital that was the only medical trauma center within a hundred mile radius of a very economically depressed area. I think they had one and a half CAT scan machines, because the half that was broken they actually just used it to pull parts off of it. If you took them offline, you took away that first line of medical care for an area where some people did not have the luxury of being able to travel to a different hospital.

CW (00:25:00):

Are bug bounties something that's useful, that actually accomplished the goal? I know certain companies resisted them for a while. I think Apple has only recently started paying them. Is there an argument that they do encourage the kind of disclosure, or the kind of testing and disclosure, that we want? Or is it that people can't compete with the black hats?

Liz (00:25:22):

Bug bounties done well, I think are great. By that, I mean, going into it with the right spirit. I'm gonna use that loosely in the sense of, I have seen researchers who have shared with me, "I've disclosed this and they sent me this NDA (nondisclosure agreement). Can I sign it?" I looked at it. I was like, "No. I'm sorry, this is a conversation you would need to have with your partner. Yes, if you signed this, they would certainly give you, I think it was 30 grand in cash. That's a sum that I should not be the one making the decision for you. But on the flip side, if you signed this NDA, it means you're agreeing to walk away from an entire line of research. That, let's be honest, that's your passion. And that's your day job. So I would tell the company what they could do with this NDA."

Liz (00:26:33):

<laugh> But I can say that with a luxury of, I'm not gonna collect on this. I am armchair quarterbacking this like nobody's business. "You do you, but I wouldn't suggest you sign it." That's where you have to have that dialogue. Understandably, some companies are just gonna be absolute, because they panic, they don't know what to do. You have companies like Luta Security, Katie Moussouris, who essentially created some of these concepts of "Hack the Pentagon". There is a way to do it.

Liz (00:27:18):

There is a way. You go to DefCon and you can hack a satellite, because we've created an environment where everyone understands, we've put the parameters in place. We've opened up systems that ordinarily you or I wouldn't be able to have access to, well, maybe not depends on your security clearance. You have the car hacking village, you have the vote hacking, the biohacking villages. Those are great examples of how you can do bug bounties or even just that collaborative environment. But it took years and lots of yelling and lots of threatened lawsuits and intimidation for researchers. And some researchers having to learn, "Maybe not presenting it that way. Let's present it somewhere different, or in a different fashion, so that you can still get heard. They will still respond, but let's build that level of trust."

EW (00:28:25):

There is something about saying, "Your engineers are stupid" that makes people not want to listen to you.

Liz (00:28:31):

Absolutely. And also it's wrong. If you're starting from that premise, why am I gonna listen to what comes next? Because your initial statement is just wrong. It's reminding people to give each other grace.

EW (00:28:49):

Yes, and empathy. And to understand that many big companies have small teams that do this, that do the blue team research and maintenance. And they're against however many people want to be against them. Some companies definitely draw more ire and therefore probably more penetration testing.

Liz (00:29:18):

<laugh> Yeah. You have some companies, a first blush and I haven't even had a chance to look into it, like Patreon, who lets go their entire security team.

EW (00:29:32):

Yeah, what's with that?

Liz (00:29:34):

I don't know. Wow, you are not gonna draw any sympathy for anything that follows. Pulling from my experience, where you have city and state governments. Look at the city of Atlanta. Their blue teams are defending not only a municipality that has, I forget how many, so you have the city government, but also they were defending the networks for the Department of Public Works that are providing water to, and sewer treatment facilities for, over 6 million people and businesses. They also were defending the networks for the world's busiest airport. So if you think it's bad when Southwest goes down, was a couple of years ago, was it Delta? Someone had a glitch in their system that caused everyone trying to get home from Black Hat and DefCon, if they were flying that airline, there was an issue. That's not the world's busiest airport. You have the world's busiest airport in the ripple effect. They have a handful of people getting paid city government salaries that aren't being necessarily offered the latest and greatest. Some of the training courses are not priced such that they can easily access, and everything's stacked against them. And yet they keep it running on a day to day basis. Sorry, I'll get off my soapbox.

EW (00:31:21):

No, no, it's a good soapbox because it's such an impossible problem. I want to tell those people you're doing a good job, but then I remember being in college. I remember how fun it was trying to break somebody else's system, just because. I mean, come on their password file, it was open!

CW (00:31:44):

It was the nineties. Things were different.

EW (00:31:45):

I mean, they didn't expect anybody to be messing around.

CW (00:31:49):

They'd just started salting and hashing passwords back then.

Liz (00:31:52):

Right. It is bananas to think of that. One of the ways that I try to make a difference is, it was something that hopefully I left kind of an impression with my colleagues at the Department of Law, but also building up that level of trust within the technology department. Making a point to go down and say, "Help me understand, help bridge that gap, because I'm sitting there in the C-suite with the Chief Operating Officer, with the CIO, with the Mayor's Office, helping to do this stuff. So rather than me sitting in my ivory tower, tell me what you actually need. Tell me where your pain points are. When I'm looking at these contracts and when I'm helping prioritize stuff, do you need more training?"

Liz (00:33:03):

"And if you need more training, what can I bring in? Do we need to create a partnership with someone like Cybrary, where we can get you hands-on stuff? Do you need access to your peers? Do you need data sharing? How about CISA and the alerts that they'll put out? What do you need as the blue team? Where are the sticking points for you?" If you get more dialogue between the C-suite, the legal department, coming in and telling the engineering teams, "Tell us. Break it down into digestible, building blocks. What's gonna make your job easier? What can we do to empower you?" I think that's a part of the conversation that it's a two-way street, but we need to be having more of.

EW (00:34:03):

How do we make sure those people, the blue team folks, aren't being held responsible or liable for the attacks that happen? You mentioned ransomware. Was it negligence? Maybe not in that case, but there are times where it is pretty negligent to leave some holes in your firewall, or in your operating system. And then ransomware happens, then the company has to pay a lot, then does somebody lose their job? Or is this a, "I couldn't have done what you wanted me to do" sort of event?

Liz (00:34:44):

Well, as long as we're not blaming the interns. If you have your ransomware or data breach bingo card, always have a square for "blaming the interns", because somebody will. It's making sure that the right people are being blamed, or is it the decision makers? And what is the point of the blame? Is it to change a behavior? Or is it we need somebody to scapegoat, drag in front of...

EW (00:35:22):

Draw and quarter.

CW (00:35:24):

Justice! Justice!

Liz (00:35:24):

Right! Right! Shame! Shame! Look at how you see this with airplane and NTSB and FAA , the post accident conversations. If they're determining it was pilot error, or if it was aircraft malfunction, or was it a security system? Boeing, what? What is the goal? What is the end game for why we're doing that? Is it because we want to learn from the mistakes? You are seeing a bigger push between the SEC, the Securities and Exchange Commission, the Federal Trade Commission and all these different government agencies, at least on the US side, that are coming in and saying, "All right, is the blame because somebody in the boardroom knew about this and did not properly prioritize it?"

Liz (00:36:40):

They knew that there might be gaps in our firewall, or they were not sending enough money or were not properly protecting different data. And we made a decision, but instead we're gonna send the money elsewhere. They are gonna be held accountable. So, it's not the person on the front line. It's not the blue team. Or it's not the person sitting in the SOC or the NOC that are sitting there going, "Oh, we're gonna get blamed." But instead it's, "Who was in the position to make that decision?" And then we need to hit 'em where it hurts.

EW (00:37:26):

You mentioned opening wallets, if things don't get patched in a reasonable amount of time. I have had my data stolen and I've never gotten any money. I've gotten some free, quasi-free credit locking of people who lost my data. That was useful.

Liz (00:37:52):

At least you got that. I always wish you would go back to, you open a checking account and you get a free toaster or a set of steak knives. My kitchen would be all set.

EW (00:38:05):

Exactly. Who are they paying? It's not me, and I'm the victim here.

Liz (00:38:13):

<laugh> That's part of the problem. We are trying to shift some of the dialogue of, "Oh boohoo." Illinois is a great example. A lot of the focus tends to go on California and even Massachusetts for some of their data protection and their privacy laws, but Illinois has BIPA. If you are sharing biometric or other, for example, think facial recognition- Instagram just got hammered by Illinois. They said, "You were scanning all these pictures and tracking the facial recognition and the biometric information from all these people without telling them. And that violates the BIPA statute in Illinois."

Liz (00:39:30):

I can't remember, I want to say it was $65 million, or something ridiculous. That money will go back into the State. I don't know how they're gonna distribute it per se, because, keep in mind, they smack companies with- Facebook has also run afoul of BIPA. The money gets collected, but they have lawyers. They'll appeal. If we're gonna bash on different companies, whether it's deserved or not, Twitter has actually been under a consent agreement with the Federal Trade Commission, since what, 2011? They paid a fine and stuff and they all negotiated down, but that's part of the problem. For them, it's cost of doing business. Insure. We'll make it up here. We'll make it up there. Anytime there's those class action lawsuits, the lawyers get money, and somebody gets a little bit, and the rest of us get like a 50 cent check or...

CW (00:40:48):

Or coupon.

Liz (00:40:48):

I forgot when I received... Let me just get this straight. Y'all paid more to print out and mail this, than I'm going to receive. This is interesting.

EW (00:41:07):

And broken.

Liz (00:41:08):

Yeah. Everything's awful. <laugh> Everything's on fire. <laugh>.

EW (00:41:17):

Cyber security topics. DefCon to some extent are viewed from the outside as confrontational and ego driven. Is that a reasonable assessment?

Liz (00:41:36):

Prefacing that with the fact that I've been going to DefCon for 15, 16 years now, and I'm on the CFP review board and one of the goons. I don't think it's confrontational and that hasn't been my experience. Do I think there are some of, especially when you look at early days, "Is this being presented in a productive manner?" Is this a "Ha ha, you all are idiots?" Is it being received as, Spot the Fed games? Now, we have an entire- My joke is that, and I've helped work on it, you have, it's not really a track per se, but you have a policy department. It's not a village. We're actually a part of the main DefCon, but our focus is to bridge and encourage those conversations between regulators and policy makers and researchers, and put them in the same room. Believe it or not, nobody came to blows this year or last year, or the, I think we've been doing it three or four years.

Liz (00:43:02):

<laugh> There have been no fisticuffs. There have been no casting of spells, at least out loud that we have seen. No voodoo doll. Nothing has been burned in effigy, at least within the official DefCon space, which I think is great. Think of what it took to get the car hacking village, and the voting village, off the ground and to where we are now. With biohacking, you had researchers that were hacking their own insulin pumps. Then having to get in arguments with the manufacturers, say, "I'm not researching this to be snarky, to be rude, to be mean. This is the insulin pump that's attached to my body. Can you please secure it so I don't die?"

EW (00:44:05):

That seems like a reasonable request. It really does.

Liz (00:44:09):

You would think! It's bananas, but I'm proud of how far we've come.

EW (00:44:20):

You won an award this year at DEFCON. Could you talk about that?

Liz (00:44:24):

Yes. Thank you! There are awards that go to women professionals and Cybersecurity Woman of the Year and they break it down into different topics. My focus is on both the law and public policy surrounding privacy, in particular. I was recognized as the Woman of the Year for in law and cybersecurity policy. So, privacy. I even got a trophy and everything, a little statue. I was like, "Look, mom, I'm not a complete raving lunatic. Occasionally people care about these topics."

EW (00:45:16):

Part of the award was about outreach, and being a good example for others. Is that important to you?

Liz (00:45:23):

Absolutely. Thank you so much for the softball that's gonna let me really share about some of my passions. Of course I love Scythe and I love the work I'm doing there, but I have the privilege of being on the board of advisors for the Rural Tech Fund, which Chris Sanders started on his own. I forget how many years it's been around. What RTF is doing is, we connect students and teachers to different- It's really bridging that access gap, and they have provided grants and support to students and teachers in all 50 states. One of the things I got to do with them was go talk to an elementary school class of students in North Georgia, and share with them about the research I was doing and what I was doing to help integrate drones into the airfield at Hartsfield-Jackson, at the airport.

Liz (00:46:35):

Rural Tech Fund had also donated 3D printer, and training and materials to the same class. So it's really like, "What do y'all need? What can we do, and really help students?" They worked with a native tribe up in Alaska that the teacher reached out and said, "From elementary school to high school, we have the same building, one trailer, that we do..." Chris said, "What do you need? Do you need access to online labs? Do you need software? Do you need training materials?" And got them up and running. That kind of thing of, how can we inspire the next generation and really spark that curiosity that got us here. Think of old-school researchers. They didn't have a guide. They didn't have someone to show them how to do it, they just had to figure it out. How do we take that curiosity and grow it and encourage it? So, work with the Rural Tech Fund, as well as try to do my best with black girls in cyber, and serving as a mentor there. This was the first year DefCon had a village for really encouraging both women and girls, but also women of color, to engage more and learn more and grow in cybersecurity.

EW (00:48:18):

There's so many things I want to unpack there. Let's start with Rural Tech Fund because we have an organization, nonprofit, near us called Digital Nest that we've talked about on the show, that provides technology access gap resources to people in our local agriculture area. They service Watsonville, which is extraordinarily close to the Silicon Valley in terms of miles, but so far in terms of everything else. Is it about showing people what's possible? Is it about giving them the tools? Is it just about telling them that technology exists? What are the main goals when you talk about Rural Tech Fund?

Liz (00:49:09):

All of the above. It's not only just showing them the world of possibilities, but enabling students that wouldn't ordinarily have that access. They may not even know that it's possible. Now that they know it's possible, they or their teachers, and sometimes it's providing the training tools to the teachers and connecting them with the experts. "Now that you know it's out there, here's how you do it." Everything from assistive technology and classrooms and providing that. Really where Rural Tech Fund, to me, strikes cord is it's not dictating, "This is a program. This is what you need." It's more about going to the teachers and the schools and the students and saying, "You tell us. What do you need? What can we do to help you and your students? Is it connectivity? Is it just access to this? Is it training on this?"

Liz (00:50:26):

In my case, I took a bunch of micro drones down to the school. Of the, I think it was a third grade class, 30 students in the class, only about six of them had ever been to an airport. So educating them. I had a topographical map, an aerial map of the airport. I said, "This is what we're talking about, guys. Every minute a plane is taking off and a plane is landing. What do you think are some of the things we should need to consider?" Oh, my word, those children have the most like graphic sense. They're like, "Well, what happens if FedEx explodes and there's fire, fireballs everywhere?"

Liz (00:51:23):

"Well, since you mentioned it, kids, we have special foam that is loaded into our fire engines that is designed just to fight jet fuel fires. But wouldn't it be great if we could use drones to get aerial imagery of the fire and fight it better?" But they were just like "Limbs! Flames!" "What have y'all been watching or reading? Wow, you're not wrong. We do have to think of all of these things, but wow! Please don't tell your parents I'm the one that told y'all about that stuff. <laugh>

EW (00:52:10):

Yes, I remember one of the people who worked for me when I was a manager, went to go talk to a classroom and came back and said, "Well, that didn't go as I expected. Halfway through, they asked me to define what 'homicide' meant." Whoops!

Liz (00:52:34):

But it's where it starts. "Yes, children, get creative. You are only limited by your imagination. Let's do it! Do you just need a check? Or do you need security experts? The researchers who are hacking into the cars? Y'all have seen autonomous vehicles. You've seen the Jetsons." I hope, can people still watch the Jetsons? Does it...?

EW (00:53:08):

I don't know, George was born recently.

Liz (00:53:19):

I love knowing that there are more and more of these organizations out there doing this.

EW (00:53:25):

You personally, you've mentioned some time donated. Do you usually give money or time? What is it like being on an advisory board of a non-profit?

Liz (00:53:36):

I started off, I did put my money where my mouth is. I try to, when I'm making charitable donations, find organizations that the money's going to go to something I'm passionate about. It might be a dog rescue. In this case, Rural Tech Fund. I got introduced to them, in part, through BSidesAugusta. With being on the advisory board, it's not a requirement, I just happened to like to every year, send money and had long told Chris, "Whatever you need. Just point me in the direction. I have passion and energy and resources available. I just need you to channel this energy for good. How do you need this?" Chris just started the advisory board for Rural Tech Fund. With that, it's an ask of, and it just depends on the organization, sometimes just ideas and time. In my case, I've worked with different nonprofits and part of their fundraising.

Liz (00:54:56):

I also, through my work with Scythe on the business side, I have access to other VC funds and other companies. It's like, "We want to be able to tap into your network and your ideas of how do we structure, how do we grow these programs? Senior Online Safety is another board that I'm on for a non-profit. That one, it really is an occasionally. It's, "We're gonna do this. Liz you've helped, from a legal side, set up nonprofits. Point us in the right direction". So it just depends on the skill set.

Liz (00:55:37):

If someone wants to get involved, I highly encourage everyone to look around, there are organizations. They're constantly in need of volunteers, advisors, supporters. People say, "I don't have money. I don't really have time." "But do you have social media? Do you have a network that you can connect them to? Or are you a creative? Maybe you can help them grow their next project because they can pick your brain. Or that $20 that you donate, and is also a tax write-off for you, it can grow and support some stuff. You'd be surprised. Do you have two hours of time on a Saturday? Because they're running a booth at a local cybersecurity conference and you can help them do this." There are many ways to get involved and it doesn't always require one thing or the other.

EW (00:56:44):

You mentioned Chris, could you say his name again?

Liz (00:56:48):

Certainly, Chris Sanders. He started Rural Tech Fund, ruraltechfund.org or if you're on the Twitters @ruraltechfund Check it out. He started this as a passion project being from a rural area and growing up without perhaps all the access that you might have, if you were in a major city.

EW (00:57:17):

I just wanted to make sure that we identified the person, because usually when there is a Chris mentioned, everyone believes it's my Chris.

CW (00:57:25):

No, they believe it's some other person Chris.

Liz (00:57:26):

Ah.

EW (00:57:27):

My Chris does not work for iRobot and does not run the Rural Tech Fund.

CW (00:57:32):

No, I'm far too lazy to do all those things.

EW (00:57:35):

I have a couple listener questions. One of them is from Nick, "Are there any ways that privacy legislation can get ahead of advances in technology, and its uses in new spaces, rather than constantly trying to play catch-up" (WarGames?!) "For example, limits on facial recognition tech by law enforcement."

Liz (00:58:00):

Unfortunately, part of that is already the cats out of the bag, the horse barn, whatever the sayings are. Really, it's raising awareness at this point. You do have some states that are doing a great job. Do we need national legislation? Yes. But the other downside is it's not just- Think of, TikTok. That's not a US company. We can put all the restrictions we want, and they may or may not follow. When it comes to local law enforcement, part of it is education and helping them understand. This goes back to some of the engineering stuff of, "Just because you can build it, do you really need to build it?"

Liz (00:58:53):

That was one of the things we worked on, and really tried to focus in on, with smart city stuff. When I would sit in the meetings and we would get the request of, "We want to do this." I'm like, "All right. Do you know what else is gonna happen with that? Do you know, now we're gonna have all this data, we're gonna become even more of a target-rich environment, and they're gonna come after it. And the opportunities for misuse. Are you ready? Are you willing to have all this? Do you understand?" It was like, "Oh no, I just wanted to make this part of my job easier." "Let's find a different way then." Having those conversations and bridging that of, "Do you understand how this technology is gonna be used and is this what you really want? And is this what we really need to get the job done?"

EW (00:59:56):

One more question from Nick, "What should I look out for, in the services' terms and conditions, if I want to protect my privacy and data?" I'm gonna add on that, "How much do I believe those terms and conditions?"

Liz (01:00:13):

How often do you frequently read all through them? Are they accurate? How much time and thought has been put into them? When you're talking about these big corporations, that's what we see with Instagram, where they just got hammered, of the terms and conditions. I mean, you don't get to negotiate them. They are what they are. Are you not going to use? And I do know some people who don't use certain devices or don't use certain things because they're like, "Wow, I've just signed away everything. This is a crock." When you're looking at that, it really boils down to some of the states, and some of the policy decisions that are getting made are actually holding them accountable. But again, when we're talking about billion dollar companies, how much does a, what, $65 million judgment really do, other than raise awareness and remind everyone, "These companies, their terms and conditions say, or in some cases don't say, what they're actually doing."

EW (01:01:30):

Exactly. It goes back to, I don't really believe them. And every single one of them says, "We may change this at any time." And I don't know how many times I've given away my firstborn child.

Liz (01:01:46):

There are days that most people I know would gladly give away their firstborn child. <laugh>

EW (01:01:51):

I don't have any.

CW (01:01:53):

That's why! They've all been given away.

Liz (01:01:59):

You go through and some of it is not necessarily an evil intent. It's just the byproduct of, "We didn't realize that this would tie into this, to tie into this, to tie into this." The design, or that just wasn't the focus. That's why really the conversation, and I think where change is gonna come from, is building in that privacy by design. So that privacy is no longer an afterthought, that we've moved it ahead on the prioritization list, and where the default isn't, "Oh, we're gonna put this little tracker here and this little cookie here." No, let's work on de-identifying that information by design and by default. That is where we go first. If we are tracking people across apps, across devices, across different things, that was an intentional choice by the company, by the engineers, by everyone involved. Instead of, you not only you have to opt out, but you have to jump through so many hoops. At that point, is it really effective? So it's kind of flipping the script.

EW (01:03:27):

You said, "Privacy by design and de-identification by default?"

Liz (01:03:33):

By default. Again, let's build the privacy in to the initial designs. When we design how a system's gonna work or what the software's gonna do, that privacy isn't an afterthought. How are we protecting this information? How are we doing this? Instead of the default being, let's use these cookies, trackers, different identifiers. Instead, let's build it so that the default is that it doesn't track, identify. Instead of having to opt out, you have to opt in.

EW (01:04:21):

I wish everything was like that. Yes, that would be nice.

Liz (01:04:26):

Yes. It's really starting to shout it from the rooftop and raising awareness of that, I think is the first step. Then you have people, like we saw with some of the researchers that were over at Amazon and Google, who kind of were raising these concerns and raising the alarm of this. Again, it is a utopian idea, but one that I think if we shout loudly enough about, and then start implementing, we will see some movement on the needle.

EW (01:05:07):

Do you have any questions Christopher?

CW (01:05:08):

I did have one and I like to ask it of people whose careers seem cool. Do you have advice for people who want to get into the InfoSec world and maybe cross that with the legal world? Who are in college or are thinking about their careers early on?

Liz (01:05:28):

Yeah. Granted, I fell backwards into it. If you had told me 20 years ago, I would be sitting here talking about stuff with y'all now, working at a cybersecurity startup, I would've been like, "You're obviously day-drinking, and whatever it is, it's amazing, share it with the rest of us." It's being willing to ask, follow the curiosity, persist and pivot. If you see something that strikes your fancy and you want to do, go for it and keep going for it. If an interesting opportunity comes along-

Liz (01:06:19):

I mean, I worked in capital markets, commercial mortgage-backed securities, right before the crash. The comment from my father was, after we crashed world economies with CMBS (commercial mortgage-backed securities), he's like, "Well, I finally understand what you do." <laugh> "Because Wall Street Journal says y'all are now responsible for this." That was the moment I was able to take the side stuff that I was doing and say, "Well, my main gig, area of practice, just crashed. Maybe I start doing the stuff that I think is fun." So don't be afraid to follow that and do the fun. Do the stuff that you find curiosity and sparks of joy in.

Liz (01:07:09):

Cool.

EW (01:07:11):

Liz, it's been good to talk to you. Do you have any thoughts you'd like to leave us with?

Liz (01:07:15):

Thank you so much. I appreciate and have enjoyed the conversation and cannot shout it enough. First of all, I'd be remiss if I didn't say, "Thank you, Scythe. Thank you to all of our customers, et cetera, et cetera, Rural Tech Fund." Cannot shout it from the rooftop enough, especially with an audience of engineers, privacy by design, de-identification by default, in what you're doing at work, as well as what you're doing as a consumer and a user of different technologies.

EW (01:07:52):

Our guest has been Liz Wharton, Vice President of Operations at Scythe. She is also on the advisory board of the Rural Tech Fund. And, of course, links will be in the show notes.

CW (01:08:03):

Thanks Liz.

Liz (01:08:05):

Thank you.

EW (01:08:07):

Thank you to Christopher for producing and co-hosting. Thank you to our Patreon listener slack group for questions. And, of course, thank you for listening. You can always contact us at show@embedded.fm or hit the contact link on embedded.fm.

EW (01:08:20):

Now a quote to leave you with. I was gonna do WarGames, but I think Calvin and Hobbes is the way to go, from Bill Watterson. Mrs. Wormwood says, "Calvin, can you tell us what Lewis and Clark did?" Calvin says, "No, but I can recite the secret superhero origin member of each of captain Napalm's Thermonuclear League of Liberty." Mrs Wormwood, "See me after class, Calvin." Calvin, "I'm not dumb, I just have a command of thoroughly useless information."