460: I Don’t Care What Your Math Says
Transcript from 460: I Don’t Care What Your Math Says with Greg Wilson and Elecia White.
EW (00:00:07):
Welcome to Embedded. I am Elecia White. On my own this week, so my thoughts will be drifting around like dandelion seeds. Our guest this week is Greg Wilson. We will be talking about teaching and learning and technology. Maybe. Greg, thank you for being on the show with us. With me, with us.
GW (00:00:31):
Thank you very much for having me, whether you are one or many.
EW (00:00:35):
Could you tell us about yourself, as if we met at an embedded systems conference?
GW (00:00:42):
Sure. I actually started my career as an electrical engineer back in the early 1980s. I still have a scar on my right hand from picking up a soldering iron the wrong way around twice in one afternoon. The second time I had my hand under the cold water tap, the lab tech came over and put his arm around my shoulders and actually did say, "Son, have you considered a career in software?"
EW (00:01:06):
<laugh>
GW (00:01:06):
I think it was self-interest on his part, because the very next semester I would have had to do the course on power transmission lines with 50,000 volt transformers and so forth. I honestly think he encouraged me to go into software, to save himself the paperwork.
EW (00:01:30):
<laugh> And has that worked out for you?
GW (00:01:32):
I think it is too early to tell. It has only been 40 years. I have worked as a programmer at everything from startups to IBM and Hewlett-Packard. In the US Department of Energy. I have exited academia three times. I was a professor for three and a half years.
(00:01:52):
I build software for fun, but mostly for my own use. I think it has worked out, but I still do not know if I have had a career in the traditional sense. And I am still trying to figure out what theme, if any, binds together all of the things I have been doing, other than the fact that apparently I really do like the sound of my own voice.
EW (00:02:28):
That actually leads us to the first of the lightning round questions, where I will ask you short questions and I want short answers. And if I am behaving myself, I will not ask for more detail until later. Are you ready?
GW (00:02:40):
I am ready.
EW (00:02:41):
Author, manager, engineer, professor or teacher?
GW (00:02:46):
I do not think you can be an engineer or a manager, unless you are a teacher.
EW (00:02:52):
That is still not an answer.
GW (00:02:55):
All of the above, but the one I like most is the teaching.
EW (00:02:59):
Reading or writing?
GW (00:03:02):
Reading, because I do not actually enjoy writing. I like having written, which is a very different thing.
EW (00:03:09):
Yes. <laugh> Oh yes. Articles or books?
GW (00:03:14):
Either, or. Different audiences, different purposes.
EW (00:03:17):
Academia or industry?
GW (00:03:19):
Industry.
EW (00:03:21):
Favorite way to learn new things? Lectures, reading, trying?
GW (00:03:28):
Working with somebody who is also trying to learn it at the same time.
EW (00:03:34):
Complete one project or start a dozen?
GW (00:03:35):
<laugh> Um.
EW (00:03:35):
That is the response of someone who is at start a dozen. That is always the response.
GW (00:03:45):
Yeah. I actually have a "to-don't" list.
EW (00:03:48):
<laugh>
GW (00:03:48):
One of the things Dave Dockendorf taught me in my second industry job, was as well as having a list of things you are supposed to do, you actually write down the list of things that you would like to do. That would be useful, that would contribute to society, and that you are not going to start, because life is short and you have got too many things on the go. I actually have a written to-don't list, and I recommend everybody else get one too.
EW (00:04:14):
I could see you putting some stuff on a to-don't list.
GW (00:04:16):
Mm-hmm.
EW (00:04:16):
Do you have a favorite fictional robot?
GW (00:04:22):
That is an interesting question. None comes to mind.
EW (00:04:28):
All right. It is funny, because one of our guests answered the next one with an important thing you have already discussed, but do you have a tip everyone should know?
GW (00:04:40):
Other than the to-don't list?
EW (00:04:42):
I mean, that is a good one.
GW (00:04:44):
When you are being interviewed, always have a cup of tea or a glass of water or something handy. Whenever you are asked a complicated question, pause and take a sip. Those three seconds might save your career.
EW (00:04:56):
Oh yeah, that is a good one.
GW (00:04:58):
My wife would tell you that if I had learned to pause before answering questions, I would have been married several years sooner than I was. But we do not have time to go into that on the podcast.
EW (00:05:09):
<laugh> One of our past guests said, "Never catch a falling soldering iron."
GW (00:05:13):
I wish I had known that 42 years ago.
EW (00:05:18):
<laugh> Okay, so I have this outline. I have all these questions for you. You have done amazing things, written several books, co-written several books, and you won an Influential Educator Award. I have a bunch of questions about that, but then you sent me this link about a project. It was sort of a request that somebody do this. Could you talk to me about "The Cost of Change" project?
GW (00:05:49):
Sure. For about 11 or 12 years now, sporadically, I have been helping out with a project called "It Will Never Work in Theory." The name comes from the saying that, "It will work in practice, but it will never work in theory."
EW (00:06:04):
Where did that come from? I mean, it is hilarious.
GW (00:06:08):
Engineers have been saying this to mathematicians, for as long as there have been engineers and mathematicians. Right? "I do not care what your math says. I have actually built it, and we are riding in it right now. So fix your theory."
(00:06:22):
Historically it is interesting that in almost all cases, practice precedes theory. The number of cases in which mathematicians and scientists come up with something or prove something, that is then translated into practice is surprisingly small. In most cases, from radio to aircraft to drugs, we find something that works, and then we go and figure out after the fact why it works.
(00:06:51):
But not wanting to get off track here. Like most of your listeners, I learned software engineering as a craft, by sitting beside other people who showed me what they did, or at least did not stop me from watching, which is not quite the same thing. I picked it up in bits and pieces, in the way that stonemasons learned how to make blocks in Egyptian times or Roman times.
(00:07:21):
It was not until the late nineties, that I discovered that there were actually people studying programs and programmers in empirical ways. Steve McConnell's book, "Rapid Development," Bob Glass', "Facts and Fallacies of Software Engineering," pointed me at a literature where people had actually gone and watched programmers, and recorded what they did and seen what worked and what did not.
(00:07:52):
Open source, and particularly things like SourceForge and later GitHub and Stack Overflow, led to an explosion in that work because so much more data suddenly became available. So by the early 2010s, we knew a whole lot about what works and what does not.
(00:08:11):
For example, you will find people in industry who have very strong opinions about whether strong typing in programming languages is worthwhile or not. All right, how do you tell? Well, one experiment that was done, was to go back and look at several hundred bugs in medium-sized JavaScript projects on GitHub. And go back and see if you added the strong typing of TypeScript, how many of those bugs would it have caught?
(00:08:41):
The answer is about 15%. All right. 15% is one in seven. That does not sound like a lot, but on the other hand, 15% is sales tax in Ontario. And if you told businesses here they could stop paying sales tax, they would think you were genius. What it does is give us a handle on the kind of scale that we can expect from adopting strong typing. Is it a definitive experiment? Absolutely not. Is it proof that we can go and find things out, by applying the scientific method to software development? Absolutely.
(00:09:18):
Catherine Hicks and her group at Pluralsight have been looking at what actually affects developer thriving. What actually makes developers productive and secure and happy, and what does not? It turns out there are a lot of myths out there, that are pretty easy to disprove. The problem is most programmers have never done science.
(00:09:45):
When I was teaching computer science at the University of Toronto, I did a bit of digging. At the time, for an undergraduate biology program to be certified, students had to spend an average of six hours a week in the lab over four years. Right? They have to learn how to actually do science, set up the experiment, collect the data, do the analysis, and so forth. So by the end of their degree, depending on what path they have taken, they have done anywhere from 20 to 40 or even 50 real experiments, so that they learn how this works, how to do the science.
(00:10:22):
The average undergrad in computer science did one experiment in four years, and that was if they took the human-computer interaction course. So they come out of undergrad not knowing what science looks like. Yeah, they did science courses in high school, but they have never actually set up and run an experiment, collected data, analyzed it.
(00:10:43):
So they do not expect to have any of that when they go to industry. So they are not looking to the research to say, "What do we actually know?" This leads us down into a lot of quagmires. You will find a lot of people in industry who swear by test driven development, they think it makes them more productive and that it leads to better code. Every well done study of TDD has found no effect. It does not make a difference.
EW (00:11:17):
But.
GW (00:11:19):
The best one that I have seen, was Davide Fucci and colleagues, published in 2016. It was a replication study of something that had been done in simul labs. They are using professional developers, they are looking at it over the course of months. So all of the spurious arguments the programmers come up with, where they say, "Well, of course if you are only looking at students for an afternoon, you are not going to see a signal." None of those apply, and Davide is meticulous about his statistics. There is no signal there.
(00:11:45):
One hypothesis which has not yet been tested by a follow-up study, is that, yeah, there are differences in productivity. There are actually very significant differences in productivity, but what it seems to depend on is how quickly the programmer alternates between coding and testing, not the order in which they do those two things.
(00:12:08):
In the early 2000s, as Agile started to spread, people switched to a more rapid iteration, a more rapid cycle. Ten minutes of coding, ten minutes of testing, rather than two days of coding, two days of testing. And at the same time adopted TDD. And then attributed whatever increase in productivity they felt they were seeing, to the wrong thing.
EW (00:12:36):
I have always thought the benefit of TDD was more in the "How do I test this? How do I prove that it is correct?"
GW (00:12:43):
Right.
EW (00:12:43):
More than in the "I am going to mechanically write a test that fails, and then write code that makes it succeed."
GW (00:12:55):
I would be happy to believe that was true. I used to preach the gospel of TDD myself. The fun thing, and I use the word "fun" very loosely here, is that you and I are both smart people. The people we work with are smart people, and none of us are saying, "I should not be making a claim like that, without at least the same kind of evidence that I would expect a pharmaceutical company to have, if they are selling a new skin cream to treat poison ivy rash."
(00:13:26):
We do not expect in our field, that people will back up claims with evidence. We want anecdotes, and we want a tall man with a deep voice and a confident manner on the stage of a TED Talk, and that is what we take as proof.
(00:13:43):
I think this is fixable. I think our field would start to advance more rapidly, if we who practice were paying attention to the research. And if the researchers were paying more attention to what we already know, and to the problems we have. I am critiquing the practitioners right now. You got to understand, I am equally harsh on the researchers. Most of what they study is completely irrelevant to practicing programmers.
EW (00:14:19):
Yes. So yes.
GW (00:14:20):
Absolutely. But we throw out the good with the bad.
EW (00:14:26):
What about the studies that have been done? Like "Mythical Man-Month." That, okay-
GW (00:14:31):
So that was not the study. Fred Brooks did not collect any data to back up his claim, that adding more programmers to a project makes it late. It was plausible. It is simple to understand, but that does not mean it is right. As my father pointed out to me in a slightly different context many years ago, you go back two generations, and you will find a lot of people who could rather indulgently explain to you why women were simply not suited for positions of power, because they are too temperamental.
(00:15:02):
We are supposed to collect evidence and base our decisions on the evidence, right? Not on what everybody knows, or what everybody is repeating, because they heard it from somebody else who also did not have evidence. I think we advance more rapidly, and we can shed a lot of damaging misconceptions, if we actually turn the scientific method on what we are doing.
(00:15:32):
So the proposal I sent to you was, let us get rid of that undergrad course where the students work in teams and pretend to be developing a product, because I know from firsthand experience that it does not really make much of a difference to them.
(00:15:49):
They learn some tools. They might, for example, learn how to use a continuous integration tool, but a lot of them are picking that up on internships these days. They do not learn how to elicit requirements, or how to run meetings, or how to do the human side of software development, because it is the blind leading the blind. The instructor cannot be in all of their team meetings, and show them how to actually do that side of their job.
(00:16:14):
So let us take that course and put it away. And instead give them the opportunity to go out and collect some data. Go and mine GitHub and look at, for example, are long methods in JavaScript and Python, more or less likely to be buggy than short methods. Okay, in order to answer that question, you are going to have to build some sort of a statistical model.
(00:16:39):
You are going to have to clear up the question itself. Are we asking more buggy per line, more buggy per statement? How are we going to relate bugs back to particular methods? All of a sudden, the students will have to come to grips with the fact that the answer you get, depends on the precise question that you ask. But you can still ask meaningful questions, and get useful answers.
(00:17:02):
Have them go and collect some performance data from the university website, and see what kinds of patterns they are seeing there. So that when they go out into industry, they can start to apply those techniques to the problems they themselves are solving. But also engage in conversations with the researchers, who might be able to answer other bigger questions for them.
(00:17:28):
This is how other sciences have progressed. I have made the suggestion for this course, multiple times in multiple venues over multiple years. And been met with either blank stares, or the embarrassed shuffling of the feet, that one gets when one suggests that a child should eat more broccoli.
(00:17:47):
So it is a dream. I do not think the course would be very hard to put together. I think that programmers who have gone through that, would be better positioned to deal with the kinds of systems and issues that are coming up today, that go beyond simply writing code. But I do not have any evidence for that. I just have a strong personal belief. I wish I had the chance to try that experiment.
EW (00:18:21):
The goal would be for the students to design an experiment, related to software development.
GW (00:18:31):
Absolutely.
EW (00:18:32):
And collect and analyze the data, and then figure out what, if anything, they have proven.
GW (00:18:39):
For example, an experiment that could be done by undergrads, with the resources they have got, the time they have got, and the background knowledge they have got, would be to see how closely different code reviewers align with each other.
(00:18:55):
Get five of your friends. Give them two pages of mediocre Python or JavaScript or whatever your favorite language is. Have them do code reviews. And then collate those and see how well did they agree, and where did they differ and why. My experience, not just in my present job but previously, is that three different people will provide three very different code reviews, some of which are more useful than others.
EW (00:19:26):
I will provide three different code reviews, depending on what you ask me to look for.
GW (00:19:31):
Absolutely. Now, in the early 1980s, there was an experiment done. I want to say Johns Hopkins, but I am probably wrong, where they took X-ray images, chest X-rays and put them in front of different radiologists and said, "Do you see a tumor or not?" The answer was basically a coin toss. There was almost no consistency. The only thing that seemed to matter, was which school the radiologist had gone to, because different schools and different courses were training them in different ways.
(00:20:04):
The response of the radiology community was to say, "Let us put together a canonical set of examples, and come to some sort of consensus about what you are supposed to see here. It does not mean this is the right answer. It means that now we have got something we can drive. Now we have got enough consistency that we can start to steer this car. If everybody is all over the map, we are quite literally herding cats." Or in this case, herding radiologists.
(00:20:30):
From your work in embedded systems, I expect the devices that I release into the wild, to meet certain standards. We can argue about whether those are the right standards, but we have already accepted that there are standards, and that there are standards bodies and somebody is checking this. So that if we want to make things better, we have got a knob that we can go and try to turn. That seems like a better world.
EW (00:21:06):
Sure, there are FCC standards, and power standards, and there are security standards, that seem to always be coming soon instead of being here.
GW (00:21:22):
<laugh> Yep.
EW (00:21:23):
Medical has its own special set of standards, as do other safety critical systems. But when you are talking about consumers and privacy, that is all kind of left up to the developer.
GW (00:21:41):
At the moment, yes. I think we would all be happier, if there was a minimum basic standard that we could rely on for that. And if the enforcement of that standard had some teeth. For example, my grandmother lived through the period from the mid 1890s until the early 1930s, when the Western world decided that there were actually going to be standards for medicine.
(00:22:17):
Prior to the 1890s you could sell whatever you wanted, and label it however you wanted. That was going badly enough that, against fierce opposition and not just from the pharmaceutical industry, we decided as a society that there ought to be some rules here. Can I ask you, do you know why mechanical engineering exists as a discipline?
EW (00:22:46):
I assume so bridges will not fall down.
GW (00:22:49):
In the 1870s, the steam boilers on locomotives in the United States were blowing up with such regularity, that people were not riding trains. And so at the time billionaires, today we would call them billionaires or oligarchs, who owned and operated the rail networks, went and said, "We are going to create a standard college program to train people, who will then be the only ones allowed to design these boilers. So that they stop blowing up, so that people will ride our trains again."
(00:23:23):
And 30 years later in Germany, they invented the profession of chemical engineer, by explicitly emulating the invention of mechanical engineering. They said, "It is in our own interests to have a floor, to have a lower bar that everybody to get over. And to do it in a public fashion, so that people have confidence in what we are producing."
EW (00:23:51):
But does this standard- With software, is it MISRA? Is it compiler warnings?
GW (00:24:01):
I do not know.
EW (00:24:02):
I know IEEE had this exam that they wanted to push on everyone.
GW (00:24:07):
Yeah. Yep. I do not know if you have ever been active in politics.
EW (00:24:14):
I have not, because I hate them with a deep abiding passion.
GW (00:24:18):
Okay. Well, I have been involved in several political campaigns over the years. There is a saying that I think goes all the way back to Saul Alinsky, which is that "Nothing gets done in the United States, until you have got a busload of dead children." There has to be- The plane has to fall out of the sky, before there will be the public will to enforce safety standards.
(00:24:44):
I am too young to remember Ralph Nader needing bodyguards, when he started to complain about the fact that some automobiles made by US manufacturers would blow up if they got rear-ended. But these days we have won that fight. We no longer fight about whether there should be safety standards for automobiles. We fight about what they should be.
(00:25:08):
I do not think I know enough. I do not think anybody knows enough yet, to decide exactly what the standards should be. I think people like Lawrence Lessig have made some very interesting proposals around actionable standards for data privacy.
(00:25:23):
I think Cory Doctorow's recent talk, he talks about the instrumentation of the internet, the way that things start so well and always seem to go downhill from there. He points at Google Search, for example. He recently gave a talk in which he put forward some reasonable proposals to try to stop that from happening and try to roll it back.
(00:25:49):
I think that we are going to need more, I hate to say this, I hate to be cynical about human nature, but I think we need disasters closer to home. It is clear, for example, at this point that Facebook played a major role in the genocidal slaughter of the Rohingya in Myanmar. That they could have stepped in to squelch the sites that were stirring up that kind of hate, and they chose not to because: engagement numbers. This is not a personal opinion, there have been lots of hearings about this. The company has sort of acknowledged complicity in this. But Myanmar is a long way away, at least from me and probably from you.
(00:26:32):
If something like that happens in the 2024 election cycle, if we see something like January 6th, but ten times or a hundred times as large, and it has very clearly been fueled by irresponsible behavior by our social media, or if we see an enormous data breach that directly affects the 1% and our elected representatives, I suspect we will suddenly start to see these laws being made. I suspect a lot of them will be bad laws, because they will have been made in haste.
EW (00:27:13):
And by people who do not understand the underlying technology and its constraints.
GW (00:27:19):
Absolutely. Or as we are seeing with Sam Altman and co going and testifying in front of Congress, the lawmakers will turn to people who absolutely do understand the technology, and how much money they can make from it, and want to perform an active regulatory capture. They want to make sure that the foxes are designing the henhouse.
(00:27:40):
There is a history of that with pharma companies trying to influence safety legislation. With oil companies essentially writing the bills around environmental protection. I would hate to see in a field like AI, a handful of large companies hoping to make trillions of dollars, be the ones to dictate to us what they are allowed to mine. How they will be checked for bias, what they have to tell us about the data they are collecting, and so forth.
(00:28:10):
We have wandered a long way away from the idea of teaching empirical methods to undergrads in computer science, and I apologize for that. There is an xkcd cartoon from many years ago, of somebody up on stage giving a speech, and in the audience somebody is holding up a sign saying "Citation please," the Wikipedian Protester. I want more of that in our industry. I want more people saying, "What is the evidence? How did you collect it? How reliable is it? Why are you trying to convince me that this is true?"
EW (00:28:47):
About the processes, about how we learn things, how we teach things?
GW (00:28:55):
Mm-hmm. About how we develop software. Oh, about everything.
EW (00:29:03):
I do not have time to read all of those.
GW (00:29:06):
I beg your pardon?
EW (00:29:07):
I do not have time to read all of those.
GW (00:29:08):
<laugh>
EW (00:29:12):
I read more technical books than most people I know. I know only maybe two or three people who read more technical books. I do not tend to read journals at all, and I know people who do.
GW (00:29:29):
Absolutely.
EW (00:29:30):
I try to stay up to date in my field. I try to learn new things. Often it is along my interests, like learning more about Python. And it is not in the area of what I would call "squishier things."
GW (00:29:45):
Okay.
EW (00:29:45):
Things that could easily revert to norm, experimental things that have low R values.
GW (00:30:01):
Okay, so very few doctors read the research literature. What they do read is the summaries that are put together for them by the Canadian Medical Association - here in Toronto. By equivalent organizations in other countries, to say, "Here is what we have learned in the last 12 months about hypertension that you need to know, and here are the citations. If you are curious, if you do not believe this or if you think you have got an oddball case and you really want to follow it up, or if you have somebody who is an outlier that you think might be worth further study, here is a pointer to the paper. Reach out to the scientist."
(00:30:46):
Whatever it is, there is less of a gap in some fields between the people who are studying the thing and the people who are building, than there is in our field. And I think that is what I would like to see.
EW (00:31:05):
Is that because our field is so large? I mean, I can barely talk to cloud people, sometimes.
GW (00:31:12):
I am willing to bet that the medical profession is both larger and more diverse than the software profession.
EW (00:31:18):
I agree.
GW (00:31:20):
And absolutely there is a lot of tension within that, and varying standards. The people who do cardiology are usually not polite about the people doing psychotherapy. Absolutely. Different standards of evidence, because you are dealing with an entirely different category of problem. But that does not mean that progress is impossible.
(00:31:47):
I think we know a lot more about public health, than we did 50 years ago. I think we know a lot more about genetic influences on disease, than we did 50 years ago. I know that we know a lot more about batteries and solar cells than we did even 20 years ago. Do you think our understanding of how people build software has progressed to anything like the same extent in your working lifetime?
EW (00:32:17):
It has improved. It has improved significantly. I mean, version control. My EE uses version control and he likes it. And bug tracking. Maybe we have not proven that those are important, but let me tell you, I am never living without version control again. If I could version control my whole life, I would.
GW (00:32:40):
Now let us talk about the human aspects of it. Do you think that programmers today are able to produce more or better code in a fixed unit of time, than they were 40 years ago? I would argue not. Do you think we are any better at eliciting requirements, and making sure that the thing we are building is the thing we were supposed to build? I would argue not.
EW (00:33:07):
In some industries. I would argue yes, for both of those.
GW (00:33:10):
Okay.
EW (00:33:12):
There are huge libraries of code that are accepted as good, and you can build upon those. You do not have to reinvent the wheel.
GW (00:33:20):
Excellent.
EW (00:33:20):
There are ways of wireframing your application, that are so much simpler than going in and coding them. And those are accepted processes that are part of building things.
GW (00:33:32):
This is wonderful. You and I disagree in an empirically testable way.
EW (00:33:36):
Yes.
GW (00:33:36):
Okay.
EW (00:33:39):
Now somebody get on testing it.
GW (00:33:41):
And that is what I would like. But our profession does not take that next step. If I tell you that high salt intake increases the likelihood of heart attack in men over 65, and you say, "No, it does not," we both accept that there is an answer to that question. The answer is probably, "It is more complicated than that," but there is an answer to the question and that answer is findable.
(00:34:06):
I would like us to be in that place. I would like to be wrong less often, or at least I would like to be wrong about more interesting things.
EW (00:34:17):
Are there digests, journals that- The ones that I know about seem very biased towards making their own profits. The ones that I do see often- I do not want to spend 40 bucks, if the article is going to be stupid.
GW (00:34:37):
Oh yeah. We started neverworkintheory.org to try to close this gap, to try to provide the kind of potted summary of recent research, that a practitioner would actually read. It mostly has not worked. There is not a large enough receptive audience of practitioners, who believe that the research might be useful.
(00:35:06):
And equally, most people doing the research are pursuing a problem, that grew out of a problem, that grew out of a problem, that grew out of a problem, that was interesting to somebody three academic generations ago. Or as we have seen over the last ten months, they are chasing taillights. A good third of the papers that are getting posted to preprint servers like archive.org right now in software engineering are basically, "I threw ChatGPT at the wall and some of it stuck."
EW (00:35:40):
Uhh, burf!
GW (00:35:40):
Now, on the one hand, I believe people like Jon Udell, Simon Willis and Ian Bicking, when they say that this is going to change programming at least as much as Stack Overflow did. That it is going to be a game changer. On the other hand, I do not think that grabbing a bunch of source code, running it through one of these tools, and doing a t-test, is moving anybody forward, and it is an embarrassing distraction.
EW (00:36:19):
That seems testable, and I know which way I would bet.
GW (00:36:25):
<laugh> Well-
EW (00:36:26):
That is what we need. What we need is really a gambling system. I say we should use version control. Nobody is going to give me bad odds on that. But somebody will say, "Okay, I disagree." And they will go out and they will test it and we will figure it out. Maybe some of the other things we disagree about, we will give odds and people will put on which side they are on. And even that voting, interested voting, leads to some interesting data about how things work now, versus how they should work.
GW (00:37:10):
You make many interesting comments. One of them is the idea that nobody is going to argue against version control. I have actually proposed that we should go and run a large scale controlled study, to see whether version control is worthwhile or not. For the same reason that years ago when I was an engineer, the first thing I would do with a voltmeter, is hook it up to a known voltage source and make sure that the voltmeter was properly calibrated.
(00:37:36):
If my study method comes back and says, "There is no benefit to version control," I am going to believe that my study method is wrong. I need to validate my instruments, because I cannot apply them to more complicated things like test room development or long list of other things, until I have got confidence in my voltmeter and my thermometer and my other tools.
(00:38:03):
I have tried three times in two different countries, to get funding to go and do a study to see whether or not version control actually makes things better. And three different research agencies have said it is not worth doing.
EW (00:38:18):
Well, I agree it is worth doing, because you have to calibrate your questions. When you say, "Is software development better?" that is not an answerable question.
GW (00:38:29):
Exactly.
EW (00:38:29):
Are there fewer bugs? Are there fewer late worked hours? Are there...
GW (00:38:36):
What do we mean by-
EW (00:38:37):
Did we complete sooner?
GW (00:38:39):
Yes. And this is it. How do you operationalize that vague question? Right?
EW (00:38:46):
And so doing version control, is not about testing version control. It is about testing your methodology.
GW (00:38:52):
Right.
EW (00:38:52):
Yeah.
GW (00:38:52):
And it is good practice for seeing how you translate a vague comprehensible question like "Does version control make things better?" into a specific experiment, that is probably answering something much more precise and much less general. Does having friends make you live longer? There is this claim these days that having lots of friends increases lifespan. It is plausible, it is being challenged, and it turns out that one of the reasons there is disagreement, is that people do not agree on what exactly do you mean by "friends"?
EW (00:39:30):
Yes.
GW (00:39:32):
Okay. So if I am measuring that differently, of course I am going to get different answers. So now let us go and do what scientists and engineers have been doing for centuries, which is refine the question, refine the instruments, learn more about what it was we were asking in the first place.
EW (00:39:50):
Measure in the volume of tears from the developers.
GW (00:39:55):
<laugh> Yeah. Sure. You have seen the xkcd code review cartoon, right? WTF per minute.
EW (00:40:04):
Yes.
GW (00:40:05):
Right. Okay. Sure. I think that is valid. I think it is a valid measure of code quality. How quickly can the next person get to the point where they can make a change? If I say that as a sentence, it makes sense. Now you think about the hundred things we would have to do, to actually do that study. And the hundred other questions that we would not be answering, because we narrowed our focus.
(00:40:32):
I do not think this is ever going to happen, at least not in my working lifetime. There is no appetite for doing this, even among the researchers who are doing this kind of research. I know because I have asked. There is certainly no interest in this in companies. I know because I have asked.
(00:40:48):
We are at the state that medicine was in prior to the 1920s where, I do not know if you know this, but in 1920, none of the entrants to medical school at Harvard had a science background. Medicine was something that gentlemen did. It was not seen as applied science, and it took a generation to change that. Lewis Thomas has a really good book called "The Youngest Science," in which he talks about how it is that medicine came to see itself as a branch of applied science. I think we are all better off for it. But I am 60 years old. I do not have a generation to wait. I do apologize if I have been going on too far and too long about this one.
EW (00:41:32):
Oh, we are totally off topic, and we are going to have to do like a speedrun through your career, but that is okay.
GW (00:41:37):
Okay. Let us do the speedrun.
EW (00:41:39):
Well first, a little bit more about It Will Never Work in Theory. There are talks from April 2023, "Emotion awareness in software engineering." Is that emotion of the person, emotion of the computer, or emotion of the developer?
GW (00:42:01):
The first and the third, not the emotion of the computer. You asked me earlier if I had a favorite robot, and I just do not think I am there yet.
EW (00:42:10):
There is also "How novice testers perceive and perform unit testing."
GW (00:42:15):
Yep, and how that differs from the way that professionals in industry do it. This is, I think, a useful prequel to how do we close that gap? How do we get young programmers to think the way we want them to, about the actual purpose of testing, and how to go about doing testing well?
EW (00:42:34):
And there is a talk here, "Crafting strong identifier naming practices." And this is going to be science-based?
GW (00:42:41):
Sure, because it is actually pretty straightforward to study how recognizable different naming conventions in code are. For example, camelCase is harder for people to read, if English is not their first language.
EW (00:42:56):
Okay.
GW (00:42:56):
If you think, for example, of somebody coming from a non-alphabetic language like Chinese. The notion that the capitalization of the letter matters, and then we hit acronyms, and then we hit all of the other cases. There is some evidence that pothole_case is actually easier for non-native speakers of English to read, because it does not require as much implicit knowledge.
(00:43:23):
Now, what about short identifiers, versus long identifiers? Well, it turns out that that relates to the scope of the variable. It is perfectly okay to say "for i = 0; i < n" in a three line loop. It becomes more problematic as the number of lines within that variable that is in scope increases, which might seem obvious in retrospect.
(00:43:48):
But as far as I know, nobody writing programming books had actually said that, before this team in Ireland did the study. I would have to go and look up the names and the dates for the study. I apologize, I do not have it at my fingertips. But when you think about your favorite linting tools, they are not as nuanced as that.
(00:44:07):
So we can go back and treat- I think code is a user interface. I think everything we know about HCI can be applied to source code. And everything we know about the readability of text, can be applied to source code. And we actually know a lot about that. Could we use that to design a more readable programming language?
(00:44:27):
Andy Stefik and his colleagues have a lot of evidence to show that we can. They actually A/B test every new feature of Quorum, the syntax of every new feature, before adding it to the language. They compare different possible syntaxes, to see which is going to be most comprehensible to novice programmers.
(00:44:45):
Do you not think your life would be better, if that kind of empirical HCI style approach had been used for C++? The answer of course is unanswerable, because if we would done that, we would not have built C++.
EW (00:45:01):
All right. Yes. All right. And there are years of these lectures.
GW (00:45:05):
Yep.
EW (00:45:05):
And they go back...
GW (00:45:10):
We have only done the live ones for, oh, year and a half. We did three sets. So spring 2022, fall 2022, which was co-located with Strange Loop, and then Spring 2023. Before that we were just doing written paper reviews.
EW (00:45:28):
"Negotiation and padding in software project estimates." And that is going to be science-based?
GW (00:45:34):
Absolutely. Because you can go and you can interview the developers and you can say, "Okay, what is your real estimate? Now, what did you put in front of management? Now talk to me about those differences."
EW (00:45:48):
<laugh> Well, you see, I multiplied by four and then I added two days, because I knew whatever it was-
GW (00:45:56):
Ahh. So what that team, that is a Brazilian team, and what they found was that there were significant number of cases, where the developers would deliberately lowball their estimate.
EW (00:46:05):
What?
GW (00:46:06):
Because if I tell you how long it is going to take-
EW (00:46:08):
Oh, that is true.
GW (00:46:08):
You are going to say, "No."
EW (00:46:10):
No, yeah, that thing happens, too.
GW (00:46:10):
If I tell you it is only two days, and then I am stuck in and I look at you with puppy dog eyes and say, "Well, we cannot quit now." Right? If I know it is going to be a month, and it is critical refactoring. And I know that business is not going to let me take a month, I am going to lie. For the good of the company, and the good of my soul. Please do not tell my boss I have done this recently.
EW (00:46:33):
Wow!
GW (00:46:34):
Well, but, how many times have you said to a child, "We are almost there."
EW (00:46:41):
Never. I do not have kids, but I do understand the idea <laugh>.
GW (00:46:46):
Right. I am okay lying to other people's children as well as my own, so maybe I am an outlier. But this is fascinating. Here is where we get into the fact that not all rigorous empirical research has to be statistical. Qualitative methods can get us answers that we cannot get through mere numbers, if it is done right. The case study approach can in fact uncover a lot of really interesting actionable insight, as long as it is done carefully. I think one of the reasons that engineers like me and people in related disciplines, do not believe that, is that we have never seen it done properly.
EW (00:47:41):
Oh, no. I have never seen it done properly.
GW (00:47:44):
Have you worked with product managers?
EW (00:47:46):
Yes.
GW (00:47:47):
Okay. Do you believe that they are actually getting the right answers? Good insights, some of the time?
EW (00:47:55):
Can you take my silence as a lie for "Yes"?
GW (00:47:59):
<laugh> Okay. I am working right now with an absolutely outstanding product manager here in Toronto. She goes and she interviews the biologists working in the lab, and then she goes and consolidates what she has found, and takes it back to them. After a couple rounds of that, she can say, "Here are the problems we actually need to solve, and here are the things that would constitute solutions."
(00:48:21):
It is more of a case study style, than collecting statistics on how many visitors did the website have, how long did they stay on which pages, and so forth. Both are meaningful. I think one of them is closer to the kind of training I have had, and therefore more easily recognized and more likely to be accepted.
(00:48:48):
Again, one of the reasons I would like to have training like this for young software developers, is that stats is not the only way of knowing. I think we throw away or disregard a lot of really valuable insight, by insisting that if it cannot be quantified, it is not real.
EW (00:49:10):
Case studies are incredibly important, even though they can be a small amount of actual data, approaching anecdote. At least they are well-documented anecdotes with their precedents explicit.
GW (00:49:33):
And carefully analyzed, teasing apart the actual meaning, and comparing it to somebody else's actual meaning. As I said, it is not the bulk collection of data, followed by a t-test, that I am most comfortable with. But I should not discount somebody else's insights, because they are using methods that I have not yet learned. I wish I had had the humility to understand that 20 or 30 years ago.
EW (00:50:13):
Where does it end? Where do we start with the t-tests? Where do we go from...
GW (00:50:21):
I have no idea.
EW (00:50:22):
I truly believe that source control is important to test-driven development. And whether or not linters are truly helpful, or all of these things...
GW (00:50:40):
Mm-hmm. It is hard for me to know where we go or where we would stop, when we have not even really started. I believe that empirical study has served engineering and medicine very well. I would like the construction of software to be at least as rigorous as the construction of a highway or a footbridge.
EW (00:51:12):
How much is it my employer's job to give me time and money to learn, and how much is it my responsibility?
GW (00:51:23):
<laugh> That question comes up in pretty much every other profession as well. Again, I will come back to medicine. What we do not have, that the medical profession has, and the legal profession and accounting and many others. We do not have professional associations that are worth a damn. The ACM and the IEEE- Yeah, right. I heard you snicker, and I agree.
(00:51:47):
The reason that there is space carved out for paramedics to maintain and improve their skills, is because they have got a professional body that goes and negotiates in bulk, rather than trying to do it case by case. The reason that lawyers, for example, have time carved out, but also have a requirement to recertify, is because there is a professional body that has teeth that says, "We as a profession have privileges that are not granted to the average citizen. The societal bargain for that is we try to ensure a certain minimum standard."
(00:52:33):
If I go and get an engineer, for example, to do the drawings to replace the wall at the back of our house. If that engineer does drawings that turn out to be faulty, and we build a wall and it collapses, I can sue the engineer. I have a reasonable expectation of getting my money back and damages and so forth.
(00:52:58):
The same is not yet true of software, and I think that is largely by design. I think programmers have worked very, very hard to make sure that they cannot ever be blamed for anything in any meaningful way. I think we have outgrown that. Can I give you an example?
EW (00:53:18):
Sure.
GW (00:53:19):
Okay. Do you remember Waze? It was an early piece of wayfinding software, a predecessor to Google Maps?
EW (00:53:24):
Yes.
GW (00:53:24):
Okay. Mike Hoye, who used to be at Mozilla, uses this as an example. For the first couple of years that Waze was out there and in use, you could ask it to find a route from A to B that avoided police checkpoints.
EW (00:53:40):
Okay.
GW (00:53:42):
Do you think the person who implemented that feature, had ever lost a loved one to a drunk driver?
EW (00:53:47):
No.
GW (00:53:48):
Okay. Do you think the person who implemented that, or the person who deployed it, or the person who authorized construction of that feature, that somebody, maybe the company as a whole, should have been liable for deaths that resulted from drunk drivers avoiding police checkpoints? I absolutely do. I am not sure who and at what level. Is it the individual programmer? Is it the company as a whole? Is it both of them?
(00:54:16):
If it was a piece of hardware going into a car that proved to be faulty and led to a crash, we have the legal and societal structures to say, "Here is who you can sue. Here is whether they go to jail or just give you money." It is not a perfect system by any means, but at least there is a system for doing it. As a result, I think your car is probably safer than a lot of the software that you use.
EW (00:54:45):
It is a measure of harm. With- I am not going to take the Waze example, because that one is really good. But the reason that cars get more checks, is because you can hurt people. The reason airplanes get more checks than cars, is because you can hurt a lot of people. The reason medical devices get certified, is because you can hurt lots of individuals. We have built some of these checks.
GW (00:55:22):
We have built social media, that explicitly and very effectively encourages racism, homophobia, misogyny.
EW (00:55:33):
I totally agree, and I do not know how to solve that one. Because you cannot- There is not one- How do you separate the content, from the content provider?
GW (00:55:44):
Okay. How do you do that with traditional publishing? How do you decide when a Ford Mustang bursts into flames, whether it is the mechanic, the engineer, the company executive or the company? Right. That is hard.
EW (00:56:00):
By the fifth one, you figure it out.
GW (00:56:02):
Right, and we are now a day late, and a dollar short, on figuring this out. There is no question at all that the internet has been a great force for good, and also great force for harm. There is no question that our entire industry has done harm, as well as good.
(00:56:19):
You may know that there are now dozens of cases in the United States, of people being arrested because of facial identification software misidentifying them. And that in almost every case, the person who is misidentified is an adult black male.
EW (00:56:36):
Right.
GW (00:56:37):
Okay. Somebody should be sued for that. There are cases working their way through the courts. The legislation needs to catch up. But also our expectations. As I said, the software industry has trained us to believe that we just shrug and go, "Oh, okay, I guess you leaked all my personal information again."
EW (00:57:00):
As developers, we know we cannot do it all. We do not have enough time, we do not have enough resources. We do not have enough input data, to know for sure that what we are doing is...
GW (00:57:14):
The same is true of the engineers who designed the car that you are- That I think is probably in your driveway right now. It takes literally tens of thousands of people to design and build and deliver an automobile.
EW (00:57:29):
And that is why they are more expensive than a Fitbit.
GW (00:57:34):
Absolutely. But we have decided with automobiles, who is to blame for what. There is a book by Amy Gajda called "Seek and Hide: The Tangled History of the Right to Privacy." It goes on a little bit, it could have been half the length. But it is a history of how this notion that we have a right to privacy in the United States came to exist, because the Founders absolutely had very different ideas.
(00:58:07):
People as recently as the 1920s and 1930s had very different ideas about what kind of things you as an individual had a right to keep private, than we do today. We negotiated that right over the course of generations, through one case after another. Through basically trial and error. I do not mean to pun, but we take cases to trial and we see, does this make society better? What are the unintended consequences, and so forth.
(00:58:35):
I think we are starting to see some of that now. The lawsuits that have been launched against various AI companies, over copyright infringement by people like George R. R. Martin. "You scraped my books. You did not have my consent. That is copyright infringement if you use it to train the model."
(00:58:56):
That is interesting, because there is very clearly not going to be a simple yes or no answer. We are going to negotiate a border and that border will shift over time. As I said, I would like to see more of that in our profession.
EW (00:59:13):
I am on the yes. How are we going to find the money, to provide time for the software engineers to possibly get certification? Because as you mentioned, there were other professions like doctor and lawyer. Those are both certified professions with longer degree paths, than most software engineers.
GW (00:59:38):
I believe that Elon Musk could pay for every programmer working in North America today to go and get this certification, and still be one of the richest men in the world.
EW (00:59:52):
But we do not get to control him. We only get to control our actions.
GW (00:59:56):
We as voters can pass laws, elect people who pass laws, to change the rules of the game. In the Gilded Age, there was no sense at all that people should pay taxes proportional to their wealth. In my parents' time, there was absolutely no sense that you should not end your years in poverty, unless you had been lucky enough to have a middle class job and squirrel money away.
(01:00:22):
Here in Canada, there is the very strong sense that you should not go bankrupt, just because you have cancer or a broken back. All of those decisions we made as a society, and enforced them on people who were very, very strongly opposed.
EW (01:00:42):
You are in the area of things I do not know how to affect a change. I mean, we are back in politics, where I just sit there and go, "This is nice. I am going to go build something that makes somebody smile, because I can do that."
GW (01:00:59):
And then you go and you vote. You vote for somebody who is, for example, going to introduce real liability legislation for banks and other companies that lose personal identifying information. Right now, the fines for that are a slap on the wrist. They are an operational expense.
EW (01:01:21):
I vote, and I feel like it is thrown away each and every time. But I do it, because I do believe it is important. But it is just- I live in California. I might as well be throwing flowers in the water, for all the effect that it is going to have.
GW (01:01:37):
I would disagree. I think the last couple of elections, your vote probably has mattered quite a bit. I think that at the state level, your vote matters even more.
EW (01:01:53):
I think we need a research on this.
GW (01:01:55):
I beg your pardon?
EW (01:01:56):
I think we need research on this.
GW (01:01:59):
We do.
EW (01:01:59):
Sorry.
GW (01:01:59):
No, no. There has been a lot. In the United States and Canada, and to a lesser extent Australia, the state or province level is often used as a laboratory for trying out ideas, that then move up to the federal level. Whether it is around environmental protection, or anti-discrimination laws, or tax laws. Proposition 13 in California, which crippled the state's finances.
EW (01:02:28):
<sigh>
GW (01:02:28):
Yep. But that was a tryout. That was an explicit tryout by right-wing libertarian politicians, for legislation that they then wanted to move up to the national level. When we take a look at things like legislation to prevent discrimination in hiring practices, some states with electors elected by people like you, take the lead on that, implement legislation. The rest of the country looks and says, "Oh, okay, that looks better than what we are doing right now." For example, I was amazed and very pleased when the law changed in California, so that companies are posting salary bands with jobs.
EW (01:03:11):
Yeah.
GW (01:03:11):
Right? Okay. That happened-
EW (01:03:16):
The band I saw recently was 20 K to 300 K, so that band is not as useful as it might have been.
GW (01:03:23):
Won the point.
EW (01:03:27):
Yeah. That was an outlier. Of the many I was looking at, that one stood out because it was like, "Oh, come on. Play by the rules."
GW (01:03:34):
Right. But here is the thing. We have now won the point. We have won the principle. Everything from here on is about the details. How wide is the band allowed to be? How closely does it have to reflect the pay of current employees or of the last five? Now we get into policy wonk country, but we have won the principle.
EW (01:03:57):
But I just want to learn a new microcontroller. I want to play with a robot. I want...
GW (01:04:05):
So have you ever read the English author Terry Pratchett?
EW (01:04:10):
Yes.
GW (01:04:11):
Okay. There is a scene in the book "Night Watch," where Captain Vimes goes out, and sets up a table and chair in front of the mob. Somebody says, "We want truth, justice, freedom." And Vimes says, "You know what I want? I want a cup of tea, piece of toast and a hard boiled egg." Do you remember that scene?
EW (01:04:37):
Not off the top of my head, but sort of. I do not remember what comes next.
GW (01:04:41):
Because he then goes on to say, "I want that, but I want to have that every morning when I wake up. And I cannot be sure of having my cup of tea, my piece of toast, and my hard boiled egg. Unless we take care of all this other stuff. I do not want to have to worry that I am not going to have that, because there is no food in the shops. I do not want to have to worry, that I cannot have that, because somebody kicked in my door in the middle of the night and dragged me off."
EW (01:05:06):
But not everybody can spend all of their time worrying about the politics of everything.
GW (01:05:12):
We do not have to.
EW (01:05:13):
I mean, for me, it is just too stressful.
GW (01:05:15):
Yep.
EW (01:05:15):
If I spend too much time thinking about politics, I am out. I just cannot do it.
GW (01:05:22):
Absolutely. And if I spend too much time thinking about software deployment, I have to go sit in a corner <laugh>. There are some people, as with everything else, there are some people who thrive on it. For the rest of us, we volunteer a little bit here and there. We go out and vote. Maybe we donate some money.
(01:05:41):
Maybe we do not pay as much attention, as that guy on the bus who really, really wants us to understand what is going on in the Greek parliamentary elections this year. Sorry, that was today's experience. I know nothing about Greek politics and I do not want to.
(01:05:58):
But the same is true of how does the food get on the shelves in the supermarket? How does the sewage system work? There are people who take care of that, and the rest of us just have to support them to make sure that they can do their jobs. I was appalled at how many Americans chose not to vote in the last midterms. We know what is at stake.
EW (01:06:24):
We know what is at stake, and we have seen our elections messed with.
GW (01:06:28):
Yep.
EW (01:06:29):
And we know how difficult it is for some people to vote, by design.
GW (01:06:35):
Yep.
EW (01:06:35):
It does not really surprise me. There is a sense, not of apathy, but of depression.
GW (01:06:47):
Yep. And I can only imagine how the generation before us felt around race rights, around women's rights, around LGBT rights. They felt the same sense of weariness and hopelessness, and here we are today. Better is possible.
(01:07:07):
I realize we have wandered a long way away from embedded controllers. But I think one of the things that I want, is for people to truly believe that better is possible. And that it is not really that much hard work, as long as we do the work together. And that doing that work together is a lot of fun.
(01:07:33):
The thing I remember most about teaching with Software Carpentry, which is a nonprofit that teaches coding skills to researchers, who did not get it earlier in their career, was the feeling of teaching with other people, of building lessons with other people.
(01:07:52):
Yeah, interacting with the learners is really fun. Seeing that light bulb come on, that is magic. But being in the room with somebody else, who realizes that you have got the wrong slide deck up, and you look and you lock eyes and it is like, "Okay, it is a Tuesday. We will get through this." That feels pretty good too.
EW (01:08:16):
I totally agree with the "we can make it better." I think making it better is about doing things each day. To take it back to software-
GW (01:08:31):
<laugh> Oh, is that what we are talking about?
EW (01:08:33):
This Never Work in Theory, there are all of these well researched, well articulated ideas about software engineering. Some of it is the idea of software engineering. Some of it is nearly the typing part of software engineering, the tactical parts.
GW (01:08:53):
Sure.
EW (01:08:54):
You do not have to do it all in one day. You can be just a little better every day. Or every week!
GW (01:08:59):
Absolutely.
EW (01:08:59):
Watch a video, take a little time to think about it, and if it does not apply to you, go on. That is fine. You do not have to tackle everything at one time.
GW (01:09:12):
Absolutely. I think anybody who comes into a large codebase and says, "We are just going to rewrite this from scratch."
EW (01:09:20):
Oh, that person.
GW (01:09:21):
Yeah, right. That person. Yes, absolutely, the temptation to just burn it all down and start over is powerful. Particularly I think when you are looking at your own old code.
EW (01:09:36):
<laugh>
GW (01:09:36):
Here is a funny thing, I learned this when I was doing a dive into the teaching literature. Have you ever seen yourself on video?
EW (01:09:52):
Only in the last few years, and I hate it so much.
GW (01:09:57):
So here is the funny thing. We can make you watch a video of yourself, and we can measure stress levels. Put the electrodes on you, eye response, pupil response, things like that. Then we can get an actor and train them to imitate everything you are doing in that video. And then you get to watch them and your stress levels will be far lower.
EW (01:10:20):
Oh yeah.
GW (01:10:23):
You are far more critical of your own ticks and so forth. You have had practice your whole life ignoring other people's foibles, because it is not socially useful to notice when people stumble or say "Um" and "Er" and or tug their earlobe. But unless you spend a lot of time in front of a mirror, you have no practice at all subtracting out that static from you. Now you are confronted with it on video, which is something that evolution did not prepare us for.
(01:10:52):
I think the same is true of code. If I get a chunk of- Okay, here is another experiment. If I get a chunk of code written by one of my colleagues and I do a review, I will probably say, "Yeah, there are a couple of things you can clean up. But basically this is okay." If you were to have me look at exactly the same code and I was the author, I would say, "All right, rm -rf. No wait! Reformat the hard drive. Can we destroy all record of this?" Right? I am going to react very, very differently. And I-
EW (01:11:31):
That is funny. I would not- I mean, looking at my own code- I mean it depends on which code.
GW (01:11:40):
<laugh>
EW (01:11:40):
But for the last few years, so much of my code is written almost for demonstration purposes. Clients that I work with who, yes, they want me to implement a bunch of stuff, but at the end of the day, I know I am handing it off and I am handing it off to a junior engineer. And I want the junior engineer to be well prepared.
(01:12:04):
So when I look at my code, I want it to be something that is almost amusing to read. If not amusing, then engaging or something. Like when I write a book, it is not just, "Here are all the facts." It is,"Here are all the facts in a way that I think will make it easier for you to understand them." So no, I do not have as much trouble with my own code. I cannot see my own typos, which is kind of a problem. But since I started writing code to be either public or to be for other people, I am not nearly as critical about it.
(01:12:45):
The video thing, yes. And by the way, I have a podcast if you do not know, and I do not like my own voice. I do not like the way I have to slur sentences together so that I do not stutter. And then people do not seem to notice that I stutter, but I do a lot. And then the whole slurry mushy thing is terrible. How can you even listen?
GW (01:13:10):
I realize we are coming up on time.
EW (01:13:13):
<laugh> Yeah.
GW (01:13:14):
But another thing I came across when I was learning about teaching, we have seen steady improvement in athletic performance throughout my entire lifetime. Records keep getting broken over and over again. And the question is why? Some of it I am sure is better pharmaceuticals.
EW (01:13:34):
Better teaching.
GW (01:13:37):
What in specific, when you say better teaching? Because there is an answer to this question.
EW (01:13:41):
The first person learns that if you start with your dominant foot forward and launch from there, you go a little faster. Second person does not have to learn that. They get taught that now. They get to learn that if you put your ankle out just a little bit, that you can go faster. And now the third person starts with both of those, and has time to learn the additional stuff on their own.
GW (01:14:05):
I agree with all of that. There is also one other thing that, depending on who you trust, might account for as much as one quarter or more of the improvement in athletic performance in the last 45 years. And that is video.
EW (01:14:21):
Huh. All right.
GW (01:14:23):
Prior to the early 1980s, you had to be pretty far along to ever see yourself play the sport. And even then there would be a delay of at least a day, between you are on the field and you get to watch the movie. Right? Today, every kid doing athletics can watch themselves on their phone, as soon as they are done performing. "Here, can you record me doing this?"
(01:14:50):
We have made that feedback loop so tight and so ubiquitous that every serious athlete can watch themselves, in the same way that every serious musician since the 1970s at least, has been able to get a tape recorder and listen to themselves. It was physically impossible to actually listen to your own performance, prior to the Edison phonograph.
(01:15:15):
It was impossibly expensive until the invention of tape recording in the 1920s or thirties, depending on which version you care about. And it was impractically expensive until at least the late 1960s, for anybody except professionals. Then all of a sudden in the 1970s, a ten year old kid could have a tape recorder radio at home, and they could play the saxophone and then listen to themselves.
(01:15:45):
If you think that has not had an impact, boy it has. I bring this up because we have seen something similar with performing arts. You might feel uncomfortable with the sound of your own voice, but every actor, every TV presenter, radio presenter, of course, they are listening to themselves and watching themselves now in the same way that athletes are.
(01:16:13):
There are companies like Athena that build products to help teachers do this. Yes, the first two or three times you see yourself in front of a class teaching, you are going to want to go and find some deep, dark cave to hide in for the rest of your life. But then you get used to it, in the same way that athletes and actors and others get used to it. That initial discomfort goes away, and it just becomes another Tuesday.
(01:16:40):
At that point, your performance accelerates, because now you can view yourself as if you were a stranger, and give yourself the feedback you would give the stranger. You know what the value of getting code reviews and other kinds of reviews are. We can do it for ourselves now, thanks to technology. Yeah, it took me a while to get used to it, but if the jocks can do it, I can do it.
EW (01:17:11):
Oh, Greg, we are out of time.
GW (01:17:13):
We are.
EW (01:17:14):
Do you have any thoughts you would like to leave us with?
GW (01:17:18):
I think I have shared far too many in the last 90 minutes, and I do apologize for that.
EW (01:17:24):
No apology needed.
GW (01:17:26):
It has been fun. I am very grateful to you for inviting me on the show.
EW (01:17:30):
Well we did not talk about many other things, so expect another invitation.
GW (01:17:35):
I would be happy to come back.
EW (01:17:36):
Our guest has been Dr. Greg Wilson, a programmer, author, and educator based in Toronto. He co-founded and was the first executive director of Software Carpentry, which has taught basic software skills to tens of thousands of researchers worldwide.
(01:17:52):
Dr. Wilson has also authored or edited over a dozen books, including "Beautiful Code," which was awesome. "The Architecture of Open Source Applications," which was awesome. "Teaching Tech Together," which I have not read. Most recently, "Software Design by Example," which he has got part of a version of a Python one online. And that was pretty awesome too.
(01:18:17):
Greg is a member of the Python Software Foundation, and a recipient of ACM SIGSOFT's Influential Educator of the Year, to which he wrote a rebuttal instead of an acceptance. He currently works as a software engineering manager at Deep Genomics. As you can tell, Greg has been around and done a lot.
GW (01:18:43):
<laugh> And he has got the bite marks on his butt to prove it.
EW (01:18:46):
Thank you so much for being with us.
GW (01:18:48):
Thank you very much, Elecia.
EW (01:18:50):
Thank you to Christopher for producing and co-hosting. He would have broken in and said "Lead in paint and lead in gas is what has been the big improvement in the last 40 to 60 years." But since he is not here, he does not get to say that.
(01:19:04):
Thank you to our Patreon listener Slack group for some of their help with preparing this. And of course, thank you for listening. You can always contact us at show@embedded.fm or hit the contact link on embedded.fm.
(01:19:17):
And now a quote to leave you with, I think we will just go with Aristotle. It is a classic. “Those who know, do. Those that understand, teach.”