Episode 130 - Walking on a Tightrope: How Politics Impacts the the Scientific Community
Feats in science and technology have undoubtedly become a game-changer in how we navigate our daily lives. We immensely benefit from it at home, work, and school. Even the government can take advantage of science and technology. However, if placed in the hands of the wrong people, these advancements and innovations can be used for harmful motives.
Today's discussion revolves around the issues in pursuing truth, knowledge, and engineering feats in an age of hyper-politics. Max and Aaron start by discussing Lawrence Krauss's article entitled the “The Ideological Corruption of Science.” They explore whether it's possible to remain apolitical in the scientific community. They also tackle the ethical implications of research and an update about the use of “Fake Faces” in publishing news articles.
Tune in to the episode to learn more about what goes behind and beyond politicizing science!
Here are three reasons why you should listen to the full episode:
Is the politicization of the sciences becoming a problem?
Discover how dictatorships have historically used science and technology to harm people.
Max and Aaron share examples of the ethical implications of science, technology, and research.
Resources
The Ideological Corruption of Science by Lawrence Krauss on the Wall Street Journal
The Universe from Nothing by Lawrence Krauss
Right-Wing Media Outlets Duped by a Middle East Propaganda Campaign by Adam Rawnsley on The Daily Beast
Related Episodes
Episode 81 on Junk Science and the Reproducibility Crisis
Episode 56 on the Fake Faces in Generative Adversarial Networks
Episode 22 on P-Hacking
Episode Highlights
Politicization of Sciences
Lawrence Krauss is a theoretical physicist and an outspoken liberal atheist who wrote the book A Universe from Nothing.
He argues that the sciences are getting overly political.
You need to have an objective truth to perform science. However, under a dictatorship, people can easily dismiss this thought.
Non-scientists are forcing scientists to incorporate their work with activism. They're insisting that scientists need to be pursuing something that provides benefits, not just the absence of a negative impact.
Inconvenient Truth
Science is dealing with an “inconvenient truth.” The scientific community prevents anything that's not politically advantageous from being published or even researched.
It can impact the results when researchers allow external forces to manipulate the approach of their work.
If you're not allowed to look at the hypothesis scientifically, you wouldn't know if it is actually incorrect.
In an ideal world, factual news should win against fake news.
Anti-Mendelian Thought
Trofim Lysenko promoted the idea called “anti-Mendelian.”
He rejected Mendel’s theory on genetics. Instead, he believes that life experiences play a role when a person passes their genes to an offspring.
Lysenko was one of the greatest scientists in botany and genetics in the Soviet Union. While serving in the Agriculture Department, he recommended policies that had severe negative impacts.
He held back the Soviet Union's scientific advancement. As a result of his policies, it led to millions of deaths.
Ukraine at the Hands of the Soviet Union
Ukraine experienced horrible things in history, from the Holocaust to the two World Wars. In many ways, it was the effect of faulty science.
During Stalin's time, people weren't studying and making an effort to prove what was true or not.
It was also illegal to voice out opinions or publish results that went against Stalin's theories. People were punished and sent to camps for doing so.
At that time, Ukraine was the breadbasket of the Soviet Union. However, they had massive famines that weren't caused by natural disasters. Instead, they did that to themselves.
Politicizing Mathematics
Jewish scientists or mathematicians in science departments dismiss certain scientific discoveries and scientists themselves.
Ed Frankel made a point that mathematics in the Soviet Union was the only subject that is hard to politicize.
The Cultural Revolution in China purged academia, which involved labeling certain branches of science.
Private-funded research tends to come with a taint of the funders. It may not be politicization per se for some organizations, but it is certainly science with an agenda.
Scientists & Political Activism
Even though they just want to do their job, there is a high expectation for scientists to be activists.
The discussion on ethics in science and technology and engineering has been ongoing for decades.
The classic example is the production of nuclear weapons that led to the Manhattan Project. Not much later than that, there’s also Wernher von Braun’s rocket research.
Technological Arms Race
There would be greater costs than benefits when someone takes a dangerous path using technology.
The modern-day example of this is facial recognition, which has a massive potential impact on society, policing, and privacy.
Due to misuse, numerous companies put a halt on releasing the most advanced version of facial recognition.
Listen to the full episode for more information on Max and Aaron's example on wormholes.
The Issue with Scientific Method
Max doesn’t think the scientific method or Bayesian inference are biased and racist on their own.
Nonetheless, you can have results that are racist or biased. This can be due to a diversity of priors and methodologies.
For instance, some people have called facial recognition biased or racist because of its poor results with people of color.
However, it’s not a matter of a flawed method or approach. It's a matter of not asking all the right questions to improve the algorithm.
Scientists are getting forced to incorporate environmentalism and other issues that have no direct relevance to their research.
Creation of Fake Faces
This Person Does Not Exist is a website that uses a generative adversarial network (GAN) to generate fake faces.
In line with this, some published news articles were found to be written by people who don't exist. These “journalists” use GANs to publish agenda-driven articles.
The technology is far from a waste of time, but people can also use it for wicked purposes.
5 Powerful Quotes from This Episode
“Science communication and science policy is something that's very important, but that science is not necessarily good at.”
“When it gets so abstract that even, you know, the politicians can't grasp what's going on, then maybe you do have a little bit more freedom to pursue your field.”
“Whenever science has been corrupted by falling prey to ideology, scientific progress suffers.” —Lawrence Krauss
“To stem the slide, scientific leaders, scientific societies, and senior academic administrators must publicly stand up not only for free speech in science, but for quality, independent of political doctrine and divorced from the demands of political factions.” —Lawrence Krauss
“So, does this particular moment also end up in the dustbin of history? Or does it lead us to a brave new world? I think that is the story that we will get to personally witness in our lives—on your lifetime.”
Enjoy the Podcast?
Are you hungry to learn more about how machine learning, physics, cosmology, Bayesian work? Do you want to expand your perspective further? Subscribe to this podcast to learn more about A.I., technology, and society.
Leave us a review! If you loved this episode, we want to hear from you! Help us reach more audiences to bring them fresh perspectives on society and technology.
Do you want more people to understand the relationship between science and politics? You can do it by simply sharing the takeaways you've learned from this episode on social media!
You can tune in to the show on Apple Podcasts, Soundcloud, and Stitcher. If you want to get in touch, visit the website, or find me on Twitter.
To expanding perspectives,
Max
Transcript
Max Sklar: You're listening to The Local Maximum Episode 130.
Time to expand your perspective. Welcome to The Local Maximum. Now here's your host, Max Sklar.
Max Sklar: You've reached another Local Maximum. Welcome to the show. I am now firmly moved into Manhattan. So for the next few months, at least, I will get a front-row seat into either the rebound or collapse of the city—one or the other is going to happen. I've got a great discussion to play for you today. This is the second part of the discussion that I had with Aaron a couple weeks ago. And I think the bulk of the discussion is about political activism in science. It's becoming more and more politicized. And is it even possible or desirable to be an apolitical scientist, you know, truth seeker in this environment, or do you need to either go actively with the established powers or against them? Because if you sort of stay on the sidelines and sort of automatically fall in one or the other, that binary decision is tough.
And Aaron and I did a pretty good job of talking around the intricacies there. And then we finish up with another example of, you know, science or technology that can be used for good or can be used for a kind of deception, and one is the generative adversarial networks—those fake faces that we saw back in Episode 56, I think it was.
So without further ado, I'm just going to play it. And here is Episode 130 discussion with Aaron.
Max: We've talked a lot about the state of science on the show in the past, and there were two points that I've made. First of all, we've made the point of the crisis in science, the crisis of reproducibility that has occurred, and all…
Aaron Bell: Does this tie into the discussion of P-hacking.
Max: Yes, and it ties into the discussion of P-hacking too, like people just keep on running experiments again and again until they get what they want, and then, yeah. And then also, in terms of, you know, Bayesian inference as one codification of the scientific method, and I'm actually trying to put out a PDF on that that will be available on the website. So yeah, I want to talk about this. We're pretty good at talking about these articles. So that's why I wanted to bring this one up.
And also Lawrence Krauss, I don't know if that name rings a bell for you—it certainly does for me. I read his book. He was a physicist at Yale. I don't think he was at Yale when I was there, but I may have seen him lecture once because when I first saw him on YouTube, he seems so familiar. But he's a theoretical physicist, and he wrote a book that I read once called "A Universe from Nothing," like how could the universe come from nothing. And he argues that the laws of physics says that it has to—nothing has to produce something. I don't know very deep, but he goes into the laws of physics. He's a very outspoken atheist, not just like, "Oh, you know, I happen to be an atheist, and I do science." No. He's like... I'm trying to have a good way of describing what kind of atheist he is. What kind of picture am I painting here—he's like a real, like, militant atheist. I don't know if militant is the right word, but I'm trying to find the right word here. But anyway, outspoken.
Aaron: He does not believe that a miracle occurred. If by miracle, you mean something that can't be explained by science,
Max: Sure, sure. But he also, you know, thinks there's something wrong with anyone who disagrees who does science. But anyway, the reason why I bring this up is I feel like this article is a big deal because if someone like Lawrence Krauss who's like an outspoken liberal atheist, is saying that the sciences are getting political, it's kind of hard to ignore. And I'm kind of wondering, are there still people out there who think that the politicization of the sciences aren't a problem? Like, is this something everyone's on board with? Or are there people who still think, "Oh, no, sciences are fine. Everything with science is going exactly how it should be with science." I don't know who has that opinion right now. But anyway, let me start with his introduction, and then we can go from there. Right.
Aaron: Okay.
Max: So he starts, "In the 1980s, when I was a young professor of Physics and Astronomy at Yale, deconstructionism was invoked in the English Department. We, in the science departments, would scoff at the lack of objective intellectual standards in the humanities, epitomized by a movement that argued against the existence of objective truth itself."
You kind of have to have objective truth in order to do science, I think. Anyway, that was my side. Let me continue.
"Arguing that all such claims," sorry, "epitomized by a movement that argued against the existence of objective truth itself, arguing that all such claims to knowledge were tainted by ideological biases due to race, sex or economic dominance. It could never happen in the hard sciences, except perhaps under dictatorships, such as the Nazi condemnation of "Jewish" science, or the Stalinist campaign against genetics led by Trofim Lysenko, in which literally thousands of mainstream geneticists were dismissed in the effort to suppress any opposition to the prevailing political view of the state. Or so we thought," as he says, "We thought it can't happen here."
Then he gives a list of examples of things like this happening today, where people are afraid to ask certain questions, or they're kind of being told by non-scientists to do science in a certain way. Or, you know, the entire... it's not just there are certain questions that you're not supposed to ask—like, you have to be asking a question that's furthering an activist goal. If you're kind of asking a question that's sort of disinterested, you know, pursuit of knowledge that it appears to be non-legitimate. So...
Aaron: So it's perhaps an extension of the distinction between not being racist and being anti-racist. That they're saying, you need to be pursuing, you need to be providing certain benefits, not just the absence of a negative impact.
Max: Right, right. Yeah. So if you're studying—wormholes is the example I give. If you're studying wormholes, what impact does that have on, you know, on the social revolution? Nothing. Then, you know, we're gonna say that it's illegitimate to study wormholes. It almost seems like that's the way we're going. Okay, so, the next bullet point's from you, Aaron. So why don't you take it away?
Aaron: Yeah. So I think another way of looking at this is that science is dealing with a "inconvenient truth." So apologies to Al Gore for repurposing his phraseology there. But, that anything that's not politically expedient is discouraged from being published or even researched. And I think there's a major synergy between this kind of at high level, but also the trend in science that kind of ties in with what we said before about P-hacking, that you have a kind of preordained results that whether you're going to achieve that through something like P-hacking or by just the inherent design of your experiment that…We all know what we're going to find here, and this is just a matter of "proving it out."
Max: Economists do that all the time.
Aaron: Yeah. So that if you're allowing...
Max: And corporate executives.
Aaron: If you're allowing external forces like this to shape your approach to your research and how your experiments are run, then you know, it's obviously going to have an impact on the results. And this isn't just because of…
Max: There's also like a…
Aaron: You know, a personal bias.
Max: Yeah.
Aaron: But also, you're going to be impacted by what others of influence are seeking. So whether that's your advisors when you're a grad student, or your department head, or the organization that's providing your funding and your grants, the journal that you want to be published in, or pressure from your colleagues. You know, people who you want to keep being invited to the right cocktail parties, and so you don't want to be doing research into the wrong types of things.
Max: Yeah. I feel like scientists who care about cocktail parties...
Aaron: And there's a highly cynical view.
Max: ...are no scientists at all. No, no, many of them do. Look, the same issue that concerns me here is the same as the issues that concern me about free speech. Like, if you're not—if a certain hypothesis is not allowed, well, that's an evil hypothesis. If you're not allowed to look at that scientifically, how are we ever going to know that that hypothesis is actually incorrect? We wouldn't. It's just sort of taken on faith. And I feel like that just puts society in a very shaky foundation where you're just saying, you know, "We're not looking into something because we're not allowed to look into it," and that you don't get the benefit of disproving it. Or if an "inconvenient truth" happens to be an "inconvenient truth" then you don't have the benefit of, "Okay, how do we rationally deal with this?"
Aaron: Yeah, it's in some way similar to the concern that, you know, in an ideal world—good news is the wrong phrase for it—but factual news should win out against fake news. But there's a problem of the volume being distributed and that it's kind of an uphill battle there. And if you're disincentivizing those who are pursuing the actual facts of the matter, then it makes it even harder to win that battle of ideals or facts even.
Max: Yeah. Okay. So I actually want to go through some of the references that…
Aaron: Those are pretty dense quotes, so there's some stuff to unpack.
Max: Yeah. So, first of all, Trofim Lysenko. I never heard of this guy in my life. So I look into him, and it's very interesting. It's a very interesting example of science in the Soviet Union, and I'll get to the Nazi example too in a bit. But he was promoted to what, Agriculture Secretary or something like that? And basically, he was promoting the idea you call the "anti-mendelian." Is that what it is? Anti-mendelian?
Aaron: Yeah.
Max: So basically, the way I understand that, tell me if I'm wrong, is it was the idea that genes could… When you pass along genes, so the next generation genetics, it can be affected by the life experiences that you've had. So if you learn something in life, or a plant had something happened to it in life, it could pass that learning on to its offspring. Obviously, humans can pass on like teachings, but in terms of our genetics, I think that's pretty much well set. I can't, you know, learn something in life and then have my genetics, you know, have it genetically passed on to my children; I'll have to teach it to them directly. But he believed that genetics was partially learned. Is that correct? Something like that.
Aaron: Yeah, yeah. So an element of his theory was the inheritance of acquired traits, which to make an example that sounds kind of silly would be if you have lab mice and some of the mice who cut off their tails, you would then expect that their children would be tailless.
Max: Right? Or maybe have smaller tails or maybe, yeah.
Aaron: Yeah. And I would assume that there was some subtlety to it so that it was not so easily shown to be ridiculous as that particular example.
Max: Yeah, there always is.
Aaron: But that was the direction it was heading in, and in some ways, his story is one of why there's a danger in the adherence to technocrats, that he was brought in as, you know, one of the greatest scientists in terms of botany and genetics in the Soviet Union, and so they put him in charge of running their Agriculture Department. And he recommended a lot of things that had serious negative impacts, not just on holding back science in the Soviet Union for, you know, decades. This isn't just a case of saying, "Whoo," you know, "Russians and Soviets—bad. Capitalists and Americans—good." He held back their science, and as a result of his policies, led to millions of deaths, and that's not hyperbole. That's not exaggeration.
Max: Yeah.
Aaron: I'll let you bring in the concrete example there, but real lives were lost because of this.
Max: Well, yeah. I mean, I've been to Ukraine, so I've been, they have...a lot of bad things happened in the Ukraine. You know, you go to every museum, and they'll have examples from the whole dimora; they'll have examples from the Holocaust, they'll have examples from the two World Wars. And so it's...but yeah, the starvation at the hands of the Soviet Union, whether it was purposeful, it seems like in many ways it was, but it was also led by this faulty science. And by the way, that question of whether traits—genetic traits—can be, you know, can be acquired over your lifetime, [it's a] perfectly legitimate scientific question for studying. The problem is they weren't studying whether that was true or not. They were saying, "We're just taking this as true." And any scientist that's asking the question as to whether it's false is going to be, what, demoted, fired, or even worse—I mean, this was Stalin's time.
Aaron: It was literally made illegal to hold opinions or publish results that went against his theories to the point that people were sent to the camps as a result, which is horrifying.
Max: It makes me think that it's not just...it makes me question the motivation. Like, it wasn't just like this guy believed so strongly in this, and Stalin believes so strongly in this. It's like, no, Stalin had an agenda here, you know.
Aaron: And it's mind-boggling that they took Ukraine, which was literally the breadbasket of the Soviet Union. You know, it produced far in excess of its needs and fed a large portion of a continent.
Max: Yeah.
Aaron: To within a decade, having massive famines and requiring the import of international aid in the form of grain. It wasn't a natural disaster that hit them; they did this to themselves. And that's horrifying on so many levels.
Max: Right. Okay. Another example of Nazi Germany kind of dismissing certain scientific discoveries and scientists themselves in science departments is Jewish science or Jewish mathematics. I know there's a quote where, you know, I think it was David Hilbert, who is a German mathematician. When Nazi official came to his city and said—I should double-check this quote—but I think he said, "Hey, isn't it great that we no longer have Jewish mathematics?" And he's like, "We don't have any mathematics anymore in this city." But he was one of the greatest mathematicians of the 19th, 20th centuries, and basically, he was really old by the time the late 30s came along. But, you know, it was like, okay, science is...the idea is like, science is based on your, you know, religion, ethnic background, race, or—even though in science, you're supposed to be a little more objective on that. You're supposed to say, "Hey, I have a bunch of hypotheses that has Bayes rule. I have a bunch of hypotheses, I'm going to gather data, and I'm going to try to draw conclusions. And it shouldn't matter what my," you know, "race, or background, or ethnicity, or religion, or any of that matter, my age, whatever."
Aaron: Yeah.
Max: I could go...I could list all the transmitted, all the characteristics, you know, and clearly, there's a group of people who think, and specifically a group of people in the totalitarian mindset or in the collectivist mindset, who think that that is not the case.
Aaron: This is maybe a little bit more of a rationalist talking point than one of science.
Max: Sure.
Aaron: Although they tend to walk hand-in-hand. But if you're not asking the uncomfortable questions, then you're not really doing worthwhile research. And a lot of the sensitivity here has come around questions of, well, you know, doing research that proves or that suggests and asks questions about differences between people of certain genetic backgrounds, which can then potentially be mapped on to things about race. Anything that has to do with differential IQs is becoming very taboo. And, yeah, this raises some uncomfortable questions. It runs up against borderlines with, you know, things that could potentially be used for making certain arguments in the area of, perhaps, eugenics. That doesn't mean that we shouldn't be asking the questions; it just means that we need to be very rigorous in the veracity of the answers that we're seeing and that we're actually doing the work here as opposed to just hand waving and saying, "It should be this. Let's not look too deep into that."
Max: Yeah. I wanted to add one more example from—because I read a book—I think, oh, "Love and Math," I have that on my... it's by Ed Frankel, who's a mathematician in the Soviet Union, then later came to the United States. He made the point that mathematics in the Soviet Union was a subject. It was the only subject that was very difficult for them to politicize. Like, how do you politicize algebraic geometry? Or how does that make someone in power uncomfortable? You know. So, I don't know. I don't know what to take away from that other than then they start targeting the actual people who are involved in the math and science, and being like, "Oh, well, this person is of this background so we can dismiss what they're saying even if they have a mathematical proof."
Aaron: Yeah. I haven't done a lot of research on the impact in that particular field in the Soviet Union, but I know that kind of the sister example to that would be in China during their revolution that there were huge purges of...
Max: Are you talking about Cultural? The Cultural Revolution.
Aaron: Yeah, the Cultural Revolution. The huge purges of academia and much of it involve labeling certain branches of science, even those in pure theoretical research as being, you know, deviant and Western. And I could totally see them coming along and saying something to the effect of, you know, "Nonlinear mathematics is deviant and a capitalist," and that, you know, "Aberration. And anyone doing research in that needs to be re-educated, that we only do," you know, "We only do linear mathematics here, or we only deal with the Euclidean geometry." They were certainly drawing…
Max: That happens all the time throughout history.
Aaron: They had no problem drawing kind of arbitrary lines there and saying, not only you shouldn't draw outside of these lines, but those who do are going to be threatened with violence if they don't recant their views.
Max: Yeah. Well, look, I mean, someone's got to decide who gets the funding, and someone has to decide what questions are worth looking into. And maybe an individual could start looking into funding or look into a question on their own time, but at some point, you know, you got to make a living. So like, that raises the question, you know, "Who should decide?" And I guess, I would argue, you have to just have a variety of different sources. Versus now where it's like everyone who is dispersing the funding now has to think the same way. So that's gonna be horrible for scientific inquiry.
Aaron: Yeah. There's a huge discussion that we had on, particularly on the side of basic research and government funding, and is that model working, has it ever worked, and how we got to where we are. But there's also, the flip side of that is private-funded research tends to come with a, you know, a taint of the funders.
Max: Sure.
Aaron: Not necessarily in the same way that, you know, there have been "research organizations" set up, for example, like the group set up by the tobacco lobby to publish studies showing that cigarettes are healthy, and you know, and "We'll cure all your ails" and whatnot and stuff that, yeah, that was maybe not politicization, but certainly science with an agenda. But even organizations or individuals funding science who may not have a clear bias like that, they're viewed as highly suspect when their funding is coming from somewhere other than, you know, like the NIH or the Department of Energy, or a somewhat unimpeachable group, which those groups are not above playing politics to a certain extent.
Max: Oh, not at all.
Aaron: I mean, they have limited funds, and they have to allocate them to meet some sort of mission, and that mission does not escape political influence.
Max: Not at all. Yeah. So, I mean, one question that I have is, is it legitimate for scientists to not want to be involved in political activism at all? I mean, now, and this is being said in companies across the country, maybe even across the world, a lot of people have told me they have experience with this, now, they're expected to be activists, even though they just want to be doing their job, and doing their job well and getting paid for it. But we're kind of entering the mindset of "All is political," and so is that the road that we're going down permanently?
Aaron: I mean, hard to answer, but it certainly seems that we've taken a detour down that path to some extent. And this opens a path to a huge, very active discussion that has been ongoing for decades but, I think, is starting to flare up again, about what the role of those working in science and technology and engineering is when it comes to ethical questions about the work that they're doing and how it's used. So the classic example would be...
Max: I agree with that. I think that people should be thinking about that.
Aaron: Yeah. And I don't know that the answer is to say, "Well, all STEM students must take an ethics course in college," and then, problem solved.
Max: No, no, no.
Aaron: Maybe that's a good step in the right direction, just to make sure that we're thinking about these things—but it's an ongoing process and dilemma.
Max: You know, I think...
Aaron: Now, the classic...go ahead.
Max: If the scientists and engineers are working on something, never question the ethics of it, then it won't be questioned. I mean..
Aaron: Yeah.
Max: You know, there's no, I mean, I guess, there could be some other lines of defense, but that's gonna be the main one.
Aaron: Yeah, well, so there's the classic example of nuclear physics and the unlocking of the powers of the atom and leading to the Manhattan Project. And in a similar vein, not much later than that, there's, you know, rocket research. There's a famous parody song about Wernher von Braun, who, he quite notoriously worked with the Nazis developing the V-2 rocket, and then went on to work for NASA developing the Saturn V rocket that took the first humans to the moon. But the lyric was something along the lines of "I just make the rockets go up; I don't care where they come down" for Wernher von Braun because his rockets were used in the bombing for V-2 rocket attacks on Great Britain, particularly the city of London.
Max: Yeah, it does make you think.
Aaron: Yeah, well, and there's an arms race aspect to this both literally in the case of arms, but also in the case of non-militarized technology. Once it's been proven that you can do something, then even if you decide, "Well, I don't want to do it this way," you know, "I think this would be—that the costs are greater than the benefits." You can't just walk away at that point because the door has been opened, and somebody else can take that technology down that dangerous path. So that brings us to the current day example: facial recognition The cat is out of the bag there, and a number of companies have decided to either get out of that market space, sort of to put kind of a freeze on their work or maybe not release the most advanced version of their technology because they're afraid it could be misused. But this is an area that has huge potential impacts on our society, on policing, on privacy across the board, and it's out there—you can't unring that bell, so to speak.
Max: Right, right. I just want to add, I obviously, I don't support Wernher von Braun, the Nazi technology. But what I meant to say was, you know, sometimes he developed something, and you say, "Hey, this could be used for good, this could be used for bad." I mean, when the first early human-invented fire, you know, they could be like, "This is really gonna mess things up," not, you know, discovered fire. But, I mean, no civilization would have existed without that. So, sometimes you still go ahead with it, even if there are positives and negatives.
Aaron: And there's a very…
Max: And obviously, if you're being hired by a totalitarian government, you might want to rethink what you're doing, especially the Nazi government. But yeah, I mean…
Aaron: There's a very narrow window.
Max: Obviously, facial recognition is a very good example.
Aaron: There's a very narrow window with these technologies or these scientific concepts where maybe if, you know, Enrico Fermi and some of his research partners had gotten together in the late 30s, you know, along with, I don't know, Heisenberg and some of the people, and they'd sat down and said, "I'm really uncomfortable with what we could be unlocking here. I think we need to," you know, "kind of make a secret pact to stop doing all this research and never talk of it again."
Max: Yeah.
Aaron: And maybe when you have half a dozen people in the room who are capable of doing that, you can, you could pull that kind of a conspiracy off. But even at the best, that's unlikely to stop, you know, things. It could delay it for a decade or two. But it's very difficult to prevent, you know, if something is eventually going to be known.
Max: Yeah. Sometimes you're in an arms race.
Aaron: It's hard to stop it from becoming known.
Max: Sometimes, you're in an arms race with a competing power, and you want your side to be ahead, and people are gonna side and do that. And you know, but there are definitely things that you still think you can do.
Aaron: The conception is all that matters in the end.
Max: When you're in that situation, there's still things that you can do to, you know, push the industry, or I guess, the scientific field in a more ethical direction, even as an individual—even if you can't move the whole thing.
Aaron: Yeah, so this gets to kind of the opposite of what we were talking about before. Is it possible to be apolitical when working in the sciences?
Max: Right.
Aaron: Which, if that is your desire, I think that's completely legitimate.
Max: Right?
Aaron: If you just want to go work on your theory, that should be an option.
Max: Like, let's say, I'm studying wormholes, again.
Aaron: Yeah.
Max: Theoretical physics. We don't know if they exist. I'm just working with equations, maybe looking at some signals from space and trying to make sense of them. Why do I then now need to be a political activist? If someone asks like, "What are you doing to combat," you know, "the ancient hierarchies that we're trying to get rid of these days?" You know, and I'm like, "What are you talking about? I have funding to do this." But they might be like, "No, this doesn't belong." Does this whole new mindset impede my ability to study wormholes in any meaningful way? Or I think a lot of people are gonna say, "Look, you know, the administration is blabbering on and on about I got to read this book and donate to this and go to this event, but whatever, I'll just do my job and keep my mouth shut and I can still do my job."
Aaron: There was an argument made, and it's questionable, and I can't say I'm good-winning the conversation because we already talked about the Nazis. But one of them, Braun would have made pretty much that argument that, "Yeah, he was in the Nazi Party, but that's because he had a fairly high ranking, you know, scientific research position, you kind of have to be.
Max: Okay, hold on.
Aaron: And he was doing whatever he had to to keep working on his rockets because his dream was to go to Mars.
Max: Okay.
Aaron: And so he was willing to, you know, play along with whatever the organization, the administration, the, you know, the faculty required of him so they could keep doing his work in his lab. And that went some pretty dark places, maybe not overnight but over, you know, over the course of a decade.
Max: Right. I mean, now, some of the things that they're asking people to do are just silly, like, okay. And also, if like what he was doing was allowing that government to produce weapons, he was giving weapon technology to the Nazis. Where I'm studying something theoretical, and I'm like, "Yeah. I'm here in America, all this crazy stuff going on," and you know like, "Okay, they asked me to give my pronouns. I don't really understand this, but whatever, I'll give them my pronouns.” And I'll ask them and, you know, I'll still study my theoretical thing. It doesn't give an evil government any benefit. What's wrong with that?
Aaron: Well…
Max: Was what I just said coherent? I'm trying to make it up on the fly.
Aaron: I'm trying to come up with a counterpoint to it without going completely off the deep end.
Max: Okay.
Aaron: The easy response is, "Yeah, that's all well and good until they find out a way to weaponize wormholes and that they can use it to," you know, "to get rid of people who think wrongly by dumping them into another universe," but that's a lot of sci-fi what-ifs.
Max: Yeah.
Aaron: I think you have a valid point there. The other side of that coin, though, is that one of the challenges science has had, particularly in the last decade, two decades is how they relate to policy.
Max: Yeah.
Aaron: In that, science communication and science policy is something that's very important, but that science is not necessarily good at. And as we've discussed before, science reporting is egregiously bad. And so if scientists are going to stay in there—not to use the pejorative term—but in their ivory tower and do their research and not be so concerned with how it's used and how the public perceives it and consumes it, then somebody who invariably gets everything wrong, and they're reporting on it, is going to be informing the politicians and the policymakers how to act on that information. And we're going to get dangerous outcomes as a result.
Max: Yeah. I think the answer is, yes, well, I don't know if the idea of being an apolitical scientist is going to be accepted as good in our society going forward, but I think there's certainly a space for it. But I think you do have to be on alert about how your science is presented and how it's used and what the ethical implications are. Because maybe right now, yeah, exactly, maybe right now, there's no benefit to wormhole, but if something comes up, you'll probably be the first person to know if you're working on wormholes, and then just keep considering what's happening there. And I think, again, that's why, you know, Soviet mathematics...you know, mathematics in the Soviet Union—well, at least the higher-level abstract mathematics. Obviously, the stuff that kids learn in school is going to be like, you know, 3 evil chocolates.
Aaron: Johnny has 6 apples, and Bobby has 2. How many—no, they both have 0 apples now, and the State has 8.
Max: I'm sure it was better than that, but yeah. No, but when it gets so abstract that even the, you know, the politicians can't grasp what's going on, then maybe you do have a little bit more freedom to pursue your field.
Aaron: And the danger there is that, if the politicians can't understand what you're doing, then why are they funding it?
Max: Yeah, I mean…
Aaron: You got to have a good story there.
Max: Yeah.
Aaron: There's certainly something to be aspired to in the scientist or the inventor or, you know, whoever who is above politics—they stay out of the fray. But is that really a realistic position? I think we've tried to draft a case with this theoretical wormhole researcher that seems to fit in there, but whether that's something that can be held true or not remains to be seen.
Max: Alright. So is the scientific method itself coming under question? Or to put it another way is, am I going to be hearing, reading an article a few months or a few years hence like, "Is Bayesian inference racist?" or something like that? I think that...
Aaron: It really depends on your priors.
Max: Yeah. I think, yeah, I'm willing to have a discussion about this. I don't know, I don't think Bayesian inferences is racist or biased in and of itself. I don't think that scientific method is biased in and of itself. And so if someone starts talking about that, I think we have to move the discussion of, no, it's on how do we choose what questions to ask? And, you know, we need maybe a diversity of priors and methodologies and also there's the imagination part of inference where you're trying to gather up all the hypotheses, maybe you could introduce bias that way, but…
Aaron: Yeah, I used to think we're…
Max: Once I hear somebody saying that the scientific method itself is problematic, that's when I know that their ideological framework is well off the deep end.
Aaron: Yeah, and I think we're in the danger of running into an issue with semantics, perhaps, here a little bit. I'm certainly willing to concede that you can have results from scientific inquiry or results from Bayesian inference, which are themselves racist or biased, but that's not a factor of the method. Whether or not it was a method developed/discovered by old dead white men.
Max: Yeah, it shouldn't matter.
Aaron: It's a matter of garbage in, garbage out. And so we've seen that well, for example, with a lot of the facial recognition stuff. We've heard talks about, yeah, facial recognition is biased, or it's racist. And that's because a lot of the algorithms we've seen and covered have had particularly poor results with people of color as compared to Caucasians. But that's not a matter of the method and the approach being flawed necessarily, as much as it is of the way that the particular tool was constructed. It's a matter of we weren't asking either the right questions or all of the right questions that we'd stopped short of, you know, kind of completing the full spectrum there—and so you can use the tool to improve that. We don't want to throw out the baby with the bathwater here and say, "We got some bad results using the scientific method. It's because of the scientific method. Get rid of it."
Aaron: Yeah. Alright. Let me continue reading from this article:
"As ideological encroachment corrupts scientific institutions, one might wonder why more scientists aren't defending the hard sciences from this intrusion. The answer is that many academics are afraid, and for good reason. They are hesitant to disagree with scientific leadership groups, and they see what has happened to scientists who do. They see how researchers lose funding if they can't justify how their research programs will explicitly combat claimed systemic racism or sexism, a requirement for scientific proposals now being applied by granting agencies."
That's crazy. Right? So he's basically saying that, yes, you have to have an activist agenda in order to do science. He mentioned the paper, "Ten Simple Rules for Building an Anti-Racist Lab," and that's exactly what that says. So that seems to be his position on what's happening in the sciences right now. "I don't know, I'm not in academia, but I," you know, "I have a problem with this." It's, no, you have to be combating this, even if, you know, combating systemic racism and sexism is important, every single point of inquiry has to touch on that now?
Aaron: I think we shouldn't be so surprised that we're seeing this. We're just seeing the focus shift that...I could have very easily seen something similar to this, and maybe it was, to some extent, happen on the environmentalism/global warming front, that even if you're doing research in wormholes, your research proposal has to have a climate sustainability section in it for, what are you doing as part of this effort that is going to help, you know, help preserve the environment of the planet and improve the environment as it stands. And even if that has little or no direct relevance to the research you're doing, it's got to be part of your proposal because, as part of our institution that has this as a guiding principle, it needs to play into that.
Max: Man, scientific institutions are…
Aaron: We've shifted focus.
Max: ...so much more messed up than I expected. I'm glad I'm not part of it again.
Aaron: Politics is everywhere.
Aaron: Yeah. You know, from your local rotary club to, you know, the Royal Academy of Sciences. So there's no escape.
Max: Okay. Let me finish with his quote at the end:
"Whenever science has been corrupted by falling prey to ideology, scientific progress suffers. This was the case in Nazi Germany, the Soviet Union—and in the U.S. in the 19th century when racist views dominated biology." I'm thinking eugenics here. "And during the McCarthy era, when prominent scientists like Robert Oppenheimer were ostracized for their political views. To stem the slide, scientific leaders, scientific societies and senior academic administrators must publicly stand up not only for free speech in science, but for quality, independent of political doctrine and divorced from the demands of political factions."
So yeah, that's what he's saying, and I have to agree with him. I know he seems to think that the leadership have to just go ahead and do this. I don't—leadership seem to be swayed by outside forces, no matter what happens. So I don't know, maybe this just has to run its course. All of these movements that he talked about, whether it's, you know, the science of racism in Nazi Germany, or dismissing, you know, Jewish scientists, or the science in the Soviet Union, or eugenics, or you know, McCarthy—all of that is in the dustbin of history. So, does this particular moment also end up in the dustbin of history? Or does it lead us to a brave new world? I think that is what the story that we will get to personally witness in our lives, on your lifetime. So, stay tuned for your entire life because that's what we're gonna get the answer to.
Aaron: Bringing to actualization everyday, the ancient Chinese curse, "May you live in interesting times."
Max: Yeah. You know, I don't...I know it's supposed to be a curse, but there's something interesting about living in interesting times, by definition. Well, I don't know if I want to live in boring—that well. Anyway, we could go into a whole philosophical discussion about that. Alright. Let's talk about, let's finish up with something a little bit. And this is not exactly lighter, but it's kind of funny. You found something about AI in the news, and we talked about GANs, those generative adversarial networks that created fake faces. You found it's finally being used for the propagandistic purposes that we speculated for.
Aaron: Very much in the vein of, "Anything that can be abused will."
Max: Yeah.
Aaron: So we talked on a previous episode, Episode 56, I guess, about not just GANs but specifically about This Person Does Not Exist, which is a website you can go to where they continuously generate or, at least, continuously serve up faces that are the result of a, what is it, a generative adversarial network. And I don't know if they've been getting better and better, or if they've always been this good, but when we were checking a pre-show today, there were some pretty good ones out there, and then there were some that they were pretty good, but something just didn't seem quite right that you couldn’t put your finger on.
Max: Yeah, like I can't exactly pinpoint it, and maybe there is a person who looks like that, but I'm like, there's something weird about... there's just something that seems a little off that makes me—it's that uncanny valley idea—right when something is so close to right but not exactly. And it's almost like—I described it this way: it was like a picture of what could be like a good looking or attractive person, but for some reason, you feel like they're unattractive, but you can't really figure out why. It's so weird.
Aaron: Yeah, well, that gets into a whole nother conversation of, you know, what defines beauty and attractiveness, and it's...
Max: I thought imperfections were a big part of that, but maybe some of these imperfections are just… out of line.
Aaron: It turns out we're not quite at the point where we can just put a formula into a computer that spits that out.
Max: No.
Aaron: But, so, they've uncovered that a number of what they refer to as "conservative news outlets" have been publishing articles from journalists who do not exist.
Max: Okay.
Aaron: And some of those journalists...
Max: And not just a pseudonym.
Aaron: Yeah. So difficult to prove definitively, but these are being passed off as journalist identities for people who are not, in fact, real people/journalists. So, presumably, somebody's writing this, unless they've got a really good AI algorithm that's writing the articles. But these articles have been submitted by "journalists" who have somewhat of a, you know, an Internet paper trail. But it turns out, it goes dead after a little while, and there's not a real person behind it. And for some of those, they were using your classic approach of using stock photos or stealing somebody's photos off of Facebook, and you know, slightly modifying them and flipping them.
Max: Right.
Aaron: But some of them were using facial imagery from the generative adversarial networks.
Max: Yes, this guy's Raphael Badani, that's the person who doesn't exist. They even have a LinkedIn network. And, right, usually, the spammers would just take pictures from somebody else's profile that seems to fit the persona. But no, now they're just making up people outright.
Aaron: Yeah. And this was done to try and push articles that had some sort of a bias. It's being speculated that it was either an intelligence organization or perhaps some advocacy group based out of the Middle East somewhere, but nobody's you know, claimed direct responsibility. But they basically created these, what you might refer to as sock puppets, built up a portfolio of actual published articles under them, and then started using these fake people to push their agenda-driven articles, which were getting published in legitimate news vehicles.
Max: Well, I think, to circle back to our previous discussion. Yeah, maybe it makes sense to develop a new technology to ask, you know, what are the different ways this could be used? Again, I'm not saying that these generative adversarial networks shouldn't have been developed—I happen to think that that was a good use of someone's time to figure out this technology—but yeah, it can be used for nefarious or iffy purposes.
Aaron: Yeah, well, and to touch on that arms race topic, we talked about a little bit in that content. Now that there's this publicly shared article talking about how they identified, you know, the flaws in this approach, you know, GANs are going to get better. They're going to be able to maybe identify some of the things that gave them away last time, and then it's going to be even slicker next time.
Max: I mean, the whole idea of a GAN is an arms race of itself.
Aaron: It's continuously learning. Yeah.
Max: Right? No, it's two sides of the same coin.
Aaron: Oh, yeah, yeah. Internally, right?
Max: Yeah, it's one side that generates the face and the other side that tries to pick out the real faces from the fake face. So it literally is progress through arms race, which just shows us how arms race can ultimately, can sometimes be good for humanity as well. Or for scientific progress, although it could be destructive at times. It depends on what the arms race is about.
Aaron: Yeah. So I saw that hat tip to the folks over at the War College Podcast where I heard this, and I guess The Daily Beast is where the original investigative reporting was being done, but interesting since it loops back to something we've talked about before. And I always like seeing stuff in the news about AI that's not just, you know, talking about Skynet or the singularity.
Max: Alright. Well, with that, I think we're going to wrap it up. It's been a late night. Thanks for joining me tonight, Aaron.
Aaron: It's always fun.
Max: Alright. And yeah, we should come back in August. And oh, yeah, I'll have you. We'll talk about a special topic coming up soon in a couple weeks, where you could talk to me about a mathematical topic, but yeah.
Aaron: A pointless topic, you might say.
Max: Yes, a pointless topic. Exactly. But I'm sitting here; I need to turn my AC back on. And I think we all need to get some sleep. So we'll call it a day.
Aaron: Okay, now we just got to find some fake guests to get on the show, and maybe we can have an adversarial generated podcast
Max: Sounds good. Alright. Have a great week, everyone.
That's the show. Remember to check out the website at localmaxradio.com. If you want to contact me, the host, or ask a question that I can answer on the show, send an email to localmaxradio@gmail.com. The show is available on iTunes, SoundCloud, Stitcher and more. If you want to keep up, remember to subscribe to the Local Maximum on one of these platforms and to follow my Twitter account @maxsklar. Have a great week.