Based in Sydney, Australia, Foundry is a blog by Rebecca Thao. Her posts explore modern architecture through photos and quotes by influential architects, engineers, and artists.

Episode 282 - Meganets with David Auerbach

Episode 282 - Meganets with David Auerbach

Max talks to author and software engineer David Auerbach about his new book Meganets: How Digital Forces Beyond Our Control Commandeer Our Daily Lives and Inner Realities.

David Auerbach

David Auerbach: Website

Meganets: How Digital Forces Beyond Our Control Commandeer Our Daily Lives and Inner Realities by David Auerbach

Links

Cnet - The Fall of FTX and Sam Bankman-Fried: A Timeline

Transcript

Max Sklar: Welcome, everyone. Welcome. You have reached another Local Maximum. It may be a comforting thought to believe that someone or some small group of people is in charge, and when there are problems, it's their fault. But they could also fix things if they wanted to.

Well, what if no one is in charge? What if all of our systems take on a life of their own and no single entity can control how things turn out? 

Well, this is the idea explored by my next guest, who gives, in my view, a convincing case for why our so-called big tech companies, the ones controlling vast social and information networks, don't have as much control over the outcomes of their products as we would think.

I had David Auerbach on the show four years ago for his book Bitwise, and we discussed the issues surrounding all of our information being digitized and being interpreted by machines. 

This week I had a really fascinating discussion to follow up on that, and I look forward to sharing it with you. My next guest is a former software engineer at places like Google and Microsoft and has transitioned into a successful writing career as an author. He recently released the book Meganets: How Digital Forces Commandeer Our Daily Lives and Inner Realities.

David Auerbach, you've reached The Local Maximum. Welcome to the show. 

David Aurbach: Hello, thanks for having me. 

Max: All right, so today we're talking about your new book. Pretty hot off the press, probably a few weeks. It's called, Meganets: How Digital Forces Beyond Our Control Commandeer Our Daily Lives and Inner Realities. So, first of all, how long has this been out? I think I got an early copy, so that was kind of cool. Thank you for that. 

David: Came out in mid March. 

Max: All right, I guess my first question is, how did you come up with or find this term Meganet? Did you come up with it? Was it something existing? And what is it that separates a Meganet from a simpler, controlled network? Like, how do you tell if you're dealing with a Meganet? 

David: Yeah, the implicit question here is why a new word? And, yeah, it's my word. I found out after I published it that apparently a book 25 years ago used it, but nothing is ever new under the sun. 

I'd spent a long time thinking about what exactly is going on with humanity and technology and what exactly network life is doing to us and why it is that so many of the problems we're confronting, as far as making online life habitable, productive, pleasant, you name it, or simply organized and non chaotic seem to run aground. Why people are so dissatisfied with the state of things, and yet why that dissatisfaction does not seem to produce positive change. 

Ultimately, I felt like we weren't looking at these systems in the right way, that we saw it way too much in terms of individual and corporate agency. And that actually you need to see these networks, whether these networks are social networks or cryptocurrency networks, or online games or AIs, that all these networks share certain characteristics to them in terms of sheer size and speed and their ability to produce feedback-driven effects that just happen spontaneously. 

Those which I call the three Vs: volume, velocity, and virality are what differentiate a Meganet from sort of what we tend to think of as what our networks today. The determining factor, I think, the real difference has been the addition of hundreds of millions of users exerting actual influence on these networks. 

I used to work at Microsoft and Google, and I would say that we never foresaw a time when we would have less control over our systems because we assumed that because we wrote the code and mistered it, even if the systems were getting larger, we would still have a certain degree of control over what the algorithms were. But that's changed. 

What happens now is that whether in AI, whether it's human-produced training data being dumped into it that's certainly not authored by programmers, or whether in the case of social networks, recommendation engines, you name it. Those algorithms are constantly being shifted and influenced by people who are interacting with them. And that does give every user a little bit of control. 

Not decisive control, but in some, that does take away the control from the people who build program and administer these networks. And it's that interaction of the human and the computational component at a speed and size faster than we can keep up from. We can't moderate it or get out ahead of it because there's just too much going on, it's happening too quickly. 

That's the defining factor of a Meganet, and it's why I feel that our existing terminology isn't good enough to capture it. Because we're looking at these pieces in isolation. We treated it as there's the tech company, or there's the hardware network, or there's the algorithms, or there's the user base. But in actuality, it's the interaction of these pieces that's producing unprecedented phenomena that I think we're having a difficult time getting a hold of. 

Max: Right. One of the senses that I got from your book is that it goes beyond just the individual users having an effect where it's uncertain. It's not something as simple as oh, people like cat videos, so then we get cat videos, like that, you could wrap your head around. But it's more like legitimate users and trolls and people trying to make money all kind of come together. And sometimes, the effects are really surprising and hard to figure out where they're coming from. Would you say that's? Yeah. 

David: Historically we have not thought of software in this way. We think of software as you program it and you ship it and it does what you tell it to and you fix the bugs. But that's really not the case anymore. It's much closer to something like an economy or even something like the weather, where you're looking at systems where you can certainly exert influence, but you can't track it because there's simply too much going on that you're never going to catch up with it. 

In effect, you're playing whack-a-mole or you're solving the last crisis even while a new one is already being produced, and you don't even know about it until it's too late. But we aren't used to thinking of technology in those terms, obviously. 

That's why I feel that we really do need to change how we perceive and think of these systems, because as long as we think of them as controllable in the way that software traditionally has been controllable, we're just going to be smashing our heads up against the wall. And I don't hesitate to criticize tech companies, but I also think that asking them to do things that they literally can't do is not going to be helpful either. 

Max: Yeah, and this is a question I wrote down. It's just coming off the top of my head. I think this is something that the people on the front lines, like the product managers, the engineers, and the users themselves, sometimes understand more than the leadership at a company when it's like, no, the users are doing with this thing something you never intended. This happens up as a four square, and this product is not what you think it is. 

I don't know, sometimes you see leadership being like, well, we want it to be this way, so we've got to kind of ban this behavior and that behavior. But then there's also kind of the strategy of, well, kind of let the users do what they want. And then, well, sometimes you go down a deep, dark place in that, but sometimes, you get beautiful results where people kind of self-organize and sort of figure out how to solve their problems that you weren't even trying to solve. 

David: It's tricky. I think there's a consensus that total anarchy does not lead to happy results. But the problem is that it's a question of who exerts control, how that control is exerted, and avoiding unintended consequences. It's clear if you look at the history of Twitter over the last year or so, Elon Musk obviously went in thinking he could sort of just smash his fist and get Twitter to be what he wanted it to be. And it seems pretty clearly that that hasn't happened. 

Maybe you can do that as far as demanding that cars be made a certain way, but it's different when you literally don't have control, that degree of control over the system that you're running. And you see this in other instances too, that in the run up to the 2020 election, Facebook banned all political advertising. That's not the action of a company that can make fine-grained delineations of accurate versus misleading advertising. 

They limited it so that you couldn't forward any link to more than five people. This is again, not the action of a company that can do content-based filtering. And the question isn't I don't think it comes from a lack of will on their part. I don't think it comes even from a monetary incentive in their part. 

I think it literally comes from an incapacity, from incapacity that they are managing their systems to the degree that they can generate effective results and it doesn't stop them from them and other companies from doing these sorts of more individualized crackdowns. But I don't think anybody thinks that they make a huge amount of difference. Again, it's a game of whack-a-mole.

Max: Right. And so it's interesting, you said the total anarchy doesn't lead to good results. But then there's also the other end of the spectrum, which is like sort of a total dictatorship where you somehow think you're going to have complete control. Not only does it not lead to good results, sounds like it's just impossible. 

David: It's funny, I mean, I've had some people act as though they think that removing what they deem as problematic content from the internet is just a matter of will, and I just don't see how you can get to that. I think that there's a subset of content that the overwhelming majority of society deems to be unacceptable. Child pornography, for example. 

In that case, when you have that level of collective unified will that yes, you can get somewhere on, but that's a distinct minority of content. Thank God for that. But when it comes to something like misinformation or even what one might deem toxicity, you're in much more of a gray area. Most people are not pedophiles, but I think everybody is a little bit toxic sometimes. 

Max: Everybody disagrees, yeah.

David: Good luck figuring out what truth is in today's age. So the thought that you could administer that in any sort of top down way and in a way that wouldn't cause some significant segment of the population to be very unhappy with it feels like a non-starter to me. That once you get into the content realm, you're playing with fire and you have both the problems of adjudication and simply the problem of actually implementing it. 

Even if you could actually identify sort of a criteria for verbal content saying, okay, look, we don't want people to express such and such a bigoted sentiment online. Language is too rich and flexible to do a good job. And you see this on Facebook, where I think everybody has friends saying like, oh, I just got banned and I don't even know why for like two weeks for being toxic or something. 

Whereas there are other people where you can see that are posting incredibly vile stuff who seem to get a completely free pass. It's because you need human moderation to do it. Human moderation isn't that perfect. It is imperfect to begin with, and they are using machine pre filtering, which is that much more imperfect as well. 

Max: Yeah, it almost seems like, and again, off the top of my head, if you come up with these vague terms like toxic, oh, you're going to get banned if you're toxic. Then if I have an enemy out there who I want to get banned, I have an incentive to tell the benevolent rulers like, hey, this is toxic activity right over there, so get my person banned. So that kind of expands what's toxic. 

David: That gets to I think the issue of virality, which is that these systems are feedback-driven. You're never stepping into the same data stream twice, whether what humans aren't so consistent to begin with. But even the algorithms, if you have a dream of an algorithm that's identifying bad content or any type of content, true, false content, whatever misinformational content, that algorithm is going to be tweaked and massaged as well in terms of how people interact with it. 

You're in this constant state of flux, and expecting anything to hold up to real scrutiny, it's just not going to happen. And I think we're seeing this in a more visible way with AI, where AI suddenly generates nonexistent court cases and people are like, why the heck did it? It's like, well, because it literally has no conception of what's true and what's false. It's just generating text probabilistically and well, that's the state of the art AI. 

Indeed, the filters that are filtering content through AI or whatever aren't doing much better than that. 

Max: Yeah, so I started this podcast about five years ago, and I remember reading a quote in one of the early episodes from Mark Zuckerberg saying all this abuse and bullying and misinformation, we're going to use AI to solve the problem, and within five to ten years it's going to be the technology will get there. 

I was very skeptical at the time, and it's not because I think Mark Zuckerberg is clueless and doesn't know how to figure things out. And I certainly don't think that of Elon Musk either. But I certainly was skeptical, but maybe didn't have the language to describe why to be skeptical of this. You say that the use of AI and deep learning in order to tame the Meganet has actually made them even more predictable. 

Why is this? How do you think about this problem? 

David: Which problem, specifically? The AI problem?

Max: Yeah, making Meganets even more unpredictable than otherwise were, rather than taming them and trying to get them under control. 

David: Wait, predictable or unpredictable?

Max: Unpredictable. 

David: Oh, unpredictable. Sorry. Well, I think that a lot of the problems come from the congealing of what I call narrative bunkers, that the metric for what content has shown to people is that amorphous term engagement, which is give people more of what they seem to like. And that's sort of the way of throwing the filtering problem back at the users. So the companies are saying, okay, well, we're just giving people what they want. We're not exercising any bias or judgment on that. 

There's certain effects that follow from that and giving people lots of what they want isn't always a good thing. The issue, because you're going to start clustering people together who believe exactly the same thing and become that much more impatient with people or even in comprehending of people who believe anything else. 

The question is, if you're not going to reinforce that by constantly showing people more of what they've already been seeing, what do you replace it with? That's when it becomes prescriptive. You start saying like, oh, okay, well, we'll show them the truth. No, that's bad because now you've got a company in the way, in the business of adjudicating truth.

My recommendation is that instead you look in a more non targeted way. You start showing people simply disparate aspects of content without trying to get into rating it, at least you do less of it than we're trying to. You don't put like, content warnings saying like, this might be misleading, let's go. And you go to this website, those don't seem to work. Those seem to just make people feel that much more aggrieved, like, oh, they're really fighting the power, or what have you. 

It doesn't necessarily need to be more unpredictable, but I do think it should be less directed and less homogeneous. That you're looking as a way to sort of break up the feedback loops that keep pushing things in one direction. So if our problem is that these Meganets tend to accelerate out of control, if you can intervene in ways that cause them to go in multiple opposite directions simultaneously, they may start to cancel each other out rather than producing nasty positive feedback effects. 

Now you're going to need to do some experimentation to see exactly what works and what doesn't there. But just the fact that Facebook was trying non-targeted mechanisms, even if they were ham fisted ones of, okay, well, you know, just stamp know, limit forwarding, stamp out all political advertising. Well, that certainly does something, and it did something that didn't get them charged with bias. 

That says that okay, that's an area in which we can see that we can do some sorts of interventions that aren't going to be just sort of banging our head against the wall or throwing pebbles into a hole. 

Max: Yeah, there's one part of the book that kind of blew my mind, I had to tell the person next to me when I read it. And I'm surprised I never heard of this story before, but it was the story about the 400,000 people in China who were mining gold on Warcraft. And I thought like, what a waste of human effort. How did this happen and what do you think is the takeaway from that story? And does it tell us anything about how digital currencies might evolve? 

David: Yeah, it's funny because that story is like 15, 20 years old now. Those things still go on, but the peak of it was in the first decade of a century and Steve Bannon was in on the action. He was trying to broker a deal between, I want to say, like Goldman Sachs and some gold farming companies. 

It's crazy, but in some ways I think it was a good in effect, you're getting labor turned into exchangeable currency, even if it's illegitimate and limited currency. And yeah, that does anticipate some of what we see today with cryptocurrency, except it's being done with processing or GPUs. 

What are the takeaways? Again, the sheer connectedness and the ability for even unofficial economies to be set up in ways that get around regulatory apparati. That's not going away right now. There's, I think, a tremendous effort to, I think, get a grip on cryptocurrency and rein it in, especially after the FTX, Sam Bankman-Fried stuff last year. 

I do think that there will be some consolidation, especially since a lot of the smaller currencies are just sort of dying out at this point. But the nature of these systems, the point that I make is that they don't admit the level of control that you get with a central bank. 

Even if you set up central bank like systems for cryptocurrencies, which I assume will come into being in spite of the intentions of cryptocurrency’s founders, even if you do set those agencies up, there's going to be more anarchy implicit in it, simply from the degree of interconnectedness and the speed of interaction and the lack, the absence of the need for sort of a central hub controlling it. 

It's not like there aren't forces in the opposite direction. But I do think that it's going to be an ongoing issue in a way that you haven't seen with traditional currencies, and you have seen it a little with traditional currencies and high frequency trading and stuff. It's just that that was still more confined to high profile institutional investors. 

If you could have a bunch of people join together and form the equivalent of some kind of investment bank and do the sorts of crazy things that investment banks have been doing, what happens then? Well, I think we're going to find out. 

Max: Yeah. Is that like the DAO in the early days of Ethereum? I remember that, we could repeat that story where in the book yeah. 

David: Yeah, I talk about that in the book, yeah. That the DAO was an example of levels getting blurred because it was such a mess. They had to change the baseline rules of Ethereum to clean up the mess. And you can't do that on a regular basis. So the mere fact that it had to be done is a really worrisome thing. 

Max: Yeah, I remember that happening and I remember it being quite shocking to the industry. And what's more is like, even though they reversed the hackers’ coins, the hacker still has their coins on Ethereum Classic, which still exists to this day. 

David: Yeah, I think it was like 90-10 or something, was it? 

Max: Well, yeah, I think at some point it was actually a lot higher. But now, Ethereum Classic has fallen quite a bit compared to Ethereum. But I think there was a point where it was almost like three to one, two to one. I could be off, but if you just look at the high.

David: Not the price, but when the fork, how many users went with what the vote was, I feel like it was like 90-10 or something.

Max: I wouldn't be surprised if that's correct. 

David: Yeah, that sounds but, yeah, those forks are problematic. And the thing is that the fix was really ugly, that they basically were implementing a lower level, they were implementing a lower level fix for something that had gone wrong at a higher abstraction level, which yeah, is horrendous. You're never supposed to do that. 

I get why you would want, under the circumstances, if you're faced with what seems like it could be an existential crisis in your cryptocurrency, what are you going to do? The issue, though, is that the underlying circumstances that allow for that sort of thing haven't gone away and you still have the capability for forks to happen under increasingly stressful circumstances. And I mean, that gets into a lot of cryptocurrency stuff that we don't have to pursue right now. 

The point that I want to get back to is that these systems have a certain amount of chaos built into them because of what I say is their velocity, volume, and virality, that is going to remain untameable, especially once you get into the economic realm. You're looking at things that could impact the larger economy, that’s what we saw last year because cryptocurrency was sort of siloed off from the greater economy for a while while it was a niche thing, but that's stopping. 

You're seeing more and more cryptocurrency merge with the traditional economy. And that means that whatever goes on in cryptocurrency is going to have a greater and greater effect on things that we wouldn't think have anything to do with cryptocurrency. Just because we don't spend it on a day to day basis doesn't mean that it can't tank our economy. 

Max: Yeah, a couple of things to say about that. Sometimes I wonder what would have happened, where would Ethereum be today if they just said, well, sorry, it's all Ethereum Classic, we're just not going to undo that? I don't know. I don't know if I could predict it. I don't know if I would advocate that that's what they should have done. It's kind of an interesting what if of history. 

David: Well, given that a large majority of their users did not want that, presumably they could have instituted a fork anyway. If you can get enough users on your side, you can force it. So there is that aspect of majority rule. 

Max: Yeah, it was almost unstoppable then. 

David: But if you think of a case where it's more 50-50, that's when you can get a lot. A fork is dependent. You can survive a fork as long as most people come down on one side. And the very early bitcoin failures, it was almost unanimous. It was like, oh, there's a bug in the code. We've got to fork it in this way. Everybody do this. 

At that time, the community was small enough that it was like, okay, great, we'll just follow the leaders on this. But when you get into increasing gray areas, that's when it's like, okay, well, now you really do have to get a large number of people to commit, otherwise it does fall.

Max: That's analogous to what you said before about social networks, where it's like there are certain things that virtually everyone agrees is objectively horrendous, and then that stuff we're not so worried about. 

David: Right. And so if you get into a scenario where there's some split down the line that greatly favors half of a cryptocurrency’s users and greatly disfavors another half, you're going to have to have some sort of uniform mechanism. And I suspect that as you get a build up of institutionally administered cryptocurrencies, there probably will be those, but it's dicey. That's the thing, is that you're playing with fire. 

No matter how many institutional controls you try to build into it, you're still going to run the risk that you still don't control it to the degree that, say, a central bank controls a country's currency. 

Max: Yeah, I want to kind of dive into the idea that these Meganets kind of spill over into the general economy or cryptocurrencies as one example. I'm sort of remembering, before there was 2021 in crypto, there was 2017, and that was another high. And around that time, we were holding brainstorms about, okay, we have this game where you win virtual coins. 

Everyone's telling us you got to have an ICO and you got to turn this into a cryptocurrency. And we were looking at it and we're like, look, if we give people real money for playing this game, cheating, which is already a problem, is going to become an industry. And it's like, well, that's really scary and so I'm kind of glad we didn't do that. 

I guess the question that I have down here, maybe you comment on that as well, is what does it mean for Meganets to connect and consolidate? And why do they do this? Does this mean that we're destined to kind of live under some kind of government monopoly of our online services, or are we going a different direction?

David: You know, the issue is that you get gains in efficiency and functionality when you merge things together. This is why even though people aren't happy with Facebook, it's very hard to split off because of network effects. Likewise, that same inertia tends to drive networks to coalesce and once they've coalesced, it's hard to take them apart. Facebook has become sort of a clearinghouse for lots and lots of data about people that aren't even signed up with them. 

The government, I mean, you can draw hard lines, like health information is still somewhat siloed off because of protections around that. You have to do it very consciously, though. And in India, for example, every user has a single identifying number called Aadhaar, and the number itself the government doesn't track much besides it being some sort of like biometric identification.

Non governmental corporations and parties are piggybacking data associated with that member. So what you get is an increasingly large virtual network, so that you are building up a profile. Even if the government only holds a small piece of it, they are still acting as the unifying glue. And I think that that's probably what you're going to see. 

Does the government control it? Well, whenever something goes wrong with Aadhaar, India, the Indian government is fond of pointing out that, no, that was a private company that did that. And for the most part I think that's true. But that doesn't change the problem that since everything is hooked together, it feeds back into both the public and private aspects of it. 

Authoritarianism isn't quite the issue per se, so much as the ripple effects that you're going to get from all these systems being tied together and that something going wrong locally can now spread through it. Think of it in terms of identity theft, that from an efficiency standpoint, it would be great to combine my Social Security number, my driver's license, my benefits card, you name all the various identifying numbers I have with the government. 

You can see, however, what that means if, in the cases of identity theft, that it just makes any violation compromise that much worse. And that's the sort of coalescing and danger that you're looking at. 

Max: That's the new world we got to get used to. This last question is I don't know if it's going to have a good answer, but it's a little more provocative. So what I want to ask is, it might be more difficult because of the sheer scale of these things, but what if I don't just want to talk about Meganets or just be a user of one, or a victim of one, or beneficiary of one, whatever side you fall on. 

Let's say I actually want to build a Meganet. Would that be something worth pursuing? Do you have any advice on that? 

David: I guess well, it's sort of an after-the-fact term that is describing sort of a system within society. You can build a steam engine, but you can't necessarily build an industrial revolution or an industrial city. You can, but it takes other factors.

So if you want to build one, you have to build a network that accumulates a sufficiently large number of people that are acting on it in very tightly conjoined ways at a speech. So you don't build one so much as grow one. 

Some of it is luck, some of it is not necessarily under your control and being in the right place at the right time. Some of it is, if you look back at Facebook versus Friendster, Facebook definitely did things to expand their virality and attract and compel their users in a way that say MySpace and Friendster didn't. So I guess you can say what you want to do is increase participation and virality of it. 

I don't know about the ethics of it because you're effectively doing things that are getting people hooked on it, but that's what you want to do. You do things that build growth at any cost. 

Max: Yeah, interesting. All right, well, thanks, perfect on timing here. Really interesting book, Meganets by David Auerbach. Is there anything else that you'd like to add to our discussion today and where can we find you online? Where to find out more about the book?

David: Yeah, you can find me. I have a Substack, auerstack, auer and then stack, auerstack.substack.com where I've been talking about these issues as well. I'm on Twitter and you can find me in the usual places, but thanks for having me, Max. 

Max: Yeah, fantastic. Thank you David, so much for coming on the show. 

David: Yeah, and maybe see you back in New York. Take care. 

Max: That's the show. To support The Local Maximum, sign up for exclusive content and our online community at maximum.locals.com. The Local Maximum is available wherever podcasts are found. If you want to keep up, remember to subscribe on your podcast app. 

Also check out the website with show notes and additional materials at localmaxradio.com. If you want to contact me, the host, send an email to localmaxradio@gmail.com. Have a great week.

Episode 283 - Max Changes the Constitution, Part I

Episode 283 - Max Changes the Constitution, Part I

Episode 281 - The Mixed Reality Moment

Episode 281 - The Mixed Reality Moment