Digital Governance and the State of Democracy: Why Does it Matter?
Justin Hendrix / Oct 8, 2022Audio of this conversation is available via your favorite podcast service.
On September 21, 2022, I moderated a panel discussion for the McCourt Institute at a pre-conference spotlight session on digital governance ahead of Unfinished Live, a conference on tech and society issues hosted at The Shed in New York City.
The topic given to us by the organizers was Digital Governance and the State of Democracy: Why Does it Matter?
Joining me for the discussion were:
- Erik Brynjolfsson, the Jerry Yang and Akiko Yamazaki Professor and Senior Fellow, Stanford Institute for Human-Centered AI (HAI) and Director of the Stanford Digital Economy Lab;
- Maggie Little, Director of the Ethics Lab at Georgetown University;
- Eli Pariser, Co-Director of New_Public, an initiative focused on developing better digital public spaces; and
- Eric Salobir, the Chair of the Executive Committee, Human Technology Foundation, a research and action network placing the human being at the heart of technology development
What follows is a lightly edited transcript of the discussion.
Justin Hendrix:
I have the privilege of doing this first panel and hopefully kind of creating a little bit of the norms for the culture of what will happen on stage here today. I'm grateful to the organizers for having me. We're going to talk about some big issues. I'm going to go ahead and invite up my panel to make their way onto the stage. We'll have, as you can see-- their heads are here in front of you on the screen-- Erik Brynjolfsson at Stanford Institute for Human Centered AI and the Stanford Digital Economy Lab. Erik, thank you very much. Maggie Little, Director of the Ethics Lab at Georgetown University. Thank you for joining us. Eli Pariser, who has worn many hats but is now co-director of New_Public, which if you don't know, you should check out. Thank you. And Father Eric Salobir, who's chair of the executive committee of something called the Human Technology Foundation, which he'll tell us more about. So Father Eric, thank you for joining us.
Okay, so we have the opportunity, because we've been invited here into this beautiful space, to be a little bit subversive and maybe question the premise for the panel and maybe even question some of the premise for the agenda today. Right? That'll be our job.
So, you know, we've had various ideas kind of put to us by the organizers. "The data surveillance economy is fueling inequity, eroding trust, threatening democracies worldwide. How did we get here? How do we chart a better path forward? Why and how, should ground rules, safeguards, ethical standards, be integrated into the tech development process?"
Maggie Little:
In 34 minutes? (Laughs)
Justin Hendrix:
34 minutes. That's right. So we'll sort it all out. But, perhaps I'll go around with each of you and give you an opportunity to sort of explain your perspective... when you were invited to this and you knew that was the premise, what did you think of that premise? How do you see this interconnected set of problems? Erik, I'll start with you.
Erik Brynjolfsson:
Well, thanks, Justin. Thanks so much for the kind introduction. And thank you to Frank and the whole team here. This is an amazing project that's underway here. I believe very sincerely that we're at at a fundamental turning point in history. As Frank mentioned, the technologies today are really changing the way our economy and our politics work, and we have a chance to reinvent what we're doing. Let me just put a little historical perspective-- I'm a professor.
So back in 1776, the steam engine was improved by James Watt, and that triggered as we know, the industrial revolution. But also that same year, Adam Smith published The Wealth of Nations, laying out the blueprint for a new economic system. And of course, the United States was founded-- a new political system.
And today, we have, I think, equally important, maybe more important technologies that are triggering a transformation. Not in the way we work with our muscles or that we work in the world of atoms, but the world of bits and with our minds. And the implications, I think, are even more profound. And so I'm-- working with Frank and others inspired by the Federalist Papers-- developing a new set of papers, a new blueprint. We're calling it the Digital Society Papers, to lay out some of the changes that are happening. And there's already been some amazing research on, on some of these topics that we're beginning to bring together.
Let me just highlight two quick articles in Science. About five years ago, there was a paper published in Science called, The Spread of True and False News Online. And it was really scary because it showed what we all worried about, which is that in fact, false information misinformation, spreads three times further, faster and deeper online than it does in person. So it's not just the anecdotes, but they documented over 100,000 stories, some true some false, and you could see that the false information spread faster.
Now, this was not because the social networks had some evil plan, as far as I know, to spread misinformation. It's because they were inadvertently designed to trigger the parts of our brains-- so the reptilian system, as Daniel Kahneman calls it, quick thinking, and that means you want to spread things that are amazing, things that you've never heard before, things that are unbelievable. Well guess what? Lies are more unbelievable, so you get a lot of false information spread. And the platform itself is in many ways amplifying that, These platforms may be destroying society, destroying democracy, because one of the most important things in this society is having truth privileged over misinformation. We've done the opposite, not by design, but in effect.
There's a more encouraging paper that was published this past week. I was a co-author and I'm so proud of it. It was in the, also in the journal Science. And it describe what we call The Causal Strength of Weak Ties. We all connect to each other, increasingly digitally. (Thankfully we're in person now.) And maybe the most important paper in sociology, the past century, was Granovetter's paper called, The Strength of Weak Ties. But it was observational not causal. Now we have data from millions of connections that people are making online. And what we found was that indeed, not just the people who are in your immediate friends circle are important, but the digital networks allow you to connect to people thousands of miles away.
People in different industries, different companies you hadn't connected before, and millions of jobs were created, that we could measure in this. So that's a much more hopeful use of the technology to connect people to each other. To people who had not known each other before.
And the takeaway I have from those two pieces of research is not that technology is evil and it's going to destroy us. Or that technology's our savior and is going to connect us. It can be used in both ways. It depends fundamentally on how we design it. And as we get more and more powerful tools, I believe we have more powerful tools now than we've ever had in history. We have more power to change the course of history going forward. So I feel like this is one of the most important things happening on the planet right now, is thinking about how we're going to use these new digital platforms to reshape the economy, to reshape democracy and reshape the way we connect to each other.
Justin Hendrix:
Maggie, to you. You're running a center that's focused on ethics. How do you come to the broad premise of this, this conversation?
Maggie Little:
Well, you know, you asked at the start when I first saw the invitation, what I thought? I'm being honest here. It's not because Frank is in the audience. Hi, Frank. But literally, I thought, thank God. Finally, because today is a day on governance, different models of governance, including regulation, that government isn't always bad, it has a role it has an irreducible role. So when I think of ethics in this space, we, we agree, I think everybody in this room, that when tech goes awry, when its power is over centralized, or the business models behind it are corrosive, right? That can be incredibly damaging to democracy. But the question is... Oh right, the Hamilton line, right? Revolution's easy, governing is harder.
So that's why I got excited about today. New technology does require new forms of governance. But I do worry sometimes in the Web3 discussions-- to which this audience is an exception, right?-- that decentralized tech is sort of equated with democracy and that all the structures needed to ensure democratic values and democratic resilience could be found in the code, as it were. And technology is great, but it's always technology used by humans. And so the tech needs to be designed in light of and regulated in light of actual humans and actual societies and the historic, the historical precedence we have experience of.
So, one of the things that I find most important is just to keep the reminder going. that while the current problem is well diagnosed is one of over centralization-- no question in my mind-- that's not the same as saying, at every moment in every decision point, fully decentralization is the answer, right? There are a lot of ways to blow up the world. Right? No structures other than letting people have full user choice and maximizing liberty themselves, as important as that is, will ensure that we don't end up in the wild west again. So, ethically speaking, I would say, meaningfully expanding and redistributing meaningful power also includes the need for human forms governance, even with the advantages of the tech in mind.
Justin Hendrix:
Eli, I think of you as in the business of plan B of trying to create alternatives, at this point, how do you come to this question?
Eli Pariser:
Well, first I just think we should, as Maggie said, ground it in why does governance matter, right? And it matters because it's about power. It's about who gets to speak. It's about who, which people matter, and what power we have together. Right? And I think there's lots of conversations happening right now about 'democracy under threat' and I think that was somewhere in the tagline.
But I think the biggest threat to democracy is its own total incompleteness. We've never actually figured this thing out in a way that gives dignity and power and freedom to everyone in a society, especially in a multicultural society. Never happened. No state on earth has ever done that. And so there's this real project of continued invention that has to happen.
And that's before you get to the fact that we're governing with like, 1700's technology, right? And so we're trying to invent something new with technology that's several centuries old in terms of how communication norms are expected. And so my feeling, and people here may disagree, is that when I imagine-- and I'm a hopeful person, and I have faith that we're going to get there-- but when I imagine the multiracial, pluralistic democracy that's going to exist in 2050, I can't see how we get there by little incremental tweaks, right. I think we need to be thinking much bigger about what kinds of powerful data structures we need. And I think that's a generational movement, and it's a really broad movement. And it's so exciting. I mean, for me, being in this for 20 years-- right now, it's starting to feel like people are actually up to the scale of that movement to the, to the ambition of that movement that's starting to actually sort of feel like it's a thing that's happening all around the world. And so I'm very grateful to everyone who's here, who's part of really feeling like, "Okay, we got to do something much bigger."
So New Public, you know, we're really trying to tackle this one piece of it, which is, democracies have always needed public spaces. Public spaces have always been key to how we built cohesion, understanding across difference, weak ties. And we know what that looks like in the 20th century. It looks like parks, it looks like libraries. But as we move into a world that's mediated by digital experiences, what does it mean to have public experiences? What does it mean to have public life and public space? And that's what we're trying to... That's our contribution to that broader movement.
I guess the last thing that I'll say, to start, is that I think there's a natural tendency to, you know, can we really, can we really build something different, something new? And I just want to turn that around and say I find it offensive for people to say that our current systems of governance are the best thing humans can do. It's not okay. We can do better than this. We have to do better than this, if we actually care about the dignity of human lives. And let's not be satisfied with what we've got, like let's, let's keep going. And I think the people who are, who are pushing autocracy forward, want us to doubt and want us to fear and want us to think it's impossible. And what I want us to do is to hold on to a kind of pragmatic hope that it is possible and figure out how we get there.
Justin Hendrix:
Father Eric, you may have the broadest time horizon on these questions of anyone on this stage, if I'm correct.
Eric Salobir:
Not so sure! First of all, thank you very much for having me here. Good afternoon, everybody. So actually, at least I have a perspective from Europe, which is perhaps slightly different, but I would like to jump on what you say, Eli, because, a couple of days ago, it was revealed that a big country of Eastern Europe-- I will not mention the name-- I would say has spent millions and millions just to, let's say, have a strong impact on our democracies and transform them. And probably be harmful to them. And my question was, how, why are we so sensitive to fake news? And so part of the answers were given by Erik, and that was very relevant, but I would say that for sure we have devices, those technologies are very well designed by smart people, implemented by strong companies and so on.
But for me, that's not all the answer. I mean, the point is, if you trust someone, and if someone else comes and says bad things about the people you trust, you will not change your mind in the snap say, "Oh, for sure. That's true." But why do people trust all those conspiracy theories? I mean, they are not dummies, and I was recently at the headquarters of Orange in France and some people were saying that-- prominent politicians from all around Europe come and say, "What's the connection between 5G and COVID?"
Justin Hendrix:
Mm-hmm.
Eric Salobir:
My gosh, just even asking the question is strange. And so I would say that, probably it means that we have no strong narrative, common narrative to oppose to those little stories. And I think that probably if our democracy is endangered, is also because at some point, it's weak. And, and we need to identify the weaknesses of the democracy to, to fix them. And, just in Europe, something interesting is a, a lot of politicians were elected because they were anti-system. So they were elected to govern the system being anti-system. Is a bit strange. But, for me, the most important point is the way they phrase it. Democracy is seen as a system. It's not seen as a common project anymore. It's seen like an operating system. I don't want to be managed by an operating system. I understand that people feel out of the loop. And so how do we change that? And for me, that's the point.
And probably, we need to fix the symptoms, like the fake news, hate speech and so on. But we also need to try to step back and see what's the real disease, this kind of democracy fatigue, and how do we cure that? And we've seen that, unfortunately, that democracy is a dream. So it doesn't work like that. But a lot of things can be done and I agree with you. Technology has to be part of the solution. It is part of the solution. But how do we, for example, use civic tech to involve the people in the society, in the, in building a common project. And actually, that's what we work on in our Human Technology Foundation, working on inclusiveness, how technology can be part of the inclusion? Working on, let's say, the impact of the big companies and so on. And so, I would say yeah, I think for me, that has to be part of the conversation.
Maggie Little:
Can I just add on to what you said. I love that you brought in the issue of inclusion of the people that would be using or who, whom the tech serves. So if we're going to live up to Eli's dream, I love your dream.
Eli Pariser:
It's gonna be real.
Maggie Little:
It's gonna be real, but one of the pathways has to include. Inclusion of the people who are part of the democracy. So one of the, I mean, diagnosing the problems that led us here. One of the big issues with tech is not just that it's big tech and over consolidated, but insular from stakeholders, right? The people who are going to be most effective. So part of, I think, your ideal envision what you're working on, is what are models for, some people call design justice, designing anything, including a city, a government, or a tech platform? How do you make it inclusive in a radical way? Might be how you get a radical solution on democracy.
Eric Salobir:
Yeah, for me, this is key, but unfortunately I'm afraid that too often, policymakers see inclusion in the tech. Like how to make it more accessible and so on. But for me, this is only the first step-- it's inclusion through technology. So how can we help people through technology to be more involved in the society? So it means that it has to empower the people and all of that, very often, we have the thing that once they have 5G, a decent computer or phone and a little bit of literacy on that, that's fine. But that's absolutely wrong. And it has to be designed to develop this democracy and to involve more people. And actually, we work with the French presidency on something called Tech for Good. So this initiative, partly on inclusiveness, and actually it was one of our goals like to design, like a digital inclusiveness goals., just to be sure that technologies will follow a specific path and so there will be more inclusive.
Justin Hendrix:
Erik?
Erik Brynjolfsson:
Yeah, and I think if we want to be more inclusive, but ultimately we want to do what we can to amplify and harness the better natures in each of us. We can be inclusive in a way that leads to, to mob rule and destruction and polarization and brings out the worst lies I talked about being spread faster. Or we can try to bring out the better angels. A couple of months ago, Alena and I had dinner with Steve Pinker and he's talking about his new book and trying to promote rationality. And there's a whole like checklist of ways that people can be more constructive. But what I thought was the most interesting part of our conversation was he talked about the institutions that we've created over the past few centuries, that are designed to amplify truth versus falsehood. You know: the journal system of refereeing, the adversarial court system, good journalistic ethics and some of the other things that are in place to try to, on the margin, make it more likely that imperfect people will lead to better outcomes.
As I was saying at the onset, unfortunately, I think many of our technologies or institutions have been inadvertently designed to amplify some of the worst instincts. Machine learning is really good at learning from clicks and quick information. It's not very good at learning from what people really care about over weeks or months or decades or lifetimes. Well, it can't. So we've built a system that not by design, but by effect amplifies some of the, you know, more primitive parts of our brains, and unfortunately, that's not the part that created the civilization we want. So we need to reinvent it. And in my conversations with Frank, one of the things that I found most inspiring was not necessarily just that, as Maggie was saying, what are the parts of Web3 that amplify democracy or not? Maybe some do, some don't, you know, you can do some things. But, but more fundamentally, the more general thing is that it's a reset button.
We're in, in a position where new technology is allowing us to just create new institutions. And now we have a chance to be a little more conscious about how we go about doing those and I think that's a big part of the agenda here.
Eli Pariser:
If I can just build on that, you know, I think with any institution, it's worth considering, how does this institution invite people to think about power? And what I think we live in currently is a centrally autocratic digital environment. Yes, you can participate, you can post a tweet, but if you want to change how Twitter works, you need to have $50 billion dollars and it's one person. You know, at the end of the day, there's like five guys who make all the decisions ultimately about how these systems work. And so I really believe democracy is not just a formal system of rules and, and regulation. It's a cultural system. And a lot of how people build faith in democracy historically, if you talk to sociologists and political scientists, isn't through because they voted and their candidate won. It's through trade unions, or it's through membership associations or other pretty informal ways that they came to believe that coming together with other people who are different from them, created some collective power.
And so those are the kinds of institutions I think you need in order to create a democratic culture in order to sustain a formal democratic system. We have the farthest thing from that in our digital environment right now. And so I think there really is this question. It's like the, the design justice question is a fundamental question. Because if people don't actually feel like they are meaningfully powerful over their environment, then they start to shrink back or look for someone powerful enough to punch through, like Elon.
Justin Hendrix:
So you mentioned 2050, the sort of aperture of perhaps mid century. How do we get there, right? Are we gonna have climate change bearing down on us in even more extreme ways? These autocracies that are presently putting pressure and attacking us, you know, both physically and perhaps in clandestine ways, will continue to do that thing. And lots more pressures, lots more conflict, COVID variant 68 or something like that, I'm sure. It's a rough patch ahead. I think we'd all perhaps agree.
If we are to get to a point where this multiracial democracy is at least still possible, intact or somehow is realized, we've got to the point where we're thinking more about equity et cetera... What does this group of people have to do starting now? This is a select group from the conference, right? They're individuals who are all leaders in their own right. We're doing things to address these issues. What would you say to them? What do they need to go out and do now?
Erik Brynjolfsson:
Well, you listed a litany of real problems, really challenging things. And I think it's very important to, to think about those. But in my career, what I ask my students to do, is think about what could go right? And so by 2050-- I like that, that's a good, you know, goal there-- you know, what were the things that went right? This is how Reid Hoffman likes to ask this question, as well. And I think actually, several of us on this panel have said, I'm also an optimist and I, but I'm mindful optimist. I think that if we do the right things, we can have, unprecedented flourishing. We can have empowering people to the extent that they never could have before. And we can have a, a level of wealth and abundance that really takes off the table, a lot of the problems that, you know... there's a lot of things because of AI and other technologies that we can, we can basically wipe out poverty and a lot of the other problems. We can address some of the environmental problems with much more success than we have in the past.
But the more wealth and technology we have, the easier it is to address climate change and the other things. So what we need to do is think about how we can use the technology to amplify humans, not to replace humans. Not to take humans out of the loop and concentrate all the power, which is a lot of how the technology is being used in business and it's a lot of how it's being used in politics today. But rather have a, a synergy of humans and machines working together on, in both of those fronts. And I think that that's a design choice. I'm very much not a technological determinist that thinks that there's just one inevitable future. History's shown over and over that there are forks and crossroads where you can make choices.
And so probably the main thing is to just understand that we have those choices, and that if we want to have technology that fulfill the kinds of values that my fellow panelists have been describing, we have to be conscious about that.
Justin Hendrix:
Maggie?
Maggie Little:
Okay, I'm gonna respectfully push back a little bit-
Justin Hendrix:
Absolutely.
Maggie Little:
... 'cause we've done a lot of dreaming and 2015 and what's possible, which I completely sign on for. I'm going to say what I agree with before I say what I disagree with. I also love the idea of alternative futures and your prompt up to your students of wha-, you know, imagine a better future and then ask what went right? I like that prompt so much because it backwards us into, we have to, we have to figure out how to get to that. So the part I am pushing back a little on is, I think utopian-based reasoning is incredibly dangerous. Humans for 10,000 years have thought, here's the utopia. So how do I get there? And I don't know pathways to go from a reality to a utopia, number one, that doesn't just break everything. And second, we don't have a good track record as humans with utopias being sustainable. And I don't see technology changing that, because you're still dealing with humans to build up what I said. So, yes to speculative ideation, fantastic. It's always good to take a pause and think of utopias and agency, it's possible to do radically better.
But then, returning to how would we go from here to there without breaking everything, one. Second, with inclusion, right? That was one of the things. So it can't just be we think of the revolution and make it happen. And then third, I'm remembering, right, the panelists to come after.... we've also got lots of regulators in the room, and I love regulators. My husband's a regulator, right? So we actually have an army of people who have high expertise, lots of experience in how actually to get stuff done. And I don't want to forget that army of people who we need to empower and ask them. So I really like incremental change too. Without incremental change, we're, we're, I'm sorry, we're just dreaming and I know that's not what you meant. But so part of why I love the idea today, is we're also going to get really practical while being supercharged about making it massively better. Yay.
And things have been depressing lately. So dreaming a little is fantastic. But there's so many good concrete things if we empower what people have already thought from making, making forward motion.
Eli Pariser:
I think the caution against speculative utopias as a North Star to build toward exclusively for everyone, I'm with you. I don't think there is a solution, or even like a couple of solutions to a bunch of these problems.
Maggie Little:
Mm-hmm.
Eli Pariser:
I've been thinking about, I think it's Elinor Ostrom, who's an economist who studied commons management has the thing about like, no panaceas. And what she meant by that is there's no magical solution that comes down that works for everyone. And the more you're talking about human communities, the more that's true. And when you look at institutions that are successful, they are situated there in a particular place with a real understanding of who they serve. They're not trying to serve everyone. So, I think in terms of the path to the better place, you know, the one piece is about breaking it apart, and saying, you can't solve these problems as one algorithm that we're going to tweak to try to make it work for three billion people. There will never be an algorithm that works for three billion people, really.
Justin Hendrix:
A protocol or an AI system?
Eli Pariser:
Yeah. And so let's like think about this differently. And then let's do a lot of experimentation and a lot of learning to see, you know, it's like the Gibson quote. It's like, the future is here, but it's not evenly distributed yet. Like, there are places where there are things that are working, and you could talk about Taiwan or Estonia or Iceland, Barcelona. There's a whole bunch of really interesting stuff that's bubbling up, that is pragmatic, but it's also much more of a, of a step forward than, you know, tweaking our vote by mail guidelines by X percent. And so, you know, I think that's where I would circle. And then the last thing I'll say is like, you know, this, this pragmatic hope thing is, is tricky, and on the one hand, like totally, yes, pragmatic. But on the other hand, like, I just want us to remember that, what everyone, what they want, is for us to be like, "Eh, it's too hard." You know?
And so I think it's that the hope, not in the sense of, you know, I know my dream is gonna come true, but as a habit of mine that is determined and I'm gonna keep pushing.
Eric Salobir:
Yeah. I'll tell you... I will say that I hope that we can make our dreams come true. And in a very pragmatic way, I would say that it has to be by redesigning some tools, some protocols, some patterns. Just to give you an example, we worked with the European Commission on data sharing. So when companies share data, they do that very efficiently for money. And as you say, it's very centralized, and probably nobody has anything to say about the way they deal with our data. On the other side, the open data is completely open and free and so on. But we see that is pretty limited, because the data is not high quality and some things can be done with that, but not everything. And so the European Commission was trying to find a third way in between, and they, I mean, it was kind of a dream. They called that, data altruism. So sharing data for the common good. But it was just not like a big declaration,-- oh sure, a little bit like Teletubbies-- wee should share our data and whatever. No, it was like, "Practically speaking, how can we do that?" Should we have like third parties managing our data? Should we have a way to finance it? Should protocols? Should we have a regulation and so on?
So just to be pragmatic and forge create a new way to share date. Now, we don't know if people would really buy it? If would people re- re- use it? And I'll tell you, we're working with them on the governance and the testability of this process. But this is the kind of experience that, that can, that should be replicated i- in many ways, just because if we build this kind of tools, we can twist the system. We can change things.
Justin Hendrix:
Yes sir?
Erik Brynjolfsson:
Yeah, since panels are more fun when we have a little friendly disagreement. I'm going to pick up on incremental versus, versus the, the radical. I'll take the side of more radical ....
Maggie Little:
Wait, I said utopian.
Erik Brynjolfsson:
Okay, I'll take the side of utopian then too.
Maggie Little:
Mm-hmm.
Erik Brynjolfsson:
So, I mean, utopian of course, you know, it means no place, is literally what it means. So that's maybe a little extreme. But it actually was one of my favorite books when I was a kid, because I was inspired by the idea that we could build a better society. And like everything in this room, somebody imagined that monitor, and that, you know, camera, the- these buildings and then they built it and it got created. Our country was imagined by a group of people and they got together. It has a ton of flaws, but they created something very new that hadn't existed before. I'd say it was pretty radical for, for it's time. You know, Facebook and Twitter and Move On and lots of organizations. Somebody said, "Hey, let's create this." And that's how, that's how we have all this amazing stuff. That's why we're 50 times richer today than we were in the 1700s, is because people consciously said, "We can build something that never existed before." Right now we have the most amazing tools ever. I've moved out from MIT just to Silicon Valley and Stanford and every day I'm running into people who give me these utopian ideas.
One of my friends is building flying cars. Another one is building implantable chips. And when you hear them, you're like, if I hadn't been in Silicon Valley, I'd be like, "That's a crazy utopian idea." But then I look at their track record, and they've done stuff like that before. So a lot of these crazy ideas fail, but some of them succeed. And I think we are right now at a point where we, we do have these more powerful tools than ever before. And so I distinguish two kinds of optimism. There's the kind of unconditional optimism that you, that I might have as a child that, you know, Christmas is going to come and presents are going to arrive and I'm just like, "I just can't wait until that happens. I'm just gonna sit here and wait and good stuff's gonna happen." That's a really bad, destructive kind of optimism. I don't think, I think that's bad-
Maggie Little:
As an adult. It was good as a kid.
Erik Brynjolfsson:
As an adult. Okay, maybe okay for kids. But, yeah. There's another kind, I call it mindful optimism, which is a word I used before. Which is like, "I want a tree house and I'm going to build that tree house. And here's my idea for it." And, you know, yeah, it has to be realistic, but it may be something that didn't exist before and it may be the best one in the neighborhood. You know, and that is something that if you put your effort into it, you can create. And so I would call for, for mindful optimism on this panel. And in our society, we have a lot better tools than the kids building tree houses did, and I think we have a chance to radically remake the way we- we're governed and the way our economy runs.
Maggie Little:
Can I?
Justin Hendrix:
Yes, please. And I just want to warn the audience before Maggie, we are going to open up to some questions.
Maggie Little:
Yeah, one minute 45.
Justin Hendrix:
For those of you who joined us online, we'll also have some mechanism for you to ask as well, Maggie?
Maggie Little:
So I love everything that you just said. So I just want to clarify what I find dangerous. From a reasoning, which is not what you said. Radical? Yah. Innovation? Yay. Creative? Right. Thinking of ideas and realizing never been thought of before? Yah. The utopian reasoning, I mean, something very specific that sometimes verbals up with Web3, we can have a society where there is no wealth disparity and everything that's right. That's to imagine an end state that has no problems, and yet is still populated by humans. That's what I find dangerous. Okay? As opposed to-
Justin Hendrix:
And we hear those things from people like Sam Altman.
Maggie Little:
Absolutely. And that just... I'm going on too long about it, but that seems very dangerous me. I don't think it's what the panel here is taling about, but I do want to caution against it, because again, sometimes in Web3 it's, we can have it all and since we don't have it all, now everything's broken. That's not how it works but love the idea of mindful, hope and radical solutions.
Justin Hendrix:
Great. And that gives us... Go ahead Eli, you've got 20 seconds before we are meant to hand it over to the audience.
Eli Pariser:
Yeah, I think the other piece is like, who's, who's invited to imagine this stuff? And, and what does it mean that we live in a society where, you know, billionaires can imagine whatever they want? Everyone's kind of like "Okay, I guess."
Justin Hendrix:
But Jackson doesn't have water.
Eli Pariser:
But Jackson doesn't have water and is not allowed to imagine a different way of doing water or a different way of doing, you know, and so I think that's part of the muscle we need to build, is a much more participatory and people driven imagination about what it is that we want, rather than having it sort of trickle down from nowhere.
Erik Brynjolfsson:
I know we're out of time. One of the things that gives me the most hope and optimism is.... Raj Chetty wrote this paper about the Lost Einsteins. And there's a very small slice of people in our society that have the opportunity to invent and create new things. I feel like I can be near some of those people, which is great. But one of the things that's changing is not just that we've connected a billion people, seven billion people to access information but more and more of them are able to contribute in a way and one of the biggest parts of our agenda should be to widen that funnel, ultimately, ultimately to almost everybody who to be able to contribute. That's going to make us all wealthier, the more people we have being the future Einstein. So that, that would be a, a top of the agenda of how we reinvent our society is, is just to address the challenge that, that Eli just brought up.
Justin Hendrix:
Okay, my microphone is ready, but Father?
Eric Salobir:
Yeah, just, just one second thing to [inaudible 00:37:20]. I would say that for me the point is just also to be sure that the citizens have the mindset that they are really citizens. Because, I mean, we have in, in Europe in some countries this dream of state as a platform, but if state is a platform, this, the city isn't the customer. And very often we have people who behave as customers in their own society. And this is not the way you can change the society. The only way to change is to be an active customer, doesn't mean to be an activist. But to be part of the reflection to... I mean, for me, technology is not about technology. Technology is about politics. And so people have to jump in that and, and this is clearly kind of the mindset, shift of mindset. Yeah.
Justin Hendrix:
Let's go to a question in the audience. I've got the first hand up right here in the center.
Audience question:
Hello. I have a question. That stems first from when Frank mentioned that this is the time for governance and the time to really do something really big and then Maggie, you said something that's still just keeps playing in my mind which is oftentimes, technology is equated with democracy, or at least now decentralized is equated with democracy being that tech will ensure goodness for example. And so you know, we also talked about how there are, these dreams of big, big things and we have to dream the utopia world. But getting to that utopia, you know, there's multiple versions of social media before we have the version we have now, right? There's multiple versions of all these things. So in all of your experiences, Maggie, whether it's teaching or working with different people what do you think of incremental ways a builder in tech, or someone who's a policymaker right now, someone who's doing enforcement within the next few months, next year, so not 2020? Like getting to 2050. What is it that we can do right now recognizing the power dynamics in the room, right?
There are people who would never be at Unfinished that are deeply impacted by this, what do they do? So what is your thoughts for all the various groups of people and what can we do now?
Justin Hendrix:
So some, some version of the question I was putting to you, which is what can this group of people here who all have some leverage in the world do?
Speaker 6:
No, not just us though. People who aren't here. We're quite privileged.
Maggie Little:
Yeah, well, so long list, but first of all people building community so that the people who are doing the imagining are more inclusive for sure and funding them, right? It's not just who gets to talk. It's who gets to do and build. Second, figuring out the regulations that are needed now and figuring out political pathways, right, for how do you enact those. So even things down to state and local levels are the people who set a lot of the actual regulations on tech and data, vote for those people who are going to do it right. Third, work on tech companies willingness to partner with people who care deeply about the ethics but are happy to get in the weeds on the technical and help sort of path find? Yeah, like bushwhack, right the path is not figured out the ethics here that are needed cantilever out over the systems and norms that are already in existence.
So you have to have partners to build this stuff-- citizen community, regulator voters- I actually do think that making voter access easier is massively important. Okay. And then tech companies finding folks who are interested in helping to help develop capacity on an ethical mindset, while keeping in mind the bottom line.
Justin Hendrix:
It sounds like Maggie says, organize, organize, organize? Is that right?
Maggie Little:
Sure. We've got time for one more question because this thing's counting down. So maybe I'll come back to the first row here again.
Audience question:
Thank you. Great panel. Antonio Gutierrez from the UN gave the humanity a pretty bad report card yesterday. And you were talking about the spreading false news or lies three times faster. And I'm wondering about education. For people believing false lies, false news, bad news versus the truth that we might think, and places like Cambridge Analytica that are still no doubt working, you know, in deceptive and evil ways. So how do we deal with people who think they're believing the truth, and thoughts on that.
Erik Brynjolfsson:
Yeah. I'm so glad you asked that question. I think it's one of the existential challenges to our society right now is dealing with this, the- the success of a civilization depends very much on privileging truth over lies, and we've been doing the opposite lately. And there's a set of categories of things that we need to look at. One is at the individual level, education and just, you can teach people to look out for misinformation. I learned this new term a while back from Jared Cohen, nut picking and now I see it everywhere. Nut picking is finding an idiot on the other side, some extremist and making them famous. Somebody who would have been ranting to 12 other people at most, suddenly becomes famous to three million or 300 million people. And then that becomes the, you know, the, the, the symbol of the other side and it generates all sorts of anger. So that's a, that's a tool that, now that I understand it, I see it and I try to avoid it. But there are a lot of individual things just, you know, basic education and civics and how you, you know, the scientific method and how you distinguish truth and ethics too, I think it, it may make a difference.
But the other thing which I was touching on was, institutional and organizational changes. And this is something that I think we know some of the techniques that work. But I think that it's a research agenda to systematically look at what are the levers... There's a whole bunch of open platforms out there from Wikipedia, Twitter, you know, Reddit and et cetera. If you look at them, some of them have a lot more misinformation than others. And they have dozens of design parameters. I was talking to Jimmy Wales about this and some of them are anonymous. Some of them have different revenue models. Some of them you can treat- repeat things very quickly. Some you can't. I have hypotheses, but I'm trying to be a scientist about it. We want to look at that systematically and say, "Which ones are correlated with more truth? Which ones are correlated with more falsehoods, and how can we over time design systems in organizations?"
I think most of the people who run these organizations and these systems, whether they're individual billionaires or, or, or boards of directors, I know some of the people on board. You know, they want to do... They will want to do the right thing. Maybe there's some revenue model incentives that push them in the wrong way. Maybe there's some technology design centers that push them the wrong way. I- I think an important agenda would be to understand that better, and then start implementing it just as, as, you know, the Federalist Papers laid out a set of principles that, you know, imperfectly moved us towards more, more democracy. We can do the same thing going forward. But, but I'm glad you asked that question.
Justin Hendrix:
Father, we were out of time but 10, 15 seconds.
Eric Salobir:
Just 10 seconds about the two questions. Perhaps, I think that each time there's a kind of riot or social movement or whatever, we allow people to complain, so to identify the problems, but not to be part of the design of the solution. And I think we should trust more the collective smartness. I mean, people can do a lot being connected, working together. And I think that technology can allow us to build a solution together. And we've seen that each time we try that. So it's time we trust people, we get a good surprise.
Justin Hendrix:
Trust people. Maybe a nice word from, from, from the man in a cloth on the stage to end us. I thank the audience. I thank you for your smart questions. And thank you to the organizers, again. I'm sorry, we took a couple of extra minutes. Thanks to the panel.
Eric Salobir:
Thank you.