Audio of this conversation is available via your favorite podcast service.
Answers on how best to regulate technology differ depending on the values and politics of any particular jurisdiction. Yet it’s worth looking for points of consensus. In general these days, we in the United States have a lot to learn from lawmakers and regulators in Europe, who are further down the path in their regulatory experiments.
In this episode, I speak with one German lawmaker, Tobias Bacherle, who was elected to the Bundestag in 2021 representing Alliance 90/The Greens. The conversation touches on issues including encryption, the Digital Services Act, the US-EU Trade and Technology Council, and the relationship between tech and the environment.
What follows is a lightly edited transcript of the discussion.
Tobias, I’m very pleased to speak to you today. I think it might help my listeners who are largely in the US, although also perhaps a bit in Europe, to just have a sense of who you are, what your job is on a day-to-day basis and how it intersects with tech policy. So you’ve mentioned that you came out of local politics and of course now are serving in a national role there. But can you just give us your quick bio and how it intersects with tech policy?
I mean, as a member of the German Bundestag, I’m member of the foreign committee. I’m also a member and coordinator for my group, the Greens in the Committee on digital affairs. So obviously there is already the connection to tech and digital topics in general. But since I’m having this kind of also unique overlapping between foreign policies and digital policies, I tend to always look on many issues around digitalization and also regulation from a very geopolitical approach, from a very international approach, and also often asking who might copy that, who also might be interested in using certain kinds of regulation technology to target it in a century where digital authoritarianism is on the rise and human rights are basically negotiated or decided on in the digital realm.
Besides that, I’m from the southwest of Germany, which is more famous for automotive industry probably. But I’m also from a district where you have, as I said, or as it’s kind of clear where you have a big Mercedes-Benz factory, but you also have IBM, HP Enterprise there. And also, IBM decided to build their first quantum computer in Europe on their old campus in Ehningen, which is also part of my district.
So there is also an overlapping from where I am from and I believe the digital transition or the twin transition, the decarbonization and the digitalization of industry is very, very important not only for Germany but particularly for the southwest. And that’s one thing that drove me into this field in politics. But the other part is really that I am passionate about human rights around the world, that I’m passionate about freedom of speech, about privacy. I believe people need to have a certain degree of privacy to be able to develop their own thoughts, which is basically the base of discussions and the base of democracy. And I believe that Europe and the US and the west, and democracies around the world have a responsibility which goes beyond our borders. And this is why I decided I don’t only want to work classically on foreign policies or on data economy as I do as well, but I also want to connect those both fields.
I want to take up one of the issues at the intersection of US and EU relations around tech, which is the US-EU Trade and Tech council, which is now I think working on a variety of different sort of geopolitical, geo-strategic challenges when it comes to questions of technology. How have you intersected with that entity and with those conversations?
Well that’s a very interesting point because there’s no parliamentarian track in the TTC, but I believe that TTC is a very, very useful and important tool in trans-Atlantic relations. I know it was founded and the original idea was to bring both sides of the Atlantic together in times where politically there might be dissents, there might be controversy or there was controversy. But I believe it’s very, very important that the big two democratic entities in the world talk to each other about how they want to approach tech regulation in general. Also trade regulation. But for me mainly interesting is the part on tech regulation.
So this is why we said, or from the very beginning, we were very keen on supporting the TTC, but also realized that there needs to be, this is maybe a political approach to it, but there hasn’t been taken a decision whether it’s going to be a political high level meeting, as for example, Paris last year was intended to be from the European Commission but also from the US side at least anticipated. Or whether it’s really only a technical, the administrations and bureaucrats are going to meet and are going to exchange ideas. And why am I empathizing on that point? Because I believe if it’s the later one, it’s totally fine if we have just discussions behind closed doors if we are basically watching what the outcome is, but the political part is going to be decided somewhere else, the agreements are presented some … like not the final agreements on the third or fifth TTC, but the high political controversial agreements are presented somewhere else.
And also democracy, in a matter of democracy legitimized on another way. Or we decide this is a part where many negotiations are taking place and also controversial negotiations are taking place, but then we need to open the doors. I mean it’s very good that the trade and technology dialogue has been established, but it’s very important to shed light on the TTC because I have no interest that there is going to be a TTIP scenario where at certain point everyone believed it was, to a certain degree, transparent and we talked about it and it was no secret, but there was no publicity on it.
And after years and years, suddenly due to some controversy, in some instance there is a public debate around it and everyone is wondering why the heck, what the heck is the 15th TTC? What’s going on for seven or eight years there? And I believe this is a turning point where the TTC is also currently standing. Both I think would be very important. Both has their legitimization or make sense, but to a certain degree there needs to be a decision taken on. Nonetheless, I believe it’s a very important tool, but also regardless which way politically or bureaucratic approach you decide for, I think we need to integrate the very controversial topics as well. For example, the matters of data privacy.
Of course the US is a laggard on that particular topic, at least with regard to Europe. And I assume you’re suggesting that because, to some extent, the US’s kind of failure to move ahead puts it out of sync with Europe?
Yes, and I mean of course I also see that there are, with the new agreement, huge, huge steps forward, and also the negotiations currently going on in Congress on certain data privacy regulations. And I also understand that there is certain controversial view on the European jurisdiction itself, but in the end there is one clear line. If there is a reason for a US law enforcement entity to believe they should have some data and European courts are withholding that and the highest European court for example is withholding that then it’s problematic if another entity is overruling the highest European court on things regarding European citizens. And I mean, vice versa, it’s the same. Let’s be honest about that.
So I believe that this somehow needs to be put on the table and if there should be some movement, I really believe that there needs to be movement on both sides still, even if there was movement on the US side already. But we are far from the point where I would say we are having same level. Because as I said, if Europeans would try to access data of American citizens which are stored somehow in Europe, that would be highly controversial.
And of course, from a German perspective, this needs to be said very bluntly, we still remember Snowden, we still remember the leaks and we still remember that this was not … that the privacy violations that happened from the US side were not only regarding terrorism or regarding things where we might agree on that there is a certain reason for it, but there was actually, like on our government, and I think this is a damage that has been done and I think there needs to be more understanding that this is the damage that needs to be somehow, in some way anticipated, or … I’m lacking now the English word, sorry for that, the correct English phrase but taken into account.
Let’s talk a little bit about perhaps a related issue. You are an advocate for encryption and strengthening private communications. You have talked recently about the kind of conversation that’s happening both in the US and Europe, where the kind of, I guess … well I think one of the key constituencies that is interested perhaps in breaking encryption in some contexts or providing law enforcement with means to access messages in some contexts are advocates who are concerned about child sexual abuse material or sexual exploitive material. How do you think about that in the context of the debate in Europe and in the conversation in the trading technology council? Has that come up at all? I noticed there’s only one mention of encryption in the most recent White House release about the TTC, which is really something to do with quantum. So quite a long ways out, theoretical.
I mean, the whole post quantum encryption part is very important when it comes to data scratching and long term issues we are having with violation against that. So I would not underrate it, but yes, you are right. This is something when we look at both Germany, European Union, and the US, and many other states, all of them signed the declaration for the freedom of the internet, for the future of the internet, sorry. And for free democratic safe internet for everyone. And in that there is a clear, clear understanding how important encryption communication is around the world for position leaders, for human right activists, for women, for feminist activists, and many, many other vulnerable groups.
So this is the first part. There is already a certain promise made by the European Union, by Germany, and by basically those fighting for free and open internet by those states that encryption is important. And now we are having, with a very good cause and very good reason, somehow a Trojan horse introduced because the reason, no one can argue against that. And not even, it’s not about do we want to do something against child sexual abuse and content of those horrible crimes which is spreading online. But there is also clear situation where we need to do something against that.
But why am I saying this is a Trojan horse? Because first of all, it’s not about mass surveillance. It’s not about we need to surveil every communication which is now supposed to be private or which right now is private. We need to have the focus on better law enforcement, more better quality, better skilled workers in law enforcement, more tech resources in law enforcement, invest in things like Jane analysis where you can actually detect where money and how money is distributed related to those crimes.
And in the end, when you look into it, basically every European state law enforcement has a problem with those materials and contents that’s already detected, meaning it’s either posted publicly or it has been detected by someone who was involved in a group chat in a platform, in a social media or whatsoever, and who reported that. And we have the files filing up. So we need to work on that and we need to enable people to easily report any kinds of crimes and above all heavy crimes, and of course above all any crimes that involve child sexual abuse. But let’s do that and not violate private communication in general because the harm we do worldwide by basically destroying technology that is bulletproof against authoritarian regimes who want to have a classed citizen, as we say in Germany, a citizen where you can just look through as it is of course very useful for a Chinese social scoring system.
We cannot roll out the red carpet for those ideas. And with that’s what’s currently discussed the proposal from the European Commission with the so-called client side scanning, the chat control which would be implement a technology in any messenger to scan any private communication before it is encrypted. That would mean we would develop and fund a technology that could be used, easily used to surveil a position, civil society, or anyone. And literally anyone does not even get into our … or get into our liberal-minded liking framework. But it can be anyone who would be targeted by that. And this is something which would be so harmful for democracies around the world because we’re not talking about people who are targeted in some way. We are talking about a technology that is looking for certain buzzwords, for certain conversations, for certain topics to then go in and surveil people talking about certain topics.
And even though for the matter of child sexual abuse or grooming, on the first view. On the second view, I would even say even there it is dangerous because you have people who by law and by their profession needs to have protected communication even about those topics. For example, if a child, a victim wants to talk in a safe environment to a lawyer, and for some reason in the current situation does not want to have law enforcement involved because they fear for their safety. It could be really dangerous if, in another way, law enforcement gets involved. And so in a second view, I would even say even for victims, it’s important to have that access to private communication.
But let’s put that aside on the first view, the damage we would be doing with that kind of technology, the client side scanning it would be horrific and also it would undermine everything we stand for internationally because it’s not only the coalition for the future of the internet or the statement to the agreement as it’s now. It’s not a collision, it wasn’t a collision in the end, but it was intended to be one.
But it’s also, when you look, we Greens are now, our minister, Annalena Baerbock is now the foreign minister. We want to have a value based foreign policy. That also means we want to fight for human rights, we want to fight for liberal spaces around the world. And the best way we can ensure them, even within authoritarian regimes is through encrypted communication, VPN and other support on digital individual serenity. And to undermine that by basically putting a mass surveillance in place in Europe, this would cost all of our credibility. And honestly, solely speaking from a foreign policy perspective, I don’t know how to discuss with the Egyptian government that they should stop impersonate people for what they’ve written in private messengers. If they can point a finger back and basically say, “Well, but the technology we are using is the same you invented, we just stole that idea from you.”
And this is just a very personal point to that. And you see I could go on, on that, for a long time and maybe the last point why maybe the German discussion around that is a bit louder than in other countries because if we have to be frank about that, we had two dictatorships within the last 80 years, or 90 years, and both of them were hardly emphasizing on mass surveillance, on violating privacy wherever they could. And I think there needs to be some lesson from that and not enable governments with such powerful, dangerous tools.
I find you very persuasive on that and tend to agree on those points. I’m interested in, if you could quickly comment on the politics of this, perhaps not from your perspective, but what you expect might happen in Europe on this question.
This is very controversial because, as always, the Ministries of Interior who are negotiating that file in the council, they have an interest. I mean the Ministries of Interior in the European Union declared in a statement 2020 that they actually want to have a encryption workaround. So they were basically saying either we forbid encryption on private messages, but because we see that that’s not an option, we want to have a implemented backdoor. And everyone was basically saying what are they meaning by that? How are they, as it’s going to be, on a technical scale, is it go going to be a double communication? Everything is going to be sent to a server? Because of course Telegram or other messenger is easy. There is a server where everything is stored off, easy to have a backdoor on the server, but how is that with encrypted communication?
And this is important to understand because you have a lot of Ministries of Interior who are either still the same ministers or still a lot of people working in the ministry who also work there in 2020.
Now they are about to negotiate that in a council and of course they’re always coming back, and we actually had that in Germany. They came back to us and they were like, “Yeah, we went there and we said Germany is critical about that, but we were basically the only ones.” So there has been a strong message from civil society and also parliamentarians who connected all over Europe and basically now did the math, and we believe if there is a clear position above all from Germany, Austria already has a clear position that cleared through the parliament. So they are basically forbidding their government to agree on the proposal if the client not scanning and some other critical parts are still in the proposal, if there is a clear message from Germany, for example, and I would personally say I believe if the clear message from Germany is there, then it needs to be cut out because if Germany says we are not going to vote with that, Austria’s going to say that we are expecting a few more states to follow and hit the majority to block it.
But we have the same situation as I said that also our ministry of interior has some kind of interest. It’s also mixing that up with a debate around data retention, which is currently a very controversial debate. We are as a coalition, as parliament, and so in the coalition agreement we decided on different approaches on quick freeze and lock in traps so that you have to actually find a crime and then pursue it. It’s a bit complicated and not that much my topic, so I’m not getting into that, but very good ideas as far as I went through them. So there is a Germany interior debate currently going on which is very political as well because of course the social Democrats, they need to back their minister and she is slowly moving to saying no client side scanning but maybe scanning off private data stored online.
And this is not clear what private data stored online, what the definition is and what private communication that is not encrypted should be. So we are basically on a very clear line as greens, but I would also say as parliament, no private communication should be scanned.
Long story short, we are currently having the debate whether we also want to have a vote in the German parliament on that. I can speak about that very freely because it got leaked last year. We proposed a text which is very … leaves a lot of negotiations open for the government in the process but declined some very hardcore problematic points as decline site scanning, and well that’s currently stuck, politically stuck.
But then again, and this is European politics, you have also the European Parliament process, and I expect the European Parliament to strongly vote against the client side scanning. So I believe there are either two ways the European Parliament votes strongly against it and we get a strong German position against that, then the proposal is going to be stripped down and actually probably negotiated by end of the year, or that’s not going to happen, which also is a possibility. And then it’s going to be a tough call to finish the voting before the European elections in May 2024.
I want to switch over to talk a little bit about the Digital Services Act. Every country has to now bring on its coordinator and put the pieces in place to begin to think about assessment and enforcement. What’s the situation in Germany on that at the moment? And are Americans who are putting their faith in the DSA to potentially address some of the harms of social media, correct to do so in your view?
I would say yes they are, but it’s not going to solve everything and we just yesterday had a debate in our group with some experts on disinformation where once again it got clear that tech regulation is always somehow lagging behind and this is natural to the process. And there is a certain gap where the DSA is not going to work out, not only because there are parts of the DSA where we believe the DSA could also have been stronger, but also because there are new technology involving. And I mean the whole debate around ChatGPT, now GPT-4, that’s a new vehicle and I mean deepfakes is another vehicle where you experienced that last year, where I believe the question of content authenticity is playing a bigger and bigger role. But where I believe the hopes are, at least best put on the DSA, is in the measure of algorithms and transparency. And to have the VLOPs, the very large online platforms to be forced to have certain reports on measurements but also on user numbers and so on.
I think that’s a very, very good part. But above all, on the question that they are forbidden to use discriminatory algorithms and if proven or if something is coming up, they need to disprove it and they also need to let government institutions and … well this is a bit of a unclear part, but science and neutral organizations have access on their algorithms or at least partly. And I believe that’s very, very important. Although most of those control mechanisms are in hindsight. So if there is something going on after a year or two, you can prove that and it can be very high fines. And for example with Twitter, I have high doubt that they are going to comply with everything. I mean, it currently looks like they’re not complying with everything because the first reports they were supposed to present were not completed.
But I mean the Twitter debate is, I would say, a bit of a different debate. But yes, it is a very important step. Now with the German debate to enforce it, I think we learned quite a bit and we are having quite a bit of a discussion on the Netzwerkdurchsetzungsgesetz (NetzDG) currently. Disclaimer, I did not like the Netzwerkdurchsetzungsgesetz, but it is current law and it’s not being properly enforced. Twitter should have gotten very high fines under it, which did not happen.
So there’s this discussion around that and then there is the discussion on who is going to be digital service coordinator. It’s probably going to be the Bundestag’s rapporteur, a body which is currently underneath the Ministry of Economy and Climate, and is in charge of basically most, not only digital infrastructure, but also it’s an oversight board to maybe translate that or agency on basically everything that once was a wire, maybe to put it that way. So on telecommunication, on digital communication, but also when it comes to, I’m not too sure about sea carriage ways, but at least also when it comes to train lines.
So they are already quite a bit of an entity. I believe that would be, that would make sense to give the DSA coordinator to them, but their independence is a issue and they need to have a certain reform on that, but that’s currently their direction it’s going.
I want to finish up by asking you about another topic that I know you care about, which is the relationship between tech and the environment. You’ve been working on questions around digitalization, ways in which digital technologies might improve the way we live and work, and other aspects of questions at that intersection. What are you working on right now when it comes to tech and environment?
I mean there are two big parts or big discussions right now, and I mean I would also say when we are talking about green tech, we have two pathways. One is how can we have sustainable tech? Tech that is run on renewables, that is not a one-way hardware and so on. There are currently a few things going on, the European Union discussing right on repair. There is rather controversial parts on that now every device needs to have a USBC. I’m laughing a bit on that because I am in favor that we have a regulatory approach that every device needs to be chargeable through one common port and standard. But the question of if USBC was the best choice, well, I’m not going to say anything about that, but we’ll see. But in general, I would also say that this is a step forward when it comes to can I use my cable onwards or do I have to change them for every device? So this is one part.
The other part is, and this is a very German discussion right now on how much renewables are actually used when it comes to data centers. Let me look that proper English word up. Data processing service centers, I think is the right, it’s a better word. Or servers in general. And what’s happening with the heat, and this is a huge discussion because the heat that is produced in data and service centers, it’s a huge amount and Germany is currently mainly heating their homes with gas and oil, and we want to get rid of them. So we are looking for synergies in the near future, where are data centers, data service centers being built near to any new build home facilities or office facilities, and housing facilities, and where can we have the heat used to heat to homes? Very easily put. This would work very well. And then the second step and this controversial step is on a certain time horizon, we actually would demand every data center, every service center to use their heat.
And of course their data center operators are very critical on because they say, “Well, we have built that, it’s not as easy. We have cooling systems that are not obliged to do that right now. It’s going to be a huge debate.” And is currently a huge debate, but I think it’s important we’re having that debate because time jump 2040 is very important to use that energy and put it in quotes twice.
The other part is actually green tech. And green tech, in a sense, as we use it in the finance business, technology that is helping us against climate change to use less resources. There, we’re currently discussing the whole idea, and we are lacking in Germany, behind when it comes to smart meters. And also the idea of smart thermostats so that your heater is recognizing, “Oh, the sun is shining, or today it’s going to be warmer than yesterday,” and have such very, very basic data points in your heating plan included. We are really, really basic when it comes to that right now. So this is one thing where we’re empathizing a lot.
And in general, I would say, on the question how we can share and use more data. Because we have, in Germany, very many or a lot of small businesses, family owned small businesses. And when I’m saying small businesses, sometimes they’re hidden champions as we call them. They are number one in the world but in a very specific part of a production chain, which is great. Honestly, also on a social scale, it’s incredibly valuable for us as a society that we have so many small businesses. Because it’s distributing wealth in our society very well. But the thing is those small businesses are not used to share data because they are sometimes working on the same product, they sometimes are afraid that if they share data, the outcome’s going to be their little step in the value production chain is no longer needed. And overall we would say, “Well, that’s great because it’s more efficient,” but for a small business it would be the end.
But then also, of course, because sometimes some small businesses are run in a very conservative way, not very digitalized, many reasons why we are sometimes lacking the data and sometimes lacking the sharing of data.
And the last point is people don’t really know how can we legally share data. And this is the point where we are currently trying to have a clearing approach basically with the data act, which is quite controversial at some points. But I believe the idea of finally defining that if I, as a user, someone who owns a product, am using that, and not only I as a personal user but also as a jurisdictional entity, as a business, the data that are produced on that do somehow belong to me. They’re my data. Yes, of course, the company who produced that device has an interest in those datas as well, and I do understand that they cannot give me all of the data because then I could just basically steal their invention.
But the very basic idea that if I use something, if I am the user, the data that is produced also belongs to me, is a thing which is not regulated yet. And if we have that, then we can have in a second step, a way more easier intermediary … In Germany we say datentreuhänder, which would translate into data depository or data custodian or trustee, probably trustee is the best word, a data trustee.
So someone, also when it comes to personal data, who I trust to distribute the data in a sense that is representing my idea. For example, my very personal data on how I move around the world, what kinds of trains and cars I’m using to travel and what medicine I’m taking. That’s very personal. I don’t want that in anyone’s hand. But if there is a study done by a university and I, for example, say that’s fine, for that reason, you can have my data. I cannot get asked for that every single time, but if I have a data trustee, he can take that. And this is the kind of framework we are trying to create to then have a working data economy and have access to data, which then gives us the possibility to have a more efficient society and therefore use less resources.
I recently heard Susan Aaronson, at John Hopkins, say that this is, in her view, the biggest challenge in technology going forward, how we pool information in a privacy protecting way to allow us to solve big problems for society. So I think that that chimes well with what you’re saying.
We’re almost out of time. I want to give you an opportunity to maybe offer a final word. You’ve spent some time in the United States trying to understand the dialogue here about tech issues. If you had a kind of message, perhaps if there are Silicon Valley tech policy executives that are listening to this or perhaps lawmakers in the United States or regulators, what do we most misunderstand about European approaches to these problems or perhaps German approaches to these problems?
Well, I think there is a huge wall around privacy. And also I would say on both sides of the transatlantic there is, or the Atlantic, there’s always this idea, well, there is this clash between the US. They are about free speech and Europe is about privacy. And I believe both could be very useful, and not in a way that … I believe both is necessary for any human rights based society. But both perspectives can be very useful because to develop anything I can speak out on freely, I need to rely on the fact that if I believe I’m saying something in private, that it is actually in private. If I am in a room and I’m talking to one person, of course this person can tell it to another person. It’s the same in the digital realm. If I’m messaging someone, he can take a screenshot, she can forward it to someone, but I rely on that person and I know what my audience is and therefore I’m able to think on things on my own and then have a stand I can take in public.
But the other way around is as important. I believe the idea on free speech is very important when it comes on how to regulate social media platforms. But I would really, really love the US to actually do it and not have a talk around free speech, which is actually, but I don’t like my opinion to be regulated, but I want to regulate the other opinions. If there is a true discussion around free speech and the question, how is free speech manipulated today on very large online platforms, that will be very helpful. And then there is another approach to the idea on we need to understand how algorithm works, and on the other hand, we need to understand how distribution is manipulated, how content is manipulated. Because if I say something, I rely on that people somehow distribute not only three of my words, but also the whole idea of what I said.
So the question of context, authenticity of content, and how the content is distributed, I would even say are more important than the question what is the content, what is that? And this is a perspective that has a high heart stance in Europe. So I would say both of those perspectives do fit quite well together. If we would not pull on those sides against each other, but rather towards the direction we believe is most important, the free world, the democratic sphere in this world and all those people around the world relying on that and relying and hoping on the free world and looking to that in hope to move in that direction as well. But profit from that.
Well, an optimistic vision will hopefully see that type of vision delivered on in perhaps cooperation between US and EU. I’m sure it’ll take folks like you making more trips here and perhaps us coming that way as well. So Tobias, thank you so much for speaking to me today. Yeah.
Thank you very much for your time and having me.
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Innovation. He is an associate research scientist and adjunct professor at NYU Tandon School of Engineering. Opinions expressed here are his own.