A Conversation with Baroness Beeban Kidron on Child Online Safety
Justin Hendrix / Apr 27, 2023Audio of this conversation is available via your favorite podcast service.
This week, a bipartisan group of US Senators introduced the Protecting Kids on Social Media Act, legislation that would require social media platforms to verify the age of users, prohibit the use of algorithmic recommendation systems for individuals under age 18, require parent or guardian consent for minors to use social media, and prohibit users who are under age 13 from accessing social media platforms altogether.
The bill is just the latest in a string of federal proposals aimed at protecting children- from the Kids Online Safety Act to the Stop CSAM Act and more. But while the House and Senate debate such measures, states are moving ahead with their own versions of bills ostensibly designed to protect children.
Chief among them is California’s Age Appropriate Design Code, which passed last year and is due to come into effect in July 2024. It seeks to limit the collection of data from users under 18 and to hold tech companies accountable for designing products with children’s privacy and safety in mind. The law faces a legal challenge from industry, and some legal experts say parts of it may violate the First Amendment.
How the courts decide the matter may have implications across the country, where dozens of bills have passed or are advancing in states such as Utah, Arkansas, Texas, Maryland, Connecticut, New Jersey, and New York. These bills vary in their composition and intent- as Tate Ryan Mosley recently reported at MIT Technology Review, “While some aim to protect privacy, others risk eroding it. Some could have a chilling effect on free speech online.”
Efforts to introduce protections for children online are also underway in other countries, as evidence accrues of a variety of negative mental health and privacy concerns. But what approaches best preserve freedom of expression, while requiring changes on platforms that protect children’s interests and address the worries of parents?
In this podcast, we’ll hear from one UK lawmaker and advocate who has been influential in the global push for more protections for children online. Baroness Beeban Kidron OBE is a Crossbench member of the House of Lords and sits on the Democracy and Digital Technologies Committee, and she’s a Commissioner for UNESCO's Broadband Commission for Sustainable Development where she is a member of the Working Group on Child Online Safety. She’s the Founder and Chair of 5Rights Foundation, which seeks to imagine ensure children and young people are afforded the right to participate in the digital world “creatively, knowledgeably and fearlessly.”
5Rights played a key role in advancing the UK Children’s Code, as well as the California Age Appropriate Design Code. I asked Baroness Kidron about the broad trajectory of efforts to address online child safety, what she thinks about the challenge to the California law and some of the harsher provisions of laws in other parts of the country, and where she believes the fight for child digital safety is headed in the future.
What follows is a lightly edited transcript of the discussion.
Justin Hendrix:
Baroness Kidron, could you just tell folks a little bit about 5Rights, how it came into being, where it operates and how it's organized?
Baroness Beeban Kidron:
So 5Rights started six, eight years ago, something like that. And really it was because people started trying to join my effort to put kids' safety, kids' rights on the map, and no one was really talking about the digital world in relation to children at the point at which I first got involved 2012, 2013. And as I started moving into that space people started wanting to join and I realized I needed an organization for them to join with. So really it's grown out of that, we're a very collegiate organization. We work with all sorts of people all over the world, which are either NGOs or policy makers, technicians, engineers, tech companies... it doesn't matter who they are, we're very, very open, but we're very clear that we have three things that we want.
One is to have children's existing rights manifest in the digital world, applied right across. The other thing is we want the system designed with them in mind. So how would you do this differently if you knew your end user was a child is the thing. And the other thing is that we really work very closely with young people, with children and young people who don't have much agency in this space and don't have much of a voice. And so a lot of our advisors are actually also the people who we're talking about.
And so those three things, providing that you buy into those three ideas, we actually work very, very collaboratively with people all over the world.
Justin Hendrix:
A 2019 profile of you by Natasha Singer in the New York Times found you recounting your plan. And the quote that stood out to me was, "It's little Timmy in his bedroom versus Mark Zuckerberg in his valley."
Now it's four years later since that profile, that plan has borne some fruit, how would you characterize where you've got to at this moment? How has the plan changed? What has it delivered?
Baroness Beeban Kidron:
Yeah, I think that really the big change over those years is that maybe we've stopped arguing about what the problem is and we've started arguing about solution. For the first period people treated me like I was a middle-aged woman who didn't know what rock and roll was. And apart from the fact that it was gendered and boring and I started my life as a camera operator, ended up being a film director and came into tech as someone who was an early adopter of tech... so it was at the very least patronizing. The truth of the matter is that that idea that there's been a generational injustice, that something that was sort of invented to connect academics who were very similar to each other has become the organizing technology of our world. No one thought about what it would be like for a child to grow up in that world and it wasn't designed with children in mind.
Now I think everybody buys into that notion that took me quite a long time to say, which is actually if you treat all users equally, then you treat a child as if they're an adult and that means you are taking no account of their emotional, physical, intellectual development, no account for the vulnerabilities of their age or agency and no account of the normal, so I like to say privileges and protections of childhood. It's not only about protection, it's also about being allowed to be someone different when you are eight and 18, being someone different when you are 11 and 21, there's a load of things that become very problematic if you think that from the moment kids get any kind of device, they're going to be treated as if they're adult. And if I say now that one in five kids under the age of five have a device of their own in the UK -- I don't know what it is in the states, but I'm sure it's very similar. So we are going to start treating one in five under five year olds with the same attention status. That's ridiculous.
Justin Hendrix:
So one of the crowning achievements, I assume, of your effort has been the UK Children's Code, which went into effect in I believe September of 2021. What's been the impact of the UK code so far? Has it faced any significant challenges?
Baroness Beeban Kidron:
Well, it faced quite a lot of challenges on the route to being adopted. There was all the lobbying in the world, there was a lot of disbelief, there was a lot of sort of arguing about language, about approach, about whether it make any difference. And it always amuses me that half of the people said it would make no difference and the other half of the people said it would break the internet. And somewhere in that middle bit you think, well, maybe I'm onto something here between those two things. So I think when you come to now, and you say that actually in the transition period it did become law in, as you say, September '21, but the transition period was a year long. So from September '20 it was clear what it was. And in fact, even before September '20, it was clear what it was going to be. And since that time we've seen a lot of tech companies roll out hundreds and hundreds of design changes.
Now, in the beginning they used to pretend that they were just fantastic about child safety and it had nothing to do with the upcoming code in a period that shortly followed that where they were being regularly hauled into Congress and Parliament and the European Union, they started saying, oh no, no, no, we're fantastic because we do the age appropriate design code. So they went from denial to adoption rather swiftly. But maybe the most interesting thing is to kind of look at what they started to do and the kinds of things, and as I say, there's hundreds of small changes, so we can't go from here, but the sort of eye catching changes if you like are when TikTok actually stop notifying under 15 year olds at 9:00 PM and under 17 year olds at 10:00 PM you go, ooh, they can design for different groups, evolving capacities. And actually it is a harm to keep kids awake and notified throughout the night. And in fact, Harvard does show that teachers teach at the level that the tiredest child can manage and so on.
So there's one. Meantime, Instagram, TikTok both took out direct messaging for under eighteens from unknown adults. When I say that mostly people go in horror and clutch their neck and they go, "Why was that allowed in the first place?" Unknown adults to kids who are basically parading their wares from their bedrooms is not a good look. I think interestingly, Google took out 18 plus apps from their app store just so that you couldn't see them if you went in as a kid. It wasn't about whether you can download them or can't download them, it was just like if they're 18 plus, forget it. And we all know what 18 plus means. We're not talking about meaning that they can't look at Fox News or the New York Times. We're talking about adult dating sites and pornography and the kinds of things most people don't want kids to have access to. And on it goes. A lot of really interesting changes, and a lot of very subtle changes, and a hell of a lot of changes.
Justin Hendrix:
So I'm sure some of those changes perhaps were more pronounced when you brought the code from the UK into a US context. I understand you and your foundation were heavily involved in the drafting of the California Age Appropriate Design Code. Are there differences between the countries that make it more challenging to implement the vision that you had for the UK code in the US?
Baroness Beeban Kidron:
If you don't mind I'm just going to slightly reframe that question. I will answer it. But here's the thing. I think it is fair to say I did not have a vision of the UK code being everybody's code. What I had was a vision of how you would design for children if you thought about them for half a second. And what has happened is people who have been struggling with this issue have suddenly gone, "Oh my god, the UK has a code. Why don't we have a code? Why shouldn't California's children have the same protections as the UK?" And they ring up and I go, yeah, you should. It's not that difficult. Do it. Right. And I do think that some of its professional detractors, if I might put it that way, like to treat it as if it's a personal mission of one person sitting in an unelected chamber in the UK.
I do have a personal vision, which is to build the digital world young people deserve. That is my vision and it's entirely personal to me. I do not control the way other people in other jurisdictions approach that. And evidence of that is that I also work with the IEEE, who are trying to create technical standards for children. I also work with the council on the rights of the child who are trying to take a rights approach. I also work with the African Union who are trying to make sure that the ways that we protect kids don't take so much infrastructure that their community can't afford it. And I also work with a lot of enforcement and specialists in the area of child sexual abuse and violence against children to make sure that's not happening.
And also, and we may get to this, I have a very particular interest in looking forward and not back, and I think if I had a criticism of most of the efforts around the world, it's that they are actually trying to regulate what is already passed and they're missing the boat again on what's about to happen. And so in reframing the question, I go, I was absolutely delighted to help whatever help I could be in California and say, look, this is my experience, this is what I think works. And yes, if you align then we can marry the same cases and indeed other US states, but even before California, Ireland had done it. This is not a kind of one stop shop. And I think that's really important to understand.
Justin Hendrix:
I do just want to press you a little bit on some of the context in the United States and see what your thoughts are on it. Of course, in California there is a substantial legal challenge to the bill. NetChoice is leading a challenge, arguing the bill violates the first amendment, giving the government unconstitutional control of online speech. You've got some of those experts that you refer to like Santa Clara University School of Law's Eric Goldman, who raises similar concerns about online speech. What's your thinking on that at the moment? How does this law comport with the US First Amendment context?
Baroness Beeban Kidron:
Well, I am just really bewildered, if I might say, about their focus on speech because a) this is a data protection code, and b) it's a design code, and c) the way you get to it is you do a risk assessment of your own services and say, am I going to damage kids? It doesn't say anything about content. It's not a content moderation code. So what they're doing is they're going to their legal... they're going to their cabinet, they're pulling out their last legal attack on whoever did whatever they did, and they're shoving it at us. And I think what it is, is honestly, I think it's flag waving. It's saying, "Hey, we're going to come after you if you do this. Don't think about copying the California code. We're coming after you." There is nothing about the First Amendment in this code. There is nothing. There are other issues you could raise against the code, but it simply isn't about content, it's not about speech.
I have some very personal feelings about the way that the same community, I can say, came after me personally and the kinds of things that they tried to get going around me. But to be honest, I'm not for taking things down. I'm not for blocking, I'm not for kicking out. I'm not even for excessive parental controls, which is probably the bigger cultural gap between me and the US than the First Amendment. I am absolutely determined not only that I should have free speech, which is one of my rights, but actually children should have free speech. It has to be possible to participate in the conversation. And none of what we are doing here with the code is doing that. What the code does is it says it's not a corporation's God given right to exploit children, monetize children or harm children for commercial gain. And that what you got to do is actually check whether you're doing any of those things before you do them and take some steps to make sure you don't.
And if I might make even one more point on that issue is that there are people, and we may get to it about the various things that are popping up in states that I have absolutely nothing to do with that are very, very controlling. And there are, a colleague of mine said this actually in a debate in the Lords and he basically said, "Are we going for an airplane kind of level of safety where absolutely everything is about safety because you want zero accidents or are we going for a driving car kind of safety where we accept that there are a level of accidents, but there's rules about the car, there's rules about the driver, there's rules about the road and there's rules about the pedestrians." And that is what we are going for here. The code does not take out all risk. It is not a hundred percent secure. You don't get to have the code and then you don't have to worry about your kids. It's absolutely not that.
It is just really much more light saying you would not ship a car without brakes, without a rearview mirror, without an airbag, without a thing and then put a 12 year old in charge. You just wouldn't do that. We don't do that. This is actually normalizing online life in a way the online or the digital environment has become normalized. If we weren't all living in it 24/7, we wouldn't need to worry about how it affects kids. But since we are, why do we have different rules and this sort of era of exceptionalism, I think even the more nuanced free speech advocates are beginning to say free speech is not the same as free reign over every thought, behavior, action that we take 24/7.
Justin Hendrix:
I'll query you just a little further on this because part of the argument is that age assurance creates barriers for children to engage with internet content. That it chills speech, that it potentially creates barriers to participation. Are there differences in the way that age assurance has worked in the children's code in the UK with the way that it's described or implemented in the California law that you think are important?
Baroness Beeban Kidron:
I actually think that there's a difference between the way it's spoken about in the media. I don't think there's actually particularly a difference in the law, if you see what I mean. So on age assurance, I think the first thing to say is can we stop pretending it doesn't happen already, that when you log in using Facebook, one of the pieces of information they share is your age range. Yeah, that's 42% of people on the earth. It's ridiculous. It's a sort of a silly argument that actually it's not only that they know your age. In fact, in gameplay you can work out the age of a child to a 93.4% accuracy in 10 seconds from how they move their body.
And also if you are one of the millions of people who are wearing one of those watches, they don't just know your age, they know when you had sex last. So I think we've got to just put the knowing of age into the context of how the world is right now. Then if you go back and say that actually if you look carefully, it doesn't say we got to know who you are, and age and identity is very separate. It doesn't actually say we've got to know exactly your age. What it suggests is that depending on the risk that you do the minimum viable product, and there is many, many, many routes to that, whether it is behavior, whether it is biometrics, whether it's being told by a third party, whether it's actually going via, there's the fantastic things happening here around checking schools and having tokens of age related to schools and so on and so on.
But my point is, let me ask you the other way around on the whole, the large variety of people, in fact a huge majority, more of a majority on this issue than on possibly most social and political issues, would like kids not to be routinely delivered. Pornography would like kids not to be routinely contacted by people for sexual activity, would like not for kids have strangers coming and talking to them, et cetera, et cetera. So my point being that actually I think there's going to be age assurance in those other areas that we concern about as a society. And I don't know what it is in America, but here it is I think something like 92% of parents would like age assurance around those things. So on the one hand we get that. The question is not whether we have age assurance, it's, is it private? Does it work? Is it secure? Does it undermine the rights of adults? Does it kick kids out where they shouldn't be kicked out? And if you notice what's in the code, it says it must be proportionate to risk, low risk, low assurance.
And the last thing I want to say about age assurance is I am kind of wryly interested in laterally a lot of the cases going against companies who have failed to uphold their own limit of 13. And you kind of go, I didn't invent 13, I don't even like 13. But why is it that they are bleeding about age assurance when they're failing to uphold their own rules? And when you put it in that complexity on the one hand, and then you start to look at all the different ways in which age assurance is now being achieved, and a lot of it's very, is a lot less intrusive than it is sort of managerial. You ask someone their age in one way they lie, you ask it in another way, they don't lie. If they lie, you don't let them back in again. You ask them later and they forget what name.
And there's lots of layers of doing it that you don't even need to know who, why, what or any other piece of data about them that are perfectly adequate in order to deliver their data rights or information in a way that they can understand, et cetera, et cetera. And then there are some very hardcore ones, but I would maintain, if you're buying a knife here, which is 18 or a gun in the states, which is whatever it is in different states, you want to have a robust version. And if you're doing it to give them a video about their privacy rights, you really don't care, roughly speaking knowing what their age is so that you can deliver something good to them is a good thing. So I actually think it's a little bit like the first amendment debate. If you really, really want to make it so extreme that there is no nuance, it looks terrible, but actually when you get into the reality it seems to me a very poorly held debate.
Justin Hendrix:
You've already pointed to the fact that there are other children's safety bills that are being advanced across the US and I want to talk a little bit about those. But I guess I'll just -- as you have perhaps -- drop the veil of my personal considerations in this matter. I'm a parent, I have young children who use digital media and I'm aware of the various challenges to that, the way it impacts both their lives and moods and attention and even family dynamics as I'm sure many of the parents listening to this may be. But I'm also sympathetic to some of the concerns, particularly around some of the bills we're seeing pop up in other parts of the US, who are worried that things may be going a bit too far.
So there are these variants of these age appropriate design codes, children's safety bills where there's clearly a kind of authoritarian intent. Often the lawmakers that are advancing them in word and deed do represent, their ideas do represent a danger to marginalized youth, to LGBTQ+ youth. Utah, of course -- this law that prohibits tech platforms from allowing users under 18 to have accounts without explicit consent of a parent or guardian. Are there child safety bills that you oppose at present, or versions of these design codes that you would disavow?
Baroness Beeban Kidron:
Well, I think the first thing I have to say to be really clear is I'm a member of a different legislator in a different country, so I have no position from which to disavow anything. So just to be really clear about that. But I think if you were to be unfortunate enough to troll through everything I have ever said in public on this issue, you would know that I do not think that parental controls is the answer because not all people have engaged parents. Not all people have parents, not all parents do right by their kids. And also, I was a child, I didn't want my parents to know my everything. At the age of 15, 16, 17, I probably, and I'm not prepared to say what age I am now, wouldn't want my parents to know everything I thought right now either. So I think the truth of the matter is, and there is a wonderful, wonderful episode of Black Mirror, which was a UK TV show, that actually brilliantly explains what's wrong with parents knowing everything.
I think the other thing is just from a really serious point of view is I have supported the creation of a set of bereaved parents whose kids have been either committed suicide or been murdered, but in some way very much to do with the digital world. And in a recent meeting in which each of five of the families told politicians what their experience was, four of the five had parental controls. So I think it's a false dawn, it's a false dawn. So my biggest objection is twofold. One is I guess is I am concerned that people think parental controls is some sort of silver bullet and then they find out it's not. And then the second thing is, if you look at all of my work, and I did start by chairing a group that wrote the addendum to the convention on the rights of the child, which is actually about their right to participate, their right to education, their right to information.
And I actually feel that we infantalize children if we don't understand that they have evolving capacities. And I will go back to what I've already said to you, and I often have to say it because I think people slip to the content, slip to the control, slip to that, but actually I'm not talking about either locking kids in their bedrooms, nor am I talking about putting them in cotton wool. I'm actually talking about whether it is okay for major corporations to systemically and routinely exploit them. So I'm not really looking at the relationship between children and parents here, although obviously because of the work I do, I have quite a lot of knowledge about how it plays out. I'm looking in terms of the legislative tool at safety, product safety, and I just am bewildered because I literally, and if someone wants to, you know, you have a lot of listeners and they'd like to write to me, my email address is on the parliamentary website. Please tell me another sector that doesn't have to make sure that their product is safe for the consumer to whom they sell, engage or give it away.
If someone can tell me another sector, because it ain't transport and it ain't food and it ain't drugs and it ain't toys and it ain't, I literally cannot find one. So if someone can answer that, I probably can go home and stop doing this. But actually we're talking about product safety here and I think all of these other things are unwittingly or wittingly diversionary and the only people that they serve, not the children, not the citizen, not the politician, a company who's got a free card. And I don't think companies should have free cards on our kids.
Justin Hendrix:
Of course there's a broader conversation about this than just what's happening here in the United States, although it is truly gathering steam here in the States as various legislation pops up around the country. I think there are a couple of dozen if not more states that are considering child safety legislation at present. Can you maybe back out the global picture beyond the UK, beyond the US, where else are these types of laws being considered or regulatory interventions?
Baroness Beeban Kidron:
I think I can honestly say there is nowhere that is not... There are obviously there are places in the world in which conflict or natural disaster or fail states. There are those countries that are not considering this. But you can absolutely without question say that this is very high on the agenda. And I think there's one thing that may be interesting in considering the global, which is when we were doing, it's called a general comment, it's the addendum to the convention of the rights of child saying how children's rights apply in the digital world. One of the things we did was we ran workshops in 28 different countries. Now the workshops were not tick box, they were either three hours or six hours and they happened with children between the age of seven and 19, I think. What absolutely confounded all the people who crunched the numbers as it were about what their attitudes were was that wherever they were in the world, children had probably four fifths of their concern, 80% of their concern were absolutely identical.
So in some of the Global South, there was more concern about price of data, access to the internet, things like that. In some of the Middle Eastern countries there was more divergence around gender about who had what access and on what basis and so on. But for the most part, the kids were talking about the hostility, the addiction, the nudges, the social and personal sort of despair that they felt in the competition, et cetera, et cetera, all the things. And of course the sexual and the violence and so on. The children put the same things because they're all using the same services. And so now what you're actually saying is that in the development of a child, which used to be school, family and state was the sort of cultural envelope is actually more determined by using Snap, TikTok, Instagram, YouTube, et cetera, than it is by those things.
And I think that that is something that people really should think about when they start dancing on a head of a pin about whether this thing or that thing may shift the balance. I think they're all looking over there thinking about tiny little pieces of things that we hold true, which I hold true too, but I think that they are missing the car crash that is coming through the center of the frame. And I think that that moment just really sort of made me understand that this, by design, by default, product safety was even more important than the rights. Not that it's less important. We do need kids, we must have a wellbeing mission. We must have a positive vision of the digital world for kids because this is where the future will be built. And I do think that this is ultimately profoundly in fact a case of uses and abuses.
Yeah, it's not about tech and everything that has come directly from my hand or indeed from colleagues that we work with around the world, either in 5Rights or outside of 5Rights always says the digital world, not social media, always says by design not don't do this and don't do that. And again, I think I've said this already, but I can't stress it enough. My biggest fear for the world is we're all taking a pop shop backwards. And actually what we really should be looking at is AI, is immersive tech, is regenerative systems. And actually we need to have a really, really, I mean I consider myself a privacy activist. I just do it on behalf of children. But we really need to have a conversation about what privacy is because actually the current conversation is as if it's a 2D world between us and the state, but actually the people who are making out like bandits from our lack of privacy are actually the commercial players. And I think if you don't put that third leg of the stool out there, you are just living in cloud cuckoo land.
Justin Hendrix:
Well I want to ask you about the bandits. You in that 2019 profile, it sort of situated you at a meeting in Silicon Valley where you were engaging with technology executives. Are you still welcome in Menlo Park? Are you still welcome in those offices or as the various tech firms fund different legal challenges against your work, are you finding doors are often closed?
Baroness Beeban Kidron:
No, I'm not finding them closed. I do go and see people. I do speak to people in all of most of the companies. There are some who I know better than others just for probably historic reasons rather than any good reason. And I was in Washington and I saw companies just very recently, two or three months ago. So we are adversaries. Yeah, my message to them, the detailed conversations never leave the room. And the detailed conversations are not political, they're practical. This is what's difficult about what you're saying, this is what they, have you thought about it like this, that's ridiculous to say that. They are very practical. But the reality of those conversations is I'm saying to them, don't resist the good stuff if what we are saying didn't hurt, it wouldn't be making a difference and it wouldn't be dealing with the problem.
But if you don't go for the good stuff, you're going to get a lot of very much more negatively radical if you like and very often unimplementable. And so I come back to the beginning of our conversation and go yeah, again and again I've been told it's fluffy, it's wooly, it's what does best interest mean? What does this mean? What does that mean? That you know, you start with a service, you start with a risk assessment, you get the principles written up on post-it notes next to you. You do know what to do at the end. It is a system of design keeping kids in mind. And that is not the most we should expect from the tech sector. It's the least. And I actually think that they push it away at their peril actually. And there are some of the things that you've alluded to that are their peril.
And I am speaking as an individual here because we're talking about the US. I'm not speaking as a politician, but I have spoken to a lot of parents groups, I've spoken to bereaved families, I've spoken to families with addicted kids who are willing to do anything to play one more route on the game. This is not a happy place. This is a collective torture of a group of people to whom we owe a duty of care. And I think you've already said, you are someone with at least one small child in the house. If you don't want better for your child than actually at the age of eight being bombarded through the night with either such vacuous, empty bullshit that it's only there to keep you on.
It's not even content or indeed that it is, that they are willing to ignore the fact that actually contrary to how it plays out in the media, most children first see porn, not because they look for it, but because they're offered it up. Most girls get involved in self-harm, not because they looked at it, but because someone recommended it to them. I think that is an abuse. Now the code doesn't do all of that, but I think it makes the companies think about what the end result of the design of their product is and that's all I'm asking. It's not much to ask and I think it is working better than most people expected.
Justin Hendrix:
We started the conversation talking about your plan and I think we've, to some extent, we've hit where it's at in the moment, where you've got to, what your considerations are at the moment. You're clearly already looking ahead, thinking about AI, thinking about the quote unquote metaverse, et cetera. What's next? Where are you going to take the fight next?
Baroness Beeban Kidron:
So I think that we've got a few fights ongoing around the code and around making sure that we don't do with tech regulation what we did with baby milk. I'm not interested that we get it out of the Global North and we send it all to the Global South and poison their kids. So I think there's a bit of making sure that the advances we make, we make together. So that bit is very much done on the ground by friends and colleagues and other NGOs and so on. I hear right now have got both a data bill and an online safety bill coming through Parliament so that preoccupies me in various ways. But my real, if you want the plan, I have recently become a fellow in the computer science department at Oxford and I plan to do some work on contestable AI on verification of information, not only age, but in general, what does that look like and on measurement of harms.
And also I'm looking at the legal language around the metaverse to see what, try and look at the taxonomy, so that the metaverse isn't a place where everything ghastly happens, but nobody's touched as it were. So those are my areas of concern, and obviously I look at everything through the lens of the under 18, through the child. But I am a citizen myself and I'm interested in the fact that when I get attacked, I always get attacked for my gender. It starts with that. So I have personal interests in this area and I'm very, very worried about the news, and the news cycle, and all of that.
There's plenty to worry about, but those four things, making sure that we don't do the same thing again. I spent 10 years trying to persuade people that not only there was a problem, but there's an answer. I think for me, my great pleasure is in seeing other people use language keeping children in mind -- embedding their rights, best interest, by default and design, likely to be accessed. There's a whole lexicon of things that are left in the way and other people will do with it something good I believe. But for me it's just looking over the parapet and seeing what's coming up. And let's not forget the kids this time, why don't we build it in.
Justin Hendrix:
Baroness Kidron, thank you so much for speaking to me.
Baroness Beeban Kidron:
Pleasure.