The Future of You
Hello, welcome to the Future of You podcast.
Here we’re going to investigate and analyse all the ways emerging technologies are going to affect our identity. We used to argue about whether personal identity was in the mind or in the body; but now that psychology of the self and the biology of self has been joined by a third dimension - the technology of the self.
In a digital, data-driven world, Facebook gets a say in verifying who we are, medical science can alter our genetic make-up, and AI will help us create our digital twin - just how many identities do we need in the multiverse? How is 21st Century technology changing who we are?
I set out to research exactly that, when I wrote my book The Future of You. That turned out to be only the start of my journey into our changing identity in a digital world. And I thought rather than continue to research this on my own, I could invite you along too.
I’m Tracey Follows, I’m a futurist and founder of Futuremade, the futures consultancy, and I’ve helped brands and businesses like Virgin, Google, Diageo and Farfetch spot trends, develop strategic foresight and fully prepare for what comes next.
I believe identity is the defining issue of our generation. So every fortnight I’ll be releasing new episodes, featuring panels of experts as we delve into topics such as how we use media to express our selves, the emergence of the extreme self, transhumanism and the augmented self; anonymity, genetics, biometrics, mind clones, and the digital afterlife - these are just some of the innovations, and controversies, we’ll cover.
Join the community, buy the book at thefutureofyou.co.uk or connect with me on X at @traceyfutures. Please come with me on this journey to explore identity in the digital world, as we investigate the future of me, and the future of you.
The Future of You
The Discussion: ‘Your Face Belongs To Us’ with Kashmir Hill #27
Increasingly we’re using facial recognition to do everything from unlock our phone to make payments. But what might the long-term impact of giving away our facial data be?
In this episode I talk to New York Times technology reporter and author of “Your Face Belongs To Us”, Kashmir Hill.
We talk about the impact of facial recognition technology, the potential for misuse of the tech and its limitations. We also discuss the importance of data privacy and what’s happening to the billions of photos millions of people share online every day.
‘Your Face Belongs To Us’ by Kashmir Hill
Amazon UK https://bit.ly/45DFs58
Amazon US https://bit.ly/3tAUQC2
The Future of You is a finalist in the Independent Podcast Awards 2023.
Tracey's book 'The Future of You: Can Your Identity Survive 21st Century Technology?' available in the UK https://bit.ly/44ObTha and US https://bit.ly/3OlDxgk
Tracey Follows 0:20
Welcome to The Future of You. This week, I'm joined by Kashmir Hill. Now you probably know of Kashmir Hill's work. She's a technology reporter for the New York Times. But most recently, she's written a brilliant book called Your Face Belongs To Us. It's the gripping true story of Clearview AI. It illustrates our tortured relationship with technology, but it also entertains us, even if it tells us about an exploitative technology around facial recognition data. But it presents also a powerful warning that in the absence of regulation, technologies like this could spell the end of anonymity. Now much more to say about Kashmir. I'm sure you're well-versed in her brilliant work for the New York Times. So let's crack on and talk about Your Face Belongs To Us.
Welcome to The Future of You Kashmir, it is honestly a real privilege to get to speak to you. Thank you for making the time for us today.
Kashmir Hill 1:21
Thank you for inviting me.
Tracey Follows 1:22
Not at all. I was so excited when I saw your book and it feels like is ages ago now - I preordered it on something like the sixth of July. People in the US will be getting it next week I think, but comes out the week after in the UK? But obviously you've been talking a little bit about it and writing on this topic for quite a while. And I remember when I was writing my own book in 2020, that your undercover work on Clearview was first coming out I think wasn't it at the beginning of that year? Because I remember reading it as I was researching identity and thinking, 'Who is this? What is this company Clearview?' And obviously it was all of your work. So I was heavily embroiled in reading it all then. But I'm fascinated. Let's start with the title and I have two questions for you - the title of the book is Your Face Belongs To Us. And the subtitle of my book was, can your identity survive 21st century technology? So my questions are, can it and who is us?
Kashmir Hill 2:24
So the us is unclear for a reason. I mean, the book focuses on Clearview AI, the startup that kind of came out of nowhere scraped billions of photos from the internet without people's consent. They now have 30 billion faces stored in their database. And yeah, just decided, they were going to make a business of selling a facial recognition app that could find people's photos anywhere on the internet, on social media sites, etc. But what I found researching the book is that, you know, the facial recognition technology goes back decades, it goes back to the 1960s. The CIA was funding an engineer in California who was in Silicon Valley before it was called Silicon Valley to try to develop, you know, a facial recognition app with just the very early computers. And there have been lots of engineers, technologists, companies that have been trying to develop the app that Clearview AI developed. And they've been using all of our faces. Basically, the breakthrough in this technology was us putting our faces on the internet. And they have been used by so many different companies around the world. And so that is kind of what I was trying to get at the title that your face belongs to lots and lots of people who who want to get very good at tracking and identifying you.
Tracey Follows 3:51
Two questions are people who are aware of it? And if they are aware of this, do they care?
Kashmir Hill 3:57
I always hesitate to say what people know about, you know. It is hard to say I think it varies from person to person. I think a lot of people are aware of facial recognition technology in a kind of benevolent way, using it to open their smartphone and maybe using it when they're at the airport as they're going through security and you know, matching their face to their passport. When I first discovered Clearview AI. What it was doing was a secret. It was not known by the general public people found it really shocking that their photos had been collected by this random company. And honestly, people were upset. They were angry. I remember the story just was like a tidal wave across the internet. So many people talking about it on social media, on Twitter. All the other journalists around the world did stories about it. And yeah, I think it was it was really shocking for people they did not realise something like that was out there.
Tracey Follows 4:56
I remember when you were writing about it early 2020, I think it was about the same time that Facebook, were doing lots of kind of amusing games around, you know, oh, why don't we do this artistic version of your face, or let's do an old version of your face and encouraging people to use the app and then get these versions of their face back to them. And lots of people were sharing it around. And partly because I was aware of what you were writing about, and the research I was doing in general anyway. I didn't do it. I didn't look at it. I didn't use it. And I watched other people using it. And it's got that real kind of mimetic feel to it, hasn't it? You know, you're doing something with your own face, and then and then reposting it out to your own audience constantly. So it kind of became a bit of a game. But what Clearview is doing is really not the game, isn't it?
Kashmir Hill 5:51
Yeah, I mean, what Clearview is doing is, you know, they built this database, they wanted to make money off of it. They would end up selling it to police departments, law enforcement officials, 1000s of agencies, and it was tried by police departments around the world, including in the UK. But originally, I mean, they want to sell it to companies, you know, grocery stores, hotels, airports. And it's interesting to see already what is happening in the United States. There is a an event venue here called Madison Square Garden. It's a really iconic venue for concerts, shows, sporting events.
Kashmir Hill 6:32
I've been
You've been?
Tracey Follows 6:33
Yes, Rolling Stones
Kashmir Hill 6:34
Oh nice (laughs) So yes, it is well known. And they started using facial recognition technology, not clareview Ai, a different company a few years back to address security threats, supposedly, you know, it would keep out terrorists, you know, more often it was used, because if there was somebody that threw a bottle down on the stage or gotten a fight in the stands, then they would get on the banned list, and they wouldn't be allowed in anymore. But in the last year, the billionaire James Dolan, who owns that venue, decided to ban lawyers that worked at firms that have sued his company. And so he ended up making this ban list of 1000s of lawyers. I mean, you might remember going in there, the security is pretty intense. As you go through the metal detector before you've even shown a ticket. Now, if you're on this list, I went with a lawyer, I watched it happen, she got pulled aside. And they said, you're not allowed in here. You work at a firm that's banned. And she said, I'm not on that case, you know, I'm not working on anything related to this. They said it doesn't matter. You're banned until your firm drops its lawsuit against us. And so that is just oh, you know, a very striking way that this technology can be used to discriminate against people in new ways, even based on the work that you do.
Tracey Follows 7:58
That's fascinating, because one of the things that's happening here in the UK, I mean, obviously, there's been a conversation around facial not Clearview, necessarily, but facial recognition cameras in general. And there's been lots of research Watson Research, like the Ada Lovelace Institute did some research on what were the consumer or citizens attitudes and time and again, it kind of comes back. Well, I don't mind it being used if it's for law enforcement. And it's only in cases where obviously, there's an obvious benefit or advantage, because you kind of trading security and privacy for civil liberties infringement and trying to get the balance right. But what's happened in the last I would say, even this last six months, because of the increase in shoplifting, and some of the abuse that's happening. I mean, there's a lot of stories today and the front page of the paper about some of the abusive supermarket staff in stores, you know, there's this wave this initiative to use facial recognition in stores now, and there was a particular company who developed it themselves and now have commercialised it called Face Watch. And so there's now two issues around once it used to be about government surveillance or surveillance from like security, public services, like the police, or maybe at big events, as you say. And now there's this commercial use and possible abuse of it as well. And that feels very different. Because as you say, it's really about putting some access points in now and making some judgments over people. And is that the same in the States and elsewhere?
Kashmir Hill 9:28
Yeah, I mean, speaking to that point, right. I think a lot of people are comfortable with the idea of a store, trying to keep out shoplifters, you know, people have been there before have stolen from them. I do think there's a group of people that object to that just the idea that every single time you walk into that store, you are having your face scanned, and they're trying to figure out whether you're a criminal or not, you know, maybe you get misidentified because you look like somebody who has shoplift, they're there before that things can go wrong. But there's the other Risk is just that you get slippage that it starts out with. We don't want you coming in here because you're a threat you've been violent with our employees before you've stolen. That's how Madison Square Garden started. It said this is about security threats. But then you realise there's other ways you can wield the superpower, Madison Square Garden said, Hey, we can use this against lawyers to try to discourage people suing us all the time. What if store said, You know what, let's not let in anybody who's left us a bad review online? Or, or maybe you know, it's a store that has certain political leanings. And they say, Oh, we don't really want anyone coming in here, who has different political leanings, you could just basically have this new era of discrimination against people. And so that's that is the possibility once you get this infrastructure in place, it might be used in a way that wasn't originally envisioned.
Tracey Follows 10:54
So now going back to Clearview, they invited you in? Didn't they? Didn't you sit in on meetings? And then the more you did around them and this story and this emergent but secretive business? Did they invite you in somewhat for you to find out a little more about them?
Kashmir Hill 11:10
They didn't at first, when I first began reporting on Clearview, I had gotten a tip from somebody who saw them mentioned in some police documents that they had gotten through a public records request. And when I first started looking into the company, they did not want to engage with me. They did not want me to write about them. They not only were, you know, ignoring all of my phone calls, emails attempts to contact people connected to a company. They were actively surveilling me, I, when I couldn't get anyone from the company to talk to me, I found police officers who had access to the app and interviewed them about it. And they all raved about how amazing it was and how it works so much better than the facial recognition technology they had used before. But I talked to a couple officers who volunteered to submit my face to the app and give me the results so I could see how it worked. And the first time that happened, the officer then immediately stopped talking to me never sent me the results. The second officer I talked to said he uploaded my face and he said oh, it's really weird. You don't have any results. There's no photos of you turning up in the app, which is weird, because when I google your name, there's lots of photos. And I said, Yeah, that seems strange. He said maybe the servers are down. Then he stopped talking to me. And I finally basically recruited a police detective to work with me. And he got access to the app because they were giving free trials to anybody with a police address email address at that point. And he uploaded my photo got no results. And then a couple minutes later got a call from the company. And they said are you talking to a reporter named Kashmir Hill at the New York Times. And so I realised what had happened, they had put some kind of alert on my face. And they knew every time somebody looked for me, which I found alarming because it meant that this random company had the ability to see who officers were looking for. And the ability to control whether they could be found they'd actually blocked my face or didn't have results. But eventually, they kind of realised I was not going away. They hired a crisis communications specialist who had worked kind of around the Rudy Giuliani administration. In the US. She was hired by politician here named Eliot Spitzer after he had like, a sex scandal. She's kind of the person that you call in an emergency. And she told them, Look, this, this reporter is going to write a story, you need to talk to her. And so that's how I met Hoan Ton-That for the first time, this really young guy who's kind of the technical genius at Clearview AI. And when they found out I was writing a book, they felt like my original story about them at the New York Times had been a fair story. And they decided to talk to me about the origins of the company. And it was yeah, there was some disputes there. There are a lot of people around Clearview AI. And one person in particular that the people who control the company now are kind of embarrassed to be associated with his real conservative provocateur kind of an online troll is what a lot of people call him and has kind of a distasteful online reputation. And so they really were trying to write him out of the story. And he was talking to me. So there was kind of this battle of the narratives between some of the different people that I was talking to for the book.
Tracey Follows 14:34
Wasn't Peter Thiel involved at some point as well?
Kashmir Hill 14:39
Yeah, so a lot of people that were around the company in its early days were from pretty conservative circles. And so before this app was even really an idea. Those two main people Hoan Ton-That and this person that they are kind of embarrassed of. They traveled together to the Republican National Convention when Trump was becoming the candidate for the Republican Party. already, Peter Thiel was talking there. And they ended up meeting with him privately and one contact got to meet him for the first time. It was at this convention, actually, they started talking about a tool that you might be able to use to judge strangers. And, you know, half a year later, when they started to develop the app, which was called Smart checker at the time, they went back to Peter Thiel, and said, can we have some money for this? And he gave them their first $200,000. So yeah, without Thiel, this, this company probably wouldn't exist today.
Tracey Follows 15:33
Is there any crossover between what Clearview do and Palantir? Or nothing at all?
Kashmir Hill 15:38
I don't know of any explicit ties between the companies but they're both in that same genre of mining data, making it easier to find, and you know, what Clearview does, it's very radical. It's something that hasn't really existed before Europe has said that it's illegal, you know, that you that the company should have gotten people's consent before, including photos of European citizens in the database. But they say, you know, we're just Google for faces, we're just taking public photos that are on the internet, and organising them by face and making them easier to find. And, and that is what a lot of these kind of data mining companies sell. You know, they'll take public information, private information, and just make it easier to sort through.
Tracey Follows 16:27
I will say, I kind of don't buy that. But on the other hand, this week, I've tried to claim the knowledge panel on Google search for my own name, because somebody said to me, Oh, you must claim that and I thought, I must actually, especially with all this work on identity. So I go in and fill that I didn't if you've done this, but filled in the form, submitted it. And they came back and said, well, we can't identify with authenticate you as Tracey Follows. This is how my whole book started when Facebook did this to me when I couldn't log in, and it wasn't machine readable. And they told me I wasn't Tracey Follows. And like, you know, however, many years later, now Google are telling me the same thing. So I tried to get to give them a bit more information. Once again, they said I can't claim the knowledge panel, because they can't authenticate me, Tracey Follows as the Tracey Follows that is in the search results, which I just again, find. I mean, there's lots of photos of me, it's my face, but I can't claim it. So I'm not suggesting anyone else would want to claim it. But there is something in that isn't that all these photos are there and I, I can't get them. I can't even kind of in a sense, bring them back into my orbit. They're there now. Now, whether another company should be allowed to scrape them is, I guess, another issue, but they're kind of they're in this limbo land? I feel so in a sense, they're up for grabs, aren't they?
Kashmir Hill 17:48
Yeah, I mean, this is a conundrum of our age, like, do you try not to put anything online so that you're very private, so that you have more of a sense of control? Or do you try to have a sense of control by putting a lot out there, you know, creating the footprint that will be found for you. And this has been a problem with with Google for, you know, a long time. Now, if you don't, if you don't give Google, you know, information about yourself for them to list in your rankings. That means someone else can do it. And there are so many sites now with these creepy names that I don't know, like PQ. And yeah, they're just these data brokers that suck up all this information and just put it on there for you. So I still, you know, this is, this is a problem. We haven't solved yet. One nice thing about, you know, European privacy law, there's a few states here where we have this this power, but you can go to Clearview AI, and say, Hey, I live in the UK, I don't want to be in your database, you do have to give them your face, but then they will delete your information from their database. But again, you do have to hand over your face, you have to give them an ID to prove again, that you're tracing follows. So I've talked there's one person I profiled in the book who says, I don't want to have to give them that information. And to keep me out of the database, they would have to keep my face because they would create this, this block. And they're not necessarily deleting you from the database, or putting on a block list so that your photos don't surface.
Tracey Follows 19:17
Yes. Interesting. What is so special about the face?
Kashmir Hill 19:23
This is an interesting question. So, you know, for a long time, this was the quest of the government, various corporations. One thing corporations wanted to be able to do was put something in your television so that you know when you're sitting there watching a TV programme, your television is watching you back and knowing who's in front of it. You know, the face is it's our personhood. It's so unique to us. And yet it's public. We're walking around all the time. You know, showing it it's it's how we express ourselves. And what made me want to write this book, what I found really alarming, is that the face is now becoming this key to link us up with that online dossier. And so all this information that we've accumulated online that others have accumulated about us, can now be just directly kind of attached to you via your face. And so you know, you walk into Madison Square Garden, they know you're a lawyer. In London, these police fans are going out with facial recognition cameras on the roof, and they'll just scan a crowd. And if you're wanted by the courts for anything for something very serious, even just a traffic infraction, I found in one case, the police can come and arrest you. And so yeah, it's people kind of asked me well, what do you do like, should I wear a mask, even with a kind of Covid mask that covers your nose, and your mouth, a lot of his software still work. So you would literally need to walk around with a ski mask on to be able to protect yourself. And I just think that's not a way that we can operate as human beings, we need to see each other's faces, it's, it's just a powerful part of who we are and how we communicate with each other.
Tracey Follows 21:09
Yeah, it's feedback, isn't it? It's understanding each other through feedback, most of it? Well, not all of it, but a lot of it comes through the face. I just wondered how much you'd looked at other biometrics, because obviously, you know, I spent a long time talking to people who weren't, you know, if the iris recognition, is it, you know, palm gestures? Is it veins in the palm of the hand, is it you know, your smile, like in China smiles, pay, you know, all these different things, and it keeps coming back to the face, that the face is that place of emotion, and feedback and meaning. And there's something very special about it in terms of it being that sort of communication interface, if you like, but also it being the best way of identifying or authenticating somebody. But once you know, it stops being a body part and it becomes a part of media, then as you say, in the book, you've kind of lost control of it, haven't you?
Kashmir Hill 22:02
I mean, part of what was appealing about the face for the company, as they've developed this technology, is that one thing we love to do is post faces online. You know, we love putting selfies online, we love sharing our loved ones online. And so we did hand over a bunch of information that helped them to make this technology better. You know, people uploaded their photos to Facebook, and tagged all their friends in it to their friends who see the photo, same thing with Google. And so we actually helped tell the computers, okay, well, you know, this is my sister in this photo, this is my sister when she's looking sideways. This is my sister when she's looking up. And that was really helpful to them. And we don't kind of do that necessarily with other types of biometrics. We're not, you know, putting our handprint online, or skinnier irises, but our faces were really giving up. And I wonder, I mean, I fear honestly, that this is going to happen with other biometrics. So as we're talking now with voices, you could imagine a company, Clearview AI for voices where they go around, and they just collect all these recordings, and create voice prints for them. And so you could go there and take a you know, take 30 seconds of your voice Tracy and upload it and they'll say, here's all the places we found Tracey Follows talking on the internet, you can listen to, you know, anything she's ever said. And we've transcribed it for you. So just here's a printout of everything she's ever been recorded saying on the internet. I mean, the AI is so powerful now that more things like that are coming.
Tracey Follows 23:35
And once you can track it back to one, like identification, then you've got a really powerful system on that actually, did you look at weather? I don't know about this. But did you look at whether Clearview are going to follow us into the metaverse? Are they going to? Because actually, I was just thinking as you were talking about, you know, how public our faces. What happens when we're in spatial, you know, the era of spatial computing and everything's ambient. So let's say you know, in an ‘Apple Vision Pro’ world, whatever, we're there, but almost holographic. So we're online, but we're outside of the internet. So everybody's kind of seen, but there's also online at the same time. It's going to be interesting, isn't it?
Kashmir Hill 24:21
Yeah, one thing I thought about I did this story for The New York Times about living in the metaverse and I spent every hour of the night and day they're all 24 hours over the period of a month. And you know, I had to wear these heavy glasses. And right now when you're kind of in the metaverse, this virtual avatar, there's not a lot of facial expressiveness because the cameras aren't gathering enough information from the face yet, but in order to really make people feel like they're there. Basically these companies have said yeah, we want to collect more face information like we want more cameras and sensors looking at your face, so that when use Smile in the real world you also smile and a metaverse. But that means they would be collecting, you know, this really granular data about your face and how it moves alongside your voice. I wonder how powerful that would be for improving kind of emotion detection? Ai, because you would have, you know, people that tend to, like the metaverse, if they really get addicted to it, they just spend hours and hours and hours that they plug their glasses, you know, into the wall so that the two hour battery life doesn't run out. And so I just wonder, you know, what these tech companies could do with that data and understanding exactly what your face does when you're happy? Or when you're bored? Or when you're, you're sad about something? Or when you're lying
Tracey Follows 25:50
in the behaviour panel online? I mean, I think that's fascinating. I'm guessing that, you know, they're going to use it to sort of do hyper targeting of products and promotions to us at some point, which would be the commercial use again, I guess.
Kashmir Hill 26:03
Yeah. I mean, it comes up a lot, policymakers have gathered a lot to talk about facial recognition, technology, and what to do about it in the US. And every single time they played that clip from Minority Report, where it's Tom Cruise, you know, running down the hallway in the mall, and all these ads are calling out to him, like, basically, you You look stressed out right now, would you like a Guinness and just calling out to him by name? And it does feel like it's not, it's not quite as, as fictional a possibility as it once was.
Tracey Follows 26:33
Where do you stand on effective computing? And you know, whether AI can really read our emotions fully, or where it might be able to go in the future. Because obviously, there was, there was few years ago, there's a lot of investment flowing into that. And then it seems to have gotten a little bit quieter. I mean, you've obviously investigated it.
Kashmir Hill 26:51
Yeah, I mean, I think there's a lot of scepticism, that you can really tell someone's emotional state from from their face, because emotion is very complex. And you know, our faces don't always reflect what we actually feel. That said, having looked at facial recognition technology, really closely and its development over decades, there was this period of time, when scientists engineers, the academic community thought, this is just never going to work well compete, this is an impossible challenge for computers, they are not going to be able to tell that someone's the same person, when they're wearing a hat, or glasses, or when they're smiling, or when they're looking away from the camera. Or when they have grown a beard, you know, it's just how can a computer possibly do what humans are capable of? And yet, you with enough data, you know, and enough computer power? Now, good facial recognition, automated facial recognition is often better than human beings at recognising the face. And so I do wonder what will be possible with, you know, other kinds of analysis like emotion analysis, when they've gathered more data. And if we start wearing, you know, these glasses on our face, that are recording information all the time about our facial expressions?
Tracey Follows 28:15
What happens to super recognises, you know, there's people who have done those tests, you must have done this testing, are you a super recognizer? Are they still going to be around? Or are we going to rely totally on technology in the end, and almost outsource some of the governance and maybe even some of the relating, if you like, to technology? I mean, we, how much of it, are we going to sort of end up relying on? Do you think?
Kashmir Hill 28:42
They're actually still really important with this technology? Because most of these programmes, especially in the, you know, the criminal justice system, and companies are deciding whether to ban somebody because they've, you know, there's been a hit on their face and the facial recognition system, they ultimately have a human making a decision, you know, within police departments, their facial recognition analysts, and they say, Yes, this person is the hit. And sometimes it's not even the first person, like the way the program's work, it'll list, you know, nine people, or, you know, over 100 photos, and the human ultimately decides, which is the best match. And so I've been talking to researchers who say, we need to better vet the people operating these systems, because as you say, some people are really good at examining the face and others aren't. And so you should have the people that are kind of tested, who have kind of an innate ability to do this to be the ones that are working in concert with the system, because this is actually really hard. When you know, for most of us, having a bunch of like, think of it as people in a room instead of you know, a bunch of photos on the screen. Imagine being in a room with 20 people who look incredibly similar and you're looking or, you know, just one person that is so hard for the human brain to process. We're just not used to that. So they're actually, you know, still super important. Recently, I did a story about people who are blind actually, who really would like this power, like they need facial recognition, they would love to have an app like Clearview AI to be able to tell them who's around them. And so they're kind of frustrated that this technology is not more widely available to them. So there are a lot of questions. There are no beneficial uses to facial recognition technology, which, you know, adds nuance and complexity to this debate.
Tracey Follows 30:43
So where do you think Clearview and even like the Silicon Valley tech companies, etc, are versus China, for example, on this? Because I remember years ago, I was looking at face plus plus, I mean, this was quite a long time ago. And I mean, it was all out there. I went to their website and went through the drop down menus. And not only were they offering face matching, face blending, face searching, I went down, it was like skin status evaluation, beauty score. And that was the first time I started again, oh, yes, there's a lot more that we can be doing here. And indeed, you are doing it. So I don't know where they've got to now because that was probably about four or five years ago. Were you able to have a look at China and you must be reporting on this anyway, in your day job. Where are they more advanced, less advanced? Or where are they in sort of, you know, regulation and responsible AI versus the West?
Kashmir Hill 31:36
Yeah, so Russia and China are among the countries that are really far ahead on facial recognition technology, in Russia, in Moscow. They have rolled out kind of facial recognition algorithms on surveillance cameras, so they can look for people in real time. So I talked to a civil liberties group there that said they get complaints from people who are doppelgangers for wanted individuals, and they get pulled over or, you know, stopped all the time. They have to show their documents and say, No, I'm not that person. And it happens to them over and over and over again. There are reports that Russia use facial recognition technology to identify protesters, they're against the war in Ukraine. And then they, you know, give them tickets and say they're guilty of unlawful assembly for a new gathering to protest the war. China, my colleagues actually have reported on some of the surveillance companies there that have developed technology for doing an alert when somebody who looks ethnically Uighur Muslim enters a public space because there's there's a lot of concern security concerns in China about Uighur Muslims, which have really led to severe crackdowns on their human rights. And so that is, that's really troubling. And China actually has been using facial recognition technology in kind of absurd ways. You know, they'll use it to keep out illegal tour guides. They've used it to police people wearing their pyjamas in public there is in it was in a public restroom in Beijing to keep people from stealing toilet paper, you had to like look at this facial recognition camera, and it would give you a certain amount of toilet paper, and then you couldn't get any more, you know, for another 10 minutes. And so I mean, this is what happens when you start rolling out this technology, you kind of can find more and more uses for it. And it can become really ubiquitous in a way that I think is uncomfortable.
Tracey Follows 33:41
Yeah, I think so. Do you think during COVID, or the response to COVID, more like something changed about facial recognition that in some ways it became if maybe not more acceptable? Just may be more normalised somehow.
Kashmir Hill 33:58
I have to say I mean, you started the interview by talking about when the original clareview story came out in January 2020. And here in the United States, people were so upset I was hearing there were going to be congressional hearings. You know, a lot of attorney, attorneys general launched investigations into the use of the app. It was, people were really incensed. It felt like people were calling for laws around facial recognition technology at the federal level. And then at the end of January 2020, Covid hid and the conversation in this I thought I'd kind of track the history of facial recognition technology in the book. And this just happens again, and again, people get really worked up about their privacy, and then there's some event that makes them worry more about safety and security. And so during the pandemic, there was talk about basically, you know, tracking the faces of people who have Covid trying to figure out who they came in contact with. So you can kind of make a list of possible infections. And so there really was this change where people became more comfortable with it, that it'd be a better way they have access to somewhere. So you're not touching things with your hands. Yeah, it's so, it's hard to say. And some people said, Oh, we're wearing masks now. So we don't have to worry about facial recognition. Meanwhile, all the facial recognition companies started training their software to better recognise people wearing masks.
Tracey Follows 35:24
Yeah, I remember, what do you think about the age estimation technology that's obviously based on facial recognition? So here in the UK, they're trialing it in like supermarkets? I think it's a bit misunderstood at the minute because people think it's, it's identifying them and estimating them on the basis of like them as a person a named person. Of course, it's not it's trawling through lots of facial recognition database is an estimating their age, and are you over 18 by alcohol, etc. 21 in the US or whatever? What do you think about those sorts of uses? Do you think those are going to come in fairly? Well, maybe they already are? In some markets?
Kashmir Hill 36:03
Yeah. I mean, it's different in the UK, right? We're in the US. We don't really have, as I understand it, sometimes it's on vending machines that there's kind of like vending machines for alcohol.
Tracey Follows 36:13
Yeah, like self service checkouts in supermarkets and things,
Kashmir Hill 36:16
Self service checkouts.... Yeah, I mean, they really want to eliminate humans, as much as possible, right. And if you, if you're buying some wine or beer, then that forces you to go interact with a human to verify your ID, I, you know, I don't think that the technology is quite as troubling, when it's not about, you know, tracking you. But I do think you always have the slippage. And so once you do start having the cameras there, you know, maybe eventually, it is about tracking individuals, or instead of it just estimating your age, it says, okay, scan your ID here. So we can match your face to the photo on your ID. And while we're at it, we'll just keep this information about your ID. And now every time you come in, even if you pay in cash, we now know your face print and your ID and we can associate these, you know, purchases with you. It's just that that the you know, the slippery slope, I think is what is the most concerning thing, if we just start getting really used to interacting with cameras all of the time. But there are conveniences to this.
Tracey Follows 37:24
Well, this is it, isn't it? This is what makes it so complex, but also so interesting, of this sort of conundrum. What are your feelings on digital identity? I mean, I know there's lots of different types, etc. But are you concerned about digital identity? Or do you think overall, it has advantages, as we move into a sort of digitised world of commerce and everything.
Kashmir Hill 37:47
You know, I think it's mixed. And I think there's so many benefits to the way that we live now. And the access to information, our ability to connect with each other. But it's also frustrating sometimes how hard it is to control the data about yourself and what's available about you. And so especially with facial recognition tools like this, it is it means that any photo kind of that or video that's ever gotten on the internet about you would be findable even if your face isn't attached to it. And so I just I find that kind of uncomfortable. When we were talking before I do on the convenient side. I went to London for the book to do some of the some research because there's so much happening of facial recognition technology there. And you know, I got to Heathrow and got off the plane and you know, went to the little, the little gate, you know where you're entering the country. And usually when I go to a new country, there is this long line of people waiting to have their passports processed. But at Heathrow, I just took my passport, put it on the scanner, looked into a facial recognition camera, it made sure that my face print matched the biometric trip in my passport, and I just walked right in. And I just thought, wow, like that's, that's really nice. It's really convenient. And then I tried to contact the home office and find out okay, well, you know, you scanned my passport, you took a picture of my face, you know, what do you do with that information? How long do you keep that? Do you delete that? Just basically, how are you using it? Is it going into some database? And is it staying there forever?
Tracey Follows 39:29
What did they say?
Kashmir Hill 39:30
They wouldn't tell me.
Tracey Follows 39:31
(Gasp)
Kashmir Hill 39:32
And I tried many times like I just I hate stuff like that where you just don't know your data's been collected, and you don't know where it's going or how they're going to use it.
Tracey Follows 39:40
So where do you think this is all netting out and Kashmir where we'll be in sort of the next five to 10 years? I'm sure you sort of go there in the book. Are you optimistic or pessimistic?
Kashmir Hill 39:51
So I loved looking at face recognition. It was so fascinating going into the history of it specifically, but it is is an example of this larger trend we're dealing with right now of artificial intelligence. And all these AI companies that are gathering lots of data from the internet, often private data. And, you know, making tools with it with Clearview AI was making a facial recognition app with openAI. It's making ChatGPT and these image generators, and facial recognition technology has been one of the earliest AI is really, and we just haven't in the US, we just haven't done much about it. People have talked about it over and over again, expressed concern alarm, said we need laws to address this. And it just hasn't happened to here. And so that worries me because I do think that AI, there's so many questions raised. And we're kind of relying on the benevolence of technology companies. And we're hoping that they make good decisions. And part of why I wrote this book was to really focus in on a more radical actor, like Clearview AI, that just decided to do what it wanted to do, which was kind of the most extreme version of this technology. And so I do wonder what's going to happen in other realms of AI, when just the more, the most, radical actors are the ones that decide the way that the tech is used?
Tracey Follows 41:19
See, I wonder if there's going to be a part of society that just decide they just don't want any of this in the end, and actually opt out of digital as much as they can. I mean, it's very difficult now, because it's interwoven into the fabric of our society, but they try to, you know, opt out and actually become outside the system almost because they just don't want any part of being surveilled. Do you think that's even possible
Kashmir Hill 41:43
in the future? It's hard, because there's like two different groups of people. But there are a number at least wasn't in my friend group. So many more people post privately now, as opposed to when the internet first came around, and everyone was just posting their photos publicly on Flickr, and sending a link to friends, you know, to look at and just everyone was kind of okay, with putting everything out there publicly, you know, Facebook force people to make their profiles public, and a lot of people kept it that way. And then I think that harms have happened, you know, like, people had photos come back to haunt them. There are people that got access to their information that they didn't want to have it. These people who put their photos on Flickr, their photos got sucked up into databases that were used to train AI. And so I think people have are now making different decisions about what to put into the public comments. And that that is only going to increase with these new tools like ChatGPT, Stable Diffusion that are... yeah, just just sucking up anything that's on the internet, I think people are going to be more careful about what they put there. The question is whether it's too late or not, I don't think it is. I don't think it's too late. But yeah, I think people are gonna make maybe some different decisions about what they readily make available.
Tracey Follows 43:02
Yeah, exactly. So would you like to remind everybody about the book and where they can get it and when it comes out? Because it's, it's I know, it's a terrific read. I think we've even got a different cover, haven't we in the UK in Europe? Who do? Yeah, tell us about that. Yeah.
Kashmir Hill 43:17
So the title of the book is your face belongs to us. In the US. It's got a very pink cover. And it's pixelated and two pictures of two photos of faces that are super pixelated. And in the UK, my publisher, Simon and Schuster, and they went with this kind of cool cover this reflective cover with, you know, like a facial algorithm. And it's reflective so that when you look into the book, you see your face, staring back at you. It's very classy. I like it a lot,
Tracey Follows 43:49
I think is such a good reminder, though, isn't it? I mean, that's a tool in action to remind you of what you're doing. The mirror looking back at you remind you how you are presenting yourself and what you're sharing. Think that's a brilliant, brilliant idea. Well, I'm sure everybody on this bit listening to podcasts is gonna get it because anybody listening to this is interested in identity in a digital world, which is absolutely what your book is about. And I can't thank you enough for coming on and chatting. I love all of your work. It's brilliant. And I know I'm going to love this book when I read it because I've very much enjoyed talking to you. It's been a privilege to spend a bit of time with you. Thank you Kashmir.
Kashmir Hill 44:26
Thank you so much for inviting me on Tracey this was lovely.
Tracey Follows 44:34
Thank you for listening to The Future of You hosted by me Tracey Follows. Check out the show notes for more info about the topics covered in this episode. Do like and subscribe wherever you listen to podcasts. And if you know someone you think will enjoy this episode, please do share it with them. Visit thefutureofyou.co.uk for more on the future of identity in a digital world and futuremade.consulting for the future of everything else. The Future of You podcast is produced by Big Tent Media