The Future of You

The Briefing: Brain data with special guest Dr Cody Rall #14

Tracey Follows Season 3 Episode 14

In this episode of The Future of You: The Briefing I’m joined by neurotech expert Dr Cody Rall to talk about Grimes getting a Neurosity Crown for her birthday. I also cover the latest news around motion prints, age verification, and speak with author of The Currency Cold War Dave Birch about the launch of Fed Now; the new instant payment system being introduced in the US.

Episode 13 The Future of You, ‘Brain Data and the Sovereign Self’ with Nita Farahany https://pod.fo/e/1706b7
Jeff Delaney/Fireship, “I literally connected my brain to GPT-4 with JavaScript” on YouTube
Neurosity website
Neurosity on Linkedin
AJ on Twitter @andrewjaykeller 

Cody Rall MD on YouTube 
Cody Rall Tik Tok @codyrallmd


Business Insider, ‘Grimes said she got a brain gadget for her birthday from a company competing with Elon Musk’s Neuralink’


VentureBeat, ‘New research suggests that privacy in the metaverse might be impossible’ 


Guardian, ‘Utah bans under-18s from using social media unless parents consent


Electronic Frontier Foundation, ‘Age Verification Mandates Would Undermine Anonymity Online’


Robin Tombs Linkedin


Computer Weekly, ‘UK Government Completes Trials of Age estimation Technology


Federal Reserve, ‘Federal Reserve Announces July Launch for the FedNow Service


Tracey's book 'The Future of You: Can Your Identity Survive 21st Century Technology?' available in the UK and US 

  • 1min 30: Audio from @Fireship video on YouTube
  • 6mins: Conversation with neurotech expert and owner of Tech for Psych YouTube channel Dr Cody Rall
  • 12min 30: Motion prints
  • 14min 40: Social media age verification
  • 20min: Conversation with payment and digital identity expert Dave Birch about Fed Now

Tracey Follows  0:20  

In this episode, we're going to have a look at the news on identity. If you listen to the last episode, you'll know it included an interview with AJ, one of the co-founders of Neurosity who make The Crown. That's the headset device that uses EEG to measure brain activity. Well, that episode was just about to go out. When Grimes tweeted that she'd received a Crown headset of her own. "Getting a non-invasive brain computer interface for my birthday. Yay. It's a good time to be alive"she tweeted. Now, people are putting it to the test as Business Insider reported someone had tried to use it to drive a Tesla and Grimes herself reportedly tweeted that she had used her Crown to try to move her computer mouse with her mind. I think she later deleted that tweet. But it has sparked the public's imagination about brain computer interfaces again.


Tracey Follows  1:15  

So much so that a YouTuber called Jeff Delaney, who's also a programmer, and has a channel called Fireship, made a video where he attached his Neurosity Crown to GPT. And by all accounts, the orders for the Crown absolutely blew up as a result. 


Audio from @Fireship video on YouTube  1:34  

"Today I'm going to put this weird thing on my head and communicate with GPT-4 using nothing but my thoughts. I was able to get my hands on this futuristic device called The Crown. A compact and stylish electroencephalogram that you could totally wear on a Tinder date without looking weird. It's the same basic idea as Elon's Neuralink, except you won't need to drill a hole into your skull. And best of all, it's got a JavaScript SDK that we can use to hack into our own brain waves. Naturally, my first thought was to connect my brain to GPT-4 to give myself superhuman intelligence, and that's what I did, which makes me the smartest man in the world. You're about to get a glimpse into the future, a transhuman, cyborg future where every human will, every rich human anyway will have all the knowledge in the world, not just at their fingertips, but at their brain tips. The poors will have these as well, but they'll be used by the CIA to make you dumber, like the moment you get thirsty, you'll be served an advertisement for overpriced sugar water that will slowly kill you. The Crown device has a price tag of about $1,000. It sits on the back of your head and has a bunch of tiny electrodes that measure your brainwaves. It hooks up to a mobile app via Bluetooth or Wi Fi then streams your brain's data into the ether. The app also provides helpful utilities like how to improve focus, but I don't care about that. I want access to the raw data, which we can easily get in JSON format via its JavaScript SDK Neurosity. The company that makes The Crown provides a dashboard where you can train algorithms to recognise your own custom thought patterns. Like currently, I'm training it to understand when I'm thinking about biting into a lemon. I just imagined biting into a sour ass lemon and then relax when it tells me to. After doing this about 30 times it will learn how to recognise that thought pattern. So currently, I am not thinking about biting into a lemon and the charts not moving. But now as I start thinking about biting into a lemon, the chart goes wild. That's pretty crazy. And it's accurate to the point of being somewhat frightening. It is somewhat time consuming, but you can train all kinds of different patterns. Like I also did right hand pinch and tone based on my wife's recommendations. 


Audio from @Fireship video on YouTube  3:23  

“Now that it knows how to understand my brain waves. Let's write some code that can do something useful, like communicate with GPT-4. Here I have a basic Node js project created with NPM and Net. From there, I can install the Neurosity SDK, then in the code, I'll go ahead and import it. And the first step is to initialise it with your device ID. The device ID can be found in the mobile app. Next, we log in with our email and password because a Crown can have multiple users. And now I can start reading my brain with JavaScript by simply calling Neurosity brainwaves raw, then subscribe to a stream of that data. Well, technically, it's an observable because it uses RX JS under the hood. If we console log it, it spits out an overwhelming feat of data. It has a sampling rate of 256 hertz, which means we'll get 256 samples per second. They're batched into 16 samples that emit approximately every 62.5 milliseconds. In addition, the brainwaves are broken down into eight different channels. So you get these huge objects with tons of numbers that look like this. If you went to brain college, you might be able to take this data and analyse it yourself. But luckily in Neurosity provides a better way it provides observables for specific states like calm and focus. That means we can subscribe to those mental states and then run code as a side effect. But the coolest thing is that we can also recognise those thought patterns that we trained earlier. Like if we want to listen to left hand pinch, we can call Neurosity Kinesis followed by the name of the event, and it's this feature that allows me to turn myself into a cyborg. What I'm able to do now is install the Open AI SDK where I can authenticate and access GPT-4. The API is dead simple to use. All we need to do is call create chat completion, specify the model as GPT-4 and then send it an array of messages. The result is no different than what you would get from chat GPT in the browser.


Audio from @Fireship video on YouTube  5:01  

“Once I have that message, I can do whatever I want with it like what I can do next is use a text to voice model to convert the text into voice as an audio file. 'This is your president speaking subscribe to Fireship on YouTube before I drone strike your ass' and then transmit it directly to a bluetooth earpiece. The implications of this are huge, because now when I show up late to my daily stand up meeting, all I have to do is think about right hand pinch to ask Chat-GPT to come up with an excuse for why I was late for work. Or if I'm taking a test like the bar exam, and there's a question I don't know the answer to all I have to do is think about biting a lemon which will transmit a signal to my camera enabled glasses, which will then upload that image to GPT-4 and provide an answer that's guaranteed to get me in the top 80%. Or maybe I can become a super soldier then stick out my tongue to blow up a village of innocent civilians. The possibilities are endless when you start thinking like a JavaScript programme cyborg. Thanks for watching and I will see you in the next one.”


Tracey Follows  5:52  

I spoke to Dr. Cody Rall, who has been investigating numerous different types of wearable neurotech on his YouTube channel Tech for Psych to bring us up to date on how we can integrate neurotech, including with generative AI, but not only into other platforms and devices, but into our own daily routines too.


Tracey Follows  6:14  

Now, Cody, you've really integrated these kinds of devices into your everyday routine, haven't you?


Dr Cody Rall  6:23  

More lately, I've been using the Mendi headband, which I could go into more depth if you would like but it actually uses a harmless red light to track the blood flow of your brain. And the company is working on some great research out of University of Victoria right now about how the blood flow of your frontal lobe changes based on focus and cognitive fatigue. And you can use their neurofeedback training programme to enhance your focus. And I use that as a priming technique actually, before my meditation sessions. So I find that my meditation sessions are much more productive. I've done 10 to 15 minutes of Mendi training before doing that. So I do a lot of neurofeedback. And that's because I'm a physician and mental health physician and advocate meditation. That's a primary part of my focus. But I do do a lot of experiments for my YouTube videos where I do test reactions to light or focus during the day of these devices. And so there's a lot to it, it can definitely be overwhelming at times all the different ways that you can integrate it into your life. But mostly for me right now it's in the morning routine. 


Tracey Follows  7:33  

We noticed over the last few days that Grimes has got herself a bit of kit. Tell us about that. 


Dr Cody Rall  7:40  

I was just talking to my wife about this. It's an interesting development where we might see neuro tech become more trendy. One of the things I've been talking about on my channel is that one of the developments is this integration into headphones. And the idea is that people don't want to wear headbands around, they don't want to be perceived as odd in the workplace. But if you have devices that monitor your brainwaves through something as innocuous as headphones that might be leading to more adoption of these technologies to get good feedback on your focus and your relaxation. But as different influential figures come in and start talking more about neurotech and showing things like The Crown on social media, I think people are getting more used to this and I couldn't think of a better person than Grimes. It seems very on brand for her to be talking about Neurosity Crown. 


Tracey Follows  8:29  

What is the most surprising, or really out there, advancement or innovation you've seen in this field? 


Dr Cody Rall  8:39  

Well, the one that has captured my imagination are these fMRI studies that have been done recently where they can use generative AI to actually read your mind in a way they have their test subjects, of course, watch a couple hours of YouTube videos and show them various images. But then they are able to feed that brain data into the machine learning algorithms, and then show them new images. And the AI is able to deconstruct simply from their brain patterns, what they are seeing. And what's really exciting is with this generative AI and those models, they can actually take that brain data and create new images right in front of you to show you what that person has seen. And also type out text of what podcasts people are listening to with increasingly good accuracy. But that's on the high end where you have these fMRI machines that cost several million dollars. And what I see is an integration and scaling down where you have these technologies coming along that do track blood flow, similar to what fMRI does. It's mostly in the cortex, it doesn't penetrate as deep into the brain. But Kernel neuroscience Brian Johnson company is a good example of this where we are getting increasingly good at tracking blood flow without the need for very, very expensive technologies. And I think that if you can integrate these AI and these generative AI capacities with the neuroscience that's coming out with wearables, that might not be as subtle as within a pair of headphones, but subtle and light enough to be sort of like a bike helmet, you really could be projecting your thoughts into images that would be comprehensible to the outside observer. And I don't see why that can happen over the next couple of years. I mean, the science is there, we just need to integrate it into smaller, less expensive devices, and have demonstrations for the general public on what's possible, and then I think more money will flow into the space and it will create some pretty fantastic brain computer interface technologies. 


Tracey Follows  10:42  

Do you see it as a very personalised set of products or devices? Or are you pro hive mind? Because this is one of the conversations always, always have with people? Is this going to be very personal and personalised? Or is it going to be like a sharing of this creativity? And this for want of a better phrase mind reading? What do you think?


Dr Cody Rall  11:06  

I think it has to be personalised to a certain degree, because what we are learning about the brain is that we all have different brains that have different signals. So I think that if you tried to just have products that were supposed to apply to everybody, they wouldn't work as well. What you need to do is actually have 3D model printed earbuds that fit in your ear well, so it gets a good signal. But then also, the software needs to be learning about your neural signals over the course of weeks, months, even years, to better understand you and your needs and how your brain signals present themselves. And with AI, we're increasingly able to do that to customise the device to the individual person and give them individualised advice about how to manage their focus, get better sleep, and be more productive in their personal and work lives. So I think in that sense, they actually need to be very personalised. And then on the collaborative model, the big brain data will feed into the algorithms and will understand more on a large data scale of how neuroscience presents itself on the population scale, and that will enhance the technology as well.


Tracey Follows  12:20  

Thanks to Cody, who you can find on his youtube channel Tech for Psych or indeed on Tiktok. Links are in the show notes. And I'm sure we'll be back with Cody in the not too distant future. Because everyday neurotech is really starting to take off now.


Tracey Follows  12:39  

Let's turn to motion prints. VentureBeat had an article written by Louis Rosenberg and Unanimous AI, suggesting that privacy in the metaverse might be impossible. Citing a paper from the University of California Berkeley conducted at the Centre for Responsible Decentralised Intelligence, they analysed user interactions in VR. Now the study found that half of the users could be uniquely identified with only two seconds of motion data. So VR involves numerous bits of text that monitor our facial features, vocal qualities, eye emotions, and ambient information about our home or office environment. And many researchers, as the article points out, are even hypothesising that EEG sensors will be employed to detect brain activity too. Well, we covered this in the last episode, where I explored with Professor Farahany the likelihood of brain data becoming the most powerful biometric in the future. The point is that switching off these kinds of data streams even then, might not provide anonymity. Because simple motion data may be all that's needed to identify a user. As the piece points out, and I'm quoting now, anytime a user puts on a mixed reality headset, grabs the two standard hand controllers and begins iterating in a virtual or augmented world, they are leaving behind a trail of digital fingerprints that can uniquely identify them. 


Tracey Follows  14:11  

And that poses the conundrum that when in VR, you do something like swing a virtual sabre at an object flying towards you, as the article posits, the motion data you create may be more uniquely identifiable than your actual real world fingerprint. Hmm, these are the motion prints that shoppers might leave all over a virtual store. So there's clearly a lot more to investigate here. And I think, you know, we'll probably return to this in a future in-depth discussion episode. 


Tracey Follows  14:43  

Now, social media age verification, that's also been a huge topic over the last month, the governor of Utah signed the US is most restrictive social media regulations set to take effect on the first of March 2024 banning users under 18 years of age for opening a social media account without parental consent. Mandated age verification of existing account holders also applies. Now, age verification is an interesting issue. Yes, honestly, it is *laughs*. In March, the Electronic Frontier Foundation (EFF) published a piece describing age verification systems or surveillance systems, suggesting that it's going to lead to an internet where our private data is collected and sold by default, and that it will end anonymity on the web. They also point out that of course, this doesn't just infringe young people's rights, but everyone's given that everyone who wants access to online services will be asked to confirm their age. Then they go on to list out multiple risks associated with verification methods. 


Tracey Follows  15:52  

Yoti CEO Robin Tombs, I noticed then took to LinkedIn to dispute these claims. Phone owning individuals over 13 can use what Yoti calls their reusable digital ID app, to share either their ID doc verified name or use either their ID doc Date of Birth verified or age estimated over 18 attribute without businesses having to receive and store any passport, driving licence, national state ID or face details. Yoti claims to have pioneered privacy preserving facial age estimation, which is by far the most popular choice for individuals across many countries to provide confidence to businesses they're over 18 or over 13, they claim. And Robin then goes on to say that the EFF knows this, that Yoti age estimation detects a face but does not identify individuals, because it does so through face matching against databases of faces to try and get a match and then uses the matched face age if known to deliver an age result and that Yoti deletes every image as soon as the age has been estimated.


Tracey Follows  17:04  

Now, I'm not sure how many people in the UK are really aware of how much age estimation like this is going on. I believe OnlyFans uses Yoti age estimation for supermarkets in the UK, certainly looking at it and calling for digital age verification for alcohol sales, much like with other age restricted products. And the UK cinema Association, which includes the likes of Cineworld, Odeon, Showcase Cinemas, they're also partnering with Yoti for age verification. This does of course mean that you will need your smartphone with your reusable digital ID when you go to the cinema. I'm not entirely up to date with where the UK Government is with its initiative to expand the use of digital identities in the UK economy after it completed an initial trial phase, which tested this age estimation technology and digital ID apps in a variety of retail environments in 2022. Led by the Home Office, there were nine trials that allowed supermarkets, bars and nightclubs to test out the tech and the taker of digital ID across a variety of scenarios. According to computer weekly four of the nine trials used age estimation technology developed by Yoti whilst other trials included One Account, which trialled the digital identity app on mobile phones in Camberley nightclub True, Fujitsu which partnered with Nottingham Trent University to draw a mobile app for students using passport and biometric data and MBJ Technology which deployed a digital identity app in 13 night economy venues across Liverpool. Perhaps the jury's out on the results of those particular trials. But one thing's for sure, with the Online Harms Bill and the increasing clamouring for more safety in online environments, then age estimation technology and potentially other solutions to age verification, are going to increase too.


Tracey Follows  19:08  

Finally, this bit of chat gathering about the launch of Fed Now, the new instant payment infrastructure developed by the Federal Reserve that allows financial institutions of any size across the US to provide safe and efficient instant payment services. The first week of April saw the Fed begin the formal certification of participants with an eye to launch in July. Businesses and individuals will be able to send and receive instant payments at any time of day, and recipients will have full access to funds immediately, giving them greater flexibility to manage their money, but also make time sensitive payments. However, I have noticed that quite a few people are now on the record suggesting that this is a precursor to CBDCs - Central Bank Digital Currencies. So I asked Dave Birch, payment and digital identity expert and author of The Currency Cold War, what Fed Now is really all about.


Dave Birch  20:07  

So Fed Now is the equivalent of faster payments. So in the UK, we've had this for God knows how many years I can't even remember. So, you know, my son calls me up, 'can you lend us 100 quid until payday?' I just send it to him on my phone, it gets there instantly. We've forgotten in the US, they don't have this, they still write cheques and things like that. So the idea of an instant payments system is quite novel out there. So starting in July, there's going to be Fed Now, which means you'll be able to send money instantly essentially, from any bank account to any other bank account. And hopefully, they'll learn from some of our mistakes in that era, hopefully they will. Because you know, we have massive authorised push payment fraud going on over faster payments at the moment. Again, because we don't have a digital identity infrastructure. So I hope they'll learn from that. But that's quite separate from the idea of a CBDC.


Tracey Follows  21:00  

People are worried. They're hearing that it's programmable so they're worried that it's going to have confiscatory measures applied to it. You know, if you don't think or do or behave in the right way, it will be confiscatory. 


Dave Birch  21:12  

But they can do that to bank accounts now. If you're worried that the government's going to come along and confiscate your bank account, right now, you're perfectly free to draw out 50 pound notes and put them under the bed, you know, it's not a problem. The thing is with all of these things is how you implement these things. If you chose to implement a CBDC in a particular way, which meant that all of my transactions were tracked and traced and monitored - I'd be against it. Do I want a CBDC that's completely anonymous? Of course not, that would be insane. I don't want to live in that society. That'd be crazy. So we've got to have society set the privacy dial in the right place. As technologists, we can implement whatever society chooses. You want completely anonymous? we can implement it. You want completely not anonymous? we can implement it. You want it somewhere in the middle? which is where it will end up, we can implement it. 


Tracey Follows  22:09  

You can hear more from Dave in the next episode in two weeks time, when I discuss in a lot more detail the technological solutions and the policy approaches and basically give an all round state of the nations on digital ID looking at the UK, the US and Europe with both Dave Birch and Cameron D'Ambrosi.


Tracey Follows  22:36  

Well, that's been The Future of You: The Briefing where we've discussed neurotech and its integration with AI including Chat GPT. We've discovered motion prints and how important they're going to be in virtual reality, social media age verification - that's becoming a big issue over time. And we've also touched on real time payments and Fed Now. Until next time, on The Future of You, this has been The Briefing.


Tracey Follows  23:12  

Thank you for listening to The Future of You hosted by me Tracey Follows. Check out the show notes for more info about the topics covered in this episode. Do like and subscribe wherever you listen to podcasts. And if you know someone you think will enjoy this episode, please do share it with them. Visit thefutureofyou.co.uk for more on the future of identity in a digital world and futuremade.consulting for the future of everything else. The future review podcast is produced by Big Tent Media.






People on this episode