229 | Nita Farahany on Ethics, Law, and Neurotechnology

Every time our brain does some thinking, there are associated physical processes. In particular, electric currents and charged particles jump between neurons, creating associated electromagnetic fields. These fields can in principle be detected with proper technology, opening the possibility for reading your mind. That technology is currently primitive, but rapidly advancing, and it's not too early to start thinking about legal and ethical consequences when governments and corporations have access to your thoughts. Nita Farahany is a law professor and bioethicist who discusses these issues in her new book, The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology.

nita farahany

Support Mindscape on Patreon.

Nita Farahany received a J.D. and a Ph.D. in philosophy from Duke University. She is currently the Robinson O. Everett Distinguished Professor of Law & Philosophy at Duke, as well as Founding Director of the Duke Initiative for Science & Society. She has served on a number of government commissions, including the Presidential Commission for the Study of Bioethical Issues. She is a Fellow of the American Law Institute and of the American Association for the Advancement of Science, and was awarded the Duke Law School Distinguished Teaching Award.

0:00:00.0 Sean Carroll: Hello, everyone. Welcome to the Mindscape Podcast. I'm your host, Sean Carroll. You have the right to remain silent, we are told, here in the United States, this is part of the famous Miranda warning that is given to newly arrested suspects to remind them of some of their rights as enumerated in the US Constitution, especially the Fifth Amendment, the right against self-incrimination. And this is actually pretty common practice around the world that suspects cannot be forced to testify against themselves. It's supposed to protect you against torture or forced confessions, anything like that. And there's other aspects of rights that we have in the US Constitution and elsewhere that, for example, very famously in Roe v. Wade, the US Supreme Court said that there was implicitly a right to privacy in the US Constitution. Given all the enumerated rights, it was sensible to say that these added up to a right to privacy.

0:00:57.6 SC: Your opinions about that particular judgment may differ, but the idea that governments cannot simply see everything we're doing, read all of our emails, tap all of our phone conversations, et cetera, is pretty well entrenched, and for good reason. But what if the government could just read your mind? [chuckle] What then are the limits that protect our right to privacy if the government could somehow know what you are thinking? That might sound science fictiony, but it's less science fictiony than you might think, because the technology is advancing. Today's podcast is not going to be primarily about the technology, although we'll discuss that, it's going to be about the ethical and legal implications of neurotechnology, of the idea that we can have brain-scanning devices that tell a listener something about what a person is thinking at different levels of specificity, not even necessarily an implant, but in some cases, as we'll hear, earphones are being sold that you can use while gaming to help get signals directly from your brain without having to use your fingers, to use your mouse or your keyboard or anything like that.

0:02:16.4 SC: There are a lot of reasons why such technology might be very, very attractive to a lot of people. They're going to do it to themselves, and they're giving away information about their thoughts to a corporation or perhaps to the government. Today's guest is Nita Farahany, who is both a law professor and a bioethicist. And in her new book, The Battle for Your Brain: Defending the Right to Think Freely in an Age of Neurotechnology, she's looking ahead to what are going to be the legal and ethical skirmishes over the collection and use of this information. As we've seen in much more benign circumstances, human beings in the modern age with social media, et cetera, are remarkably unconcerned about their privacy. They will give away a huge amount of data to corporations and to the government for a little bit of convenience. And the idea is that new technology is going to make this bargain even more attractive and even more pervasive.

0:03:15.2 SC: So what are we going to do about it? What are we going to do when in principle, the government might be able to read your political inclinations, to know that you're planning a crime, to imagine or notice that you have hateful thoughts one way or the other? What about companies that want to hire you or fire you, or schools that want to give you grades or something like that on the basis of just scanning what's in your brain? It's a very different world coming up. Obviously right now, we can't do this to anything like this kind of precision, but it's not crazy to think that it'll be coming soon. And the time to think about it is now. So let's go.

[music]

0:04:10.4 SC: Nita Farahany, welcome to the Mindscape Podcast.

0:04:11.1 Nita Farahany: Thanks for having me.

0:04:13.9 SC: We're going to go a distance here through law and ethics and technology and all sorts of crazy futuristic things. So just to get people grounded right from the start, the problem seems to be that if governments or corporations could read our minds, that would lead to legal complications. Is that the safe way getting at that, the point?

0:04:35.9 NF: I would say legal and social complications, really. So if governments and corporations can read our minds, and change our minds, not just read them, that we should be worried about what that means for the future of humanity.

0:04:49.8 SC: Change our minds. What are you referring to there?

0:04:53.3 NF: I mean the ability to manipulate and hack or even use weapons to disable minds on battlefields.

0:05:00.9 SC: Yep. That sounds important to understand. And, yeah, I guess I was going to ask this much later in the podcast. Let me just say it right now. The authors of the Constitution weren't worrying about this, right, in the United States?

0:05:14.7 NF: No, they were not. They were not. If you look back at the way laws, whether that's US-based laws or international human laws or international human rights laws, were written, nobody really contemplated a world in which you could actually hack and track the human brain. Everybody assumed that it was your private diaries or your spoken words that we needed to be worried about, or your actions, not what you're thinking or feeling. And it just turns out that that world has arrived, and we need to really update how we think about laws and protections for humanity in order to ensure that it continues to be a world in which we want to live in.

0:05:55.4 SC: Well, which raises the very basic question for people who would ever write laws or write the constitution. To what extent should we be enshrining the very basic principles versus attempting to foresee every possible application of them?

0:06:10.3 NF: Yeah, I think it's difficult, if not impossible, to foresee every possible application and misapplication of a technology. And oftentimes technology advances much more quickly than laws and regulations can. So the approach that I've advocated for is what I call a right to cognitive liberty, a broad update or understanding of liberty. And to do that at the international human rights level, to both create laws, but also norms about the right to self-determination over our brains and mental experiences. And I think if we start there, it changes the default rules. Our default rule is protection over our right to self-determination, over our brains and mental experiences, which means laws have to be created, or policies have to be created as exceptions to that for corporations to gain access to our brain data or for governments to gain access to it and to manipulate it, for that matter.

0:07:05.9 NF: So I think it changes the starting place of the conversation to start there. And then we're going to need context-specific laws and regulations over time. How are these devices used in employment settings? How are these devices used by governments, whether it's in the military or even for things like trying to use biometrics from the brain to authenticate people or using the technology for interrogating criminals, to try to figure out what they're concealing in their minds. But I think we have to start by changing the default rules in favor of individuals.

0:07:41.0 SC: Good. So let's step back then and consider the technological landscape here. Even before getting to the more science fictiony things, there are ways in which we have the technology to see or perceive what are in people's brains right now. I mean, at the dumbest level, we've invented a whole bunch of ways where people can just spew their unfettered thoughts out there into the internet in social media, which sort of gives people a lot of information already.

0:08:11.4 NF: That's true. And if you spend a lot of time on social media, you might think that those are unfiltered thoughts and that they just come straight from people's brains rather than some kind of filter gateway through which they make decisions about what to post. But in actuality, what's happening is people are choosing to write particular things and to share particular things and to search in search engines for particular words and phrases. And there are a lot of choices that we're making, and those choices that we're making are being used to make pretty precise profiles of who we are and how we think and how we feel.

0:08:43.7 NF: But they still don't go as far as actually decoding what's in your brain. There's still a space in which people have private contemplation. They have the ability to think thoughts that they never share with anyone, have thoughts pop into their head about a fantasy that they might have, a new idea that they want to turn over in their mind before thinking about sharing it with other people, or even working out their own self-identity and their own, working on their own biases, working on their kind of developing understanding of who they are.

0:09:17.9 NF: And those things don't hopefully show up on social media, although sometimes they do. That's sometimes how people work things out. But there's a difference. You choose to share versus having information that you have not chosen to share be used. And there are ways that that's happening already. Facial recognition software can already pick up micro facial changes, a precise dossier can be put together that triangulates your GPS location and your financial data and your healthcare information and a whole bunch of other pieces to give us a pretty good and accurate insight into a person. But it's still not the same as what's in your mind. And so I believe that there is this last bastion of freedom or privacy that we have, and it's not just the last one that will fall, it is, I think, the most important one that never should.

0:10:10.6 SC: Good. I do remember, this is only slightly related, but a quantum information theorist who I know said he once figured out a way that you could have a search engine such that it couldn't figure out who was doing the searching. There was a quantum privacy protocol, and he pitched it to the people at Google, and they got very excited, but then they came back the next day and they said, "Wait a minute, this is exactly the opposite of our business model, so we do not want to let you do that actually." But about the micro expressions and lie detectors and things like that, I know there was a TV show, Lie to Me, with Tim Roth in it, that was a lot of fun, but my impression was the science there was a little bogus. I mean, what is the state of the art in that kind of non-invasive, you don't even know your thoughts are being detected kind of technology?

0:10:58.8 NF: Yeah. I think the science there is still somewhat bogus, as you just said, right? Which is there are some things that we can tell, and there are some patterns of facial activity that gives you some insights, at least in the aggregate, across the population, but not necessarily, is that accurate for any given person? What if I've had Botox and you try to read my micro facial changes, how accurate is it then, right? Or what if I have, or I'm trying to keep my face really still, or if I'm just different in how I express my face, I have a really good poker face and somebody else doesn't have as good of a poker face? So there's still inferences.

0:11:40.1 NF: And the truth is, when you're reading brain activity, there are also inferences based on patterns and activity, right? It's not literally that you are picking up thoughts in whole form. You're taking some biological measure, which is brain states and brain activity, changes in electrical activity, and using algorithms that are then trying to decode what that means. But the bigger the data sets are with human brain activity, the better and more precise it becomes. And with tools like generative AI that can be used to be trained on any particular person's brain activity, it gets more and more and more precise over time to decode what's actually in a particular person's brain.

0:12:21.5 SC: So we have... We actually go out and spend money to buy devices that monitor our location and our physiological state, right, exercise bands and things like that. And presumably that is actually right now being used to build up precise profiles. You're the law expert here. Do the companies who gather all that data and collate it, do they promise not to keep track of which individual it is, or they're very interested in keeping track of which individual it is?

0:12:52.4 NF: I think they're very interested in keeping track in which individual it is. But more importantly, I think it is trying to figure out... So it may not be that efficient to market to a particular person, customize it down to where you have one particular person that you're marketing to. But the more that you can segment the population, right? This is a person who lives in the following demographics, who has the following kinds of jobs, who has the following kinds of purchasing behavior, who has X number of kids of the following ages and spends this much amount of time on social media. The more carefully you can segment the population, the more precisely you can target advertisements to it. And so that is what most big tech corporations have built their models on, is not only aggregating data, but then selling to advertisers a promise of real precision about targeting of advertisements.

0:13:50.3 NF: It's not very effective to send weight loss targeted advertisements to somebody who's trying to gain weight. You really need to find somebody who is trying to lose weight in order to send those types of advertisements. So the better you can target the advertisements, presumably, the better conversion you have to actually purchasing things. And so the question is, what does brain data add? Does it add something new? And there are a whole bunch of companies called neuromarketing companies and even major corporations now who have neuromarketing divisions. And these are divisions that recognize that what consumers say about what they're likely to do in response to advertisements or pricing of products, or what a product looks like, the trade dress of it, is not as accurate as what their brains say when shown those different things. And so self-reporting has long been a problem for marketers.

0:14:44.0 NF: And so if you can base it on behavioral data instead, and particularly on brain-based behavior, does the brain show engagement or fear or excitement or is it... Does a person pay attention actually throughout the entire advertisement that you play, or does their mind start to wander halfway through? So those are the kinds of insights that neuromarketing companies have been offering now to corporations for a long time by using neurotechnology, simple consumer-based wearable devices where while people are watching advertisements or seeing products or seeing different pricing of products, the companies are measuring how their brain reacts to those products, then using that to create insights about the next marketing campaign.

0:15:33.9 SC: But this is more like a focus group. This is not something they're able to do to every person who walks by a billboard or so forth.

0:15:41.3 NF: That's right. So because brain wearables are not ubiquitous and it's people who have opted into wearing them, or you don't have to do this even in laboratories anymore. People have consumer-based devices at home and they can go onto these websites that will pay them to have their brainwaves monitored while they look at different advertisements, but they've opted in. And they've opted into sharing that data. They're getting paid to share that data. Neuromarketing is, are building it based on that data. They may not need to do so in a world in which there is wide scale brain wearables in our everyday devices. So we're wearing headphones right now, this coming spring, there's one of the major neuromarketing companies who's partnered with one of the major headphone manufacturers to take the soft cups around our ears and fill those full of brain sensors that can pick up the electrical activity in the brain.

0:16:37.0 NF: And so, while you're recording a podcast or while you're taking a conference call or listening to music, your brain activity can be monitored at the same time. And somebody might say like, "Well, why would anybody choose that headset instead of one that doesn't have those brain sensors?" And I think it's because of the promise that will offer for the individual, which is they can interact with the rest of their technology seamlessly. They can track their focus and attention. They can track their brain health, just like they track their heart rate. They can eventually do things like replace their mouse or their keyboard and be able to think about moving the mouse around the screen in a much more seamless way, or be able to communicate much more closely at the rate of thought, rather than typing speed on either a keyboard or, worse, with two thumbs on a smartphone, you could just think about it instead.

0:17:27.0 NF: And so, as those devices come to the market and we're all wearing these, and you're surfing social media and your brain wave activity is being recorded by the headset that you're wearing, the question is, are neuromarketers really going to need to continue to have these focus groups, or are companies going to have unfettered access to how your brain reacts to whether they give you AB testing and person A sees their feet in the following way, person B sees their feet in that way. Let's look at what their brain activity looks like in group A versus group B.

0:18:02.0 SC: Yeah. Wow. So I do want to ask a little bit more about the actual technology here. You talked about measuring brain activity, measuring brain waves. Do we know what it is we really want to measure and do we have good ways of doing it?

0:18:18.8 NF: So it really just depends on what you are trying to measure for. So even the most sophisticated consumer neurotechnology device can't literally read your mind. There aren't enough electrodes. They're not deeply placed inside the brain. The most sophisticated electroencephalography, which measures electrical activity in your brain as you think, or do a math calculation or anything you do, neurons are firing in your brain, which gives off tiny electrical discharges. And when hundreds of thousands of neurons are firing at the same time when you're doing something, those create patterns that can be decoded. So this is what it looks like when you're thinking about a word. This is what it looks like when you're thinking about a face. And then it can get much more precise from there to kind of recreate an image that you're looking at.

0:19:07.3 NF: But unless you have a broad array of electrodes implanted deeply inside the brain, how much information you're getting out of the brain is not that much, and you have interference, and clinical grade EEG, which was like a cap, people can envision this. There's a fitted cap that you wear that has 128 electrodes on it, and each one has to be applied to the scalp precisely with gel in order to get good conductance, that's not what we're talking about. We're talking about earphones or earbuds or tiny tattoos worn behind your ear that have somewhere between 2 and maybe 30 electrodes and they are making dry contact with the skin and they have good applications to support making sure that they're in the right place, but everybody might have them adjusted slightly differently from one another, so exactly what it's picking up is going to vary a little bit between person and person.

0:20:01.8 NF: But despite that, then the question is what can you pick up? And you're picking up depending on where you are, where the electrode is attached, you're picking up electric activity from the brain, from these firing of neurons that rise to the level of the scalp that can be detected, and then you're averaging it usually over the different number of electrodes that you have. And you can measure things already. Companies have developed the ability to track things like attention or focus or mind wandering. Basic emotional states, happy, sad, even some things like numbers or shapes that you're seeing or imagining can be reconstructed. Even... Some studies have shown that EEG as a person is looking at a face that it's possible to reconstruct the face that they're seeing.

0:20:52.6 NF: And so the precision is increasing every day, even with consumer-based electrodes, and basic health patterns can also be detected. An epileptic may have a signature of change in the electrical activity in their brain an hour before they have an epileptic seizure. When a person develops a horrible brain tumor called a glioblastoma, the earliest stages of that show tiny changes in electrical activity that continuous use of a headset like these could pick up. So there's a lot that can be picked up. And in my book, the Battle for Your Brain, I go through so many examples of what's already happening as well as the research using consumer neurotechnology. I'm not taking from the most sophisticated implanted electrodes to say that's what we can do in today's world. Instead, I'm drawing from study after study after study to show what's actually happening today, what can be decoded from the brain right now. And it's both terrific and terrifying, how much can already be decoded from the human brain using simple surface-based electrodes.

0:21:58.1 SC: It does seem like very fertile ground for exactly the kind of machine learning, deep learning techniques that help computers distinguish pictures of cats from pictures of dogs. Like you can get a set of neuro... Sorry, what are the connections?

0:22:18.1 NF: Yeah. The neurons, sorry?

0:22:19.3 SC: No. What are the electrodes, is it a number of electrodes that we're counting here? Yeah.

0:22:22.9 NF: Yeah. So it's electrodes on the surface of the scalp that we're using.

0:22:27.3 SC: Yeah, so the electrodes are getting some signal, and the human being doesn't know what it is, but if you just train it on a big enough data set, the computer can figure out exactly what it is, yeah.

0:22:36.6 NF: Yeah, that's part of why there have been such tremendous advances in consumer neurotechnology as of recently, is it's really both the miniaturization and improvements of the electrodes and the sensors themselves, but also, and incredibly importantly, the advances in computational skills from algorithms and machine learning that have just made seismic advances. And the more brain data that we have while people are engaged in their everyday activities, and you have that, right? So you say okay, algorithm, here's what the brain activity looks like. This is what the person was looking at, or this is what the person was listening to or this is what the person was thinking of typing. That's just data that huge patterns of data that machine learning algorithms can be trained on, and the extraordinary advances that have been made in AI and continue to be made in AI, I mean, generative AI is going to lead to another huge leap in this technology, means that suddenly what seemed like science fiction a few years ago is science reality today of how much we can decode from the human brain.

0:23:40.0 SC: Do we have anything like a good picture of which kinds of thoughts can be read? Are there some kinds of thoughts or emotions that are going to be invisible to these electrodes no matter how hard we look?

0:23:54.9 NF: So, I think if we're talking about surface-based electrodes that are few in number, the likelihood of ever decoding unintentionally communicated complex thought is low. And so by that I mean you're sitting there and you're literally having an inner dialogue that you never intend to type or communicate, that really detailed inner dialogue, I don't see surface-based electrodes ever decoding. Now, check back with me in five years.

0:24:28.9 SC: Okay, sure. I get it.

0:24:29.9 NF: Maybe I'll eat my words, right?

0:24:32.6 SC: Yep.

0:24:32.9 NF: But it's just, it's hard to imagine getting good enough signal from deep enough in the brain and distributed across the brain to decode that level of accuracy and precision. That being said, when people say, can these read my mind, I always say, well, what do you think that means to you to read your mind? You walk into your friend's house and they have a really hideous mustard yellow new couch that they spent a lot of money to buy, and they say, do you like it? And your face, you keep as a poker face, and you're thinking, oh, my gosh, I hate that. And you're feeling like disgusted. That disgust can be picked up from brain activity. And so your true sentiment can be decoded even if your precise words or your precise thoughts can't be. And so I think it just depends on what level of specificity we're talking about for whether or not we can read your mind. We can pick up a lot of your sentiments and then use other things about you to have a much, much more precise picture of what you're thinking or feeling, but I don't see like literally thought bubbles above your head decoding the full inner dialogue with precision with consumer based neurotechnology.

0:25:50.1 NF: And there are other consumer-based neurotechnologies that could hit the market within a few years, like Bryan Johnson has been developing a helmet essentially that uses something called functional near infrared imaging. This is sort of like beams of light that go using infrared, near infrared light that bounces off the tissue, for lack of better specificity in how to describe, it in your brain. That goes deep into your brain, that can decode a lot more, and that's meant to be a consumer wearable device too. And so I'm talking about based on the kind of existing technology we have today, but it's rapidly improving every day, and the modalities of reading brain activity are increasing every day.

0:26:30.4 SC: It sounds like there's almost a fortunate reality to the metaphor of like deeper thoughts being literally deeper in the brain that will make them harder to pick up with these technologies.

0:26:42.6 NF: Well, at least with EEG, but none of that's near, right. So, if the bike helmet comes to market, all bets are off and so, yeah, I think with surface-based EEG, your headphones and your earbuds, deeper thoughts may be protected until we have deeper peering technology.

0:27:02.9 SC: I think, I don't want to give away too much of my own inner mental life, but I'm sure that all of us have thoughts that are not the thoughts we want to share with the world.

0:27:12.7 NF: Yeah, that's right. And we filter them, right? And you have a lot of thoughts that pop up and you just think like, oh.

0:27:18.6 SC: That was a terrible thought. I didn't want to have that.

0:27:18.7 NF: Yeah, that was a terrible thought. Like, I'm a horrible person. Of course, I'm not going to express that, right? And so you're sort of actively sometimes at war with yourself of being your best version of yourself and constantly editing to try to rise to the level and to the occasion, and to really be who you want to be in the world. And it's unclear how much of that filtering is going to be picked up by brain activity versus the base level I'm a horrible person, and that thought just popped into my head. And so even your ability to cultivate your own personality and your identity and choose what you project to other people and choose to be the best version of yourself, if you're judged on your brain activity instead of judged based on your actions and your words and what you choose to express, it could really go sideways pretty quickly.

0:28:12.7 SC: It's a completely different human experience. We're not trained for this.

0:28:16.0 NF: It is, that's right. Yeah, I think what we're talking about is a transformation of what it means to be human and to interact with other people and to interact with the environment around us. And I just think people aren't really aware of what a seismic shift is already underway to what it means to be human. And they need to join the conversation. They need to learn and get educated about what's happening, particularly in order to have a voice and a say in how this all goes and what happens next. We're still at that moment where we can make those choices. The technology's here, there are already millions of these headsets worldwide, but it's not part of people's everyday lives yet. And it's just now launching into multifunctional devices like watches and headphones and earbuds, so that devices we already use rather than new devices that are unlikely to take off because they're kind of niche applications.

0:29:16.0 NF: As that starts to take place, it will become relatively quickly too late to have the conversation. And so I feel like we're fortunate at where we have the moment. I remember when I read Shoshana Zuboff's The Age of Surveillance Capitalism. I thought, well, okay, well, this is really sad, but it's like, it's just too late. Like it's already happened, right? It's like, this is like a really sad historical fact now that we've all ended up in. We stand in a different place now. We stand at the forefront of technology that's going to transform the experience of what it means to be human, but it hasn't, like we don't have the age of brain surveillance capitalism yet. What we have is the opportunity to put into place the right to cognitive liberty before that happens, to shift and flip the narrative and how this all goes from here.

0:30:09.4 SC: Say a little bit more... You already said something about what an individual's motivation might be for giving the rest of the world access to their thoughts. Gaming, I guess, is an obvious one, but I'll let you tell us what the main ones are.

0:30:26.7 NF: Well, let me first start by saying, I think people unwittingly give access to information without often thinking about it. And so I'll give one quick anecdote of that, which is IKEA a few years ago decided to do a campaign where they were selling these limited edition rugs. They had partnered with famous artists and they wanted to bring art to everyone. And so the idea was create these limited edition of rugs that were based on art that was created in partnership with these artists, and then sell them at IKEA-based pricing to make it available to the average consumer. But what they found was that instead of doing that, instead of democratizing science, that instead people were coming in and they were buying the rugs, and then they were reselling them for much higher prices on eBay, right? And so they were bombed about this, and they wanted to figure out a way around it.

0:31:16.1 NF: So they launched this campaign where in their Brussels store, people would come into the store, they could look at the rugs, but they had to put on an EEG headset, a consumer EEG headset, and only if their brains registered love for the rugs, could they buy them. And everybody who came in, like happily put on one of these little headsets and looked at the rugs, and they have a little campaign commercial that they created on this, where they say everybody loved it. Okay, well, meaning everybody loved the experiment, not necessarily the rugs. And I look at that and I think, I get it. People are going into museums and they're putting on EEG headsets already in order to have their brain activity decoded as to whether or not they love particular art or not and then to have more art served up according to what their brain loves.

0:32:04.8 NF: So the fact that people are doing this already without thinking or questioning what is happening to my brain data, and what else are they picking up, means first they're not aware of what they're actually signing up for. And second, they're not thinking about the broader implications of what could it mean for this to be widespread and adopted in that way. It's already in classrooms, students are already using it for focus and concentration. Parents are already using it with their children to help them with their focus and concentration and neurofeedback and meditation. So I think part of it is that people adopt technology without recognizing or thinking about the broader implications of the technology.

0:32:41.9 NF: What are some of the applications people are adopting it for? One is gaming, two is neural sensors, or brain sensors are being embedded into a lot of virtual reality headsets, and I see the next generation entering into VR where the interface, the way that they move around VR, is seamless because it's picking up brain activity and intention to move without thinking about, oh, we've just crossed this final frontier of privacy. I think if you have the option of a smartwatch and a smartwatch with a heart sensor, many people choose the smartwatch with the heart sensor, because it gives them the benefit of tracking their health and their cardiac fitness and things like that. I think they're going to do the same thing with brain activity, where in order to track their brain health over time, it's just an add-on feature, right?

0:33:28.6 NF: It's like you can have the headphones and the earbuds, but then you can also track your performance and your focus over the course of the day and how stressed you are, and whether your brain health is in trouble, whether you have signs of cognitive decline and there are brain training games to improve it, to improve memory. People are fascinated with the ability to engage in cognitive enhancements and cognitive improvement, and these promise much more precise ways to do so. So I think both unwittingly and wittingly, people will adopt the technology. And then the ability to do things like replace your keyboard by typing, by thinking about typing, instead of speech to text, you have brain to text.

0:34:09.8 NF: And instead of typing with two thumbs while you're driving, which is incredibly unsafe, people think about sending a text message and they send it off while they're driving. I think the possibilities are really endless, and that the combination of not thinking about the implications and also being excited about the novelty and the possibility of opening up the black box of the brain from oneself, there will be widespread adoption of the technology.

0:34:42.1 SC: Well, as someone who's working very hard to finish a book draft, the idea that I wouldn't have to type, I could just sit in front of my computer and think about the sentences and the equations, that's pretty compelling. I kind of like that idea.

0:34:53.0 NF: Yeah, there are times where I was like, as I was writing this book, The Battle for Your Brain, I was, it'd be like late at night and I had been working on a particular chapter and I just couldn't get the words out or couldn't get the ideas out, and then there I am lying in bed, I don't feel like getting up and going and typing in my computer but it, like, it comes out in like full sentences in my brain. And if I could just launch an app on my phone and just translate that and capture it. Or you know how you wake up in the middle of the night and you're like, oh, I should keep a journal by my bed so I can write down this thought or something, or...

0:35:26.5 SC: But I don't.

0:35:26.6 NF: Even capture what my brain dream was, just imagine like flipping open your phone and just being like, okay, brain to text, and you just get it down and go back to sleep. I think people will be excited about that.

0:35:39.6 SC: And it might also be, I haven't thought about this very much, but it might also be the extra convenience that really lets the metaverse virtual reality kind of thing take off. If you can just sort of see yourself walking around in a completely different environment, interacting with people who are thousands of miles away without awkward mouse and keyboard kind of things, then all these dreams about the metaverse might become a lot more realizable.

0:36:04.7 NF: Well, I would say Mark Zuckerberg certainly thinks so, right. Meta acquired CTRL-Labs, which was one of the leading neurotechnology companies that's developing EMG, that is picking up brain activity as it goes from your brain down your arm to your wrist, which signals the intention to move or the intention to type. And they've invested billions of dollars in bringing the metaverse to reality, but also taking that CTRL-Labs device and turning it into the neural interface to enable you to interact in virtual reality, where the idea would be, you don't have these awkward and silly joysticks that you're using to navigate around, but you literally are thinking about pointing and typing, or you just are, you're just moving your hand as you're in virtual reality, and it's taking that and decoding it into the actual movement in virtual reality.

0:36:55.2 NF: So I do think that that kind of technology as well as the neural sensors embedded into the headsets themselves will be transformational for VR. I also think that that will lead to normalization in the next generation of the technology, because the younger generation who will be the first to really grow up in the metaverse, if the metaverse comes to fruition, if it's built into their experience, it will just be normalized. Just like children today think that it is just normal for everybody to have a smartphone and to be navigating on it and learn how to swipe by the time they're six months old. Like a neural interface will just be part of how they interact with all of the rest of the technology and the environment.

0:37:38.9 SC: Well, and certainly it seems from our experience with various things that track our location, our preferences over the past couple of decades, that people are pretty darn willing to trade a lot of privacy for a little convenience. Is that just my impression, or is that [0:37:56.2] ____.

0:37:56.3 NF: Yeah, I think that they're willing to trade just about all of their privacy for their convenience. But it's interesting, because when I talk to people about brain sensors, it touches a nerve for people, so to speak, in a way that it just doesn't seem to in other spaces. I think how freely and easily people give up their other forms of privacy. When you suggest to them that what they're thinking or feeling can be decoded and used, it terrifies most people. Now, will that stay with them, will it be just as normalized as carrying around a smartphone that has your GPS location? Some of my students at Duke, they have hundreds of friends in their friend-sharing circle that they share their GPS location with at all times, which is amazing to me, that you would just have like a huge number of people who just can track your every movement.

0:38:51.7 SC: It sounds horrible.

0:38:52.3 NF: And the fact that... It's weird. It's different, it's not how I would want to live my life, but they're very comfortable doing so. I could imagine something similar happening already. There are forums online where people are sharing their brain data with one another and comparing it and commenting on each other's brain data. If they're meditating and they're tracking brainwave data, they're commenting and posting what it looks like for a previous session and peering into it and sharing with one another. I think we're in for a whole new world with the sharing of brainwave data.

0:39:30.2 SC: Yeah. Okay. Good. And so let's get into the dark underbelly of this, or at least the set of things we should worry about, 'cause I think it goes without saying we should worry about them now rather than once it's too late, like the surveillance capitalism. So if you tell somebody that they can fly around in the metaverse without even twitching muscles just by thinking about it, and it sounds all fun and they're ready to do that, but then you tell them that literally all of their thoughts are being downloaded to Mark Zuckerberg's hard drive, that sounds scarier. And but that's a plausible, that's an exaggeration, but that's the direction in which they're moving.

0:40:05.8 NF: Yeah, it is the direction in which they're moving, right? They're moving into a direction in which the major tech companies who already have all of their data are also going to be the ones who are selling the same devices. And so, not only will Google have your search history, but also your brainwave data. That's extraordinary, and it's something we should worry about, because the commodification of brainwave data, it's not just about sending you very precise advertisements, that same data is sold to employers who can use it to make hiring and firing and discriminatory choices about consumers. It can be used to identify what you think neuroatypicality is and precisely discriminate against those people. It can be used by oppressive governments to test for things like political ideology and make choices about how they're going to oppress or repress demonstrations or the likelihood of people rising up.

0:41:08.4 NF: It can be used to interfere with people's brain activity, whether it's through marketing or trying to addict people to technology or, more frightening, using it to try to disable people's brains on the battlefield. Cognitive warfare is something that China has recently declared as the kind of next major battleground for humanity, and they're investing massive dollars into trying to develop technology to be able to figure out how to do that. Governments are already interrogating brains by bringing in criminal suspects and using their brain-based responses to showing them images or sounds for interrogation. I find all of this incredibly frightening.

0:41:52.4 NF: There is no safe space for people to think any thoughts, let alone dissident thoughts or divergent or kind of different thinking. And I think a lot of people are talking about the kind of chilling of freedom of speech and diversity of ideas, and that it's difficult to really have civil discourse anymore. I worry what that looks like when even children are afraid to think differently, and what that ultimately means for truth and for relationships and for misinformation and disinformation campaigns.

0:42:29.5 SC: I guess I have one more technology question that came up as you were saying this. You alluded to it before, but I didn't seize on it, which is this, I guess the cognitive warfare question, or more broadly, or more positively enhancing the brain. So not just passively reading, but actively going in there and improving your brain or crippling it. How worried should we be about that or how optimistic should we be about that if we want to improve our SAT scores?

0:42:56.8 NF: So it's good questions. I'd say I think the promise of enhancing the brain is huge. And one of the things that I talk about in The Battle for Your Brain isn't just neurotechnology, it's all of the drugs and devices and other ways in which people are able to use the technology to enhance their brains and already are. I raised the kind of complicated question with all of that here at Duke. Cognitive enhancers, if they're taken without a prescription, were made part of the cheating policy at Duke, rather than the drug policy, which is an interesting normative moral choice, right? It is to say that using a drug that enhances your performance is cheating. And they didn't really consult with any of the bioethicists on campus when they made that choice. But there was a call by students that they felt like some students were getting access and that there was distributional and equality issues about who had access to the drugs and who didn't have access to the drugs, and that it gave some people an unfair advantage.

0:43:53.3 NF: We're seeing the same kind of conversations, I think right now around generative AI, like ChatGPT, where people are worried that the use of those technologies both will not be distributed equally, but also that it's cheating to use them. I think it's a good and hard question about whether or not using enhancing technologies or enhancing drugs is cheating, but I generally think that that's kind of what we're trying to do as human beings. We're trying to enhance ourselves. We're trying to kind of create every possible way in which we can enhance our brains and our cognitive experiences and our effective experiences.

0:44:29.1 NF: And so I think part of what I'm advocating for as cognitive liberty is the right to self-determination, which includes the right to enhance or the right to break. For example, if you want to erase painful memories or you want to treat intractable depression, and that requires in some ways diminishing or breaking your brain to make that possible, I think the right to self-determination over our brains and mental experiences is the full spectrum, right? It's about being able to make choices about whether we enhance our brains or not, but also that to safeguard us from other people doing it to us without our consent or our willingness to undergo or experience that.

0:45:10.6 SC: There is a great philosophy question here, because I think we wouldn't call eating healthy food cheating academically, or exercising or even reading a book, but taking a certain drug counts, and that's probably just a relic of olden days, right? That's probably not the right way to think about it.

0:45:27.8 NF: I think so, but there is a stronghold of people who believe otherwise, but I think we draw really arbitrary lines. So if I'm using a neurofeedback device at home that decreases my stress level, that actually enhances my performance and wellbeing, am I cheating because I'm using a shortcut to do so? I'm not good at meditating without a neurofeedback headset. Am I short-cutting by doing so? I think these are in some ways resistance to technology, but also we're maybe privileging the wrong things. The idea that the genetic lottery is what we should rely on, rather than the work that people put in to actually enhancing their wellbeing and their cognitive states, I don't know why we would just privilege genetic lottery over everything else.

0:46:18.4 SC: That is a very good question. I did have a podcast that touched on similar things with Kathryn Paige Harden, but I don't think... We don't know, we don't have the answers to those questions yet, so I'm glad you brought them up. But let's get back to the dystopia that we're trying to avoid here.

0:46:34.3 NF: Yes.

0:46:34.6 SC: So I can very easily see that employers, maybe even universities seeking to let in students would want to just take a quick brain scan so we can make sure that there's no dirty skeletons in your closet.

0:46:47.1 NF: You say that as farfetched, but it's happening. It's happening, Sean. Yeah, not necessarily universities, but there are companies like Pymetrics and Mercer that have cognitive hiring-based tools rather than companies like HireVue that use like micro facial analysis and algorithms to try to study people during remote interviews. And so is it really so farfetched to think that the way in which we would try to make choices between people would be based on neurotechnology? I don't think so. I think that's coming, we already see it in large hiring platforms that are already trying to do cognitive testing between individuals.

0:47:28.5 NF: And there was already reports of schools throughout China that were using brainwave headsets on students while they were in the classroom to see if they were focused or paying attention, or their mind was wandering. So the idea that... This isn't science fiction, this is here, this is the kind of thing that I think is happening and going to continue happening.

0:47:51.4 SC: So I guess the right question to ask is how far can we go in doing these things in a completely benign and acceptable way. We wouldn't think that badly of a company that wanted to give some pencil and paper cognitive tests to prospective employees and maybe cut out the middleman and just look into their brains. Can you make the affirmative argument that there's room for this to be good and useful and safe?

0:48:18.7 NF: Yeah, maybe, if people consent to it. So I'll give you an example. There was a study that was done that looked at surgeons and wanted to see whether or not the tasks, like the certification and proficiency tasks, the skills-based tasks, if those tasks correlated well with brain activity while the surgeons were performing the same tasks that they would perform as part of the skills-based tasks. And what they found was that you could very reliably, in fact, more reliably than the written test itself, see the difference between the novice surgeons versus the proficient and more advanced ones, because the novice ones spent a whole lot more time thinking about what they were doing and less brain activity and just the motor activity of performing what they were doing. And the proficient surgeons spent far less time thinking about it because it was habitual. It was more automatic for them. And you would see more motor activity for those individuals when they were performing the different tasks.

0:49:20.4 NF: When you're going to select your next surgeon, are you going to ask them to put on a brainwave detector and be like, listen, I don't really want you spending all your time thinking about what you're doing. I want you to be really good at this and have this be like habitual and automatic. But the point is, are there ways to use the technology without being incredibly oppressive? There are, but it's not so easy with a technology like this that really gets at our innermost selves. And because of that, I think we have to be much, much more deliberate than we have been with the assimilation of other technology into our everyday lives.

0:50:00.5 NF: Again, I think we have to start by starting with safeguards in place. We almost always like catch up with safeguards later or try to regulate technology later. And I just think we have to start with a basic set of rights for people that says, like, look, starting place of all of this is you have the basic right to self-determination of your brain and mental experiences. If you want to opt into this system, like, yeah, you can opt in, there are some great benefits that you may personally enjoy. If you want to use it in the workplace to improve your focus and attention, but you don't want your employer to be peering over your shoulder and using it as boss wear, like you ought to have that right.

0:50:36.3 NF: But figuring out of the balance, this is going to take time. This is going to take a process of democratic deliberation in society. And the first step to doing that is people have to become much, much more aware, much more sophisticated consumers about technology that really crosses that final frontier of privacy.

0:50:52.4 SC: Well, and there is a set of things that I'm sure you're super duper aware of, conflicts between freedom and capitalism and privacy and the right to self-determination. So a company can say, well, you can choose voluntarily to take this cognitive test, but if you don't, I'm not going to hire you. So, what kind of right to consent is that, really?

0:51:16.8 NF: No, I agree with you, and one of the things I write about in The Battle for Your Brain is the narrative that people have often had about workplaces is, well, if somebody doesn't like a policy, they can just quit and go elsewhere. But that's not true when everywhere does it, and where it becomes the new baseline for what you have to do. But that's why I think if, again, we flip the narrative to start with the right to cognitive liberty, the really scary stuff comes in when there's broader things being inferred from our brain activity than kind of basic things, right? So if I'm a commercial driver and you're picking up whether I'm sleepy or awake at the wheel, and that is the only thing you're measuring and you're not taking the rest of my brainwave data to look and figure out how I think and feel about a whole bunch of other things that you have no business knowing about, is it that invasive to pick up just that one piece of information, and can we create assurances that that is the only piece of information that's being picked up?

0:52:14.8 NF: I think that's the kind of thing that a right to cognitive liberty would enable, which is to say you have to seek the exception, and the exception has to be very narrowly tailored to whatever the purpose is that you're seeking it for. So if you're doing a cognitive test, we do cognitive tests all the time. Back in the day when I worked at a strategy consulting company, it was oral-based strategy cognitive tasks, right? They'd give you a problem and they'd be like, okay, here's the problem. You have a thousand whatever like widgets and you want to figure out how to get to get them into this area, how would you go about doing it? And you have to lay out all of your mental thinking out loud in order for them to assess what your mental processes are like.

0:52:56.5 NF: If you were to do the same test with neurotechnology, what are the additional risks? Well, one of the additional risks is maybe they discover that the pattern by which you make those decisions shows that you're neuroatypical, and they use that to discriminate against you. If we know that, and there's no additional advantage that they get from the brainwave data that is relevant to hiring, maybe we ought to stick in that instance with an oral-based task so that they can't use other information, if it turns out that the only additional value for the business would be something that would have a discriminatory purpose.

0:53:32.1 SC: Discriminatory.

0:53:32.9 NF: So I think we've got to look at these things case by case and make a decision as a society starting with a default right, which is, you can't use this data against me, you can't gain access to it without my permission. And recognize that if it is really that everywhere has achieved a permission, like every workplace has achieved a permission, it's done so on a really narrow and limited basis that doesn't give them access to like broad spectrum data from our brains.

0:53:58.9 SC: Yeah, that does sound like a very good starting point, but I just always worry that that the technology moves faster than the law, right?

0:54:11.0 NF: I know. I know.

0:54:11.8 SC: You've already identified this. I'm not saying anything you don't know but...

0:54:13.2 NF: Look, could we put the genie back in the bottle? I think most people would be like, okay, well, that's all well and good, but like, let's be honest, it's going to get misused. Technology gets misused, right? I am not naive to that. I'm not ignorant to the fact that the technology can and will be misused. It can, and it will, and it already is being misused. I think the question is given that I think it is so difficult, if not impossible to put the genie back into the bottle, and that I think that there are real benefits that people will enjoy from it, which will lead them to demand the technology, which will lead them to continue to push the development of the technology forward. What's the next best thing, right? And I think the next best thing is like get out ahead of it by creating some rights that can at least give us the best chance of appropriate safeguards, at least in democratic societies like the United States.

0:55:09.5 SC: Well, yeah, you mentioned China a couple times, and I guess it's probably safe to say, to characterize China as the world's most technologically advanced autocracy, so they're a little bit more willing maybe than a democracy to experiment with all these sorts of things. I have not put in the work to try to separate the truth from the fiction in terms of stories about what is going on in China, 'cause there's a lot of political, cultural biases that push on this. But clearly they're doing more than we are. Is this kind of thing very prevalent in China right now or are there little test programs where they're trying things out? Or do we just not know?

0:55:43.8 NF: It's hard to know. It's hard to get accurate and fair information about exactly what's happening. What we do know is that there are hundreds of thousands of these headsets in use across China and that at least some of the companies that are best known for commodifying brain data are out of China. And what will they do with all of that, and how are they using it and will they misuse it or are they just far advanced from all of us, your guess is as good as mine, but I know that I personally would not want to be in China with a headset on. And I personally as an Iranian-American would not want to be in Iran with a headset on which the government had access to for the data.

0:56:33.0 SC: Do we think that people are going to ask potential first dates that they meet on a dating site to sit in a EEG machine for a little while just to make sure they're compatible?

0:56:46.4 NF: I don't know. I'll tell you a little story, which is my husband and I, on our maybe second date, compared 23andMe profiles to one another, we're weird that way. But we laugh about it, but it was kind of... It was kind of an experiment that was fun. Are they likely to ask somebody to sit in an EEG headset? I don't think any time in the near future, but will it become in the near future something that people geek out about? Maybe, like, look at my brain session from last week and how sharp it is, or how's your meditation practice versus my meditation practice. I think it's going to be more innocent sharing like that in the beginning, it's the kind of thing that you already see in these online forums where people are using the technology for their own benefit, but then want to compare their brain metrics against another person's brain metrics.

0:57:41.2 NF: I can also imagine as a lot of these focus and attention programs are launching that people will compete against one another, like what's your longest focus streak and what's your longest attention streak or things like that. So I can just see people gamifying it, and then that way sharing a lot of data with one another.

0:58:00.9 SC: I like that, competitive Buddhism. You're more and more one with the universe.

0:58:04.0 NF: It's happening already. It's like like, oh, how's your gamma activity today?

0:58:08.7 SC: Yes, but I... Maybe I'm giving away a billion dollar start-up idea here, but I love the idea of an app that would just separately collect the brain wave data from two people and let you know ahead of time whether you're compatible, that's probably actually much more accurate than the little surveys you do on OkCupid or whatever.

0:58:27.1 NF: Maybe. We don't know, because I think you would have to go into the Gottman Love Lab and have a whole bunch of people who have had successful marriages, look at what their brain activity looks like. But one thing that's interesting is as people work together over time, you see a lot of synchronization of their brain activity, so you might not know if what you're seeing is actually they're truly in sync with one another, just because they've been together for a long time, or this is what would actually lead to them being in sync together over time. So I don't think we have an answer yet on what a neural match looks like, but be interesting to see.

0:59:03.0 SC: Okay, I had to ask it, that was just a fun little diversion. But I guess the capstone here is, okay, we would like to... I'm entirely on board with your program of enshrining some kind of privacy rights to your thoughts, it should be our consent to give these away rather than just being forced by pressure, by jobs, by economic pressure to do it. So how would that work? How would we actually legislate or enshrine that in some document somewhere?

0:59:32.5 NF: Yeah, so I'm proposing an international human right to cognitive liberty. What that would look like is the UN Declaration of Human Rights already recognizes the three rights that I think make up cognitive liberty, at least as a starting place, people may decide that there's more that needs to go into it over time, which would be reasonable. But as a starting place, the right to privacy, which is already an international human right recognized by the International Covenant on Civil and Political Rights and signed on to by all of these different countries includes the right to privacy, but it doesn't specify that it includes a right to mental privacy.

1:00:05.8 NF: That can simply be interpreted by the Human Rights Committee, for example, through something called a general comment to update it and say, this obviously includes the right to mental privacy, and this is what that would entail. Freedom of thought, already recognized again by the UN Declaration of Human Rights, has primarily been applied to religious freedom and hasn't been interpreted to include a much more robust protection of freedom of thought, the right not to have your thoughts punished, used to punish you, not manipulated and used against you. The last Special Rapporteur on Freedom of Religion, Ahmed Shaheed, presented a report to the UN General Assembly calling for an expanded understanding of freedom of thought in October of 2021. Whole-heartedly agree with him. We need to update freedom of thought to really cover all of this emerging technology and the ability to really peer into, decode, misuse your thoughts against you.

1:01:00.6 NF: And then the last is, there is a collective right to self-determination under the Universal Declaration of Human Rights, but there's never been an individual right to self-determination that has been recognized. Again, just by updating our understanding of the collective right to self-determination to include what has already been recognized, which is a right to informational self-access as well as a right to self-determination, to be able to change your body as you want to change your body or not have it change the way you want it not changed, that right to self-determination over your brains and mental experiences I think is critical. So all that can happen pretty simply. The Human Rights Committee can write a comment that says the right to cognitive liberty is obviously a right that is already enshrined in the Universal Declaration of Human Rights, it requires an updating of these three existing rights, everybody's already signed on to the ICCPR, the International Covenant on Civil and Political Rights.

1:01:52.7 NF: We don't even need them to all come together and vote on it, right, you can just update our understanding of those things and have a good starting place.

1:01:58.7 SC: Are you personally in contact with the people who are able to do this?

1:02:04.3 NF: I am, I am, yes. I've been presenting this idea across the UN, I've been presenting it to the Human Rights Committee, I've been advocating for it, have given quite a few presentations to anyone who would listen. I worked closely with the Special Rapporteur on Freedom of Religion as he was writing his report on freedom of thought to make sure it reflected these ideas about neurotechnology. So it's not some pie in the sky academic dream about happening, but it's actually something that's underway. I think it will require the political will of people, it will require kind of a collective call for this to show that it really matters to people. And so it's not that we're kind of passive recipients of updating this right, it's something that each and every one of us should be like actively saying like, this is what we require for the coming future and the future that has already arrived.

1:02:56.4 SC: It might just be a sense of political exhaustion that sits deep within me over the past few years, but I kind of find it hard to think that the United States would enact vast new privacy rights, especially when they might get in the way of certain companies making money.

1:03:14.8 NF: That might be right, the State of the Union speech the Biden gave suggested that there was an increasing call for bipartisan support for privacy legislation. But again, if we start with human rights, human rights, to which we have already signed on to, it already binds corporations, we don't have to necessarily rely on our divided Congress coming together and passing comprehensive privacy legislation. And I guess, maybe I'm naively optimistic here, but I guess I believe when it comes to people's brains, it's a bipartisan issue, that it is hard to imagine that there is any person on the left or the right who would think it's a good idea for corporations and governments to have unfettered access to what they're thinking. I don't think politicians could ever run successfully if we had unfettered access to what they're actually thinking and feeling.

1:04:11.1 SC: Right, but okay, so here is, I'm going to play the role of the campaign manager for someone running for Congress, and they say, look, we have a new technology that will tell us with 99% confidence whether a person is going to be a terrorist or commit a murder and they won't let us use it.

1:04:27.6 NF: Well, but that's not what a right to cognitive liberty would say. So mental privacy isn't absolute. There are a balance of any privacy interests between social interests and individual interests, and if you add incredibly compelling evidence that you had a particular person for which that technology would give you all of the information that you need and it would be the least intrusive, and the alternatives that you might employ would be far more barbaric and worse to get the information, it's not impossible to imagine that in limited national security circumstances that there would be a justification that would also align with human rights to use that technology. So I don't think cognitive liberty is a ban on technology, I think what cognitive liberty is is a respect for individuals that enables the responsible progress and use of the technology.

1:05:22.5 SC: Good, it's just my job to imagine the nightmare scenarios. It's your job to be optimistic, so that's...

1:05:27.1 NF: I'm glad, I'm glad. Well, yeah, I guess it's my job to be optimistic. I'm like, I'm the person writing a book about the dystopia who's optimistic. It's funny, it's a funny juxtaposition between the two, but yes.

1:05:36.8 SC: But that scenario makes me think... I should have asked this earlier, but are we going to have to do a brain scan when we check in through security at an airport sometime?

1:05:48.6 NF: Maybe. Maybe so. I write about the increasing use of biometrics in The Battle for Your Brain, and especially functional biometrics. So it just turns out that passwords, as most people understand, are not particularly secure, and they're not a great way... And I don't know about you, but I have a different password for basically every single site that I have, and so I never know what my password is, and I'm re-setting them on a daily basis, they're also... Given all of the problems with it, increasingly, we're moving to biometrics, you can unlock your iPhone with your face or with your thumb. And functional biometrics, that is you how you move your fingers across the screen or the patterns that you engage in functionally are better biometrics for security than just static ones, like a face print where somebody could take an image of your face and use it.

1:06:40.8 NF: And brain biometrics are quite powerful for authentication of individuals. That is like you could sing a little ditty in your head, like pick your favorite song and record your brain activity while you are singing that little ditty in your head, and then the next time that you want to unlock your computer or unlock access to your employment side or anything else, you would sing that same little song in your head and your unique pattern of brain wave activity in singing that song is different than how I would sing that exact same song, and given that brain biometrics may increasingly be used by governments as a way to authenticate individuals.

1:07:19.3 SC: But now you have a new dystopia, which is that I have to like hum Jingle Bells to myself every time I want to log on to a website.

1:07:24.8 NF: Well, pick a better song, like Rihanna Diamonds or something. Pick a good song that you hum in your head so that you don't have to have a jingle, like Jingle Bells stuck in your head all year long.

1:07:41.7 SC: Can you tell me a little bit about... So you said you've been talking to people, they're receptive, the people who have the ability to think about these things. What is the level of awareness of this problem on the part of the typical national politician?

1:08:00.9 NF: Well, typical national politician, probably not that high. But at the international human rights level, I think there's a lot of awareness of what's happening. And in a lot of international settings, there's been a lot of an awareness about what's happening, and I'm not the only person advocating for changes in this space. There are activists who have started to talk about the different set of rights that we need as neurotechnologies and neuroscience progresses. And so there's been a lot of activity, for example, at the OECD, who have proposed regulations of neurotechnology, and there's been... The Council of Europe who's taken up the issue, and different committees within the UN that have taken up the issue.

1:08:39.3 NF: And so across the kind of international human rights landscape as well as increasingly from some active folks across academia, from [1:08:51.5] ____ and neuroscience to philosophy to computer science, you see a lot of people starting to raise the alarm bells about this and to really start to say like, hey, this is... We need to be on high alert, and we need to do something about it now. So I think I see activity coalescing around raising consciousness and also advocating for different solutions to the problem or set of problems that we see, and that's promising. The average, like if I were to have gone to the State of the Union recently and picked my favorite Congresswoman and said like, hey, what do you think about the emerging trends in brain sensors? I would think I would get a very blank stare.

1:09:35.1 SC: But maybe they have a staffer who is aware.

1:09:36.7 NF: Maybe they have a staffer who's on it, and I hope, I hope they do. Yeah.

1:09:40.3 SC: So I actually do like to always end the podcast on an optimistic note, and it sounds like you're trying to do this, you've already been pushing the optimistic line, but I'll let you sort of just sum it up somehow. But there is a future in which we take advantage of this technology and it makes our lives more convenient and gives us access to things we wouldn't otherwise have without selling the farm and giving away our innermost terrible thoughts to the whole world.

1:10:08.5 NF: There is. So I will sound cautiously optimistic and say I believe that it is difficult, if not impossible, to ban technologies and that that isn't in the best interest of humanity, that there can be great promise from neurotechnology for individuals to improve their own health and well-being, and that when empowered to make choices for themselves about how they use the technology, that it can be a force of good, it can help us transcend neurological disease and disorders, even improve our mental health when there is a crisis of mental health across the world. It also has the risk of becoming the most dystopian technology we've ever unleashed on society, and so we have choices we can make right now that help shape the direction that the technology will take. And those choices begin by recognizing a right to cognitive liberty, the right to self-determination over our brains and mental experiences, and there couldn't be a more urgent call to action, because truly what's at stake is what it means to be human, and so it's a call to action for everybody to become aware, but also advocates for those rights for themselves and for society.

1:11:28.9 SC: I cannot improve upon that. Nita Farahany, thanks so much for being on the Mindscape podcast.

1:11:33.3 NF: Thank you for having me, it's been a pleasure.

[music]

3 thoughts on “229 | Nita Farahany on Ethics, Law, and Neurotechnology”

  1. Ms. Farahany is a legal scholar and therefore I have a pragmatic legal question. Who becomes the recipient of the intellectual property rights (IPR), when using neurotechnology for cognitive enhancement or similar benefits in a research and creative activities? I can foresee a field ripe for conflict.

  2. As a counselor, I’ve always thought that there is a great deal of psychotic or pathological thinking within an individual. In normal settings, this thought is not drawn out, not accented on, and not part of character or identity, but a sub part of thought, or person. When a depressed person says: “people are staring me on the bus” that can be paranoid thinking. In the hands of AI, creating ways to get into someone’s head, this is a massive infringement of person, psyche.
    If a marketing companies wants to sell me–x, well, they could magnify these random thoughts, buy things because the marketing agent amplified the paranoid thinking, or lust thinking, or anger thinking.
    Looking for pathology by a government agency, and able to collect millions of people’s thoughts from their wearables, it could be used to justify surveillance, and negative character habits of a ‘criminal’, when if they analyzed themselves, they would seem the same traits if looking for them. Bogus science, Bad science, in the new marketing frontier will abound, much like the evidence of a which in the Middle Ages.

  3. Raymond Sivahop

    A wonderful Podcast, thank you Sean! And Ms. Farahany is both very believable and a wealth of knowledge!
    But what will the government do with all of that information, nothing good I’m afraid.
    And the media even less so.
    It all reminds me of the last election, where the media could track all of the comments for or against the candidates, based supposedly off of their phone numbers, but with that said why do I still have to put up with scammers, and robo calls, and people wanting to buy my house when it isn’t even listed? I guess they believe that they are a public service, or don’t want to give away their trade secrets to improve our lives.
    They can obviously track all of the hate groups in the country, but not lift a finger to do anything about them. But say something against the President, of even a Congressperson and the FBI will be a door in the morning.
    Have we all forgotten about 1984? I never would have believed that movie. But here it is .

Comments are closed.

Scroll to Top