Can AI read your mind? The battle for your brain w/ Nita Farahany
Why this matters
Safety is not only about model behavior; this episode highlights second-order effects on people, institutions, and labor markets.
Summary
This conversation examines society and jobs through Can AI read your mind? The battle for your brain w/ Nita Farahany, surfacing the assumptions, failure paths, and strategic choices that matter most for real-world deployment.
Perspective map
The amber marker shows the most Risk-forward score. The white marker shows the most Opportunity-forward score. The black marker shows the median perspective for this library item. Tap the band, a marker, or the track to open the transcript there.
An explanation of the Perspective Map framework can be found here.
Episode arc by segment
Early → late · height = spectrum position · colour = band
Risk-forwardMixedOpportunity-forward
Each bar is tinted by where its score sits on the same strip as above (amber → cyan midpoint → white). Same lexicon as the headline. Bars are evenly spaced in transcript order (not clock time).
Across 35 full-transcript segments: median 0 · mean -3 · spread -25–8 (p10–p90 -10–0) · 6% risk-forward, 94% mixed, 0% opportunity-forward slices.
Mixed leaning, primarily in the Society lens. Evidence mode: interview. Confidence: high.
- - Emphasizes safety
- - Emphasizes labor market
- - Full transcript scored in 35 sequential slices (median slice 0).
Editor note
Useful mainstream bridge episode for teams that need a shared baseline quickly.
Play on sAIfe Hands
Episode transcript
YouTube captions (auto or uploaded) · video gMo58U7mk6A · stored Apr 2, 2026 · 1,024 caption segments
Captions are an imperfect primary: they can mis-hear names and technical terms. Use them alongside the audio and publisher materials when verifying claims.
No editorial assessment file yet. Add content/resources/transcript-assessments/can-ai-read-your-mind-the-battle-for-your-brain-w-nita-farahany.json when you have a listen-based summary.
Show full transcript
scene one interior an office space fluorescent be the year is 2035 and neurot technology is The New Normal an employee sits at their desk they're wearing earbuds that their workplac has issued them but these earbuds are doing much more than playing music because they have brain sensors embedded inside them it's also tracking their stress levels while they're interacting with their screen and typing a memo and then they're sitting back relaxing and they're reviewing their brain data over the past couple of months and they notice that there's something unusual going on when they're sleeping and so they pull up a new email type it through their brain sensing earbuds and they send off a quick message to their doctor and say like hey notice that there's something unusual going on here could you take a look and let me know what you think there are no physical screens or keyboards here only desks where the workers sit and stare straight ahead clicking around office software with their minds a worker is sitting with her arms folded at her desk as she waits for her doctor to respond her thoughts wander she starts to fantasize about one of her colleagues and then suddenly starts to worry realizing that her employer has access to all of her brain data and she notices that a little message pops up on her screen warning about inter office romances and you know she doesn't want to get in trouble in that context she's relieved when later in the day she gets an email from her boss that tells her that she's getting a performance bonus and the reason is her brain metrics show that she's really just been on it she leaves work she's still jamming to the music with her brain sensing earbuds in playlist will become increasingly responsive to brain activity that's kind of a given she gets home has her brain sensing earbuds in while she sleeps at night that allows her to track brain activity during the night it also potentially could allow people to do things like Market to her while she's sleeping and then she comes into work the next day there's a somber Cloud that's fallen over the office and the reason is that one of her colleagues has been arrested Under Suspicion of engaging in some kind of Fraud and she's really worried because she's been secretly working with that person on a startup but it turns out that as you work more closely with someone you can start to see synchronization of brain data between two people and she's worried that that synchronization data is going to be used by the authorities to implicate her in his wrongdoings a world overrun by neurotechnology it all sounds very sci-fi but our guest today Nita farahan says that many of our devices already have these sensing capabilities and this kind of workplace experience is just Within Reach there's already workplaces that are issuing these earbuds that have brain sensing devices in them could really both benefit and be used by an individual and be used against them I'm belaval sadu and this is the Ted AI show where we figure out how to live and thrive in a world where AI is changing everything hi I'm belaval sadu host of Ted's newest podcast the Ted AI show where I talk with the world's leading experts artists journalists to help you live and thrive in a world where AI is changing everything I'm stoked to be working with IBM our official sponsor for this episode in a recent report published by the IBM Institute of business value among those surveyed one in three companies pause an AI use case after the pilot phase and we've all been there right you get hyped about the possibilities of AI spin up a bunch of these pilot projects and then crickets those Pilots are trapped in silos your res resources are exhausted and scaling feels daunting What If instead of hundreds of Pilots you had a holistic strategy that's built to scale that's what IBM can help with they have 655,000 Consultants with generative AI expertise who can help you design integrate and optimize AI Solutions learn more at ibm.com Consulting because using AI is cool but scaling AI across your business that's the next level boring docs with walls of text are bad for business our brains are wired to react to visuals so make your docs visual with canva docs grab attention by adding any type of media to your canva doc photos charts Graphics videos banners and much more by by Boring docs you'll love the docs you can easily design with canva and your readers will too love your work with canva DOs at canva.com teams with Big Ideas start in jira the only project management tool you need to plan and track work across any team jira even helps our team here at Ted keeping us in sync to deliver the Big Ideas our listeners love and there's a lot more that teams will love about jira it keeps cross functional tasks organized with a Project's timeline that's always really key so that we make our deadlines and cross functional teams like Ted working in one tool gives leaders the important visibility they need to make Better Business decisions get started on your next big idea today in jir hey listeners I'm excited to share with you a podcast I think you'll love called the next wave this show brings you fresh takes industry insights and a trustworthy perspective on how to implement AI to grow your business Matt wolf and Nathan Lans the hosts of the show do a great job of democratizing the expertise that is often reserved for the boardrooms of the biggest corporations out there and bringing that directly to you whether you're seeking to adapt your company to the AI era or simply C ious about the future this podcast will equip you with the knowledge to thrive in the forthcoming wave of Change Plus I think you're going to enjoy the episode where I joined as a guest called why Google search isn't going anywhere anytime soon check out the next wave wherever you get your podcast today we're going to dive into AI enhanced neurotechnology with ethicist and legal scholar Nita farahani is the author of the battle for your brain defending the right to think freely in the age of neurotechnology a book I found both fascinating and terrifying it's about how neurotechnology without regulation has the power to infringe upon our last Bastion of privacy our privacy of thought but is this kind of Technology inevitable and if so how do we preserve our cognitive Liberty a term that Nita has used that I find myself thinking about a lot lately I followed up with Nita because I had so many questions Nita welcome to the tedi show thanks for having me first I have to ask can you start by giving us a brief overview of neurotechnology sure what it is how does it work at a high level and what are some of the dominant use cases that companies are pursuing up until now if somebody hears neurot technology they associate it with maybe something like what Elon Musk is doing which is with with his neurolink company you know drilling a small uh you know hole into the skull implanting electrodes deep into the brain and then enabling somebody who maybe has lost the ability to communicate or has lost the ability to move to regain some of that functionality that is also neurotechnology but it is not the neurotechnology that's going to impact most of us what I'm focused on is the internet of things and wearable technology so the fact that you know people are increasingly wearing devices like a smartwatch or earbuds or XR devices like a virtual reality and augmented reality glasses that are packed with sensors and Those sensors are picking up different aspects of our bodily function anytime you think anytime you do anything neurons are firing in your brain they give off tiny electrical discharge hundreds of thousands of neurons may be firing at any given moment that reflect different mental States so when you're happy when you're sad when your mind is wandering or when you think up down left right and you want to navigate around a screen those are electrical signals that can be picked up by these sensors and then AI enables the decoding of that into commands most people are used to for example heart rate or the number of steps that they're taking or even temperature being something that's tracked and what's already on the market but starting to come in a much more widespread uh fashion is embedding brain sensors into those devices mostly electroen graphy sensors EEG sensors and Those sensors can be put into earbuds or headphones or even small wearable tattoos behind the ear that pick up electrical activity in the brain and so really the way to think about neurotechnology is not some separate device although there are many of those that already exist on the marketplace like a forehead band or sensors that can be put into a hard hat but instead to think about our wearable devices we're already wearing with new sensors that have the capability of picking up brain activity even recently with a release of the Apple Vision Pro or The Meta Quest 3 you know the way I've been thinking about it is on one hand it's a VR headset on the other hand it's a biometric recorder on your face that's right what is currently possible with this technology as far as mind reading goes and what is not quite possible yet but is on the near Horizon yeah it's a good question you know XR is in many ways from the groundup new techn ology right so it's it's Computing on your face but it doesn't have to be constrained by the same old rules and the same old rules being something like a keyboard and a mouse or a joystick like that doesn't have to be how we navigate through that and so as you build a new class of Technology it's possible to reimagine what it like interacting with that technology looks like and so that's what these companies have done is they've packed them full of biometric sensors that you know whether those are cameras or facial recognition or eye gaze or brain sensors all of them are being trained on the ability to be able to make inferences about brain and mental state so are you happy are you sad are you tired are you um awake is your mind wandering or are you paying attention some of these kind of basic emotional states also most of them are pretty good at being trained on brain activity to figure out cursor activity so up down left right so a lot of the things that you could do with your mouse you could do with a computer interface device and then they're getting better at being able to decode and intention to type what's not there yet from a true mind reading capacity for wearable devices is literally what you're thinking or continuous language in your brain so you can't quite at this point think oh I need to send a quick text message to my husband and then have that somehow decoded from your brain sent to your um mobile device and then sending off uh a message to him by thinking about doing so but that's coming so that's the kind of thing where the intention to communicate is something that can increasingly be decoded by these devices and that's where generative AI really can be a GameChanger because generative AI trained on natural language and conversations becomes much more powerful at being able to predict the next word that it is that you want to type and so that kind of autocomplete feature of generative AI paired together with brain activity and the increasing capability of being able to put large language models on a device mean that the devices and the Brain sensors can co-evolve with the individual so that it becomes increasingly more precise at more and more mind reading it's really powerful so it sounds like this technology can kind of coarsely infer your brain States not exactly read your internal monologue but to your point as you introduce new modalities you're looking at what does the camera feed see what is like what is the use eye gaze focused on and then throw in generative AI to throw a layer of prediction on top of that even with these very core sensing capabilities you can actually start making very accurate predictions that's right so I want to understand why is this something so many companies are racing towards and investing in heavily what are some of the most positive use cases that you're excited about like how is this actually going to help users yeah I think there's a lot of ways it can help users and I don't think about neurotechnology as just those brain sensors that are put into an earbud I think about this as an entire class of cognitive Biometrics which allow predictions about what a person is thinking and feeling and that can have a lot of really positive use cases the more we can actually gain really accurate insights about what's happening inside of our brains so the companies who are racing to this space are all of the major tech companies right Apple has all kinds of patents and investments in the space and you see the same thing at meta and Google and others because if XR technology really takes off which I think eventually it will it doesn't make sense to have it powered or have us tether to a keyboard and a mouse or a joystick like there has to be a more natural and seamless way of interacting with these devices the approach has not been like let's figure out how to commodify all the brain data but it's like how do we build a new class of Technology from the ground up and have a new way of thinking about interacting with that technology but then you know there's also so a huge amount of investment that's been happening to the side which is on mental health and recognizing that the brain is an untapped potential of areas and products that could be targeted at mental health and so you know this is things like seeing a huge number of apps that are focused on whether it's AI based mental health or if it's journaling or meditation apps there you have you know a billions to trillions of dollar industry po entially around brain health and wellness and then they start to converge because suddenly if you have this new class of devices that have been focusing on neural interface and you have this untapped potential of brain health and wellness suddenly you have access to much better insights and much better ways of being able to actually interact with the brain and to gather data that could be used for much more targeted products so I think it's like the biggest untapped Market if you think of it that way that's well said I mean I I believe Zuckerberg called neural interfaces the Holy Grail of VR right that's right yeah the Cordy hebard doesn't make any sense when you're thinking about like XR right it just doesn't make sense nor does the mouse right I mean it's like like we've it's become second nature to us but the idea that I have to work on a mouse to navigate around my screen like touch screen was a good Innovation that way but that's still awkward um and like we just we're inefficient in our interactions with technology support for the show comes from LinkedIn if you are a B2B marketer you know how noisy the ad space can be a lot of noise if your message isn't targeted to the Right audience it'll just disappear but with LinkedIn ads you can be a lot more precise you can reach the professionals who are more likely to find your ad relevant because LinkedIn has targeting capabilities to help you reach Folks by job title industry company and more you can stand out with LinkedIn ads and start converting your B2B audience into high quality leads right away I've learned so much about the vastness of LinkedIn because it does seem like everybody is on there so it's helping me find some leads or think of connections that I wouldn't have otherwise thought of without the technology it's helping me stay informed and stay educated start converting your B2B audience into high quality leads today we will even give you a $100 credit on your next campaign go to linkedin.com audio to claim your credit that's linkedin.com audio terms and conditions apply LinkedIn the place to be to be this world you painted sounds very efficient if that's the right word for it but there is something uncanny about it right and I want to get into what makes it uncanny in evaluating some of the risks of this technology uh you brought a lot of awareness to this idea of cogn Liberty so how do you define cognitive Liberty and why do you find it to be a useful term when you're looking at this new wave of neurotechnology I came on the term cognitive Liberty around 2010 or something and it resonates with me well because what it reflects from my perspective is this right to self-determination over our brain and mental experiences as a fundamental matter and like what does that even mean to have self-determination there's this huge amount of literature that has developed over the past couple decades around what self-determination means and why self-determination is fundamental to human self-actualization you need the basic autonomy the competence and the capacity for relatedness to other people and so for me it's about that it's about those pillars of self-determination the ways in which technology can both enable it but also have increasingly come to interfere with the capacity for self-determination and how cognitive Liberty would give us the goal poost to say like what is it that we're trying to achieve in technological design or in technological regulation we're trying to preserve this space of self-determination for individuals to enable them to form their identity to have a space of mental privacy to have the capacity of freedom of thought the preconditions to be able to become a fully self-actualized human so it resonates really well for me as as a liberty interest as a kind of fundamental right that individuals have as a precondition to being able to flourish it's like our minds are this sanctum sanctorum right we we feel like we have complete dominion over it though already technology is influencing us at a very deep visceral level without us even knowing and now we're creating these higher bandwidth forms of uh sensing and I'm kind of curious is there like an interesting analogy here where you know many folks got their DNA sequenced over the past decade or two without really thinking about a scenario where you know maybe a data breach would occur and it made me wonder from like a medical data perspective what are going to be the effects of increasing brain and neurological Health transparency without adequate privacy regulations and obviously America is very unique and that there is no federal law on uh internet privacy could this medical data kind of be weaponized against users yeah I mean I think very much so so you know part of it is people went into directed consumer genetic testing and could never have imagined it would eventually be used to solve cold cases and uh you know for law enforcement agencies to collect all of that data and but like put us all into um warrantless searches for here on out right and and then some people are like well that's okay I don't mind that like I didn't commit a crime and so if it helps you know find that person that's great but you know as you start to imagine every possible use case there are so many use cases that people just don't even contemplate about the way in which data can be used or misused against them I think with brain data it's even more fundamental than that like I don't think it's just let's point to how law enforcement might use it one day or how others might use it one day I think it's about the importance of having a space like a a you know the inner sanctum the the mental privacy the space we need to even just be us and that's almost impossible for us to even grapple with or imagine a world in which we have that but that's where I spend a lot of my mental energy is imagining that world where we don't have that and you know think about like as a kid all of the thoughts where you're like maybe I'm weird I you know have different gender identity or preferences and sexual orientation or you know maybe I don't want to be a doctor and my parents really want me to be a doctor and all of these little thoughts that you have every day if you don't have a space where you can do that where you don't where you feel safe to just be that person that figures out who you are what does that world look like for humans to become right this like act of becoming that we're constantly engaged in that's the world that I think we're entering into without realizing it it's a world where what we've taken for granted is this most fundamental aspect of Being Human that suddenly May no longer exist and we're not putting into place the protections to ensure this Lynch pin of humanity is still safe this is what it has come to we're talking about neuros surveillance let's go even further down the rabbit hole and like imagine this future a bit more do you think we'll get to a place where our brain States our inner thoughts are fully transparent to one another how would this impact our personal relationships how's this going to impact Society it kind of feels like Twitter on steroid like without the crudee thumb typing to express your thoughts just knowing what everyone is thinking at all times just feels wild so I taught a class at Duke called let's talk about digital you and it was a class for undergraduates where the goal was to have them think critically about digital Technologies and and how they interact with them then also think about what does it mean for them and who they are as a person and one of the things I learned in that class when we were talking about privacy was how almost every kid in the class pass shared their location data on their phone with every one of their friends like hundreds of friends who were tracking them at all times I was shocked by this and then I think about how in the writing of the book I was interviewing this person who leads a meditation class that uses neural devices and how they've created Facebook groups where they're sharing readouts of their uh meditation sessions they're like oh look at my gamma activity there look at what's happening with my Alpha activity here and I think is this going to be the latest status update where you know you're like Nita's in a bad mood don't talk to her right now or like you know Nita is thinking about food and my friends reach out and say like hey I'm thinking about food too like let's go get a bite to eat like it it just becomes something that you know we decide like we're going to share all this data with and it's not hard to imagine that we get to that place where it becomes something that is much more transparent is that all bad I don't know right there are some people who believe you know total transparency is better I can buy into that argument up until you get to mental privacy and then I really think this act of becoming requires that we have much greater control over what if anything is shared from our brain data you're bringing up a point about this technology that is very much a consent issue and sometimes I feel like utility with technology almost ends up being a trojan horse like we do get a lot of benefit from this stuff but by consenting to these things all the these other things happen that we did not expect and we may not even be aware of right and I I think that's what I'm trying to really highlight for people there's been a lot that's happened over the past year and part of it has been me worrying about normalizing neural surveillance like the risks become invisible to us and we accept this new technology without even stopping to recognize all of the implications of what it is that we're adopting and part of it is interesting it's it's how the technology is introduced to us a lot of times it's with as you put it utility right there's some utility that we buy into or there's some experience where it's like the only way you get this like fun experience is by opting into this new technology without then sensitizing us to all of the risks of doing so and really contemplating like we're crossing a new barrier here we're crossing some threshold that we've never passed before and it has profound implications it is this space of what it means to be human part of how we Define vulnerability and intimacy with each other is what we choose to share and what we choose to hold back and I don't think that's just a consent issue right because like at some point there's enough peer pressure and social pressure that like if you're not sharing your location data like you're weird you know why aren't you sharing it with everybody else so there's this coercive pressure toward the norm where we have this Collective action problem if each of us individually decides like oh I'm okay with sharing my neural data and I don't care if I don't have mental privacy I don't have anything to hide suddenly we have the collective having not recognized that they've given up this space of what it means to be human without us ever having really talked about it thought about it recognized how profound it is of a thing to actually seed to other people or to companies yeah it's like we like targeted ads oh it's fine I like using Instagram like and these algorithms have the coarsest you know kind of view into our hopes wishes and desires and are still doing such a great job it feels like we're going into this future where people are like oh yeah you know I I'll have I'll get that cheap Alexa and then suddenly they're having dreams about Bud Light or Coors Light or whatever but they're like oh that's cool with me like I'm asleep I don't really care yeah I mean that example is a real one right it's like um I write about this in my book where you know Kors was really frustrated they've been locked out of the halftime show for years for the Super Bowl so they got together with dream researchers figured out that it's possible to incubate in people's minds when they're in their most suggestible State because their conscious brain is basically checked out as they're falling asleep associations between cores and mountains and lakes and streams and like so that you have this idea that like Kors is crisp and refreshing but then I imagine this dystopian future of you're wearing your sleep earbuds to track your sleep activity the entire economic you know system of these companies is based on targeted advertisement and realtime bidding of you know here's new real estate to be able to access Nita whether she's on her search engine or the App Store and it's like hey we've never targeted when Nita's asleep but here's her most suggestible sleep State she's got a Amazon Echo in her bedroom like prime time to advertise and to play a little jingle is right now and so like what stops that what prevents targeted advertisements to us while we are sleeping nothing right I mean other than like the company shouldn't do that okay what's going to stop them from doing that if that becomes the most effective way to advertise to us and becomes the most expensive ad real estate that they could sell to ad Brokers oh good lord oh good lord I mean the studies is basically this targeted dream incubation demonstrates what companies and social media algorithms can do and also what neurotechnology is capable of already right and so it's it's wild because social algorithms not only extract or infer our way of looking at the world but they are actively shaping how we perceive the world that's right they predict what we want but as these platforms become such an intricate part of our Lives they can almost Define what we want so when it comes to this threat of neuros surveillance how do you think the rise of this Tech could influence what we feel safe to think on a conscious level or even more subliminally like what we're capable of thinking that's the question I struggle with the most these days is you know as I try to unpack like what does self-determination mean anymore in a world where we're being steered constantly and you know the easiest way people can really understand that is if you sit down to watch a single episode of a show and you end up watching Four it's by Design you are less likely to get up and leave if it's just automat like if you don't have to make a self-control choice and we can see these differences like if you look at the studies there's a difference between when you are making a choice about what video to watch next versus an algorithm is deciding for you what video to watch next in that the parts of your brain that are responsible for self-control are basically just turned off they're silence when you're just being fed content over and over again wow and when you see that right self-control is turned off you are being fed information and that's changing how you feel that's changing what you believe we are being steered and is there such a thing as self-determination in a world that's increasingly connected to technology and a world where that steering will become much much more precise it if it's not just the crude interpretation of how many seconds or milliseconds you spent on a video to try to make an interpretation of you but literally a direct measurement of your reaction to information and then you're in an immersive environment you're not even in the real world anymore where at least there's some things that are static and that immersive environment can continuously change in response to your brain activity it feels like we're approaching the Matrix pretty quickly um and so then I struggle with like okay well like is like is there such a thing as self-determination in that world of self-actualization of the world where there's true autonomy even if it's relational autonomy where there's true competence where we're exercising self-control and critical thinking skills where we're fostering relationships with each other and relatedness with ourselves and with other people I still believe there is I think if we have cognitive Liberty as a guiding principle that points us to different ways of Designing Technology that align with self-determination rather than align with human diminishment I love the the phrasing that you're using of technology is steering us and really the question is who is at the steering wheel right like who controls the objective function and so this future starts sounding rather scary so I want to zoom back out just a bit here and ask you you know you've been a major figure in both spreading awareness about these breaches of cognitive Liberty and trying to prepare us for the threats ahead you've also been working to encode cognitive Liberty as a legal rights I'm curious what would that look like in regulation and how would it be enforced first I I'll just emphasize I think cognitive Liberty to your point is it's systemic change right it's in part about encoding it into law but it's also about changing our incentives changing you know economic alignment with cognitive Liberty commercial design and redesign from a legal perspective I think it starts with a human rights approach and this is so that there's a universal Global Norm around a right to cognitive Liberty and recognizing that as an organizing principle for how we inter interpret existing human rights law it's recognizing privacy includes a right explicitly to mental privacy which also safeguards from interference and interception with the automatic processes in our brain and so much of this technology is really about that it's not about robust thought it's about these automatic ways that our brain reacts and interfering with them hijacking them manipulating them and then there's freedom of thoughts as long as it's been a right it has been recognized as an absolute human right but it's pretty narrowly constructed because it's an absolute right and so that's really around protecting the interception manipulation and Punishment of thought and here we're really in the mind reading realm right which is protecting this space of you know what we would really commonly in common language think of as thought and images in our mind that's the human rights perspective any uh good critic of human rights law will say well that's only as good as like you know the whether people adhere to it and how it's implemented at a national level so I also think it's important to recognize what that looks like you know at nation state levels and part of that is doing things as part of privacy laws to you know create robust rights around cognitive biometric data employees have a right to mental privacy giving children and students in the educational system a right to not be surveilled for their mental activity and giving them that space right so it's it's taking these high principles and then context by context creating robust laws that actually Implement those High Concepts into law at different you know nation and nation state levels so where are we currently at there I'm curious if there's been any notable progress on this subject since your book came out a lot yeah so in the US there's been finally some momentum I'd say I'm not that excited about what's happened here yet but I I'm I'm appreciative of the fact that there have been conversations so Colorado passed a new law that provide some protections around neural data when it's used for identification purposes that's a very narrow subset but it's something California has a broader law that's currently pending that would make it a sensitive category of data the uniform law commission here in the United States which has appointed Commissioners from every state has just agreed to create a study committee that might lead to a drafting committee to have model legislation for all of the states about how to protect this category of data around cognitive bi metrics which is exciting that launches the summer um UNESCO has had a major process underway 194 member countries voted to move toward adopting a global standard around the ethics of neurotechnology the first draft of that was published in April I'm part of that process so I was um appointed by the US to that process and and the co-chair of the expert committee and the second draft will come out at the end of August and then internationally there's been a lot of conversations happening around this the concept of cognitive Liberty the concept of you know whether or not there need to be special rights broadening the conversation Beyond neurotechnology to understand this is a new classes moving forward so it's encouraging but what you find is trying to claw back rights is much harder than from the get-go having a set of Rights in place so I think we need to move faster and use that momentum to actually lead to real change before these products are widescale across Society this technology is coming Fast and Furious and as you've alluded to and actually as you've outlined it's it's being infused into Tech that we use every single day people are just going to go buy the nextg airpods one question I have is for the listeners that are perhaps being exposed to this idea of neuros surveillance and neurotechnology what advice would you have for them as they go about you know navigating the world and become exposed to these Technologies both in their personal life but also in the workplace we still have time to make choices individually and collectively we should demand that if Apple launches EEG sensors that they be very clear about what their data privacy policies are with respect to that data and they've done that with apple provision right they've said like they've been incredibly specific to say here's what's happening with the eye tring data here's what lives on device here's are the inferences that leave the device that kind of transparency is great and I applaud them for doing that we should only buy the products that are both transparent with respect to how they're thinking about each stream of censor data and not buy products from companies that do not offer those same assurances that's a space we have a right to protect and that we should demand as a set of protections and so advocate for the rights that um you know are being called for be part of the process of advocating for UNESCO to move forward with those process for States for countries to move forward with a robust set of Rights around cognitive Liberty and be part of the change that makes that happen that's beautifully put thank you so much for joining us thanks for having [Music] me this Tech is amazing and it's not hyperbole to say that we're dealing with mind reading here one of the central benefits of this technology is gaining visibility into our own minds quantifying that endless stream of activity that really influences our day-to-day perception and through this new tech what seem seems to be as opaque as emotional experiences can be translated into measurable neurobiological patterns this means that we can better understand what's happening in our heads and it also means that we can more easily change what's happening in our heads if you understand what's going on in a person's mind you can create a stimulus to influence it whether that's or Willian workplace monitoring or the most persuasive advertisement you've ever seen it's giving away read write access to the core of who we are so what happens to a society where there are no secrets we're all open books maybe this is an inevitable transition we already share so much of ourselves via social media and do so proactively that said we need to place a lot more value in our personal data data that we currently view to be innocuous and often sign away without even thinking about it but when it comes to Tech that has both read and right access into our minds we have to be more proactive than we have been about social media and we have to Value the data before it becomes table Stakes to function in society after all the stakes here are high we're talking about our hearts minds and how we view the world the tedi show is a part of the Ted audio Collective and is produced by Ted with Cosmic standard our producers are l Le feter and Sarah McCrae our editors are B bansang and Alejandra Salazar our showrunner is Ivana Tucker and our associate producer is Ben Montoya our engineer is Asia polar Simpson our technical director is Jacob winck and our executive producer is Eliza Smith our fact Checker is Christian aparta and I'm your host belaval sadu see y'all in the next one [Music]