What happens when AI meets your brain data? In Episode 18 of AI in NYC, Rob May and Anna Kirk sit down with Kristen Mathews, cyber/data/privacy partner at Cooley LLP with nearly 30 years of experience, who has carved out a fascinating niche at the intersection of privacy law and neurotechnology. Kristen breaks down what neurotech actually is — from invasive brain implants to consumer wearable headbands — and explains how AI has been the key catalyst turning a century of brain signal data into actionable, decoded information.
The conversation dives deep into the different categories of neurotech, including how devices can not only read brain activity but also stimulate it — with real applications like predicting seizures 20 minutes before they happen and suppressing them with electrical pulses. Rob shares his firsthand experience from sitting on the board of a neurotech company, while Kristen paints a vivid picture of the current landscape, including New York City's role as a major hub for the neurotech community.
Perhaps the most thought-provoking segment explores the ethical frontier: the difference between decoding 'intended speech' (helping ALS patients communicate) and 'inner speech' (your private thoughts). Where's the line? Can AI tell the difference? Kristen is refreshingly honest about what we don't yet know, while emphasizing that every neurotech application she's seen in practice today is being used for good. This episode is essential listening for anyone interested in AI, privacy, the future of brain-computer interfaces, and why the next big privacy debate may be about your thoughts.
Relevant links: www.nytimes.com/2003/06/22/magazine/savant-for-a-day.html https://icaot.org/jose-delgado-a-controversial-trailblazer-inneuromodulation/
Kristen Mathews on Linkedin: https://www.linkedin.com/in/kristen-mathews-6025257?utm_source=share&utm_medium=member_mweb&utm_campaign=share_via&utm_content=profile
Download BePresent: https://apps.apple.com/us/app/bepresent-screen-time-control/id1644737181