Library/Spotlight

Back to Library
TED TalksCivilisational risk and strategySpotlightReleased: 7 Jan 2025

How AI can bridge the Deaf and hearing worlds | Adam Munder

Why this matters

Auto-discovered candidate. Editorial positioning to be finalized.

Summary

Auto-discovered from TED Talks. Editorial summary pending review.

Perspective map

MixedGovernanceMedium confidenceTranscript-informed

The amber marker shows the most Risk-forward score. The white marker shows the most Opportunity-forward score. The black marker shows the median perspective for this library item. Tap the band, a marker, or the track to open the transcript there.

An explanation of the Perspective Map framework can be found here.

Episode arc by segment

Early → late · height = spectrum position · colour = band

Risk-forwardMixedOpportunity-forward

Each bar is tinted by where its score sits on the same strip as above (amber → cyan midpoint → white). Same lexicon as the headline. Bars are evenly spaced in transcript order (not clock time).

StartEnd

Across 5 full-transcript segments: median 0 · mean 0 · spread 00 (p10–p90 00) · 0% risk-forward, 100% mixed, 0% opportunity-forward slices.

Slice bands
5 slices · p10–p90 00

Mixed leaning, primarily in the Governance lens. Evidence mode: interview. Confidence: medium.

  • - Emphasizes governance
  • - Emphasizes safety
  • - Full transcript scored in 5 sequential slices (median slice 0).

Editor note

Auto-ingested from daily feed check. Review for editorial curation under intake methodology.

ai-safetyted-talks

Play on sAIfe Hands

On-site playback is enabled when an episode-level media URL is connected. This entry currently points to a source page.

This entry currently has a show-level source URL, not an episode-level media URL.

Episode transcript

YouTube captions (TED associates this talk with a public YouTube mirror) · video ros3INVOQEU · stored Apr 10, 2026 · 123 caption segments

Captions are an imperfect primary: they can mis-hear names and technical terms. Use them alongside the audio and publisher materials when verifying claims.

No editorial assessment file yet. Add content/resources/transcript-assessments/how-ai-can-bridge-the-deaf-and-hearing-worlds-adam-munder.json when you have a listen-based summary.

Show full transcript
Christan Hansen: Adam, wait. They need an interpreter. Now I'm ready. Go ahead. Hello, you guys need an interpreter. Thank you for briefly experiencing what it's like for me and millions of other Deaf people worldwide when we don't have an interpreter and we need to order a coffee, check into a hotel, or complete other daily tasks. What I was trying to share with you was a frustrating experience that my wife and I recently had. My wife is also Deaf and we took our daughter to a doctor's appointment. When we arrived, I'd asked for either an on-site or a virtual interpreter. Unfortunately, they told me neither were available, so I had to write back and forth with the receptionist at the front desk. To make matters worse, when the doctor came into the room, he said, "Oh, it's alright. We'll just have your daughter interpret for you." So could you imagine going to the doctor's appointment and being told by the doctor that your seven-year-old is going to relay the information to you? This was just one example of the two different worlds that I navigate. In my personal life, I raise my kids, I manage rental properties, I flip buildings, and I'm working on a coffee roastery. In this world, I interact with hearing individuals every day: teachers, lawyers, real estate agents, contractors. Yet still to this day, I have to rely on a smartphone to text back and forth or write with paper and pen to the people I interact with. Then I have the professional world where I have succeeded, in part due to having two highly qualified interpreters. I have the same degrees, the same educational background, the same job responsibilities as my hearing peers. I solve the same engineering problems in a very competitive, fast-paced environment. But the playing field isn't level. And all of my daily collaborations, all of my meetings and presentations, everything hinges on my interpreters. I am very fortunate, though. My employer ensures that I have access to the same information that my hearing colleagues do. Unfortunately, this is not true for many Deaf throughout the world. Interpreters are very expensive and scarce. Where I live in Arizona, there are more than 1.1 million individuals with a hearing loss. And only about 400 licensed interpreters. So there's scarcity of tools available for us, and our communication options are very limited. This puts us in a survival mode, forcing us to use the resources that are at our disposal. Writing back and forth on paper and pen, or using a smartphone to text is not equivalent to American Sign Language. The details and nuance that make us human are lost in both our personal and business conversations. So we're bringing the humanity back to these conversations. I've done that by building a platform called OmniBridge. So my team has established this bridge between the Deaf world and the hearing world. Bringing these worlds together without forcing one to adapt to the other. So we're using the power of AI to analyze thousands of signs in ASL and translate them into English. Now, thousands may seem small, but ASL is very complex. With slight nuance and changes in body language it can change the meaning of a sign. For instance, the sign "big." Or "enormous." Today, with the advancement in compute on AI PCs, we're able to run our models locally without relying on the internet, which dramatically increases accessibility. So I'd like to show you an alpha version of this in action. And I want to show you how this could have changed my interaction at the doctor appointment had this been available. I'd like to invite Hasiba on stage. She's going to pretend to be our receptionist at the doctor's office. What you're going to see on screen is a bidirectional conversation, where the software will translate my sign into the blue text, and Hasiba's spoken English will be in gray. Hasiba: Hi, how's it going? [Hi, how's it going?] (Applause) [I am good.] [I will like to use your technology to communicate with you.] Hasiba: Sure, we can use OmniBridge. Could you tell me the name of the patient? Sophie's my sister's name. That's so fun. What's Sophie's date of birth? Perfect. The doctor's going to be here with you shortly. [Thank you so much.] Hasiba: Thank you. (Applause and cheers) AM: So thank you. You can see that with this there's much less confusion and frustration. And the receptionist and I were able to establish a connection. The best thing of all is that my daughter just had to be the patient. So we're changing the world through the power of AI. Not just revolutionizing technology, but enhancing that human connection. My team is focused on using the AI PC, and the power of AI to humanize and include to really, truly level the playing field. It's two languages, signed and spoken, in one seamless conversation. Thank you. (Applause)

Counterbalance on this topic

Ranked with the mirror rule in the methodology: picks sit closer to the opposite side of your score on the same axis (lens alignment preferred). Each card plots you and the pick together.

More from this source