Your smartphone can see, hear, and speak—even if you can’t. So it occurred to the engineers at Apple and Microsoft: Can the phone be a talking companion for anyone with low vision, describing what it’s seeing in the world around you?
Today, it can. Thanks to some heavy doses of machine learning and augmented reality, these companies’ apps can identify things, scenes, money, colors, text, and even people (“30-year-old man with brown hair, smiling, holding a laptop—probably Stuart”)—and then speak, in words, what’s in front of you, in a photo or in the real world. In this episode, the creators of these astonishing features reveal how they turned the smartphone into a professional personal describer—and why they care so deeply about making it all work.
Guests: Satya Nadella, Microsoft CEO. Saqib Shaikh, project lead for Microsoft’s Seeing AI app. Jenny Lay-Flurrie, Chief Accessibility Officer, Microsoft. Ryan Dour, accessibility engineer, Apple. Chris Fleizach, Mobile Accessibility Engineering Lead, Apple. Sarah Herrlinger, Senior Director of Global Accessibility, Apple.