113. Yaron Singer - Catching edge cases in AI

Towards Data Science - A podcast by The TDS team

Categories:

It’s no secret that AI systems are being used in more and more high-stakes applications. As AI eats the world, it’s becoming critical to ensure that AI systems behave robustly — that they don’t get thrown off by unusual inputs, and start spitting out harmful predictions or recommending dangerous courses of action. If we’re going to have AI drive us to work, or decide who gets bank loans and who doesn’t, we’d better be confident that our AI systems aren’t going to fail because of a freak blizzard, or because some intern missed a minus sign. We’re now past the point where companies can afford to treat AI development like a glorified Kaggle competition, in which the only thing that matters is how well models perform on a testing set. AI-powered screw-ups aren’t always life-or-death issues, but they can harm real users, and cause brand damage to companies that don’t anticipate them. Fortunately, AI risk is starting to get more attention these days, and new companies — like Robust Intelligence — are stepping up to develop strategies that anticipate AI failures, and mitigate their effects. Joining me for this episode of the podcast was Yaron Singer, a former Googler, professor of computer science and applied math at Harvard, and now CEO and co-founder of Robust Intelligence. Yaron has the rare combination of theoretical and engineering expertise required to understand what AI risk is, and the product intuition to know how to integrate that understanding into solutions that can help developers and companies deal with AI risk. ---  Intro music: ➞ Artist: Ron Gelinas ➞ Track Title: Daybreak Chill Blend (original mix) ➞ Link to Track: https://youtu.be/d8Y2sKIgFWc ---  Chapters: 0:00 Intro 2:30 Journey into AI risk 5:20 Guarantees of AI systems 11:00 Testing as a solution 15:20 Generality and software versus custom work 18:55 Consistency across model types 24:40 Different model failures 30:25 Levels of responsibility 35:00 Wrap-up