Ep 1: Autocorrect & Predictive Text

Episode 1: Autocorrect & Predictive Text
Episode 1: Autocorrect & Predictive Text
/

We use them every day, but are autocorrect and predictive text doing us any favors? We explore how we use them, and how they might influence us. We also share our personal autocorrect-gone-wrong stories.

Rob also pitches weekly things that blow our minds.

UPDATE: 26 Feb 2021

A listener contacted us and suggested we listen to the 99% Invisible podcast on the Enron Email Corpus. This episode shines a light on how Siri, autocorrect, predictive text and many other technologies we use today have been influenced by the corporate email collected from Enron back in 2001. This was news to both of us and really interesting to listen about.

After some conversations with colleagues, we discovered two recommended and related reads for those interested in digging into the ethical impacts wrapped up in the development of algorithms and artificial intelligences.

Technically Wrong, by Sara Wachter-Boettcher

Algorithms of Oppression, by Safiya Umoja Noble