Episodes
Friday Nov 29, 2024
Media, Technology & Culture 09 (3rd Edition): Predictive Technologies
Friday Nov 29, 2024
Friday Nov 29, 2024
There is now widespread awareness of, suspicion about, and even opposition to the notion that computers seem to think. Or if not think, at least can learn things and then make decisions without our intervention, or indeed without us even knowing about it. Mysterious entities with names like ‘algorithms’, ‘bots’ and increasingly ‘AI’ seem to be making more and more decisions for us around welfare payment claims, the fastest travel route at a given moment, what shopping coupons are made available to you, or the neighbourhoods police patrol. These entities are also pervasive in media and communications. They help inform what movies you watch, the posts you see in your social media feeds, the way a matchmaking website pairs you with others, the overall summary you might draw from a search query, or what your music streaming over the past year reveals about cultural taste. Despite a more recent tendency to label these and other developments as ‘AI’, many scholars – not just in critical media studies, but fields like computer science – are keen to remind us that this is not intelligence, per se. Instead, we are seeing are mimicries of intelligence, which are in fact advanced forms of statistical prediction, based on enormous amounts of collected data, both personal and environmental. These reminders are helpful, though it still leaves murky how all of this happens. All this computational decision making, and its capacities at deep learning: it’s all so hidden; so obscure. In this episode, we think about the growing role of predictive technologies in shaping contemporary media cultures, from the early rise of apps and personalised ‘filter bubbles’ to the rather ordinary recommendation systems we rely on today. We also grapple with growing concerns for how deep structural biases around race, class, gender and sexuality are embedded into and reinforced by the way algorithms – such as those enabling facial recognition technologies – actually work. But we will also ask: is the adequate political response to just roll up our sleeves, pry these predictive black boxes open, reveal their internal biases, and perhaps correct them? Or it is that we instead need to better understand the problematic social and cultural conditions from which these predictive technologies sprout up, get nurtured and grow?
Thinkers Discussed: Chris Anderson and Michael Wolff (The Web is Dead: Long Live the Internet); Eli Pariser (The Filter Bubble: What the Internet is Hiding From You); Murray Shanahan (Talking about Large Language Models); Ajay Agrawal, Joshua Gans and Avi Goldfarb (Prediction Machines: The Simple Economics of Artificial Intelligence); Blake Hallinan and Ted Striphas (Recommended for You: The Netflix Prize and the Production of Algorithmic Culture); Raymond Williams (Keywords); Daniela Varela Martinez's and Anne Kaun (The Netflix Experience: A User-Focused Approach to the Netflix Recommendation Algorithm); Safiya Umoja Noble (Algorithms of Oppression: How Search Engines Reinforce Racism); Ruha Benjamin (Race After Technology: Abolitionist Tools for the New Jim Code); Fabio Chiusi (Automating Society); Axel Bruns (Are Filter Bubbles Real?); Frank Pasquale (The Black Box Society: The Secret Algorithms That Control Money and Information); Taina Bucher (If...Then: Algorithmic Power and Politics); Donna Haraway (Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective); Kate Crawford (Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence).
Comments (0)
To leave or reply to comments, please download free Podbean or
No Comments
To leave or reply to comments,
please download free Podbean App.