The Algorithms Controlling Our World - Episode 8

We’re being scored with secret formulas we don’t understand, with no course for appeal if things don’t go our way…

Data scientists are writing code that learns from our every move, click, post and payment, and it’s tangibly changing the future of our tech-driven world. 

One the one hand, we’re served posts tailored to preferences we may not know we had. On the other hand, mortgages, insurance, jail sentences and job interviews are increasingly at the mercy of code - and it’s not free from bias.

In this week’s Disconnected, we’re uncovering the fundamental issues and historical bias that permeate the systems controlling what you can and can’t do, and why we should be demanding accountability from the tech titans behind it all.

Listen on
Apple

Listen on
Spotify

Listen on
Amazon Music

Listen on
Google

This episode of Disconnected covers:

  • How algorithms correlate and analyse your every move

  • The 2 key ingredients behind every predictive algorithm

  • Who loses out when unconscious biases are carried forward in data

Episode Highlights:

“We’re being scored with secret formulas that we don’t understand, and there’s usually no course for appeal if things don’t go our way.  If the computer says no to your mortgage application, shortlisting your CV, if it says you don’t deserve as high a credit limit as your neighbour, who can you complain to…?” - 2:45 - Jag Sharma

“Algorithms correlate what you do with what almost everyone else has done.  The algorithms don't really understand you, but there is power in numbers, especially in large numbers.” - 4:25 - Jag Sharma

“To build a predictive algorithm you just need 2 things: past data, and a definition of success for the algorithm to use the past data to decide what good looks like going forwards. From this we know that algorithms are opinions embedded in code, because one person decides what the definition of success is.” - 7:00 - Jag Sharma

“It might be difficult to get our head around it, the idea that machines have carried the biases of human culture from the past and use it when assessing humans, but the examples are many.  Some seem harmless, but some far less so.” - 8:40 - Jag Sharma

“What you put in is what you get out.  So if you take a random thousand photos of humans from around the web and feed them to a saliency algorithm, then you’re probably going to get it skewed in favour of the majority.  When someone collects data for an algorithm training dataset, they can be motivated by things we humans are often motivated by - convenience and cost - and end up with data that lacks diversity. ” - 12:55 - Jag Sharma

“Human subjectivity in building algorithms has hindered progress in the workplace too.  If the data fed in looks at past successful hires, then predicting future successful hires is going to skew one way.  This of course doesn’t just affect minorities, women suffer too.” - 15:20 - Jag Sharma

“There’s a brilliant quote from the author & Princeton University Professor Ruha Benjamin; ‘Robots are not sentient beings, sure.  But racism flourishes well beyond hate-filled hearts.  No malice needed, no N-word required, just lack of concern for how the past shapes the present’. Consider the challenges that women & disabled people face as a result of algorithmic bias because of the human subjectivity in the data fed into training these algorithms.” - 16:15 - Jag Sharma

“Data scientists should not be the arbiters of truth.  They should be translators of ethical discussions in larger society.  And for the rest of us, this is not a maths exam; this is a fight for the future of how our tech-driven world will be shaped.” - 17:30 - Jag Sharma


Links & references:

Jag Sharma: 

https://www.linkedin.com/in/jagsharma

https://www.instagram.com/jagsharma/

Previous
Previous

Reset Your Digital Relationships with Award-Winning Author, Elizabeth Uviebinené - Episode 9

Next
Next

10 Ways YOU Can Fight Cyber Attacks - Episode 7