Back to Categories
Technology & the Future14 min read
Weapons of Math Destruction
by Cathy O’Neil
How Big Data Increases Inequality and Threatens Democracy
Published: August 22, 2017
4.1 (184 ratings)
Table of Contents
1
what’s in it for me? see math from a completely new perspective.2
algorithms have the potential to sway the voting public and disrupt democracy.3
algorithms designed to predict crime also reinforce prejudices. 4
insurance companies are exploiting people with bad credit.5
the job market is also being unfairly influenced by algorithms.6
university rankings have negative effects on higher education.7
final summaryBook Summary
This is a comprehensive summary of “Weapons of Math Destruction” by Cathy O’Neil. The book explores how big data increases inequality and threatens democracy.
what’s in it for me? see math from a completely new perspective.#
Introduction
cathy o‐neil, weapons of math destruction, how big data increases inequality and threatens democracy maybe you've heard of big data and how algorithms using that data are providing new insights into consumer patterns, politics, and social media platforms.
indeed, algorithms are everywhere.
they guide our social media feeds and sift the advertisements we see, and they also influence human life in many other ways.
often, they now dictate which jobs and schools we have access to.
you might think that decisions about hiring, about admissions, would be much fairer if based on objective calculations rather than on someone's gut feeling.
after all, algorithms judge everyone on the same scale, right?
well, as you'll learn in these chapters, the situation is a bit more complex than that.
algorithms have the potential to sway the voting public and disrupt democracy.#
chapter number one.
algorithms have the potential to sway the voting public and disrupt democracy.
in many ways, the internet helps democracy.
it's a public platform that supports independent voices, but that same platform is also open to powerful propaganda machines that can manipulate the conversation.
research has shown that social media and search engines are especially vulnerable to algorithms that can influence the decisions of unsuspecting users.
researchers robert epstein and ronald robertson found proof of this after asking undecided voters in the united states and india to find information about a handful of different political candidates.
the catch was that the voters were told to use a specific search engine, unaware that it had been programmed with an algorithm that favored one candidate over all the others.
as a result, the participants showed a 20% shift toward voting for the algorithm's preferred choice.
a similar study happened on facebook just prior to the 2012 elections.
solomon messing of the pew research center designed a special algorithm that would generate the news feeds of 2 million users and favor political views over all other posts.
facebook surveyed the participants before and after the elections, and the results showed that 3% more users turned out to vote than was expected before the algorithm had been adjusted to favor politics.
while we can't know for sure whether certain search engine or social media algorithms are designed to influence users, it's clear that there is a vast potential for abuse.
it is also clear that political candidates are well aware of their power to garner votes.
heading into the 2012 elections, obama had a team of data analysts who interviewed thousands of voters and used their answers, in addition to demographic and consumer data, to create mathematical profiles.
these profiles were then used to find similar people on national databases.
based on the profiles, they could assume that people with similar interests and backgrounds would also share the same political views.
once people with similar data were grouped together, the analysts could create algorithms that made sure these groups received specific ads that would appeal to their tastes.
so those who showed evidence of having environmental concerns, for instance, were targeted for ads that highlighted obama's environmental policies.
algorithms designed to predict crime also reinforce prejudices. #
chapter number two.
algorithms designed to predict crime also reinforce prejudices.
predicting future crimes sounds like something out of a philip k. dick science fiction story, but is, in fact, part of today's reality.
police departments are using algorithms to target prospective criminals.
but this software is far from perfect, and the algorithms have led to cities being unevenly policed and certain people being unfairly singled out.
how has this happened?
the algorithms rely on historical data to pinpoint where crimes are most likely to occur, and it's the police who determine which data is fed into the algorithm.
part of the problem is that the police tend to focus on specific kinds of crimes, such as nuisance crimes, which include vagrancy and certain drug-related offenses.
given that crimes like these tend to occur in poor neighborhoods, the analysis will end up being completely skewed towards these parts of the city.
as a result, the police send the majority of their patrol units to the streets of poor neighborhoods, making the residents feel unfairly targeted.
this also leads to neglect of wealthier neighborhoods, which become more vulnerable to criminal activity.
similar built-in biases skew the data that police use to predict potential violent crimes as well, leading to innocent people getting labeled as dangerous.
in 2009, the chicago police department received a grant to develop new crime prediction software.
they used that money to develop an algorithm that came up with a list of the 400 people most likely to be involved in a homicide.
one of those people was robert mcdaniel, a 22-year-old who became the focus of police attention.
one day in 2013, a police officer even visited mcdaniel's home to let him know that the police had their eyes on him.
but mcdaniel was never charged with any crime.
he ended up being red-flagged by the algorithm solely based on the people he follows on social networks and the criminals who happen to live in his neighborhood.
in short, growing up in a poor neighborhood is all that it takes to get you labeled as potentially dangerous.
to be fair, crime prediction algorithms are designed to protect people, but they can very easily make people's lives worse than they were before.
as we'll see in the next chapter, a similar problem is plaguing the insurance business.
insurance companies are exploiting people with bad credit.#
chapter number three.
insurance companies are exploiting people with bad credit.
if you're familiar with insurance agencies, you might be aware that they'll ask different clients to pay different premiums.
and no, they don't just do this at random.
they use the specific data that they've collected on their customers.
for car insurance, algorithms are used to calculate payment amounts based on how many prior accidents a customer has been in, as well as their prior credit reports.
in fact, in some areas, those credit reports are given more weight than a customer's driving record.
such is the case in florida, where adults with clean driving records and poor credit reports end up paying an average of $1,552 more per year than drivers with excellent credit and a history of drunk driving.
this leads to poor drivers with impeccable driving skills having to pay more for insurance than rich drivers.
and so begins the vicious cycle.
by being forced to pay more for insurance, cash-strapped families will be more likely to miss a payment on another bill and worsen their credit score even further.
and then, when their current insurance expires, the rate on their next contract will go up even higher, even if they haven't broken a single traffic rule.
some insurance companies are even using algorithms to calculate the likelihood that a customer will shop around for cheaper prices.
the insurance company allstate does this by using a model that employs consumer and demographic data.
if the algorithm suggests that a customer is likely to search for lower prices, they'll offer them a reduced price, sometimes as large as 90% off the average rate.
however, if a customer isn't likely to shop around, his rate can increase by as much as 800%.
but what allstate's algorithm is really doing is taking advantage of poor people without formal education, since this is the demographic that is less likely to shop around for other options.
the job market is also being unfairly influenced by algorithms.#
chapter number four.
the job market is also being unfairly influenced by algorithms.
it can be difficult to spot the best worker out of a pool of hundreds of applicants, so it makes sense to use a variety of tests, in combination with the help of data companies, to sort through the results.
but these tests have proven to be restrictive for certain kinds of people, especially when it comes to personality tests, which have made it next to impossible for someone like kyle bem to get a job.
bem had to drop out of his classes at vanderbilt university to get treatment for his bipolar disorder, but in 2012 he was healthy enough to start looking for a part-time job.
so he applied to kroger, a supermarket chain, after a friend told bem that there was an open position.
when he was turned down, he checked with his friend who told him that he'd been red-lighted due to the results of his personality test.
the algorithm had tagged bem as likely to underperform.
unfortunately, the same thing happened to bem at all the other minimum wage jobs he applied for.
so, with the help of his father, he filed a lawsuit against seven different companies under the americans with disabilities act.
as of 2016, the suit was still pending.
part of the problem is that the companies handling the data can make some troubling mistakes.
when catherine taylor applied for a job with the red cross in arkansas, she was rejected and told that it was due to her criminal charge for intent to manufacture and sell methamphetamine.
this seemed odd to catherine, since she had a pretty clean record.
when she investigated further, she found that those charges belonged to another catherine taylor who happened to have the same birthday.
she also discovered that it was the company providing the data to the red cross that had made the mistake, which prompted her to do a little more research.
in the end, catherine discovered that at least ten data brokers had made the same mistake, linking her to a serious crime that she'd never committed.
university rankings have negative effects on higher education.#
mistake number five.
university rankings have negative effects on higher education.
it's no secret that colleges in the united states have gotten quite expensive over the past 30 years.
but few people know that one of the main reasons for this increase in tuition is due to one newspaper.
in the 1980s, u.s. news & world report began using an algorithm that ranked the quality of u.s. colleges using data that they believed would determine their success, such as sat scores and acceptance rates.
suddenly, these rankings became crucially important for all the universities involved, and they all set out to improve their performance in the areas that the u.s. news algorithm used.
but to do that, they needed resources.
this scramble for money is largely responsible for tuition going through the roof.
between 1985 and 2013, the cost of higher education increased by 500 percent.
these rankings weren't the only factor that contributed to this increase, but they certainly encouraged the schools to raise their costs.
one of the most damaging things u.s. news did was to include acceptance rates in their formula, as it completely ruined the concept of a safety school.
traditionally, a safety school was a college that had a high acceptance rate and that would serve as a good backup plan for a student who was also applying to a more prestigious school like harvard or yale.
but since u.s. news gave schools with a lower acceptance rate a better position in the rankings, many schools began lowering their rates and sending out fewer acceptance letters.
in order to keep their actual enrollment numbers the same, they had to choose which students they were going to reject.
by looking at their numbers, the safety schools could see that only a small percentage of the top students would choose them over the prestigious schools, so they believed that rejecting them wouldn't do any harm.
but even if only some of these high-performing students chose to attend, it would have benefited the school.
also, the decision to reject high performers out of hand ruined the backup plans of many good students.
like all the other algorithms we looked at, what started out as a good idea ended up doing far more harm than good.
final summary#
Conclusion
thank you for choosing to hear our chapters to kathy o'neill's book, weapons of math destruction.
the key message in dr. o'neill's book is that algorithms were initially created to be neutral and fair by avoiding all too human biases and faulty logic.
however, many of the algorithms used today, from the insurance market to the justice system, have incorporated the very prejudices and misconceptions of their designers.
and since these algorithms operate on a massive scale, these biases lead to millions of unfair decisions.
to minimize your chances of becoming another algorithm casualty in the quest to stand out from the job-seeking horde, the author suggests you employ the following tactics.
write machine-friendly resumes.
most companies today use automatic resume readers.
to increase your chances of getting the job, modify your resume with the automatic reader in mind.
here are some simple tips you can always apply.
use simple fonts like arial and courier.
stay away from images which can't be processed by the reader.
don't use symbols.
even simple ones like arrows can confuse the reader.
if you happen to have any spare feedback lying around, and can spare a little, we'd love to hear your opinion of our content.
just drop an email to remember at summarybook.org with weapons of math destruction as the subject line and let us know how we're doing.
You Might Also Like
Discover more book summaries in the same category or by the same author.
TS
The Singularity Is Near
Technology & the Future24 min read
The Singularity Is Near
by Ray Kurzweil
4.4
TM
The Master Algorithm
Technology & the Future21 min read
The Master Algorithm
by Pedro Domingos
4.4
TS
The Science and Technology of Growing Young
Technology & the Future24 min read
The Science and Technology of Growing Young
by Sergey Young
4.4