Skip to content

Bruce Schneier on Using Algorythms to Judge Society

  • by

editors note: Bruce Schneier (schneier.com) is who many would call the leading expert in data Security. His critical analysis of our digital age is without a doubt, one of the most valuable critiques for being both well informed and solution or correction oriented. He is also instrumental in designing data secrecy programs for personal use. 


 

Replacing Judgment with Algorithms

China is considering a new “social credit” system, designed to rate everyone’s trustworthiness. Many fear that it will become a tool of social control — but in reality it has a lot in common with the algorithms and systems that score and classify us all every day.

Human judgment is being replaced by automatic algorithms, and that brings with it both enormous benefits and risks. The technology is enabling a new form of social control, sometimes deliberately and sometimes as a side effect. And as the Internet of Things ushers in an era of more sensors and more data — and more algorithms — we need to ensure that we reap the benefits while avoiding the harms.

Right now, the Chinese government is watching how companies use “social credit” scores in state-approved pilot projects. The most prominent one is Sesame Credit, and it’s much more than a financial scoring system.

Citizens are judged not only by conventional financial criteria, but by their actions and associations. Rumors abound about how this system works. Various news sites are speculating that your score will go up if you share a link from a state-sponsored news agency and go down if you post pictures of Tiananmen Square. Similarly, your score will go up if you purchase local agricultural products and down if you purchase Japanese anime. Right now the worst fears seem overblown, but could certainly come to pass in the future.

This story has spread because it’s just the sort of behavior you’d expect from the authoritarian government in China. But there’s little about the scoring systems used by Sesame Credit that’s unique to China. All of us are being categorized and judged by similar algorithms, both by companies and by governments. While the aim of these systems might not be social control, it’s often the byproduct. And if we’re not careful, the creepy results we imagine for the Chinese will be our lot as well.

Sesame Credit is largely based on a US system called FICO. That’s the system that determines your credit score. You actually have a few dozen different ones, and they determine whether you can get a mortgage, car loan or credit card, and what sorts of interest rates you’re offered. The exact algorithm is secret, but we know in general what goes into a FICO score: how much debt you have, how good you’ve been at repaying your debt, how long your credit history is and so on.

There’s nothing about your social network, but that might change. In August, Facebook was awarded a patent on using a borrower’s social network to help determine if he or she is a good credit risk. Basically, your creditworthiness becomes dependent on the creditworthiness of your friends. Associate with deadbeats, and you’re more likely to be judged as one.

Your associations can be used to judge you in other ways as well. It’s now common for employers to use social media sites to screen job applicants. This manual process is increasingly being outsourced and automated; companies like Social Intelligence, Evolv and First Advantage automatically process your social networking activity and provide hiring recommendations for employers. The dangers of this type of system — from discriminatory biases resulting from the data to an obsession with scores over more social measures — are too many.

The company Klout tried to make a business of measuring your online influence, hoping its proprietary system would become an industry standard used for things like hiring and giving out free product samples.

The US government is judging you as well. Your social media postings could get you on the terrorist watch list, affecting your ability to fly on an airplane and even get a job. In 2012, a British tourist’s tweet caused the US to deny him entry into the country. We know that the National Security Agency uses complex computer algorithms to sift through the Internet data it collects on both Americans and foreigners.

All of these systems, from Sesame Credit to the NSA’s secret algorithms, are made possible by computers and data. A couple of generations ago, you would apply for a home mortgage at a bank that knew you, and a bank manager would make a determination of your creditworthiness. Yes, the system was prone to all sorts of abuses, ranging from discrimination to an old-boy network of friends helping friends. But the system also couldn’t scale. It made no sense for a bank across the state to give you a loan, because they didn’t know you. Loans stayed local.

FICO scores changed that. Now, a computer crunches your credit history and produces a number. And you can take that number to any mortgage lender in the country. They don’t need to know you; your score is all they need to decide whether you’re trustworthy.

This score enabled the home mortgage, car loan, credit card and other lending industries to explode, but it brought with it other problems. People who don’t conform to the financial norm — having and using credit cards, for example — can have trouble getting loans when they need them. The automatic nature of the system enforces conformity.

The secrecy of the algorithms further pushes people toward conformity. If you are worried that the US government will classify you as a potential terrorist, you’re less likely to friend Muslims on Facebook. If you know that your Sesame Credit score is partly based on your not buying “subversive” products or being friends with dissidents, you’re more likely to overcompensate by not buying anything but the most innocuous books or corresponding with the most boring people.

Uber is an example of how this works. Passengers rate drivers and drivers rate passengers; both risk getting booted out of the system if their rankings get too low. This weeds out bad drivers and passengers, but also results in marginal people being blocked from the system, and everyone else trying to not make any special requests, avoid controversial conversation topics, and generally behave like good corporate citizens.

Many have documented a chilling effect among American Muslims, with them avoiding certain discussion topics lest they be taken the wrong way. Even if nothing would happen because of it, their free speech has been curtailed because of the secrecy surrounding government surveillance. How many of you are reluctant to Google “pressure cooker bomb”? How many are a bit worried that I used it in this essay?

This is what social control looks like in the Internet age. The Cold-War-era methods of undercover agents, informants living in your neighborhood, and agents provocateur is too labor-intensive and inefficient. These automatic algorithms make possible a wholly new way to enforce conformity. And by accepting algorithmic classification into our lives, we’re paving the way for the same sort of thing China plans to put into place.

It doesn’t have to be this way. We can get the benefits of automatic algorithmic systems while avoiding the dangers. It’s not even hard.

The first step is to make these algorithms public. Companies and governments both balk at this, fearing that people will deliberately try to game them, but the alternative is much worse.

The second step is for these systems to be subject to oversight and accountability. It’s already illegal for these algorithms to have discriminatory outcomes, even if they’re not deliberately designed in. This concept needs to be expanded. We as a society need to understand what we expect out of the algorithms that automatically judge us and ensure that those expectations are met.

We also need to provide manual systems for people to challenge their classifications. Automatic algorithms are going to make mistakes, whether it’s by giving us bad credit scores or flagging us as terrorists. We need the ability to clear our names if this happens, through a process that restores human judgment.

Sesame Credit sounds like a dystopia because we can easily imagine how the Chinese government can use a system like this to enforce conformity and stifle dissent. Our own systems seem safer, because we don’t believe the corporations and governments that run them are malevolent. But the dangers are inherent in the technologies. As we move into a world where we are increasingly judged by algorithms, we need to ensure that they do so fairly and properly.

 

This essay previously appeared on CNN.com.
http://www.cnn.com/2016/01/06/opinions/schneier-china-social-scores/index.html

http://www.bbc.com/news/world-asia-china-34592186
http://qz.com/519737/all-chinese-citizens-now-have-a-score-based-on-how-well-we-live-and-mine-sucks
http://www.independent.co.uk/news/world/asia/china-has-made-obedience-to-the-state-a-game-a6783841.html
https://www.techinasia.com/china-citizen-scores-credit-system-orwellian
http://www.antipope.org/charlie/blog-static/2015/10/it-could-be-worse.html

Facebook’s patent:
http://money.cnn.com/2015/08/04/technology/facebook-loan-patent

Employer use of this data:
http://www.cleveland.com/business/index.ssf/2015/05/more_than_half_of_employers_no_1.html
http://www.forbes.com/sites/kashmirhill/2011/06/15/start-up-that-monitors-employees-internet-and-social-media-footprints-gets-gov-approval
http://www.datasociety.net/pubs/fow/EmploymentDiscrimination.pdf
http://www.fastcoexist.com/3036990/4-ways-using-big-data-to-hire-people-could-totally-backfire

How social media posts can land you on the terrorist watch list:
https://theintercept.com/2014/07/23/blacklisted
http://newsfeed.time.com/2012/01/31/british-tourists-tweets-get-them-denied-entry-to-the-u-s

Algorithmic secrecy:
https://aeon.co/essays/judge-jury-and-executioner-the-unaccountable-algorithm

Chilling effect of surveillance:
http://www.theverge.com/2013/3/11/4091282/muslim-new-yorkers-describe-nypd-surveillance-effects

The necessity of oversight and accountability:
http://towcenter.org/research/algorithmic-accountability-on-the-investigation-of-black-boxes-2

Leave a Reply

Your email address will not be published. Required fields are marked *

eight − six =