Skip to main content

Germany edges toward Chinese-style rating of citizens

by Futures Centre, Mar 12
8 minutes read

What are you worth? Today your answer might be based on what you earn, the price of your belongings or how much your family loves you. But in the near future, you may well be valued in a different way altogether – and your online behavior most certainly will be.

“We live in a world where judgment is being replaced by numbers – by scores that calculate the value of a human being, with the help of algorithms,” says Gerd Gigerenzer, director of the Harding Center for Risk Literacy at the Max Planck Institute for Human Development in Berlin. Mr. Gigerenzer also heads a council of experts on consumer affairs that advises the German Ministry of Justice. The panel is currently working on a report on this topic.

Being assessed by algorithms is advancing in China, where an experiment in what is being called a “social credit system” is underway, the professor of psychology and behavioral sciences specialist says. The Chinese system, which uses a combination of mass surveillance and big data to score citizens, is currently voluntary but it will be mandatory by 2020. At that stage, every citizen and company will be ranked whether they like it or not.

“If you don’t adhere to social conventions, if you search for the wrong website, if you buy too many video games, if you cross the road on a red light or even if you have friends with a low score, then your own score will fall,” Mr. Gigerenzer told Handelsblatt’s sister publication, Der Tagesspiegel.

If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place” – Eric Schmidt, former executive chairman, Google

In the various experimental versions of the system – rolled out by local governments and selected companies – millions of citizens with poor ratings are already labeled as “not qualified” to book flights or high-speed train tickets, let alone get a loan from the bank. The potential implications could be worse still. “Should your score fall too much, your children won’t be able to go to the better schools and many other limitations will apply,” the expert warns. A system like that ensures self-censorship even within families, he says, predicting that the plan is bound to be successful.

It sounds like some drastic, Orwellian version of the future. But as Mr. Gigerenzer notes, we in western countries are not as far from such a system as we might like to think. In Germany, he points out, it starts out with the universal credit rating system known as a Schufa. Very much like its US counterpart FICO, Schufa is a private company that assesses the creditworthiness of about three-quarters of all Germans and over 5 million companies in the country. Anyone wanting to rent a house or loan money is required to produce their Schufa rating in Germany – or their FICO score in the US. Additionally, factors like “geo-scoring” can also lower your overall grade if you happen to live in a low-rent neighborhood, or even if a lot of your neighbors have bad credit ratings.

In other areas, German health insurers will offer you lower premiums if you don’t get sick as much. They may offer you even better premiums if you share data from your fitness-tracking device to show you’re doing your part to stay healthy. Anyone using websites like Amazon, eBay or Airbnb is asked to rate others and is rated themselves. Those who try to avoid being rated are looked at askance. An increasing number of consumers will be denied certain services or, say, mortgages if they don’t present some kind of rating.

“We are in a crucial phase in which we need to have a discussion about our values,” Mr. Gigerenzer argues. “Do we want to go on like this? Should we keep scoring people in areas like finance, health, criminality, rental housing, mail-order businesses and so on? And if you answered yes to that, then here is the next question: Should we allow all of the data gathered to be brought together, so we can come up with a total score for every citizen?”

In China, the answer is a resounding yes, and yes. Beijing is selling the idea of a social credit system as a way to increase trust and combat crime and corruption. If an institution or individual has a good score, then you can have confidence in them, Chinese authorities trumpet.

And it works like a charm. Citizens are welcoming the fact that the scoring system will help them tell if people around them are trustworthy or not, the behavioral scientist says. “And I believe that even in Germany there would be a certain group who would support the idea of ‘digitally transparent’ people.”

Mr. Gigerenzer points out that Eric Schmidt, a former executive chairman of Google, controversially talked about “the right to know” back in 2009. The entrepreneur, who recently turned technical adviser to the global giant, is a staunch advocate of “full transparency.”

“If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place,” Mr. Schmidt said.

“I see a remarkable similarity between Mr. Schmidt and the Chinese government,” Mr. Gigerenzer notes, adding that Baidu, the Chinese equivalent of Google, is already working with Beijing’s fledgling social credit system.

Then again, even if you do like the idea of a social credit system, Mr. Gigerenzer thinks that the technology involved is not up to the job yet. Artificial intelligence might be doing very well with games like chess, or in other situations with well-defined parameters, he says. “But the situation looks totally different when we are talking about real-world situations with many uncertainties.”

He provides two examples. First, the program, Google Flu Trends, which launched with great fanfare in 2008 and which tried to work out how prevalent, and where, flu cases were by aggregating all the different online searches Google users made for flu symptoms. But after the software virtually missed the climax of the flu season in 2013, the program was quietly shelved.

Secondly, Mr. Gigerenzer mentions COMPAS, a recidivism-prediction algorithm used in various US states. The tool was developed to help judges with sentencing by looking at defendants’ criminal histories and then predicting what likelihood there was of them reoffending. But further research found that the risk-assessment algorithm was wrong in over a third of the cases, as well as racially biased. In recent experiments, ordinary people with no experience in the field did a better job of telling offenders’ fortunes than the algorithm could.

“It would be tragic if somebody’s life was destroyed just because others put blind faith in a commercial algorithm,” Mr. Gigerenzer says.

So what can be done about this process, and particularly given that it is already well underway?

It would have been an important topic for the recent coalition talks to form the next German government, Mr. Gigerenzer explains. In the constant talk about digital technology, the social and psychological dimensions are being left out, he continues. “A huge blind spot has developed.”

When his expert panel has finished their report on “scoring consumers” it will be presented to the justice ministry. Mr. Gigerenzer hopes the report will also arouse a lot more public debate on the topic.

There is mounting criticism in Germany of what some view as a lack of transparency in establishing Schufa scores. In 2014, the Federal Court of Justice dismissed the lawsuit of a woman who wanted to know how the company had calculated her negative credit rating.

But two non-governmental organizations, AlgorithmWatch and the Open Knowledge Foundation, launched the OpenSchufa initiative earlier this month. Supported by data journalists, they have set out to crack the Schufa algorithm. They rely mainly on financial information donated by the public.

“If we don’t do anything, then one day a corporation or a government institution will pull all the information from different data banks together and come up with a social credit score,” Mr. Gigerenzer warns. “And at the end, we will be in the same state as the Chinese. At the moment we are investing billions in digital technologies,” he continues. “When we should be investing just as much in digital education so that humans are aware what algorithms really can, and cannot, do. We cannot just stand by as they are used to change our minds and our societies.”

This interview was conducted by Heike Jahberg, who writes about consumer affairs for Handelsblatt’s sister publication, Der Tagesspiegel. This story was adapted for Handelsblatt Global by Cathrin Schaer. To contact the author:redaktion@tagesspiegel.de

 


 

This article was first published by Handelsblatt Global on 17 Feb 2018.

 


Details

by Futures Centre Spotted 1998 signals

Have you spotted a signal of change?

Register to receive the latest from the Futures Centre.
Sign up

  • 0
  • Share

Join discussion

Related signals

Our use of cookies

We use necessary cookies to make our site work. We'd also like to set optional analytics cookies to help us improve it. We won't set optional cookies unless you enable them. Using this tool will set a cookie on your device to remember your preferences.

For more detailed information about the cookies we use, see our Cookies page.

Necessary cookies

Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.

Analytics cookies

We'd like to set Google Analytics cookies to help us to improve our website by collecting and reporting information on how you use it. The cookies collect information in a way that does not directly identify anyone. For more information on how these cookies work, please see our 'Cookies page'.

>