The (good) driver rating and the social credit model
15 December 2020 | Written by Thomas Ducato
A well-known car sharing company has introduced in Italy the use of a system that analyzes user guidance. For now there will be no consequences for those who get a low score but we should start asking some questions
Driving in the city during rush hour can be a stressful experience. Sudden braking, columns and traffic jams, slalom between cyclists and scooters, unruly pedestrians. To all this, the users of a well-known car sharing company, present in several Italian cities, will have to add the feeling of being constantly judged: not by a real instructor, with whom maybe even a chat to pass the time and relax. the voltage, but by an algorithm.
In fact, since December 1st, a system has been active on these vehicles that monitors the entire journey and, once the car has been parked and the key removed, gives the user a score from 1 to 10.
But how are the information collected and the scores used? The algorithm was officially born to “encourage correct and safe driving behavior” and “protect the company”, limiting the risk of damage to its cars. At the moment, therefore, there are no consequences on future rentals for “bad drivers”, but the news raises some questions: is it acceptable to be judged by algorithms? What would happen if this data were used to limit opportunities or access to services?
The Chinese social credit system. The debate on the issue of citizens’ ratings has been alive for years also due to the Social Credit System launched by the People’s Republic of China: a tool for classifying the reputation of citizens, starting from a series of information collected and analyzed by technological systems. The program was announced by the Beijing government as early as 2014 and has been defined by the Western world as a form of control not far from Orwellian dystopian scenarios.
The basic idea is to create a system of sanctions and rewards, with which to reward or punish the behavior of citizens such as compliance with the law, economic reliability and social conduct. At least officially, the system is designed precisely to provide social incentives that favor virtuous behavior and good practices: a tool that would therefore allow the economy and social life to be regulated in an almost natural way, leveraging the reputation of citizens and limiting the cases in which it is made direct intervention by the state is required.
In some cities and provinces from 2014 to date the system has been used and implemented, with consequent controversy from the Western world and from external observers on its functioning, the punishments provided and the arbitrariness of the judgments.
The traffic light of shame. The traffic lights installed in some Chinese cities are curious, but at the same time explanatory of the system. Who passes with the red is filmed and his face shown on a big screen to be (badly) judged by other citizens. But that’s not all, because thanks to a facial recognition system “the traffic light” is also able to identify the offender, and then apply the penalties provided. A popular pillory which, in a country where reputation and honor are important values, has given positive results. But privacy?
The end justifies the means? Leaving aside the purposes of these rating systems, which at least in the declared intentions remain tools to encourage virtuous behavior and not to introduce forms of control, it is right to emphasize the means used to achieve them.
On the one hand, there is the ability of these systems to collect and analyze data that take on value and represent the cornerstone of an economic model that tends to treat us more as products than as individuals. On the other hand, there is the issue relating to the nature of algorithms, which have prejudices and lack a characteristic that is fundamental in some cases: common sense.
In Europe, even more than in China, we are far from a “Big Brother” style scenario. But the signs of the present force us to ask ourselves a few questions.