If you are familiar with the TV series Black Mirror, you already know that its main focus is presenting the consequences of new and different technologies affecting the society. Each episode introduces a new context and technology, either in the future or an alternative present time, and deals with social issues such as privacy, freedom and so on, caused by these technologies.
The first episode of season 3 titled Nosedive introduces an application which people use to rate each other. These ratings affect every aspect of people’s lives, including what they can buy, their activities, jobs, friends and creating prejudices against them – which then causes other people to rate them accordingly.
In this episode, we see how this system forces people to act “nice” all the time, making their every action seem fake and unnatural. We also see how people judge each other just by looking at their ratings and how they act accordingly. We see such issues in today’s society as well with racism and judgment towards people’s appearance. However, this episode shows how a rating system would take discrimination and prejudice to an extreme.
Although many countries have adopted the credit score system for banking purposes (e.g. FICO in the US (Wu, 2017)), and sometimes people are judged by the numbers on their social media accounts, we still haven’t fully converted into using scores for every aspect of our lives yet. However, the recently developed Zhima Credit system in China comes closer to integrating this rating system into what people can buy. For example, the scores of a person can prevent their freedom of choosing the way of traveling, the hotels they can stay in, the bank loans they can get, etc. (Vincent, 2017).
As far as I could see from the Black Mirror episodes, many of the issues presented in this series exist in today’s society already, even if it isn’t as extreme as the episodes. That is why this series pushes people to think and contemplate about the possible consequences of certain technologies and social structures. Seeing this development didn’t surprise me at all since prejudices and some sort of rating already exists among people, only not validated with numbers. We can still see many examples of discrimination according to people’s skin color, choice of clothes, country of origin, etc. Actually, even numbers came into effect with social media. Privileged people gain more privileges, while the unprivileged remain excluded from society.
Fictional stories such as Black Mirror, 1984 and Brave New World offer new perspectives about possible social issues by showing the extreme consequences of different social systems. I believe that this allows viewers to gain awareness of these social issues by looking at the current system and similar existing concerns in our society. That is why more people should read or watch these stories and avoid going down the same path.
If by any chance you haven’t watched Black Mirror, I highly suggest it. Although it can be a bit dark, it makes you think about the existing social issues and how new and supposedly beneficial technologies might become harmful in the end.
References
- Vincent, A. (2017, December 15). Black Mirror is coming true in China, where your ‘rating’ affects your home, transport and social circle. Retrieved March 28, 2018, from https://www.telegraph.co.uk/on-demand/2017/12/15/black-mirror-coming-true-china-rating-affects-home-transport/
- Hvistendahl, M. (2018, February 15). In China, a Three-Digit Score Could Dictate Your Place in Society. Retrieved March 28, 2018, from https://www.wired.com/story/age-of-social-credit/?mbid=synd_digg
- Wu, J. (2017, September 07). Credit Scores: What are they? How do they work? Retrieved March 28, 2018, from https://www.valuepenguin.com/credit-scores
Add yours Comments – 1
Well, I would say that comparing this credit system with the episode in Black Mirror, and discrimination is not fair. On one hand, it’s much similar with traditional banking credit system in essence, like you will not get loans from banks if you have poor banking records or paying back abilities. Also, I’m pretty sure that banks are desperate for data as much as possible to evaluate applicants. I think the purpose behind is mainly risk control. On the other hand, I would argue to what extent that rating is not excessive. If like what you believe that credit system is the violation of privacy, the banks or this system will probably not provide service on account of uncontrollable financial risk. The society doesn’t work like this right? I’m considering, maybe a better solution is, if we’d like to use the service, we have to authorize to companies the usage of privacy data. Otherwise, we still have other options, but perhaps with higher costs.