1 00:00:00,000 --> 00:00:18,680 *35C3 preroll music* 2 00:00:18,680 --> 00:00:22,800 Herald Angel: Alright. Then it's my great pleasure to introduce Toni to you 3 00:00:22,800 --> 00:00:28,290 She's going to talk about "the Social Credit System," which is, kind of, feels to me 4 00:00:28,290 --> 00:00:36,010 like a Black Mirror episode coming to life. So, slightly nervous and really 5 00:00:36,010 --> 00:00:41,115 curious what we're going to learn today. So please give a huge, warm round of 6 00:00:41,115 --> 00:00:45,606 applause and welcome Toni! *Applause* 7 00:00:51,918 --> 00:00:55,469 Toni: Good morning, everyone! Before I'm going to be talking I'm going into my talk 8 00:00:55,469 --> 00:01:00,399 I'm just going to be presenting the Chinese translation streams for everyone 9 00:01:00,399 --> 00:01:02,709 who doesn't speak English. 10 00:01:02,709 --> 00:01:17,330 *speaks chinese* 11 00:01:17,330 --> 00:01:23,254 *Applause* 12 00:01:23,254 --> 00:01:26,536 So because today's talk is about China we figured it would be good to have it 13 00:01:26,536 --> 00:01:31,780 in Chinese as well. And, I'm going to be talking today about 14 00:01:31,780 --> 00:01:35,439 the Social Credit system in China, where "the" Social Credit system that you 15 00:01:35,439 --> 00:01:38,970 always hear about in Western media actually doesn't really exist 16 00:01:38,970 --> 00:01:43,471 and most of my talk will actually be talking about what all we don't know. 17 00:01:43,471 --> 00:01:47,030 Which could fill an entire hour or even 18 00:01:47,030 --> 00:01:51,470 more. But I'm just going to be focusing on some of the most interesting things for 19 00:01:51,470 --> 00:01:57,560 me. First of all, a little bit about me. I'm an economist, but I'm not I'm not only 20 00:01:57,560 --> 00:02:03,050 concerned with money. I'm kind of looking at economy, at economics as the study of 21 00:02:03,050 --> 00:02:08,310 incentives, which means that what I'm really interested in is how humans respond 22 00:02:08,310 --> 00:02:12,480 to different kind of incentives. I don't believe that humans are completely 23 00:02:12,480 --> 00:02:18,250 rational. But I do believe that humans do try to maximize what they think is their 24 00:02:18,250 --> 00:02:23,830 best interest. Now, some words about me: I studied math, economics and political 25 00:02:23,830 --> 00:02:29,190 science in a couple of different cities all around the world. I spent overall 19 26 00:02:29,190 --> 00:02:35,140 months in China. Most recently I was there in July on a government scholarship, which 27 00:02:35,140 --> 00:02:39,640 was really, really interesting, because while there I read all of these Western 28 00:02:39,640 --> 00:02:43,710 newspaper articles about the Chinese Social Credit system, and I went to a 29 00:02:43,710 --> 00:02:48,200 pretty good university and I asked them: So what do you think about this system? 30 00:02:48,200 --> 00:02:52,360 And most of them basically looked at me blankly, and were like: What system, I 31 00:02:52,360 --> 00:02:56,730 haven't even heard of this! So that was kind of an interesting experience to me 32 00:02:56,730 --> 00:03:01,670 because in the West it's like this huge, all-encompassing system. And in China, 33 00:03:01,670 --> 00:03:07,140 most people that aren't directly -- that aren't directly in touch with it actually 34 00:03:07,140 --> 00:03:11,320 don't know anything about this. I'm broadly interested in the impact of 35 00:03:11,320 --> 00:03:17,170 technology on society, life, and the economy, obviously, and in my free time I 36 00:03:17,170 --> 00:03:21,400 do a lot of data science and machine learning with Python and R. So, I thought 37 00:03:21,400 --> 00:03:27,180 it was quite interesting to look at the Social Credit system, also from this point 38 00:03:27,180 --> 00:03:31,730 of view because you always heard that it's like this big data initiative, and then 39 00:03:31,730 --> 00:03:35,200 when coming down to it, what you actually see is that, they don't actually use 40 00:03:35,200 --> 00:03:40,020 machine learning all that much. They have, basically, a rule based catalog where, if 41 00:03:40,020 --> 00:03:44,459 you do this you get 50 points, if you do this you get 50 points, and then they 42 00:03:44,459 --> 00:03:48,959 actually have a lot of people that are reporting on other people's behavior. I'm 43 00:03:48,959 --> 00:03:53,310 going to be talking about how exactly it looks, later on but I was very, very 44 00:03:53,310 --> 00:03:56,870 surprised after reading a lot of the Western newspaper articles that were 45 00:03:56,870 --> 00:04:02,300 basically "Oh, this is this big dystopia, Orwellian, with big data working." And 46 00:04:02,300 --> 00:04:08,160 then, you read what's actually happening and they have huge lists of "if you 47 00:04:08,160 --> 00:04:14,700 jaywalk, you get 10 points detracted from you," this kind of thing. If you want to 48 00:04:14,700 --> 00:04:19,000 get in touch with me you can use Twitter but you can also use different e-mails 49 00:04:19,000 --> 00:04:23,099 either my professional e-mail or my personal e-mail address, that you can both 50 00:04:23,099 --> 00:04:27,870 see there. If you have any thoughts on that or are interested in this a little 51 00:04:27,870 --> 00:04:31,759 more I can give you more resources as well, because obviously today's talk will 52 00:04:31,759 --> 00:04:38,420 only be scratching on the surface. So, perceptions of the Social Credit System. 53 00:04:38,420 --> 00:04:43,310 One of the interesting things that I've talked about before was how, in the West 54 00:04:43,310 --> 00:04:48,410 and in China, the perception is completely different. So in the West, which is from 55 00:04:48,410 --> 00:04:54,550 financialtimes.com, you see this huge overwhelming guy, and he basically puts 56 00:04:54,550 --> 00:04:59,280 every Chinese person under a microscope. They're all kind of hunched over, and 57 00:04:59,280 --> 00:05:03,750 everyone has this score attached to them, and they seem pretty sad and, like, very, 58 00:05:03,750 --> 00:05:08,700 very Orwellian concept. Whereas, in China, this is actually from a Chinese state 59 00:05:08,700 --> 00:05:13,810 media, and what it says is, well, we can all live in harmony with this new system 60 00:05:13,810 --> 00:05:19,139 and all trust each other. And interestingly Chinese people actually 61 00:05:19,139 --> 00:05:24,310 believe that, to some degree. They believe that technology will fix all this current 62 00:05:24,310 --> 00:05:29,900 problems in society, especially because, in China currently, trust is a rare 63 00:05:29,900 --> 00:05:39,400 commodity. And this new system will lead to more efficiency and trust, and a better 64 00:05:39,400 --> 00:05:43,650 life. And I have a really, really interesting quote from a Western scholar, 65 00:05:43,650 --> 00:05:47,800 that really summarizes the Western perspective: "What China is doing here is 66 00:05:47,800 --> 00:05:52,949 selectively breeding its population to select against the trait of critical, 67 00:05:52,949 --> 00:05:57,490 independent thinking. This may not be the purpose, indeed I doubt it's the primary 68 00:05:57,490 --> 00:06:02,330 purpose, but it's nevertheless the effect of giving only obedient people the social 69 00:06:02,330 --> 00:06:07,290 ability to have children, not to mention successful children." This, basically, 70 00:06:07,290 --> 00:06:11,960 plays with the idea that if you have a low score, currently, in the cities that are 71 00:06:11,960 --> 00:06:17,320 already testing this system, what happens is, your children can't attend good 72 00:06:17,320 --> 00:06:23,190 schools. What happens is, you cannot take trains, you cannot take planes. You cannot 73 00:06:23,190 --> 00:06:31,140 book good hotels. Your life is just very, very inconvenient. And this is by design. 74 00:06:31,140 --> 00:06:38,620 This is kind of the plan. The Chinese government, they say it's a little 75 00:06:38,620 --> 00:06:42,340 different, the idea is about changing people's conduct by ensuring they are 76 00:06:42,340 --> 00:06:48,199 closely associated with it. One of the main things about this system is, there 77 00:06:48,199 --> 00:06:53,330 isn't very much new data being generated for the system. Instead, what's happening 78 00:06:53,330 --> 00:07:00,010 is, all the existing data that is already collected about you is, basically, 79 00:07:00,010 --> 00:07:06,370 combined into one big database for each and every person by your ID number. So, in 80 00:07:06,370 --> 00:07:10,190 China, once you're born, you get an ID number, which is similar to a Social 81 00:07:10,190 --> 00:07:13,960 Security number in the U.S. We don't really have a similar concept in Germany, 82 00:07:13,960 --> 00:07:19,310 and it used to be that your ID number was only necessary for public -- like for 83 00:07:19,310 --> 00:07:25,810 government stuff, but now you need your ID number for getting a bank account, you 84 00:07:25,810 --> 00:07:29,580 need your ID number for buying a cell phone, even if it's a prepaid cell phone, 85 00:07:29,580 --> 00:07:34,640 you still need your ID number. So all your online activity that happens with your 86 00:07:34,640 --> 00:07:38,430 cell phone is associated with your ID number, which means you can't really do 87 00:07:38,430 --> 00:07:44,210 anything anonymously, because it's all going back to your ID number. There's a 88 00:07:44,210 --> 00:07:49,450 couple of predecessors, some of them going actually back to the 1990s, that are 89 00:07:49,450 --> 00:07:54,880 supposed to be integrated into the new system. One of them, or like two of them 90 00:07:54,880 --> 00:08:00,800 are blacklists. One of them is a court blacklist. So in China, courts work a 91 00:08:00,800 --> 00:08:06,560 little bit differently. They tend to like giving you fines, as they do in other 92 00:08:06,560 --> 00:08:11,521 countries, but they also like giving you "apologies to do." So one of the things, 93 00:08:11,521 --> 00:08:17,470 if you do something, for example you're a company, your food safety wasn't up to par 94 00:08:17,470 --> 00:08:21,800 – you have to pay a fine. But in addition to this fine you also have to write a 95 00:08:21,800 --> 00:08:27,150 public apology letter in the newspaper, how you are very sorry that this happened 96 00:08:27,150 --> 00:08:31,560 and it won't happen again, and it was a moral failing on your part, and it won't 97 00:08:31,560 --> 00:08:37,460 happen again. And if you don't do that, you go on this blacklist. Similarly, if 98 00:08:37,460 --> 00:08:42,698 you take out a line of credit and don't pay it back within three months, or like 99 00:08:42,698 --> 00:08:47,930 don't don't do any payments for three months, you go on this debtors blacklist. 100 00:08:47,930 --> 00:08:52,009 If you're on this blacklist, which again is associated with your *shēnfènzhèng*, so 101 00:08:52,009 --> 00:08:57,709 your ID number – what happens is you cannot take trains you cannot take planes. 102 00:08:57,709 --> 00:09:02,309 Your life basically becomes very very inconvenient, your children can't go to 103 00:09:02,309 --> 00:09:06,749 good public schools, your children can't go to private schools, your children can't 104 00:09:06,749 --> 00:09:14,589 go to universities, all of these issues are suddenly coming up. There is also a 105 00:09:14,589 --> 00:09:19,829 company database that's called Credit China which is basically similar to the 106 00:09:19,829 --> 00:09:24,959 public debtors blacklist but it's basically a credit system a credit score 107 00:09:24,959 --> 00:09:28,860 for companies. And then there's the credit reference center of the People's Bank of 108 00:09:28,860 --> 00:09:34,569 China which is a credit score. It was supposed to be like Schufa or like the 109 00:09:34,569 --> 00:09:42,629 U.S. FICO for individuals. But one of the big problems in China is that there are a 110 00:09:42,629 --> 00:09:46,769 lot of people that aren't part of the formal economy. A lot of people are 111 00:09:46,769 --> 00:09:52,529 migrant workers. They get their money in cash. They do not have bank accounts. They 112 00:09:52,529 --> 00:10:00,410 do not have anything… they do not have rent or utilities or anything like this because 113 00:10:00,410 --> 00:10:04,610 they live in the country. So they own their own home which they built themselves 114 00:10:04,610 --> 00:10:10,920 so they didn't even finance it and their home isn't officially theirs because in 115 00:10:10,920 --> 00:10:16,069 China you can't actually own property. Instead the government leases it to you. 116 00:10:16,069 --> 00:10:20,839 So there were a lot of people that were not covered in this system, and I think 117 00:10:20,839 --> 00:10:26,739 the last data that I had was that less than 10 percent of Chinese adult citizens 118 00:10:26,739 --> 00:10:32,699 were actually in the system and had any sort of exposure to banks, which is very, 119 00:10:32,699 --> 00:10:38,089 very little. And that meant that people couldn't get credit because banks would 120 00:10:38,089 --> 00:10:42,220 only give credit to people that were in the system or people where they had 121 00:10:42,220 --> 00:10:46,649 some sort of handling on whether they would be paid back. Now, the 122 00:10:46,649 --> 00:10:51,839 implementation details of the new system are very very scarce, but the basic idea 123 00:10:51,839 --> 00:10:56,079 is that Chinese citizens are divided into trustworthy individuals and what the 124 00:10:56,079 --> 00:11:02,309 Chinese call "trust breakers". Sometimes you have five different groups, sometimes 125 00:11:02,309 --> 00:11:05,459 you have two different groups, but in general there's sort of this cut-off: 126 00:11:05,459 --> 00:11:13,619 above this line it's good and beyond this line it's bad. This is one graphic from 127 00:11:13,619 --> 00:11:20,409 the Wall Street Journal that just shows some of the inputs that go into the 128 00:11:20,409 --> 00:11:25,440 system. And one of the things that we see is that the inputs are _crazy_ crazy 129 00:11:25,440 --> 00:11:35,007 varied. So it is: do you pay income taxes? Do you pay your utility bills on time? 130 00:11:35,007 --> 00:11:40,201 Do you respect your parents? However they measure that. Do you have a criminal 131 00:11:40,201 --> 00:11:46,680 record? Do you pay for public transportation or have you been caught 132 00:11:46,680 --> 00:11:50,479 not paying? What about your friends? 133 00:11:50,479 --> 00:11:58,439 Do you retweet or use WeChat to distribute sort of information against 134 00:11:58,439 --> 00:12:04,300 the party, which they call reliability. In actuality it's not about whether it's 135 00:12:04,300 --> 00:12:08,459 factual, it's about whether it's against the party or not. Where do you buy and 136 00:12:08,459 --> 00:12:12,019 what do you buy, apparently if you buy diapers it's better than if you buy 137 00:12:12,019 --> 00:12:16,679 videogames. For your score. Because you know if you buy videogames obviously 138 00:12:16,679 --> 00:12:20,089 you're not very responsible. And if you buy diapers you have a kid, you are sort 139 00:12:20,089 --> 00:12:29,019 of conforming to the societal ideal. And then your score is supposed to go into all 140 00:12:29,019 --> 00:12:33,679 these different categories, you're supposed to have better access to social 141 00:12:33,679 --> 00:12:38,759 services if your score is good. You're supposed to have better access to internet 142 00:12:38,759 --> 00:12:43,309 services. So in theory the idea is that at one point if your score is too bad, you're 143 00:12:43,309 --> 00:12:47,890 not allowed to use WeChat anymore. You're not allowed to use Alibaba anymore. You 144 00:12:47,890 --> 00:12:57,110 can't become a government worker. You can not take planes and high speed trains. You 145 00:12:57,110 --> 00:13:04,519 can not get a passport. And your insurance premiums will go up. So it's supposed to 146 00:13:04,519 --> 00:13:12,540 be this really really big, overwhelming system. But in actuality what they say 147 00:13:12,540 --> 00:13:19,459 their stated goals are, is "it's a shorthand for a broad range of efforts to 148 00:13:19,459 --> 00:13:24,790 improve market security and public safety by increasing integrity and mutual trust 149 00:13:24,790 --> 00:13:31,899 in society." So one idea is to allocate resources more efficiently. Resource 150 00:13:31,899 --> 00:13:39,339 allocation in China is a pretty big problem, because people grow up with: 151 00:13:39,339 --> 00:13:43,940 There's 1.3 billion people. So there's - it's always going to be scarce. And a lot 152 00:13:43,940 --> 00:13:50,160 of stuff is – people grow up with this idea that it's just very very scarce, and 153 00:13:50,160 --> 00:13:54,639 current distribution strategies, which are mostly financially based but also often 154 00:13:54,639 --> 00:13:59,360 *guanxi*-based, don't really seem fair. For example, public transport in China is 155 00:13:59,360 --> 00:14:04,769 highly subsidized, which means that the price does not reflect whether – does not 156 00:14:04,769 --> 00:14:10,249 reflect true scarcity. So currently the way it works is in theory it's first come 157 00:14:10,249 --> 00:14:15,019 first serve, in practice there's people that are buying up all the tickets for, 158 00:14:15,019 --> 00:14:19,319 for example, the high speed train from Shanghai to Beijing and then selling it at 159 00:14:19,319 --> 00:14:23,120 a profit, or selling it to certain companies that have good ties to the 160 00:14:23,120 --> 00:14:28,040 government. That seems very unfair. So the new system is supposed to distribute them 161 00:14:28,040 --> 00:14:33,670 more fairly and more efficiently. The other thing is restoring trust in people. 162 00:14:33,670 --> 00:14:38,209 Perceived inter-personal trust and trust in institutions is extremely low in China. 163 00:14:38,209 --> 00:14:43,119 If you're from Germany, you might have heard that there is Chinese gangs 164 00:14:43,119 --> 00:14:48,089 basically buying up German milk powder and selling it in China. This is actually 165 00:14:48,089 --> 00:14:54,559 happening, because in 2008 there was a big scandal with laced milk powder. And ever 166 00:14:54,559 --> 00:14:59,380 since then, anyone who can afford it does not use Chinese milk powder, because they 167 00:14:59,380 --> 00:15:03,730 don't trust the government, or the regulations, the firms, enough to buy 168 00:15:03,730 --> 00:15:09,929 Chinese milk powder so they are actually importing this. And the big irony is: 169 00:15:09,929 --> 00:15:15,519 sometimes this milk powder is produced in China, exported to Germany, and then 170 00:15:15,519 --> 00:15:22,249 exported back to China. The Social Credit system is then supposed to identify those 171 00:15:22,249 --> 00:15:27,449 that deserve the trust. And the third point is sort of a reeducation of people. 172 00:15:27,449 --> 00:15:32,959 The idea is: they want to make people in the image that the Communist Party thinks 173 00:15:32,959 --> 00:15:40,529 people should be. And one additional way to the punishments and rewards this could 174 00:15:40,529 --> 00:15:45,420 work, is the feeling of being surveyed. Because you can't do anything anonymously, 175 00:15:45,420 --> 00:15:49,470 you will automatically adapt your behavior because you know someone is watching you 176 00:15:49,470 --> 00:15:55,500 all the time, and this is how a lot of the Chinese firewall actually works, because 177 00:15:55,500 --> 00:16:00,920 most people I know that are sort of more– more educated, they know ways to 178 00:16:00,920 --> 00:16:05,019 circumvent the Chinese firewall, but they also know that they're always being 179 00:16:05,019 --> 00:16:09,399 watched, so they don't do that because, you know, they're being watched, so they 180 00:16:09,399 --> 00:16:18,449 self– they censorship– they censor themselves. As I said before, allocation 181 00:16:18,449 --> 00:16:23,619 of scarce resources so far is mainly through financial *guanxi* channels. *guanxi* 182 00:16:23,619 --> 00:16:29,290 is basically an all permeating network of relationships with a clear status 183 00:16:29,290 --> 00:16:34,399 hierarchy. So if I attend a school, everyone who also attended this school 184 00:16:34,399 --> 00:16:40,319 will be sort of in my *guanxi* network. And there's this idea that we will have a 185 00:16:40,319 --> 00:16:45,679 system where we are all in-group, and in- group we trust each other and we do favors 186 00:16:45,679 --> 00:16:51,120 for each other, and everyone who's outside of my immediate group I don't trust and I 187 00:16:51,120 --> 00:16:56,350 don't do favors for. And in some ways the *guanxi* system right now is a substitute 188 00:16:56,350 --> 00:17:02,639 for formal institutions in China. For example if you want a passport right now. 189 00:17:02,639 --> 00:17:07,359 You can of course apply for passports through regular channels, which might take 190 00:17:07,359 --> 00:17:12,390 months and months. Or you can apply for a passport through knowing someone and 191 00:17:12,390 --> 00:17:16,750 knowing someone, which might take only two days. Whereas in Germany you have these 192 00:17:16,750 --> 00:17:22,510 very regular, formal institutions, in China they still use *guanxi*. But, 193 00:17:22,510 --> 00:17:26,849 increasingly especially young people find that *guanxi* are very unfair, because a lot 194 00:17:26,849 --> 00:17:31,560 of these are: where you went to school, which is determined by where you're born, 195 00:17:31,560 --> 00:17:37,200 who your parents are, and all these things. Another thing that's important to 196 00:17:37,200 --> 00:17:42,170 understand because: the system works through public shaming. And in a lot of 197 00:17:42,170 --> 00:17:48,430 western society we can't really imagine that, like, I wouldn't really care if my 198 00:17:48,430 --> 00:17:54,700 name was in a newspaper of someone who jaywalked for example. It would be: oh 199 00:17:54,700 --> 00:17:59,631 well, that's okay. But in China this is actually a very very serious thing. So 200 00:17:59,631 --> 00:18:05,030 saving face is very very important in China. And when I went to school there I 201 00:18:05,030 --> 00:18:13,080 actually – we had this dormitory, and it was an all foreigners dormitory, where the 202 00:18:13,080 --> 00:18:17,680 staff that were responsible for the dormitory felt that foreigners were not 203 00:18:17,680 --> 00:18:24,660 behaving in the way they should. So their idea was to put the names, the pictures, 204 00:18:24,660 --> 00:18:29,250 and the offenses of the foreigners in the elevator to shame them publicly. So for 205 00:18:29,250 --> 00:18:33,690 example if you brought a person of the opposite sex to your room, they would put 206 00:18:33,690 --> 00:18:39,910 your name, your offense and your room number in the elevator. And of course this 207 00:18:39,910 --> 00:18:44,630 didn't work because for a lot of western people it was basically like: "oh well I'm 208 00:18:44,630 --> 00:18:49,140 going to try to be there as often as possible because this is like a badge of 209 00:18:49,140 --> 00:18:53,670 honor for me" and the Chinese people they figured "well this is really really shame 210 00:18:53,670 --> 00:18:58,080 and I'm losing my face". She brought alcohol. So this didn't really work at 211 00:18:58,080 --> 00:19:06,390 all. But this is kind of the mindset that is behind a lot of these initiatives. As I 212 00:19:06,390 --> 00:19:12,230 said there's a lot of problems with – we don't really know what's going to happen. 213 00:19:12,230 --> 00:19:17,000 And one of the ways that we can see what might happen is actually to look at pilot 214 00:19:17,000 --> 00:19:24,000 systems. China has – or like ever since the Communist Party took hold – the 215 00:19:24,000 --> 00:19:28,630 Chinese government has tried a lot of policy experimentation. So whenever they 216 00:19:28,630 --> 00:19:33,850 try a new policy, they don't roll it out all over, but they choose different pilot 217 00:19:33,850 --> 00:19:38,890 cities or pilot districts, and then they choose "oh well this is the district where 218 00:19:38,890 --> 00:19:42,900 I'm going to be trying this system and I'm going to be trying another system in 219 00:19:42,900 --> 00:19:48,360 another district or city". And this is also what they did for the, or what 220 00:19:48,360 --> 00:19:52,520 they're doing for the Social Credit system. Now I have three systems that I 221 00:19:52,520 --> 00:19:58,300 looked at intensively for this presentation, overall there's about 70 222 00:19:58,300 --> 00:20:07,030 that I know of - the Suining system, Suining is a city in China, the Rongcheng 223 00:20:07,030 --> 00:20:11,410 system, another city in China and Sesame Credit. Sesame Credit is a commercial 224 00:20:11,410 --> 00:20:16,430 system from Alibaba - I assume everyone knows Alibaba, the're basically the 225 00:20:16,430 --> 00:20:22,770 Chinese Amazon, except they're bigger and have more users and make more money, 226 00:20:22,770 --> 00:20:27,400 actually. And they have their own little system. One of the problems with this kind 227 00:20:27,400 --> 00:20:31,620 of system that I found when I tried modeling it, was that it's a very very 228 00:20:31,620 --> 00:20:38,930 complex system and small changes in input actually changed the output significantly. 229 00:20:38,930 --> 00:20:43,550 So when they try– usually when they try this pilot system they basically have a 230 00:20:43,550 --> 00:20:47,850 couple of pilots, then they choose the pilot that is best and they roll it out 231 00:20:47,850 --> 00:20:52,560 all over. But for this kind of thing, where you have a lot of complex issues, it 232 00:20:52,560 --> 00:20:59,410 might not be the best way to do that. The Suining system is actually considered the 233 00:20:59,410 --> 00:21:05,980 predecessor of all current systems. It had a focus on punishment, and it was quite 234 00:21:05,980 --> 00:21:12,000 interesting. At the beginning of the trial period they published a catalogue of 235 00:21:12,000 --> 00:21:17,260 scores and consequences. Here is an example. This is basically taken from this 236 00:21:17,260 --> 00:21:22,640 catalog. So if you took out bank loans and didn't repay them, you got deducted 50 237 00:21:22,640 --> 00:21:26,910 points. Everyone started with 1000 points for this system. If you didn't pay 238 00:21:26,910 --> 00:21:32,690 back your credit cards you also got deducted 50 points. If you evaded taxes, 239 00:21:32,690 --> 00:21:43,890 also 50 points. If you sold fake goods, 35 points were deducted. And actually the 240 00:21:43,890 --> 00:21:50,450 system was abolished I think in 2015, 2016, because all the Chinese state media 241 00:21:50,450 --> 00:21:56,490 and also a lot of Internet citizens talked about how it's an Orwellian system and how 242 00:21:56,490 --> 00:22:03,020 it's not a good system, because it's all very centralized and everything that you 243 00:22:03,020 --> 00:22:08,880 do is basically recorded centrally. But Creemers writes: "Nonetheless, the Suining 244 00:22:08,880 --> 00:22:13,090 system already contained the embryonic forms of several elements of subsequent 245 00:22:13,090 --> 00:22:17,290 social credit initiatives: The notion of disproportional disincentives against rule 246 00:22:17,290 --> 00:22:22,500 breaking, public naming and shaming of wrongdoers, and most importantly, the 247 00:22:22,500 --> 00:22:26,900 expansion of the credit mechanism outside of the market economic context, also 248 00:22:26,900 --> 00:22:29,760 encompassing compliance with administrative regulations and urban 249 00:22:29,760 --> 00:22:34,300 management rules." So one of the things that is difficult for especially German 250 00:22:34,300 --> 00:22:40,191 speakers is that credit in Chinese, *xìnyòng*, means credit as in "loan", but 251 00:22:40,191 --> 00:22:46,350 also means credit as in "trust". So the Social Credit System is one way of trying 252 00:22:46,350 --> 00:22:52,240 to conflate those two – the economic credit and the trust credit – into one big 253 00:22:52,240 --> 00:23:01,670 system. But the Suining system basically failed. So, they adapted the system and 254 00:23:01,670 --> 00:23:07,220 are now practicing a new kind of system, the Rongcheng system. Whenever you read a 255 00:23:07,220 --> 00:23:11,310 newspaper article on the social credit system in the west, most people went to 256 00:23:11,310 --> 00:23:15,060 Rongcheng because they just received a couple of awards from the Chinese 257 00:23:15,060 --> 00:23:22,230 government for being so advanced at this social credit thing. But it's very 258 00:23:22,230 --> 00:23:25,990 difficult to call this "one system" because there's actually many many 259 00:23:25,990 --> 00:23:31,020 intertwined systems. There is one city level system, where city level offenses 260 00:23:31,020 --> 00:23:38,520 are recorded. For example tax evasion, and there's a couple of rules. If you evade 261 00:23:38,520 --> 00:23:43,730 taxes your score goes down 50. But then if you live in one neighborhood your score 262 00:23:43,730 --> 00:23:47,620 might go up for volunteering with the elderly. If you live in another 263 00:23:47,620 --> 00:23:54,200 neighborhood your score might go up for, for example, planting some trees in 264 00:23:54,200 --> 00:23:59,180 your garden or backyard. So depending on your neighborhood, your score might be 265 00:23:59,180 --> 00:24:05,590 different. If you work for a– if you work for a taxi cab company, for example, they 266 00:24:05,590 --> 00:24:10,780 also have their own little score system and your score might go up if you get good 267 00:24:10,780 --> 00:24:16,570 reviews from your drive…, from your passengers. Your score might go down if 268 00:24:16,570 --> 00:24:25,780 you don't follow traffic rules, these kinds of things. There are designated 269 00:24:25,780 --> 00:24:31,550 scorekeepers at each level. So, each district chooses a couple of people who 270 00:24:31,550 --> 00:24:38,330 are responsible for passing on the information to the next higher level, 271 00:24:38,330 --> 00:24:42,870 about who did what. There is supposed to be an official appeals procedure, so 272 00:24:42,870 --> 00:24:46,940 whenever you score changes you're supposed to be notified, but apparently that's not 273 00:24:46,940 --> 00:24:54,490 happening at this point for most people. Again, it's a system of data sharing, and 274 00:24:54,490 --> 00:24:58,520 one thing that they haven't really disclosed yet is what kind of data is 275 00:24:58,520 --> 00:25:04,760 shared. Are they only sharing the points, so if I'm in a district and I plant some 276 00:25:04,760 --> 00:25:11,480 trees, does the central system get the information "person A planted some trees," 277 00:25:11,480 --> 00:25:16,300 or does the central system get the information "person A got 5 points?" We 278 00:25:16,300 --> 00:25:21,570 don't know at this point. And it would mean something very different for how the 279 00:25:21,570 --> 00:25:27,030 system could be used. But still the end result, at this point, is that there's one 280 00:25:27,030 --> 00:25:31,360 score. So you have one central score and it's kind of– there's all these different 281 00:25:31,360 --> 00:25:35,780 smaller systems that go into this score. But at the end, everyone has one central 282 00:25:35,780 --> 00:25:44,530 score, and currently about 85 percent of people are between 950 and 1050. So you 283 00:25:44,530 --> 00:25:49,990 start off with a thousand – and those are basically the normal people – and then 284 00:25:49,990 --> 00:25:58,940 anyone above a 1050 is considered a trustworthy person, and anyone below 1050 285 00:25:58,940 --> 00:26:05,340 is considered a trust-breaker. And, as I've said before, with the naming and 286 00:26:05,340 --> 00:26:12,310 shaming and all these things, what you can actually see here is a billboard with the 287 00:26:12,310 --> 00:26:17,520 best trustworthy families in Rongcheng. So these are the families that have the 288 00:26:17,520 --> 00:26:23,590 highest scores, for example. Sesame Credit is a little different. It's the only 289 00:26:23,590 --> 00:26:27,990 system that actually uses machine learning and artificial intelligence to determine 290 00:26:27,990 --> 00:26:32,980 the outputs. In Rongcheng, for example, they have artificial intelligence, they 291 00:26:32,980 --> 00:26:37,120 have computer vision, for the most part, and the computer vision cameras they 292 00:26:37,120 --> 00:26:42,220 decide– they try to recognize you when you jaywalk. And then when they recognize you 293 00:26:42,220 --> 00:26:48,250 when jaywalking, you get a small SMS; "well, we just saw you jaywalking, your 294 00:26:48,250 --> 00:26:56,810 score is now dropping." But how the score develops, depending on your jaywalking, 295 00:26:56,810 --> 00:27:01,120 isn't really determined by machine learning or artificial intelligence. 296 00:27:01,120 --> 00:27:06,840 Instead, it's determined by rules. You know: one time jaywalking deducts five 297 00:27:06,840 --> 00:27:12,460 points, and this is stated somewhere. Sesame Credit doesn't work like that. 298 00:27:12,460 --> 00:27:19,810 Instead it uses a secret algorithm, and the way– I talked to some people that work 299 00:27:19,810 --> 00:27:25,330 for Sesame Credit or for Alibaba, and the way they described it was; they basically 300 00:27:25,330 --> 00:27:33,260 clustered people based on behavior, then gave scores to these clusters, and 301 00:27:33,260 --> 00:27:40,700 then afterwards, did basically reverse engineered their own score, using machine 302 00:27:40,700 --> 00:27:45,950 learning, so that whenever something new happens, you can move to a different 303 00:27:45,950 --> 00:27:54,330 cluster. This Sesame Credit was actually refused accreditation as a credit score in 304 00:27:54,330 --> 00:28:03,400 2017, so banks are not allowed to use the Sesame Credit score for your– to use the 305 00:28:03,400 --> 00:28:09,190 Sesame Credit score to determine whether they give you loans or not. Because Sesame 306 00:28:09,190 --> 00:28:13,770 Credit is quite ingenious – obviously Alibaba wants to keep you within their 307 00:28:13,770 --> 00:28:19,920 platform – so if you buy using Alibaba and using Alipay, your score goes up. If you 308 00:28:19,920 --> 00:28:29,090 buy using Weechatpay, which is a competing platform, your score goes down. This uses 309 00:28:29,090 --> 00:28:34,000 many of the same rewards mechanisms of the official government systems, and this is 310 00:28:34,000 --> 00:28:38,390 just an illustration of what kind of scores you can have, apparently your 311 00:28:38,390 --> 00:28:46,070 scores can go between 350 and 850, and in Chinese there's basically five different 312 00:28:46,070 --> 00:28:55,809 levels. So 385 is a "trust-breaker" or "missing trust". And then 731 is "trust is 313 00:28:55,809 --> 00:29:06,170 exceedingly high". So one way I tried to approach this issue was through agent- 314 00:29:06,170 --> 00:29:10,100 based modeling. Social Credit System is individual level, but what we're really 315 00:29:10,100 --> 00:29:13,400 interested in, or what I'm really interested in, is actually societal-level 316 00:29:13,400 --> 00:29:19,040 consequences. So if everyone gets this score, what does that mean for society? 317 00:29:19,040 --> 00:29:24,050 And agent-based modeling works quite well for that, because it allows us to imbue 318 00:29:24,050 --> 00:29:28,960 agents with some sort of rationality, but with a bounded rationality. What does 319 00:29:28,960 --> 00:29:32,800 bounded rationality mean? Usually in economics people assume agents are 320 00:29:32,800 --> 00:29:38,070 completely rational, so they are profit maximizers, they have all the information. 321 00:29:38,070 --> 00:29:45,380 But in reality, agents don't have all the information, they have a lot of issues 322 00:29:45,380 --> 00:29:51,100 with keeping stuff in their mind. So a lot of the time, they won't choose the best 323 00:29:51,100 --> 00:29:57,760 thing in the world, but they choose the best thing that they see. And bounded 324 00:29:57,760 --> 00:30:02,040 rationality allows us to account for this thing. It allows us to account for 325 00:30:02,040 --> 00:30:08,380 heuristics and these things. And what I did is I took the propensity for specific 326 00:30:08,380 --> 00:30:12,320 behavior from current state of the art research, mostly from behavioral 327 00:30:12,320 --> 00:30:17,900 economics. For example, I looked at tax evasion, and I looked at who is likely to 328 00:30:17,900 --> 00:30:24,630 evade taxes in a system, and then obviously there was some stochastic – 329 00:30:24,630 --> 00:30:30,809 some chance element. But the distribution that I chose is related to 330 00:30:30,809 --> 00:30:37,680 the current research. And I also checked that my model has similar results to the 331 00:30:37,680 --> 00:30:45,400 Rongcheng model, which I modeled at at the beginning. So on average 87% of my users 332 00:30:45,400 --> 00:30:49,240 have a score of within 10 percent of the original score, which is also the data 333 00:30:49,240 --> 00:31:00,740 that Rongcheng city actually publishes. Now, for the most part, I compared design 334 00:31:00,740 --> 00:31:05,220 choices in two axes. One of them was a centralized system versus a multi-level 335 00:31:05,220 --> 00:31:10,710 system, and a rule-based system versus a machine learning system. The centralized 336 00:31:10,710 --> 00:31:20,650 system is basically: you have a central – all the information is kept centrally, and 337 00:31:20,650 --> 00:31:27,190 everyone in China, or wherever, in Rongcheng has the exact same scoring 338 00:31:27,190 --> 00:31:35,520 opportunities. Now, if you have a centralized system the clear expectations 339 00:31:35,520 --> 00:31:39,600 were pretty good. But, at the same time, the acceptance from the population was 340 00:31:39,600 --> 00:31:46,590 really, really low, which they found during the Suining experiment. And 341 00:31:46,590 --> 00:31:50,540 there's also the problem of a single point of failure. Who decides the central 342 00:31:50,540 --> 00:31:59,080 catalog, and, depending on who, sort of, has the power, it kind of, just, 343 00:31:59,080 --> 00:32:05,100 reproduces power structures. So because you have this central catalog, the same 344 00:32:05,100 --> 00:32:10,760 people that are in power centrally, they are basically deciding some sort of score 345 00:32:10,760 --> 00:32:15,340 mechanism that works for them very well, so that they and their family will have 346 00:32:15,340 --> 00:32:23,280 high scores. And multi-level system has the advantage that local adaptation kind 347 00:32:23,280 --> 00:32:29,080 of works, and there's sort of many points of failure. But in my model, when I 348 00:32:29,080 --> 00:32:36,860 allowed locals to basically set their own rules, what happened was that they 349 00:32:36,860 --> 00:32:42,520 competed. So, it started out being this district of Rongcheng, for example, and 350 00:32:42,520 --> 00:32:46,120 this district of Rongcheng, they compete for the best people that they want to 351 00:32:46,120 --> 00:32:52,000 attract, and suddenly you have this kind of race to the bottom, where people want 352 00:32:52,000 --> 00:32:58,090 to move where they wouldn't be prosecuted, so they move to places where there's less 353 00:32:58,090 --> 00:33:04,000 cameras, for example. At the same time, there's many points of failure, especially 354 00:33:04,000 --> 00:33:13,500 the way it's currently set up, with people reporting data to the next high level. 355 00:33:13,500 --> 00:33:19,450 And, a lot of the time, what we have actually seen in Rongcheng, was that they 356 00:33:19,450 --> 00:33:24,360 reported data on people they didn't like more than data on people they did like. 357 00:33:24,360 --> 00:33:29,390 Or, their families got better scores than people they didn't know. So it also kind 358 00:33:29,390 --> 00:33:41,320 of reproduced these biases. The rule based system has the advantage that people were 359 00:33:41,320 --> 00:33:46,530 more prone to adapt their behavior, because they actually knew what they 360 00:33:46,530 --> 00:33:50,630 needed to do in order to adapt their behavior. But the score didn't really 361 00:33:50,630 --> 00:33:54,050 correlate with the important characteristics that they actually cared 362 00:33:54,050 --> 00:34:01,610 about. And, as opposed to in this machine learning system, you know how in Germany 363 00:34:01,610 --> 00:34:06,910 we don't really know the Schufa algorithm. And I, for example, don't exactly know 364 00:34:06,910 --> 00:34:12,139 what I could do in order to improve my Schufa score. And this is a similar system 365 00:34:12,139 --> 00:34:16,969 in China with the Sesame Credit score. A lot of people don't really – they say, 366 00:34:16,969 --> 00:34:21,800 "well I really want to adapt my behavior to the score, to improve my score, but 367 00:34:21,800 --> 00:34:29,119 when I tried doing that my score actually got worse." And you can have different 368 00:34:29,119 --> 00:34:36,440 biases, that I'm going to be talking about in a little bit. There's also this big 369 00:34:36,440 --> 00:34:42,579 problem of incentive mismatch. So, the decentralized, rules-based systems like 370 00:34:42,579 --> 00:34:47,299 Rongcheng, which is the system that I analyzed the most. Why, because I believe 371 00:34:47,299 --> 00:34:51,530 this is the system that we're moving towards right now. Because Rongcheng won a 372 00:34:51,530 --> 00:34:57,018 lot of awards. So the Chinese government, the way they usually work is, they try 373 00:34:57,018 --> 00:35:01,710 pilots, then they choose the best couple of systems, they give them awards, and 374 00:35:01,710 --> 00:35:06,890 then they roll out the system nationwide. So I assume that the system that's going 375 00:35:06,890 --> 00:35:12,990 to be – the system in the end will be similar to the Rongcheng system. Now, one 376 00:35:12,990 --> 00:35:19,670 problem that I actually saw in my simulation was that you could have this 377 00:35:19,670 --> 00:35:24,960 possible race to the bottom. There's also this conflict of interest in those that 378 00:35:24,960 --> 00:35:29,710 set the rules, because a lot of the time, the way it works is, you have your 379 00:35:29,710 --> 00:35:35,940 company, and your company, you, in combination with your party leaders, 380 00:35:35,940 --> 00:35:43,760 actually decide on the rules for the score system. But the scores of all your 381 00:35:43,760 --> 00:35:48,319 employees actually determines your company's score. If you employ a lot of 382 00:35:48,319 --> 00:35:52,589 people with high scores you get a better score. So you will have this incentive to 383 00:35:52,589 --> 00:35:57,400 give out high scores and to make sure that everyone gets high scores. But at the same 384 00:35:57,400 --> 00:36:04,720 time the government has an incentive for scores to be comparable. So there's a lot 385 00:36:04,720 --> 00:36:10,020 of incentives mismatch. The government also has the incentive to keep false 386 00:36:10,020 --> 00:36:15,539 negatives down, but they actually, the way the Chinese system currently works is, 387 00:36:15,539 --> 00:36:22,589 they emphasize catching trust-breakers more than rewarding trust-follow... or 388 00:36:22,589 --> 00:36:28,750 trustworthy people. So, false positives, for them, are less important, but false 389 00:36:28,750 --> 00:36:34,359 positives erode the trust in the system, and they lead to a lot less behavioral 390 00:36:34,359 --> 00:36:40,740 adaptation. I was actually able to show this using some nudging research that 391 00:36:40,740 --> 00:36:47,789 showed that as soon as you introduce an error probability and you can be caught 392 00:36:47,789 --> 00:36:54,530 for something that you didn't do, your probability of changing your behavior 393 00:36:54,530 --> 00:37:02,119 based on this score is actually lower. And in Rongcheng, one of the perverse things 394 00:37:02,119 --> 00:37:09,880 that they're doing is, you can donate money to the party or to, like, party 395 00:37:09,880 --> 00:37:17,339 affiliated social services, and this will give you points, which is kind of an 396 00:37:17,339 --> 00:37:24,269 indulgence system. Which is quite interesting, especially because a lot of 397 00:37:24,269 --> 00:37:32,690 these donation systems work in a way that you can donate 50000 renminbi and you get 398 00:37:32,690 --> 00:37:36,740 50 points, and then you donate another 50000 renminbi and you get another 50 399 00:37:36,740 --> 00:37:44,400 points. So you can basically donate a lot of money and then behave however you want, 400 00:37:44,400 --> 00:37:56,220 and still get a good score. And the trust in other people can actually go down even 401 00:37:56,220 --> 00:38:00,559 more in this system, because suddenly you only trust them because of their scores, 402 00:38:00,559 --> 00:38:05,460 and the current system is set up so that you can actually look up scores of 403 00:38:05,460 --> 00:38:09,730 everyone that you want to work with, and if they don't have a score high enough 404 00:38:09,730 --> 00:38:15,140 then suddenly you don't want to work with them. The trust in the legal system can 405 00:38:15,140 --> 00:38:22,259 also decrease, actually. Why? Because trust in the legal system in China is already 406 00:38:22,259 --> 00:38:26,039 low, and a lot of the things, like jaywalking, they're already illegal in 407 00:38:26,039 --> 00:38:30,380 China, as they are here, but no one cares. And suddenly, you have this parallel 408 00:38:30,380 --> 00:38:37,269 system that punishes you for whatever. But, why don't you just try to fix the 409 00:38:37,269 --> 00:38:45,190 legal system, which would be my approach. Suddenly, illegal activity could happen 410 00:38:45,190 --> 00:38:51,720 more offline, and this is one of those things that is quite interesting. In 411 00:38:51,720 --> 00:38:58,059 countries that we've seen that have moved towards mobile payments, and away 412 00:38:58,059 --> 00:39:05,069 from cash, you see less robberies but you don't actually see less crime. Instead you 413 00:39:05,069 --> 00:39:12,430 see more new types of crime. So, you see more credit card fraud, you see more phone 414 00:39:12,430 --> 00:39:19,390 robberies, these kinds of things. And this is also where things could move in the 415 00:39:19,390 --> 00:39:29,710 Chinese case. One major problem is also that this new system – I've talked a 416 00:39:29,710 --> 00:39:34,499 little bit about this one, but – it can introduce a lot of new bias, and reproduce 417 00:39:34,499 --> 00:39:44,710 the bias even more. So, for example, China is a country of 55 minorities. The Han are 418 00:39:44,710 --> 00:39:50,249 a big majority, they have about 94 percent of the population. So any computer vision 419 00:39:50,249 --> 00:39:58,020 task, we've shown, that they are really, really bad at discriminating between 420 00:39:58,020 --> 00:40:04,930 individuals in smaller ethnic groups. In the U.S., most computer vision tasks 421 00:40:04,930 --> 00:40:09,920 perform worse for African-Americans, they perform worse for women, because all of 422 00:40:09,920 --> 00:40:16,930 the training sets are male and white, and maybe Asian. In China, all of these tasks 423 00:40:16,930 --> 00:40:26,609 are actually performing worse for ethnic minorities, for the Uyghurs, for example. 424 00:40:26,609 --> 00:40:32,460 And one way that they could try to abuse the system is to basically just – what 425 00:40:32,460 --> 00:40:38,210 they're also doing already in Xinjiang is – to basically just identify, "oh this is 426 00:40:38,210 --> 00:40:45,979 a person of the minority, well I'm just going to go and check him or her more 427 00:40:45,979 --> 00:40:50,210 thoroughly." This is actually what happens in Xinjiang. If you're in Xinjiang and you 428 00:40:50,210 --> 00:40:59,250 look like a Turkish person, or like from Turkmenistan, from a Turkish people, you 429 00:40:59,250 --> 00:41:04,210 are a lot more likely to be questioned. You're a lot more likely to be stopped and 430 00:41:04,210 --> 00:41:12,579 they ask you or require you to download spyware on your phone. And this is 431 00:41:12,579 --> 00:41:17,660 currently what happens and this new kind of system can actually help you with that. 432 00:41:17,660 --> 00:41:24,710 I've said that it can reproduce these kind of power structures, and now obviously we 433 00:41:24,710 --> 00:41:29,890 all know neutral technology doesn't really exist, but in the Chinese case, in the 434 00:41:29,890 --> 00:41:33,529 social credit case, they don't even pretend – they always say "well, this 435 00:41:33,529 --> 00:41:37,349 is neutral technology and it's all a lot better," but actually it's the people 436 00:41:37,349 --> 00:41:43,970 currently in power, they decide on what gives you point and what deducts points 437 00:41:43,970 --> 00:41:50,029 for you. Another problem, currently the entire system is set up in a way that it 438 00:41:50,029 --> 00:41:54,619 all goes together with your *shēnfènzhèng*, with your I.D. card. What if you don't 439 00:41:54,619 --> 00:41:59,410 have an I.D. card? That's foreigners for one. But it's also people in China that 440 00:41:59,410 --> 00:42:05,479 were born during the one child policy and were not registered. There's quite a lot 441 00:42:05,479 --> 00:42:09,200 of them, actually. They're not registered anywhere and suddenly they can't do 442 00:42:09,200 --> 00:42:13,700 anything, because they don't have a score, they can't get a phone, they can't do 443 00:42:13,700 --> 00:42:20,880 anything, really. And part of the push with this social credit system is to go 444 00:42:20,880 --> 00:42:26,779 away from cash, actually. So if you need to use your phone to pay, but for your 445 00:42:26,779 --> 00:42:29,519 phone you need your *shēnfènzhèng*. If you don't have a *shēnfènzhèng*, 446 00:42:29,519 --> 00:42:32,569 well, tough luck for you. 447 00:42:32,569 --> 00:42:38,680 And currently the system in Rongcheng is set up in a way that you can check 448 00:42:38,680 --> 00:42:44,900 other people's scores and you can also see what they lose points for. So you can 449 00:42:44,900 --> 00:42:49,779 actually, sort of, choose to discriminate against people that are gay, for example, 450 00:42:49,779 --> 00:42:53,049 because they might have lost points for going to a gay bar, which you can lose 451 00:42:53,049 --> 00:43:02,130 points for. Another big issue, currently, is data privacy and security. Personal 452 00:43:02,130 --> 00:43:07,250 data is grossly undervalued in China. If you ask a Chinese person, "what do you 453 00:43:07,250 --> 00:43:14,690 think, how much is your data worth?," they say "what data? I don't have data." And, 454 00:43:14,690 --> 00:43:19,130 currently, the way it works is, if you have someone's ID number, which is quite 455 00:43:19,130 --> 00:43:24,890 easy to find out, you can actually buy access to a lot of personal information 456 00:43:24,890 --> 00:43:31,359 for a small fee. So you pay about 100 euros and you get all hotel bookings of 457 00:43:31,359 --> 00:43:35,960 the last year, you get information of who booked these hotels with them, you get 458 00:43:35,960 --> 00:43:40,829 information of where they stay, you get train bookings, you get access to all of 459 00:43:40,829 --> 00:43:47,200 the official databases for this one person. And for another 700 renminbi you 460 00:43:47,200 --> 00:43:52,829 can actually get live location data, so you can get the data of where this person 461 00:43:52,829 --> 00:43:56,400 is right now, or where his or her phone is right now, but if you've ever been to 462 00:43:56,400 --> 00:44:03,440 China you know that where the phone is, usually, the people aren't far. Supchina 463 00:44:03,440 --> 00:44:08,500 actually did an experiment where a couple of journalists tried buying that, because 464 00:44:08,500 --> 00:44:14,190 it's actually these kind of services are offered on weechat, pretty publicly. And 465 00:44:14,190 --> 00:44:26,359 you can just buy them, quite easily. So one additional thing that I looked at is, 466 00:44:26,359 --> 00:44:30,579 because one of the things that is quite interesting is, you have this idea of 467 00:44:30,579 --> 00:44:39,309 credit as twofold. Credit is trust credit but credit is also loan credit, and what 468 00:44:39,309 --> 00:44:43,730 if credit institutions actually use this unified credit score to determine credit 469 00:44:43,730 --> 00:44:49,059 distribution? The idea is that it's supposed to lead to reduced information 470 00:44:49,059 --> 00:44:55,471 asymmetry, obviously, so fewer defaults and overall more credit creation. New 471 00:44:55,471 --> 00:44:59,549 people are supposed to get access to credit, and there's supposed to be less 472 00:44:59,549 --> 00:45:04,589 shadow banking. But what actually happens? I'm not going to be talking about how I 473 00:45:04,589 --> 00:45:08,619 set up the model but just about my results. If you have this kind of score 474 00:45:08,619 --> 00:45:14,369 that includes credit information but also includes morally good – or measures of 475 00:45:14,369 --> 00:45:18,780 being morally good – what you have is, in the beginning, about 30 percent more 476 00:45:18,780 --> 00:45:23,960 agents get access to credit, and especially people that previously have not 477 00:45:23,960 --> 00:45:29,710 gotten credit access suddenly have credit access. But the problem is that this 478 00:45:29,710 --> 00:45:36,170 social credit score that correlates all of these different issues, it correlates only 479 00:45:36,170 --> 00:45:41,809 very, very weakly with repayment ability or repayment wishes, and thus suddenly you 480 00:45:41,809 --> 00:45:47,880 have all of these non-performing loans. You have – and what we see is sort of 481 00:45:47,880 --> 00:45:51,960 like – we have non-performing loans. Banks give out less loans because they 482 00:45:51,960 --> 00:45:59,400 have so many non-performing loans, and then the non-performing loans are written 483 00:45:59,400 --> 00:46:03,999 off, and suddenly banks give out more loans. But you have this oscillating 484 00:46:03,999 --> 00:46:09,239 financial system, where you give out a lot of loans, a lot of them are non- 485 00:46:09,239 --> 00:46:13,019 performing, then you give out a lot of loans again. And this is very, very 486 00:46:13,019 --> 00:46:19,150 vulnerable to crisis. If you have a real economic crisis during the time where non- 487 00:46:19,150 --> 00:46:24,569 performing loans are high, then a lot of banks will actually default, which is 488 00:46:24,569 --> 00:46:29,920 very, very dangerous for a financial system as nationed as the Chinese one. 489 00:46:29,920 --> 00:46:36,799 Now, what are some possible corrections? You could create a score that basically is 490 00:46:36,799 --> 00:46:41,259 the same as the Schufa score. So that it looks only at credit decisions, but 491 00:46:41,259 --> 00:46:45,190 suddenly, you lose a lot of incentives for the social credit score, if the social 492 00:46:45,190 --> 00:46:48,430 credit score doesn't matter for credit distribution anymore. 493 00:46:48,430 --> 00:46:52,040 Another thing, and this is, I think, the more likely one, 494 00:46:52,040 --> 00:46:55,959 is that you have a blacklist for people that have not repaid a loan 495 00:46:55,959 --> 00:46:59,400 in the past. So you basically get one freebie, and afterwards 496 00:46:59,400 --> 00:47:04,150 if you didn't repay your loan in the past then you will not get a loan in the 497 00:47:04,150 --> 00:47:08,400 future. You will still be part of the social credit system, and your social 498 00:47:08,400 --> 00:47:12,359 credit score will still be important for all of these other access issues, but it 499 00:47:12,359 --> 00:47:15,849 won't be important for access to loans anymore, once you've been on this 500 00:47:15,849 --> 00:47:21,670 blacklist. Which is probably something that the Chinese government could go 501 00:47:21,670 --> 00:47:30,180 behind, but it's also more effort to take care of it; then you have to think about, 502 00:47:30,180 --> 00:47:33,819 "well, you can't leave them on the blacklist forever, so how long do you 503 00:47:33,819 --> 00:47:37,599 leave them on the black list? Do they have to pay back the loan and then they get off 504 00:47:37,599 --> 00:47:45,670 the blacklist? Or do they have to pay back the loan and then stay not in default 505 00:47:45,670 --> 00:47:52,859 for a year, or for five years?" There's a lot of small decisions that, in my 506 00:47:52,859 --> 00:47:57,549 opinion, the Chinese government hasn't really thought about, up until now, 507 00:47:57,549 --> 00:48:01,170 because they're basically doing all these pilot studies, and all of these regional 508 00:48:01,170 --> 00:48:05,160 governments are thinking of all these small things, but they're not documenting 509 00:48:05,160 --> 00:48:10,349 everything that they're doing. So, once they – they want to roll it out by 2020, 510 00:48:10,349 --> 00:48:15,200 by the way, nationwide – once they've rolled it out there's a pretty big chance, 511 00:48:15,200 --> 00:48:18,630 in my opinion, that they'll have a lot of unintended consequences. A lot of things 512 00:48:18,630 --> 00:48:28,519 that they haven't thought about, and that they will then have to look at. So, I 513 00:48:28,519 --> 00:48:33,269 believe that some sort of system is likely to come, just in terms of how much energy 514 00:48:33,269 --> 00:48:37,289 they've expended into this one, and for the Chinese government at this point, for 515 00:48:37,289 --> 00:48:41,900 the party, it would be losing face if they did not include any such system, because 516 00:48:41,900 --> 00:48:45,969 they've been talking about this for a while. But most likely, it would be a kind 517 00:48:45,969 --> 00:48:53,019 of decentralized data sharing system. And when I ran my simulation... By the way I 518 00:48:53,019 --> 00:48:59,700 will make public my code, I still need some, basically, I used some proprietary 519 00:48:59,700 --> 00:49:06,460 data for my model, and I still need the permission to publish this. Once I publish 520 00:49:06,460 --> 00:49:11,289 this one I will also tweet it, and we'll put it on GitHub for everyone to play 521 00:49:11,289 --> 00:49:16,200 around with, if you want to. And some of these implementation details that were 522 00:49:16,200 --> 00:49:20,450 very important in determining model outcomes where "do we have a relative or 523 00:49:20,450 --> 00:49:25,289 absolute ranking?" So far, all of the systems I looked at had absolute rankings, 524 00:49:25,289 --> 00:49:30,700 but there's a point to be made for relative rankings. Do we have one score, 525 00:49:30,700 --> 00:49:35,089 where, basically, if you're a Chinese person you get one score? Or do we have 526 00:49:35,089 --> 00:49:40,880 different sub-scores in different fields? Do we have people reporting behavior, or 527 00:49:40,880 --> 00:49:46,369 do we have automatic behavior recording? How do you access other people's scores? 528 00:49:46,369 --> 00:49:50,339 How much information can you get from other people's scores? Currently, if 529 00:49:50,339 --> 00:49:55,529 someone is on a blacklist, for example, if you have their ID number, again, you can 530 00:49:55,529 --> 00:49:59,650 put it into this blacklist, and then they will say "oh, this person is on this 531 00:49:59,650 --> 00:50:04,630 blacklist for not following this judge's order," and then it says what kind of 532 00:50:04,630 --> 00:50:10,660 judge's order it was. So, most likely, it will be something like this. The idea is 533 00:50:10,660 --> 00:50:16,049 that the Social Credit system isn't only for individuals, but also for firms and 534 00:50:16,049 --> 00:50:22,219 for NGOs. So, what kind of roles will firms play in the system? I haven't looked 535 00:50:22,219 --> 00:50:28,319 at that, in detail, at this point, but it would be very interesting. Another idea 536 00:50:28,319 --> 00:50:34,390 that western people often talk about is, do people also rank each other? Currently, 537 00:50:34,390 --> 00:50:39,390 that's not part of the system in China, but it might be at one point. And lastly, 538 00:50:39,390 --> 00:50:44,839 where does the aggregation happen? So I've said that a lot of it is actually data 539 00:50:44,839 --> 00:50:53,749 sharing in China. So what kind of data is shared? Is the raw data shared? "Person A 540 00:50:53,749 --> 00:51:03,900 did something." Or is the aggregated data shared? "Person A got this score." At this 541 00:51:03,900 --> 00:51:07,890 point, most of the time, it is actually the raw data that is shared, but that also 542 00:51:07,890 --> 00:51:12,809 has sort of these data privacy issues, of course, that I've talked about. OK, 543 00:51:12,809 --> 00:51:18,950 perfect! No there's 10 more minutes. Thank you for your attention! If you have 544 00:51:18,950 --> 00:51:24,270 questions, remarks you can ask them now or you can catch me up later. You can tweet 545 00:51:24,270 --> 00:51:29,829 to me or send me an e-mail, whatever you're interested in. Thank you very much! 546 00:51:29,829 --> 00:51:37,329 *applause* 547 00:51:37,329 --> 00:51:42,039 Herald Angel: Hello! As Toni said, we have 10 minutes left for questions. If you have 548 00:51:42,039 --> 00:51:46,981 a question in the room, please go crouch in front of our five microphones. If you're 549 00:51:46,981 --> 00:51:49,701 watching the stream, please ask your questions through IRC or Twitter, and 550 00:51:49,701 --> 00:51:53,847 we'll also try to make sure to get to those. Let's just go ahead and start with 551 00:51:53,847 --> 00:51:56,339 mic one. Question: Good! Thank you very much for 552 00:51:56,339 --> 00:52:02,809 this beautiful talk. I was wondering how did the Chinese government, companies, and 553 00:52:02,809 --> 00:52:07,470 most of all, the citizens themselves, respond to you doing this research, or, 554 00:52:07,470 --> 00:52:11,519 let's put it differently, if you would have been in the system yourself, 555 00:52:11,519 --> 00:52:14,440 how would your research affect your social credit score? 556 00:52:14,440 --> 00:52:17,270 *laughter* 557 00:52:17,270 --> 00:52:25,639 Answer: So, um... There's actually two different responses that I've seen. When I 558 00:52:25,639 --> 00:52:31,380 talk to the government themselves, because I was there on a government scholarship, 559 00:52:31,380 --> 00:52:34,869 and mentioned that I'm really interested in this, they basically said oh well this 560 00:52:34,869 --> 00:52:38,930 is just a technical system. You don't really need to be concerned with this. It 561 00:52:38,930 --> 00:52:43,439 is not very important. Just, you know, it's just a technicality. It's just for us 562 00:52:43,439 --> 00:52:49,299 to make life more efficient and better for everyone. So I assume my score would 563 00:52:49,299 --> 00:52:54,660 actually go down from doing this research, actually. But when I talk to a lot of 564 00:52:54,660 --> 00:53:00,890 people at universities, they were also very – they were very interested in my 565 00:53:00,890 --> 00:53:05,230 research, and a lot of them mentioned that they didn't even know that the system 566 00:53:05,230 --> 00:53:10,200 existed! Herald: Before we go to a question from 567 00:53:10,200 --> 00:53:14,729 our signal angel, a request for all the people leaving the room, please do so as 568 00:53:14,729 --> 00:53:20,500 quietly as possible, so we can continue this Q and A. The signal angel, please! 569 00:53:20,500 --> 00:53:26,249 Signal Angel: Jaenix wants to know, is this score actually influenced by 570 00:53:26,249 --> 00:53:31,549 association with people with a low score. Meaning that, is there any peer pressure 571 00:53:31,549 --> 00:53:36,200 to stay away from people with bad scores? Answer: The Sesame credit score definitely 572 00:53:36,200 --> 00:53:42,849 is influenced by your friends' scores, the Rongcheng score, so far, apparently, is 573 00:53:42,849 --> 00:53:47,619 not influenced, but it is definitely in the cards, and it is planned that it will 574 00:53:47,619 --> 00:53:53,800 be part of this. I think WeChat, which is the main platform – it's sort of like 575 00:53:53,800 --> 00:53:59,880 WhatsApp, except it can do a lot a lot more – WeChat is still not connected to 576 00:53:59,880 --> 00:54:05,039 the Social Credit Score in Rongcheng. Once they do that, it will most likely also 577 00:54:05,039 --> 00:54:09,849 reflect your score. Herald: All right, let's continue with 578 00:54:09,849 --> 00:54:15,430 mic 3. Q: I have a question about your models. 579 00:54:15,430 --> 00:54:19,789 I'm wondering, what kind of interactions are you modeling? Or actions, like, what 580 00:54:19,789 --> 00:54:24,910 can the agents actually do? You mentioned moving somewhere else. And, what else? 581 00:54:24,910 --> 00:54:31,190 A: Okay so the way I set up my model was, I set up a multilevel model. So I looked 582 00:54:31,190 --> 00:54:38,339 at different kinds of levels. I started out with, basically, they can evade taxes, 583 00:54:38,339 --> 00:54:46,890 they can get loans and repay loans, they can choose where to live, and they can 584 00:54:46,890 --> 00:54:54,479 follow traffic rules or not follow traffic rules. And because these were, sort of, 585 00:54:54,479 --> 00:54:58,660 four big issues that were mentioned in all of the different systems, so I started out 586 00:54:58,660 --> 00:55:04,810 with these issues, and looked at, what kind of behavior do I see? I used some 587 00:55:04,810 --> 00:55:11,029 research that – some friends of mine actually sent out surveys to people and 588 00:55:11,029 --> 00:55:16,299 asked them "well, you're now part of the system. Did your behavior change, and how 589 00:55:16,299 --> 00:55:23,109 did it change depending on your responses, depending on your score, and depending on 590 00:55:23,109 --> 00:55:27,729 the score system that exists?" And I, basically, used that, and some other 591 00:55:27,729 --> 00:55:34,430 research on nudging and on behavioral adaptation, to look at how likely is it 592 00:55:34,430 --> 00:55:39,160 that someone would change their behavior based on the score. 593 00:55:39,160 --> 00:55:42,109 Herald: All right let's do another question from the interwebs. 594 00:55:42,109 --> 00:55:48,489 Q: Yeah, it's actually two questions in one. How does this system work for Chinese 595 00:55:48,489 --> 00:55:53,859 people living abroad, or for noncitizens that do business in China? 596 00:55:53,859 --> 00:56:00,529 A: Currently the system does not work for noncitizens that do business in China, 597 00:56:00,529 --> 00:56:04,700 because it works through the *shēnfènzhèng*. You only get a *shēnfènzhèng* if you're a 598 00:56:04,700 --> 00:56:10,789 Chinese citizen or you live in China for 10 or more years. So everyone who is not 599 00:56:10,789 --> 00:56:16,029 Chinese is currently excluded. Chinese people not living in China, if they have a 600 00:56:16,029 --> 00:56:21,099 *shēnfènzhèng*, are on this system, but there's not a lot of information. 601 00:56:21,099 --> 00:56:28,859 Herald: All right, mic 4. Q: Well, we've come a long way since the 602 00:56:28,859 --> 00:56:34,029 *Volkszählungsurteil*. Can you tell us anything about the dynamic in the 603 00:56:34,029 --> 00:56:43,700 time dimension? How quickly can I regain credit that was lost? Do you have any 604 00:56:43,700 --> 00:56:47,490 observations there? A: So in the Suining system what they 605 00:56:47,490 --> 00:56:53,519 actually did was they had a very, very strict period. So if you evaded taxes your 606 00:56:53,519 --> 00:56:58,599 score would be down for two years and then it would rebounce. In the Rongcheng 607 00:56:58,599 --> 00:57:03,359 system, they did not publish this kind of period. So, my assumption is that it's 608 00:57:03,359 --> 00:57:08,760 going to be more on a case by case basis. Because, I looked at the Chinese data, I 609 00:57:08,760 --> 00:57:13,849 looked at the Chinese policy documents, and they didn't really, for most of the 610 00:57:13,849 --> 00:57:20,019 stuff, they didn't say how long it would count. For the blacklists, which was kind 611 00:57:20,019 --> 00:57:24,630 of the predecessor that we look at currently, the way it works is you stay on 612 00:57:24,630 --> 00:57:29,309 there until whatever the blacklist – until whatever the reason for the 613 00:57:29,309 --> 00:57:34,510 blacklist is has been resolved. So, you stay on there until you send off this 614 00:57:34,510 --> 00:57:40,039 apology that the judge ordered you to. And then, usually, you still needed to apply to 615 00:57:40,039 --> 00:57:44,979 get off. So it doesn't – for blacklists, it does not work that you automatically 616 00:57:44,979 --> 00:57:49,410 get off. You need to apply, you need to show that you've done what they've asked 617 00:57:49,410 --> 00:57:53,089 you to do, and then you can get off this blacklist. And I assume it will be a 618 00:57:53,089 --> 00:57:57,019 similar sort of appeals procedure for the system. 619 00:57:57,019 --> 00:58:04,069 Herald: All right. Let's go to mic 2. Q: Thank you. I just wanted to if looking 620 00:58:04,069 --> 00:58:08,910 up someone else's data in details, like position et cetera, does affect your own 621 00:58:08,910 --> 00:58:12,049 score? A: Currently, it apparently does not, or 622 00:58:12,049 --> 00:58:17,369 at least they haven't published that it does. It might in the future, but most 623 00:58:17,369 --> 00:58:22,549 likely it's actually behavior that they want. So they want you to look up other 624 00:58:22,549 --> 00:58:26,480 people's scores before doing business with them. They want you to, basically, use 625 00:58:26,480 --> 00:58:30,069 this to decide who you're going to associate with. 626 00:58:30,069 --> 00:58:33,200 Q: Thank you! Herald: All right, do we have another 627 00:58:33,200 --> 00:58:38,749 question from the Internet, maybe? Signal: Yes, I do! Standby... The question 628 00:58:38,749 --> 00:58:47,569 is, how is this actually implemented for the offline rural population in China? 629 00:58:47,569 --> 00:58:53,119 A: Quite easily; not at all at this point. The idea is, by 2020, that they will 630 00:58:53,119 --> 00:58:59,089 actually have all of this is implemented. But even for the offline – or let's say 631 00:58:59,089 --> 00:59:05,869 offline rural population in China is getting smaller and smaller. Even in rural 632 00:59:05,869 --> 00:59:13,039 villages you have about 50-60% of people that are online. And most of them are 633 00:59:13,039 --> 00:59:16,049 online via smartphone, and their smartphone is connected to the 634 00:59:16,049 --> 00:59:21,309 *shēnfènzhèng*. So it's not very complicated to do that for everyone who is online. For 635 00:59:21,309 --> 00:59:25,989 everyone who's offline, off course, this is more problematic, but I think the end 636 00:59:25,989 --> 00:59:31,799 goal is to not have people offline at all. Herald: All right. Let's jump right back 637 00:59:31,799 --> 00:59:38,059 to microphone 2, please. Q: Thank you for the very good and 638 00:59:38,059 --> 00:59:44,940 frightening talk, so far. At first I have to correct you in one point. In Germany we 639 00:59:44,940 --> 00:59:50,279 have a similar system because we have this tax I.D., which is set from birth on and 640 00:59:50,279 --> 00:59:58,769 rests 30 years after a person's dead. Yeah. So we have a lifelong I.D. 641 00:59:58,769 --> 01:00:02,249 A: You're right. I just... I don't know mine, so I figured… *dismissive sound.* 642 01:00:02,249 --> 01:00:08,130 Q: No problem! But, at least we could establish a similar system, if we have a 643 01:00:08,130 --> 01:00:16,150 government which would want it. A question for you: you mentioned this "*guanxi*." Is 644 01:00:16,150 --> 01:00:20,789 it a kind of a social network? I didn't understand it, really. 645 01:00:20,789 --> 01:00:26,000 A: Yes, it is a kind of social network, but one that is a lot more based on 646 01:00:26,000 --> 01:00:31,930 hierarchies than it is in the West. So you have people that are above you and people 647 01:00:31,930 --> 01:00:36,519 that are below you. And the expectation is that, while it's a quid pro quo, people 648 01:00:36,519 --> 01:00:41,539 that are above you in the hierarchy will give you less than you will give to them. 649 01:00:41,539 --> 01:00:46,529 Q: Aha, okay. Herald: OK, all right. Unfortunately, we 650 01:00:46,529 --> 01:00:52,099 are out of time, so, please give another huge applause for Toni! 651 01:00:52,099 --> 01:00:54,394 *applause* 652 01:00:54,394 --> 01:00:57,842 *postroll music* 653 01:00:57,842 --> 01:01:17,000 subtitles created by c3subtitles.de in the year 2019. Join, and help us!