0 00:00:00,000 --> 00:00:30,000 Dear viewer, these subtitles were generated by a machine via the service Trint and therefore are (very) buggy. If you are capable, please help us to create good quality subtitles: https://c3subtitles.de/talk/665 Thanks! 1 00:00:13,140 --> 00:00:14,879 I want to talk to you, I want to 2 00:00:14,880 --> 00:00:16,378 introduce Lisa Rost. 3 00:00:16,379 --> 00:00:17,969 So Lisa loves to design data 4 00:00:17,970 --> 00:00:20,219 visualizations. She is currently the 5 00:00:20,220 --> 00:00:22,469 Mozilla Open News Fellow at 6 00:00:22,470 --> 00:00:25,529 National Public Radio in Washington, DC. 7 00:00:25,530 --> 00:00:28,019 If any you guys do not know NPR, 8 00:00:28,020 --> 00:00:29,969 let me tell you it was the number one 9 00:00:29,970 --> 00:00:31,889 thing I personally missed in the decade 10 00:00:31,890 --> 00:00:34,349 from '99 to 2011 11 00:00:34,350 --> 00:00:35,459 that I lived in Germany. 12 00:00:35,460 --> 00:00:38,009 I'm almost as much as I miss currywurst 13 00:00:38,010 --> 00:00:39,419 in the states now. 14 00:00:39,420 --> 00:00:41,579 I know seriously, it's 15 00:00:41,580 --> 00:00:43,259 amazing. It's way better than public 16 00:00:43,260 --> 00:00:45,359 radio here. You have way better public TV 17 00:00:45,360 --> 00:00:47,429 than we do, but we have way better public 18 00:00:47,430 --> 00:00:49,499 radio. So if an opportunity, check out 19 00:00:49,500 --> 00:00:51,659 NPR.org and look for shows like 20 00:00:51,660 --> 00:00:53,969 Radiolab, all things considered fresh 21 00:00:53,970 --> 00:00:56,100 air. But anyway, enough about these? 22 00:00:57,630 --> 00:00:59,639 Absolutely. Any other suggestions in the 23 00:00:59,640 --> 00:01:00,640 audience? 24 00:01:02,200 --> 00:01:04,378 Yeah, absolutely. 25 00:01:04,379 --> 00:01:05,729 Morning Edition. 26 00:01:05,730 --> 00:01:07,199 I mean, I'm in New York City, so Brian 27 00:01:07,200 --> 00:01:08,610 Lehrer, lots of different options. 28 00:01:09,990 --> 00:01:11,519 But anyway, enough about Lisa's 29 00:01:11,520 --> 00:01:12,419 workplace. 30 00:01:12,420 --> 00:01:13,889 Lisa creates visualizations. 31 00:01:13,890 --> 00:01:15,419 I'm for NPR now. 32 00:01:15,420 --> 00:01:17,459 But previously, when in Berlin, she 33 00:01:17,460 --> 00:01:19,739 designed for and with the open data 34 00:01:19,740 --> 00:01:22,199 city Spiegel on target, Spiegel 35 00:01:22,200 --> 00:01:24,179 talk data visualization universities and 36 00:01:24,180 --> 00:01:27,029 also organize the date of his meetup. 37 00:01:27,030 --> 00:01:27,929 This is one of the talks I was looking 38 00:01:27,930 --> 00:01:29,219 forward to seeing, and so it's really 39 00:01:29,220 --> 00:01:31,049 have a privilege for me to be able to 40 00:01:31,050 --> 00:01:31,949 introduce Lisa. 41 00:01:31,950 --> 00:01:33,689 But I'm not only here to see. 42 00:01:33,690 --> 00:01:34,739 So, Lisa, take it away. 43 00:01:46,730 --> 00:01:47,929 Thank you so much. 44 00:01:47,930 --> 00:01:49,099 Thanks for having me, and thanks for the 45 00:01:49,100 --> 00:01:50,029 introduction. 46 00:01:50,030 --> 00:01:51,649 I'm not sure if you've got everything. 47 00:01:51,650 --> 00:01:53,989 I think what you need to know about 48 00:01:53,990 --> 00:01:56,269 me is that I take numbers 49 00:01:56,270 --> 00:01:57,620 and I put them into something like that, 50 00:01:59,480 --> 00:02:00,829 and I've done it for a long time now, and 51 00:02:00,830 --> 00:02:02,329 I really like it. 52 00:02:02,330 --> 00:02:05,269 And then data visualization is definitely 53 00:02:05,270 --> 00:02:07,399 hyped, like to like 54 00:02:07,400 --> 00:02:09,499 the demand increase over the last five or 55 00:02:09,500 --> 00:02:11,959 10 years. Everybody wants to do it now, 56 00:02:11,960 --> 00:02:14,179 but only recently we started 57 00:02:14,180 --> 00:02:17,419 thinking about if these facts and figures 58 00:02:17,420 --> 00:02:19,310 that are visualized here actually matter. 59 00:02:21,110 --> 00:02:22,549 So that's what we did, especially this 60 00:02:22,550 --> 00:02:24,619 year. Like this year, we 61 00:02:24,620 --> 00:02:26,719 found our society, especially in 62 00:02:26,720 --> 00:02:28,879 the U.S., that some people 63 00:02:28,880 --> 00:02:31,099 take feelings and perceive them as 64 00:02:31,100 --> 00:02:32,269 a truth. 65 00:02:32,270 --> 00:02:34,129 So today I want to ask the question Can 66 00:02:34,130 --> 00:02:36,229 you take truth and make 67 00:02:36,230 --> 00:02:37,609 it evoke feelings? 68 00:02:37,610 --> 00:02:40,129 Can we take data points and 69 00:02:40,130 --> 00:02:42,079 make people care about them, especially 70 00:02:42,080 --> 00:02:44,959 data points that can walk into a bar, 71 00:02:44,960 --> 00:02:45,960 meaning humans? 72 00:02:47,060 --> 00:02:49,159 And I want to ask a question of three 73 00:02:49,160 --> 00:02:51,349 questions first and the first part I want 74 00:02:51,350 --> 00:02:53,149 to talk about. My feelings are good and 75 00:02:53,150 --> 00:02:55,249 my feelings are bad and why we can't have 76 00:02:55,250 --> 00:02:57,169 them and why we should have them. 77 00:02:57,170 --> 00:02:59,569 And as I can part, I saw shortly 78 00:02:59,570 --> 00:03:01,249 talk about what it all has to do with 79 00:03:01,250 --> 00:03:03,139 data visualization and the third part I 80 00:03:03,140 --> 00:03:05,360 want to bring specific examples. 81 00:03:06,890 --> 00:03:08,209 So feelings, 82 00:03:09,530 --> 00:03:11,389 we all have these blurry things that 83 00:03:11,390 --> 00:03:13,009 sometimes come to us and then sometimes 84 00:03:13,010 --> 00:03:14,010 they leave. 85 00:03:14,630 --> 00:03:16,079 These are only some of them. 86 00:03:16,080 --> 00:03:17,599 So like right now, for example, I'm 87 00:03:17,600 --> 00:03:19,249 pretty happy to be here. 88 00:03:19,250 --> 00:03:21,199 I'm also grateful for being here. 89 00:03:21,200 --> 00:03:23,449 And I also have like a small 90 00:03:23,450 --> 00:03:25,519 version of fear, which is nervousness. 91 00:03:25,520 --> 00:03:27,020 Actually, that's a huge part right now. 92 00:03:28,700 --> 00:03:30,769 But there are two feelings here 93 00:03:30,770 --> 00:03:32,479 that I live with separated, and these are 94 00:03:32,480 --> 00:03:34,429 empathy and compassion. 95 00:03:34,430 --> 00:03:36,619 Because empathy and compassion, we 96 00:03:36,620 --> 00:03:38,809 direct to what somebody else and 97 00:03:38,810 --> 00:03:41,299 they can have as a content, everything 98 00:03:41,300 --> 00:03:42,300 that's on this list. 99 00:03:43,700 --> 00:03:45,349 So let's define some terms here. 100 00:03:45,350 --> 00:03:47,059 Let's make some distinctions 101 00:03:47,060 --> 00:03:48,199 extinguishment. 102 00:03:48,200 --> 00:03:49,200 Empathy 103 00:03:50,270 --> 00:03:52,609 is when we feel the feeling 104 00:03:52,610 --> 00:03:55,009 somebody else has, we put ourselves 105 00:03:55,010 --> 00:03:56,119 in their shoes. 106 00:03:56,120 --> 00:03:57,589 If you would have empathy with me right 107 00:03:57,590 --> 00:03:59,330 now, you would also feel nervous. 108 00:04:00,980 --> 00:04:03,259 And compassion is more like 109 00:04:03,260 --> 00:04:04,159 sympathy. 110 00:04:04,160 --> 00:04:05,539 Like if you would have compassion with 111 00:04:05,540 --> 00:04:07,279 me, you would not be nervous. 112 00:04:07,280 --> 00:04:08,269 You wouldn't have these negative 113 00:04:08,270 --> 00:04:10,429 feelings, but you feel would 114 00:04:10,430 --> 00:04:11,929 feel more like positive feelings like 115 00:04:11,930 --> 00:04:13,190 laugh or sympathy for me. 116 00:04:15,300 --> 00:04:17,518 And almost everybody on this planet can 117 00:04:17,519 --> 00:04:19,979 agree that having empathy and compassion 118 00:04:19,980 --> 00:04:21,569 is better than not having empathy and 119 00:04:21,570 --> 00:04:22,619 compassion. 120 00:04:22,620 --> 00:04:24,869 A quick Google search found that 121 00:04:24,870 --> 00:04:26,879 smart people on the internet think that 122 00:04:26,880 --> 00:04:28,859 empathy is the only way we will survive 123 00:04:28,860 --> 00:04:29,969 for that. 124 00:04:29,970 --> 00:04:32,309 Empathy makes for good people 125 00:04:32,310 --> 00:04:34,589 and good people make for good societies. 126 00:04:34,590 --> 00:04:36,929 And Obama is a big fan of empathy, saying 127 00:04:36,930 --> 00:04:39,089 empathy. It makes it harder 128 00:04:39,090 --> 00:04:40,709 not to act harder, not to help. 129 00:04:42,060 --> 00:04:44,159 So empathy is something that comes in all 130 00:04:44,160 --> 00:04:46,169 of us. We are really good at that. 131 00:04:46,170 --> 00:04:47,549 We are really good at directing our 132 00:04:47,550 --> 00:04:48,989 empathy to one of us. 133 00:04:48,990 --> 00:04:51,119 Like, for example, this child that most 134 00:04:51,120 --> 00:04:53,369 of you already know was 135 00:04:53,370 --> 00:04:55,259 on a show in 2015 in Turkey. 136 00:04:57,120 --> 00:04:58,679 But we have some troubles directing our 137 00:04:58,680 --> 00:05:01,139 empathy to people. 138 00:05:01,140 --> 00:05:03,209 Imagine I would have put my 139 00:05:03,210 --> 00:05:05,109 beloved clone and she would have given 140 00:05:05,110 --> 00:05:07,169 the talk to here at the stage, slightly 141 00:05:07,170 --> 00:05:09,389 similar, but also slightly different. 142 00:05:09,390 --> 00:05:11,519 It would be hard for you to direct your 143 00:05:11,520 --> 00:05:14,099 attention towards me and towards her talk 144 00:05:14,100 --> 00:05:16,229 at the same time, and it would 145 00:05:16,230 --> 00:05:18,369 be really hard if all of us would be at 146 00:05:18,370 --> 00:05:20,429 that stage, except you and you 147 00:05:20,430 --> 00:05:22,110 would listen to all of us giving a talk. 148 00:05:23,370 --> 00:05:24,689 I'm not sure if you would get anything 149 00:05:24,690 --> 00:05:26,759 out of the talk, and it would be 150 00:05:26,760 --> 00:05:29,369 super hard to direct your attention 151 00:05:29,370 --> 00:05:30,809 and it's the same of empathy. 152 00:05:30,810 --> 00:05:32,579 We have been really good at directing 153 00:05:32,580 --> 00:05:35,009 empathy to one person, but we really, 154 00:05:35,010 --> 00:05:37,379 really bad at directing or to lots 155 00:05:37,380 --> 00:05:39,389 of people. That's why we can't have 156 00:05:39,390 --> 00:05:41,179 empathy with this one kid, but not for 157 00:05:41,180 --> 00:05:43,029 three thousand seven hundred seventy 158 00:05:43,030 --> 00:05:45,179 other people who also died 159 00:05:45,180 --> 00:05:47,190 crossing the Mediterranean Sea in 2015. 160 00:05:48,630 --> 00:05:49,859 And I mean, that scales up. 161 00:05:49,860 --> 00:05:51,329 That's not something that you feel, and 162 00:05:51,330 --> 00:05:53,189 it's like a little bit blurry, etc. 163 00:05:53,190 --> 00:05:55,629 But this photo of the kid 164 00:05:55,630 --> 00:05:58,139 and actually led to huge donations 165 00:05:58,140 --> 00:05:59,279 to the Red Cross. 166 00:05:59,280 --> 00:06:00,989 And it also changed politics. 167 00:06:00,990 --> 00:06:03,149 The Guardian Award that European leaders 168 00:06:03,150 --> 00:06:04,589 have been shocked into forming more 169 00:06:04,590 --> 00:06:06,719 compassionate policies by previously 170 00:06:06,720 --> 00:06:08,369 hostile media outlets took a more 171 00:06:08,370 --> 00:06:10,679 conciliatory tone and 172 00:06:10,680 --> 00:06:12,959 look at the voting shocked into 173 00:06:12,960 --> 00:06:14,759 the emotions for the reason for that. 174 00:06:14,760 --> 00:06:17,639 It almost feels like they were forced 175 00:06:17,640 --> 00:06:20,279 to have to have these 176 00:06:20,280 --> 00:06:22,889 two to have these implications 177 00:06:22,890 --> 00:06:24,059 thanks to the emotions. 178 00:06:25,980 --> 00:06:28,169 That's the conclusion to 179 00:06:28,170 --> 00:06:30,329 all of you. Our lives are actually 180 00:06:30,330 --> 00:06:33,329 not all worth the same value. 181 00:06:33,330 --> 00:06:35,789 We value lives more 182 00:06:35,790 --> 00:06:38,069 if it's only like one or two or three, 183 00:06:38,070 --> 00:06:40,379 and it's like the more life they are, 184 00:06:40,380 --> 00:06:42,539 the harder to process and 185 00:06:42,540 --> 00:06:44,999 we don't really have much empathy 186 00:06:45,000 --> 00:06:46,000 for them left. 187 00:06:47,250 --> 00:06:49,439 That's that's pretty sad. 188 00:06:49,440 --> 00:06:50,440 I mean. 189 00:06:51,140 --> 00:06:52,819 Yeah, it's it's really easy for us to 190 00:06:52,820 --> 00:06:55,579 affect this one person and to have 191 00:06:55,580 --> 00:06:57,499 empathy for that, but we can't multiply 192 00:06:57,500 --> 00:06:58,669 it for three thousand seven hundred 193 00:06:58,670 --> 00:06:59,670 seventy. 194 00:07:01,100 --> 00:07:02,869 That's why they concluded that emotions 195 00:07:02,870 --> 00:07:04,999 suck feelings out 196 00:07:05,000 --> 00:07:06,000 of us. 197 00:07:06,500 --> 00:07:07,909 OK, it's not the worst, but they're not 198 00:07:07,910 --> 00:07:09,979 really articulate like 199 00:07:09,980 --> 00:07:11,629 they are good and bad. 200 00:07:11,630 --> 00:07:12,709 You have one can need. 201 00:07:12,710 --> 00:07:14,029 It's good you have. 202 00:07:14,030 --> 00:07:15,589 And that I can eat. It's not like that 203 00:07:15,590 --> 00:07:17,429 doesn't make you less happy. 204 00:07:17,430 --> 00:07:18,949 It's the same bad feelings. 205 00:07:18,950 --> 00:07:20,839 If you have one death, it's bad. 206 00:07:20,840 --> 00:07:22,249 If you have two deaths, that's not 207 00:07:22,250 --> 00:07:25,549 double. That's not making you less sad, 208 00:07:25,550 --> 00:07:28,069 and it might be even worse than that. 209 00:07:28,070 --> 00:07:30,349 Paul Slovic, who did a 210 00:07:30,350 --> 00:07:32,509 lot of research into empathy, he did 211 00:07:32,510 --> 00:07:35,059 a study where he showed people 212 00:07:35,060 --> 00:07:37,099 a child like Take the child, for example. 213 00:07:37,100 --> 00:07:38,179 She has a miserable life. 214 00:07:38,180 --> 00:07:40,099 She's living in Syria and you can help 215 00:07:40,100 --> 00:07:42,229 her. You can make a donation and she 216 00:07:42,230 --> 00:07:43,729 will have a better life. 217 00:07:43,730 --> 00:07:44,749 Would you donate? 218 00:07:44,750 --> 00:07:47,809 I think a lot of people, if you would 219 00:07:47,810 --> 00:07:49,209 again like empathy, is something you 220 00:07:49,210 --> 00:07:50,599 really good at. 221 00:07:50,600 --> 00:07:52,399 But then I will tell you, OK, you can 222 00:07:52,400 --> 00:07:54,499 have that child, but you 223 00:07:54,500 --> 00:07:56,509 can't help her neighbor like she's 224 00:07:56,510 --> 00:07:58,399 staying. And so she's like, she will have 225 00:07:58,400 --> 00:08:00,499 like a very uncertain future, and all 226 00:08:00,500 --> 00:08:02,299 the nations will go to this first child, 227 00:08:02,300 --> 00:08:04,159 but not to the second one. 228 00:08:04,160 --> 00:08:05,899 And then I will tell you all the same 229 00:08:05,900 --> 00:08:07,609 thing about all these other children that 230 00:08:07,610 --> 00:08:09,769 you can't help them and 40 231 00:08:09,770 --> 00:08:12,079 percent, 40 percent of the top 232 00:08:12,080 --> 00:08:14,449 in donations that the researchers 233 00:08:14,450 --> 00:08:17,359 have seen towards that first child. 234 00:08:17,360 --> 00:08:19,459 People are less likely to help if they 235 00:08:19,460 --> 00:08:21,439 are reminded of all the people they can't 236 00:08:21,440 --> 00:08:23,149 help. You want to make a difference. 237 00:08:23,150 --> 00:08:25,339 Let's call it like a warm close. 238 00:08:25,340 --> 00:08:27,679 You want to feel good about helping 239 00:08:27,680 --> 00:08:29,769 and you're feeling, Yeah, 240 00:08:29,770 --> 00:08:30,949 we want to feel like you make a 241 00:08:30,950 --> 00:08:32,149 difference, and we don't want to get 242 00:08:32,150 --> 00:08:34,459 reminded of all the difference we can't 243 00:08:34,460 --> 00:08:35,460 make. 244 00:08:36,400 --> 00:08:38,489 So maybe it's even like that pod 245 00:08:38,490 --> 00:08:40,629 tests maybe actually 246 00:08:40,630 --> 00:08:42,969 be value one life even more than 247 00:08:42,970 --> 00:08:45,159 3000 lives, 3000 lives mean 248 00:08:45,160 --> 00:08:47,409 nothing to us, but one life you're really 249 00:08:47,410 --> 00:08:49,509 invested in like it's a huge difference 250 00:08:49,510 --> 00:08:51,759 for us if one child dies 251 00:08:51,760 --> 00:08:53,319 or not. But it doesn't really matter if 252 00:08:53,320 --> 00:08:54,819 three thousand seven hundred seven people 253 00:08:54,820 --> 00:08:56,139 die, or three thousand seven hundred 254 00:08:56,140 --> 00:08:57,140 seventy one. 255 00:08:58,970 --> 00:09:01,679 Sachs, he wasn't sacked. 256 00:09:01,680 --> 00:09:02,879 It's really OK. 257 00:09:02,880 --> 00:09:05,179 Yeah, no, but not articulate, 258 00:09:05,180 --> 00:09:07,339 and there is this weird ego wants to 259 00:09:07,340 --> 00:09:09,049 have impact thing going on. 260 00:09:09,050 --> 00:09:10,909 It's makes me pretty angry. 261 00:09:12,620 --> 00:09:14,299 How about if you don't do feelings 262 00:09:14,300 --> 00:09:17,269 anymore and 263 00:09:17,270 --> 00:09:18,229 just isn't everything? 264 00:09:18,230 --> 00:09:20,300 Rationally, rationality is good. 265 00:09:23,960 --> 00:09:25,819 Paul Bloom is like a big opponent of 266 00:09:25,820 --> 00:09:27,469 empathy, he says. If you want to be good 267 00:09:27,470 --> 00:09:30,169 and do good. Empathy is a poor guide, 268 00:09:30,170 --> 00:09:31,459 and it's not just because of that, 269 00:09:31,460 --> 00:09:33,439 because of empathy has this fear if you 270 00:09:33,440 --> 00:09:35,689 can't really value lives properly 271 00:09:35,690 --> 00:09:37,759 thing, but also because 272 00:09:37,760 --> 00:09:39,709 empathy is not fair. 273 00:09:39,710 --> 00:09:42,019 We tend to 274 00:09:42,020 --> 00:09:43,789 we tend to help people and to have 275 00:09:43,790 --> 00:09:46,099 empathy towards people who look like us, 276 00:09:46,100 --> 00:09:48,199 who are looking cuter, who look 277 00:09:48,200 --> 00:09:50,059 more like they need help. 278 00:09:50,060 --> 00:09:51,799 That's not the same a personality. 279 00:09:51,800 --> 00:09:53,119 Of course, he will treat everybody the 280 00:09:53,120 --> 00:09:54,120 same. 281 00:09:54,530 --> 00:09:56,689 But then again, what we tweet anybody, 282 00:09:56,690 --> 00:09:59,269 actually, because what should I tell 283 00:09:59,270 --> 00:10:01,159 my shadow cabinet? Don't care? 284 00:10:01,160 --> 00:10:02,629 Why should I care for some child? 285 00:10:02,630 --> 00:10:04,699 A. That has maybe some 286 00:10:04,700 --> 00:10:06,949 very faraway impact 287 00:10:06,950 --> 00:10:09,349 on the economy, but actually, you know, 288 00:10:09,350 --> 00:10:10,350 I don't care. 289 00:10:11,360 --> 00:10:14,479 So maybe rationality is actually also 290 00:10:14,480 --> 00:10:15,589 not the solution. 291 00:10:15,590 --> 00:10:17,719 I think I think you need both. 292 00:10:17,720 --> 00:10:20,329 We need the numbers and the feelings. 293 00:10:20,330 --> 00:10:22,459 We need numbers and narrative 294 00:10:22,460 --> 00:10:25,579 and anecdotes and abstraction. 295 00:10:25,580 --> 00:10:27,289 We need to slow system born if you what 296 00:10:27,290 --> 00:10:28,999 kind of man and disaster seem to? 297 00:10:30,570 --> 00:10:32,339 I think that's how it should be we should 298 00:10:32,340 --> 00:10:34,739 make people care about a topic 299 00:10:34,740 --> 00:10:36,299 and then we should tell them what to do 300 00:10:36,300 --> 00:10:38,549 and how to do it in a very rational way. 301 00:10:38,550 --> 00:10:40,049 First, we want to show them that they 302 00:10:40,050 --> 00:10:41,669 should do something and we want them to 303 00:10:41,670 --> 00:10:43,829 decide to do something and then we can 304 00:10:43,830 --> 00:10:45,989 show them in the wash on our terms, how 305 00:10:45,990 --> 00:10:47,580 to do it and what to do. 306 00:10:49,310 --> 00:10:51,589 And that's what I want to talk about 307 00:10:51,590 --> 00:10:53,059 in the rest of the talk. 308 00:10:53,060 --> 00:10:54,529 So first, data visualization, what does 309 00:10:54,530 --> 00:10:55,530 it have to do with that? 310 00:10:56,900 --> 00:10:58,639 Well, most of that visualization looks 311 00:10:58,640 --> 00:10:59,640 like that. Still, 312 00:11:00,770 --> 00:11:02,299 it doesn't really do justice to the 313 00:11:02,300 --> 00:11:03,799 people that we present. 314 00:11:03,800 --> 00:11:06,079 These are about malaria deaths and 315 00:11:06,080 --> 00:11:08,689 traffic fatalities and unemployed people, 316 00:11:08,690 --> 00:11:10,819 and it doesn't really, you know, make 317 00:11:10,820 --> 00:11:11,719 me care so much. 318 00:11:11,720 --> 00:11:13,399 It's like we'll see more on a number 319 00:11:13,400 --> 00:11:15,499 scale. It really speaks to my 320 00:11:15,500 --> 00:11:16,500 analytical self. 321 00:11:18,550 --> 00:11:20,289 And I think some of you might say, oh, 322 00:11:20,290 --> 00:11:21,729 that's actually good word like data 323 00:11:21,730 --> 00:11:23,709 visualization as opposed to speak to the 324 00:11:23,710 --> 00:11:25,719 rational mind, and that's I actually like 325 00:11:25,720 --> 00:11:27,879 that it doesn't try to manipulate me 326 00:11:27,880 --> 00:11:29,979 like these super manipulating 327 00:11:29,980 --> 00:11:32,859 emotional photos of kids 328 00:11:32,860 --> 00:11:33,860 at the shore. 329 00:11:34,690 --> 00:11:36,289 And I would say, OK, that that's fair. 330 00:11:36,290 --> 00:11:37,689 It really depends on your goals in the 331 00:11:37,690 --> 00:11:39,759 end. I think data visualization 332 00:11:39,760 --> 00:11:42,039 is just a tool you can use 333 00:11:42,040 --> 00:11:44,199 it to. We present data in a very, 334 00:11:44,200 --> 00:11:46,149 very objective way. 335 00:11:46,150 --> 00:11:48,219 But you can also you should be able to do 336 00:11:48,220 --> 00:11:50,349 something like that that speaks more 337 00:11:50,350 --> 00:11:51,909 to your emotions. 338 00:11:51,910 --> 00:11:53,139 In the end, it's like language. 339 00:11:53,140 --> 00:11:55,449 Language can also be super objective 340 00:11:55,450 --> 00:11:57,879 and like super harsh and cold, 341 00:11:57,880 --> 00:12:00,099 or you can have poems that 342 00:12:00,100 --> 00:12:01,210 make you really feel things. 343 00:12:04,550 --> 00:12:06,469 That it was positioned as a tool. 344 00:12:06,470 --> 00:12:08,659 And if you want to evoke emotions 345 00:12:08,660 --> 00:12:10,009 with that of a sensation, I feel like I 346 00:12:10,010 --> 00:12:12,109 should be able to we should build a 347 00:12:12,110 --> 00:12:14,239 tool kit to make that 348 00:12:14,240 --> 00:12:16,459 possible, even if 349 00:12:16,460 --> 00:12:18,619 most data visualization will still stay 350 00:12:18,620 --> 00:12:20,749 in the technology to 351 00:12:20,750 --> 00:12:21,750 swear. 352 00:12:22,640 --> 00:12:23,959 So that's the rest of my talk. 353 00:12:23,960 --> 00:12:26,779 How to make fields of data visualization. 354 00:12:26,780 --> 00:12:28,309 I have some ideas, but I'm also very 355 00:12:28,310 --> 00:12:29,830 happy to hear your ones. 356 00:12:31,880 --> 00:12:33,419 First of all, an easy one, simple one 357 00:12:33,420 --> 00:12:34,959 make use of colors. 358 00:12:34,960 --> 00:12:36,049 We're all using colors. 359 00:12:36,050 --> 00:12:37,340 Anyway, that initialization. 360 00:12:38,690 --> 00:12:39,769 This, for example, is 361 00:12:40,970 --> 00:12:43,249 Syria tracker that tracks the deaths 362 00:12:43,250 --> 00:12:45,619 of Syrian people, and it comes 363 00:12:45,620 --> 00:12:47,749 on a very comfortable cozy 364 00:12:47,750 --> 00:12:50,239 lake, very 365 00:12:50,240 --> 00:12:52,759 nice looking blue. 366 00:12:52,760 --> 00:12:54,619 I think that really it's not what it's 367 00:12:54,620 --> 00:12:56,989 about. In the end, I feel like again, 368 00:12:56,990 --> 00:12:59,089 you were to trust us more to do what 369 00:12:59,090 --> 00:13:01,099 you we present, if you would show it in a 370 00:13:01,100 --> 00:13:02,509 slightly different way. 371 00:13:02,510 --> 00:13:03,679 And that's the least you can do. 372 00:13:05,030 --> 00:13:07,399 In fact, like the three most intense, 373 00:13:07,400 --> 00:13:09,559 most emotional data visualizations 374 00:13:09,560 --> 00:13:11,629 of the past year, 375 00:13:11,630 --> 00:13:13,879 make it quite use a very impactful, 376 00:13:13,880 --> 00:13:16,069 impactful use of colors. 377 00:13:16,070 --> 00:13:18,169 The first one is about gun 378 00:13:18,170 --> 00:13:19,159 deaths in the U.S. 379 00:13:19,160 --> 00:13:21,349 The second one is about victims of 380 00:13:21,350 --> 00:13:23,149 the Second World War, and the third is 381 00:13:23,150 --> 00:13:25,429 about deaths from drones. 382 00:13:25,430 --> 00:13:26,959 And they all have plaque as a background 383 00:13:26,960 --> 00:13:29,029 and all have these highlights 384 00:13:29,030 --> 00:13:31,189 of colors. And 385 00:13:31,190 --> 00:13:33,139 yeah, this is really one of the simple 386 00:13:33,140 --> 00:13:35,839 tools to created 387 00:13:35,840 --> 00:13:36,840 empathy. 388 00:13:38,370 --> 00:13:41,249 Sure, the data would mean for experience. 389 00:13:41,250 --> 00:13:43,049 That's an interesting maybe. 390 00:13:43,050 --> 00:13:44,249 You mentioned three layers. 391 00:13:44,250 --> 00:13:46,469 First, you have to assess a table 392 00:13:46,470 --> 00:13:48,569 just like black and white, like all your 393 00:13:48,570 --> 00:13:49,739 numbers. 394 00:13:49,740 --> 00:13:51,629 You don't really understand them all, and 395 00:13:51,630 --> 00:13:53,219 they don't have any meaning for you 396 00:13:53,220 --> 00:13:54,899 because you don't understand them. 397 00:13:54,900 --> 00:13:56,729 So that's why you add to visualization. 398 00:13:56,730 --> 00:13:58,469 That's what makes you understand the 399 00:13:58,470 --> 00:14:00,089 data. That's what shows you what's 400 00:14:00,090 --> 00:14:01,379 actually in the data and how to the 401 00:14:01,380 --> 00:14:02,579 stories. 402 00:14:02,580 --> 00:14:04,529 But then he wanted to add a layer, the 403 00:14:04,530 --> 00:14:06,809 experience layer or like a meaning layer. 404 00:14:06,810 --> 00:14:09,059 The Tetsu value should care. 405 00:14:09,060 --> 00:14:10,259 That's why everybody is freaking out 406 00:14:10,260 --> 00:14:12,449 about VR because VR does 407 00:14:12,450 --> 00:14:14,129 exactly that. It puts you into a 408 00:14:14,130 --> 00:14:16,589 situation. It makes you feel things 409 00:14:16,590 --> 00:14:18,299 in a situation because your data, because 410 00:14:18,300 --> 00:14:19,529 you have to experience. 411 00:14:21,840 --> 00:14:24,509 So, yeah, experience makes a few things, 412 00:14:24,510 --> 00:14:26,609 and I think that's something a lot of 413 00:14:26,610 --> 00:14:28,109 you have seen. And I want to show that 414 00:14:28,110 --> 00:14:30,299 again, because it's doing 415 00:14:30,300 --> 00:14:31,899 that really well, not putting you in a 416 00:14:31,900 --> 00:14:34,109 situation, but bringing the situation 417 00:14:34,110 --> 00:14:36,479 to what you see if that works. 418 00:14:38,450 --> 00:14:40,309 You make wish. 419 00:14:45,110 --> 00:14:46,110 Granny. 420 00:14:48,770 --> 00:14:50,929 Have you done your homework and a 421 00:14:50,930 --> 00:14:52,759 general strike? 422 00:14:52,760 --> 00:14:53,760 Ready or not? 423 00:14:54,680 --> 00:14:56,510 Here it comes on in clashes with 424 00:14:58,670 --> 00:15:00,859 volleys of munition against turf and 425 00:15:00,860 --> 00:15:02,299 I say you. 426 00:15:07,130 --> 00:15:09,559 Airstrikes on rebel positions. 427 00:15:09,560 --> 00:15:11,699 We are going to stay on 428 00:15:11,700 --> 00:15:12,700 topic. 429 00:15:18,480 --> 00:15:19,480 All right. 430 00:15:21,510 --> 00:15:22,949 Have you come to Richmond? 431 00:15:25,360 --> 00:15:26,360 There are. 432 00:15:37,670 --> 00:15:38,670 And. 433 00:15:45,290 --> 00:15:46,290 People. 434 00:15:49,510 --> 00:15:50,510 You, commissioner. 435 00:16:08,540 --> 00:16:11,299 This view at more like 55 million clicks 436 00:16:11,300 --> 00:16:12,709 of use until now. 437 00:16:12,710 --> 00:16:14,479 It's really impactful again, it brings 438 00:16:14,480 --> 00:16:16,549 you it doesn't put you into a situation 439 00:16:16,550 --> 00:16:18,259 of peaceful situation to you. 440 00:16:18,260 --> 00:16:20,339 And it's one step further 441 00:16:20,340 --> 00:16:21,649 and visualizations like these ones. 442 00:16:21,650 --> 00:16:23,389 I think you've seen these once before, 443 00:16:23,390 --> 00:16:25,219 but they ask you to enter your zip code 444 00:16:25,220 --> 00:16:26,669 or something and then they tell you what 445 00:16:26,670 --> 00:16:28,819 the data looks like for you. 446 00:16:28,820 --> 00:16:30,889 Now. But this is these 447 00:16:30,890 --> 00:16:32,449 things. I'm what I thought. 448 00:16:32,450 --> 00:16:33,919 Experiments like the building among 449 00:16:33,920 --> 00:16:35,059 poster that recently. 450 00:16:37,660 --> 00:16:39,159 The U.S. is pretty far away. 451 00:16:39,160 --> 00:16:40,809 You have to swallow that whatever that 452 00:16:40,810 --> 00:16:42,939 means, they take the ball and 453 00:16:42,940 --> 00:16:44,709 put it in front of your door. 454 00:16:44,710 --> 00:16:46,569 Basically, you can see what a dog looks 455 00:16:46,570 --> 00:16:48,279 like or what looked like if it would be 456 00:16:48,280 --> 00:16:50,139 actually a deer in your environment in 457 00:16:50,140 --> 00:16:52,330 your experience that you have right now. 458 00:16:53,920 --> 00:16:56,259 Something similar is doing the BBC that 459 00:16:56,260 --> 00:16:58,539 with this new scheme, 460 00:16:58,540 --> 00:17:00,609 but it put people in the lives of 461 00:17:00,610 --> 00:17:02,349 Syrian refugees and let them make 462 00:17:02,350 --> 00:17:03,350 decisions. 463 00:17:03,970 --> 00:17:05,318 So you have, for example, you can decide 464 00:17:05,319 --> 00:17:07,419 to pay somebody deposit 465 00:17:07,420 --> 00:17:09,189 or have used to pay them the October 466 00:17:09,190 --> 00:17:10,190 deposit. 467 00:17:11,470 --> 00:17:12,699 Again, that's putting you into this 468 00:17:12,700 --> 00:17:13,700 situation. 469 00:17:14,530 --> 00:17:16,568 That was pretty advanced, but you can 470 00:17:16,569 --> 00:17:18,159 actually make something more simple and 471 00:17:18,160 --> 00:17:19,989 do some calculations. 472 00:17:19,990 --> 00:17:21,759 CNN was doing that. 473 00:17:21,760 --> 00:17:23,989 It calculated what it would mean if 474 00:17:23,990 --> 00:17:26,409 one to 1.3 percent 475 00:17:26,410 --> 00:17:28,239 of the population would be killed, like 476 00:17:28,240 --> 00:17:30,189 it's the case right now in Syria. 477 00:17:30,190 --> 00:17:31,749 I mean, I don't think that so many 478 00:17:31,750 --> 00:17:34,089 Americans actually care about Syria, 479 00:17:34,090 --> 00:17:36,759 but if three to four million Americans 480 00:17:36,760 --> 00:17:38,199 would be killed in their own country, 481 00:17:38,200 --> 00:17:39,200 like. 482 00:17:39,620 --> 00:17:41,689 Wow. What we what we I get 483 00:17:41,690 --> 00:17:42,690 on. 484 00:17:44,510 --> 00:17:45,949 Yeah, there are lots of questions you can 485 00:17:45,950 --> 00:17:47,599 ask to make these thought experiments, 486 00:17:47,600 --> 00:17:49,129 these parallel universes, and I would 487 00:17:49,130 --> 00:17:51,109 really like to see that more often done 488 00:17:51,110 --> 00:17:53,389 and 2070 in the data visualization 489 00:17:53,390 --> 00:17:54,390 scene. 490 00:17:56,670 --> 00:17:58,859 Another example swim into one dot, 491 00:18:00,540 --> 00:18:02,369 every project needs a story as stability 492 00:18:02,370 --> 00:18:03,959 among guys, let's say. 493 00:18:03,960 --> 00:18:05,249 And that's exactly what they do like, 494 00:18:05,250 --> 00:18:06,179 they always have these data 495 00:18:06,180 --> 00:18:08,309 visualizations beautiful at a top here, 496 00:18:08,310 --> 00:18:10,619 for example, one that shows where 497 00:18:10,620 --> 00:18:12,209 people who currently live in Berlin 498 00:18:12,210 --> 00:18:13,829 actually come from. 499 00:18:13,830 --> 00:18:15,839 But then they also go on a street and ask 500 00:18:15,840 --> 00:18:18,029 people in suministro to 1.2 at 501 00:18:18,030 --> 00:18:20,159 one data point and asked him and are 502 00:18:20,160 --> 00:18:21,639 like, Are you firm? 503 00:18:21,640 --> 00:18:22,949 Why do you live here? You like it in 504 00:18:22,950 --> 00:18:24,239 Berlin, etc. 505 00:18:24,240 --> 00:18:25,979 so that people can relate to somebody? 506 00:18:28,140 --> 00:18:29,729 And that's an old journalistic trick, 507 00:18:29,730 --> 00:18:30,689 right? 508 00:18:30,690 --> 00:18:32,130 That's from the NPR website. 509 00:18:33,210 --> 00:18:35,669 Most especially feature stories 510 00:18:35,670 --> 00:18:37,199 start with like an anecdote at the 511 00:18:37,200 --> 00:18:39,089 beginning here, for example, a photo of 512 00:18:39,090 --> 00:18:41,189 one person and then go up level and high 513 00:18:41,190 --> 00:18:43,319 level and show you the overview and show 514 00:18:43,320 --> 00:18:45,479 what the data means or like, how 515 00:18:45,480 --> 00:18:48,179 many how many people actually 516 00:18:48,180 --> 00:18:49,679 have. Like similar stories because they 517 00:18:49,680 --> 00:18:51,959 share the ad is similar 518 00:18:51,960 --> 00:18:52,960 in a similar date up and. 519 00:18:55,940 --> 00:18:58,399 And, of course, advocacy 520 00:18:58,400 --> 00:19:00,259 organizations are doing that a lot. 521 00:19:00,260 --> 00:19:01,260 That's actually interesting 522 00:19:02,330 --> 00:19:04,819 because it actually states like 523 00:19:04,820 --> 00:19:06,719 you can't lose sight of the individual 524 00:19:06,720 --> 00:19:08,509 that actually says it's actually dead. 525 00:19:08,510 --> 00:19:10,459 Don't just look at the numbers like you 526 00:19:10,460 --> 00:19:12,679 will never see something like that that 527 00:19:12,680 --> 00:19:14,089 will show you the data visualization. 528 00:19:14,090 --> 00:19:15,679 It's a sign of how many people died, and 529 00:19:15,680 --> 00:19:18,589 we always show you for the individual. 530 00:19:20,300 --> 00:19:21,709 And that's one of my favorite Twitter 531 00:19:21,710 --> 00:19:22,710 bots. 532 00:19:23,390 --> 00:19:25,399 As somebody who works for the American 533 00:19:25,400 --> 00:19:26,720 Census a lot, I really like it. 534 00:19:28,320 --> 00:19:30,019 It shows you one data point of the 535 00:19:30,020 --> 00:19:32,119 American census and tells 536 00:19:32,120 --> 00:19:34,159 you all the data it knows about the data 537 00:19:34,160 --> 00:19:36,679 point. And I think that's that's that's 538 00:19:36,680 --> 00:19:38,779 a quite example of like how you 539 00:19:38,780 --> 00:19:40,729 have data about millions of people. 540 00:19:40,730 --> 00:19:42,049 It's like a statistics. 541 00:19:42,050 --> 00:19:44,179 If you have data about one person, it's 542 00:19:44,180 --> 00:19:45,180 a story. 543 00:19:47,660 --> 00:19:48,739 So what are you talking about? 544 00:19:50,570 --> 00:19:52,159 That's something that's also pretty old 545 00:19:52,160 --> 00:19:53,959 or Tony Willard and his graphic designer 546 00:19:53,960 --> 00:19:56,659 get arms already did it in the 1930s, 547 00:19:56,660 --> 00:19:58,969 but he showed data about 548 00:19:58,970 --> 00:20:01,070 people with actual symbols of people. 549 00:20:02,600 --> 00:20:04,039 And that's what a New York Times are 550 00:20:04,040 --> 00:20:06,139 still doing or has done in the 551 00:20:06,140 --> 00:20:07,669 last years. 552 00:20:07,670 --> 00:20:08,750 And The Washington Post. 553 00:20:11,110 --> 00:20:12,999 And showed a mass as in of itself, the 554 00:20:13,000 --> 00:20:14,000 last point. 555 00:20:15,100 --> 00:20:17,079 It's similar to this zooming into dots 556 00:20:17,080 --> 00:20:18,249 thing. 557 00:20:18,250 --> 00:20:20,439 But it's more like you don't 558 00:20:20,440 --> 00:20:21,999 just zoom into one dotted showed him as 559 00:20:22,000 --> 00:20:23,769 an example, but actually show like all 560 00:20:23,770 --> 00:20:25,839 the dots as like the whole 561 00:20:25,840 --> 00:20:27,939 data, the Alcoholics 562 00:20:27,940 --> 00:20:29,169 Anonymous are doing that. 563 00:20:29,170 --> 00:20:31,269 For example, they focus on a small 564 00:20:31,270 --> 00:20:32,979 step, they say. 565 00:20:32,980 --> 00:20:35,079 Also, of course, being absent for 566 00:20:35,080 --> 00:20:36,459 the rest of their lives is the goal of 567 00:20:36,460 --> 00:20:37,419 the program. 568 00:20:37,420 --> 00:20:39,579 Alcoholics are told to stay sober one day 569 00:20:39,580 --> 00:20:42,069 at a time or one hour at a time. 570 00:20:42,070 --> 00:20:43,359 You focus on what's close. 571 00:20:43,360 --> 00:20:45,130 You focus on what's achievable. 572 00:20:46,690 --> 00:20:48,849 And you're putting the data 573 00:20:48,850 --> 00:20:50,769 closer, as you do with this example, for 574 00:20:50,770 --> 00:20:53,529 example, 800000 575 00:20:53,530 --> 00:20:56,199 killed in the last eight days versus 576 00:20:56,200 --> 00:20:59,199 one life lost every 11 seconds. 577 00:20:59,200 --> 00:21:01,009 The first number you can calculate, and 578 00:21:01,010 --> 00:21:02,689 that's super important, too. 579 00:21:02,690 --> 00:21:05,079 The second number is what speaks to your 580 00:21:05,080 --> 00:21:07,449 heart, but speaks to emotions. 581 00:21:07,450 --> 00:21:09,159 Passive essentially doing something 582 00:21:09,160 --> 00:21:10,269 similar with their 583 00:21:11,410 --> 00:21:13,849 like posture up to the German OkCupid. 584 00:21:13,850 --> 00:21:16,059 Oh, they're like 585 00:21:17,110 --> 00:21:18,909 the advertiser service that matchmaking 586 00:21:18,910 --> 00:21:20,079 service. 587 00:21:20,080 --> 00:21:22,329 Saying a single 588 00:21:22,330 --> 00:21:24,549 14 off of every 11 minutes on 589 00:21:24,550 --> 00:21:25,550 a website. 590 00:21:27,070 --> 00:21:29,379 Whatever that means, they don't 591 00:21:29,380 --> 00:21:31,659 say like 50000 people per year 592 00:21:31,660 --> 00:21:33,819 falling off on powership. 593 00:21:33,820 --> 00:21:35,709 They like focus on the individual because 594 00:21:35,710 --> 00:21:37,419 you're standing on a subway platform and 595 00:21:37,420 --> 00:21:38,889 you see that advertisement and you can 596 00:21:38,890 --> 00:21:41,199 actually you can actually 597 00:21:41,200 --> 00:21:43,479 imagine being that single or like 598 00:21:43,480 --> 00:21:44,989 that single. He was alone. 599 00:21:44,990 --> 00:21:47,019 And then you found somebody who is happy 600 00:21:47,020 --> 00:21:48,999 and then you can say, OK, every 11 601 00:21:49,000 --> 00:21:50,619 minutes is actually a lot. 602 00:21:50,620 --> 00:21:52,510 Maybe I should sign up to be less lonely. 603 00:21:54,940 --> 00:21:57,039 And that's just data 604 00:21:57,040 --> 00:21:59,259 visualization I showed before 605 00:21:59,260 --> 00:22:01,659 from periscopic about us 606 00:22:01,660 --> 00:22:04,159 catnaps, which does something supposedly 607 00:22:04,160 --> 00:22:05,829 interesting and similar. 608 00:22:05,830 --> 00:22:07,959 So it's actually an animation, but I will 609 00:22:07,960 --> 00:22:09,839 show the slides to explain them 610 00:22:11,020 --> 00:22:13,359 what you can see as ox being drawn 611 00:22:13,360 --> 00:22:14,619 for every person. 612 00:22:14,620 --> 00:22:16,659 But then a trip like the dots trip at 613 00:22:16,660 --> 00:22:18,789 some point because they get killed 614 00:22:18,790 --> 00:22:21,009 by a gun, and then they keep trying 615 00:22:21,010 --> 00:22:23,079 to act to show you how long that 616 00:22:23,080 --> 00:22:25,239 person would have lived 617 00:22:25,240 --> 00:22:26,829 in this show. One example I want to 618 00:22:26,830 --> 00:22:29,019 explain the data visualization 619 00:22:29,020 --> 00:22:31,269 by showing one example and then 620 00:22:31,270 --> 00:22:33,549 another example, and 621 00:22:33,550 --> 00:22:35,379 then another one and the initial three at 622 00:22:35,380 --> 00:22:36,609 one point, etc. 623 00:22:36,610 --> 00:22:38,470 And it goes, almost it goes. 624 00:22:39,550 --> 00:22:40,929 Always faster and faster, and it 625 00:22:40,930 --> 00:22:43,119 accelerates a lot until you end 626 00:22:43,120 --> 00:22:44,739 it something like that. 627 00:22:44,740 --> 00:22:46,389 So you can still see the dots and that's 628 00:22:46,390 --> 00:22:47,679 the point I want to make. 629 00:22:47,680 --> 00:22:50,029 It's not like this one data 630 00:22:50,030 --> 00:22:51,309 box that you have. 631 00:22:51,310 --> 00:22:53,529 You can still see and go into every 632 00:22:53,530 --> 00:22:54,530 dot if you want to. 633 00:22:57,020 --> 00:22:59,409 Something similar said about 634 00:22:59,410 --> 00:23:01,149 the Holocaust by the Holocaust survivor, 635 00:23:01,150 --> 00:23:03,459 Abdul hats back there are not six million 636 00:23:03,460 --> 00:23:05,589 Jews murdered in one murder 637 00:23:05,590 --> 00:23:07,549 six million times, and I urge you to do 638 00:23:07,550 --> 00:23:08,889 it the next time you see like a big 639 00:23:08,890 --> 00:23:10,329 number about people. 640 00:23:10,330 --> 00:23:12,459 You imagine that thing that is going to 641 00:23:12,460 --> 00:23:14,049 describe the it. 642 00:23:14,050 --> 00:23:16,749 Three hundred thousand people 643 00:23:16,750 --> 00:23:19,029 being actually happening to one 644 00:23:19,030 --> 00:23:20,770 person and then just multiply it. 645 00:23:23,040 --> 00:23:24,629 This is not quite six million, this is 646 00:23:24,630 --> 00:23:26,519 five million six hundred thousand. 647 00:23:26,520 --> 00:23:28,679 It's the last number the artist's home on 648 00:23:28,680 --> 00:23:30,349 abaca or buy, 649 00:23:31,740 --> 00:23:33,209 paint it if you paint it every single 650 00:23:33,210 --> 00:23:34,439 number of on one two. 651 00:23:34,440 --> 00:23:36,659 Exactly this number, you can see him 652 00:23:36,660 --> 00:23:37,769 doing that. 653 00:23:37,770 --> 00:23:39,229 And. 654 00:23:39,230 --> 00:23:41,299 I always I wonder if he actually is the 655 00:23:41,300 --> 00:23:43,609 person in this world who can 656 00:23:43,610 --> 00:23:45,439 shots big numbers the best, who can 657 00:23:45,440 --> 00:23:47,149 actually who actually knows what a big 658 00:23:47,150 --> 00:23:48,259 number means. 659 00:23:48,260 --> 00:23:50,479 Because he spent time with every single 660 00:23:50,480 --> 00:23:52,459 number. What if these are the people? 661 00:23:52,460 --> 00:23:54,019 He would know what it means because it 662 00:23:54,020 --> 00:23:56,479 took him? Yes, years to like Whitey 663 00:23:56,480 --> 00:23:58,579 Stone, but every number took only 664 00:23:58,580 --> 00:24:00,229 like half a minute or so to write down. 665 00:24:02,060 --> 00:24:04,319 Did it has a project that did something 666 00:24:04,320 --> 00:24:06,389 similar? Stephanie Posobiec and such 667 00:24:06,390 --> 00:24:08,579 a loopy wrote each other postcards 668 00:24:08,580 --> 00:24:09,929 that are true to date. 669 00:24:09,930 --> 00:24:12,269 So for example, this is a postcard from 670 00:24:12,270 --> 00:24:14,729 Georgia Lupi, where she shows 671 00:24:14,730 --> 00:24:16,889 all the songs she listened to 672 00:24:16,890 --> 00:24:18,959 in in one week. 673 00:24:18,960 --> 00:24:19,859 And it's not. 674 00:24:19,860 --> 00:24:22,109 It's not generated by the three tracks 675 00:24:22,110 --> 00:24:23,819 or like, Oh, or something, she actually 676 00:24:23,820 --> 00:24:25,049 true all these things. 677 00:24:25,050 --> 00:24:26,849 She spent time of every single data 678 00:24:26,850 --> 00:24:27,889 point. 679 00:24:27,890 --> 00:24:29,759 Now, going from like the use of that to 680 00:24:29,760 --> 00:24:31,469 the creators are it's like, this is 681 00:24:31,470 --> 00:24:33,569 something you can do as a creator to 682 00:24:33,570 --> 00:24:34,980 understand big data better. 683 00:24:36,410 --> 00:24:38,779 And she said Whoopi was whining about it. 684 00:24:38,780 --> 00:24:40,429 She was she called something like that, 685 00:24:40,430 --> 00:24:43,189 data humanism, and she said actually that 686 00:24:43,190 --> 00:24:45,559 instead of saving time of data, spend 687 00:24:45,560 --> 00:24:47,689 time off data and instead of data, 688 00:24:47,690 --> 00:24:49,700 it's no data people. 689 00:24:51,680 --> 00:24:53,379 So, yeah, that's that's what I wanted to 690 00:24:53,380 --> 00:24:54,559 talk about. 691 00:24:54,560 --> 00:24:55,560 Let me summed it up 692 00:24:57,740 --> 00:24:59,119 by feeling so good, and maybe we don't 693 00:24:59,120 --> 00:25:01,249 have to argue that that 694 00:25:01,250 --> 00:25:03,349 feelings are like pretty bad 695 00:25:03,350 --> 00:25:05,329 for like big numbers, but that we still 696 00:25:05,330 --> 00:25:07,160 need them to make us actually care. 697 00:25:08,900 --> 00:25:10,249 We can load up if they have this 698 00:25:10,250 --> 00:25:11,659 visualization. We don't have to. 699 00:25:11,660 --> 00:25:12,679 We can do it all. 700 00:25:12,680 --> 00:25:14,869 I think we should think about 701 00:25:14,870 --> 00:25:15,920 how we can achieve that. 702 00:25:17,180 --> 00:25:18,559 And then I was talking about all of 703 00:25:18,560 --> 00:25:20,569 options, how to make feelings with data 704 00:25:20,570 --> 00:25:22,669 visualization, for example, making use of 705 00:25:22,670 --> 00:25:24,979 color, swimming at home and not showing 706 00:25:24,980 --> 00:25:25,999 what you're talking about. 707 00:25:26,000 --> 00:25:28,129 If people symbols showing what 708 00:25:28,130 --> 00:25:29,929 a data would mean for your experience and 709 00:25:29,930 --> 00:25:31,339 showing that mass as individuals. 710 00:25:33,110 --> 00:25:34,309 So, yeah, thank you. 711 00:25:34,310 --> 00:25:35,310 Oh no. Wait, wait, wait. 712 00:25:37,580 --> 00:25:39,689 These were three important points. 713 00:25:39,690 --> 00:25:41,959 I think the fourth one is missing here. 714 00:25:41,960 --> 00:25:43,310 This is also super important. 715 00:25:44,480 --> 00:25:45,920 Once you made the feelings. 716 00:25:46,970 --> 00:25:48,109 You need to do something. 717 00:25:48,110 --> 00:25:49,669 You can't just leave people with their 718 00:25:49,670 --> 00:25:51,679 feelings. You have some responsibility if 719 00:25:51,680 --> 00:25:53,659 you create feelings and people because 720 00:25:53,660 --> 00:25:56,029 they get helpless as heck when they 721 00:25:56,030 --> 00:25:57,439 don't know what to do with it. 722 00:25:57,440 --> 00:25:58,799 If you're angry, you need to punch 723 00:25:58,800 --> 00:26:00,469 something. And if you're if you said you 724 00:26:00,470 --> 00:26:02,419 need to cry and if you have empathy for 725 00:26:02,420 --> 00:26:04,249 something, you need to make a donation or 726 00:26:04,250 --> 00:26:06,409 something to not feel like the world 727 00:26:06,410 --> 00:26:08,659 is burning if you don't do anything. 728 00:26:08,660 --> 00:26:10,039 That's not something I will talk about 729 00:26:10,040 --> 00:26:12,019 today, but keep it in mind. 730 00:26:12,020 --> 00:26:14,119 Your responsibility as somebody who makes 731 00:26:14,120 --> 00:26:15,259 people feel. 732 00:26:15,260 --> 00:26:16,260 Thank you very much. 733 00:26:39,760 --> 00:26:40,809 Thank you. 734 00:26:40,810 --> 00:26:42,579 Thank you. That was awesome. 735 00:26:44,200 --> 00:26:45,640 Thanks. That was awesome. 736 00:26:47,500 --> 00:26:49,359 So one quick announcement before we take 737 00:26:49,360 --> 00:26:50,289 questions. 738 00:26:50,290 --> 00:26:52,929 We're going to clear the room after 739 00:26:52,930 --> 00:26:54,339 the fact is over. 740 00:26:54,340 --> 00:26:56,619 That means everybody who is in will go 741 00:26:56,620 --> 00:26:59,289 out and use both doors 742 00:26:59,290 --> 00:27:02,409 means all women like them. 743 00:27:02,410 --> 00:27:04,389 If I call family history, if I could call 744 00:27:04,390 --> 00:27:06,609 when I was at my very best, I 745 00:27:06,610 --> 00:27:08,159 would have earned all skin bit have by 746 00:27:08,160 --> 00:27:10,299 asking about some and I know 747 00:27:10,300 --> 00:27:12,519 the lawyer. I know you're drinking friend 748 00:27:12,520 --> 00:27:14,079 Nixon, Ford, Port-Harcourt. You don't 749 00:27:14,080 --> 00:27:15,369 respond to golf. 750 00:27:15,370 --> 00:27:17,799 So how no frog 751 00:27:17,800 --> 00:27:18,800 in. 752 00:27:20,240 --> 00:27:22,219 Because it cuts on a microphone icon. 753 00:27:26,700 --> 00:27:28,380 Hey, thank you for your talk, and Maya, 754 00:27:29,790 --> 00:27:32,099 you you very briefly mentioned 755 00:27:32,100 --> 00:27:34,379 VR and the possibilities for 756 00:27:34,380 --> 00:27:35,399 empathy in VR. 757 00:27:35,400 --> 00:27:36,540 And I think I kind of 758 00:27:37,590 --> 00:27:38,729 don't really see it that way. 759 00:27:38,730 --> 00:27:40,199 So I'm curious about how you see it and 760 00:27:40,200 --> 00:27:41,399 if you could say more about that, thank 761 00:27:41,400 --> 00:27:42,400 you. 762 00:27:42,850 --> 00:27:44,649 I'm not an expert, but I would love to 763 00:27:44,650 --> 00:27:46,180 know why you don't see it that way. 764 00:27:48,550 --> 00:27:50,169 I'll tell you after you told me, I mean, 765 00:27:50,170 --> 00:27:52,359 I can no, I mean, 766 00:27:52,360 --> 00:27:54,429 I'm happy to talk about it, but I'm just 767 00:27:54,430 --> 00:27:57,309 sort of interested in how you're seeing. 768 00:27:57,310 --> 00:27:59,289 If you think of a continuum of different 769 00:27:59,290 --> 00:28:01,119 kinds of visual techniques that we have 770 00:28:01,120 --> 00:28:03,279 to generate empathy 771 00:28:03,280 --> 00:28:04,780 and feelings with 772 00:28:05,800 --> 00:28:08,649 with different kinds of data points, then 773 00:28:08,650 --> 00:28:10,719 data is one and VR 774 00:28:10,720 --> 00:28:12,999 is another that's been kind of talked 775 00:28:13,000 --> 00:28:14,289 about a lot. 776 00:28:14,290 --> 00:28:16,029 I think people like Sam Gregory sort of 777 00:28:16,030 --> 00:28:18,279 discuss what sort of problematic with 778 00:28:18,280 --> 00:28:20,439 this in that 779 00:28:20,440 --> 00:28:22,569 it just kind of makes it says that 780 00:28:22,570 --> 00:28:23,589 it's going to immerse you in an 781 00:28:23,590 --> 00:28:25,179 experience. But what it does is it just 782 00:28:25,180 --> 00:28:27,039 puts you into that experience and then 783 00:28:27,040 --> 00:28:29,019 drops you. It doesn't actually take you 784 00:28:29,020 --> 00:28:30,699 anywhere with it and just you having the 785 00:28:30,700 --> 00:28:32,799 experience is the most important part of 786 00:28:32,800 --> 00:28:34,119 VR. 787 00:28:34,120 --> 00:28:35,739 I mean, there are some which are slightly 788 00:28:35,740 --> 00:28:37,869 different, more interesting, 789 00:28:37,870 --> 00:28:40,119 and I can think of some examples, but I'm 790 00:28:40,120 --> 00:28:42,339 kind of curious to think here about 791 00:28:42,340 --> 00:28:44,529 what you would feel about VR. 792 00:28:44,530 --> 00:28:46,369 And as somebody who works with data 793 00:28:46,370 --> 00:28:48,639 numbers like what does that seem 794 00:28:48,640 --> 00:28:50,529 like on the horizon? 795 00:28:50,530 --> 00:28:51,999 That's open to saying thank you for your 796 00:28:52,000 --> 00:28:52,929 point. 797 00:28:52,930 --> 00:28:54,769 Again, I'm not a VR expert. 798 00:28:54,770 --> 00:28:56,949 I've done it once I was blown away like 799 00:28:56,950 --> 00:28:58,749 a pretty good level. 800 00:28:58,750 --> 00:28:59,750 Yeah. 801 00:29:02,440 --> 00:29:03,759 I think there's some reason that so many 802 00:29:03,760 --> 00:29:05,109 newsrooms, like the New York Times, 803 00:29:05,110 --> 00:29:06,880 invest and we are so much. 804 00:29:08,050 --> 00:29:09,609 Yeah, of course, it's like shiny and 805 00:29:09,610 --> 00:29:11,149 fancy and new, and that's why they love 806 00:29:11,150 --> 00:29:12,150 to. 807 00:29:13,450 --> 00:29:16,479 But I think that's something 808 00:29:16,480 --> 00:29:18,759 maybe just to the newness, you know, 809 00:29:18,760 --> 00:29:20,919 if you're like I can, I can still 810 00:29:20,920 --> 00:29:22,779 remember the first experience I had. 811 00:29:22,780 --> 00:29:24,459 And if this first, if your ISP's would 812 00:29:24,460 --> 00:29:25,899 have been about two, you know, something 813 00:29:25,900 --> 00:29:27,339 I would have like, I would have still 814 00:29:27,340 --> 00:29:28,929 remembered that because it's something 815 00:29:28,930 --> 00:29:30,609 you don't see every day. 816 00:29:33,720 --> 00:29:35,369 I want to know, like I would like to 817 00:29:35,370 --> 00:29:37,559 point about how people get trapped into 818 00:29:37,560 --> 00:29:39,749 a situation and then it's like, that's 819 00:29:39,750 --> 00:29:41,639 it, and they can always escape. 820 00:29:41,640 --> 00:29:42,809 I wonder if there's actually. 821 00:29:44,640 --> 00:29:46,349 The opposite of beneficial like if it 822 00:29:46,350 --> 00:29:47,369 actually hurts, because, 823 00:29:48,630 --> 00:29:50,159 yeah, because you don't, you don't lift 824 00:29:50,160 --> 00:29:51,389 it like you have to experience it, you 825 00:29:51,390 --> 00:29:52,319 can always escape. 826 00:29:52,320 --> 00:29:53,320 This is. 827 00:29:54,800 --> 00:29:56,029 I know. I'm not sure, 828 00:29:57,380 --> 00:29:58,380 but thank you 829 00:29:59,780 --> 00:30:01,219 so we can do one more question, maybe 830 00:30:01,220 --> 00:30:02,369 two, if they're quick. 831 00:30:02,370 --> 00:30:04,459 I'm a I'll go out. 832 00:30:04,460 --> 00:30:06,229 I'm sure Liz will take some time outside 833 00:30:06,230 --> 00:30:08,329 as well for more discussion 834 00:30:08,330 --> 00:30:10,049 and hi, thank you for your talk. 835 00:30:10,050 --> 00:30:12,109 I was wondering if you would 836 00:30:12,110 --> 00:30:14,119 be giving this talk in front of a whole 837 00:30:14,120 --> 00:30:16,069 bunch of right wing, horrible people. 838 00:30:16,070 --> 00:30:17,450 Would you give the same talk? 839 00:30:19,520 --> 00:30:21,829 Well, I'd bring horrible people. 840 00:30:21,830 --> 00:30:22,849 Like, How horrible 841 00:30:28,310 --> 00:30:28,579 is it? 842 00:30:28,580 --> 00:30:29,650 Six of 10 horrible. 843 00:30:33,140 --> 00:30:34,399 I don't really believe in horrible 844 00:30:34,400 --> 00:30:36,739 people, but 845 00:30:36,740 --> 00:30:37,740 I think 846 00:30:39,350 --> 00:30:41,089 I would need to go through the slides 847 00:30:41,090 --> 00:30:42,379 again. But actually, it feels like 848 00:30:42,380 --> 00:30:44,389 there's nothing that should offend them 849 00:30:44,390 --> 00:30:45,390 too much. 850 00:30:48,530 --> 00:30:49,969 I will. I will let you know, though, the 851 00:30:49,970 --> 00:30:51,619 next time I will give a talk and wander 852 00:30:51,620 --> 00:30:53,539 off like lots of horrible I think people 853 00:30:53,540 --> 00:30:54,919 and then we, you know, we can have a 854 00:30:54,920 --> 00:30:56,659 discussion about what I should leave out 855 00:30:56,660 --> 00:30:59,179 or something to not get hurt. 856 00:30:59,180 --> 00:31:00,109 Thirty seconds. 857 00:31:00,110 --> 00:31:01,110 OK. 858 00:31:01,640 --> 00:31:03,739 Thank you for a really inspiring talk. 859 00:31:03,740 --> 00:31:05,119 Most of the great visualizations I've 860 00:31:05,120 --> 00:31:06,769 seen were designed as a very personal 861 00:31:06,770 --> 00:31:08,689 experience for one single person to 862 00:31:08,690 --> 00:31:10,309 experience or observe at once. 863 00:31:10,310 --> 00:31:11,269 And I'm curious if you have thoughts 864 00:31:11,270 --> 00:31:13,639 about visualizations that are designed 865 00:31:13,640 --> 00:31:15,469 to connect people or to be experienced by 866 00:31:15,470 --> 00:31:16,470 more than one person. 867 00:31:17,600 --> 00:31:19,729 Oh wow, OK, I've never thought about that 868 00:31:19,730 --> 00:31:21,979 at all. This is super interesting. 869 00:31:21,980 --> 00:31:24,829 You mean like visualizations that 870 00:31:24,830 --> 00:31:26,989 as opposed to be seen or like experience, 871 00:31:26,990 --> 00:31:28,069 but lots of people? 872 00:31:28,070 --> 00:31:29,479 I mean, I've definitely seen that 873 00:31:29,480 --> 00:31:31,639 optimizations in museums, and I think 874 00:31:31,640 --> 00:31:33,619 this is the closest one I can think of, 875 00:31:33,620 --> 00:31:35,119 like 3D visualizations 876 00:31:36,170 --> 00:31:37,399 of visualizations. You need to build 877 00:31:37,400 --> 00:31:38,400 together, I guess. 878 00:31:40,170 --> 00:31:41,170 Hmm. 879 00:31:41,870 --> 00:31:43,719 Let's talk more after you guys continue 880 00:31:43,720 --> 00:31:44,809 outside 881 00:31:44,810 --> 00:31:46,669 people, let's let's talk about the down 882 00:31:46,670 --> 00:31:47,929 towards this event. 883 00:31:47,930 --> 00:31:48,189 I'm sure 884 00:31:48,190 --> 00:31:49,570 there'll be a great conversation over.