WEBVTT 00:00:00.000 --> 00:00:03.963 align:middle line:90% 00:00:03.963 --> 00:00:05.230 align:middle line:84% AKIKO FUJITA: Our next guest certainly 00:00:05.230 --> 00:00:08.770 align:middle line:84% well versed in online content as an early internet pioneer. 00:00:08.770 --> 00:00:10.510 align:middle line:84% And we are excited to be joined by the founder 00:00:10.510 --> 00:00:14.590 align:middle line:84% of Wikipedia today, Jimmy Wales joining us from the UK. 00:00:14.590 --> 00:00:15.950 align:middle line:84% Jimmy, it's good to talk to you today. 00:00:15.950 --> 00:00:16.750 align:middle line:90% JIMMY WALES: Thank you. 00:00:16.750 --> 00:00:18.483 align:middle line:84% AKIKO FUJITA: How are you thinking about this debate 00:00:18.483 --> 00:00:22.480 align:middle line:84% that's been playing out about, on the one hand, allowing 00:00:22.480 --> 00:00:25.990 align:middle line:84% for these social media platforms to allow for a free exchange 00:00:25.990 --> 00:00:27.550 align:middle line:84% of information, but also ensuring 00:00:27.550 --> 00:00:30.070 align:middle line:84% that the information that's exchanged 00:00:30.070 --> 00:00:33.000 align:middle line:90% is actually based in truth? 00:00:33.000 --> 00:00:35.190 align:middle line:84% JIMMY WALES: Yeah, so I mean, I think the most important thing 00:00:35.190 --> 00:00:38.520 align:middle line:84% to understand here is that Section 00:00:38.520 --> 00:00:40.980 align:middle line:84% 230 of the Communications Decency Act 00:00:40.980 --> 00:00:44.580 align:middle line:84% is really what has made the internet work at all. 00:00:44.580 --> 00:00:46.140 align:middle line:84% It's a very, very important principle, 00:00:46.140 --> 00:00:50.560 align:middle line:84% this idea that any internet service provider-- 00:00:50.560 --> 00:00:52.590 align:middle line:84% which could be a newspaper, it could be a news site, 00:00:52.590 --> 00:00:54.990 align:middle line:84% it could be Facebook, it could be Twitter-- 00:00:54.990 --> 00:00:57.990 align:middle line:84% can moderate content without therefore 00:00:57.990 --> 00:00:59.580 align:middle line:84% becoming responsible for everything 00:00:59.580 --> 00:01:01.650 align:middle line:90% that everyone does on that site. 00:01:01.650 --> 00:01:04.950 align:middle line:84% And I think that's really, really crucial to remember 00:01:04.950 --> 00:01:09.210 align:middle line:84% because you want to have that balance between not forcing 00:01:09.210 --> 00:01:10.753 align:middle line:84% companies to be responsible for everything 00:01:10.753 --> 00:01:12.720 align:middle line:84% that everybody ever does, which means they're going to lock 00:01:12.720 --> 00:01:15.450 align:middle line:84% down everything in a way we wouldn't like, 00:01:15.450 --> 00:01:20.280 align:middle line:84% versus not allowing them to deal with spam and trolls and abuse 00:01:20.280 --> 00:01:22.140 align:middle line:90% and all that sort of thing. 00:01:22.140 --> 00:01:24.820 align:middle line:84% So it's a hugely important topic. 00:01:24.820 --> 00:01:26.720 align:middle line:84% AKIKO FUJITA: You've certainly seen a number of lawmakers 00:01:26.720 --> 00:01:30.410 align:middle line:84% come forward and call for a change to Section 230. 00:01:30.410 --> 00:01:32.540 align:middle line:84% How are you thinking about a potential impact 00:01:32.540 --> 00:01:36.920 align:middle line:84% to Wikipedia, if it was to be revised? 00:01:36.920 --> 00:01:38.720 align:middle line:84% JIMMY WALES: Well, I mean, it remains to be seen 00:01:38.720 --> 00:01:39.860 align:middle line:90% exactly what that would mean. 00:01:39.860 --> 00:01:41.690 align:middle line:84% But of course, we're very, very different 00:01:41.690 --> 00:01:45.500 align:middle line:84% in our moderation practices, compared to, 00:01:45.500 --> 00:01:46.890 align:middle line:90% really, anybody else. 00:01:46.890 --> 00:01:49.670 align:middle line:84% The typical model for moderation of websites 00:01:49.670 --> 00:01:53.750 align:middle line:84% is staff moderators, who are paid by the company, make all 00:01:53.750 --> 00:01:57.500 align:middle line:84% the decisions about what gets taken down, what stays up, 00:01:57.500 --> 00:01:59.120 align:middle line:90% who gets blocked and so on. 00:01:59.120 --> 00:02:01.340 align:middle line:84% In our case, all of that decision-making 00:02:01.340 --> 00:02:04.470 align:middle line:84% is pushed out into the community itself. 00:02:04.470 --> 00:02:06.750 align:middle line:84% So our community makes all the rules. 00:02:06.750 --> 00:02:09.289 align:middle line:84% Our community enforces the rules. 00:02:09.289 --> 00:02:11.210 align:middle line:84% And our community spends a huge amount 00:02:11.210 --> 00:02:13.790 align:middle line:84% of time discussing and debating and trying 00:02:13.790 --> 00:02:16.940 align:middle line:84% to find the right balance between having 00:02:16.940 --> 00:02:19.760 align:middle line:84% an open discourse and dialogue, which, in our case, 00:02:19.760 --> 00:02:24.020 align:middle line:84% is all about seeking the truth, versus we don't allow people 00:02:24.020 --> 00:02:26.510 align:middle line:84% to abuse each other and to troll and to just 00:02:26.510 --> 00:02:28.280 align:middle line:90% come and disrupt the site. 00:02:28.280 --> 00:02:31.250 align:middle line:84% It's a hugely complicated topic, obviously. 00:02:31.250 --> 00:02:33.830 align:middle line:84% For us, we don't feel that it's the right thing 00:02:33.830 --> 00:02:37.640 align:middle line:84% for the Wikimedia Foundation, which is the charity that owns 00:02:37.640 --> 00:02:40.440 align:middle line:84% and operates Wikipedia, to get involved directly 00:02:40.440 --> 00:02:42.140 align:middle line:84% in those kinds of things because the community does 00:02:42.140 --> 00:02:44.227 align:middle line:90% such a fantastic job of it. 00:02:44.227 --> 00:02:45.560 align:middle line:84% ZACK GUZMAN: Yeah, it's amazing to think 00:02:45.560 --> 00:02:46.880 align:middle line:90% about how that's even run. 00:02:46.880 --> 00:02:48.470 align:middle line:84% There are some people who probably just say-- 00:02:48.470 --> 00:02:50.330 align:middle line:84% it would have been impossible if you pitched it to them 00:02:50.330 --> 00:02:52.580 align:middle line:84% and said, yeah, we're going to have 41 million people 00:02:52.580 --> 00:02:53.660 align:middle line:90% out there with Wikipedia-- 00:02:53.660 --> 00:02:55.590 align:middle line:90% Wikimedia-- Wikipedia accounts. 00:02:55.590 --> 00:02:58.850 align:middle line:84% And only 143,000 of those moderating this-- that's 00:02:58.850 --> 00:03:00.911 align:middle line:90% less than a half percent. 00:03:00.911 --> 00:03:02.280 align:middle line:90% And yet, it still works. 00:03:02.280 --> 00:03:03.860 align:middle line:84% And so we keep hearing from Facebook 00:03:03.860 --> 00:03:06.260 align:middle line:84% about AI being used to moderate and remove 00:03:06.260 --> 00:03:08.220 align:middle line:90% biases from the platform. 00:03:08.220 --> 00:03:11.120 align:middle line:84% But when you look at that promise of AI and the way 00:03:11.120 --> 00:03:12.920 align:middle line:84% that Wikipedia has been able to function 00:03:12.920 --> 00:03:14.780 align:middle line:84% with their own human moderators, where 00:03:14.780 --> 00:03:18.320 align:middle line:84% do you put AI in maybe equation there to really 00:03:18.320 --> 00:03:20.427 align:middle line:90% have an effective moderation? 00:03:20.427 --> 00:03:21.560 align:middle line:84% JIMMY WALES: I mean, look, I think 00:03:21.560 --> 00:03:24.140 align:middle line:84% it's important to understand that although I have 00:03:24.140 --> 00:03:26.750 align:middle line:84% certain criticisms of Facebook and Twitter 00:03:26.750 --> 00:03:28.500 align:middle line:84% and the others about how they do this, 00:03:28.500 --> 00:03:30.980 align:middle line:84% they face a very, very hard problem, a harder problem 00:03:30.980 --> 00:03:32.240 align:middle line:90% than we do. 00:03:32.240 --> 00:03:34.220 align:middle line:84% Our fundamental premise has always been, 00:03:34.220 --> 00:03:36.260 align:middle line:84% we're here to write an encyclopedia. 00:03:36.260 --> 00:03:38.030 align:middle line:84% That defines everything that we do. 00:03:38.030 --> 00:03:40.670 align:middle line:84% It defines the kinds of conversations we have, 00:03:40.670 --> 00:03:42.680 align:middle line:84% the kinds of behavioral rules we have. 00:03:42.680 --> 00:03:44.270 align:middle line:84% We don't have a little box that says, 00:03:44.270 --> 00:03:46.940 align:middle line:84% type here whatever you think, which 00:03:46.940 --> 00:03:50.210 align:middle line:84% is fundamentally what Twitter and Facebook are all about. 00:03:50.210 --> 00:03:52.580 align:middle line:84% And the truth is, a lot of people 00:03:52.580 --> 00:03:55.760 align:middle line:84% have obnoxious ideas and bad ideas. 00:03:55.760 --> 00:03:58.100 align:middle line:84% And I don't think that's necessarily the responsibility 00:03:58.100 --> 00:04:02.030 align:middle line:84% of Facebook and Twitter until they get into the perspective 00:04:02.030 --> 00:04:05.060 align:middle line:84% of when they are promoting bad ideas, 00:04:05.060 --> 00:04:07.160 align:middle line:84% when they are promoting disinformation. 00:04:07.160 --> 00:04:09.210 align:middle line:84% Then I think they do have a moral responsibility 00:04:09.210 --> 00:04:10.230 align:middle line:90% to think about that. 00:04:10.230 --> 00:04:15.170 align:middle line:84% But we can't make Facebook legally liable for everything 00:04:15.170 --> 00:04:18.903 align:middle line:84% that somebody's crazy uncle types on the internet. 00:04:18.903 --> 00:04:20.070 align:middle line:84% AKIKO FUJITA: What do you attribute 00:04:20.070 --> 00:04:22.710 align:middle line:90% some of these concerns-- 00:04:22.710 --> 00:04:24.250 align:middle line:84% some of these issues around these social media 00:04:24.250 --> 00:04:25.050 align:middle line:90% platforms, too? 00:04:25.050 --> 00:04:28.260 align:middle line:84% I mean, if we're talking about Facebook or Twitter, 00:04:28.260 --> 00:04:30.600 align:middle line:84% there's been concerns about, obviously, online harassment, 00:04:30.600 --> 00:04:33.000 align:middle line:84% but also disinformation a big one. 00:04:33.000 --> 00:04:36.690 align:middle line:84% Some would argue that that is directly tied to the ad revenue 00:04:36.690 --> 00:04:38.160 align:middle line:90% model of these platforms. 00:04:38.160 --> 00:04:41.940 align:middle line:84% Wikipedia obviously relying on donations, as well as grants. 00:04:41.940 --> 00:04:45.390 align:middle line:84% Has that made you a little more immune to the kind of issues 00:04:45.390 --> 00:04:48.127 align:middle line:84% we're seeing in these big tech names? 00:04:48.127 --> 00:04:49.060 align:middle line:90% JIMMY WALES: Well, for sure. 00:04:49.060 --> 00:04:51.750 align:middle line:84% I mean, I think if you've got an advertising-only business 00:04:51.750 --> 00:04:56.340 align:middle line:84% model, where you only make money when people are clicking 00:04:56.340 --> 00:04:58.950 align:middle line:84% and staying for a long time on your site, 00:04:58.950 --> 00:05:01.950 align:middle line:84% then it's very easy to fall into a trap of letting 00:05:01.950 --> 00:05:05.230 align:middle line:84% your algorithms optimize for that with no real regard 00:05:05.230 --> 00:05:06.030 align:middle line:90% for the truth. 00:05:06.030 --> 00:05:09.420 align:middle line:84% I mean, we know if your sweet grandmother posts 00:05:09.420 --> 00:05:11.070 align:middle line:84% a nice picture of her dog, it probably 00:05:11.070 --> 00:05:13.380 align:middle line:90% doesn't get much comment. 00:05:13.380 --> 00:05:16.260 align:middle line:84% But if some racist jerk in your family posts 00:05:16.260 --> 00:05:17.970 align:middle line:84% something obnoxious, probably everybody 00:05:17.970 --> 00:05:19.060 align:middle line:90% jumps on to yell at them. 00:05:19.060 --> 00:05:20.500 align:middle line:84% And then suddenly we've got engagement. 00:05:20.500 --> 00:05:22.690 align:middle line:84% So we've got time on platform and so on. 00:05:22.690 --> 00:05:26.730 align:middle line:84% So those kinds of issues are very much intimately tied 00:05:26.730 --> 00:05:28.560 align:middle line:90% up in their business model. 00:05:28.560 --> 00:05:29.760 align:middle line:90% Obviously, I'm oversimplifying. 00:05:29.760 --> 00:05:32.670 align:middle line:84% I don't think that Facebook and Twitter would 00:05:32.670 --> 00:05:37.990 align:middle line:84% agree that crazy racist uncle is the core of the business model. 00:05:37.990 --> 00:05:42.690 align:middle line:84% But we do see on these platforms controversial content, 00:05:42.690 --> 00:05:45.670 align:middle line:84% inflammatory, it does get a lot of attention. 00:05:45.670 --> 00:05:46.770 align:middle line:90% It does get a lot of clicks. 00:05:46.770 --> 00:05:48.240 align:middle line:84% And therefore, it's hard for them 00:05:48.240 --> 00:05:50.463 align:middle line:90% to not optimize around them. 00:05:50.463 --> 00:05:51.630 align:middle line:84% AKIKO FUJITA: One of the criticisms 00:05:51.630 --> 00:05:54.390 align:middle line:84% you've received around Wikipedia, 00:05:54.390 --> 00:05:56.490 align:middle line:84% it is the lack of diversity around the editors 00:05:56.490 --> 00:05:57.450 align:middle line:90% and administrators. 00:05:57.450 --> 00:05:59.340 align:middle line:84% I know you've addressed that before. 00:05:59.340 --> 00:06:03.840 align:middle line:84% But as that conversation becomes more and more at the forefront, 00:06:03.840 --> 00:06:05.790 align:middle line:84% how are you thinking about that mix? 00:06:05.790 --> 00:06:09.230 align:middle line:84% And what kind of changes do you think need to be implemented? 00:06:09.230 --> 00:06:10.530 align:middle line:84% JIMMY WALES: Yeah, I mean, for us, it's 00:06:10.530 --> 00:06:12.570 align:middle line:84% really something we've been talking 00:06:12.570 --> 00:06:14.530 align:middle line:90% about for a long, long time. 00:06:14.530 --> 00:06:16.050 align:middle line:84% We want to diversify the community. 00:06:16.050 --> 00:06:18.760 align:middle line:84% And in fact, the community wants to diversify the community. 00:06:18.760 --> 00:06:21.520 align:middle line:84% So we've got a lot of different outreach programs. 00:06:21.520 --> 00:06:23.610 align:middle line:84% We try to understand what's preventing 00:06:23.610 --> 00:06:25.380 align:middle line:90% people from participating. 00:06:25.380 --> 00:06:28.890 align:middle line:84% Sometimes that's barriers, technical barriers. 00:06:28.890 --> 00:06:31.570 align:middle line:90% Sometimes it's geographical. 00:06:31.570 --> 00:06:32.670 align:middle line:84% There's a lot of different things 00:06:32.670 --> 00:06:35.310 align:middle line:84% going on in a lot of different parts of the world. 00:06:35.310 --> 00:06:37.320 align:middle line:84% But one of the ones that we definitely do look at 00:06:37.320 --> 00:06:41.310 align:middle line:84% is to say, look, what is behavior like in the community? 00:06:41.310 --> 00:06:47.040 align:middle line:84% If you come to Wikipedia and it's a bunch of angry boys, 00:06:47.040 --> 00:06:50.670 align:middle line:84% just to sort of state the problem bluntly, 00:06:50.670 --> 00:06:51.990 align:middle line:84% a lot of people will just go, yeah, 00:06:51.990 --> 00:06:53.160 align:middle line:90% that's not actually for me. 00:06:53.160 --> 00:06:56.820 align:middle line:84% And so we want to make sure that we are welcoming newcomers, 00:06:56.820 --> 00:07:00.240 align:middle line:84% that we are making sure that if people are violating 00:07:00.240 --> 00:07:03.540 align:middle line:84% our code of conduct, that we're actually taking that seriously 00:07:03.540 --> 00:07:05.250 align:middle line:90% and getting rid of them. 00:07:05.250 --> 00:07:07.110 align:middle line:84% And it's an ongoing and very, very 00:07:07.110 --> 00:07:09.880 align:middle line:90% human process for Wikipedia. 00:07:09.880 --> 00:07:11.560 align:middle line:84% ZACK GUZMAN: Yeah, we've got crazy uncles. 00:07:11.560 --> 00:07:12.700 align:middle line:90% We've got grandmas here. 00:07:12.700 --> 00:07:14.770 align:middle line:84% We've got young boys talking bad on the internet. 00:07:14.770 --> 00:07:16.300 align:middle line:84% We've got a lot of people here that could 00:07:16.300 --> 00:07:17.620 align:middle line:90% potentially be bad actors. 00:07:17.620 --> 00:07:18.880 align:middle line:84% But when you think about it, Jimmy, 00:07:18.880 --> 00:07:21.010 align:middle line:84% I mean, you've been doing this for a long time, right? 00:07:21.010 --> 00:07:22.810 align:middle line:84% Obviously, I think when it first came around, 00:07:22.810 --> 00:07:25.480 align:middle line:84% regulators didn't necessarily know where Facebook, Twitter, 00:07:25.480 --> 00:07:27.220 align:middle line:84% where this stuff was going to go. 00:07:27.220 --> 00:07:28.120 align:middle line:90% If you could think back-- 00:07:28.120 --> 00:07:30.530 align:middle line:84% I mean, obviously, Section 230 gets a lot of attention here. 00:07:30.530 --> 00:07:33.340 align:middle line:84% But if you would go back to kind of when these things were first 00:07:33.340 --> 00:07:35.080 align:middle line:84% launched, if you can think about a different way 00:07:35.080 --> 00:07:37.780 align:middle line:84% to set up a policy framework or regulatory framework 00:07:37.780 --> 00:07:40.780 align:middle line:84% around this, if you say, look, we can't go after Facebook 00:07:40.780 --> 00:07:43.810 align:middle line:84% every time their crazy uncle posts something bad, how would 00:07:43.810 --> 00:07:46.360 align:middle line:84% you kind of effectively set up the right way for governments 00:07:46.360 --> 00:07:48.700 align:middle line:84% to be looking at this when it comes to the impacts 00:07:48.700 --> 00:07:51.400 align:middle line:84% that we've now seen potentially, or maybe specifically, 00:07:51.400 --> 00:07:54.900 align:middle line:84% on elections and now health in this pandemic, 00:07:54.900 --> 00:07:57.320 align:middle line:90% to go about it the right way? 00:07:57.320 --> 00:08:00.790 align:middle line:84% JIMMY WALES: So I mean, I think Section 230 is the right way. 00:08:00.790 --> 00:08:03.530 align:middle line:84% You know, I think it's overwhelmingly the right way. 00:08:03.530 --> 00:08:07.150 align:middle line:84% But I also think that we need to really strengthen 00:08:07.150 --> 00:08:10.360 align:middle line:84% rules around financial transparency, 00:08:10.360 --> 00:08:12.430 align:middle line:90% around political advertising. 00:08:12.430 --> 00:08:15.310 align:middle line:90% I think that is very, very hard. 00:08:15.310 --> 00:08:17.890 align:middle line:84% And it's actually that transparency isn't just 00:08:17.890 --> 00:08:19.550 align:middle line:90% who's spending money anymore. 00:08:19.550 --> 00:08:22.940 align:middle line:84% It's really, you know, if you're a journalist, 00:08:22.940 --> 00:08:26.020 align:middle line:84% we're all very aware that through microtargeting, 00:08:26.020 --> 00:08:29.920 align:middle line:84% certain politicians are able to send completely contradictory 00:08:29.920 --> 00:08:32.260 align:middle line:84% and different messages to different people, 00:08:32.260 --> 00:08:34.240 align:middle line:90% based on who they are. 00:08:34.240 --> 00:08:35.540 align:middle line:90% And that's problematic. 00:08:35.540 --> 00:08:36.940 align:middle line:84% And it's something that journalists really 00:08:36.940 --> 00:08:38.740 align:middle line:84% have a hard time addressing because, 00:08:38.740 --> 00:08:40.929 align:middle line:84% well, if you just go on Facebook and start using it, 00:08:40.929 --> 00:08:42.309 align:middle line:84% you'll see a certain set of ads as very 00:08:42.309 --> 00:08:45.140 align:middle line:84% different from what people are different from you are saying. 00:08:45.140 --> 00:08:49.450 align:middle line:84% And so, really demanding increased transparency 00:08:49.450 --> 00:08:51.940 align:middle line:84% around all of that and increased transparency 00:08:51.940 --> 00:08:55.690 align:middle line:84% about the algorithms, like how are the algorithms choosing 00:08:55.690 --> 00:08:59.680 align:middle line:84% what to show to people when we see 00:08:59.680 --> 00:09:03.940 align:middle line:84% evidence that a certain platform may be radicalizing people. 00:09:03.940 --> 00:09:06.310 align:middle line:84% I don't think they-- none of these platforms 00:09:06.310 --> 00:09:10.120 align:middle line:84% have any actual desire to radicalize people. 00:09:10.120 --> 00:09:13.120 align:middle line:84% But if it's happening and their algorithm is causing 00:09:13.120 --> 00:09:15.160 align:middle line:84% it to happen, hey, I think that's something-- that's 00:09:15.160 --> 00:09:16.480 align:middle line:90% a conversation we need to have. 00:09:16.480 --> 00:09:18.940 align:middle line:84% And transparency, shining a light on that 00:09:18.940 --> 00:09:21.510 align:middle line:90% is really important. 00:09:21.510 --> 00:09:23.240 align:middle line:84% AKIKO FUJITA: And finally, Jimmy, we recently 00:09:23.240 --> 00:09:26.630 align:middle line:84% saw the Wikimedia Foundation launch this new enterprise 00:09:26.630 --> 00:09:30.170 align:middle line:84% product, which seemed to be a bit of a break from what we've 00:09:30.170 --> 00:09:34.100 align:middle line:84% seen Wikipedia stand for after 20 years, which 00:09:34.100 --> 00:09:35.960 align:middle line:84% is about the free flow of information, 00:09:35.960 --> 00:09:37.790 align:middle line:90% the availability of that. 00:09:37.790 --> 00:09:40.050 align:middle line:84% What prompted the launch of this product? 00:09:40.050 --> 00:09:42.140 align:middle line:84% And what do you think what do these conversations, 00:09:42.140 --> 00:09:45.220 align:middle line:84% with especially the tech names, look like? 00:09:45.220 --> 00:09:47.440 align:middle line:84% JIMMY WALES: Yeah, so the basic concept here 00:09:47.440 --> 00:09:51.460 align:middle line:84% is, we depend, really, 100% on donations 00:09:51.460 --> 00:09:52.480 align:middle line:90% to keep Wikipedia running. 00:09:52.480 --> 00:09:54.910 align:middle line:90% And that's great. 00:09:54.910 --> 00:09:56.230 align:middle line:90% And we like that. 00:09:56.230 --> 00:09:58.480 align:middle line:84% We like that the small donors is where the money comes from. 00:09:58.480 --> 00:10:00.340 align:middle line:84% It maintains our intellectual independence. 00:10:00.340 --> 00:10:02.260 align:middle line:84% It keeps us responsive to the public. 00:10:02.260 --> 00:10:05.860 align:middle line:84% At the same time, we see the internet giants. 00:10:05.860 --> 00:10:11.680 align:middle line:84% If you ask Siri a question or Alexa or Google a question, 00:10:11.680 --> 00:10:13.540 align:middle line:84% very, very frequently, the answer you get 00:10:13.540 --> 00:10:16.880 align:middle line:84% is directly reading to you out of Wikipedia. 00:10:16.880 --> 00:10:18.340 align:middle line:90% That is a technical matter. 00:10:18.340 --> 00:10:20.230 align:middle line:84% That is-- we're actually completely fine with that. 00:10:20.230 --> 00:10:21.340 align:middle line:90% Everything is freely licensed. 00:10:21.340 --> 00:10:22.700 align:middle line:90% That's what it's all about. 00:10:22.700 --> 00:10:25.540 align:middle line:84% But as a technical matter, they need things from us. 00:10:25.540 --> 00:10:27.910 align:middle line:84% They need to have faster access to the data. 00:10:27.910 --> 00:10:30.430 align:middle line:84% They need to make sure that the data, 00:10:30.430 --> 00:10:31.960 align:middle line:84% they're not getting vandalism and so on. 00:10:31.960 --> 00:10:35.200 align:middle line:84% They've been doing it themselves by hand and so on and so forth. 00:10:35.200 --> 00:10:37.870 align:middle line:84% We think we can offer them a product that, actually, 00:10:37.870 --> 00:10:39.610 align:middle line:84% we can help you with that process 00:10:39.610 --> 00:10:41.590 align:middle line:90% of distributing the content. 00:10:41.590 --> 00:10:42.620 align:middle line:90% But it's going to cost you. 00:10:42.620 --> 00:10:45.070 align:middle line:84% And we think it would be quite fair if all of these companies 00:10:45.070 --> 00:10:49.060 align:middle line:84% are making billions on the back of our content, yeah, 00:10:49.060 --> 00:10:50.230 align:middle line:90% they should chip in a bit. 00:10:50.230 --> 00:10:52.510 align:middle line:84% But there's no interest in sort of turning 00:10:52.510 --> 00:10:56.380 align:middle line:84% the Wikimedia Foundation into a captured client 00:10:56.380 --> 00:10:58.090 align:middle line:90% of these tech giants. 00:10:58.090 --> 00:10:59.320 align:middle line:84% If we make some money off of them, 00:10:59.320 --> 00:11:01.770 align:middle line:90% that just feels like it's fair.