1 00:00:00,000 --> 00:00:22,404 * rC3 2021 Chaos-West TV preroll music * 2 00:00:25,071 --> 00:00:30,480 Herald: Good, good afternoon, everyone. This upcoming talk is "Stop general data 3 00:00:30,480 --> 00:00:35,045 retention in the European Union and the current plans for mass surveillance". 4 00:00:35,045 --> 00:00:41,262 And it is not in German as the Fahrplan suggests. It's all done in English, but… 5 00:00:41,840 --> 00:00:46,060 Dieser Vortrag wird also simultan übersetzt ins Deutsch. 6 00:00:47,327 --> 00:00:52,775 So that's the extent of my German. I will carry on introducing the speakers. 7 00:00:54,782 --> 00:00:57,960 Also not included in the line up in the Fahrplan is 8 00:00:57,960 --> 00:01:02,849 Friedemann Ebelt, a freelance campaigner in Germany against data retention. 9 00:01:03,917 --> 00:01:07,151 Another German is joining us, that is Patrick Breyer 10 00:01:07,151 --> 00:01:12,090 who is a member of the European Parliament for the German Pirates. 11 00:01:14,397 --> 00:01:20,250 And we stay in Brussels for a little bit with Chloé Berthélémy, who is a policy 12 00:01:20,250 --> 00:01:22,890 adviser of European Digital Rights. 13 00:01:24,475 --> 00:01:27,865 And we are also staying a little bit in European Digital Rights, because 14 00:01:27,865 --> 00:01:32,884 We also have the chairman of the Danish NGO IT-Pol, Jesper Lund 15 00:01:33,257 --> 00:01:35,428 That group's also an EDRi member. 16 00:01:35,428 --> 00:01:37,920 And last, but definitely not least, 17 00:01:37,920 --> 00:01:43,161 We have TJ McIntyre, who is a lecturer at University College Dublin, but is also 18 00:01:43,161 --> 00:01:47,848 part of Digital Rights Ireland. And is one of the brains, together with Austrian NGOs 19 00:01:47,848 --> 00:01:54,229 and activists, behind the original DRI Ireland case in front of the 20 00:01:54,229 --> 00:01:59,783 Court of Justice of the European Union, which struck down data retention. 21 00:02:00,638 --> 00:02:03,726 That was already in 2016. Time flies when you're getting old. 22 00:02:05,098 --> 00:02:09,402 I'm going to hand it over to Friedemann, who is moderating this panel. 23 00:02:10,247 --> 00:02:14,349 Friedemann Ebelt: Thank you, Walter, for introducing this talk. 24 00:02:14,349 --> 00:02:20,266 And welcome everybody to this talk on mass surveillance of our communication data in 25 00:02:20,266 --> 00:02:26,221 the European Union. And also thanks you for your interest in this very important 26 00:02:26,221 --> 00:02:32,459 issue. I had the joy to organize this talk and I will help to navigate a little 27 00:02:32,459 --> 00:02:38,213 through this session. And the key questions of this talk are going to be: 28 00:02:39,135 --> 00:02:42,374 What is data retention? And what are the problems? 29 00:02:42,374 --> 00:02:46,409 Also, what is the legal situation in the European Union? 30 00:02:46,409 --> 00:02:51,585 And what are member state governments actually doing? 31 00:02:52,126 --> 00:02:55,900 What is the Commission of the European Union doing? 32 00:02:55,900 --> 00:03:00,136 And what's going on in the European Parliament? 33 00:03:00,738 --> 00:03:05,335 Also important questions are… What's the situation in Germany? 34 00:03:05,335 --> 00:03:09,836 What's the situation in France, Ireland, Denmark and Belgium? 35 00:03:09,836 --> 00:03:19,369 And what can we expect from the new year, from 2022 and the future? 36 00:03:19,369 --> 00:03:25,978 And of course, for many of you, one of the most important questions: 37 00:03:25,978 --> 00:03:31,172 What can citizens do about mass surveillance of communication data? 38 00:03:32,002 --> 00:03:37,093 You will find more information on the speakers, and you will find also 39 00:03:37,093 --> 00:03:41,457 the audio and video to download on media.ccc.de 40 00:03:41,457 --> 00:03:45,298 You can just search for data retention. 41 00:03:45,298 --> 00:03:49,720 And of course, if you like, you can recommend the talk to others. 42 00:03:50,469 --> 00:03:56,524 In general to follow the discussion on data retention during the year 2022 and beyond 43 00:03:56,524 --> 00:03:59,442 you can use the hashtag #DataRetention 44 00:03:59,442 --> 00:04:02,361 or the equivalent in your language. 45 00:04:02,361 --> 00:04:06,339 In German, this would be #Vorratsdatenspeicherung 46 00:04:06,339 --> 00:04:11,322 Patrick Breyer, as the member of the European Parliament, 47 00:04:11,322 --> 00:04:15,357 will start this talk with the question: 48 00:04:15,357 --> 00:04:20,235 What is data retention, and what are the problems? 49 00:04:21,932 --> 00:04:24,173 Patrick Breyer: Thank you very much Friedemann. 50 00:04:24,173 --> 00:04:26,811 And thanks everybody for joining us for this talk. 51 00:04:26,811 --> 00:04:32,203 Data retention has been called the most privacy-invasive scheme ever 52 00:04:32,203 --> 00:04:38,160 adopted by the European Union. But what does the term mean exactly? Now, data 53 00:04:38,160 --> 00:04:44,638 retention means that a record is kept by your providers on all phone calls you made 54 00:04:44,638 --> 00:04:51,500 and received, on all electronic messages you sent or received, as well as on the 55 00:04:51,500 --> 00:04:56,180 IP address that was assigned to each of your internet connections. 56 00:04:56,180 --> 00:05:01,018 So this means that the record does not contain the content of your calls 57 00:05:01,018 --> 00:05:06,116 and messages, but details on who you were communicating with, at what time, 58 00:05:06,116 --> 00:05:10,130 and, in the case of mobile devices, where you were located. 59 00:05:11,206 --> 00:05:15,963 This record can be accessed by public authorities to investigate suspects 60 00:05:15,963 --> 00:05:21,690 of serious crime. But it will be created even if you are not suspected or in 61 00:05:21,690 --> 00:05:24,618 any way remotely connected to any crime. 62 00:05:25,413 --> 00:05:30,571 So what is the problem with creating this sea of personal data? 63 00:05:30,571 --> 00:05:36,686 For the first time with data retention, sensitive information is amassed 64 00:05:36,686 --> 00:05:41,666 on the everyday social contacts, including business contacts, 65 00:05:41,666 --> 00:05:48,155 on the movements and on the private lives of millions of citizens that are not even 66 00:05:48,155 --> 00:05:52,729 remotely connected to any wrongdoing. The German Constitutional Court said 67 00:05:52,729 --> 00:05:57,844 that data retention has a broader range than anything in the legal system to date. 68 00:05:57,844 --> 00:06:02,960 And the idea of collecting information just in case you might need it 69 00:06:02,960 --> 00:06:10,160 in the future, that idea opens the floodgates to recording our entire lives. 70 00:06:10,160 --> 00:06:20,400 Including collecting our travels, using ANPR data, facial recognition data… 71 00:06:20,400 --> 00:06:26,160 You name it. This idea of retaining "just in case" is what is dangerous about this 72 00:06:26,160 --> 00:06:31,180 data retention. And that's why we are fighting this precedent so hard. 73 00:06:31,180 --> 00:06:34,168 It normalizes mass surveillance. 74 00:06:34,168 --> 00:06:38,969 Besides, a blanket telecommunications data retention has proven 75 00:06:38,969 --> 00:06:44,080 to be harmful to many sectors of society. It disrupts confidential 76 00:06:44,080 --> 00:06:51,680 communications in areas that legitimately require non-traceability. For example, 77 00:06:51,680 --> 00:06:57,280 contacts with psychotherapists, with physicians, lawyers, workers councils, 78 00:06:57,280 --> 00:07:02,640 marriage counselors, drug abuse counselors, help lines, et cetera. It thus 79 00:07:02,640 --> 00:07:09,760 endangers the physical and mental health of people in need of support, as well as 80 00:07:09,760 --> 00:07:14,880 of people around them. For example, a German crisis line reported they once 81 00:07:14,880 --> 00:07:19,760 talked a student out of a killing spree that he was contemplating at his own 82 00:07:19,760 --> 00:07:24,640 school. And you know, if you start recording information on these contacts 83 00:07:24,640 --> 00:07:30,000 and people risk being prosecuted, they might no longer call and it might not be 84 00:07:30,000 --> 00:07:37,440 possible to dissuade them from these kind of crimes. Furthermore, the inability 85 00:07:37,440 --> 00:07:42,640 of journalists to electronically receive information through untraceable channels 86 00:07:42,640 --> 00:07:48,000 compromises the freedom of the press, damages preconditions of our open and 87 00:07:48,000 --> 00:07:54,960 democratic society. Blanket data retention creates risks of data abuse and loss of 88 00:07:54,960 --> 00:08:00,960 confidential information relating to our contacts, movements, and interests. And 89 00:08:00,960 --> 00:08:06,000 communications data are particularly susceptible to producing unjustified 90 00:08:06,000 --> 00:08:11,520 suspicions and subjecting innocent citizens to criminal investigations 91 00:08:11,520 --> 00:08:19,840 because they relate to a connection point, not a specific person. And let me briefly 92 00:08:19,840 --> 00:08:25,840 explain why data retention is the problem and not the solution for law enforcement. 93 00:08:25,840 --> 00:08:31,600 It is, as I explained, a weapon of mass surveillance directed against the entire 94 00:08:31,600 --> 00:08:36,880 population. But on the other hand, the results are not even statistically 95 00:08:36,880 --> 00:08:44,000 significant. So neither the number of crimes, nor the crime clearance rate 96 00:08:44,000 --> 00:08:49,600 depends on whether you have data retention legislation in place in a country or not. 97 00:08:49,600 --> 00:08:52,800 So I've commissioned the European Parliament's research service to look at 98 00:08:52,800 --> 00:08:57,904 the statistics throughout the EU, and they didn't find one country where the 99 00:08:57,904 --> 00:09:04,204 crime rate or the number of crimes depended on is a data retention law in 100 00:09:04,204 --> 00:09:09,894 effect or not. But you'd expect it, you know, considering the breadth and mass 101 00:09:09,894 --> 00:09:14,929 of information that's being recorded. Obviously, there are typically other ways 102 00:09:14,929 --> 00:09:20,362 of clearing crimes than historical records. And also, blanket retention may 103 00:09:20,362 --> 00:09:26,322 have counterproductive effects, pushing criminals to other terms and making 104 00:09:26,322 --> 00:09:31,089 investigations even more difficult in some cases. And specifically, I want to say, in 105 00:09:31,089 --> 00:09:36,171 relation to child pornography online, which is really the favorite, most popular 106 00:09:36,171 --> 00:09:42,160 argument that proponents use currently. Let me underline that in Germany, without 107 00:09:42,160 --> 00:09:48,739 mandatory data retention in force, 91% of all investigations of child pornography 108 00:09:48,739 --> 00:09:55,265 are cleared. And the crime clearance rate actually dropped when IP data retention 109 00:09:55,265 --> 00:10:01,944 came into force in Germany, about 10 years ago. Besides, anonymous communications 110 00:10:01,944 --> 00:10:07,561 protects children by allowing for anonymous counseling that they are in need 111 00:10:07,561 --> 00:10:14,330 of, by allowing for anonymous self-help groups, by allowing them to anonymously 112 00:10:14,330 --> 00:10:23,147 file criminal charges. So don't be mistaken about the killer argument of 113 00:10:23,147 --> 00:10:32,850 child pornography. It's an excuse, not a valid justification. 114 00:10:32,850 --> 00:10:38,610 Ebelt: Patrick, since you explained what blanket or general data retention is all 115 00:10:38,610 --> 00:10:45,951 about, it seems pretty clear that it's a huge problem in democracies. And it's a 116 00:10:45,951 --> 00:10:50,718 huge problem for the freedom of our communications. But how about the legal 117 00:10:50,718 --> 00:10:56,947 situation? And this is something TJ McIntyre, chairman of Digital Rights 118 00:10:56,947 --> 00:11:06,442 Ireland, will explain us right now. TJ… TJ McIntyre: Thanks Friedemann. So the 119 00:11:06,442 --> 00:11:13,100 situation here, if we go back to the early part of the 2000's, is that even in the 120 00:11:13,100 --> 00:11:17,199 run up to 9/11, governments were using this kind of data retention essentially in 121 00:11:17,199 --> 00:11:21,624 secret. And they were getting telecom companies to retain this information 122 00:11:21,624 --> 00:11:27,072 without usually any real legal basis. And, after that was exposed, the early part of 123 00:11:27,072 --> 00:11:32,503 the 2000's saw some national laws being rushed in, in a hurry, to try to legalize 124 00:11:32,503 --> 00:11:36,884 this practice. But it also saw a lot of challenges brought by civil rights groups 125 00:11:36,884 --> 00:11:41,463 to these practices. And there were successful challenges in many individual 126 00:11:41,463 --> 00:11:47,574 countries on different grounds, in Germany, Romania, Bulgaria and so on. But 127 00:11:47,574 --> 00:11:52,240 what was most interesting from my perspective was when the battle shifted to 128 00:11:52,240 --> 00:11:56,669 the European level, because there was a move to introduce a European law, which 129 00:11:56,669 --> 00:12:01,440 would require data retention across all of Europe, which was eventually adopted as 130 00:12:01,440 --> 00:12:08,584 the so-called Data Retention Directive. And we in Digital Rights Ireland, along 131 00:12:08,584 --> 00:12:12,691 with colleagues from many other civil rights groups, brought action seeking to 132 00:12:12,691 --> 00:12:16,331 challenge this. And eventually we were successful in doing so before the European 133 00:12:16,331 --> 00:12:20,530 Court of Justice in 2014, which invalidated the directive on the basis 134 00:12:20,530 --> 00:12:25,761 that it was essentially disproportionate and a gross invasion of privacy, one that 135 00:12:25,761 --> 00:12:30,611 creates real risks of abuse. If we had a piece of legislation which involves 136 00:12:30,611 --> 00:12:35,920 creating these huge dossiers of data on everybody indiscriminately. So that was 137 00:12:35,920 --> 00:12:42,863 2014, and since then we've seen massive national pushback against this finding, 138 00:12:42,863 --> 00:12:47,213 that this kind of indiscriminate data retention is disproportionate and 139 00:12:47,213 --> 00:12:52,414 therefore contrary to European law. And we've seen multiple cases since then where 140 00:12:52,414 --> 00:12:56,791 national governments have tried to persuade the European Court of Justice to 141 00:12:56,791 --> 00:13:05,356 change its tack. There was a judgment in 2016 in a case brought by litigants from 142 00:13:05,356 --> 00:13:12,305 the United Kingdom and from Sweden, the Tele2 and Davis and Watson case. There was 143 00:13:12,305 --> 00:13:19,083 a judgment in 2018 in a case arising from Spain. There was a judgment in 2020 in a 144 00:13:19,083 --> 00:13:24,560 case coming from France and the United Kingdom, the Quadrature du Net and the 145 00:13:24,560 --> 00:13:28,825 Privacy International joint cases. And again and again and again, what national 146 00:13:28,825 --> 00:13:32,033 governments have tried to do is to persuade the European Court of Justice 147 00:13:32,033 --> 00:13:37,287 that it was wrong in 2014. That this finding that mass indiscriminate 148 00:13:37,287 --> 00:13:42,031 surveillance is unacceptable in a democratic society should be rolled back. 149 00:13:42,031 --> 00:13:46,063 Now, to my mind, what's very interesting is what's happening at the moment. Because 150 00:13:46,063 --> 00:13:50,296 there is yet another of these cases, in fact, three parallel cases before the 151 00:13:50,296 --> 00:13:54,571 European Court of Justice at the moment, where national governments have 152 00:13:54,571 --> 00:13:59,947 essentially again tried to square up to the European Court of Justice. Where 153 00:13:59,947 --> 00:14:04,073 collectively – this is really quite remarkable, I don't think we've ever seen 154 00:14:04,073 --> 00:14:09,423 such a coordinated set of national disobedience to court rulings before – 155 00:14:09,423 --> 00:14:14,163 where collectively, national governments across the EU have tried to say to the 156 00:14:14,163 --> 00:14:17,816 European Court of Justice, "we need this kind of mass surveillance," – though, 157 00:14:17,816 --> 00:14:21,590 without producing any evidence to show that it's in fact necessary, as Patrick 158 00:14:21,590 --> 00:14:25,494 points out – "We need this kind of mass surveillance. We think you are wrong. We 159 00:14:25,494 --> 00:14:31,822 want you to change your mind on this." So, this to me is rather worrying because it 160 00:14:31,822 --> 00:14:35,885 shows, I think, a degree of lawlessness here. National governments are unwilling 161 00:14:35,885 --> 00:14:40,116 to accept the findings of the highest court in Europe on this point. In some 162 00:14:40,116 --> 00:14:45,057 countries – in Ireland, for example – the law and the theory hasn't been changed at 163 00:14:45,057 --> 00:14:50,471 all. Despite the multiple judgments in this area, Irish law remains as it was in 164 00:14:50,471 --> 00:14:54,909 2011, so predating all these cases. And the Irish government has essentially 165 00:14:54,909 --> 00:14:59,332 indicated that it plans to keep that law in place until it is forced by the 166 00:14:59,332 --> 00:15:05,045 judgment of the national courts to do away with it. So we have, I think, a very 167 00:15:05,045 --> 00:15:09,509 difficult situation here. In one sense, of course, we've been very lucky. We've 168 00:15:09,509 --> 00:15:13,072 achieved a number of very important judgments from the European Court of 169 00:15:13,072 --> 00:15:17,665 Justice. We have a court there whose members understand the importance of this 170 00:15:17,665 --> 00:15:22,571 issue. But against that, you have a real problem here with national pushback and a 171 00:15:22,571 --> 00:15:26,477 desire to eventually force down the court. And perhaps wait until the 172 00:15:26,477 --> 00:15:30,115 composition of the courts changes in future and get more favorable precedents. 173 00:15:30,115 --> 00:15:34,110 So I think, from that perspective, it's very important that at a national level, 174 00:15:34,110 --> 00:15:39,563 we push to increase the political pressure against these laws. And, as far as we can 175 00:15:39,563 --> 00:15:45,031 at European level, we try to push the Commission to act, to take steps against 176 00:15:45,031 --> 00:15:48,341 member states that have refused to implement judgments of the European Court 177 00:15:48,341 --> 00:15:52,846 of Justice, and to ensure compliance with European law. 178 00:15:55,665 --> 00:16:04,492 Ebelt: Thank you, TJ. Now, following the legal situation, let us have a look at the 179 00:16:04,492 --> 00:16:11,444 implementation of law. And here, one of the or maybe the most important players 180 00:16:11,444 --> 00:16:17,629 in the European Union is the Commission of the European Union. And this question goes 181 00:16:17,629 --> 00:16:22,626 to Jesper Lund, chairman of the IT- Political Association of Denmark. 182 00:16:23,888 --> 00:16:27,639 Jesper, what is going on at the Commission? 183 00:16:28,757 --> 00:16:31,127 Jesper Lund: Thank you, Friedemann. 184 00:16:31,127 --> 00:16:35,909 So, essentially, every since the Data Retention Directive was annulled in 2014, 185 00:16:35,909 --> 00:16:39,984 there has been an ongoing discussion: is there going to be a new data retention 186 00:16:39,984 --> 00:16:47,221 instrument at the EU level? And so, besides waiting for the Commission, this 187 00:16:47,221 --> 00:16:51,543 discussion has also been going on in various working groups of council 188 00:16:51,543 --> 00:16:56,771 where member states meet in secret. Fortunately, some of that documents are 189 00:16:56,771 --> 00:17:01,096 leaked or obtained through Freedom of Information Access requests. So we sort of 190 00:17:01,096 --> 00:17:05,746 know from this process that that, as TJ mentioned, member states are in complete 191 00:17:05,746 --> 00:17:11,918 denial. They refuse to accept that general indiscriminate data retention is illegal, 192 00:17:11,918 --> 00:17:20,300 and try to move on from that starting point. So most recently, I think, the 193 00:17:20,300 --> 00:17:25,019 Commission has sort of taken a hold... wait and see attitude, wait for the next 194 00:17:25,019 --> 00:17:29,919 judgment. But after the La Quadrature judgement in October last year, 195 00:17:30,693 --> 00:17:37,000 the Commission has come forward with a non-paper in June which generated a lot of 196 00:17:37,000 --> 00:17:44,462 attention. So it's mostly a paper that that asks questions to member states, but 197 00:17:44,462 --> 00:17:48,736 sort of reading between the lines of the paper, we can also see what plans the 198 00:17:48,736 --> 00:17:53,583 Commission might have for a not necessarily a new data retention law; that 199 00:17:53,583 --> 00:17:58,303 is one of the options for member states. It could also be guidance for member 200 00:17:58,303 --> 00:18:07,412 states. But sort of going through the paper, it follows roughly the judgment in 201 00:18:07,412 --> 00:18:13,880 the La Quadrature case. One novel aspect of that is that the court reaffirmed that 202 00:18:13,880 --> 00:18:18,945 we cannot have mass surveillance, general and indiscriminate data retention, for the 203 00:18:18,945 --> 00:18:23,212 purpose of combating even serious crime. But it is possible unfortunately the court 204 00:18:23,212 --> 00:18:28,793 said, it is possible, in certain cases, to have general and indiscriminate data 205 00:18:28,793 --> 00:18:33,980 retention for national security, if there is a serious threat to national security 206 00:18:33,980 --> 00:18:40,949 that is as genuine and present and foreseeable. It's pretty clear from my 207 00:18:40,949 --> 00:18:45,999 reading of the judgment that this must be an extraordinary situation where 208 00:18:45,999 --> 00:18:50,640 surveilling everybody for a short time can help prevent a very serious threat to 209 00:18:50,640 --> 00:18:54,728 national security. But the commission is using that, and member states are also 210 00:18:54,728 --> 00:19:02,560 doing that. We'll get to that later. Using that as sort of a starting point to have 211 00:19:02,560 --> 00:19:08,177 general and indiscriminate data retention. So the commission asks – even though 212 00:19:08,177 --> 00:19:11,367 national security is the sole competence of the member states, 213 00:19:11,367 --> 00:19:14,768 and the commission is very unsure of any legal basis here – 214 00:19:14,768 --> 00:19:21,965 whether there should be an EU instrument on data retention for national security. 215 00:19:21,965 --> 00:19:24,879 One thing to note here is that this might be even worse 216 00:19:24,879 --> 00:19:30,160 than the Data Retention Directive because the commission indicates that this 217 00:19:30,160 --> 00:19:34,240 should not just cover telecommunications services, as the Data Retention Directive 218 00:19:34,240 --> 00:19:40,000 did, but also so-called over-the-top providers (OTT's.) Which would be services 219 00:19:40,000 --> 00:19:47,520 like Signal, WhatsApp, Facebook Messenger, and so forth. And this could potentially 220 00:19:47,520 --> 00:19:55,120 be an EU legislative initiative, whereas perhaps data retention for traditional 221 00:19:55,120 --> 00:19:59,760 telecommunications services could remain with member states. The Commission is also 222 00:19:59,760 --> 00:20:03,920 suggesting that there could be a mixed approach of national legislation and and 223 00:20:03,920 --> 00:20:10,000 EU legislation. The Commission is also asking member states about targeted data 224 00:20:10,000 --> 00:20:15,580 retention. This is what the court has said since, essentially since the first judgement 225 00:20:15,580 --> 00:20:20,320 that general and indiscriminate data retention is not allowed, but targeted 226 00:20:20,320 --> 00:20:27,680 data retention could be allowed or is allowed by EU law. And unfortunately, what 227 00:20:27,680 --> 00:20:33,040 the Commission does in this area is so, take every hint in the La Quadrature 228 00:20:33,040 --> 00:20:38,640 judgement, and sort of amplify to make targeted data retention cover as much as 229 00:20:38,640 --> 00:20:44,000 possible. It's pretty clear from the judgment that targeted data retention has 230 00:20:44,000 --> 00:20:48,763 to be the exception, not the rule. It cannot cover half of the population. But 231 00:20:48,763 --> 00:20:53,600 this part is sort of forgotten by the Commission in the non-paper. So they 232 00:20:53,600 --> 00:20:58,720 mentioned a long list of areas: critical infrastructure, transport hubs, and then 233 00:20:58,720 --> 00:21:03,920 areas with above average crime rates. This is not exactly what the court said; said 234 00:21:03,920 --> 00:21:10,000 specific areas of high incidence of crimes or above average could very easily include 235 00:21:10,000 --> 00:21:16,000 a large part of a member state. And then on the person-based targeted data 236 00:21:16,000 --> 00:21:20,960 retention, it sort of mentions almost everybody who could be of interest to the 237 00:21:20,960 --> 00:21:24,880 police: known organized groups, individuals convicted of serious crime, 238 00:21:24,880 --> 00:21:30,160 individuals who have been subject to a lawful interception order, individuals on 239 00:21:30,160 --> 00:21:36,240 watchlists, and so forth. You know the tendency of the police and secret services 240 00:21:36,240 --> 00:21:46,377 to put people on watchlists in secret, that this can presumably be very long. 241 00:21:46,377 --> 00:21:50,160 In connection with this – it actually gets even worse – because in connection with 242 00:21:50,160 --> 00:21:57,760 targeted data retention, the Commission mentions the idea of having subscriber 243 00:21:57,760 --> 00:22:01,840 information collected on everybody and verified subscriber information, including 244 00:22:01,840 --> 00:22:09,600 mandatory and EU-wide mandatory obligation to have registration of anonymous SIM 245 00:22:09,600 --> 00:22:13,840 cards, pay as you go SIM cards. And this is sort of justified by the targeted data 246 00:22:13,840 --> 00:22:18,267 retention. We need to make sure that the right persons can be targeted. 247 00:22:18,267 --> 00:22:23,520 The non-paper also mentions quick-freeze or expedited retention. This is interesting 248 00:22:23,520 --> 00:22:29,440 because quick-freeze data preservation is what civil society has called for as 249 00:22:29,440 --> 00:22:36,107 the alternative to data retention, basically ever since the mid-2000s. 250 00:22:36,107 --> 00:22:41,120 And finally, it goes into the generalized retention of IP addresses, which the Court 251 00:22:41,120 --> 00:22:46,640 of Justice unfortunately allowed on a general and indiscriminate basis in the 252 00:22:46,640 --> 00:22:52,000 La Quadrature judgement, but limited to serious crime. Just looking at the 253 00:22:52,000 --> 00:22:57,520 questions, Commission is asking member states whether sufficient, whether all 254 00:22:57,520 --> 00:23:03,840 relevant cybercrimes are covered by their notion of serious crime. Let me briefly, 255 00:23:03,840 --> 00:23:09,120 in conclusion, mention some of the member states' reaction to this. Statewatch did a 256 00:23:09,120 --> 00:23:13,920 Freedom of Information Access request with the Commission to get a response from 257 00:23:13,920 --> 00:23:20,640 member states. Most of them refused, but some Denmark, Finland, Germany, 258 00:23:20,640 --> 00:23:26,613 Hungary, Luxembourg, the Netherlands, and Sweden provided their responses. 259 00:23:26,613 --> 00:23:33,520 In general, they want an EU instrument, but they're not interested in having that 260 00:23:33,520 --> 00:23:38,560 cover national security. They're also not too keen on targeted data retention, and 261 00:23:38,560 --> 00:23:43,840 they don't really like the idea of quick- freeze. So it's not entirely clear what 262 00:23:43,840 --> 00:23:49,360 this EU instrument should cover. Except while they want, you know, indiscriminate 263 00:23:49,360 --> 00:23:54,640 data retention for everything, but they can't have that. And that is their chosen 264 00:23:54,640 --> 00:23:59,530 state of denial, which has been going on since the first judgment in 2014. 265 00:23:59,530 --> 00:24:05,484 Let me let me stop here as the summary of the present EU initiative, 266 00:24:05,484 --> 00:24:09,383 and we'll continue later. 267 00:24:09,383 --> 00:24:12,629 Ebelt: Thank you, Jesper. 268 00:24:12,629 --> 00:24:20,800 So the Commission is communicating and negotiating a lot with member state 269 00:24:20,800 --> 00:24:28,640 governments, and the aim seems to be to find new ways for more mass surveillance. 270 00:24:28,640 --> 00:24:33,798 So of course the question is, how do national governments 271 00:24:33,798 --> 00:24:40,586 in the European Union treat fundamental rights and respond to the legal situation? 272 00:24:40,586 --> 00:24:48,557 So what is going on in EU member states? And maybe let us start with France, 273 00:24:48,557 --> 00:24:56,160 where Chloé from EDRi can tell us about the situation. 274 00:24:56,160 --> 00:24:59,662 Chloé Berthélémy: Sure, thanks. I'd like to introduce 275 00:24:59,662 --> 00:25:04,519 a little bit maybe the work of our EDRi member in France? 276 00:25:04,519 --> 00:25:09,222 So EDRi is a network of members, and one of them is La Quadrature du Net. 277 00:25:09,222 --> 00:25:12,946 And they were among the main 278 00:25:12,946 --> 00:25:30,120 *sounds of fixing microphone* 279 00:25:30,120 --> 00:25:35,494 Hmm, would this one work? Yes, OK. Sorry about that. 280 00:25:35,494 --> 00:25:39,565 Yes, so I wanted to talk about like the work of like La Quadrature du Net, 281 00:25:39,565 --> 00:25:45,130 our EDRi member in France. Sorry for all the technical details. 282 00:25:45,130 --> 00:25:49,978 They were one of the main plaintiffs that led to the landmark ruling, 283 00:25:49,978 --> 00:25:53,668 La Quadrature du Net, that was mentioned already a couple of times now. 284 00:25:53,668 --> 00:25:58,006 And they brought this case, like the procedure started already in 2015. 285 00:25:58,006 --> 00:26:02,841 They went in front of the Council of State in France 286 00:26:02,841 --> 00:26:08,856 after the first ruling by the Court of Justice of the European Union in 287 00:26:08,856 --> 00:26:15,266 in Digital Rights Ireland and wanted to have the legal framework in France removed 288 00:26:15,266 --> 00:26:21,742 or annulled by the Council of State. And obviously, that wasn't to the taste of the 289 00:26:21,742 --> 00:26:26,035 highest administrative court in France, and they decided to refer yet another 290 00:26:26,035 --> 00:26:33,847 question to the Court of Justice. That led to the famous ruling in 2020 in October, 291 00:26:33,847 --> 00:26:39,404 saying that the National Legal Framework in France is actually contrary to EU law 292 00:26:39,404 --> 00:26:44,800 and to the Charter of Fundamental Rights. But unfortunately, what the kind of, the 293 00:26:44,800 --> 00:26:50,919 aftermath of this ruling in October 2020 shows us is that France is, possibly, one 294 00:26:50,919 --> 00:26:57,006 of the most aggressive, offensive member states in the EU who is willing to 295 00:26:57,006 --> 00:27:03,878 really pay the price and the high price to keep its mass retention regime in place. 296 00:27:03,878 --> 00:27:09,310 The reason why I'm saying this is because the government made a huge advocacy 297 00:27:09,310 --> 00:27:15,182 campaign towards the Council of State, so the highest administrative court again 298 00:27:15,182 --> 00:27:21,400 charged to actually give its decision after the CJEU ruling. They submitted 299 00:27:21,400 --> 00:27:28,214 weeks before the decision was released by the Council of State, a statement of case. 300 00:27:28,214 --> 00:27:33,671 And in this statement of case, they argued that the Court of Justice 301 00:27:33,671 --> 00:27:38,961 of the European Union would have actually had abused its powers, 302 00:27:38,961 --> 00:27:44,494 and actually wanted to advise the Council of State to ignore 303 00:27:44,494 --> 00:27:48,353 everything that the court has said, and mentioned that this is 304 00:27:48,353 --> 00:27:53,431 way beyond its jurisdiction to actually limit member states in the EU 305 00:27:53,431 --> 00:27:57,010 with anything related to the fight against terrorism 306 00:27:57,010 --> 00:28:01,511 or everything related to national security 307 00:28:01,511 --> 00:28:05,946 and therefore the ruling should be completely ignored. 308 00:28:05,946 --> 00:28:13,110 The Council of State more or less followed this government approach. 309 00:28:13,110 --> 00:28:20,805 Obviously, not as radical position as the French government said. 310 00:28:20,805 --> 00:28:24,884 Because I think it was reported even … the press, that 311 00:28:24,884 --> 00:28:27,602 at one point the French government 312 00:28:27,602 --> 00:28:33,847 was really willing to even negotiate a reopening of the EU treaties 313 00:28:33,847 --> 00:28:38,367 and notably the charter. They would go as far as 314 00:28:38,367 --> 00:28:42,332 going against the primary law of the European Union 315 00:28:42,332 --> 00:28:47,741 to change it in order to accommodate France needs in terms of national security. 316 00:28:47,741 --> 00:28:54,154 Which was pretty strong and quite telling in terms of like the contradiction 317 00:28:54,154 --> 00:29:00,055 that there is with France's typical kind of reputation as a pro-European Union 318 00:29:00,055 --> 00:29:05,233 integration leader. As like, a reputation it has to just drive 319 00:29:05,233 --> 00:29:09,156 EU integration forward and be pro-European in general. 320 00:29:09,156 --> 00:29:12,290 So they're really willing to jeopardize their position 321 00:29:12,290 --> 00:29:18,113 as a strategic position in those fields to keep mass surveillance in place. 322 00:29:18,113 --> 00:29:22,576 And so to all appearance, even if the Council of States said that 323 00:29:22,576 --> 00:29:30,123 the decrees in place since 2015 should be revised, they largely actually give the 324 00:29:30,123 --> 00:29:36,880 legislature all the keys and solutions, corrective solutions, to just maintain the 325 00:29:36,880 --> 00:29:43,040 surveillance regime in place. So how it did that? I'm not going to go through the 326 00:29:43,040 --> 00:29:48,000 entire judgment, because it's rich and there is a lot of conclusions that we 327 00:29:48,000 --> 00:29:52,000 could like analyze, and it's super interesting. But maybe one that is quite 328 00:29:52,000 --> 00:29:58,880 telling I would mention. And that shows like how France is willing to do anything 329 00:29:58,880 --> 00:30:03,280 to keep its data retention in place, and the matter that is indiscriminate in 330 00:30:03,280 --> 00:30:09,040 general, is the reinterpretation of the notion of national security. And in this 331 00:30:09,040 --> 00:30:14,480 notion of national security, it goes far beyond terrorism and thus was also like 332 00:30:14,480 --> 00:30:18,800 showcased during the hearing made by the Council of State just before it released 333 00:30:18,800 --> 00:30:24,080 its decision. There was the general, the director general of the intelligence 334 00:30:24,080 --> 00:30:28,320 services, talking at the hearing and mentioning 'actually terrorism, we have 335 00:30:28,320 --> 00:30:33,760 all the legal tools at hand. It's not so much that you are limited in competence. 336 00:30:33,760 --> 00:30:40,160 What afraid us is more like the, if we apply the CJEU ruling now, we will have 337 00:30:40,160 --> 00:30:46,880 less power to actually surveil and spy on people who are at the kind of the 338 00:30:46,880 --> 00:30:52,160 forefront of social movements, or who are organizing like demonstrations, who are 339 00:30:52,160 --> 00:30:57,200 engaged in social justice fights, and so on. And so in this context of, in this 340 00:30:57,200 --> 00:31:02,640 notion of national security, the Council of State is putting any threat to the 341 00:31:02,640 --> 00:31:07,360 economic interests of the French nation. So they are thinking about economic 342 00:31:07,360 --> 00:31:14,080 espionage, but they're also thinking about mild, like drug trafficking, even like the 343 00:31:14,080 --> 00:31:19,920 smallest networks in your city suburbs? Like that could also fall as a threat to 344 00:31:19,920 --> 00:31:24,160 national security and justify the indiscriminate and general data retention. 345 00:31:24,160 --> 00:31:31,200 And then lastly, the organization of non- registered protests as a permanent threat 346 00:31:31,200 --> 00:31:37,180 to public peace, they call it "public peace". And so that would justify 347 00:31:37,180 --> 00:31:44,160 a general kind of threat to national, that would like demonstrate a threat to 348 00:31:44,160 --> 00:31:49,600 national security permanently and allow France to keep its indiscriminate and 349 00:31:49,600 --> 00:31:54,880 general data retention regime for good. Completely contrary to what the Court of 350 00:31:54,880 --> 00:32:03,280 Justice said, obviously. And even going beyond what France has in place until now, 351 00:32:03,280 --> 00:32:07,040 which was like the state of emergency. I think this is something that many of you 352 00:32:07,040 --> 00:32:11,600 probably heard in 2015, during the terrorist attacks. France reacted 353 00:32:11,600 --> 00:32:16,320 strongly, implemented a lot of measures that were anti-democratic, very like going 354 00:32:16,320 --> 00:32:20,960 against rights and freedoms. That was supposed to be temporary. Unfortunately, 355 00:32:20,960 --> 00:32:26,400 following the ruling by the Court of Justice – and this is also a natural trend 356 00:32:26,400 --> 00:32:31,840 and flow – they decided to bring all those measures that were exceptionally allowed, 357 00:32:31,840 --> 00:32:37,806 in exceptional times. And now they proposed recently in April – just a 358 00:32:37,806 --> 00:32:44,344 few weeks after, a month after actually the decision of the Council – a reform that 359 00:32:44,344 --> 00:32:51,040 brings everything, all of these measures into ordinary law. So obviously what the 360 00:32:51,040 --> 00:32:56,480 Council of State has said: indiscriminate general retention obviously always OK, 361 00:32:56,480 --> 00:33:01,280 because there's constantly threats to national security. But there is obviously 362 00:33:01,280 --> 00:33:09,680 other measures linked to house arrest, use of drones, cooperation with private actors 363 00:33:09,680 --> 00:33:14,880 to enable government hacking into end devices of users, etc. etc. And so all 364 00:33:14,880 --> 00:33:19,680 of this is packaged into one nice little law. And the latest development that I can 365 00:33:19,680 --> 00:33:24,685 share with you in France is that the … 366 00:33:24,685 --> 00:33:28,800 the socialists in the parliament blocked 367 00:33:28,800 --> 00:33:34,240 submission of this bill to the council, the Constitutional Council of France, the 368 00:33:34,240 --> 00:33:38,720 only kind of institution that is left, that kind of like control a little bit 369 00:33:38,720 --> 00:33:43,840 what the government has to say, and put forward as legislation. 370 00:33:43,840 --> 00:33:48,647 And unfortunately, the part related, 371 00:33:48,647 --> 00:33:52,560 like the provision related to data retention in this bill 372 00:33:52,560 --> 00:33:55,388 weren't submitted to the Constitutional Council, 373 00:33:55,388 --> 00:33:59,188 so they never had any say in this. And so now the project is adopted. 374 00:33:59,188 --> 00:34:01,228 Which is like, this is… 375 00:34:01,228 --> 00:34:05,466 Voilà. Rubber stamped. What the Council of State has decided for France. 376 00:34:05,466 --> 00:34:11,370 And this will be very difficult in the future to attack again. 377 00:34:11,370 --> 00:34:14,611 Ebelt: Thank you, Chloé. 378 00:34:14,611 --> 00:34:19,273 I must admit that it is extremely interesting to hear about the situation 379 00:34:19,273 --> 00:34:28,872 in France, but it's also extremely shocking to hear about the strong tensions in the 380 00:34:28,872 --> 00:34:35,722 relations between governments and courts, and governments and rule of law. 381 00:34:35,722 --> 00:34:38,913 And now, to everybody who's watching this talk live, 382 00:34:38,913 --> 00:34:44,436 you can send in your questions to the speakers 383 00:34:44,436 --> 00:34:50,163 by using the hashtag #rc3cwtv, 384 00:34:50,163 --> 00:34:55,283 because we are having a Q&A after, right after the talk. 385 00:34:55,283 --> 00:35:01,282 And maybe we will come back to this point in the Q&A. 386 00:35:01,282 --> 00:35:09,529 And the hashtag is for Mastodon and Twitter. 387 00:35:09,529 --> 00:35:14,193 Since recently, there's a new government in Berlin. 388 00:35:14,193 --> 00:35:19,678 And also, Germany is next to or together with France 389 00:35:19,678 --> 00:35:24,947 a big and important player in EU politics. 390 00:35:24,947 --> 00:35:29,788 So also, there's a new situation in Germany with data retention. 391 00:35:29,788 --> 00:35:33,172 And of course, this question goes to Patrick Breyer 392 00:35:33,172 --> 00:35:37,722 as a German member of the European Parliament. 393 00:35:37,722 --> 00:35:42,910 Patrick, what can you tell us about the situation in Germany? 394 00:35:42,910 --> 00:35:47,786 Breyer: Well, legally speaking, indiscriminate data retention legislation 395 00:35:47,786 --> 00:35:53,649 is in force, but it's not being applied due to a pending court cases that have 396 00:35:53,649 --> 00:35:59,645 said that it violates the EU case law and charter of fundamental rights. 397 00:35:59,645 --> 00:36:05,440 The European Court of Justice will rule, will decide next year on the compatibility 398 00:36:05,440 --> 00:36:12,493 of the German regime with the European fundamental rights. And in the meantime, 399 00:36:12,493 --> 00:36:19,069 the new government has agreed that data should be retained on an ad hoc basis 400 00:36:19,069 --> 00:36:27,168 and by judicial order only. Now, on the one hand side, this excludes sort of 401 00:36:27,168 --> 00:36:31,725 indiscriminate and general regime. But on the other hand, after what you've 402 00:36:31,725 --> 00:36:36,120 heard from previous speakers, you will know that it does not exclude, for 403 00:36:36,120 --> 00:36:42,134 example, a geographically targeted retention that could cover vast parts of 404 00:36:42,134 --> 00:36:48,305 the country, above average crime rates and the like. Nor does it really exclude the 405 00:36:48,305 --> 00:36:55,353 retention of data referring to a present or foreseeable national security threat, 406 00:36:55,353 --> 00:37:01,566 which could also be said to be on an ad hoc basis. So we'll have to watch 407 00:37:01,566 --> 00:37:07,158 very closely what the government will do. The Liberals and the new Justice Minister 408 00:37:07,158 --> 00:37:13,039 are advocating for quick-freeze. But there is a risk, for example, that in the 409 00:37:13,039 --> 00:37:17,506 pending procedure, the courts will not invalidate indiscriminate IP data 410 00:37:17,506 --> 00:37:21,908 retention. You know, saying that the European Court of Justice said all IP data 411 00:37:21,908 --> 00:37:26,470 retention is OK. And there is a risk that the coalition cannot agree, cannot find a 412 00:37:26,470 --> 00:37:32,343 majority to agree on abolishing it politically. So we'll have to see and 413 00:37:32,343 --> 00:37:39,575 watch very closely how the new government will behave also at a European level. 414 00:37:39,575 --> 00:37:47,123 Ebelt: Thank you, Patrick. You said there is a pending procedure on data retention 415 00:37:47,123 --> 00:37:54,403 in Germany, and I know there's also a pending court case, if I'm not mistaken, 416 00:37:54,403 --> 00:38:00,540 on the national data retention regime in Ireland. 417 00:38:00,540 --> 00:38:07,324 And TJ, what can you tell us about the situation in Ireland? 418 00:38:07,324 --> 00:38:10,373 McIntyre: So there are, in fact, three 419 00:38:10,373 --> 00:38:14,451 cases before the Court of Justice at this moment: one from Germany, one from 420 00:38:14,451 --> 00:38:19,444 Ireland, and a parallel one from France. And what to me is very interesting about 421 00:38:19,444 --> 00:38:22,868 those cases is not so much the questions that are asked, but how the court has been 422 00:38:22,868 --> 00:38:25,773 dealing with the case. So the questions that are asked are basically the same 423 00:38:25,773 --> 00:38:31,413 questions over again. Can we have indiscriminate mass retention of data 424 00:38:31,413 --> 00:38:35,385 where we need it for dealing with serious crimes? That is essentially the question 425 00:38:35,385 --> 00:38:38,902 that the Irish court has asked. Again, it's basically putting it up to the 426 00:38:38,902 --> 00:38:44,667 European Court of Justice to change its mind. And then the questions from 427 00:38:44,667 --> 00:38:49,003 Germany are very similar, because we're dealing with the law, which is again 428 00:38:49,003 --> 00:38:53,333 indiscriminate, albeit that the German retention period has now been reduced to 429 00:38:53,333 --> 00:38:58,910 approximately 10 weeks, I think, Patrick. And the question from France is a slightly 430 00:38:58,910 --> 00:39:02,645 more technical question and parralel area of law. But again, the national 431 00:39:02,645 --> 00:39:06,583 governments were taking the opportunity here to push the agenda of looking to 432 00:39:06,583 --> 00:39:11,575 rewind the time machine to 2014, prior to the Digital Rights Ireland judgment, and 433 00:39:11,575 --> 00:39:15,398 go back to a situation where mass indiscriminate retention was allowed. 434 00:39:15,398 --> 00:39:18,112 What the court did, to my mind which was very interesting, 435 00:39:18,112 --> 00:39:21,742 in dealing with these cases, was it initially said, right, we're going to 436 00:39:21,742 --> 00:39:26,277 ask the national courts, do they really want us to hear these cases? 437 00:39:26,277 --> 00:39:29,134 After the La Quadrature du Net judgement the Supreme Court of Justice 438 00:39:29,134 --> 00:39:31,608 reached out to the Irish Supreme Court, for example, and said to it 439 00:39:31,608 --> 00:39:35,079 essentially, "Listen, we've already answered your question. Do you 440 00:39:35,079 --> 00:39:41,166 really want to go ahead with this case?" And in fact, the advocate general 441 00:39:41,166 --> 00:39:48,033 suggested something even more dramatic, if you like, as a response. 442 00:39:48,033 --> 00:39:51,164 Where he said that the response of the courts should be 443 00:39:51,164 --> 00:39:58,219 to dispose of the case using Article 99 of the Rules of Procedure of the court. 444 00:39:58,219 --> 00:40:01,880 Now that might not sound very interesting, but Article 99 basically means you can take 445 00:40:01,880 --> 00:40:05,814 an incoming request from a national court and say, "We've dealt with this already. We 446 00:40:05,814 --> 00:40:10,608 We don't need to hear this case, and we can dispose of it without a hearing." So 447 00:40:10,608 --> 00:40:16,672 the advocate general and I think the court itself is intent here on sending a signal 448 00:40:16,672 --> 00:40:21,144 that we've decided, we've made up our minds regarding these cases, we're not 449 00:40:21,144 --> 00:40:24,412 interested in hearing more and more national cases coming back to us for 450 00:40:24,412 --> 00:40:28,388 national governments. We'd really like you to change your mind now. The Court of 451 00:40:28,388 --> 00:40:32,911 Justice, to my mind, is about to send a signal here where it says, look, the law 452 00:40:32,911 --> 00:40:37,240 on this point is settled. Please go and try and implement that law in good faith, 453 00:40:37,240 --> 00:40:41,560 as opposed to coming back to us with ever more ingenious ways of arguing in favor of 454 00:40:41,560 --> 00:40:45,809 mass data retention. The real question, of course, is whether that's going to happen, 455 00:40:45,809 --> 00:40:50,382 whether national courts are going – national governments, I should say – are 456 00:40:50,382 --> 00:40:54,367 prepared to respect the rule of law. Or whether, as Chloé pointed out, they're 457 00:40:54,367 --> 00:40:59,862 going to be prepared to continue to manufacture a crisis, to manufacture a 458 00:40:59,862 --> 00:41:04,806 collision between national law and European law for the sake of promoting 459 00:41:04,806 --> 00:41:09,663 this surveillance agenda. And unfortunately, I suspect the national 460 00:41:09,663 --> 00:41:13,156 governments are more likely to do the latter than the former. I think it is very 461 00:41:13,156 --> 00:41:17,562 likely that we'll continue to see pushback from them. 462 00:41:17,562 --> 00:41:25,520 Ebelt: Thank you. Next question goes to Jesper. And it would be, how is the signal 463 00:41:25,520 --> 00:41:31,520 – that's how you, TJ, framed what's going on – how is the signal from the European 464 00:41:31,520 --> 00:41:36,328 Court of Justice received in Denmark? 465 00:41:36,328 --> 00:41:38,855 Lund: Well, thank you very much. 466 00:41:38,855 --> 00:41:39,890 *cough* 467 00:41:39,890 --> 00:41:46,160 Sorry. Well, the current Danish data retention law, which is about to be 468 00:41:46,160 --> 00:41:50,880 updated, is essentially the old data retention directive, so general and 469 00:41:50,880 --> 00:41:54,064 indiscriminate data retention of telecommunications services, 470 00:41:54,064 --> 00:41:56,293 kept for one year. 471 00:41:56,293 --> 00:42:01,840 There's a court challenge to that which is still ongoing. The Association 472 00:42:01,840 --> 00:42:07,360 Against Illegal and Mass Surveillance actually lost the case in the first 473 00:42:07,360 --> 00:42:11,120 instance because the government, the Ministry of Justice, argued that the 474 00:42:11,120 --> 00:42:15,360 Danish law should not be annulled. Rather, it should not be applied to the extent 475 00:42:15,360 --> 00:42:21,520 that it is against EU law. So to some extent, this is the same situation as in 476 00:42:21,520 --> 00:42:29,600 Germany, although the Danish telecommunications providers are retaining 477 00:42:29,600 --> 00:42:34,320 the data voluntarily as though the law is still in effect. So to some extent, that 478 00:42:34,320 --> 00:42:38,880 is a sweet spot for the government. Officially they do not have to apply the 479 00:42:38,880 --> 00:42:44,560 data retention law, but telecommunications providers are respecting it anyhow. 480 00:42:44,560 --> 00:42:49,440 Nonetheless, the Danish government has taken upon itself the task of adjusting 481 00:42:49,440 --> 00:42:55,280 Danish law, claiming that after these adjustments, it will be compatible with 482 00:42:55,280 --> 00:42:59,280 the case law of the Court of Justice. So that sounds very interesting. 483 00:42:59,280 --> 00:43:04,640 Unfortunately it's a total exercise in circumventing the court, because, in the 484 00:43:04,640 --> 00:43:10,640 end, we will have almost exactly the same data retention as we have today. It'll 485 00:43:10,640 --> 00:43:15,520 just be relabeled in a way that the government claims it complies with the 486 00:43:15,520 --> 00:43:20,240 Court of Justice. And sort of the main vehicle for doing that is the same one 487 00:43:20,240 --> 00:43:27,520 used in France, namely data retention for national security. So Denmark is going to 488 00:43:27,520 --> 00:43:32,000 claim similar to what France is doing, that there is a quasi-permanent threat to 489 00:43:32,000 --> 00:43:38,720 national security, which justifies the general and indiscriminate retention of 490 00:43:38,720 --> 00:43:47,520 all communications data. Some of the safeguards in the La Quadrature 491 00:43:47,520 --> 00:43:53,280 judgements, such as review by an independent court of these renewable 492 00:43:53,280 --> 00:44:00,080 decisions for general and indiscriminate data retention, are ignored completely. 493 00:44:00,080 --> 00:44:04,800 The Ministry of Justice says, well you can sue us if you disagree with our decisions, 494 00:44:04,800 --> 00:44:10,240 and by the way, the your civil court case will not get access to all the evidence 495 00:44:10,240 --> 00:44:15,520 that the Ministry of Justice used to justify the general and 496 00:44:15,520 --> 00:44:19,520 indiscriminate data retention due to our threat to national security. So it's an 497 00:44:19,520 --> 00:44:23,920 almost impossible situation. However, it gets even worse, because you if you have 498 00:44:23,920 --> 00:44:28,240 data retention for national security, you would sort of by the principle of purpose 499 00:44:28,240 --> 00:44:33,280 limitation, you would assume that it is limited to that purpose only. The Danish 500 00:44:33,280 --> 00:44:38,080 government disagrees with that, because the retained data, similar to France, can 501 00:44:38,080 --> 00:44:42,960 also be used for serious crime. So that is, in effect, maintaining the current 502 00:44:42,960 --> 00:44:47,760 data retention regime, except relabeling it as data retention for national 503 00:44:47,760 --> 00:44:53,760 security, but mainly used for the purpose of combating serious crime, as it is done 504 00:44:53,760 --> 00:45:01,520 currently. There's a small catch here. The Danish government recognizes that there is 505 00:45:01,520 --> 00:45:05,920 significant legal uncertainty with this interpretation that it can't be used or 506 00:45:05,920 --> 00:45:10,720 accessed in cases of serious crime. So it's actually very possible that 507 00:45:10,720 --> 00:45:15,920 Denmark will one day send a data retention case to Luxembourg. So let's see 508 00:45:15,920 --> 00:45:21,280 how that goes, when the Court of Justice believes that every possible question 509 00:45:21,280 --> 00:45:26,480 about data retention has been answered. But this is not the end of the story in 510 00:45:26,480 --> 00:45:31,600 Denmark. So the Minister of Justice is aware that one day it may not be possible 511 00:45:31,600 --> 00:45:37,040 to maintain general indiscriminate data retention, because it has to be for a time 512 00:45:37,040 --> 00:45:42,240 limited period, so that cannot be evaded forever. There's also the possibility that 513 00:45:42,240 --> 00:45:47,120 the Danish government might lose a court case. So as an insurance policy to cover 514 00:45:47,120 --> 00:45:54,400 this situation, there is a provision on targeted data retention. This would only 515 00:45:54,400 --> 00:45:58,480 kick in if the general and indiscriminate data retention for national security 516 00:45:58,480 --> 00:46:04,160 cannot continue. And it is not really targeted, because the, just like the 517 00:46:04,160 --> 00:46:09,760 European Commission obviously tries to do with the non-paper that I 518 00:46:09,760 --> 00:46:15,120 described earlier, so the Danish government is taking the possibilities for 519 00:46:15,120 --> 00:46:21,360 targeted data retention, adding them together to the extreme. So the 520 00:46:21,360 --> 00:46:27,120 general criterion for above average crime rates is defined in a way that makes no 521 00:46:27,120 --> 00:46:33,280 adjustment for a population density. So any city or city-like area in Denmark will 522 00:46:33,280 --> 00:46:38,830 have an above average number of crime cases, and that will be included in the 523 00:46:38,830 --> 00:46:45,120 geographical targeted data retention. So 5%…sorry. 75% of the Danish population 524 00:46:45,120 --> 00:46:50,880 lives in 5% of the Danish territory, the cities. They will be surveilled just like 525 00:46:50,880 --> 00:46:56,400 before. On top of that, you have infrastructure sites: every train station 526 00:46:56,400 --> 00:47:03,760 or almost every train station and the mobile towers are selected, so that these 527 00:47:03,760 --> 00:47:08,480 areas are covered in full even though it means surveilling people outside these 528 00:47:08,480 --> 00:47:14,400 areas. So in the end, with the targeted data retention, something like 80 to 90 % 529 00:47:14,400 --> 00:47:19,760 of the Danish population will be covered. On top of that, there are person-based 530 00:47:19,760 --> 00:47:24,970 criteria, where every person convicted of serious crime, every person that's 531 00:47:24,970 --> 00:47:29,920 been subject to lawful interception, criteria mentioned in the non-paper from 532 00:47:29,920 --> 00:47:35,760 the commission. And even with all of that – generalized indiscriminate data 533 00:47:35,760 --> 00:47:40,640 retention continuing, an insurance policy with targeted data retention that covers 534 00:47:40,640 --> 00:47:45,840 80 to 90 % of the Danish population – the Danish politicians, those that are in 535 00:47:45,840 --> 00:47:51,600 favor of data retention, which is a vast majority, complain that they have to 536 00:47:51,600 --> 00:47:55,760 restrict the law because of the Court of Justice in Luxembourg. And they are 537 00:47:55,760 --> 00:48:00,320 saying, these are, they're really using rhetoric that's that we would expect from 538 00:48:00,320 --> 00:48:05,920 from Hungary and Poland. Judges lack the democratic legitimacy, why should they 539 00:48:05,920 --> 00:48:10,798 interfere with Danish politics, and so forth. That is a really terrible situation 540 00:48:10,798 --> 00:48:20,033 for the rule of law in Denmark and Europe in general. So this is sort of... 541 00:48:20,033 --> 00:48:26,902 on a couple of minor tweaks also, there will be mandatory SIM card registration in 542 00:48:26,902 --> 00:48:31,783 Denmark as one of the last EU member states to introduce that, 543 00:48:31,783 --> 00:48:36,493 unfortunately. There are a couple of others that also don't have it yet. 544 00:48:36,493 --> 00:48:42,372 And the threshold for serious crime will be lowered as well. So in effect, even though 545 00:48:42,372 --> 00:48:46,829 if this is presented as adjusting to the case law of the Court of Justice, we will 546 00:48:46,829 --> 00:48:52,099 have in practice more data retention and police would have easier access to the 547 00:48:52,099 --> 00:48:57,507 data. I really hope that this does not become a blueprint for how other member 548 00:48:57,507 --> 00:49:03,008 states in Europe adapt to the case law from the Court of Justice. 549 00:49:03,008 --> 00:49:07,793 But it is unfortunately following the non-paper from the Commission, 550 00:49:07,793 --> 00:49:13,292 perhaps putting it to the extreme more than the Commission intended. 551 00:49:13,292 --> 00:49:17,798 But certainly not the response that we hoped for. 552 00:49:17,798 --> 00:49:21,446 So the fight in Denmark will continue, I can assure you of that. 553 00:49:21,446 --> 00:49:25,737 And let me stop here and pass the word back to Friedemann. 554 00:49:25,737 --> 00:49:28,240 Ebelt: Thank you, Jesper. Yes… 555 00:49:28,240 --> 00:49:31,187 I think the fight needs to continue. 556 00:49:31,187 --> 00:49:38,698 And you said that a lot of data retention politics has to do with 557 00:49:38,698 --> 00:49:44,006 circumventing the court and ignoring decisions and the rule of law. 558 00:49:44,006 --> 00:49:52,921 And on an EU level, a lot of this politics takes place, of course, in process at the 559 00:49:52,921 --> 00:49:58,041 European Parliament, at the Commission. So… 560 00:49:58,041 --> 00:50:01,152 And after hearing this, I… Yeah. I hope… 561 00:50:01,152 --> 00:50:08,564 Chloé, maybe you have some good news for us? What is the situation in Belgium? 562 00:50:08,564 --> 00:50:13,089 Berthélémy: I'm afraid not so good news either from Belgium. 563 00:50:13,089 --> 00:50:17,039 Let me try to a draw a little bit the situation from what happened 564 00:50:17,039 --> 00:50:23,103 since, again, the landmark ruling in October 2020. So it's funny. The 565 00:50:23,103 --> 00:50:28,918 Constitutional Court in Belgium released its decision following that, that ruling 566 00:50:28,918 --> 00:50:35,580 on the 21st of April. So that means one day after the French Council of State gave 567 00:50:35,580 --> 00:50:41,170 its decision. And I listened to the President of the Court of Justice, who was 568 00:50:41,170 --> 00:50:46,302 invited once at a French National Assembly, in front of the committee 569 00:50:46,302 --> 00:50:50,968 specialized in legal affairs and European affairs. And he were saying, "Oh, don't 570 00:50:50,968 --> 00:50:55,194 you imagine that those two jurists, the two courts obviously talk to each other, 571 00:50:55,194 --> 00:50:59,346 and this is why they released their judgment so close to one another." And 572 00:50:59,346 --> 00:51:04,032 those are two very neighboring countries, friend countries. So you can imagine that 573 00:51:04,032 --> 00:51:08,752 they discussed and they exchanged on their point of view on the CJEU ruling. 574 00:51:08,752 --> 00:51:13,391 I was like, well, probably if this is the case, they… probably like the conclusion 575 00:51:13,391 --> 00:51:17,871 of their talks was "we agree to disagree." Because the Constitutional Court of 576 00:51:17,871 --> 00:51:23,165 Belgium choose a completely divergent way compared to the French Council of States – 577 00:51:23,165 --> 00:51:29,641 which I remind you completely to go completely rogue and ignore the court's 578 00:51:29,641 --> 00:51:36,586 main conclusions – the Constitutional Court decided to basically implement what the 579 00:51:36,586 --> 00:51:44,720 CJEU said. And decided, gave the Belgian government the task to find the solution 580 00:51:44,720 --> 00:51:50,995 for itself. So completely something else, than the French Council of State has done. 581 00:51:50,995 --> 00:51:57,486 Which, in its case, was really like giving the French government the concrete 582 00:51:57,486 --> 00:52:04,400 corrective measures to maintain its regime in place. In Belgium, it was: Your legal 583 00:52:04,400 --> 00:52:10,509 system is false and should is annulled. Now you have to work on the solutions 584 00:52:10,509 --> 00:52:14,430 yourself. And so this is what the government has been doing. They have done 585 00:52:14,430 --> 00:52:19,740 it for a month only. So a month later they came up with a bill, with a proposal 586 00:52:19,740 --> 00:52:25,794 for a new law. And that was proposed by the Council of Ministers. Mainly what the 587 00:52:25,794 --> 00:52:32,365 bill contained is a system, is a regime for targeted retention. So they they are not 588 00:52:32,365 --> 00:52:39,000 even like going for the national security mass retention thing.They try out the 589 00:52:39,000 --> 00:52:44,695 targeted retention approach, and they mainly focus on the criteria of 590 00:52:44,695 --> 00:52:52,768 geographical areas. They also include individual-based criteria, but mainly they 591 00:52:52,768 --> 00:52:57,657 focus on how can we maintain data retention as much as possible based on 592 00:52:57,657 --> 00:53:04,320 this geographical measures and measurements. And this is basically what 593 00:53:04,320 --> 00:53:09,894 Jesper explained for Denmark. This isn't very far from actually including the 594 00:53:09,894 --> 00:53:15,154 entire country on there, just "targeted" data retention. The way to do that is, 595 00:53:15,154 --> 00:53:21,305 they the select first like geographical areas that they call "by nature sensitive" 596 00:53:21,305 --> 00:53:26,382 for national security or for any kind of public security. And that includes 597 00:53:26,382 --> 00:53:30,989 airports, train stations, metro stations, so you can already imagine that Brussels 598 00:53:30,989 --> 00:53:36,235 is entirely covered, the border zones with like the neighboring countries, 599 00:53:36,235 --> 00:53:41,840 hospitals, motorways (there's a lot of motorways in Belgium), research centers 600 00:53:41,840 --> 00:53:46,566 so everything that has to do with like state innovation, state research and 601 00:53:46,566 --> 00:53:51,217 everything, justice and police buildings, and all infrastructure, and then all the 602 00:53:51,217 --> 00:53:57,970 municipalities. So the entire territory of the municipality of a city, even small, 603 00:53:57,970 --> 00:54:03,673 which has on its territory critical infrastructures. So water supply, energy 604 00:54:03,673 --> 00:54:08,507 supply, everything. And you can already see like, it's just this list of 605 00:54:08,507 --> 00:54:14,964 geographical like places that the government selected. Given the density, 606 00:54:14,964 --> 00:54:20,474 the urban density of Belgium, the size of the country, it already covers quite a 607 00:54:20,474 --> 00:54:25,319 bunch of people. And a large proportion of the population will be submitted to data 608 00:54:25,319 --> 00:54:32,644 retention, to this "targeted" just in name retention. And they all also use, as 609 00:54:32,644 --> 00:54:39,918 Denmark, the average crime rate. This has been criticized heavily by the 610 00:54:39,918 --> 00:54:46,580 Data Protection Authority in Belgium. They said that the Minister of Justice failed to 611 00:54:46,580 --> 00:54:52,883 provide any statistics to actually explain why they decided this number, this amount 612 00:54:52,883 --> 00:54:59,196 of years. And they even criticized the source of the statistics that will be used 613 00:54:59,196 --> 00:55:04,900 to determine whether a judicial district will be subjected to data retention or 614 00:55:04,900 --> 00:55:10,255 not, because the government wants to use a police database where crimes are 615 00:55:10,255 --> 00:55:15,212 registered. But it's mainly managed and it's exclusively managed by police 616 00:55:15,212 --> 00:55:19,593 officers. So there is a high risk and a conflict of interest that police officer 617 00:55:19,593 --> 00:55:26,226 will just determine one minor act or one minor offense into a serious crime. And so 618 00:55:26,226 --> 00:55:31,442 therefore their police district or their judicial district will fall under data 619 00:55:31,442 --> 00:55:37,670 retention. The database is called the BNG, the BNG. And it was heavily criticized by 620 00:55:37,670 --> 00:55:43,120 journalists in Belgium. They released an entire investigation into the BNG, and 621 00:55:43,120 --> 00:55:48,701 they show that the BNG mostly contained false, like a lot of false information, 622 00:55:48,701 --> 00:55:56,668 rumors, non-verified information, or outdated information as well. And so the 623 00:55:56,668 --> 00:56:00,925 DPA, the Data Protection Authority, required that they use a different 624 00:56:00,925 --> 00:56:06,992 database with actual like criminal offense that led to a conviction that led to a 625 00:56:06,992 --> 00:56:12,435 criminal sentence. Which makes more sense, it's not even given. So this is all the 626 00:56:12,435 --> 00:56:18,224 problems we see with the Belgian bill. This is not limited to that, it is only 627 00:56:18,224 --> 00:56:24,127 the two things that I can mention now. There are many other problems that the DPA 628 00:56:24,127 --> 00:56:31,520 objected to. But for now, the chance we had is that this bill also contained very 629 00:56:31,520 --> 00:56:38,259 dangerous and controversial provisions on access to encrypted content, with the 630 00:56:38,259 --> 00:56:43,291 possibility to force service providers to switch off encryption for certain users. 631 00:56:43,291 --> 00:56:47,809 And thanks to that, there was enough like resistance from civil society and outrage 632 00:56:47,809 --> 00:56:53,224 in the public to halt a little bit the bill. So now it's still under negotiations 633 00:56:53,224 --> 00:56:58,312 with between the ministers before it is presented to the parliament. 634 00:56:58,312 --> 00:57:03,546 But we hopefully can also bring some more attention and traction 635 00:57:03,546 --> 00:57:10,091 on the data retention provision of this law, and try to… 636 00:57:10,091 --> 00:57:13,072 Yeah, halt as much as possible 637 00:57:13,072 --> 00:57:19,662 the general mass retention of metadata in Belgium as well. 638 00:57:19,662 --> 00:57:26,328 Ebelt: Thank you, Chloé. OK. There are many, many problems, but luckily there's 639 00:57:26,328 --> 00:57:31,547 also civil society, and there are also freedom advocates. 640 00:57:31,547 --> 00:57:40,757 So the big question I would really, really like to hear your opinion on is, 641 00:57:40,757 --> 00:57:43,204 what do you expect from the future? 642 00:57:43,204 --> 00:57:50,636 How should governments – but also maybe the Commission of the European Union – act? 643 00:57:50,636 --> 00:57:54,219 What should they do? 644 00:57:54,219 --> 00:57:59,849 Let's hear TJ first. 645 00:57:59,849 --> 00:58:06,078 McIntyre: Thanks Friedemann. Well, I think the problem is, we know what governments 646 00:58:06,078 --> 00:58:11,113 should do, which is comply with the law, and that they're unwilling to do that. So 647 00:58:11,113 --> 00:58:16,282 perhaps the question could become, what can we do to force them to comply with the 648 00:58:16,282 --> 00:58:22,091 law? Now, as civil society, we are collectively already very much 649 00:58:22,091 --> 00:58:26,885 overstretched, I think. Particularly, at the moment most people are doing this, 650 00:58:26,885 --> 00:58:34,715 myself included, as a part time thing. It's unusual to have an organization such 651 00:58:34,715 --> 00:58:40,698 as EDRi, which is quite professional in this regard, when particularly the smaller 652 00:58:40,698 --> 00:58:44,475 member states, this tends to be a part time activity for a small number of 653 00:58:44,475 --> 00:58:48,460 technologists, a small number of lawyers, and so on. So maybe the first thing 654 00:58:48,460 --> 00:58:52,374 everybody should be doing is supporting their local digital rights organization 655 00:58:52,374 --> 00:58:57,818 and I think probably all agreed on that. Otherwise, we're caught in something of a 656 00:58:57,818 --> 00:59:04,129 loop here where we're being reactive. Governments put forward laws which are 657 00:59:04,129 --> 00:59:09,853 ever more draconian, which very often breach existing precedent from the Court 658 00:59:09,853 --> 00:59:14,994 of Justice, never mind the Court of Human Rights. And we as civil society have to 659 00:59:14,994 --> 00:59:20,950 respond to that very expensively. The cost, for governments, of introducing new 660 00:59:20,950 --> 00:59:26,700 measures is, relatively speaking, low. In the sense that, if it doesn't meet with 661 00:59:26,700 --> 00:59:33,663 great domestic political pushback, it's quite straightforward for them to push 662 00:59:33,663 --> 00:59:37,683 forward new measures. And those measures can often remain in place for months or 663 00:59:37,683 --> 00:59:43,164 even years before there is litigation to challenge them, if indeed it's possible in 664 00:59:43,164 --> 00:59:47,418 a particular jurisdiction to bring litigation to challenge them. So as civil 665 00:59:47,418 --> 00:59:52,716 society, we're always on the back foot here. It is very much a reactive sort of 666 00:59:52,716 --> 00:59:57,692 game that we're playing. Ultimately, we need to increase the costs for pushing 667 00:59:57,692 --> 01:00:03,063 these kinds of very illiberal measures, and we need to do that at the point when 668 01:00:03,063 --> 01:00:06,490 those measures are being proposed and adopted. And I think we can learn here 669 01:00:06,490 --> 01:00:11,280 from the German experience and the way in which data retention and encryption and 670 01:00:11,280 --> 01:00:16,803 communications have been baked into the coalition government's negotiations. 671 01:00:16,803 --> 01:00:22,016 That's something which I think we, as voters and advocates need to try to get 672 01:00:22,016 --> 01:00:27,221 our governments to do at the point where those governments are being formed. Short 673 01:00:27,221 --> 01:00:30,477 of that, though, I don't really have any great answer, Friedemann. I'm sorry. 674 01:00:30,477 --> 01:00:34,581 Perhaps somebody else might be able to take it further. 675 01:00:34,581 --> 01:00:43,200 Ebelt: I think that's a great answer. And yet, Patrick, what we what are your 676 01:00:43,200 --> 01:00:54,219 thoughts on the future? Breyer: Well, I can tell for the European 677 01:00:54,219 --> 01:01:02,889 Parliament that I don't know what the majorities would be if the commission 678 01:01:02,889 --> 01:01:08,600 proposed another data retention legislation. Because, you know – having 679 01:01:08,600 --> 01:01:14,367 seen what's happened with chat control, where they justified even the scanning of 680 01:01:14,367 --> 01:01:19,848 content of communications, a blanket indiscriminate, using the child protection 681 01:01:19,848 --> 01:01:24,336 killer argument – I'm not sure that the European Parliament's majority would go 682 01:01:24,336 --> 01:01:30,906 against another data retention instrument, especially if it claims to abide by the 683 01:01:30,906 --> 01:01:35,738 European Court of Justice jurisprudence. And also, I'm very outraged at the 684 01:01:35,738 --> 01:01:39,751 European Data Protection supervisor who, in those court hearings that we've 685 01:01:39,751 --> 01:01:46,080 discussed earlier, actually undermines the ECJ jurisprudence and says, you know, what 686 01:01:46,080 --> 01:01:51,097 matters is access to data, not so much the storage. It's really outrageous. So, but 687 01:01:51,097 --> 01:01:56,653 one good thing from the European Parliament is that in the pending trilogue 688 01:01:56,653 --> 01:02:04,282 on the eprivacy regulation, on the reform, the majority agrees that we won't accept 689 01:02:04,282 --> 01:02:08,791 to have data retention in that specific instrument, because it's about eprivacy and 690 01:02:08,791 --> 01:02:15,742 not about e-surveillance. So what I'm trying to do at the EU level is to 691 01:02:15,742 --> 01:02:21,850 push back in the very early stages of the political process. First of all, I'm very 692 01:02:21,850 --> 01:02:28,239 happy that I found Friedemann to support my work. Last year, I have 693 01:02:28,239 --> 01:02:32,000 commissioned a study by the European Parliament's research service to compare 694 01:02:32,000 --> 01:02:36,026 crime rates throughout the EU. I've already told you about that. And 695 01:02:36,026 --> 01:02:41,102 currently, I have commissioned a poll to find out the public opinion on data 696 01:02:41,102 --> 01:02:46,964 retention in several EU countries. We'll have the results early next year. And 697 01:02:46,964 --> 01:02:55,325 I will also commission a legal opinion, ask a former Court of Justice, European 698 01:02:55,325 --> 01:03:00,206 Court of Justice judge, to write a legal opinion on the French resurrection of 699 01:03:00,206 --> 01:03:04,956 indiscriminate data retention, because that is a model that they are using, 700 01:03:04,956 --> 01:03:09,424 more and more countries are using. So if you have any more ideas about 701 01:03:09,424 --> 01:03:14,890 what we could do at EU level, please let me know. 702 01:03:14,890 --> 01:03:18,670 Ebelt: Thank you. 703 01:03:18,670 --> 01:03:26,546 Well, Chloé, what do you expect of the future or maybe 2022 704 01:03:26,546 --> 01:03:32,515 from your EDRi, European rights, NGO perspective? 705 01:03:32,515 --> 01:03:35,520 Berthélémy: Sure. Well, we'll continue 706 01:03:35,520 --> 01:03:40,320 obviously monitoring the situation at EU levels, just like Patrick does, only like 707 01:03:40,320 --> 01:03:43,740 with our network of experts and NGO. 708 01:03:43,740 --> 01:03:47,462 Obviously, looking at what the Commission has in mind and where this like 709 01:03:47,462 --> 01:03:52,466 long year process of like thinking how all of this can be like put together 710 01:03:52,466 --> 01:04:00,009 and enable mass data retention without like… 711 01:04:00,009 --> 01:04:06,415 without like insulting too much the Court of Justice will lead to actually. 712 01:04:06,415 --> 01:04:10,613 That would be obviously one of our main tasks for the future. 713 01:04:10,613 --> 01:04:15,623 We'll continue, as I said, as a network to monitor what's going on at national level. 714 01:04:15,623 --> 01:04:21,755 So, and as TJ said, we are lacking resources, especially at national level, 715 01:04:21,755 --> 01:04:25,617 to follow all the 27 jurisdictions. 716 01:04:25,617 --> 01:04:29,794 So if you're just interested, and it's in your country, I would just advise 717 01:04:29,794 --> 01:04:40,928 viewers now to look at, look up on EDRi's website our map of members. And 718 01:04:40,928 --> 01:04:47,913 You can join and get in touch with some of them at their contact email address. 719 01:04:47,913 --> 01:04:52,560 If you want to lend a hand and contribute to just monitoring, because the 720 01:04:52,560 --> 01:04:56,480 first step of what we're doing as civil society is just bring a light to those 721 01:04:56,480 --> 01:05:02,080 developments. Because most of the cases, like in many times, it's just going under 722 01:05:02,080 --> 01:05:07,680 the radar. The media isn't picking up the stories so quickly as we would like them 723 01:05:07,680 --> 01:05:15,200 to do, and all those kind of really rights-violating measures can go unchecked 724 01:05:15,200 --> 01:05:20,080 without any kind of democratic pushback or anything. So this is the kind of the first 725 01:05:20,080 --> 01:05:25,440 that I would recommend for viewers to do if they want to get engaged, is basically 726 01:05:25,440 --> 01:05:32,720 join us. Follow us on social media. Follow our website. EDRi has a newsletter where 727 01:05:32,720 --> 01:05:37,600 each and every members of EDRi can contribute, and write, and even guest 728 01:05:37,600 --> 01:05:44,160 writers sometimes. If you want to write about the situation in your country 729 01:05:44,160 --> 01:05:49,600 and you've investigated a little bit the state of play, please talk to us and drop 730 01:05:49,600 --> 01:05:55,120 us an email. Everything is… Obviously all the information of contact can be found on 731 01:05:55,120 --> 01:05:59,923 our website. And you can obviously subscribe to this newsletter. 732 01:05:59,923 --> 01:06:03,000 If you want to go a bit further and get really engaged 733 01:06:03,000 --> 01:06:05,856 like the step, the kind of, the scale of engagement, 734 01:06:05,856 --> 01:06:09,931 you can join us on our mailing list dedicated to the topic data retention, 735 01:06:09,931 --> 01:06:13,680 just by dropping me an email. If you're really into it and 736 01:06:13,680 --> 01:06:19,098 want to contribute actively to the analysis, to possible future campaigns, 737 01:06:19,098 --> 01:06:25,320 or any kind of advocacy actions, we're organizing at EU level. 738 01:06:25,320 --> 01:06:28,295 And then… That's for kind of the… 739 01:06:28,295 --> 01:06:31,779 I think I didn't forget anything you can do as viewers. 740 01:06:31,779 --> 01:06:37,043 In general, what we're looking for, we'll try to push the Commission. It's… 741 01:06:37,043 --> 01:06:40,678 I think it's a dead wish, but I will mention it nonetheless. 742 01:06:40,678 --> 01:06:45,577 We would like the Commission to do, to launch infringement procedures 743 01:06:45,577 --> 01:06:51,588 against countries that do not comply with the CJEU ruling. 744 01:06:51,588 --> 01:06:56,039 As I said, it's a dead wish, because this is a highly political topic. 745 01:06:56,039 --> 01:07:02,400 The Commission has stated multiple times in public that it won't do this. 746 01:07:02,400 --> 01:07:06,400 They're not interested in doing this. They're interested in being in a 747 01:07:06,400 --> 01:07:11,760 cooperative state of mind or spirit of collaboration with member states to find 748 01:07:11,760 --> 01:07:15,680 solutions. Another word for saying, we will ignore what the ruling, what the 749 01:07:15,680 --> 01:07:20,788 ruling says and try to find solutions that can work out and that avoids like 750 01:07:20,788 --> 01:07:26,960 the painful and embarrassing situation of having a future EU legal instrument being 751 01:07:26,960 --> 01:07:32,160 struck down by their own courts. But yeah. This is to be seen. We'll work together 752 01:07:32,160 --> 01:07:35,090 with Jesper. I don't know if you have anything to add, Jesper, to that. 753 01:07:35,090 --> 01:07:39,135 If I forgot something. 754 01:07:39,135 --> 01:07:42,757 Lund: No, I think… So even monitoring the situation 755 01:07:42,757 --> 01:07:49,463 in 27 member states is a huge task, and we definitely need help on this. 756 01:07:49,463 --> 01:07:52,702 Denmark is well covered, but 757 01:07:52,702 --> 01:08:00,351 there is also Sweden… 758 01:08:00,351 --> 01:08:04,237 Many, many different member states. 759 01:08:04,237 --> 01:08:10,206 Mostly you have governments that like data retention and either 760 01:08:10,206 --> 01:08:14,217 try to just ignore the Court of Justice or, as Denmark is doing, make 761 01:08:14,217 --> 01:08:18,696 adjustments to the national law that are not real adjustments, but just try to 762 01:08:18,696 --> 01:08:24,400 maintain what is already in place under the guise of adjusting to the Court 763 01:08:24,400 --> 01:08:28,889 of Justice. Or, we have talked about France, Denmark, Belgium as cases 764 01:08:28,889 --> 01:08:33,330 that really try to circumvent the case law. So keeping… 765 01:08:33,330 --> 01:08:36,850 One definite risk here is that data retention will be forgotten. 766 01:08:36,850 --> 01:08:39,760 That is what member states want, so that nobody talks about it. 767 01:08:39,760 --> 01:08:45,499 So we need to, yeah, we need to keep the public debate going 768 01:08:45,499 --> 01:08:46,600 and make sure that… 769 01:08:46,600 --> 01:08:50,523 contact journalists, make sure that they write about data retention. 770 01:08:50,523 --> 01:08:57,015 And also, I think focus on the rule of law problem that is associated with this area, 771 01:08:57,015 --> 01:09:00,637 because it's really not a sustainable situation that 772 01:09:00,637 --> 01:09:04,436 all member states are ignoring fundamental rights. 773 01:09:08,560 --> 01:09:16,800 Ebelt: In the end, everything sounds also a little bit promising. And at least, 774 01:09:16,800 --> 01:09:21,520 let's not forget this is about, it's about the citizens, it's about the people, 775 01:09:21,520 --> 01:09:27,411 it's about their data, it's about their governments, it's about their freedoms. 776 01:09:27,411 --> 01:09:33,715 Do you have or do you would like to add something? 777 01:09:33,715 --> 01:09:37,360 I mean, here as the speakers? 778 01:09:37,360 --> 01:09:43,107 If not… You still have time to interrupt me. 779 01:09:43,107 --> 01:09:47,138 We can have… I would hand over to Walter, and 780 01:09:47,138 --> 01:09:52,288 we can we just go into the Q&A part of the talk. 781 01:09:52,288 --> 01:09:57,803 And I would thank everybody too, for joining this talk. 782 01:09:57,803 --> 01:10:03,027 Also, thank you to the speakers. It's been really, really interesting. 783 01:10:03,027 --> 01:10:07,786 And to everybody who liked the talk you can recommend it, and 784 01:10:07,786 --> 01:10:13,818 you can get the audio and video to download on media.ccc.de 785 01:10:13,818 --> 01:10:15,218 And of course, 786 01:10:15,218 --> 01:10:22,043 Join the discussion on data retention by using the hashtag #dataretention 787 01:10:22,043 --> 01:10:25,371 or the hashtag that is used in your language. 788 01:10:25,371 --> 01:10:29,337 So Walter… 789 01:10:29,337 --> 01:10:33,600 Herald: Before we go into the questions that have we already have collected 790 01:10:33,600 --> 01:10:37,585 through Mastodon, IRC, Matrix, and Twitter 791 01:10:37,585 --> 01:10:43,847 TJ wanted to add something earlier on while Chloé was still talking, I remember. 792 01:10:43,847 --> 01:10:46,133 McIntyre: Thanks Walter, Chloé's 793 01:10:46,133 --> 01:10:51,120 points reminded me of something that I was very impressed 794 01:10:51,120 --> 01:10:57,040 with from the German campaign against data retention, which was the great use of 795 01:10:57,040 --> 01:11:02,800 civil societies – the groups representing journalists, lawyers, medical 796 01:11:02,800 --> 01:11:06,640 professionals, and so on – to make the point that communications confidentiality 797 01:11:06,640 --> 01:11:10,960 is important for them, too. And that's something I have to admit that we didn't 798 01:11:10,960 --> 01:11:16,800 do to the same extent in Ireland, but it's certainly something we've tried to do. And 799 01:11:16,800 --> 01:11:21,920 to the extent that anybody listening to this now is from a group where they have a 800 01:11:21,920 --> 01:11:25,600 professional interest in communications confidentiality, I think it's a good thing 801 01:11:25,600 --> 01:11:30,320 if you can work through your group to try to develop that. It might be that you're 802 01:11:30,320 --> 01:11:35,600 in an area such as technical security. It might be that you're in an area such as 803 01:11:35,600 --> 01:11:39,920 the legal profession or in the medical profession or the like. But if you, as a 804 01:11:39,920 --> 01:11:46,400 professional, have a special interest in communications confidentiality, then it 805 01:11:46,400 --> 01:11:50,320 would be a good idea to not just go to your local digital rights group, but also 806 01:11:50,320 --> 01:11:53,300 see if you can take this up through your own professional body. 807 01:11:55,334 --> 01:11:59,187 Herald: OK, thank you. 808 01:11:59,187 --> 01:12:03,233 Since I am the moderator for the Q&A questions, then I also sort of 809 01:12:03,233 --> 01:12:07,280 get to rephrase the questions as passed on through me for the internet. 810 01:12:07,280 --> 01:12:12,480 And the most recent question, but I think also the most fascinating question is, 811 01:12:12,480 --> 01:12:17,200 Someone is asking to what extent the Gaia-X program and the Palantir 812 01:12:17,200 --> 01:12:20,531 collaborations in the EU tied to this, and what can be done to stop this? 813 01:12:20,531 --> 01:12:22,928 For those who are unfamiliar with Gaia-X, 814 01:12:22,928 --> 01:12:30,674 that's an initiative to create a European- based, Europe-based cloud service. 815 01:12:30,674 --> 01:12:34,654 But it should be mentioned that all sorts of American tech companies 816 01:12:34,654 --> 01:12:39,029 are also participants in that, so the European nature of that could be disputed. 817 01:12:39,029 --> 01:12:41,810 But to get back to the question… 818 01:12:41,810 --> 01:12:46,337 How does Gaia-X and Palantir may or may not tie into this? 819 01:12:46,337 --> 01:12:49,679 I think that might be a question for Patrick. 820 01:12:55,556 --> 01:13:00,949 Breyer: I'm afraid I don't know enough to to answer it, Walter. 821 01:13:00,949 --> 01:13:06,538 Palantir has their hands, of course, in managing databases. 822 01:13:06,538 --> 01:13:14,794 And they will also offer products to police that will integrate data retention. 823 01:13:14,794 --> 01:13:20,728 And of course, the impact of communications data grows potentially with 824 01:13:20,728 --> 01:13:28,161 the capacities to analyze this data. It has long been established that, you know, 825 01:13:28,161 --> 01:13:35,280 the idea that listening in to phone calls was more sensitive than 826 01:13:35,280 --> 01:13:42,172 only knowing that the call details is wrong nowadays, because you can use the 827 01:13:42,172 --> 01:13:47,731 the bulk of data that has been collected over weeks and months 828 01:13:47,731 --> 01:13:53,489 to paint a picture of persons, and their networks, and their movements, 829 01:13:53,489 --> 01:14:01,246 and their personalities. That is actually much, much more intrusive and 830 01:14:01,246 --> 01:14:07,359 much more sensitive than what you can tell from just listening to to phone calls. 831 01:14:07,359 --> 01:14:15,090 And yes, I think companies such as Palantir are taking this to to very great length 832 01:14:15,090 --> 01:14:22,027 with the products they are offering. And certainly it's a commercial incentive. 833 01:14:22,027 --> 01:14:28,565 Mass surveillance is big business, and we need to be very aware of this. 834 01:14:28,565 --> 01:14:30,705 Herald: OK, I will also ask Jesper this, 835 01:14:30,705 --> 01:14:33,985 because he may have some thoughts on this as well. Jesper? 836 01:14:33,985 --> 01:14:38,611 Lund: Yeah, I think another worrying development we are seeing 837 01:14:38,611 --> 01:14:42,960 is with the amendment to the European regulation, 838 01:14:42,960 --> 01:14:51,760 which to a large extent is about allowing big data analysis and, in fact, legalizing 839 01:14:51,760 --> 01:15:00,160 practices that are currently illegal. I could easily imagine that, so, police 840 01:15:00,160 --> 01:15:04,000 authorities will not have access to the sort of complete data sets that are 841 01:15:04,000 --> 01:15:07,280 retained by the telecommunications providers. But whenever they have a 842 01:15:07,280 --> 01:15:12,800 criminal investigation and get access to some data, there's a risk that it would be 843 01:15:12,800 --> 01:15:19,840 stored in the database and used for other purposes. I could easily see that systems 844 01:15:19,840 --> 01:15:26,880 from Palantir could be used for analyzing such data, that it could be 845 01:15:26,880 --> 01:15:34,880 disclosed to Europol and possibly analyzed by Europol using perhaps Palantir's 846 01:15:34,880 --> 01:15:40,480 software as well. So. Even though the connection to Palantir and Gaia-X is a bit 847 01:15:40,480 --> 01:15:47,440 speculative, it certainly fits the picture of more big data analysis for the police. 848 01:15:50,240 --> 01:15:53,190 Herald: Chloé, you want to add something to that? 849 01:15:53,190 --> 01:15:56,427 Berthélémy: Just a quick remark on the… 850 01:15:56,427 --> 01:16:05,071 it's not linked to Palantir directly or Gaia-X, but like this is also part of the 851 01:16:05,071 --> 01:16:12,575 French law that was actually brought down by the Court of Justice. 852 01:16:12,575 --> 01:16:17,608 Like one part, was also like about black boxes used by intelligence services 853 01:16:17,608 --> 01:16:20,552 not police authorities, not law enforcement authorities, 854 01:16:20,552 --> 01:16:25,125 but intelligence services. And the new "Loi Renseignement 2", so like 855 01:16:25,125 --> 01:16:33,040 the revival or the reform of the former law that was adopted this year that I've 856 01:16:33,040 --> 01:16:40,400 mentioned before, also contains this kind of algorithmic based, big data analysis of 857 01:16:40,400 --> 01:16:49,200 metadata, of communications data. And this is even further expanded in the new law by 858 01:16:49,200 --> 01:16:58,560 including URL's. So also an analysis of internet network, and how websites 859 01:16:58,560 --> 01:17:04,160 are being visited, and which ones and by whom in general. And all of this will be 860 01:17:04,160 --> 01:17:10,240 done now from the premises of like the physical premises of the intelligence 861 01:17:10,240 --> 01:17:17,040 services in France, and no longer at the premises of the service providers. 862 01:17:17,040 --> 01:17:22,000 So it's kind of a huge shift, where like intelligence services are getting the copy 863 01:17:22,000 --> 01:17:28,646 of metadata on the basis of a judge's decision. But basically everything is 864 01:17:28,646 --> 01:17:32,070 copied, and then they're like applying an algorithmic analysis to it. 865 01:17:32,070 --> 01:17:37,740 Something that is obviously not known by the public. 866 01:17:37,740 --> 01:17:40,242 This isn't in the sense, it's… 867 01:17:40,242 --> 01:17:42,689 It follows the same trend. 868 01:17:42,689 --> 01:17:47,365 Herald: OK, thank you. Another question from the audience is… 869 01:17:47,365 --> 01:17:51,760 I think I'm going to give that one to TJ, because this may 870 01:17:51,760 --> 01:17:55,600 also require a expansion into other fundamental rights or a broader set of 871 01:17:55,600 --> 01:18:00,240 fundamental rights – someone is wondering what is actually so bad about a general 872 01:18:00,240 --> 01:18:06,720 data retention for just IP addresses, for just severe crimes? TJ. 873 01:18:06,720 --> 01:18:10,560 McIntyre: Well, that's a very good question. So first of all, what do we mean 874 01:18:10,560 --> 01:18:15,440 by "just serious crimes"? In Ireland, a "serious crime" includes stealing a Mars 875 01:18:15,440 --> 01:18:19,920 bar from your local shop or a sweet of your choice from your local shop, because 876 01:18:19,920 --> 01:18:24,960 that theoretically carries a possible seven year prison sentence. And in fact, 877 01:18:24,960 --> 01:18:29,600 the Irish police have been using this so- called serious crime provision to 878 01:18:29,600 --> 01:18:34,960 investigate things like theft of a mobile phone from a locker and theft of 100 € 879 01:18:34,960 --> 01:18:42,480 from an ATM machine. So first of all, the problem here is that scope creep is a 880 01:18:42,480 --> 01:18:45,840 thing. And even if you describe something as being limited to serious crime, it's no 881 01:18:45,840 --> 01:18:49,120 guarantee that what you think of a serious crime and what it will be used for are in 882 01:18:49,120 --> 01:18:55,600 fact the same things. The second is that registration of any sort is a gateway to 883 01:18:55,600 --> 01:19:01,600 registration of everything. The kinds of registration we see talked about 884 01:19:01,600 --> 01:19:06,250 and Jesper mentioned already, and Patrick has fought against in different contexts; 885 01:19:06,250 --> 01:19:12,640 registration of SIM cards, for example, generally require identity verification of 886 01:19:12,640 --> 01:19:18,240 some sort. And that, in turn, is a real threat then to the people who rely on 887 01:19:18,240 --> 01:19:21,840 confidentiality – the whistleblowers who want to get information about what 888 01:19:21,840 --> 01:19:26,080 government is doing out to you, the people who want to talk to their doctors or their 889 01:19:26,080 --> 01:19:31,680 support helplines in confidence – and that is a threat to them. But I think Patrick 890 01:19:31,680 --> 01:19:34,043 probably is better placed to discuss those points than I am, 891 01:19:34,043 --> 01:19:37,100 so perhaps I'll just hand over to him. 892 01:19:38,440 --> 01:19:41,377 Breyer: Just to add to what TJ said 893 01:19:41,377 --> 01:19:48,040 about IP addresses specifically… On the internet, the major providers of services 894 01:19:48,040 --> 01:19:54,144 will log your every click and search term that you enter and keep that data for 895 01:19:54,144 --> 01:19:59,113 months. And so basically, if you know a person's IP addresses, it's easy to 896 01:19:59,113 --> 01:20:07,316 request from Google all the search terms that they entered. Or if somebody is 897 01:20:07,316 --> 01:20:12,583 publishing anonymously using a Twitter account, or they think it's anonymously, 898 01:20:12,583 --> 01:20:18,402 then you'll ask for the IP address. And you can establish, you can lift the 899 01:20:18,402 --> 01:20:27,243 anonymity of that whole account. And that's why IP addresses or being able to 900 01:20:27,243 --> 01:20:32,520 trace IP addresses really means that you can follow whatever a person has done on 901 01:20:32,520 --> 01:20:40,017 the internet. And you can even determine their location because IP address tells 902 01:20:40,017 --> 01:20:46,515 about your movements more or less, roughly. Whether you're at home or at work 903 01:20:46,515 --> 01:20:53,313 can be determined, according to research. And it's very telling, it's not true that 904 01:20:53,313 --> 01:21:01,215 it's somehow less sensitive. You know, if you call somebody and 905 01:21:01,215 --> 01:21:08,182 suppress your phone number, you can't you wouldn't be allowed to retain data on this. 906 01:21:08,182 --> 01:21:13,247 But if you use digital services to send an email, you'll have the 907 01:21:13,247 --> 01:21:17,877 IP address in the header. If you use messaging services, they will be logging 908 01:21:17,877 --> 01:21:27,570 your IP address. So very similar things as making phone calls will be able to be 909 01:21:27,570 --> 01:21:31,611 retained indiscriminately and be tracing the IP address. 910 01:21:34,651 --> 01:21:39,864 Herald: OK, I have a very specific question about Chloé's bit in the presentation 911 01:21:39,864 --> 01:21:42,612 *unintelligible* 912 01:21:42,612 --> 01:21:45,944 I didn't quite get what the Socialist Party did block 913 01:21:45,944 --> 01:21:48,353 and how, and what happened there. 914 01:21:49,455 --> 01:21:52,027 Berthélémy: Sure. That went very by fast. 915 01:21:52,027 --> 01:21:55,989 It's not just the socialists, obviously, I forgot to mention that the right parties 916 01:21:55,989 --> 01:22:00,937 had the big role to play there. Basically, there is like a procedure in 917 01:22:00,937 --> 01:22:06,567 France every time, like a legislation is adopted by the parliament, so both 918 01:22:06,567 --> 01:22:12,810 the French National Assembly and the Senate. There is a possibility there's 919 01:22:12,810 --> 01:22:20,030 this rule where either 60 senators or 60 MP's, members of National Assembly can 920 01:22:20,030 --> 01:22:25,815 just vote in favor of submitting the bill, before it is adopted and officially 921 01:22:25,815 --> 01:22:31,745 published in the official journal and can come into force, there is this possibility 922 01:22:31,745 --> 01:22:36,853 of submitting it to the Constitutional Council. So the Constitutional Council in 923 01:22:36,853 --> 01:22:41,490 France is composed of nine members. They're not elected, there are designated. 924 01:22:41,490 --> 01:22:47,880 So it's not the best kind of democratic counter-power you can imagine. But this is 925 01:22:47,880 --> 01:22:52,510 still there, and has been proven effective in the past years, especially since the 926 01:22:52,510 --> 01:22:59,213 presidency of Macron. To, a little bit, hold responsible what the government is 927 01:22:59,213 --> 01:23:06,373 proposing and has annulled or prevented some provision in several security laws to 928 01:23:06,373 --> 01:23:13,131 pass and to come into effect. And so for the second, the "Loi Renseignement 2", 929 01:23:13,131 --> 01:23:17,828 so like the reform of the intelligence service law that contains the provision 930 01:23:17,828 --> 01:23:22,298 about the data retention, so when this reform was finally like adopted by both 931 01:23:22,298 --> 01:23:28,553 the Senate and the National Assembly, there wasn't the majority. There wasn't 932 01:23:28,553 --> 01:23:33,787 enough majority to send the text in front of the Constitutional Court, or at 933 01:23:33,787 --> 01:23:39,256 least not on the provision related to data retention. And some other provision 934 01:23:39,256 --> 01:23:45,009 contained in the bill were sent to the Constitutional Council. And the reason why 935 01:23:45,009 --> 01:23:49,357 data retention wasn't included is because the socialists didn't support it, this 936 01:23:49,357 --> 01:23:53,017 submission, this reference to the Constitutional Council. So not just the 937 01:23:53,017 --> 01:23:57,371 socialists' fault. It's also I just said the right parties and the party in power, 938 01:23:57,371 --> 01:24:04,980 so the La République en marche. But I think one assumption that we can make 939 01:24:04,980 --> 01:24:09,588 and this is not like verified information, but this is one assumption we can make 940 01:24:09,588 --> 01:24:16,320 is that the reason why the socialists didn't support it, the submission to 941 01:24:16,320 --> 01:24:20,800 another scrutiny or constitutional scrutiny, is because they were the ones 942 01:24:20,800 --> 01:24:26,520 introducing those measures in 2015 when they were in power under François Hollande 943 01:24:31,630 --> 01:24:35,518 Herald: I've been told by the translators that we really have to wrap it up. 944 01:24:35,518 --> 01:24:39,467 So I probably will be asking the last question, depending on the length of 945 01:24:39,467 --> 01:24:43,303 the answer, maybe another question. And I'm going to ask them to Friedemann, 946 01:24:43,303 --> 01:24:46,592 because he hasn't been talking much during the Q&A. 947 01:24:46,852 --> 01:24:53,049 One of the questions is, and that was by a viewer, who understood the 948 01:24:53,049 --> 01:24:58,508 explanations about the Court of Justice as potentially being willing to revise the 949 01:24:58,508 --> 01:25:04,915 earlier jurisprudence on data retention which I understood quite differently, 950 01:25:04,915 --> 01:25:07,579 so maybe you could explain it as well. But the question is… 951 01:25:07,579 --> 01:25:11,128 In case the Court of Justice of the European Union were to revisit its 952 01:25:11,128 --> 01:25:17,080 earlier opinions and weakens its stance, what will be possible to do against that? 953 01:25:17,080 --> 01:25:22,324 Ebelt: Well, to be honest, I would love to pass this question on to the speakers, 954 01:25:22,324 --> 01:25:27,805 since I'm not actually a speaker of this talk, and just the host and moderator. 955 01:25:27,805 --> 01:25:33,371 So would somebody of the speakers take the question? 956 01:25:33,371 --> 01:25:37,210 Herald: Then I suggest Chloé does it. 957 01:25:37,210 --> 01:25:42,808 Berthélémy: Oh, wow, what a nice gift. Can I pass the ball to Jesper? *laughs* 958 01:25:42,808 --> 01:25:49,554 I don't know what can be done as a citizen counter the Court of Justice's future rulings. 959 01:25:49,554 --> 01:25:55,698 One thing that Patrick mentioned, for example, is the influence of several 960 01:25:55,698 --> 01:26:00,808 interventions during hearings in front of the courts. And notably, when the 961 01:26:00,808 --> 01:26:06,385 European data protection supervisor is kind of weakening what the court has said 962 01:26:06,385 --> 01:26:12,847 previously and said that access is more important than the mere retention of data, 963 01:26:12,847 --> 01:26:18,887 it's problematic. Even though it didn't impact it this time too much, 964 01:26:18,887 --> 01:26:22,663 there is a risk that this kind of narratives build up 965 01:26:22,663 --> 01:26:28,322 and becomes like very influential in the judges' ears. 966 01:26:28,322 --> 01:26:33,947 So this is something to be avoided. But Jesper, please complete, because 967 01:26:33,947 --> 01:26:41,000 I'm probably far from the comprehensive answers to this question. 968 01:26:42,870 --> 01:26:50,080 Lund: In the hypothetical situation that the court would revise its position and 969 01:26:50,080 --> 01:26:55,120 say, allow general and indiscriminate data retention, not just for national security 970 01:26:55,120 --> 01:27:01,971 in extraordinary circumstances, but for combating serious crime in general. 971 01:27:01,971 --> 01:27:04,571 That would put us back to sort of square one, 972 01:27:04,571 --> 01:27:07,945 before the… 973 01:27:07,945 --> 01:27:11,320 when the campaign against data retention started. And, 974 01:27:11,320 --> 01:27:15,420 then we can no longer say that it's against fundamental rights. 975 01:27:15,420 --> 01:27:17,470 But it's still a bad idea. 976 01:27:17,470 --> 01:27:19,653 It still has problems for vulnerable groups, 977 01:27:19,653 --> 01:27:26,284 persons with professional privileges, and so forth. 978 01:27:26,284 --> 01:27:30,826 So in that situation, ideally, we want to convince our parliamentarians 979 01:27:30,826 --> 01:27:36,307 at a national level that it's a bad idea. And if that is not possible, at least make 980 01:27:36,307 --> 01:27:42,296 exceptions so that these groups with professional privileges are not included 981 01:27:42,296 --> 01:27:45,337 in data retention. It may be possible. 982 01:27:45,337 --> 01:27:50,321 Germany has, in its current data retention law, has some provisions on where 983 01:27:50,321 --> 01:27:56,894 certain phone numbers of help groups, help phone lines are excluded. 984 01:27:56,894 --> 01:28:03,071 And also make sure that access to this data is subject to prior court review. 985 01:28:03,071 --> 01:28:06,027 That is not the case in all member states. 986 01:28:06,027 --> 01:28:10,303 And that access is exceptional, also. 987 01:28:10,303 --> 01:28:14,811 And so these are the options remaining 988 01:28:14,811 --> 01:28:18,260 should the Court of Justice revise its position, which… 989 01:28:18,260 --> 01:28:22,852 I would consider unlikely at this stage, and certainly the advocate general is in 990 01:28:22,852 --> 01:28:30,224 *audio breaking up* 991 01:28:30,224 --> 01:28:36,861 …the three cases pending is not suggesting this at all. Rather the contrary. 992 01:28:36,861 --> 01:28:41,034 Herald: OK, thank you. And we have run out… 993 01:28:41,034 --> 01:28:45,998 oh, and also for those who want to look up the website, it's edri.org 994 01:28:45,998 --> 01:28:48,766 And that contains the map Chloé mentioned. 995 01:28:48,766 --> 01:28:52,289 But again, thank you all. And this is it for the session. 996 01:28:52,289 --> 01:28:55,128 Take care. Bye bye. 997 01:28:55,128 --> 01:28:58,543 * rC3 2021 Chaos-West TV postroll music * 998 01:28:58,543 --> 01:29:18,141 Subtitles created by c3subtitles.de in the year 2021. Join, and help us!