Free Video Downloader

Fast and free all in one video downloader

For Example: https://www.youtube.com/watch?v=OLCJYT5y8Bo

1

Copy shareable video URL

2

Paste it into the field

3

Click to download button


Europe’s CSAM-scanning plan is a tipping point for democratic rights, experts warn
October 25, 2023

Europe’s CSAM-scanning plan is a tipping point for democratic rights, experts warn

Reading Time: 20 minutes

A controversial child sexual abuse material (CSAM)-scanning proposal that’s under discussion by lawmakers in Europe is both the wrong response to tackling a sensitive and multifaceted societal problem and a direct threat to democratic values in a free and open society, a seminar organized by the European Data Protection Supervisor heard yesterday.

More than 20 speakers at the three hour event voiced opposition to a European Union legislative proposal that would require messaging services to scan the contents of users’ communications for known and unknown CSAM, and to try to detect grooming taking place in real-time — putting the comms of all users of apps subject to detection orders under an automatic and non-targeted surveillance dragnet.

Critics argue the approach runs counter to the fundamental freedoms that are pivotal to democratic societies. The European Commission has pushed back aggressively against this sort of criticism to date — arguing the proposal is a proportionate and targeted response to a growing problem. It was even spotted recently using microtargeted ads to promote the plan, apparently turning to covert targeting to attack critics by suggesting they do not support child protection (despite the existence of another live EU legislative proposal that seeks to restrict the use of political microtargeting… so, er, oops!).

The contentious debate is still live as it’s now up to EU co-legislators, in the European Parliament and Member States, via the Council, to hash out (no pun intended!) a way forward — which means there’s still time for regional lawmakers to pull back.

And the need for the bloc to pull back from this brink was absolutely the message from yesterday’s event.

The European Data Protection Supervisor (EDPS) himself, Wojciech Wiewiórowski, suggested the EU could be at a point of no return if lawmakers go ahead and pass a law that mandates the systemic, mass surveillance of private messaging. In his opening remarks, he suggested the Commission’s proposal could bring consequences that go ‘well beyond what concerns with the protection of children’.

‘It is often being used in the debate that this proposal is only about protecting children. I would like this to be the case — but it’s not,’ he went on, arguing that the Commission’s proposal questions the ‘foundations’ of what privacy means in a democratic society; and pointing out that privacy, once undermined, leads to ‘the radical shift from which there might be no return’, as he put it.

Without amendments, the proposal would ‘fundamentally change the Internet and the digital communication as we know it’, Wiewiórowski also warned at the event’s close — invoking his personal childhood experience of living under surveillance and restrictions on freedom of expression imposed by the Communist regime in Poland. And, most certainly, it’s an awkward comparison for the EU’s executive to be asked to contemplate coming from the mouth of one of its own expert advisors.

The EDPS, an EU institution which advises the Commission on data protection and privacy, is not a newly converted critic of the Commission proposal either. Indeed, the Supervisor and the European Data Protection Board put out a joint opinion a full year ago that warned the legislative plan raises ‘serious data protection and privacy concerns’ — including for encryption. But that joint expression of concern from inside the EU has — so far — failed to persuade Johansson or the Commission to rethink their full-throated backing for mass surveillance of citizens’ private communications.

Seminar attendees heard that Johansson had been invited to attend but the event but had declined to do so. Nor did anyone else from the Commission agree to attend. (We’ve reached out to the Commission and Johansson’s office about her decision not to participate and will update our report if we get a response).

Mounting concerns

The Commission presented its draft CSAM legislation back in May 2022. Since then opposition has been building over human rights impacts as the implications of the proposal have become clearer. While concerns — and even suspicions — about the driving forces behind the proposal have mounted, not helped by a perceived lack of engagement from the Commission with civil society organizations and others expressing genuinely held misgivings. The emotive debate has also, at times, lent itself to unhelpful polarization.

Even from the start there were clear questions about the legality of the proposal. EU law requires any interferences with fundamental rights like privacy and freedom of expression to be necessary and proportionate. While the imposition of a general content monitoring obligation on online platforms is prohibited — so how does that square with a law that could put the messages of hundreds of millions of Europeans under watch by design?

Giving a perspective on the legality at yesterday’s seminar, Frederik Borgesius, professor at iHub, Radboud University, in the Netherlands, said in his view the Commission’s proposal is not a proportionate way of interfering with fundamental rights. He referred back to case law on data retention for terrorism — as the most relevant comparison — which has seen the bloc’s top court repeatedly strike down general and indiscriminate storage of Europeans’ metadata. (Not that that’s stopped Member States from keeping on breaking the law, though… )

‘Actually, the court might not even get to a proportionality test because the data retention cases were about metadata. And this is about analysing the content of communications,’ he went on. ‘The EU Charter of Fundamental Rights has an element that says if the essence of a fundamental right is violated then the measure is illegal by definition — there’s not even a need for a proportionality test.’

Borgesius also pointed out there’s case law on this essence point too. ‘When is the essence violated? Well, the court has said — in a different case — if authorities can access the contents of communications on such a large scale then the discussion is over,’ he explained. ‘No room for proportionality test — the essence of the right to privacy would be violated, and therefore such a measure would be illegal.’

Legality is just one element the seminar considered. Multiple critics of the Commission’s proposal speaking at the event also argued it’s even ill-fitted in its main claimed purpose — of addressing the complex societal problem of child sexual abuse — and actually risks causing unintended consequences, including for children.

The seminar heard repeated concerns from panellists that minors could end up being harmed because of the regulation’s single-minded focus on scanning private comms, with speakers emphasizing the importance of evidence-based policymaking in such a sensitive area, rather than a blinding rushing down the road of technosolutionism.

One issue several speakers raised is that a large proportion of the sexualized content involving minors that’s being shared online is actually being done so by (and between) consenting minors (i.e. sexting). The Commission proposal could therefore lead to children being investigated and even criminalizing for exploring their own sexual identities, they suggested — since understanding what is and isn’t CSAM is not something that can necessarily be done by simply reviewing the imagery itself.

The Commission’s proposal hinges on the idea of forcing messaging platforms which are suspected of hosting CSAM to scan for illegal content (and grooming activity) and pass flagged content to a European Center to carry out initial checks — but also potentially send reported content on to law enforcement agencies.

Certainly in the case of new CSAM (i.e. suspected CSAM content that has not been previously seen, investigated and confirmed as illegal child sexual abuse material), context is essential to any assessment of what is being depicted. This is not something an AI scanning tool, or even a trained human looped in to review flagged imagery, can inherently know just by looking at a piece of content, the seminar heard.

So a law that automates CSAM scanning and reporting without there being a foolproof way to distinguish between actual CSAM, innocent sexting between kids, or even just a parent sending a family holiday snap to a relative via a private messaging channel they believe to be a safe way to share personal stories, looks the opposite of an intelligent response to child sexual abuse.

‘A big part of the material that we see is not a result of sexual abuse,’ Arda Gerkens, chair of the board of The Netherlands’ Authority for the Prevention of Online Terrorist Content and Child Sexual Abuse Material, told the seminar. ‘The material’s indeed being spread by the Internet — but it’s a growing number which is a result of sexual activity of young people themselves.’

The risk of ‘leaked images and sextortion’ are ‘even more reason why we should keep the Internet safe and secure’, she also suggested — pointing out children can be put in an exceptionally vulnerable position if their accounts are hacked and their private comms fall into the hands of someone who wants to manipulate and abuse them.

‘The scanning of private communication will certainly flag problematic situations — I definitely know that that’s already happening — but it’s not the solution to combat this sexual child abuse,’ she went on, speaking up in favor of a narrower practice of scanning for known CSAM on image hosting websites to stop the further spread of material — ‘but not to prosecute’.

The legislation proposed by the Commission doesn’t properly address image hosting websites as potential repositories of CSAM, she suggested, because it’s too focused on ‘private communication and subsequently prosecution’. She therefore predicted the EU’s approach will be ‘counterproductive’ when it comes to detecting perpetrators and ending domestic sexual violence.

‘The sheer amount of images will overflow the systems we have and put even more pressure on the fragile law enforcement systems,’ she suggested, adding: ‘It would be much better to invest in those systems and strengthen collaborations between the EU countries.’

Another speaker, Susan Landau, bridge professor in cyber security and Policy at Tufts University, also argued the Commission’s proposal misunderstands a multifaceted and highly sensitive issue — failing to respond to different (and distinct) types of child sexual abuse and exploitation that can occur over the Internet.

An approach that’s centered on investigation and prosecution, as the Commission’s is, would hit a wall in many cases, she also predicted — pointing out, for example, that an overwhelming majority of Internet-enabled sex trafficking cases involve victims being abused by people close to them, who they do not want to report.

‘What you need there, as well as with real time abuse, is spaces that make children safe. Community spaces. Online safety education. Education about online security and safety by design,’ she suggested.

‘The point is that the law requires scanning to handle the child sexual abuse and exploitation issue misunderstands the issue,’ Landau added. ‘There are multiple ways… to prevent and investigate the crime [of child sexual abuse] that are not affected by end-to-end encryption (E2EE) . Meanwhile, end-to-end encryption is a technology that secures both children and adults.’

Also speaking during the event was Alexander Hanff, a privacy expert and advocate — who is himself a survivor of child sexual abuse. He too asserted that a lot of sexualized imagery of children that’s shared online is being done privately between consenting minors. But the impact the Commission’s proposal would have on minors who sext is not something the EU’s executive appears to have considered.

‘If we now introduce a law which requires the scanning of all digital communications, and by that we’re talking billions, tens of billions of communications every single day across the EU, those images would then be sent to multiple individuals along the chain of investigation —  including Europol and various law enforcement bodies, etc — creating victims,’ he warned. ‘Because one of the things that we see in relation to CSAM, and speaking as a survivor myself, is the impact on the dignity of the individuals for whom it relates.

‘The very fact that people are viewing these images — which is exactly the intent of the Commission to try and overcome — is a form of abuse itself. So if we now take innocent pictures, which have been shared among consenting individuals, and expose them to potentially hundreds of other individuals down the investigation chain then we are indeed actually creating more victims as a result of this.’

Another attendee — WhatsApp’s public policy director, Helen Charles — chipped into the discussion to offer an industry view, saying that while the Meta-owned messaging platform supports EU lawmakers in their aim of tackling child sexual abuse, it shares concerns that the Commission’s approach is not well targeted at this multifaceted problem; and that it risks major unintended consequences for web users of all ages.

‘We think that any outcome that requires scanning of content in end-to-end encrypted messaging would undermine fundamental rights, as several colleagues have set out,’ she told the seminar. ‘Instead, the draft regulation should set the right conditions for services like WhatsApp and other end-to-end encrypted services to mitigate the misuse of our services in a way that’s reasonable and proportionate but that also considers both the different nature of harm… but also includes things like prevention and other upstream measures that could help tackle these kinds of harms.’

Charles went on to advocate for EU lawmakers to give platforms more leeway to use ‘traffic data’ (i.e. metadata; not comms content) for the prevention of child sexual abuse under the EU’s existing ePrivacy Directive — noting that the current (temporary) ePrivacy derogation for platforms, which lets them scan non-E2EE messages for CSAM, only covers detection, reporting and takedown, not prevention.

‘Having access to some data can be helpful in a targeted proportionate way to address those risks,’ she argued. ‘Where it is legal Meta does deploy techniques, including the use of traffic data, to identify potentially problematic behaviour. This is not just about detection, though. This is also about prevention… [T]raffic data can be an important signal when used with other signals to help services proactively disrupt violating message groups and accounts who may be seeking to abuse our services.

‘So we would encourage institutions, when they’re thinking about the way forward here, to both ensure end-to-end encryption is protected and that services can tackle CSAM without accessing message content but also look at ways that EPG traffic data can be processed in a proportionate and targeted manner, including for prevention. We think, in this way, the regulation will move closer to achieving its objectives.’

The seminar also heard concerns about the limitations of the current state of the art in AI-based CSAM detection. A big issue here is AI tools used to detect known CSAM are proprietary — with no independent verification of claims made for their accuracy, said Jaap-Henk Hoepman, visiting professor in computer science at Karlstad University. ‘The problem is that the [CSAM detection] techniques being discussed — either PhotoDNA [developed by Microsoft] or NeuralHash [made by Apple] — are proprietary and therefore not publicly available and known and study-able algorithms — which means that we simply have to rely on the figures provided by the companies on how effective these technologies are.’

He also pointed to work by academics and other researchers who have reverse engineered PhotoDNA that he said revealed some elementary flaws — such as evidence it’s particularly easy to evade detection against a known CSAM fingerprint by simply rotating or mirroring the image.

‘Clearly this has implications for the proportionality of the [Commission] proposal, because a serious breach of privacy is being proposed for an not-so-effective measure,’ he added, going on to warn about risks from ‘targeted false positives’ — where attackers seek to manipulate an image so that an algorithm detects it as CSAM when, to the human eye, it looks innocuous — either to frame an innocent person or trick app users into forwarding a doctored image (and, if enough people share it, he warned it could flood detection systems and even cause a DDoS-like event).

‘This technology is not free from error — and we’re talking many billions of communications [being scanned] every day. So even if we have a 0.1% error rate that accounts for many millions of false positives or false negatives on a daily basis. Which is not something that we can subscribe to in a democracy,’ Hanff also warned, chiming in with a technosocial perspective on flawed AI tools.

Claudia Peersman, a senior research associate in the Cyber Security Research Group of the University of Bristol, had a pertinent assessment to offer related to work she’d been involved with at the Rephrain Centre. The expert academic group recently independently assessed five proof-of-concept projects, developed in the U.K. with government backing, to scan E2EE content for CSAM without — per the Home Office’s headline claim — compromising people’s privacy.

The problem is, none of the projects lived up to that billing. ‘None of these tools were able to meet [our assessment] criteria. I think this is the most important part of our conclusion. Which does not mean that we do not support the development of AI supported tools for online child protection in general. We just believe that the tools are not ready to be deployed on such a large scale on private messages within end-to-end encrypted environments,’ she told the seminar.

Delegates also heard a warning that client-side scanning — the technology experts suggest the EU law will force onto E2EE platforms such as WhatsApp if/when they’re served with a CSAM detection order — is far too new and immature to be rushed into mainstream application.

‘As computer science researchers we’ve just begun to look at this technology,’ said Matthew Green, a cryptographer professor at The Johns Hopkins University, Baltimore. ‘I want to stress how completely new the idea of client side scanning is — the very first computer science research papers on the topic appeared in 2021 and we’re not even two years later, and we’re already discussing laws that mandate it.’

‘The problem of building systems where [content scanning] algorithms are confidential and can’t be exploited is something we’re just beginning to research. And so far, many of our technical results are negative — negative in the sense that we keep finding ways to break these systems,’ he also told the seminar. ‘Break them means that we will ultimately violate the confidentiality of many users. We will cause false positives. We will cause bad data to be injected into the system.

‘And, in some cases, there’s this possibility that abuse victims may be re traumatised if the systems are built poorly… I’m just a computer scientist. I’m not a legislator or a lawyer. My request to this community is please, please give us time. To actually do the research, to figure out whether and how to do this safely before we start to deploy these systems and mandate them by law.’

Children’s rights being ignored?

A number of speakers also had passionate critiques that the views (and rights) of children themselves are being ignored by lawmakers — with several accusing the Commission of failing to consult kids about a proposal with severe implications for children’s rights, as well as for the privacy and fundamental rights of everyone who uses digital comms tools.

‘We should involve the children,’ said Gerkens. ‘We are speaking here about them. We are judging about what they do online. We have a moral opinion about it. But we’re not talking to them. They are the ones we are speaking about. We haven’t involved them in this legislation. I think we should.’

Sabine Witting, assistant professor at Leiden University, also warned over a raft of ‘negative’ impacts on kids’ rights — saying the EU proposal will affect children’s right to privacy, personal data protection, freedom of expression and access to information.

‘In this context, I really would like to highlight that — from a children’s rights perspective — privacy and protection are not contradicting each other. Actually, the UN Committee on the Rights of the Child, and its General Comment number 25, made it very clear that privacy is actually vital for children’s safety. So privacy is not a hindrance of children’s safety as it’s often projected. It’s actually an important precondition to safety,’ she said.

Witting also had a strong message about the harms that could accrue for adolescents whose private texts to each other get sucked up and caught in a CSAM-scanning dragnet. ‘The investigation alone can already be very, very harmful for affected adolescents, especially in cases where we have adolescents from marginalised communities. For example, LGBTQ+ children,’ she warned. ‘Because this kind of investigation might lead to false disclosure, further marginalisation and in a worst case scenario, also political persecution or the like, so children and adolescents being targets of a criminal investigation is already harmful in and of themselves.

‘Unfortunately the proposal will not be able to prevent that from happening. So this whole process of scanning private communications among adolescents of the most intimate nature, the further review by private sector, by government, the potential involvement of law enforcement, all of that is really a significant violation of children’s right to privacy.’

Witting said she’d raised this issue with the Commission — but had no response. ‘I think because there is just no simple answer,’ she added. ‘Because it lies in the nature of the subject matter that these kinds of cases will not be able to be filtered out along the process.’

The idea of the EU passing a law that sanctions warrantless searches of everyone’s ‘digital worlds’ was skewered more generally by Iverna McGowan, director of the European office of the Center for Democracy and Technology.

‘What essentially the detection orders amount to are warrantless searches of our digital worlds,’ she argued. ‘We would of course never expect or accept that law enforcement would enter our homes, or private setting, with no warrants and no reasonable suspicion to search everything belonging to us. And so we cannot of course, allow that to happen on the online space either because it would be a death knell to the rule of law and criminal law as we know it in the digital context.’

Going on to offer some thoughts on how to salvage something from the Commission proposal — i.e. to undo the existential threat is poses to European values and democracy — would, she suggested, require a fundamental rewriting of the detection order provisions. Including to ensure provisions are formulated with ‘sufficient precision’ to not apply to people whose conduct is not suspected of being criminal.

She also argued for an independent judicial authority being invoked to sign off searches and verify the purposes and basis upon which individuals or groups of people are suspected. Plus proportionality tests built in — to determine whether the law allows for a truly independent assessment.

‘If you consider that all of these different elements would have to be in line in order for detection order to be lawful then I think we’re in a very challenging situation at the moment with the text that’s on the table before us,’ she cautioned.

Mission creep was another concern raised by several speakers — with panellists pointing to documents obtained by journalists that suggest Europol wants unfiltered access to data obtained under the CSAM-scanning proposal.

‘[We] know that the objective of Europol is to become a data hub and to also enlarge and strengthen its missions,’ said MEP Saskia Bricmont. ‘Is there a risk of function creep? I think there is obviously, not only because of what has happened before in the evolution of the legislation around Europol but also because it seems that Europol has been pushing to obtain, in this legislation, what it aims for — namely, collect data, extend access to data and filter data — and which is also reflected in the Commission’s proposal.’

She noted this requires all reports ‘not manifestly unfounded’ to be sent simultaneously to Europol and national law enforcement agencies. ‘So far the European Parliament’s position is not going in that direction… But, nevertheless, the initial proposal of the Commission is going in that direction and… Europol has been also pushing for extended detection to other crime areas beyond CSAM.’

No dialogue

The event was being held two days before Ylva Johansson, the bloc’s commissioner for Home Affairs — who has been the driving force behind the CSAM-scanning proposal — is due to attend a hearing with the European Parliament’s civil rights committee.

Critics have accused Johansson personally, and the Commission generally, of a lack of transparency and accountability around the controversial proposal. So the meeting will be closely watched and, most likely, rather a tense affair (to put it mildly).

One key criticism — also aired during the seminar — is that EU lawmakers are using the highly sensitive and emotive issue of child abuse to push a blanket surveillance plan on the region that would drastically impact the fundamental rights of hundreds of millions of Europeans.

‘What I find really problematic is the instrumentalization of such a sensitive topic, emotional question also related to the fight — and legitimate fight and prior fights — against child sexual abuse,’ said Bricmont. ‘And this is also what we as MEPs as European parliament have to dig into, and we will have the exchanges with the commissioner in our LIBE Committee. Because we want to know more about this and other revelations of BalkanInsight’s [investigative reporting] when it comes to potential conflicts of interests.’

‘Let us regret the absence of the Commission because I think we need dialogue,’ she added. ‘I think we need to share knowledge and information in this file and that the fact that there’s a closed door [increases] suspicion also. If we want to believe that the intentions are positive and to fight against child sexual abuse then it means that every party, every person needs to be also open to the counter arguments and bring founded arguments to explain why they want to go into that direction.

‘So the question is, is the Commission misled? Misinformed? What is it? And this is also true for the Council side — because we know [there is there] probably also a lack of understanding of what is discussed today, and the risks related to the end of end-to-end encryption.’

Earlier this month Johansson responded to critics by penning a blog post that denounced what she suggested had included personal attacks on her and attacked those who have been raising doubts, including over opaque lobbying around the file — after journalists questioned the level of access given to lobbyists for commercial entities involved in selling so-called safety tech who stand to benefit from a law that makes use of such tools mandatary.

She went so far as to suggest that civil society opposition to the CSAM-scanning proposal may be acting as a sock puppet for Big Tech interests.

‘The biggest digital rights NGO in Europe gets funding from the biggest tech company in the world. EDRI, the European Digital Rights NGO, publishes on its website that it receives funding from Apple,’ she wrote. ‘Apple was accused of moving encryption keys to China, which critics say could endanger customer data. Yet no-one asks if these are strange bedfellows, no-one assumes Apple is drafting EDRI’s speaking points.’

Ella Jakubowska, senior policy advisor at EDRi, did not engage directly with the commissioner’s attack on her employer during her own contribution to the seminar. Instead she dedicated her two-to-three minutes of speaking time to calling out the Commission for asking Europeans to make a false choice between privacy and safety.

‘This disingenuous narrative has been reiterated in surveys which have crassly asked people if they agree that the ability to detect child abuse is more important than the right to online privacy,’ she said. ‘This is a misrepresentation of both privacy and safety — as if we can all altruistically give up some of our privacy in order to keep children safe online. It doesn’t work in that way. And this sort of attitude really failed to grasp the deep social roots of the crime that we are talking about. To the contrary, I think it’s clear that privacy and safety are mutually reinforcing.’

Jakubowska also accused the EU’s executive of seeking to manipulate public support for its proposal by deploying leading survey questions. (Johansson’s blog post pointed to a new Eurobarometer poll which she claimed showed massive public support for laws that regulate online service providers to fight child sexual abuse, including 81% supporting platforms having obligations to detect, report and remove child sexual abuse.)

She went on to highlight concerns that the proposal poses risks to professional secrecy (such as lawyer client privilege), warning: ‘This of course, on the surface should be of concern to all of us in a democratic society. But even more so when we’re thinking about this crime of child sexual abuse, where securing convictions of perpetrators is so deeply important. The idea that this law could stand in the way of already fragile access to justice for survivors should not be taken lightly. But this element was not even considered in the Commission’s proposal.’

She also highlighted the risk of negative impacts on children’s political participation — which she said is an angle that’s been under-examined by lawmakers, despite children’s rights law requiring their voices to be listened to in legislation that will impact young people.

‘We’ve heard very little from children and young people themselves. In fact, in a representative survey that was undertaken earlier this year, it was found that 80% of young people aged between 13 and 17 from across 13 EU Member States would not feel comfortably being politically active or exploring their sexuality if authorities were able to monitor their digital communication. And that was specifically asked if it was being done for the purpose of scanning for child sexual abuse. So it’s really clear when we asked young people if this is the sort of measure that they want to keep them safe that this is not the answer,’ she suggested.

The final panellist to speak during the event was MEP Patrick Breyer who has been a stalwart voice raised in opposition of the CSAM-scanning proposal — aka, ‘Chatcontrol’ as he’s pithily dubbed it — ever since the controversial plan popped up on the EU’s horizon.

During the seminar he described the proposal as ‘unprecedented in the free world’, suggesting it has led to unhelpfully polarized arguments both for and against. The more fruitful approach for policymakers would, he argued, be to work for consensus — to ‘try to bring the two sides together’, by keeping bits of the proposal everyone can get behind and then — ‘consensually’ — adding ‘new, effective approaches’ to push for something that ‘can protect children much better’.

Discussing how the proposal might be amended to reduce negative impacts and bolster protections for kids, Breyer said a new approach is needed — one that doesn’t just remove the controversial detection orders but focused on prevention by ‘strictly limiting the scanning of communications to persons presumably involved in child sexual exploitation’. So targeted investigations, not Chatcontrol. ‘That’s the only way to avoid involvement in court and achieving nothing at all for our children,’ he argued. 

Per Breyer, MEPs who support reforming the proposal are working hard to achieve such changes in the Parliament. But — so far — he said the Council is ‘refusing any measure of targeting’.

‘We also need to avoid un-targeted voluntary detection by industry, both concerning content and metadata, because it suffers the same problems for proportionality as the mandated detected,’ he went on. ‘And the same goes for not turning our personal devices into scanners, in order to back door encryption. So we need to explicitly exclude client-side scanning… What the Council is looking at — some vague commitments to how important encryption is — does not do the job.’

On detection, as an alternative to unacceptable mass surveillance he spoke up in favor of proactive crawling of publicly accessible material — which he noted is already being done the U.K. and Canada — as a way to ‘clean the web’. The proposed new EU Centre could be tasked with doing that, he suggested, along with focusing on crime prevention, victim support and best practices for law enforcement.

In wider remarks, he also urged lawmakers to resist calls to impose mandatory age verification on platforms as another ill-thought through child safety measure — suggesting the focus should instead be placed on making services safe by design.

‘Shouldn’t profiles be restricted to being private unless the user explicitly wants to make them publicly visible? Should anybody be able to reach out to new users and to send them all sorts of photos without the user even being asked? And shouldn’t the users have the right to decide whether they want to see nude photos? It’s possible to tell on the device without giving any information to the provider. Such confirmation could also go a long way to warning teenagers and children of what could be the consequences — and maybe offering them to reach out for support.’

But non-technical solutions are ultimately ‘key’ to preventing child sexual abuse, he suggested, emphasizing: ‘We can’t focus just on technical approaches only.

The stakes if EU lawmakers fail to reach a sensible revision of the proposal in trilogue negotiations on this file are grave indeed, he also warned — with the possibility that a CSAM-scanning law could mean ‘the end of truly confidential private messaging and secure encryption’ — and also ‘pave the way to introducing unprecedented authoritarian methods to democracies’.

It is the thin end of the wedge, in Breyer’s view. ‘If they start scanning our communications without cause what prevents them from scanning our devices, or even from scanning our homes? There is technology of shot detection; everything can be done with AI. And I think if that precedent is set — that it is justified to intrude in personal and private spaces just because they could be some hint of a crime — then this very much destroys the essence of the right to privacy,’ he suggested. 

‘But if we prevail with our views, I think we can set a global example for protecting children online in line with our values. And that’s what I’m fighting for. Not just for our generation but also for our children, because I want them to grow up in a free world where we trust each other and not in a surveillance state of mutual fear.’

Reference: https://techcrunch.com/2023/10/24/eu-csam-scanning-edps-seminar/

Ref: techcrunch

MediaDownloader.net -> Free Online Video Downloader, Download Any Video From YouTube, VK, Vimeo, Twitter, Twitch, Tumblr, Tiktok, Telegram, TED, Streamable, Soundcloud, Snapchat, Share, Rumble, Reddit, PuhuTV, Pinterest, Periscope, Ok.ru, MxTakatak, Mixcloud, Mashable, LinkedIn, Likee, Kwai, Izlesene, Instagram, Imgur, IMDB, Ifunny, Gaana, Flickr, Febspot, Facebook, ESPN, Douyin, Dailymotion, Buzzfeed, BluTV, Blogger, Bitchute, Bilibili, Bandcamp, Akıllı, 9GAG

Leave a Reply

Your email address will not be published. Required fields are marked *