Jump to content

Leaked: EU plans to mandate scanning of encrypted messages to stop child sexual abuse


Equin0x
 Share

Recommended Posts

https://www.imore.com/leaked-eu-plans-mandate-scanning-encrypted-messages-stop-child-sexual-abuse

A new leaked document appears to reveal that the EU is planning to mandate that the providers of messaging services like WhatsApp and iMessage must scan messages in order to detect child sexual abuse material (CSAM) and grooming of children within messages.

 

It's hard to believe that this is done in good faith. They talk about end to end encryption being necessary for security and confidentiality, but at the same time they propose something which would let other people people read your messages without your permission. Well I'm sorry, but I don't see how that can possibly be described as secure or confidential. Governments argue that end to end encryption is to stop criminals snooping on your data and that this proposal will keep that security in tact as the messages will still be encrypted as they travel across the network. But end to end encryption is also about preventing the government themselves snooping on your data. And a proposal like this would completely undermine that security.

It's the thin end of the wedge. Paedophiles and CSAM is just the excuse being used to get public support. If implemented, this would eventually be expanded to scanning for evidence of other crimes, and "unacceptable views".

Link to comment
Share on other sites

If it nails Paedophiles and sex offenders then, bring it on, the sooner the better.  These are the most heinous and terrifying of offenders. 

Link to comment
Share on other sites

13 hours ago, Equin0x said:

https://www.imore.com/leaked-eu-plans-mandate-scanning-encrypted-messages-stop-child-sexual-abuse

A new leaked document appears to reveal that the EU is planning to mandate that the providers of messaging services like WhatsApp and iMessage must scan messages in order to detect child sexual abuse material (CSAM) and grooming of children within messages.

 

It's hard to believe that this is done in good faith. They talk about end to end encryption being necessary for security and confidentiality, but at the same time they propose something which would let other people people read your messages without your permission. Well I'm sorry, but I don't see how that can possibly be described as secure or confidential. Governments argue that end to end encryption is to stop criminals snooping on your data and that this proposal will keep that security in tact as the messages will still be encrypted as they travel across the network. But end to end encryption is also about preventing the government themselves snooping on your data. And a proposal like this would completely undermine that security.

It's the thin end of the wedge. Paedophiles and CSAM is just the excuse being used to get public support. If implemented, this would eventually be expanded to scanning for evidence of other crimes, and "unacceptable views".

In short and to not compromise operational info, the proposed scanning would only hit know  IIOC so would be ineffective given the means of current detection.

Link to comment
Share on other sites

Posted (edited)

The EU have now published their proposals. These are just proposals and obviously may be modified/rejected.

However, as it stands it's a bad idea in my opinion. The proposals require the ability to scan all textual messages (with human oversight) for evidence of grooming. Of course, once e2e encryption is removed and we have in transit encryption only then this is the weak point to allow for all sorts of problems down the line. Be it criminals gaining access, totalitarian regimes using for cracking down on what they view as dissent etc. What's fine today might not be good tomorrow.

Also, any online service where one user can chat/message another is affected by these proposals, e.g. online games, this site etc.

Needless to say I fully support prevention of child abuse & detection of paedophiles but I don't think this is the way to go about it. We could go the whole hog and just ban the internet since there is unfortunately CSAM and grooming taking place on it. Or we could try and take a measured approach, perhaps:

* Education on how to spot grooming - for both kids & parents
* Raising awareness amongst all those who care for kids in some capacity and the signs to look for (e.g. teachers, medical professionals, care providers etc)
* Honey pots (automated and human) to detect attempts to groom children

It is also entirely possible to disable e2e encryption for the accounts of children only, e.g. if you are under 18 years old then you are subject to automated scanning. Any communication to/from a child account would be accessible and could be used to detect/proof instances of grooming. However, adult accounts could keep e2e encryption. This certainly ticks all the boxes unless their is actually another agenda for wanting to access all our messages...

Edited by martin
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...

Police Community is a forum that is supported financially through advertisements. It is a breach of our standard use policy to use Adblock plugins/software on our site. 

In order to continue using our site you will need to disable Adblock across our site. Alternatively you can purchase a membership package from our online store to remove adverts as part of the membership subscription. 

https://police.community/remove-adverts/

Thank you for your support.

I have disabled Adblock