Cybersecurity experts are reviewing Apple and EU’s phone scanning plans

A group of cybersecurity experts or researchers have recently mentioned that detecting child sexual images from mobile devices is a ‘dangerous technology’ and may not be as effective as Apple and E.U. are thinking. It seems that over a dozen of well-knowledged cybersecurity experts on Thursday criticized plans by Apple and the European Union to monitor mobile devices to track down child sexual content that we can call phone scanning plans.

Experts said this plan is an ineffective and dangerous strategy that would violate government rules as well as encourage government surveillance. There is a whole 46-page study that the researchers are reviewing currently, proposed by Apple. It basically aimed to detect images of child sexual abuse on iPhones, which was also forwarded by members of the European Union to detect similar abuse and terrorist image activities on encrypted devices in Europe that use “dangerous technology”.

Also Read: Android App Sideloading has up to 15-47x more malware than iOS, compromising on security and privacy

According to the researchers, it should be a national-security priority to resist attempts to spy on and influence law-abiding citizens. This technology is also known as client-side phone scanning that would allow Apple and EU law enforcement officials to easily detect images of child sexual abuse in someone’s phone by scanning images uploaded to Apple’s iCloud storage service.

Also Read: Apple silently resolves the iOS Zero-Day vulnerability without crediting the bug reporter

This plan was submitted previously in August but Apple said it would reject any such requests by foreign governments. Currently, the release of the scanning tool is in pause mode and Apple hasn’t replied anything on it. The cybersecurity researchers are reviewing it from their side before the final execution.

Meanwhile, the EU has released documents and arranged a meeting last year with Apple. According to reports, the governing body also plans to have a similar program soon that will not only detect images of child sex abuse, but could also detect signs of organized crime or terrorist activity as well.

Via
The New York Times
Exit mobile version