Our valued sponsor

Apple plans to scan US iPhones for child ab*se imagery, in other words, apple will install a backdoor in your phone.

MiddleEuroAsia

Divergent thinker
Pro Member
Nov 1, 2020
574
824
93
Visit site
----------------start------------------------

Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices. Apple detailed its proposed system — known as “neuralMatch” — to some US academics earlier this week, according to two security researchers briefed on the virtual meeting.

The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified. The scheme will initially roll out only in the US. Apple confirmed its plans in a blog post, saying the scanning technology is part of a new suite of child protection systems that would “evolve and expand over time”.

The features will be rolled out as part of iOS 15, expected to be released next month. “This innovative new technology allows Apple to provide valuable and actionable information to the National Center for Missing and Exploited Children and law enforcement regarding the proliferation of known CSAM [child sexual abuse material],” the company said. “And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account.

” The proposals are Apple’s attempt to find a compromise between its own promise to protect customers’ privacy and demands from governments, law enforcement agencies and child safety campaigners for more assistance in criminal investigations, including terrorism and child pornography. The tension between tech companies such as Apple and Facebook, which have defended their increasing use of encryption in their products and services, and law enforcement has only intensified since the iPhone maker went to court with the FBI in 2016 over access to a terror suspect’s iPhone following a shooting in San Bernardino, California.


Security researchers, while supportive of efforts to combat child abuse, are concerned that Apple risks enabling governments around the world to seek access to their citizens’ personal data, potentially far beyond its original intent.

“It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of . . . our phones and laptops,” said Ross Anderson, professor of security engineering at the University of Cambridge. Although the system is currently trained to spot child sex abuse, it could be adapted to scan for any other targeted imagery and text, for instance, terror beheadings or anti-government signs at protests, say researchers. Apple’s precedent could also increase pressure on other tech companies to use similar techniques. “This will break the dam — governments will demand it from everyone,” said Matthew Green, a security professor at Johns Hopkins University, who is believed to be the first researcher to post a tweet about the issue.

Recommended FT News Briefing podcast11 min listen Robinhood’s wild stock market debut Alec Muffett, a security researcher and privacy campaigner who formerly worked at Facebook and Deliveroo, said Apple’s move was “tectonic” and a “huge and regressive step for individual privacy”. “Apple are walking back privacy to enable 1984,” he said. Cloud-based photo storage systems and social networking sites already scan for child abuse imagery, but that process becomes more complex when trying to access data stored on a personal device. Apple’s system is less invasive in that the screening is done on the phone, and “only if there is a match is notification sent back to those searching”, said Alan Woodward, a computer security professor at the University of Surrey. “This decentralised approach is about the best approach you could adopt if you do go down this route.”

Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up system. Users’ photos, converted into a string of numbers through a process known as “hashing”, will be compared with those on a database of known images of child sexual abuse. The system has been trained on 200,000 sex abuse images collected by the NCMEC. According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a “safety voucher” saying whether it is suspect or not. Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.


------------end------------------------
See here we have it folks, the future of surveillance is here and they are using the same BS excuse to pass these extremely invasive s**t.

the "patriot act" in the US, to catch "terrorists"

the anti encryption bill which called "earn it act" to catch "criminals and terrorists"

and now we have apple, a private for profit company that's willingly implementing a "publicly known backdoor", so government in the future can access your data if they wanted to, but OFC, I'm just a conspiracy theorist and the researchers that said in a statement above me that this " that Apple risks enabling governments around the world to seek access to their citizens’ personal data, potentially far beyond the program original intent. " are also a total nut jobs. and apple is totally the hero and they are doing this all "for the kids".

so, can you imagine, in the future, China can add a bunch of pictures of Tiananmen square massacre in its database of hashes to be screened by Apple's program. and Whoever has that picture on their phone will be detected and reported to the authorities. what a scary creepy future that we are heading too

but yeah "this all for the kids". The road to hell is paved with good intentions, so I say that it's time to ditch apple.
 
Dr. Neal Krawetz, the creator of FotoForensics and a known researcher says it best:

"Illegal Searches: As noted, Apple says that they will scan your Apple device for CSAM material. If they find something that they think matches, then they will send it to Apple. The problem is that you don't know which pictures will be sent to Apple. You could have corporate confidential information and Apple may quietly take a copy of it. You could be working with the legal authority to investigate a child exploitation case, and Apple will quietly take a copy of the evidence.

To reiterate: scanning your device is not a privacy risk, but copying files from your device without any notice is definitely a privacy issue.

Think of it this way: Your landlord owns your property, but in the United States, he cannot enter any time he wants. In order to enter, the landlord must have permission, give prior notice, or have cause. Any other reason is trespassing. Moreover, if the landlord takes anything, then it's theft. Apple's license agreement says that they own the operating system, but that doesn't give them permission to search whenever they want or to take content."

You can read his full statement here if you want to:
 
Its much less a problem that some people want to make it appears.
Apple will just scan checksum of files on Icloud to see if it matches checksum of known pedophile pictures. It a checksum matches, the picture will be flagged and sent to a manual review. If indeed it is an illegal picture, it will be forwarded to the task force for further enforcement.

No files on your Iphone is scanned, only the files sent to Icloud will have their checksum scanned.

Other cloud providers, are scanning all your files in the cloud using AI on each picture content, not only the checksum. Some are even using all the picture on social media to feed databases....

So while I am a privacy enthusiast, I have no issue with Apple new policy. If you want to have this kind of pictures and if you are stupid enough to send them to a cloud server, you deserve to get caught.
 
Its much less a problem that some people want to make it appears.
Apple will just scan checksum of files on Icloud to see if it matches checksum of known pedophile pictures. It a checksum matches, the picture will be flagged and sent to a manual review. If indeed it is an illegal picture, it will be forwarded to the task force for further enforcement.

No files on your Iphone is scanned, only the files sent to Icloud will have their checksum scanned.

Other cloud providers, are scanning all your files in the cloud using AI on each picture content, not only the checksum. Some are even using all the picture on social media to feed databases....

So while I am a privacy enthusiast, I have no issue with Apple new policy. If you want to have this kind of pictures and if you are stupid enough to send them to a cloud server, you deserve to get caught.
I respectfully disagree with a lot of what you said, because your arguments are completely based on misunderstandings of how the program works.

People keep saying it's looking for CSAM, or child ab*se material, but that's a misunderstanding of how it works. It's looking for a match to a database of hashes that, right now, are child ab*se material but it could be anything. Tienanmen square pictures, copyrighted images, confidential corporate files etc.

SwiftOnSecurity says:


No files on your Iphone is scanned, only the files sent to Icloud will have their checksum scanned.
This is 1000% false.

The hash comparison is taking place on the local device, and not on the cloud. PERIOD.

Add that the Icloud feature is enabled by default and you have to jump through hops to disable it.

you keep saying "Everyone does it!", but that's incorrect. None of the major operating systems monitor your actions on-device for illegal activity, and report it to the authorities if you are caught. Cloud providers will compare what you upload to their servers, but non of them will scan your photos before you upload it to the cloud, there is a major fundamental principle difference here.

look at the CATO institute and what are they saying about this shitty program:

"Described more abstractly and content neutrally, here’s what Apple is implementing: A surveillance program running on the user’s personal device, outside the user’s control, will scan the user’s data for files on a list of prohibited content, and then report to the authorities when it finds a certain amount of content on the list. Once the architecture is in place, it is utterly inevitable that governments around the world will demand its use to search for other kinds of content—and to exert pressure on other device manufacturers to install similar surveillance systems."

so, no, we are not pedos because we want to keep our privacy, like how we are not pedos because we want to close our bathroom door when taking a s**t.

Man, it's just f*****g ironic that this is coming from apple, a company that preaches day and night about "we protect your privacy, we are not like other gir- I mean phones, we are way better", and then they do this lol.

I will just leave this image here and rest my case
 

Attachments

  • DwGoq2uV4AA_Aov.jpg-large.jpeg
    DwGoq2uV4AA_Aov.jpg-large.jpeg
    221 KB · Views: 56
Last edited:
Google deleted my photo from my Android phone just now without any prompts or consent. For real. The photo was showing my girlfriend sleeping. I assume that since she's a girl and was sleeping, it's a non-consensual image, so they decided to remove the file from my phone's harddrive. The funny thing is, it happened right in front of my eyes 5 minutes ago. First, I was viewing the photo, then I closed it, and it was gone. No, I didn't delete it myself. Actually, the photo got synced before to Google Photos where I was looking for it, but couldn't find it. That means somebody manually reviewed my files, and decided to delete it from my phone locally. At least now we can stop pretending that there's any privacy left. Your phone is not yours, it's Google's and they do whatever they want with your own personal data.
 
Google deleted my photo from my Android phone just now without any prompts or consent. For real. The photo was showing my girlfriend sleeping. I assume that since she's a girl and was sleeping, it's a non-consensual image, so they decided to remove the file from my phone's harddrive. The funny thing is, it happened right in front of my eyes 5 minutes ago. First, I was viewing the photo, then I closed it, and it was gone. No, I didn't delete it myself. Actually, the photo got synced before to Google Photos where I was looking for it, but couldn't find it. That means somebody manually reviewed my files, and decided to delete it from my phone locally. At least now we can stop pretending that there's any privacy left. Your phone is not yours, it's Google's and they do whatever they want with your own personal data.
That’s scary as hell! I’ll try with my gf and see what happens…
 
  • Haha
Reactions: Sdrak