----------------start------------------------
Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices. Apple detailed its proposed system — known as “neuralMatch” — to some US academics earlier this week, according to two security researchers briefed on the virtual meeting.
The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified. The scheme will initially roll out only in the US. Apple confirmed its plans in a blog post, saying the scanning technology is part of a new suite of child protection systems that would “evolve and expand over time”.
The features will be rolled out as part of iOS 15, expected to be released next month. “This innovative new technology allows Apple to provide valuable and actionable information to the National Center for Missing and Exploited Children and law enforcement regarding the proliferation of known CSAM [child sexual abuse material],” the company said. “And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account.
” The proposals are Apple’s attempt to find a compromise between its own promise to protect customers’ privacy and demands from governments, law enforcement agencies and child safety campaigners for more assistance in criminal investigations, including terrorism and child pornography. The tension between tech companies such as Apple and Facebook, which have defended their increasing use of encryption in their products and services, and law enforcement has only intensified since the iPhone maker went to court with the FBI in 2016 over access to a terror suspect’s iPhone following a shooting in San Bernardino, California.
Security researchers, while supportive of efforts to combat child abuse, are concerned that Apple risks enabling governments around the world to seek access to their citizens’ personal data, potentially far beyond its original intent.
“It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of . . . our phones and laptops,” said Ross Anderson, professor of security engineering at the University of Cambridge. Although the system is currently trained to spot child sex abuse, it could be adapted to scan for any other targeted imagery and text, for instance, terror beheadings or anti-government signs at protests, say researchers. Apple’s precedent could also increase pressure on other tech companies to use similar techniques. “This will break the dam — governments will demand it from everyone,” said Matthew Green, a security professor at Johns Hopkins University, who is believed to be the first researcher to post a tweet about the issue.
Recommended FT News Briefing podcast11 min listen Robinhood’s wild stock market debut Alec Muffett, a security researcher and privacy campaigner who formerly worked at Facebook and Deliveroo, said Apple’s move was “tectonic” and a “huge and regressive step for individual privacy”. “Apple are walking back privacy to enable 1984,” he said. Cloud-based photo storage systems and social networking sites already scan for child abuse imagery, but that process becomes more complex when trying to access data stored on a personal device. Apple’s system is less invasive in that the screening is done on the phone, and “only if there is a match is notification sent back to those searching”, said Alan Woodward, a computer security professor at the University of Surrey. “This decentralised approach is about the best approach you could adopt if you do go down this route.”
Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up system. Users’ photos, converted into a string of numbers through a process known as “hashing”, will be compared with those on a database of known images of child sexual abuse. The system has been trained on 200,000 sex abuse images collected by the NCMEC. According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a “safety voucher” saying whether it is suspect or not. Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.
------------end------------------------
See here we have it folks, the future of surveillance is here and they are using the same BS excuse to pass these extremely invasive s**t.
the "patriot act" in the US, to catch "terrorists"
the anti encryption bill which called "earn it act" to catch "criminals and terrorists"
and now we have apple, a private for profit company that's willingly implementing a "publicly known backdoor", so government in the future can access your data if they wanted to, but OFC, I'm just a conspiracy theorist and the researchers that said in a statement above me that this " that Apple risks enabling governments around the world to seek access to their citizens’ personal data, potentially far beyond the program original intent. " are also a total nut jobs. and apple is totally the hero and they are doing this all "for the kids".
so, can you imagine, in the future, China can add a bunch of pictures of Tiananmen square massacre in its database of hashes to be screened by Apple's program. and Whoever has that picture on their phone will be detected and reported to the authorities. what a scary creepy future that we are heading too
but yeah "this all for the kids". The road to hell is paved with good intentions, so I say that it's time to ditch apple.
Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices. Apple detailed its proposed system — known as “neuralMatch” — to some US academics earlier this week, according to two security researchers briefed on the virtual meeting.
The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified. The scheme will initially roll out only in the US. Apple confirmed its plans in a blog post, saying the scanning technology is part of a new suite of child protection systems that would “evolve and expand over time”.
The features will be rolled out as part of iOS 15, expected to be released next month. “This innovative new technology allows Apple to provide valuable and actionable information to the National Center for Missing and Exploited Children and law enforcement regarding the proliferation of known CSAM [child sexual abuse material],” the company said. “And it does so while providing significant privacy benefits over existing techniques since Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account.
” The proposals are Apple’s attempt to find a compromise between its own promise to protect customers’ privacy and demands from governments, law enforcement agencies and child safety campaigners for more assistance in criminal investigations, including terrorism and child pornography. The tension between tech companies such as Apple and Facebook, which have defended their increasing use of encryption in their products and services, and law enforcement has only intensified since the iPhone maker went to court with the FBI in 2016 over access to a terror suspect’s iPhone following a shooting in San Bernardino, California.
Security researchers, while supportive of efforts to combat child abuse, are concerned that Apple risks enabling governments around the world to seek access to their citizens’ personal data, potentially far beyond its original intent.
“It is an absolutely appalling idea, because it is going to lead to distributed bulk surveillance of . . . our phones and laptops,” said Ross Anderson, professor of security engineering at the University of Cambridge. Although the system is currently trained to spot child sex abuse, it could be adapted to scan for any other targeted imagery and text, for instance, terror beheadings or anti-government signs at protests, say researchers. Apple’s precedent could also increase pressure on other tech companies to use similar techniques. “This will break the dam — governments will demand it from everyone,” said Matthew Green, a security professor at Johns Hopkins University, who is believed to be the first researcher to post a tweet about the issue.
Recommended FT News Briefing podcast11 min listen Robinhood’s wild stock market debut Alec Muffett, a security researcher and privacy campaigner who formerly worked at Facebook and Deliveroo, said Apple’s move was “tectonic” and a “huge and regressive step for individual privacy”. “Apple are walking back privacy to enable 1984,” he said. Cloud-based photo storage systems and social networking sites already scan for child abuse imagery, but that process becomes more complex when trying to access data stored on a personal device. Apple’s system is less invasive in that the screening is done on the phone, and “only if there is a match is notification sent back to those searching”, said Alan Woodward, a computer security professor at the University of Surrey. “This decentralised approach is about the best approach you could adopt if you do go down this route.”
Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up system. Users’ photos, converted into a string of numbers through a process known as “hashing”, will be compared with those on a database of known images of child sexual abuse. The system has been trained on 200,000 sex abuse images collected by the NCMEC. According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a “safety voucher” saying whether it is suspect or not. Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.
------------end------------------------
See here we have it folks, the future of surveillance is here and they are using the same BS excuse to pass these extremely invasive s**t.
the "patriot act" in the US, to catch "terrorists"
the anti encryption bill which called "earn it act" to catch "criminals and terrorists"
and now we have apple, a private for profit company that's willingly implementing a "publicly known backdoor", so government in the future can access your data if they wanted to, but OFC, I'm just a conspiracy theorist and the researchers that said in a statement above me that this " that Apple risks enabling governments around the world to seek access to their citizens’ personal data, potentially far beyond the program original intent. " are also a total nut jobs. and apple is totally the hero and they are doing this all "for the kids".
so, can you imagine, in the future, China can add a bunch of pictures of Tiananmen square massacre in its database of hashes to be screened by Apple's program. and Whoever has that picture on their phone will be detected and reported to the authorities. what a scary creepy future that we are heading too
but yeah "this all for the kids". The road to hell is paved with good intentions, so I say that it's time to ditch apple.