Apple has been hit with a lawsuit from West Virginia over its iCloud service and child sexual abuse material (CSAM), but the company’s… The post West Virginia sues Apple over iCloud’s alleged role in distribution of child sex abuse material appeared first on …
Apple has been hit with a lawsuit from West Virginia over its iCloud service and child sexual abuse material (CSAM), but the company’s steadfast commitment to user privacy remains a cornerstone of its approach — one that experts and advocates argue prevents far greater abuses by bad actors. In a complaint filed on February 19, 2026, in Mason County Circuit Court, West Virginia Attorney General JB McCuskey accused Apple of allowing iCloud to serve as a platform for storing and distributing illegal CSAM, citing internal communications and the company’s lower volume of reports to the National Center for Missing & Exploited Children (NCMEC) compared to competitors like Google and Meta.
The lawsuit claims Apple prioritized privacy over aggressive content scanning, pointing to a previously reported 2020 internal message where an Apple executive described iCloud as the “greatest platform for distributing child porn” due to limited detection measures at the time. It also references Apple’s 2021 announcement of a proposed on-device scanning system (NeuralHash) for CSAM detection, which was ultimately abandoned in late 2022 amid widespread privacy backlash.Apple has long maintained that introducing broad scanning mechanisms — even for illegal content — creates inevitable vulnerabilities.
Any client-side or server-side scanning system capable of detecting specific material could be repurposed or compelled for other uses, such as government surveillance, censorship, or mass data exploitation by malicious hackers. Once a technical capability exists to scan user content at scale, it becomes a prime target for bad actors seeking to access private data, impersonate users, or conduct broader attacks on privacy.
By rejecting such systems and instead implementing end-to-end encryption for iCloud backups (rolled out in December 2022), Apple ensures that user data — including photos, messages, and documents — remains inaccessible to the company itself, law enforcement without user consent, or cybercriminals who might breach servers. This architecture fundamentally limits the risk of abuse: no backdoor means no universal key that could be stolen, leaked, or demanded under legal pressure in less democratic jurisdictions.
Apple emphasized in its response to the lawsuit that it continues to innovate for safety while upholding privacy. Features like Communication Safety in Messages automatically detect and blur nudity in communications involving children, with interventions designed to protect young users without compromising overall encryption or enabling routine content inspection. The company reports detected CSAM when required by law and has built one of the most secure ecosystems in the industry.
Privacy advocates have long warned that mandatory scanning requirements or backdoors erode protections for everyone. Journalists, dissidents, domestic violence survivors, and ordinary users rely on strong encryption to safeguard sensitive information. Introducing exceptions for CSAM detection would create a precedent that weakens these safeguards, potentially enabling authoritarian regimes or cybercriminals to exploit the same mechanisms.
While the West Virginia lawsuit seeks damages and court-ordered changes to Apple’s detection practices, Apple’s position underscores a principled stance: true protection against child exploitation must not come at the cost of creating tools that bad actors could weaponize against millions of innocent users worldwide. Apple continues to lead in device security and privacy features, balancing child safety innovations with the fundamental right to private communication in an increasingly digital world.
MacDailyNews Take: When they stoop to the Think of the Children™ ruse, their desperation fairly screams. Apple iMessage service is end-to-end encrypted. Apple cannot see data sent via its iMessage service. Apple’s Advanced Data Protection for iCloud allows users to protect important iCloud data, including iCloud Backup, Photos, Notes, and more. Apple cannot see data protected by Advanced Data Protection for iCloud.
In December 2022, after much opposition, including, voluminously, from us here at MacDailyNews, Apple killed an effort to design an iCloud photo scanning tool for detecting child sexual abuse material (CSAM) in the storage service. This sounds wonderful at first glance (everyone’s for detecting and rooting out purveyors of child pornography) and horrible once you think about it for more than a second (massive, awful potential for misuse)… It’s a huge can of worms.
It’s a backdoor, plain and simple, and it neatly negates Apple’s voluminous claims of protecting users’ privacy. It doesn’t matter what they’re scanning for, because if they can scan for one thing, they can scan for anything. – MacDailyNews, August 6, 2021 Originally, Apple would use one database of hashes from the National Center for Missing and Exploited Children (NCMEC). Then, after outcry, Apple changed their backdoor scanning to match “two or more child safety organizations operating in separate sovereign jurisdictions.” Of course, Apple’s multi-country “safeguard” is no safeguard at all.
The Five Eyes (FVEY) is an intelligence alliance comprising the United States, Australia, Canada, New Zealand, and the United Kingdom. These countries are parties to the multilateral UKUSA Agreement, a treaty for joint cooperation in signals intelligence. The FVEY further expanded their surveillance capabilities during the course of the “war on terror,” with much emphasis placed on monitoring the World Wide Web.
The former NSA contractor Edward Snowden described the Five Eyes as a “supra-national intelligence organization that does not answer to the known laws of its own countries.” Documents leaked by Snowden in 2013 revealed that the FVEY has been spying on one another’s citizens and sharing the collected information with each other in order to circumvent restrictive domestic regulations on surveillance of citizens.
Apple’s claim to backdoor scan only for CSAM was intended to be a trojan horse, introduced via the hackneyed “Think of the Children™” ruse, that would be bastardized in secret for all sorts of surveillance under the guise of “safety” in the future. “Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.” — Benjamin Franklin The fact that Apple ever considered this travesty in the first place, much less announced and tried to implement it in the fashion they did, has damaged the company’s reputation for protecting user privacy immensely; perhaps irreparably.
Hopefully, if Apple management has any sense whatsoever, is not hopelessly compromised, and can resist whatever pressure forced them into this ill-considered abject disloyalty to customers who value their privacy and security, the company will end this disastrous scheme promptly and double-down on privacy by finally and immediately enabling end-to-end encryption of iCloud backups as a company which claims to be a champion of privacy would have done many years ago.
– MacDailyNews, December 23, 2021 MacDailyNews Note: Advanced Data Protection for iCloud offers our highest level of cloud data security and protects the majority of your iCloud data using end-to-end encryption. If you choose to enable Advanced Data Protection, the majority of your iCloud data — including iCloud Backup, Photos, Notes, and more — is protected using end-to-end encryption. No one else can access your end-to-end encrypted data, not even Apple, and this data remains secure even in the case of a data breach in the cloud.
To turn on Advanced Data Protection, first update the iPhone, iPad, or Mac that you’re using to the latest software version. Turning on Advanced Data Protection on one device enables it for your entire account and all your compatible devices. On iPhone or iPad 1.Open the Settings app. 2. Tap your name, then tap iCloud. 3. Scroll down, tap Advanced Data Protection, then tap Turn on Advanced Data Protection.
4. Follow the onscreen instructions to review your recovery methods and enable Advanced Data Protection. On Mac 1. Choose Apple menu > System Settings. 2. Click your name, then click iCloud. 3. Click Advanced Data Protection, then click Turn On. 4. Follow the onscreen instructions to review your recovery methods and enable Advanced Data Protection. Please help support MacDailyNews — and enjoy subscriber-only articles, comments, chat, and more — by subscribing to our Substack: macdailynews.substack.com.
Thank you! Support MacDailyNews at no extra cost to you by using this link to shop at Amazon. On February 20, 2026, the U.S. Supreme Court ruled 6-3 that President Trump exceeded his authority by using the International… The U.S. economy demonstrated remarkable resilience in the fourth quarter of 2025, expanding at a 1.4% annual pace despite the significant…
Summary
This report covers the latest developments in iphone. The information presented highlights key changes and updates that are relevant to those following this topic.
Original Source: Macdailynews.com | Author: MacDailyNews | Published: February 19, 2026, 6:27 pm


Leave a Reply
You must be logged in to post a comment.