West Virginia Attorney General JB McCuskey wants you to think he’s protecting children. His press release says so. His legal complaint opens with the genuinely horrific line that Apple has, i…
This is a mind-bending read, but I think the important part is
If an actual court orders Apple to start scanning iCloud for CSAM, then every image flagged by those mandated scans becomes evidence obtained through a warrantless government search conducted without probable cause.
… And thus it becomes dismissable under the Fourth Amendment.
There is a reason Congress, when it enacted the federal statute requiring providers to report CSAM when they find it, explicitly included a disclaimer that providers cannot be forced to search for it.
I guess this wouldn’t prevent government officials from winking and nudging Apple to scan anyway.
And in case you’re just tuning in, client-side scanning is still bad universally, and Apple shouldn’t do it regardless.
What the complaint skips over is why the security community reacted so strongly to NeuralHash in the first place. We covered it at the time: the core objection from security researchers wasn’t that detecting CSAM is bad. It was that you cannot build client-side scanning infrastructure that only scans for CSAM. Once you build the pipe, you have built the pipe.
This is a mind-bending read, but I think the important part is
… And thus it becomes dismissable under the Fourth Amendment.
I guess this wouldn’t prevent government officials from winking and nudging Apple to scan anyway.
And in case you’re just tuning in, client-side scanning is still bad universally, and Apple shouldn’t do it regardless.