Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Apple had the CSAM scanning controversy.

You mean like Google’s recent CSAM controversy? At least Apple never proposed using ML to find new CSAM and report you to the police with a single violation…



No, he meant Apple's CSAM controversy. Apple wanted to scan files on your device, while Google scans stuff you upload to their cloud.


Not really.

It was a controversy created by a poorly written Apple press release that unveiled two distinct features, and subsequently ginned up by an even worse EFF blog post.

CSAM scanning was all about iCloud, just like every other vendor, and the benefit was a roadmap to E2E storage.

The more confusing thing was the explicit content notifications for parents for their minor children in family plans.

So instead, just like everyone else, you almost certainly have CSAM scanning happening in iCloud directly. All of your content is just a subpoena away. Thanks EFF.


Running this kind of stuff is already leg in the door. Next they can decide to scan your device even, if you don't upload anything anywhere. For your safety, of course, you might be in a possession of an objectionable material, like photos of MAGA hats or other extremist paraphernalia.

So this is a big red line. If you want to upload to iCloud, ok, scanning there is a fair game and you can always not upload to iCloud. But your device should be your device, respecting your privacy.


My minor child’s device is my device.

As a parent, I may want to opt into a service on my managed, family device so I know my child’s fellow 9th grader girlfriend is sending pictures that may be a crime to possess, or vice versa. That is exactly what Apple did.

Like I said, you’re conflating two things that are not the same, because you’ve been fed a steady diet of bad information stemming from Apples unusually poor communications and the EFFs inaccurate blog post.


Apple scans files on your device to compare against known CSAM content.

Google scans your content for potential CSAM based on AI.

One can imagine the false positive rate for the Google approach is likely higher.


Apple scans files on your device, using your resources (storage, compute, power).

Google scans files on their devices, using their resources (storage, compute power).

It is you who decided to upload files to Google's cloud.

However, it is difficult to use your own device if you don't have privacy there.


Apple’s proposal means that your data will never leave your device unencrypted - they can’t scan it on their servers due to inherently having quite good privacy in the first place.


The feature Apple proposed would only apply to files you were uploading to iCloud. Turning off iCloud photos prevented scanning.

The whole point of the Apple system was to enable end to end encryption so Apple couldn’t view your data on their servers. The only way they could get access is when a threshold of CSAM likely files were met. e.g the keys are sharded.

With the Apple approach you have considerably more privacy.


For now. The foot would be in door to enable it system-wide.


Why would they care about getting a foot in a door that they have the keys to and is wide open?

Modern SoCs have security domains that are invisible to the main CPUs. Apple can run just about anything it wants on the security coprocessor and users will be none the wiser.

The American DOJ, FBI etc do not want end-to-end encryption and use ‘think of the children’ as an excuse. This gives Apple the plausible pushback to say ‘We have thought of the children…’


Politicians feet not Apple's.


Apple don’t scan files on your device, they backtracked on the plan (for now at least).


Apple absolutely scans your files on iCloud.


ybeh intended to but the backlash was too strong. not a vote of confidence in them either way.


Apple only wanted to scan files on your device once you decide to upload it. In other words, if you don't ever actually upload anything nothing would ever be scanned.


Nobody is mad about cloud providers scanning for objectionable content. Everyone is mad because Apple is using the hardware you payed for to perform an operation you're paying them to provide. It's like paying for an API that gives you the weather, and all the endpoint returns is a system call for your phone's thermometer.

It goes to show that Apple isn't afraid to undermine their own experience to implement adversarial software services. Combined with the way they "respect" the privacy of Chinese citizens, I don't think Apple actually cares about our privacy. It would seem that instead they're testing the waters for more radical changes[0], like using on-device machine learning to detect suicidal users or terrorists. And who's going to object when that happens, right?

[0] Source: the past century of US political discourse.


The guy who had his Google account deleted for sending pictures of his children to his doctor is definitely upset about cloud providers scanning his content!


I'm sure he is, but rules are rules. Google cannot store that content for any purpose whatsoever: there are very specific regulatory standards in place for storing medical data, and Google Drive/iCloud doesn't comply with those standards. They (both!) go as far as to tell you that in the EULA, so the context of the situation doesn't even matter in the first place.

The more bothersome thing (to me) is client-side scanning. Apple is setting a precedent that allows any first-party manufacturer to work against the user's interests as long as there's a large enough common enemy. It should go without saying, but that's a horrible idea!

If Google were doing the same thing on their Pixel phones, I'd be slamming them equally as hard. It's simply depressing to watch a company claim they know best, while proceeding to use your phone to run their regulatory compliance code.


Taking a picture of something on your own phone is not "medical data" and they weren't doing ToS compliance, they were scanning for novel CSAM in a way they're not required to, reported it to the police, and now are refusing to undelete his account because of their never give anyone customer service policy.

(Even if you took a picture to send to your doctor through an encrypted app, it just has to end up in the photo library to be cloud synced, unless you remember to hand delete it.)


Not sure what the distinction is. Either Apple uses the hardware you paid for to scan [before uploading] or uses the service you paid for [during upload] to scan.

The practical difference is nil. The fact of the matter is, either way, if you don't upload you won't be scanned, if you do upload you will be. End result is the same regardless of whether you use OneDrive, Box, Dropbox or Google Drive (who all implement CSAM scanning)


The practical difference is that I pay for the electricity they use, now. The wear-and-tear of their service is now imparted onto my phone.

If Apple wants to run that code, they're welcome to do it on their own servers; they pay for the hardware, they decide what software it runs. I shouldn't have to waste my storage space on their NeuralHash model though. I shouldn't waste my CPU cycles on their regulatory compliance. Arguing anything else is paramount to saying that Apple owned your phone from the start, and if that's the case, then antitrust legislation can't come soon enough.


seems like a dumb thing to argue about. everything you're saying is moot as you can just plug in your phone to your computer and upload photos without scanning, or use a self-hosted file upload service, etc.

your compute and phone resources are already used for analytics and other things that you didn't explicitly tell the phone to do.

you also greatly overestimate how much CPU it takes to compare a hash, lol

fact of the matter is, you're only scanned when you're using apple services.


> your compute and phone resources are already used for analytics and other things that you didn't explicitly tell the phone to do.

I didn't know LineageOS sent analytics, it never showed up on Netguard. Curious!

If you think this is a dumb thing to argue about, then by all means; don't argue about it! If you're satisfied with your iPhone and convinced that Apple can do no wrong, then certainly don't let me get in your way. We're talking about the largest company in the world, though, and if you're unwilling to offer them a little criticism for objectively degrading the iPhone experience, then I don't see the point in discussing this.

> you also greatly overestimate how much CPU it takes to compare a hash, lol

Do I? If it's not such a big hassle, then Apple can just run it on their own servers. They own your iCloud decryption keys anyways, it shouldn't be too big of a deal.


We're talking about Apple devices here so LineageOS is irrelevant. Please keep up.

> Do I? If it's not such a big hassle, then Apple can just run it on their own servers. They own your iCloud decryption keys anyways, it shouldn't be too big of a deal.

Privacy is better preserved doing it on device compared to on their servers.

anyway, nothing more to discuss here.


You might think that it's silly to complain about having to directly pay for an adversarial program to run, but try to imagine how it could bother other people.


That upload that is the only way to load data onto your phone for many apps, that the phone is auto-opted into and that only Apple can offer? That one? The one that literally no competitor can replace since it uses special Apple entitlements?


I'm not sure what you're talking about? iOS has implemented interfaces to use things like Google Drive, One Drive, Dropbox, etc, baked into the file system.


1. There is no filesystem (exposed), there are per-app sandboxes with the ability to move files from another sandbox

2. It only allows single file actions, no sync/offline cache/etc. Most apps don't meaningfully work well with this - there is no "sync these ten comics to this device at all times"

Most apps still only use their local file sandbox.


Any application that saves files using the standard file selector allows you to choose any installed storage solution.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: