Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

You are here: MacNN Forums > Community > MacNN Lounge > Apple to install spyware on Phones / Macs / Etc

Apple to install spyware on Phones / Macs / Etc
Thread Tools
reader50
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Aug 6, 2021, 03:19 AM
 
Initial Story

Apple Confirms Story
Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.
They're looking for child porn. By scanning everyone's photo library. Your permission will not be asked for, or honored about this.

It's using a hashing technique, involving machine learning. Law enforcement will provide image hashes of known porn pics. Any of your images that are substantially similar will be flagged. If there are too many hits, it gets reported to Apple, including the actual photos for human analysis.

Sounds like law enforcement can provide an image hash for anything they'd like to find. Say, for anti-government signs in a picture. Or BLM signs. Or a face they'd like to arrest. Or anything else they might not like. Drug smoking hardware, or needles perhaps. Once you compromise end-to-end encrypted communication, many governments will want to look for different things.

Oh yes, and teen sexting. If they spot an explicit image from a contact (who might be your wife or husband), they'll blur it. Notify you that you don't need to open it. And notify your parents. Perhaps law enforcement eventually.

The spyware comes in the next major OS versions. Set your Apple products to not update as needed.
The changes will roll out "later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey," Apple said. Apple will also deploy software that can analyze images in the Messages application for a new system that will "warn children and their parents when receiving or sending sexually explicit photos."
Avoiding compromised OS versions is only a temporary solution. As Apple has sold out, our privacy is no longer guaranteed. They've stopped respecting who owns the product you paid for. And old OS versions eventually stop getting security updates.

Android may get pressured to compromise too. Along with Windows. It looks to me like fully-open-source is the only safe place, where malware cannot stay hidden. Linux on desktop, and fully open-source firmware/OS on a smartphone.
     
Thorzdad
Moderator
Join Date: Aug 2001
Location: Nobletucky
Status: Offline
Reply With Quote
Aug 6, 2021, 07:37 AM
 
"Think of the children!" has long been the wedge used to chip away at privacy and the like. Every parent with a couple of cute pictures on their phone of their kid in the bathtub blowing soap bubbles should be very afraid of this. I'm extremely disappointed (and not a little scared) that Apple is heading down this road.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Aug 6, 2021, 09:51 AM
 
If they’re hashing, it won’t trigger on “substantially similar” - it will trigger on specific known images. I would guess that there is some cleverness about looking at a square in the image so you can’t fool it with a quick crop or mirror, but beyond that, it is a specific image they’re flagging. Apple is referencing the specific database it is using, so law enforcement sneaking something in seems unlikely.

The other thing, about scanning images in iMessage, also appears to be connected to parental controls.

Neither of these are quite as bad as the initial story made it sound. What is concerning is that with the infrastructure in place, you can make it look for anything. But honestly, what did you expect Apple to do? So many politicians around the globe, and in particular in the US, have been pushing to ban encryption unless Apple et al does something. This is what they came up with. Good? No - absolutely awful. But banning encryption is worse.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
andi*pandi
Moderator
Join Date: Jun 2000
Location: inside 128, north of 90
Status: Online
Reply With Quote
Aug 6, 2021, 09:52 AM
 
Originally Posted by Thorzdad View Post
"Think of the children!" has long been the wedge used to chip away at privacy and the like. Every parent with a couple of cute pictures on their phone of their kid in the bathtub blowing soap bubbles should be very afraid of this. I'm extremely disappointed (and not a little scared) that Apple is heading down this road.

^exactly.
     
Laminar
Posting Junkie
Join Date: Apr 2007
Location: Iowa, how long can this be? Does it really ruin the left column spacing?
Status: Offline
Reply With Quote
Aug 6, 2021, 09:54 AM
 
Originally Posted by Thorzdad View Post
"Think of the children!" has long been the wedge used to chip away at privacy and the like. Every parent with a couple of cute pictures on their phone of their kid in the bathtub blowing soap bubbles should be very afraid of this. I'm extremely disappointed (and not a little scared) that Apple is heading down this road.
What to call it when you want to shut down any dissent:
- Child abuse
- Terrorism

Apple to install spyware on Phones / Macs / Etc
^ Extremely misleading. The phone creates a hash based on the photo, then uploads the photo and hash to iCloud. Apple compares the hash uploaded to iCloud vs. the its record of hashes of known bad photos.

Apple does not "backdoor" into your phone to look through the pictures. If you don't upload photos to an iCloud library, nothing happens. The only thing Apple sees is the hash that your phone uploaded to iCloud when you chose to upload your photos to iCloud.
     
Doc HM
Professional Poster
Join Date: Oct 2008
Location: UKland
Status: Offline
Reply With Quote
Aug 6, 2021, 09:55 AM
 
I am profoundly shocked, (and like Thorz actually scared) at the seismic level of change in surveilence this is enabling. In one quick and completely undebated step Apple (and then others) will roll out a global industrial scale surveillance network.
Won't somebody think of the children's children! (to pinch a phrase)

It's not the photo of your kids in the bath etc that's the problem as I assume law enforcement won't have a similar hashed image to compare, it's the ability to add hashes of any other image into the system in order to spark match results.
This space for Hire! Reasonable rates. Reach an audience of literally dozens!
     
Laminar
Posting Junkie
Join Date: Apr 2007
Location: Iowa, how long can this be? Does it really ruin the left column spacing?
Status: Offline
Reply With Quote
Aug 6, 2021, 10:52 AM
 
Originally Posted by Doc HM View Post
It's not the photo of your kids in the bath etc that's the problem as I assume law enforcement won't have a similar hashed image to compare, it's the ability to add hashes of any other image into the system in order to spark match results.
That's interesting - they mention a "law enforcement database," but it also opens up the possibility of scanning for "revenge porn," copyrighted images, or any other photos that someone doesn't want out there on someone else's iCloud account.
     
Thorzdad
Moderator
Join Date: Aug 2001
Location: Nobletucky
Status: Offline
Reply With Quote
Aug 6, 2021, 11:21 AM
 
Or, searching for anyone else who happened to have been sent pics of that BLM protest. Or, that 3-Percenter rally. Handy way to identify anyone the cops or other authorities consider "suspect" or "potential terrorist". The rabbit hole this plan digs is practically bottomless insofar as misuse goes.
     
reader50  (op)
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Aug 6, 2021, 12:58 PM
 
Originally Posted by Laminar View Post
^ Extremely misleading [thread title]. The phone creates a hash based on the photo, then uploads the photo and hash to iCloud. Apple compares the hash uploaded to iCloud vs. the its record of hashes of known bad photos.
It's not an MD5 hash. Rather something generated by AI, designed to match similar pics. Resizing the pic, or lowering the quality, will produce the same hash. So it will indeed match similar photos (or documents if tasked for file searches in future). So a hit on your baby pics is indeed possible. And yes, it's done on the local device - the one you paid for. Even if only currently applied to iCloud uploads, the searches still run on your property.

Real-world equivalent: the cops will enter your home daily to check for contraband. No warrant, no evidence of probable cause.

They promise to close their eyes to anything but what they're looking for. It doesn't matter if you're innocent until proven guilty (and in nearly all cases, actually innocent). They'll search anyway, daily, and you can't say 'No'. Apple is acting as an agent of government, so they are indeed searching for the cops.
Originally Posted by Laminar View Post
Apple does not "backdoor" into your phone to look through the pictures.
Apple places a scan utility on your phone without permission. One that uploads info about some of your files without your permission. With the option to decrypt the files for human analysis if they get "too many" hits. This isn't a backdoor? The ability to decrypt your files doesn't sound like end-to-end encryption to me.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Aug 6, 2021, 01:14 PM
 
Originally Posted by reader50 View Post
It's not an MD5 hash. Rather something generated by AI, designed to match similar pics. Resizing the pic, or lowering the quality, will produce the same hash. So it will indeed match similar photos (or documents if tasked for file searches in future). So a hit on your baby pics is indeed possible.
Apple doesn’t say that all, and quite frankly it doesn’t really make sense.

https://www.apple.com/child-safety/

They say that it is a cryptographic hash that is going to be compared to a number of hashes stored on your device. If Apple were to run some sort of ML algorithm on all your images and then try to match each one to a large database of images, it would take forever and toast your phone while doing it. What I think they’re doing is computing hashes for a number of squares in the image and comparing them to the stored hashes, and if any one image hits on too many squares, it is flagged.

Compare to how the other feature is described. There it is on-device machine learning, not a hash. I don’t think the techniques are even similar.

And yes, it's done on the local device - the one you paid for. Even if only currently applied to iCloud uploads, the searches still run on your property.
Yes, but it is for privacy reasons, not to save on compute power.

Apple places a scan utility on your phone without permission.
I’m sure it is mentioned in the EULA…

One that uploads info about some of your files without your permission. With the option to decrypt the files for human analysis if they get "too many" hits. This isn't a backdoor? The ability to decrypt your files doesn't sound like end-to-end encryption to me.
Apple can decrypt data on your iCloud account, we knew that already.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
Laminar
Posting Junkie
Join Date: Apr 2007
Location: Iowa, how long can this be? Does it really ruin the left column spacing?
Status: Offline
Reply With Quote
Aug 6, 2021, 02:25 PM
 
Originally Posted by reader50 View Post
Real-world equivalent: the cops will enter your home daily to check for contraband. No warrant, no evidence of probable cause.
I'd say it's closer to the cops checking your safety deposit box daily for contraband. You have to put it in the safety deposit box first. If you keep your contraband at home (no iCloud uploads), they can't check it.

Apple places a scan utility on your phone without permission.
A hash generator. The scanning happens on the server side, if I'm reading the articles correctly.

This isn't a backdoor?
Again, maybe I'm reading the article wrong, but I see No iCloud = No scan. They're not "going into" your device. You have to send files off of your device and onto Apple's servers. That's when they'll check.

edit: I'm also not saying this isn't a slippery slope to more surveillance. I'm just saying at this point it isn't a backdoor.
     
ghporter
Administrator
Join Date: Apr 2001
Location: San Antonio TX USA
Status: Offline
Reply With Quote
Aug 6, 2021, 03:14 PM
 
I’m upset that there is a perceived need for this sort of response. But I plan to sit back and wait, and see how this plays out.

If there is a substantial level of child porn traffic via iPhones, this is, while quite intrusive, a logical step. If not, I’m disappointed that Apple, which as a company has already proven to be less interested in customer interests than they want to appear, has played this card. Either way, I’m not happy about it at all.

Glenn -----OTR/L, MOT, Tx
     
OAW
Addicted to MacNN
Join Date: May 2001
Status: Offline
Reply With Quote
Aug 6, 2021, 04:09 PM
 
It's my understanding that this is an opt-in feature for parents with young children. If that is the case then I'm far less concerned. If not ...

OAW
     
ghporter
Administrator
Join Date: Apr 2001
Location: San Antonio TX USA
Status: Offline
Reply With Quote
Aug 6, 2021, 04:38 PM
 
Everything seems to be connected to photos stored in iCloud... What happens with the on-phone software if you don't put your pictures in iCloud?

Glenn -----OTR/L, MOT, Tx
     
Thorzdad
Moderator
Join Date: Aug 2001
Location: Nobletucky
Status: Offline
Reply With Quote
Aug 6, 2021, 05:38 PM
 
Originally Posted by ghporter View Post
Everything seems to be connected to photos stored in iCloud... What happens with the on-phone software if you don't put your pictures in iCloud?
That’s kind of the thing, isn’t it? Once all the pieces are in-place, it’s going to be quite easy to expand the process to your phone, too. And, it’s a fairly certain bet that some government or another is going to strongly press to look at the stuff on peoples’ phones. Because of the kids, of course.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Aug 7, 2021, 02:25 AM
 
This just popped up on my radar the day before yesterday or yesterday. (I'm very busy at work, so everything is a blur.) I really need to read up on the implementation to temper my reaction.

But the first question is: Why? What is the need for such a system? Usually, the road blocks when it comes to child pornography are not what you think. I don't know whether the information is still up to date, but a few years ago, there was a big scandal where pedophiles were not charged, because their evidence could not be sighted in a timely fashion. The work is very demanding psychologically for people with normal sexuality, so there were simply very few people willing and able to do sight the evidence.

Then there is the technical implementation. From what I can tell, it uses hashes — for now. If it doesn't, that'll be a major issue. The US has quite prude standards when it comes to nudity, in Germany, Austria and many other countries it is normal that very young children play naked in the garden in the summer. My mom has pictures of me butt naked. And I do have pictures of my daughter of us in the bath tub together. If these get somehow automatically classified as problematic, that'd be a huge issue.

Even if they use hashes and do not analyze the photos (which seems to be what they are doing now), we need to rely on the hashes being 100 % correct. And that collisions between hashes are handled correctly (since a hash is just a short checksum, the map from files with sizes > hash to the hash is necessarily many-to-one. What concerns me the most is that this could be a hook into the users's data.
I don't suffer from insanity, I enjoy every minute of it.
     
Doc HM
Professional Poster
Join Date: Oct 2008
Location: UKland
Status: Offline
Reply With Quote
Aug 7, 2021, 04:07 AM
 
On a side issue.
Whenever a tech, medical, engineering or any other company claims “trillions to one” chance of error I call bullshit automatically.*
Such claims are usually followed almost immediately by a slew of reasons why the resulting list of failures are not representative, have been “fixed” or are somehow edge cases.

*see the entire 2007 financial meltdown for evidence.
This space for Hire! Reasonable rates. Reach an audience of literally dozens!
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Aug 7, 2021, 06:59 AM
 
Originally Posted by OAW View Post
It's my understanding that this is an opt-in feature for parents with young children. If that is the case then I'm far less concerned. If not ...

OAW
Two different features that were leaked together by someone who is trying to force a change.

Feature 1 is for children. Child gets an image sent to them over iMessage. The phone will analyze it he say “We think this is a dick pick. Are you sure you want to view it? Also, we will tell your parents about it.” This is entirely opt-in and for children only.

Feature 2 is for everyone. It will scan photos stored on your phone to see if any of them are from a specific database of child abuse. This is only for images stored on iCloud.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Aug 7, 2021, 07:04 AM
 
Originally Posted by ghporter View Post
Everything seems to be connected to photos stored in iCloud... What happens with the on-phone software if you don't put your pictures in iCloud?
Nothing, because Apple can’t access your photos if you don’t put them on iCloud.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
Spheric Harlot
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Aug 7, 2021, 08:37 AM
 
Originally Posted by Thorzdad View Post
That’s kind of the thing, isn’t it? Once all the pieces are in-place, it’s going to be quite easy to expand the process to your phone, too. And, it’s a fairly certain bet that some government or another is going to strongly press to look at the stuff on peoples’ phones. Because of the kids, of course.
Which process?
Are you talking about the Parental Control feature for kids 12 and under on a family account, that analyses images with Machine Learning to detect "sensitive content"? Because that image analysis engine has been part of iOS/macOS for years now; it just hasn't been directly applicable to the Messages app.

Or are you talking about the generation of hashtags from images? There is *zero* analysis of content in that - none at all. It's merely the generation of a mathematical representation that can be compared to others. (Remember the confusion about "fingerprints being stored on the iPhone" for Touch ID? Vs. ACTUAL IMAGES of fingerprints being extracted from other phones?)
The issue I see here is that this database could be arbitrarily expanded — to the "tank man" picture from Tiananmen Square, as Gruber notes, or those particularly unflattering photographs of 45, or any other despot's personal peeves.
Local laws could be created to exploit this feature, I suppose. While it is limited to comparing hashtags when uploading to iCloud, I see no technical reason why it couldn't be changed to do so and send reports for on-device content, as well.

But here's the thing: If that was the way it was going, then all of this would be completely unnecessary, because Apple *already has* full content analysis via machine learning running on all devices' photo libraries. They could just report that.

As I understand it, this whole thing exists ONLY because Apple does NOT want to go down that lane. There'd be no reason for it if they did.

It seems to me that Apple's motivation (beyond "doing the right thing") is finding a way to avoid hosting illegal content without having to actually look at people's content.
     
MacNNFamous
Senior User
Join Date: Jul 2020
Status: Offline
Reply With Quote
Aug 7, 2021, 12:34 PM
 
So if I don't use icloud at all, I'm good?

I haven't used it and after the fappening I likely never will. I take way too many pics of my butthole in seductive poses pretending I am hunger games.
     
Spheric Harlot
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Aug 7, 2021, 07:21 PM
 
I’m afraid you’ll be fine.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Aug 10, 2021, 06:52 AM
 
Haven't read the full thread but this is a fascinating example of American anti-government paranoia in action. So a system designed to catch child abusers, the non-partisan worst of the worst, is bad because it might be abused to catch people who don't like their government. But guns (which more than might be used to murder millions of innocents citizens over the years) are fine for everyone to have. Its truly bonkers. Maybe having police at all is a bad idea? You know, in case they arrest anyone? Because they might arrest someone who doesn't like them. Or their bosses.

I could understand it amongst minorities who have actually been oppressed by their own government more recently than 200+ years ago would be overly wary of them but its quite some achievement of psychology and propaganda that such a level of anti-government sentiment has been maintained for such a long time. And in no small part by one half of said government.


Apple was building itself a cast iron reputation for personal privacy and security and this is placing that in grave jeopardy. With considerable help from the idiot "Apple is looking at your naked selfies!" media. Maybe if they've spent a few more years building that rep they'd have had the credibility to pull it off. What a shame.

Its a clever enough idea but it relies on the notion that a relatively small amount of objectionable material is cycled and recycled among the scumbags that share this stuff online. Its absolutely true that they do but how long will it take for one of them to get their hands on a tool that simply re-encodes all the images in a way that alters the hashes? I can't see that being remotely difficult. This will only catch the luddite pedophiles.
I have plenty of more important things to do, if only I could bring myself to do them....
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Aug 10, 2021, 07:41 AM
 
Originally Posted by Waragainstsleep View Post
Apple was building itself a cast iron reputation for personal privacy and security and this is placing that in grave jeopardy. With considerable help from the idiot "Apple is looking at your naked selfies!" media. Maybe if they've spent a few more years building that rep they'd have had the credibility to pull it off. What a shame.
This.
I think this will have lasting consequences for Apple's image as a privacy-conscious company.
Originally Posted by Waragainstsleep View Post
Its a clever enough idea but it relies on the notion that a relatively small amount of objectionable material is cycled and recycled among the scumbags that share this stuff online. Its absolutely true that they do but how long will it take for one of them to get their hands on a tool that simply re-encodes all the images in a way that alters the hashes? I can't see that being remotely difficult. This will only catch the luddite pedophiles.
The issue is that this mechanism is in place. I don't think Apple would be able to resist the pressure from Chinese authorities who want to use it to censor images that dissidents and critics might share. You have the issue of false positives. It is just a bad idea IMHO.
I don't suffer from insanity, I enjoy every minute of it.
     
ghporter
Administrator
Join Date: Apr 2001
Location: San Antonio TX USA
Status: Offline
Reply With Quote
Aug 10, 2021, 11:12 AM
 
Here in the States, we have a history of stuff instituted “for all the best reasons” being misused, abused, and turned into invasive and, in particular, politically weaponized. Criminalizing pot, for example, Prohibition, “the War on Drugs”, the list goes on…

To me, not only is this a complete reversal of Apple’s stated intention to protect customer privacy, it sets a very scary example of publicly announcing that they are going to install spyware - and that the user can’t do anything about it.

First, (I may have mentioned this earlier) are iPhones particularly dominant in capturing abusive pictures? Maybe, but I would really like a real, honest rationale for why this spyware is so important that it has to go on every iPhone.

Second, the supposed child-protective feature, informing children about inappropriate content in messages, etc., is really only useful on phones that are used by minor children. My son just turned 34, so that’s out (and he uses an Android phone anyway).

Finally, why announce this in the first place? I’d think it would be more effective if pervy perps had that knock on the door and later had the contents of their iCloud Photo Library used against them. It seems more likely to catch bad guys in the act than this way. Or is it to discourage the bad guys from using Apple products? It really doesn’t seem to make sense to me.

Glenn -----OTR/L, MOT, Tx
     
Thorzdad
Moderator
Join Date: Aug 2001
Location: Nobletucky
Status: Offline
Reply With Quote
Aug 10, 2021, 12:09 PM
 
Originally Posted by ghporter View Post
To me, not only is this a complete reversal of Apple’s stated intention to protect customer privacy, it sets a very scary example of publicly announcing that they are going to install spyware - and that the user can’t do anything about it
It's such a seemingly huge backward step by them, in terms of their public image (legit or not) as a privacy-focused company, that I keep wondering if there isn't something going on behind the scenes that pushed Apple to develop this and roll it out this way. A calculated move to head-off a government-imposed backdoor? It's just such an odd development.
     
Spheric Harlot
Clinically Insane
Join Date: Nov 1999
Location: 888500128, C3, 2nd soft.
Status: Offline
Reply With Quote
Aug 10, 2021, 02:13 PM
 
It has been sensibly postulated that this is the necessary step to end-to-end encryption of all iCloud content.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Aug 10, 2021, 03:09 PM
 
Nobody addressing the American anti-government paranoia? Just plowing on with it?
I have plenty of more important things to do, if only I could bring myself to do them....
     
reader50  (op)
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Aug 10, 2021, 05:13 PM
 
@War,

Smartphones (and computers) are extensions of our minds. We record our thoughts, experiences, and movements. Who we communicate with, and how long. Short of a mind-reading device, I can't think of anything that comes closer to capturing our thoughts.

Apple is getting a foot in the door, to surveil our smartphones. With a utility you presumably cannot disable, that works against your interests. At the behest of government actors. Once you lose a right, it can be very difficult to get it back. It's not important that the violation is limited today.

Example: postal mail (snail mail) is protected by federal law. The government cannot open people's mail without a warrant, which requires probable cause, etc. This status was not extended to email when it was the new thing. Now governments (including the USA) routinely intercept & read email without warrants, and this position is entrenched. Businesses and ISPs do it too. We didn't defend a right our grandparents took for granted. And now that email is dominant, that right is effectively gone.

Second example:



According to the WP article, there are only 4 versions of this picture in circulation. Any Chinese (or HK) citizen uploading this pic should be monitored or intercepted, based on past Chinese behavior. And Apple has bowed to Chinese demands before. Such as locating iCloud servers within Chinese territory. I'm not clear if cryptographic keys have been handed over, but physical access to the hardware is usually a definitive endgame.

Third example: when traffic light cams first rolled out, politicians faithfully promised to the press that the pictures would not leave the intersection. They were just a cheaper way to maintain the signal lights. I remember reading this promise in newspapers at the time. But ... over the years, those politicians retired. Mission creep happened.

Today, those cams feed to central servers. Cities can track you all over, and the redlight-ticket-camera businesses (using license plate readers) can track you across much of the country. Your movements beyond walking distance can be routinely monitored, and increasingly are. Another anonymity right our grandparents had, and we do not.

If we wait for "worse" examples to turn up, it will be too late. Accepting your premise (ride it out) ignores that, and has cost us a lot already. As to the focus on government, it is the one institution that can make practices illegal. But rather than banning new intrusive tools, it's taking advantage of them.
     
BLAZE_MkIV
Professional Poster
Join Date: Feb 2000
Location: Nashua NH, USA
Status: Offline
Reply With Quote
Aug 11, 2021, 11:13 PM
 
Since the scan is happening on the device not the server, a key part of the pitch, I think this runs afoul of the 4th amendment. Just because Apple performs the search instead of the gov doesn’t make it okay. So not only is it a slippery slope case it’s actually counter productive.
     
Thorzdad
Moderator
Join Date: Aug 2001
Location: Nobletucky
Status: Offline
Reply With Quote
Aug 12, 2021, 08:05 AM
 
     
reader50  (op)
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Aug 12, 2021, 12:48 PM
 
Plenty of softball questions in that interview. Oh, and we learn of a 3rd component.
Interventions in Siri and search – A feature that will intervene when a user tries to search for CSAM-related terms through Siri and search and will inform the user of the intervention and offer resources.
Almost nothing further is mentioned on this subject. Searching for the wrong terms (or fat-fingered mistakes) will give warnings and helpful links to ... law enforcement? Doesn't sound like "bad searches" will be reported elsewhere, but the question does not get addressed. And this feature is not covered in Apple's FAQ (pdf).

On the plus side: parental notices on receiving (suspected) explicit photos can only be turned on for kids 12 and under. Minor positive, it doesn't enforce infant rules on teenagers. It will still blank out suspected pics, and offer warnings. Having to click through two pages of warnings to view (each?) suspected pic.

It seems likely the Messages feature will have problems with news reports. Automated systems have been known to block news articles with war photos, or showing various abusive situations caught on film.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Aug 12, 2021, 01:05 PM
 
I think we need to separate the two systems here since they are clearly very different in scope.

If I understand correctly, the "scan" of your photos for child porn is happening on the iCloud server. An optional service. And material manually checked by humans is happening via a non-profit intermediary organisation, not being sent straight to any government.
Is this any more risky or intrusive than a phone tap on your calls? I don't see it tbh. I can see it would be nice to back it up with some legislation but there is no way in the current political climate you are going to get any more rights enshrined in worthwhile law because that would require a constitutional amendment and there is no way the Republicans will cooperate with one of those as far as I can see.

The on device system is the one for blocking nudity in your kids text messages. It happens on device precisely so that the device owner can maintain control over the content they produce. If your kid is sending naked selfies, Apple will want no part of uploading those to their servers for any reason at all. Again, its something you can opt out of. Or more likely, have to opt into.

Can these things be open to abuse? Sure, anything can. But of the ecited examples, where are the unlawful arrests of dissidents due to emails being read or traffic cameras tracking them? If anything, there are large numbers of domestic terrorists in the US who have been left well alone.
I have plenty of more important things to do, if only I could bring myself to do them....
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Aug 13, 2021, 02:25 PM
 
Apparently I was wrong about the CSAM check running on iCloud, both features run on the phones themselves.

https://www.macrumors.com/2021/08/13...afety-details/
I have plenty of more important things to do, if only I could bring myself to do them....
     
ghporter
Administrator
Join Date: Apr 2001
Location: San Antonio TX USA
Status: Offline
Reply With Quote
Aug 13, 2021, 05:32 PM
 
This article seems to echo my thoughts about the iCloud-focus of the spyware.

I honestly don’t put any photos in iCloud because they tend to eat up the free iCloud space allotment. And my photos get backed up elsewhere, so I won’t lose them, but they aren’t where some rogue employee or lucky hacker can get at them. Though it would be interesting to see the face of that hacker when he/she/it came across the photos I have documenting some “interesting” surgical wounds…

Glenn -----OTR/L, MOT, Tx
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Aug 14, 2021, 09:16 PM
 
One of the other interesting takeaways from that interview I linked above is that the "scan" has to flag around 30 matches before it flags anything for human review and that review goes to Apple first and then a non-profit entity before it gets to government. That means there are two different organisations who presumably would not send anything dodgy inserted by a bad acting government on to said government.

Note on a note, its bizarre that we are talking about a government adding "dodgy" material to a database of child abuse. You wouldn't think it got any dodgier than that.
I have plenty of more important things to do, if only I could bring myself to do them....
     
Doc HM
Professional Poster
Join Date: Oct 2008
Location: UKland
Status: Offline
Reply With Quote
Aug 15, 2021, 09:01 AM
 
Originally Posted by ghporter View Post
Here in the States, we have a history of stuff instituted “for all the best reasons” being misused, abused, and turned into invasive and, in particular, politically weaponized. Criminalizing pot, for example, Prohibition, “the War on Drugs”, the list goes on…
We also have that tradition. I remember the last labour government we had rolling out hugely intrusive powers solely to combat international terrorism. Oh how they swore up and down that these powers would only be use in the absolute direst of emergencies to literally save lives, first use turned out to be to prosecute the old man protesting the Iraq war on the square outside parliament.

So it goes.
This space for Hire! Reasonable rates. Reach an audience of literally dozens!
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Aug 15, 2021, 10:14 AM
 
Originally Posted by ghporter View Post
First, (I may have mentioned this earlier) are iPhones particularly dominant in capturing abusive pictures? Maybe, but I would really like a real, honest rationale for why this spyware is so important that it has to go on every iPhone.
They are not. Every cloud provider is reporting child porn to the authorities.

In 2020, Apple made 265 such reports. Google made 546’704. Facebook made over 20 million. Apple has previously been unable to scan iCloud accounts for these sort of images, because they are encrypted. This is a way to do that.

Finally, why announce this in the first place? I’d think it would be more effective if pervy perps had that knock on the door and later had the contents of their iCloud Photo Library used against them. It seems more likely to catch bad guys in the act than this way. Or is it to discourage the bad guys from using Apple products? It really doesn’t seem to make sense to me.
Because someone leaked it and Apple wanted to get ahead of the news, because the leak made it sound a lot worse than it really was.

On the Siri/Search thing - this is nothing sinister, and it is only mentioned to make that classic shit sandwich (two good things and one bad). If you ask Siri about something like the score of some sports game it recognizes, it will show that score in a specially formatted box. If you try to ask it about something it doesn’t recognize, it will offer to search the web. This change will move a question about CSAM from the second result to the first.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
ghporter
Administrator
Join Date: Apr 2001
Location: San Antonio TX USA
Status: Offline
Reply With Quote
Aug 15, 2021, 11:56 AM
 
I didn’t know about the leak, so that now makes sense.

Siri, Chrome, Safari, etc. keeping tabs on search history to “show relevant content” isn’t quite as unpleasant as it used to be for me. And this (in Chrome) has actually helped me construct better search queries; sometimes I can’t quite get the right term to show up in my brain, and my lame attempts have occasionally cued Chrome/Google to guide me toward the term I needed. (This is my word-finding going haywire, not some problem with me not really knowing what I’m looking for.)

War, I distrust appointees of our current administration, but less so than those of the prior group of thugs. Give someone power and you find out what sort of personal agenda they might have. I yearn for a day when public servants are actually motivated to serve the public instead of do something else.

Glenn -----OTR/L, MOT, Tx
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Aug 16, 2021, 06:20 AM
 
It continues to amaze me how the people most vocal about government being untrustworthy tend to be the ones voting for the least trustworthy candidates. I don't mean you lot btw.

I didn't hear about a leak, I figured Apple was announcing the features for political purposes, IE to get politicians off their backs about it.
I have plenty of more important things to do, if only I could bring myself to do them....
     
P
Moderator
Join Date: Apr 2000
Location: Gothenburg, Sweden
Status: Offline
Reply With Quote
Aug 16, 2021, 10:55 AM
 
The first link in reader’s post is a report of the leak. Ars is reprinting a Financial times story that includes that “Apple declined to comment”. Other reports were not as kind as that one, and even the Ars piece includes a couple of things that are clearly incorrect and some very alarming comments.
The new Mac Pro has up to 30 MB of cache inside the processor itself. That's more than the HD in my first Mac. Somehow I'm still running out of space.
     
reader50  (op)
Administrator
Join Date: Jun 2000
Location: California
Status: Offline
Reply With Quote
Aug 19, 2021, 03:19 PM
 
90+ organizations dedicated to civil rights, digital rights, and human rights have sent Apple an open letter (pdf), asking them to scrap the image scanning. Both for child stuff, and the iMessages explicit-image scanning. Their reasons are much the same concerns hashed out here.
“Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM), we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” said the letter, whose signatories include the Center for Democracy & Technology (CDT), the ACLU, PEN America, the Electronic Frontier Foundation (EFF), Access Now, Privacy International, Derechos Digitales, Global Voices, Global Partners Digital and groups from across Europe, Latin America, Africa, Asia, and Australia.
These organizations routinely push back against governments, defending journalists and others who have been charged with annoying a government. This isn't hypothetical for them, they deal with these issues every day.

Ars coverage

Once again, no mention of Apple's 3rd change. The "bad searches" changes across Apple platforms.
     
Waragainstsleep
Posting Junkie
Join Date: Mar 2004
Location: UK
Status: Offline
Reply With Quote
Aug 20, 2021, 04:09 AM
 
So it looks like the process is as follows:

Algorithm scans and compares hashes on device;
If it gets ~30 hits against the CSAM database, its flagged up the chain;
A second version of the algorithm (potentially a slightly different one) checks again with the submitted material;
If things still look dodgy, a human is notified. Presumably one at Apple;
Its then sent to a 3rd party NGO who check again;
Its then sent to law enforcement I guess;

It seems like the potential weak links are the 3rd party NGOs in countries outside the US and whether or not Apple will say no if a government demands they compromise the system.
With the latter it seems likely they would resist in smaller, less developed countries but things might get very dicy if the UK or Australia or China were to make such demands.
I have plenty of more important things to do, if only I could bring myself to do them....
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Aug 20, 2021, 08:47 PM
 
Learning more about what Apple has done and how it has implemented it has put me at ease a little: if you are implementing something like this while maximizing privacy, then this seems a good way to do it. Further, listening to the last two episodes of ATP, Siracusa speculated that the current implementation would allow for end-to-end encryption of your photos (i. e. Apple loses access to them) without impacting the functionality. They also brought up something interesting, namely that some countries (most notably in Europe) are preparing laws that would require functionality like that.

Don‘t get me wrong, I am still skeptical and we should keep our critical eyes on it, but it might not be as horrible as we make it out to be. Besides, if you keep your photos in e. g. Google‘s cloud, your digital stuff has been X-rayed and checked and mined all along.
I don't suffer from insanity, I enjoy every minute of it.
     
Face Ache
Addicted to MacNN
Join Date: Jul 2001
Status: Offline
Reply With Quote
Aug 21, 2021, 04:41 AM
 
My initial thought after hearing this news was that Apple are going to sell a lot of phones in China.

I'm not sure I care, but it is a bit weird having your morals questioned (perpetually) by an electronics manufacturer. Imagine if all of your appliances reported your activities to the government. You'd be less likely to have a wank if you suspected your Fitbit was taking notes. If studies showed paedos liked burnt toast you'd suddenly have to watch yourself around the toaster. "Why are they carting Phil away?" "Burnt the toast." <nods knowingly>

It's a weird future we've built for ourselves. Cars have black boxes that can be (and are) used against their owners. Cameras everywhere... facial recognition... phone tower tracing... AI knowing more about us than we know about ourselves...ah fuck it... I'm going down the road to throw my electronics into the sea. I really could chuck it all away and spend my life down there - the internet has given morons a pipeline into my eyeballs that I could do without.

Am I alone in barely using my iPhone?
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Aug 21, 2021, 05:59 AM
 
Originally Posted by Face Ache View Post
My initial thought after hearing this news was that Apple are going to sell a lot of phones in China.
I think it is important to keep in mind that Apple is a hold out up until now. The numbers are proof of that, Apple reported significantly less when compared to the other big cloud companies (by a factor of 1,000).
Originally Posted by Face Ache View Post
I'm not sure I care, but it is a bit weird having your morals questioned (perpetually) by an electronics manufacturer.
There is another side to this, I think: Facebook, Google, Twitter, Instagram and many other cloud providers/social media networks have been shirking their responsibility when it comes to damaging content. I don‘t need to talk much about Facebook, I think, it has become a cess pool of misinformation and is very strict when it comes to nipples, but doesn‘t care much when it comes to bullying, hate speech and conspiracy theories. Twitter was so courageous, it dared to ban Trump two weeks before leaving office. I think you can also view Apple‘s efforts in this light: they have not been very pro-active (I‘m sure they knew they were reporting a tiny fraction of what the other companies were), and perhaps they wanted to do this in the right way. Plus, their solution allows for full end-to-end encryption of user photos; we will have to see whether Apple will implement this in the next one or two years.

Given all that, I think you need to include this point of view in the discussion even if ultimately you still think it‘d be better to not have something like Apple‘s CASM features. I‘m still skeptical, but I think it makes more sense and unlike with Facebook and the others, I don‘t see any direct benefit to Apple.
Originally Posted by Face Ache View Post
Imagine if all of your appliances reported your activities to the government. You'd be less likely to have a wank if you suspected your Fitbit was taking notes. If studies showed paedos liked burnt toast you'd suddenly have to watch yourself around the toaster. "Why are they carting Phil away?" "Burnt the toast." <nods knowingly>
All our appliances already report everything we do, just not to the government directly. Ad tracking is the worst here, and it is way ahead of us. Even in Japan which isn‘t as tech forward as other countries, ad agencies are all over it. A friend of mine (with know how in AI/ML) was offered a job at a subsidiary of Japan‘s biggest ad agency Dentsu, but he declined. He didn‘t want to “traverse the customer‘s journey” (that‘s what they called it). Many TVs log what you are watching and send it to the manufacturer.
Originally Posted by Face Ache View Post
Am I alone in barely using my iPhone?
I think you‘re using it more than you think. I‘m using mine about 3 hours per day on average, I think. It doesn‘t feel like it, but I don‘t think Apple‘s Screen Time is lying to me. But honestly, it doesn‘t matter. Ad tracking works across devices and uses many sources you are not thinking of.
I don't suffer from insanity, I enjoy every minute of it.
     
Face Ache
Addicted to MacNN
Join Date: Jul 2001
Status: Offline
Reply With Quote
Aug 21, 2021, 07:07 AM
 
Originally Posted by OreoCookie View Post
I think you‘re using it more than you think.
I use mine for about 60 minutes a day: 5 or 10 minutes checking FB in the morning and night, and a podcast while I walk the dog. I guess I'm old - I still stare at the wall in waiting rooms.

Now the iMac, on the other hand, takes up most of my day. But that thinks my name is Sven and I live in Bristol.
     
ghporter
Administrator
Join Date: Apr 2001
Location: San Antonio TX USA
Status: Offline
Reply With Quote
Aug 21, 2021, 03:52 PM
 
I am honestly more concerned about vendors capturing information about ME than about the government gathering more data on me.

I’m a career Airman, USAF retiree, a disabled veteran, and licensed occupational therapist. On various occasions, I had various levels of security clearances. In other words, various government agencies already know all about me. And I was ahead of my time in not giving a flip that my windshield toll tag was visible and could always be pinged - most of the Texas toll roads I use are free for disabled vets, so I get something useful out of the thing, and that’s all I worry about with that.

And anything I photograph with my phone is most likely to be so boring or so arcane that nobody is going to have a reason to consider it “interesting.” Except maybe the various pix of wounds I’ve taken and not gotten around to deleting…those are pretty icky unless you have a clinical background.

But when Pinterest seems to know that I recently searched for Burmese cats and offer me more pictures of those cats, that bugs me.

I don’t have a “smart” appliance. I have to figure out what to buy at the grocery by myself, I have to keep up with whether or not the dishwasher is clean or dirty without emails or texts, and a “smart” range would be on my butt about how often I don’t clean the oven. I don’t need automation to keep track of that stuff.

So checking into what I’m doing with my phone should be pretty boring to anyone who wants to do anything more than yawn. But what about a young family with a baby? People have lost jobs and been put in jail because they had baby pictures on their phones. We took film pictures of our son when he was an infant, and many were not “properly clothed” because he was being hilarious or extra cute in the bath. That’s the kind of thing this program worries me about.

Glenn -----OTR/L, MOT, Tx
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Aug 21, 2021, 08:27 PM
 
Originally Posted by Face Ache View Post
I use mine for about 60 minutes a day: 5 or 10 minutes checking FB in the morning and night, and a podcast while I walk the dog. I guess I'm old - I still stare at the wall in waiting rooms.
In my case I am listening to a lot of music and podcasts. The actual screen time is maybe an hour-and-a-half. Still that’s a lot.
I don't suffer from insanity, I enjoy every minute of it.
     
OreoCookie
Moderator
Join Date: May 2001
Location: Hilbert space
Status: Offline
Reply With Quote
Aug 21, 2021, 08:36 PM
 
Originally Posted by ghporter View Post
I don’t have a “smart” appliance.
Do you have a TV with network connectivity? If the answer is yes, you have a smart appliance and many models will snoop on you.
Originally Posted by ghporter View Post
So checking into what I’m doing with my phone should be pretty boring to anyone who wants to do anything more than yawn. But what about a young family with a baby?
I think this goes further: advertisers combine information from lots of sources (including credit card info), and they are able to reconstruct a lot about your life. I‘m officially single on Facebook, because my wife wants to protect her privacy. But Facebook surely knows simply by looking at our IP addresses. Advertisers are even able to target friends of yours indirectly by showing ads to you.

In a sense, a lot of the criticisms of Apple‘s program, while apt in many aspects, tends to forget that other companies already have much more invasive automated scanning programs in place. On the one hand, we should keep Apple to a higher standard. On the other hand, perhaps we should keep the other companies to Apple‘s standards when it comes to our privacy.
I don't suffer from insanity, I enjoy every minute of it.
     
 
Thread Tools
 
Forum Links
Forum Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Top
Privacy Policy
All times are GMT -4. The time now is 11:43 AM.
All contents of these forums © 1995-2017 MacNN. All rights reserved.
Branding + Design: www.gesamtbild.com
vBulletin v.3.8.8 © 2000-2017, Jelsoft Enterprises Ltd.,