Apple reportedly deployed and automatically enabled Enhanced Visual Search, a feature that can identify landmarks and places of interest on customers’ devices without their consent.
According to an article by The Register, the feature analyzes customers’ pictures stored in the Photos application on their iOS and macOS devices.
Apple users have just found out about the feature
The feature was deployed last year, but Apple users were not aware of the development until recently. According to The Register, software developer Jeff Johnson called out the Enhanced Visual Search feature last week and expressed concerns over Apple’s failure to explicitly explain the feature to users. Johnson spoke of the feature in two write-ups expressing concerns over the feature believed to have hit iOS 18.1 and macOS 15.1 on October 28 last year.
Users are also concerned with Apple’s deployment of the technology. Matthew Green, an associate professor of computer science at the Johns Hopkins Information Security Institute in the US, described it as frustrating.
He said:
“It’s very frustrating when you learn about a service two days before New Year’s and you find that it’s already been enabled on your phone.”
According to a policy document from November 18, 2024, Apple describes the feature as allowing “you to search for photos using landmarks or points of interest.”
“Our device privately matches places in your photos to a global index Apple maintains on our servers. We apply homomorphic encryption and differential privacy and use an OHTTP relay that hides [your] IP address.” Apple said.
“This prevents Apple from learning about the information in your photos. You can turn off Enhanced Visual Search at any time on your iOS or iPadOS device by going to Settings > Apps > Photos. On Mac, open Photos and go to Settings > General,” they further explained.
Apple explained the technology in a technical paper published on October 24 last year. This was around the same time that the Enhanced Visual Search feature was introduced.
How the Apple feature works
According to The Register, Apple started running its customers’ pictures through a locally running machine-learning algorithm that examines image details on a visual basis and does not use location data. It then creates value associated with what could be a landmark in each photo.
It then uses that value on a remote server to check an index of such values stored in Apple servers, labeling the landmarks and places found in Apple’s database within each snap.
In other words, when a user takes a picture, the device outlines what it thinks to be a landmark or place of interest in the picture. It homomorphically encrypts a representation of that portion of the picture so that it can be examined without being decrypted. The landmark can be identified from a huge database of places.
Despite the descriptions of how the feature functions, there are objections. Jeff Johnson, in a second post, said:
“My objection to Apple’s Enhanced Visual Search is not the technical details specifically, which are difficult for most users to evaluate, but rather the fact that Apple has taken the choice out of my hands and enabled the online service by default.”
According to the descriptions in The Register’s article, if the feature functions as Apple claims, with no side channels or other leaks, Apple cannot see what is in users’ images, neither the image data nor the looked-up label.
Apple claims it protects users’ data privacy
By using homomorphic encryption and what’s called differential privacy, which is a way of protecting the privacy of people whose data appears on the data set, Apple says it prevents any potential privacy challenges.
“Apple is being thoughtful about doing this in a (theoretically) privacy-preserving way, but I don’t think the company is living up to its ideals here,” observed software developer Michael Tsai in an analysis shared Wednesday.
“Not only is it not opt-in, but you can’t effectively opt-out if it starts uploading metadata about your photos before you even use the search feature. It does this even if you’ve already opted out of uploading your photos to iCloud.”
– Tsai
The software developer also argued that Apple’s approach is less private than its abandoned CSAM scanning plan “because it applies to non-iCloud photos and uploads information about all photos, not just ones with suspicious neural hashes.”
A Step-By-Step System To Launching Your Web3 Career and Landing High-Paying Crypto Jobs in 90 Days.