The Photos app by Apple includes a variety of features designed to assist you in locating images within your library and gaining insights into their content. One such feature is known as Enhanced Visual Search. Here’s an explanation of how it operates and the privacy measures Apple takes when you use it.
Visual Look Up versus Enhanced Visual Search
It’s crucial to understand how Enhanced Visual Search is distinct from Apple’s Visual Look Up feature. Visual Look Up, which debuted with iOS 15, enables users to identify various objects, landmarks, plants, and more directly within the Photos app.
For instance, swiping up on an image can reveal the dog breed portrayed in the photo. It can even interpret laundry care symbols on garments and clarify the meaning of different icons on your vehicle’s dashboard.
Enhanced Visual Search operates independently from Visual Look Up. While Visual Look Up provides details about a specific photo, Enhanced Visual Search enables you to locate all photos in your library by searching for landmarks or places, functioning even without geolocation information.
This means you could search your library for “Golden Gate Bridge” and retrieve all relevant images, even if the landmark is somewhat blurred and in the background.
How does Enhanced Visual Search safeguard your privacy?
Recently, there was a surge of discussion about Enhanced Visual Search transmitting your location data to Apple to facilitate the identification of landmarks and points of interest. According to the Settings app, Apple states: “Allow this device to privately match places in your photos with a global index maintained by Apple so you can search by virtually any landmark or point of interest.”
This naturally prompted concerns about privacy, especially since the feature is designed to be opt-out rather than opt-in.
Apple, however, has established a comprehensive privacy framework that they assert protects your data when Enhanced Visual Search is activated.
This process initiates with something known as homomorphic encryption, which functions in the following manner:
- Your iPhone encrypts a query before transmitting it to a server.
- The server processes the encrypted query and generates a response.
- This response is then sent back to your iPhone, where it is decrypted.
Importantly, in Apple’s system, only your devices possess the decryption key, meaning the server cannot decrypt the original request. Apple utilizes homomorphic encryption across various features, including Enhanced Visual Search.
Additionally, Apple incorporates a method called private nearest neighbor search, or PNNS, for Enhanced Visual Search. This feature allows a user’s device to securely query “a global index of popular landmarks and points of interest maintained by Apple to find approximate matches for locations depicted in their photo library.”
Here’s a detailed overview of the Enhanced Visual Search request pipeline, as described by Apple on its Machine Learning website:
- An on-device machine learning model evaluates a photo to identify any “region of interest” (ROI) that might contain a landmark.
- If an ROI is identified, a “vector embedding” for that portion of the image is calculated.
- That vector embedding is then encrypted and sent to a server database. Instead of sending the actual photo, a mathematical representation of the ROI is transmitted.
- Apple employs differential privacy alongside OHTTP relay, managed by a third party, to conceal the device’s IP address before the request reaches Apple’s servers.
- The client also issues “fake queries in addition to its real ones, ensuring the server cannot discern which queries are authentic.” Furthermore, all requests traverse through an anonymization network to prevent linking multiple requests back to the same client.
- The server recognizes the relevant parts of the embeddings and returns appropriate metadata, such as landmark names, to the device. Importantly, the server does not retain the data after the results are sent back to your iPhone.
This may sound complex and filled with jargon, but it supports Apple’s assertions that they cannot access any information from your photos.
You can disable Enhanced Visual Search in the Settings app by navigating to “Apps,” then “Photos.” Scroll to the end, where you will find a toggle for Enhanced Visual Search. Apple recommends using this toggle mainly in areas with low data connectivity.
Further reading on Enhanced Visual Search:
Follow Chance: Threads, Bluesky, Instagram, and Mastodon.
: . More.