Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Sure enough, when I checked my iPhone 15 Pro this morning, the toggle was turned on. You can find it yourself by going to Settings > Photos (or System Settings > Photos on a Mac). Enhanced Visual Search lets you search for landmarks you’ve photographed or search for those images using the names of those landmarks.
To see what’s allowed in the Photos app, hover over a photo you’ve taken of a building and select “Find Landmark,” and a map will appear that ideally identifies it. Here are some examples from my phone:
On the face of it, it is a convenient expansion of the Visual Look Up function of Photos that Apple introduced in iOS 15 that allows you to identify the plants or, say, find out what those symbols on a laundry tag mean. But Visual Look Up doesn’t need special permission to share data with Apple, and that’s it.
A description below the toggle says that you give Apple permission to “privately match the locations in your photos using a global index maintained by Apple.” As such, there are details in a Apple Machine Learning Research Blog about Enhanced Visual Search that Johnson links to:
The process begins with an on-device ML model that analyzes a given photo to determine if there is a “region of interest” (ROI) that may contain a landmark. If the model detects an ROI in the “landmark” domain, a vector embedding is calculated for that region of the image.
According to the blog, that vector embedding is then encrypted and sent to Apple to compare with its database. The company offers a very technical explanation of vector embeddings in a research paperbut IBM put it more simplywrite that embeddings transform “a data point, such as a word, sentence, or image, into a n– dimensional array of numbers representing the characteristics of the data point.”
Like Johnson, I don’t fully understand Apple’s research blogs and Apple did not immediately respond to our request for comment on Johnson’s concerns. It appears that the company has gone to great lengths to keep the data private, in part by condensing the image data into a format that is readable to an ML model.
Even so, making opt-in toggles, such as those for sharing analytical data or recordings or Siri interactions, rather than something that users have to discover seems to be a better option.