On Monday, many Twitter users who have iPhones were surprised to discover that Apple has been quietly categorizing their bikini photos, bra selfies, and various nude pictures and making them easily searchable under the tag "Brassiere."

Here are two examples. NSFW, so click with caution.

The feature appears to have innocent-ish origins. It uses image-recognition capabilities that have been part of Apple Photos since summer 2016. "The Photos app makes it easy to find photos of an exact person, place, or thing. Advanced face recognition and computer vision technology lets you search your photos by who and what's in them," according to Apple's support page. "Photos recognizes scenes or specific objects in your photos, so you can search for things like dogs, mountains, or flowers."

For example, one Twitter user pulled up an ad-hoc "Adult Cat" folder"

Inc. staffers experimented a little, and found that categories seem to come into existence based on whether you have photos that the algorithm interprets as containing the object or attribute in question. Our social media editor confirmed that the "Brassiere" category was pulling in photos that qualified, but "it also picks up bathing suits and dresses with spaghetti straps," despite a "Swimsuit" category also existing.

This reporter has picture of her cats and dog, but wasn't able to pull up "Cats" or "Dogs" category. A male editor didn't have the "Brassiere" category on his own phone, while on his wife's phone it pulled up a video of their infant daughter wearing a diaper.

Clearly the algorithm could use some work; its precision is lacking. Based on the shock displayed on Twitter, Apple could also improve its communication to users around this feature, especially when it comes to privacy controls.

The whole incident calls to mind  the similar-but-worse Google Photos mishap, where the image-recognition algorithm tagged selfies of two black people as "Gorillas." Machine learning scales cheaply and is pretty successful considering that computer vision is in its infancy, but these techniques have no cultural awareness or emotional intelligence. It's up to humans to tell the algorithms why it might be a good idea to, say, keep baby photos and homemade pornography from appearing in the same search. 

Inc. reached out to Apple; we will update this story if we hear back from them.

Published on: Oct 30, 2017