Yesterday, Amazon revealed its new Echo Look, the first of its Alexa AI devices that can both hear and see, the result of a camera added on to the usual microphones and speakers. Echo Look is intended to provide fashion advice, and can help people both create lookbooks of their daily attire (which helps people remember what they wore to specific meetings and events) and advise them on which outfits look best -- utilizing "machine learning algorithms with advice from fashion specialists" to make such decisions.

But, before signing up to purchase and use this device, you should think of the privacy issues. Here are some, but certainly not all, of the points for thought:

1) Do you really want an uncovered camera in your bedroom?

Most people get dressed in their bedrooms. Could the Echo Look camera be activated by hackers when you don't think it is on? Theoretically, the Echo Look should be placed in a room other than your bedroom, but, realistically, think about where you are used to trying on outfits and comparing them - if you are like many other people you would have to consciously change your habit, and maybe the location of your mirror, once Echo Look arrives. And, if you share a residence with others, changing your "fashion analysis area" may not even be practical.

2) Will private objects, papers, medications, etc. be seen?

How tidy do you keep your bedroom, and do you have mirrors in your bedroom? While Echo Look is supposed to blur out backgrounds, are you sure that when it photographs you it won't pick up items that you don't want to be seen in photos? Even if you place the Echo Look elsewhere, will it pick up items of which you don't want Amazon to have an image? Think about how many private "items" you have that could show up in photos or videos -- from who is in your home, to developing medical conditions, to medication bottles, to pregnancy tests, to tattoos, to sex-related items, to letters from debt collectors, to rejection letters from jobs, to medical insurance claims, to firearms, to illegal substances -- I could literally dedicate an entire article to listing out what private things can be picked up in a photo or video within one's home. Are you sure Alexa will never see them?

3) Could your kids trigger photos when you don't expect them to?

Do you have children? If so, are you sure that they will never ask Alexa to take a photograph when you don't want one taken?

4) If you cover your camera in your office, wouldn't you want to do so anywhere else in your home?

Covering the camera, however, undermines the ability to activate Alexa without any physical action - just by instructing the Echo Look orally to "Alexa, take a picture" or "Alexa, take a video." If I bought an Echo Look, I would probably cover it when not in use anyway.

5) Do you want Amazon to know what you own?

Amazon already knows a lot about what you buy from it and from vendors selling through its storefront. Do you really want the firm to have a record of what other clothing you own and in what condition it appears to be?

6) Do you want Amazon to know about your health issues?

Could Amazon detect health issues in the images? Could it utilize knowledge of health concerns to market to you various products and services? Unless Amazon commits to permanently not using the photos for any purpose other than for providing fashion-related advice, it is likely that, at some point, it may use the information in images to start marketing other forms of products - perhaps related to addressing weight gain, pregnancy, skin disease, or other health related matters. On that note...

7) Will Amazon discover that you are pregnant before you have told the world?

If you share an Amazon account with family members, such a phenomenon could have serious repercussions on family relationships and privacy between family members. (Remember when Target informed a father of his teenage daughter's pregnancy?)

8) Could AI applied to photos produce emotional distress?

Besides warnings (explicit as described above, or even implied through fashion recommendation changes) about potential weight gain, there are other serious issues at hand. Could Alexa's artificial intelligence analysis of images, for example, lead to situations where women receive baby product samples and coupons after suffering miscarriages? If you think that such a scenario is unlikely, consider that marketers have made that mistake multiple times in the past.

9) Could images be demanded by the government?

Of course, anything in Amazon's possession could be demanded by government officials with warrants. There may not be anything in the photos that concerns you -- but, do you really want curious government folks snooping around inside your closet -- and your home?

10) Do you trust all of the people who will ever work for Amazon?

While Amazon claims that it has "rigorous controls in place to restrict access to these images," according to various reports, it also states that "Designated Amazon personnel may view photos and video to provide and improve our services, for example to provide feedback through Style Check." Keep this in mind.

11) Will Amazon store the images forever?

If so, what will it do with them? Will it ever change its policies regarding how the images are used? Could it eventually sell them to third parties unless folks opt out at some point in the future? Forever is a long time - it is hard to know what the future brings.

12) What if someone's sense of clothing and modesty changes with time?

Someone who converts to various faiths or adopts a stricter interpretation of a faith in which he or she was raised, for example, may not wish to have photos on Amazon's servers of themselves dressed in various outfits. Will Amazon make it easy to get rid of all such images from all storage locations including from those used by employees as mentioned earlier?

There is little doubt that the Echo Look represents the future of AIs -- hearing and seeing. But, if you decide to get one, be sure to understand the privacy repercussions - and take steps to minimize their impact on your life.

Published on: Apr 27, 2017