Three years ago, I needed a new car. I'd done my research, knew exactly the make and model I wanted, and was at the dealership only for a test-drive before signing the paperwork. Then the salesman spent our entire ride explaining that this car was tops among soccer moms, and that it was so easy to drive, I wouldn't have to beg my husband for help. I liked the car. But the experience so infuriated me that I didn't buy it.
Shrewd sales associates know how to read and interpret all the data--body language, mood, extroversion--that customers bring. But too many times salespeople misread cues, fail to listen, or steer customers toward something that makes no sense. Soon, though, you'll be encountering new sales associates--ultra-smart, focused on you, attentive to your tastes and preferences. They'll know what you've purchased in the past. They'll predict what you're likely to want next. They may even speak in a voice that sounds a lot like yours.
They're digital associates, similar to the digital assistants you already know--Alexa, Siri, Google Assistant--but programmed with different objectives. In the form of roaming robots, smart kiosks, and augmented-reality mirrors, they'll start showing up in shops over the next two years--or sooner.
They could help save such shops as physical retail continues to suffer in an era of never-ending online choices. Consider what happened when Japanese tech giant SoftBank deployed its Pepper robot in a few shops in California. Pepper was charged with greeting customers and answering their questions. One of the Pepper-equipped shops reported a
70 percent increase in foot traffic, a 13 percent increase in revenue, and six times the average sales of a featured product. At a custom print apparel store, Pepper generated 20 percent more foot traffic--and tripled revenue. That's because, like all digital associates, Pepper isn't just a transactional device. It's a system that truly knows the customer, thanks to its technical ability to recognize and interpret human emotion.
Meanwhile, some MAC cosmetics stores have installed augmented-reality mirrors so customers can try out different makeup looks without worrying about sharing lipsticks and mascaras with strangers. Japanese clothing retailer Uniqlo deploys an AR mirror that lets customers see a full range of colors for various articles of clothing, simply by swiping through options on a screen. SenseMi Technology Solutions' virtual fitting-room mirror shows how clothing will move once it's on, thus helping a customer determine whether a dress is too short or low-cut without ever having to put it on. Early this year, Amazon patented an AR mirror that allows customers to try on clothing virtually and see themselves dressed for different occasions: walking on a beach, dancing at a gala, interviewing with a potential boss. As these systems mature, they'll store personal details--body type, style profile. And Chinese tech giant Alibaba's Ant Financial unit now lets customers "smile to pay," via an ordering and payment system that uses algorithms and 3-D face-scanning cameras to recognize them.
All this will come in handy in a clothing shop or a grocery store, but what about bigger purchases--like a car? Within the next five years, half of all interactions we have with machines will be in the form of a conversation--and companies will soon have the chance to develop their own synthetic voices. A San Francisco startup, Voicery, is developing a speech synthesis system that mimics emotion, can convey charisma and warmth, and modulates to a tone that best suits each customer. One day, a digital associate--capable of interpreting my personal data--will come along when I decide to test-drive a car, and it won't assume that I am primarily driving kids to soccer practice. It will tell me about sport mode, and how to customize the onboard computer--and that dealership will make a sale.