This year, you will start to see all kinds of new mobile phones, and they will surprise and delight you: Phones with foldable screens! Phones that have cameras and buttons in different places! Phones that literally peer into your body and reveal not just whether you've made your steps, but whether you're on the verge of a heart attack!
But these innovations paradoxically signal an end to an era, not a beginning. Rather than a single smartphone and screen that connects us to our work, friends, and entertainment, soon consumers will instead start using dozens of smart devices, because the smartphone's primacy is starting to fade. Thanks to converging trends--among them 5G and AI in the cloud--the smartphone will spend the next decade acting more as a central hub before being replaced entirely by wearable screens, ubiquitous voice assistants, and ambient interfaces.
If you were to go back to, say, 1998, you'd see yourself juggling lots of gadgets: a digital camera, a MiniDisc or CD player, a big, clunky external hard drive, a GPS that attaches to your car's windshield, maybe a portable DVD player. These days, we have all those functions (and more) beyond the phone, as advances in other technologies lead us from one device to many: smart earables--that's not a typo--wristbands, yoga clothes, mirrors, even glasses.
The world's largest tech companies, as ever, are leading the charge. Nearly a decade ago, Microsoft experimented with "skinput," which turned a person's arm and hand into an interactive interface. You could answer a call by tapping your fingers, or press your palm to skip a song on your playlist. Now Google's Project Soli is advancing that skinput idea: On December 31, the FCC approved its proposed tests of a new chip that uses radar to track micromotions. The Soli chip (or something like it) could be embedded into glasses, rings, bracelets, cars, doors--virtually any connected device. We're already transitioning from physical to digital buttons; soon skinput may teach consumers to live without any buttons at all.
Meanwhile, a new generation of smart clothes will send data to applications and deliver feedback in real time. Pivot Yoga makes connected yoga pants--you read that right--that monitor your downward dogs and help you adjust your form. The company's connected clothing syncs to an app, through which a digital assistant will tell you when to turn your left hip or to move your legs three inches back on the mat. And Apple has patented "force-sensing" fabrics, including a glove that could be used to monitor our blood pressure and heart rates.
The startup Mirror created a full-size mirror that's a portal to live fitness classes. Users wear a monitor that connects to an app during classes, which tracks their performance and progress. Mirror can also be personalized: If you have a bad lower back, the system adjusts the exercise routine. The makeup giant Coty debuted a smart mirror last spring that analyzes skin tone and recommends the right shades of its makeup. It's an in-store experience now, but plenty of tech and retail brands, from Philips to Sephora, are betting on smart mirrors that can be used at home to help people apply makeup and pick the best accessories for their outfits. Amazon has patented a smart mirror that goes even further: Not only does it enable you to try on clothes virtually, but you can also see yourself in those clothes in different environments, like dinner in a swanky restaurant in L.A. or walking the sandy shores of Lake Michigan.
If you're watching only for new and smarter smartphones this spring, you'll miss this new multidevice ecosystem as it starts to snap into focus. Consumers will soon be surrounded by information--untethered from a single screen--and expect even their most mundane objects to do something extra. And that will change everything: how they shop, how they learn about new products and services, and how they relate to one another and your business.