Last week, I rearranged my desk. I tiled virtual screens around my existing monitors, placed a virtual vase and flowers beneath a (real) lamp, and stationed a tiny pet dinosaur by my mousepad to keep me company.
The dinosaur, I soon realized, was a mistake. I waved her away, again and again, but she kept walking back to my keyboard to look up at me, adorably, hoping I'd stop working to play with her.
On my physical monitors were Slack, email, and various documents. On the virtual monitors, on a mixed-reality wall that responded to my gestures, were The New York Times, several Google searches, and a shared office calendar. My team works remotely, but I could invite them to meet with me--virtually--in my office. It was possible to see outlines of their bodies sitting and standing near a whiteboard, even though one staffer was in Austin and the other was in San Francisco.
This isn't sci-fi. It's spatial computing, which will enable machines to be responsive to us in real time thanks to a combination of various technologies: sensors, 3-D capture, rendering, algorithms, and wearable displays. This means computers won't be tethered to a single location, like your desk or closet, but will instead engulf offices, boardrooms, factory floors, and kitchens. Spatial computing means any room can become a computable environment--with you as part of the system.
You'll enter this new world using a pair of mixed-reality glasses, which will track your movements and relay information while allowing you to see the real world. Unlike augmented-reality headsets, which overlay digital information in response to your location, or virtual-reality headsets, which completely block the outside world, mixed-reality glasses will blend you and your surroundings with data, algorithms, and graphics.
Sony's R&D division in Tokyo has been researching mixed reality for years. Its Parallel Eyes project allows four people to share what they see with one another--and for onlookers to watch as well. Imagine watching the Rolling Stones from the stage, and being able see through the eyes of Mick, Keith, Charlie, and Ron as they perform. More practically, this would allow construction teams, law enforcement officers, and coaches to collaborate on projects, fieldwork, and practices more viscerally than ever before.
Last August, the Florida-based startup Magic Leap launched its mixed-reality headset, Magic Leap One, and the hybrid physical-virtual world its users inhabit when they enter that spatial computing system. (It's the hardware and software I used to rearrange my desk.) Magic Leap has its detractors; I've read all of its patents and spent ample time experimenting with that headset and spatial computing platform, and agree it isn't ready for the holiday season--like every groundbreaking technology, Magic Leap needs time to mature. But even at this early stage, the underlying technology powering it is breathtaking. Wearing its glasses, I was able to set up my office and desk in less than two minutes. Once I did, my tiny dinosaur pet could freely roam the room while responding to the physical world: She couldn't walk through walls or furniture.
Soon, I should be able to interact with Mica--Magic Leap's startlingly realistic mixed-reality version of Siri and Alexa. She's still a prototype, but already her facial movements respond to mine. In the near future, Mica could keep me company when I'm dining alone, or take a walk with me around my office.
It will take time to develop this technology and move it from the fringes to the mainstream, but our lives will change profoundly once every room we're in is also a computing environment. Blueprints and 3-D designs will transform into spaces we can see, walk through, and inspect before any nail is hammered. Gyms will offer personal-training A.I. that we'll see and interact with, as it coaches us using our data. And allergy sufferers, like me, will finally be able to own a pet. Just be forewarned: Your too-cute dinosaur won't leave you alone for a second.