Last month, I did something for the first time since 1994: I hung out at the mall. I bought a soda at the food court. I browsed the racks at department stores. Eventually, I bought a T-shirt. In many ways, nothing had changed. There were still plenty of teenagers and mall walkers in comfy shoes. And yet something was completely different. During that day, I shed mounds of data--not just where I shopped and what I bought, but also my intentions, interests, and idiosyncrasies.

Whether you're at the mall or on your couch, information about you can be captured from your voice, eyes, posture--even your bone and capillary structures. How often do you switch between your mobile device and desktop computer? Do you make more video than voice calls? How likely are you to purchase the things in your shopping cart on Sunday morning versus Tuesday afternoon? All of these questions touch on biometrics, which applies statistical analysis to biological and behavioral data, and which can be used to recognize not just who you are but also how you're likely to act. This year, you'll start hearing a lot about this hot new field.

Today's behavioral biometrics tools can map and measure how you interact with every screen you own--what force you use to press down, whether you fat-finger your C's and V's, and how quickly you flick your fingers when hunting through search results. Those tools know your unique typing pattern on a physical keyboard, too--if you're someone who constantly spells behavioral wrong, and whether you hold down the delete button or repeatedly tap it.

That's why  BioCatch, which has built a system that captures and correlates more than 2,000 data points to build individual user profiles, says it knows within seconds if you've logged into a computer using someone else's user name and password. It's why call centers can now identify a customer's age, gender, sentiment, and emotional state just by listening to his or her voice and tone, thanks to tools like IBM Watson's Tone Analyzer. Meanwhile, Nuance Communications has built bio­metric software used by banks, telecom and insurance companies, and government agencies to create more than 400 million "voiceprints" for their customers. Those voiceprints, they say, are far more secure than passwords or PINs.

Amazon recently secured patents for biometric sensing, including for an Alexa feature that could listen not just for your words, but also for your tone of voice, your coughs, and even your level of stress or fatigue. The result: a machine that may someday passively detect if you are sick. But Amazon could go further: It could determine whether other people in your neighborhood also sound congested, as well as analyze your previous online shopping habits and sift through your past grocery receipts to make a range of shopping suggestions--cough drops, fresh chicken soup, tissues. It might also trail you around the web with digital ads.

Passive behavioral biometric scanning extends to real-world objects, too. Walmart recently filed a patent for a connected shopping cart handle that can detect a shopper's heart rate, palm temperature, grip force, and walking speed to determine if he or she is struggling or agitated. If so, the cart would ping an employee who could walk over to help the customer.

These reams of bits and bytes, while convenient, bring up thorny questions. Who is the legal guardian of our data? Do companies have the right to change end-user agreements regarding our data? What should data governance look like in an era of behavioral biometrics? This is especially important since most people have no idea just how much data they're generating--and what can be done with it.

Once these questions are sorted out, though, we could start to see physical and digital realms merge seamlessly in a whole new way. Which points to more enriched, personalized experiences at home and as we move about the world--or while feeling strange pangs of nostalgia over a soda at the local mall.