Anyone who watched James Bond movies of the 1990s will recall the prototypical technical villain, a hacker in a headset who would type feverishly into a black computer terminal, enter commands and prompt disaster with each keystroke. This villain had an elite ability to manipulate a computer, and would use this rare skill of human computer interaction to do nefarious deeds that mere mortals could barely hope to contain.

That technology can be manipulated for all purposes, good and bad, has not changed. What has changed, however, are the methods of human computer interaction, and the means with which humans can control and manipulate machines. While evil-doers may still choose to hole-up in dark basements and type line commands, this could change.

Steve Jobs famously dropped out of Reed College, and brought aesthetics he had learned in calligraphy to the computer interface. He introduced beautiful fonts, and an interface that was visual, rather than command-led. The mouse enabled the human to maneuver through a set of visual proxies for commanding outcomes. The human-computer interface went visual and "what you saw was what you got."

This was known as WYSIWYG, and this was the 2.0 of human computer interaction.

Today we are on the cusp of a third wave in human computer interaction. This third wave does not require humans to learn the technical speak of the computer, typing line commands into a terminal, nor does it require humans to recall a digital rat path to maneuver through file structures and apps to retrieve information. It does not require us to, as if navigating back to a place in memory, manipulate a mouse and pursue a course of clicks to zoom in and zoom out of file structures, going ten levels deep to pull out a file we might have saved months before, and hopefully categorized through a series of visual and linguistic proxies such as files and color codes. It requires us simply to talk.

This 3.0 human computer interaction is driven by conversation. In some ways it is the highest abstraction programming language, raw, natural English. Natural language processing and semantic interpretations are the hooks that command.

WYSIWIG no longer refers to what you "see." What you "say" is now what you get.

Conversation-based artificial intelligence (AI) is already here. The new Apple TV allows users to navigate directly to programming via voice command in English. Whereas the 1.0 version would be to prompt an application launch via line command, and the 2.0 version would be to navigate to desired programming through logic or memory by way of proxies such as search, or the application, this 3.0 method of data retrieval is done by way of English, and natural language processing around desired outcome.

"Siri, play Homeland," as a command has replaced, "turn on TV, open Apple TV, change the input on your remote to the correct HDMI port, navigate to Showtime, search for Homeland, locate most recent unwatched episode, and click Play."

And it's only getting more advanced, and embedded into voice and chat functions. Since the advent of ubiquitous sensors and smart phones, we are surrounded by a flurry of data exhaust. Big data exists, but without intuitive methods of data retrieval and segmenting, it's largely useless to us as people. Going into 2016, we will begin to see companies relying on chat-based artificial intelligence (AI) to help users draw upon their data exhaust.

Slack famously built its product onboarding using a chat bot that asks questions rather than requiring a new user to fill out forms. And this is only the beginning. For example, applications such as Lark, an iOS digital weight loss coach, already enables me to see and log my physical activity and sleep history, prompted via a series of conversational chats. Other startups such as Digital Genius are powering all types of enterprise interactions by pairing real-time AI and human intelligence to optimize across confidence levels.

New financial service chat-based AI companies will enable me to ask my phone by voice or chat, "Hey, how much did I spend on Uber last month?" rather than dive into and do the analysis myself. It will be like having your own personal business analyst who can stitch together insights across your data, prompted by English commands.

In 2016, "what you say is what you get." WYSIWIG has gone conversational, and it's going to start driving how humans and computers interact in every way. It's the third-wave of human computer interaction, and the new coding language is English.

The next James Bond villain might just issue calm verbal commands to Siri.

Published on: Dec 10, 2015
The opinions expressed here by columnists are their own, not those of