The din of AI seems to be everywhere. From boardroom to bedroom, AI is promising to alter every aspect of how we work, live, play. But we've heard it all before, right? AI as a topic is over 60 years old. The fundamental science and statistics behind it are not new. So, perhaps you should just ignore the hype? After all, there will be plenty of time to catch up, if and when AI become real.

Absolutely not, according to a recent report from MMC Ventures in collaboration with Barclays, "The State of AI: Divergence." 

In what has to be one of the most thoroughly researched and best presented studies on AI that I've seen to date, the report's authors state in unequivocal terms that a great deal has changed to make AI a reality. A reality that, in their words, is going to create " the fastest major paradigm shift in the history of enterprise technology."

While much has been said about the extremes of the AI spectrum of possibilities--with the Terminator at one end and the Jetsons at the other--a few of the more radical implications have received almost no attention whatsoever. For example, in a recent column I talked about how the data needed to fuel AI is simply not affordable in the volumes needed. That could stop AI dead in its tracks. 

" the fastest major paradigm shift in the history of enterprise technology."

In reading through "The State of AI" I came across several other rarely mentioned phenomenon that we need to pay close attention to as we navigate the fast changing landscape of AI's evolution. What struck me most was that, despite all the talk of AI, we still do not have an adequate measure of how fast and how radical its implications will be. 

From the standpoint of individuals and enterprises, not understanding AI's capabilities and the speed with which they will evolve will be lethal. Yes, there has been hyperbole and there have been missteps. For example, in a recent podcast, with Tom Davenport, I covered IBM's failure with Watson at M.D. Anderson, which cost over $63 million.

While these sorts of setbacks are inevitable in any case of fast moving technology change,  they're also causing many companies to adopt a wait and see attitude towards AI. For instance, one large trucking manufacturer I worked with recently, has a fully developed Level 4 autonomous truck. Meaning that it can drive itself in nearly any situation a human can drive. Yet, they are waiting to roll it out because they do not want to be first to market.

That's understandable, given the reputational risk involved. But the real concern, the one that should be keeping all of us up at night, is how large of a lead early adopters will have as they plow forward with AI, mistakes and all, while the rest wait by the sidelines.

If you have even the slightest interest in AI, I'd strongly encourage you to read the 130+ page report in its entirety. The insights may well change your mind about your own AI investments.

However, here are some of the highlights from the report and what I see as their implications on your business and your life--especially their implications on what I suggested in the headline of this column. 

A.I. Adoption

According to research by Gartner, "just 4 percent of enterprises had adopted AI 12 months ago. Today, 14 percent of enterprises have deployed AI. [and] A further 23 percent intend to deploy AI within the next 12 months. Adoption will continue to accelerate; in two years, nearly two thirds of large companies will have live AI initiatives."

The rate of adoption being reported has little precedence with any prior technology. Over the course of three years we've gone from nearly no adoption to 66 percent adoption. While the technology of AI may be exponential, it's certain that our ability to cope with its human, social, and organizational implications is not going to move any where near as fast. In a  prior column I talked about how this sort of asymmetry between man and machine can have disastrous implications.

We are creating a huge cognitive and behavioral gap between the capabilities of AI and humans. Renegotiating this collaboration will not be trivial. That's reason No. 1  why early adopters will have an enormous leg up on laggards. The learning curve is not just technical, it's very human, and very steep.

A.I. Technology

The report lists a variety of new technologies that are creating a sea change of new opportunities for AI. This is an entirely new lexicon that we all need to familiarize ourselves with. For instance: Custom Silicon technology will alter some of the most fundamental aspect of computer architectures. This will be similar to what happened with data storage over the last 60 years. For example, in 1960 an IBM 350 disk drive weighted two tons and had a 3.5 MB capacity. Today, advanced SSD storage can hold 300,000 times that data on a chip that is just one millionth of the weight of the IBM 350. The difference is that this time the change will take a few years as opposed to decades. 

Another key technology most of us aren't aware of, but which will have serious implications on distorting reality, is that of GANs, General Adversarial Networks. GANs are able to create fake content that is impossible to differentiate from real content. GANs can be used to splice an individual's face onto a video in a way that makes it impossible to differentiate it from the real thing. Take a look at these examples of GANs used for images and voice.

Other technologies such as Reinforcement Learning (RL) and Transfer Learning (TL) may hold the key to the holy grail of AI, General AI. For instance, Google Deep Mind's AlphaGo Zero was able to use these technologies to hands-down beat the world's reigning Go master, Lee Sedol. Yet, nobody taught AlphaGo Zero how to play Go. In just 40 days it was able to teach itself what it took humanity 3000 years to master.

The rate of this change in capability is nothing short of mind-numbing. Again, if you are a laggard catching up will be sort of like a 1920 horse and buggy trying to catch up to a Lambo by adding more horses to its hitches.

AI also doesn't just learn exponentially. It learns with the cumulative exponential effect of collaborative AI. For example, an autonomous vehicle doesn't learn on its own, it learns from millions of other autonomous vehicles. This sharing has a compounding exponential effect on how fast AI evolves and matures. Imagine an exponent raised to the power of another exponent. Ultimately generalized AI, the holy grail of AI, will likely be the result of this enormous combinatorial effect of having myriad individual domains from which AI can learn.  

Once more, if you're an early adopter you're already building this exponential AI learning network, while laggards are still making linear strides forward. 

A.I. Talent

This one blew me away. According to the report, "Data scientists and machine learning specialists are among the best paid professional developers. At the 20 highest-paying companies, salaries for AI engineers average $224,000." 

In fact, do you want to guess who the Number #1 and #2 best paying employers were when it comes to AI developer talent? 

Uber leads the pack at $325,000! But what surprised me was that WalmartLabs and Netflix pay a very competitive $260,000.  That's more than FaceBook and Salesforce, which clock in at $250,000. The lesson here is that you can catch up even if you're not a tech company, but you need to do so early.

All of this paints a positive picture of AI's momentum, but it also provides a detailed analysis of the implications for organizations who turn a blind eye to AI and become laggards. 

In much the same way that the rich get richer, because they have the resources and the power to set the agenda, leaders in AI will quickly leverage their early adoption, technical superiority, and talent to take a high ground that may simply be impossible to displace. 

It all illustrates the fast (exponentially fast) growing gap between leaders and laggards. And I'd extend that to not only organizations, but also to nation states. The subject of a future column.

The disparity between an AI enabled organization and one that is still using linear, rules-based, 20th Century models for computing and learning may be as great as that between human intelligence and that of an ant farm. Am I being hyperbolic? Perhaps, but only to shake you out of your indifference. 

For now, I'd strongly suggest that you tune into the din and play close attention to where you are, as an individual and as a business, on the AI learning and adoption curve.