I recently appeared as a guest on Wharton Professor David Robertson's radio show, Innovation Navigation. David is an old pro and recently published an excellent new book on innovation, The Power of Little Ideas, so it was an interesting, wide ranging discussion that covered a lot of ground.

One of the subjects we touched on was the new era of innovation. For the past few decades, firms have innovated within well understood paradigms, Moore's Law being the most famous, but by no means the only one. This made innovation relatively straightforward, because we were fairly sure of where technology was going.

Today, however, Moore's Law is nearing its theoretical limits as are lithium-ion batteries. Other technologies, such as the internal combustion engine, will be replaced by new paradigms. So the next few decades are likely to look a whole lot more like the 50s and the 60s than the 90s or the aughts. Much of the value will shift from applications to fundamental technologies.

The End Of Paradigms

As Thomas Kuhn explained in The Structure of Scientific Revolutions, we normally work within well established paradigms because they are useful for establishing the rules of the game. Specialists within a particular field can speak a common language, advance the field within well understood parameters and apply their knowledge to solve problems.

For example, Moore's Law created a stable trend of doubling computing power about every 18 months. That made it possible for technology companies to know how much computing power they would have to work with in the coming years and predict, with a fairly high level of accuracy, what they would be able to do with it.

Yet today, chip manufacturing has advanced to the point where, in a just few short years, it will be theoretically impossible to fit more transistors on a silicon wafer. There are nascent technologies, such as quantum computing and neuromorphic chips that can replace traditional architectures, but they are not nearly as well understood.

Computing is just one area reaching its theoretical limits. We also need next generation batteries to power our devices, electric cars and the grid. At the same time, new technologies, such as genomics, nanotechnology and robotics are becoming ascendant and even the scientific method is being called into question.

The Next Wave

Over the past few decades, technology and innovation has mostly been associated with the computer industry. As noted above, Moore's law has enabled firms to bring out a steady stream of devices and services that improve so quickly that they become virtually obsolete in just a few years. Clearly, these improvements have made our lives better.

Still, as Robert Gordon points out in The Rise and Fall of American Growth, because advancement has been contained so narrowly within a single field, productivity gains have been meager compared to earlier technological revolutions, such as indoor plumbing, electricity and the internal combustion engine.

There are indications that's beginning to change.These days, the world of bits is beginning to invade the world of atoms. More powerful computers are being used for genetic engineering and to design new materials. Robots, both physical and virtual, are replacing human labor for many jobs including high value work in medicine, law and creative tasks.

Yet again, these technologies are still fairly new and not nearly as well understood as traditional technologies. Unlike computer programming, you can't take a course in nanotechnology, genetic engineering or machine learning at your local community college. In many cases, the cost of the equipment and expertise to create these technologies is prohibitive for most organizations.

The Democratization Of Fundamental Research

In the 1950s and 60s, technological advancement brought increased scale to enterprises. Not only did mass production, distribution and marketing require more capital, but improved information and communication technologies made the management of a large enterprise far more feasible than ever before.

So it would stand to reason that this new era of innovation would lead to a similar trend. Only a handful of companies, like IBM, Microsoft, Google in the tech space and corporate giants like Boeing and Procter & Gamble in more conventional categories, can afford to invest billions of dollars in fundamental research.

Yet something else seems to be happening. Cloud technologies and open data initiatives are democratizing scientific research. Consider the Cancer Genome Atlas, a program that sequences the DNA inside tumors and makes it available on the Internet. It allows researchers at small labs to access the same data as major institutions. More recently, the Materials Genome Initiative was established to do much the same for manufacturing.

In fact, today there are a wide variety ways for small businesses to access world class scientific research. From government initiatives like the manufacturing hubs and Argonne Design Works to incubator, accelerator and partnership programs at major corporations, the opportunities are endless for those who are willing to explore and engage.

In fact, many large firms that I've talked to have come to see themselves as essentially utility companies, providing fundamental technology and letting smaller firms and startups explore thousands of new business models.

Innovation Needs Exploration

Innovation has come to be seen as largely a matter of agility and adaptation. Small, nimble players can adapt to changing conditions much faster than industry giants. That gives them an advantage over large, bureaucratic firms in bringing new applications to market. When technologies are well understood, much of the value is generated through the interface with the end user.

Consider Steve Job's development of the iPod. Although he knew that his vision of "1000 songs in your pocket" was unachievable with available technology, he also knew that it would only be a matter of time for someone to develop hard drive with the specifications he required. When they did, he pounced, built an amazing product and a great business.

He was able to do that for two reasons. First, because the newer, more powerful hard drives worked exactly like the old ones and fit easily into Apple's design process. Second, because the technology was so well understood, the vendor had little ability to extract large margins, even for cutting edge technology.

Yet as I explain in my book, Mapping Innovation, over the next few decades much of the value will shift back to fundamental technologies because they are not well understood, but will be essential for increasing the capability of products and services. They will require highly specialized expertise and will not fit so seamlessly into existing architectures. Rather than agility, exploration will emerge as a key competitive trait.

In short, the ones that will win in this new era will not be those with a capacity to disrupt, but those that are willing to tackle grand challenges and probe new horizons.