Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

How Do We Define Innovation in an Era of Disruption?

DZone's Guide to

How Do We Define Innovation in an Era of Disruption?

Innovation is, arguably, not what it used to be. So how are we supposed to define it when every other technology seems to be considered disruptive?

· IoT Zone
Free Resource

Cisco IoT makes digital transformation a reality in factories, transportation, and utilities. Learn how to start integrating with Cisco DevNet.

On August 10th, I will be speaking at the Mendix on Tour event in Boston to share insight on what it is that great innovators do differently. In my new book, Mapping Innovation, I deconstruct common innovation myths and explain how innovation in the digital age is different from what it was in previous generations. Simply put, technology has given us powerful new tools, and we need to learn how to use them effectively.

As an introduction to my talk at Mendix on Tour, here is an excerpt from my book about defining innovation.

It seems that any time we try to understand an innovation through events, the story only gets more tangled and bewildering. And it doesn’t get any clearer if we look at the innovators themselves. Some were highly trained PhDs, but others were college dropouts. Some were introverts. Others were extroverts. Some worked for the government, others in industry. Some worked in groups, but others largely alone.

Which path should you pursue? It starts with asking the right questions to define the problems you seek to solve and map the innovation space.”

Yet that brings us to any even more important question: How should we pursue innovation? Some companies, like IBM, invest heavily in basic research and always seem to be able to invent new businesses to replace the old ones that inevitably run out of steam. Others, like Procter & Gamble, are able to effectively partner with researchers and engineers outside their organizations to develop billion-dollar products. Apple became the world’s most valuable company by limiting the number of products it sells and relentlessly focusing on the end user to make things that are “insanely great.” Google continuously experiments to develop a seemingly endless stream of new innovations. Which path should you pursue?

Fortunately, there is an answer, and it starts with asking the right questions to define the problems you seek to solve and map the innovation space. From there, it is mostly a matter of choosing the right tools for the right jobs to develop an innovation playbook that will lead to success in the marketplace.

 

What Is Innovation?

In The Little Black Book of Innovation, Scott Anthony defines innovation as “something different that has impact.” That seems like a reasonable definition. After all, to innovate we need to come up with something different—if not a completely new invention, then a process for using an existing technology in a new way. That would cover significant technologies, like the Internet and the World Wide Web, while also making room for services like Uber and Facebook that harness those earlier inventions for new purposes.

And clearly, innovation needs to have an impact. Yet how are we to judge that? Did Engelbart’s “Mother of All Demos” have an impact in 1968? Maybe it did on the people who were there to witness it, but few others. But Anthony insists that innovations need to have a measurable impact, which probably didn’t happen until 1984, with the launch of the Macintosh. So does that mean that Steve Jobs was an innovator and Engelbart was not? That certainly doesn’t sound right. Maybe the Macintosh was the impact of “The Mother of All Demos.” But that would mean that Engelbart didn’t become an innovator until 16 years after he completed the work and that, in fact, Steve Jobs is responsible for making Engelbart’s work important and not the other way around. That doesn’t sound right either.

This is not, to be sure, a new debate, but one that’s been raging for over a century. In 1939 Abraham Flexner, published an article in Harper’s Magazine entitled “The Usefulness of Useless Knowledge,” in which he recounted a conversation he had with the great industrialist George Eastman. He asked Eastman who he thought was the man most useful to science, to which Eastman replied that he felt it was Marconi, the inventor of radio. Flexner then argued that Marconi was inevitable, given the work of Maxwell and Hertz, who discovered the basic principles that made radio possible. Further, he argued that these men were driven not by practicality—or as Anthony would put it, by the impact of their work—but merely by curiosity.

Flexner went on to describe an institution he was building in Princeton, New Jersey, called the Institute for Advanced Study, in which minds like John von Neumann as well as Albert Einstein, Kurt Gödel, and many others could pursue any subject they liked in any manner they chose, without any responsibility to teach or publish or show any impact at all from their work.

It was there that von Neumann developed a computer with a revolutionary new architecture that could store programs. He devised his new machine using other ideas once thought useless, like the vacuum tubes invented by Vladimir Zworykin in the 1920s. This design, now known as the von Neumann architecture, was open sourced and led to the development of the first commercial computers that were sold to businesses. Just about every computing device in the world is still organized according to the scheme that von Neumann came up with in 1945.

Today, hundreds of scholars come to the Institute for Advanced Study each year to work on abstract problems like string theory and abstract geometry. Will there ever be a measurable impact from their work? We won’t know for decades, but clearly there is an incredible amount of innovative thinking about some very tough problems going on there.

The truth is that any significant innovation involves an incredible diversity of problems that need to be solved, from theoretical and engineering challenges to manufacturing and distribution hurdles. There is no silver bullet, and no one person—nor even a single organization—can provide all the answers alone.”

So, I think a better definition for innovation would be “a novel solution to an important problem.” But that leads to the question: Important to whom? Well, first to a particular industry or field. Engelbart’s work was innovative because it was both new and considered incredibly important to the field of computer science, for which it created an entirely new paradigm. Also, innovations are important to the next innovator. Engelbart made Taylor and Kay’s work on the Alto possible, which made Steve Jobs’s work on the Macintosh possible, which in turn helped unleash the creativity of millions of others.

That’s why it’s so hard to understand where innovation begins and ends. The truth is that any significant innovation involves an incredible diversity of problems that need to be solved, from theoretical and engineering challenges to manufacturing and distribution hurdles. There is no silver bullet, and no one person—nor even a single organization—can provide all the answers alone.

Still—and this is a crucial point—we all must pursue our own path to innovation alone. We have to choose what problems we intend to solve, whom we will work with, the manner in which we will work with them, and how we will bring our solutions to market. Those are decisions that we need to make, and no one else can do it for us.

This book will show you how to map the innovation space in order to make those decisions in a more rational, informed manner. It will also help you build a strategy around those decisions that can help you win in the marketplace.

A New Era of Innovation

As we have seen, innovation is far more difficult and complex than most people give it credit for. It takes more than a single big idea to change the world, and it can take decades after the initial breakthroughs for the true impact of an idea to become clear.

Still, in some ways we’ve had it easy. Our basic computer architecture has not changed since John von Neumann created it in 1945. Moore’s Law, the regular doubling of chip performance that Gordon Moore postulated in 1965, has effectively given innovators a road map for developing new technology. Since the 1970s, engineers have depended on it to tell them how to focus their efforts. Other key technologies, such as the lithium-ion batteries that have made mobile devices predictably smaller and more powerful with each generation, have been in use since 1991. Over the last quarter century, these technologies have dramatically improved, but the basic paradigm of their design not changed in any significant way.

Moore’s law, that trusty old paradigm that we’ve come to depend on, will likely come to an end around the year 2020.”

The next decade or two, however, will look more like the fifties and sixties than it will the nineties or the aughts. We’ll essentially be starting over. Moore’s law, that trusty old paradigm that we’ve come to depend on, will likely come to an end around the year 2020, as transistors become so small that quantum effects between molecules will cause them to malfunction. Lithium-ion batteries will hit theoretical limits soon after that. They will be replaced by fundamentally new technologies, like quantum computing, neuromorphic chips, and new materials for energy storage that nobody really knows how to work with yet.

At the same time, new fields such as genomics, nanotechnology, and robotics are just beginning to hit their stride, leading to revolutionary new cures, advanced materials, and completely new ways to produce products. Artificial intelligence services like Apple’s Siri and Google Now will become thousands of times more powerful and change the way we work and collaborate—with machines as well as each other. I’ve talked to many of the people developing these revolutionary technologies and, despite the amazing potential of the breakthroughs, each time I’ve been struck by how much work there is still to do. We’re just beginning to scratch the surface.

Over the past 25 years, we’ve struggled to keep up with the pace of change. But over the next few decades, we will struggle to even understand the nature of change as fundamentally new technologies begin to influence the way we work, live, and strive to innovate.”

Over the past 25 years, we’ve struggled to keep up with the pace of change. But over the next few decades, we will struggle to even understand the nature of change as fundamentally new technologies begin to influence the way we work, live, and strive to innovate. It will no longer be enough to simply move fast, we will have to develop a clear sense of where we’re going, how we intend to get there, and what role we will be able to play. We’ll need, in other words, to learn how to map innovation.

Download my full book: Mapping Innovation for a better understanding of innovation and valuable tools to help you frame the problems that are important to you.

Cisco is a software company. Surprised? Don’t be. Join DevNet to explore APIs, tools, and techniques that developers are using to add collaboration, IoT, security, network priority, and more!

Topics:
innovation ,iot ,industrial internet ,digital disruption

Published at DZone with permission of Greg Satell, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}