BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

The Experimental Enterprise

This article is more than 10 years old.

Being data-driven is a hot phrase: but what does it mean in practice? It’s not about finding analytical fairy dust that you can sprinkle to make everything better, but a deep-rooted approach to operations, innovation and competition. Creating an organization designed for data-informed decision—the experimental enterprise—is the key to competing in a world that’s being rapidly digitized.

From The Web To The World

What makes large web companies such as Google so successful? It’s the data. More specifically, the ability to iterate and adapt in response to learning from data. As a result of this approach being deployed at scale, we’ve seen a raft of technology emerging from leading web companies, as they build environments that are constantly innovating and optimizing. Cloud, DevOps, big data: all owe their genesis to agility at scale.

As the tide of digitization rises, every industry sector is poised to be transformed by the ability to experiment rapidly. Digital manufacture such as 3D printing has commoditized prototyping and product development. Mobile has put every business in direct contact with their customers. The Internet of Things promises an unprecedented flood of data and the ability to manipulate processes and environment.

Digitization rapidly accelerates the pace of change in the marketplace, bringing with it new players who use newly available visibility and agility to disrupt the competitive landscape. The ability to experiment has become an imperative for today’s companies who wish to play a part in tomorrow’s market. Thus we see also the rise of big data: the way to make sense and take advantage of digitally native environments is through data science, that potent combination of big data, analytics and software development.

Although digitization is unquestionably an issue for the entire company, it places a laser focus on the IT organization. For many years, IT has played the role of “faster paper”, automating processes and ultimately being a cost center. In a digitized marketplace, IT must adapt to a role where it underpins strategic advantage. This is not an incremental change, but a radical reassessment. No wonder that it has been easier for the startups, who work from a clean slate.

Yet it is possible to bring the adaptive qualities of Silicon Valley startups to broader business. That’s what I call building the “experimental enterprise”— a company whose infrastructure is designed to make experimentation possible and efficient.

Components of the Experimental Enterprise

There are three core principles of experimenting: experiments must be cheap, fast, and not break everything.

  • Experimentation must be cheap in order to de-risk failure. It’s from failure that we can learn, so cost and stigma must be removed from trying and failing.
  • Experimentation must be quick, so a feedback loop can be used to learn from the market and environment, and allow the business to respond rapidly. There’s little use in understanding your customers intimately if it takes you a year to make small product changes.
  • Experimentation mustn’t break the important production processes of a business. As we build the experimental enterprise, we’re designing for a world where innovation sits alongside production: not the luxury of the startup garage with no legacy.

Within IT, these principles can be used to build an experimental enterprise by employing six building blocks: the cloud, devops, and open source; agile development and platforms; and data science. Each of these is worth extensive explanation, but I will explain in brief how they fit together to create an IT infrastructure ready to serve as a strategic advantage.

The foundation of the experimental enterprise focuses on making infrastructure readily accessible, allowing us to shortcut both physical and process issues such as licensing and purchasing.

  • Cloud: grants agility through removing costly and prohibitive equipment purchase cycles. It’s easy to test out ideas at scales hitherto inconceivable. It doesn’t matter overly whether we’re talking public or private cloud: the important thing is the ease with which resource can be summoned.
  • DevOps: the emergence of programmable infrastructure makes setting up new environments trivial. Without it, code releases are stuck on release cycles that run to months. With it, new ideas can flow freely from test to production, even multiple times daily.
  • Open source: with commodity building blocks available for free, open source software provides a steady stream of giants on whose shoulders we can stand, letting us spend more time focusing on particular business problems than reinventing wheels (or negotiating punitive contracts for them.)

How you build on the foundation is crucial: we need to support both investigative work and build a solid layer for production.

  • Agile development: the agile process is particularly suited to environments where there is a high rate of innovation. The nature of experimentation is that you don’t know all the answers, and have to adapt. An agile development process lets you more quickly find the right path, which can then be stabilized for the future: most importantly it builds learning into the planning process. No more “too big to fail”.
  • Platforms and APIs: how do you build production systems that still allow future innovation? By using an API-oriented approach, you can make core processes available as a robust internal utility, building blocks for future construction. Likewise, the creation of a data platform helps avoid the risk of silos, where data is collected for single-use only and unavailable for future use.

Finally, we need a way to observe our experiments, and respond to the changing environment. Data science is the lens through which we comprehend our endeavors, and a powerful tool in making decisions for future action.

  • Data science: properly deploying an experimental approach is not trivial, as an understanding of how to use experiments must pervade an organization. The lack of glamor in publishing negative results has biased our perception of the scientific method. A bad experiment will tell you nothing. A good experiment will most often reveal where you need to try again, rather than validate your schemes with glorious clarity. Edison ’s epithet that “genius is one percent inspiration and ninety nine percent perspiration” remains true—perhaps we just let algorithms do more of the sweating.

Software is Eating the World: Don’t Let It Eat You

This article has given a very brief sketch of how an adaptive IT environment is constructed to support an experimental enterprise. The importance is clear—the inexorable march of digitization makes software everybody’s business. As Marc Andreessen put it, “software is eating the world.”

Running at the front of the digitization pack, the web giants have taken data-driven business to a new level, and as a result have inspired a generation of technologies and approaches that every organization can utilize. Through embracing the core ideas of an experimental enterprise, it’s possible for businesses to adapt and thrive under challenging and rapidly changing market conditions.