A core human competency is the capacity to model outcomes. This predictive ability has contributed to our successful growth as a species and provided the stage from which we extrude our technologies. We observe our world, log our experiences, and use this information to envision & plan our future possibilities. In the rush into tomorrow we’ve deputized machines to assist in our scenario modeling as our plans grow ever greater in scope.
Today we have tremendous amounts of data available about any system we wish to model. Drive platters are bulging into the terabytes just to store all of the information gathered by sensors, services, and empowered humans. Whether we study business networks, financial models, or natural systems, our awareness of their complexity has grown exponentially. Things are far wider and more interconnected than we could have imagined even 20 years ago.
All systems are sets of nodes with properties & variables that govern their behavior, coupled together by relational rules governing their interaction. The more complex a system, the more unique nodes and the more interconnections between nodes. Given the human constraint of being able to hold only 6 or 7 unique objects in mind at any given time it’s clear that we’re overwhelmed by even the relatively simple tasks of understanding, for example, a mid-size business structure enough to predict its future, especially when you consider the business system itself as a single node embedded in a much larger global socio-economic system. Imagine the difficulties climate modelers face trying to document global circulatory systems…
One emerging strategy for modeling complex systems looks to software and the floating-point wonders enabled by Moore’s Law. Computers are phenomenally capable of managing the inconceivable amounts of operations necessary to begin modeling dynamic systems. Yet, until very recently one needed to book time on a supercomputer cluster to run weather models or robust behavioral analysis. Even today’s bleeding hardware strains under the weight of such complexity. Research institutions have pursued natural systems modeling for some time and the business world has been paying attention. SAP now offers modeling capabilities with its business intelligence ERP solutions, enabling executives to run scenarios and envision possible outcomes of strategic decisions. Oracle recently acquired Hyperion, adding “performance management” to their suite of BI tools. You can bet these technologies will work their way into government & geopolitical protocols, as well as social & personal behavioral engineering as we increasingly track & model our lives.
Effectively, this pattern emulates the deeper shift from individual enterprise to collective collaborations. You can only model a complex system with another sufficiently complex system. However, even the most interesting algorithms are encumbered by the impositions of their logic: they can only be as creative as they were written. A second emerging strategy for modeling complex systems looks to deputize humans as processing nodes, crowdsourcing future possibilities across infinitely creative sets of minds. The Institute for the Future has taken this approach with its Signtific Lab and the Superstruct platform, leveraging the principles of gameplay to engage massive participation in envisioning scenarios.
The Superstruct games have drawn in thousands of players offering their thoughts & dreams of the future. Players become processing nodes for the chosen subject (eg. “when augmented reality is everywhere”, or “when personal satellites are as easy to deploy as websites”) iterating across large sets of potential outcomes. From these inputs, patterns emerge showing trends with greater frequency & momentum among the collective. Perhaps even more interesting – and where the Superstruct method is more flexible than computational modeling – are the outliers that emerge from players. Many of the most compelling signals of the future are those that completely break from current patterns. Indeed, one of the most fundamental prevailing shifts in the global paradigm is that change is accelerating in ways we cannot even imagine.
These two approaches both consider complex systems & scenario modeling from architectures that themselves are complex, object-oriented systems. The programmatic approach brings heavy-weight numeric bit-crunching to dynamic data streams, while the Superstructing approach offers wide-reaching creativity and human sensing. Augmenting one approach with the other will mark the next phase of predictive analysis necessary to safely navigate civilization through the future. Envisioning these scenarios and building compelling narratives around them will inevitably draw them into becoming.
Our lives are more & more complex and our enterprises & collaborations are commonly reaching global scales. The need to effectively model & predict is a fundamental human trait, reinforced in the face of escalating complexity in a hyper-connected, Read-Write world.