For the past year or so I’ve been tracking trends in analytics, sensing systems, community feedback services, and visualization & modeling tools as they might be applied to intentional civic design. IBM’s Smarter Planet initiative is both a fine example and a major signal of the move towards a deeper understanding of natural & human systems and the technologies that enable us to model and reprogram our world. What this means for my local community is that we’re increasingly able to collate run-time data about our city that can be scraped, sorted, analyzed, visualized, and used to inform behavior, policy, planning, and optimizations. Services like EveryBlock and San Francisco CrimeSpotting, as well as the data visualization work carried out by Stamen Design, MIT’s CitySENSE, and many others illustrate some of the ways local data can be harvested, parsed, & visualized to derive valuable behavioral patterns about a city.
Such services have mostly been cobbled together using Google Maps and limited access to data feeds released by civic bodies but there is a growing trend for city governments to mandate standardization of their metrics into structured data streams (XML, RDF) and to aggregate & publish these feeds to the public. Gavin Newsom, the mayor of San Francisco, announced today the DataSF project which aims to create a “clearinghouse of structured, raw & machine-readable gov data”. This commitment by such a major & influential city is a huge step in legitimizing the value of open data and engaging developers & innovators to build better services for optimizing civic functioning. It is this intersection of government openness & data standardization that underlies the Gov 2.0 movement and reinforces the emerging metaphor of City as Platform.
In reviewing the General Plan 2030 [PDF] of my own city, Santa Cruz, Ca., I see the standard (and important!) list of local resources, community planning & preservation concerns, issues regarding land use & economic development, etc… but what’s missing is any reference to the increasingly large and important data shadow cast by our civic, ecological, financial, and social structures. I think this is typical of most cities that consider both event-driven data and cloud-mediated activities as tangentially arising off of primary traditional institutions. Business has sales metrics, the Department of Works has road repair updates, and farmer’s have crop reports. But in order for a city to become a platform, and for civic planning to truly step into the Information Age, the importance of it’s data shadow must be addressed as a fundamentally critical component of it’s overall functionality. Just as water resources, agriculture, and emergency services are important vertical columns in the map of city planning, so too is the dynamic body of information produced and mediated by local activities. Civic planning must consider how this data can be leveraged to better understand and optimize the vibrancy and resilience of the community.
There are many open source tools to enable creation of local mash-ups and visualizations but the fundamental roadblock impeding such progress is the missing mandate for civic bodies to convert their data into open & structured standards like XML, KML, and RDF. Just as companies invest increasingly in business intelligence platforms, executive dashboards, and analytic platforms in order to better understand their operations and model future implementations, so too must city planners underwrite their IT departments with the funds necessary to standardize & open their data so the behaviors & patterns of the city may better reveal themselves to analysis. It is a meager investment that will pay off immeasurably within just a few years. Implementing such a strategy will bring a tremendous amount of transparency into civic operational processes and stimulate a rich ecology of innovation, while engaging the community directly in the enterprise of building more efficient local systems.