Category: tech analysis

Biz Notes: Adobe/Omniture, Google Wave

Adobe’s acquisition of Omniture has overshadowed another recent acquisition. In August of this year, Adobe quietly purchased Business Catalyst, a CRM & web hosting company. With this acquisition Adobe picked up a turn-key solution for clients to publish, host, and manage business web sites. This would suggest that Adobe is moving towards similar territory as Amazon’s EC2 and other business cloud hosts.

Enter Omniture. Putting an analytics infrastructure behind Flash properties is a no-brainer, though as James Governor notes, it’s unclear how Adobe analytics would be any better than Google. The staggering Omniture client list aside (Apple, IBM, MSFT, etc…), Adobe could bring it’s analytics suite to Business Catalyst clients thereby building an entire publishing, hosting, analytics, & CRM ecosystem. You buy the Creative Suite, publish through Business Catalyst, host on Adobe servers, and reap the user analytics from Omniture. And Adobe grabs a bit of cash at every step. (If you want to get really crazy, think about how LiveCycle & Flash might fit into this equation…)

Now, about Google Wave. Disclosure: I haven’t gotten an invite yet. But I’ve been doing my share of research since the announcement earlier this year. The press following their beta is mostly focusing on how competitive Wave is with email & IM, or how weak the typically-Google user experience is. Although Google has framed the whole offering as a new communications tool, I think this generalization is perhaps a deliberate obfuscation leading people to think it’s only about evolving email.

The most interesting thing to me about Wave is that it combines real-time collaboration with a context-aware architecture. The experience of the user is dependent on the context of their content, their role, and their transactions. I think most commentary has missed the point that Wave is the first real context-aware application framework. If we look at the term “communication” and consider it more as an event protocol, Wave allows all components of a contextual transaction to communicate with each other. In other words, this isn’t just real-time collaboration for users. It’s real-time collaboration for machines. My sense is that Wave is a proof of product, and that the core functionality will be a large part of the Chrome OS underlying all transactional processes. If this is the case, Chrome OS could be a truly revolutionary cloud-aware contextual operating system.

My two bits for a Sunday…

Brain-Computer Interface

In my present tenure as a Visiting Researcher at the Institute for the Future I’ve been posting a lot of Signals pertinent to Brain-Computer Interface over at the Signtific open source research site. My Signals are listed under the tag “ProgrammableEverything”.

Check ‘em out if you’re interested in the fascinating & accelerating field of BCI. Also feel free to add your own Signals you see in the world or are engaging in your professional research.


Thoughts on Twitter’s Internal Strategy

The flurry of news surrounding the theft and publication of internal Twitter documents will inevitably engender even more goodwill for the world’s favorite social messaging platform. No betrayal of Twitter strategy short of implicating them in slapping babies with puppies can dent their supernova ascent into global stardom. Their current soap opera seems to bring them more sympathy than concern over their strategic objectives. In all likelihood, the player with the most to lose is Michael Arrington who’s managed to come off as a bit of a bully barely restrained by his own self-interest to secure future access to Twitter insiders.

The most interesting bits are related to features. The revelations concerning Hosebird, Tweet Rank, Google Syndication, and a “secret project with the X-Box” do more to allay concerns over Twitter’s monetization strategy than reveal any lack of ideas or sinister motivations. Their goal of 1 billion users is handily sugar-coated by the suggestion that they are building a global nervous system, drafting on the oft-quoted predictions of the emergent Global Brain. If anything, these leaks, like the way Apple deftly foreshadows it’s own “super secret” Skunkworks product releases, will add even more drool to the salivations of the user base, the dev ecology, and 3rd party interests eager to have more access to the Starchild. In fact, it seems that, if anything, Arrington is doing Twitter a huge favor.

Disclosures of ongoing talks with Google, Microsoft, Yahoo, Amazon, et al, while not especially new or surprising, underwrite the seriousness of Twitter’s enterprise and reinforce the fact that aside from the wall of hype & buzz permeating the media Twitter is one of the Big Boys now. If not yet in valuation, then certainly in it’s seriousness and capacity. Remember when Google was just this new, simple searchbar competing with WebCrawler & HotBot & Lycos? Twitter’s ability to keep the likes of Diddy and Marissa Mayer at arm’s length underscore the strength of their organization and the confidence they have with their status and strategy.

Another tell lies in the notes about Twitter’s future with respect to possible acquisitions. A line within the context of the failed Facebook acquisition and attempts by other would-be suiters states “it can give us understanding of what we are worth”. This is like when you go on job interviews so your current boss will promote you. By courting acquisitions Twitter gets hard numbers to reinforce what their real value is in the competitive marketplace. The inevitable press surrounding these offers gives them huge leverage for partnerships, funding, free press, and growth. Conversely, they admit that they may not be able to meet the scaling requirements of their exponential growth. These two statements together defend Twitter’s authority and secure it’s need to stay firmly in the driver’s seat if they enter into any merger or acquisition with larger suitors.

Of course, search is the big deal here. Twitter must either fiercely defend its data and analytics against Google or cut a tight deal that serves their interests effectively without diluting their brand. As they admit, Google can do search much better but Twitter controls the stream. Clearly, Google is afraid of losing ad share to Twitter, yet is salivating at the chance to sink their searchy incisors into their data as deeply as possible. Indeed, “Twitter the product is a vehicle for Twitter Search” and “Twitter is an economy of information”.

Ironically or not, the release of these internal documents and the ensuing public discussion of their contents will empower the Twitter community even more to be the stewards of their pet. Recall that Twitter’s genesis was far simpler and less ambitious. As the user base swelled and began to co-opt it’s use pulling it far beyond a fun SMS “What Are You Doing” billboard, they had to quickly re-architect their infrastructure to support a global messaging system. Recent challenges brought by Twitter’s utility as a disaster reporting tool, an emergency service coordination network, and a significant threat to oppressive regimes further reinforce the sense that the service only partly belongs to its creators. These disclosures are not only harmless to Twitter’s goals, perhaps even furthering them, they are appropriate to the era of transparency and connectivity that it has helped create.

To invoke the Global Brain myself, Twitter will get it’s 1 billion users and more (unless they piss off Goldman Suchs), and the weight of these sources and the connections they are weaving will continue to re-engineer the collective experience of information and sharing that humanity is engaged in. In the sea change waves of the new Information Economy, amid all the challenges the democratized landscape of free services pose to existing monetization strategies, something new is emerging and it’s increasingly less and less concerned about funding and valuation and far more invested in utility and humanity.

Companies to Watch: IBM & SAP

In a time of monumental change it’s important to look at how the big player’s are adapting. Their moves are typically the most heavily researched and financed attempts at divining the underlying currents and capitalizing on the shifting technological marketplace. It’s especially interesting when conservative tech stalwarts like IBM & SAP suddenly start looking cool.

Both IBM & SAP are moving quickly into 3 of the most powerful trends in computing, each of which are driven by the enormous amounts of data being captured across all domains: business intelligence & modeling, stream computing, and sustainable systems analysis.

IBM’s new initiative A Smarter Planet states succinctly, “the planet will be instrumented, interconnected, intelligent.” This is a powerful statement from one of the largest and most technologically advanced companies in the world. They’re not just talking about business. IBM CEO Sam Palmisano speaks to the really large-scale planetary challenges in creating smart infrastructures for energy, water, transport, and data.

A key component is the recently-announced System S project for supporting so-called Stream Computing.

System S is designed to perform real-time analytics using high-throughput data streams… to host applications that turn heterogeneous data streams into actionable intelligence… System S applications are able to take unstructured raw data and process it in real time.

“This is about what’s going to happen,” explains [director of high performance stream computing at IBM] Nagui Halim. “The thesis is that there are many signals that foreshadow what will occur if we have a system that is smart enough to pick them up and understand them. We tend to think it’s impossible to predict what’s going to happen; and in many cases it is. But in other cases there is a lot of antecedent information in the environment that strongly indicates what’s likely to be occurring in the future.”

With enough data you can start to create connections and patterns. With patterns you can derive meaning and ultimately be better enabled to make more accurate predictions. Since humans aren’t very well-adapted to processing large data sets, we build tools to handle the heavy lifting. Whether Wall Street indexes, ERP scenarios, government accounting, energy grid analysis, or dynamic climate models, serious hardware & software is required to process operational data into meaningful determinations and prescriptions.

SAP has introduced the Clear New World initiative built on their Business Objects service architecture. Again, the notion is that businesses, enterprises, and even governments can run more efficiently when there is a free-flow of data and a suite of integrated services to crunch and render the info into meaningful contexts.

It’s time to build greater visibility, transparency, and accountability into the way your organization works. Because being clear allows timely and relevant information to be available when and where it is needed. Clarity demonstrates that your company is willing and able to stay accountable to key stakeholders. Clarity helps call out inefficiencies, reveal your best customers, create credible sustainability, and give your business the flexibility needed to anticipate and respond to a complex, ever-changing, global environment.

[See James Governor’s recent post for more on how SAP & IBM are tackling enterprise sustainability.]

Note the statements about accountability to stakeholders & creating credible sustainability. Clear data & clear reporting. Now consider the latest announcement about SAP for Public Sector “to support the management and reporting of economic stimulus funds”. As a plugin to their Business Objects suite, this utility drafts on the trends towards open accountability and government transparency, often termed Gov 2.0, to provide support for determining just how stimulus money is being spent.

Both IBM and SAP have the power to execute effectively on these strategies, though it remains to be seen how enterprise spending will move to implement these services or if the companies will offer flexible licensing to LLC’s working on the really challenging non-profit global issues. Likewise, SAP has suffered usability problems for years and their core object architecture is old and slow. They will need more than just branding and plugins to make a more transparent world.

Finally, it’s worth noting the branding for these projects. “A Smarter Planet” is a global posture indicating agency and identity on a planetary scale. This hints at the real deep trend across the human species towards a global sense of purpose and strategy. “Clear New World” acknowledges both the occlusions under which human endeavor has marched thus far and the great clarity of visibility we’re now gaining across all domains & enterprises, while admitting that indeed everything is changing and we are moving into a New World. The technology is stepping forward to help us more effectively manage the present and navigate into the unknown future. But of course like all foresight, it remains to be seen whether individuals will choose to act appropriately with the knowledge they come to possess…

Dematerialize: Changing the Ways We Relate to Product & Ownership

There is a large and fast-moving shift occurring within the landscape of tools & technology. Increasingly, products are dematerializing and being re-engineered as services. This shift is being driven in part by rising production costs and an increasing awareness of the very real environmental impacts of producing durable goods and managing their end-of-life downstreaming into landfills. It is also a response to the rapid digitization of culture pushing many consumables into less tangible data transactions, often mediated through increasingly fetishized devices. Thus, content is becoming disengaged from fixed carriers like disk media and paper and is, instead, flowing through networks and devices.

Perhaps the most iconic and revolutionary example of this trend is the pairing of Apple’s iPod with its iTunes service. For the past 20 years, millions upon millions of cd’s, dvd’s, cases, and printed inserts have been consuming resources, fixing materials into unrecoverable or downcycled hard media and filling landfills. Apple has fundamentally rewritten this paradigm by dematerializing the content – music & movies – and connected it directly with the player. The materials & energetic overhead has been consolidated into a (hopefully) more durable device, freeing the high-volume transactional content from such a large resource burden. While there are manufacturing and reclamation costs associated with the device, the impact is lessened by decoupling those costs from the content.

There have since been an ever-increasing movement away from product towards services, as easily illustrated with the rise of online services within the Web 2.0 age. Digital cameras are another example that, like the iPod, decoupled the relentless production of content from a toxic & non-renewable material carrier – in this case, film & print paper. Likewise, print production itself has increasingly moved away from expensive, wasteful, and toxic inks & papers and has re-targeted to the ubiquity of screens. More & more “print” content – once the domain of magazines, newspapers, brochures, and advertising shwag – has moved away from hard carriers. Again, the pattern shows content being released from material substrates to move effortlessly across networks and devices.

There are a few interesting effects of this trend. Of course, piracy of content becomes considerably easier and cheaper. Content can be copied and moved across networks effortlessly, and copy protection is just another set of bits to be cracked. As Stewart Brand keenly observed, “information wants to be free” and the rapid digitization of culture has radically reinforced this proposition forcing every pre-web industry to completely re-evaluate their business models. Conversely, the bitifying of content and the democratization of powerful desktop authoring tools has empowered and emboldened the historical allure of remixing and massively reinvigorated our cultural creativity. Ironically, in an age that has enabled so many to create so much, the notion of intellectual property has less merit now than ever. When your content contains bits from 10 other pieces of content, who actually owns it? As has been noted by many authors & analysts, the genie is out of the bottle.

But perhaps more interesting are the behavioral and psychological shifts happening in response to these trends. As stuff dematerializes into intangible bits, the fact that we can no longer touch product subtly undermines the very notion of ownership. We begin to abstract our relationship to stuff as something we interact with more than possess. While this is potentially liberating it also makes it easier for content providers to assert total ownership in perpetuity: you’re merely borrowing content through a service provided by the “real” owner. Without direct ownership, are we protected and do we still have the right to share?

With respect to content, personal ownership has shifted to the device – the increasingly fetishized container through which content is constantly flowing. Our smart phones are awesomely empowering extensions of our selves, conferring unimaginable abilities to their owner. The simplest & most intuitive of these devices become second nature, third-hand extensions of our bodies, effortlessly wiring us to each other, to content, and vast stores of knowledge. Of course we fetishize such objects and of course we’ve grown dependent upon them.

Industrialization has regrettably optimized its business model through planned obsolescence, with much hard product designed to time-out and push an upsell to the next model. No doubt the devices we now rely so heavily upon have their own built-in failings, whether intentional or simply as a byproduct of the profit margin incentivised to invest in no more quality than is absolutely necessary. So have the benefits of dematerializing content from cheap carriers been negated by the resource requirements and inevitable breakdown of our devices? Has the energetic and environmental impact spared by going paperless been doubled by the sheer overhead of manufacturing and running vast global server farms? Any real evaluation of the dematerialization of products to services must consider the very large impact of the infrastructure supporting it.

Nevertheless, this is where we’re headed. Mobiles will get smarter & prettier and will be increasingly targeted for content and transient marketing. Screens will continue to multiply at an exponential pace finding their way into all aspects of our lives. Hardware manufacturers will be increasingly beholden to both international standards committees and shareholders to account for the carbon and environmental impacts of their processes. And the notion of object and ownership will continue to be challenged in ways yet unknowable.

[Acknowledgements to Gavin Starks of AMEE, Tish Shute at Ugotrade, and Lane Becker and Thor Muller of Get Satisfaction.]

Gavin Starks: Your Energy ID & Why You Should Care [E-Tech 2009 Notes]

These are my rough notes from last week’s E-Tech talk by Gavin Starks of AMEE:

We are hitting peaks and resource limitations. 5 potential futures: 1) Technology innovation; Salvation through technology but increasing reliance on it. 2) Services, not products; moving from car to public transport; carbon costs encourage services over hard products. 3) Reframing value; what is progress? what is value? Meaningful jobs, stronger communities cultivated. 4) Rationing; Things have gone too far, we need controls. Cap & trade. Sectors take control of citizens lives. Resource/H20 shortages leads to migrations and war. 5) War. Conflict over limited resources; divided communities; tribalism & territoriality. Quotes James Lovelock “90% population cull in this century”.

Hansen: “Caps won’t work – we need carbon tax.” Are we moving to post-capitalist society? Triple-bottom-line accounting: fiscal, social, environmental. McKinsey: “Capitalism is a multi-generational Ponzi scheme.” Need carbon tax. Carbon will be part of the US budget by 2011. federal cap & trade. Business-science-policy-technology: system of interconnects. Lots of data coming. EU policy stack being implemented. Anyone using over 6GWh or more than L500k/yr must disclose energy use. Coming to US. Carbon reduction commitment, energy efficiency, renewable obligations. “Moving to an economic age where we need to start obeying the 1st law of thermodynamics” [energy can neither be created nor destroyed]. Unpacking huge amounts of data. 20 largest cities use 75% of global energy. Future: many smaller cities. Pop density: cities are your country. Many local points of production & supply, networked together. No time left for closed systems. I/O models of everything. Democratization of energy. Smart grids. Microgeneration.

Data: citizens & things, private sector. public sector, cities, countries, earth. Data: purchases, materials, building, travel & transport, fuel & water & waste. Eg. SAP: 70% footprint is travel. Data is dangerous to business. Smart meters, eg fridge monitor yields whole layer of info. Every device will have accessible, identifiable profiles from data reporting. Energy Identity: Digital embodiment of your physical consumption. How to protect your digital identity? Now: everyone else assumes they own your data (utilities, suppliers, banks, retailers, etc). You own your data & can share or license it to interested parties. Collaboration networks are to business as social networks are to consumers. Emerging ecosystems, eg Planetary Skin, Oracle, IBM, Google & GE. Info about energy use; new grid; data on use belongs to you in standard, non-proprietary format. Lee: “Unlock all your raw data.” SW/SaaS/Systems integration. [tie into ERP] Eg Sun – Open Eco. Trading: Misys, EarthCP, Sandbag. Meters: Carbonmetrics, ISE. Consultancies: EQ2, NaturalLogic, CarbonVision, Greenmonk. Need transformational shift towards re-engineering behavior & production. Recession has so far had little input on carbon use.

To Do: 1) Give everything an energy ID; 2) Build SmartGrid behavior into everything; 3) Measure & map all of it; 4) Lobby for & create open standards; 5) Sort out data ownership now.

E-Tech 2009 Twitter Round-up

Here’s a selection of my tweets from the O’Reilly Emerging Technology Conference this past week. These are the ones I think grab the juicy nuggets from the speaker’s presentations. [In temporal order with the earliest (ie Monday eve) listed first.]

Tim O’Reilly: “We have greatness but have wasted it on so much. ”
We have an unprecedented opportunity to build a digital commonwealth. #etech
Work on something that matters to you more than money. This is a robust strategy. #etech
Niall Kennedy: Energy Star rating for web apps? Thinking of clouds & programming like tuning a car for better gas mileage. #etech
Cloud computing: no reasonable expectation of privacy when data is not in your hands. Not protected by 4th amendment. #etech
Alex Steffen: Problems with water supply are based in part on our lack of beavers. #etech
Social media for human rights. #etech
Gavin Starks – Your Energy Identity & Why You Should Care. see #etech
Maureen Mclugh – Consider that technology may be evolving in ways that are not particularly interested in us. #etech
Becker, Muller: We have under-estimated the costs and over-estimated the value of our economy. #etech
Becker, Muller: We assume economic trade must be the primary framing of value in our lives. Why? #etech
Design Patterns for PostConsumerism: Free; Repair Culture; Reputation Scaled; Loanership Society; Virtual Production. #etech
NYT: emerging platforms, text reflow, multitouch, flexy displays, smart content, sms story updates, sensors, GPS localized content. #etech
Jeremy Faludi: Buildings & transport have the largest impact on climate change. Biggest bang for the buck in re-design. #etech
Jeremy Faludi – Biggest contributor to species extinction & habitat loss is encroachment & byproducts from agriculture. #etech
Jeremy Faludi – Best strategies to vastly reduce overpopulation: access to birth control & family planning, empowerment of women. #etech
Tom Raftery: Grid 1.0 can’t manage excess power from renewables. Solution: electric cars as distributed storage. #etech
Considering the impact of pluging AMEE (@agentGav) data in ERP systems for feedback to biz about supply chain impacts. BI meets NRG ID.
Mike Mathieu: Data becoming more important than code. Civic data is plentiful and largely untapped. Make civic apps! #etech
Mike Mathieu: Take 10 minutes today and pick your crisis. Figure out how to create software to help. #etech
What is #SantaCruz doing to make civic data available to service builders? We want to help SC be healthier & more productive.
Mark Fraunfelder: “I haven’t heard of anybody having great success with automatic chicken doors.” #etech [re-emerging technology]
Realities of energy efficiency: 1gallon of gasoline = ~1000hrs of human labor. #etech
Kevin Lynch: Adobe is saving over $1M annually just by managing energy. #etech
Designing backwards: Think about the destiny of the item before thinking about he initial use. (via Brian Dougherty) #etech
RealTimeCity: physical & digital space merges, people incorporate intelligent systems, cities react in accord w/needs of pub welfare. #etech
Oh my we’re being LIDAR’d while Zoe Keating plays live cello n loops. ZOMG!!!
zoe keating & live lidar is blowing my mind at #etech 1.3M points per sec!
Julian Bleeker cites David A. Kirby: “Diegetic prototypes have a major rhetorical advantage over true prototypes” #etech
Julian Bleeker: Stories matter when designing the future, eg. Minority Report. #etech
Julian Bleeker: “Think of Philip K. Dick as a System Administrator. #etech
Rebecca MacKinnon: Which side are we helping, River Crabs or Grass Mud Horses? #etech
Kati London: How can we use games to game The System and how can they be used to solve civic problems? #etech
Nathan Wolfe: Trying to fight pandemics only at the viral human level ignores deep socioeconomic causes of animal-human transmission. #etech
Nathan Wolfe, re: viral jump from animal to human populations: “What happens in central Africa doesn’t stay in central Africa.”
Nathan Wolfe: need to work with % of population w/ hi freq of direct contact with animals for early detection of viral transmission.
Nathan Wolfe: Vast majority of biosphere is microscopic, mostly bacterial & viral. Humans: very small piece of life on Earth. #etech

Facebook, Twitter, and Walled Gardens

Today Facebook announced a new homepage whose re-design appears to be a response to the growing popularity of Twitter. Or more explicitly (to strip away the brand and focus on the technology), Facebook is moving towards the real-time web by adding a Stream view that shows updates from friends. In the words of Facebook’s director of product development, Chris Cox, “the stream is what’s happening”.

Indeed, the stream is certainly compelling. There is potentially great value in receiving & transmitting information as quickly as possible. As Twitter shows, people want to opt-in for notices from connections & information sources, but it’s uncertain whether Facebook users will be able to handle the unrestrained volume of content that it’s users post. Information is valuable only when it’s useful. The 140 character limit of the SMS underlying Twitter forces information to be clear & concise. It’s hard enough to keep up with Twitter posts, much less following everything your Facebook connections are allowed to post. The stream may simply be too overwhelming for most.

However, the interesting bits include the addition of filters that allow users to manage stream views, offering some hope to pare down the data glut. Likewise, the proposed ability to visualize a user’s social graph – the immediate and extended connections they have in Facebook – coupled to a lifting of the 5000 friend limit will open new opportunities for connectivity and communication but will also force users to manage their filters in order to deal with the volume.

The main downside seems to be Facebook’s ongoing insistence on private networks that are probably a legacy feature from the college-only days of in-group cliques that initially colonized the service. How will the rest of the world find value in it’s thoughtstream? How will businesses leverage the trends and interests of Facebook users if it’s too prohibitive to get access? Facebook may have the advantage in user numbers, but Twitter has the advantage in connectivity.

While Facebook boasts 175 million users, they cluster mostly in private groups. As someone who doesn’t use Facebook, I often encounter links that take me to the Facebook gates only to be turned away. It’s a walled garden to which the uninitiated do not have access. If Facebook is to approach the really interesting value of Twitter as a real-time search tool, it will need to open it’s network (and its API) to the rest of the world, thereby challenging its own users. Otherwise it will remain a land of closed & Balkanized cliques content to share party pictures and trade dollar beers, which may be enough for a business model but may fall short of moving into the territory currently occupied by everyone’s most surprising competitor: Twitter.

Patterns – 2.27.2009

1. Twitter Analytics
The phenomenal growth of Twitter has drawn tremendous amounts of data out of its 3,400,000 estimated users. From average Jane’s to elite technorati and global media stars, the spread of information shared across Twitter is immense. With the increasing presence of product brand managers and corporate reps the Twitter channel has eyes & ears that now spread deep into commerce, government, and culture. Senators can speak to constituents and customers can speak to big business. Since these communications are public and archived behind the Twitter API anyone can develop tools to extract the data.

The perennial question continues to focus on Twitter’s yet-to-be-revealed business model. Whatever ace they have up their sleeve (or however large Evan Williams bank account remains) the API has enabled a large ecology of third-party services to grow around an open data repository of public communication about almost everything. With an openly-searchable public timeline and the addition of user-generated hashtags to coordinate thread topics, the greatest emergent value of Twitter is in the trends and meaning that can be extracted from its content. As often noted, Twitter presents a reading of the zeitgeist.

Who to watch: Twitters meteoric rise hasn’t allowed a lot of time for many compelling solutions to be developed. Twitter itself has been playing the cards close to its chest, leaving most of the interesting development to third-parties. But it’s likely they’re watching and plumbing the data in ways not yet exposed through the API. There have been hints at charging for commercial use but if Google is any indicator (and it always is these days) Twitter will find it’s core footing in providing deeper access to data & user analytics.

The New York Times recently published a Twitter mashup showing timelined tweets related to the Superbowl. It’s a really simple yet compelling visualization that is immediately valuable to anyone trying to get a gauge on viewer sentiment. The timeline is displayed over a map of the US that indicates the geographical distribution of Tweets. While it focuses on general content (eg Steelers, Cincinnati) its easy to imagine visualizations focusing on ad words like Pepsi, Sobe, and GoDaddy indicating viewer response.

Some smaller third-parties like Twitstat and Tweetmeme plumb the public timeline and collate a lot of info that shows off the power of the API, but their search tools do little more than return a list of query incidences. The real need is for targeted visualizations to extract trends and meaning from the stream.

Also keep an eye on Reuters/Calais as they extend their top-down semantic approach to more archives and services. And of course, Google, as they turn their eyes towards the Twitter datacloud and start to feel the pinch of the growing buzz around Twitter & real-time search.

Related: The really interesting element of Twitter is it’s emerging use as a live-search service, illustrated beautifully by Erick Shonfeld.

What if you could peer into the thoughts of millions of people as they were thinking those thoughts or shortly thereafter? And what if all of these thoughts were immediately available in a database that could be mined easily to tell you what people both individually and in aggregate are thinking right now about any imaginable subject or event? Well, then you’d have a different kind of search engine altogether. A real-time search engine. A what’s-happening-right-now search engine.

In fact, the crude beginnings of this “now” search engine already exists. It is called Twitter, and it is a big reason why new investors poured another $35 million into the two-year-old startup on Friday.

(See also Chris O’Brien’s Mercury News article How Twitter Could be a Threat to Google.)

Twitter reports on the Now and is much closer to people and behaviors, emotions and intentions than Google can get with its static intentional search of the indexed web. Searching Twitter effectively searches human behavior. This may be a total game-changer.

2. Open Information
The much-touted “death of publishing” is essentially a realization of the declining value of static content and closed media. From books & newspapers to PDF’s, value is shifting away from the container towards the information. Like all 20th century media, the industry giants are struggling to evolve their business models to adapt to the enormous changes wrought by the hyper-linked social web.

Increasingly, stories that may have traditionally lived in fixed media are now told through multiple channels, leveraging text, video, interactive visualizations, and user input & conversation. Likewise, formerly-walled gardens like The New York Times & WSJ – which tried to simply move their paper subscription models online – are quickly moving to free models that seek revenues through advertising, analytics, & partnerships. With the empowerment of user content capture & broadcast, major news outlets are increasingly relying on average people to give them data. In return, we are demanding more access to that data.

Of perhaps greater impact is the ongoing leveraging of social media tools, open API’s, and public accounting records to expose the operational transactions of corporations & governments. As a backlash against the intense privacy of the Bush administration, and further motivated by the web of hidden banking transactions that enabled the current financial meltdown, tech-savvy public interest groups are exposing institutional accounting data to the world, heralding a new age of transparency and accountability. These trends are beginning to hit business as well and you can expect shareholders of public companies to demand more business intelligence reports about operations.

In both cases, the trend is towards the aphorism that “information wants to be free”.

Who to watch: The most active players in this area are the Old Media news publishing giants struggling to innovate in the digital world while their paper circulations dry up and disappear. Many of them have been recording and archiving valuable information about the world for over a century. As they move their content online they’re highly motivated to explore new strategies for making it valuable and compelling.

As noted, the New York Times is very actively transforming itself into an open, highly visual multimedia resource. Also, Reuters/Calais continue with their efforts to build semantic meaning into their own databases, as well as providing services to the greater web that build standard representations into any web content.

Google of course has interest in all online content and will actively pursue ways that it can help make that content more accessible and searchable. Adobe is making Flash more transparent to spiders and will also be well-served to break apart video and enable algorithmic visual analysis, identification, and tagging. The Sunlight Foundation is leveraging open information and web 2.0 service models to expose & annotate the dangerous relationships between elected officials and their special interest campaign donors.

Related: Container-aware content delivery. Obviously, reading the New York times on a mobile should be a different experience than reading it online or in print or an eReader. The usability of the information takes priority over the consistency of the format but the structure of the data must be readily able to retarget to any interface. Expect more evolved translational systems that sit between dataclouds and interface layers.

Google, Mozilla, Adobe… & Twitter!

It is extremely important to acknowledge that Mozilla gets almost 90% of it’s revenue from Google search support. This power dynamic effectively hands control of Mozilla over gives a lot of leverage to Google and positions Firefox as a potential Google proxy. I can’t help but think that Chrome may be little more than a dev sandbox and a foil to distract attention from the concerted effort between both parties to rewrite the web in their favor.

Perhaps more importantly to Adobe, the assumed competition between Chrome & Firefox obfuscates the very real & present strategy to get web video out of Flash and to further de-legitimize Flash and all “closed” 3rd party plugins against the rising value of HTML5. Both Google and Mozilla (Googlezilla!) are working to build canvas support into all browsers and to enhance the HTML5 spec to support rich media rendering. Likewise, the communications and positioning coming from both continue to stress the value of the “open web”, “interoperability”, and the danger of closed, 3rd-party plugins (ie Flash). Adobe will still claim a reasonable chunk of the rich web but if HTML5 (or whatever subset implementation Googlezilla gets into Firefox, IE, & Safari) allows easy rendering of HD video to any screen, they can say goodbye to Flash as a video solution.

Meanwhile, Google itself may find unexpected competition from an unlikely challenger. Erick Shonfeld at TechCrunch has posted a brilliant insight into the deep value of Twitter… and what it may mean for Google. It’s kinda mind-blowing to think that a hot-topic upstart like Twitter could pose a threat to the Googleplex, but Shonfeld nails it with his article, Mining the Thought Stream:

What if you could peer into the thoughts of millions of people..? And what if all of these thoughts were immediately available in a database that could be mined easily to tell you what people both individually and in aggregate are thinking right now..? Well, then you’d have a different kind of search engine altogether. A real-time search engine.

…In fact, the crude beginnings of this “now” search engine already exists. It is called Twitter…

He continues to note that search engines like Google capture people’s intent (what they are looking for), while Twitter captures their thoughts, and feelings, and what they’re doing. This is a new type of search model more closely joined to the real-time global mind. It’s much closer to people than Google search can get. Twitter is clearly already tremendously disruptive, even without any revenues. Imagine building search and analytics on top of it….

And yeah, everybody wants to know what Twitter’s business model is. Keep in mind that Twitter’s #1, Evan Williams, sold his earlier company, Blogger, to Google so he’s already got that channel open. If the model is to sell to Google and turn the world’s most successful web search engine into the world’s most powerful human thought & behavior probe, then yeah, you wanna keep that under wraps. Twitter will stay the same but Google search will suddenly get *a lot* smarter. If, on the other hand, Twitter seeks to challenge Google in search and analytics, then, oh damn you wanna play those cards as close to your chest as you can possibly keep them.

Patterns: Technology & Culture – January 2009

[This is the first of what I hope to be regular notes and analysis on some of the current prevailing trends in technology and, uh… culture.]

1. Video telepresence & “presence-based telecommunication services”
Instant messaging, Twittering, and the explosive rise and symbiosis of online video and consumer mobile capture are creating a seismic shift in communication, reporting, and collaboration across the globe. From 3rd world African fisherman using mobile IM to coordinate catch-and-sell markets for the best returns, to CNN partnering with Facebook to bring participatory democracy to new vistas during the 2009 inaugural ceremonies for President Obama , the era of instant communication is getting closer and closer to standard run-time procedures for much of humanity. This level of communication is extraordinarily compelling for numerous reasons, yet for the most part it has not yet been considered a core guideline for enterprise-scale operations. While some companies are starting to get the value of broadcast push-to-many networks, and some have been using internal IM solutions informally, most are now using telepresence video solutions for meetings and presentations.

Who to watch: Cisco & Adobe, mano a mano. Cisco is pushing the future of its online meeting service, TelePresence, in ads showing civic installations of 2-way video walls connecting towns across the world. Adobe continues to market it’s Flash-based teleconferencing solution, Adobe Connect, as a desktop and hosted service, via Of course, anything that drives more bandwidth through those increasingly clogged arterial Intertubes makes Cisco very happy. Video is huge and hosting more and more of it will require companies to budget for bigger & better routers to handle the throughput. Any win for Adobe & Flash is a win for Cisco. Any loss for Adobe is also a win for Cisco. Also, watch for anyone gaining traction in securing IM channels or building Twitter-like solutions for internal enterprise deployment.

Related: Video. Cisco & Adobe again heading into the same territory with internet TV’s, telepresence, and the might of Adobe Flash. But with 1080p & h.264 support, combined with their hardware wizardry, Cisco is well-positioned to capitalize on bringing Netflix & the web straight to your TV. Cisco is the new gorilla in the corner that people should be paying attention to. But don’t take your eyes off Google cause they certainly want a piece of the video ad pie.

2. Business Intelligence, Personal Intelligence
The term “business intelligence” has been around for a while but it’s showing up more and more across the tech world as traditional business practices begin to leverage emergent technologies to better manage their enterprise. Businesses can now track vast amounts of data in great detail across numerous channels, both internally and in the marketplace. As they face growing piles of valuable information, the enterprise is beginning to realize the powerful advantage of proactively managing both its acquisition & management through rich client dashboards. Such dynamic dashboards are allowing precise interrogation of vast stores of transactional data rendered in elegant and easily-digestible visualizations. Have a look at the Nasdaq Adobe AIR dashboard as an example of modern data management. New tools are evolving to better serve the same goal: help business know itself and its customers better.

Naturally, people are also finding ways to use these technologies to manage and optimize their behaviors, hence the emerging filed of Personal Intelligence. Tools and services make it simple for users to collect and analyze data on energy consumption, caloric intake, weight loss, menstrual/fertility cycles, family budgets, and innumerable other niches including the extremely compelling world of the Awakened Consumer. Some offer web services or desktop apps but the mobile web platform that many humans now have with them at all times is becoming the ideal place for capturing behavioral metrics, crunching the numbers, and displaying trends in pretty graphs and animations. Often these apps include access to relevant social networks where users can share tips & results and find the camaraderie & encouragement that’s so helpful to effectively changing habits. Again, the goal is the same as in business intel: help people know themselves better and manage their own behaviors successfully.

Who to watch: In BI, SAP, Oracle, Salesforce, and Adobe PDF. All of these players should be radically re-evaluating their klunky, out-dated UI and static approaches to data structuring. They should be building semantics directly into their database and wiring up elegant and flexible dynamic front-ends that allow customers to get more out of their data. These application should make the user more effective at their job, not force them to interface with an overweight bureaucratic intermediary. (And don’t even get me started on forms…) Not only should these folks be rebuilding their software but they should very intelligently consider what data, forms, and formatting really mean in the mobile landscape. It’s not as easy as porting PDF to mobile or reconfiguring the SAP front-end to fit in a phone window. Mobile solutions for enterprise data must be appropriate to human use, first and foremost, and should ditch any allegiance to the existing desktop solution.

In PI, honestly the most interesting stuff is happening on the iPhone. Personal management apps like Carbon Tracker, LoseIt!, BrainHack, iPeriod, BP Buddy Blood Pressure Helper, & GoodGuide enable people to better engineer their habits towards their goals and empower them to make better decisions about their actions as consumers.

Related: The Semantic Web. Also called Web 3.0, the notion is that the present web exists as a uniquely human endeavor, consisting mostly of text and images from which we humans are readily capable of extracting meaning & relationships. However, software is not so adept at making inferential connections and understanding context. Semantic architects seek to standardize the approaches for building a relational context that can be understood and used by agents. They look at descriptive frameworks like XML, RDF, approaches like Sparql, & OWL, as well HTML-based microformats to build both bottom-up semantics and top-down context and taxonomies. Services like Amazon Also Recommends leverage contextual relationships to make product recommendations based on key text in your current selection. For both BI & PI, the result is much greater efficiency and relevance for intelligent agents tasked with the Herculean tasks of sifting through ridiculous amounts of data to fish out the key bits most important to you.

The most active and visible players in the semantic game are Reuters/Calais, AdaptiveBlue, Powerset, and Expert System, but be sure that Google is heavily invested in this initiative. While the W3C is sorting out the plan we may all suddenly turn around and realize Google just built the semantic web. When heavyweights get behind a particular solution it tends to become a standard.

3. Crowd Content & Citizen Journalism
For the inauguration of Barack Obama, CNN partnered with Facebook to bring to the presidency what CurrentTV & Twitter brought to the election: direct broadcast user participation. CurrentTV mashed up tagged Twitter messages with the live streams of the presidential debates. Next to its inauguration stream, CNN included a live feed from Facebook users commenting on the swearing-in ceremonies. Suddenly, the audience became part of the broadcast. With Adobe, Cisco, Microsoft et al chomping at the bit, you can expect this kind of participatory content to be on your TV screen soon. Those crazy news ticker overlays crowding around Greta Van Sustren or whatever talking head will include tweets from Jim Bob in Philly. We are the new empowered crowd. Of course, industry & enterprise beware: the empowered crowd can rebel and turn into a mob.

Combining location-aware telepresence & instant communication with cameras & publishing tools, smartphone mobile devices are empowering users with the ability to capture and broadcast local events with unfiltered and immediate reporting. When a police confrontation occurred on a rail platform in Oakland, Ca. bystanders immediately pulled out their mobiles and started recording. The resulting public videos of deadly force exercised by an officer drew major media attention and impacted the ability of courts to manage the evidence. When US Airways Flight 1549 landed in the Hudson River last week the first reports were issued from passengers and nearby boat crews via mobile SMS to Twitter, then quickly broadcast out across social networks. Increasingly, some of the most significant global events of the past year – the Chengdu earthquake, the Mumbai terror attacks – have arrived to the masses first via Twitter. Mobile devices and broadcast services like Twitter are wiring people to the global cloud as ground-level sensors. We are the broadcast nodes.

Who to watch: Facebook & Twitter, of course. CNN, CurrentTV. And watch the major print & broadcast networks as they scramble to get with the times or perish. Also, city municipalities who will (eventually) leverage these tools to generate business intelligence for managing their communities. Expect increasing challenges to notions of privacy and surveillance, as well as a surge in mobile and web applications that build reporting tools & broadcast functions into social networks.

And one final aside: Autodesk. Building energy analysis directly into their BIM and CAD applications, designing advanced multitouch HIC solutions, and even opening their own Soma tech gallery to show off their magic, Autodesk has been kicking serious ass and is a model for how to evolve the enterprise to meet the times.

A Brief Rant on Cloud Agents and Business Intelligence

From a comment I left over at ReadWriteWeb about What’s Next After Web 2.0?:

Business Intelligence. The enterprise will increasingly use cloud agents and semantic analytics to better understand their customers, markets, finances, and internal workflows. Companies will engage in behavioral modeling and web meme profiling more aggressively. With diminishing worforce resources due to budgetary constraints, increased investment into automation and intelligent software solutions will give businesses more information and feedback without requiring as many large paychecks. Electronic business workflows, services, and applications will evolve to write more intelligent metadata and semantic subtext into file formats while similarly reporting usage analytics out to dyanamic data streams. All of this data will be sorted by cloud agents, filtered, parsed, and then rendered to rich media layers (eg Flash) for practical visualization and analysis. All documents and file types will evolve to contain more legacy information about who and how the file was created, when & where, who has access rights and to what degree, who has reviewed them and what comments have been attached. Such intelligent files will enable greater and greater usage by both human and cloud agents.

Cisco, video, digital hardware… and Adobe?

When Redmonk’s James Governor opined that Cisco might make a play for Adobe Systems, bells went off in my head. It suddenly made a lot of sense and made me realize I should really be paying more attention to Cisco.

From James:

Cisco competing with Apple? Who would have thunk it? To really make its ambitions count I believe Cisco will make a play for Adobe, filling out a video internet value chain from low to high production to the web.

Adobe is arguably the predominant enabler of web video, with much of the web firmly invested in Flash and all of it’s platform components & accessories. Cisco knows how to make hardware and has not been at all shy about their goals in the consumer digital market, pursuing rich internet-enabled media on set top boxes and in TVs. Recall not too long ago the legal battle with Apple over the term “iPhone”. And at CES in January Cisco is expected to introduce a new line of consumer media products.

Cisco Systems, the dominant provider of the digital pipes that run the Internet, is making a big play in digital entertainment. At the Consumer Electronics Show in January in Las Vegas, it plans to introduce a new line of products, including a digital stereo system that is meant to move music wirelessly around a house.

That is the first small move in a long-term strategy to take on Apple, Sony and the other giants of consumer electronics. Cisco is working on other gadgets that will let people watch Internet video on their televisions more easily. And its biggest bet is that people will want to use a version of its corporate videoconferencing system called Telepresence to chat with their friends over their high-definition televisions.

While this article notes the looming battle with Apple & Sony it should be considered that the set-top and video market is clearly of interest to Adobe, as well as the obvious similarities between Telepresence & Adobe Connect for video conferencing solutions. Cisco & Adobe are both invested in the Open Screen project but the relationship between the two will surely get closer whether or not some sort of acquisition is in play.

The strategy for Cisco, of course, is to encourage more high-bandwidth content running through all those Cisco routers that will need to be upgraded to keep up with the throughput. There’s no greater bandwidth hog than video and its just exploding with the boom in cheap consumer video hardware and turnkey hobbyist solutions. Adobe would be wise to pursue the authoring side of this hobbyist video boom, and focus on an aggressively marketed, cross-platform Premiere Express solution, as well as developing an ecology for video capture and publish from mobile devices. Meanwhile, their set-top Flash initiative will continue to intrigue Cisco and if the two are not already talking to make sure Cisco is using Flash everywhere possible, then somebody at Adobe needs to get busy and make it happen.

It remains to be seen whether Cisco might make an acquisition play or Adobe but it seems likely that the future of the two companies will be tightly coupled.