This tweet got me riffing on potential outcomes & exploits available when autonomous vehicles become common:
New meaning to “blue screen of death” ;) MT @seth_fletcher: Carlos Ghosn: Autonomous cars “without any doubt” part of future… by 2020.
— chris arkenberg (@chris23) January 14, 2013
Also interesting re: autonomous cars: autobotnets, vehicular ddos, remote firmware exploits, stigmergics & unexpected swarming, etc…
— chris arkenberg (@chris23) January 14, 2013
@changeist biometric spoofing, route hijacking, flexible meshnet computing… oh! oh! street racing algo mod’s!!
— chris arkenberg (@chris23) January 14, 2013
@changeist and that’s just on my lunch break ;)
— chris arkenberg (@chris23) January 14, 2013
I also “like” (or “find interesting”, in the Chinese proverbial sense) the idea of rogue agents seizing control over vehicular fleets to direct and coordinate their movements towards some sort of goal, e.g. assembling to bust a road barricade or defend a bank heist. Interesting times, indeed…
[Apologies/nods to Scott Smith.]
Many mature software companies are now in the awkward position of trying to migrate their heavyweight legacy solutions from the desktop into the uncertain domain of the cloud. Fortune 500’s are slow to adapt, preferring to leverage their cash-cow back catalog for as long as possible while gently testing the waters with lightweight solutions more aligned with marketing than their core execution layer. The results often paint the erstwhile-giants as out-of-touch and late to the game, delivering simple offerings that fail to successfully integrate with the evolving needs of their user base. The solution is not an easy one, requiring much greater commitment and risk than most CFO’s can stomach. But the cloud is not going away and the alternative to full adoption is to be resigned to a narrowing niche.
Traditional software companies are beginning to extend their platform support to include web service layers that sit across their apps. While some try to port software features to a web front-end, others are looking more closely at the collaborative workflows across apps and across stakeholders. A good example of the latter is CS5 from Adobe Systems that includes workflow support services like CS Review and the CS Live service suite. These connected services address the real-world behaviors that grow around application suites.
A design shop has multiple users working in tandem, each with specific coordinated tasks necessary to producing the final outcome. The shop itself is in communication with the client and the publishing target (eg the printers, the web developers, etc) and may be working with other 3rd party contributors. All of these have differing levels of contribution, permissions, and interaction which can be effectively mediated by well-designed services.
The shift to these kind of services is inherently social. Effective service design addresses the collaborative workflows that emerge around the intersection of the tools and the business. This design process is not feature-driven as is typical of most software development. Instead, it’s human-driven with features addressing the real needs of the users, accreting around human behaviors derived from user research & ethnography, rather than from market analysis or engineering visions.
So, for businesses looking to extend their platforms to address the secondary workflows that emerge around them, they would be well-served to invest in solid user research & ethnography in order to understand how their tools are being used, what the stakeholder relationships are really like, and how businesses implement their own hacks to develop work-arounds & optimizations for the interoperabilty & social elements of their work. Every shop & every business ecosystem has challenges that are more often remedied by frustrated internal users rather than well-designed services. These ad hoc hacks are problems looking for solutions.
Businesses are ecosystems built around human engagement & productivity. Business ecosystems are platforms for innovation. How is your services model addressing the human ecosystem of productivity?
In a time of monumental change it’s important to look at how the big player’s are adapting. Their moves are typically the most heavily researched and financed attempts at divining the underlying currents and capitalizing on the shifting technological marketplace. It’s especially interesting when conservative tech stalwarts like IBM & SAP suddenly start looking cool.
Both IBM & SAP are moving quickly into 3 of the most powerful trends in computing, each of which are driven by the enormous amounts of data being captured across all domains: business intelligence & modeling, stream computing, and sustainable systems analysis.
IBM’s new initiative A Smarter Planet states succinctly, “the planet will be instrumented, interconnected, intelligent.” This is a powerful statement from one of the largest and most technologically advanced companies in the world. They’re not just talking about business. IBM CEO Sam Palmisano speaks to the really large-scale planetary challenges in creating smart infrastructures for energy, water, transport, and data.
System S is designed to perform real-time analytics using high-throughput data streams… to host applications that turn heterogeneous data streams into actionable intelligence… System S applications are able to take unstructured raw data and process it in real time.
“This is about what’s going to happen,” explains [director of high performance stream computing at IBM] Nagui Halim. “The thesis is that there are many signals that foreshadow what will occur if we have a system that is smart enough to pick them up and understand them. We tend to think it’s impossible to predict what’s going to happen; and in many cases it is. But in other cases there is a lot of antecedent information in the environment that strongly indicates what’s likely to be occurring in the future.”
With enough data you can start to create connections and patterns. With patterns you can derive meaning and ultimately be better enabled to make more accurate predictions. Since humans aren’t very well-adapted to processing large data sets, we build tools to handle the heavy lifting. Whether Wall Street indexes, ERP scenarios, government accounting, energy grid analysis, or dynamic climate models, serious hardware & software is required to process operational data into meaningful determinations and prescriptions.
SAP has introduced the Clear New World initiative built on their Business Objects service architecture. Again, the notion is that businesses, enterprises, and even governments can run more efficiently when there is a free-flow of data and a suite of integrated services to crunch and render the info into meaningful contexts.
It’s time to build greater visibility, transparency, and accountability into the way your organization works. Because being clear allows timely and relevant information to be available when and where it is needed. Clarity demonstrates that your company is willing and able to stay accountable to key stakeholders. Clarity helps call out inefficiencies, reveal your best customers, create credible sustainability, and give your business the flexibility needed to anticipate and respond to a complex, ever-changing, global environment.
[See James Governor's recent post for more on how SAP & IBM are tackling enterprise sustainability.]
Note the statements about accountability to stakeholders & creating credible sustainability. Clear data & clear reporting. Now consider the latest announcement about SAP for Public Sector “to support the management and reporting of economic stimulus funds”. As a plugin to their Business Objects suite, this utility drafts on the trends towards open accountability and government transparency, often termed Gov 2.0, to provide support for determining just how stimulus money is being spent.
Both IBM and SAP have the power to execute effectively on these strategies, though it remains to be seen how enterprise spending will move to implement these services or if the companies will offer flexible licensing to LLC’s working on the really challenging non-profit global issues. Likewise, SAP has suffered usability problems for years and their core object architecture is old and slow. They will need more than just branding and plugins to make a more transparent world.
Finally, it’s worth noting the branding for these projects. “A Smarter Planet” is a global posture indicating agency and identity on a planetary scale. This hints at the real deep trend across the human species towards a global sense of purpose and strategy. “Clear New World” acknowledges both the occlusions under which human endeavor has marched thus far and the great clarity of visibility we’re now gaining across all domains & enterprises, while admitting that indeed everything is changing and we are moving into a New World. The technology is stepping forward to help us more effectively manage the present and navigate into the unknown future. But of course like all foresight, it remains to be seen whether individuals will choose to act appropriately with the knowledge they come to possess…
Here’s a selection of my tweets from the O’Reilly Emerging Technology Conference this past week. These are the ones I think grab the juicy nuggets from the speaker’s presentations. [In temporal order with the earliest (ie Monday eve) listed first.]
Tim O’Reilly: “We have greatness but have wasted it on so much. ”
We have an unprecedented opportunity to build a digital commonwealth. #etech
Work on something that matters to you more than money. This is a robust strategy. #etech
Niall Kennedy: Energy Star rating for web apps? Thinking of clouds & programming like tuning a car for better gas mileage. #etech
Cloud computing: no reasonable expectation of privacy when data is not in your hands. Not protected by 4th amendment. #etech
Alex Steffen: Problems with water supply are based in part on our lack of beavers. #etech
Social media for human rights. http://hub.witness.org #etech
Gavin Starks – Your Energy Identity & Why You Should Care. see http://amee.com #etech
Maureen Mclugh – Consider that technology may be evolving in ways that are not particularly interested in us. #etech
Becker, Muller: We have under-estimated the costs and over-estimated the value of our economy. #etech
Becker, Muller: We assume economic trade must be the primary framing of value in our lives. Why? #etech
Design Patterns for PostConsumerism: Free; Repair Culture; Reputation Scaled; Loanership Society; Virtual Production. #etech
NYT: emerging platforms, text reflow, multitouch, flexy displays, smart content, sms story updates, sensors, GPS localized content. #etech
Jeremy Faludi: Buildings & transport have the largest impact on climate change. Biggest bang for the buck in re-design. #etech
Jeremy Faludi – Biggest contributor to species extinction & habitat loss is encroachment & byproducts from agriculture. #etech
Jeremy Faludi – Best strategies to vastly reduce overpopulation: access to birth control & family planning, empowerment of women. #etech
Tom Raftery: Grid 1.0 can’t manage excess power from renewables. Solution: electric cars as distributed storage. #etech
Considering the impact of pluging AMEE (@agentGav) data in ERP systems for feedback to biz about supply chain impacts. BI meets NRG ID.
Mike Mathieu: Data becoming more important than code. Civic data is plentiful and largely untapped. Make civic apps! #etech
Mike Mathieu: Take 10 minutes today and pick your crisis. Figure out how to create software to help. #etech
What is #SantaCruz doing to make civic data available to service builders? We want to help SC be healthier & more productive.
Mark Fraunfelder: “I haven’t heard of anybody having great success with automatic chicken doors.” #etech [re-emerging technology]
Realities of energy efficiency: 1gallon of gasoline = ~1000hrs of human labor. #etech
Kevin Lynch: Adobe is saving over $1M annually just by managing energy. #etech
Designing backwards: Think about the destiny of the item before thinking about he initial use. (via Brian Dougherty) #etech
RealTimeCity: physical & digital space merges, people incorporate intelligent systems, cities react in accord w/needs of pub welfare. #etech
Oh my we’re being LIDAR’d while Zoe Keating plays live cello n loops. ZOMG!!!
zoe keating & live lidar is blowing my mind at #etech 1.3M points per sec!
Julian Bleeker cites David A. Kirby: “Diegetic prototypes have a major rhetorical advantage over true prototypes” #etech
Julian Bleeker: Stories matter when designing the future, eg. Minority Report. #etech
Julian Bleeker: “Think of Philip K. Dick as a System Administrator. #etech
Rebecca MacKinnon: Which side are we helping, River Crabs or Grass Mud Horses? #etech
Kati London: How can we use games to game The System and how can they be used to solve civic problems? #etech
Nathan Wolfe: Trying to fight pandemics only at the viral human level ignores deep socioeconomic causes of animal-human transmission. #etech
Nathan Wolfe, re: viral jump from animal to human populations: “What happens in central Africa doesn’t stay in central Africa.”
Nathan Wolfe: need to work with % of population w/ hi freq of direct contact with animals for early detection of viral transmission.
Nathan Wolfe: Vast majority of biosphere is microscopic, mostly bacterial & viral. Humans: very small piece of life on Earth. #etech
My thoughts submitted to the Adobe Reader Blog for the post Take the Adobe Reader Survey. As a former Adobe employee who worked on Acrobat & PDF I have a lot of personal interest in seeing the format grow and evolve.
The growing public perception is that PDF is too bulky and increasingly too opaque for the networked world. This is because PDF’s have not kept up with the prevailing trends of transparency, findability, and collaboration. PDF is important as a container with certain rights & privileges (DigSig, Security, Markup, Forms), but the data inside a PDF is far more important. Currently, PDF’s are way too opaque, too bloated, and do not clearly convey value to most users. This is especially true on mobile (why would I chose to view PDF on mobile if not required by an enterprise I need to engage with?). For most enterprises and customers, PDF is a cloud of data more than a display standard. It’s value is no longer in consistent display of fonts and formatting. It’s in the data within the millions of PDF’s that the IRS has, for example. Even as a Forms front-end it’s difficult to see why Reader/Acrobat is a better solution than a robust customizable Flash interface. The Flash-based Portfolios feature is a step in this direction.
How can Reader add value to the massive volumes of archival PDF that already exist? Answer: 1) replace Reader with a robust, customizable Flash front-end, and 2) engineer semantic data* into new & existing PDF’s so that cloud agents can sift through the documents and return meaningful results. Both of these strategies should focus heavily on supporting Live Cycle for both distilling and evaluation of PDF’s.
The static viewer model is dying. People need to be able to search, sort, find, annotate, and share. Reader is already too heavy to be of value in a browser, much less on a mobile device. Any mobile solution must dis-aggregate formatting from data and be able to dynamically reconfigure the display to present only the important data/form elements to the mobile user. At the very least, PDF’s need some serious reformatting before they can be of any real value on the mobile platform. There’s just not enough real estate. Furthermore, any PDF-mobile solution must begin with the realization that mobile = personal, collaborative, locative.
If Adobe doesn’t do this, you can bet there will be lucrative opportunities for others who understand that the value of data is no longer in it’s formatting. It’s in accessibility and structured reporting. Frankly, any business intelligence solution that doesn’t address the growing heap of PDF’s lying in their servers will fail to really leverage their own data effectively.
* I think I’m starting to use the term “semantic” a bit loosely. Essentially, I’m suggesting that Acrobat should engineer active creation of RDF structures inside PDF COS and as header info. PDFLib should extend to support both writing & reading of this framework. Likewise, top-down text analysis should spider both doc text and COS to construct relevant metadata (RDF & taxonomies) written into the PDF file header. The point is to make PDF’s as transparent & searchable as possible to those actors & agents with access rights.
These are my brief (and very rough) notes from 5 minutes ago summarizing some guidelines I feel are critical for application & service development:
The cloud is everywhere.
Applications grab eyes.
Mobile/desktop/cloud – Don’t draw partitions.
Seek integrations across platforms.
Scale services by UI. Eg editing photos on a mobile is not appropriate but capturing images and uploading them to a workspace is.
Provide ubiquitous workspaces.
Communicate, Collaborate, Create, Share
From O’Reilly Radar:
The “internet operating system” that I’m hoping to see evolve over the next few years will require developers to move away from thinking of their applications as endpoints, and more as re-usable components. For example, why does every application have to try to recreate its own social network? Shouldn’t social networking be a system service?
This isn’t just a “moral” appeal, but strategic advice. The first provider to build a reasonably open, re-usable system service in any particular area is going to get the biggest uptake. Right now, there’s a lot of focus on low level platform subsystems like storage and computation, but I continue to believe that many of the key subsystems in this evolving OS will be data subsystems, like identity, location, payment, product catalogs, music, etc. And eventually, these subsystems will need to be reasonably open and interoperable, so that a developer can build a data-intensive application without having to own all the data his application requires. This is what John Musser calls the programmable web.
This was a great conference and the most consistent collection of speakers and topics I’ve ever experienced. Very fun and inspiring. Lots of hip 30-somethings trying to dream up tomorrow and make it real. It was a a very balanced, yet cutting-edge talk aimed at an eager (and surprisingly mixed-gender)crowd. I noticed that most folks were using Mac laptops – this part of the edge seems to prefer Apple – and it was fascinating to watch many who were blogging the talks while pulling up references dropped by the speakers, tweeting out to Twitter, and snapping/downloading/posting photos in real-time. As speakers dropped references I was pulling them up on my laptop and dropping links into my blog notes.
In the lobby a team was showing off a data viz video mapping real-time communications connecting NYC to the rest of the world. Andrea noticed that a surprising number were with an Italian city called Perugia. Maybe next year they could map the live feed of all web traffic from ETech. Imagine the bitstreams rising off such a gathering of digiterati.
Maybe it was just the Sudafed coursing through our virus-ridden veins (thank you Portland) but ETech was a total intellectual turn-on, from ambient objects, Asian mobile media, green policy and sustainability, hardware hacking & drone building, Austrian post-Situationists, neuroengineering, and the digital salvation of Democracy itself.
I hope I can go back next year!
Hacking brains & iPhones, building DIY aerial drones, ambient data streaming, data viz and crowd movements, ARGs, Vegas, and the Self awakened to it’s own tech. Oh baby!
With the help of my special lady friend (who got work to sport for the hotel, pass, and air) and the help of my employer (I’m doing some booth shifts on the floor in exchange for a pass – I get to rep Adobe AIR), I’m leaving tomorrow morning for sunny San Diego and a week at the O’Reilly Emerging Technology Conference! I’m psyched. I’ve wanted to go for the last few years but couldn’t afford it. All this time, I should have just told my corporate overlords they needed to send me on the company ticket!
I’ll be sending photos to the urbeingrecorded portal via tumblr, and I’ll likely post some keen bits here. Otherwise I’ll be fast hacking my iPhone to control a robotic crowd-sourcing drone I will use to track the culinary habits of tech luminaries and international political dissidents whose footpaths I’ll be datastreaming to various dynamic art installations and ambient devices.
From their site:
How does technology help you perceive things that you never noticed before? How does it help you be found, or draw attention to issues, objects, ideas, and projects that are important, no matter their size or location?
At the 2008 version of ETech, the O’Reilly Emerging Technology Conference, we’ll take a wide-eyed look at the brand new tech that’s tweaking how we are seen as individuals, how we choose to channel and divert our energy and attention, and what influences our perspective on the world around us:
Body Hacking. Genomics Hacking. Brain Hacking. Sex Hacking. Food Hacking. iPhone Hacking.
DIY Aerial Drones. DIY Talking Things. DIY Spectrum. DIY Apocalypse Survival.
Emerging Tech of India, Cuba, and Africa. International Political Dissidents.
Visualize Data and Crowds. Ambient Data Streaming.
Good Policy. Energy Policy. Defense Policy. Genetic Policy. Corruption.
Alternate Reality Games. Emotions of Games. Sensor Games.
ETech 2008 will cover all of these topics and more. We put on stage the speakers and the ideas that help our attendees prepare for and create the future, whatever it might be. Great speakers are going to pull us forward with them to see what technology can do… and sometimes shouldn’t do. From robotics and gaming to defense and geolocation, we’ll explore promising technologies that are just that–still promises–and renew our sense of wonder at the way technology is influencing and altering our everyday lives.
From a recent internal email thread (slightly modified and redacted):
I’ve done a reasonable amount of work developing 3D spaces and evaluating the opportunities in immersive worlds. Along the way I’ve learned a lot about virtual worlds and the people who frequent them, least of which is the unfortunate reality that nobody seems to be able to make any real money on the open-ended, user-generated content model.
While Second Life enjoys the occassional publicity bumps on the backs of Boing Boing and Wired et al, they have yet to really nail down their business model short of “get bought by Google”. As others have noted, the connection between their virtual economy and that of the real world are tenuous at best and criminal at worst (see the shady operations of some of it’s private banks…). IBM and others respond to the hype and dump millions into corporate islands, only to realize that people aren’t particularly interested. The tools offered to users suffer from poor UI and steep learning curves, leading to small cliques of content creators sucking up Linden dollars from downstreamers who wish their avatar was more interesting. As we learned with Atmosphere, letting the users take responsibility for all the content leads to very limited and insular creativity with a lot of folks simply standing around in fancy outfits. Spending any substantial time in SL or the other user-content worlds leaves me with the sad aftertaste that millions and millions of polygons are being wasted on a fancy chat client.
Now clearly, virtual worlds are extremely compelling. We want cyberspace and the metaverse, and companies like SL ride this sci-fi future dream as far as they can hoping that if enough people believe it, then it will come true. A common side-effect of the hype machine is that people jump on the panacea bandwagon and start to think that the 3D world can replace everything we do on the desktop or IRL. As others have noted, running trainingseminars in full-featured flat apps like Connect is much better than trying to do it in 3D. Likewise with watching video or surfing the web or writing spreadsheets. To find value in virtual worlds is to determine what they do better than flatware. Blizzard knows that one of the best things 3D worlds do is provide an immersive environment in which to unroll a compelling narrative. SL ditched the narrative and assumes that the users want to create their own world from a blank palette. A simple glance at the numbers shows who has the better game plan for virtual worlds right now.
Content creation in 3D worlds is fraught with peril due to it’s complexity. Modelling in 3D will always be a professional endeavor, as it should be. It’s fricken hard. Scripting actions is also challenging but a little more accesible. Skinning jpegs for fashionable avatar textures? Maybe your average photoshopper can do this if they wish but don’t we already make a lot of money off the professional gaming companies that integrated PS into their workflows a long time ago?
The real point of interest for me in spaces like SL is not the creation of virtual design content, but the creation and management of social content. The most compelling thing in any social network, flat or 3d, is the ability to find your friends/connections, to share and retrieve information, to discover affinity groups based on your interests, and to have access to simple agents that help better integrate the online self with the real-world self.
To my mind, the current value proposition lies in creating extensible flash widgets that crawl through social nets and help users manage the data and enhance their productivity. How can I find the knowledge experts that can help me use Photoshop for pre-press? As a knowledge expert, how can I let others know I’m here to help? How can a user manage and personalize their Suite workflows and integrate them with their online data? What’s the easiest way to meet a LinkedIn contact in a Connect session to show off a portfolio of Flash content? How can I derive a color space from an image that will then lead me to an online resource for similar images? How can I capture real world media inspiration from my mobile and make sure it easily and reliably gets into my Suite workspace? How can a Second Life avatar show more personal attributes, interests, connections, profiles, etc to others in the virtual world? If an SL buddy texts a friend from within the 3D world, can the friend receive the text and respond with their cellphone?
I think we need to regard virtual worlds not as islands of discrete opportunities but as extensions of the real world and of the datasphere. I see little value in creating tools to enable SL/There/etc content creation or in buying advertising space in-world. To me, the most exciting virtual space right now is the social information and collaboration space – and it’s moving into the mobile form-factor a lot more quickly than into 3D worlds. The best value, IMHO, is working on the interstitial technologies that integrate all of these diverse spaces and workflows.
In the meantime, I’ll continue dreaming about the metaverse until it arrives.
So here is the entire install pathway for your new plugin, as uncovered by your intrepid adventurer who has yet not been able to successfully download and install his $300 software.
1) insert CD and run “installer”
2) enter Serial # and email
3) installer queries hardware (—, in this case) for Authentification id
4) install then goes to the web and sends this data to the host, or some subsidiary handler
5) server then sends an email to your entered addy with a link to download the file
6) from email go to server and download the file
7) drag app pkg into applications folder
9) in app UI, enter activation code and wait for server handshake
10) run your new $300 application happy in the knowledge that your software provider no longer thinks you’re a dirty rotten criminal.
Note the many potential points of failure and multiple questionably-secure web connections. And I still don’t have any usable software. Now they can bear the cost of my tech support phone call, and emails, and blogging, etc…
Again, I really like their stuff but this is just ridiculous. Professionals pay for software. Kids and criminals pirate. And kids often end up becoming professionals who buy your software because they pirated it when they were in school.
I deal with installers and activation requirements often at work so this sort of thing really bugs me. I ordered a hard copy of the —- plugin. I received the cd and began the installation only to find that my CD is just an empty installer shell that goes to a web server to download the file. So here I am with no internet access on my workstation completely unable to fetch and download the plugin that I paid $300 for to have a 700MB cd that doesn’t even have the full installer on it!
Found a workaround that claims to allow me to download the installer file from a web-enabled machine, then manually move it to my music workstation for install. However, the installer shell asks for Serial & Authentification info (which I have – legally), but has no way of cross-checking the info to verify that it’s an acceptable combo, is entered correctly, etc… it simply passes this text onto a web server.
And promptly returns an error saying the page is not available. No feedback about my installation. No suggestion that I entered the serial wrong or that my email doesn’t match the one they have for me. Nothing but the eternal winds of suckage. And I haven’t even made it to Activation yet…
I’ve used — for 4 years now and I love the system, but this stuff really undermines my faith in their product. You should always always always do whatever you can to guarantee a successful and easy installation.
This does not keep your software from being pirated. It only pisses off the honest people that are trying to pay you for your product.
Again I implore you and every other software shop: make installation easy and reliable.