“The intelligence of the city is on the streets.“ – Manu Fernandez
Amidst the swirling maelstrom of technological progress so often heralded as the imminent salvation to all our ills, it can be necessary to remind ourselves that humanity sits at the center, not technology.
I have a new article over at Big Think looking at trends in wireless implant technology and the vulnerability profile presented by our emerging integration with connected biodevices. This article builds on my previous post here, Ubicomp Getting Under Your Skin? So Are Hackers.
From the intro:
In what amounts to a fairly shocking reminder of how quickly our technologies are advancing and how deeply our lives are being woven with networked computation, security researchers have recently reported successes in remotely compromising and controlling two different medical implant devices. Such implanted devices are becoming more and more common, implemented with wireless communications both across components and outward to monitors that allow doctors to non-invasively make changes to their settings. Until only recently, this technology was mostly confined to advanced labs but it is now moving steadily into our bodies. As these procedures become more common, researchers are now considering the security implications of wiring human anatomy directly into the web of ubiquitous computation and networked communications.
A 15min presentation on the emerging ubicomp interface of the urban landscape
Also, here’s the full slide deck.
There are some really fascinating innovations & opportunities arising at the convergence of embedded sensors, the built environment, 3D modeling, and augmented reality. Buildings, manufacturing chains, cities, and environments are increasingly communicating their run-time processes through embedded sensors & systems. The data streams pouring off these devices are driving rich visualizations in monitoring dash-boards that give operators & managers high-resolution insight into the state of these systems. Soon, these datastreams will be wired into 3D models – perhaps the very same CAD models that were originally developed to construct a building will live on as a real-time model of it’s living operations. Such models of buildings, civic infrastructure, and the environments in which they’re embedded will provide up-to-the-minute assessments of their operations, from at-a-glance macro overviews to incredibly detailed micro reports. Then model branches – mirrors – could be created to run simulations of future states, eg what happens to traffic if we allow 20% more development on the north end of town?
With augmented reality, the potential exists to bus selective overlays from the model & its datastreams out to augmented interfaces. In this way building/civic managers, code enforcers, first responders, environmental analysts, and many others will be able to see the run-time state of their city/building/ecosystem drawn across the real-world. For example, a broken water main downtown would immediately be reflected in the civic model, pinging the Water Dept. dashboards that then route to a field agent’s mobile who uses their AR head’s-up-display to visibly locate the exact location of the leak for repair, possibly pulling up street schematics and a guided 3D repair manual on-site.
This convergence of the instrumented world and it’s virtual representation, mediated by an augmented reality interface between the two, may yield unprecedented opportunities to model & optimize the very structures of civilization.
[Wow! Looks like Screampoint has a big head-start on this...]
[Autodesk is also working on developing tools for sensor-driven Building Information Modeling (BIM).]
Via Tish Shute at UgoTrade:
“Imagine an environment where most physical objects know where they are, what they are, and can, (in principle) network with any other object. With this infrastructure, reality becomes its own database. Multiple consensual virtual environments are possible, each oriented to the needs of its constituency. If we also have open standards, then bottom-up social networks and even bottom up advertising become possible. Now imagine that in addition to sensors, many of these itsy-bitsy processors are equipped with effectors. Then the physical world becomes much more like a software construct. The possibilities are both scary and wondrous.” (Vernor Vinge – intro to ISMAR 2009)
[This is a narrative exploration of an idea @jingleyfish & I had walking around the Westside of Santa Cruz late at night...]
Imagine walking around a town wearing your stylish Ray Ban augmented reality glasses (because hand-held mobile devices will become a significant limiting factor to experiencing the annotated world). You see small transparent white dots glowing on people and objects indicating that they contain accessible cloud content. Maybe you “select” (by whatever mechanism constitutes selection through a pair of eyeglasses) a bench on the sidewalk then view a flyout markup indicating that the bench was commissioned by the Bruce family in memorium of Aldis Bruce, manufactured by the Taiwanese Seating Concern. You click through the family link to see a brief bio of Aldis with a set of links to his life story, works, etc…
In the upper corner of your view a light begins to blink indicating a new feed is available in your subscription list. You select and expand, showing a menu item for Bob’s Neighborhood Chat. Initializing this feed draws a new green dot over the bench, indicating that Bob has published information tagged to it. You click and Bob’s markup flies out with text stating, “Tuesday, March 11, 2010: Don & Charise Ludemeyer celebrated their 25th wedding anniversary by returning to the place where they first kissed in 1985.” A link below this offers the couple’s personal website, a photo gallery & playlist of their wedding, and then a link to more public markups about the bench.
Clicking through the “more” link offers a list of other public comments. You choose the Sur 13 layer just to see what the local hoods are up to. Flyout: “Hernandez Bros. shot down by Westside Brownshirts, Sept. 23, 2009. RIP, locos.” Then, drawn over, a bit-crushed graffiti logo “WSB” animates across the view, hacked into the Sur 13 layer by Brownshirts. A click through would open the full Brownshirt regional layer but you already feel like a trespasser on suddenly dangerous turf.
Unsettled, you call up the local Police layer. A trailing list of crimes in a 5mi radius begins scrolling. You narrow the search to your current location with a 2 week time horizon. 3 yellow car break-ins glow indicators along the road, followed by a red assault marker 10 feet down the walk, and then 2 more blinking red markers at the bench. You hover over the bench markers and learn of two shootings here within the last 4 days.
You open up the iCabNow utility, send up your beacon, and wait nervously for Yellow Cab to find you. You thumb back to the Ludemeyer markup and click through to find the playlist from their wedding. As you hop into the cab a few moments later, the theme from Miami Vice swells up in your earbuds, sending you off from this time-twisted place. You call up the WordTweet micromarker app and make a traveler’s note: “This is a dangerous bench with an old heart.” Click “Publish” and a new feed indicator appears, offering your own layer update to subscribers.
Here’s a selection of my tweets from the O’Reilly Emerging Technology Conference this past week. These are the ones I think grab the juicy nuggets from the speaker’s presentations. [In temporal order with the earliest (ie Monday eve) listed first.]
Tim O’Reilly: “We have greatness but have wasted it on so much. ”
We have an unprecedented opportunity to build a digital commonwealth. #etech
Work on something that matters to you more than money. This is a robust strategy. #etech
Niall Kennedy: Energy Star rating for web apps? Thinking of clouds & programming like tuning a car for better gas mileage. #etech
Cloud computing: no reasonable expectation of privacy when data is not in your hands. Not protected by 4th amendment. #etech
Alex Steffen: Problems with water supply are based in part on our lack of beavers. #etech
Social media for human rights. http://hub.witness.org #etech
Gavin Starks – Your Energy Identity & Why You Should Care. see http://amee.com #etech
Maureen Mclugh – Consider that technology may be evolving in ways that are not particularly interested in us. #etech
Becker, Muller: We have under-estimated the costs and over-estimated the value of our economy. #etech
Becker, Muller: We assume economic trade must be the primary framing of value in our lives. Why? #etech
Design Patterns for PostConsumerism: Free; Repair Culture; Reputation Scaled; Loanership Society; Virtual Production. #etech
NYT: emerging platforms, text reflow, multitouch, flexy displays, smart content, sms story updates, sensors, GPS localized content. #etech
Jeremy Faludi: Buildings & transport have the largest impact on climate change. Biggest bang for the buck in re-design. #etech
Jeremy Faludi – Biggest contributor to species extinction & habitat loss is encroachment & byproducts from agriculture. #etech
Jeremy Faludi – Best strategies to vastly reduce overpopulation: access to birth control & family planning, empowerment of women. #etech
Tom Raftery: Grid 1.0 can’t manage excess power from renewables. Solution: electric cars as distributed storage. #etech
Considering the impact of pluging AMEE (@agentGav) data in ERP systems for feedback to biz about supply chain impacts. BI meets NRG ID.
Mike Mathieu: Data becoming more important than code. Civic data is plentiful and largely untapped. Make civic apps! #etech
Mike Mathieu: Take 10 minutes today and pick your crisis. Figure out how to create software to help. #etech
What is #SantaCruz doing to make civic data available to service builders? We want to help SC be healthier & more productive.
Mark Fraunfelder: “I haven’t heard of anybody having great success with automatic chicken doors.” #etech [re-emerging technology]
Realities of energy efficiency: 1gallon of gasoline = ~1000hrs of human labor. #etech
Kevin Lynch: Adobe is saving over $1M annually just by managing energy. #etech
Designing backwards: Think about the destiny of the item before thinking about he initial use. (via Brian Dougherty) #etech
RealTimeCity: physical & digital space merges, people incorporate intelligent systems, cities react in accord w/needs of pub welfare. #etech
Oh my we’re being LIDAR’d while Zoe Keating plays live cello n loops. ZOMG!!!
zoe keating & live lidar is blowing my mind at #etech 1.3M points per sec!
Julian Bleeker cites David A. Kirby: “Diegetic prototypes have a major rhetorical advantage over true prototypes” #etech
Julian Bleeker: Stories matter when designing the future, eg. Minority Report. #etech
Julian Bleeker: “Think of Philip K. Dick as a System Administrator. #etech
Rebecca MacKinnon: Which side are we helping, River Crabs or Grass Mud Horses? #etech
Kati London: How can we use games to game The System and how can they be used to solve civic problems? #etech
Nathan Wolfe: Trying to fight pandemics only at the viral human level ignores deep socioeconomic causes of animal-human transmission. #etech
Nathan Wolfe, re: viral jump from animal to human populations: “What happens in central Africa doesn’t stay in central Africa.”
Nathan Wolfe: need to work with % of population w/ hi freq of direct contact with animals for early detection of viral transmission.
Nathan Wolfe: Vast majority of biosphere is microscopic, mostly bacterial & viral. Humans: very small piece of life on Earth. #etech
[This is a reply I left recently to a Global Futures question about the near-future of the web. It goes a little off-topic at the end but such is the risk of systems analysis. Everything's connected.]
Within 10-15 years mobile devices will constantly interact with the world around us, analyzing objects, faces, signage, locations, and anything else their sensors can engage. Camera viewfinders will identify visual sources using algorithms to match them up with cloud data repositories. Bluetooth and GPS will interact on sub-channels silently exchanging relationships with embedded sensors across devices and objects. A user’s mobile device will become their IP address hosting much of their profile information and mediating relationships across social nets, commercial transactions, security clearances, and the array of increasingly smart objects and devices.
Cloud access and screen presence will be nearly ubiquitous further blurring the line between desktop, laptop, server, mobile devices, and the objects in our world. It will all be screens interfacing between data, objects, and humans. Amidst the overwhelming data/content glut we will outsource mathematical chores to cloud agents dedicated to scraping data and filtering the bits that are pertinent to our personalized affinities and needs. These data streams will be highly dynamic and cloud agents will send them to rich media layers that will render the results in comprehensible and meaningful displays.
The human sensorium and its interaction with reality will be highly augmented through mobile devices that layer rich information over the world around us. The digital world will move heavily into the natural analog world as the boundaries between the two further erode. This will be readily apparent in the increasing amount of communication we will receive from appliances, vehicles, storefronts, other people, animals, and even plants all wired to the cloud. Meanwhile, cloud agents will sort through vast amounts of human behavioral information creating smart profiles and socioeconomic and environmental systems models with incredible complexity and increasing predictive ability. The cloud itself will be made more intelligible to agents by the standardization of semantic web protocols implemented into most new sites and services. Agents will concatenate to tie services together into meta-functions, just as human collectives will be much more common as we move into increasingly multicellular functional bodies.
The sense of self and our philosophical paradigms will be iterating and revising on an almost weekly basis as we spread out across the cloud and innumerable virtual spaces connected through instantaneous communication. Virtual worlds themselves will be increasingly common but will break out of the walled-garden models of the present, allowing comm channels and video streams to move freely between them and the social web. World of Warcraft will have live video feeds from in-world out to device displays. Mobile GPS will report a user’s real-world location as well as their virtual location, mashing both into Google Maps and the SketchUp-enabled virtual map of the planet.
All of this abstraction will press back on the world and create even greater value for real face-to-face interactions. Familial bonds will be more and more cherished and local communities will take greater and greater control of their lives away from unreliable global supply chains and profit-driven corporate bodies. Most families will engage in some form of gardening to supplement their food supply. The state itself will be hollowed out through over-extended conflicts and insurgencies coupled with ongoing failures to manage domestic civic instabilities. Power outages and water failures will be common in large cities. This will of course further invigorate alternative energy technologies and shift civic responsibilities to local communities. US manufacturing will have partially shifted towards alternative energy capture and storage but much of the real successes will be in small progressive towns rallying around local resources, small-scale fab, and pre-existing economic successes.
All in all, the future will be a rich collage. Totally new and much the same as it has been.
I’m heartened to find the Metaverse Roadmap, sponsored by the Accelerating Studies Foundation. While I’ve been moaning about the shortcomings of immersive 3D technologies, they’ve been defining the template for progress. Much of their thoughts align with my own, painting an exciting future of convergence across modalities, devices, and workflows.
The emergence of a robust Metaverse will shape the development of many technological realms that presently appear non-Internet-related. In manufacturing, 3D environments offer ideal design spaces for rapid-prototyping and customized and decentralized production. In logistics and transportation, spatially-aware tags and real-time world modeling will bring new efficiencies, insights, and markets. In artificial intelligence, virtual worlds offer low-risk, transparent platforms for the development and testing of autonomous machine behaviors, many of which may be also used in the physical world. These are just a sampling of coming developments based on early stage Metaverse technologies.
In sum, for the best view of the changes ahead, we suggest thinking of the Metaverse not as virtual space but as the junction or nexus of our physical and virtual worlds.
In a disturbing-but-not-surprising move, the U.S. military is contracting the development of small robotic biomimics for field deployment. Equipped with sensors and networked relays these robocritters will likely end up scurrying through apartment complexes at home and abroad, ala Minority Report. Expect swarming behaviors, social intelligence, and networked biometrics.
Everybody freeze for the spiders…
British defence giant BAE Systems is creating a series of tiny electronic spiders, insects and snakes that could become the eyes and ears of soldiers on the battlefield, helping to save thousands of lives [ed note: the video shows bugs being used to target a building for rocket attack].
Prototypes could be on the front line by the end of the year, scuttling into potential danger areas such as booby-trapped buildings or enemy hideouts to relay images back to troops safely positioned nearby.
Soldiers will carry the robots into combat and use a small tracked vehicle to transport them closer to their targets.
Then they would swarm into the building and relay images back to the soldiers’ hand-held or wrist-mounted computers, warning them of any threats inside.
BAE Systems has just signed a Â£19million contract to develop the robots for the US Army.
QR Code is a a UPC-like image code very popular in Japanese cities. Codes are in magazines, on fliers, on storefronts, and on products. When a person takes a picture of the QR Code with their cellphone the code is parsed for an url embed which launches the mobile web browser that takes the user to a website. Now, QR Codes will be tested in San Francisco in the first US pilot program.
“More than 500 restaurants, shops and businesses reviewed by Citysearch are placing printed bar codes in their windows. People who have special software from Scanbuy Inc. loaded on their cell phones can simply take a picture of the code and their phone’s Internet browser will immediately take them to the restaurant’s corresponding Citysearch page.”
This is an interesting step towards smart objects where things begin to have their own websites. I suspect this is just a step along the way towards using an embedded RFID-type chip that will transmit stored information to mobiles while users pass by the tags. I can imagine a time when all consumables and media contain an alter-profile of data and cloud-aware links and can communicate these to each-other, to users/consumers, and to supply-chains…
This was a great conference and the most consistent collection of speakers and topics I’ve ever experienced. Very fun and inspiring. Lots of hip 30-somethings trying to dream up tomorrow and make it real. It was a a very balanced, yet cutting-edge talk aimed at an eager (and surprisingly mixed-gender)crowd. I noticed that most folks were using Mac laptops – this part of the edge seems to prefer Apple – and it was fascinating to watch many who were blogging the talks while pulling up references dropped by the speakers, tweeting out to Twitter, and snapping/downloading/posting photos in real-time. As speakers dropped references I was pulling them up on my laptop and dropping links into my blog notes.
In the lobby a team was showing off a data viz video mapping real-time communications connecting NYC to the rest of the world. Andrea noticed that a surprising number were with an Italian city called Perugia. Maybe next year they could map the live feed of all web traffic from ETech. Imagine the bitstreams rising off such a gathering of digiterati.
Maybe it was just the Sudafed coursing through our virus-ridden veins (thank you Portland) but ETech was a total intellectual turn-on, from ambient objects, Asian mobile media, green policy and sustainability, hardware hacking & drone building, Austrian post-Situationists, neuroengineering, and the digital salvation of Democracy itself.
I hope I can go back next year!
“Insights and Applications from the Behavior of the Aggregate”. Using cell phones as trackable tags, then extrapolating patterns. Learning about the aggregate by sampling the individual.
Nathan is a research scientist at MIT & Santa Fe. Also holds positions across sub-Saharan Africa.
Mobile phones are the fastest tech adoption in human history. People have extraordinary processing power and access to data. A new era of wearable computing.
Data, Science, and Engineering: Social network analysis. Classical social net metrics breakdown quickly as networks grow.
Demo: dynamic display of individual people walking around Cambridge, Mass, making cell calls. After accumulating data over time, can we make predictions about behaviors? What happens when we extend this to dyads of people? What relationships can be inferred from where and when dyads are? What about aggregate behavior of the whole? Do patterns emerge based on outlying events?
Data being logged in this trial: celltower ID’s, proximate bluetooth device name/activity, phone call/text log. Obviously huge privacy implications. All subjects were informed of logging. Have accumulated over 400,000hrs continuous human behavior data collected over 2004-2005.
Transitional probabilities used to evaluate eg likelihood of being at home versus being at work. Information entropy = ratio of amount of structure to randomness in subject’s routine. Shows variations between highly habitual individs and more random people. Low entropy subject vs. high entropy subject (which one are you?). “The Entropy of Life”. This data can be mapped against demographics to see what lifestyles are more or less entropic.
These models can be extended to map and model infectious patterns of contagions through social nets. Higher entropy individuals make containment much more difficult. (Work in progress).
Eigenbehaviors: A way to reduce highly-dimensional behavior data into a set of vectors that characterize individual behavior, but also behavior of demographics. Can a subject’s affiliation (demographic) be inferred from behavior patterns? Behavior space allows inference of demographic with high accuracy (90%).
Friendship vs. proximity Networks: can friendship be inferred from proximity? Behavioral signatures – friend vs acquiantance. Properties: Prox on Saturday night, phone communication, number of unique locs, prox outside work, prox at work, prox at home.
These models allow inferences about the true topology of social/friend networks. Data from mobile phones allows a much richer picture of social nets.
Organizational Rhythms: how the deadlines of an institution can be seen in the collective behavior of its individual members.
Network data mining: scale to 250 million nodes (phone #’s). Telecomm corps are very interested. 5,000 calls/sec; 12bil calls/month. Anonymized. Highly statistical averages are yielded every day. Furthermore, monthly plots are highly consitent from month to month. Ie human behavior across large numbers is highly organized over time. Why does the symettry exist? Why is the monthly curve of cell use the same every month for millions? What patterns/events coordinate or influence this behavior?
Diversity of your social net seems to correlate with positive socio-economic accomplishment (ie mo’ money & success).
Life inferences: Sleeping, Lunch = easy. Partying? Trickier. Auto diary to track your behaviors. How much sleep did I get? What did I do last Saturday night after midnight? How much time do I spend driving? Can I make predictions about my life?
The importance of triangles and mutual influence. Product adoption can be correlated in friend triangles. Friends have more influence over your purchasing.
Eprom – Educating sub-Saharan students on use of mobile data mining. SMS bootcamp. Mobile programming. Making epidemiology inferences from behavioral patterns, eg malaria susceptibility. Reality mining Africa. SMS Bloodbank, BoonaNet.
Individual behavior prediction; relationship inference; organizational rhythms & aggregate behaviors; scalability and large-scale network analysis. Africa is fastest growing mobile phone market in the world. Incredibly smart kids in Africa hungry for this knowledge.
Note: this will be used by federal agencies to identify “terror” cells and predict criminal behavior.
Hardware is much easier to copy now. Hardware & software is blurring – ex: firmware updates.
Speed of hardware hacking is remarkable.
Why open source hardware? Contribute to the pool of knowledge; freedom to pursue software/hardware creativity; community development and quality; excitement about building things; education;
- Hardware/mechanical diagrams: 2D models, vector, DXF or AI (KiCAD)
- Scematics & circuit diagrams: PDF, BMP, GIF, PNG
- Parts list (Bill of Materials): data sheets (x0xb0x TB303)
- Layout diagrams: physical map of parts
- Core/Firmware: on-board source code
Like most developers, they don’t mention the human interface layer.
Roomba has an open API. Companies that release open platforms find much greater value (and mindshare) from user mods.
Ambient Orb publishes schematics and parts list. Neuros OSD publishes schematics (semi-open but falls short).
Hardware is mostly based on patents, not copyright. Licensing: CC, GPL, BSD, MIT
Chumby: programmable data portal.
Cool stuff: Twittering plants with Arduino – plants that call you and say they need to be watered (Twitter as SMS bridge); Open prosthetics; Minty Boost open source USB charger;
Ed note: Imagine an online repository of mechanical diagrams for DIY desktop fab/rep…
Fun with robots! Making aerial drones. Eye in the Sky. DIYdrones.com
UAV’s are very expensive. How to democratize the tech?
How cheap and simple can a UAV be?
Two requirements: stabilization & navigation.
$80 copilot: where is “down”? Use IR to seek horizon – consistent gradient between land & sky. Yields absolute frame of reference.
Could LEGO solve the probelm? Yes (under $1000), mindstorms controller in light model plane. Basic prototype, requires manual takeoff & landing.
Onboard camera takes pics with geotagging. Genereates low-cost aerial photos with very high res using low altitude with a 5Mpix camera.
OK, but can you use a cellphone? GPS, camera, broadband, onboard processing & mem.
Yes! Airplane now has a phone # – send it GPS waypoints (not yet realized in prototype).
In theory, small UAV’s can hop across cell networks for nav & comm.
IR horizon sensor can also be used to stabilize the camera so it always looks down.
Be careful, especially when flying over secured federal facilities!
Can we make it cheaper (under$500)? Yes, using homemade embedded processers. Any open source or cheap chip can support an autopilot routine.
Program & test with flight sim apps. Watch your robotic UAV run the flight sim!
How to make an aerial robotic contest for kids? Use small blimps.
Blimps are intrinsically autonomous; when they fail, they fail very gracefully; nice to have around.
Prototype maintains altitude by pinging off the ground (IR); vertical prop holds elevations; IR beacon acts as waypoint; blimp will seek the waypoint; relative frame of reference it can use compass and IR to make it’s way across waypoints;
Live demo: blimp is following the presenter around the room. ~$100. Entirely autonomous, if not very smart.
Evolution Robotics is a company that produces a bot nav solution. Paired with autopilot, the UAV can use more advanced navigation and movement. Aerial robotics is the cutting edge of robotics: “Soon the sky will be darkened with aerial drones!”
Regulations govern UAV deploy. Amateurs must fly under 400ft, maintain line-of-sight, and pilot can assume full control.
Very limiting. Power source is also a limit.
Ed Notes: could use RFID or other beacons to deploy UAV over your home or for tracking your location; pair with live hi-res camera feed.
Rough notes on ambient devices – objects that express data streams, ala Ambient Orb. Ubiquitous computing, PC-free internet.
Expressors: motion, color, angle, pattern, text.
Devices should be pre-attentive, calm, and glanceable. “Bit-trickling datatcasting”.
Energy Joule device: plugs in to outlet, exposing customers to energy use/price – real-time price of energy & own usage in the house.
“Enchanted Objects”: support continuous, thin, awareness-communication.
Amulets, pentacles, potions – objects of healing.
Pervasive is persuasive – objects that grab attention and communicate data will change behavior towards the content being communicated.
Dashboards as feedback devices for personal behavior, health.
Ex: mirror with led icons that reflect blood pressure by analysis; dashboard with pollen count.
Other ex: intelligent pill box that sends dose reminders to a display device; active timed glow caps on prescription bottles.
Using shared information to enlist social dynamics (info begets behavior).
Look to fiction, pop culture for inspiration about expressive devices.
Goal: Enhance quality of life and make things better. Enable data acquisition without personal computers.
Hacking brains & iPhones, building DIY aerial drones, ambient data streaming, data viz and crowd movements, ARGs, Vegas, and the Self awakened to it’s own tech. Oh baby!
With the help of my special lady friend (who got work to sport for the hotel, pass, and air) and the help of my employer (I’m doing some booth shifts on the floor in exchange for a pass – I get to rep Adobe AIR), I’m leaving tomorrow morning for sunny San Diego and a week at the O’Reilly Emerging Technology Conference! I’m psyched. I’ve wanted to go for the last few years but couldn’t afford it. All this time, I should have just told my corporate overlords they needed to send me on the company ticket!
I’ll be sending photos to the urbeingrecorded portal via tumblr, and I’ll likely post some keen bits here. Otherwise I’ll be fast hacking my iPhone to control a robotic crowd-sourcing drone I will use to track the culinary habits of tech luminaries and international political dissidents whose footpaths I’ll be datastreaming to various dynamic art installations and ambient devices.
From their site:
How does technology help you perceive things that you never noticed before? How does it help you be found, or draw attention to issues, objects, ideas, and projects that are important, no matter their size or location?
At the 2008 version of ETech, the O’Reilly Emerging Technology Conference, we’ll take a wide-eyed look at the brand new tech that’s tweaking how we are seen as individuals, how we choose to channel and divert our energy and attention, and what influences our perspective on the world around us:
Body Hacking. Genomics Hacking. Brain Hacking. Sex Hacking. Food Hacking. iPhone Hacking.
DIY Aerial Drones. DIY Talking Things. DIY Spectrum. DIY Apocalypse Survival.
Emerging Tech of India, Cuba, and Africa. International Political Dissidents.
Visualize Data and Crowds. Ambient Data Streaming.
Good Policy. Energy Policy. Defense Policy. Genetic Policy. Corruption.
Alternate Reality Games. Emotions of Games. Sensor Games.
ETech 2008 will cover all of these topics and more. We put on stage the speakers and the ideas that help our attendees prepare for and create the future, whatever it might be. Great speakers are going to pull us forward with them to see what technology can do… and sometimes shouldn’t do. From robotics and gaming to defense and geolocation, we’ll explore promising technologies that are just that–still promises–and renew our sense of wonder at the way technology is influencing and altering our everyday lives.
Ford is leveraging RFID tech to help workers track their tools.
Developed with DEWALT and ThingMagic, Tool Link offers owners the capability to mark and scan high-value tools, safety equipment, material inventories and other important assets using RFID tags. When the vehicle is running, a pair of RFID antennas, mounted in corrosion- and impact-resistant housings on the inside of the pickup box, scan the box for the items on a pre-programmed inventory list.
The data is transmitted to a reader mounted inside the cab and displayed on the in-dash computer screen, alerting the driver if any inventoried tools are not loaded on the truck.
And PC World opines on the near-future of smart objects.
We’re entering the era of “ambient intelligence,” when everyday objects will contain technology that broadcasts data about themselves and their environment, says Liebhold.
As you approach a dangerous intersection, sensors in your car will detect it and reduce speed. GPS coordinates of places unsafe to walk at night will be broadcast to mobile devices.
In Japan, location-based services from GeoVector let the Mapions Pointing Application deliver information on businesses inside a building at the point of a GPS-enabled camera phone. U.S. handsets with the technology should appear by year’s end.
In homes, floor sensors will detect empty rooms and automatically lower the thermostat and turn off lights. Agilewaves, a firm started by ex-NASA scientists, is working with builders to install sensors on electrical switches, pipes, and gas valves. Eventually they hope to offer neighborhoods, subdivisions, or municipalities a big-picture view of their carbon footprint.
Deeper into the Googleplex:
One plan, which has already tentatively started, entails making literally everything in the world accessible at the click of a button. For now, this means every book, piece of music, film, TV and radio broadcast, official document and photograph.
But eventually… Google boffins believe it can be extended to people and their personal belongings.
The idea is that we, and our treasured possessions, will be fitted with minute microchips which could be linked to the internet, via computers, by a digital radio frequency.
In this way, you would only have to type “Where is my watch” or “Find Joe Bloggs” into your PC or handheld computer, and Google could assist you.
…More immediately, Google is switching its main focus from PCs and laptops to mobile phones.
Smashing Magazine has a brief but nice round-up of items under the title User Experience of the Future. They list several technologies under development – some of which I’ve blogged about on a few occasions, like multi-touch and the Reactable – all of which taken together certainly paint an intriguing near-future. Off the radar are the skunk works, undiscovered breakthroughs, and emergent interactions between devices and their interface with user communities that will push the ever extruding scifi narrative further into weirdness and fancifulness. Crowley considered the new age as being represented by the spiritization of matter, and I think we’re seeing that on greater and greater scales as the lines between human and machine, imagination and reality, continue to blur into strange new forms. As Clarke wrote, that which is sufficiently technologically advanced is indistinguishable from magic.