This is the most interesting thing I’ve read in a while. DARPA is using smart agent algorithms to crunch heavy data sets and convert them to human-grokable narratives. Before long such agents will be living on our desktops, mobile devices, cars, and appliances actively interpreting innumerable datastreams rendered to transparent screens and spoken through earbuds.
“Like people,” Darpa notes …such a story-telling system would be able to “retrieve and reuse stories to construct an appropriate interpretation of events …because they convey the aspects of a situation that are most important in determining a decision.”
Darpa hopes to have this Experience-based Narrative Memory (EN-Mem) system make “complex situations… simple, understandable, and solvable.”
…Making sense of a complex situation is like understanding a story; one must construct, impose and extract an interpretation. This interpretation weaves a commonly understood narrative into the information in a way that captures the basic interactions of characters and the dynamics of their motivations while filling in details not explicitly mentioned in the input stream. It uses story lines with which we all have experience as analogies, and it simplifies the detail in order to communicate the crucial aspects of a situation. The story lines it uses are those the decision maker should be reminded of, because they are similar to the current situation based upon what the decision maker is trying to do.