There are a lot of people wondering what to do with all the data that is being generated by social tools/sites around the web and the social tools/services inside organization. Well, the answer is to watch the flows, but the pay off value is not in the flow it is in contextualizing the data into usable information. Sadly, few systems have had the metadata available to provide context for location, conversation flow, relevant objects (nouns), or the ability to deal with the granular social network.
How many times have you walked bast a book store and thought, “Hmm, what was that book I was told I should check out?” Or, “my favorite restaurant is book filled, what was the name of the one recommended near here a month or so ago?” When the conversations are digitized in services like Twitter, in Facebook, or the hundreds of other shared services it should be able to come back to you easily. Add in Skype, or IM, which are often captured by the tools and could be pulled into a global context around you, your social connections, the contexts of interest the for the relationships, and the context around the object/subject discussed you should have capability to search to get to this within relatively easy reach.
Latency from Heavy Computational Requirements
What? I am hearing screaming from the engineers about the computational power needed to do this as well as the latency in this system. Design Engaged 2005 I brought up a similar scenario, within context of my Personal InfoCloud and Local InfoCloud frameworks called Clouds, Space & Black Boxes (a 500kb PDF). The key then as it is still is using location and people to build potential context and preprocess likely queries.
When my phone is sharing my location with the social contextual memory parser service that see I am quite near a book store (queue the parsing for shared books, favorited conversations with books, recent wish list additions (as well as older additions), etc. But, it is also at the time I usually eat or pick up food for a meal, so restaurant and food conversations parsed, food blogs favorited (delicious, rated on the blogs, copied into Evernote, or stored in Together or DevonThink on my desktop, etc.) to bring new options or remind of forgotten favorites.
Now, if we pull this contextual relevance into play with augmented reality applications we get something that starts bringing Amazon type recommendations and suggestions to play into our life as well as surfacing information “we knew” at some point to our finger tips when we want it and need it.
Inside the Firewall
I have been helping many companies think through this inside the firewall to have, “have what we collectively know brought before us to help us work smarter and more efficiently”, as one client said recently. The biggest problem is poor metadata and lack of even semi-structured data from RDFa or microformats. One of the most important metadata pieces is identity, who said what, who shared it, who annotated it, who commented on it, who pointed to it, and what is that person’s relationship to me. Most organizations have not thought to ensure that tiny slice of information is available or captured in their tools or service. Once this tiny bit of information is captured and contextualized the results are dramatic. Services like Connectbeam did this years ago with tags in their social bookmarking tool, but kept it when they extended the ability to add tagging in any service and add context.