The Meltwater Social Web Analytics team came round today to tell me about their plans for their service. They are starting out with the confidence and aggression that typified Meltwater’s entry into the ‘traditional’ media monitoring six years back… and they’ve done pretty darn well in that regard.
For speed to market, they are currently white labelling Techrigy‘s rather nifty SM2 service (shout out to @aaronnewman), and I understand this will form a ‘base’ or a foundation for their endeavours going forward.
I enjoyed our conversation. In the short hour we had together we covered approaches to quantifying influence, assessing Twitter, semantic analysis approaches to gauging sentiment (aka tone), the growth in the number of Social Web Analytics vendors, the importance of the UI and ‘prettiness’ of charts, and pricing.
We debated my assertion that no one service serves all needs right now, and that a stable of differently capable services (often at different price points) is required. We even had time to chew over how Racepoint Group has achieved such distinct leadership in this field and the prospects for data visualisation.
Which is a super segue to another couple of interesting videos on my continuing obsession with and search for data visualisation technology and approaches to assist PR consultants in influencing and be influenced more effectively and efficiently.
First up is a TED video from this year’s conference (February) by JoAnn Kuchera-Morin of the Center for Research in Electronic Art Technology (CREATE) at UC Santa Barbara. I’ll defer to the official description of CREATE’s Allosphere if you’ll allow me:
The AlloSphere space consists of a 3-story cube that is treated with extensive sound absorption material making it one of the largest anechoic chambers in the world. Standing inside this chamber are two 5-meter-radius hemispheres constructed of perforated aluminum that are designed to be optically opaque and acoustically transparent.
There are currently two projectors, soon to be multiple high-resolution video projectors, mounted around the seam between the two hemispheres, approaching eye-limited resolution on the inner surface. The loudspeaker real-time sound synthesis cluster (around 500 individual speaker elements plus sub-woofers) is/will be suspended behind the aluminum screen resulting in 3-D audio. Other clusters include simulation, sensor-array processing, effector-array processing, real-time video processing for motion-capture and visual computing, render-farm/real-time ray-tracing and radiosity cluster, and content and prototyping environments.
Anyway, probably best understood in the video. If anyone has two large hemispheres they no longer want, please let me know @sheldrake.
On a more immediately applicable scale, check out SweetNTweet below. It shows a lovely little application (built with the open source Processing 1.0) in which search keywords are entered and to which Tweets from Twitter gravitate in the form of candy-coloured petals. On reaching their destination they reveal their 140 characters of wisdom and beauty.
Does it really show any promise of helping PR consultants visualise their landscape. Nope, but it’s really quite pretty and might spark something more relevant in your mind!
Previous posts on visualisation: