On June 16 at the iHub in Nairobi, a talk was given by Jon Gosier the director of Swift River which I first heard of Swift River following its use with Ushahidi following the Haiti earthquake early in 2010 when crisis rescue & response teams faced a challenges of processing 200,000 SMS messages a day. So the question was how do you filter that information to get help to the people (i.e. earthquake victim) who need it?
Jon said Swift River is an open source platform that uses algorithms and crowd sourcing to filter and validate information. He gave a quote from the book the ‘The Long Tail about with such vast amounts of data in the world, there is need for filters to filter. Swift River helps by pulling out data and he mentioned some of the tools used. In his guide titled Swift River in plain English Jon lists the arms of Swift River which include:
• SiLCC pulls keywords from any Text (including SMS and Twitter) and automatically sorts related text ( a natural language processor)
• SULSa automatically detects location of incoming content/reports
• SiCDS automatically filters out duplicate content (re-tweets, blogs, text messages)
• Reverberations detects how influential/popular content is online
• RiverID allows Swift users to carry their Swift score and reputation with them across the web
All these enable an organization facing a challenge of too much data to among other things, pick out what’s important, save time, suppress noise, filter & curate the information. This is more so at time of urgency or crisis
Swift River can also be used by newsroom to manage & curate very large information in a crisis, for online brand monitoring, or for election monitoring. It runs on a free and open source platform. It’s still in development, with more features & improvements being added to the beta (now at Version 0.2.1 Batuque) over the rest of the year by the development team who are based in Uganda.