[okfn-labs] Frictionless Data Vision and Roadmap

Enric Garcia Torrents enricgarcia at uoc.edu
Wed Jan 22 01:08:12 UTC 2014


Brilliant. Please let me know what needs to be done in terms of outreach and data packaging, time permitting I will be more than happy to contribute.

Enric

--- Missatge original de Rufus Pollock per a okfn-labs enviat el 21.01.2014 15:03

There is now a short Frictionless Data "vision" doc online at:
http://data.okfn.org/vision
It is based on input from various people and comments would be warmly welcome. I've excerpted some of it below for those who prefer info in the mail client. Regards, Rufus ## Frictionless Data Ecosystem There's too much friction working with data - friction getting data, friction processing data, friction sharing data. This friction stops people doing stuff: stops them creating, sharing, collaborating, and using data - especially amongst more distributed communities. It kills the cycles of find, improve, share that would make for a dynamic, productive and attractive (open) data ecosystem. We need to make an ecosystem that, like open-source for software, is useful and attractive to those without any principled interest, the vast majority who simply want the best tool for the job, the easiest route to their goal. We think that by getting a few key pieces in place we can reduce friction enough to revolutionize how the (open) data ecosystem operates with massively improved data quality, utilization and sharing. We think this because there's a multiplier here that means relatively small changes can have big effects. This multiplier is Network effects: the utility of a particular standard, pattern or even tool depends on how many other people are using it. This means that creating a critical mass of use around the tooling and standards will have a huge effect. This isn't easy. But after working on these issues for nearly a decade we think the time is right. ## A Metaphor Today, when you decide to cook, the ingredients are readily available at local supermarkets or even already in your kitchen. You don't need to travel to a farm, collect eggs, mill the corn, cure the bacon etc - as you once would have done! Instead, thanks to standard systems of measurement, packaging, shipping (e.g. containerization) and payment ingredients can get from the farm direct to my local shop or even my door. But with data we're still largely stuck at this early stage: every time you want to do an analysis or build an app you have to set off around the internet to dig up data, extract it, clean it and prepare it before you can even get it into your tool and begin your work proper. What do we need to do for the working with data to be like cooking today - where you get to spend your time making the cake (creating insights) not preparing and collecting the ingredients (digging up and cleaning data)? The answer: radical improvements in the "logistics2 of data associated with specialisation and standardisation. In analogy with food we need standard systems of "measurement", packaging, and transport so that its easy to get data from its original source into the application where I can start working with it. ## What We Want To Do We start with an advantage: unlike for physical goods transporting digital information from one computer to another is very cheap! This means the focus can be on standardizing and simplifying the process of getting data from one application to another (or one form to another). The following gives an overview of the main areas of work. There is more detail in the Roadmap.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.okfn.org/pipermail/okfn-labs/attachments/20140122/28ce7b60/attachment-0004.html>


More information about the okfn-labs mailing list