[@OKau] "High value" datasets
Rebecca Cameron
rcameron.bis at gmail.com
Tue Apr 14 10:28:15 UTC 2015
Cassie
There are two ways this issue was dealt with when I worked for Qld Gov,
both primarily related to the use of the data. In theory some datasets
could by academic defition be high value but if no one even opens the
dataset it has little value. Therefore value was measured in terms of use.
1. CKAN allows for departments to measure page hits and down loads.
Initially this process was performed manually so we could gauge which
datasets added value and which datasets needed maturing or more frequent
updating. From experience these measures are best-read 3 months after
initial publication as the initial hits are usually because the data is
newly published. I would hope in the last 6 months the capabilities of CKAN
to provide this information has matured and most open data platforms should
be able to provide this data. Qld Gov also publish on open data page-views
by month see
https://data.qld.gov.au/dataset/visitor-statistics-data-qld-gov-au
2. The reverse way of reading the questions is which datasets should be
published first by departments because of their high public value. In
setting the list of priority publications for a Qld Gov department a brief
investigation was undertaken of page hits and downloads of data and
information published on the departments website coupled together with
records of requests for information both under FOI and researcher requests.
This allowed for the most "valuable" datasets to be published. Bear in mind
that the data being published related to the social services and had a
particular researcher interest.
In respect of the 5-star deployment scheme for Open Data all of the
departments data was assessed against this schema and amended to meet the
star rating. There is an OD form which accompanies this rating, but Qld Gov
weren't ready to complete these.
I hope this helps.
Regards
Rebecca
On Tue, Apr 14, 2015 at 4:30 PM, Cassie Findlay <findlay.cassie at gmail.com>
wrote:
> Hi all
>
> Has anyone come across good criteria or defined methods for identifying
> 'high value' datasets? If, for example, you are looking at a whole of
> government jurisdiction. I found some in this EU report
> <http://ec.europa.eu/isa/documents/publications/report-on-high-value-datasets-from-eu-institutions_en.pdf>
> but would like to gather some more.
>
> I realise that value is a highly subjective thing to assert (valuable for
> whom, why?) and really like Rosie's work on defining the problems first, in
> order to then work out where you might find datasets of value, but all that
> aside :) - are there examples out there of work to define high value stuff?
>
> Many thanks
>
> Cassie Findlay
>
>
> _______________________________________________
> okfn-au mailing list
> okfn-au at lists.okfn.org
> https://lists.okfn.org/mailman/listinfo/okfn-au
> Unsubscribe: https://lists.okfn.org/mailman/options/okfn-au
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.okfn.org/pipermail/okfn-au/attachments/20150414/53ace8db/attachment-0004.html>
More information about the okfn-au
mailing list