[open-science] open-science Digest, Vol 963, Issue 1

Michaël Bon (SJS) michael at sjscience.org
Mon Mar 13 14:25:49 UTC 2017


Dear all, 

I would like to invite you to take some time to consider and discuss a
new idea which I believe has always been the missing ingredient in
achieving top quality and open science: a new research assessment
process that rewards quality (i.e. and no longer impact),
reproducibility, openness and collaboration. 

I present this idea and put it into practice in the article: 

"Novel processes and metrics for a scientific evaluation rooted in the
principles of science [1]" 

which is hosted on the pre-print, curation and open peer-review platform
"The Self-Journals of Science (SJS) [2]" . I would like to invite all
scholars of this list to take part into the debate. On SJS, peer-review
takes the form of a public and open debate between reviewers and the
authors, whose goal is not to binarily decide whether to accept or
reject the article, but to dynamically evolve it in order to reach a
consensus (positive or negative) in the community. Then SJS introduces a
new way to assess the validity and importance of any one article, that I
explain in the article. This new evaluation logic can gradually reverse
the process of privatization and fragmentation of science which is
implicit in the use of impact factors and associated publish-or-perish
culture. Moreover it is technically, legally and immediately feasible in
the current environment. It is an answer to what is generally referred
as "the lack of incentives" for peer review, green open access & open
science. I would welcome your input on this concrete proposition for
this fundamental question of research assessment. 

If you read the article and want to participate, just 

 	* authenticate on the platform with your institutional email address.
 	* freely (but non-anonymously!) embed your reviews in the article by
clicking on the "+" block in front of any paragraph you want to discuss.
You will have access to two forums "Critiques" and "Comments" whose
names are self-explanatory. Your reviews will have the same visibility
as the article itself. I (and co-authors) will be grateful for all
relevant critiques. You can also up-vote existing critiques.
 	* when your mind is made up, please select at the top of the page
whether you think the article has achieved scientific standards, or if
you believe it still needs revisions, so that future readers can stay
tuned as the article evolves.

I am here if anything looks to unusual! 

Michaël Bon [3] 

[1] http://www.sjscience.org/article?id=580
[2] http://www.sjscience.org
[3] http://www.sjscience.org/memberPage?uId=1
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.okfn.org/pipermail/open-science/attachments/20170313/d19b66d1/attachment-0002.html>

More information about the open-science mailing list