Why the double standard on genomic data release policies?

David Dooling has a great post that starts with the conference blogging issue, and then leaps off in a different but related direction - the curious double standard in the data release policies applying to large genome sequencing centres compared to other genomic researchers.

As David notes, the advent of second-generation sequencing technologies means that medium-size genomics labs can now generate more sequence in a month than the entire Human Genome Project was able to generate in a year; indeed, a single Illumina Genome Analyzer can now produce a high-quality entire human genome sequence in a couple of weeks. And yet while large genome sequencing centres are obliged to publicly release their data as soon as possible (following the Bermuda Principles), smaller research labs are under no such obligation.

David argues both that large genome centres need more provision for time to perform proper quality control, and that the same data release policies need to be applied to sequence-generating facilities across the board. Here's his final paragraph:

The human reference has been published (with a recent update to GRCh37).
The blueprint exists. Thus, many of the reasons underlying the
conclusions of the Bermuda Principles are no longer applicable. So
should those open access principles be applied more widely to other
areas of biology and science at large or should they no longer apply to
sequence data from a genome for which a reference exists? It is time to
rethink the current policies and begin to apply them to all sequence generators. The double standard must end.

These are important issues; I've closed the comments here to encourage readers to add their views to David's post.

More like this