Dead Links And Unmaintained/Broken Web-Services
4
4
Entering edit mode
13.6 years ago
Joachim ★ 2.9k

Hi!

Is there a central portal or service that keeps track of the growing number of dead links in the scientific literature? Are there people keeping records on the livelihood of bioinformatics projects?

I am aware of "Broken Links: The Ephemeral Nature of Educational WWW Hyperlinks", http://www.springerlink.com/content/0fupwj0tv8am3r1f/ but are there other studies about this topic? Is there perhaps even a tool/service that provides an ongoing analysis of the status of a bioinformatics project?

Chris once mentioned http://www.ohloh.net/ but that seems to require a lot of manual interaction. For example, I looked up 'BioMart' again and it says that there is 'No recent development activity', which is plainly wrong of course. The 'Taverna' information is very accurate on the other hand.

Thanks.

literature project • 3.3k views
ADD COMMENT
4
Entering edit mode
ADD COMMENT
3
Entering edit mode
13.6 years ago

@rdmpage (Roderic Page) wrote a tool to check the "Status of biodiversity services". It is available at http://bioguid.info/status/

citing his blog:

The idea is to poll each service once an hour to see if it is online. Eventually I hope to draw some graphs for each service, to get some idea of how reliable it is.

edit: and the Biocatalogue monitors the activity of the registered services: http://www.biocatalogue.org/latest#monitoring

ADD COMMENT
0
Entering edit mode

http://bioguid.info/status/ is very slow right now

ADD REPLY
0
Entering edit mode

I will check the bioguid.info link again later. So far I had no luck getting through...

ADD REPLY
3
Entering edit mode
13.6 years ago
Michael 55k

This problem is imho inherent to online resources. It can be even more complicated than just 404 errors. The resource can not only disappear be also the content can be changed, possibly without the user noticing, and then, how do you verify that the content is 'correct'. A solution would be constant monitoring of resources. It would be totally impossible to supply an automatic monitoring solution to all 'scientific' resources, even only to those used in papers (e.g. supplementary data). Further, there would be the problem of sanctions if the resources disappear or become unmaintained, e.g. because funding runs out or the post-doc moved to another place.

So, I think the solution to the problem in general exists on a higher political level, by securing funding to valuable databases and resources, such that the providers can maintain them and upgrade and update them over the years.

In addition to Pierre's and Casey's answers I would like to add that BioCatalogue has a built-in automatic monitoring of the registered SOAP and REST web-services. In this highly automated context of web-services, it seems feasible to apply monitoring. However, simply 'pinging' or checking if the WSDL is there is not enough. The functionality of such services should be automatically tested using a unit-test like procedure. I am not sure BioCatalogue already allows to supply test-cases for services.

ADD COMMENT
2
Entering edit mode
13.6 years ago
Woa ★ 2.9k

Well, though its not about a repository of broken links, but if one is lucky and the site has been archived, it can be found back using this tool.

ADD COMMENT

Login before adding your answer.

Traffic: 1899 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6