I have a ranked list of genes for my organism and experiment of interest. I have many such experiments, however.
Is there a way to easily automate use of GOrilla and/or REViGO tools to generate analysis results; particularly, if I can run the tools locally, that would help me prevent putting unnecessary load on their servers.
Is this commonly done, or do people populate "faked-out", reverse-engineered web forms with wget
or curl
and retrieve results that way?
Thanks, this script worked very well.
After retrieving the
${id}
value,curl
orwget
can be used to retrieve these eight files into their own sub-folder:These results can be loaded by a local web-browser by loading
GOResults.html
, and they can be kept indefinitely.Here are my modifications to this script, which put the GO analysis results in the folder specified with
--outputDir
:Thanks for the script! Save me a lot of manual work. Here are few modifications of a couple of lines which I believe are not correct. I am not a Perl guy but it seems it's working better after the changes.
Line 80 should be
instead of
Lines 110-113 should be
instead of
this does not capture the Excel output. How can this be modified so the 3 excel files can be saved?
Perhaps I can help modify this to add a few more GETs, assuming I understand your question correctly. What are their filenames? I just haven't looked at this in some years, so filenames would be useful.
Link broken for Gist! link