Integrating Galaxy Api Tool And Workflows
1
1
Entering edit mode
10.7 years ago

I have been trying to set up a HLA pipeline for ROCHE 454 amplicon data. I have gone through the paper http://www.biomedcentral.com/1471-2164/13/378# and found that the author has given a supplemental file ( File 2 ) which includes Galaxy API tool and some workflow execution scripts. I have been trying to use them for my data but do not have any idea how to deploy them on local Galaxy server. I have integrated few scripts but not of this kind. If any one could look at the files and give me an idea how to integrate them, that would be very much appreciated.

If any one has done HLA typing , any suggestions on data analysis pipeline are welcome. Thanks in advance.

galaxy pipeline • 2.8k views
ADD COMMENT
0
Entering edit mode

Please accept the answer if it helped you. It might help others too.

ADD REPLY
0
Entering edit mode
10.7 years ago
martenson ▴ 380

Hi,

the included Galaxy API tool has following description: This tool writes the input file to the output file and it is exactly what it does.

The file Galaxy-Workflow-API_HLA_BLAT_for_combined_Class_I.ga is just exported definition of Galaxy workflow that can be imported into any Galaxy via menu Workflows / Upload or Import

The folder workflow scripts contains a perl script that wraps two python scripts. One of them monitors folder and once the data arrives in the folder the script uploads them to data library in Galaxy. After the upload the other one executes the given workflow (^^) on that data.

If you want to use it as whole you have to at least change the initialization of the perl script as it is hardcoded:

#initialize variables
my $apiKey = '9c0e3f98216864f53162ceb41f571470';
my $url = 'http://localhost:8080/api/';
my $dataLibName = 'hawaii';
my $workingDir = '/Users/oconnorlab/Desktop/simon_galaxy_api/';
my $galaxyScriptDir = '/Volumes/storagepool/oconnorlab/galaxy_dist/scripts/api/';
my $libraryScript = $galaxyScriptDir . "simon_watch_folder.py";
my $workflowScript = $galaxyScriptDir . "workflow_execute.py";
my $inputDir = $workingDir . 'input_files/';
my $workflow =  '55504e7a2466a2e3';

#split the reads file into multiple input files based on column 2
my $readsFile = $workingDir . '/runs/jr274_276.tsv';

Change most of these and then import the workflow file manually through the UI to your Galaxy (or public one).

Future note: Do not ever include your API key in your submissions anywhere. Treat it like a password.

ADD COMMENT
0
Entering edit mode

Thanks for the reply. But where should I keep the scripts ?

ADD REPLY
0
Entering edit mode

You can keep them anywhere but you have to modify the paths (in the perl script) as they are hardcoded (another thing to avoid). I modified the answer to be clear on that.

ADD REPLY

Login before adding your answer.

Traffic: 2797 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6