Snakemake rule to copy files
2
0
Entering edit mode
7 weeks ago
DBScan ▴ 440

As part of a pipeline, I am trying to create a rule which takes a file containing file paths to gVCF files and copies them to a compute node. I would like to have all output files defined in the output of the rule, so they will get delete when I mark them with temp(). Currently I am doing the following:

rule CopyToNode:
        input:
                gVCF_list
        output:
                gVCFs_copy_node
        threads:
                2
        shell:
                """
                cat {input} | xargs -n1 -P{threads} -I% rsync  -avh % {SCRATCH_DIR}
                awk -F/ '{{print $NF}}' {input} | awk '{{printf "{SCRATCH_DIR}%s\\n", $0}}' > {output}
                """

This however doesn't actually have the file names in the rule output (the output is just a file containing the file path).

snakemake • 388 views
ADD COMMENT
0
Entering edit mode

If I understand well, you wish to copy files on a node and flag the copied files as temp? Not sure you can do this, the desired output files would be remote there. Snakemake does not support this I think.

ADD REPLY
2
Entering edit mode
7 weeks ago
DBScan ▴ 440

Turns out there is a solution to this problem with Snakemake 8.

  1. Install newest version of Snakemake (8.16.0 at this date)
  2. Install snakemake-storage-plugin-fs
  3. Then you can define remote and local storage prefixes like this:
snakemake --cores 16 --default-storage-provider fs --shared-fs-usage persistence software-deployment sources source-cache --local-storage-prefix /scratch/ --remote-job-local-storage-prefix /scratch/

I still need to figure out what all of these options mean, but it seems to work.

ADD COMMENT
0
Entering edit mode
7 weeks ago

I'm not sure of what you're trying to do , may be you want:

awk -F/ '{{print $NF}}' {input} | while read F; do cp -v "$F" {output} ; done
ADD COMMENT

Login before adding your answer.

Traffic: 950 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6