Hello, I need your help please.
I want to create docker container to analyse my reads, to run it via nextflow or wdl worflow. I have a python script, and I would like to pass my reads as argument something like that : "myScript.py args" I don't know how to run a docker container, passing files from host to the container.
I don't want to copy my files into the container, but only pass my different reads dynamically to the container and run my script on them
reads = sys.argv[1]
for reads in reads:
analyse(read)
My Dockerfile:
FROM python:3.8.3
WORKDIR /application
COPY src/ .
RUN useradd appuser && chown -R appuser /application
USER appuser
ENTRYPOINT ["python", "app.py"]
Thank you !
Cant't you just do this? Possibly need to change the bind directory source from /data to wherever the data is?
Caveat, I am using singularity not docker, so I couldn't test the syntax but it should be quite close.
If you're using nextflow, docker container mount the
/tmp
directory in the container by default. There are some options here, but typically everything in the process working directory are mounted to/tmp
. Can you add an input path parameter to you script? Something likepython app.py -I /tmp
?I usually use bash based docker containers, so I'm unfamiliar with any specific problems or how paths work in python containers. But in testing, so long as
docker run -v /path/to/reads:/tmp my_container
works, it should be find on nextflow.