Hi, I am generating sets of data up to 100x human WGS. Our lab is buying a server so I was wondering the normal time and memory this would require with say 10-15 threads using either bwa/bowtie/minimap. Thanks
Hi, I am generating sets of data up to 100x human WGS. Our lab is buying a server so I was wondering the normal time and memory this would require with say 10-15 threads using either bwa/bowtie/minimap. Thanks
Have you considered generating synthetic reads from the human reference genome with wgsim or art_illumina to answer this question?
You can use these tools to simulate this scenario at the desired read depth, run the data through each aligner, and record the amount of time to completion using the /bin/time
command (see Linux manual page).
Of course this depends on having access to the necessary hardware requirements (e.g., disk space and RAM).
I hope this can help you get a rough idea about the run time!
Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Hi, please convert the following codes into r codes. or help me if anyone knows how to perform bagging with support vector machine classifier in r.
What does this have to do with the original question? Please create a new thread to ask a new question.