Dear all,
I have a fastq.gz file that has more than 100 million reads. My aim is to divide this fastq file into three separate fastq files, ensuring that all reads from the original fastq file are distributed and unique in the newly created fastq files. I tried using fastqsplitter
as below but I get the mentioned error:
fastqsplitter -i 10_S10_R1_001.fastq.gz -o group4_sample10_R1.fq.gz -o group5_sample10_R1.fq.gz -o group6_sample10_R1.fq.gz
the error is: OSError: b'igzip: Error while decompressing extra concatenated gzip files on 10_S10_R1_001.fastq.gz\n' (exit code 1)
.
Do you know how I can solve this problem or is there any other tool that can do this?
Kind regards,
Cross posted at https://github.com/LUMC/fastqsplitter/issues/18