Please look at my raw and pre-processed reads. Pre-processing was done with bbduk. Parameters were as follows: qtrim=r trimq=20
Why is it trimming so severely for min Q 20? Can I change something?
Please look at my raw and pre-processed reads. Pre-processing was done with bbduk. Parameters were as follows: qtrim=r trimq=20
Why is it trimming so severely for min Q 20? Can I change something?
Why is it trimming so severely for min Q 20
bbduk.sh
is doing what you asked it to do i.e. trim any data to the right once it encounters a region with average quality below 20. How many reads/bases were trimmed from the original?
If you are going to align to a reference there is no pressing need to filter based on quality. If you must, then use something like trimq=10
. trimq=20
and above should only be needed if you were going to de novo
assemble the data.
Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
Links for raw and trimmed reads fastqc report:
https://ibb.co/kTMAHm
https://ibb.co/jwcsxm