Excessive data loss after trimming
0
0
Entering edit mode
4.9 years ago
jomagrax ▴ 40

Hi everyone!

I am trimming the adapters of my data, for one of the controls cutadapt It's only keeping the 5.2% of the reads after trimming and removing reads shorter than 17bp

$ cutadapt -a GATCGGAAGAGCACACGTCTGAACTCCAGTCACATCACGATCTCGTATGCCGTCTTCTGCTTG -j 4  -m 17 -o control_2_trimmed.fastq control_2.fastq

=== Summary ===

Total reads processed:               5,252,641
Reads with adapters:                 5,169,644 (98.4%)
Reads that were too short:           4,981,734 (94.8%)
Reads written (passing filters):       270,907 (5.2%)

Total basepairs processed:   262,632,050 bp
Total written (filtered):      8,801,796 bp (3.4%)

While for the other samples values around 40% are obtained. Is the problem somehow my foult or coud It be from the data?

Thank you in advance, Jose

genome RNA-Seq • 1.1k views
ADD COMMENT
2
Entering edit mode

Looks to me like library prep was screwed up and you got primer dimer, can you get a hold on the library QC (pre-sequencing)? Maybe other people with some more experience can comment

ADD REPLY
0
Entering edit mode

Can you try removing minimum length filter -m 17? Though it doesn't help much, but you would know number reads without any length filter in this file. As Asaf pointed, try reaching sequencing core @ jomagrax

ADD REPLY

Login before adding your answer.

Traffic: 1965 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6