Is it normal for RCorrector to remove millions of reads?
1
0
Entering edit mode
3.3 years ago
nina.maryn ▴ 30

I'm trying to build De Novo transcriptomes for unsequenced plants to do sequence analysis.

I'm trying to choose a tool for my first pass of quality filtering after running FastQC on my raw reads. I've tried AfterQC and RCorrector. It looks like AfterQC sort of made my read quality worse, at least at the ends. And RCorrector looks great, but I went from 30 mil reads to 3.5 mil reads? There were far fewer reads tossed out with AfterQC, but my understanding is that that tool is better for genome sequencing. Any advice or thoughts?

filtering read RCorrector Trinity RNAseq • 1.1k views
ADD COMMENT
0
Entering edit mode

What did your initial FastQC plots look like? And did you pool the reads you fed Rcorrector by any chance?

ADD REPLY
1
Entering edit mode
3.3 years ago
ponganta ▴ 590

No, this is not normal, as rCorrector should not remove any reads (if you didn't turn on -trim!). All it does is correction, and flagging of uncorrectable reads. Do you have paired end (PE) or single end (SE) reads? Did you conduct any sort of filtering prior to error correction? I'd take a look at the approach taken by the Oyster River Protocol, I found this to work quite well.

For trimming, I would rely on TrimGalore rather than rCorrector.

ADD COMMENT

Login before adding your answer.

Traffic: 2100 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6