Hi,
I am using my 454 data for OTU analysis in mothur. And I am confused after transform my sff to a fasta file. Sequencing information, platform 454 FLX, flow pattern TACG, barcode (AAAAAAAC) removed by sequencing center, primer: GAGTTTGATCNTGGCTCAG.
However, I have trouble to understand the sequence section (from 5th base to 12nd base). The primer started from 13 base. I attached the output fasta format from different toolkit.
sff_extract (from seq_crumbs toolkit) with clipping:
GAGTTTGATCCTGGCTCAGATTGAACGCTGG....
sff_extract (from seq_crumbs toolkit) without clipping:
tcagagagcgaaGAGTTTGATCCTGGCTCAGATTGAACGCTGG...
mothur output after denoise:
AGAGCGAAGAGTTTGATCCTGGCTCAGATTGAACGCTGG...
Does anyone can help to understand the sequence agagcgaa part? Base on the sequencing center information, it does not belong to barcode. And how should I deal with it? For example, it there a way to remove this region in mothur? Thank you!
Hi Josh,
Sorry for the confuse. In my sequence file (output from mothur shhh.flow as fasta format), every single sequence has a sequence section AGAGCGAA ahead of my primer sequence (GAGTTTGATCCTGGCTCAG). And base on the sequencing center, they removed all the barcode adapter. Also, because of the AGAGCGAA sequence, mothur was not allow me to remove primer from each sequence. Do you have an idea of what does that sequence for? And how do I remove them? Thank you!
Hello lzheng.chn
I'm sorry to hear that you are having this problem -- I'm not sure what has caused it. It might help if you add that sequence (AGAGCGAA) from all your samples to the oligo file that you provide in mothur to have that region trimmed during the
shhh.flow
step.You can alternately trim the sequence and primer regions off of your FASTA file outside of mothur with adapter trimming tools (see here for example) or at the command line with standard linux tools.
Hi Josh,
I found out the problem (or maybe). I shifted to QIIME by using process_sff.py, and it will trim the head like sff_extract. Maybe that is a bug in mothur. Thank you for help!
Le
Great - Glad it worked out!