Allpaths-Lg, Using Sanger Data To Create The Fragment Library
1
1
Entering edit mode
13.2 years ago
David M ▴ 580

We're currently looking at different assemblers, and I'm in the process of learning about/testing ALLPATHS-LG. At the present we have 14 lanes of illumina data (for a mammalian-sized genome), but don't yet have the fragment (= superread) library. Could I still get a decent assembly by artificially creating the fragment library from a library of Sanger reads?

Thanks!

assembly sanger illumina • 2.9k views
ADD COMMENT
0
Entering edit mode

Anybody have any ideas if this is a problem?

ADD REPLY
2
Entering edit mode
13.1 years ago
Torst ▴ 980

This would probably work, as long as you have (1) high quality across the whole read (2) enough coverage. For (1) the super-read is constructed by overlapping alignment of 100bp reads from a 180bp fragment, so that the lower quality 3' ends of each read overlap to produce an overall higher quality 180bp read. For (2) I think you could be in trouble, as you won't get the 50x required from a Sanger library, unless you shred your Sanger reads into overlapping 180bp reads to the sufficient depth. In this case it would probably be fine.

ADD COMMENT
0
Entering edit mode

If I shred the Sanger reads into overlapping reads, wouldn't I be lending a statistical certainty to the new reads that doesn't exist? Ie, If a sanger base is incorrect, and that base is used in 50 'fake' reads, don't I grant certainty to an incorrect base where it shouldn't exist?

ADD REPLY
0
Entering edit mode

Does ALLPATHS use quality scores at all? If not, if there is a base with low quality in your Sanger read, you could replace it with "N" to mark it for correction/PCR-check later. If so, perhaps scale all your Q scores in the Sanger reads by the factor appropriate for the shredding depth, and ALLPATHS will treat them more correctly.

ADD REPLY

Login before adding your answer.

Traffic: 1809 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6