calculating abundance of metagenomes
2
0
Entering edit mode
5.3 years ago
biobiu ▴ 150

Hi, We performed shotgun metagenomics for dozens of stool samples. Next, we assembled the reads into contigs and binned the contigs to reconstruct the genomes. We want to use the reconstructed metagenomes in order to calculate the abundance of the microbes they represent in all samples.

  1. Is there any recommended aligner (and parameters) for such task? (considering that we align against not-full genomes, multimapping etc.)
  2. I came across some studies that calculated relative abundance by dividing the number of reads aligned to a metagenome by the total number of reads of a sample. Any Idea why genome length was not also taken into account?
  3. Is there any reason to calculate absolute abundance in such case? (maybe for downstream analyses/tools?)

Thanks!

Assembly alignment • 5.5k views
ADD COMMENT
1
Entering edit mode

Have you seen MetaPhlAn v.2.0?

ADD REPLY
0
Entering edit mode

Thanks. Yes I've seen, we think on a way to estimate abundance without using marker genes.

ADD REPLY
1
Entering edit mode

Why not to use marker genes methods?

ADD REPLY
0
Entering edit mode

We assume that we have un-characterized species of bacteria and non-bacterial genomes.

ADD REPLY
0
Entering edit mode
4.4 years ago
Naren ▴ 1000

https://github.com/HRGV/phyloFlashphyloFlash which uses SSU 16S against SILVA db.

ADD COMMENT
0
Entering edit mode
4.4 years ago

It's not really intended to answer your question, and you've probably solved this by now, but our pipeline can do the alignment and your intended normalization (including number of reads and genome length):

https://github.com/MHH-RCUG/Wochenende

Seeing as you have a custom output, you'll need to create a reference sequence (combine contigs to "genomes" by removing headers) and index it.

ADD COMMENT

Login before adding your answer.

Traffic: 2572 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6