Normalized Bigwig Files
4
4
Entering edit mode
11.7 years ago
vj ▴ 520

I am trying to normalise the bigwig files (I start from bam files) for a large number of ChIP-Seq data. Is there a well agreed method available to do this? I know of normalising as RPM. Any suggestions?

bigwig • 21k views
ADD COMMENT
9
Entering edit mode
11.7 years ago
Ian 6.1k

If you use bedtools genomecov you can use a scaling factor.

bedtools genomecov -ibam input.bam -bg -scale X -g genome.chrom.sizes > normalised.bg

where X is the scaling factor. The scale could be for each sample 1,000,000/mapped reads, or each sample divided by the mean of mapped reads for each sample.

You can then use:

wigToBigWig -clip normalised.bg genome.chrom.sizes normalised.bw
ADD COMMENT
0
Entering edit mode

Thanks. This options seems to open up a lot of options.

ADD REPLY
2
Entering edit mode
11.7 years ago
Ryan Dale 5.0k

pybedtools has a function that will scale your BAM by million mapped reads (the scaling used by many ENCODE data sets) and creates a bigWig file all in one shot:

from pybedtools.contrib.bigwig import bam_to_bigwig
bam_to_bigwig(bam='path/to/bam', genome='hg19', output='path/to/bigwig')

More details in this answer: Converting Bam To Bedgraph For Viewing On Ucsc?

ADD COMMENT
0
Entering edit mode
5.6 years ago
sztankatt • 0

As of today, you can use deeptools exactly for these kind of tasks: https://deeptools.readthedocs.io/en/latest/index.html

ADD COMMENT

Login before adding your answer.

Traffic: 1674 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6