Hi,
I know homer can normalize tag count to 10 million reads but do you know any other methods to do the normalize part except using homer? like macs2 itself, or ....
Thanks,
Again, like in your previous questions, if you ask for a solution, you have to specify what kind of input data you have. Please keep that in mind for the future. If you have a count matrix, you can do it with a one-liner in R:
matrix_norm10mio <- data.frame(round(sweep(matrix_raw_counts, 2, colSums(matrix_raw_counts), FUN = "/") * 10000000))
You can scale to any number of reads by replaceing the 10.000.000 at the end by the intended number of total reads. This is of course a very simple way of normalizing counts.
Maybe TCC an R package can help you in this. Paper: TCC: an R package for comparing tag count data with robust normalization strategies Link to Bioconductor TCC Package