Entering edit mode
8 weeks ago
Ben
▴
20
I am attempting to run DESeq2, edgeR, and limma-voom to find DEGs between a condition sensitivity
with factors 'lineage'.
My dispersion model doesn't seem to fit well to my data, but I wanted to check if it was acceptable, here is my code:
keep <- edgeR::filterByExpr(y = counts, group = sensitivity) # default filterByExpr settings
counts <- counts[keep, ]
dds <- DESeqDataSetFromMatrix(counts, metadata, design = ~ lineage + sensitivity)
dds <- DESeq(dds, fitType = fit_type, minReplicatesForReplace = Inf)
plotDispEsts(dds)
Here is my dispersion model fitted to my data using parametric fit type.
And here is the model with the local fit type.
Any thoughts?
Yep. Thats a pretty weird looking fit.
One thing that is noticable is that the gene-est dispersions look to be bimodel in the mid-ranges. One of your factors is "lineage". Is it possible that there are different cell types, and the genes with lower dispersion represent genes expressed in both cell types, and those with low dispersion represent those that are only expressed in one cell type?
Yes, agreed definitely looks bimodal. Lineage is actually about 4 or 5 different cell lineages. I couldn't think of a quick way to determine which genes were differentially expressed between these, so I just individually ran a couple of lineages with the design
~ sensitivity
.My dispersions still look quite bimodal. Any ideas?
Bowel:
Lung (ymin = 1e-03 as there as several outliers at 1e-08):
CNS (ymin = 1e-04 as there as several outliers at 1e-08):
I think you probably need input from Mike Love (@mikelove). He hangs around here sometimes, but is more active on support.bioconductor.org, so you might try cross posting there.
I have cross-posted, thanks for your help!