Hi,
So I'm trying to test differential exon usage on case/control study across all time points when accounting for the within subject variability. I know the full and reduced models between which to test with but I'm a bit unsure how to apply it to edgeR.
Models:
Full: ~condition + time + condition:subject.nested + condition:time
Reduced: ~condition + time + condition:subject.nested
So far I made the 1) DGEList, 2) specified the full model to model.matrix 3) calculated calcNormFactors(dge), and estimated 4) estimateDisp(dge, design, robust=TRUE)
But now with:
fit <- glmFit(dge,design)
fit <- glmLRT(fit, design, coef = 35:36)
the the testing I'm a bit unsure how define correctly the LRT test. My goal is to set test so it would test if there's difference between the full and reduced model. If a make the both model.matrixes they are otherwise similar but the two last coefs 35 and 36 are missing from the reduced model.matrix. If I want to test what I described earlier, is it correct if I assign the to last coefs to glmLRT?
Really would appreciate the help.
try Dexseq (Inference of differential exon usage in RNA-Seq) package in R. https://bioconductor.org/packages/release/bioc/html/DEXSeq.html. With edgeR, you can use diffsplice function (https://www.rdocumentation.org/packages/edgeR/versions/3.14.0/topics/diffSpliceDGE)