Entering edit mode
17 months ago
Assa Yeroslaviz
★
1.9k
I have a small matrix of count data from different time-points, which looks like that:
essentials.mean
t1 t2 t3 t4 t5 t6 t7
Gene-1 3381.0 2132.5 1743.0 1774.5 724.0 125.5 56.0
Gene-2 4191.5 2894.0 2323.0 1323.0 2481.5 194.5 94.5
Gene-3 4522.0 2516.0 1751.0 803.0 443.5 163.5 54.0
Gene-4 5139.5 3412.0 3153.5 2669.5 1855.5 130.0 85.0
Gene-5 4447.0 1870.5 1425.5 907.5 324.5 34.0 55.0
...
`
When plotting the genes I can clearly see a decrease in expression intensities over time, for almost all of them.
I would like to know, if there is away to test for significance for this trend. How can I create a model to show, that this decrease is statistically significant?
Thanks
Assa
https://bioconductor.org/packages/release/workflows/vignettes/RNAseq123/inst/doc/designmatrices.html#linear-time-series
You can test for time series in limma. This guide may help deciding which sorts of time series you want to model.
I'm not sure, this would be the right approach, as I'm only interested in looking at 20 genes from the complete data set. Would this works with limma, If i subset my counts matrix to leave only 20 genes?
But thanks, I'll have a look at limma.