In all RNA-seq analysis applications they talk about the dispersion of a gene. As far as I understood, it is not a variance of the normalized counts for a given gene. It is somehow much more complicated.
DESeq defines the dispersion as follows:
But what would a dispersion of 0.19 or a dispersion of 0.80 tell me? Can I still interpret it as a variance of a gene?
In my understanding, you can. From the definition of dispersion,
In statistics, dispersion (also called variability, scatter, or spread) denotes how stretched or squeezed a distribution (theoretical or that underlying a statistical sample) is. Common examples of measures of statistical dispersion are the variance, standard deviation and interquartile range.
That is a definition from wikipedia, however I am not sure that the authors really mean raw variance as we understand it when they talk about the dispersion