Hi there,
I'm not sure if this is the right place for this question, but I feel like not seeing the tree in the forest. ;) I have some data from a FACS experiment and want to plot histograms of a given channel.
The channel's data is an 1D array of decimals ranging from slightly below zero to 10^5. No matter if I use Excel or Python pandas, the histograms show that the majority of values is below 100 (counts > 3.5e5 of 2e4). The rare events with high signals are barely visualized. However, if I use Flowing Software 2 or look at the report of the facility (they're using BD Diva), the histograms only counts up to ~200 with most values above 100. Obviously, there's some data transformation going on. The axis in the dedicated FACS software seem to be log-scale (or logicle) and there are counts around zero, which are heavily compressed in the final graph. However, simply plotting the data on a log-scaled axis in Python does not help.
As said, not sure, whether this is the right place, because it's a more question about theoretical and mathematical background. ;) But if someone can give some explanations for a naive person, this would be great.
Can you post the python and the software plots, to get an idea?
They look like this: