Help with Generating Clustered Heatmap from Large Datasets
0
0
Entering edit mode
23 days ago
ArtiCore • 0

Hi,

I'm working with a large dataset where I'm trying to generate a clustered heatmap from z-scores. However, I’m hitting a memory error: "Error: cannot allocate vector of size 614.9 Gb." The data size is simply too large to be processed in one go.

Does anyone have advice on:

  • How to handle heatmap generation for such large datasets?
  • Are there any methods to process or visualize the data in chunks while retaining meaningful clusters?
  • Any tools, R packages, or approaches for optimizing memory usage for this type of task?

I'd appreciate any insights or suggestions—thank you!

Heatmap • 326 views
ADD COMMENT
1
Entering edit mode

614.9 Gb

What kind of data is this? You should add some information about that to get specific help. Depending on the data type there may be different strategies.

ADD REPLY
0
Entering edit mode

The data are derived from genomic raw count matrix transformed into z-scores. Numeric matrix. In R I am trying to run this command:

pheatmap(mat_z_scores, cluster_rows = TRUE, cluster_cols = TRUE)

enter image description here

ADD REPLY
0
Entering edit mode

Not sure if this makes a difference in R, but it python one could force the data type to be float32 rather than a default float64 type.

ADD REPLY

Login before adding your answer.

Traffic: 2639 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6