Gene expression using R
1
0
Entering edit mode
6.4 years ago
Leite ★ 1.3k

Hello everyone,

I am analyzing a large microarray database, (n = 820), And the R has reached the total memory of my computer

Warning messages:
1: In dimnames(object[[nm]]) <- dn :
  Reached total allocation of 6036Mb: see help(memory.size)
2: In dimnames(object[[nm]]) <- dn :
  Reached total allocation of 6036Mb: see help(memory.size)
3: In dimnames(object[[nm]]) <- dn :
  Reached total allocation of 6036Mb: see help(memory.size)
4: In dimnames(object[[nm]]) <- dn :
  Reached total allocation of 6036Mb: see help(memory.size)

I was thinking of splitting the database and normalizing the groups separately:

Example:

  1. Patients who survived + healthy controls, and then look for the DEGS

  2. Patients who did not survive + healthy controls and then look for the DEGS

My doubt is this, if I normalize the data of the two groups in separate would it make a difference to normalize the groups together and then do the analysis of DEGs?

Best,

Leite

r microarray Reached total allocation • 2.0k views
ADD COMMENT
0
Entering edit mode

If your computer has 6Mb of memory you have more pressing issues.

Increase the allocation R is allowed to use:

https://stackoverflow.com/questions/1395229/increasing-or-decreasing-the-memory-available-to-r-processes

ADD REPLY
0
Entering edit mode

Dear jrj.healey,

6Mb is the total memory capacity of my computer.

ADD REPLY
1
Entering edit mode

That is unlikely with a relatively new computer (unless you are using a computer from 1990s). Surely you meant to say 6 GB.

ADD REPLY
0
Entering edit mode

Sorry, I made a mistake, yes it's 6GB and not 6MB

ADD REPLY
1
Entering edit mode

OK well with that cleared up, you have a few options I think. I can't speak to the statistical robustness of breaking your data up, as I'm no statistician.

Your first option is to look at more memory efficient ways of doing the computations. Do not read the whole dataset in to memory simultaneously. If it were python for instance, I'd suggest taking a look at generators, I daresay R has some equivalent capability. You may be able to identify the problematic components of your process and achieve some performance increases through vectorising or handing the computations differently.

Your other option is to move to a new machine/HPC provision or get more RAM put in the one you have (if possible)

ADD REPLY
0
Entering edit mode

Dear jrj.healey,

Thank you for the reply and sorry for the mistake. I think my best option will be a new machine.

ADD REPLY
0
Entering edit mode

Find a better computer if you want to do proper bioinformatics, or find a bioinformatician to collaborate.

ADD REPLY
0
Entering edit mode

Try loading/using ff read library in R with such huge data.
It gives output in a little bit different data structure but tinkering around with small data (N=10) may help for guidance.

ADD REPLY
0
Entering edit mode
6.4 years ago

I guess you are working on Windows and your memory limit is very low. You can check the output of the memory.limit(). If it is too small, try increasing it, eg: memory.limit(size=4000) for approx 4GB memory for R

ADD COMMENT
1
Entering edit mode

OP has 6 GB, which is the limit in the error that R gives.

ADD REPLY
0
Entering edit mode

Ah, you are right. Did not check the whole thread!

ADD REPLY

Login before adding your answer.

Traffic: 1816 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6