when running the command on MacOS: "makeblastdb -in protein.fa -dbtype prot -parse_seqids
" it results in the following error: BLAST Database error: Database memory map file error.
I need help to create the database
when running the command on MacOS: "makeblastdb -in protein.fa -dbtype prot -parse_seqids
" it results in the following error: BLAST Database error: Database memory map file error.
I need help to create the database
I had the same problem. The path to the fasta file contained a space. After moving the fasta file to a folder without spaces in the path I was able to run the makeblastdb command.
I also had this error. The fasta source file was on an external drive but when I moved it to my internal drive in my home folder the makeblastdb command worked just fine. I suspect it's a permissions error for using external drives (Sequoia, Mac mini M4, 24 GB RAM) but I'm not certain. I set the external drive's permissions to "Ignore ownership on this volume" using "Get Info" in the finder, but that did not resolve the problem.
Use of this site constitutes acceptance of our User Agreement and Privacy Policy.
add title when making db. On v2.16, I had the same issue with downloaded db from NCBI ftp. Worked after adding db alias so worth a try
Error reporting 101: if the error mentions memory, you tell us about your computer's memory. Chances are your system falls short there.
I have checked my computer's memory and it seems to have 88GB available.
So this is a Mac Pro (they can be fitted with 64,128, or 196GB RAM, 88GB I haven't seen)? What is the output of "About this Mac"?
And can you please run makeblastdb with only a small single protein sequence to see if the problem is related to DB size or a problem of the binary?
Which version of the binaries did you install? This may have to do with processor type. Try to re-install the MacOS universal binaries from https://ftp.ncbi.nlm.nih.gov/blast/executables/blast+/LATEST/ or the ones compiled for your chip (M1,M2: aarch64; Intel: x64).
I am working with a MacBook Air with 8GB, M1 chip. I tried with another small sequence and it gives me the same error. And regarding the version of the binaries, I am not very familiar with these terms so I don't really know what to do with the link you provided. I followed my professor's instructions to install everything. When I run makeblastdb -version it says that is the 2.16.0+ version. Do you think you could give me a hand? Do you need any more information?
If the smaller sequence yields the same error it doesn't have anything to do with the memory. Try to reinstall using this package: https://ftp.ncbi.nlm.nih.gov/blast/executables/blast+/LATEST/ncbi-blast-2.16.0+-universal.dmg The universal image should work on both intel and M1 macs. Unfortunately, I am still on intel mac, so unfortunately I cannot reproduce any M-chip related problems.
Did you find the answer to this please?
I find the same on a decent spec Intel mac, using both the correct binaries and the universal one. This is a test of 10 records:
You made a protein database but are trying to use
blastn
(with nucleotide or protein data) against it.If you have nucleotide data as query then you need to use
blastx
. If you have protein queries then useblastp
.Thank you for taking the time to help this newbie @genomax. What I should have been using is this, which does work:
That makes sense. Thanks for confirming.
Goodrmorning everyone, I have the same problem, I am trying to built a local BLAST for nucleotide and I downloaded all the .nt file with this command:
for i in $(seq -w 0 225); do wget "https://ftp.ncbi.nlm.nih.gov/blast/db/nt.${i}.tar.gz"; done
then decompressed them: for f in nt.*.tar.gz; do tar -xvzf "$f"; done
an then when I run this :blastdbcmd -db nt -info
and I obtain this error: BLAST Database error: Database memory map file error.The files seems fine and decompressed with all the correct extensions: .nog, .nni, .nnd, .nin, .nhr. I also should have enough memory. Anyone has some ideas to fix this issue? Thank you!!
First, please don't do this, use the update_blastdb.pl command.
Second, as to why this doesn't work this time: Because as of 2025-02-19, the NT database has 228 archives and counting. Another reason to use the update script.
Thank you! I used this pipeline and it is working right now! I used update_blastdb.pl --decompress nt and now it seems to recognize the database
on a side note: if you download it directly from NCBI do also download the md5 checksum keys and use them to check the sanity of your downloaded files (they are provided for a reason ;) )
more on topic, can you confirm you also have a file with extension .pal ? (that's the one that ties all those others together to a single DB )
I downloaded also the md5 checksum, I still don't have the file with the extension .pal but it seems to work at the moment, do you thin it's a problem if I don't have it? Thank you!
also not a .nal (since you're using the nucleotide DB) ? maybe it's not needed anymore ...
The only idea I have is that you don't have enough memory. What exactly does it mean that you should have enough memory? 256+ GB? I don't know what the current memory requirement is for a local BLAST run, but it wouldn't surprise me if it is inching towards half a TB.
Assuming that you have enough memory, how long does it take from the moment you hit
Enter
on your command line until the error shows up?I am working from a remote university server and I have 17 tera so I think is enough to run a local blast, before the error it was immediate, now it seems to work! thank you!