Tophat with Bowtie2 long index
0
1
Entering edit mode
10.4 years ago
BDK_compbio ▴ 140

I am getting some error while running tophat2. I used bowtie2 for building indices using the following command

bowtie2-build --large-index <refrence> <output directory>

It generates indices with suffix bt21 (instead of bt2). Bue when I am running tophat it is giving me following error

Could not find Bowtie 2 index files (.*.bt2)
tophat bowtie • 10k views
ADD COMMENT
0
Entering edit mode

Which version of tophat2? I think only the most recent version supports large indexes (it doesn't actually say in the release notes).

ADD REPLY
0
Entering edit mode

I am using tophat/2.0.12 which is the latest version I guess.

ADD REPLY
0
Entering edit mode

Hello sbdk82!

It appears that your post has been cross-posted to another site: SEQanswers

This is typically not recommended as it runs the risk of annoying people in both communities.

ADD REPLY
0
Entering edit mode

Thanks for the suggestion. I posted on both sites thinking that I would get quick answer.

ADD REPLY
0
Entering edit mode

I ran it like

/usr/local/bin/bowtie2-build -f ~/Genome_fasta/hg19refseq.fa ~/homo_bowtie2_index/hg19

where hg19 is base index prefix and hg19refseq.fa is reference sequence

I got

hg19.1.bt2
hg19.3.bt2
hg19.rev.1.bt2
hg19.2.bt2
hg19.4.bt2
hg19.rev.2.bt2

in the given directory

And TopHat2 always runs very well using them.

ADD REPLY
0
Entering edit mode

I also ran like that, but it gave me the following error

Error: Reference sequence has more than 2^32-1 characters!  Please divide the reference into batches or chunks of about 3.6 billion characters or less each and index each independently.

That's why I tried --large-index option. I am now using BWA-MEM and STAR instead of bowtie+tophat

ADD REPLY
0
Entering edit mode

May be, you try with TopHat 2.0.8 version which works for me.

ADD REPLY
0
Entering edit mode

I am using tophat 2.0.12

ADD REPLY
0
Entering edit mode

Actually the reference file is too large, so bowtie-index won't work if you don't use --large-index option. The current version of tophat does not support the large index. Did you use --large-index parameter? Maybe the size of your reference file is not large enough.

ADD REPLY
0
Entering edit mode

which reference genome you are using?

I did with Human, Chimp, Gorilla, Rhesus, cynomolgus, mouse etc, trference genome and never face any problem

ADD REPLY
1
Entering edit mode

None of those have a large enough genome to require large-genome support, which is good since tophat can't yet do that.

ADD REPLY
0
Entering edit mode

I am using a prebuild index (.bt2 files) of hg19 for bowtie2, but every runs has an error that says it is too large

Out of memory allocating the ebwt[] array for the Bowtie index. Please try again on a computer with more memory.

I know hg19 is too large, and even the fastQ file I am using is also large. But is not there anyway out to run bowtie 2 on this?

ADD REPLY
0
Entering edit mode

Firstly, please start a new thread for things like this rather than posting questions as a comment.

Secondly, you can certainly use bowtie2 for hg19, you just need to use a computer with more RAM (as the error message said).

ADD REPLY
0
Entering edit mode

Oops sorry for not starting a new thread. I am new here and somewhere it said, for discussions use the comment tab!

OK so I need more RAM. How much is least RAM required to run a mapping for hg19? My PC has 8GB RAM

I am quite new to this, so please consider this as a genuine doubt! And yes I will make sure I post doubts as new threads from next one!

THANKS!

ADD REPLY
0
Entering edit mode

I'm a little surprised that that's not enough, though I suspect you have other things running. I would expect that 12 gigs would suffice.

ADD REPLY
0
Entering edit mode

Thank you for the reply! Even I was wondering why? I tried closing all other programs and running it again! No luck!

ADD REPLY

Login before adding your answer.

Traffic: 1743 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6