how to automate this process
1
0
Entering edit mode
7.8 years ago
ashish ▴ 680

I am searching for cis-acting regulatory elements using the plantCARE database: http://bioinformatics.psb.ugent.be/webtools/plantcare/html/ i have more than 500 sequences but this tool takes a single input at a time. I want to automate this process with sleep of 60 seconds between each request. I want the results in a table format. I've tried writing a script using beautifulsoup a requests module in python but in vain. Can anyone help me write a script for this. I've never written one and failed miserably trying to write one for this. thanks

python plantCARE • 2.7k views
ADD COMMENT
1
Entering edit mode

a couple of points:

  • if you never wrote script before, this is not an easy task. So you should really start learning the basics first, if you want to use scripts in the future more often.
  • The homepage looks absolutely awful. I am not too much into plants, but are you sure this is a good resource? Are you sure there is nothing better? If I click on 'about us', the last reference to this project is from 2002! So do you know that this data is up to date? A well constructed resource would allow you to download data or offer a API to access the data, which would make things way easier. To search for sequences with html parsing in 2017 sounds totally and completely wrong!

So, ARE you sure you want to use this site? And could you tell us why? Are there no alternatives for your use case? Something like http://plants.ensembl.org/index.html does not help you? Are you sure?

Please ask yourself these questions before you invest a lot of time! You have to slow down here! And yes, it is still a good idea to learn python basics.

ADD REPLY
0
Entering edit mode

No, there is no alternative, infact this is the only resource available for such an analysis in plants. This is the only database which will give me the desired results. And I am already learning python so that I can write good scripts in future, right now I am only able to do small tasks with python such as extracting sequences, sorting them, changing ids etc. Is there anyway a script can be written which will work on this website.

ADD REPLY
0
Entering edit mode

Have you tried contacting the authors/maintainers to ask them about options for batch usage?

ADD REPLY
0
Entering edit mode

yes, they said scripts are allowed but please keep a sleep time of 60 seconds between each request.

ADD REPLY
1
Entering edit mode

if this is really the only resources for this - poor you!! Well, if you really have to parse the website, you are on the right track with beautifulsoup, it should be a good choice for this. I can not help you more, sorry - but I am still shocked that a resource like this exists in 2017. This is ugly, brrr.

ADD REPLY
0
Entering edit mode

There are iframes in there. I'd find parsing this interactively with Javascript difficult. The good part about these weirdly designed websites is that they don't do a lot of dynamic elements and javascript resource-based objects. Simple parsers should be able to reach DOM elements consistently.

I would hate to be the one parsing this site.

ADD REPLY
1
Entering edit mode

Don't ask them if scripts are allowed. Ask them if you can access their database directly.

ADD REPLY
0
Entering edit mode

I've tried writing a script ... but in vain

http://www.reactiongifs.com/r/ktf.gif

ADD REPLY
0
Entering edit mode

Why add this as an answer?

ADD REPLY
0
Entering edit mode

I didn't. I added it as a comment.

ADD REPLY
0
Entering edit mode

No, I moved it to a comment. It was an answer.

ADD REPLY
0
Entering edit mode

Ah, my bad. Sorry about that.

ADD REPLY
0
Entering edit mode

No worries. Please exercise a little more caution in the future! :)

ADD REPLY
1
Entering edit mode
7.8 years ago
Ram 44k

For starters, use this URL: http://bioinformatics.psb.ugent.be/webtools/plantcare/cgi-bin/qfm_querycare.htpl

It's the actual query page. The site uses iframes so the URL on the URL bar doesn't change when you click links. The navigation happens in a separate frame. Use Inspect on whatever browser to find your way through the site.

Once you get there, it is a simple POST web form. You should be able to fill in values and submit the form. If you give me sample values, I can help you parse the output.

ADD COMMENT
0
Entering edit mode

Hey Ram, thanks for giving some new information. You're right the URL doesn't change when we click on links. The link you've suggested is not my query page. After opening the link provided in the question we need to click on "search for CARE", the page which opens is our query page and it takes fasta sequences as input. if you're willing to try you can use these example sequences.

gene1 ATGGTGGGATACGGGATCCAGTCGATGCTCAAGGATGGGCACAAGCATCTCTCGGGCCTC GACGAGGCCGTCCTGAAGAACATCGGCGCCGGCCGCGAGCTCTCCGCCATCACCCGAACC TCCCTTGGCCCCAACGGAATGAATAAGATGGTCATTAATCATCTAGACAAGCTCTTTATC ACGAATGATGCTGCAACAATCGTGAACGAGCTTGAGGTCCAGCACCCTGCTGCTAAGCTC CTGGTGCTCGCTGCCAGAGCGCAACAGGAGGAGATCGGAGATGGAGCTAACCTCACCATA TCTTTCGCTGGTGAACTGCTCGAGAAGGCTGAGGAGCTCATCAGGATGGGACTGCATCCA AGCGAGATCATCATTGGGTACACCAAGGCCATTAATAAGACGCTTGAGATTTTGGATGAT CTAGTTGAGAAAGGCTCTGAAAATATGGATGTCAGGAACAGGGAGGAGGTCGTGTTGAGG ATGAAATCTGCTGTTGCAAGCAAGCAATTTGGCCAGGAGGATGTGCTGTGTCCACTTGTC GCTGATGCCTGTATCCAAGTTTGTCCCAAAAATCCGGCAAACTTTAATGTTGACAATGTT AGAGTGGCAAAGCTGGTTGGAGGCGGCCTTCACAATTCTTCTGTAGTTCGTGGGATGGTT CTGAAAAATGATGCTGTTGGGAGCATTAAAAAAGTGGAGAAAGTGAAGGTTGCGGTATTT GCTGGTGGTGTTGACACTTCTGCAACTGAAACAAAGGGAACTGTTCTTATACATTCTGCT GAGCAGCTTGAAAACTATGCTAAAACTGAGGAAGCAAAGGTGGAGGAGCTTATTAAATCT GTGGCTGATTCAGGTGCCAAGGTTATTGTCAGTGGAGCAGCAGTTGGCGATATGGCATTG CATTTTTGTGAACGTTACAAGCTAATGGTTCTGAAAGTCAGCTCAAAGTTTGAGCTTCGA AGATTTTGCCGTACTACTGGGGCTATAGCACTTTTGAAGCTTAGCCGACCAAATGTGGAC GAATTAGGATATGCAGACTCTGTTTCAGTGGAAGAAATTGGCGGTGCTCGAGTGACTGTT GTCAAAAATGAAGAAGGTGGAAATTCAGTGGCTACTGTTGTCTTGAGAGGCAGTACAGAT AGTATATTAGATGATCTTGAAAGAGCTGTCGATGATGGTGTAAACACGTACAAGTCAATG TGCAGGGACAGTCGGATTATCCCTGGTGCTGCTGCAACAGAAATAGAACTGGCAAAGAGA TTGAAGGAGTTTTCTCTGAAGGAAACAGGGTTGGATCAGTATGCTATTGCAAAATTTGGT GAAAGTTTTGAAATGGTTCCAAGAACATTGTCTGAAAATGCCGGACTTGGTGCAATGGAG ATAATATCTTCCCTCTATGCTGAGCATGCTGCTGGCAATGTGAAAGTTGGCATTGACTTG GAGAAAGGTGCTTGCGAGGATGCCTCAATCATGAAAATATGGGACCTCTATGTTACAAAG TCCTTCGCTCTAAAATACTCAGCAGATGCTGTATGTACTGTTTTACGGGTTGACCAGATC ATTATGGCAAAGCCAGCAGGAGGGCCGCGGCCCCCCCAACAAGGTGGCATTGATGAAGAC TAG gene2 ATGGTGGAAGACTATCTTGCTATGAATGGAATTAGTATGACGTTGTTCTGTCAGAACAAG GACTCTCTGCACAGGTCCTTGACAGCTAGCTATATCTTTCGAGAGTATGGTGTCGAAGCC ACAAGCTCTGACTGGAAAGGCACTACTCGCCTTGCAATGAAAAGCTGCACCTATCTAGGC GACACGATTGCATTTGATCAGGTCGTGGAATTGCTTGATAAGGGCAAGCCGGTGCTTGCT ATGATGATCATGGGTCCTGATTTTTGGATGCTTGGCGATGACGAAATTTACAAATGCAAA CCTGTCTTCTTGTCAGGAGAGCTCCAAAAGCCATTTGAGTACCACAGGGTACTCCTGACT GGGCATGGAGAGACGGAAGTAACAAAGCGAAAGTTTGTTCCATTCCTCAACTCCCACAGC AACAAGTTTGGGCAGAACGGTGTTGGGCGTGCTTACTTCAAGGACCTGCGCTGGTTTTTC ACTTTGGAGGGCACCTACCCACCGCTTCTTCGACATCAGCCACCTAAAGTTAATGCACTC AACCGACCAAACCTCGACAAACTAAGGCGGCGGCCTGAGTGTGATAATCAGCCAAACCTT GACATACGAAGGTGGCGGCCTGAATGTGATAACCGACGAAACCTTGATAAACCATGGAGG CGGCCTGAGTGTGATAATGGAGCAGTGAAACAAACGCAGGCACCTGGAAAAGCTGACGCT TCCTCTTGGTGGCGTGTTGACATGCCCCAGCCGATTGCTAAGGTGACCAATGGTACTGCT TGGAATGGTACAACCTGTCTCCGTGTTTTAACGAAGGAAATATTTGATCAAGGTTGCAAG CTTATGGCATCTGGAGTGAATGGAATTGATTTGAGAGATGGTATCTGCATGGCAGTTGAT GATGTCATGAGAAACCTGAAGAACATGGCCCGAATGATAAACACTTTCCAGGAAATAGCA CAGGTGGGTACCATATCAGCTGATGGGGAGAAAGAACTTGGCGAGCTCATTGCAAATGCT ATTGTGATGGTTGATAAGGAGGAAATTATCACCATTGTGGATGGTAGCACTCGATGCAGC GAGCTTCAAGTTGTGAAATGCATAAAGCTTGAGAGAGGCTACTTCACTCAACACTTTATT ACCAACCAAAAGTACAAAACATGTGAACTAGATGATCCCTTAATCTTCATACATGATCAG ATGGTATCCAATGTTCAAGCCATCAAAGTGTTGGAGTTTGCTATCACGGAAGGAAGGCCT CTGCTTATTCTTGCAGAAGATGTAGATGCTAAAGCAATAAAAACTATGGTCGCTGAAAAA TATATTAAAGGCATGAAGGTCTGCGCGGTCAAAGTTCCTGGTCATGGAGAGAACATGGAA CTCAATTTACTTGCTATCGTTACAGGCGGGCAAATCTTTACTGAAGAAAATGGAATGAAC CTTTTGCCTCACACACTTGGCACTTGTAAGAAGGTCATAGTATCTGAGGATGACTGCACA ATCTTTGGTGGAGCCGCCGATGATAATACCATCGAGAACAGAACTAAGCAGCTGGGATCT GCAACTGAGAAAAGCACGGGATTGGCAGAAATGTATTGTCGTGCTGCTATTATGAAGATT GGTGGAATGGATAAGCTGGAGATTGGTGAGAAGAAAGACAGAGCGATACATGCACTAAAT GCAACGAAAGCTGCAGTTGAGGAGGGCATTGTACCAGGTGATGGTGTTGCCCTGGTTTAT GCGGCAAAAGACCTGGATAAGCTGGACACTGCAAATGTTGGCCAAAAGAGCGGTGTTCAG ATTATTCAAAATGTCTTGAAGATACCATTTCACACCAATGCTTCAAGTGCTGGTCATGAT GGTGATGACATTCTTAGCAAGATCTTAGAGCAGAACAATACTGGCCTGGCATATGATGCT GCCAAAGGTGAATATGCCGATATGCTGGAGGCCGGTTCCATTGAGCCACTTAAAGTGATC AGAACAGCCCTGATGGATGTCCAACGAGAGTCATGTCAAATGTTAACTGACACAGACGGC CACTTGAGCCCGAAGGAAAGAACGCAGAACAAGCAATTTGCGGTGCCATGA gene3 ATGTACCGCGCCGCCGCCAGCCTCGCCTCCAAGGCTCGGCAAGCCGGGAGCAGCGCTCGC CAGGTTGGAAGCAGGCTTGCCTGGAGCAGGAACTATGCCGCCAAAGACATCAGGTTTGGT GTCGAGGCCCGTGCCTCCATGTTGAGGGGTGTCGAGGAGTTGGCAGATGCGGTGAAAGTG ACCATGGGTCCTAAGGGGCGCACTGTGATCATTGAGCAGAGCTTTGGTGCCCCAAAAGTC ACAAAGGATGGTGTGACTGTTGCCAAGAGCATTGAATTTAAAGATAGAGTCAAGAACGTT GGTGCAAGCCTTGTAAAACAGGTTGCTAATGCAACAAATGATACTGCTGGAGATGGTACC ACATGTGCCACTGTTTTGACAAAAGCTATATTTACTGAGGGCTGCAAGTCTGTAGCCGCT GGAATGAATGCAATGGATCTAAGGCGTGGAATCTCAATGGCGGTTGATGATGTTGTGACA AACCTAAAGGGCATGGCTAGAATGATCAGCACTTCAGAGGAAATAGCACAGGTGGGTACA ATATCAGCAAATGGCGAAAGGGAAATCGGTGAGCTGATTGCTAAGGCTATGGAGAAGGTT GGGAAAGAAGGTGTAATCACCATTGTGGTAAGATTTTCTAAGTTCAGATTACCAGAACTA TAA

ADD REPLY
0
Entering edit mode

I just took the sequence of your gene 1 and tried to search for it because I wanted to see what it looks like. That us what I got:

Server busy, please be patient, 5 running before your job

I doubt that this site is interactive enough to actually process your job once the other 5 are finished. At least nothing changed the last 5 minutes. This is going be dreadful. ---> ups, now I got a result, so I take that back, it actually does update! However, it says "For cpu reasons it was truncated to 1500nt from the 3'end"

The result: There is a <div id="Sequence0"> which contains <div id=divX> and <div id=SequenceX> elements (where X is a number) that contains a table that contains the results - so you would have to parse through this.

ADD REPLY
0
Entering edit mode

Yes, it is a funny website and works like that only. It analyses the whole sequence, truncation is done just to display the sequence on screen. The table under each link is the result which I am interested in. What do you think, should be the approach to automate this. I don't think beautifulsoup can do this,

ADD REPLY
0
Entering edit mode

There are multiple ways to do it: link1 link2 link3 - I don't know which one is the easiest for you. (Check my post above again, I had to edit it because the interesting bits were missing because I did just post them without declaring them as code) - these seem to be the divs you are looking for, the tables in these diffs actually.

ADD REPLY
0
Entering edit mode

Thanks LLTommy I'll try this and other things suggested above. I will post the results here if i succeed.

ADD REPLY
0
Entering edit mode

if anyone is having the same problem, I've got this problem solved. Here's is the script

import os.path
import urllib 
import urllib2

def parse ():
  f=open ("WHEAT_2000bp_UPSTREAM_SEQUENCES", "r")
  data=[]
  dtemp=["", ""]
  for line in f:
    if line[0]==">":
      data.append (dtemp)
      dtemp=["", ""]
      dtemp[0]=line [1:].replace ("\n", "")
    else:
      dtemp [1]+=line
  return data [1:]

def fetch (fseqname, fseq):
  if os.path.exists(fseqname+".html"):
    return
  else:
    url="http://bioinformatics.psb.ugent.be/webtools/plantcare/cgi-bin/CallMat_NN47.htpl"
    f=open ("WHEAT_2000bp_UPSTREAM_SEQUENCES", "r")
    data=urllib.urlencode ({"Mode":"readonly", "StartAt":"0", "Field_Sequence":fseq, "Field_SequenceName":fseqname, "Field_SequenceDate":"31/03/2017"})
    req=urllib2.Request(url, data)
    print "Fetching:"+fseqname
    res=urllib2.urlopen (req, timeout=1500)
    f=open (fseqname+".html", "w")
    f.write (res.read())
    f.close ()

if __name__=="__main__":
  data=parse()
  for x in data:
    try:
      fetch (x [0], x [1])
    except Exception, e:
      print e


print "cp"
ADD REPLY
0
Entering edit mode

Could you please give one example of result you got from plantCARE please.

I found lots of CREs, some with no define function in the description. So got confused which CREs to keep or which one to discard.

ADD REPLY

Login before adding your answer.

Traffic: 1638 users visited in the last hour
Help About
FAQ
Access RSS
API
Stats

Use of this site constitutes acceptance of our User Agreement and Privacy Policy.

Powered by the version 2.3.6