Entering edit mode
6.9 years ago
a511512345
▴
190
Hello everyone, I am learning SVM, according to http://compdiag.molgen.mpg.de/ngfn/docs/2005/sep/exercises-classif.pdf Since I am completely a novice, when I follow the book's instructions to learn to encounter the following questions: 1. How to select and output the informative genes in the resulting model? 2.How many genes do you need to still get a reasonable CV error? 3. How to use the model to do ROC curve analysis? Very much looking forward to your reply and script thank you very much!
Please validate (by up-voting and/or accepting) previous answers to your questions, or at least acknowledge them by replying.
Here are un-responded answers to your questions:
Dear Kevin Blighe, Thanks for your help and reminders. In fact, every time before I want to reply your help, but I do not know what is the reason, not every successful reply to you. Maybe for me in China, and not well linked to biostars. I feel so guilty I can not let you know in time that you have helped me to learn a lot, I hope you will continue to help me.
No problem. Have you read the entire manual?
They perform the following:
Sincerely thank you You always help me in detail About SVM, I have read the manual thoroughly. It is frustrating that I still do not fully understand, do not know which of the important genes to choose. Probably due to my extremely poor programming ability as a doctor. Thank you again!
The tutorial uses multiple indicators in order to choose important genes, but the most relevant to the tutorial is a low error margin after cross-validation of the SVM linear kernel.
The cross-validation repeats the SVM multiple times (10 times), and then compares the results each time. This allows for the production of error margins. If a gene has very similar values in each cross validation, it's error will be low; if a gene has very different values in each cross-validation, it's error will be high.
谢谢
ok, I bite again to study hard! Hope to learn! thank you In addition, your Chinese is very good, ha ha ha ha Thank you