Challenges / RSCTC 2010 Discovery Challenge. Advanced Track

Status Closed
Type Scientific
Start 2009-12-01 00:00:00 CET
End 2010-02-28 23:59:59 CET
Prize 2,000$

Registration is required.

Summary

RSCTC 2010 Discovery Challenge is OVER

The data mining challenge attracted more than 200 participants, of whom nearly 100 submitted solutions, most of them many times. Best algorithms achieved 74-76% accuracy in predicting diseases from genetic profiles, which is a very high result taken into account the difficulty of the problem - multi-class classification (up to 10 different classes), small number of objects, large number of attributes - and the form of the quality measure, which paid more attention to the smallest and most difficult classes. Winning solutions are also much better than baseline algorithms: the simple (28%) and intelligent ones (65%). We thank all participants for their effort and congratulate the winners!

Winners of RSCTC 2010 discovery challenge

From left: Guoyin Wang, Huan Luo, ChuanJiang Luo (Advanced Track), Vladimir Nikulin (Basic Track), Marcin Wojnarski (Chair). Read the interview!

The Winners

Advanced Track:

  1. ChuanJiang Luo, Ze Chen, Feng Hu, Guoyin Wang, Lihe Guan, Institute of Computer Science and Technology, Chongqing University of Posts and Telecommunications, China (RoughBoy)
  2. Huan Luo, Juan Gao, Feng Hu, Guoyin Wang, Yuanxia Shen, Institute of Computer Science and Technology, Chongqing University of Posts and Telecommunications, China (ChenZe)
  3. wulala

Basic Track:

  1. Vladimir Nikulin, Department of Mathematics, University of Queensland, Australia (UniQ) - read the interview!
  2. Matko Bošnjak, Dragan Gamberger, Ruđer Bošković Institute, Croatia (RandomGuy)
  3. Ryoji Yanashima, Keio University, Japan (yanashi)

Full results can be found on Leaderboards: Advanced and Basic.

Results of ensemble classifiers combining a number of top solutions from Basic Track can be found in Knowledge Base. Read also descriptions of the files: best07, best54, best92.

We may now reveal that competition data sets described the following diseases and diagnostic problems: acute lymphoblastic leukemia, human glioma, gingival tissue diseases, heart failure, brain cancer, systemic inflammatory response syndrome (SIRS), sepsis, response to anthracycline/taxane chemotherapy, Burkitts lymphoma, hepatocellular carcinoma, ovarian tumour and multiple human cancer types.

Post-challenge Research

All challenge resources are published now in the challenge folder in Repository and everyone can use them as a basis for new research. These are the following items:

How to Cite

If you employ in your published work any of the RSCTC 2010 Discover Challenge resources, please cite the following paper:

RSCTC'2010 Discovery Challenge: Mining DNA Microarray Data for Medical Diagnosis and Treatment.
M. Wojnarski, A. Janusz, H.S. Nguyen, J. Bazan, Luo C., Chen Z., Hu F., Wang G., Guan L., Luo H., Gao J., Shen Y., Nikulin V., Huang T.-H., McLachlan G.J., Bosnjak M. and Gamberger D. In Rough Sets and Current Trends in Computing (RSCTC) 2010, LNAI 6086, Springer, Heidelberg, pp. 4-19.

Post-challenge Submissions

Post-challenge submissions per se are not supported. Instead, TunedIT provides a versatile framework for evaluation of new solutions against any evaluation procedure and dataset, not only the ones used for the challenge. Thus, you may prepare derived versions of challenge resources and compare test results with the original ones. You may also control access rights to your resources and remove once uploaded solutions. To use the framework, upload a solution file to your home folder in Repository and evaluate it with TunedTester, giving the following names of the evaluation procedure and datasets in TT window (plus give the full name of your algorithm, as shown on its Repository page):

For Basic Track:

  • RSCTC/2010/Eval_RSCTC2010.jar:rsctc2010.EvalDecisions
  • RSCTC/2010/B/preliminary.zip
  • RSCTC/2010/B/final.zip

For Advanced Track:

  • RSCTC/2010/Eval_RSCTC2010.jar:rsctc2010.EvalCode
  • RSCTC/2010/A/preliminary.zip
  • RSCTC/2010/A/final.zip

If you wish, you may choose to send results to Knowledge Base. They can be viewed and analyzed afterwards by everyone on KB pages. For example, the following links generate post-challenge leaderboards with results corresonding to exactly the same evaluation setup as used in the challenge (the same evaluation procedures and datasets):

All results related to the challenge resources, involving perhaps some other resources (e.g., evaluation procedure is the same, but the dataset is different), are here:

When you open KB page, notice that there may be several evaluation procedures or datasets on the drop-down lists. Every choice generates a different chart. Also, there is the Raw Results tab where all the results are presented in tabular form and can be downloaded as CSV files. Raw Results tab contains also hyper-links to Repository pages of particular algorithms and datasets.

Copyright © 2008-2013 by TunedIT
Design by luksite