// you’re reading...

Research

ICPR 2010 – Contest: Extended Deadline May, 26

Call for Contest Participation – Classifier domains of competence: The landscape contest (ICPR 2010)

Classifier domains of competence: The landscape contest is a research competition aimed at finding out the relation between data complexity and the performance of learners. Comparing your techniques to those of other participants on targeted-complexity problems may contribute to enrich our understanding of the behavior of machine learning techniques and open further research lines.

The contest will take place on August 22, during the 20th International Conference on Pattern Recognition (ICPR 2010) at Istanbul, Turkey.

We encourage everyone to participate and share with us your work! For further details about dates and submission, please see http://www.salle.url.edu/ICPR10Contest/.

SCOPE OF THE CONTEST

The landscape contest involves the running and evaluation of classifier systems over synthetic data sets. Over the last two decades, the pattern recognition and machine learning communities have developed many supervised learning techniques. Nevertheless, the competitiveness of such techniques has always been claimed over a small and repetitive set of problems. This contest provides a new and configurable testing framework, reliable enough to test the robustness of each technique and detect its limitations.

INSTRUCTION FOR PARTICIPANTS

Contest participants are allowed to use any type of technique. However, we highly encourage and appreciate the use of novel algorithms.

Participants are required to submit the results by email to the organizers.
Submission e-mail: nmacia@salle.url.edu
Meet the submission deadline: Wednesday May 26, 2010

The contest is divided into two phases: (1) offline test and (2) live test. For the offline test, participants should run their algorithms over two sets of problems, S1 and S2. However, the real competition, the live test, will take place during the conference. Two more collections of problems, S3 and S4, will be presented.

S1: Collection of data sets spread along the complexity space to train the learner. All the instances will be duly labeled.

S2: Collection of data sets spread along the complexity space with no class labeling to test the learner performance.

S3: Collection of data sets with no class labeling, like S2 to be run for a limited period of time.

S4: Collection of data sets with no class labeling covering specific regions of the complexity space to determine the neighborhood dominance.

For the offline test, the results report consists of:

1. Labeling the data sets of the collection S2.

The procedure is the following:

  1. Train the learner using Dn-trn.arff in S1.
  2. Provide the rate of the correctly classified instances over a 10-fold cross validation.
  3. Label the corresponding data set Dn-tst.arff in S2.
  4. Store the n models generated for each data set to perform the live contest on August 22. Be ready to load them on this day.

2. Describing the techniques used.

A brief summary (1~2 pages) of the machine learning technique/s used in the experiments must be submitted. We expect details such as the learning paradigm, configuration parameters, strength and limitations, and computational cost.

IMPORTANT DATES

* May 26, 2010: Deadline for submission of the results and technical report

* May 29, 2010: Notification of participation

* Aug 22, 2010: Release of S3 and S4

* Aug 22, 2010: ICPR 2010 – Interactive Session


CONTACT DETAILS

Dr. Tin Kam Ho – tkh at research.bell-labs.com
Núria Macià – nmacia at salle.url.edu
Prof. Albert Orriols Puig – aorriols at salle.url.edu
Prof. Ester Bernadó Mansilla – esterb at salle.url.edu

Discussion

Comments are disallowed for this post.

Comments are closed.