In addition to submitting your classification results, we require participants to send us a 4-page paper in appropriate ISBI style in which you explain your algorithm (download templates here or here). Submissions without a 4-page paper will not be evaluated.
Please submit a single ZIP file that contains:
Your classification results, stored in a CSV file. The format of a submitted CSV file should exactly match the format of the stage_labels.csv file in the training folder. Please see the submission_example.csv example file in the testing folder. Although we will only use your assigned pN stages in the evaluation, we ask you to also add your slide-level classification results (i.e. the macro/micro/itc/negative labels) to the CSV file to help us identify typical false positives or false negatives among algorithms as well as any possible mistakes in the reference data.
Your 4-page paper, please submit it either as a PDF or MS Word document.
For the evaluation we will use the evaluate.py python script that is shared in the testing folder.
The help is accessible with the --help option. The script expects 2 options: --submission and --ground_truth. Both should contain a path to a CSV file in the described format. The files are processed in case sensitive manner. That means that the assigned pN stages could be pN0, pN0(i+), pN1mi, pN1, pN2 and the ZIP file names should match exactly the name of the shared files.
python evaluate.py --ground_truth //CAMELYON17/validation.csv --submission //CAMELYON17/results.csv Calculate inter-annotator agreement. Ground truth: //CAMELYON17/validation.csv Submission: //CAMELYON17/results.csv Score: 0.944852941176
The deadline for the workshop was April 6, 23:59 (CEST), 2017. The submission is now reopened. All new submissions are collected to a separate leaderboard.
Submit your results here.