Challenge Phases & Rules¶
The MIDOG 2025 Challenge is structured into three key phases to ensure fair, robust, and reproducible evaluation of mitosis detection and classification algorithms.
Phase 1 – Training Phase¶
- Access to training data is available. We provide diverse data sets, including multiple data domains across whole slide images and hotspot region of interests. See our dataset page.
- Participants develop and train their algorithms using the provided data and publicly and openly available datasets (private datasets are not allowed).
- The focus is on creating models that generalize well across staining, scanning, and tissue variability.
Phase 2 – Technical Validation Phase¶
- A preliminary evaluation set is made available for evaluation on grand-challenge.
- Participants are required to submit their algorithms as Docker containers.
- Only one submission per team and day is permitted. Tries to circumvent this limitation (e.g., setting up of new teams) result in exclusion from the challenge.
- This phase ensures:
- Technical compatibility
- Successful inference execution
- Preliminary feedback (not used for ranking)
- ⚠️ Note: This is not the final test set and has no bearing on the final ranking.
Phase 3 – Final Submission Phase¶
- Participants submit final Docker containers for evaluation on the hidden test set.
- Submissions must be self-contained, fully automated, and reproducible.
- Participants are requested to publish a brief (approx. 2 pages) description of their method and results on a preprint site (such as arxiv.org, medrxiv.org) together with their submission. We do provide a template (double column, IEEE style) for this. There is no explicit page limit for that description, but it has to include a conclusive description of the approach of the participating team. Participants of both tracks can combine both approaches in a single preprint.
- Only one submission per team is allowed.
- Final evaluation is performed centrally on the platform under standardized conditions.
Results & Presentation¶
- Final results will be presented at the MICCAI 2025 MIDOG workshop (September 23, 2025).
- Top-performing methods will be highlighted, and participants will be invited to present their approaches.
Key Rules¶
- Participation: Participants can decide to participate in track 1, track 2, or both.
- Reproducibility: All submissions must be made as Docker containers and operate without internet access.
- Fair Play: Excessive use of GPU resources is prohibited.
- Transparency: Abstracts describing each method are required alongside the final submission.
- Conflicts of Interest: Researchers belonging to the institutes of the organizers are not allowed to participate to avoid potential conflict of interest.
Have a look at the complete list of rules here.
Publication policy¶
- Participants may publish papers including their official performance on the challenge data set, given proper reference of the challenge. There is no embargo time in that regard.
- We aim to publish a summary of the challenge in a peer-reviewed journal. Participating teams are free to publish their own results in a separate publication.
🔗 Full details: https://midog2025.deepmicroscopy.org
💬 Questions? Join our Discord community