CDC'23 Workshop on Benchmarking, Reproducibility, and Open-Source Code in Controls
Overview: Over the past years, the scientific community has grown more cognizant of the importance and challenges of transparent and reproducible research. This topic has become increasingly important given the rise of complex algorithms (e.g., machine learning models or optimization-based algorithms), which cannot be adequately documented in standard publications alone. Benchmarking and code sharing are two key instruments that researchers use to improve reproducibility. Benchmarks have played a critical role in advancing the state of the art in machine learning research. Analogously, well-established benchmarks in controls could enable researchers to compare the effectiveness of different control algorithms. There are currently only a few benchmarks available for comparing control algorithms (e.g., the Autonomie simulation model of a Toyota Prius or the shared experimental testbed Robotarium). Limited comparisons are also due to the modest number of open-source implementations of control algorithms. Over a six-year period (2016-2021), we found that the percentage of papers with code at CDC has more than doubled. However, we also found that at CDC 2021 only 2.6% of publications had code (compared to around 5% at the robotics conference ICRA and over 60% at the machine learning conference NeurIPS). These trends are encouraging, but there is still much work to be done to promote and increase efforts toward reproducible research that accelerates innovation. Benchmarking and releasing code alongside papers can serve as a critical first step in this direction.

Call for Contribution: 

We invite submissions of short abstracts (maximum 300 words) on efforts and challenges in improving accessibility and reproducibility of research in control through benchmarks, open source code, software/hardware platforms, and educational content. Submissions are due September 18, 23:59 PST. Accepted abstracts will be presented as lightning talks at the workshop. Topics of interest include but are not limited to:

- Any open-source implementation of control algorithms
- Any benchmarks and comparisons of control approaches, or competitions
- Tools and software/hardware platforms that enable accessible and reproducible research
- Tutorials/lectures on reproducible research best practices and standards
- Educational resources to make control theory accessible

By bringing together researchers in these topics, we aim to improve the accessibility, reproducibility, comparability, usability, and visibility of research in control theory. We look forward to your contributions!

Sign in to Google to save your progress. Learn more
First name *
Last name *
Email address
*
Affiliation *
Theme where your work fits in *
Required
Short abstract (maximum 300 words) *
(Optional) Link to GitHub or supplementary materials
Submit
Clear form
Never submit passwords through Google Forms.
This form was created inside of UTIAS Robotics. Report Abuse