I found this paper to be a worthwhile commentary on benchmarking practices in quantum computing. The key contribution is drawing parallels between current quantum computing marketing practices and historical issues in parallel computing benchmarking from the early 1990s.
Main points: - References David Bailey's 1991 paper "Twelve Ways to Fool the Masses" about misleading parallel computing benchmarks - Argues that quantum computing faces similar risks of performance exaggeration - Discusses how the parallel computing community developed standards and best practices for honest benchmarking - Proposes that quantum computing needs similar standardization
Technical observations: - The paper does not present new experimental results - Focuses on benchmarking methodology and reporting practices - Emphasizes transparency in sharing limitations and constraints - Advocates for standardized testing procedures
The practical implications are significant for the quantum computing field: - Need for consistent benchmarking standards across companies/research groups - Importance of transparent reporting of system limitations - Risk of eroding public trust through overstated performance claims - Value of learning from parallel computing's historical experience
TLDR: Commentary paper drawing parallels between quantum computing benchmarking and historical parallel computing benchmarking issues, arguing for development of standardized practices to ensure honest performance reporting.
Full summary is here. Paper here.
[link] [comments]