A method to benchmark the balance resilience of robots

Front Robot AI. 2023 Jan 20:9:817870. doi: 10.3389/frobt.2022.817870. eCollection 2022.

Abstract

Robots that work in unstructured scenarios are often subjected to collisions with the environment or external agents. Accordingly, recently, researchers focused on designing robust and resilient systems. This work presents a framework that quantitatively assesses the balancing resilience of self-stabilizing robots subjected to external perturbations. Our proposed framework consists of a set of novel Performance Indicators (PIs), experimental protocols for the reliable and repeatable measurement of the PIs, and a novel testbed to execute the protocols. The design of the testbed, the control structure, the post-processing software, and all the documentation related to the performance indicators and protocols are provided as open-source material so that other institutions can replicate the system. As an example of the application of our method, we report a set of experimental tests on a two-wheeled humanoid robot, with an experimental campaign of more than 1100 tests. The investigation demonstrates high repeatability and efficacy in executing reliable and precise perturbations.

Keywords: benchmarking method; performance assessment; robots balance; robustness; self-stabilizing robots.

Grants and funding

This work is supported by the European Union’s Horizon 2020 research program under the projects Eurobench (No. 779963) and Natural Intelligence (No. 101016970). The content of this publication is the sole responsibility of the authors. The European Commission or its services cannot be held responsible for any use that may be made of the information it contains.