Graphical User Interface (GUI) testing has been a significant topic in the software engineering community. Existing GUI testing frameworks are intrusive and can only support some specific platforms, which are quite limited. With the development of distinct scenarios, diverse embedded systems or customized operating systems on different devices do not support existing intrusive GUI testing frameworks. Some approaches adopt robotic arms to replace the interface invoking of mobile app testing and use computer vision technologies to identify GUI elements. However, Some challenges remain unsolved with such approaches. First, existing approaches assume that GUI screens are fixed so that they cannot be adapted to diverse embedded systems with different screen conditions. Second, existing approaches use XY-plane robotic arm system, which cannot flexibly simulate human testing operations. Third, existing approaches ignore the compatibility bugs of apps and only focus on the crash bugs. To sum up, a more practical approach is required for the non-intrusive scenario.
In order to solve the remaining challenges, we propose a practical non-intrusive GUI testing framework with visual-based robotic arms, namely RoboTest. RoboTest utilizes a novel GUI screen detection algorithm that is adaptive to detecting screens of different sizes. Then RoboTest extracts GUI widgets from the detected GUI screen and applies a complete set of widely-used testing operations with a 4-DOF robotic arm, which can more effectively and flexibly simulate human testing operations. During the app exploration, RoboTest integrates the specially designed Principle of Proximity-guided (PoP-guided) exploration strategy, which chooses close widgets of the previous operation targets to reduce the robotic arm movement overhead and improve exploration efficiency. Moreover, RoboTest can effectively detect some compatibility bugs beyond crash bugs with a GUI comparison on different devices of the same test operations. We implement RoboTest and evaluate it with 20 real-world mobile apps. We also have a case study on a representative industrial embedded system. The results show that RoboTest can effectively, efficiently, and generally explore the AUT to find bugs and reduce the extra app exploration time overhead brought by the robotic arm movement compared with the baseline non-intrusive approaches.
We propose a practical non-intrusive GUI exploration testing framework with the visual-based robotic arm.
We propose a set of GUI screen and GUI widget identification technology designed for the robotic arm scenario to improve the generalizability of RoboTest.
We design a set of robotic arm movements that can flexibly and effectively simulate human testing operations, further attached with a PoP-guided exploration strategy to improve the GUI exploration efficiency.
We introduce a novel comparison-based GUI compatibility bug detection method that is practical in detecting bugs beyond crashes.
We conduct an experiment on real-world apps and a case study on a representative industrial embedded system to illustrate the effectiveness, efficiency, and generalizability of RoboTest.