Our system block diagram is shown to the left. The main computing device of our system is the STM3232F767ZI microcontroller. It was chosen because it is powerful enough to allow us to process the information from distance sensors with minimal latency. For our distance sensors, we used VL53L0X time of flight optical sensors which can measure absolute distances up to 2 meters. Communication between the sensors and the STM32 is handled by 4 independent I2C buses on the chip, each handling seven or eight sensors.
Communication to the UR arm is through a UART connection to a Windows control system, which has control communication with arm through RTDE interface. The communication provides the ability to set control inputs to the arm and receives information such as joint positions and angles. This arm information is used to create a visualization of the arm and detect nearby objects in real-time.
The diagram on the bottom shows the interfaces within the software system. Look in the Software System section below for more details.
There are several requirements to any system that is developed for this purpose. The first, most obvious, requirement is that the object detection system actually prevents object collision and should should not perform any unexpected movements without operator control. For example, a sudden jerk of motion or moving in a completely unexpected direction. Along the same lines, there should be no purely autonomous motion. The robot currently can only move with direct interaction with a operator, and it has been reiterated that it must stay that way.
Since the system will be used in a surgical environment, all parts have to be approved for medical environments. In the case of optical surgery, this can for example mean no high-powered lasers.
The SafeVision software system consisted of four key parts which include object detection. visualization, control, and android app communication.
The object detection is responsible for determine the important sensors using the arm position within the arm's movement. It updates the current arm status based on the movement and sensor distance data. Detected objects in the arm's path of motion will trigger the system to stop the arm to prevent collision.
The visualization shows the 3D model of the UR arm and provides information of the location of the detected objected in space.
The control system communication with the UR arm with real time data exchange (RTDE) interface to update registers and receives information such as arm positions/velocity. The arm is start/stop through the control system after the arm status updates.
The android app acts as an remote control for the whole SafeVision software system. It receives information of the arm positions/velocities and can start the arm from the app.
Another part of the software is object simulation for testing or remote uses when UR arm is not available.
The hardware system consists of the microcontroller and the optical sensors. The hardware is setup with sensors connected to all four I2C buses to communicate with all the sensors. The microcontroller sets up the sensors and sends their readings over UART to the Windows control system.