For each zone, the robot is put on the centre of the zone and always oriented in the X axis direction.
Then multiple “360° scans” are done and stored with the north orientation (provided by IMU compass):{ (x,y,a1,o) ,(x,y,a2,o)…} - {a1,..an} in {0,..180} o: north orientation.
For instance we will have 20 (number of scans) x 28 (number of zones) x 15 (numbers of measurements between 0° to 180°) records.
Preparing the data
Distances are learning features and zones are labels for AI learning system (for instance 15 features and 28 labels).
Due ot the fact that 0 measurment mostly occurs when distance is over the sensor capcity, all 0 distances are replaced by the maximum distance that can be detected by the ultrasound sensor (for instance 400cm).
Data are added to each orginal scan by virtualy rotating twice clockwise and twice anti-clockwise. It generates 4 new scans as if the robot were -26°,-13°,+13° and +26° oriented versus the original position. The number of learning labels becomes 5 x the number of zones (5 x 28 for instance).
Distances are centered to half of the maximum distance (for instance 200)
Lastely the data are randomly splitted in 3 parts: for trainning, for testing, and for validation (respectively around 75%, 10%,5%).