The three "*.ino" Arduino sketches are to be placed under a folder named "Magabot_SerialControl", and uploaded to the Magabot Arduino's, using the public domain Arduino software. Information about the Magabot can be found at "Magabot.cc".
The Python files are to be placed under a common folder, and the main file to run is "MLMMagabot07.py". To communicate with the robot, PySerial is used. It can be found on the internet and installed along Python. As it is given here, the software runs with a Magabot connected to the computer through serial port 11. The "magabot_module.py" source file needs editing so it will start the serial connection, setting correctly the serial port number actually used by the Magabot. To do this, the "self.offline_test" variable in the "magabot_module.py" source file must be set to zero (it is by default), and the "self.port_number" variable needs to be set to the correct serial port number. The Arduino software menu has an option that gives the port number that used by the robot.
This project is still in an early experimental state. The Magabot is equipped with five sonars in the frontal area, three infrared sensors under the body to detect ground edges or marks, and two frontal bumpers to detect collisions. The Magabot sketch used receives motion commands from the Arduinos's serial monitor, displays bumper collisions on the monitor as they happen, and returns sonar and infrared data on request. It also provides the basic reflex actions to bumps and ground edge detection. The Python code needs to gather properly the data from the Arduino's serial output buffer, extract relevant features, and then use the MLM mechanism for prediction and voluntary action generation. Post-cinematic actions (i.e. reflex mechanisms that can are tried only in the absence of voluntary actions) are also implemented at the Python level.
For instance, the file "MLMMagabot07.py" calls from "magabot_module.py" the post-cinematic method "sonar_driven_choice_1". This implements reflex obstacle avoidance based on sonar information. But any voluntary choice based on cinematic data will have precedence. A "sonar_driven_choice" method can be used instead of "sonar_driven_choice_1". It simply provides random choices, taken from a user-definable list of actions. In this case, consistent obstacle avoidance relies exclusively on the learning abilities of the machine.