Beyond Basics Ex 12-18
Exercise 12: Compare
Objective:
Once every second the color sensor will try to recognise a color. If it detects green then the robot will advance for one wheel rotation and otherwise the 'click' sound will be played, followed by a one second pause. The robot will continue checking until the program is interrupted.
Solution:
This script assumes that have you have downloaded the EV3 sounds in WAV format (including the file 'Click.wav' to a folder called 'sounds' in your 'robot' folder as described on the Sound page. To stop this program, press the stop button in VS Code if you launched it from there, or press the Back button on the EV3.
#!/usr/bin/env python3
from ev3dev2.motor import MoveSteering, OUTPUT_B, OUTPUT_C
from ev3dev2.sensor.lego import ColorSensor
from ev3dev2.sound import Sound
from time import sleep
cl = ColorSensor()
steer_pair = MoveSteering(OUTPUT_B, OUTPUT_C)
sound = Sound()
while True:
if cl.color_name=='Green':
steer_pair.on_for_rotations(steering=0, speed=50, rotations=1)
else:
sound.play_file('/home/robot/sounds/Click.wav')
sleep(1)
Notes:
When trying to recognise colors, the color sensor is looking for 'standard Lego colors' i.e. the colors of Lego pieces, and it's very fussy about that. So your best chance of success is to use green Lego pieces. Optimal separation between colored object and sensor is about 8mm.
Exercise 13: Variables
Variables are, of course, a very basic concept in any programming language, and you have already been using them often in previous exercises. However, it is only in official Lego exercise 13 'Variables' of the 'Beyond Basics' sequence that the 'Variables' programming block is presented. In Lego's EV3-G programming environment, changing the contents of a variable is remarkably complicated: you have to first make a copy of the variable you want to change, then you can modify the copy, such as add one to it, then you have to write the modified copy back into the variable!! I think the poor handling of variables is one of the biggest weaknesses of EV3-G, along with the fact that it is icon-based and therefore a poor preparation for a career in programming.
For us clever EV3 Python programmers, this exercise won't introduce anything new. You already know that a variable is like a named container that can contain a text string, a number, a logical value (True or False), or even multiple values, as in lists, tuples, dictionaries and arrays. Be aware that the 'arrays' in EV3-G are quite different from the arrays in Python, which are an advanced topic. An array in EV3-G is really more like a list or tuple in Python. You have been using variables in many scripts on this site already. The script below uses the variable Presses to store the number of presses of the touch sensor button, but it uses other variables too.
Actually this lesson will teach you something new: how to detect the moment that a touch sensor button is pressed (or released) as opposed to simply detecting whether the button is in the 'pressed' state. A 'press and release' is called a 'bump' in the EV3-G software.
Objective:
The user will be allowed 5 seconds during which (s)he can press the touch sensor button several times. If (s)he does that then the robot will move straight forward with 50% speed for a number of wheel rotations equal to the number of times the sensor was pressed.
Solution:
#!/usr/bin/env python3
from ev3dev2.motor import MoveSteering, OUTPUT_B, OUTPUT_C
from ev3dev2.sensor.lego import TouchSensor
from ev3dev2.sound import Sound
from time import sleep, time
ts = TouchSensor()
steer_pair = MoveSteering(OUTPUT_B, OUTPUT_C)
sound = Sound()
presses=0 # Number of presses (actually releases)
previous_state=0
sound.beep() # signal to start presses
start_time=time()
# loop until 5 seconds have passed
while (time()-start_time) < 5:
current_state=ts.is_pressed
if previous_state==1 and current_state==0:
# button has been released
presses+=1 # Short for presses = presses + 1
previous_state=current_state # Ready for next loop
sleep(0.01)
steer_pair.on_for_rotations(steering=0, speed=50, rotations=presses)
Notes:
This script uses the function time() which returns the number of seconds that have elapsed since a reference moment in the past known as the 'epoch'. You don't need to know exactly when the 'epoch' was, only that it is a certain fixed point in time (a long time ago).
EV3 Python has a wait_for_bump() function but that cannot easily be used in our loop because when the user has stopped bumping the sensor the code will then wait for the next bump which will never come so the program will get stuck.
This script demonstrates an excellent way of detecting and counting 'bumps' or touch sensor button releases. Counting the number of presses needs a little thought. Each complete 'press' is really a 'press and release' - this double action is what the Lego software calls a 'bump'. We could look for the presses or releases or both - my program looks for releases. When the touch sensor button is released then its is_pressed property will change from 1 to 0, so my script compares the current is_pressed property of the sensor (current_state) with the value that it had the last time the while loop was run through (previous_state). If the current is_pressed value is 0 and the previous is_pressed value is 1 then the button has just been released and we can add one to the press count variable (presses).
Here is the official EV3-G solution to the same problem. It uses a thread (a branch of code that can run simultaneously with other threads) to interrupt the main loop after 5 seconds. I think the best EV3 Python solution would not use threads - it seems unnecessarily complicated to have code in one thread interrupting a loop in a different thread. Again, compare the EV3 Python solution with the EV3-G solution and ask yourself which seems to be the clearer, neater solution (taking into account that the EV3-G solution has no comments)...
Exercise 14: Color sensor - calibrate
The EV3 color sensor was probably calibrated in the factory so that it returns (in mode 0, reflected light intensity) a value of 100% when a perfectly pure white surface is brought near the sensor and 0% when a perfectly black surface is brought near. But in the real world such surfaces are rare and we are more likely to be working with ordinary white paper that is not a perfect white and ordinary black paper or black marker that is not perfectly back. Therefore when we use the color sensor we might be getting a reflectance reading of, say, 13 with our (not so) black surface and 87 with our (not so) white surface. Wouldn't it be convenient if we could recalibrate the sensor so that it gives a reading of almost exactly zero with our real-world black surface and almost exactly 100 with our real-world white surface? EV3-G has a special calibration block for this purpose but it does not work in the way most people expect, hence the existence of a special lesson in the 'Beyond Basics' section of the Lego software (Education edition) to show its correct use (see the corresponding program below). EV3 Python users can also use code to 'recalibrate' the sensor (or at least to adjust the displayed value in the desired way).
Objective:
Write a program in which the color sensor is recalibrated so that our real-world black surface gives a reflectance value of 0% and our real-world white surface gives a reflectance value of 100%. At the same time, the measured reflected light intensity should be continuously displayed on the LCD screen, so we should be able to see that after each recalibration the sensor does indeed give the expected values when presented a black or white surface.
The exact procedure will be this:
User brings a 'black' surface close to the sensor and notes that the displayed value is not close to 0
User presses the touch sensor button to indicate that the dark surface is in place
Program recalibrates the sensor reading to 0% for that surface
Program plays a 'click' sound to indicate that the recalibration is complete
User checks that a value very close to 0 is indeed displayed when the 'black' object is in place
User brings a 'white' surface close to the sensor and notes that the displayed value is not close to 100
User presses the touch sensor button to indicate that the 'white' surface is in place
Program recalibrates the sensor reading to 100% for that surface
Program plays a 'click' sound to indicate that the recalibration is complete
User checks that a value very close to 100 is indeed displayed when the 'white' object is in place AND that a value very close to 0 is indeed displayed when the 'black' object is in place
User presses the touch sensor button to reset the sensor to its factory calibration
Program resets the sensor
Program plays a 'click' sound to indicate that the resetting is complete.
User checks that the original reflectance values are again displayed for both the black and the white surfaces
Solution:
As in the EV3-G solution, we will use a separate thread to continuously display the sensor value (after the calibrations for black and white, the ADJUSTED sensor reading will be displayed). Review the multithreading page if you have forgotten the principles of multithreading and how to use a 'daemon' thread.
For this exercise, it is of course important to always hold the objects at the same distance from the sensor. My experiments suggest that objects should be held about 0.5 cm from the sensor since this gives the strongest possible reflected light intensity. It is normal that after the adjustment has been made for 'black' the displayed value will then be negative when no object is near the color sensor.
Since the procedure is rather complex, instead of making 'clicks', the script causes instructions to be spoken, something that EV3-G is not capable of doing. Also, I include an extra wait_for_bump at the end to give the user time to check that the calibration really has been undone.
#!/usr/bin/env python3
from ev3dev2.sensor.lego import ColorSensor, TouchSensor
from ev3dev2.sound import Sound
from ev3dev2.button import Button
from ev3dev2.display import Display
from time import sleep
from threading import Thread
ts = TouchSensor()
cl = ColorSensor()
btn = Button()
sound = Sound()
lcd = Display()
black = 0 # this value will be replaced with the actual
# value returned by the sensor when it is placed close to a black surface
white = 100
def daemon_thread():
while True:
lcd.clear()
adjusted_intensity = str(int((cl.reflected_light_intensity - black)*100/(white - black)))
lcd.text_pixels(adjusted_intensity, x=75, y=50, font='helvB24')
sleep(0.5)
t = Thread(target=daemon_thread)
t.setDaemon(True)
t.start()
# Calibrate for black
sound.speak('Put the sensor close to a black surface and press the button')
ts.wait_for_bump()
#btn.wait_for_bump(('right'))
black = cl.reflected_light_intensity
sound.speak('Calibrated for black')
# Calibrate for white
sound.speak('Put the sensor close to a white surface and press the button')
ts.wait_for_bump()
#wait_for_bump(btn.right)
white = cl.reflected_light_intensity
sound.speak('Calibrated for black and white')
sound.speak('Check that the calibration works, then press the button')
ts.wait_for_bump()
#wait_for_bump(btn.right)
# undo the calibration
black = 0
white = 100
sound.speak('Calibration undone')
sound.speak('Check that the calibration has been undone, then press the button')
ts.wait_for_bump() # bump to end the program
sound.speak('Goodbye')
Notes:
I'm not explaining the formula for adjusting the reading (the highlighted line). Try to makes sense of it for yourself and also try calculating by hand what the adjusted result would be if black = 20 and white = white if the sensor is brought up to the black surface again. And for a white surface?
This is a relatively long and clumsy EV3 Python solution - perhaps you can find a better one?
Here is the official EV3-G solution to this exercise. Note how the recalibration blocks (such as the one just to the left of the 'play Click' block in the top line) do not get any information directly from the sensor, as most people would expect, they have to be fed a number through a data wire in order to do the recalibration.
Exercise 15: Messaging
Objective:
This program (or rather, this pair of programs) will allow two bricks to communicate with one another using Bluetooth (short range wireless radio communication). The programs will allow you to control the rotational speed of one wheel of the receiving robot, which will be moving in circles, by turning the right wheel (motor C) of the sender robot. That's right - we will be controlling the receiving robot by 'remote control' using Bluetooth (not using the EV3 infrared 'beacon').
I don't own two bricks at the moment so this solution will have to wait...
Exercise 16: Logic
In computer programming the logical (or 'Boolean') values are True and False. Logical operators include and and or.
When the and operator is applied to two expressions the output will be True only if both the expressions are true.
When the or operator is applied to two expressions the output will be True if either (or both) of the expressions are true.
Objective:
The robot will move forward in a straight line towards an object until its color sensor detects a black surface AND the robot is within 6-25cm of the object that it is approaching. When BOTH these conditions are met the robot will stop moving.
Solution:
Really there are THREE conditions to be met:
The color sensor must be detecting black
The object must be more than 6 cm away.
The object must be less than 25 cm away.
#!/usr/bin/env python3
from ev3dev2.motor import MoveSteering, OUTPUT_B, OUTPUT_C
from ev3dev2.sensor.lego import ColorSensor, UltrasonicSensor
from time import sleep
cl = ColorSensor()
us = UltrasonicSensor()
steer_pair = MoveSteering(OUTPUT_B, OUTPUT_C)
steer_pair.on(steering=0, speed=50)
while True: # forever
distance = us.distance_centimeters
if distance>6 and distance<25 and cl.color_name == 'Black':
break
sleep(0.01)
steer_pair.off()
Notes:
The instruction break causes the program execution to exit the loop, allowing the steer_pair.off() command to run and allowing the program to then terminate.
It would have been possible to save several lines by putting all the conditions in the while line but the line would have then been long and hard to read.
Here is the official EV3-G solution to this exercise. It demonstrates how to use the 'Range' block and the 'Logic' block.
In the video that accompanies the official EV3-G exercise the robot is initially far from the reflecting object and over a white mat. The conditions are met for the robot to move forward. It passes over a black line but does not stop because the robot is not yet within the defined range. It continues moving forward until it reaches a second black line and there it stops because it detects black AND the robot is within the defined range.
It seems to me the above EV3-G program has an error. It should not be necessary to repeatedly turn on the motors. Shouldn't the first motor block be placed before the loop, not within it?
Exercise 17: Maths - Advanced
This exercise uses the gyro sensor which is not included in the 'Home' version of the EV3 kit, though it is available for purchase as an optional extra, and the corresponding programming block can be downloaded to the Home version of the EV3-G software at no cost.
Recall that it is vitally important that the gyro sensor should be absolutely still when the brick is powered up or the sensor plugged in, otherwise the sensor reading will wander away from the correct value.
Objective:
The robot is assumed to have already moved along two perpendicular arms of a right triangle, each with length 25 cm, and to have turned around 180° so it is now in the right location to begin tracing the hypotenuse but it is not pointing in the right direction. The robot should now turn slowly on the spot until the gyro sensor detects that the robot has turned at least 45°, then the motors should be turned off.
Then the robot should calculate the length of the hypotenuse using the actual angle turned by the robot (as measured by the gyro sensor) rather than the 45° angle that the robot should have turned. The calculation will be
hypotenuse length = adjacent arm length / cos(turn angle)
Then the robot should calculate the corresponding number of wheel rotations needed, given that the circumference of the standard Lego wheel is 17.6 cm. Then the robot should move at speed 30 in the correct direction and for the correct distance in order to trace out the hypotenuse of the triangle.
Solution:
from ev3dev2.motor import MoveTank, OUTPUT_B, OUTPUT_C
from ev3dev2.sensor.lego import GyroSensor
from math import cos, radians
gyro = GyroSensor()
tank_pair = MoveTank(OUTPUT_B, OUTPUT_C)
# Make robot slowly turn to the right on the spot
tank_pair.on(left_speed=10, right_speed=-10)
gyro.wait_until_angle_changed_by(45)
tank_pair.off()
# calculate length of hypotenuse
length = 25/cos(radians(gyro.angle))
# calculate wheel rotations (wheel circumference = 17.6cm)
rots = length/17.6
tank_pair.on_for_rotations(left_speed=30, right_speed=30, rotations=rots)
Notes:
In order to be able to use the math functions cos() and radians() you need to import the math library or import at least the needed math functions.
Python's trigonometric functions sin(), cos() etc. work in radians, not degrees.
1 radian = 57.3° (approx) 2*pi radians = 360°In Python, you can convert degrees to radians with radians() and radians to degrees with degrees().
Don't confuse the angle that the robot turns with the angle that the wheels turn.
Here is the official EV3-G solution to this exercise. Note that EV3-G trig functions use degrees, not radians.
Exercise 18: Arrays (in Python, the closest equivalent to the EV3-G 'array' is the 'list')
Objective:
First, the user will give coded instructions to the robot by showing it a sequence of four colors. Each color is an instruction to carry out a certain movement. Then the robot will carry out the corresponding movements.
More specifically, the procedure will be:
The robot beeps to indicate that it is ready.
The user presses the touch sensor button to indicate that (s)he is also ready.
The program waits until the user presents an object with a valid color (blue, green or yellow) to the color sensor.
When the color sensor recognises a valid color it stores the corresponding color string (such as 'Blue') in a list. In Python, the closest equivalent to the EV3-G 'array' is the 'list'.
The above sequence is repeated three more times to store a total of four color strings in the same list.
The robot then plays the sound 'Horn 2' at 100% volume, waiting for the sound to finish playing before continuing.
Then the program begins reading the contents of the list, carrying out the corresponding movement for each color string.
The program stops when all four color strings have been read and the corresponding four movements carried out.
If the user presents a blue object then the robot should turn 90° left.
If the user presents a green object then the robot should go straight forward for one wheel rotation.
If the user presents a yellow object then the robot should turn 90° right.
Solution:
EV3-G defines an 'array' as a variable that can hold multiple values. The EV3-G solution to this exercise uses an array to store the four color code numbers. There exists in Python something called an array (see HERE or HERE or HERE) but in reality the Python structure that best corresponds to an EV3-G 'array' is simply a list.
The order of the elements in a list is important. Each element in the list has an index number, and the first element has index number zero. Here is an example of a list in Python: [5, 2.7, 3.1] . This array has three elements - we say the array has a 'length' of three. The elements have index numbers 0, 1 and 2, so the third element has index number 2 not 3.
This script assumes that have you have downloaded the EV3 sounds in WAV format (including the file 'Horn 2.wav') to a folder called 'sounds' in your 'robot' folder as described on the Sound page.
#!/usr/bin/env python3
from ev3dev2.motor import MoveTank, OUTPUT_B, OUTPUT_C
from ev3dev2.sensor.lego import ColorSensor, TouchSensor
from ev3dev2.sound import Sound
from time import sleep
cl = ColorSensor()
ts = TouchSensor()
tank_pair = MoveTank(OUTPUT_B, OUTPUT_C)
sound = Sound()
def get_color():
while True: # Wait for a valid color to be detected
color = cl.color_name
if color in ('Blue', 'Green', 'Yellow'):
color_list.append(color)
break # exit the loop
sleep(0.01)
color_list = [] # create empty list
for i in range(4): # i=0 then 1 then 2 then 3
sound.beep()
ts.wait_for_bump()
get_color()
sound.play_file('/home/robot/sounds/Horn 2.wav')
for col in color_list:
if col=='Blue': # blue: turn the robot 90 degrees left
tank_pair.on_for_degrees(-50, 50, degrees=345)
elif col=='Green': # green: go straight forward for one wheel rotation
tank_pair.on_for_rotations(50, 50, rotations=1)
elif col=='Yellow': # yellow: turn the robot 90 degrees right
tank_pair.on_for_degrees(50, -50, degrees=345)
Notes:
On my robot, 345° of wheel rotation (by both wheels, moving in opposite directions) is roughly correct to make the ROBOT turn 90°. You may need to use a different value, depending on the type of wheels you have and on their spacing.
To shorten some lines I have not included the parameter names left_speed and right_speed.
Advanced point: why don't I need to include global color_list in the get_color() function definition to make the list global so that it can be used in the main code block? The function does not create a list with local scope, it uses the list in the main block which already exists. See the lines in bold.
Here is the official EV3-G solution to this problem:
Conclusion
Congratulations! You have now completed the EV3 Python versions of the 'Basics' and 'Beyond Basics' exercises that are included in the Education version of the EV3 software. You have seen how EV3 Python is almost always capable of performing the same tasks as the standard Lego EV3-G software, while at the same time giving you valuable experience of textual programming. But EV3 Python is also capable of handling programs that go beyond the capabilities of the standard Lego software (such as programs that make use of EV3 Python's text-to-speech feature). Going beyond what EV3-G can do will be the theme of the next set of exercises... so stay tuned!