Beyond Basics Ex 12-18

Exercise 12: Compare

Objective:

Once every second the color sensor will try to recognise a color. If it detects green then the robot will advance for one wheel rotation and otherwise the 'click' sound will be played, followed by a one second pause. The robot will continue checking until the program is interrupted.

Solution:

This script assumes that have you have downloaded the EV3 sounds in WAV format (including the file 'Click.wav' to a folder called 'sounds' in your 'robot' folder as described on the Sound page.  To stop this program, press the stop button in VS Code if you launched it from there, or press the Back button on the EV3.

#!/usr/bin/env python3

from ev3dev2.motor import MoveSteering, OUTPUT_B, OUTPUT_C

from ev3dev2.sensor.lego import ColorSensor

from ev3dev2.sound import Sound

from time import sleep

cl = ColorSensor() 

steer_pair = MoveSteering(OUTPUT_B, OUTPUT_C)

sound = Sound()

while True: 

    if cl.color_name=='Green':

        steer_pair.on_for_rotations(steering=0, speed=50, rotations=1)

    else:

        sound.play_file('/home/robot/sounds/Click.wav')

        sleep(1)

Notes:

Exercise 13: Variables

Variables are, of course, a very basic concept in any programming language, and you have already been using them often in previous exercises. However, it is only in official Lego exercise 13 'Variables' of the 'Beyond Basics' sequence that the 'Variables' programming block is presented. In Lego's EV3-G programming environment, changing the contents of a variable is remarkably complicated: you have to first make a copy of the variable you want to change, then you can modify the copy, such as add one to it, then you have to write the modified copy back into the variable!! I think the poor handling of variables is one of the biggest weaknesses of EV3-G, along with the fact that it is icon-based and therefore a poor preparation for a career in programming.

For us clever EV3 Python programmers, this exercise won't introduce anything new. You already know that a variable is like a named container that can contain a text string, a number, a logical value (True or False), or even multiple values, as in lists, tuples, dictionaries and arrays. Be aware that the 'arrays' in EV3-G are quite different from the arrays in Python, which are an advanced topic. An array in EV3-G is really more like a list or tuple in Python. You have been using variables in many scripts on this site already. The script below uses the variable Presses to store the number of presses of the touch sensor button, but it uses other variables too.

Actually this lesson will teach you something new: how to detect the moment that a touch sensor button is pressed (or released) as opposed to simply detecting whether the button is in the 'pressed' state. A 'press and release' is called a 'bump' in the EV3-G software.

Objective:

The user will be allowed 5 seconds during which (s)he can press the touch sensor button several times. If (s)he does that then the robot will move straight forward with 50% speed for a number of wheel rotations equal to the number of times the sensor was pressed.

Solution:

#!/usr/bin/env python3

from ev3dev2.motor import MoveSteering, OUTPUT_B, OUTPUT_C

from ev3dev2.sensor.lego import TouchSensor

from ev3dev2.sound import Sound

from time import sleep, time

ts = TouchSensor() 

steer_pair = MoveSteering(OUTPUT_B, OUTPUT_C)

sound = Sound()

presses=0  # Number of presses (actually releases)

previous_state=0

sound.beep()  # signal to start presses

start_time=time()

# loop until 5 seconds have passed

while (time()-start_time) < 5:

    current_state=ts.is_pressed

    if previous_state==1 and  current_state==0:

        # button has been released

        presses+=1  # Short for presses = presses + 1

    previous_state=current_state   # Ready for next loop

    sleep(0.01)

steer_pair.on_for_rotations(steering=0, speed=50, rotations=presses)

Notes:

Exercise 14: Color sensor - calibrate

The EV3 color sensor was probably calibrated in the factory so that it returns (in mode 0, reflected light intensity) a value of 100% when a perfectly pure white surface is brought near the sensor and 0% when a perfectly black surface is brought near. But in the real world such surfaces are rare and we are more likely to be working with ordinary white paper that is not a perfect white and ordinary black paper or black marker that is not perfectly back. Therefore when we use the color sensor we might be getting a reflectance reading of, say, 13 with our (not so) black surface and 87 with our (not so) white surface. Wouldn't it be convenient if we could recalibrate the sensor so that it gives a reading of almost exactly zero with our real-world black surface and almost exactly 100 with our real-world white surface? EV3-G has a special calibration block for this purpose but it does not work in the way most people expect, hence the existence of a special lesson in the 'Beyond Basics' section of the Lego software (Education edition) to show its correct use (see the corresponding program below). EV3 Python users can also use code to 'recalibrate' the sensor (or at least to adjust the displayed value in the desired way).

Objective:

Write a program in which the color sensor is recalibrated so that our real-world black surface gives a reflectance value of 0% and our real-world white surface gives a reflectance value of 100%. At the same time, the measured reflected light intensity should be continuously displayed on the LCD screen, so we should be able to see that after each recalibration the sensor does indeed give the expected values when presented a black or white surface.

The exact procedure will be this:

User checks that a value very close to 100 is indeed displayed when the 'white' object is in place AND that a value very close to 0 is indeed displayed when the 'black' object is in place

Solution:

As in the EV3-G solution, we will use a separate thread to continuously display the sensor value (after the calibrations for black and white, the ADJUSTED sensor reading will be displayed). Review the multithreading page if you have forgotten the principles of multithreading and how to use a 'daemon' thread.

For this exercise, it is of course important to always hold the objects at the same distance from the sensor. My experiments suggest that objects should be held about 0.5 cm from the sensor since this gives the strongest possible reflected light intensity. It is normal that after the adjustment has been made for 'black' the displayed value will then be negative when no object is near the color sensor.

Since the procedure is rather complex, instead of making 'clicks', the script causes instructions to be spoken, something that EV3-G is not capable of doing. Also, I include an extra wait_for_bump at the end to give the user time to check that the calibration really has been undone.

#!/usr/bin/env python3

from ev3dev2.sensor.lego import ColorSensor, TouchSensor

from ev3dev2.sound import Sound

from ev3dev2.button import Button

from ev3dev2.display import Display

from time import sleep

from threading import Thread

ts = TouchSensor()

cl = ColorSensor()

btn = Button()

sound = Sound()

lcd = Display()

black = 0 # this value will be replaced with the actual

# value returned by the sensor when it is placed close to a black surface

white = 100

def daemon_thread():

    while True:

        lcd.clear()

        adjusted_intensity = str(int((cl.reflected_light_intensity - black)*100/(white - black)))

        lcd.text_pixels(adjusted_intensity, x=75, y=50, font='helvB24')

        sleep(0.5)

t = Thread(target=daemon_thread)

t.setDaemon(True)

t.start()

# Calibrate for black

sound.speak('Put the sensor close to a black surface and press the button')

ts.wait_for_bump()

#btn.wait_for_bump(('right'))

black = cl.reflected_light_intensity

sound.speak('Calibrated for black')

# Calibrate for white

sound.speak('Put the sensor close to a white surface and press the button')

ts.wait_for_bump()

#wait_for_bump(btn.right)

white = cl.reflected_light_intensity

sound.speak('Calibrated for black and white')

sound.speak('Check that the calibration works, then press the button')

ts.wait_for_bump()

#wait_for_bump(btn.right)

# undo the calibration

black = 0

white = 100

sound.speak('Calibration undone')

sound.speak('Check that the calibration has been undone, then press the button')

ts.wait_for_bump() # bump to end the program

sound.speak('Goodbye')

Notes:

Exercise 15: Messaging

Objective:

This program (or rather, this pair of programs) will allow two bricks to communicate with one another using Bluetooth (short range wireless radio communication). The programs will allow you to control the rotational speed of one wheel of the receiving robot, which will be moving in circles, by turning the right wheel (motor C) of the sender robot. That's right - we will be controlling the receiving robot by 'remote control' using Bluetooth (not using the EV3 infrared 'beacon').

I don't own two bricks at the moment so this solution will have to wait...

Exercise 16: Logic

In computer programming the logical (or 'Boolean') values are True and False. Logical operators include and and or

Objective:

The robot will move forward in a straight line towards an object until its color sensor detects a black surface AND the robot is within 6-25cm of the object that it is approaching. When BOTH these conditions are met the robot will stop moving.

Solution:

Really there are THREE conditions to be met:

#!/usr/bin/env python3

from ev3dev2.motor import MoveSteering, OUTPUT_B, OUTPUT_C

from ev3dev2.sensor.lego import ColorSensor, UltrasonicSensor

from time import sleep

cl = ColorSensor()

us = UltrasonicSensor() 

steer_pair = MoveSteering(OUTPUT_B, OUTPUT_C)

steer_pair.on(steering=0, speed=50)

while True:   # forever

    distance = us.distance_centimeters

    if distance>6 and distance<25 and cl.color_name == 'Black':

        break

    sleep(0.01)

steer_pair.off()

Notes:

In the video that accompanies the official EV3-G exercise the robot is initially far from the reflecting object and over a white mat. The conditions are met for the robot to move forward. It passes over a black line but does not stop because the robot is not yet within the defined range. It continues moving forward until it reaches a second black line and there it stops because it detects black AND the robot is within the defined range.

It seems to me the above EV3-G program has an error. It should not be necessary to repeatedly turn on the motors. Shouldn't the first motor block be placed before the loop, not within it?

Exercise 17: Maths - Advanced

This exercise uses the gyro sensor which is not included in the 'Home' version of the EV3 kit, though it is available for purchase as an optional extra, and the corresponding programming block can be downloaded to the Home version of the EV3-G software at no cost.

Recall that it is vitally important that the gyro sensor should be absolutely still when the brick is powered up or the sensor plugged in, otherwise the sensor reading will wander away from the correct value.

Objective:

The robot is assumed to have already moved along two perpendicular arms of a right triangle, each with length 25 cm, and to have turned around 180° so it is now in the right location to begin tracing the hypotenuse but it is not pointing in the right direction. The robot should now turn slowly on the spot until the gyro sensor detects that the robot has turned at least 45°, then the motors should be turned off.

Then the robot should calculate the length of the hypotenuse using the actual angle turned by the robot (as measured by the gyro sensor) rather than the 45° angle that the robot should have turned. The calculation will be

                       hypotenuse length = adjacent arm length / cos(turn angle)

Then the robot should calculate the corresponding number of wheel rotations needed, given that the circumference of the standard Lego wheel is 17.6 cm. Then the robot should move at speed 30 in the correct direction and for the correct distance in order to trace out the hypotenuse of the triangle.

Solution:

from ev3dev2.motor import MoveTank, OUTPUT_B, OUTPUT_C

from ev3dev2.sensor.lego import GyroSensor

from math import cos, radians

gyro = GyroSensor()

tank_pair = MoveTank(OUTPUT_B, OUTPUT_C)

# Make robot slowly turn to the right on the spot

tank_pair.on(left_speed=10, right_speed=-10) 

gyro.wait_until_angle_changed_by(45)

tank_pair.off()

# calculate length of hypotenuse

length = 25/cos(radians(gyro.angle))

# calculate wheel rotations (wheel circumference = 17.6cm)

rots = length/17.6

tank_pair.on_for_rotations(left_speed=30, right_speed=30, rotations=rots)

Notes:

Exercise 18: Arrays (in Python, the closest equivalent to the EV3-G 'array' is the 'list')

Objective:

First, the user will give coded instructions to the robot by showing it a sequence of four colors. Each color is an instruction to carry out a certain movement. Then the robot will carry out the corresponding movements.

More specifically, the procedure will be:

Solution:

EV3-G defines an 'array' as a variable that can hold multiple values. The EV3-G solution to this exercise uses an array to store the four color code numbers.  There exists in Python something called an array (see HERE or HERE or HERE) but in reality the Python structure that best corresponds to an EV3-G 'array' is simply a list.

The order of the elements in a list is important. Each element in the list has an index number, and the first element has index number zero. Here is an example of a list in Python: [5, 2.7, 3.1] . This array has three elements - we say the array has a 'length' of three. The elements have index numbers 0, 1 and 2, so the third element has index number 2 not 3.

This script assumes that have you have downloaded the EV3 sounds in WAV format (including the file 'Horn 2.wav') to a folder called 'sounds' in your 'robot' folder as described on the Sound page

#!/usr/bin/env python3

from ev3dev2.motor import MoveTank, OUTPUT_B, OUTPUT_C

from ev3dev2.sensor.lego import ColorSensor, TouchSensor

from ev3dev2.sound import Sound

from time import sleep

cl = ColorSensor()

ts = TouchSensor()

tank_pair = MoveTank(OUTPUT_B, OUTPUT_C)

sound = Sound()

def get_color():

    while True:   # Wait for a valid color to be detected

        color = cl.color_name

        if color in ('Blue', 'Green', 'Yellow'):

            color_list.append(color)

            break   # exit the loop

        sleep(0.01)

color_list = []  # create empty list

for i in range(4):  # i=0 then 1 then 2 then 3

    sound.beep()

    ts.wait_for_bump()

    get_color()

sound.play_file('/home/robot/sounds/Horn 2.wav')

for col in color_list:

    if col=='Blue':      # blue: turn the robot 90 degrees left

        tank_pair.on_for_degrees(-50, 50, degrees=345)

    elif col=='Green':    # green: go straight forward for one wheel rotation

        tank_pair.on_for_rotations(50, 50, rotations=1)

    elif col=='Yellow':    # yellow: turn the robot 90 degrees right

        tank_pair.on_for_degrees(50, -50, degrees=345)

Notes:

Conclusion

Congratulations! You have now completed the EV3 Python versions of the 'Basics' and 'Beyond Basics' exercises that are included in the Education version of the EV3 software. You have seen how EV3 Python is almost always capable of performing the same tasks as the standard Lego EV3-G software, while at the same time giving you valuable experience of textual programming. But EV3 Python is also capable of handling programs that go beyond the capabilities of the standard Lego software (such as programs that make use of EV3 Python's text-to-speech feature). Going beyond what EV3-G can do will be the theme of the next set of exercises... so stay tuned!