Weekly Update

Week 7: Setting up the LCD

I was able to get the LCD up and running this week. The startup command sequence was a bit cryptic so I referenced the LCD driver code we used in ECE 153A for this LCD. I also referenced the startup code in this Adafruit library for the same LCD which sends additional Gamma control commands: https://github.com/adafruit/Adafruit_ILI9341/blob/master/Adafruit_ILI9341.cpp. I used the provided SPI drivers for the MAX78000 microcontroller to send the startup commands and then display simple colored blocks to the screen. I then used the provided camera driver to get a pointer to the raw frame buffer from the camera in YUV422 format so that I can extract the luminance value (grayscale). I will display RGB to the screen but pass in a decimated grayscale image to the CNN (1x80x80) to speed up the inference time. The below images show the output of the camera displayed in grayscale and RGB. The output format I am using is RGB565 for both but by giving the RGB channels equal weight I can make the image look gray.

Week 8: Setting up the motion sensor and programming the state machine

  • This week I set up the PIR motion sensor and coded up most of the state machine. The motion sensor is very simple and outputs a rising edge signal when it detects movement. The output goes to GPIO Port 2 Pin 7 which is set up for an external interrupt.

  • Using the Maxim Integrated drivers I registered a callback function that gets called when there is an interrupt for that pin; this callback function has a void* as the input parameter. Rather than directly interacting with the state machine from the callback function, I pass in a function pointer as the void* parameter to the callback function. This function pointer is an ssm_action_fn which is how external events will trigger transitions in the state machine.

  • The point of doing this is data encapsulation. The motion sensor interrupt handler should not have direct access to the state machine and its data but should only be able to communicate with it indirectly.

  • As shown in the diagram and pseudocode below, the function motion_sensor_trigger() is the ssm_action_fn. This function handles the state machine transition from IDLE to SEARCH (note, this is a static function inside state_machine.c, not PIR_sensor.c). In the callback function PIR_motion_handler(void* ssm_action) (this is really the interrupt handler) the 'action' function gets called. This enables the motion sensor to interact with the state machine without having any direct access to it.

Week 9: Integrating the face detector, timer interrupts, and state transitions

  • This week I set up the face detector. This part of the project is something I have been working on for most of the quarter for ECE 196 using a different board that uses the same MAX78000 chip. The majority of the work was integrating the code into this new board. I had to do some preprocessing of the image as well because the camera is rotated and outputs a 160x160 image. I decimated the image to 80x80, extracted the grayscale values, and then rotated the image to align it with the LCD orientation before feeding it into the CNN.

  • I also continued working on the state machine code and set the motion sensor to trigger the CNN inference. As shown in the video below, the LCD is not active by default. When there is motion (I shake my phone) the motion sensor triggers a transition from IDLE --> SEARCH. The SEARCH state looks for a face in the image. If one is present it will transition to POSITIONING which outputs a bounding box. It is a little hard to see from the video but if no face is detected then no bounding box is displayed and the state machine returns to the SEARCH state.

  • I also set up the timer interrupt for the state machine. If a state is stuck for ~10 seconds then the state machine will reset and return to the IDLE state

  • The final steps are to hook up the IR temperature sensor, figure out how to get measurements from it, display text to the LCD, and set up the remaining components of the state machine

20210305_155745_1536x864.mp4

** Additional late update. I got the POSITIONING state working pretty well. As shown in the video a predefined box and center dot are displayed in black when in the positioning state. The user needs to align their face (the yellow box) with the black box. If it is aligned within the defined threshold (~ +/- 15 pixels) then the black dot will turn green. I plan to display text to the screen to help the user such as 'move left'. Getting the correct dimensions on the bounding box is quite difficult (a smaller far away face should have a smaller bounding box) given that I had limited training data so the dimensions will likely be off.

20210305_221304.mp4

Week 10: Adding the Temperature sensor and Finishing the State Machine

  • This week I figured out how to get temperature values from the IR sensor using I2C. The output temperatures seem a bit low so I may need to add an offset or calibrate the sensor in some way by referencing the temperature of some ground truth measurements.

  • I integrated the MEASUREMENT state into the state machine so the entire system is functioning now but needs to be tweaked and tested. For example, I may want to set different timeout values for each state.

  • The MAX78000FTHR board also has a PMIC so you can power it using an external battery. I ordered a compatible battery online and will integrate this into the final project. (Update: Unfortunately the battery will not arrive before the project deadline so I will power the board with a power bank)

  • One last thing I still need to do is figure out how I can display text to the LCD.