EE444 Embedded Systems Design - Student Projects: Gonowon

Disclaimer: This page was created by Jessica Gonowon.

All opinions expressed here are those of their author(s) and not of D. Raskovic

Imaging System for UAF Alaska Research Cube-satellite (ARC)

Jessica Gonowon

University of Alaska Fairbanks

May 11, 2011

EE 444 Embedded Systems Design

Abstract— Acquiring images of arctic regions to observe snow coverage and glacial areas is the main task of the microcontroller-based imaging system and fulfills one of the science mission objectives for the ARC. The imaging system is designed to use an image sensor, a first-in first-out (FIFO) cascaded buffer, and an MSP430 microcontroller to dictate the image processing and storage. Communicating image data to a separate entity using the serial peripheral interface (SPI) protocol is a key aspect of the imaging system and contributes to the orbital missions of the ARC.

Index Terms—camera, cubesat, FIFO, SPI

I. Background

THe concept design for ARC was developed at UAF in the EE 656 Space Systems Engineering course in Fall 2009 by an interdisciplinary group of students interested in satellite engineering. This concept design included an imaging system to provide visual data of Earth’s arctic regions during the ARC mission. Including an imaging system was inspired by the AAU-CubeSat, which is a student-developed satellite at Aalborg University, Denmark in 2001. A working camera on the ARC is further supported by other cubesat projects that have included an imaging system such as the XI of the University of Tokyo in 2004, the CanX-1 of the University of Toronto in 2003, and the DTUSat of the Technical University of Denmark [1]. The image sensor board for ARC will be placed on the Z-axis of ARC, which will face Earth. All images captured will require a UTC time stamp for image processing reference. The imaging system has two modes: active and sleep. During active mode, ARC is located above the arctic and power will be used to capture images. During sleep mode, ARC is away from the polar region and the power consumption is greatly reduced. The imaging system will idle with low power for the majority of the ARC mission. After pictures are taken, the image processing will take place on Earth when images are downloaded.

II. Objective

The image board will use a Texas Instruments MSP 430-2618 microcontroller, an OmniVision 10620 low-power camera, two cascaded Averlogic RAM buffers, and external flash memory. As shown in Figure 1, the main operational process will begin with the MSP 430 controlled by its programmed C-coded instructions. When the MSP 430 receives a signal to take a picture, this signal will be sent to the camera to carry out the task. The bits of the image taken by the camera will then be sent to the cascaded buffer, which will act as a pipeline to the MSP 430. The microcontroller will take these image bits and send them to flash memory for permanent storage. Because the operating frequency of the camera is much faster than the flash memory, the cascaded buffer is our current solution to this frequency mismatch. When the signal is received to retrieve a picture, the microcontroller will access the bits of the picture stored from flash memory, which will be sent to the communications board then on to the user on Earth. Other features of the system include accessing different pictures stored on the flash memory and clearing the flash memory of retrieved images.

Fig. 1. Image Board System Level layout and main imaging process

III. Requirements

- Shape of image board will conform to the ARC standard board outline

- Sleep power mode (off-state) that draws 0 - 10 mA

- Active power mode (on-state) that draws approximately 250 mA

- Take one picture and download that picture to memory in 180 s.

- Flash memory can store at least 10 pictures

- Well-documented C instructions for MSP 430

- Well-documented oscilloscope pictures of image board inputs and outputs

- Acceptable quality of pictures taken

IV. Proposed Timeline

The image board will use a Texas Instruments MSP 430-2618 microcontroller, an OmniVision 10620 low-power camera, two cascaded Averlogic RAM buffers, and external flash memory. As shown in Figure 1, the main operational process will begin with the MSP 430 controlled by its programmed C-coded instructions. When the MSP 430 receives a signal to take a picture, this signal will be sent to the camera to carry out the task. The bits of the image taken by the camera will then be sent to the cascaded buffer, which will act as a pipeline to the MSP 430. The microcontroller will take these image bits and send them to flash memory for permanent storage. Because the operating frequency of the camera is much faster than the flash memory, the cascaded buffer is our current solution to this frequency mismatch. When the signal is received to retrieve a picture, the microcontroller will access the bits of the picture stored from flash memory, which will be sent to the communications board then on to the user on Earth. Other features of the system include accessing different pictures stored on the flash memory and clearing the flash memory of retrieved images.

The proposed timeline for the imaging system is shown in Figure 2. The first portion of this project began with familiarizing with each component’s layout and researching standard ways to write instructions on the MSP430 using the C language. I used the Rowley CrossStudio cross-compiler for the MSP 430. Greg Geiger—a UAF graduate student of electrical engineering that has worked with the image board—has already written some C instructions for the MSP430. These programs were used as a guideline and changes may be made to fit the objectives of the image board project. The next phases were to test the components of the imaging system and rebuild it on the pre-made development boards provided. Finally, the last phase is to test the system as a whole and to collaborate communication processes within the ARC as a group project. The balloon flight has been rescheduled for August 2011 and will incorporate the imaging system.

Fig. 2. Proposed Timeline for imaging system

V. Methods and Results

A. Approach 1: Test Operation of Current Imaging System

When I tested the imaging system that was previously built, I used an oscilloscope and observed that the image signals from the camera were not propagating as outputs of the cascaded RAM buffer. According to the OmniVision camera datasheet, output image signals in parallel do not require I2C modification and the eight blue parallel data lines are shown in Figure 3 [2].

B. Approach 2: Rebuild Imaging System to fit on Pre-Made Development Board

Building a daughter board to fit on the development board would provide accessible pins to test the propagation of the image signals. I used Cadence PCB Editor to design routed traces between the cascaded RAM buffer, connection headers, the SD card flash memory port, and various passive components included in the schematic (Figure 4). In this design, the camera is oriented on the daughter board so that its lens points in the negative Z-direction. In other words, pin 1 on the camera is located on the top right of the connection header as described in the OmniVision datasheet. Figure 5 shows a picture of the milled daughter board. I did not pursue a third attempt after two failed milled daughter boards.

Fig. 3. Previous imaging system built by Greg Geiger in Spring 2010. Image signals from the camera were not propagating to the outputs of the first RAM buffer.

Fig. 4. PCB Editor 2-layer daughter board design with cascaded RAM buffer in the middle, SD card port to the right, connection headers to the MSP430-2618 to the left, right, and top, and connection header to the camera on the bottom.

Fig. 5. Milled daughter board fitting on development board with camera facing negative Z-axis

C. Approach 3: Rebuild Imaging System on Breadboard

I rebuilt the project on a breadboard so that wires could be easily modified if necessary. One breakout board for the surface-mount RAM chip was designed for testing. I measured the image propagation and found that it was having the same problem as before. I isolated and tested the RAM buffer. According to the datasheet of the RAM chip, the output ready (ORDY) status flag should be set high when data has been written to the buffer [3]. But ORDY was measured to be a logic low, which was about 6 mV during testing.

Fig. 6. Isolated RAM breakout board ready for testing

D. Approach 4: SPI Communication Between two ARC subsystems

Even though the imaging system may be a standalone project, it is more importantly part of a group project with the ARC communications subsystem designed by Sam Vandervaal. According to Davies, the SPI protocol between the master device (communications subsystem) and the slave device (imaging subsystem) requires compatible SPI register settings [4]. For our SPI protocol, the communication subsystem sends two bytes, ‘I’ and ‘C’ for “image capture.” When ‘IC’ is received by the imaging subsystem, it should enter the interrupt service routine to send potential image data to the transmit buffer (TXBUF) one byte at a time because TXBUF is only eight bits long [5]. For initial testing, an array of alphanumeric characters from the slave device is used for transmission. The measured signals from the master device were expected, but the data transmitted to the master device was an inconsistent shifted version of the expected. Thus, the request for data retrieval was accepted but the content of the transmission was incorrect. Figures 7 and 8 show the two subsystems connected for SPI communication and an oscilloscope capture of SPI signals, respectively. I used the B1 SPI module specified in the MSP430-2618 datasheet [6].

Fig. 7. Communications subsystem connected to MSP430-2618 through 4 SPI lines. The blue line is master clock. The purple line is STE from master device. The white line is slave-in-master-out (SIMO) data. The red line is slave-out-master-in (SOMI) data

Fig. 8. Oscilloscope capture of SPI signals. Master sends "IC" to slave; slave attempts to transmit a succession of 'A's

VI. Project Considerations

A. Engineering Standards to Consider

The main hurdles preventing the imaging system to be successful are writing data to the RAM buffer and transmitting correct data from the slave device to the master device. Thus, there were several engineering standards I should have taken into account while working on this project. First, I should have tested the buffer separately after I found that image signals were not writing to the first RAM buffer of Geiger’s board. Since this was a problem with the previous imaging system, it should have been obvious that it will be in my rebuilt system. Second, a design review in order to have a daughter board professionally built should only have been considered after the imaging system is completely working. Design reviews are common among engineering teams and help members decide whether or not a working design is ready to be implemented in the overall system. Avoiding the daughter board method would have saved a lot of time and I could have focused more on programming the imaging system. Finally, I should have spent more time understanding the buffer datasheet so that I could better troubleshoot the buffer problem. Again, this would have saved a lot of work, especially at the end of the project. I will continue to troubleshoot these difficulties until an acceptable working imaging system is implemented.

B. Project Constraints

Some parts of the imaging system are working. The camera is able to take pictures if given a clock and an enable signal. The MSP430 can process given data in our SPI protocol. However, my project is incomplete because the main goal of the imaging system is not fulfilled, particularly the software requirements. Data is not being written to the buffer and the external SD flash memory is not implemented with the MSP430. A compression scheme and a timestamp scheme needs to be implemented for all images stored on the external memory. The imaging system also lacks the two other commands when communicating with the communications subsystem, which are “ACK” for acknowledgement and “RI” for retrieve image. A physical constraint the image system faces is providing a support structure for the camera. Even though the camera fits well on the daughter board, the weight of the lens was not supported and may be a problem when travelling at high speeds in orbit. Fortunately, it is possible to have a working imaging system before the balloon test flight in August 2011.

C. Project Ethical Issues

The concept of the imaging system is simple, but there are certain ethical and legal issues that could cause this system to be of poor engineering design. First, when the imaging system takes pictures it could potentially capture images containing private property or non-consenting individuals. If these pictures were to be published without permission of the individuals, the project is not protecting the rights of the general public. However, since the ARC will be in orbit around Earth, the distance between detailed geographical areas and the resolution will be such that private properties and individuals will not show up in the images taken. The second issue is of the imaging system taking pictures of Earth without permission of the National Oceanic and Atmospheric Administration (NOAA). But since the ARC mission is a NASA supported project, permission to take pictures of Earth’s surface is most likely granted. Another ethical issue is that of modifying images to falsely represent the actual data taken. For example, if an image system user decides that the image taken was unacceptable for scientific display they may use software techniques to make the image more impressive. This is ethically dishonest and provides inaccurate visual data when ARC is in orbit, which does not satisfy the ARC science mission objective.

VII. Conclusion

Overall, the imaging system allowed me to practice the design cycle applied to a microcontroller-based project. Through the mistakes I have made during on this project, important design guidelines have been solidified. First, I have learned to stick with a proposed timeline. Even if another action item on the timeline can be addressed, tasks should still be done in order as I originally proposed. Second, I learned to overestimate tasks specified on the timeline. For most of the action items, I definitely underestimated how long it would take to completely finish. Next, I learned that while working through hardware issues, I should also be working on software that would be working on a finished built product. Despite my numerous mistakes, I hope to continue working on my own microcontroller projects practicing what I have learned from this project.

References

[1] Earth Observation Portal. http://www.eoportal.org/directory/pres_CubeSatLaunch1.html . January 24, 2008.

[2] "OmniVision Advanced Information Datasheet for OV10620 CameraChip Sensor." OmniVision Technologies Inc. Version 2.0. August 2008.

[3] "Averlogic AL440B Datasheet." Averlogic Technologies Inc. Version 1.0. November 2001.

[4] Davies, John H. MSP430 Microcontroller Basics. 2008.

[5] "MSP430x2xx Family User's Guide SLAU144F." Texas Instruments. December 2010.

[6] "MSP430F261x Mixed Signal Microcontroller Datasheet SLAS541E." Texas Instruments. January 2009.