horndogs

horndogs

horndogs CS344R Aibo Lab group

Code Setup

1) Get into a linux terminal - type $cd to put you in your home directory

$svn checkout svn+ssh://ras.ece.utexas.edu/var/lib/svn/horndogs

2) this is just to make your life easier, symlinks are cool !

$ln -s horndogs/trunk/trunk trunklink

$ln -s horndogs/trunk/trunk/shared/lua lualink

Code Update and Commit Procedures

1) Everytime you want to write Code, you must Update with SVN

$cd

$svn update horndogs OR just svn update in the horndogs system

should tell you a certain revision,

3) when you are done editing a file make sure you commit it !

$cd ~/lualink

$svn commit comptask.lua

it will then ask you to write comments, then Exit ^X, ,Yes, Enter and it will tell you svn committed ,

if this doesn't work, email me lyncas A tt g ma il..com

VIM Tips

I've created my own vimrc file for python, lua, c/c++, perl syntax highlighting and quick editing, thanks to this VIM tutorial,

Lab 1 - Tasks

Checklist:

This assignment is worth 5 points. Here's how you earn them. Partial credit is possible.

1) Change the init.lua from PRESSED to TAPPED for Mid Penalize

2) Setup STATES, to alternate accordingly, in PlaySoccer Loop,

    • (0.5 points) Demonstrate the ability to read the changing values from the Aibo's sensors as useful data in your program, and then display them on your workstation.

You can display changing values on either the telnet screen or in UTAssist. Pick something specific to show off, e.g., a leg joint sensor value and show that it changes when the robot walks

1) In comptask.lua , call GetPan function (check this ! ) ,

    • (0.5 points) Same for the camera image.

If you can connect to the Aibo through UTAssist and display the segmented image, you are done. All teams have already demonstrated this.

1) Follow Running Software Notes

    • (0.5 points) Demonstrate the ability to detect and track a colored blob in the camera image with the head held still. (Information on specifying colors to be provided on the wiki.)

Again, this has been demonstrated by all teams with the orange ball. Further things that *can* be done (but are not required) are: improve the filtering (i.e., how the Aibo recognizes the ball), recognize the pink ball that comes with the Aibo, recognize blobs of other colors.

1) look through comptask.lua for the functions,

    • (0.5 points) Demonstrate that you can control sitting, standing, and head-turning.

You *don't* have to demonstrate "sitting". Standing and head-turning has been demonstrated but I want you to show the head move in a different pattern from the left-right scan provided.

1) Look up functions, Head Scan, Stand, Sitting,

2) In another terminal, load Lua, run comptask.wa , HeadScan

    • (0.5 points) Demonstrate that you can control walking: forward and turning.

This is straightforward

1) Load up Lua Interpreter, load comptask.wa , call Move (no change in thetaV)

    • (0.5 points) Demonstrate that your Aibo can walk in a curve: forward and turning at the same time.

Pick a pattern to show off, e.g, show the Aibo walk in a circle, s-curve, etc.

1) Load up Lua Interpreter, load comptask.wa , change RelHeading at even intervals, try doing squares first

    • (1 point) Demonstrate that your Aibo can move its head to keep the visible blob from a colored ball near the center of the image, if the ball isn't moving too fast.

Demonstrate this for the orange ball or the pink ball.

    • (1 point) Demonstrate that your Aibo can spot a colored patch in the distance, and walk toward that patch until it fills more than half of its camera image, and then stop. (The patch will be a different color from the ball, but we'll give you both things well in advance.)

The colored patch can be either the orange or pink ball or any other patch of your choosing