Below is a block flow diagram that represents how the code for ARGOS will be structured. Starting at the top, input from the remote user is recorded in the form of audio, video, and gaze location. The audio and video are transmitted over the internet to the local tablet display. additionally, audio and video are also sent back from the local side to the remote side. On the local side, the gaze location is mapped onto the screen. The horizontal gaze location is used to calculate desired pan on the local side which is then translated into motor speed and duration commands that move both motors. The vertical gaze location is used to calculate the desired tilt on the local side then converted into motor commands to the single motor which directly affects the screen tilt. The instructions for controlling the motors are in the form of voltage outputs to the motors and the time duration for which those voltages are applied. Input is then read from potentiometers that record how far the system actually physically turned and send it to a controller which makes adjustments to the motors by comparing the actual movement to the desired movement.