The ShareTable system consists of two identical ShareTables, one in the home of the parent and one in the home of the child. To place a call through the system, the user simply needs to open the doors of the ShareTable cabinet. The paired table in the other household rings for 1 minute, as would a phone. To answer the call, the remote user needs to open their cabinet doors. Once a call is connected, audio is shared and the monitor screen of the ShareTable shows a standard face-to-face videoconferencing view (large view of the remote participant and a small view showing self). Additionally, the local table surface of the ShareTable now shows a projected view of the remote table surface and vice versa. To end the call, either side simply has to close the cabinet doors.
Challenge 1: Layering Physical Artifacts
In order to support layering physical artifacts in a realistic way, we implemented the ShareTable using top-down projection. For example, if the parent places a physical token on a projected game board, top-down projection allows the projected token to appear on top of the child’s physical board rather than projected unseen on the board’s bottom. Similarly, if a parent writes a comment on top of a projected worksheet, top-down projection allows this annotation to be displayed on top of the physical worksheet.
Challenge 2: Removing Visual Echo
Visual feedback or “echo” is a major concern in a camera-projector system. Unmodified, the camera records an image of the projected artifact and sends it back to the originating surface. If the physical artifact is moved, an echo of its projection remains on the surface. If projected images are re-projected without an intervention, the resulting image keeps getting brighter and less clear. Without some way to filter projected artifacts from real ones, the ShareTable would be unusable due to this feedback effect. We wanted a lightweight way to eliminate visual feedback, so we used linear polarizing lenses to filter out the projected artifacts from the physical ones. Light that passes through the lens becomes polarized and cannot be seen through a lens with a different polarity. Thus, by attaching lenses with perpendicular polarization to the camera and projector, we prevent artifacts from being re-projected. In order to preserve the polarization of the light once it strikes the table surface, we use a non-depolarizing silver lenticular projection screen as the surface backdrop.
Challenge 3: Easy-To-Initiate
To make the system as easy to use as the phone, we incorporated a kiosk-like activation. To attempt to contact the parent or to answer an incoming call, the child must simply open the doors of the cabinet. To end a call, either party can simply close these doors. As an added benefit, the open doors create a cubby-like area for the interaction, which engenders a sense of privacy.
The ShareTable uses a Dell Inspiron 530s, customized with an ATI Radeon HD 3450 256MB HDMI video card to allow for multiple monitors. The video and audio is captured by a QuickCam Pro 9000 Web Camera. The face-to-face video is shown on a standard 15-inch flat panel monitor; the audio is played on two Dell AX210 Stereo Speaker System. This is essentially a low-cost off-the-shelf system.
The door sensor consists of a Reed switch mounted on the body of the cabinet door and a magnet mounted to the main cabinet of the table. When the Reed switch is close to the magnet, a signal is sent to the system. Since this doesn't require actual physical contact, the sensor is incredibly robust and tolerant to actions such as slamming the door. The Reed switch is connected to an Arduino prototyping board which communicates with the main program via USB.
The ShareTable uses a Dell2400MP projector mounted four feet above the table surface, resulting in a 20'' X 16''. Commands were sent to the projector using an RS232 connection. When not in use, the projector remained in stand-by mode.
The tabletop surface image is captured by a 207MW Axis Camera at 1024 by 768 resolution and sent directly to the partner table at 5 fps using the Axis Camera's internal server. The cameras were connected directly to the router via a physical Ethernet connection in three of the homes. In the last home, the camera was connected to the router via a networking-over-power-line connection.
Finally, each ShareTable included components to reduce the "visual echo" of the tabletop. I used a polarization solution as a low-tech, robust approach for avoiding recapturing projected artifacts. Each projector and each Axis camera is outfitted with a custom mount for a linear polarizing lens. Projectors thus emit light that is orthogonal to the light that the cameras receive, which breaks the visual echo feedback loop. One last modification keeps the light from depolarizing when hitting the tabletop surface---each surface is covered with a silver lenticular projection screen (the kind used for projecting 3D movies, for example). To protect the screen and to allow the table to be dry-erase marker friendly, the screen is covered in a thin layer of Plexiglas.
In order to correctly overlap the image from the partner table over the physical space of the local table, the ShareTable system must account for the barrel distortion introduced the Axis cameras and crop the image to include only the relevant portion of the view. We modified an existing Python OpenCV solution to generate two matrices specifying the distortion of the Axis cameras after collecting 20 images of a checker-board pattern held at varying angles to the Axis camera. I created a custom Visual C# component that applied these matrices to the image and allowed us to select the relevant tabletop area for each camera. The ShareTable applies these cropping points and de-warping matrices to each image received from the remote Axis camera.
The main ShareTable system is programmed in Visual C# and consists of the following components: the status database, the face-to-face videoconferencing, the tabletop video, and the logging infrastructure. Each ShareTable must know about the current status of its partner in order to appropriately start and end sessions. To achieve this, each ShareTable pings a MySQL database with its status every second. The face-to-face video and audio is achieved by leveraging the Skype4COM API to initiate a full-screen videoconferencing session. Optionally, the ShareTable could be set to use the TokBox API instead for face-to-face video and audio depending on the constraints faced by the designer (TokBox is much more CPU intensive, but Skype has more strict intellectual property constraints). The ShareTable surface is a C# component that extend the Axis Media Control API to display a cropped and de-warped image (see above) of the partner surface. Finally, the ShareTable logs system use in two ways. First, any time that the status of the system changes (e.g., going from "Call Routing'' to "Call In Progress''), a message is added to a text file marking the time and reason for the transition. Second, any time that a call is successfully placed, the ShareTable collects records video of the local participant and their table surface. The video recording was achieved using a QuickCam Pro 9000 Web Camera mounted on the top shelf of the table. Whenever a call was connected, the ShareTable initiates video and audio recording by issuing a command-line call to Flash Media Live Encoder (used on low-CPU setting). All software for the ShareTable is available for download here.
Figure 2. Video of the table surface is projected onto the remote table. The standard videoconferencing system allows for a face-to-face audio/video connection.
Figure 3. A board game being played via the ShareTable. As you see, physical artifacts on one side (the game board, the hand, the tokens, the die) are projected onto the other side.
If you want to use the ShareTable code, you are welcome to do so, with the appropriate attributions. If you have any questions about any aspects of the code, feel free to contact Lana Yarosh (lana AT cc DOT gatech DOT edu) for help.