Research Projects 2012

Educational Location-based Game -

"Operation Dino" - Game for the Academy of Natural Sciences

In our project, Operation Dino, we have worked with the Academy of Natural Sciences to create an interactive educational experience about their dinosaur exhibition. Based on the museum's educational goals and the needs of their visitors, we developed a new interactive museum exhibit that takes place in the Dinosaur Hall. Utilizing touch screen tablets and QR tags, the exhibit is designed to be an educational experience targeted for children ages 8-12 which extend the information learned in the museum into a fun and interactive game. Unlike most museum games that are localized, Operation Dino utilizes server technology to install multiple game stations all around Dinosaur Hall.

Each game station contains a unique multiple-choice question that encourages players to explore the museum to answer it correctly and collect that station's explorer badge. There are many badges to collect, each with the ability to unlock unique in-game animations and collectibles. The game itself features colorful, animated dinosaurs that guide players on their quest as an archeologist to collect all the badges in order to become the Master Archeologist. In future installations of the game, a live score board at the museum will exist to encourage competition among players. It also has potential to extend outside the museum through an online portal that allows players access to their educational content relating to what they learned at the museum.

  • Project Team

    • Glenn Winters

    • Nate Lapinski

    • Girish Balakrishnan

    • Sonia Havens

    • Kevin Gross

    • Yujie Zhu

    • Bobby Speck

    • Ali Hassanzadeh

    • Yiqun Shao

    • Ian Woskey

    • Tom Burdak

  • Project Advisors

    • Dr. Jichen Zhu

    • Dr. Paul Diefenbach

  • Museum Advisor

    • Jason Poole

Music — Reactive Gaming — "Pulse" and "Pulse 2"

Download View Trailer

Pulse is a music-reactive game which uses the player's own music library to dynamically incorporate music analysis together with web-accessible lyrics, text, and images based on the metadata included in the song files. They produced a web-deliverable 2D side-scroller game integrating these features that performs real-time analysis of songs in a user's music library to drive the gameplay, providing a novel form of game-music interaction. This work was presented at the IEEE Games Innovation Conference (ICE-GIC 2009) in London, England.

    • Master's Students:

    • Christian Hahn,

    • Kevin Hoffman

    • David Lally

    • Dan Letarte

    • Evan Boucher

    • Thomas Bergamini

    • Nicholas Avallone

    • Nicholas Deimler

    • Justin Wilcott

    • Le Joyce Tong

    • Advisors: Dr. Paul Diefenbach, Dr. Youngmoo Kim

Augmented Reality — "crAR"

crAR is an Augmented Reality research project investigating novel gameplay mixing physical and virtual objects. This work was featured at the Philadelphia Museum of Art during Philly Tech Week.

    • Advisor: Dr. Paul Diefenbach

Brain-Computer Inteface game - "Maxwell's Demon"

The first of these projects is a Maxwell’s Demon. Maxwell's Demon, is a 3d first person puzzle game which uses Brain to Computer Interface devices as a core mechanic of gameplay. Specifically, the game can be interfaced to an fNIR device or the

NeuroSky EEG headset. In this game, the player has to move from one side of the room to the other by walking on platforms. However, the path is not complete and the user must concentrate to activate special floor platforms that are movable to complete the path. Additionally, the player must avoid physical, auditory, and visual distractions that attempt to differ the player's concentration.

Multitouch Gaming — "Planet Diggums"

View Presentation

While devices such as the Wii and Playstation's Eye (formerly EyeToy) have expanded user interface possibilities in the gaming world, there is still a traditional disconnect between the user and the physical screen image. The advance of gesture computing and the new availability of multi-point touch screens permit a novel mechanism and universal vocabulary for direct interaction with a virtual world. Planet Diggum is a multi-user god-game which uses a novel combination of finger and hand-stroke gestures. The goal of this project is to create a testbed kiosk where multiple users can interact with the system without need of training. Planet Diggum uses a Drexel-built Frusrated Total Internal Reflection multi-point touch screen, stanadards-based software, and custom media assets, and brings together teams from Digital Media, Computer Science, and Engineering to provide a unique and fun user experience.

    • Team Members: Dr. Paul Diefenbach, William Muto, Matthew Smith, Chester Cunanan, Justin Dobies, Arvind Neelakantan, Sara Colucci, James Grow, Dr. Frank Lee, Louis Kratz, Dr. Ko Nishino, Craig Polakoff, Boris Block, Dan Hennessey, Zenko Klapko, David Millar, William Morgan, Electrical Engineering, Dr. Youngmoo Kim, Timothy Kurzweg, Vijay Balchandani, Eric Effinger, Jeevan Kotha, Pannha Prak, Joseph Romeo

    • Advisors: Dr. Paul Diefenbach, Dr. Youngmoo Kim

This project leverages open source software libraries, low-cost hardware, and standards-based 3D protocols to create a modular, extensible framework for rapid prototyping and experimentation. It evolved from the task of investigating novel mechanisms of multi-user gameplay, and was driven by several assumptions and observations:

• Gross body gestures are now being adequately explored by modern consoles (Wii and Sony Eye) as alternative to controller.

• Physical proximity of users can enhance gameplay experience

• Most interactions with games involve physical disconnect between user and image

• Casual games have largest growth potential

• Gesture computing (base) technology has reached sufficient maturity for rapid-prototyping.

Gross body gestures, such as those employed by the Wii, were abandoned in favor of a more intimate gaming experience to enable proximity-based collaborative play. Planet Diggum was first publically presented at the DMS 2007 Conference in San Francisco. The project was later expanded by a second group of graduate students to include mini-games, a heads-up-display with embedded video, and expanded creature behaviors. The pedagogy of performing this multi-term research within a multi-discipline graduate curriculum was featured in a SIGGRAPH 2008 talk.