Look below and add your name to the google doc by 9am Friday- https://docs.google.com/spreadsheets/d/17vPWuWcDtki-i5bbNyVpOphVYxF-Il8GZBbZWwttz7s/edit?usp=sharing
Grad Student Advisor:
Ryan Rose
Project Outline:
Shimi is a musical robot companion, featuring speakers, five degrees of freedom, a microphone, a camera, and a powerful embedded CPU and GPU. Research with Shimi involves human-robot interaction (HRI) through music, both for communication and entertainment.
Sample Tasks:
Skills/Background:
Nothing is required! For programming tasks, Python knowledge is recommended. All of the codebase is in Python.
Grad Student Advisor:
Richard Savery
Project outline:
Developing forms of robot musician interaction with humans who aren’t necessarily a musician. Examples such as - https://www.youtube.com/watch?time_continue=2&v=YRb0XAnUpIk - where the human is making musical gestures that are played on the piano. Could be through hardware, or software (audio, visuals)
Sample tasks:
-Creating a physical interface
-Designing a way for a humans movements (camera) to translate to Shimon
-Very open to any ideas
Required skills/background:
-Open to everyone (all skill levels can be accommodated)
Film score generation based on deep learning visual analysis - paper discussing here http://galapagos.ucd.ie/wiki/pub/OpenAccess/CSMC/Savery.pdf
Grad Student Advisor: Richard Savery
Project outline:
Film score generation based on deep learning visual analysis - paper discussing here http://galapagos.ucd.ie/wiki/pub/OpenAccess/CSMC/Savery.pdf
Sample tasks:
Required Skills:
-All codebase will use Python. Being new to Python is fine if prepared to learn
-Multiple components will use machine learning and/or deep learning. Experience is good, but for those with no experience is a chance to learn basics through a general project.
Grad Student Advisor: Richard Savery
Project outline:
Generating and analyzing music for Shimon performance.
Sample tasks:
-Computer analysis and generation of lyrics
-Computer analysis and generation of chords and melody
-Analyzing relation of lyrics and melody
-Analysis of a live singer/rapper, using their words to generate music
Required Skills:
-All codebase will use Python. Being new to Python is fine if prepared to learn
-Multiple components will use machine learning and/or deep learning. Experience is good, but for those with no experience is a chance to learn basics through a general project.
Grad Student Advisor: Margot Paez
The goal is to make Shimon produce gestures that augment or fill-in for limitations of dynamic range and articulation. Shimon should also be able to generate these gestures at appropriate places in a musical performance.
Sample Tasks:
Help with gesture development, Contribute to building library of gestures.
Required Skills:
General research skills
Grad Student Advisor: Margot Paez
We want to characterize the dynamics (physics) and kinematics of marimba performance of musical dynamics and articulation from a physics perspective.
Sample tasks:
Assist with setting up experiments, help with data collection and analysis.
Required Skills:
General research skills
Grad Student Advisor: Raghavasimhan Sankaranarayanan
Project Outline:
An interactive music for users to view a concert, stitched based on clips that other users upload on to a server.
Sample Tasks:
- iPhone / android development for seamless recording and uploading of concert video to a server
- Object recognition and tracking in concert videos
Required Skills: one or more of these:
Swift, Java, python for web development, JavaScript, deep learning, AWS
Grad Student Advisor: Keshav Bimbraw
Project Outline:
We are replacing Shimon's current actuators with DC motors, with which, we will be able to have better overall dynamic range, speed and control over striking characteristics. This project will deal with different aspects of a Mechatronics system - design, modelling, control theory, controller interfacing and integration, and eventually, testing, evaluation and user studies.
Sample Tasks:
Skills/Background:
Mechanical Engineering background and interest in control theory and microcontroller integration.
Grad Student Advisor: Keshav Bimbraw, Jason Smith
Project Outline:
As part of the Skywalker project, we are working towards extracting additional features from ultrasound images to control prosthetics and robots. We are also working towards incorporating other vibrotactile and visual feedback from the environment to improve the system's performance. As part of this project, integration with other robots will also be looked at.
Sample Tasks:
Skills/Background:
No prior experience required. An interest in robotics and some background in Python is recommended.