Week 1:
- First Gestures team meeting
- Assigned to Into Your Mind
- Installed max msp and downloaded the shimon gestures github repo
- Intro to max msp from Lisa & Richard
- First assignment: create 1 or 2 complex gestures to share for next meeting (9-2-2019)
Week 2:
- Downloaded max mps and lean to use it
- Created zc.triplet1.maxpat
- Created zc.triplet2.maxpat and zc.triplet3.maxpat
Week3:
- Created gearSpinning.maxpat
- Shimon's head spins in time (like a human's might) with gears that slowly speed up, with a chord playing each time the rotation finishes. You'll get a bang every time the gear finishes its rotation in Max - you can simulate this for now by manually pressing the bang slowly speeding up.
- It's a coolest movement so far I have created bc it's synchronize the speed of gesture to the speed of the gearSpinning
Week4:
- Taught the umenu, counter, cycle, pack and unpack, tempo, object, Learn cycle object
- Talked about how it's a way to display or change a range of numbers
- You can set it by
- Using a message: set num num
- Using integer inlets
- Dragging with mouse (UI based object)
- Outlets are useful to see the range
- Listmode lists the output in a message in list format
- Can be helpful to use by creating a range for the song that we want to test
- A drawback is that it only uses integers so it's good for velocity but no position
- Brainstormed ideas for gestures
- Created sidetosideDescend.maxpat -- Super Cool!
Week5:
- Identify two features from audio or vision (Shimon will soon have a camera during performances)
- Suggest a way Shimon might move in response to each of those features. These gestures will be reused, possibly during the same performance, so you may find that you would like to include some amount of randomness. Try to be specific about which motions/parameters are fixed and which are random, and whether you would limit any ranges of motion in order to get the movement you want.
Week 6
- Creating the gesture for intoyourmind.maxpat at 3:01-3:14, called zc.week6Gest.maxpat.
- Thinking about the Automating Gestures's task I am goingt to work on for Mid-term presentation:
shifting Shimon's gestures to center on the area of notes he's playing on the marimba.
- input: MIDIi file parsing? (lisa will give me parsed inputs midi)
- " You have access to a list of 4 midi values corresponding to the note each arm is playing (they'll be in order of lowest to highest on the marimba, his arms never cross each other). "
- Output: Given the ongoing gestures, assign an appropiate gesture with right center location on the marimba to it.
Week7
Automating Gestures_Demo:
- Classifying input notes of midi files to 3 note registers/areas (high-mid-low) and associate it with corresponding places/directions on the marimba.
- Assign gestures with options :
- For the notes played in a small range(high-mid-low):
- Use the gestures we have created so far and shift the appropriate gestures to the registers/area of notes Shimon's playing with marimba.
- For the notes played widely spreading out
- Roll from side to side with head/neck noding or leaning to diagonal (small motion along with rolling like worm.gestrue)
Refining the synchronization of Week6.gesture.