Brainstormed project ideas and details
Finalize project idea, divide task
Plan 1 Swift Object detection (Communicating ASL on the fly)
Initial Interview
Project Interview - after project is complete
Follow up Survey on Project Interview
Website
Poster
Final Paper
Xiurong & Sarah will work on the initial survey:
Recruit 2-3 people by 11/15
Schedule a 15-30 minute talk asking the following questions to brainstorm for 2nd user scenario requirement 11/17:
What kind of AR do you think would be helpful for you to learn AR?
Ekram & Perong will get started on object detection on
Ekram: start on 11/16
Peirong: start on 11/17
Everyone gets together to talk about user scenario 2 before the end of 11/21
Sarah and Xiurong conducted an interview with Neeraja Menon, a student taking ASL class, over Zoom ~20 min
Possible 2nd user scenario base on the conversation with Neeraja:
Learning Playground: Let’s have a random object in the world and look for the object with our mobile phone. The ASL learning student can walk around with their phone/iPad to find random objects and when she sees one, she can practice making the sign and afterwards, she can check her answer to evaluate if she signed it correctly. If she did, vocab score would increase by 1
The object in the room can be refreshed
Progress tracking is implemented. If we have additional time, we can make like a AR animation like “YAY you learn a new sign” thing when vocab score increase by 1
Sarah sent out surveys to professors.
Ekram’s update on Object recognition: objects are recognized with marke
Our survey received one feedback from a professor so far.
Peirong found ASL instruction videos for truck, cell phone, cup, bowl, person from youtube and added them by object detection anchor. Also enable video to replay itself.
Group Meeting
Evaluated the timeline
Confirmed on assigned parts for user scenario 2: Winter (snowman-scarf, sled, tree-snow)
Make an animated hand using an AR composer for the 3 objects?
Future design: https://www.youtube.com/watch?v=rD-0t8r3I7k
Peirong added a restart button for first user scenario: clears the entire screen
Xiurong, Sarah, Minghui meet at Rettner, Peirong, Ekram joined virtually on Zoom to work on projects together.
Peirong: changed restart function to automatically clear the entire screen
Ekram: finding avatar, researching online to make use of ARkit
Xiurong: working on making a hand 3D model in Blender by watching YouTube tutorials
Steps Completed:
1. Created the hand model in Blender
2. Added bone/movement to hand
3. Exported blender file and convert it to USDZ to work on Reality Composer
Created sign animations: Welcome
Working on: Ice Skate, jacket, gift, gloves
Sarah: Working on making the prototype for the 3rd scenario and making presentation slides
Minghui: making the navigation controller for the project and implementation 2nd scenario
Low Fidelity Prototype Requirement:
Scenario 1- Object detection:
Objective: On fly communication
Interaction: gaze
Paper Prototype
Physical Prototype
Scenario 2- ASL Playground:
Objective: Learning, self assessment, and retrieving memory
Interaction: air tap, gaze
Paper Prototype
Physical Prototype
Scenario 3- Progress
Objective: Trace learning progress
Interaction: air tap
Paper Prototype
Physical Prototype
Update Report
Working in progress
Ekram’s Requirement
Data collection of videos in scenario 1
Had a brief meeting
Made progress on Poster