Using Unity we developed a custom C# script to implement a livestream and simulate HH
We were also able to remap the livestream into the visual field of an HH patient
Google Daydream Headset: Added comfort of a premium headset and a controller for users to interact with the UI.
Wide Angle Lens: Increases users visual field from livestream
Interactive buttons are able to be selected through IR remotes.
Disease state is able to be turned on and off, during livestream.
Our final solution differed from our initial aspirations, as the Heads-Up Display (HUD) was switched to remapping the user's visual field. This was due to the fact that the HUD obstructed much of the visual field. Remapping the visual field simply means that we re-positioned what is seen in the central field of vision. Moreover, a wide angle lens was attached to the smartphone to enhance peripheral version.
Livestream from camera of simulated Right HH Visual Field
Livestream from camera of Remapped Visual Field with the Wide Angle Lens
From these camera live-streams, even in the first frame of each video, there is an increased visual field for the user. Also it is easier for the user to interact with their environment as they do not bump into walls which can be seen in the Right HH Simulation.
Improved UI Functionality:
The UI needs to be tested in real world conditions to ensure that it meets all user needs.
Improve Depth Perception:
There is a need for improved depth perception as we continue to improve our device, so the introduction of 2 cameras or a 360 degree camera would be further researched.
Additionally, a camera with additional frames per second (FPS) would help with decrease latency in the livestream video output.
Pre-Clinical Validation of Disease State:
Our device would be validated against patients with HH by comparing visual field test outcomes.
Validation would allow for future clinical trials to be approved as our device would be confirmed to be an accurate representation of the HH disease state.