Augmenting Reality in Museums with Interactive Virtual Models
Augmenting Reality in Museums with Interactive Virtual Models
Theodore Koterwas, Jessica Suess, Scott Billings, Andrew Haith and Andrew Lamb
Abstract: Two projects at the University of Oxford extend beyond screen-based
interactivity to create physically interactive models of museum objects on smartphones
utilising Bluetooth, image recognition and sensors. The Pocket Curator app
gives visitors to the Museum of the History of Science the opportunity to recreate a
19th century demonstration of wireless technology in the gallery and to find their
latitude with a virtual sextant. The re-sOUnd app transforms phones into historic
musical instruments: moving your arm in a bowing motion plays an Amati Violin
and blowing into the phone while tilting it up and down sounds a trumpet used by
Oliver Cromwell’s trumpeter. This paper describes the apps, discusses challenges
discovered in testing them with museum visitors, and reports findings from user
interviews.
1. Introduction
Augmented reality usually refers to a specific mode of interactivity in which a
device acts as a lens through which a user experiences their physical environment
enhanced with digital content. Two projects at the University of Oxford provide
alternative modes of augmented reality through interactive virtual models of
museum objects. The Pocket Curator app provides visitors to the Museum of the
History of Science the opportunity to recreate a 19th century demonstration of
wireless technology by ringing a bell in the physical gallery with their phone, and to
find their latitude with a virtual sextant. The app was a product of the Hidden
Museum project (Suess 2016), which aimed to prototype and test approaches to
delivering content via mobile devices to museum visitors in the physical context of
the gallery. Our explicit aim was to address the most prevalent concerns about
mobile guides in museums: that they are “heads down”, isolating, and disconnected
from the gallery space and the objects (e.g. Hsi 2003; Boa and Choi 2015). One
particularly well received approach was to create experiences that enabled users to
interact with the physical space and objects using their devices—experiences that
“augment reality” in novel ways. While the Pocket Curator app overlays content on
the camera feed, it goes a few steps further, providing the user with an experience of
using the object in front of them and causing the environment itself to react.
Separately the re-sOUnd app utilises sound and motion to place historic musical
instruments in the hands of users: moving your arm in a bowing motion plays a
violin and blowing into the phone while tilting it up and down sounds a trumpet.
Both apps were iteratively tested with visitors, and the challenges that emerged are
applicable to using Augmented Reality in museums and mobile museum guides
more broadly.
2. Company Introduction
The University of Oxford is a world-leading centre of learning, teaching and
research and the oldest university in the English-speaking world. The role of IT
Services is to ensure that the University of Oxford has the modern, robust, reliable,
high-performing and leading-edge IT facilities it requires to support the distinctive
needs of those engaged in teaching, learning, research, administration and strategic
planning. The Gardens, Libraries and Museums (GLAM) of the University of
Oxford contain some of the world’s most significant collections. While they provide
important places of scholarly enquiry and teaching for the University, for the public
they also represent the front door to the wealth of knowledge and research curated
and generated at the University. The Bate Collection of Musical Instruments has
over 2000 instruments from the Western orchestral music traditions from the
renaissance, through the baroque, classical, romantic and up to modern times.
3. Project Details
The Pocket Curator app presents seven objects in the Museum of the History of
Science (www.mhs.ox.ac.uk). For each object it provides two to three short audio
clips and at least one animation, video, or interactive experience.
366 T. Koterwas et al.
The museum’s Marconi Wireless display features the Marconi Coherer, a
mysterious box stuffed with wires and inscrutable antique electronics, and if you
look carefully, a bell on the top. Guglielmo Marconi used this device in the first
public demonstration of wireless signal transmission. He put the box in one corner
of a room and generated a spark in a different corner. The spark would cause the
bell on the top of the box on the other side of the room to ring. Pocket Curator
enables you to re-enact this demonstration with your phone. It superimposes a line
drawing of the box over the camera feed, and when you line up the actual box with
the outline, image recognition triggers the next step: the appearance of a “Transmit”
button. Pressing the button causes a spark animation and vibration on the phone,
and a bell rings out from the Marconi display case holding the real box. Technically
the bell is an audio sample triggered over Bluetooth and played though a transducing
speaker attached to the top of the case.
The Museum displays a collection of devices used for navigation. One of these, a
sextant, may be familiar to some people from films, but few people we interviewed
knew how it works. A navigator at sea looks through the scope at the horizon, and
using the arm of the sextant adjusts a split mirror so that the sun (or other celestial
body) appears in line with the horizon. The track along which the arm moves has
degree markings, so the navigator can take a reading of the arc the arm traversed,
which corresponds to the angle of the celestial body above the horizon. With this
angle reading the navigator can determine the ship’s latitude from charts. Pocket
Curator simulates this with a virtual model of a sextant. The user holds the phone
vertically in front of them. On the screen they see the view through the camera
overlaid with a brass ring representing the sextant scope. Within the scope they see
an image of the sea, which moves up and and down with the motion of the phone.
They centre the sea horizon within the scope, tap a button to set the horizon, and tilt
the phone upward until a virtual sun appears. When the sun lines up with the
horizon, they tap another button to take a reading. The app shows them the angle
they measured along with a short animation of the arm moving on a sextant. They
tap the Find Latitude button and see a chart converting their angle measurement,
along with a map displaying a line at that latitude. On a given day the app sets the
angle of the virtual sun above the horizon so that the measurement they take
accurately equates to the latitude of Oxford.
The re-sOUnd app presents historic musical instruments from the Bate
Collection of Musical Instruments and the Ashmolean Museum. In addition to
offering audio clips of the instruments being played by musicians, the app allows
users to play the instruments themselves. Users play wind and brass instruments by
blowing into the microphone and tilting the phone up and down to vary pitch. For
the Amati Violin, a user holds the phone as if it were the neck of a violin and places
their fingers down on the touchscreen as if holding down strings. Moving their arm
back and forth in a bowing motion sounds the notes as if bowing the real thing.
They can also use interfaces that more directly simulate the instrument, e.g. by
covering holes or paddles with their fingers on the touchscreen. Each instrument
was sampled in a recording studio so that users hear the real sound the instrument
makes.
4. Discussion and Conclusion
The apps were iteratively tested with visitors during development to ascertain both
their usability and effectiveness in conveying concepts about the objects. The
sextant interactive was developed over several iterations, from a simplified single
screen experience to the guided, multi-step experience described above. With the
single screen experience, users were shown a map under the scope, with a line
indicating latitude which changed as they tilted the phone. It was initially thought
this simple, direct association between angle and latitude would convey the concept
more clearly than a more complicated process. However, users were not making a
strong connection between their experience and how the sextant actually works, e.g.
they were not able to describe how tilting the phone corresponded to moving the
arm of the sextant. They also missed the correlation between angle and latitude,
because in focusing on trying to line up the sun and horizon in the scope, they
didn’t see the latitude line moving on the map below. This problem of inattentional
blindness has also been found in Heads Up Displays (McCann et al. 1993) and
Surgical AR applications (Dixon et al. 2013). The second iteration was a two-screen
process. On the first screen, the map was replaced with an image of the sextant arm,
which moved as the phone tilted, and on the second screen the map and latitude
indicator were presented with explanatory text. While this more clearly connected
taking an angle reading and finding a latitude, users were still unclear on how it
related to the object itself, so a third screen was required. Now, when the user
presses a button to set the angle of the sun, the scope is replaced with a message
stating the angle they measured, with the sextant arm “set” on that angle, and an
animation of the sextant with the arm moving. Users interviewed after the third
iteration reported that the connection with the real object was clear, and that they
preferred the kinetic aspect of the interactive to a hypothetically proposed
demonstration video. They also liked that it got them “behind the glass.” One
respondent remarked that even though the display wasn’t really in her area of
interest, the interactive helped grab her interest.
In developing Pocket Curator valuable things were learned about delivering
content on mobiles generally (Suess 2016). iBeacons can be effective as an
enhancement but not as the only way to access content, and users didn’t like QR
codes, preferring to simply select content from a menu in both cases. Audio is very
effective, but it should answer a question visitors genuinely have about the object
rather than what the museum might want to tell them. The longest someone watched
or listened to something without looking at the duration (an indicator of
fatigue) was 45s. Video is effective when it offers content that can only be experienced
visually, such as a demonstration or animation. Talking heads should be
avoided unless the presenter is someone the user recognises and cares about.
Because these interactions are novel, users required guidance to use them
effectively. The interactives begin with an instructional overlay and a “Start” button,
but they need to also intrinsically guide users throughout the process. For the
sextant, this was accomplished by breaking down the process into discrete steps and
taking visitors through each step with short instructions reinforced by text on the
buttons. Timings in the app ensure users don’t skip steps, providing the next button
only after the user starts the current step. For instance, in step two of the sextant,
tilting the phone upward to find the sun, the button to set the “Angle of The Sun”
appears only after the user begins tilting the phone upward.
Usability testing for the re-sOUnd app was conducted with 12 visitors over the
course of two afternoons in the Bate Collection and Ashmolean. Once users worked
out how to hold the phone or were given a hint such as “how would you play a real
trumpet”, they played the instruments effectively. Showing them a schematic
drawing of how to hold and play the instrument was very effective in getting them
started, so drawings like these were incorporated into the app.
Further evaluation for re-sOUnd was conducted over several days in the Bate
collection and the Ashmolean. 18 visitors were each asked to read about two
instruments (control), listen to two instruments (treatment one), and play two
instruments (treatment two). The order of activities and instruments were randomized
but each instrument was read about, listened to and played an equal
number of times over the 18 tests. Users were then asked to take a short survey in
which they rated on a 5 point scale the degree to which they felt they learned about
each instrument, enjoyed each instrument, and got a “sense” of what it was like to
play it. Their responses suggest a correlation between the app giving them a “sense”
of playing the instrument and the degree to which they felt they both learned about
it and enjoyed it. The survey suggested they enjoyed playing instruments (mean of
3.89) more than reading about them (3.53) and listening to them (3.81). The
question regarding how much they learned showed a bias toward reading (3.64)
over both listening (3.55) and playing (3.44). The survey also asked more general
questions about the app. Respondents agreed the app made them want to look at the
real instruments (4.28), want to learn about other instruments in the collection (4),
want to learn to play an instrument (3.68), and made the instrument seem more
“real” (3.78). They agreed they would recommend the app to a friend visiting the
collection (4.33). There are limitations to this evaluation. The general conceptual
association between “reading” and “learning” likely skewed responses, and it is
unclear respondents made a clear distinction between listening and playing when
thinking back on their experience, rating all experiences of hearing the instrument
similarly.
Google Analytics in Pocket Curator anonymously tracks when audio, video, and
interactives are started and finished by a user. The data show that the sextant
interactive is the most started and finished item in the app, and Marconi is second
most started and fourth most finished. Both interactives are started and finished
more than any individual audio or video clip and the top three audio clips combined.
For re-sOUnd, the violin is currently most popular. The apps have only been
available for a few months and not widely advertised, so numbers are low, but as
the data grows it should indicate user behaviour and preferences in the wild.
Combining these data with further surveys and interviews will allow further study
of the virtual models approach to augmenting reality.
Both projects were funded by the University of Oxford’s IT Innovation
Fund. Pocket Curator was developed in collaboration with staff at the Museum of the History of
Science (www.mhs.ox.ac.uk), especially Stephen Johnston. Re-sOUnd was developed in collaboration
with the Music Faculty (www.music.ox.ac.uk), the Bate Collection of Musical Instruments
(www.bate.ox.ac.uk) and the Ashmolean Museum (www.ashmus.ox.ac.uk), principally Colin
Harrison and Sarah Casey. The apps were technically developed by Theodore Koterwas, Andrew
Haith and Markos Ntoumpanakis.
References
Boa, R., & Choi, Y. (2015). Using mobile technology for enhancing museum experience: Case
studies of museum mobile applications in S. Korea. International Journal of Multimedia and
Ubiquitous Engineering, 10(6), 39–44.
Dixon, B. J., Daly, M. J., Chan, H., et al. (2013). Surgeons blinded by enhanced navigation: the
effect of augmented reality on attention. Surgical Endoscopy, 27(2), 454–461.
Hsi, S. (2003). A study of user experiences mediated by nomadic web content in a museum.
Journal of Computer Assisted learning, 19(3), 308–319.
McCann, R. S., Foyle, D. C., & Johnston, J. C. (1993). Attentional limitations with heads up
displays. In R. S. Jensen (Ed.), Proceedings of the seventh international symposium on aviation
psychology (pp. 70–75). Columbus: Ohio State University.
Suess, J. (2016) Hidden museum project http://www.oxfordaspiremuseums.org/blog/hiddenmuseum-
project.