Animalearn. (2020). Froggipedia: Humane Frog Dissection in Ar! YouTube. Retrieved September 29, 2022, from https://youtu.be/JCXYlRPUI1Q.
Keiichi Matsuda. (2016). Hyper-Reality. YouTube. Retrieved September 29, 2022, from https://youtu.be/YJg02ivYzSs.
Paul Hamilton. (2020). Angry Bird style game in Reality Composer. YouTube. Retrieved September 29, 2022, from https://youtu.be/GyZWUnpvkDI.
VirQ Tech. (2017). Ar Chemistry Augmented Reality Education Arloon. YouTube. Retrieved September 29, 2022, from https://youtu.be/Qi3h18wJJiI.
Wikitude. (2019). Augmented Reality Experiences for Museums and Cultural Institutions. YouTube. Retrieved September 29, 2022, from https://youtu.be/QlN9S_bjGOY.
Apple. (2019, September 19). Reality composer. App Store. Retrieved September 29, 2022, from https://apps.apple.com/us/app/reality-composer/id1462358802
Creating 3D Content with Reality Composer. Apple Developer Documentation. (n.d.). Retrieved September 29, 2022, from https://developer.apple.com/documentation/realitykit/creating-3d-content-with-reality-composer
EyeJack creator. EyeJack Creator. (n.d.). Retrieved September 29, 2022, from https://creator.eyejackapp.com/
Hallberg, M. (2021, October 11). 👾 beer invaders 🛸 #augmentedreality #tech #virtualreality #technology #madewithunity PIC.TWITTER.COM/AXBYI4CJ3L. Twitter. Retrieved September 29, 2022, from https://twitter.com/MatthewHallberg/status/1447684009207767046?s=20&t=kRhLBhuW4GVdhAVirIvJoA
Inc., A. (n.d.). Ar creation tools - augmented reality. Apple Developer. Retrieved September 29, 2022, from https://developer.apple.com/augmented-reality/tools/
Krakofsky, R. (2022, June 9). Opened to the 1st page of the 21/22 @edtech_isf yearbook to find a QR code & an explanation of why the yearbook team wanted to,"extend their sights beyond the horizons of traditional cover design...and explore #ar"school culture is everything.@cospaces_edu #augmentedreality pic.twitter.com/QLSLVDXFZX. Twitter. Retrieved September 29, 2022, from https://twitter.com/mrkpyp/status/1534894909001371648?s=20&t=Ss45Vhdu9fkx4uKjWxZEug
Learn science, master stem, be future ready.: AR/VR Learning & Creation. Merge. (n.d.). Retrieved September 29, 2022, from https://mergeedu.com/cube
Userfacet. (2019). Solar Planets in Augmented Reality | Userfacet. YouTube. Retrieved September 29, 2022, from https://youtu.be/L6lrpWwZj64.
For this case, I focussed on Augmented Reality and browsed possibilities for a variety of applications and subjects. As well as how to create AR experiences, or ways students/teachers can use AR technology without needing to create to engage with it. There is no specific grade range attached to this technology as it could be used across multiple age ranges and subjects.
Augmented Reality, (commonly referred to as "AR"), is a digital overlay that is superimposed over something real that you are viewing through a mobile device, with the purpose of increasing engagement with the content and allowing for one to more easily visualise content.
To view an AR overlay, you need a device that can display the experience, usually a mobile device. Typically, there are two parts to an AR experience: 1) a static trigger image; and 2) a video that is played when triggered by pointing your mobile device at the trigger image. There are some AR apps that do not need a trigger image, and use spatial recognition to place an AR experience on a surface, like a table, floor, or wall (like the example image to the right). Other examples of AR experiences that don't require a trigger are the various filters available in apps like SnapChat or Instagram. These apps use spacial recognition to recognise your face, and layer the experience on top of your face.
Not to be confused with VR and MR, AR is quite different. In Virtual Reality (VR) the user is immersed in an environment where they are supposed to only see or hear what is happening in the VR experience. Augmented Reality differs in that it is an overlay of an experience that is on top of what we are seeing in real life. Mixed Reality (MR) only occurs when we are able to eliminate the need for holding a mobile device, and we begin to use wearables, like "Google Glass" (pictured on the right), or a contact lens that can seamlessly augment everything you see.
Additionally, a key difference in MR is the ability for the user to physically interact with the digital experience they are seeing by simply moving around their hands and fingers (often seen in sci-fi movies). Currently there are limitations to motion tracking software and hardware that are making it unlikely MR will be widespread technology for a number of years.
In this image, a piece of furniture is being tested for how well it fits into a space through the use of the IKEA Place AR app.
To engage with AR technology, a mobile device, like an iPad or iPhone is needed. There are a variety of AR apps available on the market that allow users to engage with the technology. Some that relate to interior design, like the IKEA Place app, and other that are geared towards taking your artistic creations to the next level, such as the EyeJack App. In all instances, the functionality of the app are similar, where the app overlays something artificial on top of what we are seeing in reality, therefore, augmenting our reality.
Use this app to visualize and test out pieces of furniture in your space before you buy.
One of the most widely known AR applications in mobile gaming, that was created to encourage daily physical activity during play.
The Merge Object Viewer and Explorer apps are great starting apps to use your new Merge Cube. They come with free experiences and paid upgrades.
Students can use EyeJack to add AR experiences to static art images as the trigger image (see WebAR examples below).
An AR app from Adobe with built in assets, and the ability to import custom assets to add to your experience.
This is Apple's native AR experience builder that gives you quite a bit more control over your scene than many other AR apps.
For high quality, custom AR experiences, an understanding of 3D animation and creating 3D objects in 3D animation software will be needed to be created separately, then the 3D designs need to be combined with the trigger image, which is where the AR app is used so that the experience can be viewed. This is one limitation of AR production. As a student looking to excel in AR experience production, 3D animation knowledge is a must, however, apps like Reality Composer (pictured below) allow you quite a bit of creative freedom, while still making it easy to create, and there are sites that share 3D files specifically to be used with AR software, as discussed below.
In the diagram to the right, you can see the AR experience being built in Reality Composer on a MacBook (on the left). On the right, the same experience that was built, can be interacted with in 3D and virtual space, using an iPad.
Depending on how the experience was programmed, a student could implement physics simulations, or other interactions using built-in functions in the app. Or, in some cases, the 3D asset could be animated to move on its own, such as the robot that is seen in the picture.
As a limitation to Reality Composer, you are unable to create any 3D assets within the program and all assets used in Reality Composer would be either built-in, or assets that were downloaded as a 3D file (typically USDZ file format) from a website that shares these designs (like usdzshare.com). However, once a user has imported an asset they wish to use, Reality Composer's intuitive design allows for a variety of experiences to be produced. Some apps allow for simple experiences to be created through the use of templates built into the app, such as Adobe Aero, these apps are the easiest to use and most user-friendly, however, these experiences are often generic and limited in terms of customisability.
That being said, creating AR experiences isn't a requirement to actually using this technology. As seen from the examples below, there are ways to learn from using AR apps, that don't involve creating AR. These are are apps that simply make it possible to extend learning, improve student engagement and take the learning to places that wouldn't be possible without the technology.
While using Augmented Reality apps, in most cases students aren't asked to accomplish anything. AR is often simply used as a way to increase engagement and extend the learning experience to allow for the student to visualise the learning materials in greater detail. Please view the examples below to see some of the possibilities. Each of the 6 categories have gif media examples, if they have not loaded please wait a moment to allow them to load so you may view the experience.
Teaching Art students to use AR allows them to take their creative skills further, by scaffolding on their works of physical art, and adding a digital overlay on top.
Students can use a variety of AR apps, such as Reality Composer to learn about physics models and simulate them easily in real time using Augmented Reality.
Some museums have begun to incorporate AR into their exhibits to extend the their learning and increase engagement from their patrons.
Imagine biology students virtually dissecting an animal for an anatomy unit. Students can visualise the body parts in ways that wouldn't be possible having the classroom.
Imagine the ability for students hold and see virtual molecules and manipulate them in real time.
In AR, using a variety of apps or physical manipulatives, (like the Merge Cube), to explore the final frontier like never before.
In contrast with developing 3D animations listed in question 5, another common way educators use Augmented Reality with students is with digital art using apps like the EyeJack App (listed above). As seen in the tweet to the right, the embedded video demonstrates an excellent example of AR Art in the classroom.
For this activity, students would need to move through the following steps:
Create a physical piece of art.
scan / take a picture of the art, which would be used as the "trigger" image to activate the AR experience later
bring that picture of the art into a piece of software that would allow for creating animations on top of it. The Keynote app from Apple, or even Procreate, would be good options for this, since students would already need an iPad to create the experience
Students would need to develop an animation that begins with the original image, which then animates from that image into something new or different.
Students would then need to export that animation as a finished video
Then students would need to use an app like EyeJack (listed above), to "connect" the animated video to the real world image, that would then be used to "trigger" the video as it is being viewed.
The reason why the animation would need to start with an image of the original art is so that when the animation plays, it looks as if it is playing on top of the physical art, or is a natural part of the physical art, and something that plays seamlessly.
An alternative step 1 to this process could also be to start with the animation, and print out the first frame of the animation as the trigger image and physical piece of art that is posted on the wall. Then program the app to recognize that first image, then play the animation once it is recognized.
There are a multitude of applications for Augmented Reality in education. AR allows for students to visualize and engage with their learning materials in ways that are impossible otherwise. When students would normally have to look at pictures, such as when learning about the solar system, students can now hold planets and full solar systems in their own hands. In addition to this, with apps like Reality Composer, students can create their own solar systems and simulate a day cycle of planets, all built within the app with easily accessible features of the app.
The primary disadvantages of using AR are the tech requirements to use the technology. Meaning, if you do not have access to a class set of iPads, it will likely be impossible to design a unit that incorporates AR. Additionally, it's likely that if you are using a class set of iPads, and not BYOD devices, it is unlikely that students will be able to continue their learning with the tech outside of the classroom.
Other than the requirement for physical tech, as seen in the examples above, this technology has been designed to increase engagement and student learning, and if you do have access to the devices, there are no other main drawbacks.
Since Augmented Reality is a type of technology that is significantly broad, I feel this questions doesn't fully apply. However, I will do my best to answer. As shown from the examples of use-cases for AR technology, this technology allows for the extension of learning in a variety of settings. Museums, science classes, art rooms, etc. all can benefit from the extension of learning, and the ability to create with AR technology. Students have the ability to use this technology to engage more deeply with their learning material, and teachers can use AR technology to help students understand and visualise the topic more clearly. Depending on the class, not only is this visualisation helpful to see what is being discussed and learned about, but in the example of the physics simulation, there are some situations where AR technology can be interactive, which allows for even deeper learning, as this example shows a custom designed wall of preselected items, with preselected weights, and an object being hurled towards it was a set material type and weight, all affecting the gravity of the objects as they collide and interact.
That being said, the technology empowers teachers to allow students to be creative and take their artistic creations further by adding AR extensions, as demonstrated in sections 6, 7, and in the "WebAR Demo" section I have included below. Students are able to bring their art to life using the technologies and teachers in the arts have the ability to push their students' creativity even further by adding AR technology to their physical art works.
The major factor that would prevent an educators from not being able to use AR technologies in the classroom is the physical tech that is required to do so. In most cases, this would be class sets of iPads. There could be some situations if a teacher could rely on students bringing their own iPhones or iPads in a BYOD environment, however, this creates a new challenge if teachers wanted to use AR apps that are not free. With many of the more subject-specific apps costing money, trying to do AR in a BYOD environment might be unrealistic. This is the primary disadvantage to using this tech in the classroom.
That being said, if access to tech and funding for apps is not a concern, AR technology is quite accessible and easy to use and integrate in a variety of classes and subjects, as seen by the many examples below.
There are limitless applications to Augmented Reality in a variety of industries. Business are using AR for brand recognition and engagement; factories are using AR for improved effectiveness of its workforce; the medical industries are using it to improve how to learn and deliver medical treatments; and firefighters are beginning to use it for improving their ability to rescue victims and keep themselves safe with modified helmets and built-in 3D scanners.
However, there are a number of limitations that are preventing wide-scale adoption of AR in our daily lives. These include the requirement of a device, the high price-points for some devices, the need for a server to host the overlayed AR information, and current physical limitations in the technology, itself.
In most cases, AR can currently only be accessed by average users holding a smart phone or tablet up to a trigger image, however "smart glasses" and other smart devices with built in cameras and 3D scanning capabilities are currently in development that will allow for increased integration of AR in our daily lives. Though, the financial cost of this tech will likely be initially highly prohibitive.
Additionally, most AR experiences are tied to a particular app or brand, such as the Adobe Aero app or the EyeJack app, and all experiences made with an app like EyeJack are hosted only on EyeJack servers, making this information non-interopable and only accessible if the user is trying to view the AR experience while using the EyeJack app.
There are current solutions being developed, such as WebAR (see example below👇) to allow for more users to view an experience without the need for them to download a specific app. However, this is currently a costly solution for the creator, as hosting this content is new and expensive.
It will not be until these issues are solved that wide-scale adoption will be seen in AR industries.
Outdoor mural coming to life with AR
Beer company engaging its buyers to play an AR experience with their product to increase engagement.
As previously mentioned, AR experiences typically need a special app to use to view the experience. However, there have been some improvements to the technology that is changing this. WebAR is an example of this. You may test out these AR overlays without an AR app.
To activate the AR overlays in the examples below, point the camera from your mobile device or tablet at this barcode. You will be directed to open the web browser from your device and will be able to view each AR experience directly through your browser.
Tip: if your device is too far from the screen, it will be confused about which AR experience to play. Try isolating a single piece of art at a time.