Explore India’s Space Missions in Augmented Reality! 🚀
Experience India’s space legacy firsthand with Nakshatra AR, a powerful augmented reality app that brings the Chandrayaan-3 and Aditya L1 missions to life in your environment. Dive into realistic simulations, explore detailed mission models, and learn from your virtual AI guide, Kalpana, as you navigate the cosmos!
AR Foundation SDK for AR interactions in Unity.
Observer Pattern to decouple components, enabling event-driven updates.
Command Pattern for control actions.
Scriptable Object Pattern for reusable and data-driven Voice-Over System.
Localization, Voice over chatbot able to speak in Hindi and English.
Factory Pattern for instantiating configurable particle systems.
Custom Editor Tools to streamline asset management and development workflows.
Shader Scripting in Unity to implement stencil testing for AR portal.
Worked as the solo developer, 3D modeler, and game designer, responsible for creating, designing, and implementing every aspect of Nakshatra AR.
Unity: Used for building and deploying the app, leveraging its powerful AR capabilities and versatile 3D environment.
Blender: For designing and modeling detailed 3D assets, including rockets, satellites, landers, rover and other elements.
Adobe Substance Painter 3D: Applied to add realistic textures and enhance the visual fidelity of models, contributing to an immersive AR experience.
Chandrayaan-3 Simulation: Built a simulation where users can control the Vikram lander’s descent and explore the moon’s surface with the Pragyaan rover, bringing real-life mission details to an interactive AR format.
Aditya L1 Mission: Created a detailed model of ISRO’s solar observation mission, allowing users to explore and interact with components in AR while learning about the science behind solar studies.
Developed an educational mode for detailed exploration of mission models, including scientific instruments and equipment used by ISRO, to educate users on India's space technology and achievements. The AI guide, Kalpana, provides information about each component in both English and Hindi, enhancing accessibility and understanding.
Developed Kalpana, an interactive AI guide within the app, programmed to deliver context, mission insights, and guidance both in English and Hindi making complex space science accessible and engaging for a wide age range.
Focused on an intuitive and educational user experience, blending interactive simulations with accessible learning elements to foster interest in STEM fields.
Ensured app accessibility for all age groups, providing educational value with a hands-on approach to space exploration.
Aqua Recall is an AR memory game inspired by the Indian card game Chattai, blending cultural heritage with modern tech. Set in a virtual aquarium, players match fish pairs within a minute, using Vuforia’s AR markers for immersive gameplay. With unique 3D fish species and seamless worldspace UI, the game challenges players to score points and keep their three lives intact, making every second count.
Vuforia SDK used for integrating marker-based AR in Unity.
Arrays Utilized for efficient storage and management of fish objects.
Dictionaries Implemented for rapid lookups and associations between fish pairs.
Immersive Aquarium Environment: Players engage in a virtual aquarium where fish are initially positioned outside the water.
Memory Challenge: The objective is to memorize and match at least five pairs of identical fish within a one-minute timeframe.
Diverse Fish Species: The game features five distinct fish species, each with unique textures, adding variety and complexity to the matching challenge.
Through-the-Glass Perspective: Players can view the aquarium through a virtual glass, enhancing the immersive experience.
Integrated User Interface: UI elements are seamlessly incorporated into the world space, reducing screen clutter and improving user experience.
Point Scoring: Players earn points by successfully matching pairs of fish.
Time Management: A countdown timer adds urgency, requiring quick and accurate matches.
Life Preservation: Players have three lives per session, adding a strategic layer to the gameplay.
Data Structures:
Arrays: Utilized for efficient storage and management of fish objects.
Lists: Employed for dynamic handling of game elements.
Dictionaries: Implemented for rapid lookups and associations between fish pairs.
AR Integration: Leveraged Vuforia’s marker-based AR capabilities to anchor virtual elements in the real world, enhancing interactivity.
3D Asset Creation: All 3D models and animations were crafted using Blender and Adobe Substance Painter, ensuring high-quality visuals and smooth animations.
The NFS-Inspired Car Configurator with AR Viewing is a dynamic application that allows users to customize cars with over 65 material options and view them in augmented reality. Developed within three days, the project integrates advanced design patterns, Unity's new Input System, and the Universal Render Pipeline to deliver an optimized and immersive user experience. The application features a sophisticated AR car spawning mechanism, ensuring precise placement and interaction within the user's environment.
Had a blast creating Bubble Blast AR this year at The Global Game Jam 2025.
With the online leaderboard integrated into it, had fun competing with others at the Game jam venue trying to beat each other's high score. Also went as a mentor this time, every team did great, to see the passion everyone had for Game development was wonderful.
Fun Story: On the 1st day, I was just mentoring and helping other teams, but seeing everyone making their ideas come alive really made me wanting to make a game of my own. So on the second day, I started to work on my own game, created it in just 20 hours, really love how game jams pushes you to make a game matching a theme within a given time frame and become better.
It's an immersive augmented reality game where players use a slingshot to smash virtual glass targets. By tapping the screen at the right moment, players adjust the force of their shots, simulating the satisfying action of breaking glass. The game features realistic 3D models, responsive audio feedback, and increasing difficulty levels, offering a unique and engaging AR
Utilized Unity 2022.3.22f1, AR Foundation, ARCore, and Vuforia to create an engaging augmented reality (AR) experience.
Designed a bar-like 3D model using Blender.
Applied textures with Adobe Substance Painter 3D.
Optimized the model for performance before importing into Unity.
Implemented a slingshot mechanism allowing players to smash virtual glass targets.
Incorporated a timing-based challenge where players tap the screen to adjust projectile force, simulating glass smashing.
Enabled player interaction through screen taps to control the slingshot.
Provided audio feedback with cash register sounds and a "now open" sign upon target detection.
Activated visual elements like the slingshot, 3D model, and background music to enhance immersion.
Developed a loosely coupled and cohesive codebase to minimize dependencies and promote scalability.
Adhered to camel case naming conventions for readability.
Employed getter and setter methods where necessary.
Tested and optimized the game on the developer's device to ensure a smooth AR experience.
Introduced a mini-game within an AR context, combining AR technology with a familiar fairground activity.
Enhanced user engagement through audio feedback and interactive gameplay mechanics.
The AR Portfolio Showcase is an augmented reality application that allows users to explore 3D representations of academic projects by scanning designated markers. Each marker reveals a specific project, offering interactive elements and detailed information, providing an immersive and engaging way to present and review past work.
Developed an augmented reality (AR) application to present first and second-year projects using multiple markers, as part of an AR & VR Development class assignment.
Utilized Unity for application development.
Integrated Vuforia for marker-based AR functionality.
Implemented multiple image markers, each corresponding to a specific project.
Enabled users to scan markers and view associated projects in 3D.
Incorporated interactive elements, such as an info button, providing detailed information about each project upon user interaction.
Designed the interface to be intuitive, enhancing user engagement and accessibility.
The AR Virtual Piano Experience is an augmented reality application that allows users to interact with a virtual piano by pressing virtual buttons corresponding to the notes of Kanye West's "Runaway." As users engage with the virtual keys, they collaboratively recreate the melody, culminating in the full song playing upon completion. This project exemplifies the integration of AR technology with musical interaction, offering an engaging and immersive user experience.
Developed an augmented reality (AR) application to present first and second-year projects using multiple markers, as part of an AR & VR Development class assignment.
Utilized Unity for application development.
Integrated Vuforia for marker-based AR functionality.
Implemented multiple image markers, each corresponding to a specific project.
Enabled users to scan markers and view associated projects in 3D.
Incorporated interactive elements, such as an info button, providing detailed information about each project upon user interaction.
Designed the interface to be intuitive, enhancing user engagement and accessibility.
The AR Shoe Customization Project is an augmented reality application that enables users to personalize Nike shoes in real-time and visualize their designs within their environment. By combining optimized 3D modeling, dynamic material manipulation, and intuitive UI/UX design, the application offers an immersive and engaging user experience, bridging the gap between digital customization and physical visualization.
Developed an augmented reality (AR) application enabling users to customize Nike shoes in real-time within their environment.
Utilized Unity for application development.
Employed Blender for 3D modeling and optimization.
Integrated Figma for UI/UX design collaboration.
Optimized 3D shoe models in Blender to ensure seamless integration and performance in Unity.
Reduced polygon count and applied efficient texturing techniques to maintain visual fidelity while enhancing performance.
Implemented scripts in Unity to allow dynamic material changes on shoe models, enabling real-time customization by users.
Ensured smooth transitions and responsiveness during material updates to enhance user experience.
Worked closely with a UI/UX designer to create an intuitive and user-friendly interface for the customization process.
Incorporated user feedback to refine the design, focusing on accessibility and engagement.
Integrated AR functionalities to allow users to visualize customized shoes within their real-world environment.
Ensured accurate scaling and positioning of the virtual shoes to provide a realistic experience.