A game programmer, XR developer, and Age of Empires enthusiast with a knack for bringing virtual worlds to life! As a final-year BTech student in Computer Science at UPES, I’ve spent two years in game development, recently launching Nakshatra AR on the Google Play Store and completing three industry internships. With experience in Unity, XR, and teamwork, I bring a collaborative spirit and a passion for innovation to every project. For me, it’s all about creating immersive and fun experiences!
Explore India’s Space Missions in Augmented Reality! 🚀
Experience India’s space legacy firsthand with Nakshatra AR, a powerful augmented reality app that brings the Chandrayaan-3 and Aditya L1 missions to life in your environment. Dive into realistic simulations, explore detailed mission models, and learn from your virtual AI guide, Kalpana, as you navigate the cosmos!
AR Foundation SDK for AR interactions in Unity.
Observer Pattern to decouple components, enabling event-driven updates.
Command Pattern for control actions.
Scriptable Object Pattern for reusable and data-driven Voice-Over System.
Localization, Voice over chatbot able to speak in Hindi and English.
Factory Pattern for instantiating configurable particle systems.
Custom Editor Tools to streamline asset management and development workflows.
Shader Scripting in Unity to implement stencil testing for AR portal.
Worked as the solo developer, 3D modeler, and game designer, responsible for creating, designing, and implementing every aspect of Nakshatra AR.
Unity: Used for building and deploying the app, leveraging its powerful AR capabilities and versatile 3D environment.
Blender: For designing and modeling detailed 3D assets, including rockets, satellites, landers, rover and other elements.
Adobe Substance Painter 3D: Applied to add realistic textures and enhance the visual fidelity of models, contributing to an immersive AR experience.
Chandrayaan-3 Simulation: Built a simulation where users can control the Vikram lander’s descent and explore the moon’s surface with the Pragyaan rover, bringing real-life mission details to an interactive AR format.
Aditya L1 Mission: Created a detailed model of ISRO’s solar observation mission, allowing users to explore and interact with components in AR while learning about the science behind solar studies.
Developed an educational mode for detailed exploration of mission models, including scientific instruments and equipment used by ISRO, to educate users on India's space technology and achievements. The AI guide, Kalpana, provides information about each component in both English and Hindi, enhancing accessibility and understanding.
Developed Kalpana, an interactive AI guide within the app, programmed to deliver context, mission insights, and guidance both in English and Hindi making complex space science accessible and engaging for a wide age range.
Focused on an intuitive and educational user experience, blending interactive simulations with accessible learning elements to foster interest in STEM fields.
Ensured app accessibility for all age groups, providing educational value with a hands-on approach to space exploration.
Here are Some of the hyper casual games i have developed.
1) Vending Frenzy- Match the pairs of 3
2) Paper Glide - Flappy Bird Inspired Paper plane game
3) Flappy Dunk - Basketball Dunk style game
4) Tower Stack - Inspired by the game Stack
80% UI Assest are genrated by AI.
This project showcases a dynamic Items and Inventory System designed for an immersive gaming experience. It features a Scriptable Object-based inventory system with item categories, a custom editor for scalability, and intuitive UI elements such as tabs, toggles, scroll views, and a 3D preview display. Players can interact with a game world, collect and manage items, and use a functional weapon system with ammunition.
1. Explore the Environment
Navigate a game world filled with scattered, interactable items.
2. Interact with Items
Pick Up Items: Collect items from the environment and add them to your inventory.
Drop Items: Remove items from your inventory and place them back in the game world with realistic physics.
3. Inventory Features
Scriptable Object-based system: Ensures flexible, reusable, and scalable inventory management.
Carry multiple items of the same type (e.g., health packs, keys).
Automatically organized inventory with item categories for better management.
Custom editor for adding new items and improving scalability.
4. Intuitive Inventory UI
Tabs, toggles, and scroll views for seamless navigation.
3D preview display for selected inventory items.
5. Weapon System
Carry only one unique weapon at a time.
Weapons are usable, firing projectiles when activated.
6. Bullet System
Bullets must be picked up as separate items from the environment.
Weapons consume bullets as ammunition during firing.
A tiny, two-level 2D platformer built to showcase game feel. The focus is depth, responsiveness, and readable feedback so the cube is fun to move from.
Coyote Time You can still jump a split-second after stepping off a ledge. It forgives human timing and turns “I pressed jump!” moments into actual jumps instead of frustration.
Jump Buffer If you press jump just before landing, the jump still fires on touchdown. This removes “dead button” feels and keeps flow intact during fast movement.
Variable Jump Height Tap for a short hop, hold for a full jump. This gives fine control in tight spaces and makes platforming feel expressive instead of binary.
Apex Boost A subtle acceleration near the top of a jump so lateral motion feels snappier when you’re committing to a landing. The result: fewer “stalling” feelings mid-air.
Input Leniency on Landing Quick reversals right after landing get a small assist. You can stick landings and turn quickly without fighting inertia, great for precise ledge work.
8-Way Dash Dash in any of the eight directions with predictable distance and speed. It’s a reliable “I meant to go THERE” button that opens up routing options and fun micro levels.
Dash Hit-Pause (Time Dilation) A tiny freeze + brief slow-mo at dash start. This creates punch without overwhelming the pace, and it clarifies “dash began” in your hands and eyes.
Wall Slide + Wall Jump Sliding by sticking to walls, with proper wall to wall jumps like prince of persia, which are consistent and readable, letting levels feel an additional depth as jumping form walls feels good.
Squash & Stretch The cube subtly stretches/tilts on jump, dash, wall slide, and land. It’s purely visual juice that communicates motion intent and impact without UI.
Camera Feel Gentle look-ahead in the move direction, a tiny zoom during dash, and micro-shake on dash/wall interactions only (never on normal jumps). The camera anticipates action and sells momentum while staying out of the way.
Color Language The cube tints based on state: soft yellow on jump, yellow→orange→red across the dash, easing back to neutral; slight red bias while falling hard, then snaps clean on landing. It’s a quick read of “what mode am I in?” at a glance.
Systems are modular and loosely coupled (input, locomotion, jump, dash, sensors, camera, FX).
Visual polish (squash/stretch, color) lives in visual-only scripts—gameplay physics stay clean.
Checkpoints and respawn are data-driven to support the tutorial → L1 → L2 flow.
lightweight, grid-based Tower Defense game built in Unity, featuring efficient object pooling for “Rams” and projectile spawns.
Grid-based Placement
Snap towers and traps precisely onto your battlefield grid.
Object Pooling
Reuse “Ram” and projectile instances to keep GC spikes at bay and maintain smooth performance.
Customizable Pool Settings
Tweak pool sizes, pre-warm counts, and expansion policies right from the Inspector.
This is a fully functional chess moves calculator. It’s designed with clean and modular code, making it easy to extend and customize. The game handles chess piece movement dynamically, highlights possible moves in real time, and tracks the board state efficiently.
Strategy Pattern: Each piece type (Pawn, Rook, Knight, etc.) has its own movement logic encapsulated in a separate class.
Singleton Pattern: The chessboard manager is a single instance that tracks tiles, pieces, and interactions across the game.
ChessPiece.cs :
Represents each piece on the board.
Decides where the piece can move using its assigned movement strategy.
Highlights moves dynamically based on the board state.
ChessBoardPlacementHandler.cs :
Handles the chessboard tiles and manages the state of all pieces.
Clears old highlights, shows new ones, and ensures the board stays consistent.
Movement Strategies :
Each piece type (Pawn, Knight, etc.) has its own strategy class (e.g., KnightMoveStrategy) that encapsulates its unique movement rules.
ChessPlayerPlacementHandler.cs :
Manages the position of a piece and syncs it with the board dynamically.
Smart Movement Logic: Each chess piece has its own movement rules powered by the Strategy Pattern—making the code flexible and easy to maintain.
Real-Time Highlights: The game highlights valid moves in green and potential captures in red, giving clear feedback as you play.
Efficient Board Management: The chessboard is managed with a 2D grid system, ensuring smooth interactions and fast lookups for moves.
a real-time heatmap shader that “paints” any surface you look at in VR (or a 3D game) with a dynamic color overlay. Essentially, wherever your gaze lingers, the shader gradually builds up a visual indicator—red for high attention, green for lower attention—so you can instantly see where users are focusing in your scene.
Aqua Recall is an AR memory game inspired by the Indian card game Chattai, blending cultural heritage with modern tech. Set in a virtual aquarium, players match fish pairs within a minute, using Vuforia’s AR markers for immersive gameplay. With unique 3D fish species and seamless worldspace UI, the game challenges players to score points and keep their three lives intact, making every second count.
Vuforia SDK used for integrating marker-based AR in Unity.
Arrays Utilized for efficient storage and management of fish objects.
Dictionaries Implemented for rapid lookups and associations between fish pairs.
Immersive Aquarium Environment: Players engage in a virtual aquarium where fish are initially positioned outside the water.
Memory Challenge: The objective is to memorize and match at least five pairs of identical fish within a one-minute timeframe.
Diverse Fish Species: The game features five distinct fish species, each with unique textures, adding variety and complexity to the matching challenge.
Through-the-Glass Perspective: Players can view the aquarium through a virtual glass, enhancing the immersive experience.
Integrated User Interface: UI elements are seamlessly incorporated into the world space, reducing screen clutter and improving user experience.
Point Scoring: Players earn points by successfully matching pairs of fish.
Time Management: A countdown timer adds urgency, requiring quick and accurate matches.
Life Preservation: Players have three lives per session, adding a strategic layer to the gameplay.
Data Structures:
Arrays: Utilized for efficient storage and management of fish objects.
Lists: Employed for dynamic handling of game elements.
Dictionaries: Implemented for rapid lookups and associations between fish pairs.
AR Integration: Leveraged Vuforia’s marker-based AR capabilities to anchor virtual elements in the real world, enhancing interactivity.
3D Asset Creation: All 3D models and animations were crafted using Blender and Adobe Substance Painter, ensuring high-quality visuals and smooth animations.
Came 3rd and won Rs.50000 prize.
India’s First AI Game Jam, held at IICS, Delhi, with a prize pool of ₹2,50,000! 🎮🤖. This groundbreaking event was organized by Nilee Games and Future Technologies Pvt. Ltd, Create a complete game from concept to completion in just 9 hours, leveraging the best of Artificial Intelligence. What seemed nearly impossible at first turned into an inspiring showcase of creativity and tech innovation—thanks to our prior experience with AI in game development at Nilee Games.
✨ 25 teams participated
✅ 13 successful submissions
🏆 3 winners, selected by esteemed jury: Rajat Ojha, Anuj Sahani and Nikhil Malankar
1. Why this matters for UX in VR/3D development:
Immediate Attention Mapping: Live visual feedback on user focus—no post-session analytics or external hardware needed.
Iterative Design: Spot “dead zones” and hotspots in real time; tweak lighting, geometry, or narrative cues accordingly.
Accessibility & Comfort: A subtle heatmap overlay guides players toward points of interest, reducing disorientation and improving onboarding.
2. How it works:
Raycast from the camera’s forward vector every frame to detect the surface you’re looking at.
Accumulate “heat” values on that material’s UV coordinates—brightening over time if your gaze remains fixed.
Blend a color gradient onto the existing material, with red indicating high-focus areas and green indicating emerging areas of interest.
The NFS-Inspired Car Configurator with AR Viewing is a dynamic application that allows users to customize cars with over 65 material options and view them in augmented reality. Developed within three days, the project integrates advanced design patterns, Unity's new Input System, and the Universal Render Pipeline to deliver an optimized and immersive user experience. The application features a sophisticated AR car spawning mechanism, ensuring precise placement and interaction within the user's environment.
Observer Pattern events/delegates for modular and efficient communication.
URP Integration for Optimized performance and visuals for mobile AR.
Vector3.Dot and plane normals for Accurate AR car placement.
3D Model Optimization for Custom-textured models for improved efficiency and fidelity.
Completed the project from scratch within three days as part of a company assignment.
Focused on creating a polished car configurator with augmented reality (AR) capabilities.
Utilized events and delegates to implement the observer pattern, enhancing modularity and extensibility.
Facilitated efficient communication between components, promoting a decoupled architecture.
Adopted Unity's new Input System, moving away from the traditional Input Manager.
Improved reliability by avoiding string references, ensuring efficient input handling.
Implemented URP to optimize performance on mobile platforms.
Enhanced rendering efficiency and graphical fidelity, crucial for AR experiences.
Provided users with over 65 custom materials for car customization.
Enhanced user engagement by offering a wide range of personalization choices.
Developed a mechanism for spawning cars in AR, dynamically selecting the bottommost plane for placement.
Utilized vector calculations, including Vector3.Dot and plane normals, to determine accurate Y positions for precise placement.
Optimized a 3D model sourced from Sketchfab, tailoring it specifically for the project.
Applied custom textures using Substance Painter 3D to enhance visual fidelity and performance.
Ensured a clean, readable, and scalable codebase.
Adopted a loosely coupled and highly cohesive structure, adhering to Unity's component-based architecture.
Maintained the principle of one script per functionality, facilitating easier maintenance and expansion.
The AR Virtual Piano Experience is an augmented reality application that allows users to interact with a virtual piano by pressing virtual buttons corresponding to the notes of Kanye West's "Runaway." As users engage with the virtual keys, they collaboratively recreate the melody, culminating in the full song playing upon completion. This project exemplifies the integration of AR technology with musical interaction, offering an engaging and immersive user experience.
Developed an augmented reality (AR) application featuring a virtual piano that allows users to interactively play Kanye West's "Runaway" using virtual buttons.
Utilized Unity for application development.
Integrated Vuforia for implementing virtual buttons in AR.
Designed nine virtual buttons, each corresponding to a specific note in the "Runaway" melody.
Configured buttons to respond to user interactions, triggering the associated musical notes.
Enabled users to press virtual piano keys in an AR environment, creating an immersive musical experience.
Programmed the application to play the complete "Runaway" melody upon activation of all nine notes.
Gained practical experience in AR development, virtual button integration, and interactive user experience design.
Explored the fusion of technology and music through immersive AR applications.