C++ AI Demo: Showcasing patrol routes, sight/hearing detection, chase behaviour, last known position investigation, and return-to-patrol logic.
This project, my final major project for the 'Games Development' course (Year 2, 2024) at Gateshead College, represents a deep dive into creating believable NPC AI behaviours using Unity and C#. The goal was to focus on simulating realistic AI decision-making and reactions within given scenarios, prioritizing believable behaviour over purely gameplay-driven 'fun'. This project involved not only the implementation but also extensive research into AI programming techniques and analysis of advanced AI examples in existing games.
The core behaviour tree logic was adapted by Mina Pêcheux on YouTube, and the Field-of-View system implementation was guided by Comp-3 Interactive. These resources were instrumental to the project's success. Further research drawing from many other authors informed the design process, all of which is documented on my development blog.
This project demonstrates several key AI features, implemented in Unity using C#.
State Transitions (Behaviour Tree): The AI correctly transitions between states. It defaults to patrolling, reacts appropriately to visual or auditory stimuli, investigates last known positions, and seamlessly returns to its patrol routine.
Pathfinding (NavMesh): Effectively utilized Unity's built-in NavMesh system to enable the NPC to navigate the environment, to my specific requirements.
Patrol Behaviour: AI autonomously navigates between a configurable set of waypoints, dynamically position-able within the world.
Detection & Field-of-View (FOV): Implemented a visual detection system allowing the NPC to "see" the player within a defined cone. A trigger-based system simulates hearing, allowing the AI to react to player movement sounds.
Chase State: Upon visually detecting the player, the AI enters a chase state, actively pursuing the player and updating their last known position.
Last Known Position: If visual contact is lost, or if only sound is detected, the AI navigates to the player's last known location, pauses briefly to "search", and then defaults back to patrolling if no new stimuli are found.
Demoing: Utilising a modified 3D character controller adapted from a previous college project focused on player movement.
Engine: Unity
Language: C#
Core Concepts: AI Behaviour (State Transitions), Behaviour Trees, Pathfinding (Unity NavMesh), Field of View Implementation, Event Triggers (for sound detection)
IDEs: Visual Studio
Developing this AI project involved significant research, critical decision-making, and persistent problem-solving:
Initial Research & Design: A major early challenge was choosing the right tools and architecture. After researching Unity versus Unreal Engine along with different AI implementation structures, I opted for Unity and behaviour trees due to familiarity and perceived ease. This phase also involved in-depth analysis of advanced AI in games like F.E.A.R., Alien: Isolation, and Halo 2 to understand techniques for creating believable NPC behaviour (documented further on the development blog).
Implementation & Debugging: While adapting the core Behaviour Tree framework was relatively smooth, a significant hurdle arose with the Last Known Position logic. The AI would often prematurely return to patrolling before reaching the target position. After considerable debugging and seeking input from peers and tutors, the solution involved temporarily integrating the position into the AI's patrol point array. Resolving subsequent issues with removing this temporary point reinforced the importance of careful state management and persistent debugging. One of the core issues faced was partially due to the depth of the research, many additional features were planned but due to the approaching deadline had to be cut, or adapted.
Key Takeaways: This project solidified my understanding of different AI architectures and the importance of thorough research before implementation. Overcoming the last known position bug was a crucial lesson in problem decomposition, seeking help, and the iterative nature of debugging complex systems. Ultimately, the main takeaway was a deeper appreciation for what makes AI believable and how to approach implementing such behaviours.
For a more in-depth look into the challenges presented by this project you can check out the Reflective Journal on my development blog.
Course Grade: Distinction