Search icon
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Unity Artificial Intelligence Programming - Fifth Edition

You're reading from  Unity Artificial Intelligence Programming - Fifth Edition

Product type Book
Published in Mar 2022
Publisher Packt
ISBN-13 9781803238531
Pages 308 pages
Edition 5th Edition
Languages
Author (1):
Dr. Davide Aversa Dr. Davide Aversa
Profile icon Dr. Davide Aversa

Table of Contents (17) Chapters

Preface Part 1:Basic AI
Chapter 1: Introduction to AI Chapter 2: Finite State Machines Chapter 3: Randomness and Probability Chapter 4: Implementing Sensors Part 2:Movement and Navigation
Chapter 5: Flocking Chapter 6: Path Following and Steering Behaviors Chapter 7: A* Pathfinding Chapter 8: Navigation Mesh Part 3:Advanced AI
Chapter 9: Behavior Trees Chapter 10: Procedural Content Generation Chapter 11: Machine Learning in Unity Chapter 12: Putting It All Together Other Books You May Enjoy

Chapter 4: Implementing Sensors

As we discussed in the previous chapter, a character AI system needs to be aware of its surrounding environment. For example, Non-Player Characters (NPCs) need to know where the obstacles are, the direction the player is looking, whether they are in the player's sight, and a lot more. The quality of the AI of our NPCs depends, for the most part, on the information they can get from the environment. Sensor mistakes are apparent to the player: we've all experienced playing a video game and laughing at an NPC that clearly should have seen us, or, on the other hand, been frustrated because an NPC spotted us from behind a wall.

Video game characters usually get the input information required by their underlying AI decision-making algorithms from sensory information. For simplicity, in this chapter, we will consider sensory information as any kind of data coming from the game world. If there's not enough information, characters might show...

Technical requirements

For this chapter, you just need Unity3D 2022. You can find the example project described in this chapter in the Chapter 4 folder in the book repository: https://github.com/PacktPublishing/Unity-Artificial-Intelligence-Programming-Fifth-Edition/tree/main/Chapter04.

Basic sensory systems

An AI sensory system emulates senses such as sight, hearing, and even smell to get information from other GameObjects. In such a system, the NPCs need to examine the environment and check for such senses periodically based on their particular interest.

In a minimal sensory system, we have two principal elements: aspect (also called event emitters) and sense (also called event senses). Every sense can perceive only a specific aspect; for instance, an NPC with just the sense of hearing can only perceive the sound (one of the aspects) emitted by another GameObject, or a zombie NPC can use its sense of smell to prey on the player's brain. As in real life, we do not need a single sense for every NPC; they can have sight, smell, and touch all at once.

In our demo, we'll implement a base interface, called Sense, that we'll use to implement custom senses. In this chapter, we'll implement sight and touch senses. Sight is what we use to see the...

Setting up our scene

Let's get started by setting up our scene:

  1. First, we add a plane as a floor.
  2. Let's create a few walls to block the line of sight from our AI character to the enemy. We make these out of short—but wide—cubes that we group under an empty GameObject called Obstacles.
  3. Finally, we add a directional light to see what is going on in our scene.

We represent the player with a tank, similar to what we used earlier, and we represent the NPCs with simple cubes. We also have a Target object to show us where the tank is moving in our scene. Our Scene hierarchy should look similar to the following screenshot:

Figure 4.1 – The setup of the example's Hierarchy

Now, let's position the tank, AI character, and walls randomly in our scene. First, make sure to increase the size of the plane to something that looks good. Fortunately, in this demo, all the objects are locked on the plane, and there...

The player's tank and the aspect class

The Target object is a simple sphere object with the mesh render disabled. We have also created a point light and made it a child of our Target object. Make sure that the light is centered, or it will not be very helpful.

Look at the following code in the Target.cs file:

using UnityEngine;
public class Target : MonoBehaviour { 
    [SerializeField]
    private float hOffset = 0.2f;
    void Update () {
        int button = 0;
        //Get the point of the hit position when the mouse 
        //is being clicked 
        if(Input.GetMouseButtonDown(button)) {
            Ray ray = Camera.main.ScreenPointToRay(
        ...

AI characters

In this example, the AI characters roam around the scene in a random direction. They have two senses: sight and touch. The sight sense checks whether the enemy aspect is within a set visible range and distance. Touch detects whether the enemy aspect has collided with the Box Collider around the character. As we have seen previously, our player's tank has the Player aspect. Consequently, these senses are triggered when they detect the player's tank.

For now, let's look at the script we use to move the NPCs around:

using UnityEngine;
using System.Collections;
public class Wander : MonoBehaviour { 
    private Vector3 tarPos;
    [SerializeField]
    private float movementSpeed = 5.0f;
    [SerializeField]
    private float rotSpeed = 2.0f;
    [SerializeField]
    private float minX = -45.0f;
   ...

Testing the game

Now, play the game in Unity3D and move the player's tank near the wandering AI character by clicking on the ground. You should see the Enemy touch detected message in the console log window whenever our AI character gets close to our player's tank.

Figure 4.6 – Our player and tank in action

The previous screenshot shows an AI agent with touch and perspective senses looking for an enemy aspect. Move the player's tank in front of the AI character, and you'll get the Enemy detected message. If you go into the editor view while running the game, you should see the rendered debug drawings thanks to the OnDrawGizmos method implemented in the Sight sense class.

Summary

This chapter introduced the concept of using sensors in implementing game AI, and we implemented two senses, Sight and Touch, for our AI character. The sensory system is just the first element of the decision-making system of a whole AI system. For example, we can use the sensory system to control the execution of a behavior system or change the state of a Finite State Machine once we have detected an enemy within the AI's line of sight.

We will cover how to apply behavior tree systems in Chapter 9, Behavior Trees. In the meantime, in the next chapter, we'll look at how to implement flocking behaviors in Unity3D, as well as how to implement Craig Reynold's flocking algorithm.

lock icon The rest of the chapter is locked
You have been reading a chapter from
Unity Artificial Intelligence Programming - Fifth Edition
Published in: Mar 2022 Publisher: Packt ISBN-13: 9781803238531
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime}