Reader small image

You're reading from  Unity 5.x Game AI Programming Cookbook

Product typeBook
Published inMar 2016
PublisherPackt
ISBN-139781783553570
Edition1st Edition
Tools
Right arrow
Author (1)
Jorge Palacios
Jorge Palacios
author image
Jorge Palacios

Jorge Palacios is a software and game developer with a BS in computer science and eight years of professional experience. He's been developing games for the last five years in different roles, from tool developer to lead programmer. Mainly focused on artificial intelligence and gameplay programming, he is currently working with Unity and HTML5. He's also a game-programming instructor, speaker, and game-jam organizer.
Read more about Jorge Palacios

Right arrow

Chapter 5. Agent Awareness

In this chapter, we will learn some algorithm recipes for simulating senses and agent awareness:

  • The seeing function using a collider-based system

  • The hearing function using a collider-based system

  • The smelling function using a collider-based system

  • The seeing function using a graph-based system

  • The hearing function using a graph-based system

  • The smelling function using a graph-based system

  • Creating awareness in a stealth game

Introduction


In this chapter, we will learn different approaches on how to simulate sense stimuli on an agent. We will learn how to use tools that we are already familiar with to create these simulations: colliders, and graphs.

On the first approach, we will take advantage of ray casting, colliders, and the MonoBehaviour functions bound to this component, such as OnCollisionEnter, in order to leverage the need to acquire objects nearby in the three-dimensional world. Then, we will learn how to simulate the same stimuli using the graph theory and functions so that we can take advantage of this way of representing the world.

Finally, we'll learn how to implement agent awareness using a mixed approach that considers the previously learned sensory-level algorithms.

The seeing function using a collider-based system


This is probably the easiest way to simulate vision. We take a collider, be it a mesh or a Unity primitive, and use it as the tool for determining whether or not an object is inside the agent's vision range.

Getting ready

It's important to have a collider component attached to the same game object using the script on this recipe, as well as the other collider-based algorithms in this chapter. In this case, it's recommended that the collider is a pyramid-based one in order to simulate a vision cone. The fewer the polygons, the faster it will be in the game.

How to do it…

We will create a component that is able to see enemies nearby:

  1. Create the Visor component declaring its member variables. It is important to add the following corresponding tags into Unity's configuration:

    using UnityEngine;
    using System.Collections;
    
    public class Visor : MonoBehaviour
    {
        public string tagWall = "Wall";
        public string tagTarget = "Enemy";
        public GameObject...

The hearing function using a collider-based system


In this recipe, we will emulate the sense of hearing by developing two entities: a sound emitter and a sound receiver. It is based on the principles proposed by Millington for simulating a hearing system, and it uses the power of Unity colliders to detect receivers near an emitter.

Getting ready

As with the other recipes based on colliders, we will need collider components attached to every object that is to be checked, and rigid body components attached to either emitters or receivers.

How to do it…

We will create the SoundReceiver class for our agents, and SoundEmitter for things such as alarms:

  1. Create the class for the sound-receiver object:

    using UnityEngine;
    using System.Collections;
    
    public class SoundReceiver : MonoBehaviour
    {
        public float soundThreshold;
    }
  2. Define the function for our own behavior that is handling the reception of sound:

    public virtual void Receive(float intensity, Vector3 position)
    {
        // TODO
        // code your own...

The smelling function using a collider-based system


Smelling is one of the trickiest senses to translate from the real to the virtual world. There are several techniques, but most of them are inclined to the use of colliders or graph logic.

Smelling can be simulated by computing a collision between an agent and odor particles scattered throughout the game level.

Getting ready

As with the other recipes based on colliders, we will need collider components attached to every object that is to be checked, and rigid body components attached to either emitters or receivers.

How to do it…

We will develop the scripts for representing odor particles and agents that are able to smell:

  1. Create the particle's script and define its member variables for computing its lifespan:

    using UnityEngine;
    using System.Collections;
    
    public class OdourParticle : MonoBehaviour
    {
        public float timespan;
        private float timer;
    }
  2. Implement the Start function for proper validations:

    void Start()
    {
        if (timespan < 0f)...

The seeing function using a graph-based system


We will start the recipes oriented to use graph-based logic in order to simulate sense. Again, we start by developing the sense of vision.

Getting ready

It is important to have grasped the chapter regarding path finding in order to understand the inner workings of the graph-based recipes.

How to do it…

We will just implement a new file:

  1. Create the class for handling vision:

    using UnityEngine;
    using System.Collections;
    using System.Collections.Generic;
    
    public class VisorGraph : MonoBehaviour
    {
        public int visionReach;
        public GameObject visorObj;
        public Graph visionGraph;
    }
  2. Validate the visor object in case the com:

    void Start()
    {
        if (visorObj == null)
            visorObj = gameObject;
    }
  3. Define and start building the function for detecting the visibility of a given set of nodes:

    public bool IsVisible(int[] visibilityNodes)
    {
        int vision = visionReach;
        int src = visionGraph.GetNearestVertex(visorObj);
        HashSet<int> visibleNodes...

The hearing function using a graph-based system


Hearing works similarly to vision but doesn't take into account the nodes direct visibility because of the properties of the sound. However, we still need a sound receiver in order to make it work. Instead of making an agent a direct sound receiver, in this recipe, the sound travels along the sound graph and is perceived by the graph nodes.

Getting ready

It is important to have grasped the chapter regarding path finding in order to understand the inner workings of the graph-based recipes.

How to do it…

  1. Create the emitter class:

    using UnityEngine;
    using System.Collections;
    using System.Collections.Generic;
    
    public class EmitterGraph : MonoBehaviour
    {
        // next steps
    }
  2. Declare the member variables:

    public int soundIntensity;
    public Graph soundGraph;
    public GameObject emitterObj;
  3. Implement the validation of the emitter object's reference:

    public void Start()
    {
        if (emitterObj == null)
            emitterObj = gameObject;
    }
  4. Declare the function for emitting...

The smelling function using a graph-based system


In this recipe, we take a mixed approach to tag vertices with a given odor particle that collides with it.

Getting ready

The vertices should have a broad collider attached so that they catch the odor particles nearby.

How to do it…

  1. Add the following member variable to the odor-particle script to store its parent ID:

    public int parent;
  2. Create the new odor-enabled class, deriving from the original vertex:

    using UnityEngine;
    using System.Collections;
    using System.Collections.Generic;
    
    public class VertexOdour : Vertex
    {
        private Dictionary<int, OdourParticle> odourDic;
    }
  3. Initialize the odor dictionary in the proper function:

    public void Start()
    {
        odourDic = new Dictionary<int, OdourParticle>();
    }
  4. Add the odor to the vertex's dictionary:

    public void OnCollisionEnter(Collision coll)
    {
        OdourOdourParticle op;
        op = coll.gameObject.GetComponent<OdourParticle>();
        if (op == null)
            return;
        int id = op.parent;
        odourDic...

Creating awareness in a stealth game


Now that we know how to implement sensory-level algorithms, it's time to see how they could be taken into account in order to develop higher-level techniques for creating agent awareness.

This recipe is based on the work of Brook Miles and its team at Klei Entertainment for the game, Mark of the Ninja. The mechanism moves around the notion of having interest sources that can be seen or heard by the agents, and a sensory manager handling them.

Getting ready

As a lot of things move around the idea of interests, we'll need two data structures for defining an interest's sense and priority, and a data type for the interest itself.

This is the data structure for sense:

public enum InterestSense
{
    SOUND,
    SIGHT
};

This is the data structure for priority:

public enum InterestPriority
{
    LOWEST = 0,
    BROKEN = 1,
    MISSING = 2,
    SUSPECT = 4,
    SMOKE = 4,
    BOX = 5,
    DISTRACTIONFLARE = 10,
    TERROR = 20
};

The following is the interest data type...

lock icon
The rest of the chapter is locked
You have been reading a chapter from
Unity 5.x Game AI Programming Cookbook
Published in: Mar 2016Publisher: PacktISBN-13: 9781783553570
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
undefined
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime

Author (1)

author image
Jorge Palacios

Jorge Palacios is a software and game developer with a BS in computer science and eight years of professional experience. He's been developing games for the last five years in different roles, from tool developer to lead programmer. Mainly focused on artificial intelligence and gameplay programming, he is currently working with Unity and HTML5. He's also a game-programming instructor, speaker, and game-jam organizer.
Read more about Jorge Palacios