<<

SDK

Release 3.8 Published Friday, June 24, 2005 Copyright © 2000-2005 Kynogon S.A. All rights reserved. powered by Kynogon SDK

Contents

Kynapse User Guide 5 Introduction 6 Getting started 7 What is Kynapse? 8 Purpose of Kynapse 9 Design philosophy 10 Kynapse specs 11 Using this documentation 13 Typographic conventions 14 Finding information 15 Programming Guide 16 Kynapse 3.7 to Kynapse 3.8 migration guide 17 Getting started with Kynapse 18 Fundamental principles 19 Behaviors 20 How behaviors are generated 21 Kynapse architecture 22 Architectural overview 23 Kynapse layout 25 Simulation engine integration 26 Main objects 27 AI engine 28 AI world 29 Entities and entity attributes 30 Actions and action attributes 32 Brains 33 Agents 35 Services 36 Teams 38 Pathfinding constraints 39 Pathfinding heuristics 40 Pathobjects 41 Main data structures 43 Resources data structures 44 PathData 45 Pathways 47 Configuration data structures 48 AI engine configuration parameters (CParamBlock) 49 AI world definition 50 Entity definition 51 Core management 52 Time management

3 Kynapse powered by Kynogon SDK

Exact time measurement 55 Deterministic time measurement 56 Time management sample code 57 Memory management 59 Object management 62 Data management 63 Initializing, updating and terminating Kynapse 65 Overview 66 Initializing Kynapse 67 Updating Kynapse 69 Terminating Kynapse 70 Kynapse coordinate system 71 Using Kynapse standard components 72 Creating and customizing Kynapse components 73 Configuring Kynapse 74 Generating pathdata 75 Common topics 76 Integration helpers 77 Tools 78 Reference 79 Glossary

4 Kynapse powered by Kynogon SDK

Kynapse user guide

Welcome to the Kynapse 3.8 user guide (online LIMITED version)! New to Kynapse? If you're new to Kynapse you should start reading: • The introduction (p.6). (Available) • What is Kynapse? (p.8) (Available) • Getting started with Kynapse (p.18). (Available)

Already with previous versions of Kynapse? Learn about migrating from Kynapse 3.7 to Kynapse 3.8 (p.17) (Not available) User guide content The API Reference contains the following sections: • Introduction (p.6): introduction to Kynapse and the Kynapse user guide. (Available) • Programming guide. (p.16) (Available) • Getting started with Kynapse (p.18): introduction to the Kynapse design philosophy. (Available) • Using Kynapse standard components (p.17): basic use of Kynapse. (Not available) • Creating and customizing Kynapse components (p.17): advanced use of Kynapse. (Not available) • Configuring Kynapse (p.17): to quickly set up Kynapse. (Not available) • Generating path data (p.17): learn everything about Kynapse data generation process. (Not available) • Common topics (p.17): learn how Kynapse can be used for implementing typical game behaviours. (Not available) • Integration helpers (p.17): to quickly and easily integrate Kynapse. (Not available) • Tools (p.17): discover Kynapse tools. (Not available) • Reference (p.17): standard C++ reference doc. (Not available)

5 Kynapse powered by Kynogon SDK

Introduction

Welcome to the Kynapse User Guide. Kynapse is a powerful AI library. This user guide is aimed at helping newcomers to Kynapse become familiar with the product. Kynapse is the result of many years of development, which began in 2000. The power and flexibility of the product has been increased with every release. Kynapse is a multi-platform Application Programming Interface (API), which is constantly being improved and updated to keep it at the cutting edge of AI. If you are new to Kynapse, it is strongly suggested that you read this user guide as you go along to acquaint yourself with its operation. If you have already been using Kynapse for a while, you can also find a lot of help here. The user guide is written by the people who created the library, so you can gain new insights, tricks and tips. Additionally it has been revised and expanded to cover all the latest new technology. Read these sections if you are new to Kynapse. • Getting started (p.7) • Using this documentation (p.13)

6 Kynapse powered by Kynogon SDK

Getting started

You should read this section if you are new to Kynapse. • What is Kynapse? (p.8) • Purpose of Kynapse (p.9) • Design philosophy (p.10) • Kynapse specs (p.11).

7 Kynapse powered by Kynogon SDK

What is Kynapse?

An AI SDK Kynapse is an AI (Artificial Intelligence) SDK. It is used by programmers to create real-time AI applications, such as computer games and simulations. Multi-platform Kynapse is a multi-platform, portable API that allows high level functionality to be achieved on all platforms, with platform specific optimizations to get the best from the hardware pipelines. Kynapse is available for Sony PlayStation 2, Microsoft , Nintendo GameCube and Microsoft Windows. Customizable Kynapse has a component-based approach to its architecture based around a small-footprint, thin-layer highly-optimized core library, supplemented by a number of customisable components. This component mechanism is fully exposed. You can write components or extend existing ones, for your own requirements - in fact you are encouraged to do so, as we do not claim to have thought of everything!

8 Kynapse powered by Kynogon SDK

Purpose of Kynapse

From a general perspective, a game can be structured as follows:

Typically, the game is built around a that is the core of the game application. This game engine manages all aspects of a game (interface, rendering, network, and so on) and is fed with data (such as images, 3D models, sounds, and so on. Here is a brief description of these key elements: 3D graphics engine What the user can see on his screen. Physics engine How 3D models in the simulation react to gravity, collisions, friction, and so on. Sound engine What the user can hear from the simulation. Interfaces How the player can interact with the game. Network How the player can share their simulation experience with other players. AI behavior All objects that are expected to have a living behavior and that are not controlled by the player can be managed via Kynapse. This includes advanced perceptions handling, decision taking, complex pathfinding, high level behaviors (such as flee, hide, attack), and so on. There is an exhaustive list of Kynapse specifications (p.11).

9 Kynapse powered by Kynogon SDK

Design philosophy

The design brief for Kynapse was to create an AI library that would perfectly match game designers AI needs and bring an easy-to-use but powerful collection of tools and functionalities to developers. We have tried hard to make sure the library is the most powerful multi-platform AI library available. Performance CPU and memory performance was one of the main concerns when designing Kynapse. The core algorithms have been designed with this in mind, and many tuning opportunities are exposed to the developer for customizing memory and CPU management. Easy integration process The number and complexity of connection points between client game engines and Kynapse have been minimized. This way, integration process and SDK evaluation can be very straightforward. Platform independent development Kynapse has been designed to let you get the most out of all the supported platforms with no compromises. APIs are provided which expose low-level features and optimisation opportunities to the developer, so the best performance can be obtained for your projects.

10 Kynapse powered by Kynogon SDK

Kynapse functional specifications

Here is a brief list of functionality that is supported by the current release of Kynapse. Decision (thinking logic) • C++ brains (p.17) support. • Lua script brains (p.17) support.

Pathfinding • Automatic pathdata generation (p.17). • Complex 3D topology support (slopes, holes, irregular surfaces, and so on). • Dynamic avoidance between entities. • Support for dynamic pathobjects (p.41) (doors, elevators, scales, and so on). • Constrained pathfinding (p.39) support. • Shortest path (p.17) pathfinding constraint. • Stealthiest path (p.17) pathfinding constraint. • Point to flee path (p.17) pathfinding constraint. • Custom pathfinding constraints (p.17) support. • Simple (p.17) path following mode. • Complex (p.17) path following mode. • Customizable pathfinding heuristics (p.40). • Euclidian (p.17) pathfinding heuristic. • (p.17) pathfinding heuristic. • Path cost (p.17) pathfinding heuristic. • Custom (p.17) pathfinding heuristic support.

Agents (high level actions) • Goto agent (p.17). • Follower agent (p.17). • Flee agent (p.17). • From danger points. • From dynamic entities. • Wander agent (p.17) (wander freely in the level). • Pathway agent (p.17) (follow an authored path). • Hide agent (p.17). • Relative or absolute.

11 Kynapse powered by Kynogon SDK

• Far away or close. • Shooter agent (p.17). • Imitator agent (player cloning). • Test agent (p.17). • Custom agent support.

Perceptions • Visibility between entities (p.17). • Sophisticated understanding of surrounding topology via access ways (p.17) (points from which danger can come). • Sound and smell perception manager.

Team support • Dynamic Team (p.38) composition. • Communication between team-mates.

Performance • Time-slicing (p.53) mechanisms. • Customisable memory management (p.59).

12 Kynapse powered by Kynogon SDK

Using this documentation

This section is designed to help you read this documentation. • Typographic conventions (p.14) • Finding information (p.15).

13 Kynapse powered by Kynogon SDK

Typographic conventions

Kynapse user guide follows certain typographic conventions. These are used to clarify the meaning of the text:

Convention Meaning courier Function/class/variable name italic Kynapse concept that is defined in the glossary (p.) hyperlink (p.14) To find out more about a topic.

14 Kynapse powered by Kynogon SDK

Finding information

The Navigation pane contains tabs enabling you to find information fast.

Content Use the Content section to find the information you require. The structure of the API Reference is explained in the API reference section.

Index The index contains a list of all functions in the API Reference.

Search tab To search for specific terms enter the word and press enter or use wildcards. Searches are not case sensitive and punctuation marks are ignored.

Wildcards • * matches any number of missing characters, for example b*n finds brain and banana • ? matches one missing character, for example b?t finds bot. • AND finds topics containing both words, for example brain AND agent • OR finds topics containing either word, for example team OR entity • NOT finds topics containing the first word, but not the second word, for example action NOT attribute • NEAR finds topics containing the first word within eight words of the second word, for example imitator NEAR agent

15 Kynapse powered by Kynogon SDK

Programming guide

The programming guide is a guide for developers who are implementing AI functionality into their applications using Kynapse. The guide contains architecture and component descriptions and customization tips. The guide is split into the following sections: • 3.7 to 3.8 migration guide (p.17) This section cover major changes 3.7 users should relfect into their code, data and process in order to properly run under Kynapse 3.8. • Getting started with Kynapse (p.18) (Available) This section contains both an overview of the architecture and the fundamental principles of Kynapse in order to help you to get a clear understanding of how Kynapse works. • Using Kynapse standard components (p.17) This section explains how you can use Kynapse standard components with minimum effort. • Creating and customizing Kynapse components (p.17) This section explains how you can customize and develop new Kynapse components. • Kynapse configuration (p.17) This section explains how you can configure Kynapse. • Generating PathData (p.17) This section explains how you can generate PathData. • Common topics (p.17) This section describes how Kynapse can be used for implementing typical game behaviours. • Integration helpers (p.17) This section presents helper code that may help you integrating Kynapse into your simulation. • Tools (p.17) This section describes how you can use Kynapse tools.

16 Kynapse powered by Kynogon SDK

Topic is not available in online documentation

Please contact Kynogon to get the full Kynapse User Guide.

17 Kynapse powered by Kynogon SDK

Getting started with Kynapse

This section is an introduction to Kynapse. You should be familiar with concepts described in this section before starting to use Kynapse. • Fundamental principles (p.19) • Kynapse architecture (p.22) • Main objects (p.27) • Main data structures (p.43) • Core management (p.52) • Initializing, updating and terminating Kynapse (p.65) • Kynapse coordinate system (p.71)

18 Kynapse powered by Kynogon SDK

Fundamental principles

This section introduces the fundamental Kynapse principles you need to be familiar with. • Behaviors (p.20) • How Behaviors are generated (p.21)

19 Kynapse powered by Kynogon SDK

Behaviors

All objects that are expected to have a living behavior, and which are not controlled by the player can be managed via Kynapse. Kynapse will first create a model for all these objects called the AI World (p.29). As an analogy, there are models from a rendering and physical point of view (polygons and collision mesh); there is also a model of the world from a behavioral point of view. This model contains information about: • 3D world topology (known as PathData (p.45)) • Objects in the game engine world (doors, energy spots, and so on) • NPCs in the game engine world.

Objects and NPCs in the game engine that are also represented in the AI World are called entities (p.30). Actions (p.32) are the way entities interact with the game engine: • Movement actions (go forward, turn left, jump, and so on) • Activation actions (fire, open a door,and so on).

An entity can implement several actions across time to achieve an objective. The cumulative effect is known as a behavior. For example, the generation of successive movement actions (go forward, turn left, turn right, and so on) to go to a specific destination gives an entity a behavior.

20 Kynapse powered by Kynogon SDK

Behavior generation pipeline

AI entity's behaviors are generated in three steps:

Perception step The perception step gathers the data that an entity can perceive and use to make a decision. For example: • entity position • new visible enemy • remaining ammunition.

The types of perception data can differ according to the game engine, and the data can be retrieved or updated in many different ways. But it is always necessary to update information about the environment in which the entity exists, and the current situation of the entity.

Decision step The decision step selects which behavior to activate, based on: • perception data • decision data.

Decision data describes the logic and mechanism for making the decision. The type of data varies depending on the algorithm that makes the decision. For example, this data could be a set of rules described in a script, a set of probabilities, or values of connections for a neural net. At the end of this step, a behavior has been selected and needs to be translated into actions.

Action step The action step translates: 1. a high-level behavior (for example, going to a specific destination) into low-level actions (go forward, turn left, and so on). 2. low-level actions into commands understood by the game engine.

21 Kynapse powered by Kynogon SDK

Kynapse architecture overview

This section contains general information about the overall structure of Kynapse components. • Architecture overview (p.23) • Kynapse layout (p.25) • Game engine integration (p.26)

22 Kynapse powered by Kynogon SDK

Kynapse architecture

To generate behaviors for entities, Kynapse offers a solution structured around four layers. Kynapse provides a set of tools to make it easier for you to control each layer and generate the required data. Here are the four layers:

Layer Description Tool Decision Groups several algorithms (called brains (p.33)) for decision-making. Agents Groups all high-level behaviors High-level actions can be (called agents (p.35)) that can be generated automatically selected at the decision layer instead of being coded. (attack, flee, hide, and so on). Services Offers a set of functionalities The modeling of the world (called services (p.36)) shared by from an AI perspective the agents and decision layers. (PathData (p.45)) is achieved via automatic data production tools. Architecture Manages: Architecture parameters can • integration of Kynapse with be configured (CPU the game engine consumption, number of entities, and so on). • the AI world • performance, via time-slicing mechanisms (p.53).

Each layer communicates with the others and is open for customization and/or extension. The different layers correspond to different steps of the behavior generation:

23 Kynapse powered by Kynogon SDK

24 Kynapse powered by Kynogon SDK

Kynapse layout

Kynapse is divided into two main parts: KAIM (Kynogon Artificial Intelligence Modules) The AI engine that powers Kynapse. It allows you to generate complex and realistic behaviors for game. It requires the definition of a small set of integration functions to communicate with any environment. Kynapse Skeleton (optional) A default implementation of the integration functions required by KAIM.

25 Kynapse powered by Kynogon SDK game engine integration

There are two levels of communication for Kynapse: 1. Kynapse communicates with your simulation modules via a set of low-level integration functions called the Skeleton (p.17) (this link is optional; Kynapse can work without the skeleton , but you will have to write your own managment routines for collisions, memory, IO, and so on). 2. Your game engine initializes and controls Kynapse via API functions using the C++ programming language.

26 Kynapse powered by Kynogon SDK

Main objects

This section introduces all the fundamental Kynapse objects you will need to be familiar with. • AI engine (p.28) • AI world (p.29) • Entities and entity attributes (p.30) • Actions and action attributes (p.32) • Brains (p.33) • Agents (p.35) • Services (p.36) • Teams (p.38) • Pathfinding constraints (p.39) • Pathfinding heuristics (p.40) • Path objects (p.41)

27 Kynapse powered by Kynogon SDK

AI engine

Introducing the AI engine This object exposes the Kynapse engine. The AI engine manages: • Initializing, running and terminating Kynapse • Collision callbacks • Memory callbacks • Time callbacks • Debug message callbacks • Debug 3D draw callbacks • File management callbacks • Metrics log callbacks.

Creating and destroying the AI engine All Kynapse applications must open the Kynapse Engine before entering their main loop. This is done through the RwAiOpen function. An AI engine Configuration Parameter Block (p.49) is passed to this function to specify vital callbacks mentioned above (collision, memory, and so on). The Kynapse optional AI Skeleton (p.17) offers you a default implementation that may help you to the AI engine Configuration Parameter Block. The Kynapse Engine should be the last AI component to be destroyed. This is done through the RwAiClose() function. Updating the AI Engine There is no update function to be called at the AI Engine level. AI components are updated through the AI World (p.29) update function. See also... • AI Engine has to be opened with a Configuration Parameter Block structure (p.49). • Learn more about how Kynapse should be initialized, updated and destroyed (p.65).

28 Kynapse powered by Kynogon SDK

The AI world

Introducing the AI world The AI world is a container for all AI objects in your game. It links dynamic and static components and provides a cohesive world space in which they can exist. Dynamic objects such as entities (p.30) andteams (p.38) can be added or removed at any time from an AI world. The AI world also holds a dynamic set of references to management services (p.36) (for example thegraph manager (p.17) handles PathData (p.45), or the sound manager handles sound perceptions, and so on). Creating and destroying the AI world After initialising Kynapse, you will have to create an AI world before any other AI object. An AI world is created from an AI world definition (p.50). When terminating Kynapse you should destroy the current AI world object. AI objects inside the AI world (entities, teams, services) will be automatically destroyed. Updating the AI world The AI engine should be updated each frame during the simulation. This is done through the Kaim::WorldUpdate(...) function. You may be interested having a deeper look at how Kynapse should be initialised, updated and destroyed (p.69). See also... • The AI world is created from an AI world definition (p.50). • The AI world can contain entities (p.30). • The AI world can contain teams (p.38). • The AI world can reference services (p.36).

29 Kynapse powered by Kynogon SDK

Entities and entity attributes

Introducing entities Objects in the game engine which need to be perceived by and/or controlled by Kynapse, should exist in the AI world (p.29) as entities. Basically, there are two types of entities: • Active entities: From a behavioral point of view, these are completely controlled by Kynapse. Kynapse gets information from the game engine for each active entity, and then sends an action to the game engine to generate their behaviors. This includes NPCs and other machines you want to act logically (enemies, allies, robotic arms, surveillance cameras, and so on). Active entities should be connected to a brain (p.33) (to think) and to anaction (p.32) (to act). • Passive entities: These represent entities in the game engine that are not controlled by Kynapse. For example, the player and doors are passive entities. Kynapse gets information from the game engine for each passive entity (so that active entities can perceive them), but Kynapse does not generate behaviors for passive entities. Passive entities have no brain (p.33) (they do not think) and no action (p.32) (they do not act).

More information: • How entities are represented in C++ (p.17).

Creating and destroying entities An entity only exists in the AI world (p.29). Entities can be created and destroyed any time during simulation. However, creating entities is CPU consuming so you should create a pool of entities at the beginning of your game that will be activated and deactivated according to your needs. • When an entity is deactivated in the AI world, it is ignored during the Kynapse update sequence • When an entity is activated in the AI world, it is taken into account during the Kynapse update sequence

An entity is created through an entity definition (p.51). When it is constructed, it is automatically inserted in the AI world (p.29) as a deactivated entity. When destroying the AI world (p.29), all entities in the AI world (activated and deactivated) are destroyed. More information: • How entities should be created (p.17).

Updating entities Active and passive entities represent game engine objects in Kynapse. These “AI images” must be synchronized with underlying objects in the game engine so that Kynapse can use them as valid representations. Information related to Kynapse

30 Kynapse powered by Kynogon SDK

entities should be updated at each frame in the simulation in order to avoid divergence between the entity's parameters in the game engine and in Kynapse. Kynapse will call all actived AI entities in the AI world at the beginning of each frame through a callback mechanism to achieve this synchronization step. This operation is called the “GetInput step” for entities. More information can be found in the Kynapse update step description (p.69). You therefore need to translate game engine entity characteristics like position, head orientation, width, maximal speed, and so on, into Kynapse entity attributes that can be considered and interpreted by standard Kynapse agents (p.35) andservices (p.36). See the list ofstandard entity attributes ['prog_standard_entityattributes_intro' not found]. More information: • How the GetInput step is implemented (p.17).

See also... • Active entities have to be connected to a brain (p.33). • Active entities have to be connected to an action (p.32). • PathObjects (p.41) are special type of entities.

31 Kynapse powered by Kynogon SDK

Actions

Introducing actions Definition Action is a key concept for implementing Kynapse behavior generation processes (p.21). Since Kynapse cannot anticipate how your game engine action system is structured, it works with its own formal actions. An action is a Kynapse representation of a combination of basic movements an entity can perform. For example “move forward” and “turn left” and “crouch” . Basic movements constituting an action are called action attributes (in the example above, “move forward” , “turn left” and “crouch” are action attributes). See the list of standard action attributes. (p.17) Implementation When constructing an active entity (p.30), you will need to tell Kynapse which action attributes it may use. For example, you may want Kynapse to compute acceleration instead of velocity for moving forward your game entities, or to explicitly allow or forbid strafe moves. Each frame during the simulation, the brain (p.33) of every active entity combines action attributes to generate an instantaneous action. High level agents and services can help the brain during this process. This action will be translated and applied by your game engine entities during the Kynapse update step. This translation and application step is called the “ApplyEntityAction step” . For more information: • See the Kynapse update sequence description (p.69). • How agents can help me to generate actions (p.35). • How services can help me to generate actions (p.36).

Kynapse offers a collection of standard action attributes you can use as they are. users can also define their own action attributes from scratch. Creating and destroying actions You do not have to be concerned about creating or destroying actions. See also... Actions are the orders Kynapse gives to the game engine for controlling active entities (p.30). Actions should be synthesized by brains (p.33).Agents (p.35) can produce action proposals.Services (p.36) can produce action proposals.

32 Kynapse powered by Kynogon SDK

Brains

Introducing brains The brain hosts the thinking logic for active entities (p.30). As a first step, it selects and activates agents (p.35) to give the appropriate behavior according to a specific game scenario. This decision is based upon: • Perception (visual, auditive, topological, and so on). • Current entity attributes. • Thinking logic (hardcoded, scripted, and so on).

The decision is then translated into an action to be applied on the active entity. Therefore, brains articulate Kynapse key components: • Agents (go to, flee, hide, customized, and so on). • Services (perception, pathfinding, customized, and so on).

The brain can be connected to one or many agents (p.35) that may help to generate actions (agents could be considered as constructing blocks for simplifying brain programming). The brain can also use services (p.36) that are connected to it (like perception services or, for example, the PathFinder service (p.17)). Kynapse implements brain thinking logic (that is strongly game-design dependant) using one of the following options: • Implementing brain logic using the C++ brain layout. • Implementing brain logic using the LUA layout. • You can also implement your own logic via, for example, your own script language. Brains are fully open.

Creating and destroying brains You do not have to be concerned about creating or destroying brains. Brains are automatically created when you create an active entity. You can retrieve a brain from an active entity or an active entity from a brain. Brains are automatically destroyed when you destroy an active entity (or the AI world it belongs to). Updating brains Brains are updated every frame in the simulation. At each frame, Kynapse will call all active entity's brains in the AI world (through a callback mechanism). This is the “think step” for brains. See the Kynapse update step (p.69) description. During the “think step” , the logic code you have implemented in your brain will be

33 Kynapse powered by Kynogon SDK

evaluated. More information: • How the “think step” step is implemented in C++ (p.17).

See also... • Brains belong to active entities (p.30). • At each frame in the simulation, the brain should synthesize an action (p.32).

34 Kynapse powered by Kynogon SDK

Agents

Introducing agents Agents implement entity (p.30) high-level actions like: • go to point • wander • follow a target • flee • and so on.

Agents simplify brain programming. Agents are called by brains (p.33) to generate action (p.32) proposals. The brain then selects and activates the appropriate agents. For controlling an entity, agents will retrieve information about the entity through entity attributes (p.30) and action attributes (p.32). Agents can call other agents and/or services (p.36). Kynapse offers a collection of standard agents (p.17) that are ready to use. More information: • users can also create their own agents (p.17) from scratch, or customize existing agents (p.17).

Creating and destroying agents You do not have to be concerned about creating or destroying agents. Agents are automatically created when you create an active entity. Agents are automatically destroyed when you destroy the active entity or the AI world (p.29) they belong to. Updating agents Like brains, agents “think” . You have to specify in the brain how and when the agents are called. Therefore, unlike a brain for which the “think step” takes place at each frame, the agent “think step” is not systematic. You will have to call an agent each frame until it manages to reach its objectives (for example, you will ask the Goto agent to think until the entity reaches its destination point). See also... • Agents are used by the brain (p.33). • Agents can use services (p.36). • Agents generate actions (p.32).

35 Kynapse powered by Kynogon SDK

Services

Introducing services A service is a functionality shared by several agents and brains (p.33). Services can be divided into two categories depending whether they are entity (p.30)-specific or not: • World services These are not entity specific. There is only one instance of each of these services for each AI world (p.29). • Brain services These are specific to an entity. There is only one instance of each of these services for each brain.

Services can also be split according to their function: • Data services These load, store, and make data available, and finally destroy it. • Perception services These manage perception requests. For example “is this door open?” , “is the enemy visible?” , “is the sound audible?” . • Action services These generate actions.

Kynapse offers a collection of standard services (p.17) that are ready to use:

Type Data Perception Action World Graph manager (p.17) Sound Manager PathWay manager (p.17) PathObjects manager Point mapper (p.17) (p.17) Mapbuilder (p.17) Brain Entity manager (p.17) PathFinder (p.17)

users can also create their own services from scratch (p.17), or customize existing services (p.17). Creating and destroying Services You do not have to be concerned about creating or destroying services. World services are created when the AI world is created and destroyed when it is destroyed. Brain services are created when the active entity containing the brain connected to these services is created, and destroyed when this entity is destroyed. Updating services

36 Kynapse powered by Kynogon SDK

Only world services are systematicaly updated after the entity “Getinput step (p.69)” and after the entitythink “ step (p.69)” . See also... • Services can be used by agents (p.35). • Services can be used by brains (p.33).

37 Kynapse powered by Kynogon SDK

Teams

Introducing teams Basically, teams are simple entity (p.30) containers. Teams offer cooperative behaviour by simplifying communication between team-mates, and proposing a holder to store common information. Moreover, part of the team behavior can be implemented at the team level. • Entities can belong to several teams simultaneously. • A team can be activated or deactivated. • When an entity is removed from the AI world (p.29), it is automatically removed from the teams it belongs to.

Creating and destroying teams A team only exists in the AI world. Teams can be created and destroyed at any time during the simulation. When a team is created, it is automatically inserted in the AI world as a deactivated team. When destroying the AI world, all teams in the AI world (activated and deactivated) are destroyed. Using Teams Team members can be added and removed at any time during the simulation. You are free to use and structure your teams as you want. See also... • A team can only exist in the AI world (p.29). • A team can contain entities (p.30). • A team can contain PathObjects (p.41).

38 Kynapse powered by Kynogon SDK

Pathfinding constraints

Introducing pathfinding constraints Pathfinding is one of the main purposes of Kynapse. Pathfinding constraints define the way paths are calculated by the pathfinder service (p.17). A path can be furtive, avoid dangerous zones, be as short as possible, and so on. Kynapse offers a collection of standard pathfinding constraints (p.17) which are ready to use. users can also create their own pathfinding constraints (p.17) from scratch . The default pathfinding constraint is the shortest path pathfinding constraint (p.17). Creating and destroying pathfinding constraints You do not have to be concerned about creating or destroying pathfinding constraints. Pathfinding constraints are automatically created when used for the first time by the pathfinder service linked to an active entity (p.30). Pathfinding constraints are automatically destroyed when you destroy the active entity that owns them (through the destruction of the pathfinder service). Activating pathfinding constraints More information: • How pathfinding constraints are activated (p.17).

When are pathfinding constraints evaluated? An entity's paths are computed along a spatial graph (p.45). The spatial graph is a representation of the world topology using vertices and edges. Pathfinding constraints are evaluated for each edge considered by the pathfinder service at path construction time. They also may be evaluated when trying to optimise trajectories during path following. See also... • Pathfinding constraints are evaluated by the pathfinder service (p.17). • You should also take a look at pathfinding heuristics (p.40).

39 Kynapse powered by Kynogon SDK

Pathfinding heuristics

Introducing pathfinding heuristics A pathfinding heuristic is a cost estimate for the shortest path to reach a target. This cost can be measured according to various criteria; distance, time, and so on. The pathfinding heuristic is critical for the performance of the A* algorithm used by Kynapse. The A* algorithm is guaranteed to find the shortest path as long as the pathfinding heuristic estimate is admissible. A heuristic is said to be admissible if it never over-estimates the cost of travelling to the target. The pathfinding heuristic must never over-estimate, but the closer it sticks to the real cost, the more efficient the A* algorithm will be. Kynapse offers a collection of standard pathfinding heuristics (p.17) which are ready to use. C++ users can also create their own pathfinding heuristics (p.17) from scratch. The default pathfinding heuristic is the Euclidian distance pathfinding heuristic (p.17). Creating and destroying pathfinding heuristics You do not have to be concerned about creating or destroying pathfinding heuristics. Pathfinding heuristics are automatically created when used for the first time by the pathfinder service (p.17) of an active entity (p.30). Pathfinding heuristics are automatically destroyed when you destroy the active entity that owns them (through the destruction of the pathfinder service). Activating pathfinding heuristics Look at how pathfinding heuristics are activated (p.17). When are pathfinding heuristics used? Pathfinding heuristics are used for each edge considered by the pathfinder service when a path is computed. See also... • Pathfinding heuristics are used by the pathfinder service (p.17). • You should also take a look at pathfinding constraints (p.39).

40 Kynapse powered by Kynogon SDK

PathObjects

Introducing PathObjects PathObjects are used to take into account dynamic objects that interfere with pathfinding. For example a door, a lift, a teleporter, and dynamic obstacles can be implemented as PathObjects. A PathObject is a specialized type of entity (p.30). A PathObject contains references a collection of edges in a move graph (p.45). • A PathObject can dynamically lock and unlock its edges depending on its logic (a door may be opened or closed, a teleporter deactivated, and so on). • A PathObject can also define how an entity goes through its edges (for example for using a lift an entity will have to press a button, wait, walk into the lift, press another button, wait again, and finally go out from the lift car).

Creating and destroying PathObjects PathObjects are created and destroyed exactly like entities, (take a look at the entity creation and destruction (p.30) section). When are PathObjects evaluated? The spatial graph (p.45) basedPathFinder service (p.17) in Kynapse proceeds in three steps: 1. Path construction. 2. Subgoal selection. 3. Go to the current subgoal (and restart step 2 until arrived). PathObjects allow customization each of these three steps. 1. Path construction: When the PathFinder service encounters an edge linked to a PathObject, it does not compute the edge cost using the current constraint (as usually). Instead it asks the linked PathObject to give the edge cost through a dedicated GetCost function. For example, this allows you to forbid an edge linked to a locked door, or to give an insignificant cost for edges referenced by a teleporter. 2. Subgoal selection: When the PathFinder service encounters an edge linked to a path object, it asks to the PathObject whether the vertices related to this PathObject can be bypassed or not. 3. Go to the current subgoal: When the PathFinder service encounters an edge linked to a path object it, asks the linked path object how to get to the end of the edge through a dedicated Goto function. Updating PathObjects PathObjects are updated exactly like entities, (take a look at the entity creation and destruction (p.30) section). The “GetInput step” for PathObjects is an appropriate place to update the

41 Kynapse powered by Kynogon SDK

collection of edges linked to a dynamic PathObject. For example, a moving obstacle PathObject (like a fire in a meadow) should be synchronized with the underlying game engine fire model propagation in the step. See also... • A PathObject is a special type of entity (p.30). • It is tightly linked to the PathFinder service (p.17).

42 Kynapse powered by Kynogon SDK

Main data structures

This section introduces all the fundamental Kynapse data structures you should be familiar with. • Resource data structures (p.44) • Configuration data structures (p.48)

43 Kynapse powered by Kynogon SDK

Main resource data structures

This section introduces all the fundamental Kynapse resource data structures you need to be familiar with. The following resource data structures are described: • PathData (p.45) • PathWays (p.47)

44 Kynapse powered by Kynogon SDK

PathData

Introducing PathData In the same way that a graphics engine uses polygons and collision meshes to model the world from a rendering and physical point of view, Kynapse uses PathData to model the world from a behavioral point of view. PathData comprises: • Spatial graphs. These are Kynapse data structures (p.43) corresponding to oriented 3-dimensional graphs with vertices linked by edges. They are widely used by the PathFinder service (p.17) and by some agents (p.35) (such as the hide agent (p.17) and the flee agent (p.17)). You can find more information about the graph structure and its size in memory in this section (p.17). Also, you should be familiar with the spatial graph structure rules (p.17), especialy if you want to modify spatial graphs and/or if you want to create your own PathData generation tool. • Additional Data. This data overloads spatial graphs with additional properties for vertices and edges. Many types of standard additional data (p.17) are available in Kynapse: • Find nearest (p.17) additional data. This is used to accelerate proximity queries on spatial graphs. • Path cost (p.17) additional data. This is used to optimize the A* algorithm. • Access ways (p.17) additional data. This is used for detecting points from which danger can come during the simulation . Users can define custom additional data (look at the additional data sample (p.17)).

Usually, in a game level you will need these spatial graphs: •A move graph for every type (that is moving differently) of active entities (p.30) for every separate area they could be moving inside. Edges in the graph will be considered as possible ways an entity can go during the simulation. • One or more optional positions graphs for each type of active entity. Points in the graph may represent strategic places in the world like hiding places, snipe positions, dangerous places, and so on.

Spatial graphs are managed by a standard dedicated management service called the graph manager (p.17). Generating PathData PathData may be generated in four ways:

• Automatically using the Pathdata generator tool (p.17). PathData usually contains hundreds of nodes, so manually generating this amount of data would be impossible.

45 Kynapse powered by Kynogon SDK

• Automatically using the standard generator agent (p.17). See automatic PathData generation (p.17). • Manually, using the PathData editor (p.17).

• Procedurally , in run-time or from your custom tools. See spatial graphs ['prog_standard_pathdata_spatialgraphs' not found].

Creating and destructing PathData You do not have to be concerned about creating or destroying PathData. PathData is created automatically when you create an AI world (p.29) containing a graph manager. PathData is destroyed automatically when you destroy the AI world containing it. Users can dynamically load and unload PathData. Updating PathData PathData should be considered as a static structure. You should not add or remove vertices or edges from spatial graphs at runtime. Moving vertices during the simulation is not recommended. PathObjects (p.41) provide you with an efficient way of dynamically changing a spatial graph's properties at runtime. See also... • PathData is widely used by the PathFinder service (p.17). • PathData is managed by the graph manager (p.17).

46 Kynapse powered by Kynogon SDK

PathWays

Introducing PathWays A PathWay is an object describing a sequence of waypoints that an active entity (p.30) should follow (ideal for a guard patrol, for example). Specific information can be stored at each waypoint and on each link between waypoints. This information is used to script the entity behaviour when following its PathWay: • The entity can use several types of movement (walk, run, jump, crouch) between waypoints. • When a waypoint has been reached, the entity can orient itself in a particular direction, turn to an angle, or wait a few seconds.

PathWays can be used through the standard PathWay agent (p.17). PathWays are managed by a standard dedicated PathWay manager service (p.17). Generating PathWays PathWays should be generated procedurally, in run-time or from your custom tools. Creating and destroying PathWays You do not have to be concerned about creating or destroying PathWays. PathWays are automatically created when you create an AI world (p.29) containing a PathWay manager. PathWays are automatically destroyed when you destroy their AI world. Updating PathWays Users can update PathWays any time during the simulation . See also... • Active entities (p.30) can follow PathWays. • PathWays can be used by the standard PathWay agent (p.17). • PathWays are managed by a standard dedicated PathWay manager service (p.17)

47 Kynapse powered by Kynogon SDK

Configuration data structures

This section introduces all the fundamental Kynapse configuration data structures you need to be familiar with: • AI engine configuration parameters (CParamBlock) (p.49) • AI world definition (p.50) • Entity definition (p.51).

48 Kynapse powered by Kynogon SDK

AI engine configuration parameters ( CParamBlock )

Introducing the CParamBlock structure The CParamBlock is used to open the AI engine (p.28). It specifies: • Data required by Kynapse such as the maximum number of entities in your AI world. • Pointer to the integration functions that Kynapse needs to interface properly (like collision detection, debug outputs, and so on).

Kynapse provides an optional AI skeleton (p.17) . With the AI skeleton, the CParamBlock class can be automatically initialised using the KsSetDefaultParamBlock function. Members of the structure See the reference API (p.17) for a detailed description of this structure. See also... • The CEngineConfig is used to open the AI engine (p.28). • You can use the AI skeleton (p.17).

49 Kynapse powered by Kynogon SDK

AI world definition

Introducing the AI world definition AI worlds (p.29) must be created through the AI world definition. An AI world definition can be considered as a construction plan for an AI world. A definition is a hierarchical structure of named nodes containing values (service (p.36) parameters, time-slicing (p.53) parameters, and so on). An AI world definition can potentially include entity definitions (p.51). Creating and destroying AI world definition You should explicitly create and destroy your AI world definition. An AI world definition can be created: • by reading an external XML definition file (see here (p.17) for more details) • by creating it from scratch in C++ • by reading an external XML file and modifying the definition via C++ code.

Main sections for a standard AI World definition • Structure (p.17) of a standard AI world definition. • Example (p.17) of an XML AI world definition.

See also... • An AI world definition is used to AI worlds (p.29). • An AI world definition can include entity definitions (p.51).

50 Kynapse powered by Kynogon SDK

Entity definitions

Introducing entity definitions Entities (p.30) should be created through entity definitions. An entity definition can be considered as a construction plan for an entity. A definition is a hierarchical structure of named nodes wearing values (parameters for services, brain settings, and so on). An entity definition can be included in an AI world definition (p.50). Creating and destroying entity definitions You should explicitly create and destroy your entity definitions. An entity definition can be specified: • by reading an external XML definition file • by creating it from scratch in C++ • by reading an external XML file and modifying the definition via C++ code.

Main sections for a standard entity definition • Structure (p.17) of a standard entity definition. • Example (p.17) of an XML entity definition.

See also... • An entity definition is used to build entities (p.30). • An entity definition can be included in an AI World definition (p.50).

51 Kynapse powered by Kynogon SDK

Core management

This section introduces all the fundamental Kynapse core management mechanisms you need to be familiar with: • Time management (p.53) • Memory management (p.59) • Object management (p.62) • Data management (p.63)

52 Kynapse powered by Kynogon SDK

Time management

Introducing time slicing Time consumption is critical. Most game dedicate very limited CPU time for AI in comparison with other tasks (rendering, physics, and so on). On the other hand, AI sometimes uses very complex algorithms that can be time-consuming. To provide both fine behaviors and low CPU consumption, Kynapse provides a time slicing mechanism. When dealing with behaviors, it is not necessary to have a very accurate action at each frame. Indeed, an action can be applied during several consecutive frames without any difference from the player's point of view. It is therefore possible to spread behavior computation over several frames to avoid CPU peaks. Kynapse guarantees an average time consumption per frame. Each time-consuming task is managed by a time manager. This object processes CPU requests from all Kynapse entities (p.30) which need computations. The main task of the time manager is to define priorities between entities so that all entities have relevant CPU time. This is achieved in the scheduling function. You can modify priorities between entities to fit specific needs. For example, you may want to give more time to entities visible from the player. The time slicing mechanism allows you to save a lot of CPU time. However, you should give a reasonable minimum amount of CPU time to Kynapse. If you do not, Kynapse will still work, but entities will not be reactive. The more time you dedicate to Kynapse, the higher the reactivity will be. You need to find a good compromise between CPU time consumption and reactivity. How can you use time slicing? A time manager is implicitly created when you create an AI world (p.29). You can set up time slicing at two levels: • Global time slicing You should learn how to activate or deactivate time slicing in Kynapse. You can ask Kynapse to deactivate time slicing by setting the “tpf” (time per frame) attribute in the AI world definition (p.50) to 0. If you want time slicing to be used, you should set the “tpf” attribute to a value greater than 0. This value represents the maximum CPU time that you want to dedicate to Kynapse. This value is expressed in milliseconds per frame. Here is a sample of an XML AI World definition with the global tpf value setting:

40.0 35 1.5 GraphManager ...

53 Kynapse powered by Kynogon SDK

• Task-level time slicing The time manager manages two kinds of task requests: Periodic Performed regularly, with a given frequency. Here is a sample of an XML AI World definition of a periodic task:

... 4000 ...

Aperiodic Performed as frequently as possible. These tasks are not performed when there is no CPU time available. Here is a sample of an XML AI World definition of an aperiodic task:

... 1.0 0.5 100 ...

Note: The “tpf” value specified here is absolute. It is not relative to the global “tpf” . If this value is set to 0, the task will never be called more than once per frame (it might not be called at all if there is no more time left for AI). If task level “tpf” is more than global “tpf” , then the task level “tpf” will be clamped to the global “tpf” . So usualy, the value of a task level “tpf” should be between 0 and the global “tpf” value. Users can customize the time slicing settings for standard Kynapse components. They can also decide if their own tasks (in their custom components) should be managed by the Kynapse time manager.

How does the AI code check for free processing time ? Users may want to define their own tasks, and then ask Kynapse whether a portion of code might be evaluated or not. Please take a look here (p.57) to see how this should be done in C++. Time measurement For being able to spread computation over several frames, Kynapse should be able to measure the time that is spent during function calls. Kynapse provides two mechanism for measuring these durations: • Exact time measurement (p.55). • Deterministic time measurement (p.56).

54 Kynapse powered by Kynogon SDK

Exact time measurement

How does it work ? To be able to spread computation time over several frames, Kynapse should be able to know how much time is spent during critical function calls. The exact time measurement mechanism simply uses the RwAiParamBlock (p.49) getTimeFunc function pointer to know the exact duration of a specific function call. Advantages and drawbacks This time measurement mechanism is the most exact measurement mechanism provided by Kynapse. The problem with this time measurement mechanism is it is not deterministic ; on a PC for example, two similar function calls can produce different measures (due to an hard-drive access or a multithread side effect). So this mechanism is not suited to games expecting deterministic behaviour (distributed games, replay, ...). Using the deterministic time measurement mechanism You only have to set the RwAiParamBlock (p.49) getTimeFunc function pointer on the RsAiGetTime skeleton function for the exact time measurement mechanism to work. See also... Kynapse also supports deterministic time measurement mechanism (p.56).

55 Kynapse powered by Kynogon SDK

Deterministic time measurement mechanism

How does it work ? To be able to spread computation time over several frames, Kynapse should be able to know how much time is spent during critical function calls. The deterministic time measurement mechanism simply uses the RwAiParamBlock (p.49) getTimeFunc function pointer to know the fixed estimated duration of a specific function call. Advantages and drawbacks This time measurement mechanism is determinisitic. This mechanism is well suited to games expecting deterministic behaviour (distributed games, replay, ...). The problem with this time measurement mechanism is it is not exact, and you have to provide an accurate estimation for each critical function in your game. Using the deterministic time measurement mechanism

• Setting up the relevant time function callback: First of all, you should set the RwAiParamBlock (p.49) getTimeFunc function pointer on the RsAiGetEstimatedTime skeleton function. The default RsAiGetTime function is not suited.

• Then you should, add an estimation section to your AI world definition Here is an AI World XML definition sample for this:

0.005 0.04 0.02 0.08

The values in this section should be carefully tuned on a reference computer.

See also... Kynapse also supports exact time measurement mechanism (p.55).

56 Kynapse powered by Kynogon SDK

Time management sample code

Here is an example on how Kynapse time manager can be asked to know whether a portion of code should be evaluated perodically:

void TestPeriodic() { CTimeManager& scheduler = CTimeManager::GetInstance(); extern KyInt32 myPeriodicFunctionId;

// myPeriodicFunctionId should have been initialized before at the init stage: // example: myPeriodicFunctionId = scheduler.GetPeriodicTaskByName("MyPeriodicFunction");

KyBool granted = scheduler.RequestPeriodic(myPeriodicFunctionId,m_bot->m_index); if(granted == KYTRUE) { // I can call my function now MyPeriodicFunction(); } else { // I cannot call my function if (ExceptionalCondition() == KYTRUE) { // In same (rare) cases, I want to call my function anyway MyPeriodicFunction(); // Then I have to warn the scheduler scheduler.RecordPeriodicTaskUse(myPeriodicFunctionId,m_bot->m_index); } } } : examples\code_examples\src\tsperiodic.cpp Here is an example on how Kynapse time manager can be asked to know whether a portion of code should be evaluated aperodically:

void TestAperiodic() { CTimeManager& scheduler = CTimeManager::GetInstance(); extern KyInt32 myAperiodicFunctionId;

// myAperiodicFunctionId should have been initialized before at the init stage: // example: myAperiodicFunctionId = scheduler.GetAperiodicTaskByName("MyAperiodicFunction");

KyBool granted = scheduler.RequestAperiodic(myAperiodicFunctionId,m_bot->m_index); if (granted == KYTRUE) { // I can call my function now MyAperiodicFunction(); // Then I give a feedback to the scheduler when I have finished scheduler.ProcessFeedback(myAperiodicFunctionId,m_bot->m_index); } else {

57 Kynapse powered by Kynogon SDK

// I cannot call my function, there is no time left } } Source: examples\code_examples\src\tsaperiodic.cpp Caution: It is not possible to call the RequestAperiodic function recursively. For example, that kind of code is not supported: void F() { ... if (RequestAperiodic(taskId)) ... F(); ... }

58 Kynapse powered by Kynogon SDK

Memory management

Introducing memory management in Kynapse Memory management is closely linked to performance issues. Kynapse has been designed to: • minimize memory consumption • avoid memory fragmentation • avoid in-game allocations/deallocations • allow user defined memory management.

Kynapse never uses standard C or C++ memory management functions to get or release memory (malloc, free, and so on). It always uses the functions you set through the AI engine configuration parameter block (p.49) when opening the AI engine (p.28). On top of that, Kynapse manages memory with allocators or FreeLists. • An allocator is a simple object that can allocate and deallocate memory. • A FreeList is a more sophisticated object that can allocate and deallocate memory through children operators (see the FreeLists section below for more details).

Kynapse offers a collection of standard allocators that are ready to use: • A basic allocator that simply calls the memory functions defined in the engine param block. • A memory pool based allocator.

The default allocator is the pool based allocator (a pool based allocator is created when you are creating your AI world using the RwAiWorldCreate function). Users can also develop their own allocators from scratch. Allocators Basic allocator The basic allocator is the simplest type of allocator possible. It directly calls the memory functions defined in the AI engine configuration parameter block, without doing any upper level management. This allocator is used by the AI engine at the early stages of Kynapse initialization. Pool based allocator This allocator creates a large memory pool when it is created and then, each time an allocation is requested, a part of the pool is used and no allocation is actually performed. This saves CPU time because only one allocation is actually performed and it avoids memory fragmentation. Moreover, it guarantees that several consecutive allocations will use a contiguous block of memory. Note that a block that is allocated by a pool based allocator will never be freed until the destruction of the pool. Hence the pool based allocator is not convenient

59 Kynapse powered by Kynogon SDK

when performing many alloc/free operations. In this case, you should consider FreeLists. When the pool is not large enough to perform all the requested allocations, Kynapse sends a debug message and uses the basic allocator. A pool based allocator can be constructed in two ways: • Creating a pool based allocator by specifying the pool size (in bytes). • Creating a pool based allocator by specifying a memory pointer on any available memory block.

Using allocators If you want your classes to be managed by Kynapse memory manager, they have to either derive from the Kaim::CObject class (see kygraphmanager.h for example) or implement the KY_DEFINE_NEW_DELETE_OPERATORS macro (see CVertex class definition in kyspatialgraph.h file for example). FreeLists A FreeList in Kynapse is a list of “intelligent” pool based allocators. Each allocator is called a FreeList block. A FreeList block is a pool based allocator that performs reference counting. Hence, each time an object is allocated inside a FreeList block, the number of references is incremented. When this object is deleted, the number of references is decreased. When the number of references is zero, the FreeList block is released and can be used again. A FreeList is merely a container that contains a list of free FreeList blocks and a list of used FreeList blocks. FreeLists should be considered when you need to manage many dynamic alloc/free operations. For example, if you need to manage 40 entities from 100 possible entity definitions (p.51), you can create a FreeList of 40 blocks whose size is 15 kB (approximate maximum size for an entity in memory). Then, you will create each entity in a FreeList block. When an entity is destroyed, the FreeList block will be automatically released and can be used for the next entity creation. Example Here is an example of how allocators and FreeLists can be mixed together for managing memory in Kynapse:

60 Kynapse powered by Kynogon SDK

See also • Allocation and deallocation functions are set through the AI engine configuration parameter block (p.49).

61 Kynapse powered by Kynogon SDK

Object management

Introducing object factories Kynapse offers many components, but you may want to develop your own components or extend existing ones. Kynapse has been designed to be fully open. You can extend Kynapse functionalities by developing your own C++ classes. However, at runtime Kynapse must be able to instantiate them. On one hand, Kynapse only handles purely abstract classes. On the other hand, concrete object creation is done through an object factory mechanism. Since native C++ does not provide a meta class abstraction level, Kynapse implements the factory mechanism with a C++ template class: CMetaClass. This class links a class name with the function that instantiates the class. Hence, by knowing a class name, it is possible to instantiate an object. Moreover, CMetaClass allows you to identify: • the base class of a class (if any) • the number of instances of a given class • the number of registered classes.

Using object factories The following components should be customized with object factories: actions (p.32), action attributes (p.32), agents (p.35), brains (p.33), pathfinding constraints (p.39), entities (p.30), entity attributes (p.30), pathfinding heuristics (p.40), PathObjects (p.41), services (p.36). Each meta class should be implemented as a global variable. This allows you to perform an automatic registration of the class in its constructor.

62 Kynapse powered by Kynogon SDK

Data management

Introducing data management in Kynapse Data management is key because it directly affects performance, especially during initialization. Kynapse handles two types of data: Configuration data Includes a description of objects to be managed by Kynapse (entities (p.30), brains (p.33), agents (p.35), and so on) and parameters for behaviors tuning and CPU time consumption. Raw data Pre-computed binary data (PathData (p.45) for example) Configuration data and raw data should be read through data readers. Configuration data Most Kynapse objects (agents, services, and so on) use configuration parameters. Just after the instantiation of an object, Kynapse calls its Init() virtual function to perform various initializations. This function receives the configuration data as an input parameter. The default implementation of Init() usually reads each element of the configuration data through the InitParam virtual function. Hence, in inherited classes, the InitParam function should be overloaded to manage specific configuration parameters.

Configuration data is stored in CData objects. CData stores configuration information in a tree structure. Raw data Raw data is usually loaded during initialization since it can be time consuming. Typically, raw data is stored in world services which are built when the world is created, and freed when the world is destroyed. Reading data through the data reader The data reader (CDataReader) is an abstract class that manages Kynapse data reading.

63 Kynapse powered by Kynogon SDK

This object allows you to: • store configuration data in a CData object • access raw data through a file I/O API.

64 Kynapse powered by Kynogon SDK

Overview

This section explains what you need to do when initializing, updating and terminating Kynapse. • Overview (p.66) • Initializing Kynapse (p.67) • Updating Kynapse (p.69) • Terminating Kynapse (p.70)

65 Kynapse powered by Kynogon SDK

Overview

Below are the main Kynapse API calls that you need to add to your game engine. For more details, you should read: • Initialising Kynapse (p.67) • Updating Kynapse (p.69) • Terminating Kynapse (p.70)

66 Kynapse powered by Kynogon SDK

Initializing Kynapse

Initialization steps These are the steps you should follow for initializing Kynapse: • Set an AI engine configuration parameter block Before opening the AI engine (p.28) (the main object representing Kynapse engine), you should build an AI engine configuration parameter block (p.49) ( CEngineConfig ) for Kynapse to be able to communicate with your game engine. • Open the AI engine Then call the Open() function passing the AI engine configuration parameter block you built above. • Set the PathData specific callbacks To use the automatic pathdata generator (p.17), you have to develop three functions. These functions provide low-level control of entities, set their position and direction in the game engine, and test whether a position is valid for pathfinding. These functions are used exclusively by the automatic PathData generator. They bypass the constraints of the movement engine in order to save time. No default functions are supplied in the AI skeleton (p.17), because their implementation is too specific to the game engine. When developed, you need to attach these function as callbacks to Kynapse with the following functions:

Functions Description KsSetSetPositionInGameEngineFunc() Attaches the callback to the function that sets the entity position in the game engine. KsSetSetDirectionInGameEngineFunc() Attaches the callback to the function that sets the entity direction in the game engine. KsSetIsReachableFunc() Attaches the callback to the function that indicates if a position can be reached, to accelerate PathData generation. This function can be implemented with {return KYTRUE;}.

See also PathData (p.45).

Now that the AI Engine is ready, you should create an AI world (p.29). An AI world can be created and deleted for each level in your game: • Set an AI world definition AI worlds are complex. You should first build an AI world construction plan (AI world definition (p.50)).

67 Kynapse powered by Kynogon SDK

• Create the AI world Call the WorldCreate() function passing the AI world definition you created above.

Now that your AI World is ready, you can populate it with entities (p.30). For performances reasons, it is better to avoid creation and destruction of entities during your main loop. You should therefore consider having a pool based approach and create a pool of entities that will be activated and deactivated inside the AI world: • Set your entity definitions AI entities may be complex. You should first build AI entities construction plans (entity definitions (p.51)). • Create your AI entities Call the EntityCreate() function with the AI entity definition you created above.

You can also create teams (p.38) and PathObjects (p.41) at this point. Now you can enter the game main loop. See also... • Updating Kynapse (p.69) • Terminating Kynapse (p.70)

68 Kynapse powered by Kynogon SDK

Updating Kynapse

The update call Kynapse is updated through a single function call: WorldUpdate() . The Kynapse update sequence Below is a description of what is happening during each frame in the simulation. 1. First loop (for each entity in the AI world):

• The Kynapse “GetInput step” . Each frame, at the “GetInput step” , the Kynapse entity parameters (position, life points, and so on) are updated from the game engine. The GetInput method for your entities (p.30) is called. 2. Second loop (for each world service in the AI world):

• The Kynapse world services update. World services (p.36) are updated at this point. The update sequence matches the AI world definition declaration sequence. 3. Third loop (for each active entity in the AI world):

• The Kynapse “think step” . Each frame, at the “think step” , the decision making process is launched and it returns an action to be applied to the entity. The Think method for your entity's brain (p.33) is called.

• The Kynapse “ApplyEntityAction step” . Each frame, at the “ApplyEntityAction step” , the action constructed by the “think step” is translated into game engine commands. This step is executed only if the “think step” for the AI entity is successful (meaning it returns true). The entity's action is translated to the game engine entity's actions and commands. The ApplyEntityAction method for your entity's action (p.32) is called. See also... • Initialising Kynapse (p.67) • Terminating Kynapse (p.70)

69 Kynapse powered by Kynogon SDK

Terminating Kynapse

Steps for terminating Kynapse These are the steps you should follow to terminate Kynapse:

• Destroy the AI world The AI world should be destroyed by calling the WorldDestroy() function. AI entities (p.30) in the AI world (p.29) (activated or not) will be implicitly destroyed. • Close the AI engine The AI engine (p.28) should be closed by calling the Close() function.

See also... • Initializing Kynapse (p.67). • Updating Kynapse (p.69).

70 Kynapse powered by Kynogon SDK

Kynapse coordinate system

Kynapse uses an orthogonal right-handed coordinate system for its 3D spaces.

The three axes are defined in Kynapse by three vectors. By convention, the axis names used are:

Axis Vector representation X "Right" Y "Up" Z "Front"

71 Kynapse powered by Kynogon SDK

Topic is not available in online documentation

Please contact Kynogon to get the full Kynapse User Guide.

72 Kynapse powered by Kynogon SDK

Topic is not available in online documentation

Please contact Kynogon to get the full Kynapse User Guide.

73 Kynapse powered by Kynogon SDK

Topic is not available in online documentation

Please contact Kynogon to get the full Kynapse User Guide.

74 Kynapse powered by Kynogon SDK

Topic is not available in online documentation

Please contact Kynogon to get the full Kynapse User Guide.

75 Kynapse powered by Kynogon SDK

Topic is not available in online documentation

Please contact Kynogon to get the full Kynapse User Guide.

76 Kynapse powered by Kynogon SDK

Topic is not available in online documentation

Please contact Kynogon to get the full Kynapse User Guide.

77 Kynapse powered by Kynogon SDK

Topic is not available in online documentation

Please contact Kynogon to get the full Kynapse User Guide.

78 Kynapse powered by Kynogon SDK

Topic is not available in online documentation

Please contact Kynogon to get the full Kynapse User Guide.

79 Kynapse powered by Kynogon SDK

Glossary

AI world (p.29) The modeling of the game world in Kynapse. Action (p.32) A means for an entity to interact with the game world. For example, go forward or backward, turn left or right, or jump are actions. Agent (p.35) An agent for Kynapse is a high level behavior that generates actions. Attribute A specific item of information attached to an entity. There are two categories of attributes: Entity attributes (p.30) Position, height, width, amount of ammunition, and so on. Action attributes (p.32) Go forward, go backward, turn left, turn right, shoot, and so on. Each entity in Kynapse has a set of entity attributes and a set of action attributes. Behavior In general, a behavior is an algorithm that generates actions for an entity. Bot Synonym for active entity (p.30). Brain (p.33) Where a decision takes place. Decision In Kynapse, the decision is the selection of an agent. Entity (p.30) An instance of a game engine object in the AI world. In Kynapse, all game engine objects necessary from an AI perspective are represented as entities in the AI World. game engine As a general definition, a Kynapse engine is the modeling of the world in the game. game engine object The game engine objects definition covers all objects existing in the game engine. For example player(s), enemies, a car, a door, a bridge, and so on are objects. Perception Input data from the AI world that an entity can use to make a decision. For example: visible enemies, specific sounds, and track trajectory data are some of the perceptions that might be available to an entity. Kynapse A set of off the shelf services, agents and brains available for you to start creating behaviors.

80 Kynapse powered by Kynogon SDK

Service (p.36) A service is a complex algorithm that can be shared among several entities. Time slicing (p.53) A mechanism through which computations are spread around several frames cycles, to avoid frame loss.

81