AI Communication is about playing sound/voice and/or animations at the right times in the course of the game.
Note that Communication animations are not played if the AI is currently playing a smart object action.
To actually trigger a Communication (event), use goalop "".
For example:
<GoalPipe name="Cover2_Communicate">
<Communicate name="comm_welcome" channel="Search" expirity="0.5"/>
</GoalPipe>
Game/Scripts/AI/Communication
.Communication Channel determines whether an AI can play a Communication at the moment, depending on whether the Communication Channel is occupied or not.
A hierarchy of Communication Channels determine, which other Communication Channels (namely, parent Communication Channels) are occupied as well, when a child Communication Channel is occupied.
Let me emphasize that Communication Channel is a totally self-contained concept, independent of any other AI Communication concepts. Its sole purpose is to be in one of the two states: occupied or free.
<Communications>
<ChannelConfig>
<Channel name="Global" minSilence="1.5" flushSilence="0.5" type="global">
<Channel name="Group" minSilence="1.5" flushSilence="0.5" type="group">
<Channel name="Search" minSilence="6.5" type="group"/>
<Channel name="Reaction" priority="2" minSilence="2" flushSilence="0.5" type="group"/>
<Channel name="Threat" priority="4" minSilence="0.5" flushSilence="0.5" type="group"/>
</Channel>
<Channel name="Personal" priority="1" minSilence="2" actorMinSilence="3" type="personal"/>
</Channel>
</ChannelConfig>
</Communications>
Communication Configuration determines which Communications (and how) an AI can play. A concrete Communication Configuration is assigned to an AI using the its property CommConfig.
<Communications>
<!--Animation + Sound Event example (needs state using the action/signal in the animation graph)-->
<Config name="Surprise">
<Communication name="comm_anim" finishMethod="animation" blocking="all" forceAnimation="1">
<Variation animationName="Surprise" soundName="sounds/interface:player:heartbeat" />
</Communication>
</Config>
<!--Sound Event example-->
<Config name="Welcome">
<Communication name="comm_welcome" finishMethod="sound" blocking="none">
<Variation soundName="sounds/dialog:dialog:welcome" />
</Communication>
</Config>
</Communications>
Every Communication Configuration should contain at least one Communication, using either tag <Communication>.
Every Communication should contain at least one Variation.
To support localized dialogs, sub-titles, and lip sync, Voice Libraries are necessary. The right Voice Libraries can be assigned to the right AIs (or Entity Archetypes) using property esVoice.
Voice Library is an XML Excel file which resides in GameSDK/Libs/Communication/Voice
. Its format is as in the example (GameSDK/Libs/Communication/Voice/npc_01_example.xml
).
Language | American English | |
File Path | languages/dialog/ai_npc_01/ | |
Signal | Sound File | SDK NPC 01 Example |
---|---|---|
see | ||
see_player_00 | i see you | |
see_player_01 | hey there you are | |
see_player_02 | hey i have been looking for you | |
pain | ||
pain_01 | ouch | |
pain_02 | ouch | |
pain_03 | ouch | |
pain_04 | ouch | |
pain_05 | ouch | |
death | ||
death_01 | arrrhh | |
death_02 | arrrhh | |
death_03 | arrrhh | |
death_04 | arrrhh | |
death_05 | arrrhh | |
alerted | ||
alerted_00 | watch_out | |
alerted_01 | be careful | |
alerted_02 | something there |
American English defines the localization this Voice Library belongs to.
Localization/<language>/dialog/ai_npc_01/
is the path where the files (2nd column) actually are.
For each Communication (1st column) - "see", "pain", "death", and "alerted" - a few variations are provided.
The 3rd column is just for comments.
To trigger Communications using Flow Graph logic the Flow Graph node "AI:Communication" can be used.
At startup, folder Game/Scripts/AI/Communication
and all subfolders are scanned for XML files which contain Channel Configurations and Communication Configurations.
The SDK includes two basic Configuration XML files:
Communication animation and/or voice can be turned off using the AI's Lua script or the entity properties in Sandbox:
Readability =
{
bIgnoreAnimations = 0,
bIgnoreVoice = 0,
},
To get debug information about AI Communication, the following CVars are available (provided ai_DebugDraw is 1):
Debug output to console:
Playing communication: comm_welcome[3007966447] as playID[84]
CommunicationPlayer::PlayState: All finished! commID[-1287000849]
CommunicationManager::OnCommunicationFinished: comm_welcome[3007966447] as playID[84]
CommunicationPlayer removed finished: comm_welcome[3007966447] as playID[84] with listener[20788600]
The code of the Communication Manager resides in CryAISystem and CryAction:
CryAISystem (Code/CryEngine/CryAISystem/Communication/
):
CryAction (Code/CryEngine/CryAction/AI/
):
[Warning] Communicate(77) [Friendly.Norm_Rifle1] Communication failed to start
You may find a message similar to the above if your AI's behavior tree calls a communication but the communication config is not set properly.
In this instance, "77" refers to the line in the GameSDK/Scripts/AI/BehaviorTrees/SDK_Advanced_Grunt.xml
behavior tree script:
<Communicate name="TargetSpottedWhileSearching" channel="Reaction" expirity="1.0" waitUntilFinished="0" />
This TargetSpottedWhileSearching attribute is defined in the GameSDK/Scripts/AI/Communication/HumanCommunication.xml
script:
<Communication name="TargetSpottedWhileSearching" finishMethod="voice" blocking="none">
<Variation voiceName="TargetSpottedWhileSearching"/>
</Communication>
So, if your AI character uses this Modular Behavior Tree but does not have the "Human" communication config set, the above error would occur.
You should also select a "Voice" to use for the communication. In this case, AI_02. This content is discussed earlier in this article here: VoiceLibraries