Enhancing the cognition of NPCs (bots) in High Fidelity


#1

Hey there fellow High Fidelity avatars,

I am looking to find ways to enhance the overall cognition of NPCs (bots/scripts/pets/companions etc.) in High Fidelity… I would love to collaborate with a javascript programmer to import some preexisting libraries written in other languages so that they are compatible for use in High Fidelity. I would also like to have fun making humanoid bots and also some weird alien NPCs that can think deeply about their virtual environment (e.g. other avatars, NPCs, objects etc.) rather than simply respond to it in a predictable manner.

The following text is what I wrote to Liv and Ozan when they asked me to write a one-page summary of what I want to do in HF…

Current High-Fidelity NPC Situation: The High Fidelity NPCs and most NPCs in other social virtual worlds and video-games are aesthetically very interesting but cognitively, are just model-based reflex agents (finite-state machines). They simply react to environmental input with a (usually) predictably limited choice of behavioral responses. There is little sense that they are thinking deeply about the virtual environment, avatars and agents that they are interacting with. They simply react to perceptual triggers that transition them into a particular state (e.g. idle or chat-mode), and will always perform one of a limited amount if possible actions while in this state (until the next trigger). Sometimes, one has to intentionally interact with an NPC in order to get a response at all. For rudimentary applications, this is sufficient but not for more meaningful interactions.

Proposed enhancements: I wish to enhance the intelligence of these NPCs with cognitive architectures. There exist some code libraries available online for use and further development etc. The architecture I am most interested is CLARION.

https://sites.google.com/site/clarioncognitivearchitecture/

Cognitive architectures such as CLARION allow NPCs to think more deeply about whether or not to perform an action. These deeper thoughts can be designed around pre-defined goals, drives and motivations. The NPC can also monitor its own behavior. If designed properly, the NPCs will be seen as more deliberative and less reactive with an automated psychology closer to that of a human being.

Technical challenges: While available for use with flexible licences, many of these cognitive architecture libraries were coded in languages other than Javascript. The biggest challenge would be to convert these libraries into Javascript (or some compatible approximation) so that they can work with High Fidelity’s in-house scripting environment (with additional access to the Limitless integration). A personal challenge is that while I have previously hired programmers to help me design new NPC minds, I do not consider myself to be a programmer. I can occasionally read and modify code but I am not at a programming level where I can write code from scratch. I am able, however, to design the NPC’s mind at a higher-level from which the coder can code from. I am very familiar with cognitive architectures like CLARION and would know how to use these code libraries to enhance the cognitive functionality of a particular High Fidelity NPC.

Currently available resources: For greeter/guide NPCs, I can select from a variety of marketplace avatars. For my own NPCs, I can design the NPC bodies using either High Fidelity’s in-world tools or import from mesh (I am used to modeling in Second Life)…Limitless also already offers the capability to retrieve assets via a voice-command prompt. This would be very useful for providing voice-based perceptual input for the cognitively-enhanced NPC to think deeply about. Of course, High Fidelity gives me my own sandbox space to create, store and test these NPCs. There is also a great community of High Fidelity content-creators and programmers I also enjoy socializing with.

Proposed Test Cases for High Fidelity: To learn how to use Cognitive Architectures in High Fidelity, I would help design and test a greeter or guide companion with the aim of designing my own NPCs as I become more proficient in this kind of design process. This design process would include the development of goals, motivations, drives, intentions, salient perceptual information as well as ways in which the NPC can monitor its own mind for improvement and/or optimization. The architecture I wish to try out first would be CLARION. With CLARION in particular, the NPC can also gradually convert trial-and-error-knowledge into logical rules. I could also help specify the kinds of rule-learning possible within the High Fidelity environment.

Required resources: I would require a fairly dedicated Javascript programmer (either as a volunteer or for hire) to help me convert these cognitive architecture libraries for use in High Fidelity and also for quickly showing me how these coding libraries work in the new Javascript environment. I would hope that these capabilities would also be compatible with the Limitless voice-prompt integration.

Motivation: I am more interested in working with a programmer than with any financial reimbursement for me (if funding is an issue). For me, this is both a hobby and part of my ongoing academic research from which, some publications are likely to appear. In the meantime, I would be very happy to learn how to program High Fidelity NPCs the usual way (as finite-state machines). I will attend every University session I can in High Fidelity for this purpose.


#2

If you have a 3rd party libraries or scripts available, could you not just build an API to them? re-writing in Javascript does not seem like the best option, instead look for Service - oriented -architecture and use High fidelity javascript to integrate with the API service.


#3

Thanks for replying…That is a good idea…I think the service-oriented workaround is why already, HF is gaining access to some very cool external AI scripts…

I just do not know whether the cognitive architecture libraries I am personally interested in implementing are active enough to be service-oriented…The CLARION library, for example, seems to be a dusty old academic code library that has not been maintained since about 2012…I can see if there was some CLARION API…If not, what other ways would there be to use libraries like these?


#4

The latest version I can find is version 6.1.1 from 2013.

Have you looked into OpenCog. It is still active, last post was on February 28th, 2017.
They had virtual pets back in 2009 that spoke English and understood the the 3D world around it.


#5

Hey there, thanks for replying to this thread :slight_smile:

Thanks for locating a newer CLARION library…

Yes, I have looked into Ben Goertzel’s OpenCog and AtomSpace and almost set it up a few months ago…If I cannot get CLARION to work, I will try something like OpenCog again…There are also other cognitive architectures for virtual agents such as Joscha Bach’s Micro-Psi - http://cognitive-ai.com/page2/page2.html
CLARION was the architecture I really want to try and implement though…

Maybe there is still a way to get CLARION running on High Fidelity?

Cheers,
Onto Distro


#6

Why is it you want to use CLARION?

CLARION uses training Neural Networks in its learning, that suggests it isn’t a real time algorithm. This comparison doesn’t list it as supporting real time learning.

And I don’t see anything about NPL with CLARION although the documentation on CLARION seems vary light. :frowning_face:


#7

I agree that the current software implementation(s) of CLARION leave something to be desired…
I want to use CLARION because I have studied the theoretical aspects of CLARION and like its capabilities for meta-reasoning as well its ability to eventually convert trial-and-error connectionist learning into rules (via the RER algorithm),…yeah, that conversion process might not occur in real-time but the agent could learn through successive iterations over time…CLARION is probably not something that practical to use right now so maybe yeah, OpenCog and/or Micro-Psi might be a better workaround…


#8

I have a growing interest in developing NPCs with a focus on animation. I’m very interested in the ideas around implementing cognition presented here. Is there a possibility that an implemented ‘intelligence’ could also produce signals that reflect the simulated emotional state of the NPC? I’m really quite stoked by the idea of an NPC displaying body language that reflects the NPC’s simulated emotional state. This, combined with some simple techniques like an NPC maintaining eye contact with avatars that it is interacting with could produce some very engaging, lifelike interaction.

@ontodistro777 Are you still looking into this? If so, is there any scope for producing signals that could be translated into NPC body language?


#9

Hi Dave(dub),

I am very happy that you are as enthused about VR NPCs as I am :smiley:
Actually, the original creator of the desktop VR world, “There.com” is a friend of mine and his focus is on NPCs and non-verbal communication (NVC). His name is Jeffrey Ventrella…
http://press.etc.cmu.edu/content/virtual-body-language

There are also many teachers and researchers at my school with this particular specialty (i.e. Nilay Yalcin, Michael Nixon and Steve DiPaola).

http://dipaola.org/lab/research/virthuman/

I am personally most interested in the cool cognition-side of NPC minds but I am also interested in how they express themselves using NVC.

I would love to meet you in HF and chat more about it more directly :slight_smile:

Cheers,
Jeremy