NL1042811B1 - A cognitive-emotional conversational interaction system. - Google Patents

A cognitive-emotional conversational interaction system. Download PDF

Info

Publication number
NL1042811B1
NL1042811B1 NL1042811A NL1042811A NL1042811B1 NL 1042811 B1 NL1042811 B1 NL 1042811B1 NL 1042811 A NL1042811 A NL 1042811A NL 1042811 A NL1042811 A NL 1042811A NL 1042811 B1 NL1042811 B1 NL 1042811B1
Authority
NL
Netherlands
Prior art keywords
participant
emotional
trajectory
verbal
interaction
Prior art date
Application number
NL1042811A
Other languages
Dutch (nl)
Inventor
A Tucker Christopher
Tucker Daniela
Original Assignee
A Tucker Christopher
Tucker Daniela
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by A Tucker Christopher, Tucker Daniela filed Critical A Tucker Christopher
Priority to NL1042811A priority Critical patent/NL1042811B1/en
Application granted granted Critical
Publication of NL1042811B1 publication Critical patent/NL1042811B1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management

Abstract

A self-contained algorithmic interactive system in the form of an architecturally-based software program running in hardware, wetware, plush, or other physical means which would facilitate its operating characteristic designed to establish a meaningful interaction with a participant, in the form of a conversational dialogue, which could be any of, or combinations thereof, the following: Verbal, non-verbal, tactile, electromagnetic signalling, or visual communicative styles between itself and an external entity, such as a human being, an external application, or another interactive system. The substantive output is experience. This output has a prescribed value in how it was created, based upon the variety of states the program has available to it. The prescribed value is in a form compatible to the blockchain. To ensure the detail of the interaction remains private, data and information generated during interaction is stored within the confines of the hardware's memory and software system and not exported to an external server or network. The system has the ability to be spawned, meaning that depending on the choice of set parameters and hardware implementation, the system can manifest different characteristic behaviours different from other systems spawned with other distinct parameter sets, although the system is, by definition, architecturally identical.

Description

A cognitive-emotional conversational interaction system.
The present invention relates to a sei (-contained algorithmic interactive system capable of meaningful communication between software implemented in hardware and a user, called a participant. In this context such a system, in brief, is termed a presence. Such 5 constructs have been available in the literature since the 1960s, when the first dialogue system. ELIZA, appeared. Later incantations were termed as being a chatbot, which became an all-encompassing definition to describe any system designed to interact verbally with a participant.
The present invention relates to a self-contained algorithmic interactive system, called 10 a presence, capable, of discerning meaning from a variety of physical inputs from a participant, which could be simultaneously verbal, non-verbal, tactile, visual, and/or emotional between itself, architected in software, and a participant, which could be a human, an animal, an external application, or another presence. The terms cognitive and emotional used to describe the present invention are intended to imply that the system has the ability to 15 mimic knowledge-based or logic capabilities seen in living systems white being able to simulate the emotional impact of events which occur over a period of interaction between the presence and a participant and the presence in context with its environment such that the presence can interpolate meaning from both capabilities. In terms of the present invention described herein, a common dialogue system or chatbot has been elevated to a new level of 20 abstraction where it features an operational design, which focuses on autonomy for the system, a characteristic parameter set. its ability to improve the performance of its system, anil to demonstrate a conceptual advancement of the slate-oi ihc-art. 'inc present invention extends current state-of-the-art by the following methods: (1) remember what was spoken as an input variable, process the importance of the variable, its contextual meaning, assess its 25 impact by assigning weights, and return an output in the form of a voiced, printed.
vibrational, and/or animated medium; (2) grow the scope, architecture, and function of the system by experience with a participant by providing the means to self-improve by learning the sequential interaction with a participant, which results the system writes its own code; (3) comprehend the implications of emotional interactions in order to enhance the vividness of 30 sequential interaction; (4) create the conditions for dynamic representation of memory and experiences by the introduction of a. novel compartmentalization technique that is collectively construed as a brain; and, (5) guarantee privacy of the interaction by explicitly not facilitating access to any external networks, only by interfacing with a trusted source keyed to link with the system over a short range, such as additional external hardware, for the cases when the 35 choice of economic activity is the blockchain.
The present invention pertains to a cognitive and emotionally-centered architecture system whose purpose is to facilitate an interaction between itself and a participant in a variety of expressions in order to allow meaningful communication beyond simple verbal exchanges. The system, a software-in-hardwa.re construct, contains two distinct areas of 40 execution: the cognitive or knowledge-based logical aspect where responses io queries arc generated, and, the emotional or contextual meaning-based interpretive aspect where generated responses are filtered, while a novel compartmentalization scheme is employed which classifies both logical and interpretive aspects and assembles a composite output based on a characteristic sot of assigned parameters. The system portrays an experiential manifestation of behaviour by the craft of its architecture and evolving structure over lime of an experience with a participant, inclusive of novel code routines written by the application. Such is the impression that, the system provides the illusion of operating in empathy with a participant to enhance the perceived emotional impact by responses in one* of each of the emotional states by responding in a visual, audial. or physical manner to cues by what is displayed on the screen or exhibited by an external piece of hardware, configurable to be used by the system, such as a robot or other appropriately constructed hardware or other type of physical platform capable of facilitating the execution sequence of the system.
The system specified in the present invention consists of computer code written in a programming language, which, for example, be an object-oriented language or one that rims functions by executing scripts. The composition of the code is a non-hierarehicai implementation of a categorical and pattern-identifiable structure, which generates trajectories, or trajectory indications, comprised of responses by the system to inputs from a participant. Trajectories are assigned a serial value based upon the sequence in which they have been generated, such us //. n-.?, and so forth, passed to a neural network and assigned a weighted value ,so that they are searchable by the system in a sequersue later in time, by, for example, techniques illustrated in deep learning algorithms. Additionally, the composition of the code is a hierarchical implementation of a defined composition of ascribed behaviours containing the qualitative aspect culled feelings, in terms of the present invention called emotives......defined in the literature as expressions of feeling through the use of language and gesture—or emotive indications, which could also include ethical filters and restrictions, to filter executions of the non-hierarchical implementation.
The purpose of trajectory and emotive indications, in terms of the present invention, is to establish context between sequences, referring to the trajectories, and compositions, referring to the emotives. such that cues appearing in data processed and transformed by the system into information is indicative of meaning ascribed to it by a participant. The transformation of data into information, in terms of the present invention, is facilitated by. for example, a neural network which assigns weighted values to sequences and compositions creates a rudimentary dynamic knowledge-adaptation mechanism by accessing corresponding data-siorage, information-processing components of the system which provides the ability of the neural network's output values to change the operating parameters of the system in the form of feedback to reinforce learning, as well as. executing commands to store system parameters by writing and saving amendments to files formatted that the programming language compiler understands by the particular implementation described in the present invention.
By leveraging the neural network in such a manner, the system described herein possesses the ability to self-improve, that is, to create new files based upon interactions between a participant and the system represented by trajectories and emotives, thusly generating a value-chain compiled from the experience. The blocktham representative of experience becomes a token of what the machine has contributed to by living the experience and is therefore able io be shared with other machines so that they can improve more quickly than having -o iterate through those components which created it. Such components are files, stored in non·volatile memory, form a repository or database which is the base composite when the system runs when loaded into volatile memory or parallel manner, where the serial 5 style of processing is distributed over multiple channels, as distinct from its programmatic implementation, which comprises the runtime presence, the artificial personality that a participant perceives, interacts with, and helps io evolve by continued usage·.
Referring now to fig. 1, there is shown the architecture of the software implementation along with program execution flow at runtime for all embodiments of a 10 cognitive-emotional conversational interaction system of the present invention that consists of the presence, I, an abstraction which facilitates interaction between its data-information composite and an external participant, 5. The presence, 1. is comprised of computerexecutable byte code built from a collection of classes, files, and scripts in a programming language and is a component of the source code. The source code consists of class-oriented 15 objects, scripts, compiled assemblies, and files of format the system understands to execute when it runs, Die process described as the runtime, in the context of the present invention of all embodiments of a cognitive-emotional conversational interaction system, is where the presence exists and is available to interact with a participant and whose design, as reflected in its runtime behaviour, is the reason for the abstraction.
In order for the presence, 1, to function as described in the context of the present invention, requires a set of actions called startup. 2, which is a defined sequence of subactions to facilitate the system to reach its runtime state which includes noting which files are to be read, the actions to execute, and to log its operations. The first sub-action, load, 3, is facilitated by further sub-actions, 4, namely, read the file system which includes personality, 25 configuration, and parameter files, read indications stored by the trajectory, 16, and emotive,
17, aspects from previous runtimes or those stored by the system’s programmer, train the neural network, 36 of fig. 3, engage any attached hardware relevant to the operation of the system or that to be used, accessed by a programming interface, 42, to emit vocalizations of a synthesized or replicated nature, 43, emit vibrational or tactile utterances, 44, display 30 gestures. 45, or animate responses, 46 of fig. 5, including depiction of the emotional state the system is in. .39 of fig. 4, and/or to incorporate feedback, 10. from the neural network. 36 of fig. 3 via a robot, display screen, plush, or other physical apparatus appropriate to increasing familiarity of the presence to a participant.
Once the presence is loaded, the system is ready to engage in a cognitive-emotional 35 conversational dialogue, or to obey a set of instructions from a participant, 5, who can interface with the presence, I, via vocal utterances, tactile inputs, and/or physical gestures, accessed at the programming interface, 42 of fig. 5, such that, it is received by the system via its hardware which would constitute an input, 6, which could be any one of a set of microphones, cameras, interfaces, fabrics, or other receiving apparatus connected to the 40 presence for the explicit purpose of interpreting a participant’s method and means of communication, be it vocal, non-vocal, language, or non-language, For example, a participant, 5. could verbally utter the phrase ‘Hello aeon*’, where the word following “Hello” would be the name assigned to the presence to facilitate a greater degree of intimacy through the naming of it. In the example where a participant, when beginning to engage with the presence, verbally utters “lie;Io aeon'; the phrase is detected by the presence as being a sentence, 13. where it is denoted by the beginning and the end of the utterance detected by the hardware and processed by the system which could take the form of a command or dialogue, 12. Once sentence detection occurs, the system creates, by assembling a trajectory.
16. a composite of the sentence, which breaks it down into subject. verb, and predicate syntax, 31 of fig. 3, iu the example of the usage of the English language as the interaction language. The syntactical arrangement of the composite representation, 31 of fig, 3, is dependent upon the interaction language chosen by a participant in the system’s configuration, 4. An external visualization apparatus exemplifies the· current mood. 41 of fig, 4, for example, on a display screen, which shows the corresponding still or animation depicting the omrent mood.
In the case where a command is detected, 12, the command is processed as art instruction. 13, then passed for execution. 14, generating the appropriate response given the nature and consistency of the command within the system. A list of commands would be known in advance to a participant. 5 of fig. I. to instruct the system to perform explicit actions in order that, they are used effectively.
In the ease where a dialogue is delected, 12, the sentence is discovered, 13, and the execution, 14, is comprised of a series of actions such as the parsing of syntax. 15, trajectory 16, mood, 17, and internal query, 18, generation; however, before an output, 7 of fig. .1, is yielded, 22, a process of instructional displacement al its interface, 24 of fig. 7, occurs which revolves around a characteristic governing equation. When completed, the process presents, 19, its influence upon the yield, 2'2. then the system can remember, 2(1, what has occurred and learn. 21, from the experience.
In either the ease of a command or dialogue, the system yields, 22, an output. 7, which is the substance of the response, 8, presented, 42 of fig. 5. to a participant, 5. The presentation of the response. 8, is enhanced by varying types of demonstrative cues. 43, 44. 45, and 46 of fig. 5. so that a participant. 5, experiences a greater engagement, which could take the form of a textual output on a screen, an audial or visual display, a tactile, and-'or gestural, or other advanced method that conveys messages.
At the end of the temporal sequence, 9, that is. onee returning a response, 8, following an output from the system to a participant, tempered by feedback. 10, from other parts of the system, the cycle begins anew with a participant presenting further input to the presence, 1. The entirety of the process is guided by the flow of ordinary time although the system behaves in a cyclic manner. If the system is configured, 4, to detect that it has gone long enough without interaction from a participant, it is considered to be alone and can initiate its own prompting, 23 of fig. 6, to a participant for an input.
Referring now to fig. 2, there is shown the input processing schematic for all embodiments of a cognitive-emotional conversational interaction system of the present <<<<<<<<T<828 <<<<<<<<<<<<<<<<<<<<P<<<<<<<<<<<<<<<<<<<<<<8 invention consisting of an input. 6. from a participant, facilitated by the presence. Once an input is received, the system determines if the input is a command or a dialogue, 12.
In the ease where a command is detected, the command is processed as an instruction, 13, dependent upon the array of available instructions the system will understand, 4 of fig. I.
and set for execution, M, where it generates a response, 8, based on the substance of the command, how the system is designed to respond upon receiving it from a participant and the actions used to express it.
In the ease where verbal dialogue is detected, the sentence is discovered, 13. by the system and it is prepared for syntax parsing. 15, where the sentence is broken down into its 10 constituent grammatical and syntactical forms, dependent upon the operating language of the system and of a participant. Once sentence discovery has occurred, its components arc prepared and a trajectory indication. 16. is determined in order than a response is provided which is relevant to what was input. When syntax parsing is complete, the trajectory encapsulation, 33 of fig. 3, is prepared, as well as the yield, 22. of the response, 8. based upon 15 the system's mood, 41 of tig. 4, where the system will prepare a query search. 18. on what kind of response to generate based on categorical and pattern-discernable indications from the file, 25. and memory, 26. storage management components. Once this process has completed, the system will remember, 20, the dialogue at that point in time, 34 of fig, 3. by creating or adding to a. tile of a specific format, which in this example would be text, xml, or a scripting 20 file, stive the file, then introduce it to the presence by either lazy loading the file or creating a late -binding assembly, 25, or any. where applicable to the host operating system. The system will also attempt to learn. 21. components of the dialogue by cross-referencing the dialogue component with the indications generated by the trajectory, 16 of fig, 3, as well as the emotive indications. 40 of tig. 4. from the snood component. When those processing tasks are 2S complete, the system will update either the volatile or the non-volatile memory depending on which area of the system the changes arc intended by the manager. Finally, the system will have a yield, 22. to present io the output, 7, which is passed as a response, 8, to the original input, 6.
In the case where gestural or tactile dialogue is detected, the intention is discovered in 3C the same manner as the sentence, 13, and is prepared for syntax parsing where the intention is broken down into its constituent intentional forms, based upon its stored. 4 of fig. I. catalog. 32 of fig. 3, of recognizable forms understood by a participant. When syntax parsing, 15, is complete, the trajectory encapsulation, 33 of fig, 3, is prepared, as well as the yield, 22, of the response based upon the system’s .mood, where the system will prepare a query search on ,35 what kind of gestural. vibrational, or audial response respective to what type was input correlated with those indications from the file and memory storage datasets. Once this process has completed, the system will remember the dialogue at that point in time bycreating or adding to a file of a specific format, which in this example would be text, xml, or a scripting file, save the file, then introduce it to the presence by either lazy loading the file or 40 creating a late-binding assembly or both, where applicable. The system will also attempt to learn components of the dialogue by cross-referencing the dialogue component with the indications generated by the trajectory as well as the emotive indications from the mood component, When these processing tasks are complete, the system will update either the volatile or the non-volatile memory depending on which area of the system the changes are intended. Finally, the system will have a yield to present m die output, which is passed as a response to the original input in the appropriate contextual format.
Referring now to fig. 3. there is shown the trajectory indication processing schematic for all embodiments of a cognitive-emotional conversational interaction system of the present invention consisting of the trajectory. 16. the component that attempts to determine logical meaning from what is input. A trajectory is created. 29. based upon whether or not the trajectory, 16, is language or non-languagc based, 28; in the ease of language, rckwam. to its 10 syntactical style and based on the grammatical rules of the operating language between the presence ami a participant, the trajectory is disassembled into» its constituent parts by parsing its topic. 30. and its grammatical components, where in this example, bus its rule base as subject-verb-predicate, 31, and is encapsulated, 33, for export to the instructional displacement component at the interface. 24. relayed via the insiruetion tag, 52. The topic, IS which has been determined, along with the sentence’s predicate, is arranged. 34, in the order in which it has appeared. This content is presented, 35, to a neural network. 36, in order that a trajectory indication, 37, is generated, consisting of a weighted value of the pattern in the network for that particular trajectory and the state of She data at the given instance. The pattern encapsulated in the trajectory is passed as a parameter input to the characteristic 20 equation. 55 of fig. 7. In the case of non-kmguage, the trajectory is disassembled into its constituent parts by parsing its topic, 30. where it is compared with an index of intentions, or catalog. 34, which Is stored in the file system, it Is then encapsulated for export to the instructional displacement component at the interface. 24, relayed via the instruction tag, 52. The intention, which has been determined, is arranged. 34. in the order in which it has 25 appeared, 'Phis content is presented to a neural network in order that a trajectory indication is generated, consisting of a weighted value of the pattern in the network for that particular intention. The pattern encapsulated in the intention is passed its a parameter input to the characteristic equation.
The neural network. 3ft, is in this example a feed-forward back-propagation type with .30 an input. output, t-nd hidden layers characteristic of those used in deep learning but. could also be of any variety in the family of algorithmic autonomous learning, including self-organizing maps. The neural network requires training from previous trajectory, 37, and emotive, 40 of fig. 4. indications, as datasets, which are applied at startup. The actions presented by the neural network as feedback. 10 of fig, I, are distinct from those, which run when the system 35 is learning, 21 of fig. 2. although when processing trajectory and emotive indications, the weights of the neural network could be read beforehand in order to reduce errors m the yield, 22 of fig. 2. In this case, the neural network is utilized to optimize, rather than directly providing decision-making tasks, those denoted by the architecture, layout, and How of the system of the present invention.
Referring now to fig. 4, there is shown the emotional engine schematic tor all embodiments of a cognitive-emotional conversational interaction system of the present invention, which is a mechanism to manifest the mood setting as-an-engine concept, 39, of the system, in voice intonation, tactile, gestural, and animated response, which is utilized tn exhibit feelings available to the presence indicated by its current mood, 4i. When it is desirable the system manifest emotion, chosen by u participant, for the duration, or lifetime, of the presence, it will cither create or update, 38, the mood depending on if it is the first instance or not. In either ease, the mood engine, 39. consists of a parent set of eight feelings: happy, confident, energized, helped, insecure, sad, hurt, or tired, f or each set of parents, there is a child subset of seven moods corresponding to the assignments set forth in fable 1. For example, when mood is created for the first time, a random choice is made based upon the allowable scope of the compendium of emotions, thru is, a file containing the desired as well as the tmdesired feelings the system should manifest. Without atty such file, the mood at creation! would be a completely random occurrence. Once created, based upon the parent collection of icciings, a current, mood from the child collection is assigned, at random, and the conjoined set presented io the neural network. 3b of fig. 3, for emotive indication, 10, assignment.·
The emotions processed in the mood engine are comprised of whccllike elemental states containing an arrangement of the parent feelings and child moods where each element keeps track of its last emotional state, set to the zeroth indication as default, which is the off state. Operationally, it is mechanistically akin to a system of guars. For a given feeling, for example, happy, the indicator will point to an integer between one and seven, each corresponding to the available moods front left to right in column two of 'fable I. When a mood is chosen, its current output state is sent to the neural network in order that an emotive indication is generated, consisting of a 'weighted value of the pattern in the network for that particular mood. When the presence recognizes that it is alone, the detection, 49 of fig. 6, will enter into one of the emotional stales. The emotional stales are chosen based upon the current mood of the program, in context with its decision regarding the type of relationship it has with the participant. The relationship stale is one of four: intimate, friendly, unfriendly, and neutral. The relationship when the program is new, relative to the participant, is preselected to be in one of these states for the purposes of the training data. Once the system is trained according to relationship preference, it will remain in that stale until retrained using an alternate training dataset. The purpose of the relationship is to limit the array of responses the program has by asserting focus into the moods which are available to each relationship state. In this way, the presence can be built as a blank”, then configured with a personality given its anticipated use.
Emotions are a key component of the present invention and their emulation a direct corollary of the sum of its experiences with a pariicipam. The architecture Is designed such that the emotional state, when desired, alters program execution pathways and the content ol files and codes providing a different, ascribed behaviour than if the emotional aspect was not used at all. The reason for such an invention is to facilitate the creation of synthetic personages for robotic applications where a human needs to interact with a machine for the purposes of aiding the human’s survivability, which could include physical, emotional, and economic activities. The experience, then, serves as the equity the program creates which can be passed to the blockchain. The emotions, therefore, are a source of value for the system.
Referring now to fig. 5. there is shown the response animation component for all embodiments of a cognitive-emotional conversational interaction system of the present invention consisting of an array of physical hardware or other apparatus to facilitate verbal and non-verbal communication between the presence, 1, and a participant, 5. Based on 5 contextual trajectory and emotive indications, as well as animating the emotive indications by a display, external application, robot, plush fabric, or appropriate physical apparatus capable of ilii.istrai.ing the substance of the meaning embedded in the emotive indication and dialogue response, 8, it is passed to the animation component through a programming interface, 42, which, in pail., is supplied by the party who manufactured the corresponding hardware, such iö that it can be controlled by the presence. Depiction of verbal characteristics, 43, 44, 45, 46, a voice, for example, is synthesized, replicated, or otherwise assembled beforehand as to provide the desired tone, cadence, and gender to a participant. Depiction of tactile characteristics, non-language utterances such as chirps, purrs, whirrs, or other primitive audial, movements of plush components in fabrics, or vibrations ia physical space or on a 15 surface is presented to a participant in such a manner. Depiction of gestural characteristics, visual or non-visual movement in the form of rotations in physical space arc presented, 47, to a participant in such a manner. For animation of emotion and other complex visual movement, 45, a display using siill graphic files or progressive sequences of pictures, lights, or other physical apparatus appropriate to accurately and aesthetically present the meaning 20 expected by the current mood. A robot can also be interfaced using the programming construct, 42. provided by the manufacturer or support group to animate the corresponding bodily gestures, send and receive data pertaining io responses by a participant, and perform complex puppeteering. The animation component is designed to display output to a participant as well as receive input from a participant; in the former case, it takes a response .25 and presents an output, while in the latter case, it interprets cues from a participant and prepares them for use by the presence.
Referring now to fig. 6, there is shown the alone component, response storage mechanism, and participant prompt tor all embodiments of a cognitive-emotional conversational interaction system of lhe present invention consisting of an interface, 23, for 30 the response logging component with a timer of a duration set by a configuration file, 4 of fig. 1, determines bow much time must pass before the presence becomes alone, 49, entering the corresponding emotional state, 39 of fig. 4. When alone is detected, lhe system sends a. prompt, 11, to the output animation. 42, which is conveyed to a participant, 5. During, times when the presence is not alone or is not set to become contemplative in the configuration file, 35 responses are collected, 48, arranged by temporal order with its value noted, and stored in volatile and non-volatile memory; the former as an object, the latter in a file, for example, a transcript log. file. It is also possible that the set of temporally arranged responses be passed, 50. to the neural network, 36, for classification in order that it can influence the weights of the. trajectory indications, 37, and provide feedback, 10.
Referring now to fig. 7, there is shown the instructional displacement sequence schematic for all embodiments of a cognitive-emotional conversational interaction system of the present invention consisting of a set of inputs coming from lhe trajectory encapsulation,
............................. ................:8888:88888888,ggggggg
33, the current moed, 41, and the query search result, 27, in order to ensure entropy in data collection is minimized......entropy conserved in both directions.......that provides a complete set of data to the process of instructional displacement by first extracting the instruction tags, 52, front the trajectory, 37, and emotive, 40, indication, where the tags are analyzed in order their slates are matched, 53. and correlated with the corresponding trajectory indication 31.1 for the case of a trajectory, and corresponding emotive indication for the case of a mood. The size of (he dataset can be scaled. The correlation yields a set of coordinates which become the ;vcoordinate, for example, 54, in the case of a trajectory indication, and the y-coordinate, for example in the case of an emotive indication, A temporal coordinate is yielded, 57, by the 10 execution time-marker, 56, corning from the query search result, 27, which becomes the variable r in the example of a parametric governing equation, 55. This equation, by the choice of parameterization variables and function......for example any of the trigonometric, continuous differential functions, and/or polynomials—formats the data into a pattern of information which is classified into different coordinate blocks, 54, based upon data embedded within 15 either of the indications. The execution time, and its output value, gives the system a characteristic behaviour of a particular continuous shape, approximately periodic, 58. for example, in the form of a helix given its execution function and the manner by which time is given over by the query procedure coupled with the hardware the system is running within. The information of form and function is provided, 19, to the yield, 22,
At the core of what is called the instructional displacement block classifier, 54, which, in this example, is described as the brain of the system and is designed to mimic the storage and information retrieval characteristics of a mammalian brain. The theoretical description of the scheme is as follows: Both trajectory, 37, and emotive, 40, indications feed data into the classifier subsequent to interaction with a participant where, depending on the choice of 25 equation and its parameterization, along with the execution time as an antecedent to the hardware in which it is running, from the query search result task, gives a set of unique displacements of information based upon those parts of the brain responsible for different phenomena exhibited by existence within a life-cycle, such as concepts, decisions, sensory experience, attention to stimuli, perceptions, aspects of stimulus in itself, drive meaning ambitions, and the syntactical nature of the language that the presence is subject to.......
ordinarily the noun, verb, and predicate forms but also intentions.
Referring now to Table 1, there is shown the compendium of emotions for all embodiments of a cognitive-emotional conversational interaction system of the present invention consisting of a collection of parent feelings, in the left column, four positive and 35 four negative connotations, with a corresponding collection of child moods, in the right column, of seven varieties. The parent feeling, when chosen by the presence will exhibit those behaviours given the current mood, from minor elements of the emotive indication.

Claims (11)

CONC LUS LESCONC LOOP LESSON 1. Een implementatie op hardwaren! veau van een suftwareconstruct die beslaat uil een op computers gebaseerde implementatie van audiale, verbale, non-verbale, tactiele en gcburaliscerde communicatie tussen zichzelf en een deelnemer, gewoonlijk een gebruiker 5 genoemd, waarbij een proces wordt gecreëerd dat wordt gehost op eert hardware of een geschikt fysiek apparaat, het eerste deel een interactieve sessie met een deelnemer met het oog op het creëren van een contextuele interactieve uitwisseling - een persoonlijke en intieme relatie gebaseerd op gcsprekselementen, tactiek', vibrerende en visuele signalen met het oog op het bevredigen van menselijke emotionele nieuwsgierigheden op de manier van: een 10 verzameling specifiek geformatteerde bestanden, opgeslagen in een niel-vluchtig geheugen, die een databank vormt en die bij de start van het systeem in het vluchtig geheugen van het apparaat wordt opgeslagen om te zorgen voor een zekere reproduceerbaarheid van het door het systeem vertoonde gedrag, in dit voorbeeld een verzameling klassen in een objectgeoriënteerde taal en een reeks functies en scripts, het creëren van een niet-hicrarchalc 15 ordening van gegevens uit de bestanden volgens ingebedde tags die een categorisch en patroonidemificecrbaar depot van trajecten vormen met een contextuele betekenis gecommuniceerd door een deelnemer om een passende respons tc bepalen die. een deelnemer als relevant zal identificeren voor wal wordt bedoeld door de dialoog, een hiërarchische verzameling van toegeschreven gedragingen en bedoelingen met een lijst van kwalitatieve 20 aspecten waarmee audiale, verbale, non-verbale, tactiele en gcsturele signalen van de emotionele toestand van de gebruiker ten opzichte van bel construct kunnen worden gecorreleerd; opeenvolgende inputs van een deelnemer worden voorgcsteld in het vluchtige geheugen van bet apparaat, dat hel tweede deel van hel proces vormt, een constructie die is geconstitueerd als een middel om tc begrijpen door het onthouden van opeenvolgende 25 trajecten n, u-1, n-2, el cetera om binnen het systeem tocgeschrcvcn betekenis van de interactie dynamisch vast tc stellen, het systeem leert elke opeenvolgende traject door neurale netwerk of andere algoritmische leerstrategie, classificeert intentie dnor middel van het in kaart brengen van correlaties, terwijl het systeem maakt extra bestander; op te slaan in nietv luchtig geheugen, toegevoegd aan de runtime door lui laden of late assemblage-binding, 30 waardoor het systeem een verhoogde capaciteit voor het detecteren van soortgelijke patronen in de toekomst uitwisselingen door middel van de methode van de baan en emotie, opgeslagen in niet-vluchtig geheugen als back-up als bet systeem onverwacht moet eindigen of als een deelnemer wil stoppen en hervatten van do interactie op een later tijdstip vanaf het laatste plint; simulatie van emotie, die het derde deel vormt van het proces van het vaststellen 35 van toe· geschreven betekenis bereikt door een component van bel systeem genaamd stemming die gebruik maakt van een van een set van zesenvijftig toegewezeu emotionele toestanden, bestaande uit een ouderset van acht types met elk een verdere zeven kind subtypes, voor het eerst willekeurig bepaald en gewijzigd door aanwijzingen op basis van de trajecten en emoties van interactie-dialoog en geheugenstapeling, weergegeven op een vast of geanimeerd 40 scherm, tactiel tentoongesteld in een pluche weefsel, geïnterpreteerd door een robotapparaat of andere geschikte variaties van hardware, zodat bet gesimuleerde emoties over te brengen aan een deelnemer: de opeenvolging van opeenvolgende audiale, verbale, non-verbale, tactiele en gesturale interacties, gegeven het traject van dc dialoog en de opeenvolging van emoties gemaakt in vluchtig geheugen en geschreven naar bestanden door het systeem in niet-vluehtig geheugen later geladen in vluchtig geheugen draagt bij aan prestaticverhogingeu door zelfverbetering, waar de groei wordt bepaald door ervaring waarbij de persoonlijkheid van de deelnemer zich manitesteert een opzettelijke neiging jegens een deelnemer;1. An implementation on hardware! level of a dull structure that covers a computer-based implementation of audial, verbal, non-verbal, tactile and gbb-based communication between themselves and a participant, commonly referred to as a user 5, creating a process hosted on hardware or a suitable physical device, the first part an interactive session with a participant with a view to creating a contextual interactive exchange - a personal and intimate relationship based on cell elements, tactics, vibrating and visual signals with a view to satisfying human emotional curiosity in the way of: a collection of specifically formatted files, stored in a non-volatile memory, which forms a database and which is stored in the volatile memory of the device at the start of the system to ensure a certain reproducibility of the behavior displayed by the system, in this example one collection of classes in an object-oriented language and a series of functions and scripts, the creation of a non-hicrarchalc order of data from the files according to embedded tags that form a categorical and pattern-proof-deposit of trajectories with a contextual meaning communicated by a participant for an appropriate response tc determine that. identify a participant as relevant to shore is meant by the dialogue, a hierarchical collection of attributed behaviors and intentions with a list of qualitative aspects that allow audial, verbal, non-verbal, tactile and structural signals of the emotional state of the user to can be correlated to bubble construct; successive inputs from a participant are presented in the volatile memory of the device, which forms the second part of the process, a construction constituted as a means of understanding tc by remembering successive trajectories n, u-1, n- 2, within the system to determine meaning of the interaction dynamically within the system, the system learns each successive trajectory through neural network or other algorithmic learning strategy, classifies intention by means of mapping correlations, while the system makes additional file ; save it in non-volatile memory, added to the runtime by lazy loading or late assembly binding, giving the system an increased capacity for detecting similar patterns in future exchanges through the method of the job and emotion, stored in non-volatile memory as a backup if the system has to end unexpectedly or if a participant wants to stop and resume the interaction at a later time from the last plinth; simulation of emotion, which forms the third part of the process of determining attributed significance achieved by a component of bell system called mood that uses one of a set of fifty-six assigned emotional states, consisting of a parent set of eight types each with a further seven child subtypes, for the first time randomly determined and modified by clues based on the trajectories and emotions of interaction dialogue and memory stack, displayed on a fixed or animated 40 screen, tactically displayed in a plush fabric, interpreted by a robotic device or other suitable variations of hardware, allowing the simulated emotions to be transmitted to a participant: the sequence of consecutive audial, verbal, non-verbal, tactile and gestural interactions, given the trajectory of the dialogue and the sequence of emotions created in volatile memory and written to files by the system in non-vluehtig memory loaded later in volatile memory contributes to prestatic enhancement through self-improvement, where growth is determined by experience in which the personality of the participant manifests itself an intentional tendency toward a participant; 5 bewustwording van dc interne staat van het systeem vormt hel vierde deel van het proces, een constructie die is geconcipieerd als een middel om liet gebrek aan aandacht door een deelnemer te herkennen, geïnterpreteerd als zijnde in een ongewenste staat die gewoonlijk wordt geassocieerd als zijnde alleen, waar bet systeem een deelnemer aanspoort om niet in de ongewenste staat te blijven, biel systeem overweegt ervaring en de som van zijn interacties 10 met een deelnemer zijn dc noodzakelijke waardcoulput van zijn functie. Deze waarde, een vorm van eigen vermogen die voortkomt uit het rekenwerk, kan worden doorgegeven aan de blockchain.Awareness of the internal state of the system forms the fourth part of the process, a construction conceived as a means of recognizing a participant's lack of attention, interpreted as being in an unwanted state that is usually associated as being alone where the system encourages a participant not to remain in the unwanted state, the system considers experience and the sum of its interactions with a participant is the necessary value coulput of its function. This value, a form of equity that results from the calculation, can be passed on to the blockchain. 2. Het systeem bestaat volgens conclusie 1 uit vier onderscheidende onderdelen: creatie van proces., inpulverwerking en systeem ui tbreidnig, simulatie van emotionele2. The system according to claim 1 consists of four distinguishing components: process creation, inpulse processing and system extension, emotional simulation 15 signalen, en geheugen van dc eigen interne staat van interactie· ten opzichte van de tijd dat de laatste input werd ontvangen, om oen output te manifesteren die een deelnemer contextuele betekenis zou vinden te bevatten door wat er werd gezegd.15 signals, and memory of the own internal state of interaction with respect to the time the last input was received, to manifest an output that a participant would find contextual meaning by what was said. 3, Het systeem volgens conclusie 2, een neuraal netwerk verwerkt signalen die verschijnen in de gegevens verwerkt en getransformeerd door het systeem dat gewogen3, The system of claim 2, a neural network processes signals that appear in the data processed and transformed by the system weighted 20 waarden toewijsl aan sequenties en samenstellingen waardoor oen mechanisme waarbij de oiitputwaardcn van het neuraal netwerk operationele parameters van hel systeem te veranderen in de vorm van feedback en dc uitvoering begint op te slaan systeemparameters door het schrijven en opslaan van wijzigingen ht bestanden geformatteerd dal dc programmeertaal begrijpt.20 assign values to sequences and compositions whereby a mechanism whereby the output values of the neural network begin to change operational parameters of the system in the form of feedback and execution to save system parameters by writing and saving changes to files formatted using the programming language understands. 2525 4. Hot systeem volgens conclusie 2. gebaseerd op trajccündieatics en input van een deelnemer, animeren de emotionele indicaties op dc volgende manier door middel eau een display, externe toepassing, robot, pluche weefsel, of een geschikt fysiek apparaat dat in staat is dc betekenisslof te illustreren dis is ingebed in de emotionele indicatie en diuioogrespons: verbale kenmerken zoals stemsynthe.se of -verveelvoudiging: tactiele kenmerken, niet30 taaluitingen zoals tjirpen, purrs, draaikolken of andere primitieve audittle. bewegingen van pluche-componenten in weefsels, of trillingen in dc fysieke ruimte of op een oppervlak; gobarenkenmerkem visuele of niel-visuele beweging in de vorm van rotaties in de fysieke ruimte; emotie en andere complexe visuele bewegingen. waarbij gebruik wordt gemaakt van stilstaande grafische bestanden ol' progressieve sequenties van beelden, lichten of andere4. Hot system according to claim 2. Based on trajccündieatics and input from a participant, the emotional indications are animated in the following manner by means of a display, external application, robot, plush tissue, or a suitable physical device capable of the meaning slave. to illustrate this is embedded in the emotional indication and diuiose response: verbal characteristics such as voice synthesis or multiplication: tactile characteristics, not 30 language expressions such as chirping, purrs, whirlpools or other primitive audittle. movements of plush components in fabrics, or vibrations in the physical space or on a surface; gobar feature visual or niel-visual movement in the form of rotations in physical space; emotion and other complex visual movements. using still graphic files or progressive sequences of images, lights or other 35 fysieke apparaten die geschikt zijn om nauwkeurig en esthetisch tie betekenis weer te geven die door de huidige stemming wordt vervracht: roboiapparaiuur om de overeenkomstige lichamelijke gebaren te animeren, gegevens over reacties van ecu deelnemer te verzenden en te ontvangen en complexe poppenspel te verrichten.35 physical devices capable of accurately and aesthetically pleasing the meaning of the current mood: robotic equipment to animate the corresponding bodily gestures, to send and receive data on responses from ecu participants and to perform complex puppetry. 5. Hot systeem volgens conclusie 4. door indicaties uit trajecten van gespreksdi?tioog, 40 elk met ecu categorie en patroon, on toegowezen een gewogen waarde die. in totaliteit, de opeenvolging van opeenvolgende verbale, ποη-verbalc. tactiele, vibrerende, gestuele en geanimeerde interacties die hei systeem begrijpt wanneer gepresenteerd door een deelnemer.5. Hot system according to claim 4. by indications from trajectories of interviewer, 40 each with ECU category and pattern, unallocated a weighted value. in totality, the sequence of consecutive verbal, ποη verbalc. tactile, vibrating, gesturing and animated interactions that the system understands when presented by a participant. 6. Het systeem volgens conclusie 3, leert elke opeenvolgende traject en data-opslag. inibrmalie-retrieval sequentie door neurale netwerk of andere algoritmische leerstrategie opThe system of claim 3, learns each successive trajectory and data storage. inibrmalie retrieval sequence by neural network or other algorithmic learning strategy 5 de manier van natuurlijke taal verwerking en/of diep leren constructen bovendien verwerking van nieuwe bestanden, die kunnen worden geformatteerd bestanden de programmeertaal begrijpt xml, programmering-language construct bestanden, of scripts uit te voeren in vluchtig geheugen terug te voeren in het systeem door luie opladen of laat-asscssmenl.5 the way of natural language processing and / or deep learning constructs additionally processing of new files, which can be formatted files the programming language understands xml, programming-language construct files, or scripts to be executed in volatile memory to be fed back into the system by lazy charging or late asscssmenl. 7. Het systeem volgens conclusie 4, bestaande uit de emotionele impact van hel 10 systeem door .middel van simulatie om het begrip betekenis vast te stellen dat wordt bereikt door een construct in hel systeem, stemming genoemd, weergegeven op een vast of geanimeerd scherm, geïnterpreteerd door een robotapparaat of andere geschikte variaties van hardware die relevant zijn voor bijzonderheden van de persoonlijkheid van een deelnemer, zodat het emoties overbrengt op de manier die een deelnemer zou verwachten, met het oog op 15 het creëren van een illusie van bewustzijn en toegenomen kameraadschap.The system of claim 4, consisting of the emotional impact of the system by means of simulation to determine the meaning of meaning achieved by a construct in the system, called mood, displayed on a fixed or animated screen, interpreted by a robotic device or other suitable variations of hardware that are relevant to particulars of a participant's personality, so that it conveys emotions in the way that a participant would expect, with a view to creating an illusion of consciousness and increased companionship . 8. Het systeem volgens conclusie 6, een set van ingangen van traject inkapseling, emotioneel aspect, en query procedure die gegevens verstrekt aan het construct van de educatieve verplaatsing door eerst de instructie tags te extraheren uit traject en. huidige stemming waar de tags worden geanalyseerd in volgorde van hun toestanden wordenThe system of claim 6, a set of entrances to trajectory encapsulation, emotional aspect, and query procedure that provides data to the construct of the educational movement by first extracting the instruction tags from trajectory and. current mood where the tags are analyzed in order of their states 20 gematcht en gecorreleerd met overeenkomstige traject indicatie voor het geval van traject, en overeenkomstige emotionele indicatie voor hel geval van .stemming. De correlatie levert, een verzameling coördinaten op, die x-coördinaat worden in het geval van een trajcctaanduiding, en y-coördinaai in het geval van een emotionele indicatie. Een lijdelijke coördinaat uitvoeringstijdmarkcr van query zoekresultaat, die de variabele wordt l in een parametrische 25 regievergelijking, naar keuze van parameterisalicvariabelen en functie -· een van de gomometrische, continue differentiële functies en/of veeltcrmijnen - formatteert gegevens in een patroon van informatie ingedeeld in verschillende coördinatcnblokken op basis van gegevens die zijn ingebed in een van de -indicaties·.20 matched and correlated with corresponding trajectory indication for the case of trajectory, and corresponding emotional indication for the case of mood. The correlation yields a set of coordinates that become x coordinate in the case of a trajectory indication, and y coordinate in the case of an emotional indication. A passive coordinate execution time mark of query search result, which becomes the variable in a parametric control equation, at the option of parameter isalic variables and function - one of the gomometric, continuous differential functions and / or multitram mines - formats data in a pattern of information classified into different coordinate blocks based on data embedded in one of the indications. 9. Het systeem volgens conclusie 7, de emotionele motor, bestaal uit wiel-achtige 30 elementaire toestanden met een regeling van ouderlijke gevoelens en kindstem mingen 'waar elk element houdt bij te houden van de laatste emotionele toestand, ingcsteld op nul indicatie als standaard de uit toestand. Voor een bepaalde staat, wijst de indicator op een geheel getal tussen een en zeven, elk corresponderend met beschikbare stemmingen hei systeem emuleerl, die naar een neuraal netwerk voor emotionele indicatie, een gewogen waarde van het patroon 35 in het netwerk voor die bepaalde stemming wordt verzonden.9. The system of claim 7, the emotional motor consisting of wheel-like elemental states with a control of parental feelings and child's moods where each element keeps track of the latest emotional state, set to zero indication as standard the off state. For a particular state, the indicator indicates an integer between one and seven, each corresponding to available moods. The system emulates, which is sent to a neural network for emotional indication, a weighted value of the pattern 35 in the network for that particular mood. sent. 10. Het systeem volgens conclusie 9, controle van de interactie tussen hei systeem en een deelnemer voor de tijdspanne wanneer de laatste input werd ontvangen van een duur ingcsteld door een configuratiebestand wanneer het systeem alleen wordt, het invoeren van de overeenkomstige emotionele staat het sturen van een prompt voor output animatie overgebraehl aan een deehsemer.The system of claim 9, controlling the interaction between the system and a participant for the time period when the last input was received from a duration set by a configuration file when the system becomes alone, entering the corresponding emotional state controlling a prompt for output animation transferred to a deehsemer. 11. Het systeem volgens conclusie 8, aanwijzingen die verschijnen in gegevens getransformeerd door het systeem ais intentie door een bloklopcr, de hersenen van het 5 systeem na te bootsen opslag en informatie ophalen kenmerken van een hersenen van zoogdieren. Traject- en gevoelsmatige indicaties leveren de classificeerder, afhankelijk van de keuze van de karakteristieke vergelijking en de parametrisering ervan, samen met de uitvoeringstijd die voortkomt uit de query-procedure en de uitvoering van het. programma binnen het systeem en de hardware, een set van verschuivingen die verantwoordelijk zijn 10 voor verschillende fenomenen die beschikbaar zijn voor het systeem in de context van zijn omgeving.11. The system of claim 8, indications appearing in data transformed by the system as an intention by a block beater, mimicking the brain of the system, and retrieving information features of a mammalian brain. Path and emotional indications provide the classifier, depending on the choice of the characteristic equation and its parameterization, together with the execution time that results from the query procedure and the execution of it. program within the system and hardware, a set of shifts that are responsible for different phenomena that are available to the system in the context of its environment.
NL1042811A 2018-04-05 2018-04-05 A cognitive-emotional conversational interaction system. NL1042811B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
NL1042811A NL1042811B1 (en) 2018-04-05 2018-04-05 A cognitive-emotional conversational interaction system.

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
NL1042811A NL1042811B1 (en) 2018-04-05 2018-04-05 A cognitive-emotional conversational interaction system.

Publications (1)

Publication Number Publication Date
NL1042811B1 true NL1042811B1 (en) 2019-10-14

Family

ID=63684373

Family Applications (1)

Application Number Title Priority Date Filing Date
NL1042811A NL1042811B1 (en) 2018-04-05 2018-04-05 A cognitive-emotional conversational interaction system.

Country Status (1)

Country Link
NL (1) NL1042811B1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5636994A (en) * 1995-11-09 1997-06-10 Tong; Vincent M. K. Interactive computer controlled doll
WO2003007273A2 (en) * 2001-07-12 2003-01-23 4Kids Entertainment Licensing, Inc. (Formerly Leisure Concepts, Inc.) Seemingly teachable toys

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5636994A (en) * 1995-11-09 1997-06-10 Tong; Vincent M. K. Interactive computer controlled doll
WO2003007273A2 (en) * 2001-07-12 2003-01-23 4Kids Entertainment Licensing, Inc. (Formerly Leisure Concepts, Inc.) Seemingly teachable toys

Similar Documents

Publication Publication Date Title
US20230419074A1 (en) Methods and systems for neural and cognitive processing
Grainger et al. Localist connectionist approaches to human cognition
US20180204107A1 (en) Cognitive-emotional conversational interaction system
Taniguchi et al. Survey on frontiers of language and robotics
Clancey The frame of reference problem in the design of intelligent machines
Foster Natural language generation for social robotics: opportunities and challenges
US20030193504A1 (en) System for designing and rendering personalities for autonomous synthetic characters
RU2670781C9 (en) System and method for data storage and processing
Germanakos et al. Human-centred web adaptation and personalization
JP2023156447A (en) Natural language solution
Armstrong Big Data, Big Design: Why Designers Should Care about Artificial Intelligence
KR20190105175A (en) Electronic device and Method for generating Natural Language thereof
Virvou The emerging era of human-AI interaction: Keynote address
Neßelrath SiAM-dp: An open development platform for massively multimodal dialogue systems in cyber-physical environments
Meena et al. Human-computer interaction
NL1042811B1 (en) A cognitive-emotional conversational interaction system.
Foster et al. Task-based evaluation of context-sensitive referring expressions in human–robot dialogue
Blumendorf Multimodal interaction in smart environments: a model-based runtime system for ubiquitous user interfaces.
Feld et al. Software platforms and toolkits for building multimodal systems and applications
Ryabinin et al. Human-oriented IoT-based interfaces for multimodal visual analytics systems
Williams et al. Manufacturing magic and computational creativity
Cuayáhuitl et al. Introduction to the special issue on machine learning for multiple modalities in interactive systems and robots
Santos et al. Behavior-based robotics programming for a mobile robotics ECE course using the CEENBoT mobile robotics platform
Bäuerle et al. exploRNN: teaching recurrent neural networks through visual exploration
KR102502195B1 (en) Method and system for operating virtual training content using user-defined gesture model

Legal Events

Date Code Title Description
MM Lapsed because of non-payment of the annual fee

Effective date: 20210501