CN116974412A - Method and related device for processing interaction information in interaction product - Google Patents
Method and related device for processing interaction information in interaction product Download PDFInfo
- Publication number
- CN116974412A CN116974412A CN202310950990.4A CN202310950990A CN116974412A CN 116974412 A CN116974412 A CN 116974412A CN 202310950990 A CN202310950990 A CN 202310950990A CN 116974412 A CN116974412 A CN 116974412A
- Authority
- CN
- China
- Prior art keywords
- character
- interactive
- interaction
- user
- node
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003993 interaction Effects 0.000 title claims description 287
- 238000000034 method Methods 0.000 title claims description 59
- 238000012545 processing Methods 0.000 title claims description 36
- 230000002452 interceptive effect Effects 0.000 claims abstract description 284
- 230000009191 jumping Effects 0.000 claims abstract description 48
- 230000004044 response Effects 0.000 claims abstract description 17
- 238000003672 processing method Methods 0.000 claims abstract description 5
- 230000006870 function Effects 0.000 claims description 38
- 230000006854 communication Effects 0.000 claims description 27
- 238000004891 communication Methods 0.000 claims description 26
- 230000008451 emotion Effects 0.000 claims description 23
- 230000008569 process Effects 0.000 claims description 21
- 230000004672 jump response Effects 0.000 claims description 19
- 238000012790 confirmation Methods 0.000 claims description 16
- 238000004590 computer program Methods 0.000 claims description 13
- 230000036651 mood Effects 0.000 claims description 13
- 230000006872 improvement Effects 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 15
- 230000001360 synchronised effect Effects 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 206010063659 Aversion Diseases 0.000 description 1
- 206010012374 Depressed mood Diseases 0.000 description 1
- 241000283973 Oryctolagus cuniculus Species 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000008094 contradictory effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application provides a processing method of interactive information in an interactive product and a related device, wherein first, a first prompting popup frame is displayed on a product interactive interface; responding to a first selection operation of a user for a first operation control, and jumping to display a character setting interface; responding to a second selection operation of the user for selecting the operation control, ending displaying the character setting interface and jumping to display the interactive interface of the current interactive node; responding to a third selection operation of the user for the second operation control, ending displaying the first prompt popup frame and jumping to display the interactive interface of the current interactive node; responding to user input operation of an interactive interface of a current interactive node; and responding to the interactive node skip response message sent by the server, and skipping to display the interactive interface corresponding to the next interactive node. Therefore, character characters are set based on user will, flexibility and intelligence of the device are improved, and reading interests of users are improved.
Description
Technical Field
The application belongs to the technical field of general data processing in the Internet industry, and particularly relates to a processing method and a related device for interactive information in an interactive product.
Background
At present, the popular interactive stories on the market mainly aim at entertainment and leisure of users, the setting of the personas in the stories is single, the degree of freedom of the users is low, the users are not enough to interact with the personas in the interactive stories in the reading process, the users feel boring easily and substitution feeling of the users is not strong, the interests of continuing to read the interactive stories are lost easily after the users read for several times, the users abandon the use, and the user experience feeling is poor.
Therefore, a method for processing interactive information in an interactive product is needed to solve the above problems.
Disclosure of Invention
The application provides a processing method and a related device of interaction information in an interaction product, which aim to carry out different man-machine interactions with users according to different role characters through terminal equipment, so that the processing capability of equipment side on the character data of an interaction story is improved, and meanwhile, the user experience and reading interest are improved.
In a first aspect, an embodiment of the present application provides a method for processing interactive information in an interactive product, which is applied to a terminal device for product interaction with a user through a human-machine dialogue script in an interactive system, where the interactive system includes the terminal device and a server, the human-machine dialogue script includes a plurality of interactive nodes, and each of the plurality of interactive nodes includes machine output content and at least one user input content; the method comprises the following steps:
Displaying a first prompting bullet frame on a product interaction interface, wherein the first prompting bullet frame comprises a first operation control for setting characters of a target character, a second operation control for skipping the character setting step of the target character and first prompting text information, the first prompting text information is used for prompting the user that a plurality of selectable characters exist in the target character which is interacted currently, and the target character is a character which is interacted for the first time when the user reads a target interaction story;
receiving information, which is sent by the server and comprises a plurality of selectable characters of the target character, from the server in response to a first selection operation of the user on the first operation control, skipping to display a character setting interface, and displaying the selectable characters and selection operation controls corresponding to the selectable characters on the character setting interface, wherein the selectable characters of the target character are associated with character settings and background stories in the target interaction story;
responding to a second selection operation of the user for the selection operation control, generating a setting request according to information of a target character designated by the second selection operation, and sending the setting request to the server, wherein the setting request is used for indicating the server to take the target character as the character of the target character, ending displaying the character setting interface and jumping to display the interactive interface of the current interactive node; or,
Responding to a third selection operation of the user for the second operation control, ending displaying the first prompt popup frame, jumping to display an interactive interface of the current interactive node, and interacting with the server to determine that the character of the target character is a default character, wherein the default character is one of the plurality of selectable characters;
responding to user input operation of the user aiming at the interactive interface of the current interactive node, sending an interactive node jump request to the server, generating an interactive node jump response message by the server according to the interactive node jump request, wherein the user input operation refers to the operation of the user for inputting characteristic voice information, the characteristic voice information is matched with a target jump condition of the current interactive node for jumping to a next interactive node, the target jump condition is one condition in a jump condition set, and the jump condition set is a condition set which is determined by the server according to the character of the target character and is used for triggering the next interactive node to jump;
and responding to the interaction node jump response message sent by the server, and jumping and displaying an interaction interface corresponding to the next interaction node according to the next interaction node indicated by the interaction node jump response message.
In a second aspect, an embodiment of the present application provides a processing apparatus for interactive information in an interactive product, which is applied to a terminal device for product interaction with a user through a human-machine dialogue script in an interactive system, where the interactive system includes the terminal device and a server, the human-machine dialogue script includes a plurality of interactive nodes, and each of the plurality of interactive nodes includes machine output content and at least one user input content; the device comprises:
the prompting unit is used for displaying a first prompting popup frame on a product interaction interface, wherein the first prompting popup frame comprises a first operation control for setting characters of a target character, a second operation control for skipping the character setting step of the target character and first prompting text information, the first prompting text information is used for prompting the user that a plurality of selectable characters exist in the target character which is interacted currently, and the target character is the character which is interacted for the first time when the user reads a target interaction story;
the setting interface display unit is used for responding to a first selection operation of the user for the first operation control, receiving information, which is sent by the server and comprises the multiple selectable characters of the target character, of jumping and displaying a character setting interface, displaying the multiple selectable characters and selection operation controls corresponding to the multiple selectable characters on the character setting interface, wherein the multiple selectable characters of the target character are associated with character setting and background stories in the target interaction stories;
The first jump display unit is used for responding to the second selection operation of the user for the selection operation control, generating a setting request according to the information of the target character designated by the second selection operation, sending the setting request to the server, and indicating the server to take the target character as the character of the target character, ending displaying the character setting interface and jumping to display the interactive interface of the current interactive node; or,
the second jump display unit is used for responding to the third selection operation of the user for the second operation control, finishing displaying the first prompt popup frame, jumping to display the interactive interface of the current interactive node, and interacting with the server to determine that the character of the target character is a default character, wherein the default character is one of the plurality of selectable characters;
an interactive node jump unit, responding to the user input operation of the user aiming at the interactive interface of the current interactive node, sending an interactive node jump request to the server, generating an interactive node jump response message by the server according to the interactive node jump request, wherein the user input operation is the operation of inputting characteristic voice information by the user, the characteristic voice information is matched with a target jump condition of jumping of the current interactive node to the next interactive node, the target jump condition is one condition in a jump condition set, and the jump condition set is determined by the server according to the character of the target character and is used for triggering the next interactive node to jump;
And the interactive interface display unit responds to the interactive node jump response message sent by the server, and jumps and displays the interactive interface corresponding to the next interactive node according to the next interactive node indicated by the interactive node jump response message.
In a third aspect, an embodiment of the present application provides an electronic device comprising a processor, a memory, and one or more programs stored in the memory and configured to be executed by the processor, the programs including instructions for performing steps as in the first aspect of the embodiment of the present application.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having stored thereon a computer program/instruction which when executed by a processor performs the steps of the first aspect of embodiments of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product comprising computer programs/instructions which when executed by a processor implement some or all of the steps as described in the first aspect of the embodiments of the present application.
It can be seen that in the embodiment of the application, a user uses an interactive product through a terminal device which performs product interaction with the user through a man-machine dialogue script in an interactive system, and the terminal device displays a first prompt popup frame on a product interactive interface; then, by responding to a first selection operation of the user for the first operation control, receiving information, which is sent by the server and comprises the plurality of selectable characters of the target character, from the server, and skipping to display a character setting interface; then, responding to a second selection operation of the user for the selection operation control, sending the setting request to the server, ending displaying the character setting interface and jumping to display the interactive interface of the current interactive node; or, ending displaying the first prompt popup frame and jumping to display the interactive interface of the current interactive node by responding to the third selection operation of the user on the second operation control; further, sending an interactive node jump request to the server by responding to a user input operation of the user on an interactive interface of the current interactive node; and finally, by responding to the interactive node jump response message sent by the server, jumping to display the interactive interface corresponding to the next interactive node. In this way, the embodiment of the application sets characters for story characters, executes different jumping behaviors by the characters with different characters, improves the communication capability of the user facing the characters with different characters in the process of reading the interactive story, optimizes the use experience of the user, and improves the flexibility and intelligence of the interactive product.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a block diagram of an interactive system according to an embodiment of the present application;
FIG. 2 is a flow chart of a method for processing interactive information in an interactive product according to an embodiment of the present application;
fig. 3a is a schematic diagram of a display interface of a terminal device according to an embodiment of the present application;
FIG. 3b is a simplified illustration of a setup interface provided by an embodiment of the present application;
FIG. 3c is a schematic view of another setup interface provided by an embodiment of the present application;
FIG. 4 is a simplified interface diagram of a prompt box according to an embodiment of the present application;
FIG. 5a is a functional block diagram of a device for processing interactive information in an interactive product according to an embodiment of the present application;
FIG. 5b is a functional block diagram of a processing device for interactive information in another interactive product according to an embodiment of the present application;
Fig. 6 is a block diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In order that those skilled in the art will better understand the present application, a technical solution in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The terms first, second and the like in the description and in the claims and in the above-described figures are used for distinguishing between different objects and not necessarily for describing a sequential or chronological order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
In the embodiment of the application, "and/or" describes the association relation of the association objects, which means that three relations can exist. For example, a and/or B may represent three cases: a alone; both A and B are present; b alone. Wherein A, B can be singular or plural.
In the embodiment of the present application, the symbol "/" may indicate that the associated object is an or relationship. In addition, the symbol "/" may also denote a divisor, i.e. performing a division operation. For example, A/B may represent A divided by B.
"at least one" or the like in the embodiments of the present application means any combination of these items, including any combination of single item(s) or plural items(s), meaning one or more, and plural means two or more. For example, at least one (one) of a, b or c may represent the following seven cases: a, b, c, a and b, a and c, b and c, a, b and c. Wherein each of a, b, c may be an element or a set comprising one or more elements.
The 'equal' in the embodiment of the application can be used with the greater than the adopted technical scheme, can also be used with the lesser than the adopted technical scheme. When the combination is equal to or greater than the combination, the combination is not less than the combination; when the value is equal to or smaller than that used together, the value is not larger than that used together.
In order to better understand the solution of the embodiment of the present application, the following describes electronic devices, related concepts and backgrounds that may be related to the embodiment of the present application.
The electronic device according to the embodiment of the present application may include various handheld devices, vehicle-mounted devices, wearable devices, computing devices, or other processing devices connected to a wireless modem, and various forms of User Equipment (UE), mobile Station (MS), electronic device (terminal device), and so on. For convenience of description, the above-mentioned devices are collectively referred to as electronic devices.
Referring to fig. 1, fig. 1 is a block diagram of an interactive system according to an embodiment of the present application. As shown in fig. 1, the interactive system 100 includes a terminal device 110 and a server 120, where the terminal device 110 and the server 120 are communicatively connected, and a user interacts with a control on a display interface of the terminal device 110 to use an interactive product.
In one possible example, a first prompt box is displayed on a product interaction interface of the terminal device 110, then the terminal device 110 receives information, including the multiple selectable characters of the target character, sent by the server 120 in response to a first selection operation of the user on the first operation control, and the terminal device 110 jumps to display a character setting interface; then, the terminal device 110 responds to a second selection operation of the user for the selection operation control, generates a setting request according to the information of the target character designated by the second selection operation, and sends the setting request to the server 120; the terminal equipment 110 finishes displaying the character setting interface and jumps to display the interactive interface of the current interactive node; or, the terminal device 110 responds to the third selection operation of the user on the second operation control, and finishes displaying the first prompt box and jumps to display the interactive interface of the current interactive node, and interacts with the server 120 to determine that the character of the target character is a default character; the terminal device 110 responds to the user input operation of the user on the interactive interface of the current interactive node, and sends an interactive node jump request to the server 120, and the server 120 generates the interactive node jump response message according to the interactive node jump request; finally, the terminal device 110 responds to the interaction node skip response message sent by the server, and skips and displays the interaction interface corresponding to the next interaction node according to the next interaction node indicated by the interaction node skip response message.
The terminal device 110 may be a mobile phone terminal, a tablet computer, a notebook computer, etc., and the server 120 may be a server, or a server cluster formed by a plurality of servers, or a cloud computing service center. One server 120 may be used to simultaneously correspond to a plurality of terminal devices 110, or a plurality of servers 120 may be included in the video interaction system 100, each server 120 corresponding to one or more terminal devices 110.
Based on this, the embodiment of the application provides a method for processing interactive information in an interactive product, and the embodiment of the application is described in detail below with reference to the accompanying drawings.
Referring to fig. 2, fig. 2 is a flow chart of a method for processing interactive information in an interactive product according to an embodiment of the present application, where the method is applied to a terminal device 110 for product interaction with a user through a man-machine dialogue scenario in an interactive system 100 shown in fig. 1, the interactive system 100 includes the terminal device 110 and a server 120, the man-machine dialogue scenario includes a plurality of interactive nodes, and each of the interactive nodes includes machine output content and at least one user input content; the method comprises the following steps:
Step 210, displaying a first prompt box on a product interactive interface, wherein the first prompt box comprises a first operation control for setting characters of a target character, a second operation control for skipping the character setting step of the target character and first prompt text information.
The first prompting text information is used for prompting the user that a plurality of selectable characters exist in the target role which is interacted currently, the target role is the role which is interacted by the user for the first time when the user reads a target interaction story, the target role in the target interaction story can have the plurality of selectable characters, the plurality of selectable characters refer to types of two or more characters, the difference occurs when an interaction node jumps due to the different characters of the target role, and then the user jumps to a different next interaction node; it should be understood that, the first interaction here refers to the first interaction with the target character during the current reading of the target interaction story, and is not the first interaction in the history of reading the interaction story.
The product interaction interface is an interaction story interface presented by the terminal equipment, and the interaction story can be an interaction novel, but is not limited to the expression form of the interaction novel; the interactive nodes of the interactive story can be further divided into non-interactive nodes and interactive nodes, wherein the interactive nodes comprise user operation and machine response operation, the plot evolution of the interactive nodes requires a user to operate on a display screen of the terminal equipment, the non-interactive nodes only comprise machine response operation, the non-interactive nodes mainly comprise side white content in the interactive story, dialogue content of roles without user participation and the like, and the information types of the machine side output content of the non-interactive nodes comprise non-video types and video types.
Specifically, referring to fig. 3a, fig. 3a is a schematic diagram of a display interface of a terminal device provided by an embodiment of the present application, as shown in fig. 3a, a first prompting bulletin 30 is displayed at the uppermost layer of a product interaction interface, and includes a first operation control 301, a second operation control 302, and a first prompting text message 303, where the first prompting text message 303 is a text content for prompting a user that a current target character has multiple optional characters, and simultaneously asking the user whether to perform character setting, and may be "whether the current character has multiple optional characters, and whether to perform character setting for the character? "content, or other text content that serves a similar prompting function; the first operation control 301 is an operation control for confirming setting of the target character, and the second operation control 302 is an operation control for canceling setting of the target character.
Step 220, in response to a first selection operation of the user on the first operation control, receiving information, including the multiple selectable characters of the target character, sent by the server, and displaying a character setting interface in a skip mode, and displaying the multiple selectable characters and selection operation controls corresponding to the multiple selectable characters on the character setting interface.
The multiple selectable characters of the target character are associated with character settings and background stories in the target interaction story, and when the server generates the interactive story script, the characters possibly possessed by the target character are assembled according to comprehensive factors such as the target character person settings, the story backgrounds and the like to generate a multi-selectable character set of the exclusive target character; it should be appreciated that the character of the target character cannot be too different from the human setting, the story background setting, for example, if the character of the target character in a certain type of adventure story is set as a main character member, that is, the target character is a genre character, the character of the character is more preferable to the characters with rich adventure spirit, brave, and aversion, rather than the characters with narrow hearts, doubts, goodwill, and smart, which are too different from the front character setting, so that the background setting is contradictory.
The first selection operation and other subsequent selection operations may be a finger touch click operation for a touch screen according to different types of terminal devices, may be an operation of clicking a related control by a mouse, or may be an operation of selecting a related control by converting voice information recognition into an operation instruction by an input voice through a server.
After the user performs the first selection operation, the terminal device sends a reading request to the server, and the server retrieves multiple pieces of optional character information adapted to the target character from the memory and sends the pieces of optional character information to the terminal device, and displays the pieces of optional character information on an interactive display interface of the terminal device, referring to fig. 3b, fig. 3b is a schematic diagram of a setting interface provided by an embodiment of the present application, and as shown in fig. 3b, the character setting interface 31 includes an avatar 311 of the target character, an optional character 312 of the target character, and a selection operation control 313 for the optional character 312.
In other possible embodiments, referring to fig. 3c, fig. 3c is a schematic diagram of another setting interface provided by the embodiment of the present application, and the difference between fig. 3c and fig. 3b is that the optional characters of the target characters in fig. 3c are presented through cartoon images, so that in order to adapt to the wide-age-range user to read the interactive story, the definition of characters of children in low young age range may be fuzzy, so that characters of the target characters are presented through cartoon images, so that the characters of children unable to be identified are set according to their own preference, for example, the pattern 321 in fig. 3c is a piglet image, and the user is implied that the characters of the target characters corresponding to the option are optimistically thick; pattern 322 is a rabbit character suggesting to the user that the character of the target character corresponding to the option is inside-in quiet; pattern 323 is a puppy character suggesting to the user that the character of the target character corresponding to the option is sense loyalty, pattern 324 is a kitten character, and suggesting to the user that the character of the target character corresponding to the option is courier.
Step 230, responding to a second selection operation of the user for the selection operation control, generating a setting request according to the information of the target character designated by the second selection operation, sending the setting request to the server, ending displaying the character setting interface and jumping to display the interactive interface of the current interactive node; or alternatively.
The setting request is used for indicating the server to take the target character as the character of the target character, for example, in fig. 3b, the user selects a selection operation control corresponding to the character of "open enthusiasm", the server receives the setting request containing "open enthusiasm", and invokes the interaction node and the jump condition set when the target character is the character of open enthusiasm in the memory.
After finishing the selection, finishing displaying the character setting interface, returning to the interactive interface where the user interacts with the target character, and continuing to finish the current interactive node.
And step 240, responding to a third selection operation of the user for the second operation control, ending displaying the first prompt popup frame and skipping to display the interactive interface of the current interactive node.
Wherein the default character is one of the plurality of selectable characters.
After the user selects the second operation control, the user does not want to set the character of the target character, the terminal equipment generates a corresponding indication message and sends the corresponding indication message to the server, wherein the indication message is used for indicating the server user to reject the custom setting character and request to finish setting according to the default character, and the server sets the character of the target character according to the history reading record or related parameters of the user in the interaction process, and the set character of the target character is the default character.
Step 250, in response to the user input operation of the user on the interactive interface of the current interactive node, sending an interactive node jump request to the server.
The server generates the interactive node jump response message according to the interactive node jump request, the user input operation refers to the operation of inputting characteristic voice information by the user, the characteristic voice information is matched with a target jump condition for jumping to a next interactive node by the current interactive node, the target jump condition is one condition in a jump condition set, and the jump condition set is a condition set determined by the server according to the character of the target character and used for triggering the next interactive node to jump.
The user input operation may also be user text input, so as to adapt to various application scenarios.
When a user inputs a word or phrase with feature semantics, the user will trigger the interaction node to jump to the next interaction node, where the feature semantics are preset by the server to be able to push the interaction node to jump, such as endorsed semantics, affirmative semantics, or some other words or phrases without special semantics but including preset keywords, for example, the user is informed before the interaction node of the present section, and golomb is a great navigation home with adventure spirit, so that the user inquires in the target role: "we have decided to travel on adventure, we want to learn from a great marine vessel, he is-" after which the user answers: and (3) triggering the jump of the next interaction node to start the interaction of the adventure chapter.
The jump condition sets are classified by characters of the target characters, and there may be coincident jump conditions in the jump condition sets corresponding to different characters of the target characters, that is, the target characters facing different characters may jump to the same interaction node when outputting the same feature sentence, which is not limited herein.
Step 260, in response to the interaction node skip response message sent by the server, displaying the interaction interface corresponding to the next interaction node in a skip manner according to the next interaction node indicated by the interaction node skip response message.
The interactive interface comprises an avatar of the target character and output contents of the target character, and user output contents.
It can be seen that in the embodiment of the application, firstly, a user uses an interactive product through a terminal device which performs product interaction with the user through a man-machine dialogue script in an interactive system, and the terminal device displays a first prompt popup frame on a product interactive interface; then, by responding to a first selection operation of the user for the first operation control, receiving information, which is sent by the server and comprises the plurality of selectable characters of the target character, from the server, and skipping to display a character setting interface; then, responding to a second selection operation of the user for the selection operation control, sending the setting request to the server, ending displaying the character setting interface and jumping to display the interactive interface of the current interactive node; or, ending displaying the first prompt popup frame and jumping to display the interactive interface of the current interactive node by responding to the third selection operation of the user on the second operation control; further, sending an interactive node jump request to the server by responding to a user input operation of the user on an interactive interface of the current interactive node; and finally, by responding to the interactive node jump response message sent by the server, jumping to display the interactive interface corresponding to the next interactive node. In this way, the embodiment of the application sets characters for story characters, executes different jumping behaviors by the characters with different characters, improves the communication capability of the user facing the characters with different characters in the process of reading the interactive story, optimizes the use experience of the user, and improves the flexibility and intelligence of the interactive product.
In one possible example, the sending the interaction node jump request to the server, the server generating the interaction node jump response message according to the interaction node jump request includes: judging whether the user reads the target interaction story for the first time or not; if not, determining a history selection record of the user for the target role, wherein the history selection record comprises the history characters of the target role selected by the user and the history skip condition corresponding to the characteristic voice information input to the current interaction node, and determining whether the characters of the currently selected target role are the same as the history characters of the target role; if the historical jump condition is the same, taking the historical jump condition as the target jump condition; if the target jump conditions are different, the characteristic voice information is obtained, the characteristic voice information is analyzed, and the target jump conditions matched with the characteristic voice information are determined.
The step of determining whether the user reads the target interaction story for the first time refers to determining whether the user reads the target interaction story for the first time, and the target role mentioned in step 210 is a role that the user interacts for the first time when reading the target interaction story, and the role that interacts for the first time only illustrates the first time in the current reading process, and cannot illustrate that the current reading is the first time of the user or the history is multiple times of reading, so that whether the user reads the target interaction story for the first time is confirmed.
If the user does not read the target interaction story for the first time, reading character selection records of the target characters and skip condition records of the user before, and judging whether the characters of the currently selected target characters are the same as the historical character selections of the target characters; if the historical jump conditions are the same, the historical jump conditions are directly used as target jump conditions, and if the historical jump conditions are different, characteristic voice information of a user is analyzed, and the target jump conditions are further determined.
The feature voice information is voice information containing feature words or feature semantics, the feature words or feature semantics are conditions for indicating to jump to a next interaction node, the server acquires the feature words or feature semantics, the jump conditions matched with the feature words or feature semantics are called from a memory as target jump conditions, for example, when a user outputs similar sentences containing "yes" or the feature semantics representing approval and approval, the server calls the jump conditions matched with approval features in the memory, jumps to the next interaction node according to the jump conditions, otherwise, if the user outputs similar sentences containing "not yes" or the feature semantics representing negation and rejection, the server determines the jump conditions matched with the negation features, and finishes the jump action of the interaction node.
In this example, whether the user reads for the first time is determined, so as to further determine a history selection record of the user on the target role, if the character of the currently selected target role is the same as the history character, the target jump condition is determined according to the history selection record, if the character of the currently selected target role is different from the history character, the target jump condition is determined according to the characteristic voice information, so that the target jump condition is effectively and accurately determined, the efficiency and flexibility of the terminal device in man-machine interaction are improved, and the user experience is optimized.
In one possible example, after said analyzing said characteristic speech information and determining said target jump condition matching thereto, said method further comprises:
determining whether the target jump condition is the same as the historical jump condition corresponding to the current interaction node;
if so, determining whether the target interaction node corresponding to the target jump condition is the same as the history interaction node corresponding to the history jump condition;
if the interactive nodes are different from the previous interactive nodes, displaying a second prompting popup frame on the product interactive interface, wherein the second prompting popup frame comprises a second prompting word, a first confirmation operation control and a first cancellation operation control, and the second prompting word is used for prompting the user that the interactive nodes to be jumped are different from the previous interactive nodes, and inquiring whether the user confirms to enter the interactive nodes the same as the previous interactive nodes;
Responding to a fourth selection operation of the user for the first confirmation operation control, ending displaying the second prompting popup frame, displaying a third prompting popup frame on the product interaction interface, wherein the third prompting popup frame comprises third prompting characters used for prompting the user to enter the target voice information which is required to be input by the interaction node which is the same as the last time, and jumping to the interaction node which is the same as the last time after receiving the target voice information input by the user;
responding to a fifth selection operation of the user for the first cancel operation control, and jumping according to the target interaction node corresponding to the target jumping condition;
if not, determining whether the target interaction node is the same as the history interaction node;
if the interaction interfaces are the same, jumping and displaying the interaction interface which corresponds to the target interaction node and the history interaction node together; the method comprises the steps of,
displaying a fourth prompting popup frame on the product interactive interface, wherein the fourth prompting popup frame comprises fourth prompting characters, a second confirmation operation control and a second cancellation operation control, the fourth prompting characters are used for prompting the user to have an interactive node backtracking function currently, and the interactive node backtracking function is a function capable of jumping forward to a previous interactive node of the current interactive node;
Responding to a sixth selection operation of the user for the second confirmation operation control, starting the interactive node backtracking function, jumping to a previous interactive node of the current interactive node, and displaying an interactive interface of the previous interactive node;
and responding to a seventh selection operation of the user for the second cancel operation control, and ending displaying the fourth prompt popup.
If the target jump condition of the current interaction node is the same as the historical jump condition, the characteristic voice input by the user is the voice information of the same type, namely the voice information containing the same characteristic word or similar characteristic semantics; and further continuously judging whether the interaction node corresponding to the target jump condition is the same as the interaction node corresponding to the historical jump condition.
If the input feature voice information is different, the user is stated to input the feature voice information of the same type but jump to different interaction nodes, that is, when the user faces the target characters of different characters, the feature voice information of the same type is output, and the feature voice information is jumped to different interaction nodes, so that a second prompt box is displayed on the interaction interface of the terminal device, and the second prompt text in the second prompt box can be, for example, "whether the interaction node to be entered is different from the last interaction node is confirmed? The character information of the' first confirmation operation control in the second prompting bullet frame is used for entering the interactive node which is the same as the last time for confirmation, and the first cancellation operation control is used for refusing to enter the interactive node which is the same as the last time. After the user confirms to enter the same interactive node as the last time, a third prompting bullet frame is displayed on an interactive interface of the terminal device, and the user is prompted to input target voice content through third prompting characters in the third prompting bullet frame so as to enter the interactive node consistent with the last time.
If the target characters are the same, it is indicated that the user inputs different characteristic voices but enters the same interaction node when facing different characters, and then a fourth prompt box is displayed on the interaction interface of the terminal device, please refer to fig. 4, fig. 4 is a schematic interface diagram of a prompt box provided by the embodiment of the present application, and as shown in fig. 4, the fourth prompt box 40 includes a fourth prompt text 410, a second confirmation operation control 420, and a second cancellation operation control 430, where the fourth prompt text 410 may be "about to jump to the interaction node repeated last time, and whether to trace back the interaction node to re-interact with the interaction node? "or other similar text content playing a role in prompting, without limitation, the user may select the second confirmation operation control 420 in the fourth prompting frame 40 to trace back the interaction node, return to the previous interaction node of the current interaction node to perform the interaction again, and may also select the second cancellation operation control 430 to continue the interaction of the interaction node.
In this example, whether the user inputs the same feature sentence is determined by judging whether the target jump condition is the same as the history jump condition, if so, whether the target interactive node is the same as the history interactive node is continuously judged, if not, the same feature sentence is indicated to enter different interactive nodes, and whether the user enters the same interactive node is confirmed by a second prompt bullet frame; if the target interaction node and the history interaction node are different, whether the target interaction node and the history interaction node are the same is determined, if the target interaction node and the history interaction node are the same, the different feature sentences jump to the same interaction node, an interaction node backtracking function is provided for a user, and the user can carry out the interaction node again, so that different prompt effects aiming at the conditions of different jumping conditions are achieved, the user reading process is easier and more convenient, the user using experience is improved, and the flexibility and the intelligence of the processing of jumping of the interaction node of the interaction story by equipment are improved.
In one possible example, the interacting with the server to determine the personality of the target persona as a default personality includes:
generating a default character setting message according to the third selection operation, and sending the default character setting message to the server, wherein the default character setting message is used for triggering the server to judge whether the current interaction node is the first reading of the user;
if yes, acquiring a history node reading record of the user through the server, and determining the default character of the target role from the history node reading record, wherein the history node reading record refers to at least one reading record of other interaction nodes in the target interaction story, and the other interaction nodes are one or more interaction nodes except the current interaction node in the interaction nodes;
if not, acquiring a historical character setting record of the user through the server, wherein the historical character setting record refers to at least one character setting record of the current interaction node on the target character, and taking the character of the target user set when the user reads the current interaction node last time in the historical character setting record or taking the character with the maximum number of times set for the target character in the historical character setting record as a default character of the target character.
The other interaction nodes can be the previous interaction nodes of the current interaction node according to the sequence of the interaction stories, or the subsequent interaction nodes according to the sequence of the interaction stories, and the default character of the target role is confirmed according to the reading records of the user at the other interaction nodes.
The historical character setting record refers to a historical setting record in the current interaction node, and includes one or more setting records, if only one setting record is provided, the character is directly determined to be a default character, and if a plurality of setting records are provided, the character indicated by the last setting record is taken as the default character, and in other possible examples, the character with the largest setting times in the plurality of setting records can also be taken as the default character.
In this example, whether the current interaction node is first read by the user is first judged, if yes, the default character of the target character is determined according to the history node reading record, if not, the default character of the target character is determined according to the history character record, so that the efficiency of setting the default character of the target character by the device is improved, the adaptation degree of the default character of the set target character and the user will is improved, and the user experience is optimized.
In one possible example, the determining the default character of the target character from the history node reading record includes:
determining interaction fluency of roles with different characters in the reading process of the user, wherein the interaction fluency is determined by the times of starting the interaction node backtracking function in the reading process of the user, and the more the times of the interaction node backtracking function are, the lower the interaction fluency is, the less the times of the interaction node backtracking function are, and the higher the interaction fluency is;
and determining the default character from the plurality of optional characters corresponding to the target character according to the interaction fluency.
The interaction fluency can also be reflected by the times of the two dialogue rounds when the user interacts with the roles with different characters, for example, in the current interaction node, the user needs to communicate with the target role to obtain the A information, if the user obtains the A information through 3 dialogues, the communication is relatively simple, the interaction fluency is high, if the user obtains the A information through 13 dialogues, the communication is relatively difficult, and the interaction fluency is low; it should be understood that the difficulty level may be determined by the relative number of conversations as described above, or may be measured by a preset standard value of the server, for example, the number of conversations of the preset current interaction node is 5, if the number of conversations of the user is less than or equal to 5, the interaction fluency is high, otherwise, if the number of conversations of the user is greater than 5, the interaction fluency is low.
The more the interactive node backtracking function times are, the lower the interest degree of the user on the interactive node which is jumped to at present is, namely, the poor effect of the user on the role communication with the character is illustrated, and the user does not jump to the interactive node according to the user wish.
In this example, the default character is determined from the multiple selectable characters of the target character according to the interaction fluency by determining the interaction fluency of the user and the characters of different characters, so that the adaptive target character is set for the user, the efficiency of the terminal device in setting the characters of the target character is improved, the suitability of the set default character and the user will is improved, and the user experience is optimized.
In one possible example, the determining the default character from the plurality of selectable characters corresponding to the target character according to the interaction fluency includes: confirming the reading purpose of the target interaction story, wherein the reading purpose comprises any one of the following steps: entertainment and communication capacity improvement; when the reading purpose is the entertainment purpose, taking one character with the highest interaction fluency as the default character of the target character; and when the reading purpose is the purpose of improving the communication capacity, taking one character with the lowest interaction fluency as the default character of the target character.
The reading purpose may be to ask the user through voice, text pattern, or other feasible modes before the user reads the target interaction, so as to confirm the reading purpose of the user, for example, display "please confirm your reading purpose at this time" on the display interface of the terminal device, and display "option one: entertainment "," option two: improving communication capability ", and further setting default characters of target characters adapting to reading purposes of users according to confirmation of reply information input by the users.
When the user selects the reading purpose as entertainment, the user hopes to read smoothly enough, more contents can be read in a short time when the user faces the target role, the interaction link is simple, the story progress is fast, interaction backtracking is reduced as much as possible, and the story development is promoted in a most rapid and efficient mode; similarly, when the user selects the reading purpose to improve the communication capability, the user hopes to communicate with more characters in the interactive story, and the story progress is developed into a secondary factor, so that when the user faces the target character, the user needs to acquire more story information through frequent communication, and the reading purpose of training the communication capability is achieved.
Specifically, in the actual application, there may be a plurality of characters with the same interaction fluency, and at this time, one character is selected from the plurality of characters as a default character.
In other possible examples, when the user selects the reading purpose to improve the communication capability, a man-machine interaction link can be appropriately added, and the user can obtain related information such as more story backgrounds and character introductions during usual reading through interaction, so that the achievement sense of the user in the communication process is improved, and the capability of the user to communicate with the character type person is further improved.
In this example, the reading purpose of the user is confirmed, so that the default character of the target character of the reading purpose of the user can be confirmed, when the user takes entertainment as the reading purpose, the character with the highest interaction fluency is taken as the default character of the target character, and when the user takes the improving communication capacity as the reading purpose, the character with the lowest interaction fluency is taken as the default character of the target character, so that the special setting of the reading purpose of the user is adapted, the reading experience of the user is improved, and meanwhile, the setting accuracy of the terminal equipment for the character is improved.
In one possible example, before the determining the default character of the target character from the history node reading record, the method further includes: acquiring the characteristic voice information input by the user facing characters of different characters from the history node reading record, analyzing the speed, the mood and the number of emotion characteristic words of the user when the characteristic voice information is input, and determining the interactive emotion of the user when the characteristic voice information is input, wherein the emotion characteristic words refer to words used for reflecting the emotion of the user in the characteristic voice information; determining a low-emotion personality from a plurality of different interaction emotions of the user facing different roles of the plurality of selectable personality, wherein the low-emotion personality refers to one or more role personality enabling the interaction emotion of the user to be lower than a preset value; and deleting one or more low-emotion characters from the plurality of selectable characters if the one or more low-emotion characters exist in the plurality of selectable characters of the target character.
Specifically, the faster the speech speed, the heavier the mood and the larger the number of emotion feature words when the user inputs the feature voice information, the lower the interaction emotion is indicated, namely, the user is uncomfortable in mood and is easy to generate emotion fluctuation when facing the target character of the character type.
The weight distribution may be performed according to the three elements of the speech speed, the mood and the number of mood words, for example, the speech speed weight is 0.2, the mood weight is 0.3 and the number of mood words is 0.5, and then the weight calculation is performed according to the relevant parameters when the server recognizes that the user inputs the characteristic speech information, for example, the speech speed is 150 words/minute, the volume is 70 db, the number of mood words in the characteristic speech information is 10, and it is known by calculation that (150×0.2+70×0.3+10×0.5) =56, if the preset value is 50, the mood of the user in the current example is lower, the character of the target character belongs to a low mood character for the user, and it should be understood that the sum of the weights of the three elements of the speech speed, the mood and the number of mood words is 1.
In this example, through the interaction emotion when the user faces the roles with different characters, characters with low user interaction emotion are deleted from the characters with the target characters, so that the user cannot set the characters in the later reading, the user reading experience is optimized, and the intelligence of the terminal equipment for assisting the interactive product in interaction is improved.
The foregoing description of the embodiments of the present application has been presented primarily in terms of a method-side implementation. It will be appreciated that the mobile electronic device, in order to implement the above-described functions, includes hardware structures and/or software modules corresponding to the respective functions. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application can divide the functional units of the electronic device according to the method example, for example, each functional unit can be divided corresponding to each function, and two or more functions can be integrated in one processing unit. The integrated units may be implemented in hardware or in software functional units. It should be noted that, in the embodiment of the present application, the division of the units is schematic, which is merely a logic function division, and other division manners may be implemented in actual practice.
In the case of dividing each functional module by adopting each corresponding function, referring to fig. 5a, fig. 5a is a functional unit composition block diagram of a processing device for interactive information in an interactive product according to an embodiment of the present application, which is applied to a terminal device 110 for product interaction with a user through a man-machine dialogue scenario in an interactive system 100 shown in fig. 1, where the interactive system 100 includes the terminal device 110 and a server 120, and the man-machine dialogue scenario includes a plurality of interactive nodes, and each of the interactive nodes includes machine output content and at least one user input content; the processing device 50 includes: the prompting unit 501 is configured to display a first prompting frame on a product interaction interface, where the first prompting frame includes a first operation control for setting characters of a target character, a second operation control for skipping a character setting step of the target character, and first prompting text information, where the first prompting text information is used to prompt the user that a plurality of selectable characters exist in the target character that is currently interacted with by the user, and the target character is a character that is interacted with by the user for the first time when reading a target interaction story; a setting interface display unit 502, configured to receive, in response to a first selection operation of the user on the first operation control, information including the multiple selectable characters of the target character sent from the server, skip display a character setting interface, and display the multiple selectable characters and selection operation controls corresponding to the multiple selectable characters on the character setting interface, where the multiple selectable characters of the target character are associated with character settings and background stories in the target interaction story; a first skip display unit 503, configured to generate a setting request according to information of a target character specified by a second selection operation of the user with respect to the selection operation control, and send the setting request to the server, where the setting request is used to instruct the server to take the target character as a character of the target character, and end displaying the character setting interface and skip displaying an interactive interface of a current interactive node; or, the second skip display unit 504 is configured to respond to a third selection operation of the user on the second operation control, end displaying the first prompt box and skip displaying the interactive interface of the current interactive node, and interact with the server to determine that the character of the target character is a default character, where the default character is one of the multiple selectable characters; an interaction node skip unit 505, configured to respond to a user input operation of the user on an interaction interface of the current interaction node, send an interaction node skip request to the server, where the server generates the interaction node skip response message according to the interaction node skip request, the user input operation is an operation of inputting feature voice information by the user, the feature voice information is matched with a target skip condition of the current interaction node for skipping to a next interaction node, the target skip condition is one condition in a skip condition set, and the skip condition set is determined by the server according to a character of the target role and is used for triggering the next interaction node to skip; and the interactive interface display unit 506 responds to the interactive node skip response message sent by the server, and skips and displays the interactive interface corresponding to the next interactive node according to the next interactive node indicated by the interactive node skip response message.
In one possible example, the sending an interaction node jump request to the server, where the server generates the interaction node jump response message according to the interaction node jump request, and the interaction node jump unit 505 is specifically configured to: judging whether the user reads the target interaction story for the first time or not; if not, determining a history selection record of the user for the target role, wherein the history selection record comprises the history characters of the target role selected by the user and the history skip condition corresponding to the characteristic voice information input to the current interaction node, and determining whether the characters of the currently selected target role are the same as the history characters of the target role; if the historical jump condition is the same, taking the historical jump condition as the target jump condition; if the target jump conditions are different, the characteristic voice information is obtained, the characteristic voice information is analyzed, and the target jump conditions matched with the characteristic voice information are determined.
In one possible example, after said analyzing said characteristic speech information and determining said target jump condition matching therewith, said interactive node jump unit 505 is specifically further configured to: determining whether the target jump condition is the same as the historical jump condition corresponding to the current interaction node; if so, determining whether the target interaction node corresponding to the target jump condition is the same as the history interaction node corresponding to the history jump condition; if the interactive nodes are different from the previous interactive nodes, displaying a second prompting popup frame on the product interactive interface, wherein the second prompting popup frame comprises a second prompting word, a first confirmation operation control and a first cancellation operation control, and the second prompting word is used for prompting the user that the interactive nodes to be jumped are different from the previous interactive nodes, and inquiring whether the user confirms to enter the interactive nodes the same as the previous interactive nodes; responding to a fourth selection operation of the user for the first confirmation operation control, ending displaying the second prompting popup frame, displaying a third prompting popup frame on the product interaction interface, wherein the third prompting popup frame comprises third prompting characters used for prompting the user to enter the target voice information which is required to be input by the interaction node which is the same as the last time, and jumping to the interaction node which is the same as the last time after receiving the target voice information input by the user; responding to a fifth selection operation of the user for the first cancel operation control, and jumping according to the target interaction node corresponding to the target jumping condition; if not, determining whether the target interaction node is the same as the history interaction node; if the interaction interfaces are the same, jumping and displaying the interaction interface which corresponds to the target interaction node and the history interaction node together; the product interaction interface is provided with a product interaction interface, and a product interaction interface is provided with a product interaction interface, wherein the product interaction interface is provided with a product interaction interface, and the product interaction interface is provided with a product interaction interface; responding to a sixth selection operation of the user for the second confirmation operation control, starting the interactive node backtracking function, jumping to a previous interactive node of the current interactive node, and displaying an interactive interface of the previous interactive node; and responding to a seventh selection operation of the user for the second cancel operation control, and ending displaying the fourth prompt popup.
In one possible example, the interaction with the server to determine the character of the target character is a default character, and the second skip display unit 504 is specifically configured to: generating a default character setting message according to the third selection operation, and sending the default character setting message to the server, wherein the default character setting message is used for triggering the server to judge whether the current interaction node is the first reading of the user; if yes, acquiring a history node reading record of the user through the server, and determining the default character of the target role from the history node reading record, wherein the history node reading record refers to at least one reading record of other interaction nodes in the target interaction story, and the other interaction nodes are one or more interaction nodes except the current interaction node in the interaction nodes; if not, acquiring a historical character setting record of the user through the server, wherein the historical character setting record refers to at least one character setting record of the current interaction node on the target character, and taking the character of the target user set when the user reads the current interaction node last time in the historical character setting record or taking the character with the maximum number of times set for the target character in the historical character setting record as a default character of the target character.
In one possible example, the determining the default character of the target character from the history node reading record, the second skip display unit 504 is specifically configured to: determining interaction fluency of roles with different characters in the reading process of the user, wherein the interaction fluency is determined by the times of starting the interaction node backtracking function in the reading process of the user, and the more the times of the interaction node backtracking function are, the lower the interaction fluency is, the less the times of the interaction node backtracking function are, and the higher the interaction fluency is; and determining the default character from the plurality of optional characters corresponding to the target character according to the interaction fluency.
In one possible example, the determining, according to the interaction fluency, the default character from the plurality of selectable characters corresponding to the target character, the second skip display unit 504 is specifically configured to: confirming the reading purpose of the target interaction story, wherein the reading purpose comprises any one of the following steps: entertainment and communication capacity improvement; when the reading purpose is the entertainment purpose, taking one character with the highest interaction fluency as the default character of the target character; and when the reading purpose is the purpose of improving the communication capacity, taking one character with the lowest interaction fluency as the default character of the target character.
In one possible example, before the determining the default character of the target character from the history node reading record, the second jump display unit 504 is specifically further configured to: acquiring the characteristic voice information input by the user facing characters of different characters from the history node reading record, analyzing the speed, the mood and the number of emotion characteristic words of the user when the characteristic voice information is input, and determining the interactive emotion of the user when the characteristic voice information is input, wherein the emotion characteristic words refer to words used for reflecting the emotion of the user in the characteristic voice information; determining a low-emotion personality from a plurality of different interaction emotions of the user facing different roles of the plurality of selectable personality, wherein the low-emotion personality refers to one or more role personality enabling the interaction emotion of the user to be lower than a preset value; and deleting one or more low-emotion characters from the plurality of selectable characters if the one or more low-emotion characters exist in the plurality of selectable characters of the target character.
In the case of using integrated units, referring to fig. 5b, fig. 5b is a functional unit block diagram of a processing device for interactive information in another interactive product according to an embodiment of the present application, and as shown in fig. 5b, the processing device 50 further includes: a communication module 511 and a processing module 512. The processing module 512 is configured to control and manage actions of the processing device for interactive information in the interactive product, for example, the steps of the prompting unit 501, the setting interface display unit 502, the first skip display unit 503, the second skip display unit 504, the interactive node skip unit 505, and the interactive interface display unit 506, and/or other processes for performing the techniques described herein. The communication module 511 is used for supporting interaction between the processing device and other devices of the interactive information in the interactive product. As shown in fig. 5b, the processing device 50 may further comprise a memory module 513, the memory module 513 being for program code and data of the processing device 50.
The processing module 512 may be a processor or controller, such as a central processing unit (Central Processing Unit, CPU), a general purpose processor, a digital signal processor (Digital Signal Processor, DSP), an ASIC, an FPGA or other programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various exemplary logic blocks, modules and circuits described in connection with this disclosure. The processor may also be a combination that performs the function of a computation, e.g., a combination comprising one or more microprocessors, a combination of a DSP and a microprocessor, and the like. The communication module 511 may be a transceiver, an RF circuit, a communication interface, or the like. The memory module 513 may be a memory.
All relevant contents of each scenario related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein. The processing device 50 can execute the processing method of the interactive information in the interactive product shown in fig. 2.
The above embodiments may be implemented in whole or in part by software, hardware, firmware, or any other combination. When implemented in software, the above-described embodiments may be implemented in whole or in part in the form of a computer program product. The computer program product comprises one or more computer instructions or computer programs. When the computer instructions or computer program are loaded or executed on a computer, the processes or functions described in accordance with embodiments of the present application are produced in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website site, computer, server, or data center to another website site, computer, server, or data center by wired or wireless means. The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains one or more sets of available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium. The semiconductor medium may be a solid state disk.
Fig. 6 is a block diagram of a terminal device according to an embodiment of the present application. As shown in fig. 6, the terminal device 600 may include one or more of the following components: a processor 601, a memory 602 coupled to the processor 601, wherein the memory 602 may store one or more computer programs that may be configured to implement the methods as described in the embodiments above when executed by the one or more processors 601. The terminal device 600 may be the terminal device 110 in the above-described embodiment or the above-described electronic device.
Processor 601 may include one or more processing cores. The processor 601 connects various parts within the overall terminal device 600 using various interfaces and lines, performs various functions of the terminal device 600 and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 602, and invoking data stored in the memory 602. Alternatively, the processor 601 may be implemented in at least one hardware form of digital signal processing (Digital Signal Processing, DSP), field-Programmable gate array (FPGA), programmable Logic Array (PLA). The processor 601 may integrate one or a combination of several of a central processing unit (CentralProcessing Unit, CPU), an image processor (Graphics Processing Unit, GPU), and a modem, etc. The CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for being responsible for rendering and drawing of display content; the modem is used to handle wireless communications. It will be appreciated that the modem may not be integrated into the processor 601 and may be implemented solely by a single communication chip.
The Memory 602 may include random access Memory (Random Access Memory, RAM) or Read-Only Memory (ROM). Memory 602 may be used to store instructions, programs, code, a set of codes, or a set of instructions. The memory 602 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (e.g., a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described above, and the like. The storage data area may also store data created by the terminal device 600 in use, etc.
It will be appreciated that the terminal device 600 may include more or fewer structural elements than those shown in the above-described block diagrams and is not limited in this regard.
The embodiments of the present application also provide a computer storage medium having stored thereon a computer program/instruction which, when executed by a processor, performs part or all of the steps of any of the methods described in the method embodiments above.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer-readable storage medium storing a computer program operable to cause a computer to perform part or all of the steps of any one of the methods described in the method embodiments above.
It should be understood that, in various embodiments of the present application, the sequence numbers of the foregoing processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present application.
In the several embodiments provided in the present application, it should be understood that the disclosed method, apparatus and system may be implemented in other manners. For example, the device embodiments described above are merely illustrative; for example, the division of the units is only one logic function division, and other division modes can be adopted in actual implementation; for example, multiple units or components may be combined or may be integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may be physically included separately, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The integrated units implemented in the form of software functional units described above may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: u disk, removable hard disk, magnetic disk, optical disk, volatile memory or nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an electrically Erasable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of random access memory (random access memory, RAM) are available, such as Static RAM (SRAM), dynamic Random Access Memory (DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate synchronous dynamic random access memory (DDR SDRAM), enhanced Synchronous Dynamic Random Access Memory (ESDRAM), synchronous Link DRAM (SLDRAM), direct memory bus RAM (DR RAM), and the like, various mediums that can store program code.
Although the present invention is disclosed above, the present invention is not limited thereto. Variations and modifications, including combinations of the different functions and implementation steps, as well as embodiments of the software and hardware, may be readily apparent to those skilled in the art without departing from the spirit and scope of the invention.
Claims (10)
1. The processing method of the interactive information in the interactive product is characterized by being applied to terminal equipment for product interaction with a user through a man-machine dialogue script in an interactive system, wherein the interactive system comprises the terminal equipment and a server, the man-machine dialogue script comprises a plurality of interactive nodes, and each of the interactive nodes comprises machine output content and at least one user input content; the method comprises the following steps:
displaying a first prompting bullet frame on a product interaction interface, wherein the first prompting bullet frame comprises a first operation control for setting characters of a target character, a second operation control for skipping the character setting step of the target character and first prompting text information, the first prompting text information is used for prompting the user that a plurality of selectable characters exist in the target character which is interacted currently, and the target character is a character which is interacted for the first time when the user reads a target interaction story;
Receiving information, which is sent by the server and comprises a plurality of selectable characters of the target character, from the server in response to a first selection operation of the user on the first operation control, skipping to display a character setting interface, and displaying the selectable characters and selection operation controls corresponding to the selectable characters on the character setting interface, wherein the selectable characters of the target character are associated with character settings and background stories in the target interaction story;
responding to a second selection operation of the user for the selection operation control, generating a setting request according to information of a target character designated by the second selection operation, and sending the setting request to the server, wherein the setting request is used for indicating the server to take the target character as the character of the target character, ending displaying the character setting interface and jumping to display the interactive interface of the current interactive node; or,
responding to a third selection operation of the user for the second operation control, ending displaying the first prompt popup frame, jumping to display an interactive interface of the current interactive node, and interacting with the server to determine that the character of the target character is a default character, wherein the default character is one of the plurality of selectable characters;
Responding to user input operation of the user aiming at the interactive interface of the current interactive node, sending an interactive node jump request to the server, generating an interactive node jump response message by the server according to the interactive node jump request, wherein the user input operation refers to the operation of the user for inputting characteristic voice information, the characteristic voice information is matched with a target jump condition of the current interactive node for jumping to a next interactive node, the target jump condition is one condition in a jump condition set, and the jump condition set is a condition set which is determined by the server according to the character of the target character and is used for triggering the next interactive node to jump;
and responding to the interaction node jump response message sent by the server, and jumping and displaying an interaction interface corresponding to the next interaction node according to the next interaction node indicated by the interaction node jump response message.
2. The method of claim 1, wherein the sending an interactive node jump request to the server, the server generating the interactive node jump response message according to the interactive node jump request, comprises:
Judging whether the user reads the target interaction story for the first time or not;
if not, determining a history selection record of the user for the target role, wherein the history selection record comprises the history characters of the target role selected by the user and the history skip condition corresponding to the characteristic voice information input to the current interaction node, and determining whether the characters of the currently selected target role are the same as the history characters of the target role;
if the historical jump condition is the same, taking the historical jump condition as the target jump condition;
if the target jump conditions are different, the characteristic voice information is obtained, the characteristic voice information is analyzed, and the target jump conditions matched with the characteristic voice information are determined.
3. The method of claim 2, wherein after said analyzing said characteristic speech information to determine said target jump condition matching thereto, said method further comprises:
determining whether the target jump condition is the same as the historical jump condition corresponding to the current interaction node;
if so, determining whether the target interaction node corresponding to the target jump condition is the same as the history interaction node corresponding to the history jump condition;
If the interactive nodes are different from the previous interactive nodes, displaying a second prompting popup frame on the product interactive interface, wherein the second prompting popup frame comprises a second prompting word, a first confirmation operation control and a first cancellation operation control, and the second prompting word is used for prompting the user that the interactive nodes to be jumped are different from the previous interactive nodes, and inquiring whether the user confirms to enter the interactive nodes the same as the previous interactive nodes;
responding to a fourth selection operation of the user for the first confirmation operation control, ending displaying the second prompting popup frame, displaying a third prompting popup frame on the product interaction interface, wherein the third prompting popup frame comprises third prompting characters used for prompting the user to enter the target voice information which is required to be input by the interaction node which is the same as the last time, and jumping to the interaction node which is the same as the last time after receiving the target voice information input by the user;
responding to a fifth selection operation of the user for the first cancel operation control, and jumping according to the target interaction node corresponding to the target jumping condition;
if not, determining whether the target interaction node is the same as the history interaction node;
If the interaction interfaces are the same, jumping and displaying the interaction interface which corresponds to the target interaction node and the history interaction node together; the method comprises the steps of,
displaying a fourth prompting popup frame on the product interactive interface, wherein the fourth prompting popup frame comprises fourth prompting characters, a second confirmation operation control and a second cancellation operation control, the fourth prompting characters are used for prompting the user to have an interactive node backtracking function currently, and the interactive node backtracking function is a function capable of jumping forward to a previous interactive node of the current interactive node;
responding to a sixth selection operation of the user for the second confirmation operation control, starting the interactive node backtracking function, jumping to a previous interactive node of the current interactive node, and displaying an interactive interface of the previous interactive node;
and responding to a seventh selection operation of the user for the second cancel operation control, and ending displaying the fourth prompt popup.
4. The method of claim 1, wherein the interacting with the server to determine the character of the target character as a default character comprises:
generating a default character setting message according to the third selection operation, and sending the default character setting message to the server, wherein the default character setting message is used for triggering the server to judge whether the current interaction node is the first reading of the user;
If yes, acquiring a history node reading record of the user through the server, and determining the default character of the target role from the history node reading record, wherein the history node reading record refers to at least one reading record of other interaction nodes in the target interaction story, and the other interaction nodes are one or more interaction nodes except the current interaction node in the interaction nodes;
if not, acquiring a historical character setting record of the user through the server, wherein the historical character setting record refers to at least one character setting record of the current interaction node on the target character, and taking the character of the target user set when the user reads the current interaction node last time in the historical character setting record or taking the character with the maximum number of times set for the target character in the historical character setting record as a default character of the target character.
5. The method of claim 4, wherein said determining the default character of the target character from the history node reading record comprises:
Determining interaction fluency of roles with different characters in the reading process of the user, wherein the interaction fluency is determined by the times of starting the interaction node backtracking function in the reading process of the user, and the more the times of the interaction node backtracking function are, the lower the interaction fluency is, the less the times of the interaction node backtracking function are, and the higher the interaction fluency is;
and determining the default character from the plurality of optional characters corresponding to the target character according to the interaction fluency.
6. The method of claim 5, wherein the determining the default character from the plurality of selectable characters corresponding to the target character according to the interaction fluency comprises:
confirming the reading purpose of the target interaction story, wherein the reading purpose comprises any one of the following steps: entertainment and communication capacity improvement;
when the reading purpose is the entertainment purpose, taking one character with the highest interaction fluency as the default character of the target character;
and when the reading purpose is the purpose of improving the communication capacity, taking one character with the lowest interaction fluency as the default character of the target character.
7. The method of claim 4, wherein prior to said determining the default character of the target character from the history node reading record, the method further comprises:
acquiring the characteristic voice information input by the user facing characters of different characters from the history node reading record, analyzing the speed, the mood and the number of emotion characteristic words of the user when the characteristic voice information is input, and determining the interactive emotion of the user when the characteristic voice information is input, wherein the emotion characteristic words refer to words used for reflecting the emotion of the user in the characteristic voice information;
determining a low-emotion personality from a plurality of different interaction emotions of the user facing different roles of the plurality of selectable personality, wherein the low-emotion personality refers to one or more role personality enabling the interaction emotion of the user to be lower than a preset value;
and deleting one or more low-emotion characters from the plurality of selectable characters if the one or more low-emotion characters exist in the plurality of selectable characters of the target character.
8. The processing device of the interactive information in the interactive product is characterized by being applied to terminal equipment for product interaction with a user through a man-machine dialogue script in an interactive system, wherein the interactive system comprises the terminal equipment and a server, the man-machine dialogue script comprises a plurality of interactive nodes, and each of the interactive nodes comprises machine output content and at least one user input content; the device comprises:
The prompting unit is used for displaying a first prompting popup frame on a product interaction interface, wherein the first prompting popup frame comprises a first operation control for setting characters of a target character, a second operation control for skipping the character setting step of the target character and first prompting text information, the first prompting text information is used for prompting the user that a plurality of selectable characters exist in the target character which is interacted currently, and the target character is the character which is interacted for the first time when the user reads a target interaction story;
the setting interface display unit is used for responding to a first selection operation of the user for the first operation control, receiving information, which is sent by the server and comprises the multiple selectable characters of the target character, of jumping and displaying a character setting interface, displaying the multiple selectable characters and selection operation controls corresponding to the multiple selectable characters on the character setting interface, wherein the multiple selectable characters of the target character are associated with character setting and background stories in the target interaction stories;
the first jump display unit is used for responding to the second selection operation of the user for the selection operation control, generating a setting request according to the information of the target character designated by the second selection operation, sending the setting request to the server, and indicating the server to take the target character as the character of the target character, ending displaying the character setting interface and jumping to display the interactive interface of the current interactive node; or,
The second jump display unit is used for responding to the third selection operation of the user for the second operation control, finishing displaying the first prompt popup frame, jumping to display the interactive interface of the current interactive node, and interacting with the server to determine that the character of the target character is a default character, wherein the default character is one of the plurality of selectable characters;
an interactive node jump unit, responding to the user input operation of the user aiming at the interactive interface of the current interactive node, sending an interactive node jump request to the server, generating an interactive node jump response message by the server according to the interactive node jump request, wherein the user input operation is the operation of inputting characteristic voice information by the user, the characteristic voice information is matched with a target jump condition of jumping of the current interactive node to the next interactive node, the target jump condition is one condition in a jump condition set, and the jump condition set is determined by the server according to the character of the target character and is used for triggering the next interactive node to jump;
And the interactive interface display unit responds to the interactive node jump response message sent by the server, and jumps and displays the interactive interface corresponding to the next interactive node according to the next interactive node indicated by the interactive node jump response message.
9. An electronic device comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-7.
10. A computer readable storage medium storing a computer program for electronic data exchange, wherein the computer program causes a computer to perform the method of any one of claims 1-7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310950990.4A CN116974412A (en) | 2023-07-28 | 2023-07-28 | Method and related device for processing interaction information in interaction product |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310950990.4A CN116974412A (en) | 2023-07-28 | 2023-07-28 | Method and related device for processing interaction information in interaction product |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116974412A true CN116974412A (en) | 2023-10-31 |
Family
ID=88479197
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310950990.4A Pending CN116974412A (en) | 2023-07-28 | 2023-07-28 | Method and related device for processing interaction information in interaction product |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116974412A (en) |
-
2023
- 2023-07-28 CN CN202310950990.4A patent/CN116974412A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108536802B (en) | Interaction method and device based on child emotion | |
Fraser et al. | Spoken conversational ai in video games: Emotional dialogue management increases user engagement | |
JP6719747B2 (en) | Interactive method, interactive system, interactive device, and program | |
JP6601069B2 (en) | Dialog control apparatus, dialog control method, and program | |
CN111737411A (en) | Response method in man-machine conversation, conversation system and storage medium | |
US10872609B2 (en) | Method, apparatus, and program of dialog presentation steps for agents | |
CN110704703A (en) | Man-machine conversation method and device | |
JP6699010B2 (en) | Dialogue method, dialogue system, dialogue device, and program | |
WO2018061774A1 (en) | Information processing system, information processing device, information processing method, and storage medium | |
US20190304451A1 (en) | Dialogue method, dialogue system, dialogue apparatus and program | |
US11267121B2 (en) | Conversation output system, conversation output method, and non-transitory recording medium | |
CN111542814A (en) | Method, computer device and computer readable storage medium for changing responses to provide rich-representation natural language dialog | |
CN110808038B (en) | Mandarin evaluating method, device, equipment and storage medium | |
CN111557001B (en) | Method for providing natural language dialogue, computer device and computer readable storage medium | |
CN117122927A (en) | NPC interaction method, device and storage medium | |
CN108132717A (en) | Recommendation method, apparatus, storage medium and the mobile terminal of candidate word | |
JP5851051B2 (en) | Information processing apparatus, server apparatus, dialogue system, and program | |
CN111556999B (en) | Method, computer device and computer readable storage medium for providing natural language dialogue by providing substantive answer in real time | |
CN117618890A (en) | Interaction method, interaction device, electronic equipment and computer readable storage medium | |
CN118132695A (en) | Heuristic man-machine dialogue method and device, electronic equipment and storage medium | |
CN116974412A (en) | Method and related device for processing interaction information in interaction product | |
Bhargava | Alexa Skills Projects: Build exciting projects with Amazon Alexa and integrate it with Internet of Things | |
CN114707823B (en) | Interactive content scoring method and device, electronic equipment and storage medium | |
JP7462995B1 (en) | Information processing system, information processing method, and program | |
JP7530688B1 (en) | Program, computer, system and information processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |