CN115253272A - Game interaction method and device, storage medium and electronic equipment - Google Patents

Game interaction method and device, storage medium and electronic equipment Download PDF

Info

Publication number
CN115253272A
CN115253272A CN202210894162.9A CN202210894162A CN115253272A CN 115253272 A CN115253272 A CN 115253272A CN 202210894162 A CN202210894162 A CN 202210894162A CN 115253272 A CN115253272 A CN 115253272A
Authority
CN
China
Prior art keywords
target user
state
game
voice interaction
activity state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210894162.9A
Other languages
Chinese (zh)
Inventor
张思敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202210894162.9A priority Critical patent/CN115253272A/en
Publication of CN115253272A publication Critical patent/CN115253272A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/424Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving acoustic input signals, e.g. by using the results of pitch or rhythm extraction or voice recognition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game

Abstract

The present disclosure provides a game interaction method and apparatus, an electronic device, and a storage medium; relates to the technical field of games. The game interaction method comprises the following steps: acquiring biological activity data of a target user acquired by a wearable device; determining a current activity state of the target user according to the biological activity data; and in response to the fact that the current activity state of the target user is a preset state, controlling the game virtual role and the target user to perform plot interactive operation corresponding to the activity state of the target user. The method and the device can effectively improve the efficiency of game information transmission and the matching degree of information content.

Description

Game interaction method and device, storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of game technologies, and in particular, to a game interaction method, a game interaction apparatus, an electronic device, and a computer-readable storage medium.
Background
With the development of game technology, the interaction between games and users is more and more, wherein the connection between the games and the users can be deepened by using effective game interaction.
In the prior art, in some game interaction methods, when a time for performing an interactive operation with a user is selected, the user does not care about the current situation or state of the user, and therefore the user may not be able to view or interact during sleep or exercise. Also, the game interaction content with which the user interacts and the user may not match, resulting in poor information conveyance of the game interaction.
Therefore, a new game interaction method needs to be proposed.
It is to be noted that the information disclosed in the above background section is only for enhancement of understanding of the background of the present disclosure, and thus may include information that does not constitute prior art known to those of ordinary skill in the art.
BRIEF SUMMARY OF THE PRESENT DISCLOSURE
The present disclosure is directed to a game interaction method, a game interaction apparatus, an electronic device, and a computer-readable storage medium, which can effectively improve game information transmission efficiency and information content matching degree.
According to an aspect of the present disclosure, there is provided a game interaction method, including:
acquiring biological activity data of a target user acquired by a wearable device;
determining a current activity state of the target user according to the biological activity data;
and responding to the current activity state of the target user as a preset state, and controlling the game virtual role and the target user to perform plot interactive operation corresponding to the activity state of the target user.
In an exemplary embodiment of the present disclosure, further comprising:
the biological activity data comprises physical sign information data of the target user and activity information data of the target user.
In an exemplary embodiment of the present disclosure, determining the current activity state of the target user from the biological activity data comprises:
acquiring corresponding weight information according to the category of the biological activity data;
and determining the current activity state of the target user according to each biological activity data and the corresponding weight information.
In an exemplary embodiment of the present disclosure, the preset state includes an idle state; and the step of controlling the game virtual character and the target user to perform plot interactive operation corresponding to the activity state of the target user in response to the current activity state of the target user being a preset state comprises the following steps:
and responding to the current activity state of the target user as an idle state, and controlling a game virtual character and the target user to perform voice interaction operation corresponding to the activity state of the target user.
In an exemplary embodiment of the present disclosure, the controlling a game virtual character to perform a voice interaction operation with the target user corresponding to an activity state of the target user includes:
sending a control instruction to the wearable device to control the wearable device to generate prompt information; the prompt information is used for prompting the target user to accept the voice interaction operation.
In an exemplary embodiment of the present disclosure, when the current active state is an idle state, initiating a voice interaction operation to the target user includes:
initiating the voice interaction operation to the target user through the terminal equipment identity identification number of the target user; alternatively, the first and second electrodes may be,
and initiating the voice interaction operation to the target user through a game interaction interface of the game virtual character.
In an exemplary embodiment of the present disclosure, controlling a game virtual character to perform a voice interaction operation with a target user corresponding to an activity state of the target user includes:
determining voice interaction content according to the activity state of the target user;
and performing voice interaction operation with the target user by combining the voice interaction content.
In an exemplary embodiment of the present disclosure, determining voice interaction content according to an activity state of the target user includes:
determining voice interaction content according to the current activity state of the target user; alternatively, the first and second electrodes may be,
and determining voice interaction content according to the historical activity state of the target user.
In an exemplary embodiment of the present disclosure, determining the voice interaction content according to the current activity state of the target user or according to the historical activity state of the target user includes:
acquiring a voice content template corresponding to the activity state of the target user;
and generating the voice content by combining the activity state of the target user and the voice content template.
According to an aspect of the present disclosure, there is provided a game interaction apparatus including:
the acquisition module is used for acquiring biological activity data corresponding to a target user, wherein the biological activity data is acquired by wearable equipment of the target user;
an analysis module for determining an activity state of the target user according to the biological activity data;
and the processing module is used for carrying out plot interactive operation corresponding to the activity state with the target user when the target user is determined to be in the preset activity state.
According to an aspect of the present disclosure, there is provided an electronic device including: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the method of any one of the above via execution of the executable instructions.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of any one of the above.
Exemplary embodiments of the present disclosure may have some or all of the following advantages:
in the game interaction method provided by an example embodiment of the present disclosure, in response to that the current activity state of the target user is a preset state, a game virtual character and the target user are controlled to perform a scenario interaction operation corresponding to the activity state of the target user. On one hand, the scenario interactive operation is carried out when the current activity state of the target user is a preset state, and then different scenario interactive operations can be carried out in different activity states of the target user. For example, the user can perform voice interactive operation when the user is idle, and the user does not perform any plot interactive operation when the user is sleeping, so that the user is prevented from refusing or omitting the plot interactive operation, and the transmission efficiency of game information is effectively improved. On the other hand, the plot interactive operation corresponding to the activity state of the target user is carried out with the target user, so that the content of the plot interactive operation is related to the target user, the problem that the game interactive content is possibly unmatched with the target user is solved, the matching degree of the game message content is further improved, and the game experience of the user is enhanced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
FIG. 1 is a schematic diagram illustrating an exemplary system architecture to which a game interaction method and apparatus of the disclosed embodiments may be applied;
FIG. 2 schematically shows a flow diagram of a game interaction method according to one embodiment of the present disclosure;
FIG. 3 schematically shows a schematic diagram of user biological activity data in one embodiment according to the present disclosure;
FIG. 4 schematically shows a flow chart that schematically illustrates a game interaction method, in accordance with another embodiment of the present disclosure;
fig. 5 schematically shows a schematic diagram schematically illustrating wearable device reminder information according to an embodiment of the present disclosure;
FIG. 6 schematically shows a diagram schematically illustrating voice interaction with a user through an operator according to an embodiment of the present disclosure;
FIG. 7 schematically shows a schematic diagram schematically illustrating simulated voice interaction with a user through a game according to one embodiment of the present disclosure;
FIG. 8 schematically shows a flow chart that schematically illustrates a game interaction method, in accordance with another embodiment of the present disclosure;
FIG. 9 schematically shows a flow diagram schematically illustrating a game interaction method according to one embodiment of the present disclosure;
FIG. 10 schematically shows a block diagram of a game interaction device according to one embodiment of the present disclosure;
FIG. 11 illustrates a schematic structural diagram of a computer system suitable for use in implementing an electronic device of an embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and the like. In other instances, well-known technical solutions have not been shown or described in detail to avoid obscuring aspects of the present disclosure.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 is a schematic diagram illustrating a system architecture of an exemplary application environment to which a game interaction method and apparatus according to an embodiment of the present disclosure may be applied.
As shown in fig. 1, system architecture 100 may include a target user 101, a wearable device 102, and a terminal device 103, and a server 104. The target user 101 may be a human of different genders or races, and the wearable device 102 may be in the form of various portable accessories that can be worn on the body of the target user 101, have a partial computing function, and can be connected to various terminal devices and servers, including a smart band and a smart watch. The terminal device 103 may be various electronic devices with game interaction functions, typically mobile terminal devices such as mobile phones, and further includes desktop computers, portable computers, tablet computers, and the like. It should be understood that the number of target users, wearable devices, terminal devices, and servers in fig. 1 are merely illustrative. There may be any number of target users, wearable devices, terminal devices, and servers, as desired for the implementation. For example, server 104 may be a server cluster comprised of multiple servers, or the like.
The game interaction method provided by the embodiment of the present disclosure is generally executed by the terminal device 103, and accordingly, the game interaction apparatus is generally disposed in the terminal device 103. However, it is easily understood by those skilled in the art that the game interaction method provided in the embodiment of the present disclosure may also be executed by the server 104, and accordingly, the game interaction device may also be disposed in the server 104, which is not particularly limited in the exemplary embodiment. For example, in an exemplary embodiment, the biological activity data of the target user 101 may be acquired by a biological activity sensor on the wearable device 102, and then uploaded to the terminal device 103, the terminal device 103 uploads the biological activity data to the server 104, the server 104 performs game interaction with the terminal device 102 through the game interaction method provided by the embodiment of the present disclosure, and the like. The game interaction method provided by the embodiment of the disclosure can also be executed by the terminal device 103 and the server 104 together.
With the development of game technology, in some games (such as the current drama loves a hand game), there are often game avatars interacting with users to enhance the game experience of the users.
The interactive operation of part of game scenarios is that a user selects a game virtual character in a game to perform voice interaction, then the game virtual character can initiate voice interaction to the user, and the voice interaction modes are probably two: the simulated voice is dialed in the game and voice activity is performed with the user mobile device by the operator outside the game. However, the user substitution sense is not strong enough due to the passive voice interaction of the game, and the direct call outside the game is easily rejected by the user or the user easily misses the call to cause the missed call.
Based on this, in the present exemplary embodiment, there is first provided a game interaction method, which, as shown with reference to fig. 2, may include the steps of:
step S210, acquiring biological activity data of a target user acquired by wearable equipment;
step S220, determining the current activity state of the target user according to the biological activity data;
step S230, in response to that the current activity state of the target user is a preset state, controlling a game virtual character and the target user to perform a scenario interaction operation corresponding to the activity state of the target user.
In the game interaction method provided by an example embodiment of the present disclosure, in response to that the current activity state of the target user is a preset state, a game virtual character and the target user are controlled to perform a scenario interaction operation corresponding to the activity state of the target user. On one hand, the scenario interactive operation is carried out when the current activity state of the target user is a preset state, and then different scenario interactive operations can be carried out in different activity states of the target user. For example, the user can perform voice interactive operation when the user is idle, and the user does not perform any plot interactive operation when the user is sleeping, so that the user is prevented from refusing or omitting the plot interactive operation, and the transmission efficiency of game information is effectively improved. On the other hand, the plot interactive operation corresponding to the activity state of the target user is carried out with the target user, so that the content of the plot interactive operation is related to the target user, the problem that the game interactive content is possibly unmatched with the target user is solved, the matching degree of the game message content is further improved, and the game experience of the user is enhanced.
Next, in another embodiment, the above steps are explained in more detail.
In step S210, biological activity data of the target user collected by the wearable device is acquired.
In this example embodiment, the wearable device is in the form of a portable accessory that can be worn on the body of a target user, for example, the mainstream product forms include various wearable devices such as a smart watch, a smart bracelet, and smart glasses, and the embodiment of the present disclosure is not limited thereto. The wearable device has a working principle that the wearable device is provided with a plurality of types of sensors, and the sensors can be used for detecting vital sign data of a target user, such as the heart rate, the blood pressure, the body temperature and other vital sign data of the target user. There are also some environmental data in daily life, including the temperature, wind direction, air humidity, etc. that can be detected. In addition, exercise data of the target user may be detected, such as a step count record, calorie consumption record, and the like of the target user. The wearable device has a partial calculation function, and may be connected to various terminal devices to transmit or share data information with each other, and operating systems of various wearable devices include, but are not limited to, an Android operating system, an ios operating system, a Windows operating system, and the like. Since the user's biological activity data relates to the user's personal privacy, the user's explicit authorization should generally be obtained prior to obtaining and utilizing the target user's biological activity data collected by the wearable device. The time for obtaining the user authorization may be when the user logs in the game for the first time, or when the user starts the scenario interaction play method, or other suitable times, and a person skilled in the art may select a suitable time and adopt a corresponding technical means as required, which is not described herein again. It should be noted that the above steps for acquiring the user's biological activity data are not necessary technical features for the technical problem to be solved by the present disclosure, and should not be taken as a limitation to the scope of the present disclosure.
In this example embodiment, the biological activity data may include vital sign information data of the target user and activity information data of the target user. The sign information data of the target user is some data representing vital signs of the target user, including a heart rate value, a blood pressure value, a blood sugar value, and the like of the target user. From this data, a lot of information about the target user can be deduced, such as whether a low blood glucose value may conclude that the target user is hungry, and whether a too high heart rate value may conclude that the target user is engaged in a violent activity. Of course, the physical sign information data of the target user includes, but is not limited to, the heart rate value and the like, and pulse data, blood oxygen data, respiration data and the like of the target user may also be detected, and the embodiment of the disclosure is not limited. The activity information data of the target user includes step number data, sleep data and the like of the target user, and the data can represent the motion state, the sleep state and the like of the target user.
In this example embodiment, the game server may obtain the biological activity data of the target user by directly connecting to the wearable device, or may obtain the biological activity data by using the terminal device, where the terminal device and the wearable device are connected to each other, and may transmit data or share data in both directions. For example, in a scene that a mobile phone and a smart watch which are deployed with an Android operating system are connected through bluetooth, a game server acquires target user biological activity data uploaded from the mobile phone, and the biological activity data share data between the mobile phone and the smart watch because the mobile phone and the smart watch are connected through bluetooth. Referring to fig. 3, the biological activity data of the target user in 12 months and 12 days are acquired by using the mobile phone, wherein the biological activity data of the target user in 12 months and 12 days comprise a movement distance of 2.59 kilometers, a step number record of 3908 steps, a sleep time of 6 minutes and 47 minutes and a pressure value of 49, the sleep record of the target user is in a sleep state from 02 in the early morning to 08 in the morning, and the conditions of different sleep states, namely the time of shallow sleep and the time of deep sleep, are also displayed. In addition, the manner of acquiring the biological activity data of the target user acquired by the wearable device is not limited to the above method.
In step S320, the current activity status of the target user is determined according to the biological activity data.
In the present exemplary embodiment, the active state represents the movement or activity situation of the target user, including an idle state, a busy state, a sleep state, and the like. For example, the target user is in an idle state when the target user does not move or performs other activities, or is in a busy state when the target user runs or performs activities with high stimulation, or the target user is in a sleep state when breathing of the target user is relatively smooth and resting or has fallen asleep.
In the present exemplary embodiment, the target user biological activity data obtained in step S310 is analyzed to determine the current activity status of the target user. Referring to fig. 4, determining the current activity state of the target user from the biological activity data may be through the following steps S410-S420.
In step S410, corresponding weight information is obtained according to the category of the biological activity data.
In the present exemplary embodiment, the biological activity data has different types and different proportions for determining the activity state. For example, when the current activity state of the target user is judged, the heart rate can determine the activity state more intuitively and more typically relative to other biological activity data, so that the weight information of the heart rate is higher than that of other biological activity data, and therefore the weight information required by determining the activity state can be preset and put into the database. In addition, it is needless to say that different weighting information may be preconfigured using other characteristics of the biological activity data, such as assignment of weights according to a data format of the biological activity data, and the embodiment of the present disclosure is not limited. This example is shownIn the embodiment, after the biological activity data of the target user is acquired, the biological activity data is classified according to the category, and then the corresponding weight information W is taken out from the database respectivelyi. For example, if the biological activity data acquired according to the requirement only includes a heart rate value and a blood pressure value, the proportion corresponding to the heart rate value is W acquired from the database10.7, weight of blood pressure value W2Is 0.3.
In step S420, a current activity status of the target user is determined according to each piece of the biological activity data and the corresponding weight information.
In this exemplary embodiment, after acquiring each piece of biological activity data and the corresponding weight information, the biological activity data and the corresponding weight information are combined for analysis, and the current activity state of the user is determined from the analysis result. For example, the biological activity data I and the corresponding weight W are comparediAnd calculating to obtain results, then adding the calculation results and comparing the calculation results with preset thresholds in a database, wherein the preset thresholds in different activity states are different, and then determining the activity state of the user according to the comparison of the calculation addition results and the preset thresholds. If the active state is divided into a busy state, an idle state and a sleep state, the thresholds of the three states are different, and the threshold of the busy state is set to be the highest, the threshold of the idle state is set to be the second time, and the sleep state is set to be the lowest. The biological activity data and the corresponding weight may be multiplied to obtain a resulting product, such as a current heart rate value I1Is 110, blood pressure value I2Is 50, and the corresponding weight W1And W2After multiplication, the resulting product numbers are 77 and 15, respectively, and the product results are added to obtain a result 92. Assuming a busy state threshold of 80, the user's active state is now busy and not appropriate for communication or disturbance. In other exemplary embodiments of the present disclosure, data with a high weight value may be selected as key data, and the activity state of the user may be inferred by monitoring the key data, which is not limited in this exemplary embodiment. In other exemplary embodiments of the present disclosure, the target is determined from biological activity data of the target userThe activity state of the user may also be determined in other ways, such as using other data (respiration, number of steps, etc.) and by sorting the differences in the data, etc.
In the present exemplary embodiment, the method described above is not limited to determining the current activity state of the target user from the biological activity data of the target user. The present disclosure is not particularly limited thereto.
In step S230, in response to that the current activity state of the target user is a preset state, controlling a game virtual character and the target user to perform a scenario interaction operation corresponding to the activity state of the target user.
In this exemplary embodiment, the preset state can be flexibly set according to actual needs. After the current activity state of the target user is obtained after analysis is carried out according to the biological activity data of the target user, and when the current activity state of the target user is found to be in a preset state, the game server can control the virtual character to carry out plot interactive operation corresponding to the activity state of the target user on the target user. For example, the current active state of the target user may be classified into a busy state, an idle state, a sleep state, and the like. There are many scenarios interactive operation modes, and there are common voice interactive operation and video interactive operation, such as simulated video interaction between a character in a game and a user, message interactive operation, and the like, which are not particularly limited by the present disclosure. The plot interactive operation in different activity states is different, for example, the target user is currently in an idle state, the game server can control the virtual character to perform voice interaction or video interaction with the target user, and when the target user is in a busy state, the virtual character can be controlled to perform message interaction with the target user. In this exemplary embodiment, in the scenario interaction operation corresponding to the activity state of the target user performed with the target user, the scenario interaction content may be generated according to the current activity state or the historical activity state of the target user, so as to improve the matching degree of the game message content and enhance the game experience of the target user.
In this example embodiment, in response to that the current activity state of the target user is an idle state, a game virtual character is controlled to perform a voice interaction operation corresponding to the activity state of the target user with the target user. For example, the current heart rate value and the blood pressure value of the target user are lower, the current idle state of the target user is obtained after the calculation by combining the corresponding weight, the preset active state is also the idle state, and when the current active state of the target user is the idle state, the game virtual role is controlled to initiate voice interaction operation to the target user. At the moment, voice interaction operation is initiated to the target user, the game scenario content can be transmitted to the target user as far as possible, the possibility of missed listening or refusing is reduced, and the transmission efficiency of game information is improved.
In order to further improve the game information transmission efficiency, prompt information can be sent to the target user to prompt the target user to accept voice interaction, the target user is prevented from being missed, and the target user can be reminded through the wearable device.
In the example embodiment, a control instruction is sent to the wearable device to control the wearable device to generate prompt information; the prompt information is used for prompting the target user to accept the voice interaction operation. Sending a control instruction to the wearable device of the target user, where the control instruction may generate different prompt information in the wearable device, for example, the control instruction may be a vibration instruction of the wearable device, or a display instruction of the wearable device, which is not limited in this disclosure. The prompt information generated by controlling the wearable device is used for prompting the target user to accept the voice interaction operation, so that vibration prompt can be performed on the wearable device, icon prompt can be performed in a display area of the wearable device, and voice broadcast can be performed. For example, a caption can be directly displayed in the smart glasses of the target user to indicate that a call comes in and needs to be answered, or a smart bracelet controlling the target user vibrates to indicate that the call comes in. Referring to fig. 5, for example, when it is detected that the target user is currently in an idle state, voice interaction is performed with the target user, and meanwhile, a control instruction is sent to the smart watch to vibrate the smart watch and display a voice incoming call image, the user is prompted to have the voice interaction, and then the user can select whether to receive the voice interaction according to a self-convenient condition. This can further improve the game information transmission efficiency. In addition, other different manners may also be used to deliver a prompt message to the target user to remind the target user to accept the voice interaction, which is not particularly limited in this disclosure.
In this example embodiment, when the preset active state is an idle state, a voice interaction operation is initiated to the target user, and the voice interaction operation may be initiated in multiple ways to perform voice interaction with the target user. For example, the manner of initiating the voice interaction operation to the user includes: initiating the voice interaction operation to the target user through the terminal equipment identity identification number of the target user; or, the voice interaction operation is initiated to the target user through a game interaction interface of the game virtual character. The game server can perform voice interaction with the target user according to the terminal equipment identification number. For example, the terminal device of the target user interacting with the game is a mobile phone, the terminal device identification number representing the identity of the target user is the mobile phone number of the target user, the game server can dial the mobile phone of the target user through an operator, the mobile phone of the target user receives the game voice interaction at the moment, and meanwhile, the smart watch of the target user also receives the prompt message to prompt the target user to receive the voice interaction. As shown in fig. 6, a call record of the game server performing voice interaction with the target user through the operator is shown, and the time and duration of the voice interaction between the game virtual character and the target user are recorded therein respectively.
In addition, the voice interaction operation is initiated to the target user through a game interaction interface of the game virtual character. The game server can also initiate voice interaction, namely simulating conversation, to the target user in the game interactive interface by controlling the game virtual character in the game. Referring to fig. 7, the game server performs voice interaction with the target user through the game virtual character at the game interactive interface and triggers the game interactive scenario, the game virtual character in the figure performs voice interaction with the target user at the study room, and subtitles are also present at the game interactive interface to facilitate interaction with the target user.
In the existing other game scenario interactions, most of the game virtual characters and the scenario interactions of the target users are carried out according to the selection of the target users, the passivity of the game interactions is reflected, some scenario interactions cannot be skipped or ignored after starting, and the degree of freedom is small. In the present exemplary embodiment, when the target user finds that there is a game voice interaction operation, the target user may freely select whether to accept or not according to whether the current situation of the target user is convenient, so as to provide the target user with a great degree of freedom. And the plot interactive operation of the game is actively triggered by the game according to the state of the target user, so that the game is closer to life, the game is further intersected with reality, the substitution feeling of the target user is improved, the target user can accept voice interaction more easily, and the game information transmission efficiency is further improved. Based on the scheme in some example embodiments of the present disclosure, voice interaction can be actively triggered only by acquiring the biological activity information of the user through the wearable device, so that the development cost is greatly reduced.
In this exemplary embodiment, the current activity state of the target user is determined according to the biological activity data, and information is transferred in response to the current activity state of the target user being a preset state, so that the information transmission efficiency can be effectively improved, the target user can receive the information more easily, and the target user is prevented from missing important events. For example, the probability of information interaction with the target user in the idle state of the target user is higher than the probability of success when the target user is in the busy state, and resources are saved. In the present exemplary embodiment, the analysis of the activity state of the target user is used to determine the timing of game interaction during game interaction, so as to prevent the user from rejecting or omitting game interaction due to his own cause, and improve the transmission efficiency of game information to the target user.
In addition, as for the voice interaction operation corresponding to the activity state of the target user by controlling the game virtual character with the target user, other modes may be used for voice interaction, and the disclosure is not particularly limited thereto.
In this example embodiment, a game virtual character and the target user are controlled to perform a voice interaction operation corresponding to an activity state of the target user. And the method also comprises the step of carrying out voice interaction operation corresponding to the activity state of the target user, namely when the game virtual character and the target user carry out voice interaction, the communicated voice content can generate corresponding voice interaction content according to the activity state of the target user. For example, the current active state when performing voice interaction with the target user is an idle state, and corresponding voice interaction content is generated according to the current state of the target user. Referring to fig. 8, corresponding voice contents are generated through steps S810-S820.
In step S810, determining the voice interaction content according to the activity state of the target user.
In the embodiment, the voice interaction content is determined according to the activity state of the target user, so that the voice interaction with the target user can be effectively carried out according to the actual situation of the target user, and the content matching degree of the plot interaction is improved. Wherein the voice interactive content can be determined and generated according to the activity state of the target user in certain time periods.
In the present exemplary embodiment, the voice interactive content is determined according to the current activity state of the target user; or determining the voice interaction content according to the historical activity state of the target user. In order to improve the matching degree of the voice content of the voice interaction, firstly, the voice interaction content can be generated according to the current activity state of the target user. For example, the current mood or activity of the target user can be better fitted, the simulation scene is more real, and the target user has more substitution feeling. Secondly, voice interaction content can be determined according to the historical activity state of the target user, and the voice interaction content can be determined from the interaction state in a certain time period according to requirements. For example, the target user may be analyzed according to the current activity state of the target user as a whole to generate the fitted voice interaction content. In this regard, generating voice interaction content based on the activity state of the target user may be accomplished with the following steps S1010-S1020.
In step S1010, a voice content template corresponding to the activity state of the target user is obtained.
In the present exemplary embodiment, the voice content template is obtained by placing preset voice content in a database, analyzing various biological activity data of the target user in different activity states, comparing the various biological activity data with a preset corresponding threshold interval, finding out the biological activity data categories that are not within the threshold interval, and using the categories as keywords to correspondingly find out the voice content template in the database. For example, if the number of steps of the target user is 3000 steps on the same day, and the predetermined threshold interval of the number of steps is 5000-6000 steps, the acquired voice content template is a voice content template surrounding "move with much attention", or "inquire whether the user is at home", or if the sleeping time of the target user on the same day is 5 hours and is less than the predetermined sleeping time interval, the acquired voice content template reminds the target user of "take a rest with much attention", and the like.
In step S1020, the voice content is generated by combining the activity status of the target user and the voice content template.
In the present exemplary embodiment, the voice content to be interacted with is generated for the activity state of the target user in combination with the corresponding voice content template. In different activity states of the target user, biological activity data of various types and a plurality of voice content templates are collected to form final voice content. For example, the step number record of the user on the same day exceeds the step number threshold interval, the obtained voice content template is the voice template combined with the step number record, and the template is used for reminding the user that the user walks some steps on the same day and takes a rest with much attention, inquiring that the user is to go there to play, and the like. For example, the heart rate value of the target user on the same day is always larger than the preset heart rate threshold value, and the acquired voice content template is a voice content template of 'more rest carelessness when the heart rate exceeds the heart rate threshold value on the same day' or 'fun for speaking and fun for the user'. The final generated voice content theme is around "remind the user to take a rest more carefully" and "fun and smile to go". For example, "how many steps a user has taken today, if the user goes out to play, or if the user moves out to go? It is also found that the blood pressure of the user is good all day long, and must be more careful to rest, and finally some jokes are spoken to divert the attention of the user ". In the prior art, most of the voice interactive contents are contents which do not relate to users, most of the voice interactive contents are scenarios in games, the substituting feeling for the users is not strong, and the 'customized scenarios' are generated according to the activity states of the users in the steps S1010-S1020, so that the matching degree of the game message contents is effectively improved.
In step S902, a voice interaction operation is performed with the target user in combination with the voice interaction content.
In this exemplary embodiment, the game server controls the game virtual character to perform voice interaction with the target user through the generated voice interaction content, thereby completing the entire game interaction operation.
Based on the game interaction method in the above example embodiment, the electronic device may automatically analyze the biological activity data of the user and determine the current activity state of the user only by acquiring the biological activity data corresponding to the target user, which is acquired by the wearable device of the target user; and then, when the target user is determined to be in a preset activity state, performing scenario interactive operation corresponding to the activity state with the target user. Compared with the prior art, the game interaction method in the example embodiment greatly optimizes the participation of the user in game interaction, and further can greatly improve the experience of the user. For example, the game interaction method in the above example embodiment is particularly suitable for a situation where a user is busy or inconvenient to participate in game interaction due to a game virtual character directly calling the user in a love game scenario. Furthermore, in the game interaction method in the above example embodiment, the content of the voice interaction may be "personalized", a corresponding voice content template may be obtained according to the activity state of the user, and the voice content may be generated by combining the activity state and the voice content template. And responding to the current activity state of the target user as a preset state, and controlling the game virtual role and the target user to perform plot interactive operation corresponding to the activity state of the target user. On one hand, the scenario interactive operation is carried out when the current activity state of the target user is a preset state, and then different scenario interactive operations can be carried out in different activity states of the target user. For example, the user can perform voice interactive operation when the user is idle, and the user does not perform any plot interactive operation when the user is sleeping, so that the situation interactive operation is prevented from being refused or omitted by the user, and the transmission efficiency of game information is effectively improved. On the other hand, the plot interactive operation corresponding to the activity state of the target user is carried out with the target user, so that the content of the plot interactive operation is related to the target user, the problem that the game interactive content is possibly unmatched with the target user is solved, the matching degree of the game message content is further improved, and the game experience of the user is enhanced.
It should be noted that although the steps of the methods of the present disclosure are depicted in the drawings in a particular order, this does not require or imply that the steps must be performed in this particular order or that all of the depicted steps must be performed to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions, etc.
Further, in the present exemplary embodiment, a game interaction apparatus is also provided. The game interaction device can be applied to a server or terminal equipment. Referring to fig. 10, the game interaction apparatus 1000 may include an obtaining module 1010, an analyzing module 1020, and a processing module 1030, wherein:
the acquiring module 1010 is configured to acquire biological activity data corresponding to a target user, which is acquired by wearable equipment of the target user;
an analysis module 1020 for determining a current activity state of the target user from the biological activity data;
and the processing module 1030 is configured to perform scenario interaction operation corresponding to the activity state with the target user when it is determined that the target user is in a preset activity state.
In an exemplary embodiment of the present disclosure, the analysis module 1020 includes:
the weight acquisition unit is used for acquiring corresponding weight information according to the category of the biological activity data;
and the activity state determining unit is used for determining the current activity state of the target user according to each piece of biological activity data and the corresponding weight information.
In an exemplary embodiment of the present disclosure, the processing module 1030 includes:
and the operation processing unit is used for initiating voice interaction operation to the target user when the preset active state is an idle state.
In an exemplary embodiment of the present disclosure, the operation processing unit sends the voice interaction prompt to the user by: when the preset activity state is an idle state, sending a control instruction to the wearable device to control the wearable device to generate prompt information; the prompt information is used for prompting the target user to accept the voice interaction operation.
In an exemplary embodiment of the present disclosure, the operation processing unit sends the voice interaction operation to the user by: initiating the voice interaction operation to the target user through the terminal equipment identity identification number of the target user; or initiating the voice interaction operation to the target user through a game interaction interface providing the game virtual character.
In an exemplary embodiment of the present disclosure, the operation processing unit performs a voice interaction operation with a user by: determining voice interaction content according to the activity state of the target user; and performing voice interaction operation with the target user by combining the voice interaction content.
In an exemplary embodiment of the present disclosure, the operation processing unit determines the voice interaction content according to a current activity state of the target user or determines the voice interaction content according to a historical activity state of the target user.
In an exemplary embodiment of the present disclosure, the operation processing unit generates the corresponding voice content by: acquiring a voice content template corresponding to the activity state of the target user; and generating the voice content by combining the activity state of the target user and the voice content template.
The specific details of each module in the above apparatus have been described in detail in the method section, and details that are not disclosed may refer to the method section, and thus are not described again.
FIG. 11 illustrates a schematic structural diagram of a computer system suitable for use in implementing an electronic device of an embodiment of the present disclosure.
It should be noted that the computer system 1100 of the electronic device shown in fig. 11 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 11, the computer system 1100 includes a Central Processing Unit (CPU) 1101, which can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM) 1102 or a program loaded from a storage section 1108 into a Random Access Memory (RAM) 1103. In the RAM 1103, various programs and data necessary for system operation are also stored. The CPU 1101, ROM 1102, and RAM 1103 are connected to each other by a bus 1104. An input/output (I/O) interface 1105 is also connected to bus 1104.
The following components are connected to the I/O interface 1105: an input portion 1106 including a keyboard, mouse, and the like; an output portion 1107 including a signal output unit such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and a speaker; a storage section 1108 including a hard disk and the like; and a communication section 1109 including a network interface card such as a LAN card, a modem, or the like. The communication section 1109 performs communication processing via a network such as the internet. A driver 1110 is also connected to the I/O interface 1105 as necessary. A removable medium 1111 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 1110 as necessary, so that a computer program read out therefrom is mounted into the storage section 1108 as necessary.
In particular, the processes described below with reference to the flowcharts may be implemented as computer software programs, according to embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer-readable medium, the computer program comprising program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication portion 1109 and/or installed from the removable medium 1111. The computer program, when executed by a Central Processing Unit (CPU) 1101, performs various functions defined in the methods and apparatus of the present application. In some embodiments, computer system 1100 may also include an AI (Artificial Intelligence) processor for processing computing operations related to machine learning.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the electronic device described in the above embodiment; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by one of the electronic devices, cause the electronic device to implement the method as described in the embodiments below. For example, the electronic device may implement various steps shown in the present disclosure, and the like.
It should be noted that the computer readable media shown in the present disclosure may be computer readable signal media or computer readable storage media or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (12)

1. A game interaction method, the method comprising:
acquiring biological activity data of a target user acquired by a wearable device;
determining a current activity state of the target user according to the biological activity data;
and responding to the current activity state of the target user as a preset state, and controlling the game virtual role and the target user to perform plot interactive operation corresponding to the activity state of the target user.
2. The method of claim 1, wherein the biological activity data comprises vital sign information data of the target user and activity information data of the target user.
3. The method of claim 1, wherein determining the current activity state of the target user from the biological activity data comprises:
acquiring corresponding weight information according to the category of the biological activity data;
and determining the current activity state of the target user according to each biological activity data and the corresponding weight information.
4. The method of claim 1, wherein the preset state comprises an idle state; and the step of controlling the game virtual character and the target user to perform plot interactive operation corresponding to the activity state of the target user in response to the current activity state of the target user being a preset state comprises the following steps:
and responding to the current activity state of the target user as an idle state, and controlling a game virtual character and the target user to perform voice interaction operation corresponding to the activity state of the target user.
5. The method of claim 4, wherein controlling the game avatar to perform voice interaction with the target user corresponding to the active state of the target user comprises:
sending a control instruction to the wearable device to control the wearable device to generate prompt information; the prompt information is used for prompting the target user to accept the voice interaction operation.
6. The method of claim 4, wherein initiating a voice interaction operation to the target user when the current active state is an idle state comprises:
initiating the voice interaction operation to the target user through the terminal equipment identity identification number of the target user; alternatively, the first and second electrodes may be,
and initiating the voice interaction operation to the target user through a game interaction interface of the game virtual character.
7. The method of claim 4, wherein controlling a game avatar to perform a voice interaction operation with the target user corresponding to the active state of the target user comprises:
determining voice interaction content according to the activity state of the target user;
and performing voice interaction operation with the target user by combining the voice interaction content.
8. The method of claim 7, wherein determining the voice interaction content according to the activity status of the target user comprises:
determining voice interaction content according to the current activity state of the target user; alternatively, the first and second liquid crystal display panels may be,
and determining voice interaction content according to the historical activity state of the target user.
9. The method of claim 7, wherein determining the voice interaction content according to the activity status of the target user comprises:
acquiring a voice content template corresponding to the activity state of the target user;
and generating the voice content by combining the activity state of the target user and the voice content template.
10. A game interaction apparatus, comprising:
the acquisition module is used for acquiring biological activity data of a target user acquired by the wearable equipment;
the analysis module is used for determining the current activity state of the target user according to the biological activity data;
and the processing module is used for responding to the preset current activity state of the target user and controlling the game virtual role and the target user to perform plot interactive operation corresponding to the activity state of the target user.
11. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of claims 1-9.
12. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1-9 via execution of the executable instructions.
CN202210894162.9A 2022-07-27 2022-07-27 Game interaction method and device, storage medium and electronic equipment Pending CN115253272A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210894162.9A CN115253272A (en) 2022-07-27 2022-07-27 Game interaction method and device, storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210894162.9A CN115253272A (en) 2022-07-27 2022-07-27 Game interaction method and device, storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN115253272A true CN115253272A (en) 2022-11-01

Family

ID=83772572

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210894162.9A Pending CN115253272A (en) 2022-07-27 2022-07-27 Game interaction method and device, storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN115253272A (en)

Similar Documents

Publication Publication Date Title
KR102173479B1 (en) Method, user terminal and server for information exchange communications
US10834479B2 (en) Interaction method based on multimedia programs and terminal device
CN102037716A (en) Method and system for automatically updating avatar status to indicate user's status
CN110472130A (en) Reduce the demand to manual beginning/end point and triggering phrase
WO2014102722A1 (en) Device, system, and method of controlling electronic devices via thought
US20220318306A1 (en) Video-based interaction implementation method and apparatus, device and medium
CN105955973A (en) User information processing method and electronic device supporting the same
CN113115114B (en) Interaction method, device, equipment and storage medium
CN107222391A (en) Group's based reminding method, device and equipment
CN108009288B (en) Recipe pushing method and device
WO2018149213A1 (en) Jigsaw puzzle type task execution control method and device
CN108616448A (en) A kind of the path recommendation method and mobile terminal of Information Sharing
CN107919138A (en) Mood processing method and mobile terminal in a kind of voice
CN108521365B (en) Method for adding friends and mobile terminal
CN110399474B (en) Intelligent dialogue method, device, equipment and storage medium
CN108762621A (en) A kind of message display method and mobile terminal
CN108597495A (en) A kind of method and device of processing voice data
CN112306238A (en) Method and device for determining interaction mode, electronic equipment and storage medium
CN111352501A (en) Service interaction method and device
CN110196900A (en) Exchange method and device for terminal
CN115253272A (en) Game interaction method and device, storage medium and electronic equipment
CN108737653A (en) A kind of chat based reminding method, terminal and computer readable storage medium
CN109495647B (en) Information prompting method and mobile terminal
CN106412306B (en) Mobile terminal social activity based reminding method, device and mobile terminal
CN112752159B (en) Interaction method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination