CN111659106B - Game interaction method, device, equipment and storage medium - Google Patents

Game interaction method, device, equipment and storage medium Download PDF

Info

Publication number
CN111659106B
CN111659106B CN202010526868.0A CN202010526868A CN111659106B CN 111659106 B CN111659106 B CN 111659106B CN 202010526868 A CN202010526868 A CN 202010526868A CN 111659106 B CN111659106 B CN 111659106B
Authority
CN
China
Prior art keywords
player
interaction
interactive object
interactive
trend
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010526868.0A
Other languages
Chinese (zh)
Other versions
CN111659106A (en
Inventor
秦觅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202010526868.0A priority Critical patent/CN111659106B/en
Publication of CN111659106A publication Critical patent/CN111659106A/en
Application granted granted Critical
Publication of CN111659106B publication Critical patent/CN111659106B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes

Abstract

The application provides a game interaction method, a game interaction device, equipment and a storage medium, and relates to the technical field of game interaction. The method comprises the following steps: collecting motion information of the VR terminal; judging whether the player and the interactive object have interaction intention according to the motion information, wherein the motion information comprises at least one of the following items: displacement, acceleration, time; and if the player has the interaction intention with the interaction object, sending out interaction prompt information. Compared with the prior art, the problem that the game experience is reduced due to the fact that the interaction efficiency is low, the trial and error cost of a player is increased is solved.

Description

Game interaction method, device, equipment and storage medium
Technical Field
The present application relates to the field of game interaction technologies, and in particular, to a game interaction method, apparatus, device, and storage medium.
Background
Virtual Reality (VR) game is an immersion game, which simulates a Virtual world of a three-dimensional space through a computer, and provides a simulation of senses such as vision, hearing, touch and the like for a user, so that the user can have a sense of being personally on the scene and can interact with some objects in the space.
In the prior art, players generally realize movement and interaction with things in a VR game by manipulating player controllers.
However, due to the limitation of the imaging principle of the visual system and the VR imaging technology, the interaction mode requires the player to continuously try to interact with the objects in the game scene through subjective judgment, which reduces the interaction efficiency, reduces the experience, and increases the trial-and-error cost of the player.
Disclosure of Invention
An object of the present application is to provide a game interaction method, device, equipment and storage medium, which are directed to the deficiencies in the prior art, so as to solve the problems in the prior art that the interaction efficiency is low, the trial and error cost of a player is increased, and the game experience is reduced.
In order to achieve the above purpose, the embodiments of the present application adopt the following technical solutions:
in a first aspect, an embodiment of the present application provides a game interaction method, which is applied to a VR terminal, and the method includes:
collecting motion information of the VR terminal;
and judging whether the player and the interactive object have the interaction intention according to the motion information, wherein the motion information comprises at least one of the following items: displacement, acceleration, time;
and if the player and the interactive object have the interaction intention, sending out interaction prompt information.
Optionally, the determining, according to the motion information, whether the player and the interactive object have an interaction intention includes:
acquiring motion trend and stable time information between the player and the interactive object according to the motion information;
and if the movement trend and the stable time meet preset conditions, determining that the player and the interactive object have an interaction intention.
Optionally, the obtaining, according to the motion information, motion trend and stabilization time information between the player and the interactive object includes:
determining the movement trend of the player to the interactive object in the interactive trend determination area according to the displacement and the acceleration;
and determining the stable time with the motion trend according to the time.
Optionally, if the motion trend and the stable time satisfy preset conditions, determining that the player has an interaction intention with the interaction object includes:
and if the player has a movement trend towards the interactive object in the interactive trend determination area, the displacement is greater than a preset movement distance, and the stabilization time is greater than a preset time interval, determining that the player and the interactive object have an interactive intention.
Optionally, the determining, according to the displacement and the acceleration, a movement trend of the player towards the interactive object in an interactive trend determination area includes:
acquiring the interaction trend judgment area corresponding to the interaction object according to the type of the interaction object;
determining that the player enters the interaction trend determination area according to the distance between the player and the interaction object;
and determining the movement trend of the player to the interactive object in the interactive trend determination area according to the displacement and the acceleration.
Optionally, the prompt message includes at least one of a visual prompt message, an audible prompt message, and a tactile prompt message.
Optionally, the sending out the interaction prompt message includes:
determining a corresponding prompt mode according to the type of the interactive object;
and sending the interactive prompt information according to the prompt mode.
Optionally, the determining, according to the type of the interactive object, a corresponding prompt manner includes:
if the interactive object is a contact type interactive object, displaying the prompt information aiming at the interactive object;
and if the interactive object is a space type interactive object, displaying the prompt information aiming at a space range.
In a second aspect, another embodiment of the present application provides a game interaction apparatus applied to a VR terminal, the apparatus including: collection module, judgement module and prompt module, wherein:
the acquisition module is used for acquiring the motion information of the VR terminal;
the judging module is used for judging whether the player and the interactive object have an interaction intention according to the motion information, wherein the motion information comprises at least one of the following items: displacement, acceleration, time;
and the prompt module is used for sending out interactive prompt information if the player has an interactive intention with the interactive object.
Optionally, the apparatus further comprises: an acquisition module and a determination module, wherein:
the acquisition module is used for acquiring the motion trend and the stable time information between the player and the interactive object according to the motion information;
the determining module is further configured to determine that the player has an interaction intention with the interaction object if the motion trend and the stabilization time meet preset conditions.
Optionally, the determining module is further configured to determine, according to the displacement and the acceleration, a movement trend of the player towards the interactive object in an interactive trend determination area;
the determining module is further configured to determine the stable time with the motion trend according to the time.
Optionally, the determining module is further configured to determine that there is an interaction intention between the player and the interaction object if the player has a movement trend towards the interaction object in the interaction trend determination area, the displacement is greater than a preset movement distance, and the stabilization time is greater than a preset time interval.
Optionally, the obtaining module is further configured to obtain the interaction trend determination area corresponding to the interaction object according to the type of the interaction object;
the determining module is further configured to determine that the player enters the interaction trend determination area according to a distance between the player and the interaction object;
the determining module is further configured to determine a movement trend of the player to the interactive object in the interactive trend determination area according to the displacement and the acceleration.
Optionally, the apparatus further comprises: the sending module is used for sending out voice or image-text prompt information, and the prompt information comprises: distance information between the player and the interactive object.
Optionally, the determining module is further configured to determine a corresponding prompt manner according to the type of the interactive object;
the sending module is further configured to send the interactive prompt information according to the prompt mode.
Optionally, the apparatus further comprises: the display module is used for displaying the prompt information aiming at the interactive object if the interactive object is a contact interactive object;
the display module is further configured to display the prompt message for a spatial range if the interactive object is a spatial interactive object.
In a third aspect, another embodiment of the present application provides a game interaction device, including: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating via the bus when the game interaction device is operated, the processor executing the machine-readable instructions to perform the steps of the method according to any one of the first aspect.
In a fourth aspect, another embodiment of the present application provides a storage medium having a computer program stored thereon, where the computer program is executed by a processor to perform the steps of the method according to any one of the above first aspects.
The beneficial effect of this application is:
by adopting the game interaction method provided by the application, whether the interaction intention exists between the current player and the interactive object is determined through the collected motion information of the VR terminal, if so, the interaction prompt information is sent, the user can determine the interactive object in the current range through the interaction prompt information, and the player can visually see the position of the interactive object in the current game scene according to the interaction prompt information, so that the interaction between the player and the interactive object can be completed according to the interaction prompt information in the subsequent game process, and the problems of low interaction efficiency and poor game experience of the player and the object in the game scene in the prior art are solved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
FIG. 1 is a schematic flow chart illustrating a game interaction method according to an embodiment of the present application;
FIG. 2 is a schematic flow chart illustrating a game interaction method according to another embodiment of the present application;
FIG. 3 is a schematic flow chart illustrating a game interaction method according to another embodiment of the present application;
FIG. 4 is a schematic flow chart illustrating a game interaction method according to another embodiment of the present application;
FIG. 5 is a schematic view of a game interface provided in an embodiment of the present application;
FIG. 6 is a schematic view of a game interaction interface provided in another embodiment of the present application;
FIG. 7 is a schematic structural diagram of a game interaction device according to an embodiment of the present application;
FIG. 8 is a schematic structural diagram of a game interaction device according to another embodiment of the present application;
fig. 9 is a schematic structural diagram of a game interaction device according to an embodiment of the present application.
Detailed Description
To make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for illustrative and descriptive purposes only and are not used to limit the scope of protection of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
Additionally, the flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and steps without logical context may be performed in reverse order or simultaneously. One skilled in the art, under the guidance of this application, may add one or more other operations to, or remove one or more operations from, the flowchart.
In order to enable those skilled in the art to use the present disclosure, the following embodiments are given by taking an example of an interactive scene in a Virtual Reality (VR) game. It will be apparent to those skilled in the art that the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the application. It should be noted that the term "comprising" will be used in the embodiments of the present application to indicate the presence of the features stated hereinafter, but does not exclude the addition of further features.
VR, a computer simulation system that creates and experiences a virtual world, uses a computer to create a simulated environment into which a user is immersed. The virtual reality technology is a virtual reality technology which is characterized in that data in real life are utilized, electronic signals generated through a computer technology are combined with various output devices to be converted into phenomena which can be felt by people, the phenomena can be real and true objects in reality and can also be substances which can not be seen by the naked eyes, the objects or the substances are expressed through a three-dimensional model, the phenomena can not be directly seen by people, but the real world is simulated through the computer technology, and therefore the virtual reality technology is called.
In a VR game, a VR terminal is generally required, and optionally, in some possible embodiments, the VR terminal may be a head-mounted terminal, an integrated terminal, a split VR terminal, and the like, for example, VR glasses, a VR helmet, and the like, and is matched with other accessories, such as a handle, a moving ring, and the like, where the type of the specific VR terminal is not limited herein.
The VR terminal can play stand-alone games, i.e. local games without networking, or online games via the Internet. Alternatively, the VR terminal may be directly networked, i.e., directly connected to the server, or may be connected to the server through a game machine, a mobile terminal, a computer, or the like, without being particularly limited thereto.
It is worth noting that before the application is provided, in the existing VR technology, a game interaction process generally requires a player to subjectively determine a position relationship between the player and an interactive object in a scene, and interaction with the interactive object in the game scene is realized in a continuous trial and error manner.
In order to solve the problems in the prior art, the application provides a game interaction method, which can determine whether an interaction intention exists between a current player and an interactive object through motion information of a VR terminal, if so, interaction prompt information is sent out, and a user can determine the interactive object in the current range through the interaction prompt information, so that the player can visually observe whether the interactive object exists in the current game scene and in the range beside the player, and can determine whether the interactive object exists in the object in the current range without continuous trial and error of the player, thereby solving the problems of low interaction efficiency between the player and the object in the game scene, poor game experience and continuous trial and error of the player in the prior art.
The game interaction method provided by the embodiment of the present application is explained below with reference to a plurality of specific application examples. Fig. 1 is a schematic flowchart of a game interaction method according to an embodiment of the present application, where an execution subject of the method may be a VR terminal, or may also be another terminal connected to the VR, such as a mobile phone, a computer, and the like, without limitation. As shown in fig. 1, the method includes:
s101: and collecting the motion information of the VR terminal.
The motion information is the motion information of a player corresponding to the current VR terminal and a corresponding control object in the virtual reality world, optionally, the VR terminal may include a player controller, and the motion information of the player controller is collected to serve as the motion information of the VR terminal.
For example, the following steps are carried out: if the player corresponding to the current VR terminal has the control object corresponding to the virtual reality world as the two hands of the object, the motion information is the motion information of the two hands of the object in the virtual reality world; if the player corresponding to the current VR terminal has the gun as the manipulator corresponding to the virtual world, the motion information is the motion information of the gun in the virtual reality world; if the player corresponding to the current VR terminal has the control object corresponding to the virtual world as the whole object, the motion information is the motion information of the whole object in the virtual reality world; the type of the specific manipulation object can be determined according to the user's needs, and is not limited to the embodiments described above.
Optionally, in an embodiment of the present application, the motion information may include at least one of: displacement, acceleration, time.
S102: and judging whether the player and the interactive object have the interaction intention or not according to the motion information.
The interaction object is things that the player can interact with during the game, and the interaction can be contact type or non-contact type. For example, the interactive object may be a virtual object in a game, such as other players, NPCs, items, scenery in a scene, etc., without limitation.
In some possible embodiments, the distance between the player and the interactive object may be determined according to the motion information, and then whether the interaction intention exists between the player and the interactive object is judged according to the distance between the player and the interactive object; or determining whether the player has a trend of moving towards the interactive object according to the motion information, and judging whether an interactive intention exists between the player and the interactive object according to a determination result; determining the staying time of the player in a certain range around the interactive object according to the motion information, and judging whether the interactive intention exists between the player and the interactive object according to the duration of the staying time; the specific manner for determining whether there is an interaction intention between the player and the interaction object may be flexibly adjusted according to the user's needs, and is not limited to the embodiments described above.
S103: and if the player has the interaction intention with the interaction object, sending out interaction prompt information.
In some possible embodiments, the interactive prompt message may be a voice prompt message, a color prompt message, a vibration prompt message, or a combination of multiple prompt messages of the above prompt messages, and the like, and the content and form of the specific prompt message may be set according to the needs of the user, and are not limited to those given in the above embodiments.
By adopting the game interaction method provided by the application, whether the interaction intention exists between the current player and the interaction object is determined through the collected motion information of the VR terminal, if so, the interaction prompt information is sent, the user can determine the interaction object in the current range through the interaction prompt information, and the player can visually determine whether the interaction object exists in the current game scene according to the interaction prompt information, so that the interaction with the interaction object can be completed according to the interaction prompt information in the subsequent game process, repeated trial and error is not needed, and the problems of low interaction efficiency and poor game experience of the player and the object in the game scene in the prior art are solved.
Optionally, on the basis of the above embodiments, the embodiments of the present application may further provide a game interaction method, which is described below with reference to the accompanying drawings. Fig. 2 is a schematic flowchart of a game interaction method according to another embodiment of the present application, and as shown in fig. 2, S102 may include:
s104: and acquiring the motion trend and the stable time information between the player and the interactive object according to the motion information.
The motion trend is the motion trend of the corresponding control object of the player in the virtual reality world, and the stable time is the time information of the maintaining time of the control object under the same motion trend.
If the player has a stable movement trend within a certain time, the possibility that the player continues to move along the movement trend is high, and if the current movement trend is to move towards the interactive object, the player possibly has the intention of interacting with the interactive object.
S105: and if the motion trend and the stable time meet the preset conditions, determining that the player and the interactive object have the interaction intention.
Optionally, if the current movement trend is that the manipulator corresponding to the player moves towards the interactive object, and the stable time for maintaining the movement trend is greater than the preset time threshold, it is indicated that the manipulator corresponding to the player is moving towards the interactive object stably, that is, it is determined that the player and the interactive object have the interaction intention. In one embodiment of the present application, the settling time may be set to 3-5S, but the frequent setting of the specific settling time may be flexibly adjusted according to the user' S needs, and is not limited to the above embodiment.
The interactive intention between the player and the interactive object is judged according to the motion trend and the stable time, the motion trend of the player at the next moment or the next moment can be judged according to the motion trend and the stable time, and therefore whether the interactive intention exists between the player and the interactive object at the next moment or not can be judged.
Optionally, on the basis of the above embodiments, the embodiments of the present application may further provide a game interaction method, which is described below with reference to the accompanying drawings. Fig. 3 is a schematic flowchart of a game interaction method according to another embodiment of the present application, and as shown in fig. 3, S104 may include:
s106: and determining the movement trend of the player to the interactive object in the interactive trend determination area according to the displacement and the acceleration.
Optionally, in an embodiment of the present application, the interaction determination area may be: and the area in the preset range around the interactive object can acquire the motion information of the player only after the player enters the interaction judgment area, judge the current motion trend of the player according to the motion information and determine whether the player has the interaction intention with the interactive object.
In another embodiment of the application, the interaction determination area may also be an area within a preset range around the player in the virtual reality world, and whether to acquire the motion information of the player is determined by determining whether an interactive object exists within the preset range, and only when an interactive object exists within the interaction determination area, the motion information of the player is acquired, and the current motion trend of the player is determined according to the motion information, so as to determine whether the player has an interaction intention with the interactive object.
For example, in some possible embodiments, the preset range may be a circular area in the virtual reality world, which is determined by taking the player or the interactive object as a center and taking 2-3 meters as a radius, but the range size and the determination manner of the specific preset range may be flexibly adjusted according to the needs of the user, and are not limited to the above embodiments.
In addition, in an embodiment of the present application, the division of the area may be further subdivided into an interaction prompt area and an interaction effective area according to a distance between the player and the interaction object; wherein, the distance between the player and the interactive object can be: determined jointly from the horizontal and vertical distances (depth distances) between the player and the interactive object, for example: if the player realizes interaction with the interactive object by moving two hands in the virtual world, if the two hands of the player are right above the interactive object (namely the horizontal distance is small and is almost 0) at the moment, the depth distance between the two hands of the player and the interactive object is too large, namely the distance between the player and the interactive object is not in the effective interaction range, and the player cannot interact with the interactive object even if the player operates; and the interactive operation with the interactive object can be triggered only when the player operates with both hands right above the interactive object and the depth distance is within the range of the interactive effective area.
The interactive prompt area is used for reminding the player of the existence of an interactive object in the current area through prompt information; the interactive effective area is used for reminding the player of operating in the current area through the prompt message and triggering the interactive operation between the player and the interactive object; in a possible embodiment, when the player is located in different areas, the current area where the player is located can be prompted through different prompting information, so that the player can determine the current position of the player according to the different prompting information, and the player can conduct the next operation by taking the prompting information as guidance.
It should be noted that, in the three regions, the range of the interaction determining region is the largest, the range of the interaction prompting region is smaller than the range of the interaction determining region and larger than the range of the interaction effective region, and the range of the interaction effective region is the smallest, where the interaction effective region may be closest to the interaction object, but may be different according to different scenes, which is not limited herein.
And if the displacement direction of the control object corresponding to the player is moving towards the interactive object and the acceleration direction is moving towards the interactive object, determining the movement trend of the control object corresponding to the current player towards the interactive object.
S107: from the time, a stabilization time with a motion trend is determined.
The time can record the time that the player continues after the player has a motion trend towards the interactive object, and at this time, if the player has a motion trend towards the interactive object in the interaction trend determination area, the displacement is greater than the preset motion distance, and the stabilization time is greater than the preset time interval, it is determined that the player and the interactive object have an interaction intention.
For example, in an embodiment of the present application, a stable time having the motion trend may be determined according to the holding time of the motion trend, and if the motion trend of the manipulator corresponding to the player is changing continuously within a preset time threshold, it indicates that the current player may only explore an area within the range and does not intend to interact with the interaction object; only when the player keeps the same movement trend within the preset time threshold value and the movement trends are the trends moving towards the interactive object, the interaction intention between the player and the interactive object is determined, and the player is prompted according to the interaction prompting information at the moment, so that the immersion sense of the player in the game process is improved, the interaction failure caused by blind interaction is reduced, the player needs to try and mistake repeatedly, and the trial and error cost is high.
Optionally, on the basis of the above embodiments, the embodiments of the present application may further provide a game interaction method, which is described below with reference to the accompanying drawings. Fig. 4 is a schematic flowchart of a game interaction method according to another embodiment of the present application, and as shown in fig. 4, S106 may include:
s108: and acquiring an interaction trend judgment area corresponding to the interaction object according to the type of the interaction object.
In one embodiment of the application, in the interaction prompt area, the player can be prompted with weaker prompt information about the object currently having the interactive quality; in the interaction validation area, the player can be prompted through stronger prompt information, and the interaction with the interaction object can be validated by carrying out interaction operation in the area.
For example, in the embodiments provided in the present application, the prompt information includes at least one of a visual prompt information (such as a graphic, a text, a color, a highlight, etc.), an auditory prompt information (such as a voice, a music), and a tactile prompt information (such as a controller vibration with different intensities), and the prompt information may further include information on a distance between the player and the interactive object. The following takes different prompt messages as an example, and the detailed description is given to the specific prompt process of different corresponding prompt messages in different areas:
the first embodiment is as follows: the interaction prompting information is taken as voice prompting information for explanation, and after the fact that the interaction intention exists between the player and the interaction object is determined, the current interaction object can be prompted to be the interactive object through the voice prompting information; specifically, if the player is currently located in the interactive prompt area, regular voice prompt information is sent out through a preset time interval and preset voice intensity; if the distance between the player and the interactive object is closer and closer, and the player enters the interaction effective area, the prompting frequency or the prompting sound of the voice prompt information can be set to be larger and larger so as to prompt that the distance between the player and the interactive object is closer and closer, so that the player can determine the distance relationship between the player and the interactive object in time according to the voice prompt information.
The second embodiment: the interactive prompting information is taken as color prompting information for explanation, after the fact that the interaction intention exists between the player and the interactive object is determined, the current interactive object can be marked out in a preset marking color through the color prompting information, and the marking mode can be as follows: selecting an interactive object frame through a frame body with colors, or directly filling the colors of the interactive objects into preset marking colors, wherein the specific marking mode can be flexibly adjusted and is not limited by the marking mode; the object with the preset marking color is an interactive object with the current player interactive intention; specifically, if the player is currently located in the interaction prompt area, labeling the interaction object according to a preset labeling mode; then, as the distance between the player and the interactive object is closer and closer, and after entering the interaction effective area, the brightness of the color prompt information can be set to be higher and higher so as to prompt that the distance between the player and the interactive object is closer and closer, so that the player can intuitively determine the distance relationship between the player and the interactive object according to the color prompt information.
Example three: explaining by taking the interaction prompt information as vibration prompt information as an example, after determining that the interaction intention exists between the player and the interaction object, prompting the player that the current interaction object is an interactive object in a vibration mode; specifically, if the player is currently located in the interaction prompting area, the player can be prompted in a slight vibration mode that the current interaction object is an interactive object; as the distance between the player and the interactive object gets closer and closer, and after the player enters the interaction effective area, the amplitude and/or frequency of the vibration can be set to be larger and larger, so as to prompt that the distance between the player and the interactive object is closer and closer, so that the player can determine the distance relationship between the player and the interactive object according to the amplitude and/or frequency of the vibration of the interactive object in time.
Different prompt messages are determined through different interaction areas, a player can visually determine distance information between the player and an interaction object through the prompt messages, and subsequent interaction operation is carried out according to the prompt messages, so that trial and error cost of the player is further reduced, interaction efficiency is improved, but the prompt mode and the prompt content of specific interaction prompt messages can be flexibly set and adjusted according to needs, and the method is not limited to the method provided by the embodiment.
In the embodiment provided by the application, the corresponding prompt mode can be determined according to the type of the interactive object; and sending out interactive prompt information according to the prompt mode.
If the interactive object is a contact interactive object, displaying prompt information aiming at the interactive object; and if the interactive object is a space interactive object, displaying the prompt information aiming at the space range.
By way of example: in one embodiment of the present application, the contact interaction object may be: a gold ingot, a treasure box, a plant, a bullet, a helmet, a gun, a dart, etc. which can be picked up; but also can be a rocker, a tree to be shaken, a Non-Player Character (NPC) which can carry out a conversation by clicking, and the like; when the interactive object is a contact interactive object, the interactive object may be divided into an interactive prompt area and an interactive effective area according to a distance between the player and the interactive object, and the manner of displaying the prompt information for the interactive object may be the prompt manner described in any one of the first to third embodiments, that is, different prompt manners are determined according to the distance between the interactive object and the player, and the closer the distance between the player and the interactive object is, the greater the prompt strength of the prompt information is.
The spatial class interaction objects may be: spatially segmented objects, such as: windows, doors, delivery areas, etc. When the interactive object is a space-class interactive object, only the interactive prompt area is included, and the interactive effective area is not included, namely, the player is prompted to have the current object with the interactive performance only according to the distance between the interactive object and the player; for example, in a shooting game, when a player drives a car and holds a gun, the player needs to shoot at the windshield deep in the gun, otherwise, if the gun does not extend out of the windshield and shoots, the player is detonated; the windshield is a space-type interactive object, and only when the player is located in the range of the interactive prompt area, the interactive prompt information prompts the player that the current windshield is the interactive object.
S109: and determining that the player enters the interaction trend determination area according to the distance between the player and the interaction object.
S110: and determining the movement trend of the player to the interactive object in the interactive trend determination area according to the displacement and the acceleration.
After determining that the player has a movement trend towards the interactive object in the interaction determination area, determining the stable time with the movement trend according to the retention time of the movement trend, specifically: if the player keeps the same movement trend within the preset time threshold and the movement trends are the trends moving towards the interactive object, the fact that the player and the interactive object have the interaction intention is determined, and at the moment, corresponding interaction prompt information is determined according to the type of the interactive object and the distance between the player and the interactive object, so that when the player judges the distance between the player and the interactive object, besides subjective judgment, the distance information between the player and the interactive object can be intuitively judged from the interaction prompt information, and the game experience of the player is improved.
FIG. 5 is a schematic view of a game interaction interface provided in an embodiment of the present application, and FIG. 6 is a schematic view of a game interaction interface provided in another embodiment of the present application; the explanation takes the interactive object as a contact type interactive object dart as an example:
as shown in fig. 5, at this time, it is determined that the player is currently located in the interaction prompt area according to the distance between the player and the interaction object, where the object enclosed by the dotted line in fig. 5 is the interaction object in the current range, and it is indicated that the distance between the player and the interaction objects is relatively short, in the interaction prompt area, the interaction object in the current range of the player can be prompted in a color marking, sound prompting or vibrating manner, and the player can adjust the position of the player by moving continuously, and determine whether the interaction object exists in the current range and the position of the interaction object according to the prompt information; if the player controls the VR terminal on the basis of fig. 5, so that the hand in the virtual reality world approaches the dart in the scene further, at this time, because the distance between the hand in the virtual reality world and the scene is very close, the player enters the preset interaction validation area of the interaction object outlined and selected in fig. 6, at this time, the prompt message prompts the interaction object corresponding to the current interaction validation area of the player through a stronger prompt manner, and if the player operates in the current range, the interactive operation with the interaction object outlined and selected in fig. 6 can be completed.
In addition, optionally, taking a non-contact interactive object (a space-type interactive object) as an example, for example, a window is taken as an example for explanation, assuming that a player in a shooting game takes a task to stick out of the window to finish shooting, the window is a virtual interactive object, and after determining that the player is currently located in an interactive prompt area according to a distance between the player and the interactive object, determining that a movement trend of the player is approaching to the window, and then sending prompt information for the window, for example, displaying a text prompt in the window range; or, the window edge is discolored and flickered; or controlling the VR equipment to vibrate; or to issue a voice prompt, etc., without being limited in particular herein.
By adopting the game interaction method provided by the application, the movement trend of the corresponding control object is determined according to the movement information of the VR terminal, then the corresponding interaction prompt information is determined according to the movement trend, the maintenance time and the type of the interaction object, and the distance between the player and the interaction object can be determined according to the interaction prompt information, so that the interaction distance is visually and intuitively fed back to the user, the effect of interaction feedback preposition is realized, the player can determine the movement direction of the next step of the player by taking the interaction prompt information as a reference, the interaction efficiency of the player in the VR game is improved, the trial and error cost is reduced, and the problems of high trial and error cost and poor game immersion sense of the player in the prior art are solved.
The following explains the game interaction device provided in the present application with reference to the drawings, where the game interaction device can execute any one of the game interaction methods shown in fig. 1 to 6, and specific implementation and beneficial effects of the game interaction device refer to the above description, which is not described again below.
Fig. 7 is a schematic structural diagram of a game interaction device according to an embodiment of the present application, and as shown in fig. 7, the device includes: collection module 201, judgment module 202 and suggestion module 203, wherein:
and the acquisition module 201 is used for acquiring the motion information of the VR terminal.
The determining module 202 is configured to determine whether the player and the interactive object have an interaction intention according to the motion information, where the motion information includes at least one of: displacement, acceleration, time.
And the prompt module 203 is used for sending out interaction prompt information if the player has an interaction intention with the interaction object.
Fig. 8 is a schematic structural diagram of a game interaction device according to an embodiment of the present application, and as shown in fig. 8, the device further includes: an obtaining module 204 and a determining module 205, wherein:
and an obtaining module 204, configured to obtain, according to the motion information, motion trend and stable time information between the player and the interactive object.
The determining module 205 is further configured to determine that the player has an interaction intention with the interaction object if the motion trend and the stabilization time satisfy the preset conditions.
Optionally, the determining module 205 is further configured to determine a movement trend of the player towards the interactive object in the interactive trend determination area according to the displacement and the acceleration.
The determining module 205 is further configured to determine a stable time with a motion trend according to the time.
Optionally, the determining module 205 is further configured to determine that the player and the interactive object have an interaction intention if the player has a motion trend of the interactive object in the interaction trend determining area, the displacement is greater than the preset motion distance, and the stabilization time is greater than the preset time interval.
Optionally, the obtaining module 204 is further configured to obtain an interaction trend determination area corresponding to the interaction object according to the type of the interaction object.
The determining module 205 is further configured to determine that the player enters the interaction tendency determination area according to a distance between the player and the interaction object.
The determining module 205 is further configured to determine a movement trend of the player towards the interactive object in the interactive trend determination area according to the displacement and the acceleration.
As shown in fig. 8, the apparatus further includes: a sending module 206, configured to send out a voice or image-text prompt message, where the prompt message includes: distance information between the player and the interactive object.
Optionally, the determining module 205 is further configured to determine a corresponding prompting manner according to the type of the interactive object.
The sending module 206 is further configured to send an interactive prompt message according to the prompt mode.
As shown in fig. 8, the apparatus further includes: and the display module 207 is configured to display prompt information for the interactive object if the interactive object is a contact interactive object.
The display module 207 is further configured to display prompt information for the spatial range if the interactive object is a spatial interactive object.
The above-mentioned apparatus is used for executing the method provided by the foregoing embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
The above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Fig. 9 is a schematic structural diagram of a game interaction device according to an embodiment of the present application, where the game interaction device may be integrated in a terminal device or a chip of the terminal device.
As shown in fig. 9, the game interaction apparatus includes: a processor 501, a storage medium 502, and a bus 503.
The processor 501 is used for storing a program, and the processor 501 calls the program stored in the storage medium 502 to execute the method embodiment corresponding to fig. 1-6. The specific implementation and technical effects are similar, and are not described herein again.
Optionally, the present application also provides a program product, such as a storage medium, on which a computer program is stored, including a program, which, when executed by a processor, performs embodiments corresponding to the above-described method.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or in the form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (in english: processor) to execute some steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.

Claims (9)

1. A game interaction method is applied to a VR terminal, and comprises the following steps:
collecting motion information of the VR terminal;
and judging whether the player and the interactive object have interaction intention or not according to the motion information, wherein the motion information comprises at least one of the following items: displacement, acceleration, time;
if the player and the interactive object have the interaction intention, sending out interaction prompt information;
the judging whether the player and the interactive object have the interaction intention according to the motion information comprises the following steps:
acquiring motion trend and stable time information between the player and the interactive object according to the motion information;
if the movement trend and the stable time meet preset conditions, determining that the player and the interactive object have an interaction intention;
determining the movement trend of the player to the interactive object in the interactive trend determination area according to the displacement and the acceleration;
and determining the stable time with the motion trend according to the duration of the motion trend.
2. The method of claim 1, wherein determining that the player has an interaction intention with the interactive object if the motion trend and the stabilization time satisfy preset conditions comprises:
and if the player has a movement trend towards the interactive object in the interactive trend determination area, the displacement is greater than a preset movement distance, and the stabilization time is greater than a preset time interval, determining that the player and the interactive object have an interaction intention.
3. The method of claim 1 or 2, wherein the determining the movement trend of the player towards the interactive object in the interactive trend determination area according to the displacement and the acceleration comprises:
acquiring the interaction trend judgment area corresponding to the interaction object according to the type of the interaction object;
determining that the player enters the interaction trend determination area according to the distance between the player and the interaction object;
and determining the movement trend of the player to the interactive object in the interactive trend determination area according to the displacement and the acceleration.
4. The method of claim 1, wherein the cue information comprises at least one of a visual cue information, an audible cue information, and a tactile cue information.
5. The method of claim 1 or 4, wherein the sending out the interactive prompt message comprises:
determining a corresponding prompt mode according to the type of the interactive object;
and sending the interactive prompt information according to the prompt mode.
6. The method of claim 5, wherein the determining a corresponding prompting mode according to the type of the interactive object comprises:
if the interactive object is a contact interactive object, displaying the prompt information aiming at the interactive object;
and if the interactive object is a space type interactive object, displaying the prompt information aiming at the space range.
7. A game interaction apparatus, for application to a VR terminal, the apparatus comprising: collection module, judgement module and prompt module, wherein:
the acquisition module is used for acquiring the motion information of the VR terminal;
the judging module is configured to judge whether the player and the interactive object have an interaction intention according to the motion information, where the motion information includes at least one of: displacement, acceleration, time;
the prompt module is used for sending out interactive prompt information if the player has interactive intention with the interactive object;
the device further comprises: an acquisition module and a determination module, wherein:
the acquisition module is used for acquiring the motion trend and the stable time information between the player and the interactive object according to the motion information;
the determining module is further configured to determine that the player and the interactive object have an interaction intention if the motion trend and the stabilization time meet preset conditions;
the determining module is further used for determining the movement trend of the player to the interactive object in the interactive trend judging area according to the displacement and the acceleration;
the determining module is further configured to determine a stable time with the motion trend according to the duration of the motion trend.
8. A game interaction device, the device comprising: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating via the bus when the game interaction device is operated, the processor executing the machine-readable instructions to perform the method of any one of claims 1 to 6.
9. A storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, performs the method of any of the preceding claims 1-6.
CN202010526868.0A 2020-06-10 2020-06-10 Game interaction method, device, equipment and storage medium Active CN111659106B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010526868.0A CN111659106B (en) 2020-06-10 2020-06-10 Game interaction method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010526868.0A CN111659106B (en) 2020-06-10 2020-06-10 Game interaction method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111659106A CN111659106A (en) 2020-09-15
CN111659106B true CN111659106B (en) 2023-03-31

Family

ID=72386953

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010526868.0A Active CN111659106B (en) 2020-06-10 2020-06-10 Game interaction method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111659106B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110543236A (en) * 2019-07-31 2019-12-06 苏州浪潮智能科技有限公司 Machine room monitoring system and method based on virtual reality technology

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110543236A (en) * 2019-07-31 2019-12-06 苏州浪潮智能科技有限公司 Machine room monitoring system and method based on virtual reality technology

Also Published As

Publication number Publication date
CN111659106A (en) 2020-09-15

Similar Documents

Publication Publication Date Title
EP3381175B1 (en) Apparatus and method for operating personal agent
US11452941B2 (en) Emoji-based communications derived from facial features during game play
JP6244593B1 (en) Information processing method, apparatus, and program for causing computer to execute information processing method
US11110346B2 (en) Game processing system, method of processing game, and storage medium storing program for processing game
CN108874114B (en) Method and device for realizing emotion expression of virtual object, computer equipment and storage medium
EP2441504A2 (en) Storage medium recording image processing program, image processing device, image processing system and image processing method
CN107229393A (en) Real-time edition method, device, system and the client of virtual reality scenario
CN111228811B (en) Virtual object control method, device, equipment and medium
CN109978975A (en) A kind of moving method and device, computer equipment of movement
EP3960258A1 (en) Program, method and information terminal
US11951398B2 (en) Method and apparatus for controlling virtual race car, storage medium, and device
US11553009B2 (en) Information processing device, information processing method, and computer program for switching between communications performed in real space and virtual space
JP2017217352A (en) Information processing program, information processing device, information processing system, and information processing method
CN113350802A (en) Voice communication method, device, terminal and storage medium in game
JP2018124981A (en) Information processing method, information processing device and program causing computer to execute information processing method
CN111659106B (en) Game interaction method, device, equipment and storage medium
US11562271B2 (en) Control method, terminal, and system using environmental feature data and biological feature data to display a current movement picture
CN112156454B (en) Virtual object generation method and device, terminal and readable storage medium
CN111773669B (en) Method and device for generating virtual object in virtual environment
Wittmann et al. What do games teach us about designing effective human-AI cooperation?-A systematic literature review and thematic synthesis on design patterns of non-player characters.
WO2018229607A1 (en) Gaming system comprising interaction with augmented reality gaming object
US20220241692A1 (en) Program, method, and terminal device
JP6792604B2 (en) Game processing system, game processing method, and game processing program
KR20210004479A (en) Augmented reality-based shooting game method and system for child
JP2020156919A (en) Information processing device, information processing method and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant