CN106569602B - Interaction method and device for behavior data - Google Patents

Interaction method and device for behavior data Download PDF

Info

Publication number
CN106569602B
CN106569602B CN201610970785.4A CN201610970785A CN106569602B CN 106569602 B CN106569602 B CN 106569602B CN 201610970785 A CN201610970785 A CN 201610970785A CN 106569602 B CN106569602 B CN 106569602B
Authority
CN
China
Prior art keywords
preset
speed
target object
behavior
behavior data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610970785.4A
Other languages
Chinese (zh)
Other versions
CN106569602A (en
Inventor
张童飞
姜进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201610970785.4A priority Critical patent/CN106569602B/en
Publication of CN106569602A publication Critical patent/CN106569602A/en
Application granted granted Critical
Publication of CN106569602B publication Critical patent/CN106569602B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Abstract

The invention discloses a behavior data interaction method and device. Wherein, the method comprises the following steps: detecting first behavior data generated by a target object during flight action simulation; under the condition that the first behavior data are detected, judging whether the speed data in the first behavior data meet a preset condition or not; and if the speed data are judged to meet the preset conditions, controlling the virtual role corresponding to the target object to execute a first preset flight behavior according to a first preset speed in a preset scene, wherein the first preset speed corresponds to the speed data. The invention solves the technical problem of large difference between the behavior of the target object and the behavior of the virtual role in the related technology.

Description

Interaction method and device for behavior data
Technical Field
The invention relates to the field of computers, in particular to a behavior data interaction method and device.
Background
At present, the existing behavior interaction technology is to perform behavior data interaction by twisting the head-mounted display device (head display) or the handle, and because the scene sensed by the user by twisting the head-mounted display device (head display) or the handle is different from the actual scene where the body is located, the user is easy to dizzy in the process of experiencing virtual reality, so that good experience cannot be brought to the user. A comparison is now made as shown in table 1 for the products on the market that are the mainstream of the technology involving flight interaction.
TABLE 1 Interactive control mode COMPARATIVE TABLE
Figure BDA0001143794650000011
Currently, a mainstream behavior interaction method controls the change of the position of the virtual character corresponding to the user in a preset scene through a head display, for example, controls the rise and fall of the position of the virtual character corresponding to the user in the air through the head display. Fig. 1 is a schematic view of controlling flight steering by a head mounted display device according to one of the related art. In the game Mount wingunit, as shown in fig. 1, flight steering is controlled by a head-mounted display device, and a user in the middle portion as shown in the figure controls steering at a corresponding virtual character by the head-mounted display device. Fig. 2 is a schematic view of controlling flight steering by a head mounted display device according to another of the related art. As shown in fig. 2, in game EAGLE FLIGHT, the user controls flight turns through the head mounted display device, including controlling the speed, altitude, and flight range of the virtual character, among other things. Since the direction of the virtual character is controlled by the head-up display, the difference between what the user sees and what the body rotates is further increased, resulting in an enhanced feeling of vertigo for the user. In addition, The flight direction of The virtual character corresponding to The user is changed through The keys of The handle, and The mode has a great difference with The experience of real flight, so that The user cannot experience direction change under natural conditions, and The immersion of The user is poor, for example, The game Leave The Nest. The increase of the speed of the virtual character can be controlled by clicking the handle, the sensor technology of various leading edges is not fully utilized in the interaction mode, and meanwhile, the greatest fun and immersion are not provided for the user.
According to the physical principle, birds obtain upward lifting force by flapping wings so as to overcome gravity to obtain upward flying speed. In all current interaction modes, gravity factors are not considered, the difference between the gravity factors and real experience is far, and a user cannot experience the superstrong immersion feeling of flying birds by fanning arms.
The prior art also ignores a high-frequency scene experience in flight, for example, the most common taxiing flight of birds is not embodied in behavior interaction technology.
In view of the above problem in the related art that the difference between the behavior of the target object and the behavior of the virtual character is large, no effective solution has been proposed at present.
Disclosure of Invention
The embodiment of the invention provides a behavior data interaction method and device, which are used for at least solving the technical problem that the difference between the behavior of a target object and the behavior of a virtual role is large in the related technology.
According to an aspect of the embodiment of the invention, an interaction method of behavior data is provided. The interaction method of the behavior data comprises the following steps: detecting first behavior data generated by a target object during flight action simulation; under the condition that the first behavior data are detected, judging whether the speed data in the first behavior data meet a preset condition or not; and if the speed data are judged to meet the preset conditions, controlling the virtual role corresponding to the target object to execute a first preset flight behavior according to a first preset speed in a preset scene, wherein the first preset speed corresponds to the speed data.
According to another aspect of the embodiment of the invention, an interaction device for behavior data is also provided. The behavior data interaction device comprises: the detection unit is used for detecting first behavior data generated by the target object during the simulated flight action; the judging unit is used for judging whether the speed data in the first behavior data meet the preset conditions or not under the condition that the first behavior data are detected; and the first control unit is used for controlling the virtual role corresponding to the target object to execute a first preset flight behavior according to a first preset speed in a preset scene when the speed data are judged to meet the preset conditions, wherein the first preset speed corresponds to the speed data.
In the embodiment of the invention, an interaction method of behavior data is adopted, and first behavior data generated by a target object during flight action simulation is detected; under the condition that the first behavior data are detected, judging whether the speed data in the first behavior data meet a preset condition or not; and if the speed data are judged to meet the preset conditions, controlling the virtual character corresponding to the target object to execute a first preset flight behavior according to a first preset speed under the preset scene, wherein the first preset speed corresponds to the speed data, and achieving the purpose of behavior interaction between the target object and the virtual character, so that the technical effect of reducing the difference between the behavior of the target object and the behavior of the virtual character is achieved, and the technical problem that the difference between the behavior of the target object and the behavior of the virtual character in the related technology is large is solved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
FIG. 1 is a schematic diagram of controlling flight steering by a head mounted display device according to one of the related art;
FIG. 2 is a schematic diagram of controlling flight steering by a head mounted display device according to another of the related art;
FIG. 3 is a diagram of a hardware environment for a method of interaction of behavioral data, according to an embodiment of the invention;
FIG. 4 is a flow chart of a method of interaction of behavioral data according to an embodiment of the invention;
FIG. 5 is a flow diagram of another interaction method for behavioral data, according to an embodiment of the present invention;
fig. 6 is a flowchart of a method for controlling a flight behavior of a virtual character performing steering at a second preset speed in a preset scene according to an embodiment of the present invention;
FIG. 7 is a schematic illustration of a physical model of bird flight according to an embodiment of the invention;
FIG. 8 is a flow diagram of another interaction method for behavioral data, according to an embodiment of the present invention;
FIG. 9A is a diagram illustrating the behavior of a target object to fan an arm according to an embodiment of the present invention;
FIG. 9B is a schematic illustration of a virtual character fly-up according to an embodiment of the present invention;
FIG. 10A is a diagram illustrating a backward swinging of an arm by a target object according to an embodiment of the present invention;
FIG. 10B is a schematic illustration of an accelerated progress of a virtual character, in accordance with an embodiment of the present invention;
FIG. 11A is a schematic diagram of a target object with a right arm higher than a left arm according to an embodiment of the invention;
FIG. 11B is a schematic diagram of a target object with a lower right arm than a left arm, in accordance with embodiments of the present invention;
FIG. 12 is a schematic view of a virtual character changing flight direction according to an embodiment of the present invention;
FIG. 13 is a schematic diagram of an interaction means for behavioral data according to an embodiment of the present invention;
FIG. 14 is a schematic diagram of another interaction means for behavioral data, according to an embodiment of the present invention; and
fig. 15 is a block diagram of a terminal according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
According to an embodiment of the present invention, an embodiment of an interaction method for behavior data is provided.
Alternatively, in this embodiment, the interaction method of the behavior data may be applied to a hardware environment formed by the server 302 and the terminal 304 as shown in fig. 3. Fig. 3 is a schematic diagram of a hardware environment of an interaction method of behavior data according to an embodiment of the present invention. As shown in fig. 3, the server 302 performs connections with the terminals 304 over a network including, but not limited to: the terminal 304 is not limited to a PC, a mobile phone, a tablet computer, etc. the terminal may be a wide area network, a metropolitan area network, or a local area network. The interaction method of the behavior data in the embodiment of the present invention may be executed by the server 302, the terminal 304, or both the server 302 and the terminal 304. The interaction method for the terminal 304 to execute the behavior data of the embodiment of the present invention may also be executed by a client installed thereon.
Fig. 4 is a flowchart of an interaction method of behavior data according to an embodiment of the present invention. As shown in fig. 4, the method may include the steps of:
step S402, detecting first behavior data generated by the target object during the simulated flight action.
In the technical solution provided by step S402 of the present invention, first behavior data generated by the target object during the simulated flight action is detected.
The target object simulates a flying action, for example, when the target object is a target user, the target user simulates a wing flapping action of a bird by flapping an arm on a vertical plane. The target object may also be a wearable device, for example, a handle of a Virtual Reality (VR) device, and the target user may simulate a bird flapping wing by waving the handle in a vertical plane. The method comprises the steps of detecting first behavior data generated by a target object when the flight action is simulated, and obtaining a detection result, wherein the detection result comprises the first behavior data generated when the target object is detected when the flight action is simulated and the first behavior data generated when the target object is not detected when the flight action is simulated, namely, determining whether the target object generates the first behavior data corresponding to the flight action simulated by the target object when the flight action is simulated. Optionally, the flight action simulated by the target object is a flapping wing action, and whether the target object has the flapping wing action is detected through the photosensitive sensor.
The scene of the first behavior data generated by the detection target object in simulating the flight action may be in a game scene or a social scene.
Step S404, determining whether the speed data in the first behavior data meets a preset condition.
In the technical solution provided by step S404 of the present invention, when the first behavior data is detected, it is determined whether the speed data in the first behavior data meets a preset condition.
The first behavior data includes velocity data generated by the target object in simulating the flight action, the velocity data being indicative of a velocity of the target object, including a direction and magnitude of the velocity, e.g., indicative of a rate of ascent of the target object in a vertical plane. The method comprises the steps of detecting first behavior data generated by a target object during flight action simulation, and judging whether speed data in the first behavior data meet preset conditions or not under the condition that the first behavior data are detected. Whether the rising speed indicated by the speed data is larger than the vertical falling speed or not and whether the speed direction in the speed data is the rising direction or not can be judged, for example, whether the target object has the action of flapping wings or not is detected through a photosensitive sensor by utilizing a sensor technology, and when the target object has the action of flapping wings, the gesture speed of the target object is obtained through a VR device, wherein the first behavior data comprises the gesture speed of the target object, and whether the gesture speed meets the preset condition or not is judged.
Birds obtain upward lifting force through flapping wings so as to overcome gravity to obtain upward flying speed, and the speed data in the first behavior data comprises speed data obtained by overcoming the gravity, so that the phenomenon that the behavior of a virtual character is far different from the real experience of a target object due to the fact that the behavior of the target object is greatly different from the behavior of the virtual character is avoided, and the effect that a target user can experience the feeling of overcoming the gravity in the process of experiencing the flying is achieved.
The first behavior data generated by the target object during the simulated flight action is detected, whether the speed data in the first behavior data meet the preset condition is judged to be a takeoff stage in the behavior interaction of the target object and the virtual character corresponding to the target object, and whether the virtual character is controlled to start takeoff in the preset scene is determined.
Step S406, the virtual character corresponding to the target object is controlled to execute a first preset flight behavior at a first preset speed in a preset scene.
In the technical solution provided in step S406 of the present invention, if it is determined that the speed data meets the preset condition, the virtual character corresponding to the control target object executes a first preset flight behavior at a first preset speed in the preset scene, where the first preset speed corresponds to the speed data.
After determining whether the speed data in the first behavior data meets the preset condition, if it is determined that the speed data meets the preset condition, for example, it is determined that the speed in the speed data is greater than the vertical descending speed and the speed direction in the speed data is the ascending direction, it is determined that the speed data in the first behavior data meets the preset condition. The speed of the virtual character of the target object in the preset scene can be calculated according to the speed data and the preset parameters to obtain a first preset speed, so that the virtual character corresponding to the target object is controlled to execute a first preset flight behavior in the preset scene through the first preset speed corresponding to the speed data, wherein the virtual character is a user image of the target object in the preset scene. The first preset flying behavior may be a takeoff behavior of the virtual character in a preset scene, for example, the flapping wing of the virtual character obtains an ascending speed in the preset scene, leaves the ground according to the ascending speed, and flies according to an advancing speed, where the first preset speed includes the ascending speed and the advancing speed of the virtual character in the preset scene. After the virtual character corresponding to the target object is controlled to execute the first preset flight behavior according to the first preset speed in the preset scene, different flight behaviors of the virtual character in the preset scene can be further controlled through different behavior actions of the target object.
Optionally, when the action that the target object has a flapping wing is detected, the gesture speed of the target object is obtained through the VR device, when the gesture speed meets a preset condition, the speed of a virtual character corresponding to the target object in a preset scene is calculated through the gesture speed, and the virtual character can obtain the upward flying speed and the forward speed in the preset scene, so that the purpose of winging and flying of the virtual character in the preset scene is achieved, the difference between the behavior of the target object and the behavior of the virtual character is reduced, the target user can obtain the flying experience in the preset scene, the traditional interaction mode of controlling the flying direction through a head display is changed, the vertigo caused by the target user rotating the head display is overcome, the optimal immersion feeling is brought to the target user in the preset scene, so that the target user can simulate the bird flying process in the preset scene or the character flying process in a movie, and a stronger experience is obtained.
It should be noted that the preset scene is a virtual reality scene, for example, a virtual reality scene in a game application or a virtual reality scene in a social application, and the difference between the behavior of the target object and the behavior of the virtual character is reduced by controlling the virtual character corresponding to the target object to execute the first preset flight behavior at the first preset speed in the virtual reality scene, so that the optimal flight experience and immersion feeling are brought to the target user in the virtual reality scene.
Through the steps S402 to S406, first behavior data generated by the target object during the simulated flight action is detected; under the condition that the first behavior data are detected, judging whether the speed data in the first behavior data meet a preset condition or not; if the speed data are judged to meet the preset conditions, the virtual character corresponding to the target object is controlled to execute a first preset flight behavior according to a first preset speed under the preset scene, wherein the first preset speed corresponds to the speed data, the technical problem that the difference between the behavior of the target object and the behavior of the virtual character in the related technology is large can be solved, and the technical effect of reducing the difference between the behavior of the target object and the behavior of the virtual character is achieved.
As an optional implementation manner, after first behavior data of the target object in the simulated flight action is detected, in the case that the first behavior data is not detected, the control of the virtual character corresponding to the target object is cancelled; and after judging whether the speed data in the first behavior data meet the preset conditions or not, if the speed data do not meet the preset conditions, canceling the control of the virtual role corresponding to the target object.
Whether the target object simulates the action of flapping wings or not is detected through the photosensitive sensor, and whether first action data are generated or not is detected. After the photosensitive sensor detects whether the target object simulates the action of the flapping wing, under the condition that the photosensitive sensor does not detect the action of the target object simulating the flapping wing, namely, under the condition that the first behavior data is not detected, the control of the virtual character corresponding to the target object is cancelled, so that the virtual character is kept in an original state, for example, the virtual character corresponding to the target object stays in scenes such as the ground, the mountain and the water in the virtual reality scene. And under the condition that the first behavior data is detected, judging whether the rising speed indicated by the speed data in the first behavior data is greater than the vertical falling speed, under the condition that the rising speed indicated by the speed data in the first behavior data is not greater than the vertical falling speed, determining that the speed data does not meet a preset condition, and canceling control over the virtual character corresponding to the target object, for example, enabling the virtual character corresponding to the target object to stay on the ground, on a mountain, in water and other scenes.
After detecting whether the first behavior data is generated when the target object simulates the action of flapping wings through the photosensitive sensor, the embodiment cancels the control of the virtual character corresponding to the target object under the condition that the first behavior data is not detected; and after judging whether the speed data in the first behavior data meet the preset conditions or not, if the speed data do not meet the preset conditions, canceling the control of the virtual role corresponding to the target object.
After detecting the first behavior data of the target object in the simulated flight action, the method further comprises: under the condition that the first behavior data is not detected, canceling to control the virtual role corresponding to the target object; after judging whether the speed data in the first behavior data meet the preset condition, the method further comprises the following steps: and if the speed data is judged not to accord with the preset condition, canceling the control of the virtual role corresponding to the target object, thereby achieving the purpose of interaction between the behavior of the target object and the behavior of the virtual role.
As an optional implementation manner, canceling the control of the virtual role corresponding to the target object includes: and controlling the virtual role corresponding to the target object to stop in the preset scene.
Detecting first behavior data of the target object in a simulated flight action, and controlling the virtual character corresponding to the target object to stop in a preset scene, such as a scene of stopping on the ground, on a mountain, in water and the like, under the condition that the first behavior data is not detected, that is, the action of the target object simulating a flapping wing is not detected. When the target object does not simulate the flight action, the virtual character stops in the preset scene, so that the behavior of the target object is consistent with that of the virtual character, the target user feels that when the target user does not simulate the flight action, the virtual character is consistent with the behavior of the target user, the difference between the behavior of the target object and the behavior of the virtual character is reduced, and the vertigo caused by the incoordination of the visual field and the body action of the target user is reduced.
As an optional implementation manner, after the virtual character corresponding to the control target object executes the first preset flight behavior at the first preset speed in the preset scene, the virtual character is controlled to execute the second preset flight behavior at the second preset speed in the preset scene under the condition that the second behavior data generated by the target object during the flight action simulation conforms to the preset behavior data.
Fig. 5 is a flowchart of another interaction method for behavior data according to an embodiment of the present invention. As shown in fig. 5, the interaction method of behavior data further includes the following steps:
step S501, judging whether second behavior data generated by the target object during the flight action simulation accords with preset behavior data.
In the technical solution provided in step S501 of the present invention, it is determined whether second behavior data generated by the target object during the simulated flight action conforms to preset behavior data.
After the virtual character corresponding to the target object is controlled to execute the first preset flight behavior according to the first preset speed in the preset scene, that is, after the virtual character corresponding to the target object takes off according to the first preset speed in the preset scene, the target object continues to simulate the flight behavior, for example, a target user keeps a posture that a V-shaped simulated bird advances with acceleration in the flight process through a handle of virtual reality equipment by two hands, the target user simulates a posture that the bird changes direction in the air by inclining the handle of the virtual reality equipment through two arms, and the target user simulates a posture that the bird slides in the air by keeping the two arms horizontally through the handle of the virtual reality equipment. And detecting second behavior data generated by the target object when the flight action is continuously simulated to obtain a detection result, wherein the detection result comprises the second behavior data generated when the target object is detected to be continuously simulated to the flight action and the second behavior data generated when the target object is not detected to be simulated to the flight action. And under the condition that second behavior data generated when the target object continues to simulate the flight action is detected, judging whether the second behavior data generated when the target object simulates the flight action conforms to preset behavior data. The second behavior data comprises speed data, the speed data is used for indicating the speed of the target object for simulating the flight action after the virtual character executes the first preset flight behavior according to the first preset speed, the preset behavior data can be standard behavior data corresponding to the target user when the target user simulates the attitude of birds accelerating and advancing in the flight process, the target object simulates standard behavior data corresponding to the attitude of birds changing directions in the air, and the target object simulates standard behavior data corresponding to the attitude of birds sliding in the air.
And step S502, controlling the virtual character to execute a second preset flight behavior according to a second preset speed in a preset scene.
In the technical solution provided in step S502 of the present invention, if it is determined that the second behavior data conforms to the preset behavior data, the virtual character is controlled to execute a second preset flight behavior at a second preset speed in the preset scene, where the preset behavior data corresponds to the second preset flight behavior.
And after judging whether second behavior data generated by the target object in the process of simulating the flight action conforms to the preset behavior data or not, controlling the flight action of the virtual character in the preset scene to correspond to the flight action simulated by the target object under the condition of judging that the second behavior data conforms to the preset behavior data.
Optionally, when it is judged that second behavior data generated by the target user keeping the V shape through the handle of the virtual reality device by both hands meets corresponding standard behavior data when the attitude of accelerating and advancing the simulated birds in the flight process is met, the virtual character is controlled to execute forward acceleration flight behavior according to a second preset speed under a preset scene; when the second behavior data generated by the target user through the handle of the virtual reality equipment and the double arms are judged to be in accordance with the corresponding standard behavior data when the posture of birds changing directions in the air is simulated, the virtual character is controlled to execute the flying behavior of changing directions at a second preset speed in a preset scene; when the situation that the target user accords with the corresponding standard behavior data when simulating the posture of birds sliding in the air through the handle of the virtual reality equipment by keeping the second behavior data generated by the two arms horizontally is judged, the virtual character is controlled to execute the sliding flight behavior according to the second preset speed under the preset scene, so that the behavior of the target object is consistent with the behavior of the virtual character, the target user feels that the virtual character is consistent with the behavior of the target user when simulating the flight behavior, the difference between the behavior of the target object and the behavior of the virtual character is reduced, and the dizziness caused by the incoordination between the visual field of the target user and the body action is reduced.
According to the embodiment, after the virtual role corresponding to the target object is controlled to execute the first preset flight behavior according to the first preset speed in the preset scene, whether second behavior data generated by the target object in the process of simulating the flight action accords with the preset behavior data is judged; and if the second behavior data are judged to be in accordance with the preset behavior data, the virtual character is controlled to execute a second preset flight behavior at a second preset speed under the preset scene, wherein the preset behavior data correspond to the second preset flight behavior, interaction of the behavior data of the target object and the virtual character is realized, and the difference between the behavior of the target object and the behavior of the virtual character is reduced.
As an optional implementation manner, after the virtual character is controlled to execute the second preset flight behavior at the second preset speed in the preset scene, under the condition that the ascending speed indicated by the ascending speed data in the second behavior data is kept greater than the preset descending speed, the virtual character corresponding to the control target object continues to execute the second preset flight behavior at the second preset speed in the preset scene; and under the condition that the ascending speed is not greater than the preset descending speed, controlling the virtual role corresponding to the target object to descend in the preset scene.
After the virtual character is controlled to execute a second preset flight behavior according to a second preset speed in a preset scene, judging whether the ascending speed indicated by the ascending speed data in the second behavior data is greater than a preset descending speed, wherein the preset descending speed is the speed of the virtual character descending along the vertical direction under the influence of gravity only; if the ascending speed is judged to be larger than the preset descending speed, the virtual role corresponding to the target object is controlled to continue to execute a second preset flight behavior according to a second preset speed under the preset scene; and if the rising speed is judged to be not greater than the preset falling speed, controlling the virtual role corresponding to the target object to fall in the preset scene.
The second behavior data includes rising speed data indicating a rising speed of the target object. Since birds always receive the acceleration of gravity in the process of flying in the air, and the upward lift force is obtained through the flapping wings, so that the upward flying speed is obtained by overcoming the gravity. If the ascending speed is judged to be larger than the preset descending speed, the virtual role corresponding to the target object is controlled to continue to execute a second preset flight behavior according to a second preset speed under the preset scene, if the rising speed is judged to be not more than the preset falling speed, the virtual character corresponding to the target object is controlled to fall in the preset scene, optionally, the target object is controlled to fall to the ground of the preset scene in the preset scene, therefore, the experience of overcoming gravity, which is experienced by the target user in the real scene, is restored in the preset scene, so that the target user feels when simulating the flight action, the virtual role is consistent with the behavior of the virtual role, the difference between the behavior of the target object and the behavior of the virtual role is reduced, therefore, the dizzy feeling caused by the incoordination of the visual field and the body action of the target user is reduced, and the optimal flying experience and immersion feeling are brought to the target user.
As an optional implementation manner, for the above interaction method of behavior data, determining whether the speed data in the first behavior data meets a preset condition includes: judging whether the ascending speed indicated by the ascending speed data in the first behavior data is greater than a preset descending speed, wherein the preset descending speed is a speed of the virtual character descending along the vertical direction under the influence of gravity only; if the speed data are judged to meet the preset conditions, the step of controlling the virtual role corresponding to the target object to execute a first preset flight behavior according to a first preset speed under the preset scene comprises the following steps: and if the ascending speed is larger than the preset descending speed, controlling the virtual character corresponding to the target object to leave the ground of the preset scene according to the ascending speed, and executing flight behavior according to the advancing speed in the first preset speed, wherein the first preset speed comprises the ascending speed.
The first preset speed at which the virtual character corresponding to the target object executes the first preset flight behavior in the preset scene includes a rising speed and a forward speed, that is, the virtual character has upward and forward flight behaviors in the preset scene. Judging whether the rising speed indicated by the rising speed data in the first behavior data is greater than the preset falling speed or not, and under the condition that the rising speed indicated by the rising speed data in the first behavior data is greater than the preset falling speed, starting to leave the ground of the preset scene according to the rising speed and flying forwards at the advancing speed, so that the aim of controlling the virtual character to fly in the preset scene is fulfilled, the difference between the behavior of the target object and the behavior of the virtual character is reduced, the dizziness caused by the incongruity of the visual field and the body action of the target user is reduced, the target user can feel the pleasure of flying in the preset scene, and the stronger immersion feeling is obtained.
As an optional implementation manner, the determining whether the rising speed is greater than the preset falling speed includes: the rising speed V is calculated by the following first preset formula1:V1K is a velocity coefficient, and V is a gesture velocity of the target object.
When the virtual character corresponding to the target object ascends on the vertical plane, the virtual character passes through V1The ascent speed is defined as k × V, where V is a gesture speed of the target object when simulating the flight action, k is a speed coefficient, that is, a swing vertical distance coefficient, and the data type may be a single precision (float) type, and the ascent speed of the virtual character may be adjusted to a size most suitable for a preset scene by the speed coefficient, and preferably, k is 1.
As an optional implementation manner, the virtual character is controlled to execute the gliding flight behavior at the second preset speed in the preset scene by judging whether the second behavior data generated by the target object during the simulation of the flight action is the behavior data generated by the target object during the keeping of the two arms horizontally unfolded.
Judging whether second behavior data generated by the target object during the flight action simulation conforms to preset behavior data comprises the following steps: judging whether second behavior data generated by the target object when the target object simulates the flight action is behavior data generated when the target object keeps the two arms horizontally unfolded or not, wherein the preset behavior data comprises the behavior data generated when the target object keeps the two arms horizontally unfolded; if the second behavior data are judged to accord with the preset behavior data, controlling the virtual character to execute a second preset flight behavior at a second preset speed under the preset scene comprises the following steps: and if the second behavior data is judged to be the behavior data generated when the target object keeps the two arms horizontally unfolded, controlling the virtual character to execute the gliding flight behavior according to the forward speed and the descending speed smaller than the preset descending speed under the preset scene, wherein the second preset speed comprises the forward speed and the descending speed, and the second preset flight behavior comprises the gliding flight behavior.
The preset behavior data can be standard behavior data corresponding to the posture of the target object simulating the bird sliding in the air, the target object simulating the sliding posture of the bird in the air flight process has a behavior of keeping the two arms horizontally unfolded in the flight, the generated second behavior data accords with the preset behavior data, and the virtual character is controlled to execute the sliding flight behavior according to a second preset speed in a preset scene. The second preset speed includes an advancing speed and a descending speed smaller than the preset descending speed, and the swing horizontal distance coefficient may be set to adjust the second preset speed of the virtual character in the preset scene according to the speed data in the second behavior data, and the data type of the swing horizontal distance coefficient is a single precision type, and is preferably 1. The virtual character corresponding to the target user can keep flying forwards according to the existing speed and is descended at the descending speed which is less than the preset descending speed, so that the sliding behavior of the virtual character in the preset scene is realized, the behavior of the target object is consistent with that of the virtual character, the target user feels that the virtual character performs the sliding flying action in the preset scene in the sliding posture while simulating the sliding flying action, the difference between the behavior of the target object and the behavior of the virtual character is reduced, and the dizziness caused by the incoordination of the visual field and the body action of the target user is reduced.
As an optional implementation manner, the virtual character is controlled to execute the taxiing flight behavior at the second preset speed in the preset scene by judging whether the second behavior data generated by the target object during the flight motion simulation is the behavior data generated by the target object during the arm extension backward.
Judging whether second behavior data generated by the target object during the flight action simulation conforms to preset behavior data comprises the following steps: judging whether second behavior data generated by the target object when the target object simulates the flight action is behavior data generated by the target object when the arm extends backwards, wherein the preset behavior data comprises behavior data generated by the target object when the arm extends backwards; if the second behavior data are judged to accord with the preset behavior data, controlling the virtual character to execute a second preset flight behavior at a second preset speed under the preset scene comprises the following steps: and if the second behavior data is judged to be the behavior data generated when the target object stretches the arm backwards, controlling the virtual character to execute the accelerated forward flight behavior according to a second preset speed under the preset scene, wherein the second preset speed is greater than the forward speed, and the second preset flight behavior comprises the accelerated forward flight behavior.
Birds are subject to a gravitational acceleration during flight. When birds fly in the air, they will experience new acceleration and new speed, and fly forward with new acceleration and new speed for different situations. The target object simulates the accelerated advancing gesture of birds in the air flight process, the arms are extended backwards in the flight process, the second behavior data generated at the moment are in accordance with the standard behavior data corresponding to the gesture that the target object simulates the birds to advance in the air in an accelerated manner, the virtual character is controlled to execute the accelerated advancing flight behavior at a second preset speed which is higher than the original advancing speed in a preset scene, and the accelerated advancing behavior of the virtual character in the preset scene is realized, thereby, the behavior of the target object is consistent with that of the virtual role, and the target user feels that when the target user simulates the acceleration forward action, the virtual character also performs accelerated forward action in a preset scene in an accelerated forward gesture, so that the difference between the behavior of the target object and the behavior of the virtual character is reduced, and the dizziness caused by the incoordination of the visual field and the body action of the target user is reduced.
Optionally, the target object simulates a deceleration advancing gesture of the birds in the air flight process, the generated second behavior data accords with standard behavior data corresponding to the gesture that the target object simulates the birds to decelerate and advance in the air, the virtual character is controlled to execute the deceleration advancing flight behavior under the preset scene according to a second preset speed which is lower than the original advancing speed, the deceleration advancing behavior of the virtual character under the preset scene is realized, thereby, the behavior of the target object is consistent with that of the virtual role, and the target user feels that when the target user simulates the deceleration advancing action, the virtual character also performs the deceleration advancing action in the preset scene in the deceleration advancing gesture, so that the difference between the behavior of the target object and the behavior of the virtual character is reduced, and the dizziness caused by the incoordination of the visual field and the body action of the target user is reduced.
As an optional implementation manner, the controlling the virtual character to execute the flight behavior of the accelerated advancing at the second preset speed in the preset scene includes: the second preset speed V is calculated by the following second preset formula2:V2=V0+ k V + a t, wherein V0The initial speed of the virtual character in the preset scene is shown, k is a speed coefficient, V is the gesture speed of the target object, A is the acceleration of the virtual character in the preset scene, which enables the second preset speed to change along with the time, and t is the acceleration time.
In the interactive program of the behavior data, at the initial speed V0On the basis of the gesture speed of the target object, the gesture speed of the target object is adjusted through the speed coefficient, and the variation of the speed along with the time is calculated, so that the gesture speed is calculated through a formula V2=V0+ k V + a t calculating a second preset speed V at which the virtual character accelerates in the preset scene2The gesture speed V sliding backwards in the first behavior data of the user is detected through the positioning technology of the virtual reality device, and the gesture speed V is the speed data in the first behavior data, so that the defect of delay of behaviors of the target object and the virtual character can be effectively avoided. A is the acceleration of the virtual character which enables the second preset speed to change along with time in the preset scene, when A is a value larger than 0, A is the acceleration which enables the speed to increase and decrease along with time, A can be larger than 0 by pressing a trigger of the virtual reality equipment, when A is a value smaller than 0, A is the acceleration which enables the speed to decrease along with time, and A can be smaller than 0 by releasing the trigger of the virtual reality equipment; t is the acceleration time, i.e. the second predetermined speed V2And when A is larger than 0, t is the time for pressing the trigger of the virtual reality device, and when A is smaller than 0, t is the time for releasing the trigger of the virtual reality device.
As an alternative implementation manner, whether second behavior data generated by the target object when simulating the flight action is flight behavior in which the double arms are inclined relative to the horizontal position to control the virtual character to execute taxiing at a second preset speed in a preset scene is determined.
Judging whether second behavior data generated by the target object during the flight action simulation conforms to preset behavior data comprises the following steps: judging whether second behavior data generated by the target object during the simulated flight action is behavior data generated when the double arms of the target object are inclined relative to the horizontal position or not, wherein the preset behavior data comprises the behavior data generated when the double arms of the target object are inclined relative to the horizontal position; if the second behavior data are judged to accord with the preset behavior data, controlling the virtual character to execute a second preset flight behavior at a second preset speed under the preset scene comprises the following steps: and if the second behavior data is judged to be the behavior data generated when the double arm of the target object inclines relative to the horizontal position, controlling the virtual character to execute the turning flight behavior according to a second preset speed under the preset scene, wherein the second preset flight behavior comprises the turning flight behavior.
The preset behavior data can be standard behavior data corresponding to the posture of the target object simulating the bird turning in the air, the target object simulating the turning posture of the bird in the air flight process, the behavior that the two arms incline relative to the horizontal position in the flight process can be the behavior that the right arm is higher than the left arm, or the behavior that the right arm is lower than the left arm, the second behavior data generated at the moment accord with the preset behavior data, and the virtual character is controlled to execute the turning flight behavior according to the second preset speed in the preset scene.
As an optional implementation manner, the controlling of the virtual character to execute the flying behavior of the turning at the second preset speed in the preset scene includes: detecting a distance deviation of the double arms of the target object relative to a horizontal position; under the condition that the distance deviation that the left arm of the target object is higher than the right arm of the target object is detected to be within a first threshold range, controlling the virtual character to execute a flying behavior turning to the right side according to a second preset speed under a preset scene; and under the condition that the distance deviation that the left arm of the target object is lower than the right arm of the target object is detected to be within a second threshold value range, controlling the virtual character to perform flight behavior turning to the left side at a second preset speed under a preset scene.
The method comprises the steps of detecting distance deviation of the two arms of a target object relative to a horizontal position, wherein the distance deviation comprises the distance deviation that the left arm of the target object is higher than the right arm of the target object and the distance deviation that the left arm of the target object is lower than the right arm of the target object, judging whether the height difference of the two handles of the virtual reality equipment is within a deviation threshold range, identifying the direction when the two arms of the target object deviate from the horizontal position and reach the deviation threshold, and controlling the change of the direction of a virtual character corresponding to the target object in a preset scene. The deviation threshold range is set by a test value in a behavior data interaction program, and the data type of the test value is single precision. In the case where it is detected that the distance deviation of the left arm of the target object higher than the right arm of the target object is within the first threshold range, controlling the virtual character to execute a flying behavior turning to the right side according to a second preset speed under a preset scene, in the case where it is detected that the distance deviation of the left arm of the target object lower than the right arm of the target object is within the second threshold range, controlling the virtual character to perform the flying behavior of turning to the left side according to a second preset speed under the preset scene, therefore, the attitude of birds changing directions in the air is simulated, the virtual role corresponding to the target object obtains the change of the flight direction in the preset scene, the aim of enabling the behavior of the target object to be consistent with the behavior of the virtual role is achieved, the difference between the behavior of the target object and the behavior of the virtual role is reduced, and the vertigo caused by the incoordination of the visual field and the body action of a target user is reduced.
Fig. 6 is a flowchart of a method for controlling a flight behavior of a virtual character performing steering at a second preset speed in a preset scene according to an embodiment of the present invention. As shown in fig. 6, the method for controlling the virtual character to perform the steered flying behavior at the second preset speed in the preset scene includes the following steps:
in step S601, a distance deviation of the both arms of the target object with respect to the horizontal position is detected.
In the technical solution provided by step S601 of the present invention, a distance deviation of the both arms of the target object with respect to the horizontal position is detected.
If the second behavior data are judged to accord with the preset behavior data, the distances of the left arm and the right arm of the target object relative to the horizontal position are respectively detected, the difference between the distance of the left arm relative to the horizontal position and the distance of the right arm relative to the horizontal position is calculated, when the difference between the distances is larger than 0, the distance deviation of the left arm higher than the right arm of the target object is determined, and when the difference between the distances is smaller than 0, the distance deviation of the left arm lower than the right arm of the target object is determined.
Step S602, in a case that it is detected that the distance deviation of the left arm of the target object higher than the right arm of the target object is within the first threshold range, controlling the virtual character to execute a flying behavior turning to the right side at a second preset speed in a preset scene.
In the technical solution provided in step S602 of the present invention, after detecting the distance deviation between the two arms of the target object with respect to the horizontal position, in the case that the distance deviation between the left arm of the target object and the right arm of the target object is detected to be within the first threshold range, the virtual character is controlled to perform a flying behavior of turning to the right side at a second preset speed in a preset scene.
And step S603, in the case that it is detected that the distance deviation of the left arm of the target object being lower than the right arm of the target object is within the second threshold range, controlling the virtual character to perform a flying behavior of turning to the left side at a second preset speed in a preset scene.
In the technical solution provided in step S603 of the present invention, after detecting the distance deviation between the two arms of the target object relative to the horizontal position, if it is detected that the distance deviation between the left arm of the target object and the right arm of the target object is within the second threshold range, the virtual character is controlled to perform a flying behavior of turning to the left side at the second preset speed in the preset scene.
The embodiment sets a threshold value of deviation of double arms of a target object in the flying process, and recognizes the direction of a virtual character corresponding to the target object in a preset scene after the double arms deviate from a horizontal position to reach the threshold value, so as to control the direction transition of the virtual character, avoid the phenomenon that the visual field and the body action of a target user are not coordinated to cause the dizziness of the user due to the fact that the traditional VR product completes the direction control by utilizing the rotation of a head display, drive the change of the position of the virtual character in the preset scene by recognizing the position difference of the double arms and the horizontal direction by simulating the rotation mode of bird flying, effectively reduce the difference of the behavior of the target object and the behavior of the virtual character, and weaken the dizziness of the target user.
As an optional implementation manner, after first behavior data generated by a target object during flight action simulation is detected, filtering displacement data smaller than a preset amplitude in the first behavior data through a preset filtering parameter to obtain filtered behavior data; judging whether the speed data in the first behavior data meet the preset conditions comprises the following steps: judging whether the speed data in the filtered behavior data meet preset conditions or not; if the speed data are judged to meet the preset conditions, the step of controlling the virtual role corresponding to the target object to execute a first preset flight behavior according to a first preset speed under the preset scene comprises the following steps: and if the speed data in the filtered behavior data is judged to meet the preset conditions, controlling the virtual role corresponding to the target object to execute a first preset flight behavior at a first preset speed in a preset scene.
After first behavior data generated by a target object during flight action simulation are detected, preset filtering parameters are obtained, displacement data smaller than a preset amplitude in the first behavior data are obtained, and the displacement data smaller than the preset amplitude in the first behavior data are filtered according to the preset filtering parameters to obtain filtered behavior data, so that a small-amplitude flapping gesture of the target object at an initial stage is effectively filtered. And judging whether the speed data in the filtered behavior data meets a preset condition, and if so, controlling the virtual role corresponding to the target object to execute a first preset flight behavior at a first preset speed under a preset scene, thereby effectively providing more accurate behavior interaction experience for the target user.
As an optional implementation manner, the preset scene is a virtual reality scene, and detecting first behavior data generated by the target object when simulating the flight action includes: in the laser positioning range, first behavior data generated when the target object simulates flight actions are detected through laser positioning.
The behavior data interaction method of the embodiment is applied to a virtual reality scene. In a virtual reality scene, in a laser positioning range, first behavior data generated by a target object in a simulated flight action are detected through laser positioning, and actions of handles worn by both hands of a target user in the laser positioning range can be accurately detected through laser positioning so as to acquire the first behavior data. Under the condition that the first behavior data are detected, judging whether the speed data in the first behavior data meet a preset condition or not; and if the speed data are judged to meet the preset conditions, controlling the virtual character corresponding to the target object to execute a first preset flight behavior according to a first preset speed in the virtual reality scene, wherein the first preset speed corresponds to the speed data, so that the technical effect of reducing the difference between the behavior of the target object and the behavior of the virtual character is realized, and the technical problem of large difference between the behavior of the target object and the behavior of the virtual character in the related technology is solved.
Example 2
The technical solution of the present invention will be described below with reference to preferred embodiments.
FIG. 7 is a schematic illustration of a physical model of bird flight in accordance with an embodiment of the present invention. As shown in fig. 7, the birds always receive a gravity acceleration g in flight, and after the birds fan the wings, the birds obtain an upward acceleration a1 and a rising speed v 1; when the wing is unfolded in the air, a new acceleration a2 and a new speed v2 are obtained; while for different situations, the aircraft will fly forward with acceleration a3 and velocity v 3.
Based on the analysis, if the user has an extreme flying experience, the speed and acceleration values related to the physical model are added to the program interaction process, so that the flying experience of the user in the flying process can be highly restored. The parameters specifically involved are shown in table 2.
TABLE 2 behavior data interaction parameters Table
Figure BDA0001143794650000211
Parameter avatardownSpeed: for a gravity fall fixed speed, the reference value is preferably 2m/s, and the parameter is defined as float type, which enables the user to descend along the vertical direction at this speed in the VR scene, restoring the experience of the gravity acceleration descending in the real scene.
The parameter avatarforwardMutiplier: for the swing level distance coefficient, the data type is float type, and the parameter defines the interaction technology of how the user accelerates the flight forward, and the formula V ═ V is used in the program0+ k V (gesture speed) + acceleration or deceleration t (trigger press/release time). And detecting the speed V of the backward sliding gesture of the user by using a positioning technology of VR equipment, and matching the time t after the user presses the trigger to obtain a final acceleration value. The speed V of the gesture is detected through the VR equipment, so that the defect that the difference between the behavior of the target object and the behavior of the virtual character is large due to time delay can be effectively overcome.
Parameter avatarupMutiplier: to wave the vertical distance coefficient, this parameter defines the interaction technique of the user rising in the vertical plane, which is implemented by the formula V' ═ k × V (gesture speed), where the reference value k is the speed coefficient, preferably 1. The parameter can adjust the speed of the virtual character in a preset scene to the size which is most suitable for the scene.
The parameter avatarupThreshold: in order to raise the threshold value, the parameter achieves a filtering effect, the reference value is a test value, displacement data smaller than a preset amplitude in the first behavior data are filtered through the preset filtering parameter, filtered behavior data are obtained, the fan-moving gesture of the user with small displacement in the initial amplitude is effectively filtered, and more accurate interaction experience is effectively provided for the user.
The parameter latelthreshold: to be a rising threshold, this parameter defines a rising threshold for deviations of the target user's hands during flight. When the deviation of the horizontal position of the two arms reaches the rising threshold, the direction of the two arms is identified, the direction of the user is controlled to be changed, and the reference value is a test value.
In this embodiment, the user holds the handle of the virtual reality device with both hands to simulate the action of the birds to fan the wings, so that the character in the preset scene obtains an upward speed, and the user keeps both arms horizontal so that the character in the preset scene can simulate the birds to slide in the sky, wherein the interaction posture relates to the parameters of avatar, avatar and avatar threshold in the program. The user keeps the V-shaped state through two hands, the posture that birds advance in an accelerating mode in a flying mode is simulated, so that roles in the VR world advance in an accelerating mode, the interaction posture relates to that the parameter in the program is avatar mutiplier, the rising, the direction control, the advancing in an accelerating mode and the like of the corresponding virtual roles in the preset scene are controlled through the behavior data of the target object, the difference between the behaviors of the target user and the behaviors of the virtual roles is reduced, more immersion is brought to the user, the user has stronger experience in the flying process of the preset scene, and the vertigo of the target user in the experience of the VR world is relieved.
Fig. 8 is a flowchart of another interaction method of behavior data according to an embodiment of the present invention. As shown in fig. 8, the interaction method of behavior data includes the following steps:
step S801, superposing the speed V of the vertical and horizontal ground1
Birds gain upward lift by flapping wings to overcome gravity to gain upward flight speed. The velocity V of the vertical and horizontal ground is superposed by taking the gravity into consideration1The method is integrated into the flight process, so that the user can experience the feeling of overcoming the gravity in flight.
Step S802, detecting whether the user has the motion of flapping wings through a photosensitive sensor.
In the laser positioning range, first behavior data generated by a target object in the simulated flight action is detected through laser positioning, and whether a user has the action of flapping wings or not can be detected through a photosensitive sensor. If the motion of the user with the flapping wings is detected by the photosensitive sensor, step S804 is performed, and if the motion of the user with the flapping wings is not detected by the photosensitive sensor, step S803 is performed.
And step S803, controlling the virtual character corresponding to the user to stay on the ground.
After the photosensitive sensor detects whether the user has the action of flapping wings, if the photosensitive sensor does not detect the action of flapping wings of the user, the control on the virtual role corresponding to the target object is cancelled, and the virtual role corresponding to the target object is controlled to stay on the ground.
Step S804, controlling the virtual character corresponding to the user to start to leave the ground according to the rising speed V2Forward speed V3The flight is started.
After detecting whether the user has the action of flapping wings through the photosensitive sensor, if the user has the action of flapping wings through the photosensitive sensor, the user starts to leave the ground of the preset scene according to the ascending speed V2Forward speed V3The flight is started.
Step S805, determining whether a difference in the horizontal position of the distance between the two arms occurs during the user' S flight.
Starting to leave the ground at the virtual character corresponding to the user according to the rising speed V2Forward speed V3After the flight is started, it is determined whether or not there is a difference in the horizontal distance between the arms of the user during the flight, and if it is determined that there is a difference in the horizontal distance between the arms of the user during the flight, step S806 is performed.
Step S806, when the left arm is higher than the right arm, the virtual character corresponding to the user flies to the right side; when the right arm is higher than the left arm, the virtual character corresponding to the user flies to the left side.
If the difference of the horizontal positions of the two arms in the flying process of the user is judged, when the left arm of the user is higher than the right arm, the virtual character corresponding to the user flies to the right side in a preset scene; and when the right arm of the user is higher than the left arm, the virtual character corresponding to the user flies to the left side in the preset scene, so that the change of the flying direction of the virtual character in the preset scene is controlled.
In step S807, it is determined whether or not the rising speed is kept larger than the falling speed.
After the virtual character corresponding to the user is controlled to change in the flight direction in the preset scene, whether the ascending speed is kept higher than the descending speed is judged, if the ascending speed is judged to be kept higher than the descending speed, the original flight state of the virtual character is continuously kept, and if the ascending speed is judged to be kept lower than the descending speed, the step S803 is executed, and the virtual character corresponding to the user is controlled to stay on the ground.
Step S808, determining whether an action of extending the arm backward occurs during the user' S flight.
Starting to leave the ground at the virtual character corresponding to the user according to the rising speed V2Forward speed V3After the flight is started, it is determined whether an action of extending the arm backward occurs while the user is flying. If it is determined that the user takes the action of extending the arm backward while flying, step S809 is performed.
Step S809, controlling the virtual role corresponding to the user to be higher than the original speed V3A forward flight is performed.
After judging that the user takes the action of extending the arm backwards in the flying process, the virtual character corresponding to the user is higher than the original speed V3And the virtual character flies forwards, so that the flying behavior of the virtual character advancing at an accelerated speed in a preset scene is realized. The virtual role corresponding to the user is higher than the original speed V3After the forward flight, step S807 is executed to determine whether the ascending velocity is greater than the descending velocity, if so, the original flight state of the virtual character is continuously maintained, and if not, step S803 is executed to control the virtual character corresponding to the user to stay on the ground.
Step S810, judging whether the user has the action of horizontally unfolding the two arms in the flying process.
Starting to leave the ground at the virtual character corresponding to the user according to the rising speed V2Forward speed V3After the start of flight, it is judged that the user is flyingIf it is determined that there is a behavior of horizontally unfolding both arms during the flight, step S811 is performed.
Step S811, controlling the virtual character corresponding to the user to keep flying forward according to the existing speed, and meanwhile, controlling the virtual character to be smaller than V1The descent speed of (2).
If the user is judged to have the action of horizontally unfolding the two arms in the flying process, the user keeps flying forwards at the current speed and is subjected to V or less1The virtual role corresponding to the target object slides in a preset scene. After the virtual character corresponding to the target object performs the gliding flight behavior in the preset scene, step S807 is executed to determine whether the ascending speed is greater than the descending speed, if it is determined that the ascending speed is greater than the descending speed, the original flight state of the virtual character is continuously maintained, and if it is determined that the ascending speed is not greater than the descending speed, step S803 is executed to control the virtual character corresponding to the user to stay on the ground.
This embodiment is implemented by superimposing the velocity V of the vertical and horizontal ground1The method comprises the steps of detecting whether a user has the action of flapping wings through a photosensitive sensor, controlling the virtual character corresponding to the user to stay on the ground under the condition that the photosensitive sensor does not detect the action of flapping wings of the user, and controlling the virtual character corresponding to the user to start to leave the ground according to the ascending speed V under the condition that the photosensitive sensor detects the action of flapping wings of the user2Forward speed V3Starting flying, judging whether the difference of the horizontal positions of the two arms occurs in the flying process of the user, and when the left arm is higher than the right arm, flying the virtual character corresponding to the user to the right side; when the right arm is higher than the left arm, the virtual character corresponding to the user flies to the left side, whether the ascending speed is kept to be higher than the descending speed or not is judged, if the ascending speed is kept to be higher than the descending speed, the virtual character is kept in the original state, if the ascending speed is kept to be not higher than the descending speed, the virtual character corresponding to the user is controlled to stay on the ground, whether the action of extending the arm backwards occurs or not in the flying process of the user is judged, and if the action of extending the arm backwards occurs or not in the flying process of theGenerating the action of extending the arm backwards, and controlling the virtual character corresponding to the user to be higher than the original speed V3Performing forward flight, judging whether the user has the action of horizontally unfolding two arms in the flying process, if so, controlling the virtual character corresponding to the user to keep flying forward at the current speed and simultaneously receiving the action of less than V1The descent speed of (2).
This embodiment abstracts birds flight into a physical model, and based on the laser positioning technique of VR, the action of the handle of user's both hands wearing in laser positioning range can be detected by accurate to the technological effect of the difference of the action of reduction target object and the action of virtual character has been realized, and then has solved the great technical problem of difference of the action of target object and the action of virtual character among the correlation technique.
Example 3
The application environment of the embodiment of the present invention may refer to the application environment in the above embodiments, but is not described herein again. The embodiment of the invention provides an optional specific application for implementing the interaction method of the behavior data, and the specific application is specifically applied to a virtual reality scene.
Fig. 9A is a schematic diagram of a behavior of a target object to fan an arm according to an embodiment of the present invention. As shown in fig. 9A, the user can simulate the flapping of wings of birds by holding the handle of the virtual reality device with both hands. Fig. 9B is a schematic diagram of a virtual character fly-up according to an embodiment of the present invention. As shown in fig. 9B, when the target object simulates the motion of a bird flapping wings, the virtual character in the VR world is caused to acquire flight behavior. Through many times of experimental tests, learn that the user makes the role in the VR world obtain ascending speed through waving the handle simulation birds on the vertical plane and flapping the wing to realize the flight behavior that the virtual character flies upwards.
Fig. 10A is a schematic diagram of a behavior of a target object swinging an arm backward according to an embodiment of the present invention. As shown in fig. 10A, the user holds the V-shape with both hands, simulating the posture of birds accelerating forward in flight. FIG. 10B is a schematic diagram of accelerated progress of a virtual character, according to an embodiment of the invention. As shown in fig. 10B, when simulating the attitude of birds accelerating forward in flight, virtual characters in the VR world are caused to acquire the flight behavior of accelerating forward.
FIG. 11A is a diagram illustrating a target object with a right arm higher than a left arm, according to an embodiment of the invention. As shown in fig. 11A, the target object simulates the posture of the bird changing direction in the air, and the action of turning left during the flight of the bird is simulated by the right arm being higher than the left arm. FIG. 11B is a diagram illustrating a target object with a lower right arm than a left arm, according to an embodiment of the invention. As shown in fig. 11B, the target object simulates the posture of the bird changing direction in the air, and the action of turning right during the flight is simulated by the right arm being lower than the left arm.
Fig. 12 is a schematic view of a virtual character changing a flight direction according to an embodiment of the present invention. As shown in fig. 12, when the target object simulates the posture of a bird changing directions in the air, the deviation of the horizontal distance between the two hands of the user is accurately positioned through a sensor in the device, and the inclination condition that the right arm of the target object is higher than the left arm or the inclination condition that the right arm of the target object is lower than the left arm is determined, so that the virtual character of the user in the VR world obtains the change of the left flight direction or the right flight direction.
The embodiment is based on a leading edge VR platform, fully excavates the characteristics of a new platform, develops a unique positioning technology to design an interactive technology unique to VR, and mainly explains the value of the technology from the following aspects:
and (3) improving immersion feeling: VR technology can improve the user's sense of immersion that other platforms cannot match. If scenes that the user cannot experience at ordinary times are repeatedly engraved in the VR world, the user can be provided with super-strong immersion. According to the embodiment, the arms are fanned in the real world through simulating the flight action, meanwhile, the user can feel the pleasure of winging flight in the VR world, and the user can obtain stronger immersion through the interaction technology.
And (3) weakening dizziness feeling: the traditional VR product controls the direction by using the rotation of a head display, and the scheme can cause dizziness of users due to the incoordination of visual field and body movement; the behavior data interaction technology of the embodiment drives the change of the image position of the user in the VR world by simulating the rotation mode of bird flight and identifying the position difference between the double arms and the horizontal direction, thereby effectively weakening the uncomfortable vertigo feeling of the user.
Expanding multiple platforms: the interaction technology of the embodiment can be expanded by multiple platforms, for example, can be applied to interaction modes in VR products such as VR games, VR social contact and VR tools.
It should be noted that, in the above embodiment, the user simulates the postures of the birds swinging wings, sliding, accelerating forward and changing directions in the air, and is not limited to the motions described in this embodiment, and may change to other motions according to the habits of the user to realize the control of the virtual character in the preset scene, and reduce the difference between the behavior of the target object and the behavior of the virtual character.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the order of acts, as some steps may occur in other orders or concurrently in accordance with the invention. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required by the invention.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
Example 4
According to the embodiment of the invention, the invention also provides the behavior data interaction device for implementing the behavior data interaction method. Fig. 13 is a schematic diagram of an interaction device for behavior data according to an embodiment of the present invention. As shown in fig. 13, the interaction device for behavior data may include: a detection unit 10, a judgment unit 20 and a first control unit 30.
The detection unit 10 is used for detecting first behavior data generated by the target object when the flight action is simulated.
The determining unit 20 is configured to determine whether the speed data in the first behavior data meets a preset condition when the first behavior data is detected.
The first control unit 30 is configured to, when it is determined that the speed data meets the preset condition, control the virtual character corresponding to the target object to execute a first preset flight behavior at a first preset speed in the preset scene, where the first preset speed corresponds to the speed data.
Fig. 14 is a schematic diagram of another interaction device for behavior data according to an embodiment of the invention. As shown in fig. 14, the interaction means of the behavior data may include: a detection unit 10, a judgment unit 20 and a first control unit 30. The interaction device for the behavior data may further include: a second control unit 40 and a third control unit 50.
It should be noted that the detecting unit 10, the determining unit 20 and the first control unit 30 of this embodiment have the same functions as those of the behavior data interaction apparatus of the embodiment shown in fig. 13, and are not described again here.
And a second control unit 40 configured to cancel the control of the virtual character corresponding to the target object when the first behavior data is not detected after the first behavior data of the target object simulating the flight action is detected.
And a third control unit 50, configured to cancel control over the virtual character corresponding to the target object when it is determined that the speed data does not meet the preset condition after it is determined whether the speed data in the first behavior data meets the preset condition.
Optionally, the third control unit 50 is configured to control the virtual character corresponding to the target object to stop in the preset scene.
Optionally, after the virtual character corresponding to the target object is controlled to execute the first preset flight behavior at the first preset speed in the preset scene, the determining unit 20 is further configured to determine whether second behavior data generated by the target object during the flight action simulation conforms to the preset behavior data; the first control unit 30 is further configured to control the virtual character to execute a second preset flight behavior at a second preset speed in the preset scene when the second behavior data is determined to conform to the preset behavior data, where the preset behavior data corresponds to the second preset flight behavior.
Optionally, the determining unit 20 is further configured to determine whether an ascending speed indicated by the ascending speed data in the second behavior data remains greater than a preset descending speed after controlling the virtual character to execute a second preset flight behavior at a second preset speed in the preset scene, where the preset descending speed is a speed of the virtual character descending in the vertical direction under the influence of only gravity; the first control unit 30 is further configured to, when the rising speed is determined to be greater than the preset falling speed, control the virtual character corresponding to the target object to continue to execute a second preset flight behavior at a second preset speed in the preset scene, and if the rising speed is determined to be not greater than the preset falling speed, control the virtual character corresponding to the target object to fall in the preset scene.
Alternatively, the judging unit 20 is configured to judge whether an ascending speed indicated by the ascending speed data in the first behavior data is greater than a preset descending speed, where the preset descending speed is a speed at which the virtual character descends in a vertical direction when being influenced only by gravity; the first control unit 30 is configured to, when it is determined that the ascending speed is greater than the preset descending speed, control the virtual character corresponding to the target object to leave the ground of the preset scene according to the ascending speed, and execute a flight behavior according to an advancing speed in a first preset speed, where the first preset speed includes the ascending speed.
OptionallyThe judging unit 20 is used for calculating the rising speed V by the following first preset formula1:V1K is a velocity coefficient, and V is a gesture velocity of the target object.
Optionally, the determining unit 20 is configured to determine whether second behavior data generated by the target object when the target object simulates the flight action is behavior data generated by the target object when the target object keeps the two arms horizontally unfolded, where the preset behavior data includes behavior data generated by the target object when the target object keeps the two arms horizontally unfolded; the first control unit 30 is configured to, when it is determined that the second behavior data is behavior data generated when the target object keeps the two arms horizontally unfolded, control the virtual character to execute a gliding flight behavior according to a forward speed and a descent speed smaller than a preset descent speed in a preset scene, where the second preset speed includes the forward speed and the descent speed, and the second preset flight behavior includes the gliding flight behavior.
Optionally, the determining unit 20 is configured to determine whether second behavior data generated by the target object when simulating the flight action is behavior data generated by the target object when extending the arm backwards, where the preset behavior data includes behavior data generated by the target object when extending the arm backwards; the first control unit 30 is configured to, when it is determined that the second behavior data is behavior data generated when the target object extends the arm backward, control the virtual character to execute a flight behavior of accelerated forward in a preset scene according to a second preset speed, where the second preset speed is greater than the forward speed, and the second preset flight behavior includes a flight behavior of accelerated forward.
Optionally, the first control unit 30 is configured to calculate a second preset speed V by a second preset formula2:V2=V0+ k V + a t, wherein V0The initial speed of the virtual character in the preset scene is shown, k is a speed coefficient, V is the gesture speed of the target object, A is the acceleration of the virtual character in the preset scene, which enables the second preset speed to change along with the time, and t is the acceleration time.
Optionally, the judging unit 20 is configured to judge whether second behavior data generated by the target object when simulating the flight action is behavior data generated by tilting the two arms of the target object relative to a horizontal position, wherein the preset behavior data includes behavior data generated by tilting the two arms of the target object relative to the horizontal position; the first control unit 30 is configured to control the virtual character to execute a steered flight behavior at a second preset speed in a preset scene when it is determined that the second behavior data is behavior data generated when the double arm of the target object is inclined with respect to the horizontal position, where the second preset flight behavior includes the steered flight behavior.
Alternatively, the first control unit 30 includes: the device comprises a detection module, a first control module and a second control module. The detection module is used for detecting the distance deviation of the double arms of the target object relative to the horizontal position; the first control module is used for controlling the virtual character to execute a flying behavior turning to the right side according to a second preset speed under a preset scene under the condition that the distance deviation that the left arm of the target object is higher than the right arm of the target object is detected to be within a first threshold range; the second control module is used for controlling the virtual character to carry out flying behavior of turning to the left side according to a second preset speed under a preset scene under the condition that the distance deviation that the left arm of the target object is lower than the right arm of the target object is detected to be within a second threshold range.
Optionally, the interaction device for behavior data further includes: the filtering unit is used for filtering displacement data smaller than a preset amplitude in the first behavior data through a preset filtering parameter after detecting the first behavior data generated by the target object during the flight action simulation, so as to obtain filtered behavior data; the judging unit 20 is configured to judge whether speed data in the filtered behavior data meets a preset condition; if the speed data is judged to meet the preset condition, the first control unit 30 is configured to control the virtual character corresponding to the target object to execute a first preset flight behavior at a first preset speed in a preset scene when the speed data in the filtered behavior data is judged to meet the preset condition.
Optionally, the preset scene is a virtual reality scene, and the detection unit 10 is configured to detect first behavior data generated by the target object during the simulated flight action through laser positioning within a laser positioning range.
It should be noted that the detecting unit 10 in this embodiment may be configured to execute step S202 in embodiment 1 of this application, the determining unit 20 in this embodiment may be configured to execute step S204 in embodiment 1 of this application, and the first controlling unit 30 in this embodiment may be configured to execute step S206 in embodiment 1 of this application.
It should be noted here that the above units and modules are the same as the examples and application scenarios realized by the corresponding steps, but are not limited to the disclosure of the above embodiments. It should be noted that the above units and modules as a part of the apparatus may be operated in a hardware environment as shown in fig. 3, and may be implemented by software, or may be implemented by hardware, where the hardware environment includes a network environment.
In the embodiment, the detection unit 10 detects first behavior data generated by the target object during the simulated flight action, and the judgment unit 20 judges whether the speed data in the first behavior data meets the preset condition or not when the first behavior data is detected. When the first control unit 30 determines that the speed data meets the preset condition, the virtual character corresponding to the target object is controlled to execute a first preset flight behavior at a first preset speed in a preset scene, where the first preset speed corresponds to the speed data, so that a technical problem that a difference between a behavior of the target object and a behavior of the virtual character in a related technology is large is solved, and a technical effect of reducing a difference between the behavior of the target object and the behavior of the virtual character is achieved.
Example 5
According to the embodiment of the invention, the invention also provides a server or a terminal for implementing the interaction method of the behavior data.
Fig. 15 is a block diagram of a terminal according to an embodiment of the present invention. As shown in fig. 15, the terminal may include: one or more processors 151 (only one shown), a memory 153, and a transmission device 155. As shown in fig. 15, the terminal may further include an input-output device 157.
The memory 153 may be used to store software programs and modules, such as program instructions/modules corresponding to the behavior data interaction method and apparatus in the embodiment of the present invention, and the processor 151 executes various functional applications and data processing by running the software programs and modules stored in the memory 153, that is, implements the behavior data interaction method described above. The memory 153 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 153 may further include memory located remotely from the processor 151, which may be connected to the terminal over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 155 is used for receiving or transmitting data via a network, and may also be used for data transmission between the processor and the memory. Examples of the network may include a wired network and a wireless network. In one example, the transmission device 155 includes a Network adapter (NIC) that can be connected to a router via a Network cable and other Network devices so as to perform communication with the internet or a local area Network. In one example, the transmission device 155 is a Radio Frequency (RF) module for performing communication with the internet in a wireless manner.
Among them, the memory 153 is used to store an application program, in particular.
The processor 151 may call the application stored in the memory 153 through the transmission means 155 to perform the following steps:
detecting first behavior data generated by a target object during flight action simulation;
under the condition that the first behavior data are detected, judging whether the speed data in the first behavior data meet a preset condition or not;
and if the speed data are judged to meet the preset conditions, controlling the virtual role corresponding to the target object to execute a first preset flight behavior according to a first preset speed in a preset scene, wherein the first preset speed corresponds to the speed data.
Processor 151 is further configured to perform the following steps: after first behavior data of a target object in a simulated flight action is detected, under the condition that the first behavior data is not detected, the control of a virtual role corresponding to the target object is cancelled; and after judging whether the speed data in the first behavior data meet the preset conditions or not, if the speed data do not meet the preset conditions, canceling the control of the virtual role corresponding to the target object.
Processor 151 is further configured to perform the following steps: and controlling the virtual role corresponding to the target object to stop in the preset scene.
Processor 151 is further configured to perform the following steps: after a virtual role corresponding to a target object is controlled to execute a first preset flight behavior according to a first preset speed under a preset scene, judging whether second behavior data generated by the target object during flight action simulation conforms to preset behavior data; and if the second behavior data are judged to accord with the preset behavior data, controlling the virtual character to execute a second preset flight behavior at a second preset speed under the preset scene, wherein the preset behavior data correspond to the second preset flight behavior.
Processor 151 is further configured to perform the following steps: after the virtual character is controlled to execute a second preset flight behavior according to a second preset speed in a preset scene, judging whether the ascending speed indicated by the ascending speed data in the second behavior data is greater than a preset descending speed, wherein the preset descending speed is the speed of the virtual character descending along the vertical direction under the influence of gravity only; if the ascending speed is judged to be larger than the preset descending speed, the virtual role corresponding to the target object is controlled to continue to execute a second preset flight behavior according to a second preset speed under the preset scene; and if the rising speed is judged to be not greater than the preset falling speed, controlling the virtual role corresponding to the target object to fall in the preset scene.
Processor 151 is further configured to perform the following steps: judging whether the ascending speed indicated by the ascending speed data in the first behavior data is greater than a preset descending speed, wherein the preset descending speed is a speed of the virtual character descending along the vertical direction under the influence of gravity only; and if the ascending speed is larger than the preset descending speed, controlling the virtual character corresponding to the target object to leave the ground of the preset scene according to the ascending speed, and executing flight behavior according to the advancing speed in the first preset speed, wherein the first preset speed comprises the ascending speed.
Processor 151 is further configured to perform the following steps: judging whether second behavior data generated by the target object when the target object simulates the flight action is behavior data generated when the target object keeps the two arms horizontally unfolded or not, wherein the preset behavior data comprises the behavior data generated when the target object keeps the two arms horizontally unfolded; and if the second behavior data is judged to be the behavior data generated when the target object keeps the two arms horizontally unfolded, controlling the virtual character to execute the gliding flight behavior according to the forward speed and the descending speed smaller than the preset descending speed under the preset scene, wherein the second preset speed comprises the forward speed and the descending speed, and the second preset flight behavior comprises the gliding flight behavior.
Processor 151 is further configured to perform the following steps: judging whether second behavior data generated by the target object when the target object simulates the flight action is behavior data generated by the target object when the arm extends backwards, wherein the preset behavior data comprises behavior data generated by the target object when the arm extends backwards; and if the second behavior data is judged to be the behavior data generated when the target object stretches the arm backwards, controlling the virtual character to execute the accelerated forward flight behavior according to a second preset speed under the preset scene, wherein the second preset speed is greater than the forward speed, and the second preset flight behavior comprises the accelerated forward flight behavior.
Processor 151 is further configured to perform the following steps: judging whether second behavior data generated by the target object during the simulated flight action is behavior data generated when the double arms of the target object are inclined relative to the horizontal position or not, wherein the preset behavior data comprises the behavior data generated when the double arms of the target object are inclined relative to the horizontal position; and if the second behavior data is judged to be the behavior data generated when the double arm of the target object inclines relative to the horizontal position, controlling the virtual character to execute the turning flight behavior according to a second preset speed under the preset scene, wherein the second preset flight behavior comprises the turning flight behavior.
Processor 151 is further configured to perform the following steps: detecting a distance deviation of the double arms of the target object relative to a horizontal position; under the condition that the distance deviation that the left arm of the target object is higher than the right arm of the target object is detected to be within a first threshold range, controlling the virtual character to execute a flying behavior turning to the right side according to a second preset speed under a preset scene; and under the condition that the distance deviation that the left arm of the target object is lower than the right arm of the target object is detected to be within a second threshold value range, controlling the virtual character to perform flight behavior turning to the left side at a second preset speed under a preset scene.
Processor 151 is further configured to perform the following steps: after first behavior data generated by a target object during flight action simulation are detected, filtering displacement data smaller than a preset amplitude in the first behavior data through a preset filtering parameter to obtain filtered behavior data; judging whether the speed data in the filtered behavior data meet preset conditions or not; and if the speed data in the filtered behavior data is judged to meet the preset conditions, controlling the virtual role corresponding to the target object to execute a first preset flight behavior at a first preset speed in a preset scene.
Processor 151 is further configured to perform the following steps: the preset scene is a virtual reality scene, and first behavior data generated by the target object in the simulated flight action is detected through laser positioning in a laser positioning range.
The embodiment of the invention provides an interaction method of behavior data. Detecting first behavior data generated by a target object during flight action simulation; under the condition that the first behavior data are detected, judging whether the speed data in the first behavior data meet a preset condition or not; and if the speed data are judged to meet the preset conditions, controlling the virtual character corresponding to the target object to execute a first preset flight behavior according to a first preset speed under the preset scene, wherein the first preset speed corresponds to the speed data, and achieving the purpose of behavior interaction between the target object and the virtual character, so that the technical effect of reducing the difference between the behavior of the target object and the behavior of the virtual character is achieved, and the technical problem that the difference between the behavior of the target object and the behavior of the virtual character in the related technology is large is solved.
Optionally, the specific examples in this embodiment may refer to the examples described in the above embodiments, and this embodiment is not described herein again.
It can be understood by those skilled in the art that the structure shown in fig. 15 is only an illustration, and the terminal may be a terminal device such as a smart phone (e.g., an Android phone, an iOS phone, etc.), a tablet computer, a palm computer, and a Mobile Internet Device (MID), a PAD, etc. Fig. 15 is a diagram illustrating a structure of the electronic device. For example, the terminal may also include more or fewer components (e.g., network interfaces, display devices, etc.) than shown in FIG. 15, or have a different configuration than shown in FIG. 15.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by a program instructing hardware associated with the terminal device, where the program may be stored in a computer-readable storage medium, and the storage medium may include: flash disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
Example 6
The embodiment of the invention also provides a storage medium. Alternatively, in this embodiment, the storage medium may be used to execute a program code of an interaction method of behavior data.
Optionally, in this embodiment, the storage medium may be located on at least one of a plurality of network devices in a network shown in the above embodiment.
Optionally, in this embodiment, the storage medium is configured to store program code for performing the following steps:
detecting first behavior data generated by a target object during flight action simulation;
under the condition that the first behavior data are detected, judging whether the speed data in the first behavior data meet a preset condition or not;
and if the speed data are judged to meet the preset conditions, controlling the virtual role corresponding to the target object to execute a first preset flight behavior according to a first preset speed in a preset scene, wherein the first preset speed corresponds to the speed data.
Optionally, the storage medium is further arranged to store program code for performing the steps of: after first behavior data of a target object in a simulated flight action is detected, under the condition that the first behavior data is not detected, the control of a virtual role corresponding to the target object is cancelled; and after judging whether the speed data in the first behavior data meet the preset conditions or not, if the speed data do not meet the preset conditions, canceling the control of the virtual role corresponding to the target object.
Optionally, the storage medium is further arranged to store program code for performing the steps of: and controlling the virtual role corresponding to the target object to stop in the preset scene.
Optionally, the storage medium is further arranged to store program code for performing the steps of: after a virtual role corresponding to a target object is controlled to execute a first preset flight behavior according to a first preset speed under a preset scene, judging whether second behavior data generated by the target object during flight action simulation conforms to preset behavior data; and if the second behavior data are judged to accord with the preset behavior data, controlling the virtual character to execute a second preset flight behavior at a second preset speed under the preset scene, wherein the preset behavior data correspond to the second preset flight behavior.
Optionally, the storage medium is further arranged to store program code for performing the steps of: after the virtual character is controlled to execute a second preset flight behavior according to a second preset speed in a preset scene, judging whether the ascending speed indicated by the ascending speed data in the second behavior data is greater than a preset descending speed, wherein the preset descending speed is the speed of the virtual character descending along the vertical direction under the influence of gravity only; if the ascending speed is judged to be larger than the preset descending speed, the virtual role corresponding to the target object is controlled to continue to execute a second preset flight behavior according to a second preset speed under the preset scene; and if the rising speed is judged to be not greater than the preset falling speed, controlling the virtual role corresponding to the target object to fall in the preset scene.
Optionally, the storage medium is further arranged to store program code for performing the steps of: judging whether the ascending speed indicated by the ascending speed data in the first behavior data is greater than a preset descending speed, wherein the preset descending speed is a speed of the virtual character descending along the vertical direction under the influence of gravity only; and if the ascending speed is larger than the preset descending speed, controlling the virtual character corresponding to the target object to leave the ground of the preset scene according to the ascending speed, and executing flight behavior according to the advancing speed in the first preset speed, wherein the first preset speed comprises the ascending speed.
Optionally, the storage medium is further arranged to store program code for performing the steps of: judging whether second behavior data generated by the target object when the target object simulates the flight action is behavior data generated when the target object keeps the two arms horizontally unfolded or not, wherein the preset behavior data comprises the behavior data generated when the target object keeps the two arms horizontally unfolded; and if the second behavior data is judged to be the behavior data generated when the target object keeps the two arms horizontally unfolded, controlling the virtual character to execute the gliding flight behavior according to the forward speed and the descending speed smaller than the preset descending speed under the preset scene, wherein the second preset speed comprises the forward speed and the descending speed, and the second preset flight behavior comprises the gliding flight behavior.
Optionally, the storage medium is further arranged to store program code for performing the steps of: judging whether second behavior data generated by the target object when the target object simulates the flight action is behavior data generated by the target object when the arm extends backwards, wherein the preset behavior data comprises behavior data generated by the target object when the arm extends backwards; and if the second behavior data is judged to be the behavior data generated when the target object stretches the arm backwards, controlling the virtual character to execute the accelerated forward flight behavior according to a second preset speed under the preset scene, wherein the second preset speed is greater than the forward speed, and the second preset flight behavior comprises the accelerated forward flight behavior.
Optionally, the storage medium is further arranged to store program code for performing the steps of: judging whether second behavior data generated by the target object during the simulated flight action is behavior data generated when the double arms of the target object are inclined relative to the horizontal position or not, wherein the preset behavior data comprises the behavior data generated when the double arms of the target object are inclined relative to the horizontal position; and if the second behavior data is judged to be the behavior data generated when the double arm of the target object inclines relative to the horizontal position, controlling the virtual character to execute the turning flight behavior according to a second preset speed under the preset scene, wherein the second preset flight behavior comprises the turning flight behavior.
Optionally, the storage medium is further arranged to store program code for performing the steps of: detecting a distance deviation of the double arms of the target object relative to a horizontal position; under the condition that the distance deviation that the left arm of the target object is higher than the right arm of the target object is detected to be within a first threshold range, controlling the virtual character to execute a flying behavior turning to the right side according to a second preset speed under a preset scene; and under the condition that the distance deviation that the left arm of the target object is lower than the right arm of the target object is detected to be within a second threshold value range, controlling the virtual character to perform flight behavior turning to the left side at a second preset speed under a preset scene.
Optionally, the storage medium is further arranged to store program code for performing the steps of: after first behavior data generated by a target object during flight action simulation are detected, filtering displacement data smaller than a preset amplitude in the first behavior data through a preset filtering parameter to obtain filtered behavior data; judging whether the speed data in the filtered behavior data meet preset conditions or not; and if the speed data in the filtered behavior data is judged to meet the preset conditions, controlling the virtual role corresponding to the target object to execute a first preset flight behavior at a first preset speed in a preset scene.
Optionally, the storage medium is further arranged to store program code for performing the steps of: the preset scene is a virtual reality scene, and first behavior data generated by the target object in the simulated flight action is detected through laser positioning in a laser positioning range.
Optionally, the specific examples in this embodiment may refer to the examples described in the above embodiments, and this embodiment is not described herein again.
Optionally, in this embodiment, the storage medium may include, but is not limited to: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
The integrated unit in the above embodiments, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in the above computer-readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing one or more computer devices (which may be personal computers, servers, network devices, etc.) to execute all or part of the steps of the method according to the embodiments of the present invention.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.

Claims (17)

1. A behavior data interaction method is characterized by comprising the following steps:
detecting first behavior data generated by a target object during flight action simulation;
under the condition that the first behavior data are detected, judging whether speed data in the first behavior data meet preset conditions or not, wherein the speed data are obtained through detection of virtual reality equipment;
if the speed data is judged to meet the preset condition, controlling the virtual role corresponding to the target object to execute a first preset flight behavior according to a first preset speed under a preset scene, wherein the first preset speed corresponds to the speed data, the preset scene is a virtual reality scene,
and calculating the speed of the virtual character of the target object in the virtual reality scene according to the speed data and preset parameters to obtain the first preset speed, so that the first preset flight behavior of the virtual character executed according to the first preset speed in the virtual reality scene is consistent with the behavior of the target object in the process of simulating the flight action.
2. The method of claim 1,
after detecting the first behavior data generated by the target object while simulating flight action, the method further comprises: under the condition that the first behavior data is not detected, canceling to control the virtual role corresponding to the target object;
after determining whether the speed data in the first behavior data meets the preset condition, the method further includes: and if the speed data is judged not to accord with the preset condition, canceling the control of the virtual role corresponding to the target object.
3. The method of claim 2, wherein canceling control of the virtual role corresponding to the target object comprises:
and controlling the virtual role corresponding to the target object to stop in the preset scene.
4. The method according to claim 1, wherein after controlling the virtual character corresponding to the target object to execute the first preset flight behavior at the first preset speed in the preset scene, the method further comprises:
judging whether second behavior data generated by the target object when the target object simulates the flight action accords with preset behavior data or not;
and if the second behavior data are judged to accord with the preset behavior data, controlling the virtual character to execute a second preset flight behavior according to a second preset speed under the preset scene, wherein the preset behavior data correspond to the second preset flight behavior.
5. The method of claim 4, wherein after controlling the avatar to perform the second preset flight maneuver at the second preset speed in the preset scenario, the method further comprises:
judging whether the ascending speed indicated by the ascending speed data in the second behavior data is greater than a preset descending speed, wherein the preset descending speed is a speed of the virtual character descending along the vertical direction under the influence of gravity only;
if the rising speed is judged to be larger than the preset falling speed, controlling the virtual role corresponding to the target object to continue to execute the second preset flight behavior at the second preset speed under the preset scene;
and if the rising speed is judged to be not greater than the preset falling speed, controlling the virtual role corresponding to the target object to fall in the preset scene.
6. The method according to any one of claims 1 to 5,
judging whether the speed data in the first behavior data meet the preset condition or not comprises the following steps: judging whether the ascending speed indicated by the ascending speed data in the first behavior data is greater than a preset descending speed, wherein the preset descending speed is a speed of the virtual character descending along the vertical direction under the influence of gravity only;
if the speed data is judged to meet the preset condition, controlling the virtual role corresponding to the target object to execute the first preset flight behavior at the first preset speed under the preset scene comprises the following steps: and if the rising speed is judged to be greater than the preset falling speed, controlling the virtual character corresponding to the target object to leave the ground of the preset scene according to the rising speed, and executing the flight behavior according to the advancing speed in the first preset speed, wherein the first preset speed comprises the rising speed.
7. The method of claim 6, wherein determining whether the rise rate is greater than the preset fall rate comprises:
the rising speed V is calculated by the following first preset formula1
V1K x V, where k is a velocity coefficient and V is a gesture velocity of the target object.
8. The method of claim 4,
judging whether the second behavior data generated by the target object during the flight action simulation conforms to the preset behavior data comprises the following steps: judging whether second behavior data generated by the target object when the target object simulates the flight action is behavior data generated by the target object when the target object keeps the two arms horizontally unfolded or not, wherein the preset behavior data comprises the behavior data generated by the target object when the target object keeps the two arms horizontally unfolded;
if the second behavior data are judged to accord with the preset behavior data, controlling the virtual character to execute the second preset flight behavior at the second preset speed under the preset scene comprises the following steps: and if the second behavior data is judged to be the behavior data generated when the target object keeps the two arms horizontally unfolded, controlling the virtual character to execute the gliding flight behavior according to the forward speed and the descending speed smaller than the preset descending speed in the preset scene, wherein the second preset speed comprises the forward speed and the descending speed, and the second preset flight behavior comprises the gliding flight behavior.
9. The method of claim 4,
judging whether the second behavior data generated by the target object during the flight action simulation conforms to the preset behavior data comprises the following steps: judging whether second behavior data generated by the target object when the target object simulates the flight action is behavior data generated by the target object when the arm extends backwards, wherein the preset behavior data comprises behavior data generated by the target object when the arm extends backwards;
if the second behavior data are judged to accord with the preset behavior data, controlling the virtual character to execute the second preset flight behavior at the second preset speed under the preset scene comprises the following steps: and if the second behavior data is judged to be the behavior data generated when the target object stretches the arm backwards, controlling the virtual character to execute the accelerated forward flight behavior according to a second preset speed in the preset scene, wherein the second preset speed is greater than the forward speed, and the second preset flight behavior comprises the accelerated forward flight behavior.
10. The method of claim 9, wherein controlling the behavior of the virtual character to perform the accelerated forward flight in the preset scenario at the second preset speed comprises:
the second preset speed V is calculated by the following second preset formula2
V2=V0+ k V + a t, wherein V is0The initial speed of the virtual character in the preset scene is determined, k is a speed coefficient, V is a gesture speed of the target object, a is an acceleration of the virtual character in the preset scene, which enables the second preset speed to change along with time, and t is an acceleration time.
11. The method of claim 4,
judging whether the second behavior data generated by the target object during the flight action simulation conforms to the preset behavior data comprises the following steps: judging whether second behavior data generated by the target object when simulating the flight action is behavior data generated by the double arm of the target object when the double arm of the target object is inclined relative to a horizontal position, wherein the preset behavior data comprises the behavior data generated by the double arm of the target object when the double arm of the target object is inclined relative to the horizontal position;
if the second behavior data are judged to accord with the preset behavior data, controlling the virtual character to execute the second preset flight behavior at the second preset speed under the preset scene comprises the following steps: and if the second behavior data is judged to be the behavior data generated when the double arm of the target object inclines relative to the horizontal position, controlling the virtual character to execute the turning flight behavior according to the second preset speed in the preset scene, wherein the second preset flight behavior comprises the turning flight behavior.
12. The method of claim 11, wherein controlling the flying behavior of the virtual character to perform the turn at the second preset speed in the preset scenario comprises:
detecting a distance deviation of the double arms of the target object relative to the horizontal position;
under the condition that the distance deviation that the left arm of the target object is higher than the right arm of the target object is detected to be within a first threshold value range, controlling the virtual character to execute a flying behavior turning to the right side according to the second preset speed under the preset scene;
and under the condition that the distance deviation that the left arm of the target object is lower than the right arm of the target object is detected to be within a second threshold range, controlling the virtual character to perform flight behavior turning to the left side at the second preset speed in the preset scene.
13. The method of claim 1,
after detecting the first behavior data generated by the target object while simulating flight action, the method further comprises: filtering displacement data smaller than a preset amplitude in the first behavior data through a preset filtering parameter to obtain filtered behavior data;
judging whether the speed data in the first behavior data meet the preset condition or not comprises the following steps: judging whether the speed data in the filtered behavior data meet the preset conditions or not;
if the speed data is judged to meet the preset condition, controlling the virtual role corresponding to the target object to execute the first preset flight behavior at the first preset speed under the preset scene comprises the following steps: and if the speed data in the filtered behavior data is judged to meet the preset condition, controlling the virtual role corresponding to the target object to execute the first preset flight behavior at the first preset speed in the preset scene.
14. The method of claim 1, wherein detecting first behavioral data generated by the target object while simulating the flight maneuver comprises: and in the laser positioning range, detecting first behavior data generated by the target object when the flight action is simulated through laser positioning.
15. An interaction device for behavior data, comprising:
the detection unit is used for detecting first behavior data generated by the target object during the simulated flight action;
the judging unit is used for judging whether speed data in the first behavior data meet preset conditions or not under the condition that the first behavior data are detected, wherein the speed data are obtained through detection of virtual reality equipment;
the first control unit is used for controlling the virtual role corresponding to the target object to execute a first preset flight behavior at a first preset speed in a preset scene when the speed data is judged to meet the preset condition, wherein the first preset speed corresponds to the speed data, and the preset scene is a virtual reality scene;
and calculating the speed of the virtual character of the target object in the virtual reality scene according to the speed data and preset parameters to obtain the first preset speed, so that the first preset flight behavior of the virtual character executed according to the first preset speed in the virtual reality scene is consistent with the behavior of the target object in the process of simulating the flight action.
16. The apparatus of claim 15, further comprising:
the second control unit is used for canceling control over the virtual role corresponding to the target object under the condition that the first behavior data is not detected after the first behavior data generated by the target object during flight action simulation is detected;
and the third control unit is used for canceling control over the virtual role corresponding to the target object when judging that the speed data does not accord with the preset condition after judging that whether the speed data in the first behavior data accords with the preset condition.
17. A storage medium, in which a computer program is stored, wherein the computer program is arranged to perform the method of any of claims 1 to 14 when executed.
CN201610970785.4A 2016-10-27 2016-10-27 Interaction method and device for behavior data Active CN106569602B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610970785.4A CN106569602B (en) 2016-10-27 2016-10-27 Interaction method and device for behavior data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610970785.4A CN106569602B (en) 2016-10-27 2016-10-27 Interaction method and device for behavior data

Publications (2)

Publication Number Publication Date
CN106569602A CN106569602A (en) 2017-04-19
CN106569602B true CN106569602B (en) 2020-05-19

Family

ID=58539934

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610970785.4A Active CN106569602B (en) 2016-10-27 2016-10-27 Interaction method and device for behavior data

Country Status (1)

Country Link
CN (1) CN106569602B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113521734A (en) * 2021-08-16 2021-10-22 网易(杭州)网络有限公司 Flight control method and device in game

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1394325A (en) * 2000-09-01 2003-01-29 美国索尼电脑娱乐公司 User input device and method for interaction with graphic images
CN104368146A (en) * 2014-10-17 2015-02-25 广东小天才科技有限公司 Method and device for controlling dummy objects to move

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140018169A1 (en) * 2012-07-16 2014-01-16 Zhong Yuan Ran Self as Avatar Gaming with Video Projecting Device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1394325A (en) * 2000-09-01 2003-01-29 美国索尼电脑娱乐公司 User input device and method for interaction with graphic images
CN104368146A (en) * 2014-10-17 2015-02-25 广东小天才科技有限公司 Method and device for controlling dummy objects to move

Also Published As

Publication number Publication date
CN106569602A (en) 2017-04-19

Similar Documents

Publication Publication Date Title
Tezza et al. The state-of-the-art of human–drone interaction: A survey
KR101810834B1 (en) The skydiving feel simulator system based virtual reality
KR100678373B1 (en) User input device and method for interaction with graphic images
CN104950902B (en) The control method and multi-rotor aerocraft of multi-rotor aerocraft
CN103977559B (en) Exchange method and interactive device
US10751608B2 (en) Full body movement control of dual joystick operated devices
CN106502388A (en) A kind of interactive movement technique and head-wearing type intelligent equipment
CN110935172A (en) Virtual object processing method, device, system and storage medium thereof
US11951398B2 (en) Method and apparatus for controlling virtual race car, storage medium, and device
US20160287989A1 (en) Natural body interaction for mixed or virtual reality applications
JP7447299B2 (en) Adaptive display method and device for virtual scenes, electronic equipment, and computer program
KR101410410B1 (en) Bodily sensation type learning apparatus and method
CN110559653B (en) Control method, device, terminal and storage medium of virtual aircraft
WO2018103656A1 (en) Motion processing method and device for props in vr scene, and storage medium
EP2862112B1 (en) Anti-cheating method and system for online games
US20220351472A1 (en) Remote camera augmented reality system
CN111354240A (en) Ejection and/or parachute jumping life-saving training method based on VR
CN106569602B (en) Interaction method and device for behavior data
EP2642467A1 (en) System and method for simulated aircraft control through desired direction of flight
CN109935132A (en) Flight simulator
CN113559516B (en) Virtual character control method and device, storage medium and electronic equipment
US11181938B2 (en) Full body movement control of dual joystick operated devices
CN102483652B (en) Control the Apparatus for () and method therefor of game motion
WO2019165588A1 (en) Teaching method for unmanned aerial vehicle and remote controller for unmanned aerial vehicle
CN111185005B (en) Parachuting information prompting method, terminal equipment, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant