CN112764545A - Virtual character motion synchronization method and terminal equipment - Google Patents

Virtual character motion synchronization method and terminal equipment Download PDF

Info

Publication number
CN112764545A
CN112764545A CN202110127008.4A CN202110127008A CN112764545A CN 112764545 A CN112764545 A CN 112764545A CN 202110127008 A CN202110127008 A CN 202110127008A CN 112764545 A CN112764545 A CN 112764545A
Authority
CN
China
Prior art keywords
motion
data
motion data
real
character
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110127008.4A
Other languages
Chinese (zh)
Other versions
CN112764545B (en
Inventor
陈元
刘伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Ziyuan Technology Co ltd
Original Assignee
Chongqing Ziyuan Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Ziyuan Technology Co ltd filed Critical Chongqing Ziyuan Technology Co ltd
Priority to CN202110127008.4A priority Critical patent/CN112764545B/en
Publication of CN112764545A publication Critical patent/CN112764545A/en
Application granted granted Critical
Publication of CN112764545B publication Critical patent/CN112764545B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N5/00Computing arrangements using knowledge-based models
    • G06N5/01Dynamic search techniques; Heuristics; Dynamic trees; Branch-and-bound
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention is suitable for the technical field of data analysis of virtual reality, and provides a virtual character motion synchronization method and terminal equipment, wherein the method comprises the steps of obtaining first motion data and second motion data in an inertial MEMS sensor; the inertial MEMS sensor is bound to the ankle part of a real person, the first motion data is acceleration data, and the second motion data is angular velocity data; analyzing the starting and stopping state of the real person according to the second motion data, and acquiring starting and stopping information of the real person; when the real person is in a motion state, analyzing the current gait type based on the current frame according to the first motion data and the second motion data through a decision tree algorithm, and calculating leg motion data based on the current gait type; and synchronizing the motion of the real character and the motion of the virtual character according to the start-stop information, the current gait type and the leg motion data. The invention can avoid the switching delay caused by the mixed operation of various data.

Description

Virtual character motion synchronization method and terminal equipment
Technical Field
The invention relates to the technical field of data analysis of virtual reality, in particular to a virtual character motion synchronization method and terminal equipment.
Background
The virtual reality technology is a computer simulation system capable of creating and experiencing a virtual world, which utilizes a computer to generate a simulation environment, is a system simulation of multi-source information fusion interactive three-dimensional dynamic views and entity behaviors, and enables a user to be immersed in the environment.
Generally speaking, the space presented to the user through the virtual reality technology does not mean that the user can walk freely, but the movement of the virtual character is driven by capturing the action of the real character on the motion platform, and the real character does not have actual physical movement, so that the user has low immersion and is easy to dizzy.
However, at present, there are also virtual reality devices, such as a universal motion platform, for a user to move freely, so that the real character and the virtual character have the same motion track. However, the motion data of the real person is complex, and when capturing the motion of the real person on the motion platform to drive the movement of the virtual person, a large switching delay is usually caused, so that it is difficult to accurately reflect the motion state of the real person. Therefore, it is still difficult to significantly improve the immersion feeling of the user.
Disclosure of Invention
The invention mainly aims to provide a virtual character motion synchronization method and terminal equipment to solve the problems that in the prior art, when the motion of a real character on a motion platform is captured to drive the virtual character to move, the switching delay is large, and the motion state of the real character is difficult to accurately reflect.
In order to achieve the above object, a first aspect of the embodiments of the present invention provides a method for synchronizing motions of virtual characters, including:
acquiring first motion data and second motion data in an inertial MEMS sensor;
the inertial MEMS sensor is bound to the ankle part of a real person, the first motion data is acceleration data, and the second motion data is angular velocity data;
analyzing the starting and stopping state of the real person according to the second motion data, and acquiring starting and stopping information of the real person;
when the real person is in a motion state, analyzing a current gait type based on a current frame according to the first motion data and the second motion data through a decision tree algorithm, and calculating leg motion data based on the current gait type;
and synchronizing the motion of the real character and the motion of the virtual character according to the start-stop information, the current gait type and the leg motion data.
With reference to the first aspect of the present invention, in the first embodiment of the present invention, after acquiring the first motion data and the second motion data in the inertial MEMS sensor, the method includes:
and performing low-pass filtering and smoothing processing on the first motion data and the second motion data.
With reference to the first aspect of the present invention, in a second implementation manner of the present invention, analyzing the start-stop state of the real person according to the second motion data, and acquiring start-stop information of the real person includes:
acquiring a preset start-stop threshold;
acquiring second motion data of a left foot and second motion data of a right foot, and converting the second motion data of the left foot and the second motion data of the right foot into vector module values;
when the vector modulus value of the left foot and the vector modulus value of the right foot are both smaller than the start-stop threshold value, the real character is in a static state, and the static start time is recorded;
when the vector modulus value of the left foot and/or the vector modulus value of the right foot are/is larger than the start-stop threshold value, the real person is in a motion state, and the motion starting time is recorded;
the static starting time and the motion starting time are starting and stopping information of the real person.
With reference to the first aspect of the present invention, in a third implementation manner of the present invention, when the real person is in a motion state, analyzing a current gait type according to the first motion data and the second motion data through a decision tree algorithm includes:
calculating an acceleration vector module value and acceleration data frequency distribution according to the first motion data;
calculating the angular velocity vector module value and the gyroscope data frequency distribution according to the second motion data;
calculating a ratio of the acceleration vector mode value to the angular velocity mode value;
taking the ratios of the acceleration vector module value, the acceleration data frequency distribution, the angular velocity vector module value, the gyroscope data frequency distribution and the angular velocity module value as current gait data;
acquiring standard gait data of a standard gait type;
and taking the current gait data and the standard gait data as the input of the decision tree algorithm, and obtaining the current gait type according to the output of the decision tree algorithm.
With reference to the third embodiment of the first aspect of the present invention, in a fourth embodiment of the present invention, the calculating leg movement data based on the current gait type includes:
determining a calculated weight of the first motion data and a calculated weight of the second motion data according to the current gait type to calculate leg motion data based on the current gait type.
With reference to the third and fourth embodiments of the first aspect of the present invention, in a fifth embodiment of the present invention, the decision tree algorithm is provided with preset gait type tags and confidence levels based on each gait type tag;
the gait type tag includes at least one of a step, a slide, and others.
With reference to the first aspect of the present invention, in a sixth implementation manner of the present invention, synchronizing the motion of the real person and the motion of the virtual person according to the start-stop information, the current gait type, and the leg movement data includes:
packing the current gait type and leg movement data corresponding to the current frame by taking the frame as a unit to obtain a movement data set based on the real figure;
uploading the motion data set based on the real person to an upper computer, and sending the motion data set based on the real person to a VR engine through the upper computer;
wherein the VR engine controls the movement of the virtual character according to the movement data set based on the real character to reconstruct the movement of the real character in real time.
With reference to the sixth implementation manner of the first aspect of the present invention, after the step of sending the motion data set based on the real person to the VR engine, the method includes:
adjusting the mapping magnification of the VR engine;
the mapping magnification represents a conversion magnification for controlling the motion of the virtual character based on the motion data set of the real character according to the VR engine.
A second aspect of embodiments of the present invention provides a terminal device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the steps of the method provided in the first aspect are implemented.
A third aspect of embodiments of the present invention provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the method as provided in the first aspect above.
The embodiment of the invention provides a virtual character motion synchronization method, which is used for judging the starting and stopping states of a real character through two data of an inertial MEMS sensor, namely first motion data representing acceleration data and second motion data representing angular velocity data. And analyzing the current gait type of the real person in the current frame and the leg movement data based on the current gait type through a decision tree algorithm. And then the synchronization of the motion of the real character and the motion of the virtual character is realized through the start-stop information, the current gait type of each frame and the leg motion data. Therefore, the virtual character motion synchronization method provided by the embodiment of the invention avoids the problems of switching delay and difficulty in accurately reflecting the motion state of a real character by a synchronization result caused by mixed operation of multiple data, and remarkably improves the immersion of a user.
Drawings
Fig. 1 is a schematic flow chart illustrating an implementation of a virtual character movement synchronization method according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Suffixes such as "module", "part", or "unit" used to denote elements are used herein only for the convenience of description of the present invention, and have no specific meaning in themselves. Thus, "module" and "component" may be used in a mixture.
As shown in fig. 1, an embodiment of the present invention provides a virtual character motion synchronization method to synchronize the motion of a real character and the motion of a virtual character in virtual reality technology. The virtual character motion synchronization method provided by the embodiment of the invention comprises the following steps of:
s101, acquiring first motion data and second motion data in the inertial MEMS sensor.
The inertial MEMS sensor is bound to the ankle of a real person, the first motion data is acceleration data, and the second motion data is angular velocity data.
In a specific application, the inertial MEMS sensor is one of sensors commonly used in virtual reality technology, and is used for acquiring attitude information of a real person, and includes functions of an accelerometer, an angular velocity meter, an inertial measurement combination, a pressure sensor, a flow sensor, and a displacement sensor. In the embodiment of the invention, the motion synchronization of the virtual character is realized only by using the acceleration data and the angular velocity data.
It is conceivable that the first and second motion data have two groups, respectively representing both ankles of the real character.
In one embodiment, after the step S101 and before the step S102, the method further includes:
and performing low-pass filtering and smoothing processing on the first motion data and the second motion data.
The high frequency noise can be filtered out by low pass filtering. In the embodiment of the invention, the filter uses a2 nd order Butterworth low-pass filter and has the characteristics of low data delay and good filtering effect
S102, analyzing the starting and stopping states of the real person according to the second motion data, and acquiring starting and stopping information of the real person.
According to the step S102, the start-stop state of the real person can be analyzed according to the second motion data, that is, the angular velocity data.
In specific application, the angular velocity data is generated by a gyroscope of the inertial MEMS sensor, the sensitivity of the gyroscope is high, and tiny jitter can be immediately reflected on the data, so that the method is suitable for analyzing the starting and stopping states of a real person.
The starting and stopping states of the real person comprise a motion state and a static state of the real person, and the real person is in the motion state in the period from the beginning of one motion state to the beginning of the next static state.
In one embodiment, one implementation manner of the step S102 may be:
acquiring a preset start-stop threshold;
acquiring second motion data of a left foot and second motion data of a right foot, and converting the second motion data of the left foot and the second motion data of the right foot into vector module values;
when the vector modulus value of the left foot and the vector modulus value of the right foot are both smaller than the start-stop threshold value, the real character is in a static state, and the static start time is recorded;
when the vector modulus value of the left foot and/or the vector modulus value of the right foot are/is larger than the start-stop threshold value, the real person is in a motion state, and the motion starting time is recorded;
the static starting time and the motion starting time are starting and stopping information of the real person.
In the above step, the second motion data is converted to vector mode values, in effect converting the gyroscope vectors to mode values, which represent the magnitude of the degree of rotation of the vectors at that moment. In a specific application, a threshold value may be set, and when the vector modulus value is larger than the threshold value, it indicates that the current degree of rotation exceeds the degree of rotation indicated by the threshold value, so as to obtain that the current state of the measurement object is motion.
In the embodiment of the invention, because the measurement object is a real person, the start-stop state of the real person is analyzed based on the two sets of second motion data on the two ankles and the preset start-stop threshold. Specifically, in the above step, when the second movement data of both ankles of the real person indicates that the measurement object is in a static state, it is determined that the state of the real person is in the static state.
In practical application, the setting of the start-stop threshold also affects the size of the switching delay. If the start-stop threshold is set to be too small, the problems of transmission delay, picture rendering delay and the like cannot be effectively solved, and the final presentation is delayed; if the start-stop threshold is set to be too large, people feel that the walking is difficult and often jam; only a proper threshold value is selected, and the delay time is utilized, so that the final feeling can hardly feel the start-stop delay.
In one embodiment, the start-stop threshold may be set as:
setting a start-stop threshold data set, a = { a1, a2, a3... aN }; respectively using the start-stop threshold values in the start-stop threshold value data set to acquire a plurality of switching delays; and taking the switching delay with the minimum numerical value as a preset start-stop threshold.
S103, when the real person is in a motion state, analyzing the current gait type based on the current frame according to the first motion data and the second motion data through a decision tree algorithm, and calculating leg motion data based on the current gait type.
According to the step S103, the embodiment of the present invention synchronizes the motion of the real person and the motion of the virtual person in units of frames, analyzes the gait type by using the decision tree algorithm, and can reflect the motion state of the real person in time, thereby improving the immersion of the user.
In one embodiment, one implementation manner of the step S103 may be:
calculating an acceleration vector module value and acceleration data frequency distribution according to the first motion data;
calculating the angular velocity vector module value and the gyroscope data frequency distribution according to the second motion data;
calculating a ratio of the acceleration vector mode value to the angular velocity mode value;
taking the ratios of the acceleration vector module value, the acceleration data frequency distribution, the angular velocity vector module value, the gyroscope data frequency distribution and the angular velocity module value as current gait data;
acquiring standard gait data of a standard gait type;
and taking the current gait data and the standard gait data as the input of the decision tree algorithm, and obtaining the current gait type according to the output of the decision tree algorithm.
In the above steps, the classification result of the decision tree algorithm affects the subsequent calculation of the leg movement data, that is, the first movement data and the second movement data affect the calculation of the leg movement data. Firstly, the leg motion data needs to truly reflect the frequency of the foot swing, and in qualitative terms, when the foot swing frequency is high, the average value of the acceleration and the angular velocity should be larger in one swing period no matter the acceleration or the angular velocity, and when the foot swing frequency is low, the average value of the acceleration and the angular velocity should be smaller in one swing period, which is the principle of velocity judgment. For the accelerometer, we only care about the magnitude of the linear acceleration, so we need to remove the influence of gravity. And analyzing the data change conditions of the accelerometer and the gyroscope in a gait cycle, wherein the acceleration value of the accelerometer is relatively large in a short time period when the step just enters a swing period from a support period and a short time period when the leg begins to fall from a swing to a highest point, and the acceleration value is relatively small in the middle part when the leg approaches a constant speed movement. For the gyroscope, when the foot steps on the ground and the position is near the highest point, the speed approaches to 0, the value of the gyroscope is small, and the reading of the gyroscope is large at the position near the middle. Therefore, in terms of numerical value, the measurement values of the gyroscope and the accelerometer are combined, a more stable index can be obtained, and the real state of the leg swing frequency can be better reflected, namely the first motion data and the second motion data influence the calculation of the leg motion data.
In one embodiment, calculating leg motion data based on the current gait type includes:
determining a calculated weight of the first motion data and a calculated weight of the second motion data according to the current gait type to calculate leg motion data based on the current gait type.
In the embodiment of the invention, the decision tree algorithm is provided with preset gait type labels and confidence coefficient based on each gait type label;
the gait type tag includes at least one of a step, a slide, and others.
Taking practical application as an example, the implementation of the above calculation based on the leg movement data of the current gait type is described:
the calculation weight of the first motion data and the calculation weight of the second motion data are set according to different gait types, and can be calculated by the probability of each gait type obtained by the decision tree algorithm in the previous step, and the value of the probability is in an interval of 0-1. For example, when the probability of the stepping mode is 0.1, the calculation weight of the first motion data is set to 0.7, and the calculation weight of the second motion data is set to 0.3. When the probability of the stepping mode is 0.3, the calculation weight of the first motion data is set to be 0.5, and the calculation weight of the second motion data is set to be 0.5.
Or several groups of fixed weight values can be selected according to different motion states, for example, if the acceleration data is more important in the stepping mode, the calculation weight of the first motion data is 0.7, and the calculation weight of the second motion data is 0.3; and if the gyroscope data is emphasized in the sliding step mode, the calculation weight of the second motion data is 0.7, and the calculation weight of the first motion data is 0.3.
And S104, synchronizing the motion of the real person and the motion of the virtual person according to the start-stop information, the current gait type and the leg motion data.
In the step S104, since the current gait type and the leg movement data are counted in frames, one implementation manner of the step S104 may be:
packing the current gait type and leg movement data corresponding to the current frame by taking the frame as a unit to obtain a movement data set based on the real figure;
uploading the motion data set based on the real person to an upper computer, and sending the motion data set based on the real person to a VR engine through the upper computer;
wherein the VR engine controls the movement of the virtual character according to the movement data set based on the real character to reconstruct the movement of the real character in real time.
In one embodiment, after sending the motion data set based on the real person to the VR engine, the method includes:
adjusting the mapping magnification of the VR engine;
the mapping magnification represents a conversion magnification for controlling the motion of the virtual character based on the motion data set of the real character according to the VR engine.
In a particular application, by mapping the magnification, a higher or lower movement speed can be achieved to adjust the difficulty of movement in the virtual reality experience, e.g., the mapping magnification can be increased or decreased by a slider to make it easier or more difficult for a game character to act.
The embodiment of the present invention further provides a terminal device, which includes a memory, a processor, and a computer program stored in the memory and capable of running on the processor, and when the processor executes the computer program, the steps of the virtual character motion synchronization method in the above embodiments are implemented.
An embodiment of the present invention further provides a storage medium, where the storage medium is a computer-readable storage medium, and a computer program is stored on the storage medium, and when being executed by a processor, the computer program implements the steps in the virtual character motion synchronization method in the foregoing embodiments.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the foregoing embodiments illustrate the present invention in detail, those of ordinary skill in the art will understand that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (10)

1. A virtual character motion synchronization method is characterized by comprising the following steps:
acquiring first motion data and second motion data in an inertial MEMS sensor;
the inertial MEMS sensor is bound to the ankle part of a real person, the first motion data is acceleration data, and the second motion data is angular velocity data;
analyzing the starting and stopping state of the real person according to the second motion data, and acquiring starting and stopping information of the real person;
when the real person is in a motion state, analyzing a current gait type based on a current frame according to the first motion data and the second motion data through a decision tree algorithm, and calculating leg motion data based on the current gait type;
and synchronizing the motion of the real character and the motion of the virtual character according to the start-stop information, the current gait type and the leg motion data.
2. The virtual character motion synchronization method of claim 1, wherein after acquiring the first motion data and the second motion data in the inertial MEMS sensor, comprising:
and performing low-pass filtering and smoothing processing on the first motion data and the second motion data.
3. The virtual character motion synchronization method of claim 1, wherein analyzing the start-stop state of the real character according to the second motion data and acquiring the start-stop information of the real character comprises:
acquiring a preset start-stop threshold;
acquiring second motion data of a left foot and second motion data of a right foot, and converting the second motion data of the left foot and the second motion data of the right foot into vector module values;
when the vector modulus value of the left foot and the vector modulus value of the right foot are both smaller than the start-stop threshold value, the real character is in a static state, and the static start time is recorded;
when the vector modulus value of the left foot and/or the vector modulus value of the right foot are/is larger than the start-stop threshold value, the real person is in a motion state, and the motion starting time is recorded;
the static starting time and the motion starting time are starting and stopping information of the real person.
4. The virtual character movement synchronization method as claimed in claim 1, wherein analyzing a current gait type according to the first movement data and the second movement data by a decision tree algorithm when the real character is in a movement state comprises:
calculating an acceleration vector module value and acceleration data frequency distribution according to the first motion data;
calculating the angular velocity vector module value and the gyroscope data frequency distribution according to the second motion data;
calculating a ratio of the acceleration vector mode value to the angular velocity mode value;
taking the ratios of the acceleration vector module value, the acceleration data frequency distribution, the angular velocity vector module value, the gyroscope data frequency distribution and the angular velocity module value as current gait data;
acquiring standard gait data of a standard gait type;
and taking the current gait data and the standard gait data as the input of the decision tree algorithm, and obtaining the current gait type according to the output of the decision tree algorithm.
5. The virtual character motion synchronization method of claim 4, wherein calculating leg motion data based on the current gait type comprises:
determining a calculated weight of the first motion data and a calculated weight of the second motion data according to the current gait type to calculate leg motion data based on the current gait type.
6. The virtual character motion synchronization method of claim 4 or 5, wherein the decision tree algorithm is provided with preset gait type tags and a confidence level based on each gait type tag;
the gait type tag includes at least one of a step, a slide, and others.
7. The virtual character motion synchronization method of claim 1, wherein synchronizing the motion of the real character with the motion of the virtual character based on the start-stop information, the current gait type and the leg motion data comprises:
packing the current gait type and leg movement data corresponding to the current frame by taking the frame as a unit to obtain a movement data set based on the real figure;
uploading the motion data set based on the real person to an upper computer, and sending the motion data set based on the real person to a VR engine through the upper computer;
wherein the VR engine controls the movement of the virtual character according to the movement data set based on the real character to reconstruct the movement of the real character in real time.
8. The virtual character motion synchronization method of claim 7, wherein after sending the real character-based motion data set to a VR engine, comprising:
adjusting the mapping magnification of the VR engine;
the mapping magnification represents a conversion magnification for controlling the motion of the virtual character based on the motion data set of the real character according to the VR engine.
9. A terminal device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the virtual character movement synchronization method according to any one of claims 1 to 8 when executing the computer program.
10. A storage medium which is a computer-readable storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the steps of the virtual character motion synchronization method according to any one of claims 1 to 8.
CN202110127008.4A 2021-01-29 2021-01-29 Virtual character motion synchronization method and terminal equipment Active CN112764545B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110127008.4A CN112764545B (en) 2021-01-29 2021-01-29 Virtual character motion synchronization method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110127008.4A CN112764545B (en) 2021-01-29 2021-01-29 Virtual character motion synchronization method and terminal equipment

Publications (2)

Publication Number Publication Date
CN112764545A true CN112764545A (en) 2021-05-07
CN112764545B CN112764545B (en) 2023-01-24

Family

ID=75703728

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110127008.4A Active CN112764545B (en) 2021-01-29 2021-01-29 Virtual character motion synchronization method and terminal equipment

Country Status (1)

Country Link
CN (1) CN112764545B (en)

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110043443A1 (en) * 2006-07-14 2011-02-24 Ailive, Inc. Systems and methods for utilizing personalized motion control in virtual environment
CN102004840A (en) * 2009-08-28 2011-04-06 深圳泰山在线科技有限公司 Method and system for realizing virtual boxing based on computer
CN104461013A (en) * 2014-12-25 2015-03-25 中国科学院合肥物质科学研究院 Human body movement reconstruction and analysis system and method based on inertial sensing units
CN104898669A (en) * 2015-04-30 2015-09-09 贺杰 Virtual reality walking control method and system based on inertia sensor
US20160370854A1 (en) * 2015-06-16 2016-12-22 Wilson Steele Method and System for Analyzing a Movement of a Person
WO2017024177A1 (en) * 2015-08-04 2017-02-09 Board Of Regents Of The Nevada System Of Higher Education,On Behalf Of The University Of Nevada,Reno Immersive virtual reality locomotion using head-mounted motion sensors
US20180007382A1 (en) * 2016-06-30 2018-01-04 Facebook, Inc. Systems and methods for determining motion vectors
CN108762498A (en) * 2018-05-22 2018-11-06 重庆子元科技有限公司 The virtual reality system of human body walking gesture stability
CN108831527A (en) * 2018-05-31 2018-11-16 古琳达姬(厦门)股份有限公司 A kind of user movement condition detection method, device and wearable device
CN108958465A (en) * 2017-05-25 2018-12-07 纽密克斯传媒有限公司 Walking analysis method, virtual reality interlock method and device
CN109045682A (en) * 2018-07-13 2018-12-21 深圳众赢时代科技有限公司 A method of it reducing projection mobile phone and interacts body-building game propagation delay time with intelligent shoe
US20190076618A1 (en) * 2017-09-14 2019-03-14 Advanced Micro Devices, Inc. Method, apparatus and system for mitigating motion sickness in a virtual reality environment
CN109770911A (en) * 2019-01-21 2019-05-21 北京诺亦腾科技有限公司 A kind of gait analysis method, device and storage medium
WO2019114337A1 (en) * 2017-12-15 2019-06-20 阿里巴巴集团控股有限公司 Biometric authentication, identification and detection method and device for mobile terminal and equipment
CN110327054A (en) * 2019-07-17 2019-10-15 袁兴光 A kind of gait analysis method and device based on acceleration and angular speed sensor
US20200060602A1 (en) * 2013-11-12 2020-02-27 Highland Instruments Motion analysis systems and methods of use thereof
CN112169296A (en) * 2019-07-05 2021-01-05 华为技术有限公司 Motion data monitoring method and device

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110043443A1 (en) * 2006-07-14 2011-02-24 Ailive, Inc. Systems and methods for utilizing personalized motion control in virtual environment
CN102004840A (en) * 2009-08-28 2011-04-06 深圳泰山在线科技有限公司 Method and system for realizing virtual boxing based on computer
US20200060602A1 (en) * 2013-11-12 2020-02-27 Highland Instruments Motion analysis systems and methods of use thereof
CN104461013A (en) * 2014-12-25 2015-03-25 中国科学院合肥物质科学研究院 Human body movement reconstruction and analysis system and method based on inertial sensing units
CN104898669A (en) * 2015-04-30 2015-09-09 贺杰 Virtual reality walking control method and system based on inertia sensor
US20160370854A1 (en) * 2015-06-16 2016-12-22 Wilson Steele Method and System for Analyzing a Movement of a Person
WO2017024177A1 (en) * 2015-08-04 2017-02-09 Board Of Regents Of The Nevada System Of Higher Education,On Behalf Of The University Of Nevada,Reno Immersive virtual reality locomotion using head-mounted motion sensors
US20180007382A1 (en) * 2016-06-30 2018-01-04 Facebook, Inc. Systems and methods for determining motion vectors
CN108958465A (en) * 2017-05-25 2018-12-07 纽密克斯传媒有限公司 Walking analysis method, virtual reality interlock method and device
US20190076618A1 (en) * 2017-09-14 2019-03-14 Advanced Micro Devices, Inc. Method, apparatus and system for mitigating motion sickness in a virtual reality environment
WO2019114337A1 (en) * 2017-12-15 2019-06-20 阿里巴巴集团控股有限公司 Biometric authentication, identification and detection method and device for mobile terminal and equipment
CN108762498A (en) * 2018-05-22 2018-11-06 重庆子元科技有限公司 The virtual reality system of human body walking gesture stability
CN108831527A (en) * 2018-05-31 2018-11-16 古琳达姬(厦门)股份有限公司 A kind of user movement condition detection method, device and wearable device
CN109045682A (en) * 2018-07-13 2018-12-21 深圳众赢时代科技有限公司 A method of it reducing projection mobile phone and interacts body-building game propagation delay time with intelligent shoe
CN109770911A (en) * 2019-01-21 2019-05-21 北京诺亦腾科技有限公司 A kind of gait analysis method, device and storage medium
CN112169296A (en) * 2019-07-05 2021-01-05 华为技术有限公司 Motion data monitoring method and device
CN110327054A (en) * 2019-07-17 2019-10-15 袁兴光 A kind of gait analysis method and device based on acceleration and angular speed sensor

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
王斐等: "基于过程神经网络的步态模式自动分类", 《东北大学学报(自然科学版)》 *
陈莹超等: "基于MEMS的虚拟现实步迹跟踪系统的设计", 《微纳电子技术》 *

Also Published As

Publication number Publication date
CN112764545B (en) 2023-01-24

Similar Documents

Publication Publication Date Title
CN107102728B (en) Display method and system based on virtual reality technology
US9741146B1 (en) Kinetic energy smoother
JP2022553167A (en) MOVIE PROCESSING METHOD, MOVIE PROCESSING APPARATUS, COMPUTER PROGRAM AND ELECTRONIC DEVICE
US20140364227A1 (en) Locating and orienting device in space
CN111191542B (en) Method, device, medium and electronic equipment for identifying abnormal actions in virtual scene
CN112764545B (en) Virtual character motion synchronization method and terminal equipment
Mousas et al. Performance-driven hybrid full-body character control for navigation and interaction in virtual environments
Kang et al. Towards machine learning with zero real-world data
Feigl et al. Real-time gait reconstruction for virtual reality using a single sensor
CN112363617A (en) Method and device for acquiring human body action data
WO2023098090A1 (en) Smart device control method and apparatus, server, and storage medium
CN114296539B (en) Direction prediction method, virtual reality device and non-transitory computer readable medium
CN106649594B (en) Data display method and device
Ando et al. Level of interest in observed exhibits in metaverse museums
CN113592986A (en) Action generation method and device based on neural network and computing equipment
Forte et al. User experience problems in immersive virtual environments
CN117853689B (en) VR display system of souvenir house
Han et al. Controlling virtual world by the real world devices with an MPEG-V framework
CN109597480A (en) Man-machine interaction method, device, electronic equipment and computer readable storage medium
Kleine Deters Therapeutic exercise assessment automation, a hidden Markov model approach.
CN116483208B (en) Anti-dizzy method and device for virtual reality equipment, computer equipment and medium
CN115564803B (en) Animation processing method, device, equipment, storage medium and product
CN117253010A (en) Image processing method, device, equipment, storage medium and product
CN115862139A (en) Motion recognition method and device and electronic equipment
CN117033954A (en) Data processing method and related product

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant