CN117664173A - Calibration method, device, equipment and medium of motion capture equipment - Google Patents

Calibration method, device, equipment and medium of motion capture equipment Download PDF

Info

Publication number
CN117664173A
CN117664173A CN202211021886.9A CN202211021886A CN117664173A CN 117664173 A CN117664173 A CN 117664173A CN 202211021886 A CN202211021886 A CN 202211021886A CN 117664173 A CN117664173 A CN 117664173A
Authority
CN
China
Prior art keywords
coordinate system
conversion relation
posture conversion
head
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211021886.9A
Other languages
Chinese (zh)
Inventor
李耿磊
李汉振
山君良
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zitiao Network Technology Co Ltd
Original Assignee
Beijing Zitiao Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zitiao Network Technology Co Ltd filed Critical Beijing Zitiao Network Technology Co Ltd
Priority to CN202211021886.9A priority Critical patent/CN117664173A/en
Publication of CN117664173A publication Critical patent/CN117664173A/en
Pending legal-status Critical Current

Links

Abstract

The application provides a calibration method, a device, equipment and a medium of a motion capture device, wherein the motion capture device is worn on a lower limb of a human body, and the method is applied to a head-mounted display device and comprises the following steps: determining a first posture conversion relation between a world coordinate system and a reference coordinate system and a second posture conversion relation between the reference coordinate system and a device coordinate system, wherein the device coordinate system is a motion capture device coordinate system; and calibrating the motion capture device according to the first gesture conversion relation and the second gesture conversion relation. The method and the device can effectively reduce the calibration difficulty of the motion capture equipment, shorten the calibration time and improve the calibration precision and the calibration effect of the motion capture equipment.

Description

Calibration method, device, equipment and medium of motion capture equipment
Technical Field
The embodiment of the application relates to the technical field of human motion capture, in particular to a calibration method, a device, equipment and a medium of motion capture equipment.
Background
The human motion capture technology has great application prospects in the fields of Virtual Reality (VR for short), human-computer interaction and the like. Current motion capture schemes of human body capture mainly perform motion capture by a motion capture device (such as an inertial sensor or a tracking device with an inertial sensor) to collect motion data of human body in the real world.
Before the motion capture device is used for collecting the human motion data, the motion capture device needs to be calibrated to ensure the accuracy and precision of the motion data collected by the motion capture device. In the related art, when calibrating the motion capture device, a reference object is needed to be used, and the wearing position of the motion capture device is fixed. The method can increase the operation difficulty of a user, brings trouble to the user, and even introduces calibration errors due to inaccurate wearing positions of the motion capture equipment, so that the calibration accuracy of the motion capture equipment is low and the effect is poor.
Disclosure of Invention
The embodiment of the application provides a calibration method, a device, equipment and a medium for motion capture equipment, which can effectively reduce the calibration difficulty of the motion capture equipment, shorten the calibration time and improve the calibration precision and the calibration effect of the motion capture equipment.
In a first aspect, an embodiment of the present application provides a method for calibrating a motion capture device, where the motion capture device is worn on a lower limb of a human body, the method being applied to a head-mounted display device, including:
determining a first posture conversion relation between a world coordinate system and a reference coordinate system and a second posture conversion relation between the reference coordinate system and a device coordinate system, wherein the device coordinate system is the motion capture device coordinate system;
And calibrating the motion capture device according to the first gesture conversion relation and the second gesture conversion relation.
In a second aspect, embodiments of the present application provide a calibration apparatus for a motion capture device, where the motion capture device is worn on a lower limb of a human body, and the apparatus is configured in a head-mounted display device, including:
the device comprises a determining module, a processing module and a processing module, wherein the determining module is used for determining a first posture conversion relation between a world coordinate system and a reference coordinate system and a second posture conversion relation between the reference coordinate system and a device coordinate system, wherein the device coordinate system is the motion capture device coordinate system;
and the calibration module is used for calibrating the motion capture device according to the first gesture conversion relation and the second gesture conversion relation.
In a third aspect, an embodiment of the present application provides an electronic device, including:
a processor and a memory, the memory is used for storing a computer program, and the processor is used for calling and running the computer program stored in the memory to execute the calibration method of the motion capture device in the embodiment of the first aspect or various implementation manners thereof.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium storing a computer program that causes a computer to perform a method of calibrating a motion capture device as described in the embodiments of the first aspect or implementations thereof.
In a fifth aspect, embodiments of the present application provide a computer program product comprising program instructions which, when run on an electronic device, cause the electronic device to perform a method of calibrating a motion capture device as described in the embodiments of the first aspect or implementations thereof.
The technical scheme disclosed by the embodiment of the application has at least the following beneficial effects:
and calibrating the motion capture device according to the first posture conversion relation and the second posture conversion relation by determining the first posture conversion relation of the world coordinate system and the reference coordinate system and the second posture conversion relation of the reference coordinate system and the device coordinate system. According to the method, the motion capture device is worn on the lower limb of the human body, the natural motion of the human body is utilized to calibrate the motion capture device, so that the calibration difficulty of the motion capture device can be effectively reduced, the calibration time is shortened, the calibration precision and the calibration effect of the motion capture device are improved, the limb motion capture effect can be improved based on the calibrated motion capture device, and conditions are provided for improving the accuracy of human body posture reduction.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 2 is a schematic diagram of a wearing usage scenario when the head-mounted display device 100 provided in the embodiment of the present application is an HMD;
FIG. 3 is a schematic diagram of a head mounted display device, a motion capture device and peripheral devices worn on different parts of a human body according to an embodiment of the present application;
FIG. 4 is a flowchart of a method for calibrating a first motion capture device according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram of the relationship between a device coordinate system, a world coordinate system, and a reference coordinate system provided by an embodiment of the present application;
FIG. 6 is a flowchart illustrating a third method for calibrating a motion capture device according to an embodiment of the present application;
FIG. 7 is a flowchart of a fourth method for calibrating a motion capture device according to an embodiment of the present disclosure;
FIG. 8 is a flowchart of a method for calibrating a fifth motion capture device according to an embodiment of the present disclosure;
FIG. 9 is a schematic diagram of the relationship between the device coordinate system, the world coordinate system, and the limb coordinate system provided by the embodiments of the present application;
FIG. 10 is a flowchart of a method for calibrating a sixth motion capture device according to an embodiment of the present application;
FIG. 11 is a schematic block diagram of a calibration apparatus of a motion capture device provided in an embodiment of the present application;
FIG. 12 is a schematic block diagram of an electronic device provided by an embodiment of the present application;
fig. 13 is a schematic block diagram of an electronic device provided in an embodiment of the present application as an HMD.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present application in light of the embodiments herein.
It should be noted that the terms "first," "second," and the like in the description and claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or server that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The human motion capture method and device are suitable for human motion capture scenes, and at present, the human motion capture scheme mainly performs motion capture through motion capture equipment so as to collect human motion data in the real world. Wherein the motion capture device is basically an inertial sensor or a tracking device with an inertial sensor, etc. In order to ensure the accuracy and precision of the motion data acquired by the motion capture device, the motion capture device needs to be calibrated. Considering that when the motion capture device is calibrated in the related technology, the reference object is needed, and the wearing position of the motion capture device is fixed, the operation difficulty of a user can be increased, the user is puzzled, and even the calibration error is introduced due to inaccurate wearing position of the motion capture device, so that the problem of low calibration precision and poor effect of the motion capture device is solved. Therefore, the method for calibrating the motion capture device is designed, so that the difficulty in calibrating the motion capture device can be effectively reduced, the time spent in calibration is shortened, and the calibration precision and the calibration effect of the motion capture device are improved.
In order to facilitate understanding of embodiments of the present application, before describing various embodiments of the present application, some concepts related to all embodiments of the present application are first appropriately explained, specifically as follows:
1) Virtual Reality (VR) is a technology for creating and experiencing a Virtual world, determining to generate a Virtual environment, which is a multi-source information (the Virtual Reality mentioned herein at least includes visual perception, and may also include auditory perception, tactile perception, motion perception, and even include gustatory perception, olfactory perception, etc.), implementing a fused, interactive three-dimensional dynamic view of the Virtual environment and simulation of physical behavior, immersing a user in the simulated Virtual Reality environment, and implementing applications in various Virtual environments such as maps, games, videos, education, medical treatment, simulation, collaborative training, sales, assistance in manufacturing, maintenance, and repair.
2) A virtual reality device (VR device) may be provided in the form of glasses, a head mounted display (Head Mount Display, abbreviated as HMD), or a contact lens for realizing visual perception and other forms of perception, but the form of the virtual reality device is not limited thereto, and may be further miniaturized or enlarged according to actual needs.
Optionally, the virtual reality device described in the embodiments of the present application may include, but is not limited to, the following types:
2.1 Computer-side virtual reality (PCVR) equipment, which utilizes the PC side to perform the related computation of the virtual reality function and data output, and external computer-side virtual reality equipment utilizes the data output by the PC side to realize the effect of virtual reality.
2.2 Mobile virtual reality device, supporting the setting of a mobile terminal (e.g., a smart phone) in various ways (e.g., a head mounted display provided with a dedicated card slot), performing related calculations of virtual reality functions by the mobile terminal through wired or wireless connection with the mobile terminal, and outputting data to the mobile virtual reality device, e.g., viewing virtual reality video through the APP of the mobile terminal.
2.3 The integrated virtual reality device has a processor for performing the related computation of the virtual function, so that the integrated virtual reality device has independent virtual reality input and output functions, does not need to be connected with a PC end or a mobile terminal, and has high use freedom.
3) A virtual Field Of View (FOV) represents a perceived area Of a virtual environment that a user can perceive through a lens in a virtual reality device, using a Field Of View (FOV) Of the virtual Field Of View.
4) Augmented reality (Augmented Reality, abbreviated AR): a technique for calculating camera pose parameters of a camera in a real world (or three-dimensional world, real world) in real time during image acquisition by the camera, and adding virtual elements on the image acquired by the camera according to the camera pose parameters. Virtual elements include, but are not limited to: images, videos, and three-dimensional models. The goal of AR technology is to socket the virtual world over the real world on the screen for interaction.
5) Mixed Reality (MR for short): a simulated scenery integrating computer-created sensory input (e.g., a virtual object) with sensory input from a physical scenery or a representation thereof, in some MR sceneries, the computer-created sensory input may be adapted to changes in sensory input from the physical scenery. In addition, some electronic systems for rendering MR scenes may monitor orientation and/or position relative to the physical scene to enable virtual objects to interact with real objects (i.e., physical elements from the physical scene or representations thereof). For example, the system may monitor movement such that the virtual plants appear to be stationary relative to the physical building.
6) Extended Reality (XR) refers to all real and virtual combined environments and human-machine interactions generated by computer technology and wearable devices, and includes multiple forms of Virtual Reality (VR), augmented Reality (AR), and Mixed Reality (MR).
7) A virtual scene is a virtual scene that an application program displays (or provides) when running on an electronic device. The virtual scene may be a simulation environment for the real world, a semi-simulation and semi-fictional virtual scene, or a pure fictional virtual scene. The virtual scene may be any one of a two-dimensional virtual scene, a 2.5-dimensional virtual scene or a three-dimensional virtual scene, and the dimension of the virtual scene is not limited in the embodiment of the present application. For example, a virtual scene may include sky, land, sea, etc., the land may include environmental elements of a desert, city, etc., and a user may control a virtual object to move in the virtual scene.
8) A virtual object is an object that interacts in a virtual scene, and is controlled by a user or a robot program (e.g., an artificial intelligence-based robot program) to be able to rest, move, and perform various actions in the virtual scene, such as various characters in a game.
In order to clearly explain the technical scheme of the application, the application scenario of the technical scheme of the application is described below. It should be understood that the technical solution of the present application may be applied to the following scenarios, but is not limited thereto:
fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application. As shown in fig. 1, the application scenario 1000 may include: a head mounted display device 100 and a motion capture device 200. And, communication may be performed between the head mounted display device 100 and the motion capture device 200.
In some implementations, the head mounted display device 100 may be an HMD, such as a head mounted display in a VR all-in-one machine, and the like, which is not limited in this embodiment.
Also, inertial sensors and positioning devices may be provided on the head mounted display device 100. In the embodiment of the application, the inertial sensor and the positioning device can provide 6-degree-of-freedom data including position and posture. The positioning means may be selected as any device for tracking positioning, such as a camera or the like. The positioning device in this embodiment is preferably an optical positioning device, so as to perform tracking positioning through real-time positioning and map building (simultaneous localization and mapping, abbreviated as SLAM).
In some alternative implementations, as shown in fig. 2, when the head mounted display device 100 is an HMD, it is contemplated that the HMD is relatively light, ergonomically comfortable, and may provide a device with low latency, high resolution content. Therefore, an inertial sensor for posture detection, such as a nine-axis inertial sensor or the like, is provided in the HMD so that a posture change of the HMD can be detected in real time by the sensor. For example, if the user wears the HMD, when the head pose of the user changes, the inertial sensor transmits real-time pose information of the head of the user to the processor in real time, so that the processor calculates a gaze point of the user in the virtual environment according to the real-time pose information, and further calculates an image in a three-dimensional model of the virtual environment, which is in a user gaze range, that is, a virtual field of view, according to the gaze point, and displays the image on the display screen, so that a person looks like looking in a real environment.
In some implementations, the motion capture device 200 may be an inertial sensor, or any device with an inertial sensor, such as a tracker, or the like.
In some implementations, as shown in fig. 3, the motion capture device 200 is worn on a lower limb of a human body. Wherein, the lower limbs of human body include: thigh and calf. That is, the four motion capture devices 200 are respectively worn on the thighs and the calves of the human body to collect thigh motion data and shank motion data (collectively referred to as lower limb motion data) of the human body by the motion capture devices 200, and the lower limb motion data is transmitted to the head-mounted display device 100, so that the purpose of tracking the lower limb motion of the human body is achieved.
It should be noted that, the lower limb movement data collected by the motion capture device 200 in this embodiment is specifically three-degree-of-freedom data of the lower limb posture.
The peripheral device 300 is optionally worn on the upper limb of the human body in consideration of the fact that the upper limb of the human body can also perform different actions. Such as on the hands and/or arms of the human body. Then, the upper limb movement data of the human body is collected by the peripheral device 300, and is transmitted to the head-mounted display device 100, so that the tracking of the upper limb movement of the human body is realized, and the wearing mode of the peripheral device 300 is specifically shown in fig. 3.
In some implementations, the peripheral device 300 may be, but is not limited to: handles, gloves, hand rings, wrist bands, finger rings, and other wearable devices, etc.
The peripheral device 300 is provided with an inertial sensor that can provide 6-degree-of-freedom data including the position of the upper limb of the human body and the posture of the upper limb of the human body.
It should be noted that, the inertial sensor in this embodiment specifically refers to an inertial measurement unit (Inertial Measurement Unit, abbreviated as IMU), and the IMU is a nine-axis IMU, which may include: three-axis gyroscopes, three-axis accelerometers and three-axis magnetometers.
It can be appreciated that the embodiment of the present application may implement whole body tracking of a human body by wearing the motion capture device 200 and the peripheral apparatus 300 on limbs of the human body.
It should be understood that the head mounted display device 100, the motion capture device 200, and the peripheral apparatus 200 shown in fig. 1-3 are illustrative only and are not limiting to the present application.
After an application scenario of the embodiments of the present application is introduced, a method for calibrating a motion capture device provided by the embodiments of the present application is described in detail below with reference to the accompanying drawings.
Fig. 4 is a flowchart of a calibration method of a motion capture device according to an embodiment of the present application. The method and the device for calibrating the motion capture device can be applied to a calibration scene of the motion capture device in human body motion capture, and the method for calibrating the motion capture device can be performed by a calibration device of the motion capture device. The calibration means of the motion capture device may be comprised of hardware and/or software and may be integrated in the head mounted display device.
In the embodiment of the application, the head-mounted display device may be any electronic device capable of reproducing human actions. Alternatively, the electronic device may be an Extended Reality (XR) device. The XR device may be a VR device, an augmented reality (Augmented Reality, AR) device, a Mixed Reality (MR) device, or the like, which is not particularly limited in this application.
As shown in fig. 4, the method may include the steps of:
s101, determining a first posture conversion relation between a world coordinate system and a reference coordinate system and a second posture conversion relation between the reference coordinate system and a device coordinate system, wherein the device coordinate system is a motion capture device coordinate system.
In the embodiment of the present application, the world coordinate system refers to a Global world coordinate system, which may be denoted as G, where G is collectively referred to as Global.
The reference frame, in particular the geographical frame, may be denoted C. The geographic coordinate system is specifically a northeast geographic coordinate system. In embodiments of the present application, the reference coordinate system may be determined jointly by an accelerometer and a magnetometer in the motion capture device. The specific determination process can be referred to in the prior art, and will not be described in detail herein.
In the embodiment of the present application, the device coordinate system may be denoted as L i Where L is collectively referred to as Local, i is identification information of the motion capture device, where the identification information may be any information capable of determining an identity of the motion capture device, such as a number or a name, and the like, and is not specifically limited herein.
As an alternative implementation, the relationship between the world coordinate system, the reference coordinate system and the device coordinate system in the present application may be as shown in fig. 5. Wherein O is G Is the world coordinate system, O C For reference frame, O L1 A motion capture device coordinate system for wearing on a thigh of a lower limb of a human body, and O L2 Representing another motion capture device coordinate system worn on the thigh associated with the calf.
Specifically, before S101 is executed, the user may wear the devices such as the head-mounted display device and the motion capture device on different parts of the human body, for example, wear the head-mounted display device on the head of the human body, wear the motion capture device on the lower limb of the human body, and so on, which can be specifically seen in fig. 3. When wearing the head-mounted display device, the motion capture device and the like, the face, the chest and the legs of the human body are required to be kept to face in the same direction, so that conditions are provided for acquiring the human body lower limb motion data and the human body head motion data under the same scene.
And then, controlling the human body to be in a standing state, and keeping the human body still for a preset period of time. When the human body is in a standing and still state, the three coordinate systems of the world coordinate system, the reference coordinate system and the equipment coordinate system are all fixed coordinate systems. Thus, the first posture conversion relationship between the world coordinate system and the reference coordinate system is determined based on the fixed relationship between the world coordinate system and the reference coordinate system, and is generally a constant which approximates to the unit array I.
Likewise, a second posture conversion relationship between the reference coordinate system and the device coordinate system is determined, and can be obtained from a fixed relationship between the reference coordinate system and the device coordinate system, and is typically a constant, which is approximated as a unit array.
The fixed relation between the reference coordinate system and the device coordinate system can be obtained through measurement of an inertial sensor in the motion capture device.
It should be noted that the preset duration may be flexibly set according to the actual calibration requirement, which is not limited herein. For example, the preset duration may be set to 1 second(s), 2s, 3s, or the like.
S102, calibrating the motion capture device according to the first gesture conversion relation and the second gesture conversion relation.
After the first posture conversion relation between the world coordinate system and the reference coordinate system and the second posture conversion relation between the reference coordinate system and the equipment coordinate system are obtained, the method and the device can calibrate the motion capture equipment according to the first posture conversion relation and the second posture conversion relation.
Specifically, a target posture conversion relationship of a world coordinate system and a device coordinate system can be determined according to the first posture conversion relationship and the second posture conversion relationship; and then, calibrating the motion capture device according to the target posture conversion relation between the world coordinate system and the device coordinate system.
As an optional implementation manner, the determining the target posture conversion relationship of the world coordinate system and the device coordinate system according to the first posture conversion relationship and the second posture conversion relationship may be implemented by the following formula (1):
wherein,for the object pose conversion relationship of the world coordinate system and the ith device coordinate system, G R C for the first pose conversion of the world coordinate system and the reference coordinate system, +.>Is the second gesture conversion relation between the reference coordinate system and the ith equipment coordinate system, i is the ith equipment coordinate system, and i is E [1,4 ]]。
It is appreciated that the present application calibrates the motion capture device by utilizing a first pose conversion relationship of the world coordinate system and the reference coordinate system, and a second pose conversion relationship of the reference coordinate system and the device coordinate system. That is, the first gesture conversion relationship and the second gesture conversion relationship are used as a tie for calibrating the motion capture device, so as to realize the calibration operation of the motion capture device. Compared with the prior art, the method for calibrating the motion capture device by means of the reference object and the wearing mode of the fixed motion capture device is simpler and easier to realize, and the problem that the calibration error is caused by inaccurate wearing direction of the motion capture device by a user is avoided, so that the accuracy and the correctness of the calibration of the motion capture device can be improved.
According to the calibration method of the motion capture device, the first posture conversion relation of the world coordinate system and the reference coordinate system and the second posture conversion relation of the reference coordinate system and the device coordinate system are determined, and then the motion capture device is calibrated according to the first posture conversion relation and the second posture conversion relation. According to the method, the motion capture device is worn on the lower limb of the human body, the natural motion of the human body is utilized to calibrate the motion capture device, so that the calibration difficulty of the motion capture device can be effectively reduced, the calibration time is shortened, the calibration precision and the calibration effect of the motion capture device are improved, the limb motion capture effect can be improved based on the calibrated motion capture device, and conditions are provided for improving the accuracy of human body posture reduction.
As can be seen from the above description, the embodiments of the present application calibrate the motion capture device according to the first posture conversion relationship and the second posture conversion relationship.
On the basis of the above embodiment, the first posture conversion relation between the world coordinate system and the reference coordinate system in the embodiment of the present application is further explained below, with reference to fig. 6 in particular.
As shown in fig. 6, the method may include the steps of:
s201, determining the posture conversion relation of the world coordinate system and the head-mounted coordinate system and the posture conversion relation of the reference coordinate system and the head-mounted coordinate system.
The head-mounted coordinate system is a head-mounted display device coordinate system.
In the practical application process, the head-mounted display device worn on the brain of the human body is considered to determine the gaze point of the user's sight line in the virtual environment according to the head gesture of the user, so as to display the corresponding virtual image to the user according to the gaze point. And the user's head pose is acquired based on inertial sensors disposed in the head mounted display device. That is, the head mounted display device of the present application may include an inertial sensor. And, the head-mounted display device can further comprise a positioning device to realize tracking positioning by using the positioning device.
Therefore, when determining the posture conversion relationship between the world coordinate system and the head-mounted coordinate system, the present application may include the following cases:
first case
If the head-mounted display device comprises an inertial sensor, determining the posture conversion relation of the world coordinate system and the head-mounted coordinate system specifically comprises: determining the posture conversion relation between the inertial coordinate system and the reference coordinate system, and determining the posture conversion relation between the reference coordinate system and the world coordinate system; determining the posture conversion relation of the world coordinate system and the head-mounted coordinate system according to the posture conversion relation of the inertial coordinate system and the reference coordinate system and the posture conversion relation of the reference coordinate system and the world coordinate system; the inertial coordinate system is an inertial sensor coordinate system.
In the embodiment of the application, the posture conversion relation between the inertial coordinate system and the reference coordinate system can be measured by an inertial sensor in the head-mounted display device. Specifically, the determination is based on a magnetometer and an accelerometer on an inertial sensor, wherein the determination process is conventional in the art, and is not described in detail herein.
Therefore, the present application can determine the posture conversion relationship of the world coordinate system and the head-mounted coordinate system by the following formula (2):
G R HMD =( HMD_imu R C · C R G ) -1 .................................(2)
wherein, G R HMD is the posture conversion relation of the world coordinate system and the head-wearing coordinate system, HMD_imu R C is the posture conversion relation between the inertial coordinate system and the reference coordinate system, C R G the posture conversion relation of the reference coordinate system and the world coordinate system is obtained, and-1 is the inversion. Wherein the method comprises the steps of C R G The relationship between the reference coordinate system and the world coordinate system is determined based on a fixed relationship between the reference coordinate system and the world coordinate system, and the posture conversion relationship is generally a constant, which is approximately the unit array I.
Second case
If the head-mounted display device comprises a positioning device and an inertial sensor, determining the posture conversion relation of the world coordinate system and the head-mounted coordinate system specifically comprises the following steps: determining the posture conversion relation between the world coordinate system and the positioning device coordinate system and the posture conversion relation between the positioning device coordinate system and the inertial coordinate system; determining the posture conversion relation of the world coordinate system and the head-mounted coordinate system according to the posture conversion relation of the world coordinate system and the positioning device coordinate system and the posture conversion relation of the positioning device coordinate system and the inertial coordinate system; the inertial coordinate system is an inertial sensor coordinate system.
When the head-mounted display device is worn on the head of a user, the position of the head-mounted display device is also fixed; accordingly, if the head-mounted display apparatus includes a positioning device and an inertial sensor, the posture conversion relationship between the positioning device coordinate system and the world coordinate system is also a fixed relationship, and the fixed relationship is a known amount; at the same time, the location of the positioning device and inertial sensor on the head mounted display device is also fixed. The posture conversion relationship between the positioning device coordinate system and the inertial coordinate system is therefore also a fixed relationship, and the fixed relationship is also a known amount. In the present embodiment, the above known amount is generally a constant that approximates the unit array I.
Correspondingly, the method and the device for determining the posture conversion relation of the world coordinate system and the head-mounted coordinate system based on the posture conversion relation of the world coordinate system and the positioning device coordinate system and the posture conversion relation of the positioning device coordinate system and the inertial coordinate system can be realized through the following formula (3):
G R HMDG R HMD_d · HMD_d R HMD_imu ..............................(3)
wherein, G R HMD is the posture conversion relation of the world coordinate system and the head-wearing coordinate system, G R HMD_d is the posture conversion relation between the world coordinate system and the positioning device coordinate system, HMD-d R HMD_imu The method is a posture conversion relation between a positioning device coordinate system and an inertial coordinate system.
In addition, when determining the posture conversion relation between the reference coordinate system and the head-mounted coordinate system, the method specifically comprises the following steps: determining the posture conversion relation between the reference coordinate system and the inertial coordinate system; the inertial coordinate system is an inertial sensor coordinate system.
The attitude conversion relation between the reference coordinate system and the inertial coordinate system is determined and can be measured by an inertial sensor.
It should be understood that, in the embodiment of the present application, the pose conversion relationship between the world coordinate system and the head-mounted coordinate system is determined, and in fact, the pose conversion relationship between the world coordinate system and the inertial coordinate system in the head-mounted display device is determined;
accordingly, the posture conversion relationship of the reference coordinate system and the head-mounted coordinate system is determined, and in fact, the posture conversion relationship of the reference coordinate system and the inertial coordinate system in the head-mounted display device is determined.
S202, determining a first posture conversion relation of the world coordinate system and the reference coordinate system according to the posture conversion relation of the world coordinate system and the head-mounted coordinate system and the posture conversion relation of the reference coordinate system and the head-mounted coordinate system.
Specifically, according to the posture conversion relationship between the world coordinate system and the head-mounted coordinate system and the posture conversion relationship between the reference coordinate system and the head-mounted coordinate system, the first posture conversion relationship between the world coordinate system and the reference coordinate system is determined by the following formula (4):
G R CG R HMD ·( C R HMD ) -1 ......................................(4)
Wherein, G R C for the first pose conversion relationship of the world coordinate system and the reference coordinate system, G R HMD is the posture conversion relation of the world coordinate system and the head-wearing coordinate system, C R HMD the posture conversion relation of the reference coordinate system and the head-wearing coordinate system is calculated, and minus 1 is calculated as the inversion.
In consideration of the posture conversion relationship of the world coordinate system and the head-mounted coordinate system, the aforementioned step S201 may be specifically referred to in various cases. The posture conversion relationship between the reference coordinate system and the head-mounted coordinate system is the posture conversion relationship between the reference coordinate system and the inertial coordinate system. Therefore, the formula (4) of the first posture conversion relationship of the world coordinate system and the reference coordinate system in the embodiment of the present application may be modified into the formula (5) and the formula (6) based on the foregoing formulas (2) and (3), specifically as follows:
G R C =( HMD_imu R C · C R G ) -1 ·( C R HMD_imu ) -1 ...................(5)
wherein, G R C is the first posture conversion relation of the world coordinate system and the reference coordinate system HMD_imu R C · C R G ) -1 Transforming relations between the world coordinate system and the head-mounted coordinate system, wherein HMD_imu R C Is the posture conversion relation between the inertial coordinate system and the reference coordinate system, C R G for the posture conversion relationship of the reference coordinate system and the world coordinate system, C R HMD_imu the posture conversion relation of the reference coordinate system and the inertial coordinate system is obtained, and the-1 is the inversion.
G R CG R HMD_d · HMDd R HMD_imu ·( C R HMD_imu ) -1 .................................(6)
Wherein, G R C for the first pose conversion relationship of the world coordinate system and the reference coordinate system, G R HMD_d Is the posture conversion relation between the world coordinate system and the positioning device coordinate system, HMD_d R HMD_imu for the posture conversion relationship between the positioning device coordinate system and the inertial coordinate system, C R HMD_imu the posture conversion relation of the reference coordinate system and the inertial coordinate system is obtained, and the-1 is the inversion.
S203, determining a second posture conversion relation of the reference coordinate system and a device coordinate system, wherein the device coordinate system is a motion capture device coordinate system.
S204, calibrating the motion capture device according to the first gesture conversion relation and the second gesture conversion relation.
Based on the calculation formulas (5) and (6) for determining the first posture conversion relationship of the world coordinate system and the reference coordinate system in the aforementioned step S202, the present application determines the target posture conversion relationship of the world coordinate system and the device coordinate system according to the first posture conversion and the second posture conversion relationship, and may include the following formulas (7) and (8).
Wherein, the formulas (7) and (8) are obtained by deforming the formula (1) based on the formulas (5) and (6), specifically:
wherein,converting the target gesture of the world coordinate system and the ith equipment coordinate system into a relation of the target gesture and the target gesture of the world coordinate system and the ith equipment coordinate system HMD_imu R C · C RG) -1 Transforming relations between the world coordinate system and the head-mounted coordinate system, wherein HMD_imu R C Is the posture conversion relation between the inertial coordinate system and the reference coordinate system, C R G For the posture conversion relationship of the reference coordinate system and the world coordinate system, C R HMD_imu is the posture conversion relation of the reference coordinate system and the inertial coordinate system, -1 is the inversion, +.>For the second posture conversion relation of the reference coordinate system and the ith equipment coordinate system, i E [1,4 ]]。
Wherein,for the object pose conversion relationship of the world coordinate system and the ith device coordinate system, G R HMD_d is the posture conversion relation between the world coordinate system and the positioning device coordinate system, HMD_d R HMD_imu for the posture conversion relationship between the positioning device coordinate system and the inertial coordinate system, C R HMD_imu is the posture conversion relation of the reference coordinate system and the inertial coordinate system, -1 is the inversion, +.>For the second posture conversion relation of the reference coordinate system and the ith equipment coordinate system, i E [1,4 ]]。
And calibrating the motion capture device based on the determined target posture conversion relation.
According to the calibration method of the motion capture device, the first posture conversion relation of the world coordinate system and the reference coordinate system and the second posture conversion relation of the reference coordinate system and the device coordinate system are determined, and then the motion capture device is calibrated according to the first posture conversion relation and the second posture conversion relation. According to the method, the motion capture device is worn on the lower limb of the human body, the natural motion of the human body is utilized to calibrate the motion capture device, so that the calibration difficulty of the motion capture device can be effectively reduced, the calibration time is shortened, the calibration precision and the calibration effect of the motion capture device are improved, the limb motion capture effect can be improved based on the calibrated motion capture device, and conditions are provided for improving the accuracy of human body posture reduction.
As another alternative implementation, after calibrating the motion capture device, the present application may determine the pose of the motion capture device in the world coordinate system based on the calibrated motion capture device. The following describes a specific description of determining the pose of the motion capture device in the world coordinate system according to the embodiment of the present application with reference to fig. 7.
As shown in fig. 7, the method may include the steps of:
s301, determining a first posture conversion relation between a world coordinate system and a reference coordinate system and a second posture conversion relation between the reference coordinate system and a device coordinate system, wherein the device coordinate system is a motion capture device coordinate system.
S302, calibrating the motion capture device according to the first gesture conversion relation and the second gesture conversion relation.
S303, acquiring lower limb motion data of a human body acquired by the motion capture device.
The lower limb movement data specifically refers to lower limb movement data collected by the motion capture device when a user stands and keeps a static state after wearing various devices. The lower limb movement data is specifically lower limb posture data.
The motion data of the lower limbs acquired by the motion capture device is also motion data of the motion capture device because the motion capture device worn on the lower limbs of the human body can acquire the motion data of the lower limbs of the human body and the human body is in a standing and static state. Therefore, the method and the device can acquire the lower limb movement data of the human body through the motion capture device worn on the lower limb of the human body. And then, the collected lower limb movement data is sent to the head-mounted display device, so that the head-mounted display device can determine the initial posture of the motion capture device under the world coordinate system based on the lower limb movement data sent by the motion capture device.
S304, determining the initial posture of the motion capture device under the world coordinate system according to the lower limb motion data and the target posture conversion relation.
Specifically, after the head-mounted display device receives the lower limb movement data sent by the motion capture device, target lower limb movement data corresponding to an initial moment (namely, a t0 moment) is extracted from the lower limb movement data. Then substituting the target lower limb movement data corresponding to the initial moment into a target posture conversion relation to obtain the initial posture of the motion capture device under the world coordinate system at the initial moment.
And the lower limb movement data sent by the motion capture device are considered to correspond to different moments. Therefore, after determining the initial pose of the motion capture device in the world coordinate system at the initial moment, the present application may further determine the real-time pose of the motion capture device in the world coordinate system based on the lower limb motion data corresponding to the initial pose and other moments, and see steps S305 to S306.
S305, performing integral processing on the lower limb movement data to obtain integral data.
Considering lower limb movement data includes: lower limb gyroscope data, lower limb accelerometer data, and lower limb magnetometer data. And the gesture of the motion capture device under the world coordinate system is determined mainly according to the lower limb gyroscope data. Therefore, the method and the device can obtain the integral data by carrying out integral processing on the lower limb gyroscope data. Therefore, the data processing amount can be reduced, and the gesture determining speed of the motion capture device under the world coordinate system can be improved.
S306, determining the real-time gesture of the motion capture device under the world coordinate system according to the integral data, the initial gesture and the target gesture conversion relation.
After the integral data is obtained, the integral data and the initial gesture can be substituted into a target gesture conversion relation, so that the real-time gesture of the motion capture device under the world coordinate system is obtained.
According to the calibration method of the motion capture device, the first posture conversion relation of the world coordinate system and the reference coordinate system and the second posture conversion relation of the reference coordinate system and the device coordinate system are determined, and then the motion capture device is calibrated according to the first posture conversion relation and the second posture conversion relation. According to the method, the motion capture device is worn on the lower limb of the human body, the natural motion of the human body is utilized to calibrate the motion capture device, so that the calibration difficulty of the motion capture device can be effectively reduced, the calibration time is shortened, the calibration precision and the calibration effect of the motion capture device are improved, the limb motion capture effect can be improved based on the calibrated motion capture device, and conditions are provided for improving the accuracy of posture reduction. In addition, after the motion capture device is calibrated, the initial posture of the motion capture device under the world coordinate system is determined by acquiring the lower limb motion data acquired by the motion capture device and substituting the target lower limb motion data corresponding to the initial moment in the lower limb motion data into the target posture conversion relation determined based on the first posture conversion relation and the second posture conversion relation. And the integral processing is carried out on the lower limb gyroscope data so as to substitute the integral data and the initial gesture into a target gesture conversion relation, and the real-time gesture of the motion capture device under the world coordinate system is determined, so that the alignment operation of the motion capture device is completed, the data processing amount can be reduced, and the gesture determination speed of the motion capture device under the world coordinate system is improved.
As an alternative implementation scenario, consider that the motion capture device is worn on a lower limb of a human body. Therefore, after the motion capture device is calibrated, the gesture conversion relation between the limb coordinate system and the world coordinate system can be optionally determined, and the gesture of the lower limb corresponding to the motion capture device under the world coordinate system is further determined based on the gesture conversion relation. The following describes, with reference to fig. 8, a specific description of determining a posture conversion relationship between a limb coordinate and a world coordinate system, and determining a posture process of a lower limb in the world coordinate system based on the posture conversion relationship according to the embodiment of the present application.
As shown in fig. 8, the method may include the steps of:
s401, determining a third posture conversion relation of a limb coordinate system and a device coordinate system, wherein the limb coordinate system is a lower limb coordinate system.
In the embodiment of the present application, the limb coordinate system refers to the limb coordinate system of the lower limb of the human body, and the limb coordinate system can be denoted as B i Wherein B is called Body, i is the identification information of the limb corresponding to the limb coordinate system, the identification information can be any information capable of determining the identity of the limb, such as the important skeleton number of the human Body or the important skeleton name of the human Body, and the like, and the identification information is not particularly limited herein.
As an alternative implementation, the relationship among the device coordinate system, the world coordinate system, and the limb coordinate system in the embodiment of the present application may be as shown in fig. 9. Wherein O is G Represents the world coordinate system, O B1 Representing the limb coordinate system, O, of a thigh B2 Representing the limb coordinate system of one calf associated with the thigh, O L1 A device coordinate system representing a motion capture device worn on the thigh, and O L2 Representing the device coordinate system of another motion capture device worn on the calf.
It is contemplated that the motion capture device is stationary relative to the lower limb after the motion capture device is worn on the lower limb of the person. Correspondingly, the posture conversion relation between the limb coordinate system and the device coordinate system is also fixed. That is, the posture conversion relationship between the limb coordinate system and the device coordinate system is a known quantity, and the known quantity is generally a constant, which is approximated as a unit array.
S402, determining the posture conversion relation between the limb coordinate system and the world coordinate system according to the first posture conversion relation, the second posture conversion relation and the third posture conversion relation.
Specifically, the posture conversion relationship between the limb coordinate system and the world coordinate system may be determined based on the first posture conversion relationship, the second posture conversion relationship, and the third posture conversion relationship by the following formula (9):
Wherein,for the gesture conversion relation of the ith limb coordinate system and the world coordinate system, +.>For the third posture conversion relation of the ith limb coordinate system and the ith device coordinate system,/->Converting a relationship for object pose of an ith device coordinate system and world coordinate system, wherein G R C For the first pose conversion relationship of the world coordinate system and the reference coordinate system,for the second posture conversion relation of the reference coordinate system and the ith equipment coordinate system, i E [1,4 ]]And-1 is the inversion.
S403, determining the posture of the lower limb under the world coordinate system according to the posture data of the lower limb and the posture conversion relation between the limb coordinate system and the world coordinate system.
It is contemplated that a motion capture device worn on a person's lower limb, while capable of capturing 6 degrees of freedom including position and pose, provides only 3 degrees of freedom data of pose to the head mounted display device. That is, in the embodiment of the present application, the lower limb posture data is acquired based on the motion capture device worn on the lower limb of the human body.
Furthermore, the application can substitute the lower limb posture data into the posture conversion relation between the limb coordinate system and the world coordinate system so as to obtain the posture of the lower limb under the world coordinate system.
Wherein, considering the lower limb posture data includes: thigh pose data and shank pose data. Therefore, the present application determines the pose of the lower limb in the world coordinate system, specifically: the pose of each thigh in the world coordinate system and the pose of each shank in the world coordinate system are determined.
According to the calibration method of the motion capture device, the third posture conversion relation of the limb coordinate system and the device coordinate system is determined, so that the posture of the lower limb in the world coordinate system is determined according to the first posture conversion relation between the world coordinate system and the reference coordinate system, the second posture conversion relation between the reference coordinate system and the device coordinate system and the third posture conversion relation, and the posture conversion relation between the limb coordinate system and the world coordinate system is determined according to the lower limb posture data and the posture conversion relation between the limb coordinate system and the world coordinate system. According to the method, the motion capture device is worn on the lower limb of the human body, the natural motion of the human body is utilized to calibrate the motion capture device, so that the calibration difficulty of the motion capture device can be effectively reduced, the calibration time is shortened, the calibration precision and the calibration effect of the motion capture device are improved, the limb motion capture effect can be improved based on the calibrated motion capture device, and conditions are provided for improving the accuracy of human body posture reduction. And the limb coordinate system can be calibrated based on the posture conversion relation between the limb coordinate system and the world coordinate system, and the posture of the lower limb in the world coordinate system is determined based on the calibrated limb coordinate system, so that the accuracy and the correctness of the determination of the posture of the lower limb in the world coordinate system are improved, and the use experience of a user is improved.
On the basis of the above embodiment, a third posture conversion relationship between the limb coordinate system and the device coordinate system determined in the embodiment of the present application will be further explained, with reference to fig. 10 in particular.
As shown in fig. 10, the method may include the steps of:
s501, determining the posture conversion relation of the limb coordinate system and the head-mounted coordinate system and the posture conversion relation of the head-mounted coordinate system and the equipment coordinate system.
The head-mounted coordinate system is a head-mounted display device coordinate system.
In the practical application process, the head-mounted display device worn on the brain of the human body is considered to determine the gaze point of the user's sight line in the virtual environment according to the head gesture of the user, so as to display the corresponding virtual image to the user according to the gaze point. And the user's head pose is acquired based on inertial sensors disposed in the head mounted display device. That is, the head mounted display device of the present application may include an inertial sensor. And, the head-mounted display device can further comprise a positioning device to realize tracking positioning by using the positioning device.
Therefore, when determining the posture conversion relation between the limb coordinate system and the head-mounted coordinate system, the present application may include the following cases:
First case
If the head-mounted display device comprises an inertial sensor, determining the posture conversion relation of the limb coordinate system and the head-mounted coordinate system specifically comprises the following steps: determining the posture conversion relation between the limb coordinate system and the inertial coordinate system; the inertial coordinate system is an inertial sensor coordinate system.
Consider that the human leg and limb coordinate system is fixed, while the head mounted display device worn on the human head is also fixed. Therefore, the posture conversion relationship between the limb coordinate system and the head-mounted coordinate system is a constant, and the constant is approximated as the unit array I. In particular from inertial sensors in the head mounted display device.
In the embodiment of the application, determining the posture conversion relationship between the limb coordinate system and the inertial coordinate system can be achieved by the following formula (10):
wherein,for the posture conversion relation of the ith limb coordinate system and the head-wearing coordinate system, +.>For the gesture conversion relation of the ith limb coordinate system and inertial coordinate system, i is E [1,4 ]]。
Second case
If the head-mounted display device comprises a positioning device and an inertial sensor, determining the posture conversion relation of the limb coordinate system and the head-mounted coordinate system, wherein the posture conversion relation is specifically as follows: determining the posture conversion relation between the limb coordinate system and the positioning device coordinate system and the posture conversion relation between the positioning device coordinate system and the inertial transmission coordinate system; determining the posture conversion relation of the limb coordinate system and the head-mounted coordinate system according to the posture conversion relation of the limb coordinate system and the positioning device coordinate system and the posture conversion relation of the positioning device coordinate system and the inertial coordinate system; the inertial coordinate system is an inertial sensor coordinate system.
It is contemplated that the locations of the positioning device and inertial sensors in the head mounted display device are fixed and known, as well as the human lower extremities. Therefore, the posture conversion relationship between the positioning device coordinate system and the inertial sensor is a known quantity, and the posture conversion relationship between the limb coordinate system and the positioning device coordinate system is also a known quantity. And, the known quantity is typically a constant that approximates the unit array I.
Therefore, the determination of the posture conversion relationship of the limb coordinate system and the head-mounted coordinate system can be achieved by the following formula (11):
wherein,for the posture conversion relation of the ith limb coordinate system and the head-wearing coordinate system, +.>For the posture conversion relation between the ith limb coordinate system and the positioning device coordinate system, HMD_d R HMD_imu for the posture conversion relation of the positioning device coordinate system and the inertial coordinate system, i is E [1,4 ]]。
Third case
If the head-mounted display device comprises a positioning device and an inertial sensor, determining the posture conversion relation of the limb coordinate system and the head-mounted coordinate system, wherein the posture conversion relation is specifically as follows: determining the posture conversion relation between the limb coordinate system and the head coordinate system, the posture conversion relation between the head coordinate system and the positioning device coordinate system, and the posture conversion relation between the positioning device coordinate system and the inertial coordinate system; determining the posture conversion relation of the limb coordinate system and the head coordinate system according to the posture conversion relation of the limb coordinate system and the head coordinate system, the posture conversion relation of the head coordinate system and the positioning device coordinate system and the posture conversion relation of the positioning device coordinate system and the inertial coordinate system; the inertial coordinate system is an inertial sensor coordinate system.
The head coordinate system specifically refers to a coordinate system of a physical head of a human body.
In the present embodiments, the physical head of the human body and the lower limb of the human body belong to different parts of the human body, which parts are known and fixed. That is, the head coordinate system corresponding to the physical head of the human body and the limb coordinate system corresponding to the lower limb of the human body are all known quantities; correspondingly, the posture conversion relationship between the head coordinate system and the limb coordinate system is also a known quantity. And, the above known amount is generally a constant that approximates the unit array I.
Further, because the head-mounted display apparatus includes the positioning device and the inertial sensor are fixed in position, the posture conversion relationship between the corresponding positioning device coordinate system and the inertial coordinate system is a known amount, and the posture conversion relationship between the head coordinate system and the positioning device coordinate system is also a known amount. And, the above known amount is generally a constant that approximates the unit array I.
Therefore, the determination of the posture conversion relation between the limb coordinate system and the head-mounted coordinate system can be realized by the following formula (12):
wherein,for the posture conversion relation of the ith limb coordinate system and the head-wearing coordinate system, +. >For the posture conversion relation of the ith limb coordinate system and the head coordinate system, head R HMD_d for the posture conversion relation between the head coordinate system and the positioning device coordinate system, HMD_d R HMD_imu for the posture conversion relation of the positioning device coordinate system and the inertial coordinate system, i is E [1,4 ]]。
In addition, in the embodiment of the present application, the posture conversion relationship between the head coordinate system and the device coordinate system is determined specifically as follows: determining the posture conversion relation between the head-mounted coordinate system and the world coordinate system and the posture conversion relation between the world coordinate system and the equipment coordinate system; and determining the posture conversion relation of the head-mounted coordinate system and the equipment coordinate system according to the posture conversion relation of the head-mounted coordinate system and the world coordinate system and the posture conversion relation of the world coordinate system and the equipment coordinate system.
The implementation principle of determining the posture conversion relationship between the head-mounted coordinate system and the world coordinate system is the same as or similar to that of determining the posture conversion relationship between the world coordinate system and the head-mounted coordinate system in step S201 in the previous embodiment, and the specific implementation process may refer to the foregoing embodiment, and will not be repeated herein. That is, the posture conversion relationship between the head-mounted coordinate system and the world coordinate system in this embodiment is actually a conversion relationship obtained by inverting the posture conversion relationship between the world coordinate system and the head-mounted coordinate system.
In addition, the implementation principle of determining the posture-transformation relationship between the world coordinate system and the device coordinate system is the same as or similar to that of determining the target posture-transformation relationship between the world coordinate system and the device coordinate system in the step S101 to the step S102 in the foregoing embodiment, and the specific implementation process may be referred to in the foregoing embodiment, and will not be repeated here.
Specifically, the posture conversion relationship between the head-mounted coordinate system and the device coordinate system can be determined by the following formula (13):
/>
wherein,for the posture conversion relation between the head-wearing coordinate system and the ith equipment coordinate system, HMD R G is the posture conversion relation of the head-wearing coordinate system and the world coordinate system +.>The method is a posture conversion relation between a world coordinate system and an ith equipment coordinate system.
S502, determining a third posture conversion relation of the limb coordinate system and the equipment coordinate system according to the posture conversion relation of the limb coordinate system and the head-mounted coordinate system and the posture conversion relation of the head-mounted coordinate system and the equipment coordinate system.
Specifically, determining the third posture conversion relationship between the limb coordinate system and the device coordinate system can be achieved by the following formula (14):
wherein,for the third posture conversion relation of the ith limb coordinate system and the ith device coordinate system,/- >For the posture conversion relation of the ith limb coordinate system and the head-wearing coordinate system, and +.>For the gesture conversion relation of the head-wearing coordinate system and the ith equipment coordinate system, i epsilon [1,4 ]]。
The calculation formulas (10) to (12) of the posture conversion relation of the limb coordinate system and the head-mounted coordinate system, and the calculation formula (13) of the posture conversion relation of the head-mounted coordinate system and the device coordinate system are determined based on the foregoing step S501. The application can be based on a calculation formula (15) and a formula (17) obtained by deforming the formula (14) from the formula (10) to the formula (13), and specifically comprises the following steps:
wherein,for the third posture conversion relation of the ith limb coordinate system and the ith device coordinate system,for the posture conversion relation of the ith limb coordinate system and the inertial coordinate system, < +.>For the posture conversion relation between the head-wearing coordinate system and the ith equipment coordinate system, HMD R G is the posture conversion relation of the head-wearing coordinate system and the world coordinate system, < >>For the gesture conversion relationship of world coordinate system and ith equipment coordinate system and i E [1,4 ]]。
Wherein,for the third posture conversion relation of the ith limb coordinate system and the ith device coordinate system,for the posture conversion relation between the ith limb coordinate system and the positioning device coordinate system, HMD_d R HMD_imu for the posture conversion relationship between the positioning device coordinate system and the inertial coordinate system, HMD R G Is the posture conversion relation between the head-wearing coordinate system and the world coordinate system,for the gesture conversion relationship of world coordinate system and ith equipment coordinate system and i E [1,4 ]]。
Wherein,for the third posture conversion relation of the ith limb coordinate system and the ith device coordinate system,/->For the posture conversion relation of the ith limb coordinate system and the head coordinate system, head R HMD_d for the posture conversion relation between the head coordinate system and the positioning device coordinate system, HMD_d R HMD_imu for locating device coordinate system and inertial coordinateThe system is in a posture conversion relation, HMD R G is the posture conversion relation of the head-wearing coordinate system and the world coordinate system, < >>For the gesture conversion relationship of world coordinate system and ith equipment coordinate system and i E [1,4 ]]。
S503, determining the posture conversion relation between the limb coordinate system and the world coordinate system according to the first posture conversion relation, the second posture conversion relation and the third posture conversion relation.
S504, determining the posture of the lower limb under the world coordinate system according to the posture data of the lower limb and the posture conversion relation between the limb coordinate system and the world coordinate system.
According to the calibration method of the motion capture device, the third posture conversion relation of the limb coordinate system and the device coordinate system is determined, so that the posture of the lower limb in the world coordinate system is determined according to the first posture conversion relation between the world coordinate system and the reference coordinate system, the second posture conversion relation between the reference coordinate system and the device coordinate system and the third posture conversion relation, and the posture conversion relation between the limb coordinate system and the world coordinate system is determined according to the lower limb posture data and the posture conversion relation between the limb coordinate system and the world coordinate system. According to the method, the motion capture device is worn on the lower limb of the human body, the natural motion of the human body is utilized to calibrate the motion capture device, so that the calibration difficulty of the motion capture device can be effectively reduced, the calibration time is shortened, the calibration precision and the calibration effect of the motion capture device are improved, the limb motion capture effect can be improved based on the calibrated motion capture device, and conditions are provided for improving the accuracy of human body posture reduction. And the limb coordinate system can be calibrated based on the posture conversion relation between the limb coordinate system and the world coordinate system, and the posture of the lower limb in the world coordinate system is determined based on the calibrated limb coordinate system, so that the accuracy and the correctness of the determination of the posture of the lower limb in the world coordinate system are improved, and the use experience of a user is improved.
A calibration apparatus for a motion capture device according to an embodiment of the present application is described below with reference to fig. 11. FIG. 11 is a schematic block diagram of a calibration apparatus for a motion capture device provided in an embodiment of the present application.
The motion capture device is worn on the lower limb of the human body, and the calibration device of the motion capture device is configured on the head-mounted display device. The calibration apparatus 600 of the motion capture device includes: a determination module 610 and a calibration module 620.
The determining module 610 is configured to determine a first pose conversion relationship between a world coordinate system and a reference coordinate system, and a second pose conversion relationship between the reference coordinate system and a device coordinate system, where the device coordinate system is the motion capture device coordinate system;
the calibration module 620 is configured to calibrate the motion capture device according to the first gesture conversion relationship and the second gesture conversion relationship.
An optional implementation manner of the embodiment of the present application, the determining module 610 includes: a first determination unit and a second determination unit;
the first determining unit is used for determining the posture conversion relation of the world coordinate system and the head-mounted coordinate system and the posture conversion relation of the reference coordinate system and the head-mounted coordinate system;
A second determining unit configured to determine a first posture conversion relationship of the world coordinate system and the reference coordinate system according to the posture conversion relationship of the world coordinate system and the head-mounted coordinate system, and the posture conversion relationship of the reference coordinate system and the head-mounted coordinate system;
the head-mounted coordinate system is a head-mounted display device coordinate system.
An optional implementation manner of an embodiment of the present application, the head-mounted display device includes: an inertial sensor;
correspondingly, the first determining unit is specifically configured to:
determining the posture conversion relation between an inertial coordinate system and the reference coordinate system and the posture conversion relation between the reference coordinate system and the world coordinate system;
determining the posture conversion relation of the world coordinate system and the head-mounted coordinate system according to the posture conversion relation of the inertial coordinate system and the reference coordinate system and the posture conversion relation of the reference coordinate system and the world coordinate system;
wherein the inertial coordinate system is an inertial sensor coordinate system.
An optional implementation manner of an embodiment of the present application, the head-mounted display device includes: a positioning device and an inertial sensor;
Correspondingly, the first determining unit is specifically configured to:
determining the posture conversion relation between the world coordinate system and the positioning device coordinate system and the posture conversion relation between the positioning device coordinate system and the inertial coordinate system;
determining the posture conversion relation of the world coordinate system and the head-mounted coordinate system according to the posture conversion relation of the world coordinate system and the positioning device coordinate system and the posture conversion relation of the positioning device coordinate system and the inertial coordinate system;
wherein the inertial coordinate system is an inertial sensor coordinate system.
An optional implementation manner of an embodiment of the present application, the head-mounted display device includes: an inertial sensor;
correspondingly, the first determining unit is specifically configured to:
determining the posture conversion relation of the reference coordinate system and the inertial coordinate system;
wherein the inertial coordinate system is an inertial sensor coordinate system.
An optional implementation manner of the embodiment of the present application, the calibration module 620 includes: a third determination unit and a calibration unit;
the third determining unit is used for determining a target posture conversion relation of the world coordinate system and the equipment coordinate system according to the first posture conversion relation and the second posture conversion relation;
And the calibration unit is used for calibrating the motion capture device according to the target posture conversion relation of the world coordinate system and the device coordinate system.
In an optional implementation manner of this embodiment of the present application, the calibration apparatus 600 of the motion capture device further includes: the system comprises a data acquisition module and a gesture determination module;
the motion capture device comprises a data acquisition module, a motion capture module and a control module, wherein the data acquisition module is used for acquiring lower limb motion data of the human body;
and the gesture determining module is used for determining the initial gesture of the motion capture device under the world coordinate system according to the lower limb motion data and the target gesture conversion relation.
In an optional implementation manner of this embodiment of the present application, the calibration apparatus 600 of the motion capture device further includes: a processing module;
the processing module is used for carrying out integral processing on the lower limb movement data to obtain integral data;
and the gesture determining module is specifically used for determining the real-time gesture of the motion capture device under the world coordinate system according to the integral data, the initial gesture and the target gesture conversion relation.
An optional implementation manner of the embodiment of the present application, the lower limb movement data includes: lower limb gyroscope data;
Correspondingly, the processing module is specifically configured to:
and integrating the lower limb gyroscope data.
An optional implementation manner of the embodiment of the present application, the determining module 610 is further configured to:
determining a third posture conversion relation between a limb coordinate system and an equipment coordinate system, wherein the limb coordinate system is a lower limb coordinate system;
and determining the posture conversion relation between the limb coordinate system and the world coordinate system according to the first posture conversion relation, the second posture conversion relation and the third posture conversion relation.
In an optional implementation manner of the embodiment of the present application, the determining module 610 further includes: a fourth determination unit and a fifth determination unit;
the fourth determining unit is used for determining the gesture conversion relation between the limb coordinate system and the head-mounted coordinate system and the gesture conversion relation between the head-mounted coordinate system and the equipment coordinate system;
a fifth determining unit, configured to determine a third posture conversion relationship between the limb coordinate system and the device coordinate system according to the posture conversion relationship between the limb coordinate system and the head-mounted coordinate system, and the posture conversion relationship between the head-mounted coordinate system and the device coordinate system;
the head-mounted coordinate system is a head-mounted display device coordinate system.
An optional implementation manner of an embodiment of the present application, the head-mounted display device includes: an inertial sensor;
correspondingly, the fourth determining unit is specifically configured to:
determining the posture conversion relation between a limb coordinate system and the inertial coordinate system;
wherein the inertial coordinate system is an inertial sensor coordinate system.
An optional implementation manner of an embodiment of the present application, the head-mounted display device includes: a positioning device and an inertial sensor;
correspondingly, the fourth determining unit is specifically configured to:
determining the posture conversion relation between the limb coordinate system and the positioning device coordinate system and the posture conversion relation between the positioning device coordinate system and the inertial transmission coordinate system;
determining the posture conversion relation of the limb coordinate system and the head-mounted coordinate system according to the posture conversion relation of the limb coordinate system and the positioning device coordinate system and the posture conversion relation of the positioning device coordinate system and the inertial coordinate system;
wherein the inertial coordinate system is an inertial sensor coordinate system.
An optional implementation manner of an embodiment of the present application, the head-mounted display device includes: a positioning device and an inertial sensor;
Correspondingly, the fourth determining unit is specifically configured to:
determining the posture conversion relation between the limb coordinate system and the head coordinate system, the posture conversion relation between the head coordinate system and the positioning device coordinate system, and the posture conversion relation between the positioning device coordinate system and the inertial coordinate system;
determining the posture conversion relation of the limb coordinate system and the head coordinate system according to the posture conversion relation of the limb coordinate system and the head coordinate system, the posture conversion relation of the head coordinate system and the positioning device coordinate system and the posture conversion relation of the positioning device coordinate system and the inertial coordinate system;
wherein the inertial coordinate system is an inertial sensor coordinate system.
An optional implementation manner of the embodiment of the present application, the fourth determining unit is specifically configured to:
determining the posture conversion relation between the head-mounted coordinate system and the world coordinate system and the posture conversion relation between the world coordinate system and the equipment coordinate system;
and determining the posture conversion relation of the head-mounted coordinate system and the equipment coordinate system according to the posture conversion relation of the head-mounted coordinate system and the world coordinate system and the posture conversion relation of the world coordinate system and the equipment coordinate system.
An optional implementation manner of this embodiment of the present application, the gesture determining module is further configured to:
and determining the posture of the lower limb under the world coordinate system according to the posture data of the lower limb and the posture conversion relation between the limb coordinate system and the world coordinate system.
According to the calibration device for the motion capture device, the first posture conversion relation of the world coordinate system and the reference coordinate system and the second posture conversion relation of the reference coordinate system and the device coordinate system are determined, and then the motion capture device is calibrated according to the first posture conversion relation and the second posture conversion relation. According to the method, the motion capture device is worn on the lower limb of the human body, the natural motion of the human body is utilized to calibrate the motion capture device, so that the calibration difficulty of the motion capture device can be effectively reduced, the calibration time is shortened, the calibration precision and the calibration effect of the motion capture device are improved, the limb motion capture effect can be improved based on the calibrated motion capture device, and conditions are provided for improving the accuracy of posture reduction.
It should be understood that apparatus embodiments and the foregoing method embodiments may correspond to each other, and similar descriptions may refer to the method embodiments. To avoid repetition, no further description is provided here. Specifically, the apparatus 600 shown in fig. 11 may perform the method embodiment corresponding to fig. 4, and the foregoing and other operations and/or functions of each module in the apparatus 600 are respectively for implementing the corresponding flow in each method in fig. 4, which are not described herein for brevity.
The apparatus 600 of the embodiments of the present application is described above in terms of functional modules in connection with the accompanying drawings. It should be understood that the functional module may be implemented in hardware, or may be implemented by instructions in software, or may be implemented by a combination of hardware and software modules. Specifically, each step of the method embodiment of the first aspect in the embodiments of the present application may be implemented by an integrated logic circuit of hardware in a processor and/or an instruction in software, and the steps of the method of the first aspect disclosed in connection with the embodiments of the present application may be directly implemented as an execution of a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. Alternatively, the software modules may be located in a well-established storage medium in the art such as random access memory, flash memory, read-only memory, programmable read-only memory, electrically erasable programmable memory, registers, and the like. The storage medium is located in a memory, and the processor reads information in the memory, and in combination with hardware, performs the steps in the method embodiment of the first aspect.
Fig. 12 is a schematic block diagram of an electronic device provided in an embodiment of the present application. As shown in fig. 12, the electronic device 700 may include:
A memory 710 and a processor 720, the memory 710 being configured to store a computer program and to transfer the program code to the processor 720. In other words, the processor 720 may call and run a computer program from the memory 710 to implement the method of calibrating the motion capture device in embodiments of the present application.
For example, the processor 720 may be configured to perform the above-described calibration method embodiments of the motion capture device according to instructions in the computer program.
In some embodiments of the present application, the processor 720 may include, but is not limited to:
a general purpose processor, digital signal processor (Digital Signal Processor, DSP), application specific integrated circuit (Application Specific Integrated Circuit, ASIC), field programmable gate array (Field Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, or the like.
In some embodiments of the present application, the memory 710 includes, but is not limited to:
volatile memory and/or nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM) which acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (Double Data Rate SDRAM), enhanced SDRAM (ESDRAM), synchronous Link DRAM (SLDRAM), and Direct memory bus RAM (DR RAM).
In some embodiments of the present application, the computer program may be partitioned into one or more modules that are stored in the memory 710 and executed by the processor 720 to perform the methods of calibrating the motion capture device provided herein. The one or more modules may be a series of computer program instruction segments capable of performing the specified functions, which are used to describe the execution of the computer program in the electronic device.
As shown in fig. 12, the electronic device may further include:
a transceiver 730, the transceiver 730 being connectable to the processor 720 or the memory 710.
The processor 720 may control the transceiver 730 to communicate with other devices, and in particular, may send information or data to other devices or receive information or data sent by other devices. Transceiver 730 may include a transmitter and a receiver. Transceiver 730 may further include antennas, the number of which may be one or more.
It will be appreciated that the various components in the electronic device are connected by a bus system that includes, in addition to a data bus, a power bus, a control bus, and a status signal bus.
In an embodiment of the present application, when the electronic device is an HMD, the embodiment of the present application provides a schematic block diagram of the HMD, as shown in fig. 13.
As shown in fig. 13, the main functional modules of the HMD800 may include, but are not limited to, the following: the detection module 810, the feedback module 820, the sensor 830, the control module 840, the modeling module 850.
The detection module 810 is configured to detect operation commands of a user by using various sensors, and act on a virtual environment, such as continuously updating images displayed on a display screen along with the line of sight of the user, so as to realize interaction between the user and the virtual scene.
The feedback module 820 is configured to receive data from the sensors and provide real-time feedback to the user. For example, the feedback module 820 may generate a feedback instruction according to user operation data and output the feedback instruction.
The sensor 830 is configured to accept an operation command from a user and apply it to the virtual environment; and on the other hand is configured to provide the results generated after the operation to the user in the form of various feedback.
The control module 840 is configured to control sensors and various input/output devices, including obtaining user data such as motion, speech, etc., and outputting sensory data such as images, vibrations, temperature, sounds, etc., to affect the user, virtual environment, and the real world. For example, the control module 840 may obtain user gestures, voice, and the like.
The modeling module 850 is configured to construct a three-dimensional model of the virtual environment, and may also include various feedback mechanisms of sound, touch, etc. in the three-dimensional model.
It should be appreciated that the various functional modules in the HMD800 are connected by a bus system that includes, in addition to a data bus, a power bus, a control bus, a status signal bus, and the like.
The present application also provides a computer storage medium having stored thereon a computer program which, when executed by a computer, enables the computer to perform the method of the above-described method embodiments.
Embodiments of the present application also provide a computer program product comprising program instructions which, when run on an electronic device, cause the electronic device to perform the method of the method embodiments described above.
When implemented in software, may be embodied in whole or in part in the form of a deterministic computer program product. The determiner program product includes one or more determiner instructions. When loaded and executed on a determining machine, produces, in whole or in part, a flow or function according to embodiments of the present application. The determiner may be a general purpose determiner, a special purpose determiner, a network of determinants, or other programmable device. The determiner instructions can be stored in or transmitted from one determiner-readable storage medium to another determiner-readable storage medium, for example, the determiner instructions can be transmitted from one website site, determiner, server, or data center to another website site, determiner, server, or data center by wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL)) or wireless (e.g., infrared, wireless, microwave, etc.) means. The determination machine-readable storage medium may be any available medium that the determination machine can access or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a digital video disc (digital video disc, DVD)), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
Those of ordinary skill in the art will appreciate that the various illustrative modules and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, and for example, the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or modules, which may be in electrical, mechanical, or other forms.
The modules illustrated as separate components may or may not be physically separate, and components shown as modules may or may not be physical modules, i.e., may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. For example, functional modules in the embodiments of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules may be integrated into one module.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (20)

1. A method of calibrating a motion capture device, the motion capture device being worn on a lower limb of a human body, the method being applied to a head-mounted display device comprising:
Determining a first posture conversion relation between a world coordinate system and a reference coordinate system and a second posture conversion relation between the reference coordinate system and a device coordinate system, wherein the device coordinate system is the motion capture device coordinate system;
and calibrating the motion capture device according to the first gesture conversion relation and the second gesture conversion relation.
2. The method of claim 1, wherein determining a first pose conversion relationship of the world coordinate system and the reference coordinate system comprises:
determining the posture conversion relation of the world coordinate system and the head-mounted coordinate system and the posture conversion relation of the reference coordinate system and the head-mounted coordinate system;
determining a first posture conversion relation of the world coordinate system and the reference coordinate system according to the posture conversion relation of the world coordinate system and the head-mounted coordinate system and the posture conversion relation of the reference coordinate system and the head-mounted coordinate system;
the head-mounted coordinate system is a head-mounted display device coordinate system.
3. The method of claim 2, wherein the head mounted display device comprises: an inertial sensor;
correspondingly, determining the posture conversion relation of the world coordinate system and the head-mounted coordinate system comprises the following steps:
Determining the posture conversion relation between an inertial coordinate system and the reference coordinate system and the posture conversion relation between the reference coordinate system and the world coordinate system;
determining the posture conversion relation of the world coordinate system and the head-mounted coordinate system according to the posture conversion relation of the inertial coordinate system and the reference coordinate system and the posture conversion relation of the reference coordinate system and the world coordinate system;
wherein the inertial coordinate system is an inertial sensor coordinate system.
4. The method of claim 2, wherein the head mounted display device comprises: a positioning device and an inertial sensor;
correspondingly, determining the posture conversion relation of the world coordinate system and the head-mounted coordinate system comprises the following steps:
determining the posture conversion relation between the world coordinate system and the positioning device coordinate system and the posture conversion relation between the positioning device coordinate system and the inertial coordinate system;
determining the posture conversion relation of the world coordinate system and the head-mounted coordinate system according to the posture conversion relation of the world coordinate system and the positioning device coordinate system and the posture conversion relation of the positioning device coordinate system and the inertial coordinate system;
Wherein the inertial coordinate system is an inertial sensor coordinate system.
5. The method of claim 2, wherein the head mounted display device comprises: an inertial sensor;
correspondingly, determining the posture conversion relation of the reference coordinate system and the head-mounted coordinate system comprises the following steps:
determining the posture conversion relation of the reference coordinate system and the inertial coordinate system;
wherein the inertial coordinate system is an inertial sensor coordinate system.
6. The method of claim 1, wherein calibrating the motion capture device according to the first and second gesture conversion relationships comprises:
determining a target attitude conversion relation of the world coordinate system and the equipment coordinate system according to the first attitude conversion relation and the second attitude conversion relation;
and calibrating the motion capture device according to the target posture conversion relation of the world coordinate system and the device coordinate system.
7. The method as recited in claim 6, further comprising:
acquiring lower limb motion data of the human body acquired by the motion capture equipment;
and determining the initial posture of the motion capture equipment under the world coordinate system according to the lower limb motion data and the target posture conversion relation.
8. The method as recited in claim 7, further comprising:
integrating the lower limb movement data to obtain integrated data;
and determining the real-time gesture of the motion capture device under the world coordinate system according to the integral data, the initial gesture and the target gesture conversion relation.
9. The method of claim 8, wherein the lower limb movement data comprises: lower limb gyroscope data;
correspondingly, the integrating processing of the lower limb movement data comprises the following steps:
and integrating the lower limb gyroscope data.
10. The method according to any one of claims 1-9, wherein the method further comprises:
determining a third posture conversion relation between a limb coordinate system and an equipment coordinate system, wherein the limb coordinate system is a lower limb coordinate system;
and determining the posture conversion relation between the limb coordinate system and the world coordinate system according to the first posture conversion relation, the second posture conversion relation and the third posture conversion relation.
11. The method of claim 10, wherein determining a third pose conversion relationship of the limb coordinate system and the device coordinate system comprises:
Determining the posture conversion relation of the limb coordinate system and the head-mounted coordinate system and the posture conversion relation of the head-mounted coordinate system and the equipment coordinate system;
determining a third posture conversion relation of the limb coordinate system and the equipment coordinate system according to the posture conversion relation of the limb coordinate system and the head-mounted coordinate system and the posture conversion relation of the head-mounted coordinate system and the equipment coordinate system;
the head-mounted coordinate system is a head-mounted display device coordinate system.
12. The method of claim 11, wherein the head mounted display device comprises: an inertial sensor;
correspondingly, determining the posture conversion relation of the limb coordinate system and the head-mounted coordinate system comprises the following steps:
determining the posture conversion relation between a limb coordinate system and the inertial coordinate system;
wherein the inertial coordinate system is an inertial sensor coordinate system.
13. The method of claim 11, wherein the head mounted display device comprises: a positioning device and an inertial sensor;
correspondingly, determining the posture conversion relation of the limb coordinate system and the head-mounted coordinate system comprises the following steps:
determining the posture conversion relation between the limb coordinate system and the positioning device coordinate system and the posture conversion relation between the positioning device coordinate system and the inertial transmission coordinate system;
Determining the posture conversion relation of the limb coordinate system and the head-mounted coordinate system according to the posture conversion relation of the limb coordinate system and the positioning device coordinate system and the posture conversion relation of the positioning device coordinate system and the inertial coordinate system;
wherein the inertial coordinate system is an inertial sensor coordinate system.
14. The method of claim 11, wherein the head mounted display device comprises: a positioning device and an inertial sensor;
correspondingly, determining the posture conversion relation of the limb coordinate system and the head-mounted coordinate system comprises the following steps:
determining the posture conversion relation between the limb coordinate system and the head coordinate system, the posture conversion relation between the head coordinate system and the positioning device coordinate system, and the posture conversion relation between the positioning device coordinate system and the inertial coordinate system;
determining the posture conversion relation of the limb coordinate system and the head coordinate system according to the posture conversion relation of the limb coordinate system and the head coordinate system, the posture conversion relation of the head coordinate system and the positioning device coordinate system and the posture conversion relation of the positioning device coordinate system and the inertial coordinate system;
Wherein the inertial coordinate system is an inertial sensor coordinate system.
15. The method of claim 11, wherein determining the pose conversion relationship of the headset coordinate system and the device coordinate system comprises:
determining the posture conversion relation between the head-mounted coordinate system and the world coordinate system and the posture conversion relation between the world coordinate system and the equipment coordinate system;
and determining the posture conversion relation of the head-mounted coordinate system and the equipment coordinate system according to the posture conversion relation of the head-mounted coordinate system and the world coordinate system and the posture conversion relation of the world coordinate system and the equipment coordinate system.
16. The method as recited in claim 10, further comprising:
and determining the posture of the lower limb under the world coordinate system according to the posture data of the lower limb and the posture conversion relation between the limb coordinate system and the world coordinate system.
17. A calibration apparatus for a motion capture device, the motion capture device being worn on a lower limb of a human body, the apparatus being configured in a head mounted display device, comprising:
the device comprises a determining module, a processing module and a processing module, wherein the determining module is used for determining a first posture conversion relation between a world coordinate system and a reference coordinate system and a second posture conversion relation between the reference coordinate system and a device coordinate system, wherein the device coordinate system is the motion capture device coordinate system;
And the calibration module is used for calibrating the motion capture device according to the first gesture conversion relation and the second gesture conversion relation.
18. An electronic device, comprising:
a processor and a memory for storing a computer program, the processor for invoking and running the computer program stored in the memory to perform the method of calibrating the motion capture device of any of claims 1 to 16.
19. A computer readable storage medium storing a computer program for causing a computer to perform the method of calibrating the motion capture device of any of claims 1 to 16.
20. A computer program product comprising program instructions that, when run on an electronic device, cause the electronic device to perform the method of calibrating a motion capture device according to any of claims 1 to 16.
CN202211021886.9A 2022-08-24 2022-08-24 Calibration method, device, equipment and medium of motion capture equipment Pending CN117664173A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211021886.9A CN117664173A (en) 2022-08-24 2022-08-24 Calibration method, device, equipment and medium of motion capture equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211021886.9A CN117664173A (en) 2022-08-24 2022-08-24 Calibration method, device, equipment and medium of motion capture equipment

Publications (1)

Publication Number Publication Date
CN117664173A true CN117664173A (en) 2024-03-08

Family

ID=90071734

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211021886.9A Pending CN117664173A (en) 2022-08-24 2022-08-24 Calibration method, device, equipment and medium of motion capture equipment

Country Status (1)

Country Link
CN (1) CN117664173A (en)

Similar Documents

Publication Publication Date Title
US20210131790A1 (en) Information processing apparatus, information processing method, and recording medium
CN109643014B (en) Head mounted display tracking
US11030918B2 (en) Identification and analysis of movement using sensor devices
EP3070680B1 (en) Image-generating device and method
US20150070274A1 (en) Methods and systems for determining 6dof location and orientation of head-mounted display and associated user movements
CN103907139B (en) Information processor, information processing method and program
CN108022302B (en) Stereo display device of Inside-Out space orientation&#39;s AR
CN104536579A (en) Interactive three-dimensional scenery and digital image high-speed fusing processing system and method
KR20160013939A (en) Body-locked placement of augmented reality objects
US11914762B2 (en) Controller position tracking using inertial measurement units and machine learning
KR20210093866A (en) Information processing devices, information processing methods and programs
CN111915738A (en) Method for realizing virtual reality and head-mounted virtual reality equipment
CN106802716B (en) Data processing method of virtual reality terminal and virtual reality terminal
US11792517B2 (en) Pose tracking for rolling shutter camera
CN112486331A (en) IMU-based three-dimensional space handwriting input method and device
US11589778B2 (en) Body size estimation apparatus, body size estimation method, and program
CN111382701A (en) Motion capture method, motion capture device, electronic equipment and computer-readable storage medium
CN117664173A (en) Calibration method, device, equipment and medium of motion capture equipment
US11887259B2 (en) Method, system, and apparatus for full-body tracking with magnetic fields in virtual reality and augmented reality applications
WO2022146858A1 (en) Controller position tracking using inertial measurement units and machine learning
CN114777773A (en) Camera position and posture compensation method and device, electronic equipment and readable storage medium
US10545572B2 (en) Motion tracking apparatus and system
CN117740025A (en) Positioning accuracy determining method, device, equipment and medium for positioning device
KR20150081975A (en) Apparatus for pose estimation of wearable display device using hybrid sensors
CN117503120B (en) Human body posture estimation method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination