WO2017120767A1 - Procédé et appareil de prédiction d'attitude de la tête - Google Patents

Procédé et appareil de prédiction d'attitude de la tête Download PDF

Info

Publication number
WO2017120767A1
WO2017120767A1 PCT/CN2016/070687 CN2016070687W WO2017120767A1 WO 2017120767 A1 WO2017120767 A1 WO 2017120767A1 CN 2016070687 W CN2016070687 W CN 2016070687W WO 2017120767 A1 WO2017120767 A1 WO 2017120767A1
Authority
WO
WIPO (PCT)
Prior art keywords
fitting
time
curve
reporting
posture
Prior art date
Application number
PCT/CN2016/070687
Other languages
English (en)
Chinese (zh)
Inventor
李刚
龙寿伦
Original Assignee
深圳多哚新技术有限责任公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳多哚新技术有限责任公司 filed Critical 深圳多哚新技术有限责任公司
Priority to PCT/CN2016/070687 priority Critical patent/WO2017120767A1/fr
Publication of WO2017120767A1 publication Critical patent/WO2017120767A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present invention relates to the field of head display devices, and in particular, to a head attitude prediction method and apparatus.
  • the part of head tracking In the field of virtual reality, users need to wear head-mounted devices to experience virtual reality. The most important part is the part of head tracking. During the rotation of the user's head, the virtual scene can be rotated together. This requires sensors to sense the head. Part attitude (attitude angle).
  • the main sensors used are angular velocity meters (gyros), accelerometers, and geomagnetic sensors. According to the original values of these sensors, the current attitude angle can be obtained through algorithm processing.
  • the embodiment of the invention provides a head posture prediction method and device, which can solve the problem that the posture angle caused by the posture angle reported in the prior art is delayed.
  • the attitude angle corresponding to the future time is reported to the upper application of the head display device.
  • a method of quadratic fitting is used to perform curve fitting according to the posture angle and the corresponding reporting time, and a fitting curve for predicting a future time is obtained.
  • acquiring the posture angle and the corresponding reporting time within a preset time length closest to the current time includes:
  • the method of quadratic fitting is used to perform curve fitting according to the posture angle and the corresponding reporting time, and the fitting curve for predicting the future time is obtained, which specifically includes:
  • the quadratic fitting formula is as follows:
  • Q 1 , Q 2 , and Q 3 are the three attitude angles, and T 1 , T 2 , and T 3 are the three reporting times, and A, B, and C are the three fitting coefficients;
  • a fitting curve predicting the future time is obtained based on the three fitting coefficients.
  • the method further includes:
  • the attitude angle time acquiring module is configured to acquire an attitude angle and a corresponding reporting time within a preset time length that is closest to the current time;
  • a curve fitting module configured to perform curve fitting according to the posture angle and the corresponding reporting time, to obtain a fitting curve for predicting a future time
  • a future attitude angle prediction module configured to substitute a time point of a future time that needs to be predicted into the fitting curve to obtain an attitude angle corresponding to the future time
  • the attitude angle reporting module is configured to report the posture angle corresponding to the future time to the upper application of the head display device.
  • the curve fitting module comprises:
  • a quadratic fitting unit is configured to perform curve fitting according to the posture angle and the corresponding reporting time by using a quadratic fitting method to obtain a fitting curve for predicting a future time.
  • the attitude angle time acquisition module includes:
  • a three-pose angle acquisition module configured to acquire three attitude angles closest to the current time
  • the three reporting time acquisition module is configured to acquire three reporting times corresponding to the three posture angles.
  • the quadratic fitting unit comprises:
  • Q 1 , Q 2 , and Q 3 are the three attitude angles, and T 1 , T 2 , and T 3 are the three reporting times, and A, B, and C are the three fitting coefficients;
  • a curve subunit is fitted for obtaining a fitting curve for predicting a future time based on the three fitting coefficients.
  • the head posture prediction device includes:
  • a sensor determining module configured to determine whether the sensor of the head display device generates sensor data
  • the triggering module is configured to trigger the attitude angle time acquiring module when the determination result of the sensor determining module is YES.
  • the embodiment of the present invention first, acquiring a posture angle and a corresponding reporting time within a preset time length closest to the current time; and then performing curve fitting according to the posture angle and the corresponding reporting time to obtain a predicted future time Fitting a curve; then, substituting the time point of the predicted future time into the fitting curve to obtain the attitude angle corresponding to the future time; finally, reporting the attitude angle corresponding to the future time to the upper application of the head display device .
  • by predicting the attitude angle of the future moment, and then reporting the attitude angle of the future moment to the upper application when the upper application is completed, the head motion falls on the predicted attitude angle, and the screen is realized. Real-time synchronization with head movements enhances the user experience.
  • FIG. 1 is a flow chart of an embodiment of a head posture prediction method according to an embodiment of the present invention
  • FIG. 2 is a flowchart of another embodiment of a head posture prediction method according to an embodiment of the present invention.
  • FIG. 3 is a structural diagram of an embodiment of a head posture prediction apparatus according to an embodiment of the present invention.
  • FIG. 4 is a structural diagram of another embodiment of a head posture prediction apparatus according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of a delay principle of a posture angle reported in the prior art
  • FIG. 6 is a schematic diagram of a principle of a future attitude angle overcoming delay predicted by a head attitude prediction method according to the present invention.
  • the embodiment of the present invention provides a method and a device for predicting a head posture, which are used to solve the problem of causing a picture delay caused by a posture angle reported in the prior art.
  • an embodiment of a head posture prediction method includes:
  • the attitude angle and the corresponding reporting time within the preset time length closest to the current time can be acquired.
  • curve fitting After acquiring the attitude angle and the corresponding reporting time within the preset time length closest to the current time, curve fitting may be performed according to the posture angle and the corresponding reporting time, and a fitting curve for predicting the future time is obtained.
  • the time point of the future time that needs to be predicted may be substituted into the fitting curve to obtain the attitude angle corresponding to the future time.
  • the attitude angle corresponding to the future time may be reported to the upper layer application of the head display device.
  • FIG. 2 another embodiment of the head posture prediction method in the embodiment of the present invention includes:
  • the attitude angle and the corresponding reporting time within the preset time length closest to the current time can be acquired.
  • the attitude angle can be defined by a quaternion.
  • the preset time length closest to the current time can be set according to the actual situation, which is related to the performance of different head display devices. For example, it may be set within the last 100 ms from the current time, or may be set by the number of posture angles. For example, the specific one may be:
  • the three attitude angles can achieve an optimal balance of effects and efficiency.
  • a quadratic fitting method may be used to perform curve fitting according to the posture angle and the corresponding reporting time, and the prediction of the future time is obtained.
  • the curve is specifically:
  • the quadratic fitting formula is as follows:
  • Q 1 , Q 2 , and Q 3 are the three attitude angles, and T 1 , T 2 , and T 3 are the three reporting times, and A, B, and C are the three fitting coefficients;
  • a fitting curve predicting the future time is obtained based on the three fitting coefficients.
  • the time point of the future time that needs to be predicted may be substituted into the fitting curve, and the The attitude angle corresponding to the future moment.
  • t1 can be substituted into the fitting curve to obtain an attitude angle corresponding to the time t1.
  • the system performs processing, and the rendering is completed at time t1.
  • the display (see Figure 6) is displayed so that the user does not perceive the delay.
  • the attitude angle corresponding to the future time may be reported to the upper layer application of the head display device, that is, the above reporting is performed.
  • the system of the display device is processed, and finally the rendering and other operations are completed and displayed.
  • step 205 Determine whether the sensor of the head display device generates sensor data, and if yes, return to step 201 again, and if not, end the process.
  • the sensor of the head display device After reporting the posture angle corresponding to the future time to the upper application of the head display device, it may be determined whether the sensor of the head display device generates sensor data, and if yes, return to step 201 again, and if not, end the process. It can be understood that if the sensor of the head display device is still generating sensor data, the head display device is still in use, and the sensor data needs to be processed to obtain the user's head posture, so returning to step 201 again, the implementation is implemented.
  • the head posture prediction method in the embodiment is executed cyclically. On the contrary, it can be considered that the head display device is no longer used or does not need to acquire the user's head posture, and the process of the head posture prediction method ends.
  • an embodiment of a head posture prediction apparatus includes:
  • the attitude angle time acquiring module 301 is configured to acquire an attitude angle and a corresponding reporting time within a preset time length that is closest to the current time;
  • the curve fitting module 302 is configured to perform curve fitting according to the posture angle and the corresponding reporting time to obtain a fitting curve for predicting a future time;
  • the future attitude angle prediction module 303 is configured to substitute a time point of a future time that needs to be predicted into the fitting curve to obtain an attitude angle corresponding to the future time;
  • the attitude angle reporting module 304 is configured to report the attitude angle corresponding to the future time to the upper application of the head display device.
  • the attitude angle time acquiring module 301 acquires the posture angle and the corresponding reporting time in the preset time length closest to the current time; then, the curve fitting module 302 performs the posture angle according to the posture angle and the corresponding reporting time. Curve fitting, obtaining a fitting curve for predicting future time; next, the future attitude angle prediction module 303 substitutes the time point of the future time that needs to be predicted into the fitting curve to obtain the attitude angle corresponding to the future time; finally, the attitude angle is reported The module 304 reports the attitude angle corresponding to the future time to the upper application of the head display device.
  • FIG. 4 another embodiment of a head posture prediction apparatus according to an embodiment of the present invention includes:
  • the attitude angle time acquiring module 401 is configured to acquire an attitude angle and a corresponding reporting time within a preset time length that is closest to the current time;
  • the curve fitting module 402 is configured to perform curve fitting according to the posture angle and the corresponding reporting time to obtain a fitting curve for predicting a future time;
  • a future attitude angle prediction module 403 configured to substitute a time point of a future time that needs to be predicted into the fitting curve, to obtain an attitude angle corresponding to the future time;
  • the attitude angle reporting module 404 is configured to report the attitude angle corresponding to the future time to the head display device The upper application.
  • the curve fitting module 402 in this embodiment may include:
  • the quadratic fitting unit 4021 is configured to perform curve fitting according to the posture angle and the corresponding reporting time by using a quadratic fitting method to obtain a fitting curve for predicting a future time.
  • the posture angle time acquiring module 401 in this embodiment may include:
  • the three attitude angle obtaining module 4011 is configured to acquire three attitude angles closest to the current time
  • the three reporting time acquisition module 4012 is configured to acquire three reporting times corresponding to the three posture angles.
  • the quadratic fitting unit 4021 in this embodiment may include:
  • Q 1 , Q 2 , and Q 3 are the three attitude angles, and T 1 , T 2 , and T 3 are the three reporting times, and A, B, and C are the three fitting coefficients;
  • the fitting curve sub-unit 0212 is configured to obtain a fitting curve for predicting future time according to the three fitting coefficients.
  • the sensor determining module 405 is configured to determine whether the sensor of the head display device generates sensor data.
  • the triggering module 406 is configured to trigger the attitude angle time acquiring module when the determination result of the sensor determining module 405 is YES.
  • the disclosed system, apparatus, and method may be implemented in other manners.
  • the device embodiments described above are merely illustrative, for example, the division of the units is only a logical function division, actual There may be additional divisions at present, for example multiple units or components may be combined or integrated into another system, or some features may be omitted or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as a standalone product, may be stored in a computer readable storage medium.
  • the technical solution of the present invention which is essential or contributes to the prior art, or all or part of the technical solution, may be embodied in the form of a software product stored in a storage medium.
  • a number of instructions are included to cause a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present invention.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like, which can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

L'invention concerne un procédé et un appareil de prédiction d'attitude de la tête, utilisés pour résoudre le problème de l'art antérieur de retard d'image provoqué par un angle d'attitude rapportée. Le procédé consiste à : acquérir un angle d'attitude sur une durée prédéterminée plus proche d'un instant présent, et un temps de rapport correspondant (101); effectuer un ajustement de courbe selon l'angle d'attitude et le temps de rapport correspondant afin d'obtenir une courbe d'ajustement prévoyant un instant futur (102); remplacer un moment de l'instant futur devant être prédit dans la courbe d'ajustement de façon à obtenir un angle d'attitude correspondant à l'instant futur (103); et rapporter l'angle d'attitude correspondant à l'instant futur à une application de couche supérieure d'un dispositif d'affichage de tête (104).
PCT/CN2016/070687 2016-01-12 2016-01-12 Procédé et appareil de prédiction d'attitude de la tête WO2017120767A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/070687 WO2017120767A1 (fr) 2016-01-12 2016-01-12 Procédé et appareil de prédiction d'attitude de la tête

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/070687 WO2017120767A1 (fr) 2016-01-12 2016-01-12 Procédé et appareil de prédiction d'attitude de la tête

Publications (1)

Publication Number Publication Date
WO2017120767A1 true WO2017120767A1 (fr) 2017-07-20

Family

ID=59310682

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/070687 WO2017120767A1 (fr) 2016-01-12 2016-01-12 Procédé et appareil de prédiction d'attitude de la tête

Country Status (1)

Country Link
WO (1) WO2017120767A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111050271A (zh) * 2018-10-12 2020-04-21 北京微播视界科技有限公司 用于处理音频信号的方法和装置

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100026714A1 (en) * 2008-07-31 2010-02-04 Canon Kabushiki Kaisha Mixed reality presentation system
US20140354515A1 (en) * 2013-05-30 2014-12-04 Oculus Vr, Llc Perception based predictive tracking for head mounted displays
WO2015098292A1 (fr) * 2013-12-25 2015-07-02 ソニー株式会社 Dispositif de traitement d'image, procédé de traitement d'image, programme informatique et système de traitement d'image
CN105144196A (zh) * 2013-02-22 2015-12-09 微软技术许可有限责任公司 用于计算相机或对象姿态的方法和设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100026714A1 (en) * 2008-07-31 2010-02-04 Canon Kabushiki Kaisha Mixed reality presentation system
CN105144196A (zh) * 2013-02-22 2015-12-09 微软技术许可有限责任公司 用于计算相机或对象姿态的方法和设备
US20140354515A1 (en) * 2013-05-30 2014-12-04 Oculus Vr, Llc Perception based predictive tracking for head mounted displays
WO2015098292A1 (fr) * 2013-12-25 2015-07-02 ソニー株式会社 Dispositif de traitement d'image, procédé de traitement d'image, programme informatique et système de traitement d'image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
YU , CHI., YING1 YONG4 TONG3JI4 XUE2'' HUA2 NAN2 LI3GONG1 DA4XUE2 CHULBAN3SHE4, 31 August 2014 (2014-08-31), pages 145 - 147, ISBN: 978-7-5623-4316-5 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111050271A (zh) * 2018-10-12 2020-04-21 北京微播视界科技有限公司 用于处理音频信号的方法和装置

Similar Documents

Publication Publication Date Title
US11145083B2 (en) Image-based localization
US10115210B2 (en) Display control device, display control method, and program
CN109743626B (zh) 一种图像显示方法、图像处理方法和相关设备
WO2016155377A1 (fr) Procédé et dispositif d'affichage d'image
CN113396443A (zh) 增强现实系统
US20160217615A1 (en) Method and System for Implementing a Multi-User Virtual Environment
JP6190035B2 (ja) コンテンツ配信のセグメンテーション
WO2022174594A1 (fr) Procédé et système de suivi et d'affichage de main nue basés sur plusieurs caméras, et appareil
US11727648B2 (en) Method and device for synchronizing augmented reality coordinate systems
WO2019160699A3 (fr) Utilisation d'un suivi de dispositif d'affichage pour commander un affichage d'image
US10147240B2 (en) Product image processing method, and apparatus and system thereof
JP2018537748A (ja) 可変の計算量を用いた画像のライトフィールドレンダリング
WO2017206451A1 (fr) Procédé de traitement d'informations d'image et dispositif de réalité augmentée
US9536351B1 (en) Third person view augmented reality
CN117529700A (zh) 使用自追踪控制器的人体姿势估计
CN107065164B (zh) 图像展示方法及装置
CN109766006B (zh) 虚拟现实场景的显示方法、装置及设备
KR102448833B1 (ko) 클라우드 vr을 위한 렌더링 방법
WO2016011763A1 (fr) Procédé, appareil et dispositif de présentation d'image, et support de stockage informatique non volatil
CN108804161B (zh) 应用的初始化方法、装置、终端和存储介质
US20160378178A1 (en) Visualized content transmission control method, sending method and apparatuses thereof
WO2022021631A1 (fr) Procédé de commande d'interaction, dispositif terminal, et support de stockage
WO2017120767A1 (fr) Procédé et appareil de prédiction d'attitude de la tête
WO2019114092A1 (fr) Appareil et procédé de réalité augmentée d'image, et dispositif d'affichage de réalité augmentée et terminal
JP2015118577A5 (fr)

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16884330

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 14.12.2018)

122 Ep: pct application non-entry in european phase

Ref document number: 16884330

Country of ref document: EP

Kind code of ref document: A1