CN116126141A - Pose data processing method, pose data processing system, electronic equipment and computer readable medium - Google Patents

Pose data processing method, pose data processing system, electronic equipment and computer readable medium Download PDF

Info

Publication number
CN116126141A
CN116126141A CN202211734695.7A CN202211734695A CN116126141A CN 116126141 A CN116126141 A CN 116126141A CN 202211734695 A CN202211734695 A CN 202211734695A CN 116126141 A CN116126141 A CN 116126141A
Authority
CN
China
Prior art keywords
head
data
mounted display
display device
main control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211734695.7A
Other languages
Chinese (zh)
Inventor
徐伟刚
王文兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Companion Technology Co ltd
Original Assignee
Hangzhou Companion Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Companion Technology Co ltd filed Critical Hangzhou Companion Technology Co ltd
Priority to CN202211734695.7A priority Critical patent/CN116126141A/en
Publication of CN116126141A publication Critical patent/CN116126141A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Power Sources (AREA)

Abstract

Embodiments of the present disclosure disclose pose data processing methods, systems, electronic devices, and computer readable media. One embodiment of the method comprises the following steps: acquiring head posture data of a user acquired by head-mounted display equipment; in response to detecting that the head-mounted display device does not meet the head position data acquisition condition, acquiring main control pose data acquired by main control equipment, wherein the main control equipment is in communication connection with the head-mounted display device, the main control pose data comprise main control position data, and the head-mounted display device and the main control equipment correspond to the user; combining the head posture data and the main control position data to obtain combined posture data; and controlling the operation of the head-mounted display equipment according to the obtained combined pose data. The embodiment can improve the sensitivity of the user to perform interactive operation with the head-mounted display device through the head pose without increasing the weight and the power consumption of the head-mounted display device.

Description

Pose data processing method, pose data processing system, electronic equipment and computer readable medium
Technical Field
Embodiments of the present disclosure relate to the field of computer technology, and in particular, to a pose data processing method, a pose data processing system, an electronic device, and a computer readable medium.
Background
The head-mounted display device can be used for a user to watch display content in the virtual space, so that the user experiences immersive interactive operation. Currently, to reduce the weight and power consumption of head mounted display devices, head mounted display devices typically provide only head pose data acquisition functions, with the estimated position data being used for pose positioning.
However, the inventors found that when the positioning of the attitude is performed in the above manner, there are often the following technical problems: the estimated position data has poor stability, so that the user can perform a relatively stuck operation with the head-mounted display device through the head pose, and the sensitivity is poor.
The above information disclosed in this background section is only for enhancement of understanding of the background of the inventive concept and, therefore, may contain information that does not form the prior art that is already known to those of ordinary skill in the art in this country.
Disclosure of Invention
The disclosure is in part intended to introduce concepts in a simplified form that are further described below in the detailed description. The disclosure is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some embodiments of the present disclosure propose pose data processing methods, systems, electronic devices, and computer readable media to address one or more of the technical problems mentioned in the background section above.
In a first aspect, some embodiments of the present disclosure provide a pose data processing method, the method comprising: acquiring head posture data of a user acquired by head-mounted display equipment; in response to detecting that the head-mounted display device does not meet the head position data acquisition condition, acquiring main control pose data acquired by main control equipment, wherein the main control equipment is in communication connection with the head-mounted display device, the main control pose data comprise main control position data, and the head-mounted display device and the main control equipment correspond to the user; combining the head posture data and the main control position data to obtain combined posture data; and controlling the operation of the head-mounted display equipment according to the obtained combined pose data.
Optionally, the head-mounted display device is an AR device or a VR device.
Optionally, the master control pose data further includes master control pose data; and combining the head pose data and the master control position data to obtain combined pose data, wherein the combining comprises the following steps: deleting the main control gesture data from the main control gesture data to obtain main control position data; and combining the head gesture data and the obtained main control position data to obtain combined gesture data.
Optionally, controlling the operation of the head-mounted display device according to the obtained combined pose data includes: and updating the display content of the head-mounted display device according to the obtained combined pose data.
Optionally, updating the display content of the head-mounted display device according to the obtained combined pose data includes: determining application update information corresponding to the current application according to the obtained combined pose data; and displaying the application update information in a display screen of the head-mounted display device so as to update the display content of the head-mounted display device.
Optionally, the master control pose data further includes master control pose data; and before controlling the operation of the head-mounted display device according to the obtained combined pose data, the method further comprises: in response to determining that the head mounted display device does not collect real-time head pose data, determining target historical head pose data collected by the head mounted display device; generating smooth gesture data according to the historical head gesture data of the target and the master gesture data; and combining the smooth gesture data and the main control position data to obtain combined gesture data.
Optionally, before controlling the operation of the head-mounted display device according to the obtained combined pose data, the method further includes: determining target historical master control position data acquired by the master control equipment in response to determining that the master control position data is empty; generating estimated position data according to the head posture data; generating smooth position data according to the target historical master control position data and the estimated position data; and combining the head posture data and the smooth position data to obtain combined posture data.
In a second aspect, some embodiments of the present disclosure provide a pose data processing system comprising: a head-mounted display device configured to collect head pose data of a user; a master device configured to perform the method described in any implementation manner of the first aspect, where the master device is communicatively connected to the head-mounted display device, and the head-mounted display device and the master device each correspond to the user.
In a third aspect, some embodiments of the present disclosure provide an electronic device comprising: one or more processors; a storage device having one or more programs stored thereon, which when executed by one or more processors causes the one or more processors to implement the method described in any of the implementations of the first aspect above.
In a fourth aspect, some embodiments of the present disclosure provide a computer readable medium having a computer program stored thereon, wherein the program, when executed by a processor, implements the method described in any of the implementations of the first aspect above.
The above embodiments of the present disclosure have the following advantageous effects: by the pose data processing method of some embodiments of the present disclosure, the sensitivity of the user in performing the interactive operation with the head-mounted display device through the head pose can be improved without increasing the weight and the power consumption of the head-mounted display device. Specifically, the reasons for causing the user to have relatively stuck and poor sensitivity through the interaction operation of the head pose and the head-mounted display device are as follows: the estimated position data has poor stability, so that the user can perform a relatively stuck operation with the head-mounted display device through the head pose, and the sensitivity is poor. Based on this, pose data processing methods of some embodiments of the present disclosure first acquire head pose data of a user acquired by a head-mounted display device. Thus, the head pose data may characterize the head pose of the user when wearing the head mounted display device. And then, in response to detecting that the head-mounted display device does not meet the head position data acquisition condition, acquiring the main control pose data acquired by the main control device. The main control equipment is in communication connection with the head-mounted display equipment. The master control pose data includes master control position data. The head-mounted display device and the main control device correspond to the user. Therefore, when the head-mounted display device does not have the capability of acquiring the head position data, the main control pose data acquired by the main control device can be automatically acquired, and the main control pose data comprise the main control position data. And then, combining the head gesture data and the main control position data to obtain combined gesture data. Therefore, the gesture data acquired by the head-mounted display device and the position data acquired by the main control device can be combined. And finally, controlling the operation of the head-mounted display equipment according to the obtained combined pose data. Thus, the operation of the head-mounted display device can be controlled according to the combined pose data. And because the estimated head position data is not adopted for positioning, the position data acquired by the main control equipment is combined with the posture data acquired by the head-mounted display equipment, so that the final posture data is determined, and the stability of the position data is improved. Therefore, the sensitivity of the interactive operation of the user and the head-mounted display device through the head pose can be improved under the condition of not increasing the weight and the power consumption of the head-mounted display device.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
FIG. 1 is a schematic illustration of one application scenario of a pose data processing method according to some embodiments of the present disclosure;
FIG. 2 is a flow chart of some embodiments of a pose data processing method according to the present disclosure;
FIG. 3 is a flow chart of other embodiments of a pose data processing method according to the present disclosure;
FIG. 4 is a schematic structural view of some embodiments of a pose data processing system according to the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device suitable for use in implementing some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings. Embodiments of the present disclosure and features of embodiments may be combined with each other without conflict.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 is a schematic diagram of an application scenario of a pose data processing method according to some embodiments of the present disclosure.
In the application scenario of fig. 1, first, the main control device 101 may acquire the head pose data 103 of the user acquired by the head-mounted display device 102. Then, the master control device 101 may acquire the master control pose data 104 acquired by the master control device 101 in response to detecting that the head-mounted display device 102 described above does not satisfy the head position data acquisition condition. Wherein the master device 101 is communicatively connected to the head mounted display device 102. For example, the connection between the main control device 101 and the head-mounted display device 102 may be wireless. The master pose data 104 includes master position data 105. The head-mounted display device 102 and the main control device 101 each correspond to the user. That is, the user wears the head mounted display device 102 and holds the main control device 101. Thereafter, the master device 101 may perform a merging process on the head pose data 103 and the master position data 105 to obtain merged pose data 106. Finally, the master device 101 may control the operation of the head-mounted display device 102 according to the obtained combined pose data 106.
The master device 101 may be hardware or software. When the master control device is hardware, the master control device may be implemented as a distributed cluster formed by a plurality of servers or terminal devices, or may be implemented as a single server or a single terminal device. When the master device is embodied as software, it may be installed in the above-listed hardware devices. It may be implemented as a plurality of software or software modules, for example, for providing distributed services, or as a single software or software module. The present invention is not particularly limited herein.
It should be understood that the number of master devices in fig. 1 is merely illustrative. There may be any number of master devices as desired for implementation.
With continued reference to fig. 2, a flow 200 of some embodiments of a pose data processing method according to the present disclosure is shown. The pose data processing method comprises the following steps:
in step 201, head pose data of a user acquired by a head mounted display device is acquired.
In some embodiments, the execution body of the pose data processing method (such as the master control device 101 shown in fig. 1) may acquire the head pose data of the user acquired by the head mounted display device from the head mounted display device through a wired connection manner or a wireless connection manner. The head-mounted display device may be a device for a user to view the virtual space. For example, the head mounted display device described above may be MR glasses. The user may be a user wearing the head-mounted display device and holding a master control device. The head pose data may be a direction coordinate in a three-dimensional space, and may have 3 degrees of freedom. It should be noted that the wireless connection may include, but is not limited to, 3G/4G connections, wiFi connections, bluetooth connections, wiMAX connections, zigbee connections, UWB (Ultra Wide Band) connections, and other now known or later developed wireless connection means.
Alternatively, the head mounted display device may be an AR device or a VR device.
Step 202, in response to detecting that the head-mounted display device does not meet the head position data acquisition condition, acquiring main control pose data acquired by the main control device.
In some embodiments, the executing entity may acquire the master pose data acquired by the master device in response to detecting that the head-mounted display device does not satisfy the head position data acquisition condition. The head position data acquiring condition may be that the type of pose data that can be acquired by the head-mounted display device does not include position data. That is, the head mounted display device described above does not have the ability to collect position data. The position data may be position coordinates in three-dimensional space, and may have 3 degrees of freedom. The master device may be a computing device for communicative connection with the head mounted display device. For example, the master device may be, but is not limited to, one of the following: the mobile phone, the intelligent terminal and the tablet personal computer. The master control device may have at least one of the following built-in: and a camera and a radar. The master control pose data may be pose data collected by the master control device. The master pose data may include master position data. The master location data may be location data determined by SLAM (Simultaneous Localization And Mapping ) techniques. The head-mounted display device and the master control device correspond to the user, and the user can be understood to wear the head-mounted display device and hold the master control device. In practice, the execution subject may collect the master pose data through SLAM techniques.
And 203, combining the head posture data and the main control position data to obtain combined posture data.
In some embodiments, the executing body may perform a merging process on the head pose data and the master position data to obtain merged pose data. In practice, first, the execution body may determine the head pose data as pose three-degree-of-freedom data. The master position data may then be determined as position three degrees of freedom data. And finally, combining the three-degree-of-freedom data of the gesture and the three-degree-of-freedom data of the position into combined pose data. The combined pose data are pose data with six degrees of freedom.
Optionally, the master pose data may further include master pose data. The master control gesture data may be gesture data collected by the master control device. Here, the above-mentioned master gesture data may be a direction coordinate of a three-dimensional space, and may have 3 degrees of freedom.
In some optional implementations of some embodiments, the executing body may further perform a merging process on the head pose data and the master position data to obtain merged pose data by:
and a first step of deleting the main control gesture data from the main control gesture data to obtain main control position data.
And secondly, combining the head gesture data and the obtained main control position data to obtain combined gesture data. The manner of combining may refer to the specific implementation of step 203, and will not be described herein. Therefore, the main control gesture data in the main control gesture data can be deleted directly before the merging processing, so that the storage space is saved.
And step 204, controlling the operation of the head-mounted display device according to the obtained combined pose data.
In some embodiments, the executing body may control the operation of the head-mounted display device according to the obtained combined pose data. In practice, the execution body may determine the selection item corresponding to the merged pose data according to the merged pose data. The above-mentioned selection items may be one selected by the user from among the respective selection items by a head control operation. Here, the specific content of the selection item is not limited. For example, the above selection may be "determine play". Then, the execution body may control the head-mounted display device to execute an operation corresponding to the selection item. For example, the executing body may control the head-mounted display device to play audio corresponding to the selection item.
In some optional implementations of some embodiments, the executing entity may update the display content of the head-mounted display device according to the obtained combined pose data.
In some optional implementations of some embodiments, the executing entity may update the display content of the head-mounted display device according to the obtained combined pose data by:
and the first step, determining application update information corresponding to the current application according to the obtained combined pose data. In practice, the execution body may determine an application execution node corresponding to the application operation of merging the pose data representation in each application execution node of the current application. The current application may be an application currently running in the foreground. The above-mentioned each application execution node may be each execution node corresponding to the running logic of the current application. For example, after application page A clicks on the a control, the user jumps to application page B. The application execution node "application page B" may be used as an execution node after the application execution node "application page a" after the control a is triggered. Here, the application operation corresponding to the above-mentioned merging of the pose data representation may be an operation of triggering the a control. Then, the application information corresponding to the application execution node may be determined as application update information. The application information may be information to be displayed by the executing application executing node. For example, the application information may be page display information of an application page. The page display information may be information for displaying an application page.
And a second step of displaying the application update information on a display screen of the head-mounted display device to update the display content of the head-mounted display device. In practice, the execution body may display an application page corresponding to the application update information in a display screen of the head-mounted display device to update display contents of the head-mounted display device.
The above embodiments of the present disclosure have the following advantageous effects: by the pose data processing method of some embodiments of the present disclosure, the sensitivity of the user in performing the interactive operation with the head-mounted display device through the head pose can be improved without increasing the weight and the power consumption of the head-mounted display device. Specifically, the reasons for causing the user to have relatively stuck and poor sensitivity through the interaction operation of the head pose and the head-mounted display device are as follows: the estimated position data has poor stability, so that the user can perform a relatively stuck operation with the head-mounted display device through the head pose, and the sensitivity is poor. Based on this, pose data processing methods of some embodiments of the present disclosure first acquire head pose data of a user acquired by a head-mounted display device. Thus, the head pose data may characterize the head pose of the user when wearing the head mounted display device. And then, in response to detecting that the head-mounted display device does not meet the head position data acquisition condition, acquiring the main control pose data acquired by the main control device. The main control equipment is in communication connection with the head-mounted display equipment. The master control pose data includes master control position data. The head-mounted display device and the main control device correspond to the user. Therefore, when the head-mounted display device does not have the capability of acquiring the head position data, the main control pose data acquired by the main control device can be automatically acquired, and the main control pose data comprise the main control position data. And then, combining the head gesture data and the main control position data to obtain combined gesture data. Therefore, the gesture data acquired by the head-mounted display device and the position data acquired by the main control device can be combined. And finally, controlling the operation of the head-mounted display equipment according to the obtained combined pose data. Thus, the operation of the head-mounted display device can be controlled according to the combined pose data. And because the estimated head position data is not adopted for positioning, the position data acquired by the main control equipment is combined with the posture data acquired by the head-mounted display equipment, so that the final posture data is determined, and the stability of the position data is improved. Therefore, the sensitivity of the interactive operation of the user and the head-mounted display device through the head pose can be improved under the condition of not increasing the weight and the power consumption of the head-mounted display device.
With further reference to FIG. 3, a flow 300 of further embodiments of a pose data processing method is illustrated. The process 300 of the pose data processing method includes the following steps:
in step 301, head pose data of a user acquired by a head mounted display device is acquired.
In step 302, in response to detecting that the head-mounted display device does not meet the head position data acquisition condition, master control pose data acquired by the master control device is acquired.
Step 303, merging the head pose data and the master control position data to obtain merged pose data.
In some embodiments, the specific implementation of steps 301 to 303 and the technical effects thereof may refer to steps 201 to 203 in those embodiments corresponding to fig. 2, which are not described herein.
In step 304, in response to determining that the head mounted display device does not collect real time head pose data, target historical head pose data collected by the head mounted display device is determined.
In some embodiments, an executing subject of the pose data processing method (e.g., the master device 101 shown in fig. 1) may determine the target historical head pose data collected by the head mounted display device in response to determining that the head mounted display device does not collect real-time head pose data. The target historical head posture data may be head posture data collected by the head-mounted display device before the current moment. The target historical head pose data may include at least one recent historical head pose data collected by the head mounted display device prior to the current time. For example, the current time may be time 5, and the target historical head pose data may include historical head pose data collected by the head-mounted display device from time 1 to time 4.
Step 305, generating smooth gesture data according to the historical head gesture data and the master gesture data of the target.
In some embodiments, the execution body may generate smooth pose data from the target historical head pose data and the master pose data. In practice, the execution subject may perform smoothing processing on the master control posture data according to the target historical head posture data to obtain smoothed posture data. Specifically, the executing body may perform smoothing processing on the master control gesture data by using a smoothing algorithm to obtain smoothed gesture data.
And 306, combining the smooth gesture data and the main control position data to obtain combined gesture data.
In some embodiments, the execution body may perform a merging process on the smoothed gesture data and the master control position data to obtain merged gesture data. In practice, first, the execution body may determine the smooth posture data as posture three-degree-of-freedom data. The master position data may then be determined as position three degrees of freedom data. And finally, combining the three-degree-of-freedom data of the gesture and the three-degree-of-freedom data of the position into combined pose data. The combined pose data are pose data with six degrees of freedom.
Optionally, before step 307, the above-mentioned execution body may further execute the following steps:
and in the first step, determining target historical master control position data acquired by the master control equipment in response to determining that the master control position data is empty. When the master control position data is empty, the master control device can be characterized that the master control device does not acquire the master control position data. The target historical master control position data may be master control position data collected by the master control device before the current time. The target historical master control location data may include at least one recent historical master control location data collected by the master control device prior to the current time. For example, the current time may be time 5, and the target historical master control position data may include historical master control position data collected by the master control device from time 1 to time 4.
And a second step of generating estimated position data according to the head posture data. In practice, the execution subject may generate estimated pose data from the head pose data by SLAM algorithm. Wherein, the estimated pose data may include position data and pose data. Then, the position data included in the above estimated pose data may be determined as estimated position data.
And thirdly, generating smooth position data according to the target historical master control position data and the estimated position data. In practice, the execution body may perform smoothing processing on the estimated position data according to the target historical master control position data to obtain smoothed position data. Specifically, the execution body may perform smoothing processing on the estimated position data by using a smoothing algorithm to obtain smoothed position data.
Fourth, combining the head posture data and the smooth position data to obtain combined posture data. In practice, first, the execution body may determine the head pose data as pose three-degree-of-freedom data. The smoothed position data may then be determined as position three degrees of freedom data. And finally, combining the three-degree-of-freedom data of the gesture and the three-degree-of-freedom data of the position into combined pose data. The combined pose data are pose data with six degrees of freedom. Therefore, when the main control equipment does not acquire real-time main control position data, namely the main control position data acquisition function of the main control equipment fails, the smooth position data can be used as position three-degree-of-freedom data. Therefore, when the master control position data acquisition function of the master control equipment fails, the user can still continue the head interaction operation. And because the three-degree-of-freedom position data is obtained by smoothing the historical master control position data acquired by the master control equipment in advance, the estimated position data can be prevented from being directly used as the three-degree-of-freedom position data, and the stability of the three-degree-of-freedom position data is improved. Further, the click feeling of the head interaction is reduced.
And step 307, controlling the operation of the head-mounted display device according to the obtained combined pose data.
In some embodiments, the specific implementation of step 307 and the technical effects thereof may refer to step 204 in those embodiments corresponding to fig. 2, which are not described herein.
As can be seen in fig. 3, the flow 300 of the pose data processing method in some embodiments corresponding to fig. 3 embodies the step of expanding the smoothed pose data as compared to the description of some embodiments corresponding to fig. 2. Therefore, when the head-mounted display device does not acquire real-time head posture data, namely, when the head posture data acquisition function of the head-mounted display device fails, the proposal described by the embodiments can adopt smooth posture data obtained by smoothing according to the target historical head posture data acquired by the head-mounted display device and the main control posture data acquired by the main control device as posture three-degree-of-freedom data. Therefore, when the head posture data acquisition function of the head-mounted display device fails, the user can still continue head interaction operation, and the blocking sense of the head interaction operation is reduced.
With further reference to FIG. 4, the present disclosure provides some embodiments of a pose data processing system as an implementation of the method illustrated in the above figures.
As shown in fig. 4, the pose data processing system 400 of some embodiments includes: a head mounted display device 401 configured to collect head pose data of a user; a master device 402 configured to perform the pose data processing method according to any of the embodiments shown in fig. 2 or fig. 3, wherein the master device is communicatively connected to the head-mounted display device, and the head-mounted display device and the master device each correspond to the user.
The above embodiments of the present disclosure have the following advantageous effects: by the pose data processing system of some embodiments of the present disclosure, the sensitivity of a user to interoperate with the head-mounted display device through the head pose can be improved without increasing the weight and power consumption of the head-mounted display device.
Referring now to fig. 5, a schematic diagram of an electronic device 500 (e.g., master device 101 of fig. 1) suitable for use in implementing some embodiments of the present disclosure is shown. The electronic device shown in fig. 5 is merely an example and should not impose any limitations on the functionality and scope of use of embodiments of the present disclosure.
As shown in fig. 5, the electronic device 500 may include a processing means 501 (e.g., a central processor, a graphics processor, etc.) that may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM 503, various programs and data required for the operation of the electronic apparatus 500 are also stored. The processing device 501, the ROM 502, and the RAM 503 are connected to each other via a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
In general, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 507 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; and communication means 509. The communication means 509 may allow the electronic device 500 to communicate with other devices wirelessly or by wire to exchange data. While fig. 5 shows an electronic device 500 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead. Each block shown in fig. 5 may represent one device or a plurality of devices as needed.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such embodiments, the computer program may be downloaded and installed from a network via the communications device 509, or from the storage device 508, or from the ROM 502. The above-described functions defined in the methods of some embodiments of the present disclosure are performed when the computer program is executed by the processing device 501.
It should be noted that, the computer readable medium described in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, the computer-readable signal medium may comprise a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring head posture data of a user acquired by head-mounted display equipment; in response to detecting that the head-mounted display device does not meet the head position data acquisition condition, acquiring main control pose data acquired by main control equipment, wherein the main control equipment is in communication connection with the head-mounted display device, the main control pose data comprise main control position data, and the head-mounted display device and the main control equipment correspond to the user; combining the head posture data and the main control position data to obtain combined posture data; and controlling the operation of the head-mounted display equipment according to the obtained combined pose data.
Computer program code for carrying out operations for some embodiments of the present disclosure may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above technical features, but encompasses other technical features formed by any combination of the above technical features or their equivalents without departing from the spirit of the invention. Such as the above-described features, are mutually substituted with (but not limited to) the features having similar functions disclosed in the embodiments of the present disclosure.

Claims (10)

1. A pose data processing method, comprising:
acquiring head posture data of a user acquired by head-mounted display equipment;
in response to detecting that the head-mounted display device does not meet a head position data acquisition condition, acquiring main control pose data acquired by main control equipment, wherein the main control equipment is in communication connection with the head-mounted display device, the main control pose data comprise main control position data, and the head-mounted display device and the main control equipment correspond to the user;
combining the head gesture data and the main control position data to obtain combined gesture data;
and controlling the operation of the head-mounted display equipment according to the obtained combined pose data.
2. The method of claim 1, wherein the head mounted display device is an AR device or a VR device.
3. The method of claim 1, wherein the master pose data further comprises master pose data; and
the combining the head gesture data and the main control position data to obtain combined gesture data comprises the following steps:
deleting the main control gesture data from the main control gesture data to obtain main control position data;
and combining the head gesture data and the obtained main control position data to obtain combined gesture data.
4. The method of claim 1, wherein the controlling the operation of the head-mounted display device according to the obtained combined pose data comprises:
and updating the display content of the head-mounted display device according to the obtained combined pose data.
5. The method of claim 4, wherein updating the display content of the head mounted display device according to the obtained merged pose data comprises:
determining application update information corresponding to the current application according to the obtained combined pose data;
and displaying the application update information in a display screen of the head-mounted display device so as to update the display content of the head-mounted display device.
6. The method of claim 1, wherein the master pose data further comprises master pose data; and
before the head-mounted display device is controlled to operate according to the obtained combined pose data, the method further comprises:
in response to determining that the head mounted display device does not collect real-time head pose data, determining target historical head pose data collected by the head mounted display device;
generating smooth gesture data according to the historical head gesture data of the target and the master gesture data;
and combining the smooth gesture data and the main control position data to obtain combined gesture data.
7. The method of one of claims 1-6, wherein, prior to said controlling operation of the head mounted display device in accordance with the resulting combined pose data, the method further comprises:
determining target historical master control position data acquired by the master control equipment in response to determining that the master control position data is empty;
generating estimated position data from the head pose data;
generating smooth position data according to the target historical master control position data and the estimated position data;
and combining the head posture data and the smooth position data to obtain combined posture data.
8. A pose data processing system, comprising:
a head-mounted display device configured to collect head pose data of a user;
a master device configured to perform the method of any of claims 1-7, wherein the master device is communicatively connected with the head mounted display device, the head mounted display device and the master device each corresponding to the user.
9. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, causes the one or more processors to implement the method of any of claims 1-7.
10. A computer readable medium having stored thereon a computer program, wherein the computer program, when executed by a processor, implements the method of any of claims 1-7.
CN202211734695.7A 2022-12-30 2022-12-30 Pose data processing method, pose data processing system, electronic equipment and computer readable medium Pending CN116126141A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211734695.7A CN116126141A (en) 2022-12-30 2022-12-30 Pose data processing method, pose data processing system, electronic equipment and computer readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211734695.7A CN116126141A (en) 2022-12-30 2022-12-30 Pose data processing method, pose data processing system, electronic equipment and computer readable medium

Publications (1)

Publication Number Publication Date
CN116126141A true CN116126141A (en) 2023-05-16

Family

ID=86307449

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211734695.7A Pending CN116126141A (en) 2022-12-30 2022-12-30 Pose data processing method, pose data processing system, electronic equipment and computer readable medium

Country Status (1)

Country Link
CN (1) CN116126141A (en)

Similar Documents

Publication Publication Date Title
US11610287B2 (en) Motion trail update method, head-mounted display device and computer-readable medium
CN111291244B (en) House source information display method, device, terminal and storage medium
CN114863214A (en) Image generation model training method, image generation device, image generation medium, and image generation device
CN111309415A (en) UI (user interface) information processing method and device of application program and electronic equipment
CN111652675A (en) Display method and device and electronic equipment
CN114564106B (en) Method and device for determining interaction indication line, electronic equipment and storage medium
CN111694629A (en) Information display method and device and electronic equipment
US20230199262A1 (en) Information display method and device, and terminal and storage medium
CN116228952A (en) Virtual object mounting method, device, equipment and medium
CN116342785A (en) Image processing method, device, equipment and medium
CN114187169B (en) Method, device, equipment and storage medium for generating video special effect package
CN113703704B (en) Interface display method, head-mounted display device, and computer-readable medium
CN111625692B (en) Feature extraction method, device, electronic equipment and computer readable medium
CN116126141A (en) Pose data processing method, pose data processing system, electronic equipment and computer readable medium
CN112507676B (en) Method and device for generating energy report, electronic equipment and computer readable medium
CN111327472B (en) Method and device for acquiring target network
CN114637400A (en) Visual content updating method, head-mounted display device assembly and computer readable medium
CN114397961A (en) Head-mounted display device control method, head-mounted display device assembly, and medium
CN111835917A (en) Method, device and equipment for showing activity range and computer readable medium
CN112311842A (en) Method and device for information interaction
CN115756176B (en) Application display method, head-mounted display device, and computer-readable medium
CN112781581B (en) Method and device for generating path from moving to child cart applied to sweeper
CN116880726B (en) Icon interaction method and device for 3D space, electronic equipment and medium
CN116185537A (en) Virtual scene display method, head-mounted display device and computer readable medium
CN112395826B (en) Text special effect processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination