CN113382228A - Head-mounted display device and head-mounted display system - Google Patents

Head-mounted display device and head-mounted display system Download PDF

Info

Publication number
CN113382228A
CN113382228A CN202110535316.0A CN202110535316A CN113382228A CN 113382228 A CN113382228 A CN 113382228A CN 202110535316 A CN202110535316 A CN 202110535316A CN 113382228 A CN113382228 A CN 113382228A
Authority
CN
China
Prior art keywords
head
mounted display
display device
wearer
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110535316.0A
Other languages
Chinese (zh)
Other versions
CN113382228B (en
Inventor
张秀志
周宏伟
柳光辉
郭衡江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Xiaoniao Kankan Technology Co Ltd
Original Assignee
Qingdao Xiaoniao Kankan Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Xiaoniao Kankan Technology Co Ltd filed Critical Qingdao Xiaoniao Kankan Technology Co Ltd
Priority to CN202110535316.0A priority Critical patent/CN113382228B/en
Publication of CN113382228A publication Critical patent/CN113382228A/en
Application granted granted Critical
Publication of CN113382228B publication Critical patent/CN113382228B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/398Synchronisation thereof; Control thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

The embodiment of the disclosure discloses a head-mounted display device and a head-mounted display system, wherein the head-mounted display device comprises a processor, a first camera device, a lens module and at least one infrared lamp positioned on the lens module, wherein the at least one infrared lamp is used for transmitting an infrared light signal so that eyes of a wearer wearing the head-mounted display device can receive the infrared light signal; the first camera device is used for collecting the human eye image of the wearer after the eyes of the wearer receive the infrared light signal; the processor is used for carrying out eyeball tracking on the wearer according to the human eye image.

Description

Head-mounted display device and head-mounted display system
Technical Field
The present disclosure relates to the field of virtual reality display technologies, and more particularly, to a head-mounted display device and a head-mounted display system.
Background
The head-mounted display device can comprise a head-mounted virtual reality display device, and the virtual reality display technology is used for simulating a virtual world which generates a three-dimensional space, providing simulation of senses of vision, hearing, touch and the like for a user, and enabling the user to observe affairs in the three-dimensional space in time without limitation as if the user is personally on the scene.
In the prior art, when a user wears a head-mounted virtual reality display device to perform virtual reality display experience, the eyes of the wearer are in a dark environment, so that eyeball tracking of the wearer cannot be performed.
Disclosure of Invention
It is an object of the embodiments of the present disclosure to provide a new technical solution for a head-mounted display device.
According to a first aspect of the present disclosure, there is provided a head mounted display device comprising: a processor, a first camera device, a lens module and at least one infrared lamp positioned on the lens module,
the at least one infrared lamp is used for emitting an infrared light signal for receiving by eyes of a wearer of the head-mounted display device;
the first camera device is used for collecting the human eye image of the wearer after the eyes of the wearer receive the infrared light signal;
the processor is used for carrying out eyeball tracking on the wearer according to the human eye image.
Optionally, the first camera device comprises a first infrared camera and a second infrared camera,
the first infrared camera is used for collecting the left eye image of the wearer after the left eye of the wearer receives the infrared light signal;
the processor is used for carrying out eyeball tracking on the left eye of the wearer according to the left eye image to obtain the left eye fixation point of the wearer;
the second infrared camera is used for collecting the right eye image of the wearer after the right eye of the wearer receives the infrared light signal;
the processor is used for carrying out eyeball tracking on the right eye of the wearer according to the right eye image to obtain the right eye fixation point of the wearer.
Optionally, the lens module comprises a left lens and a right lens, the head-mounted display device further comprises a display screen,
the processor is further configured to: calculating a first distance between a left eye eyeball and a right eye eyeball of the wearer according to the left eye fixation point and the right eye fixation point of the wearer; acquiring a second distance between the left lens and the right lens; and adjusting the display center position of the image displayed by the display screen according to the first distance and the second distance.
Optionally, the head-mounted display device further comprises a dichroic mirror on the lens module,
the dichroic mirror is for reflecting the infrared light signal if the eye of the wearer receives the infrared light signal;
the first camera device is used for collecting the human eye image of the wearer under the condition that the dichroic mirror reflects the infrared light signal.
Optionally, the head-mounted display device further comprises a first interface, a switch and a wireless communication module,
the communication module is used for carrying out wireless communication connection with the handle;
the processor is used for acquiring first position and posture data of the head-mounted display device and second position and posture data of the handle;
the processor is used for controlling the change-over switch to be switched to connect the first interface and the display screen under the condition that the first interface is detected to be connected with a host, sending the first position and posture data and the second position and posture data to the host through the first interface so that the host can render images according to the first position and posture data, and outputting the rendered images to the display screen through the first interface for displaying; alternatively, the first and second electrodes may be,
the processor is used for controlling the selector switch to connect the processor and the display screen under the condition that the first interface is not connected with the host, rendering an image according to the first position and posture data, and outputting the rendered image to the display screen for display.
Optionally, the head-mounted display device further comprises a second camera and a first inertial measurement unit, the handle comprises a second inertial measurement unit and a plurality of illuminants,
the second camera device is used for acquiring a first image in a first exposure time length and acquiring a second image in a second exposure time length; the second image at least comprises light spots corresponding to a plurality of luminous bodies arranged on the handle;
the processor is configured to determine the first pose data from the first image and a first inertial measurement module; and determining the second pose data from the second image, the second inertial measurement module, and the first pose data.
Optionally, the second camera device comprises at least one fisheye camera.
Optionally, the head-mounted display device further comprises an audio output module,
the processor is used for receiving audio data transmitted by the host through the first interface and decoding the audio data under the condition that the first interface is detected to be connected with the host;
the audio output module is used for playing the decoded audio data.
Optionally, the head-mounted display device further comprises a second interface,
the second interface is used for connecting power supply equipment so that the power supply equipment can supply power for the head-mounted display equipment.
According to a second aspect of the present disclosure, there is also provided a head mounted display system, comprising:
a head mounted display device according to the first aspect above, the head mounted display device comprising a wireless communication module and a first interface;
the handle is in wireless communication connection with the head-mounted display device through the wireless communication module;
the host computer is in wired communication connection with the head-mounted display device through the first interface.
According to the embodiment of the present disclosure, a novel head-mounted display device is provided, the head-mounted display device not only includes a processor and a lens module, but also includes a first camera device and at least one infrared lamp located on the lens module, an infrared light signal is emitted by the at least one infrared lamp, and after the eye of a wearer of the head-mounted display device receives the infrared light signal, the eye image of the wearer can be collected by the first camera device, so that the eye tracking can be performed on the wearer according to the eye image, the real-time gazing point rendering can be performed, and the rendering efficiency can be improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the disclosure and together with the description, serve to explain the principles of the disclosure.
FIG. 1 is a functional block diagram of a head mounted display device according to an embodiment of the present disclosure;
FIG. 2 is a functional block diagram of infrared light emission according to an embodiment of the present disclosure;
FIG. 3 is a functional block diagram of a head mounted display device according to another embodiment of the present disclosure;
FIG. 4a is a functional block diagram of a head mounted display system according to an embodiment of the present disclosure;
FIG. 4b is a functional block diagram of a head mounted display system according to another embodiment of the present disclosure;
fig. 5 is a block diagram of the configuration of an image pickup apparatus according to the present disclosure;
FIG. 6a is a functional block diagram of a head mounted display system according to another embodiment of the present disclosure;
fig. 6b is a functional block diagram of a head mounted display system according to another embodiment of the present disclosure.
Detailed Description
Various exemplary embodiments of the present disclosure will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present disclosure unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the disclosure, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
The embodiment of the present disclosure provides a head-mounted display device, which may be a head-mounted virtual reality display device, and fig. 1 is a block schematic diagram of the head-mounted display device provided according to the embodiment of the present disclosure.
Referring to fig. 1, the head-mounted display apparatus 10 includes a processor 110, a first camera 120, a lens module 130, and at least one infrared lamp 131 located on the lens module 130, wherein the processor 110 is in communication connection with the first camera 120 and the lens module 130.
The at least one infrared lamp 131 is used to emit an infrared light signal for reception by the eye of a wearer of the head-mounted display device 10.
In this embodiment, as shown in fig. 1, the head-mounted display device 10 includes four infrared lamps 131, namely an infrared lamp 131a, an infrared lamp 131b, an infrared lamp 131c, and an infrared lamp 131d, the four infrared lamps 161 may be disposed on an outer surface of the lens module 130, that is, on a surface of the lens module 130 away from the display screen 140, and the four infrared lamps 161 can simultaneously emit infrared light signals to be received by left and right eyes (only left or right eyes of a wearer are shown in fig. 2) of the wearer wearing the head-mounted display device 10.
The first camera device 120 is used for collecting the human eye image of the wearer after the eye of the wearer receives the infrared light signal.
In this embodiment, the first camera device 131 may be disposed in the head-mounted display apparatus 10, and the first camera device 120 is an infrared camera device and is configured to capture an image of the eye of the wearer after the eye of the wearer receives the infrared light signal. Since the infrared light signal needs to be reflected to ensure that the first camera 120 can capture the image of the human eye representing the eye movement of the wearer, as shown in fig. 2, a dichroic mirror 132 is further disposed in the head-mounted display apparatus 10, and the dichroic mirror 132 is disposed on the lens module 130 and is used for reflecting the infrared light signal when the eye of the wearer receives the infrared light signal and continuously transmitting the screen visible light signal to the human eye, so that the first camera 120 captures the image of the human eye of the wearer when the dichroic mirror 132 reflects the infrared light signal.
In one example, the first camera 120 may include a first infrared camera and a second infrared camera (only one of which is shown in fig. 2).
The first infrared camera is used for collecting the left eye image of the wearer after the left eye of the wearer receives the infrared light signal; and the processor 110 is configured to perform eye tracking on the left eye of the wearer according to the left eye image, and obtain the gaze point of the left eye of the wearer.
The second infrared camera is used for collecting the right eye image of the wearer after the right eye of the wearer receives the infrared light signal; and carrying out eyeball tracking on the right eye of the wearer according to the right eye image to obtain the right eye fixation point of the wearer.
Exemplarily, taking fig. 2 as an example of a left eye of a wearer, the four infrared lamps 131 emit infrared light signals at the same time, the left eye of the wearer receives the infrared light signals, and after the infrared light signals are reflected by the dichroic mirror 132, the first infrared camera collects a left eye image of the wearer, the processor 110 performs eye tracking according to the left eye image of the wearer, obtains a left eye gaze point of the wearer, and then performs left eye gaze point rendering on the display screen 140, thereby improving rendering efficiency.
Exemplarily, taking fig. 2 as an example of a right eye of a wearer, the four infrared lamps 131 emit infrared light signals simultaneously, the right eye of the wearer receives the infrared light signals, and after the infrared light signals are reflected by the dichroic mirror 132, the second infrared camera collects a right eye image of the wearer, the processor 110 performs eye tracking according to the right eye image of the wearer, obtains a right eye gaze point of the wearer, and then performs right eye gaze point rendering on the display screen 140, so as to improve rendering efficiency.
According to the embodiment of the disclosure, the head-mounted display equipment comprises a processor, a lens module, a first camera device and at least one infrared lamp positioned on the lens module, wherein the infrared lamp emits an infrared light signal, and after a wearer of the head-mounted display equipment receives the infrared light signal, the first camera device can collect eye images of the wearer, so that the wearer can track eyeballs according to the eye images, the real-time fixation point is used for rendering, and the rendering efficiency is improved.
In one embodiment, the lens module 130 includes a left lens and a right lens, and here, in the case that four infrared lamps 131 are disposed in the head-mounted display device 10, two of the infrared lamps 131 may be disposed on an outer surface of the left lens, that is, on a side of the left lens away from the display screen 140. Two more infrared lamps are disposed on the outer surface of the right lens, i.e., on the side of the right lens away from the display screen 140.
The processor 110 is further configured to: calculating a first distance between a left eye eyeball and a right eye eyeball of the wearer according to the left eye fixation point and the right eye fixation point of the wearer; acquiring a second distance between the left lens and the right lens; and then the display center position of the image displayed on the display screen 140 is adjusted according to the first distance and the second distance.
It can be understood that, in the related art, the actual distance between the left lens and the right lens is adjusted through a physical structure, for the head-mounted display device 10 with two display screens, the display screens move along with the corresponding lenses, and only the optical axis center position of the lens module needs to be moved to be aligned with the center position of the display screen, however, the adjustment is actually limited by the size of the display screen and the size of the head-mounted display device, and the actual adjustment range is only about 58-68 mm. To the head-mounted display equipment of single display screen, because lens module and display screen can not adjust simultaneously to make the regulation of interpupillary distance receive bigger restriction, generally all correspond several special gears, adjust the display center of controlling the image. Moreover, in actual conditions, because the wearer cannot accurately know the information of the interpupillary distance of the wearer, whether the adjustment is suitable or not is judged only through experience during actual use, and the blind operation also influences the experience.
In this embodiment, the processor 110 calculates a first distance between a left eye eyeball and a right eye eyeball of the wearer according to the left eye gaze point of the wearer and the right eye gaze point of the wearer, obtains a second distance between the left lens and the right lens, further calculates a difference between the first distance and the second distance, and then adjusts the display center position of the image displayed on the display screen according to the difference between the first distance and the second distance. The pupil distance of the user can be automatically detected, and therefore the experience of self-adapting to people with different pupil distances is achieved.
In one embodiment, the existing head-mounted virtual reality display equipment is divided into a PC helmet and an all-in-one machine, content resources of the PC helmet are relatively mature and stable, but a user use scene is complex, environment construction is relatively difficult, and carrying and using are not convenient. Although all-in-one machines are convenient to use and carry, game resources are relatively few, most all-in-one machines achieve wireless streaming through modes such as 5G WIFI and 11ad at present, however, wireless streaming is greatly affected by router load and the like, delay is affected, and the problem of wireless bandwidth also causes low definition and affects experience. Here, the head-mounted display device in this embodiment can realize the switching function module between the all-in-one machine and the PC helmet, automatically complete the switching between the all-in-one machine and the PC helmet, and can share the head and hand tracking function of the all-in-one machine.
Referring to fig. 3, the head-mounted display device 10 further includes a first interface 150, a switch 160, and a wireless communication module 170. Wherein, the wireless communication module 170 is used for performing wireless communication connection with the handle 20. The first interface 150 may be a Type C interface, and transmission of image data, audio data, and pose data may be simultaneously achieved through the first interface 150. The processor 110 is connected to the first interface 150 and the wireless communication module 170, respectively. The switch 160 includes a movable contact and two stationary contacts, one of which is connected to the first interface 150, the other of which is connected to the processor 110, and the movable contact is connected to the display screen 140. The display 150 may be a liquid crystal display (LCD display).
Processor 110 is configured to obtain first pose data for head mounted display device 10 and second pose data for handle 20. The pose data typically includes position data and pose data, which may also be referred to as 6dof (free of free) data. For example, the first pose data includes position data and pose data of head mounted display device 10, and the second pose data includes position data and pose data of handle 20. That is, the head-and-hand tracking function of the wearer of the head-mounted display device 10 can be realized by the head-mounted display device 10.
In this embodiment, as shown in fig. 4a and 4b, the head-mounted display device 10 further includes a second camera 180 and a first inertia measurement module 190, and the handle 20 includes a second inertia measurement module and a plurality of light emitters (both shown in the figure).
The second camera device 180 is configured to acquire a first image with a first exposure duration and acquire a second image with a second exposure duration; wherein the second image at least includes light spots corresponding to a plurality of light emitters disposed on the handle 20. The processor 110 is configured to determine first pose data from the first image and the first inertial measurement module; and determining second position and attitude data according to the second image, the second inertial measurement module and the first position and attitude data.
In this embodiment, the second camera 180 includes at least one fisheye camera 181, the at least one fisheye camera 181 is configured to capture images, and the at least one fisheye camera 181 is configured to capture the first image and the second image alternately with different exposure durations, for example, first capture the first image with the first exposure duration and then capture the second image with the second exposure duration. At least one fisheye camera 181 acquires the first image or the second image at the same time, that is, the central point of the exposure time of each fisheye camera 181 is the same.
For example, as shown in fig. 5, four fisheye cameras 181 are disposed on an outer surface of the head-mounted display device 10, and specifically include an upper left fisheye camera 181a, an upper right fisheye camera 181b, a lower left fisheye camera 181c, and a lower right fisheye camera 181d, where the four fisheye cameras 181 are disposed at different positions, and for the description of the fisheye cameras 181, reference may be made to the following embodiments, which are not described herein again.
Because the mounted position of every fisheye camera 181 is different, thereby lead to the external environment that responds to also can be different, in order to guarantee the uniformity of every first image in the output of different environment, these four fisheye cameras 181 can be with the first exposure of difference duration to gather first image respectively, for example the duration of the first exposure of the darker fisheye camera 181 of environment sets up longer, the first exposure setting of the better fisheye camera 181 of environment is shorter, through the alignment of the exposure central point that realizes a plurality of fisheye cameras 161, thereby can guarantee that the moment of these four fisheye cameras 181 when gathering first image is unanimous, these four fisheye cameras 181 can shoot the surrounding environment of the same moment, guarantee to track the precision.
Since the handle 20 is provided with a plurality of light emitters, which are not in the same plane, the light emitters may be, for example, visible light sources or infrared light sources, such as LED lamps. Because the handle 20 light source will be bright a lot for external environment, in order to effectively reduce the influence that external environment tracked handle 20, the second exposure time of these four fisheye cameras 181 sets up very little, and these four fisheye cameras 181 all use the same second exposure time, guarantees simultaneously that handle 20 is when the exposure moment, and the handle light source is in luminous state, and these four fisheye cameras can shoot the second image of handle 20 simultaneously. For example, when a second image is captured with a second exposure duration by at least one fisheye camera 181 disposed on the head-mounted display device 10, the light emitter on the control handle 20 is turned on for a preset turn-on duration, and the middle time of the second exposure duration corresponds to the middle time of the turn-on duration.
The middle time of the second exposure time period is synchronized with the middle time of the lighting time period, that is, the light emitting body on the control handle 20 is lighted in the exposure time period in which the at least one fisheye camera 181 collects the second image, so that it can be ensured that the second image includes light spots corresponding to the plurality of light emitting bodies arranged on the handle 20.
In this embodiment, it is long to light for a time can be greater than the second exposure time, and it is long than the time extension of the time around the exposure time for a time that also is the length of time of lighting of luminous element, can avoid because adopt wireless communication's mode synchronous control camera device to gather the second image and the precision error that exists of lighting of the luminous element of handle to guarantee that camera device can catch the facula that the luminous element produced when gathering the second image.
After acquiring the first position and orientation data of the head mounted display device 10 and the second position and orientation data of the handle 20, the processor 110 may process the first position and orientation data and the second position and orientation data differently according to the operation mode of the head mounted display device 10.
In the first case: when detecting that the first interface 150 is connected to the host 30, the processor 110 controls the switch 160 to switch to connect the first interface 150 and the display screen 140, and sends the first position data and the second position data to the host 30 through the first interface 150, so that the host 30 renders an image according to the first position data and the second position data, and outputs the rendered image to the display screen 140 through the first interface 150 for display.
In this embodiment, when the host 30 accesses the first interface 150 through the PC data line, the processor 110 controls one of the fixed contacts and the movable contact of the switch 160 to connect the first interface 150 and the display screen 140, and the head-mounted display device 10 works in the PC helmet mode.
In this embodiment, the head-mounted display device 10 further includes a bridge chip 1100, the bridge chip 1100 is connected between the first interface 150 and the switch 160, and the bridge chip 1100 is configured to convert the received image rendered by the host 30 and output the converted image to the display screen 140 for displaying.
As shown in fig. 4a, the first interface 150 may have an image end and a data end, the data end is a USB end, the data end can transmit pose data and audio data, the image end is an image data interface such as a dp (display port) end or an hdmi (high Definition Multimedia interface) end, the image end can transmit image data, the image end is connected to the bridge chip 1100, and the data end is connected to the processor 110. When the host 30 is connected to the first interface 150 through the PC data line, the Processor 110 controls one of the fixed contacts of the switch 160 to be connected to the movable contact to communicate with the first interface 150, the bridge chip 1100 and the display screen 140, the Processor 110 transmits the pose data of the head-mounted display device 10 to the host 30 through the data terminal, the host 30 renders an image according to the pose data of the head-mounted display device 10 and the pose data of the handle 20 to obtain a DP signal and transmits the DP signal to the bridge chip 1100 through the image terminal of the first interface 120, and the bridge chip 1100 converts the DP signal into an mipi (mobile Industry Processor interface) signal and outputs the mipi (mobile Industry Processor interface) signal to the display screen 140 for display.
On the other hand, the processor 110 receives the audio data transmitted by the host 30 through the data terminal of the first interface 150, decodes the audio data, and plays the decoded audio data through the audio output module 1300.
In the second case: the processor 110 is configured to control the switch 160 to switch to connect the processor 110 and the display screen 140, render an image according to the first position posture data and the second position posture data, and output the rendered image to the display screen 140 for displaying, when detecting that the first interface 150 is not connected to the host 30.
In this embodiment, when the host 30 is not connected to the first interface 150, the processor 110 controls another fixed contact of the switch 160 to connect with the movable contact to communicate the processor 110 and the display screen 140, and herein, the head-mounted display device 10 operates in the all-in-one mode.
In this embodiment, the head-mounted display device 10 further includes a display driving module 1200, the display driving module 1200 is connected between the processor 110 and the switch 160, and the display driving module 1200 is configured to drive the display screen 140 to display an image rendered by the processor 110.
For example, as shown in fig. 4b, when the host 30 is not connected to the first interface 150, the processor 110 controls another fixed contact of the switch 160 to connect with a movable contact to communicate with the processor 120, the display driving module 1200 and the display screen 140, the processor 110 renders an image according to the pose data of the head-mounted display device 10 and the pose data of the handle 20, and controls the display driving module 1200 to drive the display screen 140 to display the image rendered by the processor 110.
According to the embodiment of the disclosure, the processor can acquire first position data of the head-mounted display device and second position data of the handle, and control the switch to communicate the first interface and the display screen when detecting that the first interface is connected with the host, and send the first position data and the second position data to the host through the first interface so that the host can render an image according to the first position data and the second position data, or control the switch to communicate the processor and the display screen when detecting that the first interface is not connected with the host, and render the image according to the first position data and the second position data. Namely, the head-mounted display equipment can realize the switching function module of the all-in-one machine and the PC helmet, automatically complete the switching of the all-in-one machine and the PC helmet, and can share the head and hand tracking function of the all-in-one machine.
In an embodiment, since the above camera device 180 includes at least one fisheye camera 181, for example, four fisheye cameras including a fisheye camera 181a, a fisheye camera 181b, a fisheye camera 181c, and a fisheye camera 181d as shown in fig. 5, the embodiment briefly describes the at least one fisheye camera 181.
At least one of the fisheye cameras 181 may have the same horizontal angle of view, vertical angle of view and diagonal angle of view, and for any one of the fisheye cameras 181, it has a certain concave shape to prevent the head-mounted display device 10 from falling, and the head-mounted display device 10 is placed on a horizontal plane such as a desktop or the like, protecting the lens module.
At least one fisheye camera 181 is put in the surperficial multiposition of wearing display device 10, firstly increases the scope that display device 10 can be tracked, when wearing display device 10 to track, can track the surrounding environment on a large scale, and this one hand can improve and track stability and precision, and this on the other hand can improve the scope that handle 20 can be tracked, reduces the optics handle and tracks the blind area, obtains better handle and tracks and experience.
The tracking schematic diagram of each fisheye camera 181 is to output an area that each fisheye camera 181 can cover according to the field angle data of each fisheye camera 181.
Every fisheye camera 181 can be according to the angle of vision scope of different colour output self, for example can be in the main visual scope of people's eye, keep a plurality of fisheye cameras coincidence, promote and track precision and stability, the coincidence zone of a plurality of fisheye cameras 181 can guarantee to track better precision and stability, it is specific, can be fisheye camera 181c under the left side and fisheye camera 181d under the right side, the focus increases the coincidence zone of fisheye camera 181, promote visual zone's precision and stability. For example, in an area where the human eye is not focused, such as the upper left fisheye camera 181a and the upper right fisheye camera 181b, the tracking area of the single fisheye camera 181 is increased, and the trackable range is increased.
In one embodiment, the head-mounted display device 10 may further include a second interface (not shown in the figure) for connecting an external device, so that the external device can supply power to the head-mounted display device 10, perform data transmission, and perform upgrading. The second interface may be a USB 3.0 interface.
It is understood that the head-mounted display device 10 may further include a distance sensor detection module, a memory storage module, a WIFI/BT module, a power management module, an audio input module, an optical display module, and the like.
The present embodiment further provides a head-mounted display system 60, as shown in fig. 6a and 6b, the head-mounted display system 60 includes the head-mounted display device 10, the handle 20, and the host 30 provided in any of the above embodiments.
The handle 20 is connected to the head-mounted display device 10 through the wireless communication module 170 of the head-mounted display device 10 in a wireless communication manner. The handle 20 includes a left handle 210 and a right handle 220, for example, each of the left handle 210 and the right handle 220 is provided with a wireless communication module, so as to wirelessly communicate with the head-mounted display device 10 through the corresponding wireless communication module.
The handle 20 includes a second inertial measurement module and a plurality of light emitters (all shown). The handle 20 may also include a power on/system key, an enter key, a return key, a menu key, a rocker enter key, a trigger key, a grab key, a rocker, etc. At the same time, the handle 20 supports the functions of enter, return, rocker, trigger, and thumb rest. As for the control of the handle 20, reference may be made to the above-described embodiments, and the present embodiment is not limited thereto.
The host 30 can be connected to the head-mounted display device 10 through the first interface 150 in a wired communication manner, as shown in fig. 6a, for example, in a case where the host 30 is not connected to the head-mounted display device 10 through the first interface 120, the head-mounted display device 10 operates in a combo mode. As shown in fig. 6b, for example, when the host 30 is connected to the head-mounted display device 10 through the first interface 150, the head-mounted display device 10 operates in a PC helmet mode.
The present disclosure may be systems, methods, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the present disclosure by utilizing the state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, by software, and by a combination of software and hardware are equivalent.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the present disclosure is defined by the appended claims.

Claims (10)

1. A head-mounted display device is characterized by comprising a processor, a first camera device, a lens module and at least one infrared lamp positioned on the lens module,
the at least one infrared lamp is used for emitting an infrared light signal for receiving by eyes of a wearer of the head-mounted display device;
the first camera device is used for collecting the human eye image of the wearer after the eyes of the wearer receive the infrared light signal;
the processor is used for carrying out eyeball tracking on the wearer according to the human eye image.
2. The head-mounted display device of claim 1, wherein the first camera comprises a first infrared camera and a second infrared camera,
the first infrared camera is used for collecting the left eye image of the wearer after the left eye of the wearer receives the infrared light signal;
the processor is used for carrying out eyeball tracking on the left eye of the wearer according to the left eye image to obtain the left eye fixation point of the wearer;
the second infrared camera is used for collecting the right eye image of the wearer after the right eye of the wearer receives the infrared light signal;
the processor is used for carrying out eyeball tracking on the right eye of the wearer according to the right eye image to obtain the right eye fixation point of the wearer.
3. The head-mounted display device of claim 2, wherein the lens module comprises a left lens and a right lens, the head-mounted display device further comprising a display screen,
the processor is further configured to: calculating a first distance between a left eye eyeball and a right eye eyeball of the wearer according to the left eye fixation point and the right eye fixation point of the wearer; acquiring a second distance between the left lens and the right lens; and adjusting the display center position of the image displayed by the display screen according to the first distance and the second distance.
4. The head mounted display device of claim 1, further comprising a dichroic mirror positioned on the lens module,
the dichroic mirror is for reflecting the infrared light signal if the eye of the wearer receives the infrared light signal;
the first camera device is used for collecting the human eye image of the wearer under the condition that the dichroic mirror reflects the infrared light signal.
5. The head mounted display device of claim 3, further comprising a first interface, a switch, and a wireless communication module,
the communication module is used for carrying out wireless communication connection with the handle;
the processor is used for acquiring first position and posture data of the head-mounted display device and second position and posture data of the handle;
the processor is used for controlling the change-over switch to be switched to connect the first interface and the display screen under the condition that the first interface is detected to be connected with a host, sending the first position and posture data and the second position and posture data to the host through the first interface so that the host can render images according to the first position and posture data, and outputting the rendered images to the display screen through the first interface for displaying; alternatively, the first and second electrodes may be,
the processor is used for controlling the selector switch to connect the processor and the display screen under the condition that the first interface is not connected with the host, rendering an image according to the first position and posture data, and outputting the rendered image to the display screen for display.
6. The head-mounted display device of claim 5, further comprising a second camera and a first inertial measurement unit, wherein the handle comprises a second inertial measurement unit and a plurality of lights,
the second camera device is used for acquiring a first image in a first exposure time length and acquiring a second image in a second exposure time length; the second image at least comprises light spots corresponding to a plurality of luminous bodies arranged on the handle;
the processor is configured to determine the first pose data from the first image and the first inertial measurement module; and determining the second pose data from the second image, the second inertial measurement module, and the first pose data.
7. The head-mounted display apparatus of claim 6, wherein the second camera comprises at least one fisheye camera.
8. The head mounted display device of claim 5, further comprising an audio output module,
the processor is used for receiving audio data transmitted by the host through the first interface and decoding the audio data under the condition that the first interface is detected to be connected with the host;
the audio output module is used for playing the decoded audio data.
9. The head mounted display device of claim 1, wherein the head mounted display device further comprises a second interface,
the second interface is used for connecting an external device, so that the external device can supply power to the head-mounted display device, and any one or more of data transmission and upgrading can be performed.
10. A head-mounted display system, comprising:
the head mounted display device according to any one of claims 1-9, the head mounted display device comprising a wireless communication module and a first interface;
the handle is in wireless communication connection with the head-mounted display device through the wireless communication module;
the host computer is in wired communication connection with the head-mounted display device through the first interface.
CN202110535316.0A 2021-05-17 2021-05-17 Head-mounted display device and head-mounted display system Active CN113382228B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110535316.0A CN113382228B (en) 2021-05-17 2021-05-17 Head-mounted display device and head-mounted display system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110535316.0A CN113382228B (en) 2021-05-17 2021-05-17 Head-mounted display device and head-mounted display system

Publications (2)

Publication Number Publication Date
CN113382228A true CN113382228A (en) 2021-09-10
CN113382228B CN113382228B (en) 2023-04-18

Family

ID=77571136

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110535316.0A Active CN113382228B (en) 2021-05-17 2021-05-17 Head-mounted display device and head-mounted display system

Country Status (1)

Country Link
CN (1) CN113382228B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023093649A1 (en) * 2021-11-26 2023-06-01 北京七鑫易维信息技术有限公司 Eye tracking module accessory and manufacturing method therefor, head-mounted display device

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104822061A (en) * 2015-04-30 2015-08-05 小鸟科技有限公司 Interpupillary distance adjusting method, system, and module of head-mounted 3D display
CN106527687A (en) * 2016-10-10 2017-03-22 北京小鸟看看科技有限公司 Virtual reality helmet and double computing platform method for realizing same
CN106686365A (en) * 2016-12-16 2017-05-17 歌尔科技有限公司 Lens adjusting method and lens adjusting device for head-mounted display equipment, and head-mounted display equipment
CN106773038A (en) * 2016-12-06 2017-05-31 栗明 The infrared lamp penetrating apparatus for wearing display device that a kind of view-based access control model is followed the trail of
CN106990847A (en) * 2017-04-06 2017-07-28 小派科技(上海)有限责任公司 A kind of virtual implementing helmet and the method for adjusting virtual implementing helmet interpupillary distance
CN107462992A (en) * 2017-08-14 2017-12-12 深圳创维新世界科技有限公司 A kind of adjusting method for wearing display device, device and wear display device
CN109857255A (en) * 2019-02-13 2019-06-07 京东方科技集团股份有限公司 A kind of display parameter regulation method, device and wear display equipment
CN209858860U (en) * 2019-06-28 2019-12-27 歌尔科技有限公司 Head-mounted display equipment
US10528128B1 (en) * 2017-12-15 2020-01-07 Facebook Technologies, Llc Head-mounted display devices with transparent display panels for eye tracking
CN111061363A (en) * 2019-11-21 2020-04-24 青岛小鸟看看科技有限公司 Virtual reality system
CN112286343A (en) * 2020-09-16 2021-01-29 青岛小鸟看看科技有限公司 Positioning tracking method, platform and head-mounted display system
CN113382230A (en) * 2021-05-17 2021-09-10 青岛小鸟看看科技有限公司 Head-mounted display device and head-mounted display system

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104822061A (en) * 2015-04-30 2015-08-05 小鸟科技有限公司 Interpupillary distance adjusting method, system, and module of head-mounted 3D display
CN106527687A (en) * 2016-10-10 2017-03-22 北京小鸟看看科技有限公司 Virtual reality helmet and double computing platform method for realizing same
CN106773038A (en) * 2016-12-06 2017-05-31 栗明 The infrared lamp penetrating apparatus for wearing display device that a kind of view-based access control model is followed the trail of
CN106686365A (en) * 2016-12-16 2017-05-17 歌尔科技有限公司 Lens adjusting method and lens adjusting device for head-mounted display equipment, and head-mounted display equipment
CN106990847A (en) * 2017-04-06 2017-07-28 小派科技(上海)有限责任公司 A kind of virtual implementing helmet and the method for adjusting virtual implementing helmet interpupillary distance
CN107462992A (en) * 2017-08-14 2017-12-12 深圳创维新世界科技有限公司 A kind of adjusting method for wearing display device, device and wear display device
US10528128B1 (en) * 2017-12-15 2020-01-07 Facebook Technologies, Llc Head-mounted display devices with transparent display panels for eye tracking
CN109857255A (en) * 2019-02-13 2019-06-07 京东方科技集团股份有限公司 A kind of display parameter regulation method, device and wear display equipment
CN209858860U (en) * 2019-06-28 2019-12-27 歌尔科技有限公司 Head-mounted display equipment
CN111061363A (en) * 2019-11-21 2020-04-24 青岛小鸟看看科技有限公司 Virtual reality system
CN112286343A (en) * 2020-09-16 2021-01-29 青岛小鸟看看科技有限公司 Positioning tracking method, platform and head-mounted display system
CN113382230A (en) * 2021-05-17 2021-09-10 青岛小鸟看看科技有限公司 Head-mounted display device and head-mounted display system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023093649A1 (en) * 2021-11-26 2023-06-01 北京七鑫易维信息技术有限公司 Eye tracking module accessory and manufacturing method therefor, head-mounted display device

Also Published As

Publication number Publication date
CN113382228B (en) 2023-04-18

Similar Documents

Publication Publication Date Title
CN113382230B (en) Head-mounted display device and head-mounted display system
US9710973B2 (en) Low-latency fusing of virtual and real content
CA2820950C (en) Optimized focal area for augmented reality displays
CN107710284B (en) Techniques for more efficiently displaying text in a virtual image generation system
CN108535868B (en) Head-mounted display device and control method thereof
US20180114353A1 (en) Integrating Real World Conditions into Virtual Imagery
KR20160113139A (en) Gaze swipe selection
CN105009039A (en) Direct hologram manipulation using IMU
KR20180002534A (en) External imaging system, external imaging method, external imaging program
EP3671408B1 (en) Virtual reality device and content adjusting method therefor
CN111684496B (en) Apparatus and method for tracking focus in a head-mounted display system
WO2016136074A1 (en) Information processing apparatus, information processing method, and program
US10482670B2 (en) Method for reproducing object in 3D scene and virtual reality head-mounted device
US10725540B2 (en) Augmented reality speed reading
JP2017102732A (en) Display control device and display control method
CN111061363A (en) Virtual reality system
CN113382228B (en) Head-mounted display device and head-mounted display system
JP2010067154A (en) Head mounted display, information browsing system, and management server
JP2018207151A (en) Display device, reception device, program, and control method of reception device
US20200264437A1 (en) Display system, control program for information processor, and control method for information processor
US10319346B2 (en) Method for communicating via virtual space and system for executing the method
US11662592B2 (en) Head-mounted display device and head-mounted display system
JP7145944B2 (en) Display device and display method using means for providing visual cues
JP2018018315A (en) Display system, display unit, information display method, and program
JP2018042004A (en) Display device, head-mounted type display device, and method for controlling display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant