WO2018120554A1 - 图像显示方法和头戴显示设备 - Google Patents

图像显示方法和头戴显示设备 Download PDF

Info

Publication number
WO2018120554A1
WO2018120554A1 PCT/CN2017/082439 CN2017082439W WO2018120554A1 WO 2018120554 A1 WO2018120554 A1 WO 2018120554A1 CN 2017082439 W CN2017082439 W CN 2017082439W WO 2018120554 A1 WO2018120554 A1 WO 2018120554A1
Authority
WO
WIPO (PCT)
Prior art keywords
head
eyeball
position information
movement
eye movement
Prior art date
Application number
PCT/CN2017/082439
Other languages
English (en)
French (fr)
Inventor
吴欣凯
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN201780009379.4A priority Critical patent/CN108604015B/zh
Publication of WO2018120554A1 publication Critical patent/WO2018120554A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/02Viewing or reading apparatus

Definitions

  • the present application relates to the field of head-mounted display technologies, and in particular, to an image display method and a head-mounted display device.
  • Virtual Reality (VR) technology refers to the ability to use the screen display and sound playback technology to guide users into a virtual environment, giving users a real sensory experience.
  • Virtual display technology can be implemented using a head mounted display (HMD).
  • the HMD respectively displays the difference image by the screen corresponding to the left eye and the right eye respectively, and the left eye and the right eye respectively obtain the difference image on the corresponding screen, and then combine to generate the stereo image.
  • the virtual image output by the HMD can cover the real-world view observed by the user, thereby enhancing the user's immersion.
  • the HMD is configured to switch the display screen by using the motion data of the detected head motion, for example, to switch the display screen according to the detected head motion information.
  • the virtual image may have chromatic aberration or picture distortion at the edge.
  • the range of the virtual graphics presented by the HMD does not change, and the user may observe the border of the virtual image while moving the eyeball, causing the user to observe the edge of the virtual image.
  • the phenomenon of color difference or distortion of the edge of the picture may reduce the sensory experience of the user using the HMD.
  • the embodiment of the present invention provides an image display method and a head-mounted display device, which can enable a user to observe a corresponding image by rotating an eyeball, thereby facilitating improvement of user immersion.
  • an embodiment of the present application provides an image display method, including: acquiring first position information of an eyeball reference object when an eyeball motion is detected; and determining, according to the first position information of the eyeball reference object, The eyeball observes a change in the axis; adjusts a position of the display system according to the change in the eyeball viewing axis, and displays a virtual image associated with the position to control a central axis of the virtual image and the eyeball viewing axis Aligned, the display system includes a display screen and optical components coupled to the display screen.
  • the acquiring first location information of the eyeball reference includes: acquiring first location information of the eyeball reference at an end point of the eyeball motion; or, real time Obtaining first position information of the eyeball reference during the movement of the eyeball.
  • the adjusting a position of the display system includes adjusting a position of at least one of the display screen or the optical component.
  • the method further includes: when the head motion is monitored during the monitoring of the eye movement, determining whether the eye movement is synchronized with the head motion; if it is determined that the motion is not synchronized Determining the offset information between the eye movement and the head movement; adjusting the position of the display system according to the change of the eyeball viewing axis, comprising: adjusting the Shows the location of the system.
  • the method further includes: acquiring second position information of the head when the head motion is monitored during the monitoring of the eye movement; wherein the determining whether the eye movement is Synchronizing with the movement of the head, comprising: determining, according to the first position information and the second position information, whether the eye movement is The head movements are synchronized.
  • the determining the offset information between the eye movement and the head motion comprises: determining an eyeball corresponding to the second position information of the head Third position information of the reference; determining offset information between the eye movement and the head movement based on the first position information and the third position information.
  • the determining the offset information between the eye movement and the head motion comprises: determining to synchronize with the first position information of the eye reference Fourth position information of the head; determining offset information between the eye movement and the head motion based on the two position information and the fourth position information.
  • an embodiment of the present application provides a head mounted display device, where the head mounted display device includes a functional unit for performing some or all of the methods of the first aspect.
  • an embodiment of the present application provides a head mounted display device including a memory and a processor, and a computer program stored on the memory for execution by the processor, The processor executes the computer program to implement some or all of the methods of the first aspect.
  • an embodiment of the present application provides a storage medium storing computer instructions, which are executed in part or in all of the methods of the first aspect.
  • the position information of the eyeball reference object can be acquired, and the change of the eyeball observation axis is determined according to the position information of the eyeball reference object; and then the position of the display system is adjusted according to the change of the eyeball observation axis.
  • a virtual image associated with the location is displayed to control the central axis of the virtual image to align with the eyeball viewing axis. Therefore, the user can observe the corresponding image by rotating the eyeball, which is beneficial to enhance the user's immersion.
  • 1A is a schematic diagram of a virtual environment observed by a user through an HMD
  • FIG. 1B is a schematic diagram of an area of the virtual environment shown in FIG. 1A observed by a user at a current location;
  • FIG. 1B is a schematic diagram of an area of the virtual environment shown in FIG. 1A observed by a user at a current location;
  • FIG. 2 is a schematic flow chart of an image display method according to an embodiment of the present application.
  • FIG. 3A and FIG. 3B are schematic diagrams showing the positional correspondence between some eyeball observation axes and a display system provided by an embodiment of the present application;
  • FIG. 4 is a schematic flow chart of another image display method provided by an embodiment of the present application.
  • FIG. 5 is a schematic structural diagram of a head-mounted display device according to an embodiment of the present disclosure.
  • FIG. 6 is a functional block diagram of a head mounted display device according to an embodiment of the present application.
  • FIG. 1A shows a virtual environment provided by the HMD to the user, and the HMD passes the configured display system.
  • the virtual image is displayed for the user, and the virtual image observed by the user can form a stereoscopic image in the brain.
  • the displayed virtual image may include, for example, graphics, text, and/or video.
  • the content of the displayed image may relate to any number of contexts including, but not limited to, the current environment of the wearer, the activity currently being performed by the wearer, the biometric status of the wearer, and any audio, video or text communication for the wearer.
  • the image displayed by the HMD can also be part of the interactive user interface.
  • the HMD can be part of a wearable computing device.
  • the images displayed by the HMD can include menus, selection boxes, navigation icons, or other user interface features that enable the wearer to invoke functions of the wearable computing device or otherwise interact with the wearable computing device.
  • the HMD With the rotation of the user's head, the HMD can switch the display screen in real time to provide the user with the virtual image corresponding to the current viewing angle.
  • the HMD can generally use the HMD motion data to adjust the display system display. Virtual image.
  • the HMD motion data can include the position and orientation of the HMD. Thereby making the user immersive, enhancing the user's real sensory experience and enhancing the user's immersion.
  • 1B shows an area of the virtual image that the user can observe at an observation angle.
  • the position or orientation of the HMD is different, and the area of the virtual image observed by the user is different.
  • the eyeball When motion (for example, moving to the left or right), since the area of the virtual image provided by the HMD is corresponding to the head HMD motion data, in this case, the area of the virtual image provided by the HMD is unchanged, and the user passes Eye movements will observe the edge of the image, and it is possible to observe the black part of the current display image, which will reduce the user's immersion.
  • FIG. 2 is a schematic flowchart diagram of an image display method according to an embodiment of the present application. As shown in FIG. 2, the method includes the following steps.
  • step S201 when the HMD monitors the eye movement, the position information of the eyeball reference object is acquired.
  • the HMD can track eye movements.
  • the HMD may first determine an eyeball reference and determine an eyeball state based on the state of motion of the eyeball reference.
  • the eyeball reference may include one or more reference points in the pupil, or the edge of the sclera/iris (also becoming a heterochromatic edge), for example, the centroid of the pupil or the centroid of the heterochromatic edge;
  • the eyeball reference may also include one or more flashing reflection points on the eyeball, which is not specifically limited in the embodiment of the present application. Taking the position information of the pupil centroid as an example to illustrate the specific implementation of determining the position information of the eyeball reference.
  • the user's eyes are illuminated by one or more infrared light sources, and the infrared light emitted by the infrared light source is reflected by the eyes, and the infrared camera can be used to collect the reflected infrared light, and the infrared camera It is possible to image the reflected infrared light.
  • the range of the pupil in the eye and the position information of the centroid can be confirmed.
  • the position information of the centroid can be recorded by spatial coordinates.
  • Embodiments of the present application may also utilize other known techniques and methods for determining an eyeball reference, including the use of visible light illumination and/or other imaging techniques.
  • the motion state of the eyeball reference can be monitored. If it is detected that the eyeball reference changes from a stationary state to a motion state, the eyeball reference motion can be tracked to obtain position information of the eyeball reference.
  • the position information of the acquired eyeball reference object may be position information of the eyeball reference object acquired in real time when tracking the movement of the eyeball reference object, or may be the position of determining the end point of the movement of the eyeball reference object after tracking the movement of the eyeball reference object. information.
  • the acquired position information of the eyeball reference is spatial coordinate information of the eyeball reference.
  • Step S202 determining a change of the eyeball observation axis according to the position information of the eyeball reference object.
  • the change of the eyeball viewing axis may be determined according to the position information of the eyeball reference.
  • the eyeball viewing axis can be understood as the central axis of the region that the eye can observe, and the eyeball viewing axis can be represented as a straight line passing through the pupil region and perpendicular to the surface of the eyeball. More specifically, the eyeball viewing axis can be visually represented as passing through The pupil is shaped like a straight line perpendicular to the surface of the eyeball; the direction of the eyeball viewing axis can also be understood as the direction of the central beam that can enter the pupil.
  • the change in the eyeball viewing axis may be determined based on the initial position information of the eyeball reference and the acquired positional information of the eyeball reference.
  • the initial position information of the eyeball reference refers to the spatial coordinate information of the eyeball reference acquired when the eyeball reference is in a stationary state.
  • the position information of the eyeball reference acquired in step S201 may be spatial coordinate information of the eyeball reference object detected in real time, or may be space coordinate information of the end point of the eyeball reference object.
  • the change of the eyeball reference motion may be determined, and the change of the eyeball reference includes the spatial coordinate change amount and the change orientation.
  • the position information of the acquired eyeball reference can also be understood as including the spatial coordinate change amount and the change orientation of the eyeball reference object from rest to motion, and the change amount refers to the distance. That is to say, according to the acquired position information of the eyeball reference object, the change of the eyeball reference object motion can be determined.
  • the change in the axis of view of the eyeball can be determined accordingly.
  • the change in the eyeball viewing axis is related to the display position of the eyeball distance virtual image.
  • the mapping relationship between the change of the eyeball reference object and the change of the eyeball observation axis can be preset based on the distance between the eyeball and the HMD display virtual image. It can be understood that the change orientation of the eyeball reference object (including the angle of change) and the eyeball observation axis can be understood.
  • the change orientation is consistent, and one implementation is a mapping relationship between the amount of change of the eyeball reference and the amount of change of the eyeball observation axis. Alternatively, it is also considered that the change in the eyeball reference is consistent with the change in the axis of view of the eyeball.
  • the embodiment of the present application is not specifically limited.
  • Step S203 adjusting a position of the display system according to the change of the eyeball viewing axis, and displaying a virtual image associated with the position to control a central axis of the virtual image to be aligned with the eyeball viewing axis.
  • the position of the display system can be adjusted accordingly based on changes in the eyeball viewing axis.
  • the display system can be controlled to display a virtual image associated with the position, and the central axis of the virtual image is controlled to be aligned with the eyeball viewing axis, that is, the area of the image observed by the user can be adjusted according to the eye movement.
  • adjusting the position of the display system includes adjusting spatial coordinate information of the center of gravity of the display system and/or an angle of the display system.
  • the display system includes a display screen and an optical component, and the image displayed by the display screen is presented to the user as a virtual image through the optical component.
  • adjusting the position of the display system can include adjusting the position of at least one of the display elements and/or the optical components.
  • the position information of the eyeball reference object can be acquired, and the change of the eyeball observation axis is determined according to the position information of the eyeball reference object; and then the position of the display system is adjusted according to the change of the eyeball observation axis.
  • a virtual image associated with the location is displayed to control the central axis of the virtual image to align with the eyeball viewing axis. Therefore, the user can observe the corresponding image by rotating the eyeball, which is beneficial to enhance the user's immersion.
  • the eye 301 includes a black eye 302 and a pupil 303.
  • the eyeball reference may be a black eye 302 or a pupil 303, or One or more reference points, such as a black eye center or a pupil heart.
  • the eyeball reference may also be the edge of the iris and sclera not shown, or one or more reference points on the edge, and the like.
  • the pupil is used as a reference for the eyeball.
  • the display system 305 includes a display screen and an optical component through which a virtual image 304 can be output. In FIG.
  • Figure 3A the central axis 306 of the virtual image coincides with the eyeball viewing axis 306. In this case, the user can observe the center view.
  • the field has the best visual experience.
  • Figure 3A also provides a reference space coordinate axis at which the pupil is located as shown in Figure 3A.
  • the position information of the pupil can be obtained in real time, and the change of the eyeball observation axis is determined according to the position information of the pupil.
  • the position of the eyeball viewing axis 307 also changes, deviating from the central axis 306 of the virtual image 304 displayed by the display system 305.
  • the position information of the pupil 303 can be obtained, and the position information of the pupil 303 can be obtained in real time, and the change of the eyeball observation axis is determined according to the position information of the pupil 303, and the display system 305 is adjusted based on the change.
  • the adjusted position is as shown in FIG.
  • FIG. 3A to FIG. 3B merely adjust the position of the display system as a whole, and of course, the center axis of the output virtual image can also be adjusted by adjusting the display screen and/or the optical component in the display system. Align with the eyeball viewing axis.
  • FIG. 4 is a schematic flowchart diagram of another image display method according to an embodiment of the present application. As shown in FIG. 4, the method includes at least the following steps.
  • step S401 when the head movement is detected during the monitoring of the eye movement, it is determined whether the eye movement is synchronized with the head movement.
  • the eye movement can be monitored by an infrared camera, a camera, and a microwave type positioning device.
  • monitoring the head motion can also be understood as monitoring the HMD motion, and the gyroscope configured by the HMD (such as a three-axis gyroscope or a six-axis gyroscope)
  • the position information of the head may be the position information of the center of gravity of the head monitored by the HMD, or the position information of the head may be the position information of the position of the head positioning device configured by the HMD.
  • the position information of the eyeball reference object may be acquired, and when the eyeball motion is detected, the position information of the head may be acquired. Based on the position information of the eye reference acquired at the same time, the position information of the head reference can be used to determine whether the eye movement is synchronized with the head movement.
  • the positional information of the head corresponding to the synchronization may be searched according to the positional information of the acquired eyeball reference object by the synchronization correspondence between the position information of the eyeball reference object and the position information of the head, and the head corresponding to the synchronization is determined.
  • the spatial orientation change of the eyeball reference may be determined according to the acquired position information of the eyeball reference, where the spatial orientation change means that the change orientation is determined based on the three-dimensional coordinate axis; and the position information of the acquired head may be obtained.
  • Determine the spatial orientation change of the head determine whether the spatial orientation change of the eyeball reference is consistent with the spatial orientation change of the head, and if consistent, indicate that the eye movement is synchronized with the head movement, and if not, the eye movement and the head movement are indicated. Not synchronized.
  • the virtual image currently displayed by the HMD may not be switched, or the virtual image currently displayed by the HMD may be switched to the virtual information related to the position information of the eyeball reference or the position information of the head. Quasi-image. Here, no specific limitation is made.
  • Step S402 if it is determined that the synchronization is not performed, determining offset information between the eye movement and the head movement.
  • the offset information between the eye movement and the head motion may be further determined.
  • the location information of the eyeball reference object synchronized with the acquired location information of the head may be determined according to the acquired location information of the head, and according to the acquired location information of the eyeball reference object and the determined synchronized eyeball reference object.
  • the change between the positional information determines the offset information between the eye movement and the head movement.
  • the offset information described herein can be understood as a change in the eye reference, the change including the change distance and the change orientation.
  • position information of the head synchronized with the acquired position information of the eyeball reference object may be determined, and eye movement and head movement may be determined according to a change between the acquired position information of the head and the determined position information of the head. Offset information between. Further, the change of the eyeball observation axis can also be determined based on the offset information determined in the above manner.
  • determining the position information of the eyeball reference synchronized with the acquired position information of the head, or determining the position information of the head synchronized with the acquired position information of the eyeball reference may be based on the preset.
  • the synchronization algorithm determines, or is determined based on a synchronized correspondence with the store.
  • Step S403 adjusting the position of the display system according to the offset information.
  • the position of the display system may be adjusted according to the determined offset information, or the change of the eyeball observation axis may be further determined according to the offset information after determining the offset information, and the change based on the eyeball viewing axis To adjust the position of the display system.
  • a virtual image related to adjusting the position of the display system is displayed to control the central axis of the virtual image to be aligned with the eyeball viewing axis.
  • the head movement is taken into consideration during eye movement, which can more accurately adjust the position of the display system to provide a better sensory experience for the user.
  • the head mounted display device described in the embodiments of the present application may include a head mounted display device (HMD), a near-eye display device, or a head-up display device (HUD).
  • HMD head mounted display device
  • HUD head-up display device
  • the head mounted display device described in the embodiments of the present application may also be part of the wearable display device.
  • FIG. 5 shows a schematic diagram of an HMD 100 including a number of different components and subsystems.
  • the components of HMD 100 may include eye tracking system 102, HMD tracking system 104, display system 106, peripherals 108, power source 110, processor 112, memory 114, and control system 115.
  • Eye tracking system 102 can include hardware such as infrared camera 116 and at least one infrared source 118 that can position the eye.
  • the HMD tracking system 104 can include a gyroscope 120, a global positioning system (GPS) 122, and an accelerometer 124.
  • Display system 106 in one embodiment, can include display screen 126, display source 128, and optical assembly 130.
  • Peripheral device 108 can include, for example, wireless communication interface 134, touch pad 136, microphone 138, camera 140, and speaker 142.
  • Control system 115 may include a transmission such as stepper motor 144 or servo motor 146.
  • infrared camera 116 may image one or both eyes of an HMD wearer.
  • Infrared camera 116 in eye tracking system 102 can deliver image information to processor 112, which can access memory 114 and make Regarding the determination of the direction of observation of the HMD wearer, the direction of observation is also referred to as the eyeball viewing axis.
  • the processor 112 can also control the control system 115 to adjust the position of the display system 106, for example, to adjust the position of the display screen 126 in the display system, and/or to adjust the position of the optical assembly 130; subsequently, the processor 112 can control the display The screen displays a virtual image associated with the location to the HMD wearer based on adjusting the position of the display system 106.
  • the HMD 100 can be configured, for example, as glasses, goggles, helmet, hat, visor, headband, or some other form that can be supported on or supported by the wearer's head. Additionally, HMD 100 can represent an opaque display that is configured to display an image to one or both eyes of the wearer without a view of the real world environment.
  • Power source 110 can provide power to various HMD components and can represent, for example, a rechargeable lithium ion battery. Various other power supply materials and types known in the art are possible.
  • HMD 100 may be controlled by processor 112, which executes instructions stored in a non-transitory computer readable medium, such as memory 114.
  • processor 112 in conjunction with instructions stored in memory 114, can function as a controller of HMD 100.
  • processor 112 can control the image displayed on display 126.
  • the processor 112 can also control the wireless communication interface 134 and various other components of the HMD 100.
  • Processor 112 may also represent a plurality of computing devices that may be used to control individual components or subsystems of HMD 100 in a distributed manner.
  • memory 114 may also store data that may include a set of calibrated wearer eye pupil positions and past eye pupil position sets.
  • the memory 114 can function as a database of information related to the direction of observation. This information can be used by the HMD 100 to anticipate where the user will look and determine what image to show to the wearer.
  • the calibrated wearer eye pupil position may include, for example, information about the extent or extent of the wearer's eye pupil motion (right/left and up/down) and the position of the wearer's eye pupil associated with various reference axes. .
  • the viewing axis may represent, for example, an axis extending from the viewing position through the apparent center of the target object or field of view (ie, the central axis of the center point of the apparent display screen that may be projected through the HMD). Other possibilities for observing the axis are present. Thus, the viewing axis can also represent the basis for determining the direction in which the user is looking.
  • the memory 114 can also store various recorded data from previous HMD/user interactions. For example, multiple images of the HMD wearer's eyes (one eye or two eyes) may be averaged to obtain an average viewing axis. This mitigates the effects of beating eye movements or eyeballs that move around the eye's gaze axis in a rapid and somewhat random manner during beating eye movements or eye movements. These eye beats help humans build a mental image of the field of vision and have a better resolution than if the eye remained static, and can be beaten with less by averaging several eye images over a specific time period "Noise" determines the average viewing axis.
  • the HMD 100 can include a user interface for providing information to or receiving input from a wearer.
  • the user interface can be associated with, for example, a displayed virtual image, a touchpad, a keypad, a button, a microphone, and/or other peripheral input device.
  • the processor 112 can control the functionality of the HMD 100 based on input received through the user interface. For example, processor 112 may utilize user input from user interface 115 to control how HMD 100 displays an image within a field of view or determines what image HMD 100 displays.
  • An eye tracking system 102 can be included in the HMD 100.
  • the eye tracking system 102 can deliver location information about the wearer's eyeball reference to the HMD 100 to the processor 112.
  • processor 112 may determine changes in the ocular viewing axis based on information from eye tracking system 102.
  • Processor 112 can then control display system 106 to adjust the displayed image and position in a variety of ways.
  • the infrared camera 116 can be used by the eye tracking system 102 to capture an image of the viewing position associated with the HMD 100.
  • the infrared camera 116 can image the eyes of the HMD wearer that can be located at the viewing position.
  • the viewing position can be illuminated by an infrared source 118.
  • the image can be a video image or a still image.
  • the image obtained by the infrared camera 116 regarding the eyes of the HMD wearer can help determine where the wearer is looking within the HMD field of view, thereby determining the direction of the viewing axis, such as by allowing the processor 112 to ascertain the HMD wearer's eye pupil or other The position of the eyeball reference. Analysis of the image obtained by infrared camera 116 may be performed by processor 112 in conjunction with memory 114.
  • Imaging of the viewing position may occur continuously or at discrete times depending on, for example, user interaction with the user interface.
  • Infrared camera 116 can be integrated into display system 106 or mounted on HMD 100. Alternatively, the infrared camera can be positioned completely separate from the HMD100. Additionally, infrared camera 116 may additionally represent a conventional visible light camera having sensing capabilities in infrared wavelengths.
  • Infrared light source 118 can represent one or more light-emitting diodes (LEDs) or infrared laser diodes that illuminate the viewing position.
  • LEDs light-emitting diodes
  • One or both eyes of the wearer of the HMD 100 can be illuminated by the infrared source 118.
  • the infrared source 118 can be positioned along an optical axis common with the infrared camera 116 and/or the infrared source 118 can be positioned elsewhere.
  • the infrared source 118 can continuously illuminate the viewing position or can be turned on at discrete times. Additionally, the infrared source 118 can be modulated at a particular frequency when illuminated. Other types of modulation of infrared source 118 are possible.
  • the eye tracking system 102 may be disposed in the eye tracking system 102.
  • the infrared camera 116 and the infrared light source 118 included in the eye tracking system 102 in this embodiment are merely exemplary.
  • the HMD tracking system 104 can be configured to provide the HMD location and HMD orientation to the processor 112. This position and orientation data can help determine if it is in sync with eye or eye movements.
  • Gyroscope 120 can be a microelectromechanical system (MEMS) gyroscope, a fiber optic gyroscope, or another type of gyroscope known in the art. Gyroscope 120 can be configured to provide orientation information to processor 112.
  • GPS unit 122 may be a receiver that obtains clocks and other signals from GPS satellites and may be configured to provide real-time location information to processor 112.
  • HMD tracking system 104 may also include an accelerometer 124 configured to provide motion input data to processor 112.
  • Display system 106 can represent components that are configured to provide a virtual image to a viewing location.
  • the optical component 130 in the display system 106 can include a variety of lenses, and can also include a device that integrates both a lens and a display screen.
  • HMD 100 can include a wireless communication interface 134 for wirelessly communicating with one or more devices, either directly or via a communication network.
  • the wireless communication interface 134 can use 3G cellular communication, such as CDMA, EVDO, GSM/GPRS, or 4G cellular communication, such as WiMAX or LTE.
  • the wireless communication interface 134 can communicate with a wireless local area network (WLAN), for example, using WiFi.
  • the wireless communication interface 134 can communicate directly with the device, for example, using an infrared link, Bluetooth, or ZigBee.
  • FIG. 1 shows that various components of the HMD 100 (ie, the wireless communication interface 134, the processor 112, the memory 114, the infrared camera 116, the display screen 126, the GPS 122, and the user interface 115) are integrated into the HMD 100, among these components
  • One or more of the ones may be physically separated from the HMD 100.
  • the infrared camera 116 can be mounted on a wearer separate from the HMD 100.
  • HMD 100 can be part of a wearable computing device in the form of a separate device that can be worn or carried by a wearer.
  • the separate components that make up the wearable computing device can be communicatively coupled together in a wired or wireless manner.
  • FIG. 6 is a functional block diagram of an HMD provided by an embodiment of the present invention.
  • the functional blocks of the HMD may implement the inventive arrangements by hardware, software, or a combination of hardware and software. It will be understood by those skilled in the art that the functional blocks described in FIG. 6 may be combined or separated into several sub-blocks to implement the embodiments of the present invention. Accordingly, the above description of the invention may support any possible combination or separation or further definition of the functional modules described below.
  • the HMD 200 may include an input unit 201, a processing unit 203, and an output unit 205. among them:
  • the input unit 201 acquires first position information of the eyeball reference
  • the processing unit 203 determines a change of the eyeball observation axis according to the first position information of the eyeball reference object
  • the output unit 205 adjusts the position of the display system according to the change of the eyeball viewing axis, and displays a virtual image associated with the position to control the central axis of the virtual image to be aligned with the eyeball viewing axis.
  • the display system includes a display screen and optical components coupled to the display screen.
  • the above functional unit is also capable of executing some or all of the methods described in the foregoing method embodiments.
  • the hardware structure on which the above functional units are based can be seen in the embodiment shown in FIG. I will not repeat them here.
  • the embodiment of the present application further provides a readable non-volatile or non-transitory storage medium storing computer instructions, which are executed by the HMD to perform some or all of the above embodiments.
  • embodiments of the present application can be provided as a method, system, or computer program product.
  • the present application can take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment in combination of software and hardware.
  • the application can take the form of a computer program product embodied on one or more computer-usable storage media (including but not limited to disk storage and optical storage, etc.) including computer usable program code.
  • the computer program instructions can also be stored in a computer readable memory that can direct a computer or other programmable data processing device to operate in a particular manner, such that the instructions stored in the computer readable memory produce an article of manufacture comprising the instruction device.
  • the apparatus implements the functions specified in one or more blocks of a flow or a flow and/or block diagram of the flowchart.
  • These computer program instructions can also be loaded onto a computer or other programmable data processing device such that a series of operational steps are performed on a computer or other programmable device to produce computer-implemented processing for execution on a computer or other programmable device.
  • the instructions provide steps for implementing the functions specified in one or more of the flow or in a block or blocks of a flow diagram.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

一种图像显示方法和头戴显示设备(100、200)。图像显示方法包括:当监测到眼球运动时,获取眼球参考物的第一位置信息(S201);根据眼球参考物的第一位置信息,确定眼球观察轴(307)的变化(S202);根据眼球观察轴(307)的变化,调整显示系统(106、305)的位置,并显示与位置相关联的虚拟图像(304),以控制虚拟图像(304)的中心轴(306)与眼球观察轴(307)对齐(S203),显示系统(106、305)包括显示屏(126)和与显示屏(126)耦合的光学组件(130)。能够使用户通过转动眼球观察对应的图像,有利于提升用户沉浸感。

Description

图像显示方法和头戴显示设备 技术领域
本申请涉及头戴显示技术领域,尤其涉及一种图像显示方法和头戴显示设备。
背景技术
虚拟现实(Virtual Reality,VR)技术是指能够利用画面显示和声音播放技术引导用户置身于一种虚拟环境中,给用户一种真实的感官体验。虚拟显示技术能够利用头戴式显示设备(Head Mount Display,HMD)实现。其中,HMD通过在左眼和右眼分别对应的屏幕分别显示差异图像,左眼和右眼分别获取对应屏幕上的差异图像后,结合产生立体图像。HMD输出的虚拟图像能够覆盖用户观察到的真实世界的视图,进而增强用户的沉浸感。
现有技术中,HMD是利用检测到的头部运动的运动数据来对应切换显示画面,例如,根据检测到的头部运动信息来对应切换显示画面。在这种实现方式中,由于头戴显示设备中所配置的光学组件的透镜的实现局限性,使虚拟图像在边缘处会出现色差或画面失真等现象。在用户头部保持不动,但眼球转动的情形下,HMD所呈现的虚拟图形的范围不变,用户可能会在移动眼球时观察到虚拟图像的边框,导致用户可能会观察到虚拟图像的边缘画面出现色差或者边缘画面失真等现象,这样会降低用户使用HMD的感官体验。
发明内容
本申请实施例提供了一种图像显示方法及头戴显示设备,能够使用户通过转动眼球观察对应的图像,有利于提升用户沉浸感。
第一方面,本申请实施例提供了一种图像显示方法,该方法包括:当监测到眼球运动时,获取眼球参考物的第一位置信息;根据所述眼球参考物的第一位置信息,确定眼球观察轴的变化;根据所述眼球观察轴的变化,调整所述显示系统的位置,并显示与所述位置相关联的虚拟图像,以控制所述虚拟图像的中心轴与所述眼球观察轴对齐,所述显示系统包括显示屏和与所述显示屏耦合的光学组件。
结合第一方面,在一些可能的实现方式中,所述获取眼球参考物的第一位置信息,包括:获取所述眼球参考物在所述眼球运动的结束点的第一位置信息;或者,实时获取所述眼球参考物在所述眼球运动过程中的第一位置信息。
结合第一方面,在一些可能的实现方式中,所述调整所述显示系统的位置,包括:调整所述显示屏或所述光学组件中的至少一个的位置。
结合第一方面,在一些可能的实现方式中,还包括:当监测到眼球运动的过程中监测到头部运动时,判断所述眼球运动是否与所述头部运动同步;如果判断出不同步,则确定所述眼球运动与所述头部运动之间的偏移信息;所述根据所述眼球观察轴的变化,调整所述显示系统的位置,包括:根据所述偏移信息调整所述显示系统的位置。
结合第一方面,在一些可能的实现方式中,还包括:当监测到眼球运动的过程中监测到头部运动时,获取头部的第二位置信息;其中,所述判断所述眼球运动是否与所述头部运动同步,包括:根据所述第一位置信息和所述第二位置信息,判断所述眼球运动是否与 所述头部运动同步。
结合第一方面,在一些可能的实现方式中,所述确定所述眼球运动与所述头部运动之间的偏移信息,包括:确定与所述头部的第二位置信息同步对应的眼球参考物的第三位置信息;根据所述第一位置信息和所述第三位置信息,确定所述眼球运动与所述头部运动之间的偏移信息。
结合第一方面,在一些可能的实现方式中,所述确定所述眼球运动与所述头部运动之间的偏移信息,包括:确定与所述眼球参考物的第一位置信息同步对应的头部的第四位置信息;根据所述二位置信息和所述第四位置信息,确定所述眼球运动与所述头部运动之间的偏移信息。
第二方面,本申请实施例提供了一种头戴式显示设备,该头戴式显示设备包括功能单元,所述功能单元用于执行第一方面的部分或全部方法。
第三方面,本申请实施例提供了一种头戴式显示设备,该头戴式显示设备包括存储器和处理器,以及存储在所述存储器上可供所述处理器执行的计算机程序,所述处理器执行所述计算机程序实现第一方面的部分或全部方法。
第四方面,本申请实施例提供了一种存储计算机指令的存储介质,所述计算机指令被执行以第一方面的部分或全部方法。
可以得知,当监测到眼球运动时,可以获取眼球参考物的位置信息,并根据眼球参考物的位置信息,确定眼球观察轴的变化;进而根据眼球观察轴的变化,调整显示系统的位置,并显示与该位置相关联的虚拟图像,以控制虚拟图像的中心轴与眼球观察轴对齐。从而,能够使用户通过转动眼球观察对应的图像,有利于提升用户沉浸感。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍。
图1A是用户通过HMD观察到的一种虚拟环境的示意图;
图1B是用户在当前位置所观察到的图1A所示虚拟环境的区域示意图;
图2是本申请实施例提供的一种图像显示方法的流程示意图;
图3A和图3B是本申请实施例提供的一些眼球观察轴和显示系统的位置对应关系的示意图;
图4是本申请实施例提供的另一种图像显示方法的流程示意图;
图5是本申请实施例提供的一种头戴式显示设备的结构示意图;
图6是本申请实施例提供的一种头戴式显示设备的功能框图。
具体实施方式
本申请的实施方式部分使用的术语仅用于对本申请的具体实施例进行解释,而非旨在限定本申请。
为了容易理解本申请的技术方案,下面对本申请所提供的应用场景进行介绍。
请参阅图1A,图1A示出了HMD提供给用户的一种虚拟环境,HMD通过配置的显示系 统为用户显示虚拟图像,用户观察到的虚拟图像能够在大脑中形成立体图像。需要说明的是,显示的虚拟图像可包括例如图形、文本和/或视频。显示的图像的内容可涉及任意数目的情境,包括但不限于穿戴者的当前环境、穿戴者当前从事的活动、穿戴者的生物计量状态、以及针对穿戴者的任何音频、视频或文本通信。HMD显示的图像也可以是交互式用户界面的一部分。例如,HMD可以是可穿戴计算设备的一部分。从而,HMD显示的图像可包括菜单、选择框、导航图标或者使得穿戴者能够调用可穿戴计算设备的功能或以其他方式与可穿戴计算设备交互的其他用户界面特征。随着用户头部的转动,HMD能够实时切换显示画面,为用户提供当前观察角度所对应的虚拟图像,换言之,由于HMD随着头部一起运动,HMD一般可使用HMD运动数据来调整显示系统显示的虚拟图像。HMD运动数据能够包括HMD的位置和取向(orientation)。从而使用户身临其境,增强了用户真实的感官体验,提升用户的沉浸感。
其中,图1B示出了用户在一个观察角度能够观察到的虚拟图像的区域,HMD所处的位置或取向不同,用户观察的虚拟图像的区域不同,当用户此时保持头部不动,眼球运动(例如,向左或向右运动)时,由于HMD提供的虚拟图像的区域是与头部HMD运动数据对应的,则在此种情况下,HMD提供的虚拟图像的区域不变,用户通过眼球运动会观察到图像中的边缘画面,有可能会观察到当前显示图像两侧黑色的部分,这样会降低用户的沉浸感。
针对上述应用场景中存在的技术缺陷,下面介绍本申请提供的技术方案。首先描述本申请提供的方法实施例。
请参阅图2,图2是本申请实施例提供的一种图像显示方法的流程示意图。如图2所示,该方法包括以下步骤。
步骤S201,HMD监测到眼球运动时,获取眼球参考物的位置信息。
在一些可能的实现方式中,HMD可以跟踪眼球运动。HMD可以首先确定眼球参考物,根据眼球参考物的运动状态来确定眼球运动状态。本申请实施例中,眼球参考物可以包括瞳孔,或者巩膜/虹膜的边缘(也成为异色边缘)中的一个或多个参考点,例如,瞳孔的形心或者异色边缘的形心等;当然,眼球参考物也可以包括眼球上一个或多个闪光反射点,在此本申请实施例不做具体限定。以确定瞳孔形心的位置信息为例,来说明确定眼球参考物的位置信息的具体实现方式。具体的,利用一个或多个红外光源对用户的眼睛(左眼和/或右眼)照明,红外光源发射的红外光会被眼睛反射,可以利用红外相机来收集反射的红外光,并且红外相机能够对收集反射的红外光成像。在对眼睛成像后,即可确认眼睛中的瞳孔的范围和形心的位置信息。形心的位置信息可以通过空间坐标来记录。本申请实施例还可以利用其他已知的确定眼球参考物的技术手段和方法,包括使用可见光照明和/或其他成像技术。
在一些可能的实现方式中,在确定出眼球参考物后,可以监测该眼球参考物的运动状态。如果监测到该眼球参考物从静止状态变化为运动状态,则可以跟踪该眼球参考物运动,以获取眼球参考物的位置信息。这里,获取的眼球参考物的位置信息可以是在跟踪眼球参考物运动时,实时获取的眼球参考物的位置信息,也可以是跟踪眼球参考物运动后,确定眼球参考物运动的结束点的位置信息。所获取的眼球参考物的位置信息为眼球参考物的空间坐标信息。
步骤S202,根据所述眼球参考物的位置信息,确定眼球观察轴的变化。
在一些可能的实现方式中,当获取眼球参考物的位置信息,则可根据眼球参考物的位置信息确定眼球观察轴的变化。其中,眼球观察轴可以理解为眼睛能够观察到的区域的中心轴,该眼球观察轴能够具象表示成经过瞳孔区域,且与眼球表面垂直的直线,更特殊的,眼球观察轴能够具象表示成经过瞳孔形心,且与眼球表面垂直的直线;眼球观察轴的方向也可以理解为能够进入瞳孔的中心光束的方向。
可以基于眼球参考物的初始位置信息和获取的眼球参考物的位置信息,来确定眼球观察轴的变化。具体的,眼球参考物的初始位置信息是指眼球参考物处于静止状态时获取的眼球参考物的空间坐标信息。这里在步骤S201中获取的眼球参考物的位置信息可以是实时监测到的眼球参考物的空间坐标信息,也可以是眼球参考物的结束点的空间坐标信息。基于眼球参考物的初始位置信息和获取的眼球参考物的位置信息,可以确定眼球参考物运动的变化,眼球参考物的变化包括空间坐标变化量和变化取向。
或者,获取的眼球参考物的位置信息也可以理解为包括了眼球参考物从静止到运动的空间坐标变化量和变化取向,变化量是指距离。也就是说,根据获取的眼球参考物的位置信息,能够确定眼球参考物运动的变化。
在确定眼球参考物的变化后,可以对应确定眼球观察轴的变化。这里,眼球观察轴的变化与眼球距离虚拟图像的显示位置相关。对于相同的眼球参考物的变化,眼球距离虚拟图像的显示位置越远,确定眼球观察轴的变化越大。可以基于眼球和HMD显示虚拟图像之间的距离,预设眼球参考物的变化和眼球观察轴的变化的映射关系,可以理解的是,眼球参考物的变化取向(包括变化角度)和眼球观察轴的变化取向一致,一种实现方式为仅预设眼球参考物的变化量和眼球观察轴的变化量的映射关系。或者,也可以认为眼球参考物的变化和眼球观察轴的变化是一致的。在此,本申请实施例不做具体限定。
步骤S203,根据所述眼球观察轴的变化,调整所述显示系统的位置,并显示与所述位置相关联的虚拟图像,以控制所述虚拟图像的中心轴与所述眼球观察轴对齐。
在一个实施例中,在确定出眼球观察轴的变化后,可以根据眼球观察轴的变化来对应调整显示系统的位置。当将显示系统调整到相应位置后,可以控制显示系统显示与该位置相关联的虚拟图像,控制虚拟图像的中心轴与眼球观察轴对齐,即能够根据眼球运动,对应调整用户观察的图像的区域,使用户能够观察到最佳视角。可选的,调整显示系统的位置包括调整显示系统的重心的空间坐标信息和/或显示系统的角度。本申请实施例中,显示系统包括显示屏和光学组件,显示屏显示的图像经过光学组件呈现给用户为虚拟图像。在此,调整显示系统的位置可以包括调整显示屏和/或光学组件中至少一个光学元件的位置。
可以得知,当监测到眼球运动时,可以获取眼球参考物的位置信息,并根据眼球参考物的位置信息,确定眼球观察轴的变化;进而根据眼球观察轴的变化,调整显示系统的位置,并显示与该位置相关联的虚拟图像,以控制虚拟图像的中心轴与眼球观察轴对齐。从而,能够使用户通过转动眼球观察对应的图像,有利于提升用户沉浸感。
结合图3A和图3B来描述一些实现上述方法的实施例。
其中,图3A示出了静止状态下,显示系统和眼球观察轴的对应位置关系。其中,眼睛301包括黑眼珠302和瞳孔303。眼球参考物可以是黑眼珠302也可以是瞳孔303,或者其中的 一个或多个参考点,如黑眼珠形心或者瞳孔形心等。或者,眼球参考物也可以是图中未示出的虹膜与巩膜的边缘,或者边缘上的一个或多个参考点等。本实施例以瞳孔作为眼球参考物。显示系统305包括显示屏和光学组件,通过显示屏和光学组件能够输出虚拟图像304,图3A中,虚拟图像的中心轴306与眼球观察轴306重合,这种情况下,用户能够观察到中心视场,视觉体验最佳。图3A还提供了一种参考空间坐标轴,在这种参考空间坐标轴下,瞳孔所在位置如图3A所示。
当监测到瞳孔运动时,可实时获取该瞳孔的位置信息,并根据瞳孔的位置信息确定眼球观察轴的变化。如图3B所示,随着眼球运动,眼球观察轴307的位置也发生变化,偏离了显示系统305显示的虚拟图像304的中心轴306。可以跟踪瞳孔303运动结束后,获取瞳孔303的位置信息,也可以实时获取瞳孔303的位置信息,并根据瞳孔303的位置信息,确定眼球观察轴的变化,并基于该变化对显示系统305进行调整,调整后的位置如图3B所示,使显示系统305输出的虚拟图像304的中心轴再次与眼球观察轴重合。并且显示系统305在移动过程中,显示位置对应的虚拟图像,即呈现给与用户视线匹配的区域内的虚拟图像。提升用户的沉浸感。这里需要说明的是,图3A至图3B以整体调整显示系统的位置仅为示例性的,当然,还可以通过调整显示系统中显示屏和/或光学元件,以使输出的虚拟图像的中心轴与眼球观察轴对齐。
请参阅图4,图4是本申请实施例提供的另一种图像显示方法的流程示意图。如图4所示,该方法至少包括以下步骤。
步骤S401,当监测到眼球运动的过程中监测到头部运动时,判断所述眼球运动是否与所述头部运动同步。
在一些可能的实现方式中,当监测到眼球运动的过程中监测到头部运动时,可以判断眼球运动是否与头部运动同步。具体的,可以通过红外相机、摄像机、微波类定位装置监测眼球运动,在此,监测头部运动也可以理解为监测HMD运动,由HMD配置的陀螺仪(如三轴陀螺仪或六轴陀螺仪等)来进行监测,头部的位置信息可以是通过HMD监测到的头部重心的位置信息,或者,头部的位置信息也可以是HMD配置的头部定位装置所在位置的位置信息。进一步的,当监测到眼球运动时,可以获取眼球参考物的位置信息,当监测到眼球运动时,可以获取头部的位置信息。基于同一时间获取的眼球参考物的位置信息可头部参考物的位置信息,判断眼球运动是否与头部运动同步。一些实施例中,可以通过预设眼球参考物的位置信息和头部的位置信息的同步对应关系,根据获取的眼球参考物的位置信息查找同步对应的头部的位置信息,判断同步对应的头部的位置信息与获取的头部的位置信息是否一致,如果一致,表明眼球运动与头部运动同步,如果不一致,则表明眼球运动与头部运动不同步。在又一些实施例中,可以根据获取的眼球参考物的位置信息,确定眼球参考物的空间取向变化,这里空间取向变化是指变化取向基于三维坐标轴确定;可以根据获取的头部的位置信息,确定头部的空间取向变化,判断眼球参考物的空间取向变化和头部的空间取向变化是否一致,如果一致,表明眼球运动与头部运动同步,如果不一致,则表明眼球运动与头部运动不同步。
如果判断出眼球运动和头部运动同步,可以无需切换HMD当前显示的虚拟图像,或者,将HMD当前显示的虚拟图像切换为与眼球参考物的位置信息或头部的位置信息相关的虚 拟图像。在此,不做具体限定。
步骤S402,如果判断出不同步,则确定所述眼球运动与所述头部运动之间的偏移信息。
在一些可能的实现方式中,如果判断出眼球运动与头部运动不同步,则可以进一步确定所述眼球运动与所述头部运动之间的偏移信息。具体的,可以根据获取的头部的位置信息,确定与获取的头部的位置信息同步的眼球参考物的位置信息,并根据获取的眼球参考物的位置信息和确定的同步的眼球参考物的位置信息之间的变化,确定眼球运动与头部运动之间的偏移信息,这里所描述的偏移信息可以理解为眼球参考物的变化,该变化包括变化距离及变化取向。或者,可以确定与获取的眼球参考物的位置信息同步的头部的位置信息,并根据获取的头部的位置信息与确定的头部的位置信息之间的变化,确定眼球运动与头部运动之间的偏移信息。进一步的,还可以基于上述方式确定的偏移信息确定眼球观察轴的变化。
需要说明的是,确定与获取的头部的位置信息同步的眼球参考物的位置信息,或者,确定与获取的眼球参考物的位置信息同步的头部的位置信息的实现方式可以是基于预设同步算法确定,或者基于与存储的同步对应关系确定。
步骤S403,根据所述偏移信息调整所述显示系统的位置。
在一些可能的实现方式中,可以根据确定的偏移信息来调整显示系统的位置,或者,可以在确定偏移信息后,进一步根据偏移信息确定眼球观察轴的变化,基于眼球观察轴的变化来调整显示系统的位置。
在一些可能的实现方式中,对应调整显示系统的位置后,显示与调整显示系统的位置相关的虚拟图像,以控制该虚拟图像的中心轴与眼球观察轴对齐。
可以得知,在眼球运动时考虑到头部运动,能够更精准的调整显示系统的位置,为用户提供更佳的感官体验。
下面结合上述实施例,对本申请的装置实施例进行描述。本申请实施例所描述的头戴显示设备可以包括头戴显示设备(HMD)、近眼显示设备(near-eye display)或抬头显示设备(Head-Up Display,HUD)。当然,本申请实施例所描述的头戴显示设备也可以作为穿戴显示设备的一部分。
请参阅图5,图5示出了包括若干个不同组件和子系统的HMD100的示意图。HMD100的组件可包括眼睛跟踪系统102、HMD跟踪系统104、显示系统106、外围设备108、电源110、处理器112、存储器114和控制系统115。眼睛跟踪系统102可包括诸如红外相机116和至少一个红外光源118之类的能够对眼睛进行定位的硬件。HMD跟踪系统104可包括陀螺仪120、全球定位系统(global positioning system,GPS)122和加速度计124。显示系统106在一个实施例中可包括显示屏126、显示光源128和光学组件130。外围设备108可包括例如无线通信接口134、触摸板136、麦克风138、相机140和扬声器142。控制系统115可包括步进电机144或者伺服马达146等传动装置。
HMD100的组件可被配置为与其各自的系统内或外的其他组件以互连方式工作。例如,在示例实施例中,红外相机116可对HMD穿戴者的一只或两只眼睛成像。眼睛跟踪系统102中的红外相机116可将图像信息递送到处理器112,处理器112可访问存储器114并且作出 关于HMD穿戴者的观察的方向的确定,其中观察的方向也称为眼球观察轴。处理器112还可以控制所述控制系统115,以调整显示系统106的位置,例如,调整显示系统中显示屏126的位置,和/或调整光学组件130的位置;随后,处理器112可控制显示屏来基于调整显示系统106的位置向HMD穿戴者显示与所述位置相关的虚拟图像。
HMD100可被配置为例如眼镜、护目镜、头盔、帽子、帽舌、头带或者可在穿戴者的头部上支撑或从穿戴者的头部支撑的某种其他形式。另外,HMD100可表示被配置为在没有真实世界环境的视图的情况下向穿戴者的一只或两只眼睛显示图像的不透明显示器。
电源110可向各种HMD组件提供电力并且可表示例如可再充电锂离子电池。本领域已知的各种其他电源材料和类型是可能的。
HMD100的功能可由处理器112来控制,处理器112执行存储在非暂态计算机可读介质例如存储器114中的指令。从而,处理器112与存储器114中存储的指令相结合可起到HMD100的控制器的作用。这样,处理器112可控制显示屏126显示的图像。处理器112还可控制无线通信接口134和HMD100的各种其他组件。处理器112还可表示可用来以分布方式控制HMD100的个体组件或子系统的多个计算设备。
除了处理器112可执行的指令以外,存储器114还可存储可以包括一组经校准的穿戴者眼睛瞳孔位置和过去眼睛瞳孔位置的集合在内的数据。从而,存储器114可起到与观察方向有关的信息的数据库的作用。这种信息可被HMD100用来预期用户将看向何处并且确定要向穿戴者显示什么图像。经校准的穿戴者眼睛瞳孔位置可包括例如关于穿戴者的眼睛瞳孔运动的程度或范围(向右/向左和向上/向下)以及可与各种参考轴相关的穿戴者眼睛瞳孔位置的信息。
观察轴可表示例如从观看位置延伸经过目标对象或视野的表观中心(apparent center)的轴(即,可投影经过HMD的表观显示屏的中心点的中心轴)。观察轴的其他可能性是存在的。从而,观察轴还可表示用于确定用户观察方向的基础。
除了上述特征以外,存储器114还可存储来自先前HMD/用户交互的各种记录数据。例如,HMD穿戴者的眼睛(一眼或两眼)的多个图像可被取平均以获得平均观察轴。这可减轻跳动式眼睛运动或眼球跳动的影响,在跳动式眼睛运动或眼跳动中眼睛以迅速且有些随机的方式在眼睛凝视轴周围运动。这些眼跳动帮助人类构建视野的精神图像并且该精神图像与眼睛保持静态的情况相比具有更好的分辨率,并且通过对特定时间段内的数个眼睛图像取平均,可以以更少的跳动“噪声”确定平均观察轴。
HMD100可包括用于向穿戴者提供信息或从穿戴者接收输入的用户界面。用户界面可与例如显示的虚拟图像、触摸板、小键盘、按钮、麦克风和/或其他外围输入设备相关联。处理器112可基于通过用户界面接收的输入来控制HMD100的功能。例如,处理器112可利用来自用户界面115的用户输入来控制HMD100如何在视野内显示图像或者确定HMD100显示什么图像。
在HMD100中可包括眼睛跟踪系统102。在示例实施例中,眼睛跟踪系统102可向处理器112递送关于HMD100的穿戴者的眼球参考物的位置信息。具体地,处理器112可基于来自眼睛跟踪系统102的信息来确定显眼球观察轴的变化。处理器112随后可控制显示系统106以各种方式调整显示的图像和位置。
红外相机116可被眼睛跟踪系统102用来捕捉与HMD100相关联的观看位置的图像。从而,红外相机116可对可位于观看位置的HMD穿戴者的眼睛成像。观看位置可被红外光源118照明。图像可以是视频图像或静止图像。红外相机116获得的关于HMD穿戴者的眼睛的图像可帮助确定穿戴者在HMD视野内看着何处,进而确定观察轴的方向,例如通过允许处理器112查明HMD穿戴者的眼睛瞳孔或其他眼球参考物的位置。对由红外相机116获得的图像的分析可由处理器112联合存储器114来执行。
对观看位置的成像可以连续地发生或者在离散的时间、取决于例如用户与用户界面的交互而发生。红外相机116可被集成到显示系统106中或被安装在HMD100上。或者,红外相机可与HMD100完全分开定位。另外,红外相机116可以额外地表示具有红外波长中的感测能力的传统可见光相机。
红外光源118可表示可对观看位置照明的一个或多个红外发光二极管(light-emitting diode,LED)或红外激光二极管。HMD100的穿戴者的一眼或两眼可被红外光源118照明。红外光源118可沿着与红外相机116共同的光轴定位和/或红外光源118可被定位在别处。红外光源118可连续地对观看位置照明或者可在离散的时间被接通。此外,当照明时,红外光源118可被调制在特定的频率。红外光源118的其他类型的调制是可能的。
需要说明的是,眼睛跟踪系统102中还可以配置其他能够捕捉眼球或眼球参考物图像的装置,本实施例中眼睛跟踪系统102包括的红外相机116和红外光源118仅是示例性的。
HMD跟踪系统104可被配置为向处理器112提供HMD位置和HMD取向。这个位置和取向数据可帮助确定是否与眼球或眼睛运动同步。
陀螺仪120可以是微机电系统(microelectromechanical system,MEMS)陀螺仪、光纤陀螺仪或本领域已知的另外类型的陀螺仪。陀螺仪120可被配置为向处理器112提供取向信息。GPS单元122可以是从GPS卫星获得时钟和其他信号的接收器并且可被配置为向处理器112提供实时位置信息。HMD跟踪系统104还可包括被配置为向处理器112提供运动输入数据的加速度计124。
显示系统106可表示被配置为向观看位置提供虚拟图像的组件。其中,显示系统106中的光学组件130可以包括多种透镜,也可以包括同时集成有透镜和显示屏的装置。
各种外围设备108可被包括在HMD100中并且可用来向和从HMD100的穿戴者提供信息。在一个示例中,HMD100可包括无线通信接口134,用于直接地或经由通信网络与一个或多个设备无线地通信。例如,无线通信接口134可以使用3G蜂窝通信,例如CDMA、EVDO、GSM/GPRS,或者4G蜂窝通信,例如WiMAX或LTE。或者,无线通信接口134可例如利用WiFi与无线局域网(wireless local area network,WLAN)通信。在一些实施例中,无线通信接口134可例如利用红外链路、蓝牙或ZigBee与设备直接通信。
虽然图1示出了HMD100的各种组件(即,无线通信接口134、处理器112、存储器114、红外相机116、显示屏126、GPS122和用户界面115)被集成到HMD100中,但这些组件中的一个或多个可与HMD100物理上分离。例如,红外相机116可被安装在与HMD100分离的穿戴者上。从而,HMD100可以是采取可被穿戴者穿戴或携带的分离设备的形式的可穿戴计算设备的一部分。构成可穿戴计算设备的分离组件可以以有线或无线方式通信地耦合在一起。
图6示出了本发明实施例提供的一种HMD的功能框图。HMD的功能块可由硬件、软件或硬件与软件的组合来实施本发明方案。所属领域的技术人员应理解,图6中所描述的功能块可经组合或分离为若干子块以实施本发明实施例方案。因此,本发明中上面描述的内容可支持对下述功能模块的任何可能的组合或分离或进一步定义。如图6所示,HMD200可包括:输入单元201、处理单元203和输出单元205。其中:
当监测到眼球运动时,输入单元201获取眼球参考物的第一位置信息;
处理单元203根据所述眼球参考物的第一位置信息,确定眼球观察轴的变化;
输出单元205根据所述眼球观察轴的变化,调整所述显示系统的位置,并显示与所述位置相关联的虚拟图像,以控制所述虚拟图像的中心轴与所述眼球观察轴对齐,所述显示系统包括显示屏和与所述显示屏耦合的光学组件。
需要说明的是,上述功能单元还能够执行前述方法实施例中描述的部分或全部方法。上述功能单元所基于的硬件结构可参见图5所示实施例。在此不再赘述。
本申请实施例还提供了一种存储计算机指令的可读非易失性或非暂态性的存储介质,所述计算机指令被上述HMD执行上述实施例中的部分或全部方法。
本领域内的技术人员应明白,本申请的实施例可提供为方法、系统、或计算机程序产品。因此,本申请可采用完全硬件实施例、完全软件实施例、或结合软件和硬件方面的实施例的形式。而且,本申请可采用在一个或多个其中包含有计算机可用程序代码的计算机可用存储介质(包括但不限于磁盘存储器和光学存储器等)上实施的计算机程序产品的形式。
本申请是参照根据本申请实施例的方法、设备(系统)、和计算机程序产品的流程图和/或方框图来描述的。应理解可由计算机程序指令实现流程图和/或方框图中的每一流程和/或方框、以及流程图和/或方框图中的流程和/或方框的结合。可提供这些计算机程序指令到通用计算机、专用计算机、嵌入式处理机或其他可编程数据处理设备的处理器以产生一个机器,使得通过计算机或其他可编程数据处理设备的处理器执行的指令产生用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的装置。
这些计算机程序指令也可存储在能引导计算机或其他可编程数据处理设备以特定方式工作的计算机可读存储器中,使得存储在该计算机可读存储器中的指令产生包括指令装置的制造品,该指令装置实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能。
这些计算机程序指令也可装载到计算机或其他可编程数据处理设备上,使得在计算机或其他可编程设备上执行一系列操作步骤以产生计算机实现的处理,从而在计算机或其他可编程设备上执行的指令提供用于实现在流程图一个流程或多个流程和/或方框图一个方框或多个方框中指定的功能的步骤。
显然,本领域的技术人员可以对本申请进行各种改动和变型而不脱离本申请的精神和范围。这样,倘若本申请的这些修改和变型属于本申请权利要求及其等同技术的范围之内,则本申请也意图包含这些改动和变型在内。

Claims (15)

  1. 一种图像显示方法,其特征在于,包括:
    当监测到眼球运动时,获取眼球参考物的第一位置信息;
    根据所述眼球参考物的第一位置信息,确定眼球观察轴的变化;
    根据所述眼球观察轴的变化,调整所述显示系统的位置,并显示与所述位置相关联的虚拟图像,以控制所述虚拟图像的中心轴与所述眼球观察轴对齐,所述显示系统包括显示屏和与所述显示屏耦合的光学组件。
  2. 如权利要求1所述方法,其特征在于,所述获取眼球参考物的第一位置信息,包括:
    获取所述眼球参考物在所述眼球运动的结束点的第一位置信息;或者,
    实时获取所述眼球参考物在所述眼球运动过程中的第一位置信息。
  3. 如权利要求1或2所述方法,其特征在于,所述调整所述显示系统的位置,包括:
    调整所述显示屏或所述光学组件中的至少一个的位置。
  4. 如权利要求1-3任一项所述方法,其特征在于,还包括:
    当监测到眼球运动的过程中监测到头部运动时,判断所述眼球运动是否与所述头部运动同步;
    如果判断出不同步,则确定所述眼球运动与所述头部运动之间的偏移信息;
    所述根据所述眼球观察轴的变化,调整所述显示系统的位置,包括:
    根据所述偏移信息调整所述显示系统的位置。
  5. 如权利要求4所述方法,其特征在于,还包括:
    当监测到眼球运动的过程中监测到头部运动时,获取头部的第二位置信息;
    其中,所述判断所述眼球运动是否与所述头部运动同步,包括:
    根据所述第一位置信息和所述第二位置信息,判断所述眼球运动是否与所述头部运动同步。
  6. 如权利要求5所述方法,其特征在于,所述确定所述眼球运动与所述头部运动之间的偏移信息,包括:
    确定与所述头部的第二位置信息同步对应的眼球参考物的第三位置信息;
    根据所述第一位置信息和所述第三位置信息,确定所述眼球运动与所述头部运动之间的偏移信息。
  7. 如权利要求5所述方法,其特征在于,所述确定所述眼球运动与所述头部运动之间的偏移信息,包括:
    确定与所述眼球参考物的第一位置信息同步对应的头部的第四位置信息;
    根据所述二位置信息和所述第四位置信息,确定所述眼球运动与所述头部运动之间的偏移信息。
  8. 一种头戴显示设备,其特征在于,包括存储器和处理器,以及存储在所述存储器上可供所述处理器执行的计算机程序,其特征在于,所述处理器执行所述计算机程序实现:
    当监测到眼球运动时,获取眼球参考物的第一位置信息;
    根据所述眼球参考物的第一位置信息,确定眼球观察轴的变化;
    根据所述眼球观察轴的变化,调整所述显示系统的位置,并显示与所述位置相关联的虚拟图像,以控制所述虚拟图像的中心轴与所述眼球观察轴对齐,所述显示系统包括显示屏和与所述显示屏耦合的光学组件。
  9. 如权利要求8所述头戴显示设备,其特征在于,所述处理器获取眼球参考物的第一位置信息,包括:
    获取所述眼球参考物在所述眼球运动的结束点的第一位置信息;或者,
    实时获取所述眼球参考物在所述眼球运动过程中的第一位置信息。
  10. 如权利要求7或8所述头戴显示设备,其特征在于,所述处理器调整所述显示系统的位置,包括:
    调整所述显示屏或所述光学组件中的至少一个的位置。
  11. 如权利要求8-10任一项所述头戴显示设备,其特征在于,所述处理器执行所述计算机程序实现:
    当监测到眼球运动的过程中监测到头部运动时,判断所述眼球运动是否与所述头部运动同步;
    如果判断出不同步,则确定所述眼球运动与所述头部运动之间的偏移信息;
    所述处理器根据所述眼球观察轴的变化,调整所述显示系统的位置,包括:
    根据所述偏移信息调整所述显示系统的位置。
  12. 如权利要求11所述头戴显示设备,其特征在于,所述处理器执行所述计算机程序实现:
    当监测到眼球运动的过程中监测到头部运动时,获取头部的第二位置信息;
    其中,所述判断所述眼球运动是否与所述头部运动同步,包括:
    根据所述第一位置信息和所述第二位置信息,判断所述眼球运动是否与所述头部运动同步。
  13. 如权利要求12所述头戴显示设备,其特征在于,所述处理器确定所述眼球运动与所述头部运动之间的偏移信息,包括:
    确定与所述头部的第二位置信息同步对应的眼球参考物的第三位置信息;
    根据所述第一位置信息和所述第三位置信息,确定所述眼球运动与所述头部运动之间的偏移信息。
  14. 如权利要求12所述头戴显示设备,其特征在于,所述处理器确定所述眼球运动与所述头部运动之间的偏移信息,包括:
    确定与所述眼球参考物的第一位置信息同步对应的头部的第四位置信息;
    根据所述二位置信息和所述第四位置信息,确定所述眼球运动与所述头部运动之间的偏移信息。
  15. 一种存储计算机指令的存储介质,所述计算机指令被执行以如权利要求1-7任一项所述方法。
PCT/CN2017/082439 2016-12-26 2017-04-28 图像显示方法和头戴显示设备 WO2018120554A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201780009379.4A CN108604015B (zh) 2016-12-26 2017-04-28 图像显示方法和头戴显示设备

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201611220189.0 2016-12-26
CN201611220189 2016-12-26

Publications (1)

Publication Number Publication Date
WO2018120554A1 true WO2018120554A1 (zh) 2018-07-05

Family

ID=62706697

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/082439 WO2018120554A1 (zh) 2016-12-26 2017-04-28 图像显示方法和头戴显示设备

Country Status (2)

Country Link
CN (1) CN108604015B (zh)
WO (1) WO2018120554A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109240493A (zh) * 2018-08-22 2019-01-18 联想(北京)有限公司 一种控制方法及电子设备
CN111367405A (zh) * 2020-02-17 2020-07-03 深圳岱仕科技有限公司 头戴显示设备的调整方法、装置、计算机设备及存储介质
CN112069480A (zh) * 2020-08-06 2020-12-11 Oppo广东移动通信有限公司 显示方法、装置、存储介质及可穿戴设备

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030020755A1 (en) * 1997-04-30 2003-01-30 Lemelson Jerome H. System and methods for controlling automatic scrolling of information on a display or screen
CN102323829A (zh) * 2011-07-29 2012-01-18 青岛海信电器股份有限公司 一种显示屏视角调整方法及显示设备
CN103380625A (zh) * 2011-06-16 2013-10-30 松下电器产业株式会社 头戴式显示器及其位置偏差调整方法
US20130328925A1 (en) * 2012-06-12 2013-12-12 Stephen G. Latta Object focus in a mixed reality environment
CN103593044A (zh) * 2012-08-13 2014-02-19 鸿富锦精密工业(深圳)有限公司 电子装置校正系统及方法
CN104067160A (zh) * 2011-11-22 2014-09-24 谷歌公司 使用眼睛跟踪在显示屏中使图像内容居中的方法
CN104280883A (zh) * 2013-07-04 2015-01-14 精工爱普生株式会社 图像显示装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103605208B (zh) * 2013-08-30 2016-09-28 北京智谷睿拓技术服务有限公司 内容投射系统及方法
CN103439794B (zh) * 2013-09-11 2017-01-25 百度在线网络技术(北京)有限公司 头戴式设备的校准方法和头戴式设备
US9244539B2 (en) * 2014-01-07 2016-01-26 Microsoft Technology Licensing, Llc Target positioning with gaze tracking

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030020755A1 (en) * 1997-04-30 2003-01-30 Lemelson Jerome H. System and methods for controlling automatic scrolling of information on a display or screen
CN103380625A (zh) * 2011-06-16 2013-10-30 松下电器产业株式会社 头戴式显示器及其位置偏差调整方法
CN102323829A (zh) * 2011-07-29 2012-01-18 青岛海信电器股份有限公司 一种显示屏视角调整方法及显示设备
CN104067160A (zh) * 2011-11-22 2014-09-24 谷歌公司 使用眼睛跟踪在显示屏中使图像内容居中的方法
US20130328925A1 (en) * 2012-06-12 2013-12-12 Stephen G. Latta Object focus in a mixed reality environment
CN103593044A (zh) * 2012-08-13 2014-02-19 鸿富锦精密工业(深圳)有限公司 电子装置校正系统及方法
CN104280883A (zh) * 2013-07-04 2015-01-14 精工爱普生株式会社 图像显示装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109240493A (zh) * 2018-08-22 2019-01-18 联想(北京)有限公司 一种控制方法及电子设备
CN111367405A (zh) * 2020-02-17 2020-07-03 深圳岱仕科技有限公司 头戴显示设备的调整方法、装置、计算机设备及存储介质
CN112069480A (zh) * 2020-08-06 2020-12-11 Oppo广东移动通信有限公司 显示方法、装置、存储介质及可穿戴设备

Also Published As

Publication number Publication date
CN108604015A (zh) 2018-09-28
CN108604015B (zh) 2020-07-14

Similar Documents

Publication Publication Date Title
US10055642B2 (en) Staredown to produce changes in information density and type
KR102626821B1 (ko) 고정-거리 가상 및 증강 현실 시스템들 및 방법들
EP3368170B1 (en) Adjusting image frames based on tracking motion of eyes
US10740971B2 (en) Augmented reality field of view object follower
JP7431246B2 (ja) 異なる露光時間を有する画像を使用した眼追跡
US8971570B1 (en) Dual LED usage for glint detection
KR102208376B1 (ko) Hmd 상의 하이브리드 월드/바디 락 hud
JP5887026B2 (ja) ヘッドマウントシステム及びヘッドマウントシステムを用いてディジタル画像のストリームを計算しレンダリングする方法
US8955973B2 (en) Method and system for input detection using structured light projection
TWI549505B (zh) 用於擴展現實顯示的基於理解力和意圖的內容
JP7423659B2 (ja) 眼姿勢を推定するためのシステムおよび技法
US20130241805A1 (en) Using Convergence Angle to Select Among Different UI Elements
JP7005658B2 (ja) 非平面コンピュテーショナルディスプレイ
JP2019507902A (ja) 深度平面間の低減された切り替えを伴う多深度平面ディスプレイシステム
US20200322595A1 (en) Information processing device and information processing method, and recording medium
JP2022540675A (ja) 1つ以上の眼追跡カメラを用いた眼回転中心の決定
US20240085980A1 (en) Eye tracking using alternate sampling
WO2018120554A1 (zh) 图像显示方法和头戴显示设备
US20210392318A1 (en) Gaze tracking apparatus and systems
US11747897B2 (en) Data processing apparatus and method of using gaze data to generate images
WO2023195995A1 (en) Systems and methods for performing a motor skills neurological test using augmented or virtual reality

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17888469

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17888469

Country of ref document: EP

Kind code of ref document: A1