WO2019007050A1 - 瞳孔距离测量方法、可穿戴眼部设备及存储介质 - Google Patents

瞳孔距离测量方法、可穿戴眼部设备及存储介质 Download PDF

Info

Publication number
WO2019007050A1
WO2019007050A1 PCT/CN2018/074980 CN2018074980W WO2019007050A1 WO 2019007050 A1 WO2019007050 A1 WO 2019007050A1 CN 2018074980 W CN2018074980 W CN 2018074980W WO 2019007050 A1 WO2019007050 A1 WO 2019007050A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
pupil
eye
distance
camera
Prior art date
Application number
PCT/CN2018/074980
Other languages
English (en)
French (fr)
Other versions
WO2019007050A8 (zh
Inventor
张�浩
陈丽莉
楚明磊
孙建康
闫桂新
郭子强
王亚坤
田文红
马占山
Original Assignee
京东方科技集团股份有限公司
北京京东方光电科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司, 北京京东方光电科技有限公司 filed Critical 京东方科技集团股份有限公司
Priority to EP18788993.6A priority Critical patent/EP3651457B1/en
Priority to US16/096,428 priority patent/US11534063B2/en
Priority to JP2018557090A priority patent/JP2020526735A/ja
Publication of WO2019007050A1 publication Critical patent/WO2019007050A1/zh
Publication of WO2019007050A8 publication Critical patent/WO2019007050A8/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/11Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
    • A61B3/111Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring interpupillary distance
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0025Operational features thereof characterised by electronic signal processing, e.g. eye models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30041Eye; Retina; Ophthalmic
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Definitions

  • Embodiments of the present disclosure relate to a pupil distance measuring method, a wearable eye device, and a storage medium.
  • VR devices are required to have a higher refresh rate to reduce smear; screens are required to have higher resolution to reduce visual graininess and the like.
  • people have higher and higher requirements for anti-distortion and anti-dispersion of VR devices. If the anti-distortion anti-dispersion operation does not consider the difference of the pupils of the human eye, it will cause the same VR device to be used by people with different distances. Have different viewing effects.
  • At least one embodiment of the present disclosure provides a pupil distance measuring method, including: capturing at least one eye image; extracting, from the at least one eye image, a pupil image position corresponding to a single eye pupil, or extracting a pupil eye corresponding to the pupil hole Two pupil image positions; and determining an actual distance between the pupils of the eyes based on the one or two pupil image positions.
  • the photographing at least one eye image includes: capturing an eye image that simultaneously includes the binocular pupil.
  • the extracting the one or two pupil image positions of the single eye pupil or the binocular pupil in the at least one eye image includes determining a pupil image position of the binocular pupil in the eye image. And determining the actual distance between the pupils based on the one or two pupil image positions, including: determining an image distance of the binocular pupils on the eye image according to the pupil image positions of the binocular pupils And determining an actual distance of the binocular pupil based on an image distance of the binocular pupil.
  • determining the actual distance of the binocular pupil according to an image distance of the binocular pupil comprises: measuring a vertical actual distance between the binocular and the camera; determining a shooting angle of the camera; and reading the The image distance of the pupil of the eye; the height of the image of the eye; the image distance according to the pupil of the eye, the angle of the shooting, the vertical actual distance between the eyes and the camera, and the height of the eye image Determining the actual distance between the pupils of the eyes.
  • the capturing at least one eye image includes: capturing, by the first camera and the second camera, a left eye image including a left eye and a right eye image including a right eye, respectively. Extracting, from the at least one eye image, a one or two pupil image positions corresponding to the one eye pupil or the binocular pupil in the at least one eye image, comprising: determining a left eye pupil in the left eye image The pupil image position and the pupil image position of the right eye pupil in the right eye image.
  • determining the actual distance between the binocular pupils based on the one or two pupil image positions including: determining an image position of the first camera in the left eye image and the second camera is An image position in the right eye image; in the left eye image, determining a first horizontal image distance between a pupil image position of the left eye pupil to an image position of the first camera; Determining, by a horizontal image distance, a first horizontal actual distance between the left eye pupil and the first camera; and determining, in the right eye image, a pupil image position of the right eye pupil to the second camera a second horizontal image distance between the image positions; determining a second horizontal actual distance between the right eye pupil and the second camera according to the second horizontal image distance; and actually according to the first level
  • the actual distance of the binocular pupil is determined by the distance from the second horizontal distance.
  • determining a first horizontal actual distance between the left eye pupil and the first camera according to the first horizontal image distance comprises: measuring a vertical between the first camera and the second camera and the two eyes Actual distance; determining a first shooting angle of the first camera; reading a height of the left eye image; according to the vertical actual distance, the first shooting angle, the first horizontal image distance, and Determining a height of the left eye image, determining a first horizontal actual distance between the left eye pupil and the first camera.
  • Determining, according to the second horizontal image distance, a second horizontal actual distance between the right eye pupil and the second camera comprising: determining a second shooting angle of the second camera; reading the right a height of the eye image; determining, between the right eye pupil and the second camera, according to the vertical actual distance, the second shooting angle, the second horizontal image distance, and the height of the right eye image The second level of actual distance.
  • the determining the actual distance between the pupils of the two eyes further includes: acquiring an actual distance between the first and second cameras; calculating the first horizontal actual distance and the second horizontal actual distance The sum of the distances of the persons; determining the actual distance between the pupils of the eyes is the sum of the actual distance between the first and second cameras minus the distance.
  • the photographing at least one eye image includes: capturing an eye image including the single eye pupil and the set marker point. Extracting, from the at least one eye image, a one or two pupil image positions corresponding to the one eye pupil or the binocular pupil in the at least one eye image, comprising: extracting the single eye pupil in the eye image a pupil image position; and a marker position from which the marker point is extracted from the eye image. And determining the actual distance between the pupils of the two eyes based on the one or two pupil image positions, including determining the single eye pupil and the marker according to the pupil image position of the single eye pupil and the marker position The horizontal image distance of the point; and determining the actual distance between the pupils of the two eyes based on the horizontal image distance of the single eye pupil and the marker point.
  • determining the actual distance between the pupils of the two eyes according to the horizontal image distance of the single eye pupil and the marker point comprises: measuring a vertical actual distance between the eyes and the camera; determining a shooting of the camera An angle; reading a horizontal image distance of the single eye pupil and the marker point; reading a height of the eye image; according to a vertical actual distance between the eyes and the camera, a shooting angle, the single eye pupil, and The horizontal image distance of the marker point and the height of the eye image determine the actual distance between the binocular pupils.
  • extracting, from the at least one eye image, one or two pupil image positions corresponding to the one eye pupil or the binocular pupil in the at least one eye image including: from the at least one eye image Extracting an image area in which one or both eyes are located; filtering and grading the image area to obtain a grayscale image; converting the grayscale image into a binary image; searching for a boundary of a single-eye pupil region in the binary image or The boundary of the pupil region of the two eyes; elliptical fitting the boundary of the monocular pupil region or the boundary of the binocular pupil region to obtain one or two elliptical patterns; and taking the center of each elliptical pattern as the image position of the corresponding pupil.
  • the height of the eye image is the vertical pixel size of the image captured by the camera.
  • At least one embodiment of the present disclosure provides a computer readable storage medium having stored thereon computer instructions that, when executed by a processor, perform operations of extracting a single eye pupil or a binocular pupil from at least one eye image Corresponding one or two pupil image positions in the at least one eye image; and determining an actual distance between the binocular pupils based on the one or two pupil image positions.
  • At least one embodiment of the present disclosure provides a wearable eye device comprising: a processor and a memory; the memory storing instructions, when executed by the processor, to: operate from at least one eye image And extracting a corresponding one or two pupil image positions in the at least one eye image of the single eye pupil or the binocular pupil; and determining an actual distance between the binocular pupils based on the one or two pupil image positions.
  • the wearable eye device further includes a first camera, wherein the first camera is configured to capture an eye image comprising the binocular pupil. And determining, by the processor, the operation of determining, by the processor, determining a pupil image position of the binocular pupil in the eye image; determining, according to a pupil image position of the binocular pupil, An image distance of the binocular pupil on the eye image; the actual distance of the binocular pupil is determined based on an image distance of the binocular pupil.
  • the wearable eye device further includes a first camera and a second camera; the first camera and the second camera are configured to respectively capture a left eye image including a left eye and a right eye image including a right eye;
  • the memory stored instructions are executed by the processor to: determine a pupil image position of a left eye pupil in the left eye image and a pupil image position of a right eye pupil in the right eye image; Determining an image position of the first camera in the left eye image and an image position of the second camera in the right eye image; in the left eye image, determining a pupil image position of the left eye pupil to a first horizontal image distance between image positions of the first camera; determining a first horizontal actual distance between the left eye pupil and the first camera according to the first horizontal image distance; Determining, in the right eye image, a second horizontal image distance between the pupil image position of the right eye pupil to an image position of the second camera; according to the second horizontal image distance Determining a second level of a second camera between the actual distance of the pupil of the right eye; the actual
  • the wearable eye device further includes a first camera, wherein the first camera is configured to capture an eye image including the single eye pupil and the set landmark. And performing, by the processor, the following operations: extracting a pupil image position of the single-eye pupil in the eye image; and extracting a marker position of the marker point from the eye image; Determining a horizontal image distance of the single-eye pupil and the marker point according to the pupil image position of the single-eye pupil and the marker position; and determining the binocular pupil according to the horizontal image distance of the single-eye pupil and the marker point The actual distance between.
  • the marker point is disposed on a housing of the wearable eye device.
  • the wearable eye device further includes an infrared light source, wherein the infrared light source provides a light source for the wearable eye device to take an eye image.
  • the embodiment of the present disclosure measures the actual distance of the binocular pupil by means of image processing, and can be further used for other visually related fields such as eye tracking and line of sight calculation according to the obtained actual distance of the binocular pupil, and can be further improved. Wear the experience of the wearer of an eye device (such as a VR device).
  • FIG. 1 is a flowchart of a method for measuring a pupil distance according to an embodiment of the present disclosure
  • 2A is a flowchart of another method for measuring pupil distance according to an embodiment of the present disclosure
  • 2B is a schematic diagram of a pupil distance measuring device according to an embodiment of the present disclosure.
  • 2C is a schematic diagram of a binocular image taken by a camera according to an embodiment of the present disclosure
  • 2D is a schematic diagram of similar graphics provided by an embodiment of the present disclosure.
  • FIG. 3A is a flowchart of still another method for measuring pupil distance according to an embodiment of the present disclosure
  • FIG. 3B is still another schematic diagram of a pupil distance measuring device according to an embodiment of the present disclosure.
  • FIG. 3C-1 is a schematic diagram of a captured left eye image according to an embodiment of the present disclosure.
  • 3C-2 is a schematic diagram of a captured right eye image provided by an embodiment of the present disclosure.
  • FIG. 3D is a schematic diagram of similar graphics provided by an embodiment of the present disclosure.
  • FIG. 3E is a schematic diagram of positions of two eyes and a camera on a plane where two eyes are located according to an embodiment of the present disclosure
  • FIG. 4A is a flowchart of still another method for measuring pupil distance according to an embodiment of the present disclosure.
  • 4B is still another schematic diagram of a pupil distance measuring device according to an embodiment of the present disclosure.
  • 4C is a schematic diagram of a position of a marker point according to an embodiment of the present disclosure.
  • 4D is a relative positional relationship diagram of a binocular and a marker point according to an embodiment of the present disclosure
  • 4E is an image of a single eye pupil and a marker point taken by a camera provided by an embodiment of the present disclosure
  • 4F is a schematic diagram of a similar figure provided by an embodiment of the present disclosure.
  • FIG. 5 is a flowchart of extracting a pupil position according to an embodiment of the present disclosure
  • FIG. 6 is a block diagram of a composition of a wearable eye device according to an embodiment of the present disclosure.
  • pupils may be employed to represent the eye without causing ambiguity.
  • the position of the pupil may be used to represent the position of the eye, the distance of the pupil of the two eyes represents the distance between the eyes, and the like.
  • the pupil distance measuring method 100 may include: step 101, capturing at least one eye image; and step 111, extracting, from the at least one eye image, a corresponding one of the one eye pupil or the binocular pupil in the at least one eye image Or two pupil image positions (ie, extracting one pupil image position corresponding to the single eye pupil, or extracting two pupil image positions corresponding to the two eye pupils); and step 121, determining the based on the one or two pupil image positions The actual distance between the pupils of both eyes.
  • step 101 includes taking an image that includes one eye, while in other embodiments, step 101 includes taking an image that includes both eyes or capturing two images each containing one eye.
  • step 111 may extract the image distance of the binocular pupil from the image; in other embodiments, step 111 may extract the distance between the pupil of one eye and a set marker point from the image. Or extract the distance between the pupil of one eye and the position of the camera device on the image from the image.
  • step 121 derives the actual distance between the pupils of the two eyes based on the image distance of the pupils of the two eyes obtained in step 111, in conjunction with the proportional relationship between the figures. In other embodiments, step 121 derives the actual distance between the pupils of the eye based on the image distance between the single eye pupil and the marker point obtained in step 111, in conjunction with the proportional relationship between the graphics. In other embodiments, step 121 may further derive the actual distance between the eye and the corresponding imaging device based on the proportional relationship between the graphics based on the image distance of step 111, and then combine the actual distance between the left eye and the photographic device. The actual distance between the distance and the right eye and another photographic device gives the actual distance between the pupils of both eyes.
  • embodiments of the present disclosure in order to derive the actual distance between the pupils based on the image distance associated with the pupil, embodiments of the present disclosure also construct a virtual imaging plane located on the focal plane of the photographic device, wherein in some embodiments the virtual The imaging plane may coincide with the actual imaging plane. If the virtual imaging plane does not coincide with the actual imaging plane, the image formed on the virtual imaging plane also satisfies a certain proportional relationship with the image taken on the actual imaging plane, for example, scaling up or down. Based on the proportional relationship between the virtual imaging plane and the similar figure on the plane of the two eyes, the calculation relationship between the image distance related to the pupil on the captured image and the actual distance of the pupil can be derived.
  • the following embodiments are described by taking an image distance, that is, a correlation distance on an image taken on an actual imaging plane as an example.
  • step 121 may extract the position of the pupil on the image from the captured eye image (ie, the image located on the actual imaging plane) using the method illustrated in FIG.
  • the distance associated with the pupil on the captured image can be further obtained.
  • the distance of the pupil of the eye on the captured image is determined based on the position of the pupil extracted on the captured image.
  • the first horizontal image distance or the second horizontal image distance is determined according to the pupil position extracted on the captured image and the corresponding position of the camera.
  • the horizontal image distance is determined based on the position of the pupil extracted on the captured image and the position of the marker point.
  • the user when the user starts to use a wearable device such as a VR device, the user performs some operations on the start interface to enter the main interface. For example, click on a menu, etc., or enter the main interface only when you are in an open phase.
  • the pupil distance measuring method of the present disclosure can be activated to measure the actual distance between the pupils.
  • the image on which the pupil measurement method 200 is based may be a binocular image acquired using one camera.
  • the pupil measurement method 200 may include: step 201, capturing an eye image including the binocular pupil; and step 211, determining a pupil image position of the binocular pupil in the eye image; The pupil image position of the binocular pupil is determined, the image distance of the binocular pupil on the eye image is determined; and in step 231, the actual distance of the binocular pupil is determined according to the image distance of the binocular pupil.
  • step 201 may use an infrared camera to capture the user's eyes, and at the same time, multiple infrared LED lights may be used to provide the infrared camera with a light source.
  • the camera captures images it is necessary to ensure that the user's two eyes can be collected, that is, to ensure that the camera can work normally, and the LED lights are normally turned on.
  • the embodiments of the present disclosure may also use other types of cameras and other types of light sources for shooting. The disclosure is not limited herein.
  • the pupil distance measuring method can be applied in a VR device, and the corresponding infrared LED lamp 4 illuminates the inside of the VR device with infrared light, and the LED lamp 4 is disposed around the lens 3 used by the VR device. It can be distributed on the inner casing 5 of the VR device. The number of LED lights 4 is used to illuminate the inside of the VR device, and it is possible to take a picture of a clear eye image.
  • the camera 2 is placed at an upper intermediate position of the VR device.
  • the VR device may also include an interface 1 that can connect the infrared camera 2 to a data processing device.
  • the interface 1 can be a USB interface (connected to an external computer), a MIPI interface (connected to a mobile terminal), a WIFI interface, or a Bluetooth interface.
  • the image size of the image actually captured by the camera is "Image_W*Image_H”.
  • the position of the left eye pupil on the image is Image_el
  • the position of the right eye pupil on the image is Image_er
  • the left eye pupil and the right image are taken on the image.
  • the distance between the two points of the eyelid ie, the distance between the eyelids on the image) is Image_ed.
  • Fig. 2D shows two planes, a virtual imaging plane O' (i.e., a focal plane) and a plane O where the eyes are located.
  • a virtual imaging plane O' i.e., a focal plane
  • the angle of the field of view (FOV) of the camera 20 is 2* ⁇ .
  • the positions el0 and er0 of the binocular pupils are shown in the virtual imaging plane O'.
  • H represents the height of the plane in which the eyes are located.
  • the parameter H can be calculated from the shooting angle of the camera; h represents the height of the focal plane.
  • the parameter h can also be calculated from the shooting angle of the camera.
  • Pd (ie, the actual distance between the pupils of both eyes) represents the actual distance between the left eye pupil el and the right eye pupil er of the right eye pupil er on the plane O;
  • pd' (ie the distance of the binocular pupil on the focal plane O') is expressed in The distance between the binocular pupil er0 (corresponding to the er point of the O plane) and el0 (corresponding to the el point of the O plane) on the imaging plane O'.
  • Image_ed/pd’ Image_H/h
  • determining the actual distance pd of the binocular pupil according to the image distance Image_ed of the binocular pupil in step 221 of FIG. 2A may include: measuring a vertical actual distance d between the binocular and the camera (eg, a graph) 2D is the vertical distance between the plane O of the eyes and the camera 20; determining the shooting angle ⁇ of the camera (for example, the shooting angle of the camera 20 shown in FIG. 2D); reading the binocular pupil
  • the image distance is Image_ed (for example, the image distance between the left eye pupil and the right eye pupil on the captured image shown in FIG. 2C); the height of the eye image is read Image_H (for example, the captured image in FIG.
  • the embodiment of the present disclosure uses an infrared camera to capture the binocular of the user, and through image processing, tests the distance of the binocular pupil on the image, and then calculates the actual distance of the pupil. After detecting the pupil distance using the embodiment of the present disclosure, it can also be used for other eye-related fields such as eye tracking and line of sight calculation.
  • the pupil distance measuring method 300 may include: step 301, respectively capturing a left eye image including a left eye and a right eye image including a right eye by using the first camera and the second camera; and step 311, determining a left eye in the left eye image a pupil image position of the pupil and a pupil image position of the right eye pupil in the right eye image; step 321 , determining an image position of the first camera in the left eye image and the second camera is on the right Image position in the eye image; step 331, in the left eye image, determining a first horizontal image distance between the pupil image position of the left eye pupil to an image position of the first camera; step 341, Determining, by the first horizontal image distance, a first horizontal actual distance Sep_1 between the left eye pupil and the first camera; Step 351, in the right eye image, determining a pupil image of the right eye pupil Positioning a second horizontal image distance between image positions of the second camera; step 361, determining the right eye pupil and
  • step 301 can take an image with the device shown in FIG. 3B.
  • two infrared cameras 11 ie, the first camera and the second camera, respectively located above the left eye and above the right eye
  • each Each of the infrared cameras 11 is capable of capturing one eye of the user.
  • a plurality of infrared LED lamps 14 are also included, which provide light sources for the two infrared cameras.
  • the two cameras 11 are respectively above the lens 13, and the two cameras 11 are on the same horizontal line.
  • the embodiments of the present disclosure may also use other types of cameras and other types of light sources for shooting. The disclosure is not limited herein.
  • the lens 13 of Figure 3B can be a lens used by a VR device, and the infrared LED lamp 14 illuminates the interior of the VR device with infrared light.
  • the LED lamps 14 are typically located around the lens 13 of the VR and may also be distributed over the inner casing 15 of the VR device. The number of LED lamps 14 is such that the inside of the VR device is illuminated and the image of the eye image can be captured.
  • the image captured by one of the cameras 12 in Fig. 3B can be as shown in Figs. 3C-1 and 3C-2.
  • the pixel sizes of the captured images of the left and right eyes shown in FIGS. 3C-1 and 3C-2 are Image_W*Image_H, and the center of the image is P_ImageC (ie, the corresponding position on the image taken by the camera 12 on the actual imaging plane).
  • the position of the center of the monocular pupil on the captured image on the captured image is the point Peye, and the lateral extension of the point Peye intersects the longitudinal extension of the point P_ImageC with the point P_c.
  • the distance between the point Peye and the point P_c is the first horizontal image distance Image_ed_l or the second horizontal image distance Image_ed_r.
  • Fig. 3D shows two planes, a virtual imaging plane O' (i.e., a focal plane) and a plane O where the eyes are located.
  • the point P_eye_o and the point P_c_o in Fig. 3D are the point positions corresponding to the point Peye on the captured image and the point P_c on the plane O where the eyes are located, respectively.
  • the points P_eye_o and P_c_o in Fig. 3D correspond to the point Peye' and the point P_c', respectively, on the virtual imaging plane O'.
  • the point Peye' may be to the left of the point P_c' or to the right of the point P_c', the first horizontal image distance Image_ed.l' or the second horizontal image distance Image_ed.r' may be positive or possible. Negative.
  • the focal lengths of the first camera and the second camera 30 in FIG. 3D are f, and the shooting angle FOV is 2* ⁇ .
  • the height of the virtual imaging plane O' is h
  • the vertical actual distance from the human eye to the camera is d
  • P_eye_o is the position of the human eye on the plane O
  • P_c_o is the projection of the point P_c' on the plane O.
  • the distance between the point P_eye_o and the point P_c_o is the first horizontal actual distance or the second horizontal actual distance Sep_x (where the parameter x may be l or r, respectively representing the first horizontal actual distance Sep_l corresponding to the left eye and the right eye corresponding to The second level actual distance Sep_r).
  • Image_ed/Image_ed’ Image_H/h
  • the actual distance of the binocular pupil can be further calculated by referring to FIG. 3E.
  • the image processing method for example, refer to FIG. 5
  • the first camera 12 is used for shooting.
  • the left eye image, and the horizontal distance Sep_l of the left eye pupil on the captured image to the corresponding point of the first camera is calculated by using the same method; the same method can be used to calculate the point corresponding to the second camera 12 on the captured image to the center of the right eye pupil
  • the distance is Sep_r.
  • determining the first horizontal actual distance in step 341 may include measuring a vertical actual distance d between the first camera and the second camera and the two eyes (eg, the plane O of the eyes shown in FIG. 3D) a vertical distance from the camera 30); determining a first shooting angle ⁇ of the first camera (eg, a shooting angle of the camera 30 shown in FIG. 3D); reading a height of the left eye image (eg, The vertical shooting pixel of the image in FIG. 3C-1; according to the vertical actual distance d, the first shooting opening angle ⁇ , the first horizontal image distance Image_ed_1 (for example, the left eye shown in FIG.
  • 3C-1 Determining a distance between the pupil and a projection point P_ImageC of the first camera 12 on the image and a height of the left-eye image, determining a first horizontal actual distance Sep_1 between the left-eye pupil and the first camera (refer to Figure 3E).
  • the step 361 of determining the second horizontal actual distance may include: determining a second shooting angle ⁇ of the second camera; reading a height of the right eye image (eg, a longitudinally captured pixel of the image in FIG. 3C-2) According to the vertical actual distance d, the second shooting opening angle ⁇ , the second horizontal image distance Image_ed_r (for example, the right eye pupil and the second camera 12 shown on FIG. 3C-2 are projected on the image
  • the distance between the P_ImageC and the height of the right eye image determines a second horizontal actual distance Sep_r between the right eye pupil and the second camera (refer to FIG. 3E).
  • step 371 determines an actual distance between the binocular pupils, and may further include: acquiring an actual distance Sep between the first and second cameras; calculating the first level actual The sum of the distances between the distance Sep_l and the second horizontal actual distance Sep_r; determining the actual distance between the binocular pupils is the sum of the actual distance between the first and second cameras minus the distance.
  • the distance between the two cameras in the VR device is Sep
  • the left eye image of the user is captured by the first camera camera_l by means of image processing
  • the horizontal distance Sep_l of the left eye pupil to the left camera is calculated by calculation.
  • the same method can be used to calculate the distance Sep_r from the second camera camera_r to the center of the right eye pupil.
  • the user's final pupil distance pd Sep-Sep_l-Sep_r, as shown in FIG. 3E.
  • a single-eye image is taken by two infrared cameras with known positions, each camera can capture one eye of the user, and a plurality of infrared LED lights are used to provide a light source for the infrared camera.
  • a method is provided for capturing an eye of a user by using two infrared cameras in a VR device, and measuring a single pupil on the captured image to a corresponding point on the image by a corresponding image on the image by image processing. The distance, and then, the actual distance of the pupil is calculated.
  • the camera used in the embodiments of the present disclosure can also be used for other eye-related work fields such as eye tracking and line of sight calculation after detecting the pupil distance.
  • the pupil distance measuring method 400 includes a step 401 of capturing an eye image including the single eye pupil and a preset landmark. Step 411, extracting a pupil image position of the single-eye pupil in the eye image; and extracting a marker position of the marker point from the eye image. Step 421, determining a horizontal image distance of the single-eye pupil and the marker point according to the pupil image position of the single-eye pupil and the marker position; and step 431, according to the horizontal image distance of the single-eye pupil and the marker point The actual distance between the pupils of the eyes is determined.
  • the pupil ranging method 400 can be applied to a VR device.
  • the VR device is configured with a single infrared camera 22 and an infrared LED lamp 24.
  • the infrared light source of the LED 24 or the like illuminates the human eye
  • the infrared camera 22 captures the single eye of the user and the marked point 26 on the VR housing 25, and obtains the user's pupil and marking points by image processing. 26, and based on the lateral distance from the pupil to the marked point on the image, the actual distance between the pupils of the eyes is estimated.
  • the marker point 26 can be disposed on the housing 25 of the VR device.
  • the embodiments of the present disclosure may also use other types of cameras and other types of light sources for shooting. The disclosure is not limited herein.
  • the VR device uses a lens 23 that illuminates the interior of the VR device with infrared light.
  • the LED lamps 24 are typically around the lens 23 and may also be distributed over the inner casing 25 of the VR device.
  • the number of LED lamps 24 is used to illuminate the inside of the VR device, and it is possible to take a picture of the eye image, and generally eight LED lamps are used.
  • the marker point 26 is located above the VR device viewing window 27, for example, the marker point 26 is located in the middle of the left and right direction of the VR housing 25.
  • the marked point 26 can be set at the center of the user's binocular spacing.
  • the marker point 26 can be an object point that can be recognized by an infrared camera such as an infrared LED lamp 24.
  • the viewing window 27 of the VR device means that the user can view the content displayed by the VR device through the lens 23 through the window.
  • FIG. 4E An image taken with a camera is shown in FIG. 4E, which includes a single eye position point Peye and a mark point Pmark.
  • the image size of the image taken by the camera is Image_W*Image_H
  • the position of the center of the monocular pupil on the image is Peye
  • the point of the marked point on the image is Pmark.
  • the horizontal extension line of the point Peye on the image intersects the longitudinal extension line of the point Pmark with the point Pcross.
  • the distance between the point Peye and the point Pcross is the horizontal image distance Image_ed.
  • Fig. 4F Two planes are shown in Fig. 4F, which are the virtual imaging plane O' at the focal plane and the plane O where the pupils of both eyes are located.
  • the point Peye, the point Pmark, and the point Pcross on the captured image correspond to the point Peye', the point Pmark', and the point Pcross' on the virtual imaging plane O', respectively.
  • the position Peye' of the single-eye pupil and the position Pmark' of the marker point are shown in the virtual imaging plane O'.
  • the distance from the human eye to the camera is d.
  • Peye_o is the position of the human eye on the face O
  • Pcross_o is the projection of Pcross' on the face O.
  • the corresponding Peye_o, the distance between Pcross_o is half the actual distance of the binocular pupil.
  • the focal length of the camera 40 is f, and the shooting angle is ⁇ .
  • H represents the height of the plane in which the eyes are located. For example, H can be calculated from the shooting angle of the camera; h represents the height of the focal plane, for example, h can be calculated from the shooting angle of the camera.
  • Image_ed/pd’ Image_H/h
  • pd' 2 * d * tan ( ⁇ ) * Image_ed / Image_H can be obtained.
  • step 421 can include measuring a vertical actual distance d between the binocular and the camera (eg, a vertical distance between the plane O of the eyes shown in FIG. 4F and the camera 40); determining the camera
  • the opening angle ⁇ is taken (for example, the shooting opening angle of the camera 40 shown in FIG. 4F); the horizontal image distance of the single-eye pupil and the marker point on the captured image is read from Image_ed (for example, on the image shown in FIG. 4E)
  • the distance between the monocular pupil and the marker point; the height of the eye image of the captured image for example, the number of vertical pixels of the image shown in FIG. 4E); according to the vertical actual distance between the binocular and the camera d.
  • the shooting opening angle ⁇ , the horizontal image distance of the single-eye pupil and the marker point, and the height Image_H of the eye image determine the actual distance (ie, pd) between the binocular pupils.
  • a step of extracting, from the at least one eye image, a position of one or two pupil images corresponding to a single eye pupil or a binocular pupil in the at least one eye image may include: step 501, extracting an image region where the single eye or both eyes are located from each eye image; and step 511, filtering and grading the image region to obtain a grayscale image, and converting the grayscale image into a binary image; step 521, searching for a boundary of a monocular pupil region or a boundary of a binocular pupil region in the binary image; and step 531, elliptically fitting a boundary of the monocular pupil region or a boundary of the binocular pupil region to obtain a Or two elliptical figures; and step 541, the center of each ellipse pattern is taken as the image position of the corresponding pupil.
  • other methods can also be used to obtain the image position of the pupil, which is not limited herein.
  • the height of the eye image in the above embodiment may be the pixel size in the longitudinal direction of the image captured by the camera.
  • extracting the image of the identification point may include the following steps: after obtaining the image acquired by the camera, the marker point extraction may be performed.
  • Marking point extraction includes the following steps: (1) Marking point area extraction (for example, extracting the approximate area where the marked point is located from the entire image). Since the position of the marked point in the image is substantially determined after the user wears the VR device, a larger image area can be selected as the candidate area of the area where the marked point is located.
  • Image filtering filtering the extracted image, such as Gaussian filtering.
  • Grayscale Grayscale the filtered image, and convert the layer grayscale image from the RGB image.
  • Marking point area inspection Using the morphological method, find the boundary of the marked point area.
  • Marking point ellipse fitting Ellipse fitting is performed by using the boundary of the marked point area to obtain an ellipse.
  • Marking point center output The marked point position is the center of the ellipse.
  • Embodiments of the present disclosure provide a method of estimating a pupil distance using a single infrared camera.
  • an infrared camera may be employed with multiple LED lamps.
  • a single camera can capture the user's monocular and marker points on the VR housing, and multiple infrared LEDs provide a source of light for the infrared camera.
  • the infrared camera in the VR device is used to capture the single eye of the user and the marked points on the VR housing, and the lateral distance of the single eye pupil to the marked point is tested by image processing, and then the actual distance between the pupils of the eyes is calculated.
  • the camera used in the present disclosure can also be used for other eye-related tasks such as eye tracking and line of sight calculation after detecting the pupil distance.
  • At least one embodiment of the present disclosure provides a computer readable storage medium having stored thereon computer instructions that, when executed by a processor, perform operations of extracting a single eye pupil or both eyes from the at least one eye image a corresponding one or two pupil image positions of the pupil in the at least one eye image; determining an actual distance between the binocular pupils based on the one or two pupil image positions.
  • the wearable eye device 600 can include a processor 602 and a memory 603.
  • the memory 603 stores instructions that, when executed by the processor 602, perform operations of extracting one or two of a single eye pupil or a binocular pupil in the at least one eye image from at least one eye image a pupil image position; determining an actual distance between the pupils of the two eyes based on the one or two pupil image positions.
  • the wearable eye device 600 can also include at least one camera 601.
  • at least one camera 601 is configured to capture the at least one eye image.
  • the wearable eye device 600 can also include at least one infrared light source 604 that can be configured to provide the photographing light source by the at least one camera 601.
  • wearable eye device 600 can have the basic components of VR glasses.
  • the basic components of VR glasses can include lenses, housings, and the like.
  • At least one camera 601 includes a first camera, wherein the first camera is configured to capture an image of the eye that includes the pupil of the eye.
  • the memory stored instructions are executed by the processor to: determine a pupil image position of the binocular pupil in the eye image; determine the eye in the eye according to a pupil image position of the binocular pupil An image distance of the binocular pupil on the partial image; the actual distance of the binocular pupil is determined based on an image distance of the binocular pupil.
  • the at least one camera 601 may further include a first camera and a second camera; wherein the first camera and the second camera are configured to respectively capture a left eye image including a left eye and a right eye including a right eye An eye image; when the instructions stored by the memory 603 are executed by the processor 602, the following operations are performed: determining a pupil image position of a left eye pupil in the left eye image and a pupil of a right eye pupil in the right eye image Image position; determining an image position of the first camera in the left eye image and an image position of the second camera in the right eye image; determining, in the left eye image, the left eye pupil Determining a first horizontal image distance between the pupil image position and the image position of the first camera; determining, according to the first horizontal image distance, a first level actual between the left eye pupil and the first camera a distance; in the right eye image, determining a second horizontal image distance between a pupil image position of the right eye pupil and an image position of the second camera; a second horizontal image distance,
  • the camera 601 can also include a first camera, wherein the first camera is configured to capture an eye image that includes the single eye pupil and the set landmark.
  • the following operations are performed: extracting a pupil image position of the single-eye pupil in the eye image; and extracting a marker position of the marker point from the eye image Determining a horizontal image distance of the single-eye pupil and the marker point according to the pupil image position of the single-eye pupil and the marker position; determining the binocular pupil according to the horizontal image distance of the single-eye pupil and the marker point The actual distance between.
  • the marker point is disposed on a housing of the wearable eye device.
  • the memory 603 and the processor 602 may be located on a processing device such as the same PC.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Eye Examination Apparatus (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

瞳孔距离测量方法(100)、可穿戴眼部设备(600)及计算机可读存储介质。所述方法(100)包括:拍摄至少一眼部图像;从所述至少一眼部图像中,提取单眼瞳孔或双眼瞳孔在所述至少一眼部图像中对应的一或两个瞳孔图像位置;以及基于所述一或两个瞳孔图像位置,确定所述双眼瞳孔之间的实际距离。

Description

瞳孔距离测量方法、可穿戴眼部设备及存储介质 技术领域
本公开的实施例涉及一种瞳孔距离测量方法、可穿戴眼部设备及存储介质。
背景技术
随着虚拟现实(VR)技术的不断发展,人们对VR设备的要求越来越高。例如,要求VR设备具备更高的刷新率,以减小拖影;要求屏幕拥有更高的分辨率,以减小视觉的颗粒感等。此外,人们对VR设备的反畸变反色散的要求也越来越高,如果反畸变反色散运算不考虑人眼瞳孔的差异性,会导致同一个VR设备被不同的瞳距的人使用时,具有不同的观看效果。
发明内容
本公开的至少一个实施例提供一种瞳孔距离测量方法,包括:拍摄至少一眼部图像;从所述至少一眼部图像中,提取单眼瞳孔对应的一个瞳孔图像位置,或提取双眼瞳孔对应的两个瞳孔图像位置;以及基于所述一或两个瞳孔图像位置,确定所述双眼瞳孔之间的实际距离。
例如,所述拍摄至少一眼部图像,包括:拍摄同时包含所述双眼瞳孔的眼部图像。所述提取单眼瞳孔或双眼瞳孔在所述至少一眼部图像中对应的一或两个瞳孔图像位置,包括:确定在所述眼部图像中的所述双眼瞳孔的瞳孔图像位置。以及所述基于所述一或两个瞳孔图像位置,确定瞳孔之间的实际距离,包括:依据所述双眼瞳孔的瞳孔图像位置,确定在所述眼部图像上的所述双眼瞳孔的图像距离;以及根据所述双眼瞳孔的图像距离,确定所述双眼瞳孔的实际距离。
例如,所述根据所述双眼瞳孔的图像距离,确定所述双眼瞳孔的所述实际距离,包括:测量双眼与摄像头之间的垂直实际距离;确定所述摄像头的拍摄张角;读取所述双眼瞳孔的图像距离;读取所述眼部图像的高度;依据所述双眼瞳孔的图像距离、所述拍摄张角、所述双眼与摄像头之间的垂直实际距离以及所述眼部图像的高度,确定所述双眼瞳孔之间的所述实际距离。
例如,所述拍摄至少一眼部图像,包括:采用第一摄像头和第二摄像头分别拍摄包含左眼的左眼图像和包含右眼的右眼图像。所述从所述至少一眼部图像中,提取单眼瞳孔或双眼瞳孔在所述至少一眼部图像中对应的一或两个瞳孔图像位置,包括:确定所述左眼图像中的左眼瞳孔的瞳孔图像位置和所述右眼图像中的右眼瞳孔的瞳孔图像位置。以及所述基于所述一或两个瞳孔图像位置,确定所述双眼瞳孔之间的实际距离,包括:确定所述第一摄像头在所述左眼图像中的图像位置和所述第二摄像头在所述右眼图像中的图像位置;在所述左眼图像中,确定所述左眼瞳孔的瞳孔图像位置到所述第一摄像头的图像位置之间的第一水平图像距离;依据所述第一水平图像距离,确定所述左眼瞳孔与所述第一摄像头之间的第一水平实际距离;在所述右眼图像中,确定所述右眼瞳孔的瞳孔图像位置到所述第二摄像头的图像位置之间的第二水平图像距离;依据所述第二水平图像距离,确定所述右眼瞳孔与所述第二摄像头之间的第二水平实际距离;以及依据所述第一水平实际距离和所述第二水平实际距离,确定所述双眼瞳孔的实际距离。
例如,依据所述第一水平图像距离,确定所述左眼瞳孔与所述第一摄像头之间的第一水平实际距离,包括:测量所述第一摄像头和第二摄像头与双眼之间的垂直实际距离;确定所述第一摄像头的第一拍摄张角;读取所述左眼图像的高度;依据所述垂直实际距离、所述第一拍摄张角、所述第一水平图像距离以及所述左眼图像的高度,确定所述左眼瞳孔与所述第一摄像头之间的第一水平实际距离。依据所述第二水平图像距离,确定所述右眼瞳孔与所述第二摄像头之间的第二水平实际距离,包括:确定所述第二摄像头的第二拍摄张角;读取所述右眼图像的高度;依据所述垂直实际距离、所述第二拍摄张角、所述第二水平图像距离以及所述右眼图像的高度,确定所述右眼瞳孔与所述第二摄像头之间的第二水平实际距离。
例如,所述确定所述双眼瞳孔之间的实际距离,还包括:获取所述第一和第二摄像头之间的实际距离;计算所述第一水平实际距离和所述第二水平实际距离两者的距离之和;确定所述双眼瞳孔之间的实际距离为所述第一和第二摄像头之间的实际距离减去所述距离之和。
例如,所述拍摄至少一眼部图像,包括:拍摄包含所述单眼瞳孔以及设的标志点的眼部图像。所述从所述至少一眼部图像中,提取单眼瞳孔或双眼瞳孔在所述至少一眼部图像中对应的一或两个瞳孔图像位置,包括:提取所 述眼部图像中所述单眼瞳孔的瞳孔图像位置;以及从所述眼部图像中提取所述标志点的标记位置。以及所述基于所述一或两个瞳孔图像位置,确定所述双眼瞳孔之间的实际距离,包括,依据所述单眼瞳孔的瞳孔图像位置和所述标记位置确定所述单眼瞳孔和所述标志点的水平图像距离;以及依据所述单眼瞳孔和所述标志点的水平图像距离,确定所述双眼瞳孔之间的实际距离。
例如,所述依据所述单眼瞳孔和所述标志点的水平图像距离,确定所述双眼瞳孔之间的实际距离,包括:测量双眼与摄像头之间的垂直实际距离;确定所述摄像头的拍摄张角;读取所述单眼瞳孔和所述标志点的水平图像距离;读取所述眼部图像的高度;依据所述双眼与摄像头之间的垂直实际距离、拍摄张角、所述单眼瞳孔和所述标志点的水平图像距离以及所述眼部图像的高度,确定所述双目瞳孔之间的实际距离。
例如,所述从所述至少一眼部图像中,提取单眼瞳孔或双眼瞳孔在所述至少一眼部图像中对应的一或两个瞳孔图像位置,包括:从所述至少一眼部图像中提取单眼或双眼所在图像区域;对所述图像区域进行滤波和灰度化,得到灰度图;将所述灰度图转化为二值图;查找二值图中的的单眼瞳孔区域的边界或双眼瞳孔区域的边界;将所述单眼瞳孔区域的边界或双眼瞳孔区域的边界进行椭圆拟合得到一或两个椭圆图形;以及将各椭圆图形的中心作为相应瞳孔的图像位置。
例如,所述眼部图像的高度为摄像头采集的图像的纵向的像素大小。
本公开的至少一个实施例提供一种计算机可读存储介质,其上存储有计算机指令,所述指令被处理器执行时实现以下操作:从至少一眼部图像中,提取单眼瞳孔或双眼瞳孔在所述至少一眼部图像中对应的一或两个瞳孔图像位置;以及基于所述一或两个瞳孔图像位置,确定所述双眼瞳孔之间的实际距离。
本公开的至少一个实施例提供一种可穿戴眼部设备,包括:处理器以及存储器;所述存储器存储指令,所述指令被所述处理器执行时实现以下操作:从至少一眼部图像中,提取单眼瞳孔或双眼瞳孔在所述至少一眼部图像中对应的一或两个瞳孔图像位置;以及基于所述一或两个瞳孔图像位置,确定所述双眼瞳孔之间的实际距离。
例如,所述可穿戴眼部设备还包括第一摄像头,其中,该第一摄像头被配置为;拍摄包含所述双眼瞳孔的眼部图像。以及所述存储器存储的指令被 所述处理器执行时实现以下操作:确定在所述眼部图像中的所述双眼瞳孔的瞳孔图像位置;依据所述双眼瞳孔的瞳孔图像位置,确定在所述眼部图像上的所述双眼瞳孔的图像距离;根据所述双眼瞳孔的图像距离,确定所述双眼瞳孔的所述实际距离。
例如,所述可穿戴眼部设备还包括第一摄像头和第二摄像头;所述第一摄像头和第二摄像头被配置为分别拍摄包含左眼的左眼图像和包含右眼的右眼图像;以及所述存储器存储的指令被所述处理器执行时实现以下操作:确定所述左眼图像中的左眼瞳孔的瞳孔图像位置和所述右眼图像中的右眼瞳孔的瞳孔图像位置;确定所述第一摄像头在所述左眼图像中的图像位置和所述第二摄像头在所述右眼图像中的图像位置;在所述左眼图像中,确定所述左眼瞳孔的瞳孔图像位置到所述第一摄像头的图像位置之间的第一水平图像距离;依据所述第一水平图像距离,确定所述左眼瞳孔与所述第一摄像头之间的第一水平实际距离;在所述右眼图像中,确定所述右眼瞳孔的瞳孔图像位置到所述第二摄像头的图像位置之间的第二水平图像距离;依据所述第二水平图像距离,确定所述右眼瞳孔与所述第二摄像头之间的第二水平实际距离;依据所述第一水平实际距离和所述第二水平实际距离,确定所述双眼瞳孔的实际距离。
例如,所述可穿戴眼部设备还包括第一摄像头,其中,所述第一摄像头被配置为拍摄包含所述单眼瞳孔以及设定的标志点的眼部图像。以及所述存储器存储的指令被所述处理器执行时实现以下操作:提取所述眼部图像中所述单眼瞳孔的瞳孔图像位置;从所述眼部图像中提取所述标志点的标记位置;依据所述单眼瞳孔的瞳孔图像位置和所述标记位置确定所述单眼瞳孔和所述标志点的水平图像距离;以及依据所述单眼瞳孔和所述标志点的水平图像距离,确定所述双眼瞳孔之间的实际距离。
例如,所述标志点设置于可穿戴眼部设备的壳体上。
例如,所述可穿戴眼部设备还包括红外光源,其中,所述红外光源为所述可穿戴眼部设备拍摄眼部图像提供光源。
本公开的实施例通过图像处理的方式测量出双目瞳孔的实际距离,根据得到的双目瞳孔的实际距离可以进一步用于眼球追踪和视线计算等其他与视觉相关的领域,并可以进一步提高可穿戴眼部设备(例如VR设备)佩戴者的体验效果。
附图说明
为了更清楚地说明本公开实施例的技术方案,下面将对实施例的附图作简单地介绍,显而易见地,下面描述中的附图仅仅涉及本公开的一些实施例,而非对本公开的限制。
图1为本公开实施例提供的一种瞳孔距离测量方法的流程图;
图2A为本公开实施例提供的另一种瞳孔距离测量方法流程图;
图2B为本公开实施例提供的瞳孔距离测量装置的示意图;
图2C为本公开实施例的摄像头拍摄的双眼图像示意图;
图2D为本公开实施例提供的相似图形示意图;
图3A为本公开实施例提供的又另一种瞳孔距离测量方法流程图;
图3B为本公开实施例提供的瞳孔距离测量装置又一示意图;
图3C-1为本公开实施例提供的拍摄的左眼图像示意图;
图3C-2为本公开实施例提供的拍摄的右眼图像的示意图;
图3D为本公开实施例提供的相似图形示意图;
图3E为本公开实施例提供的双眼所在平面上双眼和摄像头的位置示意图;
图4A为本公开实施例提供的又另一种瞳孔距离测量方法流程图;
图4B为本公开实施例提供的瞳孔距离测量装置的又一示意图;
图4C为本公开实施例提供的标志点的位置示意图;
图4D为本公开实施例提供的双眼与标志点的相对位置关系图;
图4E为本公开实施例提供的摄像头拍摄的单眼瞳孔和标志点的图像;
图4F为本公开实施例提供的相似图形的示意图;
图5为本公开实施例提供的一种提取瞳孔位置的流程图;
图6为本公开实施例提供的一种可穿戴眼部设备组成框图。
具体实施方式
下面将结合附图,对本公开实施例中的技术方案进行清楚、完整地描述参考在附图中示出并在以下描述中详述的非限制性示例实施例,更加全面地说明本公开的示例实施例和它们的多种特征及有利细节。应注意的是,图中示出的特征不是必须按照比例绘制。所给出的示例仅旨在有利于理解本公开 实施例的实施,以及进一步使本领域技术人员能够实施示例实施例。因而,这些示例不应被理解为对本公开的实施例的范围的限制。
除非另外特别定义,本公开使用的技术术语或者科学术语应当为本公开所属领域内具有一般技能的人士所理解的通常意义。本公开中使用的“第一”、“第二”以及类似的词语并不表示任何顺序、数量或者重要性,而只是用来区分不同的组成部分。此外,在本公开各个实施例中,相同或类似的参考标号表示相同或类似的构件。
下面结合附图介绍本公开的瞳孔距离测量方法、计算机可读存储介质以及可穿戴眼部设备。在本公开的实施例中,在不引起歧义的情况下,可以采用瞳孔来代表眼睛。例如,可以采用瞳孔的位置来代表眼睛的位置,双眼瞳孔的距离代表双眼的距离等。
如图1所示,本公开的至少一个实施例提供一种瞳孔距离测量方法100。该瞳孔距离测量方法100可以包括:步骤101,拍摄至少一眼部图像;步骤111,从所述至少一眼部图像中,提取单眼瞳孔或双眼瞳孔在所述至少一眼部图像中对应的一或两个瞳孔图像位置(即,提取单眼瞳孔对应的一个瞳孔图像位置,或提取双眼瞳孔对应的两个瞳孔图像位置);以及步骤121,基于所述一或两个瞳孔图像位置,确定所述双眼瞳孔之间的实际距离。
在一些实施例中,步骤101包括拍摄包含一个眼睛的图像,而在另一些实施例中,步骤101包括拍摄同时包含两个眼睛的一幅图像或者拍摄各自包含一个眼睛的两幅图像。
在一些实施例中,步骤111可以从图像中提取双眼瞳孔的图像距离;在另一些实施例中,步骤111可以从图像中提取一个眼睛的瞳孔与某个设定的标志点之间的距离,或从图像中提取一个眼睛的瞳孔与摄像设备在图像上的位置点之间的距离。
在一些实施例中,步骤121基于步骤111得到的双眼瞳孔的图像距离,再结合图形之间的比例关系推导出双眼瞳孔之间的实际距离。在另一些实施例中,步骤121基于步骤111得到的单眼瞳孔与标志点之间的图像距离,再结合图形之间的比例关系推导出双眼瞳孔之间的实际距离。在另一些实施例中,步骤121还可以基于步骤111的图像距离,基于图形之间的比例关系推导出眼睛与相应的摄像设备之间的实际距离,再结合左眼与摄影设备之间的实际距离和右眼与另一摄影设备之间的实际距离,得到双眼瞳孔的实际距离。
在一些实施例中,为了基于与瞳孔相关的图像距离得到瞳孔之间的实际距离,本公开实施例还构造了一个位于摄影设备的焦平面上的虚拟成像平面,其中在有些实施例中该虚拟成像平面与实际成像平面可能重合。如果虚拟成像平面与实际成像平面不重合时,在虚拟成像平面上所形成的图像与实际成像平面上所拍摄的图像之间也满足一定的比例关系,例如,等比例放大或者缩小。基于虚拟成像平面与双眼所在平面上的相似图形的比例关系,可以推导出所拍摄图像上与瞳孔相关的图像距离与瞳孔的实际距离之间的计算关系式。以下实施例以图像距离即在实际成像平面上所拍摄的图像上的相关距离为例进行说明。
在一些实施例中,步骤121可以采用图5所示的方法从拍摄的眼部图像(即位于实际成像平面上的图像)上提取瞳孔在图像上的位置。依据瞳孔在所拍摄图像上的位置,可以进一步获得所拍摄图像上与瞳孔相关的距离。例如,根据所拍摄图像上提取的瞳孔位置,确定双眼瞳孔在所拍摄图像上的距离。例如,根据所拍摄图像上提取的瞳孔位置以及摄像头的相应位置,确定第一水平图像距离或者第二水平图像距离。例如,根据所拍摄图像上提取的瞳孔位置和标志点的位置,确定水平图像距离。
在一些实施例中,当使用者开始使用VR设备等可穿戴设备时,使用者会在起始界面上,做一些操作进入主界面。例如,点选某个菜单等,或者只有到某个开启阶段时,才能进入主界面。此时,即可激活本公开的瞳孔距离测量方法,进行瞳孔之间实际距离的测量。
如图2A所示,提供另一种瞳孔距离测量方法200。该瞳孔测量方法200所基于的图像可以为采用一个摄像头采集的双眼图像。例如,瞳孔测量方法200可以包括:步骤201,拍摄同时包含所述双眼瞳孔的眼部图像;步骤211,确定在所述眼部图像中的所述双眼瞳孔的瞳孔图像位置;步骤221,依据所述双眼瞳孔的瞳孔图像位置,确定在所述眼部图像上的所述双眼瞳孔的图像距离;以及步骤231,根据所述双眼瞳孔的图像距离,确定所述双眼瞳孔的所述实际距离。
在一些实施例中,步骤201可以采用一个红外摄像头拍摄使用者的双眼,同时,可以采用多个红外LED灯为红外摄像头提供光源。摄像头采集图像时,需要确保能够采集到使用者的两个眼睛,即,确保摄像头能够正常工作,LED灯正常打开。当然,本公开实施例也可以采用其他类型的摄像头和其他类型 的光源进行拍摄,本公开在此不作限定。
参考图2B,在一些实施例中,瞳孔距离测量方法可以应用在VR设备中,相应的红外LED灯4用红外光线照亮VR设备内部,LED灯4设置在VR设备使用的透镜3周围,也可以分布在VR设备的内壳体5上。LED灯4的个数以照亮VR设备内部,能够拍摄到清晰的眼部图像的照片为准。摄像头2设置在VR设备的上部中间位置。VR设备还可以包括接口1,该接口1可以连接红外摄像头2与数据处理设备。例如,接口1可以为USB接口(与外部计算机相连接),MIPI接口(与移动终端相连接)、WIFI接口或者蓝牙接口等。
假设摄像头实际拍摄到的图像的像素大小为“Image_W*Image_H”。如图2C所示,通过分析拍摄得到的图像(即图2C的图像)得到左眼瞳孔在图像上的位置为Image_el、右眼瞳孔在图像上的位置为Image_er,拍摄图像上左眼瞳孔和右眼瞳孔两点的间距(即双眼瞳距在图像上的距离)为Image_ed。
基于双眼瞳孔在拍摄图像上的双眼瞳孔的图像距离Image_ed得到双眼瞳孔的实际距离,需要进一步参考图2D示出的相关位置点以及构造的相关平面。图2D示出了两个平面,分别是虚拟成像平面O’(即焦平面)以及双眼所在的平面O。假设摄像头20的焦距为f,摄像头20的拍摄张角(field of view,FOV)的角度为2*θ。在虚拟成像平面O’中示出了双眼瞳孔的位置el0和er0。在双眼瞳孔所在的实际平面O上示出了左右眼的实际位置el和er,在图2D中用眼睛的位置表示眼睛瞳孔的位置。H表示双眼所在平面的高度,例如,参数H可以由摄像头的拍摄张角计算得到;h表示焦平面的高度,例如,参数h也可以根据摄像头的拍摄张角计算得到。pd(即双眼瞳孔的实际距离)表示图2D的左眼瞳孔el与右眼瞳孔er在平面O上的瞳孔间的实际距离;pd’(即双眼瞳孔在焦平面O’上的距离)表示在成像面O’上的双眼瞳孔er0(相应于O平面的er点)和el0(相应于O平面的el点)二者之间的距离。
由图2D的透视关系可得:
tan(θ)=(h/2)/f;
tan(θ)=(H/2)/d;
Image_ed/pd’=Image_H/h;
pd/pd’=H/h。
由以上关系式即可得到使用者的双眼瞳孔的实际距离pd=2*d*tan(θ)*Image_ed/Image_H。
在一些实施例中,图2A的步骤221中根据所述双眼瞳孔的图像距离Image_ed,确定所述双眼瞳孔的实际距离pd,可以包括:测量双眼与摄像头之间的垂直实际距离d(例如,图2D所示的双眼所在的平面O与摄像头20之间的垂直距离);确定所述摄像头的拍摄张角θ(例如,图2D所示的摄像头20的拍摄张角);读取所述双眼瞳孔的图像距离Image_ed(例如,图2C所示的拍摄图像上的左眼瞳孔与右眼瞳孔之间的图像距离);读取所述眼部图像的高度Image_H(例如,图2C中的拍摄图像的纵向像素);依据所述双眼瞳孔的图像距离Image_ed、拍摄张角θ、双眼与摄像头之间的垂直实际距离d以及所述眼部图像的高度Image_H,确定所述双眼瞳孔之间的实际距离pd(即图2D中的er与el之间的距离)。
本公开的实施例采用一个红外摄像头拍摄使用者的双目,并通过图像处理的方式,测试出双目瞳孔在图像上的距离,继而计算出瞳孔的实际距离。采用本公开的实施例检测到瞳孔距离后,也可以用于眼球追踪和视线计算等其他与眼睛相关的领域。
如图3A所示,该图提供一种瞳孔距离测量方法300。瞳孔距离测量方法300可以包括:步骤301,采用第一摄像头和第二摄像头分别拍摄包含左眼的左眼图像和包含右眼的右眼图像;步骤311,确定所述左眼图像中的左眼瞳孔的瞳孔图像位置和所述右眼图像中的右眼瞳孔的瞳孔图像位置;步骤321,确定所述第一摄像头在所述左眼图像中的图像位置和所述第二摄像头在所述右眼图像中的图像位置;步骤331,在所述左眼图像中,确定所述左眼瞳孔的瞳孔图像位置到所述第一摄像头的图像位置之间的第一水平图像距离;步骤341,依据所述第一水平图像距离,确定所述左眼瞳孔与所述第一摄像头之间的第一水平实际距离Sep_l;步骤351,在所述右眼图像中,确定所述右眼瞳孔的瞳孔图像位置到所述第二摄像头的图像位置之间的第二水平图像距离;步骤361,依据所述第二水平图像距离,确定所述右眼瞳孔与所述第二摄像头之间的第二水平实际距离;步骤371,依据所述第一水平实际距离和所述第二水平实际距离,确定所述双眼瞳孔的实际距离。
在一些实施例中,步骤301可以采用图3B所示的装置拍摄图像。例如,可以利用两个红外摄像头11(即第一摄像头和第二摄像头,分别位于左眼上方和右眼上方)测试瞳孔之间的距离,两个红外摄像头11之间的位置已知,且每个红外摄像头11各自能够拍摄到使用者的一个眼睛。同时,还包括多个 红外LED灯14,所述LED灯14为两个红外摄像头提供光源。如图3B所示,两摄像头11分别在透镜13的上方,并且两摄像头11在同一水平线上。当然,本公开实施例也可以采用其他类型的摄像头和其他类型的光源进行拍摄,本公开在此不作限定。
例如,图3B的透镜13可以为VR设备使用的透镜,红外LED灯14用红外光线照亮VR设备内部。LED灯14通常在VR的透镜13周围,也可以分布在VR设备的内壳体15上。LED灯14的个数以照亮VR设备内部,能够拍摄到眼部图像的照片为准。
图3B中的一个摄像头12拍摄到的图像可以如图3C-1和图3C-2所示。
图3C-1和图3C-2展示的左眼和右眼的拍摄图像的像素大小为Image_W*Image_H,图像中心为P_ImageC(即摄像头12在实际成像平面上所拍摄图像上的对应位置)。拍摄图像上的单目瞳孔中心在该拍摄图像上的位置为点Peye,点Peye的横向延长线与点P_ImageC的纵向延长线交与点P_c。点Peye与点P_c之间的距离为第一水平图像距离Image_ed_l或第二水平图像距离Image_ed_r。
为了根据第一水平图像距离和第二水平图像距离计算双眼瞳孔的实际距离,需要进一步参考图3D示出的相关位置点以及构造的相关平面。
图3D示出了两个平面,分别是虚拟成像平面O’(即焦平面)以及双眼所在的平面O。图3D中点P_eye_o和点P_c_o分别为拍摄图像上的点Peye以及点P_c在双眼所在平面O上相对应的位置点。图3D中的点P_eye_o和P_c_o在虚拟成像平面O’上分别对应于点Peye’和点P_c’。假设焦平面为实际的成像平面,则拍摄图像上的第一水平图像距离或者第二水平图像距离可以表示为Image_ed.x’=P_c’.x-Peye’.x,其中参数x可以为l(left)或r(right),分别表示左眼对应的第一水平图像距离(例如,Image_ed.l’=P_c’.l-Peye’.l)和右眼对应的第二水平图像距离(例如,Image_ed.r’=P_c’.r-Peye’.r)。由于实际中,点Peye’可能在点P_c’的左边,也可能在点P_c’的右边,所以第一水平图像距离Image_ed.l’或者第二水平图像距离Image_ed.r’可能为正,也可能为负。
假设图3D中的第一摄像头和第二摄像头30的焦距为f,拍摄角度FOV为2*θ。虚拟成像面O’的高度为h,人眼到摄像头的垂直实际距离为d,P_eye_o为人眼在面O上的位置、P_c_o为点P_c’在面O上的投影。点P_eye_o与点 P_c_o之间的距离为第一水平实际距离或者第二水平实际距离Sep_x(其中,参数x可以为l或r,分别表示左眼对应的第一水平实际距离Sep_l和右眼对应的第二水平实际距离Sep_r)。
由图3D所示的透视关系可得:
tan(θ)=(h/2)/f
tan(θ)=(H/2)/d
Image_ed/Image_ed’=Image_H/h;
Sep_X/Image_ed’=H/h
由以上关系即可得到使用者的第一或者第二水平实际距离的计算公式为:Sep_x=2*d*tan(θ)*Image_ed/Image_H。
基于第一摄像头拍摄得到的左眼图像可以得到瞳孔到摄像头在所述图像上的图像中心的横向距离Image_ed_l,带入到上式可得第一水平实际距离Sep_l=2*d*tan(θ)*Image_ed_l/Image_H。
同理,对于第二摄像头,可得到第二水平实际距离Sep_r=2*d*tan(θ)*Image_ed_r/Image_H。
采用上述方法得到第一水平实际距离和第二水平实际距离后,可以进一步参考如图3E计算得到双目瞳孔的实际距离。图3E所示,如果已知两个摄像头12(第一摄像头和第二摄像头)在VR设备的距离为Sep,用图像处理的方式(例如,可以参考图5),通过第一摄像头12拍摄使用者左眼图像,并通过计算得到拍摄图像上左眼瞳孔到第一摄像头相应点的水平距离Sep_l;使用同样的方法可以计算得到拍摄图像上第二摄像头12所对应的点到右眼瞳孔中心的距离为Sep_r。则使用者双眼瞳孔的实际距离为pd=Sep-Sep_l-Sep_r。
在一些实施例中,步骤341中确定第一水平实际距离可以包括:测量所述第一摄像头和第二摄像头与双眼之间的垂直实际距离d(例如,图3D所示的双眼所在的平面O与摄像头30之间的垂直距离);确定所述第一摄像头的第一拍摄张角θ(例如,图3D所示的摄像头30的拍摄张角);读取所述左眼图像的高度(例如,图3C-1中图像的纵向拍摄像素);依据所述垂直实际距离d、所述第一拍摄张角θ、所述第一水平图像距离Image_ed_l(例如,图3C-1所示的左眼瞳孔与第一摄像头12在图像上的投影点P_ImageC之间的距离)以及所述左眼图像的高度,确定所述左眼瞳孔与所述第一摄像头之间的第一水平实际距离Sep_l(参考图3E)。步骤361确定第二水平实际距离的步骤可以 包括:确定所述第二摄像头的第二拍摄张角θ;读取所述右眼图像的高度(例如,图3C-2中图像的纵向拍摄像素);依据所述垂直实际距离d、所述第二拍摄张角θ、所述第二水平图像距离Image_ed_r(例如,图3C-2所示的右眼瞳孔与第二摄像头12在图像上的投影点P_ImageC之间的距离)以及所述右眼图像的高度,确定所述右眼瞳孔与所述第二摄像头之间的第二水平实际距离Sep_r(参考图3E)。
参考图3E,在一些实施例中,步骤371确定所述双眼瞳孔之间的实际距离,还可以包括:获取所述第一和第二摄像头之间的实际距离Sep;计算所述第一水平实际距离Sep_l和所述第二水平实际距离Sep_r两者的距离之和;确定所述双眼瞳孔之间的实际距离为所述第一和第二摄像头之间的实际距离减去所述距离之和。在VR设备上,第一和第二两个摄像头间的距离Sep是已知的,所以使用者的瞳孔间距为:pd=Sep–(Sep_l+Sep_r)。
综上所述,已知两个摄像头在VR设备的距离为Sep,利用图像处理的方式,通过第一摄像头camera_l拍摄使用者左眼图像,并通过计算得到左眼瞳孔到左摄像头的水平距离Sep_l;使用同样的方法可以计算得到第二摄像头camera_r到右眼瞳孔中心的距离Sep_r。则使用者最终的瞳距pd=Sep-Sep_l-Sep_r,如图3E所示。
采用两个位置已知的红外摄像头分别拍摄单眼图像,每个摄像头能够拍摄到使用者的一个眼睛,同时还采用多个红外LED灯为红外摄像头提供光源。例如,提供了一种利用VR设备内两个红外摄像头分别拍摄使用者的一个眼睛,并通过图像处理的方式,测量出拍摄图像上单个瞳孔到与之对应的摄像头光心在图像上相应点的距离,继而,计算出瞳孔的实际距离。本公开实施例所用的摄像头,在检测到瞳孔距离后,也可以用于眼球追踪和视线计算等其他与眼睛相关的工作领域。
如图4A所示,该图提供一种瞳孔距离测量方法400。瞳孔距离测量方法400包括:步骤401,拍摄包含所述单眼瞳孔以及预设的标志点的眼部图像。步骤411,提取所述眼部图像中所述单眼瞳孔的瞳孔图像位置;从所述眼部图像中提取所述标志点的标记位置。步骤421,依据所述单眼瞳孔的瞳孔图像位置和所述标记位置确定所述单眼瞳孔和所述标志点的水平图像距离;以及步骤431,依据所述单眼瞳孔和所述标志点的水平图像距离,确定所述双眼瞳孔之间的实际距离。
如图4B所示,在一些实施例中,瞳孔测距方法400可应用于VR设备中。VR设备配置有单个红外摄像头22及红外LED灯24。在使用VR设备时,LED等24的红外光源照亮人眼,红外摄像头22拍摄到使用者的单眼和VR壳体25上的标示点26,通过图像处理的方法获得使用者的瞳孔和标示点26,并依据图像上瞳孔到标示点的横向距离,估算出双眼瞳孔的实际距离。例如,标志点26可以设置在VR设备的壳体25上。当然,本公开实施例也可以采用其他类型的摄像头和其他类型的光源进行拍摄,本公开在此不作限定。
例如,VR设备使用透镜23,红外LED灯24用红外光线照亮VR设备内部。LED灯24通常在透镜23周围,也可以分布在VR设备的内壳体25上。LED灯24的个数以照亮VR设备内部,能够拍摄到眼部图像的照片为准,一般采用8个LED灯等。
如图4C和图4D所示,标示点26位于VR设备观看窗口27的上方,例如,标示点26位于VR壳体25左右方向的正中间。使用者正常使用VR设备时,标示点26可设置于使用者双目间距的中心位置。例如,标示点26可以为类似红外LED灯24等红外摄像头能够识别出的物点。VR设备的观看窗口27是指使用者能够通过该窗口透过透镜23观看VR设备显示的内容。
采用摄像头拍摄的图像如图4E所示,该拍摄图像上包括单个眼睛位置点Peye以及标志点Pmark。
假设摄像头拍摄的图像(图4E所示)的像素大小为Image_W*Image_H,单目瞳孔中心在图像上的上的位置为Peye、标示点在图像上的点为Pmark。图像上点Peye的横向延长线与点Pmark的纵向延长线交与点Pcross。点Peye与点Pcross之间的距离为水平图像距离Image_ed。
为了推导出点Peye与点Pcross之间的水平图像距离与瞳孔之间的实际距离之间的计算关系式,需要进一步参考图4F示出的相关位置点和构造的相关平面。
如图4F示出两个平面,分别是位于焦平面的虚拟成像平面O’以及双眼瞳孔所在的平面O。拍摄图像上的点Peye、点Pmark以及点Pcross在虚拟成像平面O’上分别对应于点Peye’、点Pmark’以及点Pcross’。在虚拟成像平面O’中示出了单眼瞳孔的位置Peye’和标志点的位置Pmark’。人眼到摄像头的距离为d。Peye_o为人眼在面O上的位置、Pcross_o为Pcross’在面O上的投影。相应的Peye_o,Pcross_o之间的距离为双目瞳孔的实际距离的一半。摄像头 40的焦距为f,拍摄角度为θ。H表示双眼所在平面的高度,例如,H可以由摄像头的拍摄张角计算得到;h表示焦平面的高度,例如,h与可以根据摄像头的拍摄张角计算得到。
由图4E的透视关系可得:
tan(θ)=(h/2)/f
tan(θ)=(H/2)/d
Image_ed/pd’=Image_H/h;
pd/pd’=H/h
由以上关系即可得到pd’=2*d*tan(θ)*Image_ed/Image_H。使用者的双目瞳孔的距离pd=2*pd’。
在一些实施例中,步骤421可以包括:测量双眼与摄像头之间的垂直实际距离d(例如,图4F所示的双眼所在的平面O与摄像头40之间的垂直距离);确定所述摄像头的拍摄张角θ(例如,图4F所示的摄像头40的拍摄张角);读取拍摄图像上所述单眼瞳孔和所述标志点的水平图像距离Image_ed(例如,图4E所示的图像上的单眼瞳孔到标志点之间的距离);读取所述拍摄图像的眼部图像的高度(例如,图4E所示的图像的纵向像素数);依据所述双眼与摄像头之间的垂直实际距离d、拍摄张角θ、所述单眼瞳孔和所述标志点的水平图像距离Image_ed以及所述眼部图像的高度Image_H,确定所述双目瞳孔之间的实际距离(即pd)。
如图5所示,上述三个实施例中的从所述至少一眼部图像中,提取单眼瞳孔或双眼瞳孔在所述至少一眼部图像中对应的一或两个瞳孔图像位置的步骤,例如该步骤可以包括:步骤501,从各眼部图像中提取单眼或双眼所在图像区域;步骤511,对所述图像区域进行滤波和灰度化,得到灰度图,将所述灰度图转化为二值图;步骤521,查找二值图中的的单眼瞳孔区域的边界或双眼瞳孔区域的边界;步骤531,将所述单眼瞳孔区域的边界或双眼瞳孔区域的边界进行椭圆拟合得到一或两个椭圆图形;以及步骤541,将各椭圆图形的中心作为相应瞳孔的图像位置。当然,也可以使用其他方法得到瞳孔的图像位置,本公开在此不作限定。
上述实施例中所述眼部图像的高度可以为摄像头所拍摄的图像的纵向的像素大小。
在一些实施例中,提取上述标识点的图像可以包括如下步骤:获得摄像 头采集的图像后,即可进行标示点提取。标示点提取包括以下几步:(1)标示点区域提取(例如,从整副图像中提取出标示点所在的大概区域)。因使用者佩戴上VR设备后,标示点在图像中的位置大致是确定的,因此,可以选取一个较大的图像区域作为标示点所在区域的备选区域。(2)图像滤波:对提取出来的图像做滤波,如高斯滤波等。(3)灰度化:对滤波后的图像进行灰度化,由RGB图转化层灰度图。(4)阈值处理:将灰度图转化成二值图;因为标示点本身就是红外LED灯,所以,在红外摄像头拍摄的图像上,标示点为一白色的斑点,可以设置一个较大的阈值,如,阈值=240,进行图像的二值化。(5)标示点区域检查:利用形态学的方法,找到标示点区域的边界。(6)标示点椭圆拟合:利用标示点区域的边界进行椭圆拟合,得到椭圆。(7)标示点中心输出:标示点位置即为椭圆的中心。
本公开实施例提供了一种能够利用单个红外摄像头估算瞳距的方法,例如,可以采用一个红外摄像头多个LED灯。单个摄像头能够拍摄到使用者的单眼和VR壳体上的标示点,多个红外LED灯为红外摄像头提供光源。利用VR设备内的红外摄像头拍摄使用者的单眼和VR壳体上的标示点,并通过图像处理的方式,测试单眼瞳孔到标示点的横向距离,继而,计算出双眼瞳孔之间的实际距离。本公开所用的摄像头,在检测到瞳孔距离后,也可以用于眼球追踪和视线计算等其他与眼睛相关的工作。
本公开的至少一个实施例提供一种计算机可读存储介质,其上存储有计算机指令,所述指令被处理器执行时实现以下操作:从所述至少一眼部图像中,提取单眼瞳孔或双眼瞳孔在所述至少一眼部图像中对应的一或两个瞳孔图像位置;基于所述一或两个瞳孔图像位置,确定所述双眼瞳孔之间的实际距离。
如图6所示,该图为本公开提供的可穿戴眼部设备600的组成框图。可穿戴眼部设备600可以包括:处理器602以及存储器603。所述存储器603存储指令,所述指令被所述处理器602执行时实现以下操作:从至少一眼部图像中,提取单眼瞳孔或双眼瞳孔在所述至少一眼部图像中对应的一或两个瞳孔图像位置;基于所述一或两个瞳孔图像位置,确定所述双眼瞳孔之间的实际距离。
在一些实施例中,可穿戴眼部设备600还可以包括:至少一个摄像头601。例如,至少一个摄像头601被配置为拍摄所述至少一眼部图像。
在一些实施例中,可穿戴眼部设备600还可以包括:至少一个红外光源604,红外光源604可以被配置为所述至少一个摄像头601提供拍摄光源。
在一些实施例中,可穿戴眼部设备600可以具有VR眼镜的基本部件。例如,VR眼镜的基本部件可以包括透镜、壳体等。
在一些实施例中,至少一个摄像头601包括第一摄像头,其中,该第一摄像头被配置为;拍摄包含所述双眼瞳孔的眼部图像。所述存储器存储的指令被所述处理器执行时实现以下操作:确定在所述眼部图像中的所述双眼瞳孔的瞳孔图像位置;依据所述双眼瞳孔的瞳孔图像位置,确定在所述眼部图像上的所述双眼瞳孔的图像距离;根据所述双眼瞳孔的图像距离,确定所述双眼瞳孔的所述实际距离。
在一些实施例中,至少一个摄像头601还可以包括第一摄像头和第二摄像头;其中,所述第一摄像头和第二摄像头被配置为分别拍摄包含左眼的左眼图像和包含右眼的右眼图像;所述存储器603存储的指令被所述处理器602执行时实现以下操作:确定所述左眼图像中的左眼瞳孔的瞳孔图像位置和所述右眼图像中的右眼瞳孔的瞳孔图像位置;确定所述第一摄像头在所述左眼图像中的图像位置和所述第二摄像头在所述右眼图像中的图像位置;在所述左眼图像中,确定所述左眼瞳孔的瞳孔图像位置到所述第一摄像头的图像位置之间的第一水平图像距离;依据所述第一水平图像距离,确定所述左眼瞳孔与所述第一摄像头之间的第一水平实际距离;在所述右眼图像中,确定所述右眼瞳孔的瞳孔图像位置到所述第二摄像头的图像位置之间的第二水平图像距离;依据所述第二水平图像距离,确定所述右眼瞳孔与所述第二摄像头之间的第二水平实际距离;依据所述第一水平实际距离和所述第二水平实际距离,确定所述双眼瞳孔的实际距离。
在一些实施例中,摄像头601还可以包括第一摄像头,其中,所述第一摄像头被配置为拍摄包含所述单眼瞳孔以及设定的标志点的眼部图像。所述存储器603存储的指令被所述处理器602执行时实现以下操作:提取所述眼部图像中所述单眼瞳孔的瞳孔图像位置;从所述眼部图像中提取所述标志点的标记位置;依据所述单眼瞳孔的瞳孔图像位置和所述标记位置确定所述单眼瞳孔和所述标志点的水平图像距离;依据所述单眼瞳孔和所述标志点的水平图像距离,确定所述双眼瞳孔之间的实际距离。例如,所述标志点设置于可穿戴眼部设备的壳体上。
在一些实施例中,所述存储器603以及处理器602可以位于同一台PC机等处理设备上。
参照图1-5,对可穿戴眼部设备600的相似的描述在此不再赘述。
以上所述,仅为本公开的具体实施方式,但本公开的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本公开揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本公开的保护范围之内。因此,本公开的保护范围应以所述权利要求的保护范围为准。
本公开要求于2017年7月7日递交的中国专利申请第201710551489.5号的优先权,在此全文引用上述中国专利申请公开的内容以作为本申请的一部分。

Claims (17)

  1. 一种瞳孔距离测量方法,包括:
    拍摄至少一眼部图像;
    从所述至少一眼部图像中,
    提取单眼瞳孔对应的一个瞳孔图像位置,或提取双眼瞳孔对应的两个瞳孔图像位置;以及
    基于所述一或两个瞳孔图像位置,确定所述双眼瞳孔之间的实际距离。
  2. 如权利要求1所述的瞳孔距离测量方法,其中,
    所述拍摄至少一眼部图像,包括:拍摄同时包含所述双眼瞳孔的眼部图像;
    所述提取双眼瞳孔在所述至少一眼部图像中对应的两个瞳孔图像位置,包括:确定在所述眼部图像中的所述双眼瞳孔的瞳孔图像位置;以及
    所述基于所述一或两个瞳孔图像位置,确定瞳孔之间的实际距离,包括:
    依据所述双眼瞳孔的瞳孔图像位置,确定在所述眼部图像上的所述双眼瞳孔的图像距离;以及
    根据所述双眼瞳孔的图像距离,确定所述双眼瞳孔的实际距离。
  3. 如权利要求2所述的瞳孔距离测量方法,其中,所述根据所述双眼瞳孔的图像距离,确定所述双眼瞳孔的所述实际距离,包括:
    测量双眼与摄像头之间的垂直实际距离;
    确定所述摄像头的拍摄张角;
    读取所述双眼瞳孔的图像距离;
    读取所述眼部图像的高度;
    依据所述双眼瞳孔的图像距离、所述拍摄张角、所述双眼与摄像头之间的垂直实际距离以及所述眼部图像的高度,确定所述双眼瞳孔之间的所述实际距离。
  4. 如权利要求1所述的瞳孔距离测量方法,其中,
    所述拍摄至少一眼部图像,包括:采用第一摄像头和第二摄像头分别拍摄包含左眼的左眼图像和包含右眼的右眼图像;
    所述从所述至少一眼部图像中,提取单眼瞳孔在所述至少一眼部图像中对应的一个瞳孔图像位置,包括:确定所述左眼图像中的左眼瞳孔的瞳孔图像位置和所述右眼图像中的右眼瞳孔的瞳孔图像位置;以及
    所述基于所述一或两个瞳孔图像位置,确定所述双眼瞳孔之间的实际距离,包括:
    确定所述第一摄像头在所述左眼图像中的图像位置和所述第二摄像头在所述右眼图像中的图像位置;
    在所述左眼图像中,确定所述左眼瞳孔的瞳孔图像位置到所述第一摄像头的图像位置之间的第一水平图像距离;
    依据所述第一水平图像距离,确定所述左眼瞳孔与所述第一摄像头之间的第一水平实际距离;
    在所述右眼图像中,确定所述右眼瞳孔的瞳孔图像位置到所述第二摄像头的图像位置之间的第二水平图像距离;
    依据所述第二水平图像距离,确定所述右眼瞳孔与所述第二摄像头之间的第二水平实际距离;以及
    依据所述第一水平实际距离和所述第二水平实际距离,确定所述双眼瞳孔的实际距离。
  5. 如权利要求4所述的瞳孔距离测量方法,其中,
    所述依据所述第一水平图像距离,确定所述左眼瞳孔与所述第一摄像头之间的第一水平实际距离,包括:
    测量所述第一摄像头和第二摄像头与双眼之间的垂直实际距离;
    确定所述第一摄像头的第一拍摄张角;
    读取所述左眼图像的高度;
    依据所述垂直实际距离、所述第一拍摄张角、所述第一水平图像距离以及所述左眼图像的高度,确定所述左眼瞳孔与所述第一摄像头之间的第一水平实际距离;
    所述依据所述第二水平图像距离,确定所述右眼瞳孔与所述第二摄像头之间的第二水平实际距离,包括:
    确定所述第二摄像头的第二拍摄张角;
    读取所述右眼图像的高度;以及
    依据所述垂直实际距离、所述第二拍摄张角、所述第二水平图像距离以及所述右眼图像的高度,确定所述右眼瞳孔与所述第二摄像头之间的第二水平实际距离。
  6. 如权利要求4所述的瞳孔距离测量方法,其中,
    所述确定所述双眼瞳孔之间的实际距离,还包括:
    获取所述第一摄像头和第二摄像头之间的实际距离;
    计算所述第一水平实际距离和所述第二水平实际距离两者的距离之和;
    确定所述双眼瞳孔之间的实际距离为所述第一和第二摄像头之间的实际距离减去所述距离之和。
  7. 如权利要求1所述的瞳孔距离测量方法,其中,
    所述拍摄至少一眼部图像,包括:拍摄包含所述单眼瞳孔以及设定的标志点的眼部图像;
    所述从所述至少一眼部图像中,提取单眼瞳孔在所述至少一眼部图像中对应的一个瞳孔图像位置,包括:
    提取所述眼部图像中所述单眼瞳孔的瞳孔图像位置;以及
    从所述眼部图像中提取所述标志点的标记位置;以及
    所述基于所述一或两个瞳孔图像位置,确定所述双眼瞳孔之间的实际距离,包括:
    依据所述单眼瞳孔的瞳孔图像位置和所述标记位置确定所述单眼瞳孔和所述标志点的水平图像距离;以及
    依据所述单眼瞳孔和所述标志点的水平图像距离,确定所述双眼瞳孔之间的实际距离。
  8. 如权利要求7所述的瞳孔距离测量方法,其中,所述依据所述单眼瞳孔和所述标志点的水平图像距离,确定所述双眼瞳孔之间的实际距离,包括:
    测量双眼与摄像头之间的垂直实际距离;
    确定所述摄像头的拍摄张角;
    读取所述单眼瞳孔和所述标志点的水平图像距离;
    读取所述眼部图像的高度;
    依据所述双眼与摄像头之间的垂直实际距离、拍摄张角、所述单眼瞳孔和所述标志点的水平图像距离以及所述眼部图像的高度,确定所述双目瞳孔之间的实际距离。
  9. 如权利要求1-8任一项所述的瞳孔距离测量方法,其中,所述从所述至少一眼部图像中,提取单眼瞳孔或双眼瞳孔在所述至少一眼部图像中对应的一或两个瞳孔图像位置,包括:
    从所述至少一眼部图像中提取单眼或双眼所在图像区域;
    对所述图像区域进行滤波和灰度化,得到灰度图;
    将所述灰度图转化为二值图;
    查找二值图中的单眼瞳孔区域的边界或双眼瞳孔区域的边界;
    将所述单眼瞳孔区域的边界或双眼瞳孔区域的边界进行椭圆拟合得到一或两个椭圆图形;以及
    将各椭圆图形的中心作为相应瞳孔的图像位置。
  10. 如权利要求3,5或8所述的瞳孔距离测量方法,其中,所述高度为摄像头采集的图像的纵向的像素大小。
  11. 一种计算机可读存储介质,其上存储有计算机指令,所述指令被处理器执行时实现以下操作:
    从至少一眼部图像中,提取单眼瞳孔或双眼瞳孔在所述至少一眼部图像中对应的一或两个瞳孔图像位置;以及
    基于所述一或两个瞳孔图像位置,确定所述双眼瞳孔之间的实际距离。
  12. 一种可穿戴眼部设备,包括:处理器以及存储器;其中,
    所述存储器存储指令,所述指令被所述处理器执行时实现以下操作:
    从至少一眼部图像中,提取单眼瞳孔或双眼瞳孔在所述至少一眼部图像中对应的一或两个瞳孔图像位置;以及
    基于所述一或两个瞳孔图像位置,确定所述双眼瞳孔之间的实际距离。
  13. 如权利要求12所述的可穿戴眼部设备,还包括第一摄像头,其中,
    该第一摄像头被配置为:拍摄包含所述双眼瞳孔的眼部图像;以及
    所述存储器存储的指令被所述处理器执行时实现以下操作:
    确定在所述眼部图像中的所述双眼瞳孔的瞳孔图像位置;
    依据所述双眼瞳孔的瞳孔图像位置,确定在所述眼部图像上的所述双眼瞳孔的图像距离;以及
    根据所述双眼瞳孔的图像距离,确定所述双眼瞳孔的所述实际距离。
  14. 如权利要求12所述的可穿戴眼部设备,还包括第一摄像头和第二摄像头;其中,
    所述第一摄像头和第二摄像头被配置为分别拍摄包含左眼的左眼图像和包含右眼的右眼图像;以及
    所述存储器存储的指令被所述处理器执行时实现以下操作:
    确定所述左眼图像中的左眼瞳孔的瞳孔图像位置和所述右眼图像中 的右眼瞳孔的瞳孔图像位置;
    确定所述第一摄像头在所述左眼图像中的图像位置和所述第二摄像头在所述右眼图像中的图像位置;
    在所述左眼图像中,确定所述左眼瞳孔的瞳孔图像位置到所述第一摄像头的图像位置之间的第一水平图像距离;
    依据所述第一水平图像距离,确定所述左眼瞳孔与所述第一摄像头之间的第一水平实际距离;
    在所述右眼图像中,确定所述右眼瞳孔的瞳孔图像位置到所述第二摄像头的图像位置之间的第二水平图像距离;
    依据所述第二水平图像距离,确定所述右眼瞳孔与所述第二摄像头之间的第二水平实际距离;以及
    依据所述第一水平实际距离和所述第二水平实际距离,确定所述双眼瞳孔的实际距离。
  15. 如权利要求12所述的可穿戴眼部设备,还包括第一摄像头;其中,
    所述第一摄像头被配置为拍摄包含所述单眼瞳孔以及设定的标志点的眼部图像;以及
    所述存储器存储的指令被所述处理器执行时实现以下操作:
    提取所述眼部图像中所述单眼瞳孔的瞳孔图像位置;
    从所述眼部图像中提取所述标志点的标记位置;
    依据所述单眼瞳孔的瞳孔图像位置和所述标记位置确定所述单眼瞳孔和所述标志点的水平图像距离;以及
    依据所述单眼瞳孔和所述标志点的水平图像距离,确定所述双眼瞳孔之间的实际距离。
  16. 如权利要求15所述的可穿戴眼部设备,其中,所述标志点设置于可穿戴眼部设备的壳体上。
  17. 如权利要求12-16任一项所述的可穿戴眼部设备,还包括红外光源,其中,所述红外光源为所述可穿戴眼部设备拍摄眼部图像提供光源。
PCT/CN2018/074980 2017-07-07 2018-02-01 瞳孔距离测量方法、可穿戴眼部设备及存储介质 WO2019007050A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP18788993.6A EP3651457B1 (en) 2017-07-07 2018-02-01 Pupillary distance measurement method, wearable eye equipment and storage medium
US16/096,428 US11534063B2 (en) 2017-07-07 2018-02-01 Interpupillary distance measuring method, wearable ophthalmic device and storage medium
JP2018557090A JP2020526735A (ja) 2017-07-07 2018-02-01 瞳孔距離測定方法、装着型眼用機器及び記憶媒体

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710551489.5 2017-07-07
CN201710551489.5A CN109429060B (zh) 2017-07-07 2017-07-07 瞳孔距离测量方法、可穿戴眼部设备及存储介质

Publications (2)

Publication Number Publication Date
WO2019007050A1 true WO2019007050A1 (zh) 2019-01-10
WO2019007050A8 WO2019007050A8 (zh) 2019-03-07

Family

ID=64949675

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/074980 WO2019007050A1 (zh) 2017-07-07 2018-02-01 瞳孔距离测量方法、可穿戴眼部设备及存储介质

Country Status (5)

Country Link
US (1) US11534063B2 (zh)
EP (1) EP3651457B1 (zh)
JP (1) JP2020526735A (zh)
CN (1) CN109429060B (zh)
WO (1) WO2019007050A1 (zh)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109674443B (zh) * 2017-10-19 2022-04-26 华为终端有限公司 一种瞳距测量方法及终端
JP7180209B2 (ja) * 2018-08-30 2022-11-30 日本電信電話株式会社 眼情報推定装置、眼情報推定方法、プログラム
CN111486798B (zh) * 2020-04-20 2022-08-26 苏州智感电子科技有限公司 图像测距方法、图像测距系统及终端设备
CN111860292B (zh) * 2020-07-16 2024-06-07 科大讯飞股份有限公司 基于单目相机的人眼定位方法、装置以及设备
CN111854620B (zh) * 2020-07-16 2022-12-06 科大讯飞股份有限公司 基于单目相机的实际瞳距测定方法、装置以及设备
CN112987853B (zh) * 2021-02-05 2021-12-10 读书郎教育科技有限公司 一种基于视觉算法的早教平板
US11663739B2 (en) * 2021-03-11 2023-05-30 Microsoft Technology Licensing, Llc Fiducial marker based field calibration of a device
CN113177434A (zh) * 2021-03-30 2021-07-27 青岛小鸟看看科技有限公司 基于单眼球追踪的虚拟现实系统注视渲染方法、系统
GB2611579A (en) * 2021-10-11 2023-04-12 Fuel 3D Tech Limited Methods and systems for interpupillary distance measurement

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103793719A (zh) * 2014-01-26 2014-05-14 深圳大学 一种基于人眼定位的单目测距方法和系统
US9538166B2 (en) * 2011-12-16 2017-01-03 Sk Planet Co., Ltd. Apparatus and method for measuring depth of the three-dimensional image
CN106686365A (zh) * 2016-12-16 2017-05-17 歌尔科技有限公司 用于头戴显示设备的镜头调节方法、装置及头戴显示设备
CN106803950A (zh) * 2017-03-02 2017-06-06 深圳晨芯时代科技有限公司 一种vr一体机及其图像调整方法

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS53106070A (en) * 1977-02-28 1978-09-14 Tokyo Kouon Denpa Kk Measuring apparatus for dimension
JP3921968B2 (ja) 2001-07-12 2007-05-30 株式会社豊田自動織機 位置検出方法及び位置検出装置
JP2005103039A (ja) * 2003-09-30 2005-04-21 Pentax Corp 瞳孔距離測定方法および測定器
US7682026B2 (en) * 2006-08-22 2010-03-23 Southwest Research Institute Eye location and gaze detection system and method
JP2012239566A (ja) * 2011-05-18 2012-12-10 Nikon Corp 眼鏡用測定装置及び三次元測定装置
JP6128977B2 (ja) * 2013-06-14 2017-05-17 中村留精密工業株式会社 板材の周縁加工装置並びに加工精度の計測及び補正方法
CN105637512B (zh) * 2013-08-22 2018-04-20 贝斯普客公司 用于创造定制产品的方法和系统
US20160019720A1 (en) * 2014-07-15 2016-01-21 Ion Virtual Technology Corporation Method for Viewing Two-Dimensional Content for Virtual Reality Applications
US9719871B2 (en) * 2014-08-09 2017-08-01 Google Inc. Detecting a state of a wearable device
CN104834381B (zh) * 2015-05-15 2017-01-04 中国科学院深圳先进技术研究院 用于视线焦点定位的可穿戴设备及视线焦点定位方法
US9877649B2 (en) * 2015-10-23 2018-01-30 Gobiquity, Inc. Photorefraction method and product
CN106019588A (zh) * 2016-06-23 2016-10-12 深圳市虚拟现实科技有限公司 一种可以自动测量瞳距的近眼显示装置及方法
CN106325510B (zh) * 2016-08-19 2019-09-24 联想(北京)有限公司 信息处理方法及电子设备

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9538166B2 (en) * 2011-12-16 2017-01-03 Sk Planet Co., Ltd. Apparatus and method for measuring depth of the three-dimensional image
CN103793719A (zh) * 2014-01-26 2014-05-14 深圳大学 一种基于人眼定位的单目测距方法和系统
CN106686365A (zh) * 2016-12-16 2017-05-17 歌尔科技有限公司 用于头戴显示设备的镜头调节方法、装置及头戴显示设备
CN106803950A (zh) * 2017-03-02 2017-06-06 深圳晨芯时代科技有限公司 一种vr一体机及其图像调整方法

Also Published As

Publication number Publication date
EP3651457B1 (en) 2023-08-30
CN109429060B (zh) 2020-07-28
EP3651457A4 (en) 2021-03-24
WO2019007050A8 (zh) 2019-03-07
EP3651457A1 (en) 2020-05-13
US11534063B2 (en) 2022-12-27
JP2020526735A (ja) 2020-08-31
CN109429060A (zh) 2019-03-05
US20210228075A1 (en) 2021-07-29

Similar Documents

Publication Publication Date Title
WO2019007050A1 (zh) 瞳孔距离测量方法、可穿戴眼部设备及存储介质
CN109716268B (zh) 眼部和头部跟踪
US10002463B2 (en) Information processing apparatus, information processing method, and storage medium, for enabling accurate detection of a color
WO2018161877A1 (zh) 处理方法、处理装置、电子装置和计算机可读存储介质
KR100949743B1 (ko) 고글 형태를 갖는 착용형 시선 추적 장치 및 방법
CN102812416B (zh) 指示输入装置、指示输入方法、程序、记录介质以及集成电路
ES2742416T3 (es) Dispositivo y método de imagen corneal
US20090051871A1 (en) Custom eyeglass manufacturing method
JP4501003B2 (ja) 顔姿勢検出システム
TWI694809B (zh) 檢測眼球運動的方法、其程式、該程式的記憶媒體以及檢測眼球運動的裝置
CN106204431B (zh) 智能眼镜的显示方法及装置
KR20120057033A (ko) Iptv 제어를 위한 원거리 시선 추적 장치 및 방법
CN112220444B (zh) 一种基于深度相机的瞳距测量方法和装置
CN109478227A (zh) 计算设备上的虹膜或其他身体部位识别
US20170186170A1 (en) Facial contour recognition for identification
CA3041021C (en) Body part color measurement detection and method
JP2015080647A (ja) 撮影画像表示装置
ES2933452T3 (es) Procedimiento y dispositivo para la medición de la potencia de refracción local y/o de la distribución de la potencia de refracción de una lente de gafas
WO2020032254A1 (ja) 注意対象推定装置及び注意対象推定方法
US20090202180A1 (en) Rotation independent face detection
CN112073640B (zh) 全景信息采集位姿获取方法及装置、系统
TWI603225B (zh) 液晶顯示器顯示視角的調整方法和裝置
KR20140003107A (ko) 증강 현실 표현 장치 및 방법
CN111522431B (zh) 使用眼睛跟踪系统对闪光进行分类
CN112651270B (zh) 一种注视信息确定方法、装置、终端设备及展示对象

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2018557090

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18788993

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018788993

Country of ref document: EP

Effective date: 20200207