WO2019085487A1 - 显示设备、用于调整显示设备的图像呈现的方法及装置 - Google Patents

显示设备、用于调整显示设备的图像呈现的方法及装置 Download PDF

Info

Publication number
WO2019085487A1
WO2019085487A1 PCT/CN2018/091076 CN2018091076W WO2019085487A1 WO 2019085487 A1 WO2019085487 A1 WO 2019085487A1 CN 2018091076 W CN2018091076 W CN 2018091076W WO 2019085487 A1 WO2019085487 A1 WO 2019085487A1
Authority
WO
WIPO (PCT)
Prior art keywords
distance
image
lens
display device
user
Prior art date
Application number
PCT/CN2018/091076
Other languages
English (en)
French (fr)
Inventor
刘木
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to EP18874723.2A priority Critical patent/EP3683614B1/en
Publication of WO2019085487A1 publication Critical patent/WO2019085487A1/zh
Priority to US16/854,419 priority patent/US11115648B2/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/111Transformation of image signals corresponding to virtual viewpoints, e.g. spatial image interpolation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/344Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/371Image reproducers using viewer tracking for tracking viewers with different interocular distances; for tracking rotational head movements around the vertical axis
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera

Definitions

  • the present disclosure relates to the field of display devices, and in particular, to a display device, a method and apparatus for adjusting image presentation of a display device.
  • the display device acquires the left pupil image and the right pupil image of the user, determines the left pupil coordinate and the right pupil coordinate, calculates the pupil distance according to the two coordinates, and adjusts the positions of the left lens and the right lens according to the obtained pupil distance,
  • the distance between the left lens and the right lens and the axis of the display device is half of the lay length.
  • the interpupillary distance is the distance between the center point of the left pupil and the center point of the right pupil. Specifically, referring to FIG.
  • the center point O of the two eyes is defined as the origin
  • the horizontal direction is the x-axis direction
  • , and the rightmost position of the right pupil B' to the origin are obtained.
  • is calculated by the following formula,
  • the pupil distance (ipd) is obtained, and the distance between the left lens and the right lens and the central axis is adjusted to ipd/2.
  • the distance between the user's pupil and the display device may be different, resulting in different sizes of the pupil images that are captured, resulting in a lower precision of the measured pupil distance, and adjusting the position of the lens according to the low precision of the pupil distance. Can't achieve clear viewing.
  • the embodiments of the present disclosure provide a display device, a method and an apparatus for adjusting an image presentation of a display device, which can solve the problem that the measured pitch precision is low in the related art, and the technical solution is as follows:
  • a display device in a first aspect, includes: a display device body, the display device body includes a first lens barrel and a second lens barrel, and the first end of the first lens barrel is provided with a first a lens, the first end of the second barrel is provided with a second lens;
  • the display device body is further provided with a first distance sensor and a second distance sensor;
  • the first distance sensor and the second distance sensor are symmetrically disposed on both sides of a longitudinal center axis of the display device body;
  • the first distance sensor is configured to measure a distance between the left eyeball and the first lens
  • the second distance sensor is for measuring a distance between the right eyeball and the second lens.
  • two distance sensors are disposed on the display device body, the distance between the left eyeball and the first lens is measured by the first distance sensor, and the right eyeball and the second lens are measured by the second distance sensor.
  • the distance between the left pupil image and the right pupil image according to the two distances can offset the influence of the distance between the pupil and the display device, and help to obtain a high precision pupil distance.
  • a clear viewing effect can be achieved by adjusting the position of the lens based on high precision.
  • the first distance sensor is disposed at an edge region of the first end of the first barrel, and the second distance sensor is disposed at an edge of the first end of the second barrel region.
  • the first distance sensor is disposed in an intersection of an edge region of the first end of the first barrel and a lateral central axis of the display device body, adjacent to the longitudinal center axis On the intersection;
  • the second distance sensor is disposed at an intersection of an edge region of the first end of the second barrel and a lateral center axis of the display device body near the longitudinal center axis.
  • the distance measured by the first distance sensor and the second distance sensor is actually the distance between the inner corner of the eye and the corresponding distance sensor, so that the measured distance is not affected by the difference in the depth of the user's eye socket, and the improvement is improved.
  • the accuracy of measuring distance is actually the distance between the inner corner of the eye and the corresponding distance sensor, so that the measured distance is not affected by the difference in the depth of the user's eye socket, and the improvement is improved.
  • the first distance sensor is disposed outside the edge region of the first end of the first barrel, and the second distance sensor is disposed at the first end of the second barrel The outer side of the edge area.
  • the first distance sensor is disposed on a first vertical line, and the first vertical line is a line parallel to the longitudinal central axis and passing through a lens center of the first lens;
  • the second distance sensor is disposed on a second vertical line that is a line parallel to the longitudinal center axis and passing through the center of the lens of the second lens.
  • a light emitting diode (LED) unit is disposed in an edge region of the first end of the first lens barrel, and an edge region of the first end of the second lens barrel is disposed. There are LED units.
  • a method for adjusting an image presentation of a display device comprising:
  • first image includes an image of a left eyeball of the user
  • second image includes an image of the right eye of the user
  • the first lens is a lens of the entire lens of the display device that is closest to the left eyeball of the user;
  • the second lens is a lens of the entire lens of the display device that is closest to the right eye of the user
  • the first lens and/or the second lens are adjusted according to the user's interpupillary distance.
  • the first image and the second image are normalized by the first distance between the first lens and the left eyeball and the second distance between the second lens and the right eyeball, respectively.
  • the normalized first image and the normalized second image both eliminate the influence of the user's pupil and the display device, so the first image and normalization according to the normalization process are eliminated.
  • the calculated pupil distance of the processed second image does not cause an error due to the difference between the user's pupil and the display device, and the precision is high. According to the high-precision pupil distance adjustment lens position, a clear viewing effect can be achieved.
  • the normalizing the first image and the second image according to the first distance and the second distance respectively including:
  • the second image is scaled using the second scaling factor.
  • the normalizing the first image and the second image according to the first distance and the second distance respectively including:
  • the first image and the second image are respectively normalized according to the first distance and the second distance.
  • the position of the lens is adjusted to avoid the problem that the measured pupil distance is inaccurate due to the tilt of the display device.
  • the detecting the wearing state of the display device according to the first distance and the second distance includes:
  • the detecting the wearing state of the display device according to the first image and the second image includes:
  • the detecting the wearing state of the display device according to the first image and the second image includes:
  • the detecting the wearing state of the display device according to the first distance and the second distance; or detecting the display device according to the first image and the second image After the wearing state the method further includes:
  • the user is prompted to re-wear the display device according to a preset prompt manner.
  • the normalizing the first image and the second image according to the first distance and the second distance respectively including:
  • the first image and the second image are respectively normalized according to the first distance and the second distance.
  • the position of the lens is adjusted to avoid the measurement due to the abnormal state of the blinking state such as the closed eyes of the user when the image is taken.
  • the problem of inaccurate interpupillary distance by detecting the blinking state of the user, when the blinking state is normal, the position of the lens is adjusted to avoid the measurement due to the abnormal state of the blinking state such as the closed eyes of the user when the image is taken.
  • the detecting the blink status of the user according to the first distance, the second distance, the first image, and the second image includes:
  • the predetermined pupil diameter is a distance between the first lens and the left eyeball of the user is a preset reference distance, and a pupil diameter that is detected when the blink state is normal, or the preset pupil diameter is when the second a pupil diameter that is detected between a distance between the lens and the right eyeball of the user, and a pupil diameter that is detected when the blink state is normal, the left pupil diameter being determined according to the first image, The right pupil diameter is determined according to the second image;
  • the first scaling factor is a ratio of the first distance to the preset reference distance
  • the second scaling factor is a ratio of the second distance to the preset reference distance
  • the predetermined pupil diameter is an average value of sample pupil diameters detected based on a plurality of sample first images or a plurality of sample second images acquired by the display device.
  • the method further includes:
  • a correspondence between the user identifier and the user's interpupillary distance is stored.
  • the correspondence between the user identifier and the user's distance can be stored.
  • the user identifier can be directly obtained from the corresponding relationship.
  • the user's interpupillary distance without having to acquire the user's interpupillary distance, is advantageous for quickly obtaining the interpupillary distance, thereby quickly adjusting the distance between the lenses, saving time and improving efficiency.
  • an apparatus for adjusting an image presentation of a display device comprising a plurality of functional modules to implement the second aspect and any of the possible aspects of the second aspect for adjusting the display device The method of image rendering.
  • a computer readable storage medium is provided, the instructions being stored on a computer readable storage medium, the instructions being executed by a processor to perform any of the second aspect and the second aspect of the possible design A method for adjusting the image presentation of a display device.
  • FIG. 1 is a schematic diagram of a method for adjusting image presentation of a display device provided by the related art
  • FIG. 2 is a schematic structural diagram of a display device according to an embodiment of the present disclosure
  • FIG. 3 is a schematic structural diagram of a display device according to an embodiment of the present disclosure.
  • FIG. 4 is a schematic structural diagram of a display device according to an embodiment of the present disclosure.
  • FIG. 5 is a schematic structural diagram of a display device according to an embodiment of the present disclosure.
  • FIG. 6 is a schematic structural diagram of a display device according to an embodiment of the present disclosure.
  • FIG. 7 is a schematic structural diagram of a display device according to an embodiment of the present disclosure.
  • FIG. 8 is a schematic structural diagram of a display device according to an embodiment of the present disclosure.
  • FIG. 9 is a schematic structural diagram of a display device according to an embodiment of the present disclosure.
  • FIG. 10 is a flowchart of a method for adjusting image presentation of a display device according to an embodiment of the present disclosure
  • FIG. 11 is a schematic diagram of a method for adjusting image presentation of a display device according to an embodiment of the present disclosure
  • FIG. 12 is a block diagram of an apparatus for adjusting image presentation of a display device according to an embodiment of the present disclosure.
  • the display device is a head mounted display device, and may belong to a virtual reality device, such as a virtual reality helmet, virtual reality glasses, or may belong to an augmented reality device, for example, Augmented reality helmets, etc.
  • a virtual reality device such as a virtual reality helmet, virtual reality glasses, or may belong to an augmented reality device, for example, Augmented reality helmets, etc.
  • the display device includes a display device body 201, a first distance sensor 202 and a second distance sensor 203 disposed on the display device body 201.
  • the display device body 201 includes a first lens barrel 2011 and a second lens barrel 2012. The first end of the first lens barrel 2011 is provided with a first lens 2013, and the first end of the second lens barrel 2012 is provided with a second lens 2014.
  • the display device body 201 may further include a first camera module, a second camera module, a first motor, a second motor, a display screen, and a processor.
  • the display device body 201 may further include Other components used to implement the display function.
  • the first lens barrel 2011 and the second lens barrel 2012 both include opposite ends. To distinguish the description, the end of the display device facing the user when the display device is normally worn is referred to as the first end.
  • a first lens 2013 is fixed to an edge region of the first end of the first barrel 2011, and a second lens 2014 is fixed to an edge region of the first end of the second barrel 2012.
  • This fixing method can include the following two designs.
  • the manner in which the first lens 2013 is fixed to the edge region of the first barrel 2011 will be described as an example, and the manner in which the edge region of the second barrel 2012 fixes the second lens 2014 is the same.
  • the edge area of the first end of the first barrel 2011 may be the lens cover of the first barrel 2011.
  • the edge region is tightly engaged with the first lens 2013 to buckle the first lens 2013 on the first end of the first lens barrel 2013.
  • the edge region may be provided with a plurality of screws passing through the edge region and inserted into the plurality of through holes of the outer edge of the first lens 2013 to be fixed with the first lens 2013.
  • the edge region can be fixed to the first lens 2013 by glue or soldering.
  • the edge region of the first end of the first barrel 2011 may be the edge of the first end of the first barrel 2011.
  • the inner wall of the edge region closely fits the outer edge of the first lens 2013, and the first lens 2013 is sleeved inside the edge region and is fixed in the edge region to prevent slippage from the edge region.
  • the inner wall of the edge region may be fixedly connected to the outer edge of the first lens 2013 by glue, screw, welding or the like.
  • the first lens 2013 may be a positive focal length lens or a negative focal length lens.
  • the first lens 2013 may include a first mirror surface facing the first end and a second mirror surface facing the left eye of the user when the display device is normally worn.
  • the first mirror surface may be convex and protrude in a direction toward the first end.
  • the second mirror surface faces away from the first mirror surface to face the interior of the display device when the display device is normally worn.
  • the second mirror surface may be a concave surface, and the recessed direction is also toward the first end.
  • the first mirror surface and the second mirror surface may both be convex surfaces, or the first mirror surface may be a concave surface and the second mirror surface may be a convex surface.
  • the second lens 2014 is similar to the first lens 2013, that is, the second lens 2014 may include a third mirror surface and a fourth mirror surface facing the first end, and may face the right eye of the user when the display device is normally worn .
  • the third mirror surface may be convex and protrude in a direction toward the first end.
  • the fourth mirror surface is opposite to the third mirror surface, and faces the display device when the display device is normally worn.
  • the fourth mirror surface may be a concave surface, and the recessed direction also faces the first end, or the first mirror surface and the second mirror surface may both be
  • the convex surface, or the first mirror surface may be a concave surface and the second mirror surface may be a convex surface.
  • the first distance sensor 202 and the second distance sensor 203 are symmetrically disposed on both sides of the longitudinal central axis of the display device body 201, and the longitudinal central axis refers to a central axis of the vertical direction of the display device, and the symmetric setting refers to the first distance.
  • the sensor 202 and the second distance sensor 203 are respectively disposed on the left and right sides of the longitudinal central axis, and the distance between the first distance sensor 202 and the longitudinal central axis, and the distance between the second distance sensor 203 and the longitudinal central axis may be equal. .
  • the first distance sensor 202 may be disposed on the left side of the longitudinal central axis for measuring the distance between the left eyeball and the first lens 2013, and the first distance sensor 202 may be a distance sensor with high measurement distance accuracy. That is, a high-precision distance sensor, such as a laser distance sensor, an infrared distance sensor, a microwave distance sensor, or the like.
  • the second distance sensor 203 may be disposed on the right side of the longitudinal center axis for measuring the distance between the right eyeball and the second lens 2014.
  • the second distance sensor 203 may also be a high-precision distance sensor, such as a laser distance sensor, an infrared distance sensor, a microwave distance sensor, or the like.
  • the first distance sensor 202 is disposed at an edge region of the first end of the first lens barrel 2011, and the second distance sensor 203 is disposed at the first end of the second lens barrel 2012.
  • the edge area is a first possible design, referring to FIG. 3, the first distance sensor 202 is disposed at an edge region of the first end of the first lens barrel 2011, and the second distance sensor 203 is disposed at the first end of the second lens barrel 2012. The edge area.
  • two distance sensors are respectively disposed on the edge regions of the two lens barrels, the distance between the first distance sensor 202 and the lens center of the first lens 2013 is equal to the radius of the first lens 2013, and the second distance sensor 203 and the The distance of the lens center of the two lenses 2014 is equal to the radius of the second lens 2014.
  • the first distance sensor 202 may be disposed at any position of the edge region of the first end of the first barrel 2011, and the second distance sensor 203 may be disposed at any position of the edge region of the first end of the second barrel 2012, It is sufficient that the first distance sensor 202 and the second distance sensor 203 are symmetrically disposed about the longitudinal center axis.
  • the design provides the following two specific methods.
  • the first distance sensor 202 is disposed at an intersection of an edge region of the first end of the first lens barrel 2011 and a lateral central axis of the display device body 201 near the longitudinal central axis, and a second distance.
  • the sensor 203 is disposed at an intersection of an edge region of the first end of the second barrel 2012 and a lateral center axis of the display device body 201 near the longitudinal center axis.
  • the edge region of the first end of the first barrel 2011 may have two intersections with the lateral center axis, and the first distance sensor 202 may be disposed at an intersection of the two intersections near the longitudinal center axis, the first distance.
  • the sensor 202 may be disposed at a position closest to the longitudinal center axis among all the positions of the edge regions of the first end of the first barrel 2011.
  • the edge region of the first end of the second lens barrel 2012 also has two intersections with the lateral central axis, and the second distance sensor 203 can be disposed at the intersection of the two intersection points near the longitudinal central axis, and second The distance sensor 203 may be disposed at a position closest to the longitudinal center axis among all the positions of the edge regions of the first end of the second barrel 2012.
  • the first distance sensor 202 is located directly in front of the inner corner of the left eye of the user, and the measured distance is actually the distance between the inner corner of the left eye and the first distance sensor 202, and Since the center of the first distance sensor 202 is on the same vertical plane as the lens center of the first lens 2013, and the inner corner of the left eye and the center of the left eye pupil are in the same vertical plane, the distance measured by the first distance sensor 202 will be The distance between the center of the left eye pupil and the lens center of the first lens 2013.
  • the second distance sensor 203 will be located directly in front of the inner corner of the user's right eye, and the distance between the inner corner of the right eye and the second distance sensor 203 will be measured, and due to the center of the second distance sensor 203 On the same vertical plane as the center of the second lens 2014, the inner corner of the right eye and the center of the right eye pupil are in the same vertical plane, and the distance measured by the second distance sensor 203 will be the right eye pupil center and the second lens 2014. The distance between the centers of the lenses.
  • the distance measured by the first distance sensor 201 is the exact distance between the left eyeball and the first lens 2013, and the distance measured by the second distance sensor 203 is the right eyeball and the second lens.
  • the exact distance between 2014 then, based on this accurate distance adjustment lens position, the distance between the first lens 201 and the second lens 2013 will closely match the user's pupil distance, ensuring a clear viewing effect.
  • the distance measured by any distance sensor is actually the distance between the inner corner of the eye and the distance sensor, the distance is not affected by the depth of the eye socket and the measurement distance is avoided.
  • the distance measured by the distance sensor is too large, that is, greater than the actual distance between the eyeball and the lens, and the distance sensor is avoided when the user with a shallow eye socket wears the display device.
  • the measured distance is small, that is, less than the actual distance between the eyeball and the lens.
  • the first distance sensor 202 is disposed on the first vertical line
  • the second distance sensor 203 is disposed on the second vertical line.
  • the first vertical line is a line parallel to the longitudinal center axis and passing through the center of the lens of the first lens 2013, and the first vertical line has two edges with the edge area of the first end of the first barrel 2011.
  • the first distance sensor 202 may be disposed at an upper intersection of the two intersections, that is, the first distance sensor 202 may be disposed directly above the lens center of the first lens 2013 and at a distance from the lens center equal to the lens radius Location.
  • the first distance sensor 202 may be disposed at a lower intersection of the two intersection points, that is, the first distance sensor may be disposed directly below the lens center of the first lens 2013 and at a distance from the lens center equal to the lens radius At the office.
  • the second vertical line is a line parallel to the longitudinal center axis and passing through the lens center of the second lens 2014, and the second vertical line also has two intersections with the edge area of the first end of the second barrel 2012.
  • the first distance sensor 202 may be disposed at an upper intersection of the two intersections or at a lower intersection of the two intersections.
  • the display device body 201 may further include an LED unit 205, and the two distance sensors may actually fall on the LED unit 205.
  • the LED unit 205 may be disposed in an edge region of the first end of the first lens barrel 2011.
  • the LED unit 205 includes a plurality of LEDs, and the plurality of LEDs are centered on the lens center of the first lens 2013, and the first lens 2013 The radius is a radius and is surrounded by an LED ring, and the first distance sensor 202 can be disposed on the LED ring.
  • the LED unit 205 may be disposed in the edge region of the first end of the second lens barrel 2012, and the second distance sensor 203 may be disposed on the LED unit 205 of the second lens barrel 2012.
  • the first distance sensor 202 is disposed outside the edge region of the first end of the first barrel 2011, and the second distance sensor 203 is disposed at the edge region of the first end of the second barrel 2012. The outside.
  • two distance sensors are respectively disposed around the edge regions of the two lens barrels, and the distance between the first distance sensor 202 and the lens center of the first lens 2013 is greater than the radius of the first lens 2013, The distance between the two distance sensors 203 and the lens center of the second lens 2014 may be greater than the radius of the second lens 2014.
  • the first distance sensor 202 may be disposed on the first vertical line, that is, the first distance sensor 202 may be disposed directly above the lens center of the first lens 2013 and at a distance from the lens center. Any position greater than the radius of the lens, or disposed directly below the center of the lens of the first lens 2013 and at a distance from the center of the lens at any position greater than the radius of the lens.
  • the second distance sensor 203 can be disposed on the second vertical line, that is, the second distance sensor 203 can be disposed directly above the lens center of the second lens 2014 and the distance from the lens center is greater than the lens radius. At any position, or disposed directly below the center of the lens of the second lens 2014 and at a distance from the center of the lens at any position greater than the lens radius.
  • the display device body 201 may further include a protection pad 206 for protecting the first lens barrel 2011 and the second lens barrel 2012 to avoid the first lens barrel 2011 and the second lens barrel 2012.
  • the user's wearing experience is more comfortable because the bumper is damaged and the user is allowed to wear the display device.
  • the first distance sensor 202 and the second distance sensor 203 may be actually disposed on the protection pad 206.
  • the first camera module may be disposed in the first lens barrel 2011 with a fixed focal length and orientation, capable of capturing an image of the user's left eyeball, and transmitting the image to the processor.
  • the second camera module may be disposed in the second lens barrel 2012 with a fixed focal length and orientation, capable of capturing an image of the user's right eye ball, and transmitting the image to the processor.
  • the first motor is configured to drive the movement of the first lens 2013 under the control of the processor to adjust the position of the first lens 2013, and the second motor is configured to drive the movement of the second lens 2014 under the control of the processor, thereby The position of the first lens 2014 is adjusted.
  • the scheme in which the processor adjusts the positions of the first lens 2013 and the second lens 2014 is described in detail in the embodiment shown in FIG. 10 below.
  • the display screen is used to display images and present a virtual or enhanced world in front of the user's eyes, which may be a thin film transistor liquid crystal display (LCD) screen or other type of screen.
  • LCD liquid crystal display
  • only one distance sensor is disposed on the display device for detecting an obstacle within a certain range, thereby detecting whether the user has worn the display device, so that the processor determines whether to start the display device according to the detection result.
  • the processor will only obtain the pupil distance based on the captured pupil image, and then adjust the position of the lens according to the pupil distance. This distance will be affected by the distance between the pupil and the display device, and the accuracy is better. Low, it may not be possible to achieve a clear viewing after adjusting the position of the lens.
  • two distance sensors are disposed on the display device body, the distance between the left eyeball and the first lens is measured by the first distance sensor, and the right eyeball and the second lens are measured by the second distance sensor.
  • the distance between the left pupil image and the right pupil image according to the two distances can offset the influence of the distance between the pupil and the display device, and help to obtain a high precision pupil distance.
  • a clear viewing effect can be achieved by adjusting the position of the lens based on high precision.
  • the first distance sensor may be disposed at an intersection of an edge region of the first end of the first lens barrel and a lateral center axis of the display device body near the longitudinal center axis
  • the second distance sensor may be Provided at an intersection of an edge region of the first end of the second lens barrel and a lateral center axis of the display device body near the longitudinal center axis, then measured by the first distance sensor and the second distance sensor
  • the distance between the actual inner corner and the corresponding distance sensor ensures that the measured distance is not affected by the difference in the depth of the user's eye socket, and the accuracy of the measuring distance is improved.
  • a computer readable storage medium such as a memory comprising instructions executable by a processor in an electronic device to perform image rendering for adjusting a display device in an embodiment of the following Methods.
  • the computer readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, and an optical data storage device.
  • FIG. 10 is a flowchart of a method for adjusting an image presentation of a display device according to an exemplary embodiment.
  • the execution body of the method is a processor of the display device set forth in the foregoing embodiment, as shown in FIG. Includes the following steps:
  • the processor acquires a first image and a second image.
  • an image including a left eyeball of a user is referred to as a first image
  • an image including a right eyeball of a user is referred to as a second image as an example.
  • the processor may capture the left eyeball of the user through the first camera module when receiving the shooting instruction, obtain the first image, and capture the right eyeball of the user through the second camera module to obtain the second image.
  • the shooting instruction may be triggered by an operation of wearing the display device or triggered by an operation of activating the display device.
  • the display device may play the virtual perspective image, thereby guiding the user's line of sight toward the front, ensuring that the left pupil and the right pupil of the user directly look at the display device.
  • the processor obtains a first distance between the first lens and a left eyeball of the user.
  • the processor obtains a second distance between the second lens and the right eye of the user.
  • the processor may measure the distance between the first lens and the left eyeball of the user by the first distance sensor to obtain a first distance, and measure a distance between the second lens and the right eyeball of the user by the second distance sensor, Get the second distance.
  • the first lens is a lens which is the closest to the left eyeball of the user among all the lenses of the display device
  • the second lens is a lens which is the closest to the right eye of the user among all the lenses of the display device.
  • the processor detects a wearing state of the display device.
  • the processor can detect the wearing state of the display device according to the first distance and the second distance, as described in the first possible design below.
  • the processor may detect the wearing state of the display device according to the first image and the second image, as described in the second possible design or the third possible design below.
  • the processor may calculate a difference between the first distance and the second distance, and determine that the display device is worn properly when the difference is less than the first preset difference.
  • the processor calculates the difference between the first distance and the second distance, it may be determined whether the difference is less than the first preset difference, thereby determining whether the display device is worn properly.
  • the first preset difference is used to indicate a maximum difference between the first distance and the second distance when the display device is worn normally, and the first preset difference value may be pre-stored in the display device.
  • the first distance and the second distance may be considered to be relatively close, that is, the distance between the left eyeball and the first lens, and the distance between the right eyeball and the second lens are substantially equal.
  • the direction of the connection of the two lenses is substantially parallel to the direction of the connection of the eyes, that is, the two lenses are on the same horizontal line, and it can be determined that the display device is properly worn.
  • the first distance and the second distance may be considered to be different, that is, the distance between the left eyeball and the first lens, and the distance between the right eyeball and the second lens.
  • the difference is large, and the direction of the connection of the two lenses is offset from the direction of the connection of the eyes, that is, the two lenses are not on the same horizontal line, and it can be determined that the display device is tilted.
  • the tilting condition is that the first lens is forward and the second lens is backward.
  • the first distance is smaller than the second distance, and the difference is not less than the first preset difference, it indicates that the tilt condition is that the first lens is behind and the second lens is forward.
  • the processor Make sure the display device is properly worn.
  • the design may include the following steps 1 through 3.
  • Step 1 Obtain the point coordinates of the pupil center in the two images.
  • the processor may establish a coordinate system for the first image, perform image recognition on the first image, and identify a position of the pupil center in the coordinate system of the first image, thereby obtaining a pupil center.
  • the point coordinate in the coordinate system that is, the point coordinate of the center of the left pupil.
  • the processor may establish a coordinate system for the second image, perform image recognition on the second image, and recognize the position of the pupil center in the coordinate system of the second image, thereby obtaining the pupil.
  • the point coordinate of the center in the coordinate system that is, the point coordinate of the center of the right pupil.
  • Step 2 Determine whether the point coordinate of the center of the left pupil belongs to a preset range of the first image, and when the point coordinate of the center of the left pupil belongs to a preset range of the first image, perform step 3, when the point coordinate of the center of the left pupil does not belong to the first
  • the preset range of an image determines that the display device is not properly worn.
  • the processor may obtain a preset range of the first image, and determine whether the point coordinate belongs to a preset range of the first image, when the point coordinates of the pupil center in the first image belong to The preset range of the first image indicates that the center position of the left pupil is normal, that is, the left eye is worn normally.
  • the processor may determine that the display device is tilted.
  • the preset range of the first image is used to indicate a range in which the point coordinates of the pupil center in the captured image of the left eyeball fall within the normal wearing condition of the display device.
  • the preset range of the first image may be pre-stored in the display device.
  • determining whether the point coordinates of the pupil center in the first image belong to the preset range of the first image can be realized by determining whether the point coordinates satisfy the following formula:
  • x1 represents the abscissa of the point coordinates of the pupil center in the first image
  • y1 represents the ordinate of the point coordinates of the pupil center in the first image
  • Step 3 Determine whether the point coordinate of the center of the right pupil belongs to the preset range of the second image.
  • step 4 when the point coordinate of the center of the right pupil does not belong to the first
  • the preset range of the two images determines that the display device is not properly worn.
  • the processor may acquire a preset range of the second image, and determine whether the point coordinate belongs to a preset range of the second image, and the point coordinates of the pupil center in the second image belong to
  • the preset range of the second image indicates that the center position of the right pupil is normal, that is, the right eye is worn normally.
  • the processor may determine that the display device is tilted.
  • the preset range of the second image is used to indicate a range in which the point coordinates of the pupil center in the captured image of the right eyeball fall within the normal wearing condition of the display device.
  • the preset range of the second image may be pre-stored in the display device.
  • determining whether the point coordinates of the pupil center in the second image belong to the preset range of the second image can be realized by determining whether the point coordinates satisfy the following formula:
  • x2 represents the abscissa of the point coordinates of the pupil center in the second image
  • y2 represents the ordinate of the point coordinates of the pupil center in the second image
  • Step 4 Make sure that the display device is properly worn.
  • the processor determines that the display device is properly worn.
  • the processor may be the distance between the point coordinates of the pupil center in the first image and the reference position in the first image, the point coordinates of the pupil center in the second image, and the reference position in the second image. When the difference between the distances is less than the second preset difference, it is determined that the display device is worn properly.
  • the design may specifically include the following steps 1 to 3.
  • Step 1 Obtain a distance between a point coordinate of a pupil center in the first image and a reference position in the first image.
  • the processor can establish a coordinate system for the first image, and perform image recognition on the first image to identify the position of the pupil center in the coordinate system of the first image, thereby obtaining the point coordinate of the pupil center in the coordinate system, that is, the left pupil Center point coordinates, and any position in the coordinate system of the first image as a reference position, obtain the point coordinates of the reference position in the coordinate system, calculate the distance between the point coordinates of the center of the left pupil and the point coordinates of the reference position .
  • the display device can use the center of the first image, that is, the origin of the coordinate system of the first image as the reference position of the first image.
  • Step 2 Obtain a distance between a point coordinate of the pupil center in the second image and a reference position in the second image.
  • the processor can establish a coordinate system for the second image, and perform image recognition on the second image to identify the position of the pupil center in the coordinate system of the second image, thereby obtaining the point coordinate of the pupil center in the coordinate system, that is, the right pupil
  • the point coordinates of the center, and any position of the coordinate system of the second image is used as a reference position, the point coordinates of the reference position in the coordinate system are obtained, and the distance between the point coordinates of the center of the right pupil and the point coordinates of the reference position is calculated.
  • the display device can use the center of the second image, that is, the origin of the coordinate system of the second image as the reference position of the second image.
  • the design requires that the point coordinates of the reference position in the first image be the same as the point coordinates of the reference position in the second image, that is, if the (M, N) of the coordinate system in the first image is taken as the first
  • the (M, N) of the coordinate system of the second image is also used as the reference position of the second image to ensure that the difference calculated in the subsequent manner can correctly reflect the wearing condition of the display device.
  • Step 3 Calculate the difference between the two distances to determine whether the difference is smaller than the second preset difference. When the difference is less than the second preset difference, determine that the display device is properly worn, and when the difference is not less than the second When the difference is preset, it is determined that the display device is worn.
  • the center of the pupil in the second image can reflect the relative positional relationship between the point coordinates of the center of the left pupil and the reference position of the first image
  • the center of the pupil in the second image The distance between the point coordinate and the reference position of the second image can reflect the relative positional relationship between the point coordinates of the center of the right pupil and the reference position of the second image, and the reference position of the first image is in turn and the reference position in the second image.
  • the point coordinates are the same, so the difference between the two distances can reflect the position coordinate between the point coordinates of the center of the left pupil and the point coordinates of the center of the right pupil. If the difference is smaller than the second preset difference, the left position can be determined.
  • the distance between the center of the pupil and the longitudinal central axis, the distance between the center of the right pupil and the longitudinal central axis are approximately equal, and the height of the center of the left pupil is approximately equal to the height of the center of the right pupil, thereby determining that the display device is properly worn, and
  • the difference is not less than the second preset difference, indicating that there may be a left pupil center, a right pupil center one outer one, or one high one low.
  • the second preset difference is used to indicate a distance between a point coordinate of a pupil center in the first image and a reference position in the first image, a point coordinate of a pupil center in the second image, and a reference position in the second image.
  • the maximum difference between the distances between the two, the second preset difference may be determined according to the actual demand for accuracy, and may be pre-stored in the display device.
  • the processor prompts the user to re-wear the display device according to a preset prompt manner.
  • the preset prompting manner may include displaying the first prompting page, issuing the first prompting voice, and the like.
  • the first prompt page may include text information, picture information, and the like for prompting to re-wear the display device.
  • the text information may be “the current wearing method is incorrect, please re-wear the device”, and the first prompt voice may be Prompt to re-wear the voice of the display device.
  • the user may know that the display device is to be re-weared and the display device is re-applied, and the display device restarts the above step 1001.
  • the processor detects a blink status of the user according to the first distance, the second distance, the first image, and the second image.
  • the blinking state of the user is first detected according to the first distance, the second distance, the first image, and the second image.
  • the following step 1007 is performed, when the blinking state is normal, Perform the following step 1008.
  • the specific process of detecting whether the blink state is normal may include the following steps 1 to 5.
  • Step 1 The processor calculates a ratio between the diameter of the left pupil and the diameter of the preset pupil to obtain a first designated coefficient.
  • the processor adopts the second possible design or the third possible design in the process of performing the above step 1004.
  • the first image is image-recognized, and in the process of identifying the first image, At the same time, the diameter of the left pupil is recognized, and the diameter of the left pupil can be directly obtained in this step.
  • the processor adopts the first possible design in the process of performing the above step 1004, or when the second possible design or the third possible design is adopted, the left pupil diameter is not recognized, the processing in this step
  • the device needs to perform image recognition on the first image to obtain the diameter of the left pupil.
  • the processor After obtaining the left pupil diameter, the processor obtains the preset pupil diameter and calculates the ratio between the left pupil diameter and the preset pupil diameter, and takes the ratio as the first specified coefficient.
  • the preset pupil diameter is a pupil diameter that is detected when the distance between the first lens and the user's left eyeball is a preset reference distance and the blink state is normal.
  • the preset pupil diameter can be detected in advance by the sample image acquired when the plurality of sample users wear the display device, and the specific process of obtaining the preset pupil diameter can include the following three possible designs.
  • the sample image is acquired, and a plurality of sample first images are obtained. For each sample first image of the plurality of sample first images, obtaining a pupil diameter in the sample image, obtaining a sample pupil diameter, obtaining a plurality of sample pupil diameters, and obtaining an average value of the plurality of sample pupil diameters, As the preset pupil diameter.
  • the sample image is acquired, and a plurality of sample second images are obtained. For the second image of each of the plurality of sample second images, obtaining a pupil diameter in the second image of the sample, obtaining a pupil diameter of the sample, obtaining a plurality of sample pupil diameters, and obtaining an average of the pupil diameters of the plurality of samples The value is used as the preset pupil diameter.
  • a plurality of sample pupil diameters may be obtained based on the plurality of sample first images, a plurality of sample pupil diameters may be obtained based on the plurality of sample second images, and an average value of the pupil diameters of the samples may be obtained as Preset pupil diameter.
  • Step 2 The processor calculates a ratio between the diameter of the right pupil and the diameter of the preset pupil to obtain a second designated coefficient.
  • the processor adopts the second possible design or the third possible design in the process of performing the above step 1004.
  • the second image is image-recognized, and in the process of identifying the second image, At the same time, the diameter of the right pupil is recognized, and the diameter of the right pupil can be directly obtained in this step.
  • the processor adopts the first possible design in the process of performing the above step 1004, or when the second possible design or the third possible design is adopted, the right pupil diameter is not recognized, the processor in this step Image recognition is required for the second image to obtain the diameter of the right pupil.
  • Step 3 The processor calculates a ratio of the first distance to the preset reference distance, obtains a first scaling factor, calculates a ratio of the second distance to the preset reference distance, and obtains a second scaling factor.
  • the first scaling factor and the second scaling factor may be stored, so as to perform a normalization process based on the first scaling factor and the second scaling factor.
  • Step 4 The processor calculates a difference between the first scaling factor and the first specified coefficient, and calculates a difference between the second scaling factor and the second specified coefficient.
  • the processor may calculate a difference between the first scaling factor minus the first specified coefficient, or calculate a difference between the first specified coefficient and the first scaling factor, and then obtain an absolute value of the obtained difference, and the absolute value The value is taken as the difference between the first scaling factor and the first specified coefficient.
  • the processor may calculate a difference between the second scaling factor minus the second specified coefficient, or calculate a difference between the second specified coefficient and the second scaling factor, and then obtain an absolute value of the obtained difference, the absolute value The value is taken as the difference between the second scaling factor and the second specified coefficient.
  • Step 5 The processor determines whether each of the two differences is less than a third preset difference, and when any of the differences is less than the third preset difference, determining that the blink state is abnormal, when two When the difference is less than the third preset difference, it is determined that the blink state is normal.
  • the storage device may obtain a third preset difference, determine whether a difference between the first scaling factor and the first specified coefficient is less than a third preset difference, and determine a difference between the second scaling factor and the second specified coefficient. Whether the value is smaller than the third preset difference, and when both differences are smaller than the third preset difference, indicating that the left eye blink state and the right eye blink state are normal, the display device determines that the blink state is normal.
  • the third preset difference is used to indicate a maximum difference between the first scaling factor and the first specified coefficient, or to indicate a maximum difference between the second scaling factor and the second specified coefficient.
  • the third preset difference may be determined according to the demand for accuracy and may be pre-stored in the display device.
  • the difference between the first scaling factor and the first specified coefficient is not less than the third preset difference, it may be known that there may be a situation in which the left eye is closed, the left eye is half-closed, and the like, that is, the left eye is closed. If the eye condition is abnormal, the display device will determine that the blink status is abnormal.
  • the difference between the second scaling factor and the second specified coefficient is not less than the third preset difference, it may be known that a closed eye of the right eye, a half-closed half of the right eye, and the like may occur. If the status is abnormal, the display device will determine that the blink status is abnormal.
  • the processor When it is determined that the blinking state is abnormal, the processor prompts the user to blink correctly according to the preset prompting manner.
  • the preset prompting manner may include displaying a second prompting page, issuing a second prompting voice, and the like.
  • the second prompt page may include text information, picture information, and the like for prompting the correct blinking.
  • the text information may be “please open your eyes and look straight ahead”, and the second prompt voice is prompting the correct blink. Voice. After the user views the second prompt page or hears the second prompt voice, the user can know that the blink is correct.
  • the present embodiment does not limit the sequence of performing the process of detecting the wearing state and the process of performing the process of detecting the blinking state. That is, the wearing state may be detected first, and the blinking state is detected when it is determined that the wearing state is normal. When it is determined that the wearing state is abnormal, the subsequent steps are not required, and the user is directly prompted to re-wear the display device. It is also possible to detect the blinking state first, and then detect the wearing state when it is determined that the blinking state is normal, and when it is determined that the blinking state is abnormal, there is no need to perform the subsequent steps, and the user is directly prompted to blink correctly.
  • the processor normalizes the first image and the second image respectively.
  • the normalization process in this step refers to adjusting the size of the image captured at different distances to the size of the image that will be captured under the preset reference distance, and the preset reference distance may be a distance of about 15 mm.
  • the display device calculates the first scaling factor and the second scaling factor, then the display device in this step 1008 may acquire the first scaling factor and the second scaling factor, based on the first scaling coefficient pair.
  • the first image is normalized, and the second image is normalized based on the second scaling factor.
  • the first scaling factor may be used as an enlargement ratio of the first image, and the first image may be scaled up, and the size of the enlarged first image may be equal to the first image.
  • the product of the size and the first scaling factor may be used as the normalized first image.
  • the first scaling factor is used as a reduction ratio of the first image, and the first image is scaled down.
  • the size of the reduced first image may be equal to the product of the size of the first image and the first scaling factor, which may be reduced.
  • the first image is used as the first image after normalization.
  • the second scaling factor may be used as the magnification ratio of the second image, and the second image may be scaled up, and the size of the enlarged second image may be equal to the second image.
  • the product of the size and the second scaling factor may be used as the normalized second image.
  • the second scaling factor is used as a reduction ratio of the second image, and the second image is scaled down.
  • the size of the reduced second image is equal to the product of the size of the second image and the second scaling factor, which may be reduced.
  • the second image is used as the normalized second image.
  • the first point to be explained is that the distance between the user's pupil and the display device may be different due to the difference in the user's wearing habits, resulting in different sizes of the first image and the second image captured at different distances.
  • a preset reference distance is set, and a first scaling factor is calculated according to the first distance between the left eyeball and the first lens and the preset reference distance, and the first scaling factor is used to
  • the image is normalized, and the size of the first image captured under various first distances can be uniformly scaled to the size of the first image that is captured under the preset reference distance, thereby eliminating the first distance pair The effect of the size of an image.
  • a second scaling factor is calculated, and the second image is normalized by the second scaling factor, and various The size of the second image captured at the second distance is uniformly scaled to the size of the second image that would be captured at the preset reference distance, thereby eliminating the effect of the second distance on the size of the second image.
  • the enlarged first image is a first image that is captured under a preset reference distance. The closer the distance between the left pupil and the first lens is, the larger the first image is, and the smaller the first scaling factor is, that is, the smaller the reduction ratio of the first image is, the offset effect is offset, and the reduced size is ensured.
  • the first image is a first image that is captured under a preset reference distance.
  • the enlarged second image will be the second image captured under the preset reference distance. The closer the distance between the right pupil and the second lens is, the larger the second image is, and the smaller the second scaling factor is, that is, the smaller the reduction ratio of the second image is, the offset effect is offset, and the reduced size is ensured.
  • the second image will be the second image captured under the preset reference distance.
  • the second point to be described is that the step of detecting the wearing state and the step of detecting the blinking state are only optional steps, and may be unnecessary, that is, the processor may not perform the above steps 1004 to 1007, and perform step 1001 - step. After 1003, the first image and the second image are directly normalized.
  • the processor calculates a lay length between the left eyeball of the user and the right eyeball of the user according to the normalized first image and the normalized second image.
  • This step 1009 may include the following steps 1 through 4.
  • Step 1 obtaining point coordinates of the left pupil center in the normalized first image, and acquiring mapping point coordinates of the center of the first lens in the normalized first image, and calculating the left pupil center The difference between the point coordinates and the coordinates of the mapped points as the first difference.
  • the origin of the first lens and the second lens may be used as the origin of the coordinate system, and the longitudinal center axis of the display device is the y-axis of the coordinate system, and the coordinate system is established, and then normalized.
  • the absolute value of the abscissa of any point in the processed first image is the distance from the point to the longitudinal center axis.
  • the left eye pupil of the first image after normalization is detected, and the center of the left pupil is positioned, converted into a coordinate system, and the distance deye1 between the point coordinate of the center of the left pupil and the y axis is obtained, and then calculated.
  • Step 2 obtaining point coordinates of the right pupil center in the normalized second image, and acquiring mapping point coordinates of the center of the second lens in the normalized second image, and calculating the right pupil center The difference between the point coordinates and the coordinates of the mapped points as the second difference.
  • the coordinate system can be established in the same way, and the right eye pupil of the normalized second image is detected, the right pupil center is positioned, and converted into a coordinate system to obtain the point coordinate and the y-axis of the right pupil center.
  • Step 3 Obtain a current distance between a center of the first lens and a center of the second lens.
  • Step 4 Calculate a sum of the current distance, the first difference, and the second difference, to obtain a distance of the user.
  • the normalized first image has eliminated the influence of the size of the first distance
  • the normalized second image has eliminated the influence of the second distance.
  • the calculated pupil distance between the first processed image and the normalized second image is the accurate pupil distance.
  • the processor adjusts the first lens and/or the second lens according to a user's interpupillary distance.
  • the adjustment direction of the first lens is referred to as a first adjustment direction
  • the adjustment distance of the first lens is referred to as a first adjustment distance
  • the adjustment direction of the second lens is referred to as a second adjustment direction
  • the adjustment distance of the second lens is referred to as a second adjustment direction.
  • the specific process of adjusting the lens may be: in step 1 of the above step 1009, when the first difference is greater than 0, the inward movement may be used as the first adjustment direction, when the first difference is less than 0.
  • the outward movement may be used as the first adjustment direction, and the absolute value of the first difference is used as the first adjustment distance, and the first lens is adjusted according to the first adjustment direction and the first adjustment distance.
  • step 2 of the above step 1007 when the second difference is greater than 0, the inward movement may be used as the second adjustment direction, and when the second difference is less than 0, the outward movement may be used as the second adjustment direction, and The absolute value of the second difference is used as the second adjustment distance, and the second lens is adjusted according to the second adjustment direction and the second adjustment distance.
  • the first lens and/or the second lens will move inward to reduce the current distance between the first lens and the second lens until the current distance is equal to the lay length.
  • the first lens and/or the second lens will move outward to increase the current distance between the first lens and the second lens until the current distance is equal to the lay length. Then, the current distance between the first lens and the second lens will be equal to the pupil distance, thereby ensuring that the effect of the image presented by the display device is clear.
  • step 1 of step 1009 after the display device acquires the coordinates of the point of the center of the left pupil in the normalized first image, the distance between the coordinates of the point and the longitudinal center axis of the display device may be calculated.
  • the distance is approximately equal to the distance between the center of the left pupil and the bridge of the user's nose.
  • the distance can be used as the left eyelid distance, and the first adjustment direction and the first adjustment distance are determined according to the left eyelid distance alone, thereby adjusting the first lens.
  • the distance between the point coordinates and the longitudinal central axis of the display device can be calculated, which is approximately equal to the right pupil center and the user's nose bridge.
  • the distance between the two can be used as the right eyelid distance, and the second adjustment direction and the second adjustment distance are determined according to the right eyelid distance alone, thereby adjusting the second lens.
  • the method provided by the embodiment is also applicable to a special population with a relatively large gap between the left eye and the right eye, which improves the universality.
  • the first point to be explained is that after the display device obtains the interpupillary distance, the correspondence between the user identifier and the interpupillary distance can be stored, so that when the user wears the display device next time, the position of the lens is directly adjusted according to the stored interpupillary distance. There is no need to get the distance again.
  • the display device may receive an input instruction, obtain an input user identifier, and query a correspondence between the stored user identifier and the user's distance based on the user identifier, when the login information is displayed, in the stored correspondence relationship.
  • the user's distance corresponding to the user identifier is not included, the user's distance is obtained by the above method, and the correspondence between the user identifier and the user's distance is stored.
  • the user's distance corresponding to the user identifier may be directly obtained from the corresponding relationship. Without having to acquire the user's interpupillary distance again, it is advantageous to quickly obtain the interpupillary distance, thereby quickly adjusting the distance between the lenses, saving time and improving efficiency.
  • the second point to be explained is that the process of adjusting the first lens and/or the second lens by the processor can be realized by the first motor and the second motor included in the display device body: the processor can calculate the first adjustment direction, After the first adjustment distance, the second adjustment direction, and the second adjustment distance, the first adjustment direction and the first adjustment distance are transmitted to the first motor, and the second adjustment direction and the second adjustment distance are transmitted to the second motor.
  • the first motor pushes the first lens to move left and right according to the first adjustment direction and the first adjustment distance
  • the second motor pushes the second lens to move left and right according to the second adjustment direction and the second adjustment distance, and finally achieves the purpose of adjusting the lay length.
  • the operation flow of the method for adjusting the image presentation of the display device may be as follows, including the following steps S1 - S4:
  • S1 acquiring a left eyeball image of the user by the first camera module, and acquiring a right eyeball image of the user by the second camera module.
  • step S2 The distance between the left eyeball of the user and the first lens is measured by the first distance sensor, and the left eyeball image acquired in step S1 is normalized based on the distance.
  • the distance between the right eyeball and the second lens of the user is measured by the second distance sensor, and the right eyeball image acquired in step S1 is normalized based on the distance.
  • step S3 detecting the user's pupil for the normalized left eyeball image in step S2, positioning the left pupil center, obtaining the point coordinates of the left pupil center, and according to the point coordinates of the left pupil center and the first lens
  • the adjustment direction and the adjustment distance of the first lens are calculated by the difference between the coordinates of the map points on the image, and the pupil of the user is detected for the normalized image of the right eye in step S2.
  • the center of the right pupil is positioned to obtain the point coordinates of the center of the right pupil
  • the adjustment direction of the second lens is calculated according to the difference between the point coordinates of the center of the right pupil and the coordinates of the mapping point of the lens center of the second lens on the image. And adjust the distance and calculate the distance value.
  • S4 Adjust the first lens and the second lens of the display device according to the distance of the user currently using the display device.
  • the first image and the second image are normalized by the first distance between the first lens and the left eyeball and the second distance between the second lens and the right eyeball, respectively.
  • the normalized first image and the normalized second image both eliminate the influence of the user's pupil and the display device, so the first image and normalization according to the normalization process are eliminated.
  • the calculated pupil distance of the processed second image does not cause an error due to the difference between the user's pupil and the display device, and the precision is high. According to the high-precision pupil distance adjustment lens position, a clear viewing effect can be achieved.
  • the position of the lens is adjusted to avoid the problem that the measured pupil distance is inaccurate due to the tilt of the display device.
  • the user can be guided to correctly wear the display device, which is more humanized.
  • the position of the lens is adjusted to avoid the inaccurate measured pupil distance due to abnormal state of the blinking state such as closed eyes of the user when the image is captured. The problem.
  • the correspondence between the user identifier and the user's distance can be stored.
  • the user's distance corresponding to the user identifier can be directly obtained from the corresponding relationship. Without having to acquire the user's interpupillary distance again, it is advantageous to quickly obtain the interpupillary distance, thereby quickly adjusting the distance between the lenses, saving time and improving efficiency.
  • FIG. 12 is a schematic structural diagram of an apparatus for adjusting an image presentation of a display device according to an exemplary embodiment. As shown in FIG. 12, the apparatus includes: an obtaining module 1201, a obtaining module 1202, and a normalization module 1203. The calculation module 1204 and the adjustment module 1205.
  • the obtaining module 1201 is configured to acquire the first image and the second image.
  • a module 1202 is obtained for obtaining the first distance and the second distance.
  • a normalization module 1203, configured to perform normalization processing on the first image and the second image
  • a calculation module 1204 configured to calculate a lay length
  • the adjustment module 1205 is configured to adjust the first lens and/or the second lens according to the interpupillary distance.
  • the normalization module 1203 is configured to perform the above step 1008.
  • the apparatus further includes: a wearing state detecting module for performing the above step 1004.
  • the apparatus further includes: a prompting module, configured to perform the above step 1005.
  • the apparatus further includes: a blink state detection module for performing the above step 1006.
  • the device further includes: a storage module, configured to store a correspondence between the user identifier and the user's interpupillary distance.
  • the apparatus for adjusting the image presentation of the display device provided by the above embodiment is used to adjust the image presentation of the display device, only the division of each functional module described above is used for illustration. In actual applications, The above function assignment is completed by different functional modules, that is, the internal structure of the display device is divided into different functional modules to complete all or part of the functions described above.
  • the embodiment of the method for adjusting the image presentation of the display device for adjusting the image presentation of the display device belongs to the same concept. For the specific implementation process, refer to the method embodiment, and details are not described herein again.
  • a person skilled in the art may understand that all or part of the steps of implementing the above embodiments may be completed by hardware, or may be instructed by a program to execute related hardware, and the program may be stored in a computer readable storage medium.
  • the storage medium mentioned may be a read only memory, a magnetic disk or an optical disk or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

一种显示设备、用于调整显示设备的图像呈现的方法及装置,属于显示设备领域。该显示设备包括:显示设备本体(201),显示设备本体(201)包含第一镜筒(2011)和第二镜筒(2012),第一镜筒(2011)的第一端设置有第一透镜(2013),第二镜筒(2012)的第一端设置有第二透镜(2014);显示设备本体(201)上还设置有第一距离传感器(202)和第二距离传感器(203);第一距离传感器(202)和第二距离传感器(203)对称设置于显示设备本体(201)的纵向中轴线的两侧;第一距离传感器(202)用于对左眼球和第一透镜(2013)之间的距离进行测量;第二距离传感器(203)用于对右眼球和第二透镜(2014)之间的距离进行测量。在显示设备本体(201)上设置了两个距离传感器(202,203),有助于得到高精度的瞳距。

Description

显示设备、用于调整显示设备的图像呈现的方法及装置
本申请要求于2017年10月30日提交中国国家知识产权局、申请号为201711038481.5、发明名称为“显示设备、用于调整显示设备的图像呈现的方法及装置”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本公开涉及显示设备领域,特别涉及一种显示设备、用于调整显示设备的图像呈现的方法及装置。
背景技术
随着虚拟现实技术以及增强现实技术的发展,虚拟现实头盔、虚拟现实眼镜等显示设备纷纷诞生,用户可以佩戴显示设备,显示设备会显示图像,用户可以透过显示设备内的透镜看到图像,以体验虚拟的或者增强的世界。为了达到清晰的观看效果,通常需要调节透镜的位置,以使透镜中心与瞳孔中心在水平方向和垂直方向相吻合。
相关技术中,显示设备会获取用户的左瞳孔图像和右瞳孔图像,确定左瞳孔坐标和右瞳孔坐标,根据两个坐标计算瞳距,根据得到的瞳距调节左透镜和右透镜的位置,以使左透镜、右透镜与显示设备中轴线之间的距离均为瞳距的一半。其中,瞳距为左瞳孔中心点和右瞳孔中心点之间的距离。具体来说,参见图1,定义双眼的中心点O为原点,水平方向为x轴方向,获取右瞳孔最左侧位置B和原点的距离|OB|,右瞳孔最右侧位置B’到原点的距离|OB'|,并同理地获取左瞳孔最左侧位置A和原点的距离|OA|,左瞳孔最右侧位置A’到原点的距离|OA'|,采用以下公式进行计算,得到瞳距(ipd),将左透镜、右透镜与中轴线之间的距离调节为ipd/2。
Figure PCTCN2018091076-appb-000001
在实现本公开的过程中,发明人发现相关技术至少存在以下问题:
由于用户佩戴习惯的差异,用户瞳孔与显示设备的距离可能不同,导致拍摄到的瞳孔图像的大小不同,进而导致测量到的瞳距精度较低,按照这种低精度的瞳距调节透镜的位置,无法达到清晰的观看效果。
发明内容
本公开实施例提供了一种显示设备、用于调整显示设备的图像呈现的方法及装置,可以解决相关技术中测量到的瞳距精度较低的问题,所述技术方案如下:
第一方面,提供了一种显示设备,所述显示设备包括:显示设备本体,所述显示设备本体包含第一镜筒和第二镜筒,所述第一镜筒的第一端设置有第一透镜,所述第二镜筒的第一端设置有第二透镜;
所述显示设备本体上还设置有第一距离传感器和第二距离传感器;
所述第一距离传感器和所述第二距离传感器对称设置于所述显示设备本体的纵向中轴 线的两侧;
所述第一距离传感器用于对左眼球和所述第一透镜之间的距离进行测量;
所述第二距离传感器用于对右眼球和所述第二透镜之间的距离进行测量。
本实施例提供的显示设备,在显示设备本体上设置了两个距离传感器,由第一距离传感器测量左眼球和第一透镜之间的距离,由第二距离传感器测量右眼球和第二透镜之间的距离,以便根据这两个距离分别对左瞳孔图像和右瞳孔图像进行归一化处理后,可以抵消掉瞳孔与显示设备之间的距离的影响,有助于得到高精度的瞳距,在基于高精度的瞳距调节透镜的位置后,可以达到清晰的观看效果。
在一种可能的设计中,所述第一距离传感器设置于所述第一镜筒的第一端的边缘区域,所述第二距离传感器设置于所述第二镜筒的第一端的边缘区域。
在一种可能的设计中,所述第一距离传感器设置于所述第一镜筒的第一端的边缘区域与所述显示设备本体的横向中轴线的两个交点中靠近所述纵向中轴线的交点上;
所述第二距离传感器设置于所述第二镜筒的第一端的边缘区域与所述显示设备本体的横向中轴线的两个交点中靠近所述纵向中轴线的交点上。
在这种可能的设计中,第一距离传感器、第二距离传感器测量的距离实际为内眼角与对应距离传感器之间的距离,保证测量到的距离不会受到用户眼窝深浅不同造成的影响,提高了测量距离的准确性。
在一种可能的设计中,所述第一距离传感器设置于所述第一镜筒的第一端的边缘区域的外侧,所述第二距离传感器设置于所述第二镜筒的第一端的边缘区域的外侧。
在一种可能的设计中,所述第一距离传感器设置于第一垂直线上,所述第一垂直线为与所述纵向中轴线平行且穿过所述第一透镜的透镜中心的直线;
所述第二距离传感器设置于第二垂直线上,所述第二垂直线为与所述纵向中轴线平行且穿过所述第二透镜的透镜中心的直线。
在一种可能的设计中,所述第一镜筒的第一端的边缘区域中设置有发光二极管(Light Emitting Diode,LED)单元,所述第二镜筒的第一端的边缘区域中设置有LED单元。
第二方面,提供了一种用于调整显示设备的图像呈现的方法,所述方法包括:
获取第一图像和第二图像,其中,第一图像中包含用户的左眼球的图像,第二图像中包含所述用户的右眼球的图像;
得到第一透镜与所述用户的左眼球之间的第一距离,其中,所述第一透镜为所述显示设备的全部透镜中与所述用户的左眼球的距离最近的透镜;
得到第二透镜与所述用户的右眼球之间的第二距离,其中,所述第二透镜为所述显示设备的全部透镜中与所述用户的右眼球的距离最近的透镜;
根据所述第一距离、所述第二距离,分别对所述第一图像和所述第二图像进行归一化处理;
根据归一化处理后的第一图像和归一化处理后的第二图像,计算所述用户的左眼球和所述用户的右眼球之间的瞳距;
根据所述用户的瞳距调整所述第一透镜和/或所述第二透镜。
本实施例提供的方法,通过第一透镜与左眼球之间的第一距离、第二透镜与右眼球之间的第二距离,分别对第一图像和第二图像进行了归一化处理,归一化处理后的第一图像 和归一化处理后的第二图像均消除了由于用户瞳孔与显示设备的远近不同造成的影响,因此根据归一化处理后的第一图像和归一化处理后的第二图像计算得到的瞳距,不会由于用户瞳孔与显示设备的远近不同产生误差,精度较高,按照这种高精度的瞳距调节透镜的位置,可以达到清晰的观看效果。
在一种可能的设计中,所述根据所述第一距离、所述第二距离,分别对所述第一图像和所述第二图像进行归一化处理,包括:
计算所述第一距离与预设参考距离之间的比值,得到第一缩放系数,所述预设参考距离为获取到预设尺寸的瞳孔图像时所述第一透镜与所述用户的左眼球之间的距离,或为获取到预设尺寸的瞳孔图像时所述第二透镜与所述用户的右眼球之间的距离;
计算所述第二距离与所述预设参考距离之间的比值,得到第二缩放系数;
采用所述第一缩放系数,对所述第一图像进行缩放;
采用所述第二缩放系数,对所述第二图像进行缩放。
在一种可能的设计中,所述根据所述第一距离、所述第二距离,分别对所述第一图像和所述第二图像进行归一化处理,包括:
根据所述第一距离和所述第二距离检测所述显示设备的佩戴状态;或,根据所述第一图像和所述第二图像检测所述显示设备的佩戴状态;
当确定所述显示设备佩戴正常时,根据所述第一距离、所述第二距离,分别对所述第一图像和所述第二图像进行归一化处理。
在这种可能的设计中,通过检测显示设备的佩戴状态,当显示设备佩戴正常时,再调节透镜的位置,避免由于显示设备佩戴倾斜导致测量到的瞳距不准确的问题。
在一种可能的设计中,所述根据所述第一距离和所述第二距离检测所述显示设备的佩戴状态,包括:
计算所述第一距离与所述第二距离之间的差值;
当所述差值小于第一预设差值时,确定所述显示设备佩戴正常。
在一种可能的设计中,所述根据所述第一图像和所述第二图像检测所述显示设备的佩戴状态,包括:
当所述第一图像中瞳孔中心的点坐标属于所述第一图像的预设范围,且所述第二图像中瞳孔中心的点坐标属于所述第二图像的预设范围时,确定所述显示设备佩戴正常。
在一种可能的设计中,所述根据所述第一图像和所述第二图像检测所述显示设备的佩戴状态,包括:
计算所述第一图像中瞳孔中心的点坐标与所述第一图像中参考位置之间的距离,得到第三距离;
计算所述第二图像中瞳孔中心的点坐标与所述第二图像中参考位置之间的距离,得到第四距离;
当所述第三距离与所述第四距离之间的差值小于第二预设差值时,确定所述显示设备佩戴正常。
在一种可能的设计中,所述根据所述第一距离和所述第二距离检测所述显示设备的佩戴状态;或,根据所述第一图像和所述第二图像检测所述显示设备的佩戴状态之后,所述方法还包括:
当确定所述显示设备佩戴不正常时,按照预设提示方式,提示所述用户重新佩戴所述显示设备。
在这种可能的设计中,通过按照预设提示方式,提示用户重新佩戴显示设备,可以引导用户正确佩戴显示设备,更加人性化。
在一种可能的设计中,所述根据所述第一距离、所述第二距离,分别对所述第一图像和所述第二图像进行归一化处理,包括:
根据所述第一距离、第二距离、所述第一图像和所述第二图像,检测所述用户的睁眼状态;
当确定所述睁眼状态正常时,根据所述第一距离、所述第二距离,分别对所述第一图像和所述第二图像进行归一化处理。
在这种可能的设计中,通过检测用户的睁眼状态,当睁眼状态正常时,再调节透镜的位置,避免由于拍摄图像时由于出现用户闭眼等睁眼状态不正常的情况导致测量到的瞳距不准确的问题。
在一种可能的设计中,所述根据所述第一距离、第二距离、所述第一图像和所述第二图像,检测所述用户的睁眼状态,包括:
计算左瞳孔直径与预设瞳孔直径之间的比值,得到第一指定系数,计算右瞳孔直径与所述预设瞳孔直径之间的比值,得到第二指定系数,所述预设瞳孔直径为当所述第一透镜与所述用户的左眼球之间的距离之间为预设参考距离,且睁眼状态正常时会检测到的瞳孔直径,或者所述预设瞳孔直径为当所述第二透镜与所述用户的右眼球之间的距离之间为所述预设参考距离,且睁眼状态正常时会检测到的瞳孔直径,所述左瞳孔直径根据所述第一图像确定,所述右瞳孔直径根据所述第二图像确定;
当所述第一缩放系数与所述第一指定系数之间的差值、所述第二缩放系数与所述第二指定系数之间的差值均小于第三预设差值时,确定所述睁眼状态正常,所述第一缩放系数为所述第一距离与所述预设参考距离的比值,所述第二缩放系数为所述第二距离与所述预设参考距离的比值。
在一种可能的设计中,所述预设瞳孔直径为基于所述显示设备所获取的多个样本第一图像或多个样本第二图像检测得到的样本瞳孔直径的平均值。
在一种可能的设计中,所述根据归一化处理后的第一图像和归一化处理后的第二图像计算所述用户的左眼球和所述用户的右眼球之间的瞳距之后,所述方法还包括:
获取所述用户的用户标识;
存储所述用户标识和所述用户的瞳距之间的对应关系。
在这种可能的设计中,在测量得到瞳距后,可以存储用户标识和用户的瞳距之间的对应关系,当用户下次佩戴显示设备时,则可直接从对应关系中获取用户标识对应的用户的瞳距,而无需再次获取用户的瞳距,有利于快速获取瞳距,从而快速调节透镜之间的距离,节省了时间,提高了效率。
第三方面,提供了一种用于调整显示设备的图像呈现的装置,所述装置包括多个功能模块,以实现上述第二方面以及第二方面的任一种可能设计的用于调整显示设备的图像呈现的方法。
第四方面,提供了一种计算机可读存储介质,所述计算机可读存储介质上存储有指令, 所述指令被处理器执行以完成上述第二方面以及第二方面的任一种可能设计的用于调整显示设备的图像呈现的方法。
附图说明
图1是相关技术提供的一种用于调整显示设备的图像呈现的方法的示意图;
图2是本公开实施例提供的一种显示设备的结构示意图;
图3是本公开实施例提供的一种显示设备的结构示意图;
图4是本公开实施例提供的一种显示设备的结构示意图;
图5是本公开实施例提供的一种显示设备的结构示意图;
图6是本公开实施例提供的一种显示设备的结构示意图;
图7是本公开实施例提供的一种显示设备的结构示意图;
图8是本公开实施例提供的一种显示设备的结构示意图;
图9是本公开实施例提供的一种显示设备的结构示意图;
图10是本公开实施例提供的一种用于调整显示设备的图像呈现的方法的流程图;
图11是本公开实施例提供的一种用于调整显示设备的图像呈现的方法的示意图;
图12是本公开实施例提供的一种用于调整显示设备的图像呈现的装置的框图。
具体实施方式
为使本公开的目的、技术方案和优点更加清楚,下面将结合附图对本公开实施方式作进一步地详细描述。
图2是本公开实施例提供的一种显示设备的结构示意图,该显示设备为头戴式显示设备,可以属于虚拟现实设备,例如虚拟现实头盔、虚拟现实眼镜,或者可以属于增强现实设备,例如增强现实头盔等。
参见图2,该显示设备包括显示设备本体201、在该显示设备本体201上设置的第一距离传感器202和第二距离传感器203。显示设备本体201包含第一镜筒2011和第二镜筒2012,该第一镜筒2011的第一端设置有第一透镜2013,该第二镜筒2012的第一端设置有第二透镜2014。可选地,该显示设备本体201的内部还可以包含第一摄像头模组、第二摄像头模组、第一马达、第二马达、显示屏幕和处理器,另外,该显示设备本体201还可以包括其他用于实现显示功能的部件。
(1)第一镜筒2011和第二镜筒2012。
第一镜筒2011和第二镜筒2012均包括相对的两端,为了区分描述,在此将显示设备正常佩戴时显示设备面向用户的一端称为第一端。该第一镜筒2011的第一端的边缘区域固定有第一透镜2013,第二镜筒2012的第一端的边缘区域固定有第二透镜2014。
该固定方式可以包括以下两种设计。以下以第一镜筒2011的边缘区域固定第一透镜2013的方式为例进行阐述,第二镜筒2012的边缘区域固定第二透镜2014的方式与此同理。
在第一种可能的设计中,第一镜筒2011第一端的边缘区域可以为第一镜筒2011的镜头盖。该边缘区域与第一透镜2013之间紧密咬合,从而将第一透镜2013扣在第一镜筒2013的第一端上。其中,该边缘区域上可以设置有多个螺钉,该多个螺钉穿过边缘区域,插入第一透镜2013外边缘的多个通孔中,从而与第一透镜2013固定。又如边缘区域可以通过 胶水或焊接的方式与第一透镜2013固定。
在第二种可能的设计中,第一镜筒2011第一端的边缘区域可以为第一镜筒2011第一端的边沿。该边缘区域的内壁与第一透镜2013的外边缘紧密贴合,第一透镜2013套在边缘区域内部,固定于边缘区域中,以免从边缘区域中滑脱。其中,边缘区域的内壁可以与第一透镜2013的外边缘通过胶水、螺钉、焊接等方式固定连接。
(2)第一透镜2013和第二透镜2014。
第一透镜2013可以为正焦距透镜,或者为负焦距透镜。第一透镜2013可以包括第一镜面和第二镜面,该第一镜面朝向第一端,在显示设备正常佩戴时可以面向用户的左眼。该第一镜面可以为凸面,且凸出方向朝向第一端。该第二镜面与第一镜面相背离,在显示设备正常佩戴时以面向显示设备内部。第二镜面可以为凹面,且凹陷方向也朝向第一端,或者,第一镜面和第二镜面可以均为凸面,或者,第一镜面可以为凹面而第二镜面可以为凸面。
该第二透镜2014与第一透镜2013类似,即,该第二透镜2014可以包括第三镜面和第四镜面,该第三镜面朝向第一端,在显示设备正常佩戴时可以面向用户的右眼。该第三镜面可以为凸面,且凸出方向朝向第一端。该第四镜面与第三镜面相背离,在显示设备正常佩戴时会面向显示设备内部,第四镜面可以为凹面,且凹陷方向也朝向第一端,或者,第一镜面和第二镜面可以均为凸面,或者,第一镜面可以为凹面而第二镜面可以为凸面。
(3)第一距离传感器202和第二距离传感器203。
第一距离传感器202和第二距离传感器203对称设置于该显示设备本体201的纵向中轴线的两侧,该纵向中轴线是指显示设备垂直方向的中轴线,该对称设置是指该第一距离传感器202、第二距离传感器203分别设置于纵向中轴线的左侧和右侧,第一距离传感器202与纵向中轴线之间的距离、第二距离传感器203与纵向中轴线之间的距离可以相等。
其中,第一距离传感器202可以设置于纵向中轴线的左侧,用于对左眼球和第一透镜2013之间的距离进行测量,第一距离传感器202可以为测量距离精确度较高的距离传感器,即高精度距离传感器,例如为激光距离传感器、红外线距离传感器、微波距离传感器等。第二距离传感器203可以设置于纵向中轴线的右侧,用于对右眼球和第二透镜2014之间的距离进行测量。该第二距离传感器203也可以为高精度距离传感器,例如为激光距离传感器、红外线距离传感器、微波距离传感器等。
针对第一距离传感器202和第二距离传感器203在显示设备本体201上的具体设置方式,详见以下两种设计。
在第一种可能的设计中,参见图3,第一距离传感器202设置于第一镜筒2011的第一端的边缘区域,该第二距离传感器203设置于第二镜筒2012的第一端的边缘区域。
本设计中,两个距离传感器分别设置在两个镜筒的边缘区域上,第一距离传感器202与第一透镜2013的透镜中心的距离等于第一透镜2013的半径,第二距离传感器203与第二透镜2014的透镜中心的距离等于第二透镜2014的半径。
第一距离传感器202可以设置于第一镜筒2011的第一端的边缘区域的任意位置,第二距离传感器203可以设置于第二镜筒2012的第一端的边缘区域的任意位置,只需保证第一距离传感器202与第二距离传感器203关于纵向中轴线对称设置即可。
而为了保证距离传感器测量到的距离准确度较高,本设计提供了以下两种具体方式。
方式一、第一距离传感器202设置于该第一镜筒2011的第一端的边缘区域与该显示设备本体201的横向中轴线的两个交点中靠近该纵向中轴线的交点上,第二距离传感器203设置于该第二镜筒2012的第一端的边缘区域与该显示设备本体201的横向中轴线的两个交点中靠近该纵向中轴线的交点上。
参见图4,第一镜筒2011的第一端的边缘区域会与横向中轴线具有两个交点,第一距离传感器202可以设置于这两个交点中靠近纵向中轴线的交点上,第一距离传感器202可以设置于第一镜筒2011的第一端的边缘区域的所有位置中最靠近纵向中轴线的位置处。同理地,第二镜筒2012的第一端的边缘区域也会与横向中轴线具有两个交点,第二距离传感器203可以设置于这两个交点中靠近纵向中轴线的交点上,第二距离传感器203可以设置于第二镜筒2012的第一端的边缘区域的所有位置中最靠近纵向中轴线的位置处。
那么,当显示设备佩戴正常时,第一距离传感器202会位于用户左眼的内眼角的正前方,测量到的距离实际为左眼的内眼角与该第一距离传感器202之间的距离,而由于第一距离传感器202的中心与第一透镜2013的透镜中心处于同一垂直平面上,左眼的内眼角和左眼瞳孔中心处于同一垂直平面上,则第一距离传感器202测量到的距离会是左眼瞳孔中心与第一透镜2013的透镜中心之间的距离。同理地,第二距离传感器203会位于用户右眼的内眼角的正前方,会测量到右眼的内眼角与该第二距离传感器203之间的距离,而由于第二距离传感器203的中心与第二透镜2014的中心处于同一垂直平面上,右眼的内眼角和右眼瞳孔中心处于同一垂直平面上,则第二距离传感器203测量到的距离会是右眼瞳孔中心与第二透镜2014的透镜中心之间的距离。
基于这种设置方式,能够保证第一距离传感器201测量到的距离即为左眼球和第一透镜2013之间的准确的距离,第二距离传感器203测量到的距离即为右眼球和第二透镜2014之间的准确的距离,那么,基于这种准确的距离调节透镜位置时,第一透镜201和第二透镜2013之间的距离会与用户的瞳距十分匹配,保证观看效果清晰。另外,对于眼窝深浅程度不同的用户来说,由于任一距离传感器测量的距离实际为内眼角与该距离传感器之间的距离,该距离不会受到眼窝深浅对测量距离造成的影响,避免了当眼窝较深的用户佩戴显示设备时,出现距离传感器测量到的距离偏大,即大于眼球与透镜之间的实际距离的情况,也避免了当眼窝较浅的用户佩戴显示设备时,出现距离传感器测量到的距离偏小,即小于眼球与透镜之间的实际距离的情况。
方式二、第一距离传感器202设置于第一垂直线上,第二距离传感器203设置于第二垂直线上。
参见图5,第一垂直线为与该纵向中轴线平行且穿过该第一透镜2013的透镜中心的直线,第一垂直线会与第一镜筒2011的第一端的边缘区域具有两个交点,第一距离传感器202可以设置于这两个交点中靠上的交点上,即,第一距离传感器202可以设置在第一透镜2013的透镜中心的正上方且与透镜中心距离等于透镜半径的位置处。或者,第一距离传感器202可以设置于这两个交点中靠下的交点上,即,第一距离传感器可以设置在第一透镜2013的透镜中心的正下方且与透镜中心距离等于透镜半径的位置处。
同理地,第二垂直线为与纵向中轴线平行且穿过第二透镜2014的透镜中心的直线,第二垂直线会与第二镜筒2012的第一端的边缘区域也具有两个交点,第一距离传感器202可以设置于这两个交点中靠上的交点上,或设置于这两个交点中靠下的交点上。
需要说明的是,参见图6,显示设备本体201还可以包含LED单元205,两个距离传感器实际可以落在LED单元205上。其中,第一镜筒2011的第一端的边缘区域中可以设置有LED单元205,该LED单元205包括多个LED,该多个LED以第一透镜2013的透镜中心为圆心,第一透镜2013的半径为半径,围绕成一个LED环,第一距离传感器202可以设置于该LED环上。同理地,该第二镜筒2012的第一端的边缘区域中也可以设置有LED单元205,第二距离传感器203可以设置于第二镜筒2012的LED单元205上。
在第二种可能的设计中,第一距离传感器202设置于第一镜筒2011的第一端的边缘区域的外侧,第二距离传感器203设置于第二镜筒2012的第一端的边缘区域的外侧。
本设计中,参见图7,两个距离传感器分别设置在两个镜筒的边缘区域的周围,第一距离传感器202与第一透镜2013的透镜中心的距离会大于第一透镜2013的半径,第二距离传感器203与第二透镜2014的透镜中心的距离会大于第二透镜2014的半径。
可选地,参见图8,第一距离传感器202可以设置于第一垂直线上,即,第一距离传感器202可以设置于第一透镜2013的透镜中心的正上方且与透镜中心之间的距离大于透镜半径的任一位置处,或设置于第一透镜2013的透镜中心的正下方且与透镜中心之间的距离大于透镜半径的任一位置处。同理地,第二距离传感器203可以设置于第二垂直线上,即,第二距离传感器203可以设置于第二透镜2014的透镜中心的正上方且与透镜中心之间的距离大于透镜半径的任一位置处,或设置于第二透镜2014的透镜中心的正下方且与透镜中心之间的距离大于透镜半径的任一位置处。
需要说明的是,参见图9,显示设备本体201还可以包含保护垫206,保护垫206用于保护第一镜筒2011和第二镜筒2012,避免第一镜筒2011和第二镜筒2012由于磕碰发生损坏,同时保证用户佩戴显示设备时让用户的佩戴体验更加舒适。第一距离传感器202、第二距离传感器203实际可以设置于保护垫206上。
(4)、第一摄像头模组和第二摄像头模组。
第一摄像头模组可以设置于第一镜筒2011中,其焦距和朝向固定,能够拍摄用户的左眼球的图像,并可以将该图像传输至处理器。第二摄像头模组可以设置于第二镜筒2012中,其焦距和朝向固定,能够拍摄用户的右眼球的图像,并可以将该图像传输至处理器。
(5)、第一马达和第二马达。
第一马达用于在处理器的控制下,驱动第一透镜2013移动,从而调节第一透镜2013的位置,第二马达用于在处理器的控制下,用于驱动第二透镜2014移动,从而调节第一透镜2014的位置。其中,处理器调节第一透镜2013、第二透镜2014的位置的方案详见下述图10所示实施例。
(6)、显示屏幕。
显示屏幕用于显示图像,在用户眼前呈现虚拟的或者增强的世界,可以为薄膜晶体管液晶显示器(Liquid Crystal Display,LCD)屏幕或者其它类型的屏幕。
相关技术中,显示设备上仅会设置一个距离传感器,用于对一定范围内的障碍物进行检测,从而检测出用户是否已经佩戴上显示设备,以便处理器根据检测结果确定是否启动显示设备。当显示设备启动后,处理器仅会基于拍摄到的瞳孔图像,得到瞳距,进而根据瞳距来调节透镜的位置,这种瞳距会受到瞳孔与显示设备之间的距离的影响,精度较低,导致调节透镜的位置后很可能无法达到清晰的观看效果。
本实施例提供的显示设备,在显示设备本体上设置了两个距离传感器,由第一距离传感器测量左眼球和第一透镜之间的距离,由第二距离传感器测量右眼球和第二透镜之间的距离,以便根据这两个距离分别对左瞳孔图像和右瞳孔图像进行归一化处理后,可以抵消掉瞳孔与显示设备之间的距离的影响,有助于得到高精度的瞳距,在基于高精度的瞳距调节透镜的位置后,可以达到清晰的观看效果。
进一步地,第一距离传感器可以设置于该第一镜筒的第一端的边缘区域与该显示设备本体的横向中轴线的两个交点中靠近该纵向中轴线的交点上,第二距离传感器可以设置于该第二镜筒的第一端的边缘区域与该显示设备本体的横向中轴线的两个交点中靠近该纵向中轴线的交点上,那么,第一距离传感器、第二距离传感器测量的距离实际为内眼角与对应距离传感器之间的距离,保证测量到的距离不会受到用户眼窝深浅不同造成的影响,提高了测量距离的准确性。
在示例性实施例中,还提供了一种计算机可读存储介质,例如包括指令的存储器,上述指令可由电子设备中的处理器执行以完成下述实施例中的用于调整显示设备的图像呈现的方法。例如,所述计算机可读存储介质可以是只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、CD-ROM、磁带、软盘和光数据存储设备等。
图10是根据一示例性实施例示出的一种用于调整显示设备的图像呈现的方法的流程图,该方法的执行主体为上述实施例阐述的显示设备的处理器,如图10所示,包括以下步骤:
1001、处理器获取第一图像和第二图像。
本实施例以包含用户的左眼球的图像称为第一图像,包含用户的右眼球的图像称为第二图像为例进行阐述。处理器可以在接收到拍摄指令时,通过第一摄像头模组对用户左眼球进行拍摄,得到该第一图像,通过第二摄像头模组对用户右眼球进行拍摄,得到该第二图像。其中,拍摄指令可以由佩戴显示设备的操作触发,或者由启动显示设备的操作触发。
可选地,在获取第一图像和第二图像之前,显示设备可以播放虚拟的远景图像,从而引导用户的视线朝向正前方,保证用户的左瞳孔、右瞳孔均直视显示设备。
1002、处理器得到第一透镜与用户的左眼球之间的第一距离。
1003、处理器得到第二透镜与用户的右眼球之间的第二距离。
处理器可以通过第一距离传感器对第一透镜与用户的左眼球之间的距离进行测量,得到第一距离,通过第二距离传感器对第二透镜与用户的右眼球之间的距离进行测量,得到第二距离。其中,第一透镜为显示设备的全部透镜中与用户的左眼球的距离最近的透镜,第二透镜为显示设备的全部透镜中与用户的右眼球的距离最近的透镜。第一透镜与第二透镜的具体位置详见上述图2-图9实施例。
1004、处理器检测显示设备的佩戴状态。
处理器可以根据第一距离和第二距离检测显示设备的佩戴状态,详见以下第一种可能的设计。或,处理器可以根据第一图像和第二图像检测显示设备的佩戴状态,详见以下第二种可能的设计或第三种可能的设计。
在第一种可能的设计中,处理器可以计算第一距离与第二距离之间的差值,当差值小于第一预设差值时,确定显示设备佩戴正常。
处理器计算出第一距离与第二距离之间的差值后,可以判断该差值是否小于第一预设差值,从而确定显示设备佩戴是否正常。其中,该第一预设差值用于指示显示设备佩戴正常时,第一距离与第二距离之间的最大差值,该第一预设差值可以在显示设备中预先存储。
当该差值小于第一预设差值时,可以认为第一距离与第二距离较为接近,即,左眼球与第一透镜之间的距离、右眼球与第二透镜之间的距离大致相等,两个透镜连线的方向与双眼连线的方向大致平行,即两个透镜处于同一水平线上,则可以确定显示设备佩戴正常。
当该差值不小于第一预设差值时,可以认为第一距离与第二距离相差较大,即,左眼球与第一透镜之间的距离、右眼球与第二透镜之间的距离相差较大,两个透镜连线的方向与双眼连线的方向产生偏移,即两个透镜不在同一水平线上,则可以确定显示设备佩戴倾斜。其中,当第一距离大于第二距离,且差值不小于第一预设差值时,表明倾斜情况为第一透镜靠前、第二透镜靠后。当第一距离小于第二距离,且差值不小于第一预设差值时,表明倾斜情况为第一透镜靠后、第二透镜靠前。
在第二种可能的设计中,当第一图像中瞳孔中心的点坐标属于第一图像的预设范围,且第二图像中瞳孔中心的点坐标属于第二图像的预设范围时,处理器确定显示设备佩戴正常。
本设计可以包括以下步骤一至步骤三。
步骤一、获取两个图像中瞳孔中心的点坐标。
处理器在得到第一图像和第二图像后,可以为第一图像建立坐标系,并对第一图像进行图像识别,识别出瞳孔中心在第一图像的坐标系中的位置,从而得到瞳孔中心在坐标系中的点坐标,即左瞳孔中心的点坐标。同理地,处理器可以在得到第二图像后,可以为第二图像建立坐标系,并对第二图像进行图像识别,识别出瞳孔中心在第二图像的坐标系中的位置,从而得到瞳孔中心在坐标系中的点坐标,即右瞳孔中心的点坐标。
步骤二、判断左瞳孔中心的点坐标是否属于第一图像的预设范围,当左瞳孔中心的点坐标属于第一图像的预设范围,执行步骤三,当左瞳孔中心的点坐标不属于第一图像的预设范围,确定显示设备佩戴不正常。
当得到第一图像中瞳孔中心的点坐标后,处理器可以获取第一图像的预设范围,判断该点坐标是否属于第一图像的预设范围,当第一图像中瞳孔中心的点坐标属于第一图像的预设范围,表明左瞳孔中心位置正常,即左眼佩戴正常。而当第一图像中瞳孔中心的点坐标不属于第一图像的预设范围时,表明左瞳孔中心未处于正确的位置,即左眼佩戴不正常,则处理器可以确定显示设备佩戴倾斜。其中,该第一图像的预设范围用于指示显示设备佩戴正常时,拍摄到的左眼球的图像中瞳孔中心的点坐标会落入的范围。该第一图像的预设范围可以在显示设备中预先存储。
在一种可能的设计中,判断第一图像中瞳孔中心的点坐标是否属于第一图像的预设范围可以通过判断点坐标是否满足以下公式实现:
Figure PCTCN2018091076-appb-000002
Figure PCTCN2018091076-appb-000003
其中,x1表示第一图像中瞳孔中心的点坐标的横坐标,y1表示第一图像中瞳孔中心的 点坐标的纵坐标,W的值可以根据对精确度的实际需求确定。
步骤三、判断右瞳孔中心的点坐标是否属于第二图像的预设范围,当右瞳孔中心的点坐标属于第二图像的预设范围,执行步骤四,当右瞳孔中心的点坐标不属于第二图像的预设范围,确定显示设备佩戴不正常。
当得到第二图像中瞳孔中心的点坐标后,处理器可以获取第二图像的预设范围,判断该点坐标是否属于第二图像的预设范围,当第二图像中瞳孔中心的点坐标属于第二图像的预设范围,表明右瞳孔中心位置正常,即右眼佩戴正常。而当第二图像中瞳孔中心的点坐标不属于第二图像的预设范围时,表明右瞳孔中心未处于正确的位置,即右眼佩戴不正常,则处理器可以确定显示设备佩戴倾斜。其中,该第二图像的预设范围用于指示显示设备佩戴正常时,拍摄到的右眼球的图像中瞳孔中心的点坐标会落入的范围。该第二图像的预设范围可以在显示设备中预先存储。
在一种可能的设计中,判断第二图像中瞳孔中心的点坐标是否属于第二图像的预设范围可以通过判断点坐标是否满足以下公式实现:
Figure PCTCN2018091076-appb-000004
Figure PCTCN2018091076-appb-000005
其中,x2表示第二图像中瞳孔中心的点坐标的横坐标,y2表示第二图像中瞳孔中心的点坐标的纵坐标,W的值可以根据对精确度的实际需求确定。
步骤四、确定显示设备佩戴正常。
也即是,当第一图像中瞳孔中心的点坐标属于第一图像的预设范围,且第二图像中瞳孔中心的点坐标属于第二图像的预设范围时,即左眼和右眼均佩戴正常时,处理器确定显示设备佩戴正常。
在第三种可能的设计中,处理器可以当第一图像中瞳孔中心的点坐标与第一图像中参考位置之间的距离、第二图像中瞳孔中心的点坐标与第二图像中参考位置之间的距离之间的差值小于第二预设差值时,确定显示设备佩戴正常。
本设计具体可以包括以下步骤一至步骤三。
步骤一、获取第一图像中瞳孔中心的点坐标与第一图像中参考位置之间的距离。
处理器可以为第一图像建立坐标系,并对第一图像进行图像识别,识别出瞳孔中心在第一图像的坐标系中的位置,从而得到瞳孔中心在坐标系中的点坐标,即左瞳孔中心的点坐标,并将第一图像的坐标系中的任一位置作为参考位置,获取参考位置在坐标系中的点坐标,计算左瞳孔中心的点坐标与参考位置的点坐标之间的距离。其中,显示设备可以将第一图像的中心,即第一图像的坐标系的原点作为第一图像的参考位置。
步骤二、获取第二图像中瞳孔中心的点坐标与第二图像中参考位置之间的距离。
处理器可以为第二图像建立坐标系,并对第二图像进行图像识别,识别出瞳孔中心在第二图像的坐标系中的位置,从而得到瞳孔中心在坐标系中的点坐标,即右瞳孔中心的点坐标,并将第二图像的坐标系的任一位置作为参考位置,获取参考位置在坐标系中的点坐标,计算右瞳孔中心的点坐标与参考位置的点坐标之间的距离。其中,显示设备可以将第二图像的中心,即第二图像的坐标系的原点作为第二图像的参考位置。
需要说明的是,本设计要求第一图像中的参考位置的点坐标与第二图像中的参考位置的点坐标相同,即,若将第一图像中的坐标系的(M,N)作为第一图像的参考位置,则也要将第二图像的坐标系的(M,N)作为第二图像的参考位置,以保证后续计算的差值能够正确反映显示设备的佩戴情况。
步骤三、计算两个距离之间的差值,判断差值是否小于第二预设差值,当差值小于第二预设差值时,确定显示设备佩戴正常,当差值不小于第二预设差值时,确定显示设备佩戴倾斜。
由于第一图像中瞳孔中心的点坐标与第一图像中参考位置之间的距离能够反映左瞳孔中心的点坐标与第一图像的参考位置之间的相对位置关系,第二图像中瞳孔中心的点坐标与第二图像的参考位置之间的距离能够反映右瞳孔中心的点坐标与第二图像的参考位置之间的相对位置关系,而第一图像的参考位置又与第二图像中参考位置的点坐标相同,因此两个距离之间的差值能够反映左瞳孔中心的点坐标、右瞳孔中心的点坐标之间位置的偏差,若差值小于第二预设差值,则可以确定左瞳孔中心与纵向中轴线之间的距离、右瞳孔中心与纵向中轴线之间的距离近似相等,且左瞳孔中心的高度与右瞳孔中心的高度近似相等,则可以确定显示设备佩戴正常,而该差值不小于第二预设差值,表明可能出现了左瞳孔中心、右瞳孔中心一外一内、或一高一低的情况,则可以确定显示设备佩戴倾斜。
其中,该第二预设差值用于指示第一图像中瞳孔中心的点坐标与第一图像中参考位置之间的距离、第二图像中瞳孔中心的点坐标与第二图像中参考位置之间的距离之间的最大差值,该第二预设差值可以根据实际对精确度的需求确定,可以在显示设备中预先存储。
1005、当确定显示设备佩戴不正常时,处理器按照预设提示方式,提示用户重新佩戴显示设备。
该预设提示方式可以包括显示第一提示页面、发出第一提示语音等。其中,该第一提示页面可以包括用于提示重新佩戴显示设备的文字信息、图片信息等,例如该文字信息可以为“当前佩戴方式有误,请重新佩戴设备”,该第一提示语音可以为提示重新佩戴显示设备的语音。用户查看第一提示页面,或听到第一提示语音后,即可获知要重新佩戴显示设备,并重新佩戴显示设备,则显示设备会重新开始执行上述步骤1001。
1006、处理器根据第一距离、第二距离、第一图像和第二图像,检测用户的睁眼状态。
由于后续要根据第一图像和第二图像计算用户的瞳距,若用户睁眼状态不正常,例如用户在拍照时发生了闭眼,则无法正常计算瞳距。为此,会先根据第一距离、第二距离、第一图像和第二图像,检测用户的睁眼状态,当睁眼状态不正常时,执行下述步骤1007,当睁眼状态正常时,执行下述步骤1008。
检测睁眼状态是否正常的具体过程可以包括以下步骤一至步骤五。
步骤一、处理器计算左瞳孔直径与预设瞳孔直径之间的比值,得到第一指定系数。
当处理器在执行上述步骤1004的过程中采用了第二种可能的设计或第三种可能的设计,则会对第一图像进行图像识别,而在对第一图像进行识别的过程中,可以同时识别出左瞳孔直径,本步骤即可直接获取该左瞳孔直径。而当处理器在执行上述步骤1004的过程中采用了第一种可能的设计,或在采用了第二种可能的设计或第三种可能的设计并未识别左瞳孔直径时,本步骤中处理器需要对第一图像进行图像识别,得到该左瞳孔直径。
在得到左瞳孔直径后,处理器会获取预设瞳孔直径,计算左瞳孔直径与预设瞳孔直径 之间的比值,将该比值作为第一指定系数。其中,预设瞳孔直径为当第一透镜与用户的左眼球之间的距离为预设参考距离,且睁眼状态正常时会检测到的瞳孔直径。预设瞳孔直径可以预先通过对多个样本用户佩戴显示设备时获取到的样本图像进行检测得到,获取预设瞳孔直径的具体过程可以包括以下三种可能的设计。
在第一种可能的设计中,可以当第一透镜与样本用户的左眼球之间的距离为预设参考距离,且样本用户睁眼状态正常时采集样本图像,得到多个样本第一图像,对于该多个样本第一图像中的每个样本第一图像,获取该样本图像中的瞳孔直径,得到样本瞳孔直径,得到多个样本瞳孔直径,求取该多个样本瞳孔直径的平均值,作为预设瞳孔直径。
在第二种可能的设计中,当第二透镜与样本用户的右眼球之间的距离为预设参考距离,且样本用户睁眼状态正常时采集样本图像,得到多个样本第二图像。对于该多个样本第二图像中的每个样本第二图像,获取该样本第二图像中的瞳孔直径,得到样本瞳孔直径,得到多个样本瞳孔直径,求取该多个样本瞳孔直径的平均值,作为预设瞳孔直径。
在第三种可能的设计中,可以基于多个样本第一图像得到多个样本瞳孔直径、基于多个样本第二图像得到多个样本瞳孔直径,再求取这些样本瞳孔直径的平均值,作为预设瞳孔直径。
步骤二、处理器计算右瞳孔直径与预设瞳孔直径之间的比值,得到第二指定系数。
当处理器在执行上述步骤1004的过程中采用了第二种可能的设计或第三种可能的设计,则会对第二图像进行图像识别,而在对第二图像进行识别的过程中,可以同时识别出右瞳孔直径,本步骤即可直接获取该右瞳孔直径。而当处理器在执行上述步骤1004的过程中采用了第一可能的设计,或在采用了第二种可能的设计或第三种可能的设计并未识别右瞳孔直径时,本步骤中处理器需要对第二图像进行图像识别,得到该右瞳孔直径。
步骤三、处理器计算第一距离与预设参考距离的比值,得到第一缩放系数,计算第二距离与预设参考距离的比值,得到第二缩放系数。
处理器得到第一缩放系数、第二缩放系数后,可以存储该第一缩放系数、第二缩放系数,以便后续基于该第一缩放系数、第二缩放系数进行归一化过程。
步骤四、处理器计算第一缩放系数与第一指定系数之间的差值,计算第二缩放系数与第二指定系数之间的差值。
可选地,处理器可以计算第一缩放系数减第一指定系数的差值,或计算第一指定系数减第一缩放系数的差值,再对得到的差值求取绝对值,将该绝对值作为第一缩放系数与第一指定系数之间的差值。同理地,处理器可以计算第二缩放系数减第二指定系数的差值,或计算第二指定系数减第二缩放系数的差值,再对得到的差值求取绝对值,将该绝对值作为第二缩放系数与第二指定系数之间的差值。
步骤五、处理器分别判断两个差值中的每个差值是否小于第三预设差值,当任一个差值小于第三预设差值时,确定睁眼状态不正常,当两个差值均小于第三预设差值时,确定睁眼状态正常。
存储设备可以获取第三预设差值,判断第一缩放系数与第一指定系数之间的差值是否小于第三预设差值,并判断第二缩放系数与第二指定系数之间的差值是否小于第三预设差值,当两个差值均小于第三预设差值时,表明左眼睁眼状态和右眼睁眼状态均正常,则显示设备会确定睁眼状态正常。其中,该第三预设差值用于指示第一缩放系数与第一指定系 数之间的最大差值,或用于指示第二缩放系数与第二指定系数之间的最大差值。该第三预设差值可以根据对精确度的需求确定,可以在显示设备中预先存储。
而当第一缩放系数与第一指定系数之间的差值不小于第三预设差值时,可以获知可能出现了左眼闭眼、左眼半睁半闭等情况,即左眼的睁眼状态不正常,则显示设备会确定睁眼状态不正常。当第二缩放系数与第二指定系数之间的差值不小于第三预设差值时,可以获知可能出现了右眼闭眼、右眼半睁半闭等情况,即右眼的睁眼状态不正常,则显示设备会确定睁眼状态不正常。
1007、当确定睁眼状态不正常时,处理器按照预设提示方式,提示用户正确睁眼。
该预设提示方式可以包括显示第二提示页面、发出第二提示语音等。其中,该第二提示页面可以包括用于提示正确睁眼的文字信息、图片信息等,例如该文字信息可以为“请睁开双眼,直视前方”,该第二提示语音为提示正确睁眼的语音。用户查看第二提示页面,或听到第二提示语音后,即可获知要正确睁眼。
需要说明的是,本实施例对执行检测佩戴状态的过程和执行检测睁眼状态的过程的先后顺序不做限定。即,可以先检测佩戴状态,当确定佩戴状态正常时再检测睁眼状态,当确定佩戴状态不正常时则无需执行后续步骤,直接提示用户重新佩戴显示设备。也可以先检测睁眼状态,当确定睁眼状态正常时再检测佩戴状态,而当确定睁眼状态不正常时则无需执行后续步骤,直接提示用户正确睁眼。
1008、当确定显示设备佩戴正常且睁眼状态正常时,处理器分别对第一图像和第二图像进行归一化处理。
本步骤中的归一化处理是指将不同距离下拍摄到的图像的尺寸均调整为预设参考距离下会拍摄到的图像的尺寸,该预设参考距离可以为15mm左右的距离。
具体来说,在上述步骤1005中,显示设备计算出了第一缩放系数和第二缩放系数,则本步骤1008中显示设备可以获取第一缩放系数和第二缩放系数,基于第一缩放系数对第一图像进行归一化处理,基于第二缩放系数对第二图像进行归一化处理。
针对基于第一缩放系数对第一图像进行归一化处理的具体过程,当第一缩放系数大于1,即第一距离大于预设参考距离时,表明左瞳孔与显示设备的距离偏大,则获取到的第一图像的尺寸实际偏小,则可以将第一缩放系数作为第一图像的放大比例,对第一图像进行等比例放大,放大后的第一图像的尺寸会等于第一图像的尺寸与第一缩放系数的乘积,可以将放大后的第一图像作为归一化处理后的第一图像。
同理地,当第一缩放系数小于1,即第一距离小于预设参考距离时,表明左瞳孔与显示设备的距离偏小,则获取到的第一图像的尺寸实际偏大,则可以将第一缩放系数作为第一图像的缩小比例,对第一图像进行等比例缩小,缩小后的第一图像的尺寸会等于第一图像的尺寸与第一缩放系数的乘积,可以将放小后的第一图像作为归一化处理后的第一图像。
针对基于第二缩放系数对第二图像进行归一化处理的具体过程,当第二缩放系数大于1,即第二距离大于预设参考距离时,表明右瞳孔与显示设备的距离偏大,则获取到的第二图像的尺寸实际偏小,则可以将第二缩放系数作为第二图像的放大比例,对第二图像进行等比例放大,放大后的第二图像的尺寸会等于第二图像的尺寸与第二缩放系数的乘积,可以将放大后的第二图像作为归一化处理后的第二图像。
同理地,当第二缩放系数小于1,即第二距离小于预设参考距离时,表明右瞳孔与显示 设备的距离偏小,则获取到的第二图像的尺寸实际偏大,则可以将第二缩放系数作为第二图像的缩小比例,对第二图像进行等比例缩小,缩小后的第二图像的尺寸会等于第二图像的尺寸与第二缩放系数的乘积,可以将放小后的第二图像作为归一化处理后的第二图像。
需要说明的第一点是,由于用户佩戴习惯的差异,用户瞳孔与显示设备的距离可能不同,导致不同距离下拍摄到的第一图像、第二图像的尺寸不同。而本实施例中,会设定预设参考距离,根据左眼球和第一透镜之间的第一距离与该预设参考距离,计算出第一缩放系数,通过该第一缩放系数对第一图像进行归一化处理,能够将各种第一距离下拍摄到的第一图像的尺寸统一地缩放至预设参考距离下会拍摄到的第一图像的尺寸,从而消除了第一距离对第一图像的尺寸造成的影响。同理地,根据右眼球和第二透镜之间的第二距离与该预设参考距离,计算出第二缩放系数,通过该第二缩放系数对第二图像进行归一化处理,将各种第二距离下拍摄到的第二图像的尺寸统一地缩放至预设参考距离下会拍摄到的第二图像的尺寸,从而消除了第二距离对第二图像的尺寸造成的影响。
也即是,左瞳孔与第一透镜的距离越远,第一图像会越小,而第一缩放系数越大,即第一图像的放大比例越大,则抵消掉了远距离的影响,保证放大后的第一图像为预设参考距离下会拍摄到的第一图像。而左瞳孔与第一透镜的距离越近,第一图像会越大,则第一缩放系数越小,即第一图像的缩小比例越小,则抵消掉了近距离的影响,保证缩小后的第一图像为预设参考距离下会拍摄到的第一图像。同理地,右瞳孔与第二透镜的距离越远,第二图像会越小,而第二缩放系数越大,即第二图像的放大比例越大,则抵消掉了远距离的影响,保证放大后的第二图像会为预设参考距离下拍摄到的第二图像。而右瞳孔与第二透镜的距离越近,第二图像会越大,则第二缩放系数越小,即第二图像的缩小比例越小,则抵消掉了近距离的影响,保证缩小后的第二图像会为预设参考距离下拍摄到的第二图像。
需要说明的第二点是,检测佩戴状态的步骤以及检测睁眼状态的步骤仅为可选步骤,可以无需执行,即,处理器可以无需执行上述步骤1004-步骤1007,在执行步骤1001-步骤1003之后,直接对第一图像和第二图像进行归一化处理。
1009、处理器根据归一化处理后的第一图像和归一化处理后的第二图像计算用户的左眼球和用户的右眼球之间的瞳距。
本步骤1009可以包括以下步骤一至步骤四。
步骤一、获取归一化处理后的第一图像中的左瞳孔中心的点坐标,并获取第一透镜的中心在归一化处理后的第一图像中的映射点坐标,计算左瞳孔中心的点坐标与映射点坐标之间的差值,作为第一差值。
示例性地,参见图11,可以以第一透镜与第二透镜之间的中点为坐标系的原点,以显示设备的纵向中轴线为坐标系的y轴,建立坐标系,则归一化处理后的第一图像中的任一点的横坐标的绝对值为该点到纵向中轴线之间的距离。再对归一化处理后的第一图像的左眼瞳孔进行检测,并对左瞳孔中心进行定位,转换到坐标系中,得到左瞳孔中心的点坐标与y轴之间的距离deye1,再计算第一透镜的透镜中心的映射点坐标与y轴之间的距离dlens1,计算两个距离之间的差值δ1=deye1–dlens1。
步骤二、获取归一化处理后的第二图像中的右瞳孔中心的点坐标,并获取第二透镜的中心在归一化处理后的第二图像中的映射点坐标,计算右瞳孔中心的点坐标与映射点坐标之间的差值,作为第二差值。
可以以同样的方式建立坐标系,并对归一化处理后的第二图像的右眼瞳孔进行检测,对右瞳孔中心进行定位,转换到坐标系中,得到右瞳孔中心的点坐标与y轴之间的距离deye2,再计算第二透镜的透镜中心的映射点坐标与y轴之间的距离dlens2,计算两个距离之间的差值δ2=deye2–dlens2。
步骤三、获取第一透镜的中心与第二透镜的中心之间的当前距离。
步骤四、计算该当前距离、该第一差值和第二差值的和值,得到该用户的瞳距。
假设该当前距离为d,则瞳距=d+δ1+δ2。
需要说明的是,由于归一化处理后的第一图像已经消除了第一距离的大小造成的影响,归一化处理后的第二图像已经消除了第二距离的大小造成的影响,根据归一化处理后的第一图像和归一化处理后的第二图像所计算的瞳距即为准确的瞳距。
1110、处理器根据用户的瞳距调整第一透镜和/或第二透镜。
以第一透镜的调节方向称为第一调节方向,第一透镜的调节距离称为第一调节距离,第二透镜的调节方向称为第二调节方向,第二透镜的调节距离称为第二调节距离为例,调整透镜的具体过程可以为:在上述步骤1009的步骤一中,当第一差值大于0时,可以将向内移动作为第一调节方向,当第一差值小于0时,可以将向外移动作为第一调节方向,并将第一差值的绝对值作为第一调节距离,根据第一调节方向和第一调节距离调节第一透镜。在上述步骤1007的步骤二中,当第二差值大于0时,可以将向内移动作为第二调节方向,当第二差值小于0时,可以将向外移动作为第二调节方向,并将第二差值的绝对值作为第二调节距离,根据第二调节方向和第二调节距离调节第二透镜。
那么,在当前距离大于瞳距时,第一透镜和/或第二透镜会向内移动,以使第一透镜和第二透镜之间的当前距离缩小,直至当前距离等于瞳距。在当前距离小于瞳距时,第一透镜和/或第二透镜会向外移动,以使第一透镜和第二透镜之间的当前距离增大,直至当前距离等于瞳距。那么,第一透镜和第二透镜之间的当前距离会等于瞳距,从而保证显示设备呈现图像的效果清晰。
可选地,在上述步骤1009的步骤一中,显示设备获取归一化处理后的第一图像中的左瞳孔中心的点坐标后,可以计算该点坐标与显示设备的纵向中轴线之间的距离,该距离近似等于左瞳孔中心与用户鼻梁之间的距离,可以将该距离作为左眼瞳距,单独根据左眼瞳距确定第一调节方向和第一调节距离,进而调节第一透镜。显示设备获取归一化处理后的第二图像中的右瞳孔中心的点坐标后,可以计算该点坐标与显示设备的纵向中轴线之间的距离,该距离近似等于右瞳孔中心与用户鼻梁之间的距离,可以将该距离作为右眼瞳距,单独根据右眼瞳距确定第二调节方向和第二调节距离,进而调节第二透镜。通过单独计算左眼瞳距和右眼瞳距,使得本实施例提供的方法也适用于左眼瞳距、右眼瞳距差距相对较大的特殊人群,提高了普适性。
需要说明的第一点是,显示设备获取到瞳距后,可以存储用户标识和瞳距之间的对应关系,以便用户下次佩戴显示设备时,直接根据已存储的瞳距调节透镜的位置,而无需再次获取瞳距。其中,显示设备可以在显示登录页面时,接收输入指令,获取已输入的用户标识,基于该用户标识查询已存储的用户标识与用户的瞳距之间的对应关系,当已存储的对应关系中不包括该用户标识对应的用户的瞳距时,则采用上述方法获取用户的瞳距,进而存储用户标识和用户的瞳距之间的对应关系。而当已存储的对应关系中包括该用户标识 对应的用户的瞳距时,表明显示设备之间已经获取到了用户的瞳距,则可以直接从对应关系中获取用户标识对应的用户的瞳距,而无需再次获取用户的瞳距,有利于快速获取瞳距,从而快速调节透镜之间的距离,节省了时间,提高了效率。
需要说明的第二点是,处理器调整第一透镜和/或第二透镜的过程实际可以通过显示设备本体包含的第一马达和第二马达实现:处理器可以在计算得到第一调节方向、第一调节距离、第二调节方向、第二调节距离后,将该第一调节方向和第一调节距离传入第一马达,将该第二调节方向和第二调节距离传入第二马达,第一马达会根据第一调节方向和第一调节距离推动第一透镜左右移动,第二马达会根据第二调节方向和第二调节距离推动第二透镜左右移动,最终达到调节瞳距的目的。
综上所述,本实施例提供的用于调整显示设备的图像呈现的方法的操作流程可以如下所示,包括以下步骤S1-步骤S4:
S1:由第一摄像头模组获取用户左眼球图像,由第二摄像头模组获取用户右眼球图像。
S2:由第一距离传感器测量用户的左眼球和第一透镜之间的距离,基于该距离对步骤S1中获取的左眼球图像进行归一化处理。由第二距离传感器测量用户的右眼球和第二透镜之间的距离,基于该距离对步骤S1中获取的右眼球图像进行归一化处理。
S3:针对步骤S2中的归一化处理后的左眼球图像对用户的瞳孔进行检测,对左瞳孔中心进行定位,得到左瞳孔中心的点坐标,并根据左瞳孔中心的点坐标与第一透镜的透镜中心在图像上的映射点坐标之间的差值,计算出第一透镜的调节方向和调节距离,针对步骤S2中的归一化处理后的右眼球图像对用户的瞳孔进行检测,对右瞳孔中心进行定位,得到右瞳孔中心的点坐标,并根据右瞳孔中心的点坐标与第二透镜的透镜中心在图像上的映射点坐标之间的差值,计算出第二透镜的调节方向和调节距离,并计算出瞳距值。
S4:根据当前使用显示设备的用户的瞳距,调整显示设备的第一透镜与第二透镜。
本实施例提供的方法,通过第一透镜与左眼球之间的第一距离、第二透镜与右眼球之间的第二距离,分别对第一图像和第二图像进行了归一化处理,归一化处理后的第一图像和归一化处理后的第二图像均消除了由于用户瞳孔与显示设备的远近不同造成的影响,因此根据归一化处理后的第一图像和归一化处理后的第二图像计算得到的瞳距,不会由于用户瞳孔与显示设备的远近不同产生误差,精度较高,按照这种高精度的瞳距调节透镜的位置,可以达到清晰的观看效果。
进一步地,通过检测显示设备的佩戴状态,当显示设备佩戴正常时,再调节透镜的位置,避免由于显示设备佩戴倾斜导致测量到的瞳距不准确的问题。
进一步地,当确定显示设备佩戴不正常时,通过按照预设提示方式,提示用户重新佩戴显示设备,可以引导用户正确佩戴显示设备,更加人性化。
进一步地,通过检测用户的睁眼状态,当睁眼状态正常时,再调节透镜的位置,避免由于拍摄图像时由于出现用户闭眼等睁眼状态不正常的情况导致测量到的瞳距不准确的问题。
进一步地,在测量得到瞳距后,可以存储用户标识和用户的瞳距之间的对应关系,当用户下次佩戴显示设备时,则可直接从对应关系中获取用户标识对应的用户的瞳距,而无需再次获取用户的瞳距,有利于快速获取瞳距,从而快速调节透镜之间的距离,节省了时间,提高了效率。
图12是根据一示例性实施例示出的一种用于调整显示设备的图像呈现的装置的结构示意图,如图12所示,该装置包括:获取模块1201、得到模块1202、归一化模块1203、计算模块1204和调整模块1205。
获取模块1201,用于获取第一图像和第二图像。
得到模块1202,用于得到第一距离和第二距离。
归一化模块1203,用于对第一图像和第二图像进行归一化处理;
计算模块1204,用于计算瞳距;
调整模块1205,用于根据瞳距调整第一透镜和/或第二透镜。
在一种可能的设计中,该归一化模块1203用于执行上述步骤1008。
在一种可能的设计中,该装置还包括:佩戴状态检测模块,用于执行上述步骤1004。
在一种可能的设计中,该装置还包括:提示模块,用于执行上述步骤1005。
在一种可能的设计中,该装置还包括:睁眼状态检测模块,用于执行上述步骤1006。
在一种可能的设计中,该装置还包括:存储模块,用于存储该用户标识和该用户的瞳距之间的对应关系。
上述所有可选技术方案,可以采用任意结合形成本公开的可选实施例,在此不再一一赘述。
需要说明的是:上述实施例提供的用于调整显示设备的图像呈现的装置在调整显示设备的图像呈现的时,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将显示设备的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。另外,上述实施例提供的用于调整显示设备的图像呈现的装置用于调整显示设备的图像呈现的方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述。
本领域普通技术人员可以理解实现上述实施例的全部或部分步骤可以通过硬件来完成,也可以通过程序来指令相关的硬件完成,所述的程序可以存储于一种计算机可读存储介质中,上述提到的存储介质可以是只读存储器,磁盘或光盘等。
以上所述仅为本公开的可选实施例,并不用以限制本公开,凡在本公开的精神和原则之内,所作的任何修改、等同替换、改进等,均应包含在本公开的保护范围之内。

Claims (29)

  1. 一种显示设备,其特征在于,所述显示设备包括:显示设备本体,所述显示设备本体包含第一镜筒和第二镜筒,所述第一镜筒的第一端设置有第一透镜,所述第二镜筒的第一端设置有第二透镜;
    所述显示设备本体上还设置有第一距离传感器和第二距离传感器;
    所述第一距离传感器和所述第二距离传感器对称设置于所述显示设备本体的纵向中轴线的两侧;
    所述第一距离传感器用于对左眼球和所述第一透镜之间的距离进行测量;
    所述第二距离传感器用于对右眼球和所述第二透镜之间的距离进行测量。
  2. 根据权利要求1所述的显示设备,其特征在于,所述第一距离传感器设置于所述第一镜筒的第一端的边缘区域,所述第二距离传感器设置于所述第二镜筒的第一端的边缘区域。
  3. 根据权利要求2所述的显示设备,其特征在于,所述第一距离传感器设置于所述第一镜筒的第一端的边缘区域与所述显示设备本体的横向中轴线的两个交点中靠近所述纵向中轴线的交点上;
    所述第二距离传感器设置于所述第二镜筒的第一端的边缘区域与所述显示设备本体的横向中轴线的两个交点中靠近所述纵向中轴线的交点上。
  4. 根据权利要求1所述的显示设备,其特征在于,所述第一距离传感器设置于所述第一镜筒的第一端的边缘区域的外侧,所述第二距离传感器设置于所述第二镜筒的第一端的边缘区域的外侧。
  5. 根据权利要求2或4所述的显示设备,其特征在于,
    所述第一距离传感器设置于第一垂直线上,所述第一垂直线为与所述纵向中轴线平行且穿过所述第一透镜的透镜中心的直线;
    所述第二距离传感器设置于第二垂直线上,所述第二垂直线为与所述纵向中轴线平行且穿过所述第二透镜的透镜中心的直线。
  6. 根据权利要求2至5任一项所述的显示设备,其特征在于,所述第一镜筒的第一端的边缘区域中设置有发光二极管LED单元,所述第二镜筒的第一端的边缘区域中设置有LED单元。
  7. 一种用于调整显示设备的图像呈现的方法,其特征在于,所述方法包括:
    获取第一图像和第二图像,其中,第一图像中包含用户的左眼球的图像,第二图像中包含所述用户的右眼球的图像;
    得到第一透镜与所述用户的左眼球之间的第一距离,其中,所述第一透镜为所述显示设备的全部透镜中与所述用户的左眼球的距离最近的透镜;
    得到第二透镜与所述用户的右眼球之间的第二距离,其中,所述第二透镜为所述显示设备的全部透镜中与所述用户的右眼球的距离最近的透镜;
    根据所述第一距离、所述第二距离,分别对所述第一图像和所述第二图像进行归一化处理;
    根据归一化处理后的第一图像和归一化处理后的第二图像,计算所述用户的左眼球和所述用户的右眼球之间的瞳距;
    根据所述用户的瞳距调整所述第一透镜和/或所述第二透镜。
  8. 根据权利要求7所述的方法,其特征在于,所述根据所述第一距离、所述第二距离,分别对所述第一图像和所述第二图像进行归一化处理,包括:
    计算所述第一距离与预设参考距离之间的比值,得到第一缩放系数,所述预设参考距离为获取到预设尺寸的瞳孔图像时所述第一透镜与所述用户的左眼球之间的距离,或为获取到预设尺寸的瞳孔图像时所述第二透镜与所述用户的右眼球之间的距离;
    计算所述第二距离与所述预设参考距离之间的比值,得到第二缩放系数;
    采用所述第一缩放系数,对所述第一图像进行缩放;
    采用所述第二缩放系数,对所述第二图像进行缩放。
  9. 根据权利要求7所述的方法,其特征在于,所述根据所述第一距离、所述第二距离,分别对所述第一图像和所述第二图像进行归一化处理,包括:
    根据所述第一距离和所述第二距离检测所述显示设备的佩戴状态;或,根据所述第一图像和所述第二图像检测所述显示设备的佩戴状态;
    当确定所述显示设备佩戴正常时,根据所述第一距离、所述第二距离,分别对所述第一图像和所述第二图像进行归一化处理。
  10. 根据权利要求9所述的方法,其特征在于,所述根据所述第一距离和所述第二距离检测所述显示设备的佩戴状态,包括:
    计算所述第一距离与所述第二距离之间的差值;
    当所述差值小于第一预设差值时,确定所述显示设备佩戴正常。
  11. 根据权利要求9所述的方法,其特征在于,所述根据所述第一图像和所述第二图像检测所述显示设备的佩戴状态,包括:
    当所述第一图像中瞳孔中心的点坐标属于所述第一图像的预设范围,且所述第二图像中瞳孔中心的点坐标属于所述第二图像的预设范围时,确定所述显示设备佩戴正常。
  12. 根据权利要求9所述的方法,其特征在于,所述根据所述第一图像和所述第二图像检测所述显示设备的佩戴状态,包括:
    计算所述第一图像中瞳孔中心的点坐标与所述第一图像中参考位置之间的距离,得到第三距离;
    计算所述第二图像中瞳孔中心的点坐标与所述第二图像中参考位置之间的距离,得到第四距离;
    当所述第三距离与所述第四距离之间的差值小于第二预设差值时,确定所述显示设备佩戴正常。
  13. 根据权利要求9-12任一项所述的方法,其特征在于,所述根据所述第一距离和所述第二距离检测所述显示设备的佩戴状态;或,根据所述第一图像和所述第二图像检测所述显示设备的佩戴状态之后,所述方法还包括:
    当确定所述显示设备佩戴不正常时,按照预设提示方式,提示所述用户重新佩戴所述显示设备。
  14. 根据权利要求7所述的方法,其特征在于,所述根据所述第一距离、所述第二距离,分别对所述第一图像和所述第二图像进行归一化处理,包括:
    根据所述第一距离、第二距离、所述第一图像和所述第二图像,检测所述用户的睁眼状态;
    当确定所述睁眼状态正常时,根据所述第一距离、所述第二距离,分别对所述第一图像和所述第二图像进行归一化处理。
  15. 根据权利要求14所述的方法,其特征在于,所述根据所述第一距离、第二距离、所述第一图像和所述第二图像,检测所述用户的睁眼状态,包括:
    计算左瞳孔直径与预设瞳孔直径之间的比值,得到第一指定系数,计算右瞳孔直径与所述预设瞳孔直径之间的比值,得到第二指定系数,所述预设瞳孔直径为当所述第一透镜与所述用户的左眼球之间的距离之间为预设参考距离,且睁眼状态正常时会检测到的瞳孔直径,或者所述预设瞳孔直径为当所述第二透镜与所述用户的右眼球之间的距离之间为所述预设参考距离,且睁眼状态正常时会检测到的瞳孔直径,所述左瞳孔直径根据所述第一图像确定,所述右瞳孔直径根据所述第二图像确定;
    当所述第一缩放系数与所述第一指定系数之间的差值、所述第二缩放系数与所述第二指定系数之间的差值均小于第三预设差值时,确定所述睁眼状态正常,所述第一缩放系数为所述第一距离与所述预设参考距离的比值,所述第二缩放系数为所述第二距离与所述预设参考距离的比值。
  16. 根据权利要求15所述的方法,其特征在于,所述预设瞳孔直径为基于所述显示设备所获取的多个样本第一图像或多个样本第二图像检测得到的样本瞳孔直径的平均值。
  17. 根据权利要求7所述的方法,其特征在于,所述根据归一化处理后的第一图像和归一化处理后的第二图像计算所述用户的左眼球和所述用户的右眼球之间的瞳距之后,所述方法还包括:
    获取所述用户的用户标识;
    存储所述用户标识和所述用户的瞳距之间的对应关系。
  18. 一种用于调整显示设备的图像呈现的装置,其特征在于,所述装置包括:
    获取模块,用于获取第一图像和第二图像,其中,第一图像中包含用户的左眼球的图像,第二图像中包含所述用户的右眼球的图像;
    得到模块,用于得到第一透镜与所述用户的左眼球之间的第一距离,其中,所述第一透镜为所述显示设备的全部透镜中与所述用户的左眼球的距离最近的透镜;
    所述得到模块,还用于得到第二透镜与所述用户的右眼球之间的第二距离,其中,所述第二透镜为所述显示设备的全部透镜中与所述用户的右眼球的距离最近的透镜;
    归一化模块,用于根据所述第一距离、所述第二距离,分别对所述第一图像和所述第二图像进行归一化处理;
    计算模块,用于根据归一化处理后的第一图像和归一化处理后的第二图像,计算所述用户的左眼球和所述用户的右眼球之间的瞳距;
    调整模块,用于根据所述用户的瞳距调整所述第一透镜和/或所述第二透镜。
  19. 根据权利要求18所述的装置,其特征在于,所述归一化模块,包括:
    计算子模块,用于计算所述第一距离与预设参考距离之间的比值,得到第一缩放系数,所述预设参考距离为获取到预设尺寸的瞳孔图像时所述第一透镜与所述用户的左眼球之间的距离,或为获取到预设尺寸的瞳孔图像时所述第二透镜与所述用户的右眼球之间的距离;
    所述计算子模块,还用于计算所述第二距离与所述预设参考距离之间的比值,得到第二缩放系数;
    缩放模块,用于采用所述第一缩放系数,对所述第一图像进行缩放;
    所述缩放模块,还用于采用所述第二缩放系数,对所述第二图像进行缩放。
  20. 根据权利要求18所述的装置,其特征在于,所述装置还包括:
    佩戴状态检测模块,用于根据所述第一距离和所述第二距离检测所述显示设备的佩戴状态;或,根据所述第一图像和所述第二图像检测所述显示设备的佩戴状态;
    所述归一化模块,还用于当所述佩戴状态检测模块确定所述显示设备佩戴正常时,根据所述第一距离、所述第二距离,分别对所述第一图像和所述第二图像进行归一化处理。
  21. 根据权利要求20所述的装置,其特征在于,所述佩戴状态检测模块,还用于计算所述第一距离与所述第二距离之间的差值;当所述差值小于第一预设差值时,确定所述显示设备佩戴正常。
  22. 根据权利要求20所述的装置,其特征在于,所述佩戴状态检测模块,还用于当所述第一图像中瞳孔中心的点坐标属于所述第一图像的预设范围,且所述第二图像中瞳孔中心的点坐标属于所述第二图像的预设范围时,确定所述显示设备佩戴正常。
  23. 根据权利要求20所述的装置,其特征在于,所述佩戴状态检测模块,还用于计算所述第一图像中瞳孔中心的点坐标与所述第一图像中参考位置之间的距离,得到第三距离;计算所述第二图像中瞳孔中心的点坐标与所述第二图像中参考位置之间的距离,得到第四距离;当所述第三距离与所述第四距离之间的差值小于第二预设差值时,确定所述显示设备佩戴正常。
  24. 根据权利要求18-23任一项所述的装置,其特征在于,所述装置还包括:
    提示模块,用于当确定所述显示设备佩戴不正常时,按照预设提示方式,提示所述用户重新佩戴所述显示设备。
  25. 根据权利要求18所述的装置,其特征在于,所述装置还包括:
    睁眼状态检测模块,用于根据所述第一距离、第二距离、所述第一图像和所述第二图像,检测所述用户的睁眼状态;
    所述归一化模块,还用于当确定所述睁眼状态正常时,根据所述第一距离、所述第二距离,分别对所述第一图像和所述第二图像进行归一化处理。
  26. 根据权利要求25所述的装置,其特征在于,所述睁眼状态检测模块,包括:
    计算子模块,用于计算左瞳孔直径与预设瞳孔直径之间的比值,得到第一指定系数,计算右瞳孔直径与所述预设瞳孔直径之间的比值,得到第二指定系数,所述预设瞳孔直径为当所述第一透镜与所述用户的左眼球之间的距离之间为预设参考距离,且睁眼状态正常时会检测到的瞳孔直径,或者所述预设瞳孔直径为当所述第二透镜与所述用户的右眼球之间的距离之间为所述预设参考距离,且睁眼状态正常时会检测到的瞳孔直径,所述左瞳孔直径根据所述第一图像确定,所述右瞳孔直径根据所述第二图像确定;
    确定子模块,用于当所述第一缩放系数与所述第一指定系数之间的差值、所述第二缩放系数与所述第二指定系数之间的差值均小于第三预设差值时,确定所述睁眼状态正常,所述第一缩放系数为所述第一距离与所述预设参考距离的比值,所述第二缩放系数为所述第二距离与所述预设参考距离的比值。
  27. 根据权利要求25所述的装置,其特征在于,所述预设瞳孔直径为基于所述显示设备所获取的多个样本第一图像或多个样本第二图像检测得到的样本瞳孔直径的平均值。
  28. 根据权利要求18所述的装置,其特征在于,所述获取模块,还用于获取所述用户的用户标识;
    所述装置还包括:存储模块,用于存储所述用户标识和所述用户的瞳距之间的对应关系。
  29. 一种计算机可读存储介质,其特征在于,所述存储介质中存储有至少一条指令,所述指令由处理器加载并执行以实现如权利要求7至权利要求17任一项所述的用于调整显示设备的图像呈现的方法中所执行的操作。
PCT/CN2018/091076 2017-10-30 2018-06-13 显示设备、用于调整显示设备的图像呈现的方法及装置 WO2019085487A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP18874723.2A EP3683614B1 (en) 2017-10-30 2018-06-13 Display apparatus and method and device for adjusting image display of the display apparatus
US16/854,419 US11115648B2 (en) 2017-10-30 2020-04-21 Display device, and method and apparatus for adjusting image presence on display device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711038481.5 2017-10-30
CN201711038481.5A CN109725418B (zh) 2017-10-30 2017-10-30 显示设备、用于调整显示设备的图像呈现的方法及装置

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/854,419 Continuation US11115648B2 (en) 2017-10-30 2020-04-21 Display device, and method and apparatus for adjusting image presence on display device

Publications (1)

Publication Number Publication Date
WO2019085487A1 true WO2019085487A1 (zh) 2019-05-09

Family

ID=66292450

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/091076 WO2019085487A1 (zh) 2017-10-30 2018-06-13 显示设备、用于调整显示设备的图像呈现的方法及装置

Country Status (4)

Country Link
US (1) US11115648B2 (zh)
EP (1) EP3683614B1 (zh)
CN (1) CN109725418B (zh)
WO (1) WO2019085487A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110794585A (zh) * 2019-10-16 2020-02-14 中国航空工业集团公司洛阳电光设备研究所 一种头盔显示器双目对准校准方法
JPWO2020079906A1 (ja) * 2018-10-15 2021-09-02 国立大学法人東京農工大学 ヘッドマウントディスプレイおよびこれに用いられる広焦点レンズの設計方法

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110401831B (zh) * 2019-07-19 2021-06-11 歌尔光学科技有限公司 一种vr设备及其显示控制方法
US11064188B2 (en) * 2019-07-29 2021-07-13 Htc Corporation Driving method for calculating interpupillary distance and related head-mounted device
CN110426862B (zh) * 2019-08-14 2020-11-06 广东小天才科技有限公司 一种调节智能眼镜度数的方法及智能眼镜
CN110837294B (zh) * 2019-10-14 2023-12-12 成都西山居世游科技有限公司 一种基于眼球追踪的面部表情控制方法及系统
CN111025653B (zh) * 2019-12-29 2022-03-29 歌尔光学科技有限公司 调整头戴式设备中的显示器的方法、系统及头戴式设备
CN111158146A (zh) * 2019-12-31 2020-05-15 歌尔股份有限公司 头戴设备及其控制方法和计算机可读存储介质
CN111248852A (zh) * 2020-02-24 2020-06-09 脉脉眼镜科技(深圳)有限责任公司 智能瞳距测量装置和测量方法
CN112162408A (zh) * 2020-10-21 2021-01-01 潍坊歌尔电子有限公司 智能眼镜及其调节装置、调节系统和调节方法
US20230324710A1 (en) * 2022-04-11 2023-10-12 Snap Inc. Intelligent actuated nose bridge
CN115508979B (zh) * 2022-11-22 2023-03-21 深圳酷源数联科技有限公司 Ar眼镜自动调焦系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6215461B1 (en) * 1996-08-30 2001-04-10 Minolta Co., Ltd. Image viewing system and image display device
CN106019588A (zh) * 2016-06-23 2016-10-12 深圳市虚拟现实科技有限公司 一种可以自动测量瞳距的近眼显示装置及方法
CN106019589A (zh) * 2016-06-25 2016-10-12 深圳市虚拟现实科技有限公司 一种自动调整光学系统的近眼显示装置
CN205826969U (zh) * 2016-06-25 2016-12-21 深圳市虚拟现实科技有限公司 一种自适应近眼显示装置
CN106445167A (zh) * 2016-10-20 2017-02-22 网易(杭州)网络有限公司 单眼视界自适配调整方法及装置、头戴式可视设备
WO2017179938A1 (ko) * 2016-04-15 2017-10-19 이문기 눈 촬영 장치

Family Cites Families (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2807169B1 (fr) * 2000-03-31 2002-06-07 Essilor Int Procede de montage de lentilles ophtalmiques
FR2863857B1 (fr) * 2003-12-23 2006-10-13 Essilor Int Mesure du comportement d'un porteur de lentilles ophtalmologiques
JP5507797B2 (ja) * 2007-03-12 2014-05-28 キヤノン株式会社 頭部装着型撮像表示装置及び画像生成装置
US8957835B2 (en) * 2008-09-30 2015-02-17 Apple Inc. Head-mounted display apparatus for retaining a portable electronic device with display
JP5538160B2 (ja) 2010-09-24 2014-07-02 パナソニック株式会社 瞳孔検出装置及び瞳孔検出方法
US8929589B2 (en) 2011-11-07 2015-01-06 Eyefluence, Inc. Systems and methods for high-resolution gaze tracking
US20140247286A1 (en) * 2012-02-20 2014-09-04 Google Inc. Active Stabilization for Heads-Up Displays
US20130241805A1 (en) * 2012-03-15 2013-09-19 Google Inc. Using Convergence Angle to Select Among Different UI Elements
US20130258486A1 (en) * 2012-03-27 2013-10-03 Dumitru Mihai Ionescu Head-mount display
WO2014024649A1 (ja) * 2012-08-06 2014-02-13 ソニー株式会社 画像表示装置および画像表示方法
IN2015KN00682A (zh) * 2012-09-03 2015-07-17 Sensomotoric Instr Ges Für Innovative Sensorik Mbh
CN102937745B (zh) * 2012-11-13 2015-04-01 京东方科技集团股份有限公司 开放式头戴显示装置及其显示方法
TWI481901B (zh) * 2012-12-03 2015-04-21 Wistron Corp 頭戴式顯示裝置
JP2016509292A (ja) * 2013-01-03 2016-03-24 メタ カンパニー エクストラミッシブ空間撮像デジタル眼鏡装置または拡張介在ビジョン
JP6264734B2 (ja) * 2013-03-13 2018-01-24 セイコーエプソン株式会社 虚像表示装置
US20160033770A1 (en) * 2013-03-26 2016-02-04 Seiko Epson Corporation Head-mounted display device, control method of head-mounted display device, and display system
WO2014156389A1 (ja) * 2013-03-29 2014-10-02 ソニー株式会社 情報処理装置、提示状態制御方法及びプログラム
US9239460B2 (en) * 2013-05-10 2016-01-19 Microsoft Technology Licensing, Llc Calibration of eye location
US10345903B2 (en) * 2013-07-30 2019-07-09 Microsoft Technology Licensing, Llc Feedback for optic positioning in display devices
TWI507729B (zh) * 2013-08-02 2015-11-11 Quanta Comp Inc 頭戴式視覺輔助系統及其成像方法
US10099030B2 (en) * 2013-09-06 2018-10-16 Iarmourholdings, Inc. Mechanical and fluid system and method for the prevention and control of motion sickness, motion-induced vision sickness, and other variants of spatial disorientation and vertigo
US9470906B2 (en) 2013-10-16 2016-10-18 Magic Leap, Inc. Virtual or augmented reality headsets having adjustable interpupillary distance
EP2886041A1 (en) * 2013-12-17 2015-06-24 ESSILOR INTERNATIONAL (Compagnie Générale d'Optique) Method for calibrating a head-mounted eye tracking device
US10228562B2 (en) * 2014-02-21 2019-03-12 Sony Interactive Entertainment Inc. Realtime lens aberration correction from eye tracking
US9995933B2 (en) * 2014-06-24 2018-06-12 Microsoft Technology Licensing, Llc Display devices with transmittance compensation mask
JP6269387B2 (ja) * 2014-08-21 2018-01-31 セイコーエプソン株式会社 表示装置及び電子機器
KR102244222B1 (ko) * 2014-09-02 2021-04-26 삼성전자주식회사 가상 현실 서비스를 제공하기 위한 방법 및 이를 위한 장치들
CN104216841A (zh) * 2014-09-15 2014-12-17 联想(北京)有限公司 一种信息处理方法及电子设备
EP3216023A4 (en) * 2014-11-07 2018-07-18 Eye Labs, LLC Visual stabilization system for head-mounted displays
US10571696B2 (en) * 2014-12-26 2020-02-25 Cy Vision Inc. Near-to-eye display device
JP2016212177A (ja) * 2015-05-01 2016-12-15 セイコーエプソン株式会社 透過型表示装置
JP6693060B2 (ja) * 2015-07-06 2020-05-13 セイコーエプソン株式会社 表示システム、表示装置、表示装置の制御方法、及び、プログラム
EP3360001B1 (en) * 2015-10-08 2023-11-15 LG Electronics Inc. Head mount display device
US10424117B2 (en) * 2015-12-02 2019-09-24 Seiko Epson Corporation Controlling a display of a head-mounted display device
CN105549209B (zh) * 2016-02-04 2018-09-04 京东方科技集团股份有限公司 3d显示装置及其光栅周期调节方法
JP6686505B2 (ja) * 2016-02-15 2020-04-22 セイコーエプソン株式会社 頭部装着型画像表示装置
CN205485069U (zh) * 2016-02-25 2016-08-17 北京耐德佳显示技术有限公司 具有视度显示的近眼显示设备
KR20180002387A (ko) * 2016-06-29 2018-01-08 엘지전자 주식회사 헤드 마운티드 디스플레이 장치 및 이의 제어방법
KR20180014492A (ko) * 2016-08-01 2018-02-09 삼성전자주식회사 영상 표시 방법 및 이를 지원하는 전자 장치
CN106019600B (zh) * 2016-08-03 2018-10-09 深圳酷酷科技有限公司 光学模组及头戴式显示设备
WO2018078409A1 (en) * 2016-10-28 2018-05-03 Essilor International Method of determining an eye parameter of a user of a display device
CN206301289U (zh) 2016-11-29 2017-07-04 阿里巴巴集团控股有限公司 Vr终端设备
CN106802486A (zh) * 2017-04-11 2017-06-06 广东小天才科技有限公司 一种调节焦距的方法及头戴显示器
CN107132657B (zh) * 2017-05-22 2023-06-30 歌尔科技有限公司 Vr一体机、手机、手机和vr一体机套装
CN107167924B (zh) * 2017-07-24 2019-07-19 京东方科技集团股份有限公司 一种虚拟现实设备和虚拟现实设备的透镜调节方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6215461B1 (en) * 1996-08-30 2001-04-10 Minolta Co., Ltd. Image viewing system and image display device
WO2017179938A1 (ko) * 2016-04-15 2017-10-19 이문기 눈 촬영 장치
CN106019588A (zh) * 2016-06-23 2016-10-12 深圳市虚拟现实科技有限公司 一种可以自动测量瞳距的近眼显示装置及方法
CN106019589A (zh) * 2016-06-25 2016-10-12 深圳市虚拟现实科技有限公司 一种自动调整光学系统的近眼显示装置
CN205826969U (zh) * 2016-06-25 2016-12-21 深圳市虚拟现实科技有限公司 一种自适应近眼显示装置
CN106445167A (zh) * 2016-10-20 2017-02-22 网易(杭州)网络有限公司 单眼视界自适配调整方法及装置、头戴式可视设备

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3683614A4

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPWO2020079906A1 (ja) * 2018-10-15 2021-09-02 国立大学法人東京農工大学 ヘッドマウントディスプレイおよびこれに用いられる広焦点レンズの設計方法
US11740459B2 (en) 2018-10-15 2023-08-29 National University Corporation Tokyo University Of Agriculture And Technology Head-mounted display and method for designing wide-focus lens to be used for the head-mounted display
CN110794585A (zh) * 2019-10-16 2020-02-14 中国航空工业集团公司洛阳电光设备研究所 一种头盔显示器双目对准校准方法
CN110794585B (zh) * 2019-10-16 2022-05-24 中国航空工业集团公司洛阳电光设备研究所 一种头盔显示器双目对准校准方法

Also Published As

Publication number Publication date
EP3683614B1 (en) 2022-12-28
CN109725418B (zh) 2020-10-16
EP3683614A4 (en) 2020-11-18
EP3683614A1 (en) 2020-07-22
US20200267380A1 (en) 2020-08-20
US11115648B2 (en) 2021-09-07
CN109725418A (zh) 2019-05-07

Similar Documents

Publication Publication Date Title
WO2019085487A1 (zh) 显示设备、用于调整显示设备的图像呈现的方法及装置
KR102469507B1 (ko) 정보 처리 장치, 정보 처리 방법 및 프로그램
US9291834B2 (en) System for the measurement of the interpupillary distance using a device equipped with a display and a camera
US8752963B2 (en) See-through display brightness control
US11500607B2 (en) Using detected pupil location to align optical components of a head-mounted display
JP2019527377A (ja) 視線追跡に基づき自動合焦する画像捕捉システム、デバイス及び方法
US10360450B2 (en) Image capturing and positioning method, image capturing and positioning device
US9335567B2 (en) Method for manufacturing binocular loupe
US20190235255A1 (en) System, head mounted device (hmd) and method for adjusting a position of an hmd worn by a user
US20140240470A1 (en) Method, system and device for improving optical measurement of ophthalmic spectacles
US20190073820A1 (en) Ray Tracing System for Optical Headsets
US11676422B2 (en) Devices, systems and methods for predicting gaze-related parameters
TW201814356A (zh) 頭戴顯示裝置與其鏡片位置調整方法
CN111512217B (zh) 用于确定镜片的光学参数的方法
JP2015025859A (ja) 画像処理装置、電子機器、眼鏡特性判定方法および眼鏡特性判定プログラム
JP7081599B2 (ja) 情報処理装置、情報処理方法、およびプログラム
JP2007029126A (ja) 視線検出装置
JP2005253778A (ja) 視線検出方法及び同装置
JP2003329541A (ja) アイポイントの位置決定方法及びアイポイント測定システム
JP2006095008A (ja) 視線検出方法
EP2772795A1 (en) Method, system and device for improving optical measurement of ophthalmic spectacles
US11662574B2 (en) Determining gaze depth using eye tracking functions
US20220142473A1 (en) Method and system for automatic pupil detection
WO2019116675A1 (ja) 情報処理装置、情報処理方法、およびプログラム
JP6433258B2 (ja) 双眼ルーペの製作方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18874723

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018874723

Country of ref document: EP

Effective date: 20200416