WO2022127612A1 - 图像校准方法和设备 - Google Patents

图像校准方法和设备 Download PDF

Info

Publication number
WO2022127612A1
WO2022127612A1 PCT/CN2021/135168 CN2021135168W WO2022127612A1 WO 2022127612 A1 WO2022127612 A1 WO 2022127612A1 CN 2021135168 W CN2021135168 W CN 2021135168W WO 2022127612 A1 WO2022127612 A1 WO 2022127612A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
image
offset
center point
displayed
Prior art date
Application number
PCT/CN2021/135168
Other languages
English (en)
French (fr)
Inventor
靳云峰
朱帅帅
曾以亮
单双
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to US18/257,263 priority Critical patent/US20240036306A1/en
Priority to EP21905541.5A priority patent/EP4242727A4/en
Publication of WO2022127612A1 publication Critical patent/WO2022127612A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0025Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration
    • G02B27/0068Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration having means for controlling the degree of correction, e.g. using phase modulators, movable elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0081Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. enlarging, the entrance or exit pupil
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0025Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for optical correction, e.g. distorsion, aberration
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/011Head-up displays characterised by optical features comprising device for correcting geometrical aberrations, distortion
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0181Adaptation to the pilot/driver

Definitions

  • the present application relates to the field of terminal technologies, and in particular, to an image calibration method and device.
  • the head-mounted display device includes two optical systems, left and right. Each optical system includes an optical lens and a display screen. After the user wears the head-mounted display device, the light emitted by the display screen enters the human eye after being refracted by the optical lens, so that the user can watch the display. to an enlarged virtual image, enhancing the user's sense of immersion.
  • the point seen when the user's two eyes are looking straight ahead is relative to the image displayed on the display screen.
  • the center point of the screen has different degrees of offset, and the two eyes see different pictures, and the user will feel dizzy.
  • the present application provides an image calibration method and apparatus for calibrating an image.
  • the present application provides an image calibration method, comprising: acquiring a user's pupil position offset and a user's diopter, where the user's pupil position offset is the offset of the user's pupil relative to the optical axis; The user's pupil position offset and the user's degree of vision are determined, and the first image center point offset corresponding to the user's pupil position offset and the user's degree of vision is determined; according to the first image center point offset and the direction of pupil offset, adjust the center point of the image to be displayed.
  • the first image center point offset corresponding to the user pupil position offset and the user's degree of vision is determined according to the user's pupil position offset and the user's degree of vision. It includes: determining the offset of the center point of the first image according to the user's pupil position offset, the user's diopter, and a first mapping relationship, where the first mapping relationship is used to indicate the pupil position offset The correspondence between the amount, the diopter, and the offset of the image center point.
  • the adjusting the center point of the image to be displayed according to the offset of the center point of the first image and the direction of the pupil offset includes: converting the image to be displayed generated by the first module.
  • the center point moves a first distance in the direction of the pupil offset, the first distance is the same as the offset amount of the center point of the first image, and the first module is used for the instruction input by the user or the default playback order Render the scene data to generate the corresponding image.
  • the first module is also referred to as a virtual camera in this embodiment of the present application.
  • the adjusting the center point of the image to be displayed according to the offset of the center point of the first image and the direction of the pupil offset includes: according to the offset of the center point of the first image, Adjusting the displacement parameter value of the first module, so that the center point of the image to be displayed generated by the first module moves a first distance in the direction of the pupil offset, the first distance and the first image
  • the center point offsets are of the same size, and the first module is configured to render the scene data according to an instruction input by the user or a default playback sequence, and generate a corresponding image.
  • acquiring the offset of the user's pupil position includes: acquiring an eye image of the user; identifying the pupil position from the eye image; taking the distance between the pupil position and the reference position as the user's Pupil position offset.
  • acquiring the user's diopter includes: acquiring the first position of each optical lens; searching for the first diopter corresponding to the first position from the fourth mapping relationship, and converting the first diopter.
  • the fourth mapping relationship is used to indicate the corresponding relationship between the positions of each optical lens and the diopter.
  • the above image calibration method enables the user to wear the head-mounted display device inappropriately, and the point that the user sees when looking straight ahead is the center point of the image displayed on the display screen, and the above-mentioned calibration is performed on the images corresponding to the optical systems on both sides. After the user's two eyes see the same picture, no dizziness will occur.
  • the present application provides an image calibration method, comprising: acquiring a user's pupil position offset and a user's diopter, where the user's pupil position offset is the offset of the user's pupil relative to the optical axis; The user's pupil position offset and the user's diopter are determined, and the first distortion correction parameter corresponding to the user's pupil position offset and the user's diopter is determined; pixel position for correction.
  • determining the first distortion correction parameter corresponding to the user's pupil position offset and the user's degree of vision according to the user's pupil position offset and the user's degree of vision including: : Determine the first distortion correction parameter according to the user's pupil position offset, the user's diopter, and a second mapping relationship, where the second mapping relationship is used to indicate the pupil position offset, diopter, and distortion Correct the correspondence between the parameters.
  • the first distortion correction parameters include distortion correction parameters corresponding to red sub-pixels, distortion correction parameters corresponding to green sub-pixels, and distortion correction parameters corresponding to blue sub-pixels; a distortion correction parameter, for correcting the position of each pixel on the to-be-displayed image, including: correcting the position of each red sub-pixel on the to-be-displayed image according to the distortion correction parameter corresponding to the red sub-pixel; The distortion correction parameters corresponding to the green sub-pixels are used to correct the position of each green sub-pixel on the to-be-displayed image; according to the distortion correction parameters corresponding to the blue sub-pixels, the The position of the color sub-pixel is corrected.
  • acquiring the offset of the user's pupil position includes: acquiring an eye image of the user; identifying the pupil position from the eye image; taking the distance between the pupil position and the reference position as the user's Pupil position offset.
  • acquiring the user's diopter includes: acquiring the first position of each optical lens; searching for the first diopter corresponding to the first position from the fourth mapping relationship, and converting the first diopter.
  • the fourth mapping relationship is used to indicate the corresponding relationship between the positions of each optical lens and the diopter.
  • the distortion correction parameters stored in the second mapping relationship are obtained by simulating different eye offsets and different dioptric degrees, the distortion correction parameters stored in the second mapping relationship are ideal for distortion correction.
  • the present application provides an image calibration method, including: acquiring a user's vision; determining a first image height corresponding to the user's vision according to the user's vision; Displays the image height.
  • the determining the first image height corresponding to the user's vision according to the user's vision includes: determining the first image according to the user's vision and a third mapping relationship. height, the third mapping relationship is used to indicate the corresponding relationship between the viewing angle and the image height.
  • the adjusting the image height of the to-be-displayed image according to the first image height includes: adjusting the image height of the to-be-displayed image generated by the first module to be the same as the first image height.
  • the height is the same, and the first module is used to render the scene data according to the instruction input by the user or the default play sequence, and generate the corresponding image.
  • the adjusting the image height of the image to be displayed according to the first image height includes: adjusting the field angle parameter value of the first module according to the first image height, to The image height of the to-be-displayed image generated by the first module is the same as the first image height, and the first module is used to render the scene data according to the instructions input by the user or the default playback order, and generate corresponding image.
  • acquiring the user's diopter includes: acquiring the first position of each optical lens; searching for the first diopter corresponding to the first position from the fourth mapping relationship, and converting the first diopter.
  • the fourth mapping relationship is used to indicate the corresponding relationship between the positions of each optical lens and the diopter.
  • the above-mentioned image calibration method enables users with different viewing degrees to have the same field of view when using the head-mounted display device, which enhances the user experience.
  • the present application provides an electronic device, comprising: a camera, a lens position detector, and a processor, the camera is used to capture an eye image, the lens position detector is used to detect the position of an optical lens, and the processor uses for performing the method described in the first aspect, the second aspect or the third aspect.
  • the present application provides an electronic device, comprising: a memory and a processor; the processor is configured to be coupled with the memory, read and execute instructions in the memory, so as to implement the first aspect and the second aspect Or the method described in the third aspect.
  • the present application provides a readable storage medium, wherein a computer program is stored on the readable storage medium; when the computer program is executed, the first aspect, the second aspect or the third aspect is implemented. the method described in the aspect.
  • the first mapping relationship can be used to find the corresponding
  • the center point of the image to be displayed on the display screen is adjusted according to the offset amount of the center point of the image, so that when the user wears the head-mounted display device inappropriately, the point that the user sees when looking straight ahead is the display screen.
  • the corresponding distortion correction parameter is searched from the second mapping relationship, and the image to be displayed on the display screen is subjected to distortion correction according to the distortion correction parameter.
  • the distortion correction parameters stored in the relationship are obtained by simulating different eye offsets and different degrees of vision, and it is ideal to perform distortion correction based on the distortion correction parameters stored in the second mapping relationship.
  • the corresponding image height is searched from the third mapping relationship according to the user's viewing angle, and the image height of the image to be displayed on the display screen is adjusted according to the image height, so that when users with different viewing angles use the head-mounted display device The field of view is the same, enhancing the user experience.
  • FIG. 1 is an application scenario diagram provided by an embodiment of the present application
  • FIG. 2 is a schematic diagram of an imaging principle provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of an optical axis provided by an embodiment of the present application.
  • FIG. 4 is a frame diagram of a head-mounted display device 100 provided by an embodiment of the present application.
  • FIG. 5 is a schematic position diagram of the camera 404 and the lens position detector 405 provided by the embodiment of the present application;
  • FIG. 6 is a schematic flowchart of an embodiment of an image calibration method provided by the present application.
  • 7A is a schematic diagram of the principle of determining the offset of the user's pupil position according to an embodiment of the present application.
  • FIG. 7B is a schematic diagram of the principle of determining a reference position according to an embodiment of the present application.
  • FIG. 8 is a schematic diagram of pupil position offset provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of adjusting a center point of an image to be displayed according to an embodiment of the present application.
  • FIG. 10 is a schematic diagram of on-axis distortion provided by an embodiment of the present application.
  • FIG. 11 is a schematic diagram of off-axis distortion provided by an embodiment of the present application.
  • FIG. 12 is a schematic diagram of the variation of the degree of image distortion with the angle of view according to an embodiment of the present application.
  • FIG. 13 is a schematic diagram of field angles corresponding to different viewing degrees according to an embodiment of the present application.
  • FIG. 14 is a schematic diagram 1 of adjusting the image height of an image to be displayed according to an embodiment of the present application
  • 15 is a second schematic diagram of adjusting the image height of an image to be displayed according to an embodiment of the present application.
  • FIG. 16 shows a schematic structural diagram of the electronic device 10 .
  • FIG. 1 is an application scenario diagram provided by an embodiment of the present application.
  • FIG. 1 shows a head-mounted display device 100.
  • the head-mounted display device 100 is a binocular display device.
  • the head-mounted display device 100 includes two optical systems: left and right.
  • the image calibration method provided by the embodiment of the present application is applied to the two optical systems. while displaying the image.
  • the shape of the head-mounted display device 100 shown in FIG. 1 is glasses, which is only an example, and the head-mounted display device 100 may also be in other shapes, including but not limited to goggles, helmets, or head-mounted displays. Display, the embodiment of the present application does not limit the form of the head-mounted display device 100 .
  • each optical system includes two optical lenses and a display screen.
  • the optical system uses the principle shown in FIG. 2 to form images, see FIG. 2, from left to right It is the pupil 10 , the optical lens 101 , the optical lens 102 and the display screen 103 .
  • the display screen 103 emits light, the light enters the pupil 10 after being refracted by the optical lens 102 and the optical lens 101, so that the user can see an enlarged virtual image.
  • the virtual image range is shown in FIG. 2, which enhances the user's immersion.
  • FIG. 2 only shows the imaging principle of one optical system in the head-mounted display device 100 , and the imaging principle of the other optical system is similar, and the imaging principle of the other optical system will not be repeated in this embodiment of the present application.
  • the line connecting the center point P1 of the optical lens 101, the center point P2 of the optical lens 102, and the center point P3 of the display screen 103 constitutes the optical axis of the optical system.
  • the point that the user sees when he looks straight ahead corresponds to the center point of the image displayed on the display screen 103 , that is, the point that the user sees when he looks straight ahead
  • the points are not offset from the center point of the image displayed on the display screen 103 .
  • the two pupils 10 of the user are on the corresponding optical axis, the two eyes see the same picture, and the user experience is high.
  • the user's two pupils 10 are not necessarily on the corresponding optical axes, resulting in
  • the points seen by the two eyes when looking straight ahead have different degrees of offset from the center point of the image displayed on the display screen 103 , and the images seen by the two eyes are different, and the user may feel dizzy.
  • the optical lens 101 and the optical lens 102 in the optical system have different corresponding dioptric powers at different positions. Therefore, the user can adjust the positions of the optical lens 101 and the optical lens 102 to match the positions of the two lenses with their own dioptric power. For example, if the approximate degree of the user's right eye is 700 degrees, when using the head-mounted display device 100, the optical lens 101 and the optical lens 102 in the right optical system can be adjusted to positions corresponding to approximately 700 degrees.
  • the deviation of the point that the user sees when looking straight ahead relative to the center point of the image displayed on the display screen 103 The degree is also different. That is, when the user's pupil 10 is not on the corresponding optical axis, the degree of deviation of the point seen by the user when looking straight ahead relative to the center point of the image displayed on the display screen 103 is related to the user's vision.
  • the present application provides an image calibration method, which can be used in the process of displaying images by two optical systems.
  • the camera detects the offset of the user's pupil position;
  • the user's diopter is detected by the lens position detector, and then, according to the user's pupil position offset and the user's diopter, the corresponding image center point offset is searched from the pre-configured mapping relationship, and finally the image center point offset is found according to the found image center point offset.
  • the center point of the image to be displayed on the display screen is adjusted by the shift amount, so that the point seen by the user when looking straight ahead will not be offset from the center point of the image displayed on the display screen 103.
  • FIG. 4 is a frame diagram of a head-mounted display device 100 according to an embodiment of the present application.
  • the head-mounted display device 100 includes but is not limited to: two optical systems 400 , a processor 401 , and a memory 402 .
  • Each optical system 400 includes, but is not limited to, a display screen 103 , an optical lens barrel module 403 , a camera 404 , a lens position detector 405 and a diopter adjustment module 406 .
  • the optical lens barrel module 403 , the display screen 103 , the camera 404 and the lens position detector 405 in each optical system 400 are all connected to the processor 401 , and the memory 402 is also connected to the processor 401 .
  • a related software unit that includes a virtual camera. The functions of each hardware or software unit are introduced one by one as follows:
  • the optical lens barrel module 403 includes the optical lens 101 and the optical lens 102 as shown in FIG. 2 .
  • the diopter adjustment module 406 is used to adjust the positions of the optical lens 101 and the optical lens 102 so that the positions of the optical lens 101 and the optical lens 102 match the user's diopter.
  • the number of lenses included in the optical lens barrel module 403 may also be one or more than two, and the embodiment of the present application uses two lenses as an example to describe the calibration process of the present application.
  • the camera 404 is used to capture an eye image and send the eye image to the processor 401 .
  • the lens position detector 405 is used to detect the position of each optical lens in the optical lens barrel module 403 , and send the position of each optical lens to the processor 401 .
  • the position of each optical lens detected by the lens position detector 405 may be the position of each optical lens relative to the display screen 103 .
  • the positions of the camera 404 and the lens position detector 405 in the two optical systems 400 may be as shown in FIG. 5 .
  • the processor 401 is configured to determine the offset of the user's pupil position according to the eye image sent by the camera 404 . It is also used to determine the user's diopter according to the position of each optical lens sent by the lens position detector 405 .
  • an application program APP (Application, APP) is also stored in the memory 402, and the APP can be a game APP or a video APP.
  • This application does not limit the APP type, and a game APP is taken as an example.
  • the game APP includes several pre-established game scenes, each game scene has corresponding scene data, and the scene data is rendered to generate a corresponding image.
  • the virtual camera renders the scene data according to the instructions input by the user to generate a corresponding image; or, renders the scene data according to the default playback order to generate a corresponding image.
  • the virtual camera can send the generated image to be displayed to the display screen 103, and the display screen 103 is used to display the image sent by the virtual camera.
  • the virtual camera includes a displacement parameter and an angle of view parameter, and the center point of the image generated by the virtual camera can be adjusted by changing the value of the displacement parameter of the virtual camera. Adjusts the image height of images generated by the virtual camera.
  • the processor 401 is further configured to adjust the center point of the image to be displayed on the display screen 103 according to the offset of the user's pupil position and the user's degree of vision. It is also used to determine the distortion correction parameter corresponding to the image to be displayed on the display screen 103 according to the offset of the user's pupil position and the user's diopter, and perform distortion correction on the image to be displayed on the display screen 103 according to the distortion correction parameter. It is also used to adjust the image height of the image to be displayed on the display screen 103 according to the user's viewing angle.
  • the virtual image seen by the user is not affected by the displacement of the pupil position or the degree of vision.
  • the user's two eyes see the same picture, which improves the user's experience of using the head-mounted display device 100 .
  • the head-mounted display device 100 further includes a mechanical interpupillary distance adjustment device, and the interpupillary distance mechanical adjustment device is used to adjust the distance between the two optical axes, so that the distance between the two optical axes is the same as the distance between the two optical axes.
  • the user's interpupillary distance is matched.
  • the mechanical interpupillary distance adjusting device if the adjustment is not appropriate or the pupils are shifted upward or downward with respect to the optical axis after the user wears the head-mounted display device 100, the two pupils 10 of the user still do not correspond to each other.
  • the image to be displayed on the display screen 103 can be calibrated by using the image calibration method provided in the embodiment of the present application.
  • FIG. 6 is a schematic flowchart of an embodiment of the image calibration method provided by the present application.
  • the image calibration method provided in this embodiment can be applied after the user triggers the start of the head-mounted display device 100, and can also be applied after the user triggers the opening of the APP installed on the head-mounted display device 100, which is not limited in this embodiment of the present application.
  • the provided image calibration methods specifically include:
  • the camera 404 captures an eye image, and sends the eye image to the processor 401 .
  • the lens position detector 405 detects the first position of each optical lens, and sends the first position of each optical lens to the processor 401 .
  • the processor 401 determines the offset of the user's pupil position according to the eye image.
  • the pupil position is identified from the eye image, and then the distance between the pupil position and the reference position is calculated, and the distance is taken as User's pupil position offset.
  • a positioning point is set on the optical lens barrel module 403, and a groove matching the positioning point is set on the tooling. After the calibration object is installed on the tooling, the positioning point is clamped in the groove, and then Ensure that the calibration object is on the optical axis, use the camera 404 to take pictures to obtain an image of the calibration object, identify the center point of the calibration object from the image of the calibration object, and use the position of the center point as the above-mentioned reference position.
  • the processor 401 determines the user's diopter according to the first position of each optical lens.
  • the processor 401 can search for the first diopter corresponding to the first position from the fourth mapping relationship, and use The first viewing angle found is used as the user viewing angle.
  • the fourth mapping relationship is the corresponding relationship between the pre-configured positions of the respective optical lenses and the diopter.
  • the viewing angle may include 0 degrees, 100 degrees, 200 degrees, 300 degrees, 400 degrees, 500 degrees, 600 degrees, and 700 degrees.
  • the positions of the lenses are shown in Table 1, and the positions involved in Table 1 are the positions of the optical lenses relative to the display screen 103 . It should be noted that the corresponding relationship shown in Table 1 is only an example, and the corresponding relationship between the position of the optical lens 101, the position of the optical lens 102 and the diopter may also be other relationships, and Table 1 does not constitute a limitation to the present application.
  • Optical Lens 101 Optical lens 102 Vision Location A1 Location B1 0 degree Position A 2 position B 2 100 degree Position A 3 Position B 3 200 degrees Position A 4 Position B 4 300 degrees Location A 5 Location B 5 400 degrees Position A 6 Position B 6 500 degrees Location A 7 Location B 7 600 degrees Location A 8 Location B 8 700 degrees
  • the diopter can be designed as a function of the positions of the two optical lenses. After the user puts on the head-mounted display device 100, the positions of the two optical lenses are adjusted so that the diopter calculated by the function matches the diopter of the user. This design allows users with any viewing angle to use the head-mounted display device 100 .
  • the processor 401 can find from the fourth mapping relationship that the user's vision is 200 degrees.
  • the processor 401 adjusts the center point of the image to be displayed on the display screen 103 according to the offset of the user's pupil position and the user's degree of vision.
  • the processor 401 searches the corresponding first image center point offset from the first mapping relationship according to the user's pupil position offset obtained in S603 and the user's viewing angle obtained in S604, and based on the first image center point offset.
  • the offset of the image center point and the direction of the pupil offset adjust the center point of the image to be displayed on the screen.
  • the first mapping relationship is a pre-configured correspondence between the offset of the pupil position, the diopter, and the offset of the image center point.
  • the image to be displayed on the display screen is also referred to as the image to be displayed in this application.
  • the diopter may include 0 degrees, 100 degrees, 200 degrees, 300 degrees, 400 degrees, 500 degrees, 600 degrees, and 700 degrees
  • the pupil position offset may include 1mm, 2mm, 3mm and 4mm.
  • Table 2 The offsets of the image center points corresponding to each diopter and each pupil position offset are shown in Table 2. It should be noted that the corresponding relationship shown in Table 2 is only an example, and the corresponding relationship between pupil position offset, diopter, and image center point offset may also be other relationships, and Table 2 does not constitute a limitation on this application.
  • the pupil can be simulated upward by 2 mm relative to the optical axis by means of simulation, and the optical lens 101 and the optical lens 102 can be adjusted to a position corresponding to 200 degrees according to the first mapping relationship shown in Table 1.
  • the point Q1 seen by the user looking straight ahead is not on the optical axis
  • the direction in which Q1 deviates from the optical axis is the same as the direction in which the pupil deviates from the optical axis
  • the distance between Q1 and the pupil deviating from the optical axis is the same. That is, Q1 is also shifted upward by 2 mm with respect to the optical axis.
  • Q1 also does not correspond to the center point of the image displayed on the display screen 103, but a point where the center point of the image displayed on the display screen 103 is shifted upward by a certain distance.
  • P4 is used in FIG.
  • the center point is moved up from P3 to P4
  • the point that the user sees when looking straight ahead is the center point of the image displayed on the display screen 103. Therefore, the distance A3mm from P3 to P4 can be used as the pupil position offset, which is 2mm, and the visual angle is 2mm.
  • the offset of the image center point corresponding to 200 degrees.
  • the processor 401 searches for the corresponding first image center point offset from the first mapping relationship according to the user's pupil position offset obtained in S603 and the user's perspective obtained in S604, and then can The center point of the to-be-displayed image generated by the virtual camera moves a first distance in the direction of pupil offset, and the first distance is the same as the offset amount of the center point of the first image.
  • the device 401 can move the center point of the image to be displayed generated by the virtual camera by A3 mm in the direction of pupil offset, so that the point that the user sees when looking straight ahead is the center point of the image displayed on the display screen, and the optical systems on both sides are calibrated above. After the user's two eyes see the same picture, no dizziness will occur.
  • Fig. 9 illustrates the adjustment process of the present application by taking the direction of pupil displacement as an upward displacement as an example, and the adjustment process when the pupil is displaced in other directions is similar, and the embodiment of the present application will not be repeated.
  • the processor 401 searches for the corresponding first image center point offset from the first mapping relationship according to the user's pupil position offset obtained in S603 and the user's perspective obtained in S604, and then can The displacement adjustment amount of the virtual camera is converted according to the offset amount of the center point of the first image, and the displacement parameter value of the virtual camera is adjusted according to the displacement adjustment amount, so that the center point of the image to be displayed generated by the virtual camera will be shifted in the direction of the pupil. Move the first distance, the first distance is the same as the offset of the center point of the first image.
  • the user's pupil position offset obtained in S603 is 2mm
  • the user's viewing angle obtained in S604 is 200 degrees
  • the displacement adjustment of the virtual camera can be obtained by conversion.
  • the amount is a3mm
  • the displacement parameter value of the virtual camera can be adjusted according to the displacement adjustment amount.
  • the center point of the image generated by the virtual camera will move A3mm in the direction of the pupil offset, so that the user can see the The point is the center point of the image displayed on the display screen.
  • the processor 401 performs distortion correction on the image to be displayed on the display screen according to the offset of the user's pupil position and the user's degree of vision.
  • the image to be displayed on the display screen involved in S606 may be the image obtained after the processor 401 executes S605.
  • FIG. 10 and 11 are schematic diagrams of two distortions of the same image viewed by the user, FIG. 10 is a schematic diagram of on-axis distortion, and FIG. 11 is a schematic diagram of off-axis distortion.
  • On-axis distortion refers to the distortion seen by the user when the pupil is not offset from the optical axis.
  • Off-axis distortion refers to the distortion seen by the user when the pupil is offset from the optical axis. Distortion seen when the axis is shifted upward.
  • the grid intersections represent the positions of the pixels when no distortion occurs, and the cross-shaped points represent the positions of the pixels after the image is distorted. It can be seen from the comparison that the image distortion seen by the user when the pupil does not shift is different from the image distortion seen by the user when the pupil is shifted. For example, when the pupil does not shift, the pixel at P0 is distorted and then moved to P1 in FIG. 10 , as indicated by the arrow in FIG. 10 . When the pupil is shifted, the pixel at P0 is distorted and moved to P2 in FIG. 11 , as indicated by the arrow in FIG. 11 . The position indicated by P1 in FIG. 10 is obviously different from the position indicated by P2 in FIG. 11 .
  • the abscissa of the upper and lower figures is the degree of distortion
  • the ordinate is the angle of view
  • the upper figure is the change of the degree of image distortion with the angle of view when the viewing angle is 700 degrees
  • the bottom is the viewing angle When it is 0 degrees
  • the degree of image distortion changes with the angle of view. It can be seen from the comparison that when the user's degree of vision is different, the degree of image distortion seen by the user is different, that is, the degree of image distortion is related to the degree of vision.
  • the processor 401 searches for the corresponding first distortion correction parameter from the second mapping relationship according to the user's pupil position offset obtained in S603 and the user's vision obtained in S604, and for each image to be displayed on the image.
  • a pixel is obtained by substituting the first distortion correction parameter and the coordinates of the pixel into the distortion correction formula to obtain the coordinates of the pixel after distortion correction.
  • the second mapping relationship is the pre-configured correspondence between the pupil position offset, the diopter, and the distortion correction parameters.
  • the first distortion correction parameter of , R represents the distance between the pixel before distortion correction and the center point of the image
  • R can be calculated from the coordinates of the pixel before distortion correction and the coordinate of the center point of the image
  • R' represents the pixel after distortion correction and the center point of the image the distance between.
  • the above R' is obtained, and the pixel is moved to the position corresponding to R' along the line connecting the pixel before distortion correction and the center point of the image to complete the distortion correction of the pixel.
  • the above-mentioned distortion correction formula may also be the polynomial formula used in the following polynomial fitting process.
  • the degree of vision may include 0 degrees, 100 degrees, 200 degrees, 300 degrees, 400 degrees, 500 degrees, 600 degrees, and 700 degrees
  • the pupil position offset may include
  • the distortion correction parameters corresponding to each diopter and each pupil position offset are shown in Table 3. It should be noted that the corresponding relationship shown in Table 3 is only an example of the corresponding relationship between the pupil position offset, the diopter, and the distortion correction parameters, and other relationships can also be used, and the number of values contained in the distortion correction parameters can also be More than one, Table 3 does not constitute a limitation to this application.
  • the position of the eye can be simulated upward by 2 mm relative to the optical axis by means of simulation, and the optical lens 101 and the optical lens 102 can be adjusted to a position corresponding to 200 degrees according to the first mapping relationship shown in Table 1.
  • Table 1 the first mapping relationship shown in Table 1.
  • the distortion correction parameter can be used as the distortion correction parameter corresponding to the pupil position offset of 2 mm and the diopter of 200 degrees, and stored in the second mapping relationship. Since the distortion correction parameters stored in the second mapping relationship are obtained by simulating different eye offsets and different degrees of vision, it is ideal to perform distortion correction based on the distortion correction parameters stored in the second mapping relationship.
  • Each pixel on the image is composed of three sub-pixels of red, blue and green. After the image is distorted, the three sub-pixels move to different positions, resulting in color separation of the image displayed on the screen.
  • the sub-pixels, blue sub-pixels and green sub-pixels are subjected to the above distortion correction processing to overcome the color separation problem.
  • the corresponding relationship between the pupil position offset, the diopter, and the distortion correction parameters corresponding to the sub-pixels of each color may be configured.
  • the diopter may include 0 degrees, 100 degrees, 200 degrees, 300 degrees, 400 degrees, 500 degrees, 600 degrees and 700 degrees
  • the pupil position offset may include 1 mm, 2 mm, 3 mm and 4 mm
  • the distortion correction parameters may include distortion correction parameters corresponding to red sub-pixels, distortion correction parameters corresponding to blue sub-pixels, and distortion correction parameters corresponding to green sub-pixels, as shown in Table 4.
  • the corresponding relationship shown in Table 4 is only an example, and the corresponding relationship between the pupil position offset, the diopter, and the distortion correction parameters of each color may also be other relationships, and Table 4 does not constitute a limitation on the present application.
  • the processor 401 searches for the corresponding red sub-pixel corresponding distortion correction parameters, from the second mapping relationship shown in Table 4 according to the user's pupil position offset obtained in S603 and the user's vision obtained in S604.
  • the distortion correction parameters corresponding to the blue subpixels and the distortion correction parameters corresponding to the green subpixels for each red subpixel on the image to be displayed, the distortion correction parameters corresponding to the red subpixels and the coordinates of the red subpixels are substituted into the distortion correction formula, we can get The coordinates of the red sub-pixel after distortion correction; for each green sub-pixel on the image to be displayed, the corresponding distortion correction parameters of the green sub-pixel and the coordinates of the green sub-pixel are substituted into the distortion correction formula, and the coordinates of the green sub-pixel after the distortion correction can be obtained.
  • the processor 401 adjusts the image height of the image to be displayed on the display screen according to the user's viewing angle.
  • the processor 401 may adjust the image height of the image to be displayed on the display screen according to the user's viewing angle.
  • the processor 401 searches for the corresponding first image height from the third mapping relationship according to the user's vision obtained in S604, and adjusts the image height of the image to be displayed on the display screen based on the first image height.
  • the third mapping relationship is a pre-configured correspondence between the viewing angle and the image height.
  • the viewing degrees may include 0 degrees, 100 degrees, 200 degrees, 300 degrees, 400 degrees, 500 degrees, 600 degrees, and 700 degrees, and the image heights corresponding to each viewing degree are as follows: shown in Table 5. It should be noted that the corresponding relationship shown in Table 5 is only an example, and the corresponding relationship between the viewing angle and the image height may also be other relationships, and Table 5 does not constitute a limitation on the present application.
  • the corresponding diopter of the optical lens 101 is 0 degrees when the optical lens 101 is at the position of the solid line, the corresponding diopter when the optical lens 101 is moved to the position of the dotted line is 200 degrees, and the visual field angle of the user whose diopter is 0 degrees is ⁇ , the viewing angle of a user with a viewing angle of 200 degrees is ⁇ .
  • the image height of the image to be displayed on the display screen 103 can be adjusted until the viewing angle of the user with a viewing angle of 200 degrees becomes ⁇ , and the image height at this time is H3mm in Table 5.
  • the processor 401 searches for the corresponding first image height from the third mapping relationship according to the user's vision obtained in S604, and then adjusts the image height of the image to be displayed generated by the virtual camera to be the same as the first image height.
  • One image height is the same. Since the image height stored in the third mapping relationship is obtained by simulating different viewing angles, after adjusting the image to be displayed on the display screen 103 according to the found image height, the user's field of view and The viewing angle of the user with other viewing degrees is the same.
  • the processor 401 can adjust the image height of the image captured by the virtual camera to H3mm.
  • the viewing angle of the head-mounted display device 100 is both ⁇ for a user with a viewing angle of 200 degrees and a user with a viewing angle of 0 degrees, which enhances the user experience.
  • the processor 401 finds the corresponding first image height from the third mapping relationship according to the user's viewing angle obtained in S604, and then can convert the field angle adjustment amount of the virtual camera according to the image height, The field angle parameter value of the virtual camera is adjusted according to the field angle adjustment amount, so that the image height of the to-be-displayed image generated by the virtual camera is the same as the first image height.
  • the adjustment amount of the viewing angle of the virtual camera is x degrees, which can be adjusted according to the viewing angle of the virtual camera.
  • Adjust the field of view parameter value of the virtual camera After this adjustment, the image height of the image generated by the virtual camera becomes H3mm, which also enables users with a viewing angle of 200 degrees and users with a viewing angle of 0 degrees to use the head-mounted display device.
  • the field of view is all ⁇ , which enhances the user experience.
  • S601 and S602 are not limited in this embodiment of the present application.
  • S601 may be performed first, then S602 may be performed, or S602 may be performed first, and then S601 may be performed, or S601 and S602 may be performed simultaneously.
  • this embodiment of the present application does not limit the sequence of S603 and S604.
  • S603 may be executed first and then S604 may be executed, or S604 may be executed first and then S603 may be executed, or S603 and S604 may be executed simultaneously.
  • this embodiment of the present application does not limit the sequence of S605, S606, and S607, and S605, S606, and S607 may be performed in any order.
  • the sequence of Figure 6 is only an example.
  • a corresponding search can be performed from the first mapping relationship.
  • the center point of the image to be displayed on the display screen is adjusted according to the offset amount of the center point of the image, so that the point that the user sees when looking ahead is the center point of the image displayed on the display screen, and the optical systems on both sides pass through After this calibration, the user's two eyes see the same picture without dizziness.
  • the corresponding distortion correction parameter is searched from the second mapping relationship, and the image to be displayed on the display screen is subjected to distortion correction according to the distortion correction parameter.
  • the distortion correction parameters stored in the relationship are obtained by simulating different eye offsets and different degrees of vision, and it is ideal to perform distortion correction based on the distortion correction parameters stored in the second mapping relationship.
  • the corresponding image height is searched from the fourth mapping relationship according to the user's viewing angle, and the image height of the image to be displayed on the display screen is adjusted according to the image height, so that when users with different viewing angles use the head-mounted display device The field of view is the same, enhancing the user experience.
  • FIG. 16 shows a schematic structural diagram of the electronic device 10 .
  • the electronic device 10 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, speaker 170A, receiver 170B, microphone 170C, headphone jack 170D, sensor module 180, buttons 190, motor 191, indicator 192, camera 193, display screen 194, etc.
  • the sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an acceleration sensor 180E, a distance sensor 180F, a touch sensor 180K, and the like.
  • the structures illustrated in the embodiments of the present invention do not constitute a specific limitation on the electronic device 10 .
  • the electronic device 10 may include more or less components than shown, or combine some components, or separate some components, or arrange different components.
  • the illustrated components may be implemented in hardware, software, or a combination of software and hardware.
  • the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural-network processing unit (neural-network processing unit, NPU), etc. Wherein, different processing units may be independent devices, or may be integrated in one or more processors.
  • application processor application processor, AP
  • modem processor graphics processor
  • ISP image signal processor
  • controller video codec
  • digital signal processor digital signal processor
  • baseband processor baseband processor
  • neural-network processing unit neural-network processing unit
  • the controller can generate an operation control signal according to the instruction operation code and timing signal, and complete the control of fetching and executing instructions.
  • a memory may also be provided in the processor 110 for storing instructions and data.
  • the memory in processor 110 is cache memory. This memory may hold instructions or data that have just been used or recycled by the processor 110 . If the processor 110 needs to use the instruction or data again, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby increasing the efficiency of the system.
  • the processor 110 is configured to adjust the center point of the image to be displayed on the display screen 194 according to the offset of the user's pupil position and the user's degree of vision. It is also used to determine the distortion correction parameter corresponding to the image to be displayed on the display screen 194 according to the offset of the user's pupil position and the user's diopter, and perform distortion correction on the image to be displayed on the display screen 194 according to the distortion correction parameter. It is also used to adjust the image height of the image to be displayed on the display screen 194 according to the user's viewing angle.
  • the processor 110 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous transceiver (universal asynchronous transmitter) receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous transceiver
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB universal serial bus
  • the I2C interface is a bidirectional synchronous serial bus that includes a serial data line (SDA) and a serial clock line (SCL).
  • the processor 110 may contain multiple sets of I2C buses.
  • the processor 110 can be respectively coupled to the touch sensor 180K, the charger, the flash, the camera 193 and the like through different I2C bus interfaces.
  • the processor 110 may couple the touch sensor 180K through the I2C interface, so that the processor 110 and the touch sensor 180K communicate through the I2C bus interface, so as to realize the touch function of the electronic device 10 .
  • the I2S interface can be used for audio communication.
  • the processor 110 may contain multiple sets of I2S buses.
  • the processor 110 may be coupled with the audio module 170 through an I2S bus to implement communication between the processor 110 and the audio module 170 .
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the I2S interface, so as to realize the function of answering calls through a Bluetooth headset.
  • the PCM interface can also be used for audio communications, sampling, quantizing and encoding analog signals.
  • the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface.
  • the audio module 170 can also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to realize the function of answering calls through the Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
  • the UART interface is a universal serial data bus used for asynchronous communication.
  • the bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
  • a UART interface is typically used to connect the processor 110 with the wireless communication module 160 .
  • the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
  • the audio module 170 can transmit audio signals to the wireless communication module 160 through the UART interface, so as to realize the function of playing music through the Bluetooth headset.
  • the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193 .
  • MIPI interfaces include camera serial interface (CSI), display serial interface (DSI), etc.
  • the processor 110 and the camera 193 communicate through a CSI interface to implement the photographing function of the electronic device 10 .
  • the processor 110 communicates with the display screen 194 through the DSI interface to implement the display function of the electronic device 10 .
  • the GPIO interface can be configured by software.
  • the GPIO interface can be configured as a control signal or as a data signal.
  • the GPIO interface may be used to connect the processor 110 with the camera 193, the display screen 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like.
  • the GPIO interface can also be configured as I2C interface, I2S interface, UART interface, MIPI interface, etc.
  • the USB interface 130 is an interface that conforms to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, and the like.
  • the USB interface 130 can be used to connect a charger to charge the electronic device 10, and can also be used to transmit data between the electronic device 10 and peripheral devices. It can also be used to connect headphones to play audio through the headphones.
  • the interface can also be used to connect other electronic devices, such as AR devices.
  • the interface connection relationship between the modules illustrated in the embodiment of the present invention is only a schematic illustration, and does not constitute a structural limitation of the electronic device 10 .
  • the electronic device 10 may also adopt different interface connection manners in the foregoing embodiments, or a combination of multiple interface connection manners.
  • the wireless communication function of the electronic device 10 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modulation and demodulation processor, the baseband processor, and the like.
  • Antenna 1 and Antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in electronic device 10 may be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • the antenna 1 can be multiplexed as a diversity antenna of the wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
  • the mobile communication module 150 can provide wireless communication solutions including 2G/3G/4G/5G etc. applied on the electronic device 10 .
  • the mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (LNA) and the like.
  • the mobile communication module 150 can receive electromagnetic waves from the antenna 1, filter and amplify the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
  • the mobile communication module 150 can also amplify the signal modulated by the modulation and demodulation processor, and then turn it into an electromagnetic wave for radiation through the antenna 1 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the processor 110 .
  • at least part of the functional modules of the mobile communication module 150 may be provided in the same device as at least part of the modules of the processor 110 .
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low frequency baseband signal is processed by the baseband processor and passed to the application processor.
  • the application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display screen 194 .
  • the modem processor may be a separate device.
  • the modulation and demodulation processor may be independent of the processor 110, and be provided in the same device as the mobile communication module 150 or other functional modules.
  • the wireless communication module 160 can provide applications on the electronic device 10 including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) networks), bluetooth (BT), global navigation satellites Wireless communication solutions such as global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), and infrared technology (IR).
  • WLAN wireless local area networks
  • BT Bluetooth
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication
  • IR infrared technology
  • the wireless communication module 160 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 160 receives electromagnetic waves via the antenna 2 , frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 110 .
  • the wireless communication module 160 can also receive the signal to be sent from the processor 110 , perform frequency modulation on it, amplify it, and convert it into electromagnetic waves for radiation through the antenna 2 .
  • the antenna 1 of the electronic device 10 is coupled with the mobile communication module 150, and the antenna 2 is coupled with the wireless communication module 160, so that the electronic device 10 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code Division Multiple Access (WCDMA), Time Division Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include a global positioning system (global positioning system, GPS), a global navigation satellite system (GLONASS), a Beidou navigation satellite system (BDS), a quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite based augmentation systems
  • the electronic device 10 implements a display function through a GPU, a display screen 194, an application processor, and the like.
  • the GPU is a microprocessor for image processing, and is connected to the display screen 194 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
  • Display screen 194 is used to display images, videos, and the like.
  • Display screen 194 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (active-matrix organic light).
  • LED diode AMOLED
  • flexible light-emitting diode flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (quantum dot light emitting diodes, QLED) and so on.
  • the electronic device 10 may include one or N display screens 194 , where N is a positive integer greater than one.
  • the electronic device 10 can realize the shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194 and the application processor.
  • the ISP is used to process the data fed back by the camera 193 .
  • the shutter is opened, the light is transmitted to the camera photosensitive element through the lens, the light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone.
  • ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 193 .
  • the camera 193 is used for capturing still images or video.
  • the object is projected through the lens to generate an optical image onto the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats of image signals.
  • the electronic device 10 may include 1 or N cameras 193 , where N is a positive integer greater than 1.
  • a digital signal processor is used to process digital signals, in addition to processing digital image signals, it can also process other digital signals. For example, when the electronic device 10 selects a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy, and the like.
  • Video codecs are used to compress or decompress digital video.
  • Electronic device 10 may support one or more video codecs.
  • the electronic device 10 can play or record videos in various encoding formats, for example, moving picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, and so on.
  • MPEG moving picture experts group
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 10 can be implemented through the NPU, such as image recognition, face recognition, speech recognition, text understanding, and the like.
  • the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 10 .
  • the external memory card communicates with the processor 110 through the external memory interface 120 to realize the data storage function. For example to save files like music, video etc in external memory card.
  • Internal memory 121 may be used to store computer executable program code, which includes instructions.
  • the internal memory 121 may include a storage program area and a storage data area.
  • the storage program area can store an operating system, an application program required for at least one function (such as a sound playback function, an image playback function, etc.), and the like.
  • the storage data area can store data (such as audio data, phone book, etc.) created during the use of the electronic device 10 and the like.
  • the internal memory 121 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (UFS), and the like.
  • the processor 110 executes various functional applications and data processing of the electronic device 10 by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
  • the electronic device 10 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playback, recording, etc.
  • the audio module 170 is used for converting digital audio information into analog audio signal output, and also for converting analog audio input into digital audio signal. Audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be provided in the processor 110 , or some functional modules of the audio module 170 may be provided in the processor 110 .
  • Speaker 170A also referred to as a "speaker" is used to convert audio electrical signals into sound signals.
  • the electronic device 10 can listen to music through the speaker 170A, or listen to a hands-free call.
  • the receiver 170B also referred to as "earpiece" is used to convert audio electrical signals into sound signals.
  • the voice can be answered by placing the receiver 170B close to the human ear.
  • the microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 170C through a human mouth, and input the sound signal into the microphone 170C.
  • the electronic device 10 may be provided with at least one microphone 170C. In other embodiments, the electronic device 10 may be provided with two microphones 170C, which can implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 10 may further be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
  • the earphone jack 170D is used to connect wired earphones.
  • the earphone interface 170D can be the USB interface 130, or can be a 3.5mm open mobile terminal platform (OMTP) standard interface, a cellular telecommunications industry association of the USA (CTIA) standard interface.
  • OMTP open mobile terminal platform
  • CTIA cellular telecommunications industry association of the USA
  • the pressure sensor 180A is used to sense pressure signals, and can convert the pressure signals into electrical signals.
  • the pressure sensor 180A may be provided on the display screen 194 .
  • the capacitive pressure sensor may be comprised of at least two parallel plates of conductive material. When a force is applied to the pressure sensor 180A, the capacitance between the electrodes changes.
  • the electronic device 10 determines the intensity of the pressure according to the change in capacitance. When a touch operation acts on the display screen 194, the electronic device 10 detects the intensity of the touch operation according to the pressure sensor 180A.
  • the electronic device 10 can also calculate the touched position according to the detection signal of the pressure sensor 180A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example, when a touch operation whose intensity is less than the first pressure threshold acts on the short message application icon, the instruction for viewing the short message is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the short message application icon, the instruction to create a new short message is executed.
  • the gyro sensor 180B may be used to determine the motion attitude of the electronic device 10 .
  • the angular velocity of electronic device 10 about three axes ie, x, y, and z axes
  • the gyro sensor 180B can be used for image stabilization.
  • the gyroscope sensor 180B detects the shaking angle of the electronic device 10, calculates the distance to be compensated by the lens module according to the angle, and allows the lens to counteract the shaking of the electronic device 10 through reverse motion to achieve anti-shake.
  • the gyro sensor 180B can also be used for navigation and somatosensory game scenarios.
  • the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 10 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the electronic device 10 is stationary. It can also be used to identify the posture of electronic devices, and can be used in applications such as horizontal and vertical screen switching, pedometers, etc.
  • the electronic device 10 can measure distances by infrared or laser. In some embodiments, when shooting a scene, the electronic device 10 can use the distance sensor 180F to measure the distance to achieve fast focusing.
  • Touch sensor 180K also called “touch device”.
  • the touch sensor 180K may be disposed on the display screen 194 , and the touch sensor 180K and the display screen 194 form a touch screen, also called a “touch screen”.
  • the touch sensor 180K is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations may be provided through display screen 194 .
  • the touch sensor 180K may also be disposed on the surface of the electronic device 10 , which is different from the location where the display screen 194 is located.
  • the keys 190 include a power-on key, a volume key, and the like. Keys 190 may be mechanical keys. It can also be a touch key.
  • the electronic device 10 may receive key inputs and generate key signal inputs related to user settings and function control of the electronic device 10 .
  • Motor 191 can generate vibrating cues.
  • the motor 191 can be used for vibrating alerts for incoming calls, and can also be used for touch vibration feedback.
  • touch operations acting on different applications can correspond to different vibration feedback effects.
  • the motor 191 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 194 .
  • Different application scenarios for example: time reminder, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 192 can be an indicator light, which can be used to indicate the charging state, the change of the power, and can also be used to indicate a message, a missed call, a notification, and the like.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

一种图像校准方法和设备。方法包括:获取用户瞳孔位置偏移量和用户视度,用户瞳孔位置偏移量为用户的瞳孔(10)相对光轴的偏移量;根据用户瞳孔位置偏移量和用户视度,调整待显示图像。使得用户佩戴头戴显示设备(100)不恰当的情况下,用户平视前方时看到的点为显示屏(103)显示的图像的中心点,并且,对两侧光学系统(400)对应的图像均进行校准后,用户两只眼睛看到的画面是相同的,不会产生眩晕感。

Description

图像校准方法和设备
本申请要求于2020年12月14日提交中国专利局、申请号为202011469253.5、申请名称为“图像校准方法和设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及终端技术领域,尤其涉及一种图像校准方法和设备。
背景技术
头戴显示设备包括左右两个光学系统,每个光学系统包括光学镜片和显示屏,用户戴上头戴显示设备后,显示屏发出的光线经过光学镜片的折射后进入人眼,使得用户可以看到一个放大的虚像,增强了用户的沉浸感。
然而,由于用户的瞳距和头戴显示设备的两个光轴之间的距离不匹配,或者用户佩戴不恰当等原因,用户两只眼睛平视前方时看到的点相对于显示屏显示的图像的中心点有不同程度的偏移,两只眼睛看到的画面不同,用户会感到眩晕。
发明内容
本申请提供一种图像校准方法和设备,用于校准图像。
第一方面,本申请提供一种图像校准方法,包括:获取用户瞳孔位置偏移量和用户视度,所述用户瞳孔位置偏移量为用户的瞳孔相对光轴的偏移量;根据所述用户瞳孔位置偏移量和所述用户视度,确定所述用户瞳孔位置偏移量和所述用户视度对应的第一图像中心点偏移量;根据所述第一图像中心点偏移量和瞳孔偏移的方向,调整待显示图像的中心点。
一种可能的实现方式中,所述根据所述用户瞳孔位置偏移量和所述用户视度,确定所述用户瞳孔位置偏移量和所述用户视度对应的第一图像中心点偏移量,包括:根据所述用户瞳孔位置偏移量、所述用户视度以及第一映射关系,确定所述第一图像中心点偏移量,所述第一映射关系用于指示瞳孔位置偏移量、视度和图像中心点偏移量之间的对应关系。
一种可能的实现方式中,所述根据所述第一图像中心点偏移量和瞳孔偏移的方向,调整待显示图像的中心点,包括:将第一模块生成的所述待显示图像的中心点向所述瞳孔偏移的方向移动第一距离,所述第一距离和所述第一图像中心点偏移量大小相同,所述第一模块用于根据用户输入的指令或者默认播放顺序对场景数据进行渲染,生成对应的图像。第一模块在本申请实施例中也称作虚拟相机。
一种可能的实现方式中,所述根据所述第一图像中心点偏移量和瞳孔偏移的方向,调整待显示图像的中心点,包括:根据所述第一图像中心点偏移量,调整第一模块的位移参数值,以使所述第一模块生成的所述待显示图像的中心点向所述瞳孔偏移的方向移动第一距离,所述第一距离和所述第一图像中心点偏移量大小相同,所述第一模块用于根据用户输入的指令或者默认播放顺序对场景数据进行渲染,生成对应的图像。
一种可能的实现方式中,获取用户瞳孔位置偏移量,包括:获取用户的眼睛图像;从所述眼睛图像中识别瞳孔位置;将所述瞳孔位置和基准位置之间的距离作为所述用户瞳孔位置偏移量。
一种可能的实现方式中,获取用户视度,包括:获取各个光学镜片的第一位置;从第四映射关系中查找所述第一位置对应的第一视度,将所述第一视度作为所述用户视度,所述第四映射关系用于指示各个光学镜片的位置和视度之间的对应关系。
上述图像校准方法,使得用户佩戴头戴显示设备不恰当的情况下,用户平视前方时看到的点为显示屏显示的图像的中心点,并且,对两侧光学系统对应的图像均进行上述校准后,用户两只眼睛看到的画面是相同的,不会产生眩晕感。
第二方面,本申请提供一种图像校准方法,包括:获取用户瞳孔位置偏移量和用户视度,所述用户瞳孔位置偏移量为用户的瞳孔相对光轴的偏移量;根据所述用户瞳孔位置偏移量和所述用户视度,确定所述用户瞳孔位置偏移量和所述用户视度对应的第一畸变矫正参数;根据所述第一畸变矫正参数,对待显示图像上每个像素的位置进行矫正。
一种可能的实现方式中,所述根据所述用户瞳孔位置偏移量和所述用户视度,确定所述用户瞳孔位置偏移量和所述用户视度对应的第一畸变矫正参数,包括:根据所述用户瞳孔位置偏移量、所述用户视度以及第二映射关系,确定所述第一畸变矫正参数,所述第二映射关系用于指示瞳孔位置偏移量、视度以及畸变矫正参数之间的对应关系。
一种可能的实现方式中,所述第一畸变矫正参数包括红色子像素对应的畸变矫正参数、绿色子像素对应的畸变矫正参数和蓝色子像素对应的畸变矫正参数;所述根据所述第一畸变矫正参数,对待显示图像上每个像素的位置进行矫正,包括:根据所述红色子像素对应的畸变矫正参数,对所述待显示图像上每个红色子像素的位置进行矫正;根据所述绿色子像素对应的畸变矫正参数,对所述待显示图像上每个绿色子像素的位置进行矫正;根据所述蓝色子像素对应的畸变矫正参数,对所述待显示图像上每个蓝色子像素的位置进行矫正。
一种可能的实现方式中,获取用户瞳孔位置偏移量,包括:获取用户的眼睛图像;从所述眼睛图像中识别瞳孔位置;将所述瞳孔位置和基准位置之间的距离作为所述用户瞳孔位置偏移量。
一种可能的实现方式中,获取用户视度,包括:获取各个光学镜片的第一位置;从第四映射关系中查找所述第一位置对应的第一视度,将所述第一视度作为所述用户视度,所述第四映射关系用于指示各个光学镜片的位置和视度之间的对应关系。
上述图像校准方法,由于第二映射关系存储的畸变矫正参数是模拟不同眼睛偏移量以及不同视度的情况下得到的,基于该第二映射关系存储的畸变矫正参数进行畸变矫正效果比较理想。
第三方面,本申请提供一种图像校准方法,包括:获取用户视度;根据所述用户视度,确定所述用户视度对应的第一像高;根据所述第一像高,调整待显示图像的像高。
一种可能的实现方式中,所述根据所述用户视度,确定所述用户视度对应的第一像高,包括:根据所述用户视度和第三映射关系,确定所述第一像高,所述第三映射关系用于指示视度和像高之间的对应关系。
一种可能的实现方式中,所述根据所述第一像高,调整待显示图像的像高,包括:将第一模块生成的所述待显示图像的像高调整为和所述第一像高相同,所述第一模块用于根 据用户输入的指令或者默认播放顺序对场景数据进行渲染,生成对应的图像。
一种可能的实现方式中,所述根据所述第一像高,调整所述待显示图像的像高,包括:根据所述第一像高,调整第一模块的视场角参数值,以使所述第一模块生成的所述待显示图像的像高和所述第一像高相同,所述第一模块用于根据用户输入的指令或者默认播放顺序对场景数据进行渲染,生成对应的图像。
一种可能的实现方式中,获取用户视度,包括:获取各个光学镜片的第一位置;从第四映射关系中查找所述第一位置对应的第一视度,将所述第一视度作为所述用户视度,所述第四映射关系用于指示各个光学镜片的位置和视度之间的对应关系。
上述图像校准方法,使得不同视度的用户使用头戴显示设备时视场角是相同的,增强了用户体验。
第四方面,本申请提供一种电子设备,包括:相机、镜片位置探测器和处理器,所述相机用于拍摄眼睛图像,所述镜片位置探测器用于探测光学镜片的位置,所述处理器用于执行第一方面、第二方面或者第三方面所述的方法。
第五方面,本申请提供一种电子设备,包括:存储器和处理器;所述处理器用于与所述存储器耦合,读取并执行所述存储器中的指令,以实现第一方面、第二方面或者第三方面所述的方法。
第六方面,本申请提供一种可读存储介质,其特征在于,所述可读存储介质上存储有计算机程序;所述计算机程序在被执行时,实现第一方面、第二方面或者第三方面所述的方法。
本申请提供的图像校准方法和设备,在检测到用户瞳孔位置偏移量以及用户视度后,一方面,可根据用户瞳孔位置偏移量和用户视度,从第一映射关系中查找对应的图像中点偏移量,根据该图像中心点偏移量调整显示屏所要显示的图像的中心点,使得用户佩戴头戴显示设备不恰当的情况下,用户平视前方时看到的点为显示屏显示的图像的中心点,并且对两侧光学系统对应的图像均进行该校准后,用户两只眼睛看到的画面是相同的,不会产生眩晕感。另一方面,根据用户瞳孔位置偏移量和用户视度,从第二映射关系中查找对应的畸变矫正参数,并根据该畸变矫正参数对显示屏所要显示的图像进行畸变矫正,由于第二映射关系存储的畸变矫正参数是模拟不同眼睛偏移量以及不同视度的情况下得到的,基于该第二映射关系中存储的畸变矫正参数进行畸变矫正效果比较理想。另一方面,根据用户视度从第三映射关系中查找对应的像高,根据该像高对对显示屏所要显示的图像的像高进行调整,使得不同视度的用户使用头戴显示设备时视场角是相同的,增强了用户体验。
附图说明
图1为本申请实施例提供的应用场景图;
图2为本申请实施例提供的成像原理示意图;
图3为本申请实施例提供的光轴示意图;
图4为本申请实施例提供的头戴显示设备100的框架图;
图5为本申请实施例提供的相机404和镜片位置探测器405的位置示意图;
图6为本申请提供的图像校准方法的一实施例的流程示意图;
图7A为本申请实施例提供的确定用户瞳孔位置偏移量的原理示意图;
图7B为本申请实施例提供的确定基准位置的原理示意图;
图8为本申请实施例提供的瞳孔位置偏移示意图;
图9为本申请实施例提供的调整待显示图像的中心点的示意图;
图10为本申请实施例提供的轴上畸变示意图;
图11为本申请实施例提供的离轴畸变示意图;
图12为本申请实施例提供的图像畸变程度随视场角的变化示意图;
图13为本申请实施例提供的不同视度对应的视场角示意图;
图14为本申请实施例提供的调整待显示图像的像高的示意图一;
图15为本申请实施例提供的调整待显示图像的像高的示意图二;
图16示出了电子设备10的结构示意图。
具体实施方式
图1为本申请实施例提供的应用场景图。图1示出一种头戴显示设备100,头戴显示设备100为双目显示设备,头戴显示设备100包括左右两个光学系统,本申请实施例提供的图像校准方法应用在两个光学系统显示图像的过程中。需要说明的是:图1所示头戴显示设备100的形态为眼镜,该形态仅是一种示例,头戴显示设备100还可为其他形态,其他形态包括但不限于眼罩、头盔或者头戴显示器,本申请实施例对头戴显示设备100的形态不做限定。
在一些实施例中,每个光学系统包括两枚光学镜片和显示屏,用户戴上头戴显示设备100后,光学系统通过图2所示原理成像,参见图2所示,从左到右依次为瞳孔10、光学镜片101、光学镜片102以及显示屏103。显示屏103发出光线后,该光线经过光学镜片102和光学镜片101的折射后进入瞳孔10,使得用户可以看到一个放大的虚像,虚像范围参见图2,增强了用户的沉浸感。
需要说明的是:图2仅示出了头戴显示设备100中一个光学系统的成像原理,另一个光学系统的成像原理类似,本申请实施例不再对另一个光学系统的成像原理赘述。
参见图3所示,针对每个光学系统,光学镜片101的中心点P1、光学镜片102的中心点P2以及显示屏103的中心点P3的连线构成了该光学系统的光轴。继续参见图2所示,当用户的瞳孔10在对应的光轴上时,用户平视前方时看到的点对应显示屏103显示的图像的中心点,也就是说,用户平视前方时看到的点相对于显示屏103显示的图像的中心点没有偏移。用户的两只瞳孔10均在对应的光轴上时,两只眼睛看到的画面相同,用户体验高。
然而,由于用户的瞳距和头戴显示设备100的两个光轴之间的距离不匹配,或者用户佩戴不恰当等原因,用户的两只瞳孔10不一定都在对应的光轴上,导致两只眼睛平视前方时看到的点相对于显示屏103显示的图像的中心点有不同程度的偏移,两只眼睛看到的画面不同,用户会感到眩晕。
本申请实施例提供的头戴显示设备100,光学系统中光学镜片101和光学镜片102在不同位置时对应的视度也不同,本申请实施例提到的视度包括但不限于:近似度数或者远视度数,因此,用户可通过调节光学镜片101和光学镜片102的位置来使两枚镜片的位置和自身视度匹配。比如:用户右眼近似度数为700度,那么在使用头戴显示设备100时, 可将右侧光学系统中光学镜片101和光学镜片102调节至近似700度对应的位置。
需要说明的是:若用户的瞳孔10不在对应的光轴上,光学镜片101和光学镜片102在不同位置时,用户平视前方时看到的点相对于显示屏103显示的图像的中心点的偏离程度也不同。即,用户的瞳孔10不在对应的光轴上时,用户平视前方时看到的点相对于显示屏103显示的图像的中心点的偏离程度和用户视度有关。
为了解决上述用户会感到眩晕的问题,本申请提供一种图像校准方法,该方法可用于两个光学系统显示图像的过程中,一方面,通过相机检测用户瞳孔位置偏移量,另一方面,通过镜片位置探测器检测用户视度,然后,根据用户瞳孔位置偏移量和用户视度,从预先配置的映射关系中查找对应的图像中心点偏移量,最后根据查找到的图像中心点偏移量来调整显示屏所要显示的图像的中心点,从而使用户平视前方时看到的点相对于显示屏103显示的图像的中心点不会发生偏移,两个光学系统经过上述校准后,使得用户双眼看到的画面是相同的,不会产生眩晕感。
图4为本申请实施例提供的头戴显示设备100的框架图,头戴显示设备100包括但不限于:两个光学系统400、处理器401以及存储器402。每个光学系统400包括但不限于:显示屏103、光学镜筒模组403、相机404、镜片位置探测器405以及视度调节模组406。每个光学系统400中的光学镜筒模组403、显示屏103、相机404以及镜片位置探测器405均与处理器401连接,存储器402也与处理器401连接,存储器402中存储有和图像处理相关的软件单元,该软件单元包括虚拟相机。下面逐一介绍各个硬件或者软件单元的功能:
一种可能的实现方式中,光学镜筒模组403包括如图2所示的光学镜片101和光学镜片102。视度调节模组406用于调节光学镜片101和光学镜片102的位置,以使光学镜片101和光学镜片102的位置和用户视度匹配。需要说明的是:光学镜筒模组403包括的镜片数量也可为一枚或者大于两枚,本申请实施例以两枚为例说明本申请的校准过程。
一种可能的实现方式中,相机404用于拍摄眼睛图像,并将该眼睛图像发送给处理器401。
一种可能的实现方式中,镜片位置探测器405用于探测光学镜筒模组403中各个光学镜片的位置,并将各个光学镜片的位置发送给处理器401。镜片位置探测器405所探测的各个光学镜片的位置可以是各个光学镜片相对于显示屏103的位置。
一种可能的实现方式中,两个光学系统400中相机404和镜片位置探测器405设置的位置可以如图5所示。
一种可能的实现方式中,处理器401用于根据相机404发送的眼睛图像,确定用户瞳孔位置偏移量。还用于根据镜片位置探测器405发送的各个光学镜片的位置,确定用户视度。
一种可能的实现方式中,存储器402中还存储有应用程序APP(Application,APP),该APP可以为游戏APP,也可以为视频APP,本申请对APP类型不做限定,以游戏APP为例,游戏APP包含预先建立的若干游戏场景,每个游戏场景有对应的场景数据,对场景数据进行渲染,可生成对应的图像。用户打开该APP后,虚拟相机根据用户输入的指令对场景数据进行渲染,生成对应的图像;或者,根据默认的播放顺序对场景数据进行渲染,生成对应的图像。虚拟相机可将生成的待显示图像发送给显示屏103,显示屏103用于显示虚拟相机发送的图像。
一种可能的实现方式中,虚拟相机包括位移参数和视场角参数,可通过改变虚拟相机的位移参数值调整虚拟相机生成的图像的中心点,可通过改变虚拟相机的视场角参数值来调整虚拟相机生成的图像的像高。
一种可能的实现方式中,处理器401还用于根据用户瞳孔位置偏移量和用户视度调整显示屏103所要显示的图像的中心点。还用于根据用户瞳孔位置偏移量和用户视度确定显示屏103所要显示的图像对应的畸变矫正参数,并根据该畸变矫正参数对显示屏103所要显示的图像进行畸变矫正。还用于根据用户视度调整显示屏103所要显示的图像的像高。经过上述图像校准处理,使得用户看到的虚像不受瞳孔位置偏移的影响,也不受视度的影响。两个光学系统均做上述校准处理后,用户两只眼睛看到的画面相同,提升了用户对头戴显示设备100的使用体验。
一种可能的实现方式中,头戴显示设备100还包括瞳距机械调节装置,该瞳距机械调节装置用于调节两个光轴之间的距离,以使两个光轴之间的距离和用户的瞳距匹配。然而,经过该瞳距机械调节装置的调节后,如果调节不合适或者用户戴上头戴显示设备100后瞳孔相对于光轴向上或者向下偏移,用户的两只瞳孔10还是不在对应的光轴上,可通过本申请实施例提供的图像校准方法对显示屏103所要显示的图像进行校准。
应用于图4所示头戴显示设备100,图6为本申请提供的图像校准方法的一实施例的流程示意图。本实施例提供的图像校准方法可应用在用户触发启动头戴显示设备100之后,也可应用在用户触发打开头戴显示设备100安装的APP之后,本申请实施例对此不做限定,本实施提供的图像校准方法具体包括:
S601、相机404拍摄眼睛图像,并将眼睛图像发送给处理器401。
S602、镜片位置探测器405探测各个光学镜片的第一位置,并将各个光学镜片的第一位置发送给处理器401。
S603、处理器401根据眼睛图像,确定用户瞳孔位置偏移量。
一种可能的实现方式中,参见图7A所示,接收到相机404发送的眼睛图像后,从眼睛图像中识别出瞳孔位置,然后计算该瞳孔位置和基准位置之间的距离,将该距离作为用户瞳孔位置偏移量。
下面介绍基准位置的获取方式:
参见图7B所示,在光学镜筒模组403上设置定位点,在工装上设置与定位点匹配的凹槽,标定物安装在工装上后,将定位点卡设在凹槽内,便可保证标定物处于光轴上,使用相机404拍照,得到标定物的图像,从标定物的图像中识别标定物中心点,将该中心点的位置作为上述基准位置。
S604、处理器401根据各个光学镜片的第一位置,确定用户视度。
一种可能的实现方式中,处理器401接收到镜片位置探测器405发送的各个光学镜片的第一位置后,可从第四映射关系中查找与该第一位置对应的第一视度,将查找到的第一视度作为用户视度。该第四映射关系为预先配置的各个光学镜片的位置和视度之间的对应关系。
一种可能的实现方式中,在第四映射关系中,视度可包括0度、100度、200度、300度、400度、500度、600度以及700度,各个视度对应的各个光学镜片的位置如表1所示,表1中涉及到的位置为光学镜片相对于显示屏103的位置。需要说明的是,表1所示对应 关系仅是一种示例,光学镜片101的位置、光学镜片102的位置以及视度的对应关系还可以为其他关系,表1不构成对本申请的限制。
表1
光学镜片101 光学镜片102 视度
位置A1 位置B1 0度
位置A 2 位置B 2 100度
位置A 3 位置B 3 200度
位置A 4 位置B 4 300度
位置A 5 位置B 5 400度
位置A 6 位置B 6 500度
位置A 7 位置B 7 600度
位置A 8 位置B 8 700度
为了使更多视度的用户能够使用头戴显示设备100,可将视度设计为两枚光学镜片的位置的函数。用户戴上头戴显示设备100后,调整两枚光学镜片的位置,使得通过该函数计算得到的视度和用户视度匹配。该设计可以任意视度的用户均能使用头戴显示设备100。
下面结合表1举例说明:
假设,镜片位置探测器405探测到光学镜片101在位置A3处,光学镜片102在位置B3处,处理器401从第四映射关系可以查到用户视度为200度。
S605、处理器401根据用户瞳孔位置偏移量和用户视度,调整显示屏103所要显示的图像的中心点。
一种可能的实现方式中,处理器401根据S603得到的用户瞳孔位置偏移量和S604得到的用户视度,从第一映射关系查找对应的第一图像中心点偏移量,基于该第一图像中心点偏移量和瞳孔偏移的方向,调整显示屏所要显示的图像的中心点。第一映射关系为预先配置的瞳孔位置偏移量、视度以及图像中心点偏移量之间的对应关系。显示屏所要显示的图像在本申请中也称为待显示图像。
一种可能的实现方式中,在第一映射关系中,视度可包括0度、100度、200度、300度、400度、500度、600度以及700度,瞳孔位置偏移量可包括1mm、2mm、3mm以及4mm。各个视度和各个瞳孔位置偏移量对应的图像中心点偏移量如表2所示。需要说明的是,表2所示对应关系仅是一种示例,瞳孔位置偏移量、视度以及图像中心点偏移量的对应关系还可以为其他关系,表2不构成对本申请的限制。
表2
Figure PCTCN2021135168-appb-000001
Figure PCTCN2021135168-appb-000002
下面以瞳孔位置偏移量为2mm,视度为200度为例,对配置第一映射关系的过程进行说明:
参见图8所示,可通过仿真模拟的方式模拟瞳孔相对于光轴向上偏移2mm,可按照表1所示第一映射关系将光学镜片101和光学镜片102调节至200度对应的位置。这种情况下,用户平视前方时看到的点Q1并不在光轴上,Q1偏离光轴的方向和瞳孔偏离光轴的方向相同,Q1偏离光轴的距离和瞳孔偏离光轴的距离相同,即Q1相对于光轴也向上偏移2mm。Q1对应的也不是显示屏103显示的图像的中心点,而是显示屏103显示的图像的中心点向上偏移一定距离的点,图8中使用P4示意,只有将显示屏103显示的图像的中心点从P3上移到P4时,用户平视前方时看到的点才是显示屏103显示的图像的中心点,因此,可将P3到P4的距离A3mm作为瞳孔位置偏移量为2mm、视度为200度对应的图像中心点偏移量。
一种可能的实现方式中,处理器401根据S603得到的用户瞳孔位置偏移量和S604得到的用户视度,从第一映射关系查找到对应的第一图像中心点偏移量,然后可将虚拟相机生成的待显示图像的中心点向瞳孔偏移的方向移动第一距离,第一距离和第一图像中心点偏移量大小相同。
举例来说:
假设,S603得到的用户瞳孔位置偏移量为2mm,S604得到的用户视度为200度,从第一映射关系可以查到对应的图像中心点偏移量为A3mm,参见图9所示,处理器401可将虚拟相机生成的待显示图像的中心点向瞳孔偏移的方向移动A3mm,从而使得用户平视前方时看到的点为显示屏显示的图像的中心点,两侧光学系统经过上述校准后,用户两只眼睛看到的画面是相同的,不会产生眩晕感。
需要说明的是:图9以瞳孔偏移的方向为向上偏移为例说明本申请的调整过程,瞳孔 向其他方向偏移时的调整过程类似,本申请实施例不再赘述。
另一种可能的实现方式中,处理器401根据S603得到的用户瞳孔位置偏移量和S604得到的用户视度,从第一映射关系查找到对应的第一图像中心点偏移量,然后可根据该第一图像中心点偏移量换算虚拟相机的位移调整量,根据该位移调整量调整虚拟相机的位移参数值,以使虚拟相机生成的待显示图像的中心点会向瞳孔偏移的方向移动第一距离,第一距离和第一图像中心点偏移量大小相同。
举例来说:
假设,S603得到的用户瞳孔位置偏移量为2mm,S604得到的用户视度为200度,从第一映射关系可以查到对应的图像中心点偏移量为A3mm,换算得到虚拟相机的位移调整量为a3mm,可根据该位移调整量调整虚拟相机的位移参数值,经过该调整后,虚拟相机生成的图像的中心点会向瞳孔偏移的方向移动A3mm,从而使得用户平视前方时看到的点为显示屏显示的图像的中心点,两侧光学系统经过上述校准后,用户两只眼睛看到的画面是相同的,不会产生眩晕感。
S606、处理器401根据用户瞳孔位置偏移量和用户视度对显示屏所要显示的图像进行畸变矫正。
一种可能的实现方式中,S606中涉及的显示屏所要显示的图像可以为处理器401执行S605后得到的图像。
由于光学镜片101和光学镜片102的边缘部分和中心部分的放大倍率不一致,导致用户看到的图像会发生畸变。图10和图11为用户看到的同一张图像的两个畸变示意图,图10为轴上畸变示意图,图11为离轴畸变示意图。轴上畸变指的是瞳孔相对于光轴没有发生偏移时用户看到的畸变,离轴畸变指的是瞳孔相对于光轴发生偏移时用户看到的畸变,图11以瞳孔相对于光轴发生向上偏移时看到的畸变来示意。图10和图11中网格交点代表未发生畸变时像素所在的位置,叉形点代表图像畸变后像素所在的位置。对比可见,瞳孔没有发生偏移时用户看到的图像畸变情况和瞳孔发生偏移时用户看到的图像畸变情况不同。比如,瞳孔没有发生偏移时,P0处的像素畸变后移动至图10中P1处,参见图10箭头示意。瞳孔发生偏移时,P0处的像素畸变后移动至图11中P2处,参见图11箭头示意。图10中P1示意的位置和图11中P2示意的位置显然不同。
另外,参见图12所示,上下两个图的横坐标为畸变程度,纵坐标为视场角,上图为视度为700度时图像畸变程度随视场角的变化情况,下为视度为0度时图像畸变程度随视场角的变化情况,对比可见,用户视度不同时,用户看到的图像畸变程度不同,即,图像畸变程度和视度有关。
一种可能的实现方式中,处理器401根据S603得到的用户瞳孔位置偏移量和S604得到的用户视度,从第二映射关系查找对应的第一畸变矫正参数,针对待显示图像上的每个像素,将第一畸变矫正参数和该像素的坐标代入畸变矫正公式,便可得到该像素畸变矫正后的坐标。第二映射关系为预先配置的瞳孔位置偏移量、视度以及畸变矫正参数之间的对应关系。
举例来说:
假设畸变矫正公式为R’=a*R+b*R 3+c*R 5+d*R 7+e*R 9,其中a,b,c,d,e代表从第二映射关系查找到的第一畸变矫正参数,R代表畸变矫正前像素和图像中心点之间的距离, R可通过畸变矫正前像素的坐标和图像中心点坐标计算得到,R’代表畸变矫正后像素和图像中心点之间的距离。针对待显示图像上的每个像素,求取上述R’,沿着畸变矫正前像素和图像中心点的连线将像素移动至R’对应的位置,便完成了该像素的畸变矫正。
一种可能的实现方式中,上述畸变矫正公式也可以是下文多项式拟合过程中采用的多项式公式。
一种可能的实现方式中,在第二映射关系中,视度可包括0度、100度、200度、300度、400度、500度、600度以及700度,瞳孔位置偏移量可包括1mm、2mm、3mm以及4mm,各个视度和各个瞳孔位置偏移量对应的畸变矫正参数参见表3。需要说明的是,表3所示对应关系仅是一种示例瞳孔位置偏移量、视度以及畸变矫正参数的对应关系还可以为其他关系,而且畸变矫正参数包含的值的个数还可以为多个,表3不构成对本申请的限制。
表3
Figure PCTCN2021135168-appb-000003
下面以瞳孔位置偏移量为2mm,视度为200度为例,对配置第二映射关系的过程进行说明:
可通过仿真模拟的方式模拟眼睛位置相对于光轴向上偏移2mm,可按照表1所示第一映射关系将光学镜片101和光学镜片102调节至200度对应的位置。这种情况下,假设用户看到的图像畸变情况如图11,从图11中获取未发生畸变时各个像素所在的位置,以及图像畸变后各个像素所在的位置,然后采用多项式拟合的方式对两者进行拟合,拟合结果为图11所示图像畸变情况对应的畸变矫正参数,可将该畸变矫正参数作为瞳孔位置偏移量为2mm、视度为200度对应的畸变矫正参数,并存储至第二映射关系中。由于第二映射关系存储的畸变矫正参数是模拟不同眼睛偏移量以及不同视度的情况下得到的,基于该第二映射关系中存储的畸变矫正参数进行畸变矫正效果比较理想。
图像上的每个像素均是由红色、蓝色和绿色三个子像素组成的,图像畸变后三个子像素移动到不同的位置,导致显示屏显示的图像会出现颜色分离,因此,可分别针对红色子像素、蓝色子像素以及绿色子像素做上述畸变矫正处理,以克服该颜色分离的问题。
一种可能的实现方式中,在配置第二映射关系时,可配置瞳孔位置偏移量、视度以及每种颜色的子像素对应的畸变矫正参数之间的对应关系。在第二映射关系中,视度可包括0度、100度、200度、300度、400度、500度、600度以及700度,瞳孔位置偏移量可包括1mm、2mm、3mm以及4mm,畸变矫正参数可包括红色子像素对应畸变矫正参数、蓝色子像素对应的畸变矫正参数以及绿色子像素对应的畸变矫正参数,如表4所示。表4所示对应关系仅是一种示例,瞳孔位置偏移量、视度以及每种颜色的畸变矫正参数的对应关系还可以为其他关系,表4不构成对本申请的限制。
表4
Figure PCTCN2021135168-appb-000004
Figure PCTCN2021135168-appb-000005
一种可能的实现方式中,处理器401根据S603得到的用户瞳孔位置偏移量和S604得到的用户视度,从表4所示第二映射关系中查找对应的红色子像素对应畸变矫正参数、蓝色子像素对应畸变矫正参数以及绿色子像素对应畸变矫正参数,针对待显示图像上的每个红色子像素,将红色子像素对应畸变矫正参数和红色子像素的坐标代入畸变矫正公式,可得到红色子像素畸变矫正后的坐标;针对待显示图像上的每个绿色子像素,将绿色子像素对应畸变矫正参数和绿色子像素的坐标代入畸变矫正公式,可得到绿色子像素畸变矫正后的坐标;针对待显示图像上的每个蓝色子像素,将蓝色子像素对应畸变矫正参数和蓝色子像素的坐标代入畸变矫正公式,可得到蓝色子像素畸变矫正后的坐标。对各个颜色子像素进行畸变矫正的过程可参见上文描述。
S607、处理器401根据用户视度调整显示屏所要显示的图像的像高。
参见图13所示,以光学镜筒模组403仅包括一枚光学镜片101为例,光学镜片101在实线位置时对应的视场角(FOV)为α,光学镜片101移动到虚线位置时对应的视场角为β,由于光学镜片的位置会对改变视度,因此,用户的视场角和用户视度有关。为了不同视度的用户使用头戴显示设备100时视场角是相同的,处理器401可根据用户视度调整显示屏所要显示的图像的像高。
一种可能的实现方式中,处理器401根据S604得到的用户视度,从第三映射关系查找对应的第一像高,基于该第一像高对显示屏所要显示的图像的像高进行调整。第三映射关系为预先配置的视度和像高的对应关系。
一种可能的实现方式中,第三映射关系中,视度可包括0度、100度、200度、300度、400度、500度、600度以及700度,各个视度对应的像高如表5所示。需要说明的是,表5所示对应关系仅是一种示例,视度和像高的对应关系还可以为其他关系,表5不构成对本申请的限制。
表5
视度 像高
0度 H1mm
100度 H2mm
200度 H3mm
300度 H4mm
400度 H5mm
500度 H6mm
600度 H7mm
700度 H8mm
下面以视度为200度为例,对配置第三映射关系的过程进行说明:
图13中,假设光学镜片101在实线位置时对应的视度为0度,光学镜片101移动到虚线位置时对应的视度为200度,视度为0度的用户的视场角为α,视度为200度的用户的视场角为β,为了使视度为200度的用户和视度为0度的用户视场角相同,在配置第三映射关系时,参见图14所示,可对显示屏103所要显示的图像的像高做调整,直至视度为200度的用户的视场角变为α为止,此时的像高即为表5中H3mm。
一种可能的实现方式中,处理器401根据S604得到的用户视度,从第三映射关系查找到对应的第一像高,然后可将虚拟相机生成的待显示图像的像高调整为和第一像高相同,由于第三映射关系存储的像高是模拟不同视度的情况下得到的,依据查找到的像高对显示屏103所要显示的图像进行调整后,该用户的视场角和其他视度的用户的视场角是相同。
举例来说:
假设,S604得到的用户视度为200度,从第三映射关系可以查到对应的像高为H3mm,参见图15所示,处理器401可将虚拟相机拍摄的图像的像高调为H3mm。将像高调为H3mm时,视度为200度的用户和视度为0度的用户使用头戴显示设备100时视场角均为α,增强了用户体验。
另一种可能的实现方式中,处理器401根据S604得到的用户视度,从第三映射关系查找到对应的第一像高,然后可根据该像高换算虚拟相机的视场角调整量,根据该视场角调整量调整虚拟相机的视场角参数值,以使虚拟相机生成的待显示图像的像高和第一像高相同。
举例来说:
假设,S604得到的用户视度为200度,则从第三映射关系可以查到对应的像高为H3mm,换算得到虚拟相机的视场角调整量为x度,可根据该视场角调整量调整虚拟相机的视场角参数值,经过该调整后,虚拟相机生成的图像的像高变为H3mm,同样使使得视度为200度的用户和视度为0度的用户使用头戴显示设备100时视场角均为α,增强了用户体验。
需要说明的是:本申请实施例对S601和S602的先后顺序不限定,可以先执行S601,后执行S602,也可先执行S602,后执行S601,或者同时执行S601和S602。同样的,本申请实施例对S603和S604的先后顺序不做限定,可以先执行S603,后执行S604,也可先执行S604,后执行S603,或者同时执行S603和S604。同样的,本申请实施例对S605、S606和S607的先后顺序不做限定,S605、S606和S607可按照任意顺序执行。图6的顺序仅是一种示例。
本申请实施例提供的图像校准方法,在检测到用户瞳孔位置偏移量以及用户视度后, 一方面,可根据用户瞳孔位置偏移量和用户视度,从第一映射关系中查找对应的图像中点偏移量,根据该图像中心点偏移量调整显示屏所要显示的图像的中心点,使得用户平视前方时看到的点为显示屏显示的图像的中心点,两侧光学系统经过该校准后,用户两只眼睛看到的画面是相同的,不会产生眩晕感。另一方面,根据用户瞳孔位置偏移量和用户视度,从第二映射关系中查找对应的畸变矫正参数,并根据该畸变矫正参数对显示屏所要显示的图像进行畸变矫正,由于第二映射关系存储的畸变矫正参数是模拟不同眼睛偏移量以及不同视度的情况下得到的,基于该第二映射关系中存储的畸变矫正参数进行畸变矫正效果比较理想。另一方面,根据用户视度从第四映射关系中查找对应的像高,根据该像高对对显示屏所要显示的图像的像高进行调整,使得不同视度的用户使用头戴显示设备时视场角是相同的,增强了用户体验。
图16示出了电子设备10的结构示意图。电子设备10可以包括处理器110,外部存储器接口120,内部存储器121,通用串行总线(universal serial bus,USB)接口130,天线1,天线2,移动通信模块150,无线通信模块160,音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,传感器模块180,按键190,马达191,指示器192,摄像头193,显示屏194等。其中传感器模块180可以包括压力传感器180A,陀螺仪传感器180B,加速度传感器180E,距离传感器180F,触摸传感器180K等。
可以理解的是,本发明实施例示意的结构并不构成对电子设备10的具体限定。在本申请另一些实施例中,电子设备10可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器110可以包括一个或多个处理单元,例如:处理器110可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
控制器可以根据指令操作码和时序信号,产生操作控制信号,完成取指令和执行指令的控制。
处理器110中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器110中的存储器为高速缓冲存储器。该存储器可以保存处理器110刚用过或循环使用的指令或数据。如果处理器110需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器110的等待时间,因而提高了系统的效率。
处理器110用于根据用户瞳孔位置偏移量和用户视度调整显示屏194所要显示的图像的中心点。还用于根据用户瞳孔位置偏移量和用户视度确定显示屏194所要显示的图像对应的畸变矫正参数,并根据该畸变矫正参数对显示屏194所要显示的图像进行畸变矫正。还用于根据用户视度调整显示屏194所要显示的图像的像高。经过上述图像校准处理,使得用户看到的虚像不受眼睛位置偏移的影响,也不受眼睛视度的影响。两个光学系统均做上述校准处理后,用户两只眼睛看到的画面相同,提升了用户对电子设备10的使用体验。
在一些实施例中,处理器110可以包括一个或多个接口。接口可以包括集成电路 (inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器110可以包含多组I2C总线。处理器110可以通过不同的I2C总线接口分别耦合触摸传感器180K,充电器,闪光灯,摄像头193等。例如:处理器110可以通过I2C接口耦合触摸传感器180K,使处理器110与触摸传感器180K通过I2C总线接口通信,实现电子设备10的触摸功能。
I2S接口可以用于音频通信。在一些实施例中,处理器110可以包含多组I2S总线。处理器110可以通过I2S总线与音频模块170耦合,实现处理器110与音频模块170之间的通信。在一些实施例中,音频模块170可以通过I2S接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。
PCM接口也可以用于音频通信,将模拟信号抽样,量化和编码。在一些实施例中,音频模块170与无线通信模块160可以通过PCM总线接口耦合。在一些实施例中,音频模块170也可以通过PCM接口向无线通信模块160传递音频信号,实现通过蓝牙耳机接听电话的功能。所述I2S接口和所述PCM接口都可以用于音频通信。
UART接口是一种通用串行数据总线,用于异步通信。该总线可以为双向通信总线。它将要传输的数据在串行通信与并行通信之间转换。在一些实施例中,UART接口通常被用于连接处理器110与无线通信模块160。例如:处理器110通过UART接口与无线通信模块160中的蓝牙模块通信,实现蓝牙功能。在一些实施例中,音频模块170可以通过UART接口向无线通信模块160传递音频信号,实现通过蓝牙耳机播放音乐的功能。
MIPI接口可以被用于连接处理器110与显示屏194,摄像头193等外围器件。MIPI接口包括摄像头串行接口(camera serial interface,CSI),显示屏串行接口(display serial interface,DSI)等。在一些实施例中,处理器110和摄像头193通过CSI接口通信,实现电子设备10的拍摄功能。处理器110和显示屏194通过DSI接口通信,实现电子设备10的显示功能。
GPIO接口可以通过软件配置。GPIO接口可以被配置为控制信号,也可被配置为数据信号。在一些实施例中,GPIO接口可以用于连接处理器110与摄像头193,显示屏194,无线通信模块160,音频模块170,传感器模块180等。GPIO接口还可以被配置为I2C接口,I2S接口,UART接口,MIPI接口等。
USB接口130是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口130可以用于连接充电器为电子设备10充电,也可以用于电子设备10与外围设备之间传输数据。也可以用于连接耳机,通过耳机播放音频。该接口还可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本发明实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备10的结构限定。在本申请另一些实施例中,电子设备10也可以采用上 述实施例中不同的接口连接方式,或多种接口连接方式的组合。
电子设备10的无线通信功能可以通过天线1,天线2,移动通信模块150,无线通信模块160,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。电子设备10中的每个天线可用于覆盖单个或多个通信频带。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块150可以提供应用在电子设备10上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块150可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块150可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块150还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块150的至少部分功能模块可以被设置于处理器110中。在一些实施例中,移动通信模块150的至少部分功能模块可以与处理器110的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器170A,受话器170B等)输出声音信号,或通过显示屏194显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器110,与移动通信模块150或其他功能模块设置在同一个器件中。
无线通信模块160可以提供应用在电子设备10上的包括无线局域网(wireless local area networks,WLAN)(如无线保真(wireless fidelity,Wi-Fi)网络),蓝牙(bluetooth,BT),全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块160可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块160经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器110。无线通信模块160还可以从处理器110接收待发送的信号,对其进行调频,放大,经天线2转为电磁波辐射出去。
在一些实施例中,电子设备10的天线1和移动通信模块150耦合,天线2和无线通信模块160耦合,使得电子设备10可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based  augmentation systems,SBAS)。
电子设备10通过GPU,显示屏194,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏194和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器110可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏194用于显示图像,视频等。显示屏194包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备10可以包括1个或N个显示屏194,N为大于1的正整数。
电子设备10可以通过ISP,摄像头193,视频编解码器,GPU,显示屏194以及应用处理器等实现拍摄功能。
ISP用于处理摄像头193反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头193中。
摄像头193用于用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备10可以包括1个或N个摄像头193,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当电子设备10在频点选择时,数字信号处理器用于对频点能量进行傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。电子设备10可以支持一种或多种视频编解码器。这样,电子设备10可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备10的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口120可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备10的存储能力。外部存储卡通过外部存储器接口120与处理器110通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器121可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器121可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存 储电子设备10使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器121可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器110通过运行存储在内部存储器121的指令,和/或存储在设置于处理器中的存储器的指令,执行电子设备10的各种功能应用以及数据处理。
电子设备10可以通过音频模块170,扬声器170A,受话器170B,麦克风170C,耳机接口170D,以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块170用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块170还可以用于对音频信号编码和解码。在一些实施例中,音频模块170可以设置于处理器110中,或将音频模块170的部分功能模块设置于处理器110中。
扬声器170A,也称“喇叭”,用于将音频电信号转换为声音信号。电子设备10可以通过扬声器170A收听音乐,或收听免提通话。
受话器170B,也称“听筒”,用于将音频电信号转换成声音信号。当电子设备10接听电话或语音信息时,可以通过将受话器170B靠近人耳接听语音。
麦克风170C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风170C发声,将声音信号输入到麦克风170C。电子设备10可以设置至少一个麦克风170C。在另一些实施例中,电子设备10可以设置两个麦克风170C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,电子设备10还可以设置三个,四个或更多麦克风170C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
耳机接口170D用于连接有线耳机。耳机接口170D可以是USB接口130,也可以是3.5mm的开放移动电子设备平台(open mobile terminal platform,OMTP)标准接口,美国蜂窝电信工业协会(cellular telecommunications industry association of the USA,CTIA)标准接口。
压力传感器180A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器180A可以设置于显示屏194。压力传感器180A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器180A,电极之间的电容改变。电子设备10根据电容的变化确定压力的强度。当有触摸操作作用于显示屏194,电子设备10根据压力传感器180A检测所述触摸操作强度。电子设备10也可以根据压力传感器180A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度小于第一压力阈值的触摸操作作用于短消息应用图标时,执行查看短消息的指令。当有触摸操作强度大于或等于第一压力阈值的触摸操作作用于短消息应用图标时,执行新建短消息的指令。
陀螺仪传感器180B可以用于确定电子设备10的运动姿态。在一些实施例中,可以通过陀螺仪传感器180B确定电子设备10围绕三个轴(即,x,y和z轴)的角速度。陀螺仪传感器180B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器180B检测电子设备10抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备10的抖动,实现防抖。陀螺仪传感器180B还可以用于导航,体感游戏场景。
加速度传感器180E可检测电子设备10在各个方向上(一般为三轴)加速度的大小。当电子设备10静止时可检测出重力的大小及方向。还可以用于识别电子设备姿态,应用于横竖屏切换,计步器等应用。
距离传感器180F,用于测量距离。电子设备10可以通过红外或激光测量距离。在一些实施例中,拍摄场景,电子设备10可以利用距离传感器180F测距以实现快速对焦。
触摸传感器180K,也称“触控器件”。触摸传感器180K可以设置于显示屏194,由触摸传感器180K与显示屏194组成触摸屏,也称“触控屏”。触摸传感器180K用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏194提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器180K也可以设置于电子设备10的表面,与显示屏194所处的位置不同。
按键190包括开机键,音量键等。按键190可以是机械按键。也可以是触摸式按键。电子设备10可以接收按键输入,产生与电子设备10的用户设置以及功能控制有关的键信号输入。
马达191可以产生振动提示。马达191可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏194不同区域的触摸操作,马达191也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器192可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (15)

  1. 一种图像校准方法,其特征在于,包括:
    获取用户瞳孔位置偏移量和用户视度,所述用户瞳孔位置偏移量为用户的瞳孔相对光轴的偏移量;
    根据所述用户瞳孔位置偏移量和所述用户视度,确定所述用户瞳孔位置偏移量和所述用户视度对应的第一图像中心点偏移量;
    根据所述第一图像中心点偏移量和瞳孔偏移的方向,调整待显示图像的中心点。
  2. 根据权利要求1所述的方法,其特征在于,所述根据所述用户瞳孔位置偏移量和所述用户视度,确定所述用户瞳孔位置偏移量和所述用户视度对应的第一图像中心点偏移量,包括:
    根据所述用户瞳孔位置偏移量、所述用户视度以及第一映射关系,确定所述第一图像中心点偏移量,所述第一映射关系用于指示瞳孔位置偏移量、视度和图像中心点偏移量之间的对应关系。
  3. 根据权利要求1或2所述的方法,其特征在于,所述根据所述第一图像中心点偏移量和瞳孔偏移的方向,调整待显示图像的中心点,包括:
    将第一模块生成的所述待显示图像的中心点向所述瞳孔偏移的方向移动第一距离,所述第一距离和所述第一图像中心点偏移量大小相同,所述第一模块用于根据用户输入的指令或者默认播放顺序对场景数据进行渲染,生成对应的图像。
  4. 根据权利要求1或2所述的方法,其特征在于,所述根据所述第一图像中心点偏移量和瞳孔偏移的方向,调整待显示图像的中心点,包括:
    根据所述第一图像中心点偏移量,调整第一模块的位移参数值,以使所述第一模块生成的所述待显示图像的中心点向所述瞳孔偏移的方向移动第一距离,所述第一距离和所述第一图像中心点偏移量大小相同,所述第一模块用于根据用户输入的指令或者默认播放顺序对场景数据进行渲染,生成对应的图像。
  5. 一种图像校准方法,其特征在于,包括:
    获取用户瞳孔位置偏移量和用户视度,所述用户瞳孔位置偏移量为用户的瞳孔相对光轴的偏移量;
    根据所述用户瞳孔位置偏移量和所述用户视度,确定所述用户瞳孔位置偏移量和所述用户视度对应的第一畸变矫正参数;
    根据所述第一畸变矫正参数,对待显示图像上每个像素的位置进行矫正。
  6. 根据权利要求5所述的方法,其特征在于,所述根据所述用户瞳孔位置偏移量和所述用户视度,确定所述用户瞳孔位置偏移量和所述用户视度对应的第一畸变矫正参数,包括:
    根据所述用户瞳孔位置偏移量、所述用户视度以及第二映射关系,确定所述第一畸变矫正参数,所述第二映射关系用于指示瞳孔位置偏移量、视度以及畸变矫正参数之间的对应关系。
  7. 根据权利要求5或6所述的方法,其特征在于,所述第一畸变矫正参数包括红色子像素对应的畸变矫正参数、绿色子像素对应的畸变矫正参数和蓝色子像素对应的畸变矫正参数;
    所述根据所述第一畸变矫正参数,对待显示图像上每个像素的位置进行矫正,包括:
    根据所述红色子像素对应的畸变矫正参数,对所述待显示图像上每个红色子像素的位置进行矫正;
    根据所述绿色子像素对应的畸变矫正参数,对所述待显示图像上每个绿色子像素的位置进行矫正;
    根据所述蓝色子像素对应的畸变矫正参数,对所述待显示图像上每个蓝色子像素的位置进行矫正。
  8. 一种图像校准方法,其特征在于,包括:
    获取用户视度;
    根据所述用户视度,确定所述用户视度对应的第一像高;
    根据所述第一像高,调整待显示图像的像高。
  9. 根据权利要求8所述的方法,其特征在于,所述根据所述用户视度,确定所述用户视度对应的第一像高,包括:
    根据所述用户视度和第三映射关系,确定所述第一像高,所述第三映射关系用于指示视度和像高之间的对应关系。
  10. 根据权利要求8或9所述的方法,其特征在于,所述根据所述第一像高,调整待显示图像的像高,包括:
    将第一模块生成的所述待显示图像的像高调整为和所述第一像高相同,所述第一模块用于根据用户输入的指令或者默认播放顺序对场景数据进行渲染,生成对应的图像。
  11. 根据权利要求8或9所述的方法,其特征在于,所述根据所述第一像高,调整所述待显示图像的像高,包括:
    根据所述第一像高,调整第一模块的视场角参数值,以使所述第一模块生成的所述待显示图像的像高和所述第一像高相同,所述第一模块用于根据用户输入的指令或者默认播放顺序对场景数据进行渲染,生成对应的图像。
  12. 一种电子设备,其特征在于,包括:相机、镜片位置探测器和处理器,所述相机用于拍摄眼睛图像,所述镜片位置探测器用于探测光学镜片的位置,所述处理器用于执行权利要求1-4任一项所述的方法,或者用于执行权利要求5-7任一项所述的方法,或者用于执行权利要求8-11任一项所述的方法。
  13. 一种电子设备,其特征在于,包括:存储器和处理器;所述处理器用于与所述存储器耦合,读取并执行所述存储器中的指令,以实现权利要求1-4任一项所述的方法,或者权利要求5-7任一项所述的方法,或者权利要求8-11任一项所述的方法。
  14. 一种可读存储介质,其特征在于,所述可读存储介质上存储有计算机程序;所述计算机程序在被执行时,实现权利要求1-4任一项所述的方法,或者权利要求5-7任一项所述的方法,或者权利要求8-11任一项所述的方法。
  15. 一种程序产品,其特征在于,所述程序产品包括计算机程序,所述计算机程序存储在可读存储介质中,通信装置的至少一个处理器可以从所述可读存储介质读取所述计算机程序,所述至少一个处理器执行所述计算机程序使得通信装置实施如权利要求1-4任意一项所述的方法或者如权利要求5-7任意一项所述的方法或者如权利要求8-11任意一项所述的方法。
PCT/CN2021/135168 2020-12-14 2021-12-02 图像校准方法和设备 WO2022127612A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US18/257,263 US20240036306A1 (en) 2020-12-14 2021-12-02 Image calibration method and device
EP21905541.5A EP4242727A4 (en) 2020-12-14 2021-12-02 IMAGE CALIBRATION DEVICE AND METHOD

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011469253.5 2020-12-14
CN202011469253.5A CN114624875B (zh) 2020-12-14 2020-12-14 图像校准方法和设备

Publications (1)

Publication Number Publication Date
WO2022127612A1 true WO2022127612A1 (zh) 2022-06-23

Family

ID=81896969

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/135168 WO2022127612A1 (zh) 2020-12-14 2021-12-02 图像校准方法和设备

Country Status (4)

Country Link
US (1) US20240036306A1 (zh)
EP (1) EP4242727A4 (zh)
CN (1) CN114624875B (zh)
WO (1) WO2022127612A1 (zh)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102944935A (zh) * 2012-11-13 2013-02-27 京东方科技集团股份有限公司 双眼式头戴显示装置及其图像间距调整方法
CN105955477A (zh) * 2016-04-29 2016-09-21 乐视控股(北京)有限公司 一种调节vr设备的显示图像的方法、装置及对应的vr设备
CN105974582A (zh) * 2015-08-10 2016-09-28 乐视致新电子科技(天津)有限公司 头戴式显示设备的图像校正方法及系统
CN106094200A (zh) * 2016-08-24 2016-11-09 深圳市视睿迪科技有限公司 一种像素结构、显示面板及显示装置
US20170205630A1 (en) * 2015-10-12 2017-07-20 Intel Corporation Adjustable pupil distance wearable display
CN107682690A (zh) * 2017-10-19 2018-02-09 京东方科技集团股份有限公司 自适应视差调节方法和虚拟现实vr显示系统
CN107924229A (zh) * 2016-04-14 2018-04-17 华为技术有限公司 一种虚拟现实设备中的图像处理方法和装置
CN108008537A (zh) * 2017-12-27 2018-05-08 北京传嘉科技有限公司 基于vr眼镜的调节处理方法及vr眼镜
CN109283997A (zh) * 2017-07-20 2019-01-29 中兴通讯股份有限公司 显示方法、装置和系统
CN109891296A (zh) * 2016-10-26 2019-06-14 威尔乌集团 使用瞳孔位置校正光学透镜畸变

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9025252B2 (en) * 2011-08-30 2015-05-05 Microsoft Technology Licensing, Llc Adjustment of a mixed reality display for inter-pupillary distance alignment
CN104914575B (zh) * 2014-09-29 2017-11-14 北京蚁视科技有限公司 带有屈光度检测装置的微透镜阵列式近眼显示器
US10115205B2 (en) * 2016-03-11 2018-10-30 Facebook Technologies, Llc Eye tracking system with single point calibration
SG11201907370XA (en) * 2017-02-12 2019-09-27 Lemnis Tech Pte Ltd Methods, devices and systems for focus adjustment of displays
CN107037588B (zh) * 2017-03-24 2019-07-02 华勤通讯技术有限公司 一种虚拟现实设备、及其显示图像调整方法
CN107462992B (zh) * 2017-08-14 2020-09-18 深圳创维新世界科技有限公司 一种头戴显示设备的调节方法、装置及头戴显示设备

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102944935A (zh) * 2012-11-13 2013-02-27 京东方科技集团股份有限公司 双眼式头戴显示装置及其图像间距调整方法
CN105974582A (zh) * 2015-08-10 2016-09-28 乐视致新电子科技(天津)有限公司 头戴式显示设备的图像校正方法及系统
US20170205630A1 (en) * 2015-10-12 2017-07-20 Intel Corporation Adjustable pupil distance wearable display
CN107924229A (zh) * 2016-04-14 2018-04-17 华为技术有限公司 一种虚拟现实设备中的图像处理方法和装置
CN105955477A (zh) * 2016-04-29 2016-09-21 乐视控股(北京)有限公司 一种调节vr设备的显示图像的方法、装置及对应的vr设备
CN106094200A (zh) * 2016-08-24 2016-11-09 深圳市视睿迪科技有限公司 一种像素结构、显示面板及显示装置
CN109891296A (zh) * 2016-10-26 2019-06-14 威尔乌集团 使用瞳孔位置校正光学透镜畸变
CN109283997A (zh) * 2017-07-20 2019-01-29 中兴通讯股份有限公司 显示方法、装置和系统
CN107682690A (zh) * 2017-10-19 2018-02-09 京东方科技集团股份有限公司 自适应视差调节方法和虚拟现实vr显示系统
CN108008537A (zh) * 2017-12-27 2018-05-08 北京传嘉科技有限公司 基于vr眼镜的调节处理方法及vr眼镜

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4242727A4

Also Published As

Publication number Publication date
CN114624875B (zh) 2023-07-28
EP4242727A1 (en) 2023-09-13
US20240036306A1 (en) 2024-02-01
EP4242727A4 (en) 2024-05-08
CN114624875A (zh) 2022-06-14

Similar Documents

Publication Publication Date Title
US11765463B2 (en) Multi-channel video recording method and device
WO2020192458A1 (zh) 一种图像处理的方法及头戴式显示设备
US11782554B2 (en) Anti-mistouch method of curved screen and electronic device
CN111510630B (zh) 图像处理方法、装置及存储介质
CN113810601B (zh) 终端的图像处理方法、装置和终端设备
CN112530382B (zh) 电子设备调整画面色彩的方法和装置
US11546531B2 (en) Camera assembly, image acquisition method, and mobile terminal
US11750926B2 (en) Video image stabilization processing method and electronic device
EP4024898A1 (en) Method and device for improving sound quality of speaker
US20240119566A1 (en) Image processing method and apparatus, and electronic device
CN113572956A (zh) 一种对焦的方法及相关设备
CN114257920B (zh) 一种音频播放方法、系统和电子设备
WO2022062884A1 (zh) 文字输入方法、电子设备及计算机可读存储介质
CN113850709A (zh) 图像变换方法和装置
WO2022127612A1 (zh) 图像校准方法和设备
WO2022068505A1 (zh) 一种拍摄方法和电子设备
CN112565735B (zh) 一种虚拟现实的测量和显示方法、装置、以及系统
CN115297269B (zh) 曝光参数的确定方法及电子设备
RU2782312C1 (ru) Способ обработки изображения и устройство отображения, устанавливаемое на голове
WO2022160795A1 (zh) 基于光场显示的显示模式的转换方法及转换装置
WO2022105670A1 (zh) 一种显示方法及终端
CN115150543B (zh) 拍摄方法、装置、电子设备及可读存储介质
CN114390195B (zh) 一种自动对焦的方法、装置、设备及存储介质
WO2023030067A1 (zh) 遥控方法、遥控设备和被控制设备
CN115696067A (zh) 终端的图像处理方法、装置和终端设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21905541

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 18257263

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2021905541

Country of ref document: EP

Effective date: 20230605

NENP Non-entry into the national phase

Ref country code: DE