CN111694158A - Calibration method, calibration equipment and calibration system for near-eye display device - Google Patents

Calibration method, calibration equipment and calibration system for near-eye display device Download PDF

Info

Publication number
CN111694158A
CN111694158A CN202010551867.1A CN202010551867A CN111694158A CN 111694158 A CN111694158 A CN 111694158A CN 202010551867 A CN202010551867 A CN 202010551867A CN 111694158 A CN111694158 A CN 111694158A
Authority
CN
China
Prior art keywords
optical module
display element
image
adjusting
coordinate information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010551867.1A
Other languages
Chinese (zh)
Inventor
陈彪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010551867.1A priority Critical patent/CN111694158A/en
Publication of CN111694158A publication Critical patent/CN111694158A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0149Head-up displays characterised by mechanical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/62Optical apparatus specially adapted for adjusting optical elements during the assembly of optical systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/003Alignment of optical elements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0149Head-up displays characterised by mechanical features
    • G02B2027/0161Head-up displays characterised by mechanical features characterised by the relative positioning of the constitutive elements

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)

Abstract

The application discloses a calibration method, calibration equipment and a calibration system of a near-eye display device. The calibration method comprises the following steps: the first imaging module shoots a first virtual image formed by the first optical module to obtain a first image, and the second imaging module shoots a second virtual image formed by the second optical module to obtain a second image. Determining a first adjustment amount of a first display element in the first optical module in the first direction according to the definition of the first image, and determining a second adjustment amount of a second display element in the second optical module in the first direction according to the definition of the second image; and adjusting the position of the first display element in the first optical module according to the first adjusting amount, and adjusting the position of the second display element in the second optical module according to the second adjusting amount. The calibration method of the embodiment of the application can improve the imaging effect of the near-eye display device by adjusting the position of the display element in the optical module, so that a user can obtain better experience.

Description

Calibration method, calibration equipment and calibration system for near-eye display device
Technical Field
The present disclosure relates to the field of calibration technologies of near-eye display devices, and in particular, to a calibration method, a calibration apparatus, and a calibration system for a near-eye display device.
Background
Augmented Reality (AR) and Virtual Reality (VR) are the fields of technology that have attracted much attention in recent years, and their near-eye display devices project a far-away virtual image onto the human eye by forming pixels on a display through a series of optical imaging elements. Compared with virtual reality, the near-eye display device for augmented reality needs to have a see-through (see-through) function so that the user can see both a real external scene and a virtual scene. Therefore, the optical modules within the augmented reality near-to-eye display device cannot be blocked in front of the user's line of sight. To solve this occlusion problem, one or a set of optical combiners (optical combiners) may be added to the near-eye display device to integrate, complement, and "enhance" the virtual scene and the real scene in a "stacked" fashion. All include two sets of optical module in augmented reality's near-to-eye display device and virtual reality's near-to-eye display device, when there is great difference in the quality of the image that two sets of optical module become, can influence near-to-eye display device's whole imaging effect, cause user's visual fatigue easily, reduce user's use and experience.
Disclosure of Invention
The embodiment of the application provides a calibration method of a near-eye display device, calibration equipment of the near-eye display device and a calibration system.
The calibration method of the embodiment of the application is used for a near-eye display device. The near-eye display device comprises a first optical module and a second optical module. The first optical module and the second optical module are used for forming a virtual image. The calibration method comprises the following steps: the first imaging module shoots a first virtual image formed by the first optical module to obtain a first image, and the second imaging module shoots a second virtual image formed by the second optical module to obtain a second image; determining a first adjustment amount of a first display element in the first optical module in a first direction according to the definition of the first image, and determining a second adjustment amount of a second display element in the second optical module in the first direction according to the definition of the second image; and adjusting the position of the first display element in the first optical module according to the first adjustment amount, and adjusting the position of the second display element in the second optical module according to the second adjustment amount.
The calibration device of the embodiment of the application is used for a near-eye display device. The near-eye display device comprises a first optical module and a second optical module. The first optical module and the second optical module are used for forming a virtual image. The calibration apparatus includes a first imaging module, a second imaging module, a processor, and an adjustment assembly. The first imaging module is used for shooting a first virtual image formed by the first optical module to obtain a first image. The second imaging module is used for shooting a second virtual image formed by the second optical module to obtain a second image. The processor is used for determining a first adjustment amount of a first display element in the first optical module in a first direction according to the definition of the first image, and determining a second adjustment amount of a second display element in the second optical module in the first direction according to the definition of the second image. The adjusting component is used for adjusting the position of the first display element in the first optical module according to the first adjusting quantity and adjusting the position of the second display element in the second optical module according to the second adjusting quantity.
The calibration system of the embodiment of the application comprises a near-eye display device and a calibration device. The calibration apparatus is used to calibrate the near-eye display device. The near-eye display device comprises a first optical module and a second optical module. The first optical module and the second optical module are used for forming a virtual image. The calibration apparatus includes a first imaging module, a second imaging module, a processor, and an adjustment assembly. The first imaging module is used for shooting a first virtual image formed by the first optical module to obtain a first image. The second imaging module is used for shooting a second virtual image formed by the second optical module to obtain a second image. The processor is used for determining a first adjustment amount of a first display element in the first optical module in a first direction according to the definition of the first image, and determining a second adjustment amount of a second display element in the second optical module in the first direction according to the definition of the second image. The adjusting component is used for adjusting the position of the first display element in the first optical module according to the first adjusting quantity and adjusting the position of the second display element in the second optical module according to the second adjusting quantity.
The calibration method of the near-eye display device, the calibration equipment of the near-eye display device and the calibration system can determine the adjustment quantity of the display elements in the two sets of optical modules according to the definition of an image, and adjust the position of the display elements in each set of optical module in the optical module through the adjustment quantity so as to improve the imaging quality of each set of optical module, so that the whole imaging effect of the near-eye display device is improved, and a user can obtain better experience when using the near-eye display device.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic flow chart of a method of calibrating a near-eye display device according to some embodiments of the present disclosure;
FIG. 2 is a schematic view of a near-eye display device according to certain embodiments of the present application;
FIG. 3 is a schematic diagram of a calibration apparatus for a near-eye display device according to certain embodiments of the present application;
FIG. 4 is a schematic diagram of a calibration system of certain embodiments of the present application;
FIG. 5 is a schematic flow chart of a method of calibrating a near-eye display device according to some embodiments of the present disclosure;
FIG. 6 is a schematic diagram of a scenario illustrating a method for calibrating a near-eye display device according to some embodiments of the present application;
FIG. 7 is a schematic flow chart of a method of calibrating a near-eye display device according to some embodiments of the present disclosure;
FIG. 8 is a schematic diagram illustrating a method for calibrating a near-eye display device according to certain embodiments of the present application;
FIG. 9 is a schematic flow chart of a method for calibrating a near-eye display device according to some embodiments of the present disclosure;
FIG. 10 is a schematic flow chart of a method of calibrating a near-eye display device according to some embodiments of the present disclosure;
fig. 11 is a flowchart illustrating a calibration method for a near-eye display device according to an embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below by referring to the drawings are exemplary only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the embodiments of the present application.
Referring to fig. 1 and 4, a calibration method of a near-eye display device 20 is provided in the present embodiment. The near-eye display device 20 includes a first optical module 21 and a second optical module 23, and the first optical module 21 and the second optical module 23 are used for forming a virtual image. The calibration method comprises the following steps:
01: the first imaging module 11 shoots a first virtual image formed by the first optical module 21 to obtain a first image, and the second imaging module 13 shoots a second virtual image formed by the second optical module 23 to obtain a second image;
02: determining a first adjustment amount of the first display element 213 in the first optical module 21 in the first direction according to the definition of the first image, and determining a second adjustment amount of the second display element 233 in the second optical module 23 in the first direction according to the definition of the second image; and
03: the position of the first display element 213 in the first optical block 21 is adjusted according to the first adjustment amount, and the position of the second display element 233 in the second optical block 23 is adjusted according to the second adjustment amount.
Referring to fig. 3 and 4, the present embodiment further provides a calibration apparatus 10 for a near-eye display device 20. The calibration method of the present embodiment can be implemented by the calibration apparatus 10 of the present embodiment. Calibration apparatus 10 includes a first imaging module 11, a second imaging module 13, a processor 15, an adjustment assembly 17, and a support 19. Step 01 can be realized by the first imaging module 11 and the second imaging module 13, step 02 can be realized by the processor 15, and step 03 can be realized by the adjusting assembly 17.
That is, the first imaging module 11 is used for shooting a first virtual image formed by the first optical module 21 to obtain a first image; the second imaging module 13 is used for capturing a second virtual image formed by the second optical module 23 to obtain a second image. The processor 15 is configured to determine a first adjustment amount of the first display element 213 in the first optical module 21 in the first direction according to the sharpness of the first image, and determine a second adjustment amount of the second display element 233 in the second optical module 23 in the first direction according to the sharpness of the second image. The adjusting component 17 is used for adjusting the position of the first display element 213 in the first optical module 21 according to the first adjusting amount and adjusting the position of the second display element 233 in the second optical module 23 according to the second adjusting amount.
Referring to fig. 4, in particular, the near-eye display device 20 of the embodiment of the present disclosure may be a Virtual Reality (VR) device or an Augmented Reality (AR) device, which is not limited herein. The first optical module 21 of the near-eye display device 20 includes a first lens 211 and a first display element 213, and the first lens 211 and the first display element 213 cooperate to form a virtual image. The second optical module 21 of the near-eye display device 20 includes a second lens 231 and a second display element 233, and the second lens 231 and the second display element 233 cooperate to form a virtual image. The first imaging module 11 and the second imaging module 13 of the embodiment of the present application are used for simulating eyes of a person, the optical parameters (e.g., focal length, etc.) of the first imaging module 11 and the second imaging module 13 are consistent, and the field angle of the first imaging module 11 should be greater than or equal to the field angle corresponding to the first virtual image formed by the first optical module 21 to be photographed (i.e., the field angle of the first imaging module 11 should be large enough to completely photograph the first virtual image formed by the first optical module 21), and the field angle of the second imaging module 13 should be greater than or equal to the field angle corresponding to the second virtual image formed by the second optical module 23 (i.e., the field angle of the second imaging module 13 should be large enough to completely photograph the second virtual image formed by the second optical module 23). The distance between the optical axis of the first imaging module 11 and the optical axis of the second imaging module 13 is determined by the binocular pupillary distance calculated from the binocular distances of a large number of face samples, and users of different ages correspond to different binocular pupillary distances, such as users of less than a predetermined age having a first binocular pupillary distance and users of greater than a predetermined age having a second binocular pupillary distance, wherein the first binocular pupillary distance is less than the second binocular pupillary distance. The developer selects a binocular interpupillary distance according to the user type of the near-eye display device 20, the binocular interpupillary distance is used as the distance between the optical axis of the first imaging module 11 and the optical axis of the second imaging module 13, the first imaging module 11 and the second imaging module 13 are fixed on the support 19 by taking the distance as an interval, and the first imaging module 11 and the second imaging module 13 are both connected with the processor 15 through data interfaces. After the first imaging module 11 and the second imaging module 13 are assembled, the developer places the first optical module 21 on the optical path of the first imaging module 11 and at a predetermined distance from the first imaging module 11, and places the second optical module 23 on the optical path of the second imaging module 13 and at a predetermined distance from the second imaging module 13. Wherein the predetermined distance is typically indicative of an interpupillary distance (eyerelief). In practical applications of embodiments of the present application, the predetermined distance generally cannot be less than a certain value, taking into account the need for myopic people to wear glasses.
After setting the relative position between the calibration apparatus 10 and the near-eye display device 20, the processor 15 may control the first imaging module 11 to capture a first virtual image formed by the first optical module 21 to obtain a first image, and control the second imaging module 13 to capture a second virtual image formed by the second optical module 23 to obtain a second image. Subsequently, the processor 15 may determine a first adjustment amount of the first display element 213 in the first optical block 21 in the first direction according to the sharpness of the first image, and determine a second adjustment amount of the second display element 233 in the second optical block 23 in the first direction according to the sharpness of the second image. The sharpness of the image can be detected by, for example, Modulation Transfer Function (MTF), which is not limited herein; the processor 15 determining the first adjustment amount of the first display element 213 in the first direction according to the definition of the first image may be that the processor 15 calculates a difference between the definition of the first image and a predetermined definition, and determines how much the first display element 213 needs to be adjusted according to the difference, so that the first imaging module 11 may capture the first image with the definition reaching the predetermined definition; the processor 15 may determine the second adjustment amount of the second display element 233 in the first direction according to the definition of the second image, that is, the processor 15 calculates a difference between the definition of the second image and a predetermined definition, and determines how much the second display element 233 needs to be adjusted according to the difference, so that the second imaging module 13 may capture the second image with the definition reaching the predetermined definition. Subsequently, the adjusting component 17 adjusts the position of the first display element 213 in the first optical block 21 according to the determined first adjustment amount, and adjusts the position of the second display element 233 in the second optical block 23 according to the obtained second adjustment amount. The adjusting component 17 adjusts the position of the first display element 213 in the first optical module 21, that is, adjusts the distance between the first display element 213 and the first lens 211 along the first direction, and the adjusting component 17 adjusts the position of the second display element 233 in the second optical module 23, that is, adjusts the distance between the second display element 233 and the second lens 231 along the first direction. The first direction is parallel to both the optical axis of the first optical module 21 and the optical axis of the second optical module 23.
It is understood that the first optical module 21 and the second optical module 23 of the near-eye display device 20 are both used for forming a virtual image. When there is a large difference between the sharpness of the image obtained by the first imaging module 11 capturing the first virtual image formed by the first optical module 21 and the sharpness of the image obtained by the second imaging module 13 capturing the second virtual image formed by the second optical module 23, the overall imaging effect of the near-to-eye display device 20 is affected, which is likely to cause visual fatigue of the user and reduce the user experience.
The calibration method and the calibration apparatus 10 according to the embodiment of the application can enable the first imaging module 11 to clearly image the first virtual image formed by the first optical module 21 by adjusting the distance between the first display element 213 and the first lens 211, and enable the second imaging module 13 to clearly image the second virtual image formed by the second optical module 23 by adjusting the distance between the second display element 233 and the second lens 231, so as to improve the overall imaging effect of the near-eye display device 20, and enable a user to obtain better experience when using the near-eye display device 20.
Referring to fig. 4 and 5, in some embodiments, before the step 01 of capturing the first virtual image formed by the first optical module 21 by the first imaging module 11 to obtain the first image and capturing the second virtual image formed by the second optical module 23 by the second imaging module 13 to obtain the second image, the calibration method further includes:
04: the position of the first display element 213 in the first optical block 21 is adjusted according to a first predetermined adjustment amount, and the position of the second display element 233 in the second optical block 23 is adjusted according to a second predetermined adjustment amount.
Referring to fig. 3 and 4, in some embodiments, step 04 may be implemented by the adjusting element 17. That is, the adjusting component 17 is further configured to adjust the position of the first display element 213 in the first optical module 21 according to a first predetermined adjustment amount, and adjust the position of the second display element 233 in the second optical module 23 according to a second predetermined adjustment amount.
Referring to fig. 6, when the first imaging module 11 shoots the calibration object 14 to obtain a first calibration image with a definition greater than a predetermined definition, a first predetermined adjustment amount is calculated according to a distance between the calibration object 14 and the first imaging module 11, and when the second imaging module 13 shoots the calibration object 14 to obtain a second calibration image with a definition greater than the predetermined definition, a second predetermined adjustment amount is calculated according to a distance between the calibration object 14 and the second imaging module 13.
Referring to fig. 3 and 6, specifically, before the first imaging module 11 captures a first virtual image formed by the first optical module 21 to obtain a first image, and the second imaging module 13 captures a second virtual image formed by the second optical module 23 to obtain a second image, the calibration object 14 may be used to calibrate the positions of the first imaging module 11 and the second imaging module 13 that can be clearly imaged, so as to establish the coordinate relationship between the calibration object 14 and the first imaging module 11 and the coordinate relationship between the calibration object 14 and the second imaging module 13. That is, when the first imaging module 11 captures the calibration object 14 to obtain the first calibration image with the definition greater than the predetermined definition, the processor 15 calibrates the coordinate relationship between the first imaging module 11 and the calibration object 14, and calculates to obtain the first predetermined adjustment amount according to the coordinate relationship; when the second imaging module 13 captures the calibration object 14 to obtain a second calibration image with a definition greater than the predetermined definition, the processor 15 calibrates the coordinate relationship between the second imaging module 13 and the calibration object 14, and calculates a second predetermined adjustment amount according to the coordinate relationship. The adjusting assembly 17 adjusts the position of the first display element 213 in the first optical module 21 according to a first predetermined adjustment amount, and adjusts the position of the second display element 233 in the second optical module 23 according to a second predetermined adjustment amount.
In the embodiment of the present application, before the first imaging module 11 is used to capture a first virtual image formed by the first optical module 21 to obtain a first image, and the second imaging module 13 is used to capture a second virtual image formed by the second optical module 23 to obtain a second image, the calibration object 14 is used to calibrate the positions where the first imaging module 11 and the second imaging module 13 can clearly image, so that the first display element 213 and the second display element 233 can be pre-installed in the calculated theoretical positions where the first imaging module 11 and the second imaging module 13 can clearly image, and the problem that the adjustment accuracy may not be high enough due to the need to perform relatively large adjustment on the first display element 213 and the second display element 233 later is avoided.
Referring to fig. 4, 7 and 8, in some embodiments, the calibration method further includes:
05: acquiring first coordinate information of a first feature point 111 in a first image and second coordinate information of a second feature point 131 in a second image, wherein the first feature point 111 and the second feature point 131 form a feature matching pair;
06: calculating a first difference between the first coordinate information and the reference coordinate information, and calculating a second difference between the second coordinate information and the reference coordinate information;
07: determining a third adjustment amount of the first display element 213 in the second direction based on the first difference, and determining a fourth adjustment amount of the second display element 233 in the second direction based on the second difference, the second direction being different from the first direction; and
08: the position of the first display element 213 in the first optical block 21 is adjusted according to the third adjustment amount, and the position of the second display element 233 in the second optical block 23 is adjusted according to the fourth adjustment amount.
Referring to fig. 3, 4 and 8, in some embodiments, step 05, step 06 and step 07 can be implemented by the processor 15, and step 08 can be implemented by the adjusting component 17. That is, the processor 15 is further configured to obtain first coordinate information of the first feature point 111 in the first image, and obtain second coordinate information of the second feature point 131 in the second image, where the first feature point 111 and the second feature point 131 form a feature matching pair. The processor 15 is further configured to calculate a first difference between the first coordinate information and the reference coordinate information and a second difference between the second coordinate information and the reference coordinate information, and determine a third adjustment amount of the first display element 213 in a second direction according to the first difference and a fourth adjustment amount of the second display element 233 in the second direction according to the second difference, the second direction being different from the first direction. The adjusting component 17 can also be used to adjust the position of the first display element 213 in the first optical module 21 according to the third adjustment amount, and adjust the position of the second display element 233 in the second optical module 23 according to the fourth adjustment amount.
Specifically, the processor 15 may identify a first feature point 111 in the first image and obtain first coordinate information of the first feature point 111, and the processor 15 may also identify a second feature point 131 in the second image and obtain second coordinate information of the second feature point 131. The first feature point 111 and the second feature point 131 form a feature matching pair, that is, the feature of the first feature point 111 corresponding to the point in the first virtual image formed by the first optical module 21 is consistent with the feature of the second feature point 131 corresponding to the point in the second virtual image formed by the second optical module 23. It should be noted that the first feature point 111 may be a point located at a center position in the first virtual image formed by the first optical module 21, or may be a point located at an edge position in the first virtual image formed by the first optical module 21, which is not limited herein. The second feature point 131 may be a point corresponding to a central position in the second virtual image formed by the second optical module 23, or may be a point corresponding to an edge position in the second virtual image formed by the second optical module 23, which is not limited herein. The number of the first feature points 111 and the second feature points 131 may be one or multiple, and is not limited herein, wherein when the number of the first feature points 111 and the number of the second feature points 131 are one, the one first feature point 111 corresponds to the one second feature point 131 to form a pair of feature matching pairs; when the number of the first feature points 111 and the number of the second feature points 131 are both multiple, the multiple first feature points 111 correspond to the multiple second feature points 131 one to form a multiple-pair feature matching pair. The first coordinate information is coordinate information on the image pixel coordinate system 113 of the first imaging module 11; the second coordinate information is coordinate information on the image pixel coordinate system 133 of the second imaging module 13.
After determining the feature points and their corresponding coordinate information, the processor 15 may calculate a first difference between the first coordinate information and the reference coordinate information, and calculate a second difference between the second coordinate information and the reference coordinate information. Since the first imaging module 11 and the second imaging module 13 are identical imaging modules, the reference coordinate information for calculating the first difference value and the reference coordinate information for calculating the second difference value are identical coordinate information, and the reference coordinate information may be coordinate information on an image pixel coordinate system (may be the coordinate system 12 shown in fig. 8) of the first imaging module 11 or coordinate information on an image pixel coordinate system (may be the coordinate system 12 shown in fig. 8) of the second imaging module 13, which is not limited hereinAnd (5) preparing. For example, assuming that the first coordinate information is (x1, y1), the second coordinate information is (x2, y2), and the reference coordinate information is (x0, y0), the first difference value
Figure BDA0002542820430000081
Figure BDA0002542820430000082
Second difference value
Figure BDA0002542820430000083
Subsequently, the processor 15 calculates a third adjustment amount of the first display element 213 in the second direction based on the first difference, and calculates an adjustment amount of the second display element 233 in the second direction based on the second difference. The second direction is a direction perpendicular to the optical axis of the first imaging module 11 and perpendicular to the optical axis of the second imaging module 13. Subsequently, the adjusting component 17 may adjust the position of the first display element 213 in the first optical block 21 according to the third adjustment amount in the direction perpendicular to the optical axis of the first imaging block 11, so that the first difference value between the adjusted first coordinate information of the first feature point 111 and the reference coordinate information may be less than or equal to a certain predetermined threshold value; the adjusting component 17 may also adjust the position of the second display element 233 in the second optical block 23 in the direction perpendicular to the optical axis of the second imaging block 11 according to the fourth adjustment amount, so that the second difference between the adjusted second coordinate information of the second feature point 131 and the reference coordinate information may be less than or equal to the predetermined threshold value described above.
In one example, if the number of the first feature points 111 and the second feature points 131 is one, and the first feature point 111 in the first image corresponds to a point located at a center position in the first virtual image, and the second feature point 131 in the second image corresponds to a point located at a center position in the second virtual image, the first coordinate information adjusted by the adjusting component 17 should be equal to the second coordinate information adjusted by the adjusting component 17 and equal to the reference coordinate information (the reference coordinate information is, for example, the coordinate information of the origin position of the coordinate system 12 shown in fig. 8), and then the center of the first image and the center of the second image are completely merged.
In another example, if the number of the first feature points 111 and the second feature points 131 is plural, and the plurality of first feature points 111 in the first image simultaneously include a point corresponding to the first virtual image at the central position and a point corresponding to the first virtual image at the edge position, the plurality of second feature points 131 in the second image simultaneously include a point corresponding to the second virtual image at the central position and a point corresponding to the second virtual image at the edge position, then for the first feature point 111 corresponding to the point at the center position in the first virtual image and the second feature point 131 corresponding to the point at the center position in the second virtual image, the first coordinate information adjusted by the adjusting component 17 should be equal to the second coordinate information adjusted by the adjusting component 17 and equal to the reference coordinate information (in this case, the reference coordinate information is, for example, the coordinate information of the origin position of the coordinate system 12 shown in fig. 8); for the first feature point 111 corresponding to a point located at an edge position in the first virtual image and the second feature point 131 corresponding to a point located at a center position in the second virtual image, the first coordinate information adjusted by the adjusting component 17 should be equal to the second coordinate information adjusted by the adjusting component 17 and equal to the reference coordinate information (in this case, the reference coordinate information is, for example, the coordinate information of the non-origin position of the coordinate system 12 shown in fig. 8).
In another example, if the number of the first feature points 111 and the second feature points 131 is multiple, and the multiple first feature points 111 in the first image simultaneously include a point corresponding to the center position in the first virtual image and a point corresponding to the edge position in the first virtual image, and the multiple second feature points 131 in the second image simultaneously include a point corresponding to the center position in the second virtual image and a point corresponding to the edge position in the second virtual image, for the first feature point 111 and the second feature point 131 corresponding to the point corresponding to the center position in the second virtual image, the first coordinate information of the first feature point 111 adjusted by the adjusting component 17 may not be equal to the reference coordinate information (in this case, the reference coordinate information is, for example, the coordinate information of the origin position of the coordinate system 12 shown in fig. 8), and the second coordinate information of the second feature point 131 adjusted may not be equal to the reference coordinate information (in this case, the reference coordinate information is also an example As the coordinate information of the origin position of the coordinate system 12 shown in fig. 8), as long as it is satisfied that the difference between the first coordinate information adjusted by the adjusting member 17 and the reference coordinate information is less than or equal to the predetermined threshold value, and the difference between the second coordinate information adjusted by the adjusting member 17 and the reference coordinate information is also less than or equal to the predetermined threshold value. Similarly, for the first feature point 111 corresponding to a point located at an edge position in the first virtual image and the second feature point 131 corresponding to a point located at a center position in the second virtual image, the first coordinate information of the first feature point 111 adjusted by the adjusting component 17 may not be equal to the reference coordinate information (in this case, the reference coordinate information is for example the coordinate information of the non-origin position of the coordinate system 12 shown in fig. 8), and the adjusted second coordinate information of the second feature point 131 may not be equal to the reference coordinate information (in this case, the reference coordinate information is, for example, coordinate information of a non-origin position of the coordinates shown in fig. 8), as long as the difference between the adjusted first coordinate information and the reference coordinate information by the adjusting component 17 is less than or equal to the predetermined threshold, and the difference between the second coordinate information adjusted by the adjusting component 17 and the reference coordinate information is less than or equal to the predetermined threshold.
It can be understood that when the first virtual image formed by the first optical module 21 and the second virtual image formed by the second optical module 22 have a higher fusion precision, the first image captured by the first imaging module 11 and the second image captured by the second imaging module 13 should be identical images. However, in the process of actually assembling the near-eye display device 20, there is an error in the installation positions of the first display element 213 of the first optical module 21 and the second display element 233 of the second optical module 23, which results in poor fusion precision of two virtual images formed by the two optical modules, and thus visual fatigue of a user is easily caused, and the user experience is reduced.
Referring to fig. 1 and 4, in the embodiment of the present application, the processor 15 determines a third adjustment amount of the first display element 213 in the second direction by calculating a first difference between the first coordinate information and the reference coordinate information, and determines a fourth adjustment amount of the second display element 233 in the second direction by calculating a second difference between the second coordinate information and the reference coordinate information. Then, the adjusting component 17 adjusts the position of the first display element 213 in the first optical module 21 according to the third adjustment amount, and adjusts the position of the second display element 233 in the second optical module 23 according to the fourth adjustment amount, so that the fusion precision of two virtual images formed by two optical modules can be improved, the overall imaging effect of the near-eye display device 20 can be improved, and a user can obtain better experience when using the near-eye display device 20.
In the calibration method according to the embodiment of the present invention, steps 05 to 08 may be performed after step 03 (i.e., calibrating the sharpness before calibrating the fusion precision), or may be performed before step 02 (i.e., calibrating the fusion precision before calibrating the sharpness), which is not limited herein.
Referring to fig. 4 and 9, in some embodiments, the calibration method further includes:
09: the first imaging module 11 shoots a third virtual image formed by the first optical module 21 to obtain a third image, and the second imaging module 13 shoots a fourth virtual image formed by the second optical module 23 to obtain a fourth image, wherein the third virtual image is different from the first virtual image, and the fourth virtual image is different from the second virtual image;
010: acquiring first coordinate information of a first feature point 111 in a third image and second coordinate information of a second feature point 131 in a fourth image, wherein the first feature point 111 and the second feature point 131 form a feature matching pair;
011: calculating a first difference between the first coordinate information and the reference coordinate information, and calculating a second difference between the second coordinate information and the reference coordinate information;
012: determining a third adjustment amount of the first display element 213 in the second direction based on the first difference, and determining a fourth adjustment amount of the second display element 233 in the second direction based on the second difference, the second direction being different from the first direction;
013: the position of the first display element 213 in the first optical block 21 is adjusted according to the third adjustment amount, and the position of the second display element 233 in the second optical block 23 is adjusted according to the fourth adjustment amount.
Referring to fig. 3, 4 and 9, in some embodiments, step 09 can be implemented by the first imaging module 11 and the second imaging module 13, step 010, step 011 and step 012 can be implemented by the processor 15, and step 013 can be implemented by the adjusting component 17.
That is, the first imaging module 11 can also be used to capture a third virtual image formed by the first optical module 21 to obtain a third image. The second imaging module 13 can also be used to capture a fourth virtual image formed by the second optical module 23 to obtain a fourth image. The third virtual image is different from the first virtual image, and the fourth virtual image is different from the second virtual image. The processor 15 is further configured to obtain first coordinate information of the first feature point 111 in the third image, and obtain second coordinate information of the second feature point 131 in the fourth image, where the first feature point 111 and the second feature point 131 form a feature matching pair. The processor 15 is further configured to calculate a first difference between the first coordinate information and the reference coordinate information and a second difference between the second coordinate information and the reference coordinate information, and determine a third adjustment amount of the first display element 213 in a second direction according to the first difference and a fourth adjustment amount of the second display element 233 in the second direction according to the second difference, the second direction being different from the first direction. The adjusting component 17 is further configured to adjust the position of the first display element 213 in the first optical module 21 according to the third adjustment amount, and adjust the position of the second display element 233 in the second optical module 23 according to the fourth adjustment amount.
Referring to fig. 3 and 4, it can be understood that the image for the image sharpness test and the image for the image fusion precision test may be the same (as shown in the embodiment of fig. 7) or different (as shown in the embodiment of fig. 9). The test image for calibrating the image fusion accuracy may be an image including stripes, so as to observe the fusion condition of the third virtual image formed by the first optical module 21 and the fourth virtual image formed by the second optical module 12. Therefore, in the present embodiment, the first imaging module 11 may capture a third virtual image formed after the first optical module 21 switches the image to obtain a third image, and the second imaging module 13 may capture a fourth virtual image formed after the second optical module 23 switches the image to obtain a fourth image. After obtaining the third image and the fourth image, the processor 15 may calculate a first difference between the first coordinate information and the reference coordinate information to determine a third adjustment amount of the first display element 213 in the second direction, and calculate a second difference between the second coordinate information and the reference coordinate information to determine a fourth adjustment amount of the second display element 233 in the second direction. Then, the adjusting component 17 adjusts the position of the first display element 213 in the first optical module 21 according to the third adjustment amount in the direction perpendicular to the optical axis of the first optical module 21, and adjusts the position of the second display element 233 in the second optical module 23 according to the fourth adjustment amount in the direction perpendicular to the optical axis of the second optical module 23, so as to improve the fusion precision of the third virtual image formed by the first optical module 21 and the fourth virtual image formed by the second optical module 12, thereby improving the overall imaging effect of the near-eye display device 20, and enabling a user to obtain better experience when using the near-eye display device 20. The specific execution process of the processor 15 in executing steps 010 to 013 is the same as the specific execution process of the processor 15 in executing steps 05 to 08, and is not described herein again.
Referring to fig. 4 and 10, in some embodiments, the calibration method further includes:
014: adjusting the position of the first display element 213 in the first optical block 21 so that the difference between the coordinate information of the center point of the virtual image formed by the first optical block 21 and the reference coordinate information is less than a predetermined difference;
015: the position of the second display element 233 in the second optical block 23 is adjusted so that the difference between the coordinate information of the center point of the virtual image formed by the second optical block 23 and the reference coordinate information is smaller than a predetermined difference.
Referring to fig. 3, 4 and 10, in some embodiments, steps 014 and 015 can be implemented by the adjusting assembly 17. That is, the adjusting component 17 is also used for adjusting the position of the first display element 213 in the first optical module 21, so that the difference between the coordinate information of the center point of the virtual image formed by the first optical module 21 and the reference coordinate information is smaller than the predetermined difference; the position of the second display element 233 in the second optical block 23 is adjusted so that the difference between the coordinate information of the center point of the virtual image formed by the second optical block 23 and the reference coordinate information is smaller than a predetermined difference.
The coordinate information of the center point of the virtual image (the first virtual image or the third virtual image) formed by the first optical module 21 refers to: the coordinate information of the point corresponding to the center point of the virtual image formed by the first optical module 21 in the image (the first image or the third image) captured by the first imaging module 11, and the coordinate information of the center point of the virtual image (the second virtual image or the fourth virtual image) formed by the second optical module 23 are: coordinate information of a point corresponding to the center point of the virtual image formed by the second optical block 23 in the image (the second image or the fourth image) captured by the second imaging block 13.
Referring to fig. 4 and 8, it can be understood that if the difference between the position of the first display device 213 in the first optical module 21 and the target position is large, the difference between the coordinate information of the center point of the virtual image formed by the first optical module 21 and the reference coordinate information is large, and at this time, the first imaging module 11 may not capture the image (the first image or the third image) including the first feature point 111. Similarly, if the position of the second display device 233 in the second optical module 23 is different from the target position of the second display device 233, the difference between the coordinate information of the center point of the virtual image formed by the second optical module 23 and the reference coordinate information is large, and at this time, the second imaging module 13 may not capture the image (the second image or the fourth image) including the second feature point 131. When the first imaging module 11 does not capture the image including the first feature point 111 and/or the second imaging module 13 does not capture the image including the second feature point 131, the processor 15 cannot calibrate the fusion accuracy by feature matching. Therefore, in the present embodiment, before acquiring the coordinate information of the feature point, the adjusting component 17 can adjust the position of the first display element 213 in the first optical module 21, so that the difference between the coordinate information of the center point of the virtual image formed by the first photo module 21 and the reference coordinate information is less than a predetermined difference, and adjusts the position of the second display element 233 in the second photo module 23, so that the difference between the coordinate information of the center point of the virtual image formed by the second optical module 23 and the reference coordinate information is less than a predetermined difference, so that the processor 15 can obtain the first feature point 111 and the second feature point 131 which can form a feature matching pair, and further adjustment of the first display element 213 and the second display element 233 can be realized based on the first characteristic point 111 and the second characteristic point 131, so as to improve the fusion precision of virtual images formed by the two optical modules.
It should be noted that, step 014 and step 015 may be implemented before step 09 (not shown), in addition to step 05 (shown in fig. 10), and are not limited herein.
Referring to fig. 4 and 11, in some embodiments, after adjusting the position of the first display element 213 in the first optical module 21 according to the third adjustment amount and adjusting the position of the second display element 233 in the second optical module 23 according to the fourth adjustment amount in step 08, the calibration method further includes:
016: the first imaging module 11 shoots a virtual image formed by the first optical module 21 to obtain a fifth image, and the second imaging module 13 shoots a virtual image formed by the second optical module 23 to obtain a sixth image;
017: when the definition of the fifth image and the definition of the sixth image are both greater than the predetermined definition, it is determined that the current position of the first display element 213 in the first optical module 21 is the target position, and it is determined that the current position of the second display element 233 in the second optical module 23 is the target position.
Referring to fig. 3, fig. 4 and fig. 11, in some embodiments, step 016 can be implemented by the first imaging module 11 and the second imaging module 13, and step 017 can be implemented by the processor 15.
That is, the first imaging module 11 can also be used to capture the virtual image formed by the first optical module 21 to obtain the fifth image. The second imaging module 13 can also be used to take a virtual image formed by the second optical module 23 to obtain a sixth image. The processor 15 is further configured to determine that the current position of the first display element 213 in the first optical module 21 is the target position and determine that the current position of the second display element 233 in the second optical module 23 is the target position when the sharpness of the fifth image and the sharpness of the sixth image are both greater than the predetermined sharpness.
Specifically, the first imaging module 11 captures a virtual image formed by the first optical module 21 to obtain a fifth image; the second imaging module 13 is also used for shooting the virtual image formed by the second optical module 23 to obtain a sixth image. The fifth image may be obtained by the first imaging module 11 capturing a first virtual image formed when the first optical module 21 does not switch the image (in this case, step 016 is performed after step 08), or may be obtained by the first imaging module 11 capturing a third virtual image formed when the first optical module 21 switches the image (in this case, step 016 is performed after step 013). Similarly, the fifth image may be obtained by the second imaging module 13 capturing a second virtual image formed when the second optical module 23 is not switched, or may be obtained by the second imaging module 13 capturing a fourth virtual image formed after the second optical module 23 is switched. If the sharpness of the fifth image and the sharpness of the sixth image are both greater than or equal to the predetermined sharpness, the processor 15 determines that the current position of the first display element 213 in the first optical module 21 is the target position, and determines that the current position of the second display element 233 in the second optical module 23 is the target position. That is, the target position of the first display element 213 is the final position of the first display element 213 in the first optical module 21, the target position of the second display element 233 is the final position of the second display element 233 in the second optical module 23, and the adjustment of the near-eye display device 20 is completed after the first display element 213 and the second display element 233 are respectively fixed (e.g., glued and fixed) at the corresponding target positions.
Referring to fig. 4 and 11, in some embodiments, the calibration method further includes:
018: when the definition of the fifth image is less than the predetermined definition, determining a fifth adjustment amount of the first display element 213 in the first direction according to the definition of the fifth image, and adjusting the position of the first display element 213 in the first optical module 21 according to the fifth adjustment amount; and
019: when the definition of the sixth image is less than the predetermined definition, a sixth adjustment amount of the second display element 233 in the first direction is determined according to the definition of the sixth image, and the position of the second display element 233 in the second optical block 23 is adjusted according to the sixth adjustment amount.
Referring to fig. 3, 4 and 11, in some embodiments, steps 018 and 019 can be implemented by the processor 15 and the adjusting element 17.
That is, the processor 15 may be further configured to determine a fifth adjustment amount of the first display element 213 in the first direction according to the definition of the fifth image when the definition of the fifth image is less than the predetermined definition. The adjusting assembly 17 can also be used to adjust the position of the first display element 213 in the first optical module 21 according to the fifth adjustment amount. The processor 15 may be further configured to determine a sixth adjustment amount of the second display element 233 in the first direction according to the sharpness of the sixth image when the sharpness of the sixth image is less than the predetermined sharpness. The adjusting assembly 17 can also be used to adjust the position of the second display element 233 in the second optical module 23 according to the sixth adjustment amount.
Specifically, after the fifth image and the sixth image are obtained, if the sharpness of the fifth image is less than the predetermined sharpness, that is, the sharpness of the fifth image does not meet the predetermined sharpness requirement of the near-eye display device 20 when the near-eye display device leaves the factory, the processor 15 needs to determine a fifth adjustment amount of the first display element 213 in the first direction according to a difference between the sharpness of the fifth image and the predetermined sharpness, and adjust the position of the first display element 213 in the first optical module 21 according to the fifth adjustment amount so that the sharpness of the fifth image is greater than the predetermined sharpness. When the definition of the sixth image is smaller than the predetermined definition, that is, the definition of the sixth image does not meet the requirement of the predetermined definition of the near-eye display device 20 leaving the factory, at this time, the processor 15 determines a sixth adjustment amount of the second display element 233 in the first direction according to a difference between the definition of the sixth image and the predetermined definition, and adjusts the position of the second display element 233 in the second optical module 23 according to the sixth adjustment amount so that the definition of the sixth image is greater than the predetermined definition. After the positions of the first display element 213 and the second display element 233 are adjusted again, the first display element 213 and the second display element 233 are fixed (e.g., glued) to the adjusted positions, respectively, and the adjustment of the near-to-eye display device 20 is completed.
It can be understood that, in the process of adjusting the fusion accuracy of the two virtual images, when the first display element 213 and the second display element 233 are moved along the direction perpendicular to the optical axes of the first optical module 21 and the second optical module 23, the first display element 213 and the second display element 233 may be moved along the optical axes of the first optical module 21 and the second optical module 23, so that the definition of the image formed by the first imaging module 11 and the second imaging module 13 may be reduced, and therefore, after the fusion accuracy of the images is adjusted, it is necessary to check whether the definition of the images meets the requirement again.
In the embodiment of the application, the image after the adjustment and fusion precision is subjected to definition detection again, so that the whole imaging effect of the near-to-eye display device 20 can meet the use requirement, and a user can obtain better experience when using the near-to-eye display device 20.
Referring to fig. 2 and 4, a calibration system 100 is further provided according to an embodiment of the present disclosure. The calibration system 100 includes a near-eye display device 20 and the calibration apparatus 10 of any of the previous embodiments.
The calibration system 100 according to the embodiment of the present application assists in adjusting the positions of the two sets of optical modules in the near-to-eye display system 20 through the calibration apparatus 10 including the first imaging module 11 and the second imaging module 13, so that the overall imaging effect of the near-to-eye display system 20 can be improved, and a user can obtain better experience when using the near-to-eye display device 20.
In the description herein, reference to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (15)

1. The calibration method of the near-eye display device is characterized in that the near-eye display device comprises a first optical module and a second optical module, wherein the first optical module and the second optical module are used for forming a virtual image; the calibration method comprises the following steps:
the first imaging module shoots a first virtual image formed by the first optical module to obtain a first image, and the second imaging module shoots a second virtual image formed by the second optical module to obtain a second image;
determining a first adjustment amount of a first display element in the first optical module in a first direction according to the definition of the first image, and determining a second adjustment amount of a second display element in the second optical module in the first direction according to the definition of the second image; and
and adjusting the position of the first display element in the first optical module according to the first adjustment amount, and adjusting the position of the second display element in the second optical module according to the second adjustment amount.
2. The calibration method according to claim 1, wherein before the first imaging module captures a first virtual image formed by the first optical module to obtain a first image, and the second imaging module captures a second virtual image formed by the second optical module to obtain a second image, the calibration method further comprises:
adjusting the position of the first display element in the first optical module according to a first preset adjusting amount, and adjusting the position of the second display element in the second optical module according to a second preset adjusting amount;
when the first imaging module shoots a calibration object to obtain a first calibration image with definition being greater than preset definition, the first preset adjustment quantity is obtained through calculation according to the distance between the calibration object and the first imaging module, and when the second imaging module shoots the calibration object to obtain a second calibration image with definition being greater than the preset definition, the second preset adjustment quantity is obtained through calculation according to the distance between the calibration object and the second imaging module.
3. The calibration method according to claim 1, further comprising:
acquiring first coordinate information of a first feature point in the first image and second coordinate information of a second feature point in the second image, wherein the first feature point and the second feature point form a feature matching pair;
calculating a first difference between the first coordinate information and reference coordinate information, and calculating a second difference between the second coordinate information and the reference coordinate information;
determining a third adjustment amount of the first display element in a second direction according to the first difference, and determining a fourth adjustment amount of the second display element in the second direction according to the second difference, the second direction being different from the first direction; and
and adjusting the position of the first display element in the first optical module according to the third adjustment amount, and adjusting the position of the second display element in the second optical module according to the fourth adjustment amount.
4. The calibration method according to claim 1, further comprising:
the first imaging module shoots a third virtual image formed by the first optical module to obtain a third image, the second imaging module shoots a fourth virtual image formed by the second optical module to obtain a fourth image, the third virtual image is different from the first virtual image, and the fourth virtual image is different from the second virtual image;
acquiring first coordinate information of a first feature point in the third image and second coordinate information of a second feature point in the fourth image, wherein the first feature point and the second feature point form a feature matching pair;
calculating a first difference between the first coordinate information and reference coordinate information, and calculating a second difference between the second coordinate information and the reference coordinate information;
determining a third adjustment amount of the first display element in a second direction according to the first difference, and determining a fourth adjustment amount of the second display element in the second direction according to the second difference, the second direction being different from the first direction; and
and adjusting the position of the first display element in the first optical module according to the third adjustment amount, and adjusting the position of the second display element in the second optical module according to the fourth adjustment amount.
5. The calibration method according to claim 3 or 4, further comprising:
adjusting the position of the first display element in the first optical module so that the difference between the coordinate information of the central point of the virtual image formed by the first optical module and the reference coordinate information is smaller than a preset difference;
and adjusting the position of the second display element in the second optical module so that the difference between the coordinate information of the central point of the virtual image formed by the second optical module and the reference coordinate information is smaller than the preset difference.
6. The calibration method according to claim 3 or 4, wherein after said adjusting the position of the first display element in the first optical block according to the third adjustment amount and adjusting the position of the second display element in the second optical block according to the fourth adjustment amount, the calibration method further comprises:
the first imaging module shoots a virtual image formed by the first optical module to obtain a fifth image, and the second imaging module shoots a virtual image formed by the second optical module to obtain a sixth image;
and when the definition of the fifth image and the definition of the sixth image are both greater than the preset definition, confirming that the current position of the first display element in the first optical module is a target position, and confirming that the current position of the second display element in the second optical module is a target position.
7. The calibration method of claim 6, further comprising:
when the definition of the fifth image is smaller than the preset definition, determining a fifth adjustment amount of the first display element in the first direction according to the definition of the fifth image, and adjusting the position of the first display element in the first optical module according to the fifth adjustment amount; and
when the definition of the sixth image is smaller than the preset definition, determining a sixth adjustment amount of the second display element in the first direction according to the definition of the sixth image, and adjusting the position of the second display element in the second optical module according to the sixth adjustment amount.
8. A calibration apparatus for a near-eye display device, wherein the near-eye display device comprises a first optical module and a second optical module, both the first optical module and the second optical module are used for forming a virtual image; the calibration apparatus includes:
the first imaging module is used for shooting a first virtual image formed by the first optical module to obtain a first image;
the second imaging module is used for shooting a second virtual image formed by the second optical module to obtain a second image;
a processor for determining a first adjustment amount of a first display element in the first optical module in a first direction according to the sharpness of the first image and determining a second adjustment amount of a second display element in the second optical module in the first direction according to the sharpness of the second image; and
the adjusting component is used for adjusting the position of the first display element in the first optical module according to the first adjusting amount and adjusting the position of the second display element in the second optical module according to the second adjusting amount.
9. The calibration apparatus of claim 8, wherein the adjustment assembly is further configured to:
adjusting the position of the first display element in the first optical module according to a first preset adjusting amount, and adjusting the position of the second display element in the second optical module according to a second preset adjusting amount;
when the first imaging module shoots a calibration object to obtain a first calibration image with definition being greater than preset definition, the first preset adjustment quantity is obtained through calculation according to the distance between the calibration object and the first imaging module, and when the second imaging module shoots the calibration object to obtain a second calibration image with definition being greater than the preset definition, the second preset adjustment quantity is obtained through calculation according to the distance between the calibration object and the second imaging module.
10. The calibration device of claim 8, wherein the processor is further configured to:
acquiring first coordinate information of a first feature point in the first image and second coordinate information of a second feature point in the second image, wherein the first feature point and the second feature point form a feature matching pair;
calculating a first difference between the first coordinate information and reference coordinate information, and calculating a second difference between the second coordinate information and the reference coordinate information; and
determining a third adjustment amount of the first display element in a second direction according to the first difference, and determining a fourth adjustment amount of the second display element in the second direction according to the second difference, the second direction being different from the first direction;
the adjusting component is further used for adjusting the position of the first display element in the first optical module according to the third adjusting quantity and adjusting the position of the second display element in the second optical module according to the fourth adjusting quantity.
11. The calibration apparatus according to claim 8, wherein the first imaging module is further configured to capture a third virtual image formed by the first optical module to obtain a third image, the third virtual image being different from the first virtual image;
the second imaging module is further used for shooting a fourth virtual image formed by the second optical module to obtain a fourth image, and the fourth virtual image is different from the second virtual image;
the processor is further configured to:
acquiring first coordinate information of a first feature point in the third image and second coordinate information of a second feature point in the fourth image, wherein the first feature point and the second feature point form a feature matching pair;
calculating a first difference between the first coordinate information and reference coordinate information, and calculating a second difference between the second coordinate information and the reference coordinate information; and
determining a third adjustment amount of the first display element in a second direction according to the first difference, and determining a fourth adjustment amount of the second display element in the second direction according to the second difference, the second direction being different from the first direction;
the adjusting component is further used for adjusting the position of the first display element in the first optical module according to the third adjusting quantity and adjusting the position of the second display element in the second optical module according to the fourth adjusting quantity.
12. The calibration apparatus of claim 10 or 11, wherein the adjustment assembly is further configured to:
adjusting the position of the first display element in the first optical module so that the difference between the coordinate information of the central point of the virtual image formed by the first optical module and the reference coordinate information is smaller than a preset difference;
and adjusting the position of the second display element in the second optical module so that the difference between the coordinate information of the central point of the virtual image formed by the second optical module and the reference coordinate information is smaller than the preset difference.
13. The calibration apparatus according to claim 10 or 11, wherein the first imaging module is further configured to capture a virtual image formed by the first optical module to obtain a fifth image;
the second imaging module is also used for shooting a virtual image formed by the second optical module to obtain a sixth image;
the processor is further configured to determine that a current position of the first display element in the first optical module is a target position and determine that a current position of the second display element in the second optical module is a target position when the definition of the fifth image and the definition of the sixth image are both greater than a predetermined definition.
14. The calibration device of claim 13, wherein the processor is further configured to determine a fifth adjustment amount of the first display element in the first direction based on the sharpness of the fifth image when the sharpness of the fifth image is less than the predetermined sharpness;
the adjusting component is further used for adjusting the position of the first display element in the first optical module according to the fifth adjusting amount;
the processor is further configured to determine a sixth adjustment amount of the second display element in the first direction according to the sharpness of the sixth image when the sharpness of the sixth image is less than the predetermined sharpness;
the adjusting component is also used for adjusting the position of the second display element in the second optical module according to the sixth adjusting amount.
15. A calibration system, characterized in that the calibration system comprises:
the near-eye display device comprises a first optical module and a second optical module; and
the calibration apparatus of any of claims 8-14, used to calibrate the near-eye display device.
CN202010551867.1A 2020-06-17 2020-06-17 Calibration method, calibration equipment and calibration system for near-eye display device Pending CN111694158A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010551867.1A CN111694158A (en) 2020-06-17 2020-06-17 Calibration method, calibration equipment and calibration system for near-eye display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010551867.1A CN111694158A (en) 2020-06-17 2020-06-17 Calibration method, calibration equipment and calibration system for near-eye display device

Publications (1)

Publication Number Publication Date
CN111694158A true CN111694158A (en) 2020-09-22

Family

ID=72481296

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010551867.1A Pending CN111694158A (en) 2020-06-17 2020-06-17 Calibration method, calibration equipment and calibration system for near-eye display device

Country Status (1)

Country Link
CN (1) CN111694158A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113099203A (en) * 2021-05-10 2021-07-09 青岛小鸟看看科技有限公司 Display system calibration method and system
CN114245092A (en) * 2022-02-23 2022-03-25 北京灵犀微光科技有限公司 Multi-depth near-to-eye display method and device
CN115278203A (en) * 2022-07-20 2022-11-01 广州视享科技有限公司 Calibration method and calibration device for virtual reality equipment and calibration robot

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107063646A (en) * 2017-06-27 2017-08-18 歌尔科技有限公司 Method, device and the virtual reality helmet of lens effective focal length are determined using camera
CN107101808A (en) * 2017-06-27 2017-08-29 歌尔科技有限公司 Method, device and the virtual reality helmet of lens back focal length are determined using camera
CN108012146A (en) * 2017-12-15 2018-05-08 歌尔科技有限公司 Virtual image distance detection method and equipment
CN108769668A (en) * 2018-05-31 2018-11-06 歌尔股份有限公司 Method for determining position and device of the pixel in VR display screens in camera imaging
CN109031688A (en) * 2018-06-11 2018-12-18 歌尔股份有限公司 The localization method and positioning device of display screen in a kind of optics module
CN109596319A (en) * 2018-11-26 2019-04-09 歌尔股份有限公司 The detection system and method for optics module parameter
CN110087049A (en) * 2019-05-27 2019-08-02 广州市讯码通讯科技有限公司 Automatic focusing system, method and projector
CN209417442U (en) * 2019-02-26 2019-09-20 弗提图德萨沃有限公司 The adjustment test device of optics module in binocular helmet
CN111164493A (en) * 2018-08-29 2020-05-15 法国圣戈班玻璃厂 Detection device for head-up display (HUD)

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107063646A (en) * 2017-06-27 2017-08-18 歌尔科技有限公司 Method, device and the virtual reality helmet of lens effective focal length are determined using camera
CN107101808A (en) * 2017-06-27 2017-08-29 歌尔科技有限公司 Method, device and the virtual reality helmet of lens back focal length are determined using camera
CN108012146A (en) * 2017-12-15 2018-05-08 歌尔科技有限公司 Virtual image distance detection method and equipment
CN108769668A (en) * 2018-05-31 2018-11-06 歌尔股份有限公司 Method for determining position and device of the pixel in VR display screens in camera imaging
CN109031688A (en) * 2018-06-11 2018-12-18 歌尔股份有限公司 The localization method and positioning device of display screen in a kind of optics module
CN111164493A (en) * 2018-08-29 2020-05-15 法国圣戈班玻璃厂 Detection device for head-up display (HUD)
CN109596319A (en) * 2018-11-26 2019-04-09 歌尔股份有限公司 The detection system and method for optics module parameter
CN209417442U (en) * 2019-02-26 2019-09-20 弗提图德萨沃有限公司 The adjustment test device of optics module in binocular helmet
CN110087049A (en) * 2019-05-27 2019-08-02 广州市讯码通讯科技有限公司 Automatic focusing system, method and projector

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113099203A (en) * 2021-05-10 2021-07-09 青岛小鸟看看科技有限公司 Display system calibration method and system
CN113099203B (en) * 2021-05-10 2023-08-22 青岛小鸟看看科技有限公司 Display system calibration method and system
CN114245092A (en) * 2022-02-23 2022-03-25 北京灵犀微光科技有限公司 Multi-depth near-to-eye display method and device
CN115278203A (en) * 2022-07-20 2022-11-01 广州视享科技有限公司 Calibration method and calibration device for virtual reality equipment and calibration robot

Similar Documents

Publication Publication Date Title
CN110675348B (en) Augmented reality image display method and device and image processing equipment
US20070248260A1 (en) Supporting a 3D presentation
JP4852591B2 (en) Stereoscopic image processing apparatus, method, recording medium, and stereoscopic imaging apparatus
CN111694158A (en) Calibration method, calibration equipment and calibration system for near-eye display device
US9188849B2 (en) 3D imaging device and 3D imaging method
CN108663799A (en) A kind of display control program and its display control method of VR images
US9049434B2 (en) 3D imaging device and 3D imaging method
US9128367B2 (en) 3D imaging device and 3D imaging method
CN112311965A (en) Virtual shooting method, device, system and storage medium
CN108833795B (en) Focusing method and device of image acquisition equipment
US20150304625A1 (en) Image processing device, method, and recording medium
CN109785390B (en) Method and device for image correction
US20040184656A1 (en) Method for measuring object based on image and photographing apparatus
CN113112407B (en) Method, system, device and medium for generating field of view of television-based mirror
CN112598751A (en) Calibration method and device, terminal and storage medium
JP6168220B2 (en) Image generation apparatus, image processing apparatus, image generation method, and image processing program
JP2017103695A (en) Image processing apparatus, image processing method, and program of them
KR101873161B1 (en) Method and apparatus for providing personal 3-dimensional image using convergence matching algorithm
JP6330955B2 (en) Imaging apparatus and imaging method
CN109917908A (en) A kind of image acquiring method and system of AR glasses
KR101737260B1 (en) Camera system for extracting depth from images of different depth of field and opertation method thereof
CN111292380B (en) Image processing method and device
JP6569769B2 (en) Arbitrary viewpoint image composition method and image processing apparatus
CN113938578A (en) Image blurring method, storage medium and terminal device
CN114862934B (en) Scene depth estimation method and device for billion pixel imaging

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200922