WO2021212978A1 - 标定方法、标定装置和非易失性计算机可读存储介质 - Google Patents

标定方法、标定装置和非易失性计算机可读存储介质 Download PDF

Info

Publication number
WO2021212978A1
WO2021212978A1 PCT/CN2021/076301 CN2021076301W WO2021212978A1 WO 2021212978 A1 WO2021212978 A1 WO 2021212978A1 CN 2021076301 W CN2021076301 W CN 2021076301W WO 2021212978 A1 WO2021212978 A1 WO 2021212978A1
Authority
WO
WIPO (PCT)
Prior art keywords
calibration
marking
camera module
rectangle
points
Prior art date
Application number
PCT/CN2021/076301
Other languages
English (en)
French (fr)
Inventor
徐乃江
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2021212978A1 publication Critical patent/WO2021212978A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Definitions

  • This application relates to the fields of super-resolution imaging, lithography and data storage, and in particular to a calibration method, a calibration device and a non-volatile computer-readable storage medium.
  • ear-EyeDisplay represented by devices such as VR/AR glasses is gradually entering people's lives due to its portability, strong immersion, and wide range of uses.
  • NED ear-EyeDisplay
  • the embodiments of the present application provide a calibration method, a calibration device, and a non-volatile computer-readable storage medium.
  • the calibration method of the embodiment of the present application includes acquiring camera parameters of a camera module, the camera parameters including the focal length of the camera module; controlling the display module to display the calibration pattern, and photographing the calibration pattern through the camera module Obtaining a calibration image; and calculating the field of view angle of the display module according to the focal length and the distance between the plurality of calibration points identified in the calibration image.
  • the calibration device of the embodiment of the present application includes a display module, a camera module, and a processor; the processor is used to obtain camera parameters of the camera module, the camera parameters including the focal length of the camera module, and control the display module
  • the group displays the calibration pattern, and captures the calibration pattern through the camera module to obtain a calibration image, and calculates the display module based on the focal length and the distance between a plurality of calibration points recognized in the calibration image The angle of view.
  • the calibration method includes acquiring camera parameters of a camera module, the camera parameters including the focal length of the camera module; controlling the display module to display a calibration pattern, and capturing the calibration pattern through the camera module to obtain a calibration image And calculating the angle of view of the display module according to the distance between the focal length and the plurality of calibration points identified in the calibration image.
  • FIG. 1 is a schematic flowchart of a calibration method according to some embodiments of the present application.
  • Figure 2 is a schematic plan view of a calibration device according to some embodiments of the present application.
  • Fig. 3 is a schematic plan view of another view of the calibration device of some embodiments of the present application.
  • Fig. 4 is a schematic plan view of a calibration pattern of some embodiments of the present application.
  • Fig. 5 is a schematic plan view of a calibration pattern of some embodiments of the present application.
  • Fig. 6 is a schematic plan view of a calibration image according to some embodiments of the present application.
  • FIG. 7 is a schematic plan view of a calibration image in some embodiments of the present application.
  • FIG. 8 is a schematic flowchart of a calibration method according to some embodiments of the present application.
  • FIG. 9 is a schematic plan view of a checkerboard image of some embodiments of the present application.
  • FIG. 10 is a schematic plan view of a photographed image obtained by photographing a checkerboard image by a camera module according to some embodiments of the present application.
  • FIG. 11 is a schematic plan view of a photographed image obtained by photographing a checkerboard image by a camera module according to some embodiments of the present application;
  • FIG. 12 is a schematic plan view of a photographed image obtained by photographing a calibration pattern by a camera module according to some embodiments of the present application;
  • FIG. 13 is a schematic flowchart of a calibration method according to some embodiments of the present application.
  • FIG. 14 is a schematic flowchart of a calibration method according to some embodiments of the present application.
  • FIG. 15 is a schematic flowchart of a calibration method according to some embodiments of the present application.
  • FIG. 16 is a schematic diagram of the connection between a processor and a computer-readable storage medium in some embodiments of the present application.
  • the calibration method of the embodiment of the present application includes 011: Obtain the camera parameters of the camera module 20, the camera parameters include the focal length of the camera module 20; 012: Control the display module 10 to display the calibration pattern, and pass The camera module 20 photographs the calibration pattern to obtain the calibration image; and 013: Calculate the field of view of the display module 10 according to the focal length and the distance between the multiple calibration points recognized in the calibration image.
  • obtaining the camera parameters of the camera module includes: 0111: controlling the camera module 20 to align with the calibration board 40; and 0112: controlling the camera module 20 to photograph the checkerboard pattern on the calibration board 40; and 0113: Calculate camera parameters based on the checkerboard image obtained by shooting.
  • the display area is rectangular
  • the calibration pattern includes a plurality of marking points P
  • the plurality of marking points P in the marking pattern form a first marking side L1 and a second marking side L2
  • the first mark side L1 and the second mark side L2 are perpendicular, and the first mark side L1 is parallel to the first side length direction of the display area of the display module 10;
  • the field angle of the display module is calculated according to the focal length and the distance between the multiple calibration points identified in the calibration image, including:
  • the vertical viewing angle of the display module 10 is calculated according to the focal length, the length of the second calibration side N2, and the ratio of the length of the second marking side L2 to the width of the calibration pattern.
  • the multiple calibration points M in the calibration image include a center calibration point M1 and a plurality of edge calibration points M2.
  • the edge calibration points M2 enclose a first rectangle T1 and are located in the first rectangle T1.
  • the vertex of a rectangle T1 and the midpoint of the side of the first rectangle T1 the center calibration point M1 is located at the center of the first rectangle T1
  • the long side of the first rectangle T1 is the first calibration side N1
  • the short side of the first rectangle T1 is The second calibration side N2
  • the calibration method also includes:
  • the marking pattern is rectangular, the marking pattern includes a plurality of marking points P, the plurality of marking points P include a central marking point P1 and a plurality of edge marking points P2, and the edge marking points P2 form a second rectangle T2. Located at the midpoint of the apex of the second rectangle T2 and the side of the second rectangle T2, the center mark point P1 is located at the center of the second rectangle T2, the long side of the second rectangle T2 is the first mark side L1, and the short side of the second rectangle It is the second mark edge L2.
  • the ratio of the length of the first marking side L1 to the length of the calibration pattern is the same as the ratio of the length of the second marking side L2 to the width of the calibration pattern.
  • the calibration method further includes: 016 : Calculate the field angle of the corresponding display module 10 according to the focal length of the camera module 20 and the distance between the multiple calibration points M in the corresponding calibration image.
  • the calibration point M is located at the center of multiple circles with the same radius, and the distance between the calibration points M is the distance between the centers of the circles where the calibration point M is located.
  • controlling the alignment of the camera module 20 and the calibration board 40 includes: positioning the center of the calibration board 40 on the optical axis O of the camera module 20, and making the calibration board 40 perpendicular to the light Axis O.
  • obtaining the camera parameters of the camera module 20 includes: controlling the camera module 20 to align with the display module 10; controlling the display module 10 to display a checkerboard pattern; controlling the camera module 20 to photograph the checkerboard pattern To obtain a checkerboard image; and calculate camera parameters based on the checkerboard image.
  • the calibration device of the embodiment of the present application includes a display module 10, a camera module 20 and a processor 30.
  • the processor 30 is used to obtain the camera parameters of the camera module 20, the camera parameters including the focal length of the camera module 20; control the display module 10 to display the calibration pattern, and take the calibration pattern through the camera module 20 to obtain the calibration image; and according to the focal length
  • the field angle of the display module 10 is calculated from the distance between the multiple calibration points recognized in the calibration image.
  • the calibration device 100 further includes a calibration board 40 and a positioning device 50.
  • the positioning device 50 is used to control the alignment of the camera module 20 and the calibration board 40; the processor 30 is also used to control the camera module 20 to shoot and calibrate.
  • the checkerboard pattern on the board 40; and the camera parameters are calculated according to the checkerboard image obtained by shooting.
  • the display area is rectangular
  • the calibration pattern includes a plurality of marking points P
  • the plurality of marking points P in the marking pattern form a first marking edge L1 and a second marking edge L1 and a second marking edge.
  • the marking side L2, the first marking side L1 and the second marking side L2 are perpendicular, and the first marking side L1 is parallel to the first side length direction of the display area of the display module 10; the marking image is formed by identifying multiple marking points M
  • the first calibration side N1 and the second calibration side N2, the processor 30 is also used to calculate the display module 10 according to the focal length, the length of the first calibration side N1, and the ratio of the length of the first marking side L1 to the length of the calibration pattern
  • the horizontal viewing angle, and the vertical viewing angle of the display module 10 are calculated based on the focal length, the length of the second calibration side N2, and the ratio of the length of the second marking side L2 to the width of the calibration pattern.
  • the multiple calibration points M in the calibration image include a center calibration point M1 and a plurality of edge calibration points M2.
  • the edge calibration points M2 enclose a first rectangle T1 and are located in the first rectangle T1.
  • the marking pattern is rectangular, the marking pattern includes a plurality of marking points P, the plurality of marking points P include a central marking point P1 and a plurality of edge marking points P2, and the edge marking points P2 form a second rectangle T2. Located at the midpoint of the apex of the second rectangle T2 and the side of the second rectangle T2, the center mark point P1 is located at the center of the second rectangle T2, the long side of the second rectangle T2 is the first mark side L1, and the short side of the second rectangle It is the second mark edge L2.
  • the ratio of the length of the first marking side L1 to the length of the calibration pattern is the same as the ratio of the length of the second marking side L2 to the width of the calibration pattern.
  • the two camera modules 20 and two display modules 10 there are two camera modules 20 and two display modules 10, and the two camera modules 20 and two display modules 10 are in one-to-one correspondence, and the processor 30 is also used for
  • the field angle of the corresponding display module 10 is calculated according to the focal length of the camera module 20 and the distance between the multiple calibration points in the corresponding calibration image.
  • the calibration point M is located at the center of multiple circles with the same radius, and the distance between the calibration points M is the distance between the centers of the circles where the calibration point M is located.
  • the positioning device 50 is used to position the center of the calibration plate 40 on the optical axis O of the camera module 20 and make the calibration plate 40 perpendicular to the optical axis O.
  • the processor 30 is also used to control the alignment of the camera module 20 and the display module 10; control the display module 10 to display the checkerboard pattern; control the camera module 20 to photograph the checkerboard pattern to obtain a checkerboard image ; And calculate the camera parameters based on the checkerboard image.
  • the non-volatile computer-readable storage medium 300 of the computer-executable instructions 302 of the embodiment of the present application when the computer-executable instructions 302 are executed by one or more processors 30, causes the processor 30 to execute the foregoing The calibration method of any embodiment.
  • the calibration method of the embodiment of the present application includes the following steps:
  • 013 Calculate the field of view angle of the display module 10 according to the focal length and the distance between the multiple calibration points identified in the calibration image.
  • the calibration device 100 includes a display module 10, a camera module 20 and a processor 30.
  • the processor 30 is used to obtain the camera parameters of the camera module 20, the camera parameters including the focal length of the camera module 20; control the display module 10 to display the calibration pattern, and take the calibration pattern through the camera module 20 to obtain the calibration image; and according to the focal length
  • the field angle of the display module 10 is calculated from the distance between the multiple calibration points recognized in the calibration image.
  • step 011 can be implemented by the processor 30, step 012 can be implemented by the processor 30 in cooperation with the camera module 20 and the display module 10, and step 013 can be implemented by the processor 30.
  • the camera module 20 has a design parameter when it leaves the factory.
  • the camera parameters of the camera module 20 itself are basically consistent with the design parameters.
  • some manufacturers will use high precision to calibrate the camera parameters again after the camera is assembled, so as to ensure the accuracy of the camera parameters after leaving the factory.
  • the processor 30 can obtain camera parameters.
  • the processor 30 can be connected to the camera module 20 to obtain the camera parameters stored in the camera module 20, or the camera parameters of the camera module 20 have been pre-stored, such as a server or a local
  • the memory the processor 30 is connected to a server or a local memory to obtain camera parameters.
  • the camera parameters may include the focal length of the camera module 20, the horizontal field of view angle, the vertical field of view angle of the camera module 20, and so on. In this embodiment, only the focal length of the camera module 20 needs to be obtained.
  • the processor 30 controls the connected display module 10 to display a preset calibration pattern (as shown in FIGS. 4 and 5), and then controls the camera module 20 to shoot the calibration pattern to obtain a calibration image (such as (Shown in Figure 6 and Figure 7), the calibration pattern is a special pattern, the calibration pattern has specific mark points P, these mark points P can be easily recognized from the calibration image obtained by shooting, the position of the mark point P is also set in advance Fixed.
  • the processor 30 first recognizes the calibration point M corresponding to the mark point P in the calibration image (as shown in FIG. 6 and FIG. 7), and then obtains a plurality of calibration points M (as in this embodiment, the calibration pattern shown in FIG.
  • the nine calibration points M corresponding to the nine mark points P shown in FIG. 6 are obtained, and the calibration pattern shown in FIG. 5 is photographed, and five calibration points corresponding to the five mark points P shown in FIG. 7 can be obtained.
  • the calibration point M and the mark point P correspond one-to-one.
  • the marking pattern includes a plurality of marking points P, and the plurality of marking points P in the marking pattern form a first marking side L1 and a second marking side L2.
  • the multiple mark points P include a center mark point P1 and an edge mark point P2 (ie, one center mark point P1 and eight edge mark points P2), and the eight edge mark points P2 enclose a second rectangle T2.
  • the edge mark point P2 is located at the midpoint of the vertex and side of the second rectangle T2
  • the center mark point P1 is located at the center of the second rectangle T2 (ie, the intersection of the diagonals of the second rectangle T2), and the length of the second rectangle T2
  • the side is the first marking side L1
  • the short side of the second rectangle T2 is the second marking side L2.
  • the plurality of mark points P include a central mark point P1 and an edge mark point P2 (ie, a central mark point P1 and four edge mark points P2).
  • the first mark edge L1 and the second mark edge L2 are respectively It is the line between the two opposite edge mark points P2 in the X direction and the line between the two opposite edge mark points P2 in the Y direction.
  • the center mark point P1 is located at the intersection of the two lines, where the X direction and the Y direction are perpendicular ,
  • the first logo side L1 and the second logo side L2 are perpendicular, the first logo side L1 is parallel to the first side length direction of the display area of the display module 10, the display area and the calibration pattern are both rectangular, where the first side is long It is the long side of the rectangular display area, and the first side length direction is the extension direction of the first side length (that is, the X direction).
  • a first calibration side N1 and a second calibration side N2 formed by a plurality of calibration points M corresponding to the first marking side L1 and the second marking side L2, respectively, are identified.
  • the multiple calibration points M in the calibration image include a center calibration point M1 and a plurality of edge calibration points M2 (ie, one center calibration point M1 and eight edge calibration points M2), and the edge calibration points M2 enclose
  • the first rectangle T1 is located at the midpoint between the vertex of the first rectangle T1 and the side of the first rectangle T1
  • the center marking point M1 is located at the center of the first rectangle T1 (ie, the intersection of the diagonals of the first rectangle T1).
  • the long side of a rectangle T1 is the first calibration side N1, and the short side of the first rectangle T1 is the second calibration side N2.
  • the multiple calibration points M include a center calibration point M1 and an edge calibration point M2 (ie, one center calibration point M1 and four edge calibration points M2).
  • the first calibration side N1 and the second calibration side N2 are respectively It is the line connecting the two opposite edge calibration points M2 in the X direction and the line connecting the two opposite edge calibration points M2 in the Y direction.
  • the center calibration point M1 is located at the intersection of the two lines, the first calibration side N1 and the second The calibration side N2 is vertical, and the first calibration side N1 is parallel to the first side length direction of the display area of the display module 10 (ie, the X direction).
  • the processor 30 can obtain the lengths of the first calibration side N1 and the second calibration side N2 according to the distance between the plurality of calibration points M in the calibration image, and according to the lengths of the first calibration side N1 and the second calibration side N2 and the camera module
  • the focal length of 20 can be calculated to obtain the field of view of the display module 10.
  • the field of view includes the horizontal field of view, the vertical field of view, and the diagonal field of view.
  • the diagonal field of view can be based on the horizontal field of view.
  • the angle and the vertical field of view are calculated, so generally only the horizontal field of view and the vertical field of view are required to be calculated.
  • the types of calibration patterns include but are not limited to the patterns shown in FIGS. 4 and 5, as long as the calibration patterns have a first logo side L1 parallel to the X direction and a second logo side L2 parallel to the Y direction. That's it.
  • the calibration pattern is displayed by controlling the display module 10, and the calibration pattern is photographed by the camera module 20 to obtain a calibration image.
  • the calibration image contains a calibration point corresponding to the mark point of the calibration pattern.
  • the distance between the focal length of the camera module 20 and the calibration point can be used to accurately calculate the angle of view of the display module 10. Therefore, the angle of view of the display module 10 can be accurately configured to ensure that the left and right eye images seen by the human eye completely overlap.
  • step 011 includes:
  • the calibration device 100 further includes a positioning device 50, the positioning device 50 is used to control the camera module 20 and the calibration board 40 alignment; the processor 30 is also used to control the camera module The group 20 photographs the checkerboard pattern on the calibration board 40; and calculates the camera parameters based on the checkerboard image obtained by shooting.
  • step 0111 can be implemented by the positioning device 50
  • step 0112 and step 0113 can be implemented by the processor 30.
  • the camera module 20 in order to prevent the actual parameters of the camera module 20 from being greatly deviated from the set parameters due to manufacturing and assembly errors, in order to ensure the accuracy of the acquired camera parameters, it is necessary to accurately detect the camera parameters in advance.
  • the camera module 20 In order to ensure the clarity of the image captured by the camera module 20, the camera module 20 should be aligned with the calibration plate 40. Specifically, the center of the calibration plate 40 can be positioned on the optical axis O of the camera module 20 by the positioning device 50.
  • the calibration plate 40 perpendicular to the optical axis O, so that the calibration plate 40 can be maximized in the center area of the field of view of the camera module 20, avoiding the distortion of the edge area of the field of view from affecting the imaging quality, and ensuring the camera model The imaging quality of the images taken by group 20.
  • the calibration pattern on the calibration plate 40 can be photographed by the camera module 20 (this application takes the calibration pattern as the checkerboard pattern shown in FIG. 9 as an example for illustration, where 1 represents a black area, 2 represents a white area,
  • the calibration pattern can also be other suitable patterns, which are not limited here), to determine whether it is aligned, specifically to first determine whether the center point z of the checkerboard pattern is located in the center of the entire captured image S1, when the center point z of the checkerboard pattern When located on the optical axis O of the camera module 20, the center point z of the checkerboard pattern must be located at the center of the captured image S1 (as shown in FIG.
  • the center point z of the checkerboard pattern in the captured image S1 to the two long sides should be the same (d1 and d2 in Figure 10).
  • the center point z of the checkerboard pattern in the captured image S1 to the two short sides should also be the same distance (d3 and d4 in Figure 10).
  • the rotation angle can be corrected by the positioning device 50.
  • the rotation angle error in the X direction can be corrected according to the difference between d1 and d2
  • the rotation angle error in the Y direction can be corrected according to d3 and d4.
  • the positioning device 50 can Realize the functions of fixing the camera module 20, the calibration board 40 and the display module 10, height adjustment, three-axis angle adjustment, etc., so as to ensure the alignment of the calibration board 40 and the camera module 20, which is beneficial to improve the camera of the subsequent camera module 20 Parameters and the accuracy of the viewing angle of the display module 10.
  • the processor 30 controls the camera module 20 to photograph the checkerboard pattern on the calibration board 40 to obtain a checkerboard image.
  • the camera of the camera module 20 can be accurately detected. parameter.
  • the calibration plate 40 may not be provided, and only the camera module 20 and the display module 10 need to be aligned (that is, the center of the display area of the display module 10 is located on the optical axis O of the camera module 20, And the surface on which the display module 10 is located is perpendicular to the optical axis O).
  • the checkerboard pattern is displayed through the display module 10, and the camera module 20 captures the checkerboard pattern displayed by the display module 10 to obtain the checkerboard image, which can be accurately detected according to Zhang Zhengyou's camera calibration method and the checkerboard image Camera parameters of the camera module 20.
  • Zhang Zhengyou's camera calibration method and the checkerboard image Camera parameters of the camera module 20 In this way, there is no need to set a calibration board, and only the display module 10 and the camera module 20 need to be aligned.
  • the structure of the calibration device 100 is simpler and the calibration process is reduced.
  • the calibration board 40 of the present application is located on the back side of the display module 10.
  • the display module 10 of the AR device is generally light-transmissive, so that the scene behind the display module 10 can also be seen Observed by the human eye, thereby realizing the combination of the virtual image and the real image, the camera module 20 can photograph the checkerboard pattern of the calibration board 40 through the display module 10. Therefore, the positioning device 50 can align the camera module 20, the display module 10, and the calibration board 40, so as to achieve the calibration of the camera module 20 and the calibration of the display module 10.
  • the plurality of calibration points M in the calibration image includes a center calibration point M1 and a plurality of edge calibration points M2.
  • the edge calibration points M2 enclose the first rectangle T1 and are located in the first rectangle T1.
  • the vertex of a rectangle T1 and the midpoint of the side of the first rectangle T1 the center calibration point M1 is located at the center of the first rectangle T1
  • the long side of the first rectangle T1 is the first calibration side N1
  • the short side of the first rectangle T1 is The second calibration side N2
  • the calibration method also includes:
  • the processor 30 is further used to determine that the camera module 20 and the display module 10 are in the display area when the distance between the two first calibration sides N1 and the center calibration point M1 is equal When the distances between the two second calibration sides N2 and the central calibration point M1 are equal, it is determined that the camera module 20 and the display module 10 are aligned in the second side length direction of the display area.
  • step 014 and step 015 can be implemented by the processor 30.
  • the positioning device 50 may first control the camera module 20 and the display module 10 to align, so that the captured calibration image is located in the camera module.
  • the center calibration point M1 of the first rectangle T1 (corresponding to the center mark point P1 of the calibration pattern (shown in FIG. 4)) is located on the optical axis O of the camera module 20, the center calibration point M1 of the calibration image S3 must be located in the captured image S4
  • the calibration image S3 is the image displayed in the display area of the display module 30. Since the field of view of the camera module 30 is generally larger than the field of view of the display module 10, not only the display area will be captured during shooting, but also Part of the area outside the display area will also be photographed. Therefore, the photographed image S4 generally includes the calibration image S3. In this way, it can be accurately determined whether the center of the display module 10 (that is, the center mark point P1 of the calibration pattern) is located on the optical axis O of the camera module 20.
  • the processor 30 determines whether the surface on which the display module 10 is located is perpendicular to the optical axis O.
  • the surface on which the display module 10 is located and the camera module 20 are aligned in both the X direction and the Y direction.
  • the distance between the X direction of the two first calibration sides N1 of the first rectangle T1 and the center calibration point M1 (that is, the distance between the edge calibration point M2 and the center calibration point M1 respectively located at the midpoints of the two first calibration sides N1 ) Is equal, it is determined that the camera module 20 and the display module 10 are aligned in the first side length direction of the display area, where the alignment in the first side length direction of the display area means that the display module 10 has no The angle of rotation around the X direction); the distance between the two second calibration sides N2 and the center calibration point M1 in the first rectangle T1 (that is, the edge calibration points M2 and the center respectively located at the midpoints of the two second calibration sides N2 When the distance between the calibration point M2) is equal, it is determined that the camera module 20 and the display module 10 are aligned in the second side length direction of the display area, where the second side length is the short side of the rectangular display area, and the second side length direction is It is the extension direction of the second side length (
  • Alignment in the second side length direction of the display area means that the display module 10 has no rotation angle around the Y direction relative to the camera module 20). In this way, it can be accurately determined whether the camera module 20 and the display module 10 are aligned, which is beneficial to improve the accuracy of the subsequent calculation of the field of view of the display module 10.
  • the processor 30 compares the length of the first calibration side N1 with a preset first standard length. The first standard length is when the camera module 30 and the display module 10 are completely aligned. The camera module 30 photographs the same calibration pattern displayed by the display module 10 to determine the length of the first calibration side N1.
  • the processor 30 compares the length of the second calibration side N2 with a preset second standard length.
  • the second standard length is that the camera module 30 and the display module 10 are completely In the case of alignment, the camera module 30 photographs the same calibration pattern displayed by the display module 10 to determine the length of the second calibration side N2.
  • the length of the second calibration side N2 is the same as the second standard length, it is sufficient. It is determined that the camera module 20 and the display module 10 are aligned in the X direction.
  • the marking pattern includes a plurality of marking points P, the marking points P in the marking pattern form a first marking edge L1 and a second marking edge L2, the first marking edge L1 It is perpendicular to the second mark side L2, the first mark side L1 is parallel to the first side length direction of the display area of the display module 10, and the display area and the calibration pattern are rectangular;
  • a calibration side N1 and a second calibration side N2, step 013 includes:
  • 0132 Calculate the vertical viewing angle of the display module 10 based on the focal length, the length of the second calibration side N2, and the ratio of the length of the second mark side L2 to the width of the calibration pattern.
  • the processor 30 is further configured to calculate the horizontal field of view of the display module 10 based on the focal length, the length of the first calibration side N1, and the ratio of the length of the first marking side L1 to the length of the calibration pattern, And the vertical field angle of the display module 10 is calculated according to the focal length, the length of the second calibration side N2, and the ratio of the length of the second logo side L2 to the width of the calibration pattern.
  • step 0131 and step 0132 can be implemented by the processor 30.
  • first marking edge N1 and the second marking edge N2 are shown in FIGS. 4 and 5. According to the different calibration patterns, the first marking edge N1 and the second marking edge N2 are also different.
  • the first marking edge N1 of FIG. 6 And the second calibration edge N2 respectively correspond to the first identification edge N1 and the second identification edge N2 of FIG. 4, and the first calibration edge N1 and the second calibration edge N2 of FIG.
  • the two flags correspond to side N2.
  • the horizontal field of view and vertical field of view can be specifically obtained by the following formula:
  • the ratio of the length of the first marking side L1 to the length of the calibration pattern is the same as the ratio of the length of the second marking side L2 to the width of the calibration pattern.
  • the ratio of the length of the first marking side L1 to the length of the rectangular calibration pattern is a first predetermined ratio
  • the ratio of the length of the second marking side L2 to the width of the rectangular calibration pattern is the second predetermined ratio
  • the second predetermined ratio may be the same or different.
  • the first predetermined ratio and the second predetermined ratio are the same and both are 0.7; or, the first predetermined ratio and the second predetermined ratio are the same and both are 0.8; or, the first predetermined ratio and the first predetermined ratio are the same as the second predetermined ratio.
  • the two predetermined ratios are not the same and are respectively 0.7 and 0.8, etc.
  • the camera module 20 and the display module 10 are both two, and the two camera modules 20 and the two display modules 10 correspond to each other.
  • the calibration method is also include:
  • the processor 30 is further configured to calculate the corresponding field angle of the display module 10 according to the focal length of the camera module 20 and the distance between the multiple calibration points in the corresponding calibration image.
  • step 016 can be implemented by the processor 30.
  • an AR/VR device is generally provided with two display modules 10 corresponding to the human eye.
  • the two display modules 10 display the same content.
  • the two display modules The image displayed on the retina overlaps.
  • two camera modules 20 can be used to simulate human eyes.
  • the distance D between the two camera modules 20 is the distance between the human eyes.
  • the distance corresponds to the two display modules 10 of the AR/VR device to calibrate the corresponding display modules 10 respectively, according to the focal length of the camera module 20 and the distance between multiple calibration points in the corresponding calibration image
  • the field of view angle of the display module 10 is calculated, where the corresponding calibration image is obtained by the camera module 20 photographing the calibration pattern displayed by the corresponding display module 10.
  • the field of view angles of the two display modules 10 can be accurately obtained to accurately configure the display modules 10 to ensure that the images of the two display modules 10 seen by the human eye are overlapped.
  • the two display modules 10 are calibrated at the same time, and the calibration efficiency is relatively high.
  • the calibration point M is located at the center of multiple circles with the same radius, and the distance between the calibration points M is the distance between the centers of the circles where the calibration point M (FIG. 12) is located.
  • the distance between the centers of the two calibration areas can be used as the distance between the calibration points M of the two calibration areas, so that the distance between the calibration points M can be determined more accurately, and the viewing angle of the display module 10 can be improved. Calculation accuracy of field angle.
  • one or more non-volatile computer-readable storage media 300 containing computer-executable instructions 302 when the computer-executable instructions 302 are executed by one or more processors 30, make The processor 30 can execute the startup method of any one of the foregoing implementation manners.
  • 013 Calculate the field of view angle of the display module 10 according to the focal length and the distance between the multiple calibration points identified in the calibration image.
  • the processor 30 may also perform the following steps:

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

一种标定方法、标定装置和计算机可读存储介质。标定方法包括获取相机模组的相机参数(011);控制显示模组显示标定图案,并通过所述相机模组拍摄所述标定图案以获取标定图像(012);及根据所述焦距和所述标定图像中识别到的多个标定点M之间的距离计算所述显示模组的视场角(013)。

Description

标定方法、标定装置和非易失性计算机可读存储介质
优先权信息
本申请请求2020年4月23日向中国国家知识产权局提交的、专利申请号为202010325625.0的专利申请的优先权和权益,并且通过参照将其全文并入此处。
技术领域
本申请涉及超分辨成像、光刻及数据存储领域,特别涉及一种标定方法、标定装置和非易失性计算机可读存储介质。
背景技术
随着显示技术的发展,以VR/AR眼镜等设备为代表的眼戴式显示器(NED,ear-EyeDisplay)因其便携、沉浸感强、用途广泛等特点,正逐渐走入人们的生活。尽管对眼戴式显示器本身的设计和优化方面的技术成果层出不穷,但关于对眼戴式显示设备的测量,相关的研究相对较少。
发明内容
本申请的实施例提供了一种标定方法、标定装置和非易失性计算机可读存储介质。
本申请实施方式的标定方法包括获取相机模组的相机参数,所述相机参数包括所述相机模组的焦距;控制显示模组显示标定图案,并通过所述相机模组拍摄所述标定图案以获取标定图像;及根据所述焦距和所述标定图像中识别到的多个标定点之间的距离计算所述显示模组的视场角。
本申请实施方式的标定装置包括显示模组、相机模组和处理器;所述处理器用于获取所述相机模组的相机参数,所述相机参数包括所述相机模组的焦距,控制显示模组显示标定图案,并通过所述相机模组拍摄所述标定图案以获取标定图像,及根据所述焦距和所述标定图像中识别到的多个标定点之间的距离计算所述显示模组的视场角。
一种包含计算机可执行指令的非易失性计算机可读存储介质,当所述计算机可执行指令被一个或多个处理器执行时,使得所述处理器执行所述标定方法。所述标定方法包括获取相机模组的相机参数,所述相机参数包括所述相机模组的焦距;控制显示模组显示标定图案,并通过所述相机模组拍摄所述标定图案以获取标定图像;及根据所述焦距和所述标定图像中识别到的多个标定点之间的距离计算所述显示模组的视场角。
本申请的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本申请的实践了解到。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请某些实施方式的标定方法的流程示意图;
图2是本申请某些实施方式的标定装置的平面示意图;
图3是本申请某些实施方式的标定装置的另一视角的平面示意图;
图4是本申请某些实施方式的标定图案的平面示意图;
图5是本申请某些实施方式的标定图案的平面示意图;
图6是本申请某些实施方式的标定图像的平面示意图;
图7是本申请某些实施方式的标定图像的平面示意图;
图8是本申请某些实施方式的标定方法的流程示意图;
图9是本申请某些实施方式的棋盘格图像的平面示意图;
图10是本申请某些实施方式的相机模组拍摄棋盘格图像得到的拍摄图像的平面示意图;
图11是本申请某些实施方式的相机模组拍摄棋盘格图像得到的拍摄图像的平面示意图;
图12是本申请某些实施方式的相机模组拍摄标定图案得到的拍摄图像的平面示意图;
图13是本申请某些实施方式的标定方法的流程示意图;
图14是本申请某些实施方式的标定方法的流程示意图;
图15是本申请某些实施方式的标定方法的流程示意图;及
图16是本申请某些实施方式的处理器和计算机可读存储介质的连接示意图。
具体实施方式
以下结合附图对本申请的实施方式作进一步说明。附图中相同或类似的标号自始至终表示相同或类似的元件或具有相同或类似功能的元件。另外,下面结合附图描述的本申请的实施方式是示例性的,仅用于解释本申请的实施方式,而不能理解为对本申请的限制。
请参阅图1至图3,本申请实施方式的标定方法包括011:获取相机模组20的相机参数,相机参数包括相机模组20的焦距;012:控制显示模组10显示标定图案,并通过相机模组20拍摄标定图案以获取标定图像;及013:根据焦距和标定图像中识别到的多个标定点之间的距离计算显示模组10的视场角。
在某些实施方式中,获取相机模组的相机参数,包括:0111:控制相机模组20和标定板40对准;及0112:控制相机模组20拍摄标定板40上的棋盘格图案;及0113:根据拍摄得到的棋盘格图像计算相机参数。
请参阅图5和图6,在某些实施方式中,显示区域呈矩形,标定图案包括多个标志点P,标定图案中的多个标志点P形成第一标志边L1和第二标志边L2,第一标志边L1和第二标志边L2垂直,第一标志边L1平行于显示模组10的显示区域的第一边长方向;标定图像中识别到多个标定点M形成的第一标定边N1和第二标定边N2,根据焦距和标定图像中识别到的多个标定点之间的距离计算显示模组的视场角,包括:
根据焦距、第一标定边N1的长度、和第一标志边L1的长度与标定图案的长度的比值计算显示模组10的水平视场角;及
根据焦距、第二标定边N2的长度、和第二标志边L2的长度与标定图案的宽度的比值计算显示模组10的垂直视场角。
请参阅图12和图13,在某些实施方式中,标定图像中的多个标定点M包括中心标定点M1和多个边缘标定点M2,边缘标定点M2围成第一矩形T1并位于第一矩形T1的顶点和第一矩形T1的边的中点,中心标定点M1位于第一矩形T1的中心,第一矩形T1的长边为第一标定边N1,第一矩形T1的短边为第二标定边N2,标定方法还包括:
014:在两个第一标定边N1和中心标定点M1的距离相等时,确定相机模组20和显示模组10在显示区域的第一边长方向对齐;及
015:在两个第二标定边N2和中心标定点M1的距离相等时,确定相机模组20和显示模组10在显示区域的第二边长方向对齐,第二边长方向和第一边长方向垂直。
在某些实施方式中,标定图案呈矩形,标定图案包括多个标志点P,多个标志点P包括中心标志点P1和多个边缘标志点P2,边缘标志点P2围成第二矩形T2并位于第二矩形T2的顶点和第二矩形T2的边的中点,中心标志点P1位于第二矩形T2的中心,第二矩形T2的长边为第一标志边L1,第二矩形的短边为第二标志边L2。
在某些实施方式中,第一标志边L1的长度和标定图案的长度的比值和第二标志边L2的长度和标定图案的宽度的比值相同。
请参阅图2和图6,在某些实施方式中,相机模组20和显示模组10均为两个,两相机模组20和两显示模组10一一对应,标定方法还包括:016:根据相机模组20的焦距和对应的标定图像中的多个标定点M之间的距离计算对应的显示模组10的视场角。
在某些实施方式中,标定点M位于多个半径相同的圆的圆心位置,标定点M之间的距离为标定点M所在的圆之间的圆心距。
请参阅图2,在某些实施方式中,控制相机模组20和标定板40对准,包括:将标定板40的中心定位在相机模组20的光轴O,并使得标定板40垂直光轴O。
在某些实施方式中,获取相机模组20的相机参数,包括:控制相机模组20和显示模组10对准;控制显示模组10显示棋盘格图案;控制相机模组20拍摄棋盘格图案以获取棋盘格图像;及根据棋盘格图像计算相机参数。
请参阅图2和图3,本申请实施方式的标定装置包括显示模组10、相机模组20和处理器30。处理器30用于获取相机模组20的相机参数,相机参数包括相机模组20的焦距;控制显示模组10显示标定图案,并通过相机模组20拍摄标定图案以获取标定图像;及根据焦距和标定图像中识别到的多个标定点之间的距离计算显示模组10的视场角。
在某些实施方式中,标定装置100还包括标定板40和定位装置50,定位装置50用于控制相机模组20和标定板40对准;处理器30还用于控制相机模组20拍摄标定板40上的棋盘格图案;及根据拍摄得到的棋盘格图像计算相机参数。
请参阅图3、图5和图6,在某些实施方式中,显示区域呈矩形,标定图案包括多个标志点P,标定图案中的多个标志点P形成第一标志边L1和第二标志边L2,第一标志边L1和第二标志边L2垂直,第一标志边L1平行于显示模组10的显示区域的第一边长方向;标定图像中识别到多个标定点M形成的第一标定边N1和第二标定边N2,处理器30还用于根据焦距、第一标定边N1的长度、和第一标志边L1的长度与标定图案的长度的比值计算显示模组10的水平视场角,及根据焦距、第二标定边N2的长度、和第二标志边L2的长度与标定图案的宽度的比值计算显示模组10的垂直视场角。
请参阅图3和图12,在某些实施方式中,标定图像中的多个标定点M包括中心标定 点M1和多个边缘标定点M2,边缘标定点M2围成第一矩形T1并位于第一矩形T1的顶点和第一矩形T1的边的中点,中心标定点M1位于第一矩形T1的中心,第一矩形T1的长边为第一标定边N1,第一矩形T1的短边为第二标定边N2,处理器30还用于:在两个第一标定边N1和中心标定点M1的距离相等时,确定相机模组20和显示模组10在显示区域的第一边长方向对齐,及在两个第二标定边N2和中心标定点M1的距离相等时,确定相机模组20和显示模组10在显示区域的第二边长方向对齐。
在某些实施方式中,标定图案呈矩形,标定图案包括多个标志点P,多个标志点P包括中心标志点P1和多个边缘标志点P2,边缘标志点P2围成第二矩形T2并位于第二矩形T2的顶点和第二矩形T2的边的中点,中心标志点P1位于第二矩形T2的中心,第二矩形T2的长边为第一标志边L1,第二矩形的短边为第二标志边L2。
在某些实施方式中,第一标志边L1的长度和标定图案的长度的比值和第二标志边L2的长度和标定图案的宽度的比值相同。
请参阅图2和图6,在某些实施方式中,相机模组20和显示模组10均为两个,两相机模组20和两显示模组10一一对应,处理器30还用于根据相机模组20的焦距和对应的标定图像中的多个标定点之间的距离计算对应的显示模组10的视场角。
在某些实施方式中,标定点M位于多个半径相同的圆的圆心位置,标定点M之间的距离为标定点M所在的圆之间的圆心距。
请参阅图2,在某些实施方式中,定位装置50用于将标定板40的中心定位在相机模组20的光轴O,并使得标定板40垂直光轴O。
在某些实施方式中,处理器30还用于控制相机模组20和显示模组10对准;控制显示模组10显示棋盘格图案;控制相机模组20拍摄棋盘格图案以获取棋盘格图像;及根据棋盘格图像计算相机参数。
请参阅图16,本申请实施方式的计算机可执行指令302的非易失性计算机可读存储介质300,当计算机可执行指令302被一个或多个处理器30执行时,使得处理器30执行上述任一实施方式的的标定方法。
请参阅图1至图3,本申请实施方式的标定方法包括以下步骤:
011:获取相机模组20的相机参数,相机参数包括相机模组20的焦距;
012:控制显示模组10显示标定图案,并通过相机模组20拍摄标定图案以获取标定图像;及
013:根据焦距和标定图像中识别到的多个标定点之间的距离计算显示模组10的视场角。
在某些实施方式中,标定装置100包括显示模组10、相机模组20和处理器30。处理器30用于获取相机模组20的相机参数,相机参数包括相机模组20的焦距;控制显示模组10显示标定图案,并通过相机模组20拍摄标定图案以获取标定图像;及根据焦距和标定图像中识别到的多个标定点之间的距离计算显示模组10的视场角。也即是说,步骤011可以由处理器30实现,步骤012可以由处理器30配合相机模组20及显示模组10实现,步骤013可以由处理器30实现。
具体地,相机模组20在出厂时有一个设计参数,在相机模组20制造过程中的制造和组装误差较小的情况下,相机模组20本身的相机参数基本和设计参数基本保持一致,另 外一些厂商在相机组装完成后也会利用高精度的一起再次对相机参数进行标定,从而保证出厂后的相机参数的准确性。
处理器30能够获取相机参数,例如处理器30可以与相机模组20连接以获取存储在相机模组20中的相机参数,或者,相机模组20的相机参数已预先存储好,如服务器或者本地存储器,处理器30与服务器或本地存储器连接以获取相机参数。相机参数可包括相机模组20的焦距、相机模组20的水平视场角、垂直视场角等。本实施方式中,只需要获取相机模组20的焦距即可。
在得到相机参数后,处理器30控制连接好的显示模组10显示预设的标定图案(如图4和图5所示),然后控制相机模组20拍摄标定图案,以获取标定图像(如图6和图7所示),标定图案为特制的图案,标定图案存在特定的标志点P,这些标志点P能够从拍摄得到的标定图像中轻松的识别到,标志点P的位置也是提前设定好的。
处理器30首先识别标定图像中与标志点P对应的标定点M(如图6和图7所示),然后得到多个标定点M(如本实施方式拍摄图4所示的标定图案,可得到图6所示的与九个标志点P分别对应的九个标定点M,拍摄图5所示的标定图案,可得到图7所示的与五个标志点P分别对应的五个标定点M)之间的距离,标定点M和标志点P一一对应。
标定图案包括多个标志点P,标定图案中的多个标志点P形成第一标志边L1和第二标志边L2。如图4所示,多个标志点P包括中心标志点P1和边缘标志点P2(即,一个中心标志点P1和八个边缘标志点P2),八个边缘标志点P2围成第二矩形T2,边缘标志点P2位于第二矩形T2的顶点和边的中点,中心标志点P1位于第二矩形T2的中心(即,第二矩形T2的对角线的交点),第二矩形T2的长边为第一标志边L1,第二矩形T2的短边为第二标志边L2。如图5所示,多个标志点P包括中心标志点P1和边缘标志点P2(即,一个中心标志点P1和四个边缘标志点P2),第一标志边L1和第二标志边L2分别为X方向上相对的两个边缘标志点P2的连线和Y方向相对的两个边缘标志点P2的连线,中心标志点P1位于两个连线的交点,其中,X方向和Y方向垂直,第一标志边L1和第二标志边L2垂直,第一标志边L1平行于显示模组10的显示区域的第一边长方向,显示区域和标定图案均呈矩形,其中,第一边长为矩形显示区域的长边,第一边长方向即为第一边长的延伸方向(即,X方向)。
标定图像中识别到多个标定点M形成的分别与第一标志边L1和第二标志边L2对应的第一标定边N1和第二标定边N2。如图6所示,标定图像中的多个标定点M包括中心标定点M1和多个边缘标定点M2(即,一个中心标定点M1和八个边缘标定点M2),边缘标定点M2围成第一矩形T1并位于第一矩形T1的顶点和第一矩形T1的边的中点,中心标定点M1位于第一矩形T1的中心(即,第一矩形T1的对角线的交点),第一矩形T1的长边为第一标定边N1,第一矩形T1的短边为第二标定边N2。如图7所示,多个标定点M包括中心标定点M1和边缘标定点M2(即,一个中心标定点M1和四个边缘标定点M2),第一标定边N1和第二标定边N2分别为X方向上相对的两个边缘标定点M2的连线和Y方向相对的两个边缘标定点M2的连线,中心标定点M1位于两个连线的交点,第一标定边N1和第二标定边N2垂直,第一标定边N1平行于显示模组10的显示区域的第一边长方向(即,X方向)。
处理器30根据标定图像中多个标定点M之间的距离能够得到第一标定边N1和第二 标定边N2的长度,根据第一标定边N1和第二标定边N2的长度以及相机模组20的焦距即可计算得到显示模组10的视场角,视场角包括水平视场角、垂直视场角和对角线视场角,其中,对角线视场角可根据水平视场角和垂直视场角计算得到,故一般只需要计算水平视场角和垂直视场角即可。可以理解,标定图案的类型包括但不限于上述图4和图5所示的图案,只需标定图案中存在分别与X方向平行的第一标志边L1和与Y方向平行的第二标志边L2即可。
由于生产工艺的限制,即使配置同一的视场角参数,每台VR/AR眼镜的两显示模组的视场角也并不完全相同,会导致人眼透过显示模组后看到的左右眼图像不完全重合,因此,如何准确地检测左右眼对应的显示模组的视场角是亟需解决的问题。
本申请的标定方法和标定装置100中,通过控制显示模组10显示标定图案,并通过相机模组20拍摄标定图案以获取标定图像,标定图像中存在与标定图案的标志点对应的标定点,利用相机模组20的焦距和标定点之间的距离即可准确地计算显示模组10的视场角。从而能够对显示模组10的视场角进行准确地配置,保证人眼看到的左右眼图像完全重合。
请结合图8和图9,在某些实施方式中,步骤011包括:
0111:控制相机模组20和标定板40对准;及
0112:控制相机模组20拍摄标定板40上的棋盘格图案;及
0113:根据拍摄得到的棋盘格图像计算相机参数。
请结合图2和图3,在某些实施方式中,标定装置100还包括定位装置50,定位装置50用于控制相机模组20和标定板40对准;处理器30还用于控制相机模组20拍摄标定板40上的棋盘格图案;及根据拍摄得到的棋盘格图像计算相机参数。也即是说,步骤0111可以由定位装置50实现,步骤0112和步骤0113可以由处理器30实现。
具体地,为了防止相机模组20的实际参数因为制造和组装误差导致和设定参数出现较大偏差,因此,为了保证获取的相机参数的准确性,需要事先准确地检测好相机参数。为了保证相机模组20拍摄的图像的清晰度,相机模组20要对准标定板40,具体地,可通过定位装置50将标定板40的中心定位到相机模组20的光轴O上,并使得标定板40垂直于光轴O,如此,标定板40能最大限度的处于相机模组20的视场范围的中心区域,避免了视场范围的边缘区域的畸变影响成像质量,保证相机模组20拍摄的图像的成像质量。
在定位完成后,可通过相机模组20拍摄标定板40上的标定图案(本申请以标定图案为图9所示的棋盘格图案为例进行说明,其中1表示黑色区域,2表示白色区域,标定图案还可以是其他合适的图案,在此不做限制),确定是否对准,具体为先判断棋盘格图案的中心点z是否位于整个拍摄图像S1的中心,当棋盘格图案的中心点z位于相机模组20的光轴O时,棋盘格图案的中心点z必然位于拍摄图像S1的中心(如图10所示),而当标定板40的中心点z不位于相机模组20的光轴O时,棋盘格图案的中心点z就会偏离拍摄图像S2的中心(如图11所示),如此,可准确判断标定板40的中心是否位于相机模组20的光轴O上;
然后可判断标定板40所在平面是否垂直光轴O,可以理解,当标定板40垂直光轴O时,拍摄图像S1中的棋盘格图案的中心点z到两条长边(平行于X方向的两条边)的距 离应该是相同的(如图10中的d1和d2),同样的,拍摄图像S1中的棋盘格图案的中心点z到两条短边(平行于Y方向的两条边)的距离也应该是相同的(如图10中的d3和d4),若不相同,表示标定板40未垂直光轴O,标定板40的定位角度存在一定误差(如存在绕X方向或Y方向的旋转角度),可通过定位装置50矫正该旋转角度,如根据d1、d2的差值可校正X方向的旋转角度误差,根据d3和d4可校正Y方向的旋转角度误差,定位装置50能够实现固定相机模组20、标定板40和显示模组10固定、高度调节、三轴角度调节等功能,从而保证标定板40相机模组20的对准,有利于提升后续相机模组20的相机参数以及显示模组10的视场角的准确性。
在对准后,处理器30控制相机模组20拍摄标定板40上的棋盘格图案以得到棋盘格图像,根据张正友相机标定法和棋盘格图像,即可准确地检测得到相机模组20的相机参数。在其他实施方式中,可不设置标定板40,而只需要将相机模组20和显示模组10对准(即,显示模组10的显示区域的中心位于相机模组20的光轴O上,且显示模组10所在的面垂直光轴O)。然后通过显示模组10显示该棋盘格图案,相机模组20拍摄显示模组10显示的棋盘格图案即可获取棋盘格图像,从而根据张正友相机标定法和棋盘格图像,即可准确地检测得到相机模组20的相机参数。如此,无需设置标定板,只需对准显示模组10和相机模组20即可,标定装置100的结构更为简单,且减少了标定工序。
本申请实施方式中,本申请的标定板40位于显示模组10的背面一侧,可以理解,AR设备的显示模组10一般是透光的,以使得显示模组10后的景象也能够被人眼所观察到,从而实现虚弥图像与现实图像的结合,相机模组20可透过显示模组10拍摄标定板40的棋盘格图案。因此,定位装置50可将相机模组20、显示模组10以及标定板40三者均对准,从而实现相机模组20的标定以及显示模组10的标定。
请参阅图12和图13,在某些实施方式中,标定图像中的多个标定点M包括中心标定点M1和多个边缘标定点M2,边缘标定点M2围成第一矩形T1并位于第一矩形T1的顶点和第一矩形T1的边的中点,中心标定点M1位于第一矩形T1的中心,第一矩形T1的长边为第一标定边N1,第一矩形T1的短边为第二标定边N2,标定方法还包括:
014:在两个第一标定边N1和中心标定点M1的距离相等时,确定相机模组20和显示模组10在显示区域的第一边长方向对齐;及
015:在两个第二标定边N2和中心标定点M1的距离相等时,确定相机模组20和显示模组10在显示区域的第二边长方向对齐,第二边长方向和第一边长方向垂直。
请结合图3,在某些实施方式中,处理器30还用于:在两个第一标定边N1和中心标定点M1的距离相等时,确定相机模组20和显示模组10在显示区域的第一边长方向对齐,及在两个第二标定边N2和中心标定点M1的距离相等时,确定相机模组20和显示模组10在显示区域的第二边长方向对齐。也即是说,步骤014和步骤015可以由处理器30实现。
具体地,为了保证相机模组20拍摄显示模组10显示的标定图案的准确性,定位装置50可先控制相机模组20和显示模组10对准,以使得拍摄的标定图像位于相机模组20的视场范围的中心区域,避免边缘区域的畸变影响标定图像的成像质量。
在第一矩形T1的中心标定点M1(对应标定图案的中心标志点P1(图4示))位于相机模组20的光轴O时,标定图像S3的中心标定点M1必然位于拍摄图像S4的中心,其 中,标定图像S3即显示模组30的显示区域显示的图像,由于相机模组30的视场角一般大于显示模组10的视场角,在拍摄时不仅会拍摄到显示区域,而且还会拍摄到显示区域之外的部分区域,因此,拍摄图像S4一般包含了标定图像S3。如此,可准确判断显示模组10的中心(即,标定图案的中心标志点P1)是否位于相机模组20的光轴O上。
处理器30再判断显示模组10所在的面是否垂直光轴O,当标定板40垂直光轴O时,显示模组10所在的面和相机模组20在X方向和Y方向上均对齐。具体地,在第一矩形T1的两个第一标定边N1X方向和中心标定点M1的距离(即,分别位于两个第一标定边N1中点的边缘标定点M2和中心标定点M1的距离)相等时,确定相机模组20和显示模组10在显示区域的第一边长方向对齐,其中,在显示区域的第一边长方向对齐指的是显示模组10相对相机模组20没有绕X方向的旋转角度);在第一矩形T1中两个第二标定边N2和中心标定点M1的距离(即,分别位于两个第二标定边N2的中点的边缘标定点M2和中心标定点M2的距离)相等时,确定相机模组20和显示模组10在显示区域的第二边长方向对齐,其中,第二边长为矩形显示区域的短边,第二边长方向即为第二边长的延伸方向(即,Y方向),在显示区域的第二边长方向对齐指的是显示模组10相对相机模组20没有绕Y方向的旋转角度)。如此,可准确地确定相机模组20和显示模组10是否对准,有利于提升后续计算显示模组10的视场角的准确性。在其他实施方式中,处理器30将第一标定边N1的长度和预设的第一标准长度进行比较,第一标准长度为在相机模组30和显示模组10完全对准的情况下,相机模组30拍摄显示模组10显示的同样的标定图案以确定的第一标定边N1的长度,在第一标定边N1的长度和第一标准长度相同时,即可确定相机模组20和显示模组10在Y方向对齐,同样地,处理器30将第二标定边N2的长度和预设的第二标准长度进行比较,第二标准长度为在相机模组30和显示模组10完全对准的情况下,相机模组30拍摄显示模组10显示的同样的标定图案以确定的第二标定边N2的长度,在第二标定边N2的长度和第二标准长度相同时,即可确定相机模组20和显示模组10在X方向对齐。
请参阅图4和14,在某些实施方式中,标定图案包括多个标志点P,标定图案中的多个标志点P形成第一标志边L1和第二标志边L2,第一标志边L1和第二标志边L2垂直,第一标志边L1平行于显示模组10的显示区域的第一边长方向,显示区域和标定图案呈矩形;标定图像中识别到多个标定点M形成的第一标定边N1和第二标定边N2,步骤013包括:
0131:根据焦距、第一标定边N1的长度、和第一标志边L1的长度与标定图案的长度的比值计算显示模组10的水平视场角;及
0132:根据焦距、第二标定边N2的长度、和第二标志边L2的长度与标定图案的宽度的比值计算显示模组10的垂直视场角。
在某些实施方式中,处理器30还用于根据焦距、第一标定边N1的长度、和第一标志边L1的长度与标定图案的长度的比值计算显示模组10的水平视场角,及根据焦距、第二标定边N2的长度、和第二标志边L2的长度与标定图案的宽度的比值计算显示模组10的垂直视场角。也即是说,步骤0131和步骤0132可以由处理器30实现。
具体地,第一标志边N1和第二标志边N2如图4和5所示,根据标定图案的不同,第一标志边N1和第二标志边N2也不同,图6的第一标定边N1和第二标定边N2分别与 图4的第一标志边N1和第二标志边N2对应,图7的第一标定边N1和第二标定边N2分别与图5的第一标志边N1和第二标志边N2对应。水平视场角和垂直视场角具体可通过如下公式得到:
fw=2*atan(Dw/2*a/F),fh=2*atan(Dh/2*b/F),其中,fw是显示模组10的水平视场角,fh是显示模组10的垂直视场角,Dw是第一标定边N1的长度,Dh是第二标定边N2的长度,a是第一标志边L1的长度和标定图案的长度的比值,b是第二标志边L2的长度和标定图案的宽度的比值,F是相机模组20的焦距。如此,根据上述公式可快速计算得到水平视场角和垂直视场角。
请参阅图4,在某些实施方式中,第一标志边L1的长度和标定图案的长度的比值和第二标志边L2的长度和标定图案的宽度的比值相同。
具体地,第一标志边L1的长度和矩形的标定图案的长度的比值为第一预定比值,第二标志边L2的长度和矩形标定图案的宽度的比值为第二预定比值,第一预定比值和第二预定比值可以相同,也可以不同。例如,如图4和图5中的第一预定比值和第二预定比值相同且均为0.7;或者,第一预定比值和第二预定比值相同且均为0.8;或者,第一预定比值和第二预定比值不相同且分别为0.7和0.8等。
请参阅图3、图6和图15,在某些实施方式中,相机模组20和显示模组10均为两个,两相机模组20和两显示模组10一一对应,标定方法还包括:
016:根据相机模组20的焦距和对应的标定图像中的多个标定点M之间的距离计算对应的显示模组10的视场角。
在某些实施方式中,处理器30还用于根据相机模组20的焦距和对应的标定图像中的多个标定点之间的距离计算对应的显示模组10的视场角。也即是说,步骤016可以由处理器30实现。
具体地,AR/VR设备一般和人眼对应的设置有两个显示模组10,在显示时两个显示模组10显示同样的内容,配置合适的视场角参数后,两个显示模组显示的图像在视网膜的成像就是重合的,在对显示模组10进行标定时,可采用两个相机模组20模拟人眼,两个相机模组20之间的距离D即为人眼之间的距离,和AR/VR设备的两个显示模组10分别对应,以分别对对应的显示模组10进行标定,根据相机模组20的焦距和对应的标定图像中多个标定点之间的距离计算显示模组10的视场角,其中对应的标定图像为相机模组20拍摄对应的显示模组10显示的标定图案获取的。从而分别准确地得到两个显示模组10视场角,以对显示模组10进行准确的配置,保证人眼看到的两个显示模组10的图像是重合的。且同时对两个显示模组10进行标定,标定效率较高。
在某些实施方式中,标定点M位于多个半径相同的圆的圆心位置,标定点M之间的距离为标定点M(图12)所在的圆之间的圆心距。
具体地,为了更为准确的计算标定点M之间的距离,可使用多个半径大小相同的圆作为标定图案S3中的标定区域,然后将每个圆的圆心作为标定点M,从而避免了标定点M本身大小对标定点M之间的距离的影响,可更为准确地确定标定点M。在后续计算中,可将两个标定区域之间的圆心距作为两个标定区域的标定点M的距离,从而能够更为准确的确定标定点M之间的距离,提升显示模组10的视场角的计算准确性。
请参阅图16,本申请实施方式的一个或多个包含计算机可执行指令302的非易失性 计算机可读存储介质300,当计算机可执行指令302被一个或多个处理器30执行时,使得处理器30可执行上述任一实施方式的启动方法。
例如,请结合图1至图3,当计算机可执行指令302被一个或多个处理器30执行时,使得处理器30执行以下步骤:
011:获取相机模组20的相机参数,相机参数包括相机模组20的焦距;
012:控制显示模组10显示标定图案,并通过相机模组20拍摄标定图案以获取标定图像;及
013:根据焦距和标定图像中识别到的多个标定点之间的距离计算显示模组10的视场角。
再例如,请结合图2、图8和图9,当计算机可执行指令302被一个或多个处理器30执行时,处理器30还可以执行以下步骤:
0111:控制相机模组20和标定板40对准;及
0112:控制相机模组20拍摄标定板40上的棋盘格图案;及
0113:根据拍摄得到的棋盘格图像计算相机参数。
在本说明书的描述中,参考术语“一个实施方式”、“一些实施方式”、“示意性实施方式”、“示例”、“具体示例”或“一些示例”等的描述意指结合所述实施方式或示例描述的具体特征、结构、材料或者特点包含于本申请的至少一个实施方式或示例中。在本说明书中,对上述术语的示意性表述不一定指的是相同的实施方式或示例。而且,描述的具体特征、结构、材料或者特点可以在任何的一个或多个实施方式或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本申请的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本申请的实施例所属技术领域的技术人员所理解。
尽管上面已经示出和描述了本申请的实施方式,可以理解的是,上述实施方式是示例性的,不能理解为对本申请的限制,本领域的普通技术人员在本申请的范围内可以对上述实施方式进行变化、修改、替换和变型。

Claims (21)

  1. 一种标定方法,其特征在于,所述标定方法包括:
    获取相机模组的相机参数,所述相机参数包括所述相机模组的焦距;
    控制显示模组显示标定图案,并通过所述相机模组拍摄所述标定图案以获取标定图像;及
    根据所述焦距和所述标定图像中识别到的多个标定点之间的距离计算所述显示模组的视场角。
  2. 根据权利要求1所述的标定方法,其特征在于,所述获取相机模组的相机参数,包括:
    控制所述相机模组和标定板对准;
    控制所述相机模组拍摄标定板上的棋盘格图案;及
    根据拍摄得到的棋盘格图像计算所述相机参数。
  3. 根据权利要求1所述的标定方法,其特征在于,所述显示区域呈矩形,所述标定图案包括多个标志点,所述标定图案中的多个标志点形成第一标志边和第二标志边,所述第一标志边和所述第二标志边垂直,所述第一标志边平行于所述显示模组的显示区域的第一边长方向;所述标定图像中识别到多个标定点形成的第一标定边和第二标定边,所述根据所述焦距和所述标定图像中识别到的多个标定点之间的距离计算所述显示模组的视场角,包括:
    根据所述焦距、所述第一标定边的长度、和所述第一标志边的长度与所述标定图案的长度的比值计算所述显示模组的水平视场角;及
    根据所述焦距、所述第二标定边的长度、和所述第二标志边的长度与所述标定图案的宽度的比值计算所述显示模组的垂直视场角。
  4. 根据权利要求3所述的标定方法,其特征在于,所述标定图像中的多个标定点包括中心标定点和多个边缘标定点,所述边缘标定点围成第一矩形并位于所述第一矩形的顶点和所述第一矩形的边的中点,所述中心标定点位于所述第一矩形的中心,所述第一矩形的长边为所述第一标定边,所述第一矩形的短边为所述第二标定边,所述标定方法还包括:
    在两个所述第一标定边和所述中心标定点的距离相等时,确定所述相机模组和所述显示模组在所述显示区域的第一边长方向对齐;及
    在两个所述第二标定边和所述中心标定点的距离相等时,确定所述相机模组和所述显示模组在所述显示区域的第二边长方向对齐,所述第二边长方向和所述第一边长方向垂直。
  5. 根据权利要求3所述的标定方法,其特征在于,所述标定图案呈矩形,所述标定图案包括多个标志点,所述多个标志点包括中心标志点和多个边缘标志点,所述边缘标志点围成第二矩形并位于所述第二矩形的顶点和所述第二矩形的边的中点,所述中心标志点位于所述第二矩形的中心,所述第二矩形的长边为所述第一标志边,所述第二矩形的短边为所述第二标志边。
  6. 根据权利要求3或5所述的标定方法,其特征在于,所述第一标志边的长度和所述标定图案的长度的比值和所述第二标志边的长度和所述标定图案的宽度的比值相同。
  7. 根据权利要求1所述的标定方法,其特征在于,所述相机模组和所述显示模组均为两个,两所述相机模组和两所述显示模组一一对应,所述标定方法还包括:
    根据所述相机模组的焦距和对应的所述标定图像中的多个标定点之间的距离计算对应的所述显示模组的视场角。
  8. 根据权利要求1所述的标定方法,其特征在于,所述标定点位于多个半径相同的圆的圆心位置,所述标定点之间的距离为所述标定点所在的圆之间的圆心距。
  9. 根据权利要求2所述的标定方法,其特征在于,所述控制所述相机模组和所述标定板对准,包括:将所述标定板的中心定位在所述相机模组的光轴,并使得所述标定板垂直所述光轴。
  10. 根据权利要求1所述的标定方法,其特征在于,所述获取相机模组的相机参数,包括:
    控制所述相机模组和所述显示模组对准;
    控制所述显示模组显示棋盘格图案;
    控制所述相机模组拍摄所述棋盘格图案以获取棋盘格图像;及
    根据所述棋盘格图像计算所述相机参数。
  11. 一种标定装置,其特征在于,所述标定装置包括显示模组、相机模组和处理器;所述处理器用于获取所述相机模组的相机参数,所述相机参数包括所述相机模组的焦距,控制显示模组显示标定图案,并通过所述相机模组拍摄所述标定图案以获取标定图像,及根据所述焦距和所述标定图像中识别到的多个标定点之间的距离计算所述显示模组的视场角。
  12. 根据权利要求11所述的标定装置,其特征在于,所述标定装置还包括定位装置和标定板,所述定位装置用于控制所述相机模组和所述标定板对准;所述处理器还用于控制所述相机模组拍摄标定板上的棋盘格图案,及根据拍摄得到的棋盘格图像计算所述相机参数。
  13. 根据权利要求11所述的标定装置,其特征在于,所述显示区域呈矩形,所述标定图案包括多个标志点,所述标定图案中的多个标志点形成第一标志边和第二标志边,所述第一标志边和所述第二标志边垂直,所述第一标志边平行于所述显示模组的显示区域的第一边长方向;所述标定图像中识别到多个标定点形成的第一标定边和第二标定边,所述处理器还用于根据所述焦距、所述第一标定边的长度、和所述第一标志边的长度与所述标定 图案的长度的比值计算所述显示模组的水平视场角,及根据所述焦距、所述第二标定边的长度、和所述第二标志边的长度与所述标定图案的宽度的比值计算所述显示模组的垂直视场角。
  14. 根据权利要求13所述的标定装置,其特征在于,所述标定图像中的多个标定点包括中心标定点和多个边缘标定点,所述边缘标定点围成第一矩形并位于所述第一矩形的顶点和所述第一矩形的边的中点,所述中心标定点位于所述第一矩形的中心,所述第一矩形的长边为所述第一标定边,所述第一矩形的短边为所述第二标定边,所述处理器还用于在两个所述第一标定边和所述中心标定点的距离相等时,确定所述相机模组和所述显示模组在所述显示区域的第一边长方向对齐,及在两个所述第二标定边和所述中心标定点的距离相等时,确定所述相机模组和所述显示模组在所述显示区域的第二边长方向对齐,所述第二边长方向和所述第一边长方向垂直。
  15. 根据权利要求13所述的标定装置,其特征在于,所述标定图案呈矩形,所述标定图案包括多个标志点,所述多个标志点包括中心标志点和多个边缘标志点,所述边缘标志点围成第二矩形并位于所述第二矩形的顶点和所述第二矩形的边的中点,所述中心标志点位于所述第二矩形的中心,所述第二矩形的长边为所述第一标志边,所述第二矩形的短边为所述第二标志边。
  16. 根据权利要求13或15所述的标定装置,其特征在于,所述第一标志边的长度和所述标定图案的长度的比值和所述第二标志边的长度和所述标定图案的宽度的比值相同。
  17. 根据权利要求11所述的标定装置,其特征在于,所述相机模组和所述显示模组均为两个,两所述相机模组和两所述显示模组一一对应,所述处理器还用于:
    根据所述相机模组的焦距和对应的所述标定图像中的多个标定点之间的距离计算所述显示模组的视场角。
  18. 根据权利要求10所述的标定装置,其特征在于,述标定点位于多个半径相同的圆的圆心位置,所述标定点之间的距离为所述标定点所在的圆之间的圆心距。
  19. 根据权利要求12所述的标定装置,其特征在于,所述定位装置用于将所述标定板的中心定位在所述相机模组的光轴,并使得所述标定板垂直所述光轴。
  20. 根据权利要求11所述的标定装置,其特征在于,所述处理器还用于:
    控制所述相机模组和所述显示模组对准;
    控制所述显示模组显示棋盘格图案;
    控制所述相机模组拍摄所述棋盘格图案以获取棋盘格图像;及
    根据所述棋盘格图像计算所述相机参数。
  21. 一种包含计算机可执行指令的非易失性计算机可读存储介质,当所述计算机可执行指令被一个或多个处理器执行时,使得所述处理器执行如权利要求1至10中任一项所述的标定方法。
PCT/CN2021/076301 2020-04-23 2021-02-09 标定方法、标定装置和非易失性计算机可读存储介质 WO2021212978A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010325625.0 2020-04-23
CN202010325625.0A CN111402344A (zh) 2020-04-23 2020-04-23 标定方法、标定装置和非易失性计算机可读存储介质

Publications (1)

Publication Number Publication Date
WO2021212978A1 true WO2021212978A1 (zh) 2021-10-28

Family

ID=71433323

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/076301 WO2021212978A1 (zh) 2020-04-23 2021-02-09 标定方法、标定装置和非易失性计算机可读存储介质

Country Status (3)

Country Link
CN (1) CN111402344A (zh)
TW (1) TWI771961B (zh)
WO (1) WO2021212978A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114363482A (zh) * 2022-03-08 2022-04-15 荣耀终端有限公司 一种标定图像的确定方法及电子设备

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111402344A (zh) * 2020-04-23 2020-07-10 Oppo广东移动通信有限公司 标定方法、标定装置和非易失性计算机可读存储介质
WO2022041128A1 (zh) * 2020-08-28 2022-03-03 中国科学院深圳先进技术研究院 用于多个相机的自动标定方法及系统
CN112268538A (zh) * 2020-10-20 2021-01-26 中国人民解放军陆军工程大学 一种适用于小型无人车的几何光学测距方法
CN112561992A (zh) * 2020-12-01 2021-03-26 浙江大华技术股份有限公司 位置的确定方法及装置、存储介质和电子装置
CN112995514B (zh) * 2021-03-03 2023-05-30 上海万物新生环保科技集团有限公司 一种工业相机的拍照物距的获取方法及设备
CN112991459B (zh) * 2021-03-09 2023-12-12 阿波罗智联(北京)科技有限公司 一种相机标定方法、装置、设备以及存储介质
CN113705389A (zh) * 2021-08-13 2021-11-26 北京市商汤科技开发有限公司 人脸识别模组测试方法、装置、存储介质及电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107037584A (zh) * 2016-02-03 2017-08-11 深圳市易瞳科技有限公司 一种智能眼镜透视方法及系统
CN108765328A (zh) * 2018-05-18 2018-11-06 凌美芯(北京)科技有限责任公司 一种高精度多特征平面模板及其畸变优化和标定方法
CN108848374A (zh) * 2018-08-08 2018-11-20 京东方科技集团股份有限公司 显示参数测量方法及其装置、存储介质和测量系统
CN109767472A (zh) * 2018-12-06 2019-05-17 东南大学 一种用于测量眼戴式显示器fov的方法
CN110006634A (zh) * 2019-04-15 2019-07-12 北京京东方光电科技有限公司 视场角测量方法、视场角测量装置、显示方法和显示设备
CN111402344A (zh) * 2020-04-23 2020-07-10 Oppo广东移动通信有限公司 标定方法、标定装置和非易失性计算机可读存储介质

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB0327339D0 (en) * 2003-11-25 2003-12-31 Fortkey Ltd Inspection apparatus and method
JP4521568B2 (ja) * 2005-06-14 2010-08-11 国立大学法人京都大学 対応点探索方法、相互標定方法、3次元画像計測方法、対応点探索装置、相互標定装置、3次元画像計測装置、対応点探索プログラム及び対応点探索プログラムを記録したコンピュータ読み取り可能な記録媒体
CN103049918A (zh) * 2011-10-17 2013-04-17 天津市亚安科技股份有限公司 一种视频监控中准确计算实际目标大小的方法
CN104835115A (zh) * 2015-05-07 2015-08-12 中国科学院长春光学精密机械与物理研究所 用于航空相机的图像成像方法及系统
CN105716582B (zh) * 2016-02-15 2018-08-28 中林信达(北京)科技信息有限责任公司 摄像机视场角的测量方法、装置以及摄像机视场角测量仪
US11889979B2 (en) * 2016-12-30 2024-02-06 Barco Nv System and method for camera calibration
CN108961195B (zh) * 2018-06-06 2021-03-23 Oppo广东移动通信有限公司 图像处理方法及装置、图像采集装置、可读存储介质和计算机设备
CN109191374B (zh) * 2018-10-10 2020-05-08 京东方科技集团股份有限公司 一种畸变参数测量方法、装置及系统
CN109712193B (zh) * 2018-12-04 2021-06-15 浙江大华技术股份有限公司 一种球机视场角的确定方法及装置
CN110148167B (zh) * 2019-04-17 2021-06-04 维沃移动通信有限公司 一种距离测量方法及终端设备
CN110148186B (zh) * 2019-05-28 2021-01-22 河北农业大学 一种rgb-d相机快速标定方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107037584A (zh) * 2016-02-03 2017-08-11 深圳市易瞳科技有限公司 一种智能眼镜透视方法及系统
CN108765328A (zh) * 2018-05-18 2018-11-06 凌美芯(北京)科技有限责任公司 一种高精度多特征平面模板及其畸变优化和标定方法
CN108848374A (zh) * 2018-08-08 2018-11-20 京东方科技集团股份有限公司 显示参数测量方法及其装置、存储介质和测量系统
CN109767472A (zh) * 2018-12-06 2019-05-17 东南大学 一种用于测量眼戴式显示器fov的方法
CN110006634A (zh) * 2019-04-15 2019-07-12 北京京东方光电科技有限公司 视场角测量方法、视场角测量装置、显示方法和显示设备
CN111402344A (zh) * 2020-04-23 2020-07-10 Oppo广东移动通信有限公司 标定方法、标定装置和非易失性计算机可读存储介质

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114363482A (zh) * 2022-03-08 2022-04-15 荣耀终端有限公司 一种标定图像的确定方法及电子设备

Also Published As

Publication number Publication date
CN111402344A (zh) 2020-07-10
TW202141431A (zh) 2021-11-01
TWI771961B (zh) 2022-07-21

Similar Documents

Publication Publication Date Title
WO2021212978A1 (zh) 标定方法、标定装置和非易失性计算机可读存储介质
ES2370512T3 (es) Método y aparato para ajustar automáticamente la alineación de un proyector con respecto a una pantalla de proyección.
US11575883B2 (en) 3D test chart, adjusting arrangement, forming method and adjusting method thereof
CN107527336B (zh) 镜头相对位置标定方法及装置
KR101910288B1 (ko) 듀얼 카메라모듈의 정렬 방법
US10148944B2 (en) Calibration method of an image capture system
JP2017182038A (ja) 投影システム及び投影画面の修正方法
JP2003307466A (ja) キャリブレーション装置、方法及び結果診断装置、並びにキャリブレーション用チャート
CN104931421B (zh) 液晶面板视觉检测系统的定位对焦方法
TWI752089B (zh) 長度測量裝置
TW202141099A (zh) 相機模組之對位方法
TW202131084A (zh) 判斷相機模組之組裝品質的方法
CN109682312A (zh) 一种基于摄像头测量长度的方法及装置
JP4141874B2 (ja) 焦点距離及び/又は画角算出方法ならびに焦点距離算出用光投射装置
US11546564B2 (en) Projection system and self-adaptive adjustment method thereof
TW201443393A (zh) 拍攝並拼接物件影像的方法
JP3914938B2 (ja) プロジェクタの台形歪み補正装置と該台形歪み補正装置を備えたプロジェクタ
CN110650331A (zh) 阵列摄像模组测试方法及其标板装置
JP2024077029A (ja) 画像投写装置および制御方法
CN115631187A (zh) 一种检测方法、检测装置及检测设备
JP2014175330A (ja) 基板のアライメント方法
CN114549295A (zh) 一种基于幂级数展开法的鱼眼图片矫正方法
CN117061713A (zh) 校正投影机的投影区域的方法和投影系统
CN117593378A (zh) 车载摄像头模组内参标定装置及方法
CN115755538A (zh) 线路板曝光机校准方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21792870

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21792870

Country of ref document: EP

Kind code of ref document: A1