WO2021254014A1 - 显示标定方法及装置、设备、存储介质 - Google Patents

显示标定方法及装置、设备、存储介质 Download PDF

Info

Publication number
WO2021254014A1
WO2021254014A1 PCT/CN2021/091942 CN2021091942W WO2021254014A1 WO 2021254014 A1 WO2021254014 A1 WO 2021254014A1 CN 2021091942 W CN2021091942 W CN 2021091942W WO 2021254014 A1 WO2021254014 A1 WO 2021254014A1
Authority
WO
WIPO (PCT)
Prior art keywords
calibration
camera
display
coordinate system
image
Prior art date
Application number
PCT/CN2021/091942
Other languages
English (en)
French (fr)
Inventor
徐乃江
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2021254014A1 publication Critical patent/WO2021254014A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration

Definitions

  • the present disclosure relates to the field of computer technology, and in particular, to a display calibration method and device, equipment, and storage medium.
  • augmented reality devices have gradually begun to be applied, and follow-up cameras and display modules are usually provided on augmented reality devices.
  • the following camera is used to collect images of the real environment, and the display module is used to display virtual images.
  • the display module is used to display virtual images.
  • SPAAM Single Point Active Alignment Method
  • the purpose of the present disclosure is to provide a display calibration method, device, device, and storage medium, so as to at least to some extent solve one or more problems caused by deficiencies in related technologies.
  • a display calibration method including:
  • the calibration mapping relationship includes a conversion relationship between a calibration camera coordinate system and a display module coordinate system
  • the first calibration image group including a plurality of first calibration images
  • the second calibration image group including a plurality of second calibration images corresponding to the first calibration image
  • a conversion relationship between the device camera coordinate system and the display module coordinate system is determined.
  • a display calibration device comprising:
  • the first determining module is configured to determine a calibration mapping relationship, where the calibration mapping relationship includes a conversion relationship between a calibration camera coordinate system and a display module coordinate system;
  • a first acquisition module configured to acquire a first calibration image group of a calibration board by using the calibration camera, the first calibration image group including a plurality of first calibration images;
  • the second acquisition module is configured to acquire a second calibration image group of the calibration board using a device camera, the device camera and the display module are provided in an electronic device, and the second calibration image group includes a plurality of and first calibration The second calibration image corresponding to the image;
  • the second determining module is configured to determine the conversion relationship between the device camera coordinate system and the display module coordinate system according to the first calibration image group, the second calibration image group, and the calibration mapping relationship.
  • a display calibration device used for the calibration of a device to be calibrated, the device to be calibrated including a display module and a device camera, and the display calibration device includes:
  • a fixed clamp which is used to clamp the equipment to be calibrated
  • a calibration board is arranged on one side of the fixing fixture, and the calibration board is located within the shooting range of the device camera;
  • a calibration camera the calibration camera is set within the eye movement range of the display module
  • the control unit is connected to the calibration camera and the device camera respectively, and the control unit is configured to be able to execute the above-mentioned display calibration method.
  • a computer-readable storage medium having a computer program stored thereon, which, when executed by a processor, implements the method according to any one of the above.
  • FIG. 1 is a schematic structural diagram of a display calibration device provided by an exemplary embodiment of the present disclosure
  • FIG. 2 is a flowchart of the first display calibration method provided by an exemplary embodiment of the present disclosure
  • FIG. 3 is a schematic diagram of a coordinate system conversion provided by an exemplary embodiment of the present disclosure
  • FIG. 4 is a flowchart of a second display calibration method provided by an exemplary embodiment of the present disclosure
  • FIG. 5 is a flowchart of a third display calibration method provided by an exemplary embodiment of the present disclosure.
  • Fig. 6 is a flowchart of a fourth display calibration method provided by an exemplary embodiment of the present disclosure.
  • FIG. 7 is a block diagram of a display calibration device provided by an exemplary embodiment of the present disclosure.
  • Fig. 8 is a block diagram of a controller provided by an exemplary embodiment of the present disclosure.
  • FIG. 9 is a schematic diagram of a computer-readable storage medium provided by an exemplary embodiment of the present disclosure.
  • Exemplary embodiments of the present disclosure first provide a display calibration device, which is used to calibrate the device to be calibrated.
  • the device to be calibrated may be an electronic device such as augmented reality glasses, augmented reality helmets, mixed reality glasses, or mixed reality helmets.
  • the device to be calibrated may include a display module and a device camera (follow camera), that is, the device camera and the display module are set on the same electronic device, and the display light side of the display module and the lens side of the device camera are in opposite directions.
  • the display module can be a transflective display module, such as an optical waveguide display module.
  • the light in the real scene can enter the user's eyes through the display module, and on the other hand, the virtual screen can be reflected by the display module.
  • the augmented reality device displays a virtual screen
  • the location of the virtual image needs to be determined according to the real scene, and the device camera obtains the current real scene image.
  • the device camera and the display module may also have a split structure, and the embodiments of the present disclosure are not limited to this.
  • the display calibration equipment includes: a fixed clamp 101, a calibration board 102, a calibration camera 103, and a control unit (not shown in the figure).
  • the fixed clamp 101 is used to clamp the device to be calibrated;
  • the calibration board 102 is located One side of the fixture is fixed, and the calibration board 102 is located in the shooting range of the device camera 112, and the calibration board 102 is located on the side of the display module 111 away from the calibration camera 103;
  • the calibration camera 103 is set in the eye movement range of the display module; control
  • the units are respectively connected to the calibration camera 103 and the device camera 112, and the control unit is used to control the display calibration.
  • the display calibration equipment may further include: a camera adjustment device 104 and a calibration plate adjustment device 105, the camera adjustment device 104 is connected to the fixing fixture 101 and the calibration camera 103, and the camera adjustment device 104 is used to adjust the calibration camera 103 and the relative position of the fixing jig 101.
  • the calibration board adjusting device 105 is connected to the calibration board 102 and the fixing fixture 101, and the calibration board adjusting device 102 is used to adjust the relative positional relationship between the calibration board 102 and the fixing fixture 101.
  • an embodiment of the present disclosure provides a display calibration method, and the display calibration method may include the following steps:
  • Step S210 determining the calibration mapping relationship, the calibration mapping relationship includes the mapping relationship between the calibration camera coordinate system and the display module coordinate system;
  • Step S220 using the calibration camera to acquire a first calibration image group of the calibration board, the first calibration image group includes a plurality of first calibration images;
  • Step S230 Acquire a second calibration image group of the calibration board by using the device camera, the second calibration image group including a plurality of second calibration images corresponding to the first calibration image;
  • Step S240 Determine the conversion relationship between the device camera coordinate system and the display module coordinate system according to the first calibration image group, the second calibration image group, and the calibration mapping relationship.
  • the conversion relationship between the device camera coordinate system and the display module coordinate system is determined through the first calibration image group, the second calibration image group, and the calibration mapping relationship.
  • the first calibration image group is acquired by the calibration camera, which avoids the human eye judging the alignment of the cross cursor during calibration, increases the accuracy of the display module calibration, and thereby can increase the display rendering accuracy of the device to be calibrated.
  • the user does not need to wear the equipment to be calibrated to save manpower.
  • the calibration principle is as follows.
  • the defined coordinate system W represents the world coordinate system
  • the coordinate system C represents the device camera coordinate system
  • the coordinate system S represents the display module coordinate system.
  • the coordinate system V represents the coordinate system of a virtual camera of a pinhole model composed of the human eye and the imaging plane of the display module (that is, the calibration camera coordinate system), and the imaging plane of the calibration camera is P.
  • Matrix H is the homography matrix of the calibration camera coordinate system P and the display module coordinate system S
  • G is the conversion matrix of the coordinate conversion between the device camera coordinate system C and the display module coordinate system S
  • F is the device camera coordinate system C and the world.
  • the transformation matrix of the coordinate system W is the transformation matrix of the coordinate system W.
  • the purpose of the calibration of the display calibration method provided by the embodiments of the present disclosure is to obtain the mapping relationship between the coordinate system of the device camera and the coordinate system of the display module. Assuming any point P in space, the coordinate in the world coordinate system W is Pw, and the coordinate in the device camera coordinate system C is Pc. Look at the point P through the display area of the display module, and the point P is in the display module coordinate system. The coordinates of is Ps, then the following two equations are obtained:
  • Pw and Pc are the homogeneous coordinates of the three-dimensional position (homogeneous coordinates are to represent an originally n-dimensional vector with an n+1-dimensional vector, which is the coordinate system used in projection geometry), so Pw and Pc They are all four-dimensional vectors.
  • Ps is the homogeneous coordinate of the two-dimensional position, so it is a three-dimensional vector.
  • Each occurrence of R is a 3x3 rotation matrix
  • each T is a 3x1 translation vector
  • T is a column vector containing 3 elements.
  • K represents the internal parameter matrix of the virtual camera including human eyes
  • K is a 3x3 matrix.
  • G represents the mapping relationship between the device camera coordinate system and the display module coordinate system.
  • the coordinate conversion from the tracking camera coordinate system to the display module coordinate system can be realized in the display module.
  • the display area of the group makes virtual information annotations for real objects in reality.
  • G can also be decomposed into an internal parameter matrix, a rotation matrix, and a translation matrix (the translation matrix is the matrix when the translation vector is regarded as nx1).
  • the translation matrix is the matrix when the translation vector is regarded as nx1).
  • the spatial position relationship is superimposed on three-dimensional virtual objects, so that some virtual pictures (such as the effects of characters, buildings, etc.) can be superimposed in the real environment.
  • the calibration mapping relationship may be determined, and the calibration mapping relationship includes the mapping relationship between the calibration camera coordinate system and the display module coordinate system.
  • determining the calibration mapping relationship in step S210 may include the following steps:
  • Step S410 controlling the display module to display the calibration pattern
  • Step S420 Obtain a third calibration image by using the calibration camera, where the third calibration image is an image of the calibration pattern;
  • Step S430 Determine the coordinates of the calibration feature points according to the third calibration image
  • Step S440 Determine the calibration mapping relationship according to the coordinates of the calibration feature points and the coordinates of the points corresponding to the calibration feature points in the calibration pattern of the display module.
  • step S410 the display module can be controlled to display the calibration pattern.
  • the calibration pattern may be a checkerboard pattern.
  • the display module is driven to display the checkerboard calibration pattern through the processor of the equipment to be calibrated and the display driving circuit.
  • the calibration pattern may also be other patterns (for example, the far-point calibration pattern or the AprilTag calibration pattern), and the embodiments of the present disclosure are not limited thereto.
  • the display module displays the calibration pattern, it can display the calibration pattern at different angles, that is, the display module displays the calibration pattern at different angles in a preset order. For example, the display module can display three calibration patterns at different angles.
  • a calibration camera may be used to obtain a third calibration image, and the third calibration image is an image of a calibration pattern.
  • the third calibration image is the image of the calibration pattern displayed by the display module taken by the calibration camera.
  • the calibration camera captures the screen displayed by the display module.
  • the third calibration image is a photo of the checkerboard pattern.
  • the third calibration image may include multiple pictures, and the calibration of the calibration camera and the display module can be achieved through the multiple pictures.
  • step S430 the coordinates of the calibration feature points can be determined according to the third calibration image.
  • the coordinates of the calibration feature point are the coordinates in the calibration camera coordinate system.
  • the calibration feature point may be a feature point in the third calibration image.
  • the calibration pattern is a checkerboard pattern
  • the calibration feature point may be the intersection of the checkerboard.
  • the coordinates of the calibration feature points in the calibration camera coordinate system can be determined through the images of multiple calibration patterns acquired in step S420.
  • the calibration feature points in the calibration camera coordinate system can be obtained by Zhang Zhengyou calibration method (least squares method, etc.) coordinate.
  • step S440 the calibration mapping relationship is determined according to the coordinates of the calibration feature points and the coordinates of the points corresponding to the calibration feature points in the calibration pattern of the display module.
  • the coordinates of the calibration feature points can be obtained in step S430, and the coordinates of the calibration pattern displayed in the display module and the corresponding points of the calibration feature points have been determined when the image is rendered. Get the coordinates of the corner points of the chessboard when the calibration pattern is rendered. According to the coordinates of the same point in different coordinate systems, the conversion relationship between the coordinate system of the display module and the coordinate system of the calibration camera is obtained.
  • the calibration board supplement light source or the backlight light source can be turned off, and a virtual calibration pattern can be rendered in the display module.
  • a calibration camera is used to shoot the virtual calibration pattern in the display module to obtain a photographed image V of the virtual calibration pattern, the checkerboard corner point is detected in the photographed image V, and the checkerboard corner point coordinates V_Points are obtained.
  • the corner coordinates of the virtual calibration pattern rendered in the display module are known D_Points.
  • the two sets of corner points of V_Points and D_Points mentioned above are screened for points with the same name, and one-to-one corresponding point pairs are obtained, and the monochromatic matrix H (the conversion relationship between the display module coordinate system and the calibration camera coordinate system) is obtained by solving.
  • the coordinate conversion relationship between the display module and the calibration camera can also be solved by the least squares method of the two-dimensional multi-term curve, and the embodiment of the present disclosure is not limited to this.
  • the calibration camera may be used to obtain a first calibration image group of the calibration plate, and the first calibration image group includes a plurality of first calibration images.
  • the display module can transmit light, and the light reflected by the calibration board can enter the calibration camera through the display module.
  • the calibration camera can be used to separately acquire the first calibration images of the calibration board at a plurality of different preset angles.
  • the calibration camera uses the calibration camera to take photos of the calibration board (that is, the first calibration image) when the calibration board is at the first preset angle, the second preset angle, and the third preset angle. Both the preset angle and the third preset angle are different.
  • the angle of the calibration plate can also be other angles, and the number of images in the first calibration image group can also be other numbers, which is not specifically limited in the embodiment of the present disclosure.
  • the device camera may be used to obtain a second calibration image group of the calibration board, the device camera and the display module are set in an electronic device, and the second calibration image group includes a plurality of second calibration images corresponding to the first calibration image. .
  • the device camera and the display module are provided in the same electronic device.
  • the device camera may be a follower camera of the augmented reality device.
  • the lens of the calibration camera faces the side away from the user, and the light-emitting side of the display module faces the side close to the user.
  • the device camera may be used to separately acquire a plurality of second calibration images of the calibration plate corresponding to the first calibration image.
  • the second calibration image acquired by the device camera corresponds to the first calibration image acquired by the calibration camera, and the corresponding first calibration image and second calibration image are images of the calibration board at the same time.
  • the device camera uses the device camera to take photos of the calibration plate (that is, the second calibration image) when the calibration plate is at the first preset angle, the second preset angle, and the third preset angle. Both the preset angle and the third preset angle are different.
  • the calibration plate it is possible to obtain images of the calibration plate at 0 degrees, -20 degrees and +20 degrees.
  • the index calibration template and the calibration camera lens plane are parallel.
  • the angle of the calibration plate can also be other angles, and the number of images in the second calibration image group can also be other numbers, which is not specifically limited in the embodiment of the present disclosure.
  • the calibration board is set to 20°
  • the calibration camera and the equipment camera are taken at the same time to obtain images W_20, T_20
  • the calibration board is rotated (counterclockwise) to -20°
  • the calibration camera and the equipment camera are taken at the same time, and the images W_-20, T_-20 are obtained.
  • step S240 the conversion relationship between the device camera coordinate system and the display module coordinate system may be determined according to the first calibration image group, the second calibration image group, and the calibration mapping relationship.
  • step S240 can be implemented through the following steps:
  • Step S510 Convert the first calibration image into a display calibration image according to the calibration mapping relationship
  • Step S520 Determine a conversion matrix between the coordinate system of the device camera and the coordinate system of the display module according to the display calibration image and the second calibration image.
  • step S510 the first calibration image is converted into a display calibration image according to the calibration mapping relationship.
  • the first calibration image is the calibration board image in the calibration camera coordinate system
  • the display calibration image is the calibration board image in the display module coordinate system.
  • the conversion of the first calibration image into the display calibration image can be achieved in the following ways:
  • Obtain the coordinates of the first feature point which is the feature point in the first calibration image; use the calibration mapping relationship to convert the coordinates of the first feature point to the display feature point coordinates in the display module coordinate system.
  • the first feature point may be multiple feature points on the first calibration image, and it does not mean that the coordinates of only one feature point are acquired here.
  • the calibration board is a chessboard calibration board
  • the first feature point is the corner point of the chessboard on the first calibration image.
  • step S520 a conversion matrix between the coordinate system of the device camera and the coordinate system of the display module is determined according to the display calibration image and the second calibration image.
  • the calibration of the display module and the device camera can be converted into binocular camera calibration.
  • the coordinates of the second feature point are the feature points corresponding to the first feature point in the second calibration image; according to the display feature point coordinates and the second feature point coordinates, determine the device camera coordinate system and display mode The transformation matrix of the group coordinate system.
  • the second feature point may be multiple feature points on the second calibration image, and it does not mean that the coordinates of only one feature point are acquired here.
  • the first feature point is the corner point of the chessboard on the first calibration image
  • the second feature point is the corner point of the chessboard on the second calibration image.
  • step S210 perform feature point detection on images W_0, T_0, W_20, T_20, W_-20, and T_-20 to obtain feature point pairs W_0_Point, T_0_Points, W_20_Point, T_20_Points, W_-20_Point, T_-20_Points, which are obtained in step S210
  • step S240 can be implemented through the following steps:
  • Step S610 Determine a camera conversion relationship according to the first calibration image and the second calibration image, and the camera conversion relationship includes the conversion relationship between the coordinate system of the device camera and the coordinate system of the calibration camera;
  • Step S620 Determine the conversion matrix between the device camera coordinate system and the display module coordinate system according to the calibration mapping relationship and the camera conversion relationship.
  • the camera conversion relationship may be determined according to the first calibration image and the second calibration image, and the camera conversion relationship includes the conversion relationship between the coordinate system of the device camera and the coordinate system of the calibration camera.
  • determining the camera conversion relationship according to the first calibration image and the second calibration image can be achieved in the following manner:
  • the conversion matrix between the coordinate system of the device camera and the coordinate system of the calibration camera is determined.
  • multiple first calibration images and multiple second calibration images are acquired.
  • the camera conversion relationship can be obtained through the coordinates of the first feature points on the different first calibration images and the coordinates of the second feature points on the corresponding second calibration images. For example, through the binocular camera calibration algorithm.
  • the first feature point may be multiple feature points on the first calibration image, and it does not mean that the coordinates of only one feature point are acquired here.
  • the second feature point may be multiple feature points on the second calibration image, and it does not mean that the coordinates of only one feature point are acquired here.
  • the first feature point is the corner point of the chessboard on the first calibration image
  • the second feature point is the corner point of the chessboard on the second calibration image.
  • step S620 the conversion matrix between the device camera coordinate system and the display module coordinate system may be determined according to the calibration mapping relationship and the camera conversion relationship.
  • feature point detection is performed on images W_0, T_0, W_20, T_20, W_-20, and T_-20 to obtain feature point pairs W_0_Point, T_0_Points, W_20_Point, T_20_Points, W_-20_Point, T_-20_Points.
  • the method of binocular camera calibration is used to solve the internal parameters of the calibration camera module, and the external parameters of the calibration and equipment camera.
  • the calibration mapping relationship (H matrix) obtained in step S210, the internal parameters of the calibration camera module, the external parameters of the calibration camera and the device camera are converted into a conversion matrix between the device camera coordinate system and the display module coordinate system, completing the entire calibration process.
  • the conversion relationship between the device camera coordinate system and the display module coordinate system is determined through the first calibration image group, the second calibration image group, and the calibration mapping relationship.
  • the first calibration image group is acquired by the calibration camera, which avoids the human eye judging the alignment of the cross cursor during calibration, increases the accuracy of the display module calibration, and thereby can increase the display rendering accuracy of the device to be calibrated.
  • the user does not need to wear the equipment to be calibrated to save manpower.
  • Exemplary embodiments of the present disclosure also provide a display calibration device. As shown in FIG. 7, the display calibration device includes:
  • the first determining module 710 is configured to determine a calibration mapping relationship, where the calibration mapping relationship includes a mapping relationship between a calibration camera coordinate system and a display module coordinate system;
  • the first acquisition module 720 is configured to acquire a first calibration image group of the calibration board by using a calibration camera, and the first calibration image group includes a plurality of first calibration images;
  • the second acquisition module 730 is configured to acquire a second calibration image group of the calibration board by using the device camera, the second calibration image group including a plurality of second calibration images corresponding to the first calibration image;
  • the second determining module 740 is configured to determine the conversion relationship between the device camera coordinate system and the display module coordinate system according to the first calibration image group, the second calibration image group, and the calibration mapping relationship.
  • the display calibration device determines the conversion relationship between the device camera coordinate system and the display module coordinate system through the first calibration image group, the second calibration image group, and the calibration mapping relationship.
  • the first calibration image group is acquired by the calibration camera, which avoids the human eyes judging the alignment of the cross cursor during calibration, increases the accuracy of the display module calibration, and thereby can increase the display rendering accuracy of the device to be calibrated.
  • the user does not need to wear the equipment to be calibrated to save manpower.
  • the first determining module may include:
  • the first control unit is used to control the display module to display the calibration pattern
  • the first acquisition unit is configured to acquire a third calibration image by using the calibration camera, the third calibration image being an image of the calibration pattern;
  • the first determining unit is configured to determine the coordinates of the calibration feature points according to the third calibration image
  • the second determining unit determines the calibration mapping relationship according to the coordinates of the calibration feature points and the coordinates of the points corresponding to the calibration feature points in the calibration pattern of the display module.
  • the first obtaining module may include:
  • the second acquisition unit is used to acquire the first calibration image of the calibration board at the first preset angle, the second preset angle, and the third preset angle by using the calibration camera, the first preset angle, the second preset angle, and the The third preset angles are all different.
  • the second acquisition module may include:
  • the third acquiring unit is used to acquire the second calibration images of the calibration board at the first preset angle, the second preset angle, and the third preset angle by using the device camera.
  • the second determining module may include:
  • the first conversion unit is configured to convert the first calibration image into a display calibration image according to the calibration mapping relationship
  • the third determining unit is used to determine the conversion matrix between the coordinate system of the device camera and the coordinate system of the display module according to the display calibration image and the second calibration image.
  • the first conversion unit may include:
  • the first obtaining subunit is used to obtain the coordinates of the first feature point, the first feature point being the feature point in the first calibration image;
  • the first conversion subunit is used for converting the coordinates of the first feature point into the coordinates of the display feature point in the coordinate system of the display module by using the calibration mapping relationship.
  • the third determining unit may include:
  • the second acquiring subunit is used to acquire the coordinates of the second feature point, and the coordinates of the second feature point are the feature points corresponding to the first feature point in the second calibration image;
  • the first determining subunit is used to determine the conversion matrix between the coordinate system of the device camera and the coordinate system of the display module according to the coordinate of the display characteristic point and the coordinate of the second characteristic point.
  • the second determining module may include:
  • the second conversion unit is configured to determine a camera conversion relationship according to the first calibration image and the second calibration image, and the camera conversion relationship includes the conversion relationship between the coordinate system of the device camera and the coordinate system of the calibration camera;
  • the fourth determining unit is used to determine the conversion matrix between the device camera coordinate system and the display module coordinate system according to the calibration mapping relationship and the camera conversion relationship.
  • the second conversion unit may include:
  • the third acquisition subunit is used to acquire the coordinates of the first feature point, where the first feature point is the feature point in the first calibration image;
  • the fourth acquiring subunit is used to acquire the coordinates of the second feature point, where the coordinates of the second feature point are the feature points corresponding to the first feature point in the second calibration image;
  • the fifth determining subunit determines the conversion matrix between the coordinate system of the device camera and the coordinate system of the calibration camera according to the coordinates of the first feature point and the coordinates of the second feature point.
  • modules or units of the display calibration device are mentioned in the above detailed description, this division is not mandatory.
  • the features and functions of two or more modules or units described above may be embodied in one module or unit.
  • the features and functions of a module or unit described above can be further divided into multiple modules or units to be embodied.
  • Those skilled in the art can understand that various aspects of the present disclosure can be implemented as a system, a method, or a program product.
  • the display calibration device may further include a controller 800, which is respectively connected to the calibration camera 103 and the device camera 112, and the controller 800 is configured to perform the above The display calibration method.
  • the display calibration device may be provided with a data interface, and the device to be calibrated is connected to the display calibration device through the data interface.
  • the controller may include at least one processing unit 810, at least one storage unit 820, a bus 830 connecting different system components (including the storage unit 820 and the processing unit 810), a display unit 840, and the like.
  • the storage unit stores program code, and the program code can be executed by the processing unit 810, so that the processing unit 810 executes the various exemplary methods described in the "Exemplary Method" section of this specification. Example steps.
  • the storage unit 820 may include a readable medium in the form of a volatile storage unit, such as a random access storage unit (RAM) 8201 and/or a cache storage unit 8202, and may further include a read-only storage unit (ROM) 8203.
  • RAM random access storage unit
  • ROM read-only storage unit
  • the storage unit 820 may also include a program/utility tool 8204 having a set of (at least one) program module 8205.
  • program module 8205 includes but is not limited to: an operating system, one or more application programs, other program modules, and program data, Each of these examples or some combination may include the implementation of a network environment.
  • the bus 830 may represent one or more of several types of bus structures, including a storage unit bus or a storage unit controller, a peripheral bus, a graphics acceleration port, a processing unit, or a local area using any bus structure among multiple bus structures. bus.
  • the display calibration device can also communicate with one or more external devices 870 (such as keyboards, pointing devices, Bluetooth devices, etc.), and can also communicate with one or more devices that enable users to interact with the electronic device 800, and/or communicate with Any device (such as a router, modem, etc.) that enables the display calibration device to communicate with one or more other computing devices. This communication can be performed through an input/output (I/O) interface 850.
  • the display calibration device can also communicate with one or more networks (for example, a local area network (LAN), a wide area network (WAN), and/or a public network, such as the Internet) through the network adapter 860. As shown in the figure, the network adapter 840 communicates with other modules of the display calibration device through the bus 830.
  • the exemplary embodiments described here can be implemented by software, or can be implemented by combining software with necessary hardware. Therefore, the technical solution according to the embodiments of the present disclosure can be embodied in the form of a software product, which can be stored in a non-volatile storage medium (which can be a CD-ROM, U disk, mobile hard disk, etc.) or on the network , Including several instructions to make a computing device (which may be a personal computer, a server, a terminal device, or a network device, etc.) execute the method according to the embodiment of the present disclosure.
  • a computing device which may be a personal computer, a server, a terminal device, or a network device, etc.
  • a computer-readable storage medium is also provided, on which a program product capable of implementing the above-mentioned method of this specification is stored.
  • various aspects of the present invention may also be implemented in the form of a program product, which includes program code, and when the program product runs on a terminal device, the program code is used to make the The terminal device executes the steps according to various exemplary embodiments of the present invention described in the above-mentioned "Exemplary Method" section of this specification.
  • a program product 900 for implementing the above method according to an embodiment of the present invention is described. It can adopt a portable compact disk read-only memory (CD-ROM) and include program code, and can be installed in a terminal device, For example, running on a personal computer.
  • the program product of the present invention is not limited to this.
  • the readable storage medium can be any tangible medium that contains or stores a program, and the program can be used by or in combination with an instruction execution system, device, or device.
  • the program product can use any combination of one or more readable media.
  • the readable medium may be a readable signal medium or a readable storage medium.
  • the readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or a combination of any of the above. More specific examples (non-exhaustive list) of readable storage media include: electrical connections with one or more wires, portable disks, hard disks, random access memory (RAM), read-only memory (ROM), erasable Type programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • the computer-readable signal medium may include a data signal propagated in baseband or as a part of a carrier wave, and readable program code is carried therein. This propagated data signal can take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • the readable signal medium may also be any readable medium other than a readable storage medium, and the readable medium may send, propagate, or transmit a program for use by or in combination with the instruction execution system, apparatus, or device.
  • the program code contained on the readable medium can be transmitted by any suitable medium, including but not limited to wireless, wired, optical cable, RF, etc., or any suitable combination of the foregoing.
  • the program code used to perform the operations of the present invention can be written in any combination of one or more programming languages.
  • the programming languages include object-oriented programming languages—such as Java, C++, etc., as well as conventional procedural programming languages. Programming language-such as "C" language or similar programming language.
  • the program code can be executed entirely on the user's computing device, partly on the user's device, executed as an independent software package, partly on the user's computing device and partly executed on the remote computing device, or entirely on the remote computing device or server Executed on.
  • the remote computing device can be connected to a user computing device through any kind of network, including a local area network (LAN) or a wide area network (WAN), or can be connected to an external computing device (for example, using Internet service providers). Shanglai is connected via the Internet).
  • LAN local area network
  • WAN wide area network
  • an external computing device for example, using Internet service providers.
  • Shanglai is connected via the Internet.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

本公开是关于一种显示标定方法及装置、设备、存储介质,所述显示标定方法包括:确定标定映射关系,所述标定映射关系包括标定相机坐标系和显示模组坐标系的转换关系;利用所述标定相机获取标定板的第一标定图像组,所述第一标定图像组中包括多个第一标定图像;利用设备相机获取标定板的第二标定图像组,所述第二标定图像组包括多个和第一标定图像对应的第二标定图像;根据所述第一标定图像组、所述第二标定图像组和所述标定映射关系,确定所述设备相机坐标系和所述显示模组坐标系的转换关系。增加了显示模组标定的精度。

Description

显示标定方法及装置、设备、存储介质
交叉引用
本公开要求于2020年6月17日提交的申请号为202010554384.7名称为“显示标定方法及装置、设备、存储介质”的中国专利申请的优先权,该中国专利申请的全部内容通过引用全部并入本文。
技术领域
本公开涉及计算机技术领域,具体而言,涉及一种显示标定方法及装置、设备、存储介质。
背景技术
随着技术的发展和进步,增强现实设备逐渐开始应用,增强现实设备上通常设置有跟随相机和显示模组。跟随相机用于采集现实环境图像,显示模组用于显示虚拟图像,为了实现虚拟图像和现实图像的匹配通常需要对跟随相机和显示模组进行标定。
目前,增强现实设备常用的标定方法为单点主动对准法(SPAAM)。该方法要求用户佩戴增强现实设备,并且将显示模组上的十字光标与真实世界中的物体进行多次对齐,多次对齐需要通过头部转动完成。由于十字光标的对齐需要用户肉眼确定,因此标定精度较低。
需要说明的是,在上述背景技术部分公开的信息仅用于加强对本公开的背景的理解,因此可以包括不构成对本领域普通技术人员已知的现有技术的信息。
公开内容
本公开的目的在于提供一种显示标定方法及装置、设备、存储介质,进而至少一定程度上解决由于相关技术的缺陷而导致的一个或多个问题。
根据本公开的第一方面,提供一种显示标定方法,所述显示标定方法包括:
确定标定映射关系,所述标定映射关系包括标定相机坐标系和显示模组坐标系的转换关系;
利用所述标定相机获取标定板的第一标定图像组,所述第一标定图像组中包括多个第一标定图像;
利用设备相机获取标定板的第二标定图像组,所述第二标定图像组包括多个和第一标定图像对应的第二标定图像;
根据所述第一标定图像组、所述第二标定图像组和所述标定映射关系,确定所述设备相机坐标系和所述显示模组坐标系的转换关系。
根据本公开的第二方面,提供一种显示标定装置,所述显示标定装 置包括:
第一确定模块,用于确定标定映射关系,所述标定映射关系包括标定相机坐标系和显示模组坐标系的转换关系;
第一获取模块,用于利用所述标定相机获取标定板的第一标定图像组,所述第一标定图像组中包括多个第一标定图像;
第二获取模块,用于利用设备相机获取标定板的第二标定图像组,所述设备相机和所述显示模组设于一电子设备,所述第二标定图像组包括多个和第一标定图像对应的第二标定图像;
第二确定模块,用于根据所述第一标定图像组、所述第二标定图像组和所述标定映射关系,确定所述设备相机坐标系和所述显示模组坐标系的转换关系。
根据本公开的第三方面,提供一种显示标定设备,用于待标定设备的标定,所述待标定设备包括显示模组和设备相机,所述显示标定设备包括:
固定夹具,所述固定夹具用于夹持待标定设备;
标定板,所述标定板设于所述固定夹具的一侧,并且所述标定板位于所述设备相机的拍摄范围内;
标定相机,所述标定相机设于所述显示模组的眼动范围内;
控制单元,所述控制单元分别连接所述标定相机和所述设备相机,所述控制单元被配置为能够执行上述的显示标定方法。
根据本公开的第四方面,提供一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现根据上述任意一项所述的方法。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本申请。
附图说明
此处的附图被并入说明书中并构成本说明书的一部分,示出了符合本公开的实施例,并与说明书一起用于解释本公开的原理。显而易见地,下面描述中的附图仅仅是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本公开示例性实施例提供的一种显示标定设备的结构示意图;
图2为本公开示例性实施例提供的第一种显示标定方法的流程图;
图3为本公开示例性实施例提供的一种坐标系转换示意图;
图4为本公开示例性实施例提供的第二种显示标定方法的流程图;
图5为本公开示例性实施例提供的第三种显示标定方法的流程图;
图6为本公开示例性实施例提供的第四种显示标定方法的流程图;
图7为本公开示例性实施例提供的一种显示标定装置的框图;
图8为本公开示例性实施例提供的一种控制器的框图;
图9为本公开示例性实施例提供的一种计算机可读存储介质的示意图。
具体实施方式
现在将参考附图更全面地描述示例实施例。然而,示例实施例能够以多种形式实施,且不应被理解为限于在此阐述的实施例;相反,提供这些实施例使得本公开将全面和完整,并将示例实施例的构思全面地传达给本领域的技术人员。在图中相同的附图标记表示相同或类似的部分,因而将省略对它们的重复描述。
此外,所描述的特征、结构或特性可以以任何合适的方式结合在一个或更多实施例中。在下面的描述中,提供许多具体细节从而给出对本公开的实施例的充分理解。然而,本领域技术人员将意识到,可以实践本公开的技术方案而没有所述特定细节中的一个或更多,或者可以采用其它的方法、组元、材料、装置、步骤等。在其它情况下,不详细示出或描述公知结构、方法、装置、实现、材料或者操作以避免模糊本公开的各方面。
附图中所示的方框图仅仅是功能实体,不一定必须与物理上独立的实体相对应。即,可以采用软件形式来实现这些功能实体,或在一个或多个软件硬化的模块中实现这些功能实体或功能实体的一部分,或在不同网络和/或处理器装置和/或微控制器装置中实现这些功能实体。
本公开示例性实施例首先提供一种显示标定设备,该显示标定设备用于对待标定设备进行标定。待标定设备可以是增强现实眼镜、增强现实头盔、混合现实眼镜或者混合现实头盔等电子设备。待标定设备可以包括显示模组和设备相机(跟随相机),也即是设备相机和显示模组设于同一电子设备,显示模组的显示出光侧和设备相机的镜头侧的方向相反。显示模组可以是半透半反射显示模组,比如光波导显示模组,一方面现实场景中的光线可以透过显示模组进入用户的眼睛,另一方面虚拟画面可以通过显示模组的反射进入人眼。在增强现实设备显示虚拟画面时,需要根据现实场景确定虚拟图像的位置,设备相机获取当前现实场景图像。当然在实际应用中,设备相机和显示模组也可以是分体式结构,本公开实施例并不以此为限。
如图1所示,该显示标定设备包括:固定夹具101、标定板102、标定相机103和控制单元(图中未示出),固定夹具101用于夹持待标定设备;标定板102设于固定夹具的一侧,并且标定板102位于设备相机112的拍摄范围内,标定板102位于显示模组111远离标定相机103的一侧;标定相机103设于显示模组的眼动范围内;控制单元分别连接标定相机103和设备相机112,控制单元用于控制显示标定。
进一步的,本公开实施例提供的显示标定设备还可以包括:相机调节装置104和标定板调节装置105,相机调节装置104和固定夹具101以及标定相机103连接,相机调节装置104用于调节标定相机103和固定夹具101的相对位置。标定板调节装置105和标定板102以及固定夹具101连接,标定板调节装置102用于调节标定板102和固定夹具101的相对位置关系。
如图2所示,本公开实施例提供一种显示标定方法,该显示标定方法可以包括如下步骤:
步骤S210,确定标定映射关系,标定映射关系包括标定相机坐标系和显示模组坐标系的映射关系;
步骤S220,利用标定相机获取标定板的第一标定图像组,第一标定图像组中包括多个第一标定图像;
步骤S230,利用设备相机获取标定板的第二标定图像组,第二标定图像组包括多个和第一标定图像对应的第二标定图像;
步骤S240,根据第一标定图像组、第二标定图像组和标定映射关系,确定设备相机坐标系和显示模组坐标系的转换关系。
本公开实施例提供的显示标定方法,通过第一标定图像组、第二标定图像组和标定映射关系,确定设备相机坐标系和显示模组坐标系的转换关系,实现了显示标定,并且在标定过程中通过标定相机获取第一标定图像组,避免了标定时通过人眼判断十字光标的对齐,增加了显示模组标定的精度,进而能够增加待标定设备的显示渲染精度。并且在在标定过程中不需要用户佩戴待标定设备节省人力。
在本公开实施例提供的显示标定方法中,标定原理如下,如图3所示,定义坐标系W表示世界坐标系,坐标系C表示设备相机坐标系,坐标系S表示显示模组坐标系,坐标系V表示人眼与显示模组的成像平面组成的一个针孔模型的虚拟摄像机的坐标系统(也即是标定相机坐标系),标定相机的成像平面为P。矩阵H为标定相机坐标系P和显示模组坐标系S的单应性矩阵,G为设备相机坐标系C和显示模组坐标系S坐标转换的转换矩阵,F为设备相机坐标系C和世界坐标系W的转换矩阵。
本公开实施例提供的显示标定方法的标定的目的是为了获得设备相机坐标系与显示模组坐标系之间的映射关系。假设空间中任意一点P,在世界坐标系W中的坐标为Pw,在设备相机坐标系C中的坐标为Pc,通过显示模组的显示区域看点P,点P在显示模组坐标系中的坐标为Ps,则得到如下两个等式:
Pc=[Rwc|Twc;0 0 0 1]Pw
Ps=K[Rcv|Tcv]Pc=G Pc;
其中,Pw和Pc是三维位置的齐次坐标(齐次坐标就是将一个原本是n维的向量用一个n+1维向量来表示,是用于投影几何中的坐标系统),因此Pw和Pc都是四维向量。Ps是二维位置的齐次坐标,因此是三维向 量。每一个出现的R都是3x3的旋转矩阵,每个T都是3x1的平移向量,T是包含3个元素的列向量。K是表示包含人眼的虚拟摄像机的内参数矩阵,K是3x3矩阵。
令Ps=[u v 1]T,Pc=[xc yc zc 1]T,G=[g11 g12 g13 g14;g21 g22 g23 g24;g31 g32 g33 g34],则得到如下表达式:
Figure PCTCN2021091942-appb-000001
此时需要标定的是G,G表示设备相机坐标系与显示模组坐标系之间的映射关系,根据G实现从踪相机坐标系到显示模组坐标系之间的坐标转换,可以在显示模组的显示区域对现实中的真实物体做虚拟信息标注。G也可以分解成内参数矩阵和旋转矩阵、平移矩阵(平移矩阵是平移向量看成nx1时的矩阵),在此基础上,不仅可以进行文字标注,还可以根据虚拟摄像机和设备摄像机之间的空间位置关系,进行三维虚拟物体的叠加,从而可以实现在现实环境中叠加一些虚拟的画面(比如人物、建筑等物体的效果)。
在步骤S210中,可以确定标定映射关系,标定映射关系包括标定相机坐标系和显示模组坐标系的映射关系。
其中,如图4所示,在步骤S210中确定标定映射关系可以包括如下步骤:
步骤S410,控制显示模组显示标定图案;
步骤S420,利用标定相机获取第三标定图像,第三标定图像为标定图案的图像;
步骤S430,根据第三标定图像,确定标定特征点的坐标;
步骤S440,根据标定特征点的坐标和显示模组标定图案中和标定特征点对应的点的坐标,确定标定映射关系。
在步骤S410中,可以控制显示模组显示标定图案。
其中,标定图案可以是棋盘图案。通过待标定设备的处理器和显示驱动电路驱动显示模组显示棋盘标定图案。当然,在实际应用中标定图案也可以是其他图案(比如远点标定图案或者AprilTag标定图案),本公开实施例并不以此为限。显示模组在显示标定图案时,可以以不同的角度显示标定图案,也即是显示模组按照预设顺序显示不同的角度的标定图案。比如,显示模组可以显示三个不同角度的标定图案。
在步骤S420中,可以利用标定相机获取第三标定图像,第三标定图像为标定图案的图像。
其中,第三标定图像即为标定相机拍摄的显示模组显示的标定图案的图像。在显示模组显示标定图案时,通过标定相机拍摄显示模组显示 的画面。当显示模组上显示的是棋盘图案时,第三标定图像为棋盘图案的照片。当显示模组显示多个角度的标定图案时,第三标定图像可以包括多张图片,通过多张图片实现对标定相机和显示模组的标定。
在步骤S430中,可以根据第三标定图像,确定标定特征点的坐标。
其中,该标定特征点的坐标为标定相机坐标系下的坐标。标定特征点可以是第三标定图像中的特征点。示例的,当标定图案为棋盘图案时,标定特征点可以是棋盘的交点。可以通过步骤S420中获取的多张标定图案的图像确定在标定相机坐标系下的标定特征点的坐标,比如可以通过张正友标定法(最小二乘法等)获取标定相机坐标系下的标定特征点的坐标。
在步骤S440中,根据标定特征点的坐标和显示模组标定图案中和标定特征点对应的点的坐标,确定标定映射关系。
其中,标定特征点的坐标可以在步骤S430中获取,显示模组中显示的标定图案中和标定特征点对应点的坐标在图像渲染时已确定。获取标定图案渲染时,棋盘角点的坐标。根据相同的点在不同坐标系下的坐标,获取显示模组坐标系和标定相机坐标系之间的转换关系。
示例的,在获取显示模组和标定相机的转换关系时,可以关闭标定板补光光源或者背光光源,在显示模组中渲染一幅虚拟标定图案。使用标定相机拍摄该显示模组中的虚拟标定图案,得到该虚拟标定图案的拍摄图像V,在拍摄图像V中检测该棋盘格角点,获得棋盘格角点坐标V_Points。同时渲染在显示模组中的虚拟标定图案的角点坐标是已知的D_Points。在上述V_Points和D_Points两组角点进行同名点筛选,拿到一一对应的点对,求解获得单映性矩阵H(显示模组坐标系和标定相机坐标系之间的转换关系)。当然,在实际应用中也可以通过两元多次项曲线进行最小二乘求解显示模组和标定相机的坐标转换关系,本公开实施例并不以此为限。
在步骤S220中,可以利用标定相机获取标定板的第一标定图像组,第一标定图像组中包括多个第一标定图像。
其中,显示模组可以透光,标定板所反射的光线可以透过显示模组进入标定相机。在标定过程中,可以利用标定相机分别获取标定板在多个不同预设角度的第一标定图像。
利用标定相机在标定板处于第一预设角度、第二预设角度和第三预设角度时,分别拍摄标定板的照片(也即是第一标定图像),第一预设角度、第二预设角度和第三预设角度均不同。
示例的,可以获取标定板在0度、-20度和+20度时的图像。其中,标定板处于0度是指标定模板和标定相机镜头平面平行。当然在实际用用中,标定板的的角度也可以是其他角度,第一标定图像组中的图像的数量也可以是其他数量,本公开实施例对此不做具体限定。
在步骤S230中,可以利用设备相机获取标定板的第二标定图像组, 设备相机和显示模组设于一电子设备,第二标定图像组包括多个和第一标定图像对应的第二标定图像。
其中,设备相机和显示模组设于同一电子设备,比如,当电子设备为增强现实设备时,设备相机可以是增强现实设备的跟随相机。标定相机的镜头朝向远离用户的一侧,显示模组的出光侧朝向靠近用户的一侧。
在标定相机获取第一标定图像时,可以利用设备相机分别获取标定板的多个和第一标定图像对应的第二标定图像。设备相机获取的第二标定图像和标定相机获取的第一标定图像一一对应,对应的第一标定图像和第二标定图像为标定板在同一时刻的图像。
利用设备相机在标定板处于第一预设角度、第二预设角度和第三预设角度时,分别拍摄标定板的照片(也即是第二标定图像),第一预设角度、第二预设角度和第三预设角度均不同。
示例的,可以获取标定板在0度、-20度和+20度时的图像。其中,标定板处于0度是指标定模板和标定相机镜头平面平行。当然在实际用用中,标定板的的角度也可以是其他角度,第二标定图像组中的图像的数量也可以是其他数量,本公开实施例对此不做具体限定。
在实际应用中,可以开启标定板补光光源,并将标定板版转至0°(和标定相机平行),标定相机与设备相机同时拍摄,得到图像W_0与T_0;旋转(顺时针)标定板板至20°,标定相机与设备相机同时拍摄,得到图像W_20,T_20;旋转(逆时针)标定板至-20°,标定相机与设备相机同时拍摄,得到图像W_-20,T_-20。
在步骤S240中,可以根据第一标定图像组、第二标定图像组和标定映射关系,确定设备相机坐标系和显示模组坐标系的转换关系。
在本公开一可行的实施方式中,如图5所示,步骤S240可以通过如下步骤实现:
步骤S510,根据标定映射关系,将第一标定图像转换为显示标定图像;
步骤S520,根据显示标定图像和第二标定图像,确定设备相机坐标系和显示模组坐标系的转换矩阵。
在步骤S510中,根据标定映射关系,将第一标定图像转换为显示标定图像。
其中,第一标定图像为标定相机坐标系下的标定板图像,显示标定图像为显示模组坐标系下的标定板图像。根据标定映射关系,将第一标定图像转换为显示标定图像可以通过如下方式实现:
获取第一特征点的坐标,第一特征点为第一标定图像中的特征点;利用标定映射关系,将第一特征点的坐标转换为显示模组坐标系下的显示特征点坐标。
需要说明的是第一特征点可以是第一标定图像上的多个特征点,并不代表此处只获取一个特征点的坐标。比如,标定板为棋盘标定板时, 第一特征点为第一标定图像上棋盘的角点。
在步骤S520,根据显示标定图像和第二标定图像,确定设备相机坐标系和显示模组坐标系的转换矩阵。
通过在步骤S510中,将第一标定图像转换为显示标定图像,此时显示模组和设备相机的标定可以转换为双目相机标定。通过双目相机标定获取显示模组和设备相机的转换关系。确定设备相机坐标系和显示模组坐标系的转换矩阵可以通过如下方式实现:
获取第二特征点的坐标,第二特征点的坐标为第二标定图像中和第一特征点对应的特征点;根据显示特征点坐标和第二特征点坐标,确定设备相机坐标系和显示模组坐标系的转换矩阵。
需要说明的是第二特征点可以是第二标定图像上的多个特征点,并不代表此处只获取一个特征点的坐标。比如,标定板为棋盘标定板时,第一特征点为第一标定图像上棋盘的角点,第二特征点为第二标定图像上棋盘的角点。
示例的,对图像W_0、T_0、W_20、T_20、W_-20和T_-20分别进行特征点检测得到特征点对W_0_Point,T_0_Points,W_20_Point,T_20_Points,W_-20_Point,T_-20_Points,使用步骤S210中获取的标定映射关系H,将设备相机图像检测到的特征点转换为显示模组坐标D_i_Point2=W_i_Point*H,然后使用D_i_Points与T_i_Points采用双目相机标定的方法,求解显示模组内参、显示模组与设备相机外参,完成整个标定流程。
在本公开一可行的实施方式中,如图6所示,步骤S240可以通过如下步骤实现:
步骤S610,根据第一标定图像和第二标定图像,确定相机转换关系,相机转换关系包括设备相机坐标系和标定相机的坐标系的转换关系;
步骤S620,根据标定映射关系和相机转换关系,确定设备相机坐标系和显示模组坐标系的转换矩阵。
在步骤S610中,可以根据第一标定图像和第二标定图像,确定相机转换关系,相机转换关系包括设备相机坐标系和标定相机的坐标系的转换关系。
其中,根据第一标定图像和第二标定图像,确定相机转换关系,可以通过如下方式实现:
获取第一特征点的坐标,第一特征点为第一标定图像中的特征点;获取第二特征点的坐标,第二特征点的坐标为第二标定图像中和第一特征点对应的特征点;根据第一特征点的坐标和第二特征点的坐标,确定设备相机坐标系和标定相机的坐标系的转换矩阵。
在上述实施例中获取了多张第一标定图像和多张第二标定图像。可以通过不同第一标定图像上的第一特征点的坐标与对应的第二标定图像上的第二特征点的坐标,获得相机转换关系。比如,通过双目相机标定 算法。
需要说明的是第一特征点可以是第一标定图像上的多个特征点,并不代表此处只获取一个特征点的坐标。第二特征点可以是第二标定图像上的多个特征点,并不代表此处只获取一个特征点的坐标。比如,标定板为棋盘标定板时,第一特征点为第一标定图像上棋盘的角点,第二特征点为第二标定图像上棋盘的角点。
在步骤S620中,可以根据标定映射关系和相机转换关系,确定设备相机坐标系和显示模组坐标系的转换矩阵。
示例的,对图像W_0、T_0、W_20、T_20、W_-20和T_-20分别进行特征点检测得到特征点对W_0_Point,T_0_Points,W_20_Point,T_20_Points,W_-20_Point,T_-20_Points。根据W_0_Point,T_0_Points,W_20_Point,T_20_Points,采用双目相机标定的方法,求解标定相机模组内参,与标定与设备相机外参。使用步骤S210中获取的标定映射关系(H矩阵),将标定相机模组内参、标定相机与设备相机外参,转换为设备相机坐标系和显示模组坐标系的转换矩阵,完成整个标定流程。
本公开实施例提供的显示标定方法,通过第一标定图像组、第二标定图像组和标定映射关系,确定设备相机坐标系和显示模组坐标系的转换关系,实现了显示标定,并且在标定过程中通过标定相机获取第一标定图像组,避免了标定时通过人眼判断十字光标的对齐,增加了显示模组标定的精度,进而能够增加待标定设备的显示渲染精度。并且在在标定过程中不需要用户佩戴待标定设备节省人力。
需要说明的是,尽管在附图中以特定顺序描述了本公开中方法的各个步骤,但是,这并非要求或者暗示必须按照该特定顺序来执行这些步骤,或是必须执行全部所示的步骤才能实现期望的结果。附加的或备选的,可以省略某些步骤,将多个步骤合并为一个步骤执行,以及/或者将一个步骤分解为多个步骤执行等。
本公开示例性实施例还提供一种显示标定装置,如图7所示,显示标定装置包括:
第一确定模块710,用于确定标定映射关系,标定映射关系包括标定相机坐标系和显示模组坐标系的映射关系;
第一获取模块720,用于利用标定相机获取标定板的第一标定图像组,第一标定图像组中包括多个第一标定图像;
第二获取模块730,用于利用设备相机获取标定板的第二标定图像组,第二标定图像组包括多个和第一标定图像对应的第二标定图像;
第二确定模块740,用于根据第一标定图像组、第二标定图像组和标定映射关系,确定设备相机坐标系和显示模组坐标系的转换关系。
本公开实施例提供的显示标定装置,通过第一标定图像组、第二标定图像组和标定映射关系,确定设备相机坐标系和显示模组坐标系的转换关系,实现了显示标定,并且在标定过程中通过标定相机获取第一标 定图像组,避免了标定时通过人眼判断十字光标的对齐,增加了显示模组标定的精度,进而能够增加待标定设备的显示渲染精度。并且在在标定过程中不需要用户佩戴待标定设备节省人力。
可选的,第一确定模块可以包括:
第一控制单元,用于控制显示模组显示标定图案;
第一获取单元,用于利用标定相机获取第三标定图像,第三标定图像为标定图案的图像;
第一确定单元,用于根据第三标定图像,确定标定特征点的坐标;
第二确定单元,根据标定特征点的坐标和显示模组标定图案中和标定特征点对应的点的坐标,确定标定映射关系。
可选的,第一获取模块可以包括:
第二获取单元,用于利用标定相机分别获取标定板在第一预设角度、第二预设角度和第三预设角度的第一标定图像,第一预设角度、第二预设角度和第三预设角度均不同。
可选的,第二获取模块可以包括:
第三获取单元,用于利用设备相机分别获取标定板在第一预设角度、第二预设角度和第三预设角度的第二标定图像。
可选的,第二确定模块可以包括:
第一转换单元,用于根据标定映射关系,将第一标定图像转换为显示标定图像;
第三确定单元,用于根据显示标定图像和第二标定图像,确定设备相机坐标系和显示模组坐标系的转换矩阵。
可选的,第一转换单元可以包括:
第一获取子单元,用于获取第一特征点的坐标,第一特征点为第一标定图像中的特征点;
第一转换子单元,用于利用标定映射关系,将第一特征点的坐标转换为显示模组坐标系下的显示特征点坐标。
可选的,第三确定单元可以包括:
第二获取子单元,用于获取第二特征点的坐标,第二特征点的坐标为第二标定图像中和第一特征点对应的特征点;
第一确定子单元,用于根据显示特征点坐标和第二特征点坐标,确定设备相机坐标系和显示模组坐标系的转换矩阵。
可选的,第二确定模块可以包括:
第二转换单元,用于根据第一标定图像和第二标定图像,确定相机转换关系,相机转换关系包括设备相机坐标系和标定相机的坐标系的转换关系;
第四确定单元,用于根据标定映射关系和相机转换关系,确定设备相机坐标系和显示模组坐标系的转换矩阵。
可选的,第二转换单元可以包括:
第三获取子单元,用于获取第一特征点的坐标,第一特征点为第一标定图像中的特征点;
第四获取子单元,用于获取第二特征点的坐标,第二特征点的坐标为第二标定图像中和第一特征点对应的特征点;
第五确定子单元,根据第一特征点的坐标和第二特征点的坐标,确定设备相机坐标系和标定相机的坐标系的转换矩阵。
上述中各显示标定装置模块的具体细节已经在对应的显示标定方法中进行了详细的描述,因此此处不再赘述。
应当注意,尽管在上文详细描述中提及了显示标定装置的若干模块或者单元,但是这种划分并非强制性的。实际上,根据本公开的实施方式,上文描述的两个或更多模块或者单元的特征和功能可以在一个模块或者单元中具体化。反之,上文描述的一个模块或者单元的特征和功能可以进一步划分为由多个模块或者单元来具体化。所属技术领域的技术人员能够理解,本公开的各个方面可以实现为系统、方法或程序产品。因此,本公开的各个方面可以具体实现为以下形式,即:完全的硬件实施例、完全的软件实施例(包括固件、微代码等),或硬件和软件方面结合的实施例,这里可以统称为“电路”、“模块”或“系统”。
此外,如图8所示,在本公开的示例性实施例中,显示标定设备还可以包括控制器800,控制器800分别连接标定相机103和设备相机112,控制器800被配置为能够执行上述的显示标定方法。
其中,显示标定设备上可以设置有数据接口,待标定设备通过该数据接口和显示标定设备连接。控制器可以包括至少一个处理单元810、至少一个存储单元820、连接不同系统组件(包括存储单元820和处理单元810)的总线830、显示单元840等。
其中,所述存储单元存储有程序代码,所述程序代码可以被所述处理单元810执行,使得所述处理单元810执行本说明书上述“示例性方法”部分中描述的根据本发明各种示例性实施例的步骤。
存储单元820可以包括易失性存储单元形式的可读介质,例如随机存取存储单元(RAM)8201和/或高速缓存存储单元8202,还可以进一步包括只读存储单元(ROM)8203。
存储单元820还可以包括具有一组(至少一个)程序模块8205的程序/实用工具8204,这样的程序模块8205包括但不限于:操作系统、一个或者多个应用程序、其它程序模块以及程序数据,这些示例中的每一个或某种组合中可能包括网络环境的实现。
总线830可以为表示几类总线结构中的一种或多种,包括存储单元总线或者存储单元控制器、外围总线、图形加速端口、处理单元或者使用多种总线结构中的任意总线结构的局域总线。
显示标定设备也可以与一个或多个外部设备870(例如键盘、指向设备、蓝牙设备等)通信,还可与一个或者多个使得用户能与该电子设备 800交互的设备通信,和/或与使得该显示标定设备能与一个或多个其它计算设备进行通信的任何设备(例如路由器、调制解调器等等)通信。这种通信可以通过输入/输出(I/O)接口850进行。并且,显示标定设备还可以通过网络适配器860与一个或者多个网络(例如局域网(LAN),广域网(WAN)和/或公共网络,例如因特网)通信。如图所示,网络适配器840通过总线830与显示标定设备的其它模块通信。应当明白,尽管图中未示出,可以结合显示标定设备使用其它硬件和/或软件模块,包括但不限于:微代码、设备驱动器、冗余处理单元、外部磁盘驱动阵列、RAID系统、磁带驱动器以及数据备份存储系统等。
通过以上的实施例的描述,本领域的技术人员易于理解,这里描述的示例实施例可以通过软件实现,也可以通过软件结合必要的硬件的方式来实现。因此,根据本公开实施例的技术方案可以以软件产品的形式体现出来,该软件产品可以存储在一个非易失性存储介质(可以是CD-ROM,U盘,移动硬盘等)中或网络上,包括若干指令以使得一台计算设备(可以是个人计算机、服务器、终端装置、或者网络设备等)执行根据本公开实施例的方法。
在本公开的示例性实施例中,还提供了一种计算机可读存储介质,其上存储有能够实现本说明书上述方法的程序产品。在一些可能的实施例中,本发明的各个方面还可以实现为一种程序产品的形式,其包括程序代码,当所述程序产品在终端设备上运行时,所述程序代码用于使所述终端设备执行本说明书上述“示例性方法”部分中描述的根据本发明各种示例性实施例的步骤。
参考图9所示,描述了根据本发明的实施例的用于实现上述方法的程序产品900,其可以采用便携式紧凑盘只读存储器(CD-ROM)并包括程序代码,并可以在终端设备,例如个人电脑上运行。然而,本发明的程序产品不限于此,在本文件中,可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。
所述程序产品可以采用一个或多个可读介质的任意组合。可读介质可以是可读信号介质或者可读存储介质。可读存储介质例如可以为但不限于电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。可读存储介质的更具体的例子(非穷举的列表)包括:具有一个或多个导线的电连接、便携式盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。
计算机可读信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了可读程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。可 读信号介质还可以是可读存储介质以外的任何可读介质,该可读介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。
可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于无线、有线、光缆、RF等等,或者上述的任意合适的组合。
可以以一种或多种程序设计语言的任意组合来编写用于执行本发明操作的程序代码,所述程序设计语言包括面向对象的程序设计语言—诸如Java、C++等,还包括常规的过程式程序设计语言—诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算设备上执行、部分地在用户设备上执行、作为一个独立的软件包执行、部分在用户计算设备上部分在远程计算设备上执行、或者完全在远程计算设备或服务器上执行。在涉及远程计算设备的情形中,远程计算设备可以通过任意种类的网络,包括局域网(LAN)或广域网(WAN),连接到用户计算设备,或者,可以连接到外部计算设备(例如利用因特网服务提供商来通过因特网连接)。
此外,上述附图仅是根据本发明示例性实施例的方法所包括的处理的示意性说明,而不是限制目的。易于理解,上述附图所示的处理并不表明或限制这些处理的时间顺序。另外,也易于理解,这些处理可以是例如在多个模块中同步或异步执行的。
本领域技术人员在考虑说明书及实践这里公开的发明后,将容易想到本公开的其他实施例。本申请旨在涵盖本公开的任何变型、用途或者适应性变化,这些变型、用途或者适应性变化遵循本公开的一般性原理并包括本公开未公开的本技术领域中的公知常识或惯用技术手段。说明书和实施例仅被视为示例性的,本公开的真正范围和精神由权利要求指出。
应当理解的是,本公开并不局限于上面已经描述并在附图中示出的精确结构,并且可以在不脱离其范围进行各种修改和改变。本公开的范围仅由所附的权利要求来限。

Claims (20)

  1. 一种显示标定方法,其中,所述显示标定方法包括:
    确定标定映射关系,所述标定映射关系包括标定相机坐标系和显示模组坐标系的转换关系;
    利用所述标定相机获取标定板的第一标定图像组,所述第一标定图像组中包括多个第一标定图像;
    利用设备相机获取标定板的第二标定图像组,所述第二标定图像组包括多个和第一标定图像对应的第二标定图像;
    根据所述第一标定图像组、所述第二标定图像组和所述标定映射关系,确定所述设备相机坐标系和所述显示模组坐标系的转换关系。
  2. 如权利要求1所述的显示标定方法,其中,所述确定标定映射关系,包括:
    控制所述显示模组显示标定图案;
    利用所述标定相机获取第三标定图像,所述第三标定图像为所述标定图案的图像;
    根据所述第三标定图像,确定标定特征点的坐标;
    根据所述标定特征点的坐标和显示模组标定图案中和标定特征点对应的点的坐标,确定所述标定映射关系。
  3. 如权利要求1所述的显示标定方法,其中,所述利用所述标定相机获取标定板的第一标定图像组,包括:
    利用所述标定相机分别获取所述标定板在多个不同预设角度的第一标定图像;
    所述利用设备相机获取标定板的第二标定图像组,包括:
    利用所述设备相机分别获取所述标定板的多个和所述第一标定图像对应的第二标定图像。
  4. 如权利要求3所述的显示标定方法,其中,所述利用所述标定相机获取标定板的第一标定图像组,包括:
    利用所述标定相机分别获取所述标定板在第一预设角度、第二预设角度和第三预设角度的第一标定图像,所述第一预设角度、所述第二预设角度和所述第三预设角度均不同;
    所述利用设备相机获取标定板的第二标定图像组,包括:
    利用所述设备相机分别获取所述标定板在第一预设角度、第二预设角度和第三预设角度的第二标定图像。
  5. 如权利要求1所述的显示标定方法,其中,所述确定所述设备相机坐标系和所述显示模组坐标系的转换关系,包括:
    根据所述标定映射关系,将所述第一标定图像转换为显示标定图像;
    根据所述显示标定图像和所述第二标定图像,确定所述设备相机坐标系和所述显示模组坐标系的转换矩阵。
  6. 如权利要求5所述的显示标定方法,其中,所述根据所述标定映 射关系,将所述第一标定图像转换为显示标定图像,包括:
    获取第一特征点的坐标,所述第一特征点为第一标定图像中的特征点;
    利用所述标定映射关系,将所述第一特征点的坐标转换为显示模组坐标系下的显示特征点坐标。
  7. 如权利要求6所述的显示标定方法,其中,确定所述设备相机坐标系和所述显示模组坐标系的转换矩阵,包括:
    获取第二特征点的坐标,所述第二特征点的坐标为第二标定图像中和第一特征点对应的特征点;
    根据所述显示特征点坐标和所述第二特征点坐标,确定所述设备相机坐标系和所述显示模组坐标系的转换矩阵。
  8. 如权利要求1所述的显示标定方法,其中,所述确定所述设备相机坐标系和所述显示模组坐标系的转换关系,包括:
    根据所述第一标定图像和所述第二标定图像,确定相机转换关系,所述相机转换关系包括所述设备相机坐标系和所述标定相机的坐标系的转换关系;
    根据所述标定映射关系和所述相机转换关系,确定所述设备相机坐标系和所述显示模组坐标系的转换矩阵。
  9. 如权利要求8所述的显示标定方法,其中,所述根据所述第一标定图像和所述第二标定图像,确定相机转换关系,包括:
    获取第一特征点的坐标,所述第一特征点为第一标定图像中的特征点;
    获取第二特征点的坐标,所述第二特征点的坐标为第二标定图像中和第一特征点对应的特征点;
    根据所述第一特征点的坐标和所述第二特征点的坐标,确定所述设备相机坐标系和所述标定相机的坐标系的转换矩阵。
  10. 一种显示标定装置,其中,所述显示标定装置包括:
    第一确定模块,用于确定标定映射关系,所述标定映射关系包括标定相机坐标系和显示模组坐标系的转换关系;
    第一获取模块,用于利用所述标定相机获取标定板的第一标定图像组,所述第一标定图像组中包括多个第一标定图像;
    第二获取模块,用于利用设备相机获取标定板的第二标定图像组,所述第二标定图像组包括多个和第一标定图像对应的第二标定图像;
    第二确定模块,用于根据所述第一标定图像组、所述第二标定图像组和所述标定映射关系,确定所述设备相机坐标系和所述显示模组坐标系的转换关系。
  11. 如权利要求10所述的显示标定装置,其中,所述第一确定模块,包括:
    第一控制单元,用于控制所述显示模组显示标定图案;
    第一获取单元,用于利用所述标定相机获取第三标定图像,所述第三标定图像为所述标定图案的图像;
    第一确定单元,用于根据所述第三标定图像,确定标定特征点的坐标;
    第二确定单元,根据根据所述标定特征点的坐标和显示模组标定图案中和标定特征点对应的点的坐标,确定所述标定映射关系。
  12. 如权利要求10所述的显示标定装置,其中,所述第一获取模块包括:
    第二获取单元,用于利用所述标定相机分别获取所述标定板在多个不同预设角度的第一标定图像;
    所述第二获取模块包括:
    第三获取单元,用于利用所述设备相机分别获取所述标定板的多个和所述第一标定图像对应的第二标定图像。
  13. 如权利要求12所述的显示标定装置,其中,所述第一获取模块包括:
    第二获取单元,用于利用所述标定相机分别获取所述标定板在第一预设角度、第二预设角度和第三预设角度的第一标定图像,所述第一预设角度、所述第二预设角度和所述第三预设角度均不同;
    第二获取模块包括:
    第三获取单元,用于利用所述设备相机分别获取所述标定板在第一预设角度、第二预设角度和第三预设角度的第二标定图像。
  14. 如权利要求10所述的显示标定装置,其中,所述第二确定模块包括:
    第一转换单元,用于根据所述标定映射关系,将所述第一标定图像转换为显示标定图像;
    第三确定单元,用于根据所述显示标定图像和所述第二标定图像,确定所述设备相机坐标系和所述显示模组坐标系的转换矩阵。
  15. 如权利要求14所述的显示标定方法,其中,所述第一转换单元包括:
    第一获取子单元,用于获取第一特征点的坐标,所述第一特征点为第一标定图像中的特征点;
    第一转换子单元,用于利用所述标定映射关系,将所述第一特征点的坐标转换为显示模组坐标系下的显示特征点坐标。
  16. 如权利要求10所述的显示标定方法,其中,所述第二确定模块包括:
    第二转换单元,用于根据所述第一标定图像和所述第二标定图像,确定相机转换关系,所述相机转换关系包括所述设备相机坐标系和所述标定相机的坐标系的转换关系;
    第四确定单元,用于根据所述标定映射关系和所述相机转换关系, 确定所述设备相机坐标系和所述显示模组坐标系的转换矩阵。
  17. 一种显示标定设备,其中,用于待标定设备的标定,所述待标定设备包括显示模组和设备相机,所述显示标定设备包括:
    固定夹具,所述固定夹具用于夹持待标定设备;
    标定板,所述标定板设于所述固定夹具的一侧,并且所述标定板位于所述设备相机的拍摄范围内;
    标定相机,所述标定相机设于所述显示模组的眼动范围内;
    控制器,所述控制器分别连接所述标定相机和所述设备相机,所述控制器被配置为能够执行权利要求1-9任一所述的显示标定方法。
  18. 如权利要求17所述的显示标定设备,其中,所述显示标定设备还包括:
    相机调节装置,所述相机调节装置和所述固定夹具以及所述标定相机连接,所述相机调节装置用于调节所述标定相机和所述固定夹具的相对位置。
  19. 如权利要求17所述的显示标定设备,其中,所述显示标定设备还包括:
    标定板调节装置,所述标定板调节装置和所述标定板以及所述固定夹具连接,所述标定板调节装置用于调节所述标定板和所述固定夹具的相对位置关系。
  20. 一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现根据权利要求1至9中任一项所述方法。
PCT/CN2021/091942 2020-06-17 2021-05-06 显示标定方法及装置、设备、存储介质 WO2021254014A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010554384.7A CN111652946B (zh) 2020-06-17 2020-06-17 显示标定方法及装置、设备、存储介质
CN202010554384.7 2020-06-17

Publications (1)

Publication Number Publication Date
WO2021254014A1 true WO2021254014A1 (zh) 2021-12-23

Family

ID=72348214

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/091942 WO2021254014A1 (zh) 2020-06-17 2021-05-06 显示标定方法及装置、设备、存储介质

Country Status (2)

Country Link
CN (1) CN111652946B (zh)
WO (1) WO2021254014A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114596792A (zh) * 2022-02-24 2022-06-07 武汉精立电子技术有限公司 微显示器的贴合方法及设备
WO2023107845A1 (en) * 2021-12-07 2023-06-15 Snap Inc. Optical waveguide combiner systems and methods

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111652946B (zh) * 2020-06-17 2024-05-24 Oppo广东移动通信有限公司 显示标定方法及装置、设备、存储介质
CN112945090B (zh) * 2020-12-11 2023-04-21 深圳市菲森科技有限公司 一种三维测量系统的标定装置及标定方法
CN115457148A (zh) * 2022-09-16 2022-12-09 博众精工科技股份有限公司 一种旋转中心的标定方法、装置、计算机设备和存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014095557A (ja) * 2012-11-07 2014-05-22 Shimadzu Corp モーショントラッカ装置
CN109949228A (zh) * 2019-03-14 2019-06-28 北京航空航天大学 一种光学透视的ar眼镜的在线标定装置及方法
CN110880188A (zh) * 2018-09-06 2020-03-13 舜宇光学(浙江)研究院有限公司 用于近眼显示光学系统的标定方法、标定装置和标定系统
CN111652946A (zh) * 2020-06-17 2020-09-11 Oppo广东移动通信有限公司 显示标定方法及装置、设备、存储介质
CN112654886A (zh) * 2020-05-27 2021-04-13 华为技术有限公司 外参标定方法、装置、设备及存储介质

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6507730B2 (ja) * 2015-03-10 2019-05-08 富士通株式会社 座標変換パラメータ決定装置、座標変換パラメータ決定方法及び座標変換パラメータ決定用コンピュータプログラム
CN109978956B (zh) * 2019-03-22 2021-07-06 新华三技术有限公司 采集设备的标定方法、装置及标定系统
CN110782499B (zh) * 2019-10-23 2023-01-06 Oppo广东移动通信有限公司 一种增强现实设备的标定方法、标定装置及终端设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014095557A (ja) * 2012-11-07 2014-05-22 Shimadzu Corp モーショントラッカ装置
CN110880188A (zh) * 2018-09-06 2020-03-13 舜宇光学(浙江)研究院有限公司 用于近眼显示光学系统的标定方法、标定装置和标定系统
CN109949228A (zh) * 2019-03-14 2019-06-28 北京航空航天大学 一种光学透视的ar眼镜的在线标定装置及方法
CN112654886A (zh) * 2020-05-27 2021-04-13 华为技术有限公司 外参标定方法、装置、设备及存储介质
CN111652946A (zh) * 2020-06-17 2020-09-11 Oppo广东移动通信有限公司 显示标定方法及装置、设备、存储介质

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023107845A1 (en) * 2021-12-07 2023-06-15 Snap Inc. Optical waveguide combiner systems and methods
CN114596792A (zh) * 2022-02-24 2022-06-07 武汉精立电子技术有限公司 微显示器的贴合方法及设备

Also Published As

Publication number Publication date
CN111652946A (zh) 2020-09-11
CN111652946B (zh) 2024-05-24

Similar Documents

Publication Publication Date Title
WO2021254014A1 (zh) 显示标定方法及装置、设备、存储介质
CN110782499B (zh) 一种增强现实设备的标定方法、标定装置及终端设备
US11625841B2 (en) Localization and tracking method and platform, head-mounted display system, and computer-readable storage medium
EP1899793B1 (en) Control device for information display, corresponding system, method and program product
WO2022012085A1 (zh) 人脸图像处理方法、装置、存储介质及电子设备
CN109032348B (zh) 基于增强现实的智能制造方法与设备
US11553123B2 (en) Dynamic detection and correction of light field camera array miscalibration
CN114145011A (zh) 光场相机阵列错误校准的动态检测和校正
WO2020029373A1 (zh) 人眼空间位置的确定方法、装置、设备和存储介质
JP2021166075A (ja) 画像の投影方法、装置、デバイス及び記憶媒体
US10331393B2 (en) Vehicle-mounted terminal and method for obtaining resolution of a screen of a handheld terminal
US10063792B1 (en) Formatting stitched panoramic frames for transmission
CN107566793A (zh) 用于远程协助的方法、装置、系统及电子设备
WO2021011087A1 (en) Device pose detection and pose-related image capture and processing for light field based telepresence communications
US20190116354A1 (en) Camera calibration
WO2021112382A1 (en) Apparatus and method for dynamic multi-camera rectification using depth camera
US20170045957A1 (en) System and Method of Object Recognition for an Interactive Input System
WO2023056840A1 (zh) 三维物体的显示方法、装置、设备及介质
JP7223072B2 (ja) 路側感知方法、路側感知装置、電子デバイス、記憶媒体、路側設備、及びプログラム
WO2024055837A1 (zh) 一种图像处理方法、装置、设备及介质
WO2024055470A1 (zh) 旋转中心的标定方法、装置、计算机设备和存储介质
WO2019119999A1 (zh) 立体图形的展开过程呈现方法以及装置、设备及存储介质
WO2019100547A1 (zh) 投影控制方法、装置、投影交互系统及存储介质
US20240046554A1 (en) Presenting virtual representation of real space using spatial transformation
CN114373016A (zh) 一种在扩展现实技术场景中实现点的定位方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21825691

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21825691

Country of ref document: EP

Kind code of ref document: A1