CN114219867A - Method and device for calibrating camera, electronic equipment and readable storage medium - Google Patents

Method and device for calibrating camera, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN114219867A
CN114219867A CN202111560370.7A CN202111560370A CN114219867A CN 114219867 A CN114219867 A CN 114219867A CN 202111560370 A CN202111560370 A CN 202111560370A CN 114219867 A CN114219867 A CN 114219867A
Authority
CN
China
Prior art keywords
camera
motion
mobile platform
plane
trajectory
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111560370.7A
Other languages
Chinese (zh)
Inventor
周骥
冯歆鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NextVPU Shanghai Co Ltd
Original Assignee
NextVPU Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NextVPU Shanghai Co Ltd filed Critical NextVPU Shanghai Co Ltd
Priority to CN202111560370.7A priority Critical patent/CN114219867A/en
Publication of CN114219867A publication Critical patent/CN114219867A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the disclosure discloses a method and a device for calibrating a camera, electronic equipment and a readable storage medium, and relates to the field of camera calibration. The method comprises the following steps: when a mobile platform provided with a camera performs plane motion, shooting a plurality of images through the camera; determining at least two different motion trajectories of the camera based on the plurality of captured images; and determining installation attitude information of the camera relative to the mobile platform based on the two different determined motion tracks.

Description

Method and device for calibrating camera, electronic equipment and readable storage medium
Technical Field
The embodiment of the disclosure relates to the field of camera calibration, in particular to a method and a device for camera calibration, an electronic device and a readable storage medium.
Background
Devices for performing planar motion, such as sweeping robots and logistics robots, are usually provided with a camera on a mobile platform, so as to perform visual measurement during the moving process, and realize functions such as automatic positioning, obstacle avoidance and information detection. When the camera is installed on the mobile platform, the camera is not in an ideal installation posture relative to the mobile platform due to installation errors. This will cause the camera to deviate when performing vision measurement, affecting the accuracy of positioning, detection, etc.
The approaches described in this section are not necessarily approaches that have been previously conceived or pursued. Unless otherwise indicated, it should not be assumed that any of the approaches described in this section qualify as prior art merely by virtue of their inclusion in this section. Similarly, unless otherwise indicated, the problems mentioned in this section should not be considered as having been acknowledged in any prior art.
Disclosure of Invention
According to an aspect of an exemplary embodiment of the present disclosure, there is provided a calibration method of a camera, including: when a mobile platform provided with a camera performs plane motion, shooting a plurality of images of the environment through the camera at a plurality of track points in the plane motion; determining at least two different motion trajectories of the camera in the environment based on a plurality of images captured at the plurality of trajectory points; and determining mounting pose information of the camera relative to the mobile platform based on the determined at least two different motion trajectories.
According to another aspect of exemplary embodiments of the present disclosure, there is provided an apparatus for calibration of a camera, including: the shooting unit is configured to control the camera to shoot a plurality of images when the mobile platform carrying the camera performs plane motion; a motion trajectory determination unit that determines at least two different motion trajectories of the camera based on the plurality of captured images; and a camera pose information determination unit that determines pose information of the camera with respect to the mobile platform based on the two different motion trajectories.
According to another aspect of exemplary embodiments of the present disclosure, there is provided an electronic device including: a processor; and a memory storing a program comprising instructions that, when executed by the processor, cause the processor to perform a method according to some exemplary embodiments of the present disclosure.
According to another aspect of exemplary embodiments of the present disclosure, there is provided a computer-readable storage medium storing a program, the program comprising instructions that, when executed by a processor of an electronic device, cause the processor to perform a method according to some exemplary embodiments of the present disclosure.
According to another aspect of exemplary embodiments of the present disclosure, a computer program product is provided, comprising a computer program, wherein the computer program realizes any of the methods described above when executed by a processor.
The method, the apparatus, the electronic device and the readable storage medium for calibration of the camera according to the exemplary embodiments of the present disclosure may determine an actual installation posture of the camera by determining at least two motion trajectories of the camera through images taken by the camera while the mobile platform is moving, thereby completing the calibration. Therefore, the calibration process of the camera installation position can be simplified, and the influence of manual operation on the precision of the calibration process can be avoided.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the embodiments and, together with the description, serve to explain the exemplary implementations of the embodiments. The illustrated embodiments are for purposes of illustration only and do not limit the scope of the claims. Throughout the drawings, identical reference numbers designate similar, but not necessarily identical, elements:
FIG. 1 shows an exemplary flow diagram of a method for calibration of a camera according to an embodiment of the present disclosure;
FIG. 2 illustrates a schematic diagram of arranging a plurality of identifiers in an environment, according to an embodiment of the present disclosure;
fig. 3 illustrates an exemplary flow chart for determining a motion profile of a camera in accordance with an embodiment of the present disclosure;
FIG. 4 illustrates an exemplary flow chart for determining mounting pose information of a camera relative to a mobile platform according to an embodiment of the present disclosure;
FIG. 5 shows a flow chart for determining an axis of rotation of a mobile platform relative to a camera according to an embodiment of the disclosure;
FIG. 6 illustrates an exemplary flow chart for determining a rotation angle of a camera relative to a moving platform about a rotation axis according to an embodiment of the present disclosure;
FIG. 7 shows a schematic block diagram of an apparatus for calibration of a camera according to an embodiment of the present disclosure; and
fig. 8 is a block diagram illustrating an example of an electronic device according to an exemplary embodiment of the present disclosure.
Detailed Description
In the present disclosure, unless otherwise specified, the use of the terms "first", "second", etc. to describe various elements is not intended to limit the positional relationship, the timing relationship, or the importance relationship of the elements, and such terms are used only to distinguish one element from another. In some examples, a first element and a second element may refer to the same instance of the element, and in some cases, based on the context, they may also refer to different instances.
The terminology used in the description of the various described examples in this disclosure is for the purpose of describing particular examples only and is not intended to be limiting. Unless the context clearly indicates otherwise, if the number of elements is not specifically limited, the elements may be one or more. Furthermore, the term "and/or" as used in this disclosure is intended to encompass any and all possible combinations of the listed items. For example, a and/or B, may represent: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
A camera mounted on the mobile platform may acquire images during movement of the mobile platform. In order to accurately acquire surrounding information of the mobile platform by using an image acquired by the camera, after the camera is mounted on the mobile platform, the camera needs to be calibrated to acquire real mounting posture information of the camera relative to the mobile platform, so that the information can be corrected during subsequent vision measurement.
Devices for sweeping robots, logistics robots, vehicles, etc. that move substantially on a plane may include a mobile platform that moves in a plane. The camera can be installed on the mobile platform to perform vision measurement in the moving process, so that the functions of automatic positioning, obstacle avoidance, information detection and the like are realized. Specifically, during the movement of the mobile platform, the camera mounted on the mobile platform takes pictures along with the movement of the mobile platform. Based on the pictures taken, visual information around the mobile platform may be obtained. If the camera is mounted on the mobile platform in a predetermined manner, for example, such that the coordinate system of the camera coincides with the coordinate system of the mobile platform, the position of the mobile platform can be characterized based on the position of the camera determined from the pictures taken by the camera. In this way, indirect determination of the position of the mobile platform is achieved.
However, when the camera is mounted on the mobile platform, there may be mounting errors, which may result in the camera not being in an ideal mounting posture with respect to the mobile platform. This may cause the position of the mobile platform to be inaccurately represented by the position of the camera determined based on the picture taken by the camera, thereby affecting the accuracy of positioning, detection, and the like. In the related art, the actual installation attitude information of the camera is determined through manual measurement or a calibration mode by using an external calibration object, so that the installation error of the camera relative to the mobile platform is corrected. However, these calibration methods often require human intervention and are complicated to operate, so that there is a problem that the calibration accuracy is limited. For example, the accuracy of a method for manually measuring the offset angle of the camera is poor, a strict requirement is imposed on the placement position of the calibration object by using a calibration method of a special external calibration object, and the calibration process is complicated.
In this regard, exemplary embodiments of the present disclosure provide an improved approach for calibration of a camera. By determining at least two motion trajectories of the camera by means of images taken by the camera while the mobile platform is moving, the mounting attitude of the camera can be determined, thereby completing the calibration. The calibration process can be automatically completed without external calibration or measurement equipment and manual operation.
In embodiments of the present disclosure, the mounting offset of the camera on the mobile platform may be indicated by a rotation matrix between the camera coordinate system and the mobile platform coordinate system. Wherein the mounting point of the camera on the mobile platform can be determined as the origin of coordinates of the camera coordinate system and the mobile platform coordinate system. The mobile platform coordinate system may include the following three characteristic directions: a first stage characteristic direction corresponding to the forward direction of the moving stage, a second stage characteristic direction perpendicular to the moving plane of the moving stage, and a third stage characteristic direction orthogonal to both the first stage characteristic direction and the second stage characteristic direction. The camera coordinate system may include the following three characteristic directions: a first camera feature direction corresponding to the optical axis direction, a second camera feature direction corresponding to the vertical direction of the imaging plane, and a third feature direction orthogonal to the first camera feature direction and the second camera feature direction. Wherein the mobile platform coordinate system may coincide with the world coordinate system. When the first camera characteristic direction is superposed with the first platform characteristic direction, the second camera characteristic direction is superposed with the second platform characteristic direction, and the third camera characteristic direction is superposed with the third platform characteristic direction, the camera track determined based on the image shot by the camera can reflect the moving track of the moving platform. In this case, it can be considered that there is no offset in the installation of the camera. When there is an offset in the mounting of the camera on the mobile platform, the mounting offset of the camera may be indicated using a rotation matrix between the camera coordinate system and the mobile platform coordinate system.
Embodiments of the present disclosure are further described below with reference to the accompanying drawings.
Fig. 1 shows an exemplary flow diagram of a method 100 for calibration of a camera according to an embodiment of the present disclosure. As shown in fig. 1, in step S110, when the mobile platform mounted with the camera performs a planar motion, a plurality of images of the environment are captured by the camera at a plurality of locus points in the planar motion.
It should be understood that the camera may be a device that forms an image and records the image using optical imaging principles. In some embodiments, the camera may be configured as a device that images based on visible or invisible light (e.g., infrared light).
In some embodiments, images taken by the camera may be acquired from the camera in real-time. In the case of acquiring images in real time, steps S120 to S130 described below may be performed in real time. In other embodiments, multiple images taken by the camera may be stored in the storage device. The plurality of images captured by the camera and stored in the storage device are then acquired from the storage device to perform steps S120-S130 described below.
The image taken by the camera obtained in step S110 includes information capable of representing the movement trajectory of the camera when the camera is in planar motion.
In step S120, at least two different motion trajectories of the camera in the environment may be determined based on the plurality of images captured at the plurality of trajectory points.
The motion trajectory of the camera during the planar motion of the mobile platform may be determined using information in the image photographed in step S110 and predetermined ambient environment information.
The motion trajectory of the camera during planar motion of the mobile platform can be determined by means of any characteristic information in the environment.
In some embodiments, the shooting position of the camera when the image was taken may be determined by means of a plurality of identifiers in the environment for the subsequent execution of steps S120-S130. In such an embodiment, each of the plurality of captured images includes at least one of a plurality of identifiers disposed in the environment. The markers have predefined dimensions and/or patterns in order to determine, by means of the related art, on the basis of these dimensions and patterns, the shooting position of the camera relative to the respective marker when the image was taken. The size and pattern of the different identifiers may be the same or different. In this case, the ambient environment information may include the relative position between the various identifiers in the environment. In some examples, the environment information may include point cloud information corresponding to various identifiers in the environment. The identifier may be arranged in the environment in a random or predefined manner by the operator prior to use. In the embodiment illustrated in FIG. 2, the identifier may employ AprilTag labels 201-1, 201-2, or the like. Multiple aprilatag tags may be prearranged in the environment by the operator in a random manner. Markers of any other shape and color may be used by those skilled in the art without departing from the principles of the present disclosure, so long as the position and orientation of the marker can be easily identified in the image so that the relative position of the camera with respect to the marker when the image was captured can be determined.
In further embodiments, the shooting position of the camera when the image was taken may also be determined by means of any other characteristic information in the environment. Besides the size and pattern of the marker, the shooting position of the camera can be assisted by the characteristics of the color information of the marker and the like.
As previously mentioned, in some embodiments, each of the plurality of images captured includes at least one of a plurality of identifiers disposed in the environment. Thus, the capturing position of the camera when capturing the image can be determined by means of a plurality of markers in the environment, so that at least two different movement trajectories of the camera are determined. In some implementations, each of the plurality of markers disposed in the environment is included in at least one of the at least one calibration images acquired for the environment. In this way, the camera can acquire information of the surrounding environment as much as possible during the planar motion of the mobile platform, so that the motion track of the camera can be determined more accurately. For example, when determining the movement trajectory of the camera, the entire shot of the camera is made to cover all markers arranged in the environment. Using more markers helps to improve the accuracy and robustness of the determination of the camera position.
Fig. 3 shows an exemplary flowchart for determining a motion trajectory of a camera according to an embodiment of the present disclosure, and step S120 in fig. 1 may be implemented by the process shown in fig. 3.
In step S310, a corresponding shooting position of the camera in the environment when each image is shot is determined based on a relative positional relationship between a position of at least one marker in each image in the corresponding image and a plurality of markers in the environment. Wherein at least one of the plurality of identifiers is included in at least one of the plurality of images captured by the camera. In some embodiments, the relative positional relationship between each of the plurality of identifiers and the other identifiers comprises a relative rotation and displacement of the identifier and the other identifiers. In step S320, at least two different motion trajectories of the camera are determined based on a plurality of shooting positions corresponding to a plurality of captured images.
In some embodiments, the environmental information may be determined by taking a calibration image of the environment with the camera before the calibration method of the camera begins. In some implementations, the environmental information can be determined by identifying information of the marker in a calibration image taken by the camera. For example, the relative positional relationship between the plurality of markers may be determined based on at least one calibration image acquired for the environment with a camera mounted on a mobile platform or other camera device in the environment that has completed an internal reference calibration. By using the method for determining the environmental information by using the calibration image, the arrangement mode of the marker in the environment is not required to be limited, thereby simplifying the calibration process and reducing the influence of the arrangement accuracy of the marker on the calibration precision. Furthermore, as shown in fig. 2, the marker may be disposed on a wall surface or a ground surface in the environment, and thus information such as the wall surface or the ground surface of the environment may also be included in the calibration image. It is understood that the moving plane of the mobile platform may be assumed to be parallel to the ground, and the wall surface may be assumed to be perpendicular to the ground, so that the expression of the coordinate system of the mobile platform in the environment may be determined based on the environment information.
In other embodiments, pre-stored environmental information may also be obtained from a database, where the environmental information includes a relative positional relationship between a plurality of identifiers arranged in an environment for camera calibration.
Referring back to fig. 1 again, in step S130, based on the determined at least two different motion trajectories, mounting posture information of the camera with respect to the mobile platform is determined. In some embodiments, the mounting pose information is represented as a rotation matrix between the camera coordinate system and the mobile platform coordinate system. The two different motion profiles may include at least a first motion profile and a second motion profile. A process of determining the installation posture information of the camera using the first motion trajectory and the second motion trajectory will be described below.
Fig. 4 illustrates an exemplary flow chart for determining mounting pose information of a camera relative to a mobile platform according to an embodiment of the present disclosure. Step S130 shown in fig. 1 may be implemented using the process shown in fig. 4.
As shown in fig. 4, determining the installation posture information of the camera with respect to the mobile platform further includes steps S410 to S430.
In step S410, a rotation axis of the mobile platform relative to the camera may be determined based on the first motion trajectory. In this step, the mobile platform is moved in a plane in the environment according to a first predetermined trajectory. The planar movement includes at least three different lines of trace points. The specific shape of the first predetermined trajectory is not limited herein. The camera may be controlled to capture images during the orbiting motion of the mobile platform, with corresponding images being obtained at least three non-collinear capture positions. Based on the captured image, at least three track points of the camera movement may be determined, and the first motion trajectory may be formed by connecting the at least three track points of the camera. It will be appreciated that the choice of taking images at more points of trajectory is beneficial to improve the accuracy and robustness of the determination of the axis of rotation.
A plane in which the trajectory of the camera movement lies and a normal vector of the plane may be determined based on the first motion trajectory.
If the second camera feature direction in the camera coordinate system and the second platform feature direction in the mobile platform coordinate system coincide when the camera is installed, the plane where the first motion trajectory is located determined by step S410 and the normal vector of the ground should coincide. However, if there is a deviation in the installation angle of the camera, the plane where the first motion trajectory determined in step S410 is located and the ground do not coincide, and the normal vector of the plane where the first motion trajectory is located and the normal vector of the ground do not coincide with each other. Thus, based on the first motion trajectory, information of the mobile platform with respect to a rotation axis of the camera may be determined, which rotation axis may be used to represent a deviation between a vertical axis of the camera and a vertical axis of the mobile platform.
Fig. 5 shows a flow chart for determining an axis of rotation of a mobile platform relative to a camera according to an embodiment of the disclosure. Step S410 in fig. 4 may be implemented using the process shown in fig. 5.
In step S510, at least three trajectory points (or first motion trajectories) are fitted to the same plane to form a first plane-fitted motion trajectory. In some embodiments, at least one plane constraint may be imposed on the at least three first trajectory points to fit the at least three first trajectory points to the same plane to form a first plane-fitted motion trajectory. The first plane-fitting motion trajectory is a planar trajectory (i.e., a planar curve) due to the planar constraints imposed during the fitting process. In step S520, based on the determined first plane-fitting motion trail, a plane normal vector of a plane where the first plane-fitting motion trail is located is determined as a rotation axis of the mobile platform relative to the camera. And confirming the expression of a plane normal vector of the first plane fitting motion trail in the camera coordinate system as a rotating shaft of the mobile platform relative to the camera coordinate system.
In some implementations, the plane constraint includes a non-linear least squares optimization algorithm. In fact, any plane constraint algorithm can be used by those skilled in the art to implement the above plane constraint, and the specific method for applying the plane constraint is not limited herein.
Referring back to fig. 4, after the rotation axis of the moving platform with respect to the camera is determined based on the first motion trajectory in step S410, step S420 is performed. In step S420, based on the second motion trajectory and the determined rotation axis, a rotation angle of the mobile platform relative to the camera about the rotation axis is determined. In this step, the mobile platform is moved in a plane in the environment according to a second predetermined trajectory. The second predetermined trajectory comprises at least two trajectory points. And controlling the camera to shoot images during the movement of the mobile platform according to a second preset track, shooting at the positions of at least two track points and obtaining corresponding images. The actual positions of the camera at the two track points can be determined through images shot at the at least two track points, and the actual motion track of the camera is fitted. By comparing the actual motion trajectory of the camera with the movement trajectory of the mobile platform (i.e. the second predetermined trajectory), the rotation angle of the mobile platform relative to the camera about the rotation axis can be determined.
Fig. 6 illustrates an exemplary flow chart for determining a rotation angle of a mobile platform relative to a camera about a rotation axis according to an embodiment of the present disclosure. Step S420 in fig. 4 may be implemented using the process shown in fig. 6.
In step S610, at least two second track points are connected to form a second motion trajectory. In step S620, the second motion trajectory may be fitted to a plane on which the moving platform performs the second planar motion by using the previously determined rotation axis to form a second planar-fitted motion trajectory. For example, the second motion trajectory may be rotationally compensated using a rotation axis between the camera and the mobile platform determined based on the first motion trajectory, such that a second plane-fitted motion trajectory obtained after compensation is located on a plane in which the mobile platform moves. In step S630, an angle between the second plane motion trajectory of the mobile platform and the second plane fitting motion trajectory of the camera is determined as a rotation angle.
As described above, the deviation between the second camera characteristic direction of the camera and the second stage characteristic direction of the mobile stage can be determined using the rotation axis determined in step S410, but the rotation axis cannot represent the deviation between the optical axis direction of the camera and the advancing direction of the mobile stage. Therefore, the deviation between the optical axis direction of the camera and the advancing direction of the mobile platform needs to be determined by using the second motion track, so that a complete rotation matrix between the camera coordinate system and the mobile platform coordinate system is obtained. After the second motion trajectory is rotation-compensated by the rotation axis determined in step S410, the second plane fitting motion trajectory corresponds to a camera movement trajectory corresponding to an image captured by the camera in a case where the vertical direction of the camera coordinate system and the vertical direction of the moving platform coincide. In this case, the angle between the second plane-fitting motion trajectory and the second predetermined trajectory of the moving platform may represent a deviation between the optical axis direction of the camera and the advancing direction of the moving platform.
In some embodiments, the second motion profile is a straight line. In the case that the second motion trajectory is a straight line, the angle between the two second predetermined trajectories and the second plane fitting motion trajectory of the camera can be directly obtained and taken as the rotation angle by comparing the motion trajectory of the mobile platform (i.e., the second predetermined trajectory) with the determined direction of the actual motion trajectory of the camera. It will be appreciated that the second motion trajectory may also be a trajectory of any other shape, and the rotation angle may be determined by comparing an angle between a tangential direction (corresponding to the forward direction of the camera) of any point on the second plane-fitted motion trajectory corresponding to the second motion trajectory and a tangential direction (corresponding to the forward direction of the mobile platform) of a trajectory position corresponding to the same time on a second predetermined trajectory of the mobile platform.
Referring back to fig. 4, after the rotation axis and the rotation angle of the moving platform with respect to the camera are obtained, step S430 is performed. In step S430, a rotation matrix between the camera coordinate system and the moving platform coordinate system is determined as the mounting posture information based on the determined rotation axis and rotation angle. The rotation matrix here may be used to represent a rotational transformation between the camera coordinate system and the mobile platform coordinate system. In one embodiment, the determined axis of rotation is
Figure BDA0003420521610000081
Where x, y, z are parameters of the rotation axis in the coordinate system of the moving platform and the determined rotation angle is θ, a rotation matrix of the camera relative to the moving platform can be obtained by rotation of the transformed angle-axis representation
Figure BDA0003420521610000082
As shown in equation (1):
Figure BDA0003420521610000083
in other embodiments, the formula (1) may also be converted into an euler angle expression, and the installation attitude information of the camera on the mobile platform is represented by three parameters, namely a yaw angle, a pitch angle and a roll angle. In fact, the person skilled in the art can represent the rotation transformation in any way, depending on the actual situation.
After determining the rotation matrix of the camera relative to the mobile platform, the position of the mobile platform can be determined from the actual position of the camera by means of the rotation matrix, thereby compensating for mounting errors of the camera relative to the mobile platform.
Although only the process of determining the rotation axis of the mobile platform with respect to the camera using one first motion trajectory and determining the rotation angle of the mobile platform with respect to the camera using one second motion trajectory is described in the above method, it may be understood by those skilled in the art that the rotation axis of the mobile platform with respect to the camera may be determined using the fitting results of a plurality of first motion trajectories and the rotation angle of the mobile platform with respect to the camera may be determined using the fitting results of a plurality of second motion trajectories. Moreover, the first motion trajectory and the second motion trajectory may be obtained by making one planar motion.
The method for calibration of a camera according to an exemplary embodiment of the present disclosure is described above. Although the operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, nor that all illustrated operations be performed, to achieve desirable results.
Fig. 7 shows a schematic block diagram of an apparatus 700 for calibration of a camera according to an embodiment of the present disclosure. As shown in fig. 7, an apparatus 700 for calibration of a camera may include a photographing unit 710, a motion trajectory determining unit 720, and a camera pose information determining unit 730.
The apparatus 700 shown in fig. 7 may be used to implement the various methods described in connection with fig. 1-6 above in the present disclosure. In some embodiments, the apparatus 700 may be implemented as a robot that performs planar motion, such as a sweeping robot, a logistics robot, or the like.
The photographing unit 710 may be configured to control the camera to photograph a plurality of images while the mobile platform carrying the camera performs a planar motion. The motion trajectory determination unit 720 may be configured to determine at least two different motion trajectories of the camera based on the plurality of captured images. The camera pose information determination unit 730 may be configured to determine pose information of the camera with respect to the mobile platform based on the determined two different motion trajectories.
Although specific functionality is discussed above with reference to particular modules, it should be noted that the functionality of the various modules discussed herein may be divided into multiple modules and/or at least some of the functionality of multiple modules may be combined into a single module. Performing an action by a particular module discussed herein includes the particular module itself performing the action, or alternatively the particular module invoking or otherwise accessing another component or module that performs the action (or performs the action in conjunction with the particular module). Thus, a particular module that performs an action can include the particular module that performs the action itself and/or another module that the particular module invokes or otherwise accesses that performs the action.
An exemplary embodiment of the present disclosure provides an electronic device, which may include a processor; and a memory storing a program comprising instructions that, when executed by the processor, cause the processor to perform any of the methods previously described.
The exemplary embodiments of the present disclosure also provide a computer-readable storage medium storing a program comprising instructions which, when executed by a processor of an electronic device, cause the processor to perform any of the methods described above.
The exemplary embodiments of the present disclosure also provide a computer program product comprising a computer program, wherein the computer program realizes any of the methods described above when executed by a processor.
Referring to fig. 8, an electronic device 800, which is an example of a hardware device (electronic device) that can be applied to aspects of the present disclosure, will now be described. The electronic device 800 may be any machine configured to perform processing and/or computing, and may be, but is not limited to, a workstation, a server, a desktop computer, a laptop computer, a tablet computer, a personal digital assistant, a robot, a smart phone, an on-board computer, or any combination thereof. The above-described method of camera calibration may be implemented in whole or at least in part by the electronic device 800 or a similar device or system.
Electronic device 800 may include components connected to bus 802 (possibly via one or more interfaces) or in communication with bus 802. For example, electronic device 800 may include a bus 802, one or more processors 804, one or more input devices 806, and one or more output devices 808. The one or more processors 804 may be any type of processor and may include, but are not limited to, one or more general purpose processors and/or one or more special purpose processors (e.g., special processing chips). Input device 806 may be any type of device capable of inputting information to electronic device 800 and may include, but is not limited to, a mouse, a keyboard, a touch screen, a microphone, and/or a remote control. Output device 808 may be any type of device capable of presenting information and may include, but is not limited to, a display, speakers, a video/audio output terminal, a vibrator, and/or a printer. The electronic device 800 may also include a non-transitory storage device 810, which may be any storage device that is non-transitory and that may enable data storage, including but not limited to a magnetic disk drive, an optical storage device, solid state memory, a floppy disk, a flexible disk, a hard disk, a magnetic tape, or any other magnetic medium, an optical disk or any other optical medium, a ROM (read only memory), a RAM (random access memory), a cache memory, and/or any other memory chip or cartridge, and/or any other medium from which a computer may read data, instructions, and/or code. The non-transitory storage device 810 may be removable from the interface. The non-transitory storage device 810 may have data/programs (including instructions)/code for implementing the above-described methods and steps. The electronic device 800 may also include a communication device 812. Communication device 812 may be any type of device or system that enables communication with external devices and/or with a network, and may include, but is not limited to, a modem, a network card, an infrared communication device, a wireless communication device, and/or a chipset, such as bluetoothTMDevices, 802.11 devices, Wi-Fi devices, Wi-Max devices, cellular communication devices, and/or the like.
Electronic device 800 may also include a working memory 814, which may be any type of working memory that can store programs (including instructions) and/or data useful for the operation of processor 804, and which may include, but is not limited to, random access memory and/or read only memory devices.
Software elements (programs) may be located in the working memory 814 including, but not limited to, an operating system 816, one or more application programs 818, drivers, and/or other data and code. Instructions for performing the above-described methods and steps may be included in one or more applications 818, and the above-described methods for camera calibration may be implemented by the instructions being read and executed by the processor 804 in the one or more applications 818. More specifically, in the above-described camera calibration method, steps S110-S130 may be implemented, for example, by the processor 804 executing the application 818 having the instructions of steps S110-S130. Further, other steps in the camera calibration method described above may be implemented, for example, by the processor 804 executing an application 818 having instructions to perform the respective steps. Executable code or source code for the instructions of the software elements (programs) may be stored in a non-transitory computer readable storage medium, such as storage device 810 described above, and may be stored in working memory 814 (possibly compiled and/or installed) upon execution. Executable code or source code for the instructions of the software elements (programs) may also be downloaded from a remote location.
It will also be appreciated that various modifications may be made in accordance with specific requirements. For example, customized hardware might also be used and/or particular elements might be implemented in hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. For example, some or all of the disclosed methods and apparatus may be implemented by programming hardware (e.g., programmable logic circuitry including Field Programmable Gate Arrays (FPGAs) and/or Programmable Logic Arrays (PLAs)) in an assembly language or hardware programming language such as VERILOG, VHDL, C + +, using logic and algorithms according to the present disclosure.
It should also be understood that the foregoing method may be implemented in a server-client mode. For example, a client may receive data input by a user and send the data to a server. The client may also receive data input by the user, perform part of the processing in the foregoing method, and transmit the data obtained by the processing to the server. The server may receive data from the client and perform the aforementioned method or another part of the aforementioned method and return the results of the execution to the client. The client may receive the results of the execution of the method from the server and may present them to the user, for example, through an output device.
It should also be understood that the components of the electronic device 800 may be distributed across a network. For example, some processes may be performed using one processor while other processes may be performed by another processor that is remote from the one processor. Other components of the computing system 800 may also be similarly distributed. As such, electronic device 800 may be interpreted as a distributed computing system that performs processing at multiple locations.
While the disclosure has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative and exemplary and not restrictive; the present disclosure is not limited to the disclosed embodiments. Variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed subject matter, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps not listed, the indefinite article "a" or "an" does not exclude a plurality, and the term "a plurality" means two or more. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Claims (18)

1. A calibration method of a camera comprises the following steps:
when a mobile platform provided with a camera performs plane motion, shooting a plurality of images of the environment through the camera at a plurality of track points in the plane motion;
determining at least two different motion trajectories of the camera in the environment based on a plurality of images captured at the plurality of trajectory points; and
determining mounting pose information of the camera relative to the mobile platform based on the determined at least two different motion trajectories.
2. The method of claim 1, wherein each of the plurality of captured images includes at least one of a plurality of identifiers disposed in an environment, and,
determining at least two different motion trajectories of the camera based on the captured plurality of images comprises:
determining a corresponding shooting position of the camera in the environment when shooting each image based on the relative position relation between the position of at least one marker in each image in the corresponding image and the plurality of markers in the environment; and
the at least two different motion trajectories are determined based on a plurality of photographing positions corresponding to the plurality of photographed images.
3. The method according to claim 2, wherein the identifier has a predefined size or/and pattern.
4. The method of claim 2, wherein the relative positional relationship between each of the plurality of identifiers and the other identifiers comprises a relative rotation and displacement of the identifier and the other identifiers.
5. The method of claim 2, wherein the relative positional relationship between the plurality of identifiers is determined based on at least one calibration image acquired for the environment by the camera mounted on the mobile platform or other camera device in the environment that has completed an internal reference calibration.
6. The method of claim 5, wherein each of the plurality of identifiers is included in at least one of at least one calibration image acquired for the environment.
7. The method of claim 2, wherein at least one of the plurality of identifiers is included in at least one of a plurality of images captured by the camera.
8. The method of any of claims 2-7, wherein the plurality of identifiers comprises AprilTag labels.
9. The method of any of claims 1-7, wherein the at least two different motion trajectories include a first motion trajectory and a second motion trajectory, and,
determining mounting pose information of the camera relative to the mobile platform based on the determined at least two different motion trajectories comprises:
determining a rotation axis of the mobile platform relative to the camera based on the first motion trajectory;
determining a rotation angle of the camera about the rotation axis relative to the mobile platform based on the second motion trajectory and the determined rotation axis; and
determining a rotation matrix between the camera coordinate system and a mobile platform coordinate system as the mounting attitude information based on the determined rotation axis and the rotation angle.
10. The method of claim 9, wherein the first motion trajectory corresponds to at least three first trajectory points of at least three non-collinear capture positions of the camera when the mobile platform makes a first planar motion, and
determining, based on the first motion profile, an axis of rotation of the mobile platform relative to the camera comprises:
fitting the at least three first trajectory points to the same plane to form a first plane-fitted motion trajectory; and
and determining a plane normal vector of a plane where the first plane fitting motion trail is located as a rotating shaft of the mobile platform relative to the camera based on the determined first plane fitting motion trail.
11. The method of claim 10, wherein determining, based on the first motion profile, an axis of rotation of the mobile platform relative to the camera comprises: applying at least one plane constraint to the at least three first trajectory points to fit the at least three first trajectory points to the same plane to form a first plane-fitted motion trajectory.
12. The method of claim 10, wherein the at least one plane constraint comprises a non-linear least squares optimization algorithm.
13. The method of claim 9, wherein the second motion trajectory corresponds to at least two second trajectory points of at least two capture positions of the camera when the mobile platform is in a second planar motion, and
based on the second motion trajectory and the determined rotation axis, determining a rotation angle of the mobile platform relative to the camera about the rotation axis comprises:
connecting the at least two second trajectory points to form the second motion trajectory;
fitting the second motion track to a plane where the second plane motion of the mobile platform is performed by using the rotating shaft to form a second plane fitting motion track; and
and determining an included angle between the track of the moving platform making the second plane motion and the second plane fitting motion track of the camera as the rotation angle.
14. The method of claim 13, wherein the second planar motion of the mobile platform is a linear motion.
15. An apparatus for calibration of a camera, comprising:
the shooting unit is configured to control the camera to shoot a plurality of images when the mobile platform carrying the camera performs plane motion;
a motion trajectory determination unit that determines at least two different motion trajectories of the camera based on the plurality of captured images; and
a camera pose information determination unit that determines pose information of the camera with respect to the mobile platform based on the two different motion trajectories.
16. An electronic device, comprising:
a processor; and
a memory storing a program comprising instructions that, when executed by the processor, cause the processor to perform the method of any of claims 1 to 14.
17. A computer readable storage medium storing a program, the program comprising instructions that, when executed by a processor of an electronic device, cause the processor to perform the method of any of claims 1-14.
18. A computer program product comprising a computer program, wherein the computer program realizes the method according to any one of claims 1 to 14 when executed by a processor.
CN202111560370.7A 2021-12-20 2021-12-20 Method and device for calibrating camera, electronic equipment and readable storage medium Pending CN114219867A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111560370.7A CN114219867A (en) 2021-12-20 2021-12-20 Method and device for calibrating camera, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111560370.7A CN114219867A (en) 2021-12-20 2021-12-20 Method and device for calibrating camera, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN114219867A true CN114219867A (en) 2022-03-22

Family

ID=80704233

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111560370.7A Pending CN114219867A (en) 2021-12-20 2021-12-20 Method and device for calibrating camera, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN114219867A (en)

Similar Documents

Publication Publication Date Title
CN109584295B (en) Method, device and system for automatically labeling target object in image
JP6223122B2 (en) Automatic reference frame calibration for augmented reality
US20200398435A1 (en) Control System and Control Method
CN108062776A (en) Camera Attitude Tracking method and apparatus
JP6516558B2 (en) Position information processing method
CN113409391B (en) Visual positioning method and related device, equipment and storage medium
US7711507B2 (en) Method and device for determining the relative position of a first object with respect to a second object, corresponding computer program and a computer-readable storage medium
WO2023060964A1 (en) Calibration method and related apparatus, device, storage medium and computer program product
JP2020534198A (en) Control methods, equipment and systems for mobile objects
US11263818B2 (en) Augmented reality system using visual object recognition and stored geometry to create and render virtual objects
CN112489113A (en) Camera external parameter calibration method and device and camera external parameter calibration system
CN110332930B (en) Position determination method, device and equipment
CN109740487B (en) Point cloud labeling method and device, computer equipment and storage medium
US11395102B2 (en) Field cooperation system and management device
CN110728716B (en) Calibration method and device and aircraft
JP6922348B2 (en) Information processing equipment, methods, and programs
KR101683763B1 (en) Augmented Reality Robot Simulation System And Method Using 360 Degree Camera
CN114219867A (en) Method and device for calibrating camera, electronic equipment and readable storage medium
CN113330487A (en) Parameter calibration method and device
CN110853102A (en) Novel robot vision calibration and guide method, device and computer equipment
US11039083B1 (en) Facilitating motion capture camera placement
CN109615658B (en) Method and device for taking articles by robot, computer equipment and storage medium
CN114723923B (en) Transmission solution simulation display system and method
TWI738315B (en) Automatic tracking photographic system based on light label
CN116266388A (en) Method, device and system for generating annotation information, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination