CN117132657A - Pose correction method, pose correction device and storage medium - Google Patents

Pose correction method, pose correction device and storage medium Download PDF

Info

Publication number
CN117132657A
CN117132657A CN202210559310.1A CN202210559310A CN117132657A CN 117132657 A CN117132657 A CN 117132657A CN 202210559310 A CN202210559310 A CN 202210559310A CN 117132657 A CN117132657 A CN 117132657A
Authority
CN
China
Prior art keywords
target
angle
coordinate system
axis
world coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210559310.1A
Other languages
Chinese (zh)
Inventor
张超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN202210559310.1A priority Critical patent/CN117132657A/en
Publication of CN117132657A publication Critical patent/CN117132657A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The disclosure relates to a pose correction method, a pose correction device and a storage medium. The pose correction method comprises the following steps: and acquiring a target image shot by a camera, and performing de-distortion on the target image based on optical design parameters. And extracting a region of interest meeting a field-of-view threshold range from the image subjected to the de-distortion based on the optical design parameters. And determining a target angle based on the region of interest, wherein the target angle is an included angle between an imaging sensor of the camera and a target when the camera shoots the target image. And correcting the relative pose between the camera and the target based on the target angle. Through the method and the device, the problem of setting the hardware environment for correcting distortion in the camera production process is solved, errors are reduced, and correction accuracy is improved.

Description

Pose correction method, pose correction device and storage medium
Technical Field
The disclosure relates to the technical field of cameras, and in particular relates to a pose correction method, a pose correction device and a storage medium.
Background
With the continuous progress of technology, cameras are increasingly used in daily life and technology fields, such as intelligent terminals, high-end robot industries, automatic driving and other fields.
The optical distortion correction based on the optical design parameters is a common distortion correction mode for ensuring the subsequent image processing of the camera. However, at present, the optical distortion correction based on the optical design parameters is mainly performed from the image angle, and still has the situations of low correction accuracy and unsatisfactory effect.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides a pose correction method, apparatus, and storage medium.
According to a first aspect of an embodiment of the present disclosure, there is provided a pose correction method, including: acquiring a target image shot by a camera, and de-distorting the target image based on optical design parameters; extracting a region of interest meeting a field threshold range from the de-distorted image; determining a target angle based on the region of interest, wherein the target angle is an included angle between an imaging sensor of the camera and a target when the camera shoots the target image; and correcting the relative pose between the camera and the target based on the target angle.
In one embodiment, the determining the target angle based on the region of interest includes: mapping the region of interest to a world coordinate system; determining a target angle between the target image and a target coordinate axis in the world coordinate system based on a target pixel in the region of interest; the target pixels comprise pixels in the same row direction in the region of interest, and the target coordinate axis is the x axis of the world coordinate system; or the target pixel comprises pixels in the same column direction in the region of interest, and the target coordinate axis is the y axis of the world coordinate system; or the target pixel comprises a pixel in the connecting line direction between the imaging sensor and the target, and the target coordinate axis is the z axis of the world coordinate system.
In yet another embodiment, the target pixel includes pixels in the same row direction in the region of interest, and the target coordinate axis is an x-axis of the world coordinate system; or the target pixel comprises pixels in the same column direction in the region of interest, and the target coordinate axis is the y axis of the world coordinate system; the determining, based on the target pixels in the region of interest, a target angle between the target image and a target coordinate axis in the world coordinate system includes: determining a first distance between a first pixel of the target pixels and a target coordinate axis of the world coordinate system; determining a second distance between a second pixel of the target pixels and a target coordinate axis of the world coordinate system; determining a first projection length of the target pixel on a target coordinate axis of the world coordinate system; and performing arctangent function operation based on the first distance, the second distance and the first projection length to obtain a target angle.
In yet another embodiment, the target pixel includes a pixel in a direction of a line between the imaging sensor and a target, and the target coordinate axis is a z-axis of the world coordinate system; the determining, based on the target pixels in the region of interest, a target angle between the target image and a target coordinate axis in the world coordinate system includes: acquiring a second projection length of the target pixel on the world coordinate system; determining a length of a pixel in the region of interest in a direction of a line between the imaging sensor and a target in the world coordinate system as a third distance; determining a difference between the third distance and the second projected length; determining a ratio of a distance between the imaging sensor and a target to a focal length of the camera to obtain a first ratio; determining a ratio between the difference value and the first ratio to obtain a second ratio; and performing arctangent function operation on the second ratio to obtain a target angle.
In yet another embodiment, the correcting the relative pose between the camera and the target based on the target angle includes: if a target angle larger than or equal to a preset angle threshold exists, reversely adjusting a target in the direction of a target coordinate axis or reversely adjusting the target angle of the camera, wherein the target angle comprises a first angle between the target image and the x-axis direction of the world coordinate system; or a second angle between the target image and the y-axis direction of the world coordinate system; or a third angle between the target image and the z-axis direction of the world coordinate system.
In yet another embodiment, the correcting the relative pose between the camera and the target based on the target angle includes: and if the target angle is smaller than a preset angle threshold, generating a rotation matrix based on the target angle, and adjusting the target or the camera based on the rotation matrix. The target angle includes a first angle between the target image and an x-axis direction of the world coordinate system, a second angle between the target image and a y-axis direction of the world coordinate system, and a third angle between the target image and a z-axis direction of the world coordinate system.
In yet another embodiment, the method further comprises: before extracting a region of interest meeting a field threshold range from the de-distorted image, determining that the optical design parameter is smaller than the field threshold range corresponding to the set distortion threshold based on the corresponding relation between the optical design parameter and the field.
According to a second aspect of the embodiments of the present disclosure, there is provided a pose correction apparatus including: the processing unit is used for acquiring a target image shot by the camera and de-distorting the target image based on optical design parameters; the extraction unit is used for extracting a region of interest meeting a field threshold range from the de-distorted image; the determining unit is used for determining a target angle in the region of interest, wherein the target angle is an included angle between an imaging sensor of the camera and a target when the camera shoots the target image; and the correction unit is used for correcting the relative pose between the camera and the target based on the target angle.
In one embodiment, the determining unit determines the target angle based on the region of interest in the following manner: mapping the region of interest to a world coordinate system; determining a target angle between the target image and a target coordinate axis in the world coordinate system based on a target pixel in the region of interest; the target pixels comprise pixels in the same row direction in the region of interest, and the target coordinate axis is the x axis of the world coordinate system; or the target pixel comprises pixels in the same column direction in the region of interest, and the target coordinate axis is the y axis of the world coordinate system; or the target pixel comprises a pixel in the connecting line direction between the imaging sensor and the target, and the target coordinate axis is the z axis of the world coordinate system.
In another embodiment, the target pixel includes pixels in the same row direction in the region of interest, and the target coordinate axis is an x-axis of the world coordinate system; or the target pixel comprises pixels in the same column direction in the region of interest, and the target coordinate axis is the y axis of the world coordinate system; the determination unit determines a target angle between the target image and a target coordinate axis in the world coordinate system based on the target pixel in the region of interest in the following manner: determining a first distance between a first pixel of the target pixels and a target coordinate axis of the world coordinate system; determining a second distance between a second pixel of the target pixels and a target coordinate axis of the world coordinate system; determining a first projection length of the target pixel on a target coordinate axis of the world coordinate system; and performing arctangent function operation based on the first distance, the second distance and the first projection length to obtain a target angle.
In another embodiment, the target pixel includes a pixel in a direction of a line between the imaging sensor and a target, and the target coordinate axis is a z-axis of the world coordinate system; the determination unit determines a target angle between the target image and a target coordinate axis in the world coordinate system based on the target pixel in the region of interest in the following manner: acquiring a second projection length of the target pixel on the world coordinate system; determining a length of a pixel in the region of interest in a direction of a line between the imaging sensor and a target in the world coordinate system as a third distance; determining a difference between the third distance and the second projected length; determining a ratio of a distance between the imaging sensor and a target to a focal length of the camera to obtain a first ratio; determining a ratio between the difference value and the first ratio to obtain a second ratio; and performing arctangent function operation on the second ratio to obtain a target angle.
In another embodiment, the correction unit corrects the relative pose between the camera and the target based on the target angle in the following manner: if a target angle larger than or equal to a preset angle threshold exists, reversely adjusting a target in the direction of a target coordinate axis or reversely adjusting the target angle of the camera, wherein the target angle comprises a first angle between the target image and the x-axis direction of the world coordinate system; or a second angle between the target image and the y-axis direction of the world coordinate system; or a third angle between the target image and the z-axis direction of the world coordinate system.
In another embodiment, the correction unit corrects the relative pose between the camera and the target based on the target angle in the following manner: and if the target angle is smaller than a preset angle threshold, generating a rotation matrix based on the target angle, and adjusting the target or the camera based on the rotation matrix. The target angle includes a first angle between the target image and an x-axis direction of the world coordinate system, a second angle between the target image and a y-axis direction of the world coordinate system, and a third angle between the target image and a z-axis direction of the world coordinate system.
In another embodiment, the apparatus further comprises: before extracting a region of interest meeting a field threshold range from the de-distorted image, determining that the optical design parameter is smaller than the field threshold range corresponding to the set distortion threshold based on the corresponding relation between the optical design parameter and the field.
According to a third aspect of the embodiments of the present disclosure, there is provided a pose correction device, characterized by comprising: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to: a method for performing pose correction in the first aspect or any implementation of the first aspect.
According to a fourth aspect of embodiments of the present disclosure, there is provided a storage medium having instructions stored therein, which when executed by a processor of a terminal, enable the terminal including the processor to perform the method of pose correction in the first aspect or any implementation of the first aspect.
The technical scheme provided by the embodiment of the disclosure can comprise the following beneficial effects: and acquiring a target image in the process of correcting the pose of the relative parallel position relationship between the camera and the target. After the target image is subjected to de-distortion based on optical design parameters, an interested region is extracted, and a target angle between the target image and a camera imaging sensor is calculated in the interested region. And correcting the pose of the target image according to the target angle, reducing errors and improving correction accuracy.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating a pose correction method according to an exemplary embodiment.
FIG. 2 is a flow chart illustrating a method of determining a target angle based on a region of interest, according to an exemplary embodiment.
FIG. 3 is a flowchart illustrating a method of determining a target angle between a target image and a target coordinate axis in a world coordinate system based on target pixels in a region of interest, according to an exemplary embodiment.
FIG. 4 is a flowchart illustrating a method of determining a target angle between a target image and a target coordinate axis in a world coordinate system based on target pixels in a region of interest, according to an exemplary embodiment.
FIG. 5 is a flow chart illustrating a method of correcting a relative pose between a camera and a target based on a target angle according to an exemplary embodiment.
FIG. 6 is a flow chart illustrating a method of correcting a relative pose between a camera and a target based on a target angle according to an exemplary embodiment.
FIG. 7 is a flowchart illustrating a method of correcting a relative pose between a camera and a target based on a target angle, according to an exemplary embodiment.
Fig. 8 is a flowchart illustrating processing of a target image according to an exemplary embodiment.
Fig. 9 is a graph illustrating distortion as a function of field of view according to an exemplary embodiment.
FIG. 10 is a schematic diagram illustrating region of interest selection according to an exemplary embodiment.
Fig. 11 shows a schematic diagram of the angle between the target image 2 and the camera imaging sensor in the x-axis direction in an exemplary embodiment of the present disclosure.
Fig. 12 (a) shows a schematic diagram of an included angle between the target image 2 and the camera imaging sensor in the x-axis direction in an exemplary embodiment of the present disclosure.
Fig. 12 (b) shows a schematic diagram of an angle between the target image 2 and the camera imaging sensor in the x-axis direction in an exemplary embodiment of the present disclosure.
Fig. 13 shows a schematic diagram of the angle between the target image 2 and the camera imaging sensor in the y-axis direction in an exemplary embodiment of the present disclosure.
Fig. 14 shows a schematic diagram of the angle between the target image 2 and the camera imaging sensor in the y-axis direction in an exemplary embodiment of the present disclosure.
Fig. 15 shows a schematic diagram of an angle between a line connecting the target image 2 and an imaging sensor of the camera and a z-axis direction in an exemplary embodiment of the present disclosure.
Fig. 16 shows a schematic diagram of an angle between a line connecting the target image 2 and an imaging sensor of the camera and a z-axis direction in an exemplary embodiment of the present disclosure.
Fig. 17 shows a schematic diagram of target adjustment according to an angle between the target image 2 and a sensor of the camera in the x-axis direction in an exemplary embodiment of the present disclosure.
Fig. 18 is a block diagram illustrating a pose correction apparatus according to an exemplary embodiment.
Fig. 19 is a block diagram illustrating an apparatus for pose correction according to an exemplary embodiment.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure.
In the related art, distortion calibration is adopted to obtain a de-distortion coefficient based on optical design parameters, and a distortion correction scheme is performed, but the hardware environment of the relative position between the camera and the target can be truly set. Distortion correction for cameras is often based on distortion correction of optical design parameters, but pose correction between a target and the camera is avoided, the hardware environment for distortion correction in the production process of the camera is poor, and experience is poor in the process of camera correction by a user.
In order to solve the above-mentioned problems, the present disclosure provides a method for correcting pose, so as to achieve the expected requirement between the target and the camera as much as possible, for example, the x, y and z three axes of the relative positions of the target and the camera are parallel. And optimizing the hardware environment setting problem of distortion correction in the camera production process.
Fig. 1 is a flowchart illustrating a pose correction method according to an exemplary embodiment, as shown in fig. 1, including the following steps.
In step S11, a target image captured by a camera is acquired, and de-distortion based on optical design parameters is performed on the target image.
In the embodiment of the disclosure, a target image shot by a camera is acquired, wherein the target image is an image acquired by a camera target. For example, the target may be a drawing sheet correcting camera distortion. The pattern of the target image can be a checkerboard pattern or a target drawing with straight line characteristics, and the specific pattern of the target image is not limited.
In the embodiments of the present disclosure, distortion is one of the important factors limiting the accuracy of optical measurement, and may cause image modification. Wherein for an ideal optical system, the magnification is constant over a pair of conjugate object image planes. However, with practical optical systems, this property is only exhibited when the field of view is small, whereas when the field of view is large or large, the magnification of the image varies from field of view, which causes the image to lose similarity with respect to the object. Such an imaging defect that deforms an image is called distortion.
The conjugate object image planes are a pair of object image planes matched according to a certain rule, or two object image planes symmetrical by taking a certain axis as a symmetry.
In the embodiment of the disclosure, a target drawing which can be in a checkerboard pattern or has a straight line characteristic is photographed, and a target image is obtained. And performing undistorted operation on the obtained target image based on the optical design parameters. The de-distortion based on the optical design parameters may be adjusted using the optical design parameters. The optical design parameter may be a percentage of the ratio of the difference in image height of the actual target image to the image height of the ideal target image. The target image may be optically de-distorted once according to the optical design parameters.
In step S12, in the undistorted image, a region of interest satisfying the field-of-view threshold range is extracted.
In the embodiment of the disclosure, the field of view is the maximum range that can be observed by the camera.
In the disclosed embodiment, in the target image subjected to optical de-distortion, a proper view field threshold is selected based on a distortion-dependent view field change curve which is made by taking the image height of an actual target image as an x axis and taking the image height of an ideal target image as a y axis. The view field threshold range is the view field threshold with smaller optical distortion, and pose correction is performed in the view field with smaller optical distortion, so that errors are reduced.
In the embodiment of the disclosure, the region of interest is a field of view region that satisfies a field of view threshold range.
In step S13, a target angle is determined based on the region of interest, where the target angle is an angle between an imaging sensor of the camera and the target when the camera captures an image of the target.
In the embodiment of the disclosure, an included angle between an imaging sensor of a camera and a target is determined in a target image region of interest. The included angle between the imaging sensor of the camera and the target on the x, y and z three axes is formed.
In step S14, the relative pose between the camera and the target is corrected based on the target angle.
In the embodiment of the disclosure, according to the obtained included angles between the imaging sensor of the camera and the target on the x, y and z three axes, the corresponding angles are reversely adjusted on the x, y and z three axes, so that the purpose of correcting the pose between the camera and the target is achieved.
According to the pose correction method provided by the embodiment of the disclosure, the relative positions of the camera and the target are in a parallel state as much as possible, the hardware environment setting of distortion correction of the camera in the production process is optimized, and the user experience is improved.
The following examples of the present disclosure further illustrate and describe the method of determining the target angle in the above-described examples of the present disclosure.
FIG. 2 is a flow chart illustrating a method of determining a target angle based on a region of interest, as shown in FIG. 2, the method of determining a target angle based on a region of interest, according to an exemplary embodiment, including the following steps.
In step S21, the region of interest is mapped to a world coordinate system.
In the embodiment of the disclosure, the region of interest in the target image and the camera imaging sensor may be mapped into a world coordinate system, where the world coordinate system may be arbitrary, which is not limited thereto. And obtaining the included angle between the region of interest and the imaging sensor according to the relative position between the region of interest and the imaging sensor in the target image.
In step S22, a target angle between the target image and a target coordinate axis in the world coordinate system is determined based on the target pixel in the region of interest.
In the embodiment of the disclosure, the target pixels in the region of interest may be pixels in the same row or column, and the target coordinate axis in the world coordinate axis may be an x-axis, a y-axis, or a z-axis, but is not limited to the x-axis, the y-axis, or the z-axis, for example, an axis with x=3 may also be used as the target coordinate axis.
In step S23, the target pixel includes pixels in the same row direction in the region of interest, and the target coordinate axis is the x-axis of the world coordinate system.
In the embodiment of the disclosure, the target pixels may be pixels in any same row of the region of interest, and the target pixels in the same row may be abstracted into a straight line.
In step S24, or the target pixel includes pixels in the same column direction in the region of interest, the target coordinate axis is the y-axis of the world coordinate system.
In the embodiment of the disclosure, the target pixel may be any pixel in the same column of the region of interest, and the target pixel in the same column may be abstracted into a straight line.
In step S25, or the target pixel includes a pixel in a direction of a line connecting the imaging sensor and the target, the target coordinate axis is a z-axis of the world coordinate system.
In the embodiment of the disclosure, the target pixel may also be a pixel on the same line on a connection line between the imaging sensor of the camera and the region of interest on the target, and the target pixel on the same line may be abstracted into a straight line.
The following embodiments of the present disclosure further explain and illustrate the method of determining a target angle between a target image and a target coordinate axis in a world coordinate system based on a target pixel in a region of interest in the above embodiments of the present disclosure.
FIG. 3 is a flowchart illustrating a method of determining a target angle between a target image and a target coordinate axis in a world coordinate system based on a target pixel in a region of interest, as shown in FIG. 3, the method of determining a target angle between a target image and a target coordinate axis in a world coordinate system based on a target pixel in a region of interest, according to an exemplary embodiment, comprising the following steps.
In step S31, a first distance between a first one of the target pixels and a target coordinate axis of the world coordinate system is determined.
In the embodiment of the present disclosure, the target pixel is a pixel on the same line in the region of interest, and the first pixel may be any one pixel on a line to which the target pixel is abstracted. And drawing a vertical line from the first pixel to the target coordinate axis, wherein the length of the vertical line is a first distance. If the target pixel is a pixel on the same row in the region of interest, the target coordinate axis is the x axis; if the target pixel is a pixel on the same column in the region of interest, the target coordinate axis is the y axis.
In step S32, a second distance between a second one of the target pixels and a target coordinate axis of the world coordinate system is determined.
In the embodiment of the disclosure, the second pixel may be a pixel on the same line as the first pixel but different from the first pixel in the region of interest. And drawing a vertical line from the second pixel to the target coordinate axis, wherein the length of the vertical line is a second distance.
In step S33, a first projection length of the target pixel on the target coordinate axis of the world coordinate system is determined.
In the embodiment of the disclosure, the projection length of the target pixel on the target coordinate axis may be a distance between a perpendicular point of the first pixel on the target coordinate axis and a perpendicular point of the second pixel on the target coordinate axis.
In step S34, an arctangent function operation is performed based on the first distance, the second distance, and the first projection length, to obtain a target angle.
In the embodiment of the disclosure, if the first pixel and the second pixel are located at two sides of an intersection point of the pixel and the coordinate axis on the same line in the region of interest, that is, the intersection point of the pixel and the coordinate axis on the same line where the first pixel and the second pixel are located is taken as an origin as an x-axis and a y-axis, the first pixel and the second pixel are respectively located in a first quadrant and a third quadrant of the established coordinate system or respectively located in a second quadrant and a fourth quadrant of the established coordinate system. A ratio between the sum of the distances between the first distance and the second distance and the first projection length is determined.
In the embodiment of the disclosure, if the first pixel and the second pixel are located on the same side of the intersection point of the pixel and the coordinate axis on the same line in the region of interest, that is, the intersection point of the pixel and the coordinate axis on the same line where the first pixel and the second pixel are located is taken as the origin as the x-axis and the y-axis, the first pixel and the second pixel are respectively located in the same quadrant of the coordinate system. A ratio between the absolute value of the distance difference between the first distance and the second distance and the first projection length is determined.
In the embodiment of the disclosure, the target angle can be obtained according to the arctangent function, the measured first distance, second distance and first projection length. If the target pixel is a pixel on the same row in the region of interest, the target angle is the angle between the region of interest and the imaging sensor in the x-axis direction. If the target pixel is a pixel on the same column in the region of interest, then the target angle is the angle of the region of interest with the imaging sensor in the y-axis direction.
In the embodiment of the disclosure, taking two sides of an intersection point of a pixel located on the same line in the region of interest and a coordinate axis of the pixel, for example, a vertical distance from the first pixel located on the same line in the region of interest to an x-axis of a world coordinate system is 1, a vertical distance from the second pixel located on the line in the region of interest to the x-axis of the world coordinate system is 1, a projection length projected on the x-axis of the world coordinate system is 2, and an included angle between the pixel located on the line in the region of interest and the x-axis of the world coordinate system is 45 degrees, that is, a target angle is 45 degrees, which can be obtained according to an arctangent function.
In the embodiment of the disclosure, taking an example that the first pixel and the second pixel are located on the same side of the intersection point of the pixel and the coordinate axis in the same row in the region of interest, for example, the vertical distance from the first pixel in the column direction in the region of interest to the y-axis of the world coordinate system is 3, the vertical distance from the second pixel in the column direction in the region of interest to the y-axis of the world coordinate system is 1, the projection length projected on the y-axis of the world coordinate system is 2, and the included angle between the pixel in the column direction in the region of interest and the y-axis of the world coordinate system is 45 degrees, that is, the target angle is 45 degrees, can be obtained according to an arctangent function.
The following embodiments of the present disclosure further explain and illustrate the method of determining a target angle between a target image and a target coordinate axis in a world coordinate system based on a target pixel in a region of interest in the above embodiments of the present disclosure.
FIG. 4 is a flowchart illustrating a method of determining a target angle between a target image and a target coordinate axis in a world coordinate system based on a target pixel in a region of interest, as shown in FIG. 4, the method of determining a target angle between a target image and a target coordinate axis in a world coordinate system based on a target pixel in a region of interest, according to an exemplary embodiment, comprising the following steps.
In step S41, a second projection length of the target pixel on the world coordinate system is acquired.
In the embodiment of the disclosure, the target pixel may be projected into the world coordinate system, and the projection length is measured to obtain the second projection length.
In step S42, the length of the pixel in the region of interest in the direction of the line between the imaging sensor and the target in the world coordinate system is determined as the third distance.
In step S43, a difference between the third distance and the second projection length is determined.
In step S44, a ratio between the distance between the imaging sensor and the target and the focal length of the camera is determined, resulting in a first ratio.
In step S45, a ratio between the difference and the first ratio is determined, resulting in a second ratio.
In step S46, an arctangent function operation is performed on the second ratio to obtain the target angle.
In the embodiment of the disclosure, for example, a third distance based on a pixel length in a line direction between an imaging sensor of a camera and a target in a region of interest is denoted as 2, a length projected in a world coordinate system is denoted as 1, a distance between the target and the imaging sensor of the camera is denoted as 1, and a focal length of the camera is denoted as 1. And obtaining an included angle of 45 degrees in the direction of a connecting line between an imaging sensor of the camera and the target in the region of interest according to the arctangent function, namely, the target angle is 45 degrees.
The following embodiments of the present disclosure further explain and illustrate the method of correcting the relative pose between the camera and the target based on the target angle in the above embodiments of the present disclosure.
FIG. 5 is a flowchart illustrating a method of correcting a relative pose between a camera and a target based on a target angle according to an exemplary embodiment, as shown in FIG. 5, the method of correcting a relative pose between a camera and a target based on a target angle comprising the following steps.
In step S51, if there is a target angle greater than or equal to the preset angle threshold, the target is reversely adjusted or the camera is reversely adjusted in the direction of the target coordinate axis.
In the embodiment of the disclosure, the angle threshold may be set to 1 degree, and when the target angle is greater than or equal to the preset angle threshold of 1 degree, the angle between the target and the camera is considered to be larger, and the angle of the camera or the target angle can be reversely adjusted.
In the embodiment of the disclosure, for example, a target angle formed by the target image and the x-axis direction of the world coordinate system is 3 degrees, the preset angle threshold is 1 degree, and the target angle is greater than the preset angle threshold. In the x-axis direction of the world coordinate system, the target image or camera is reversely adjusted by 3 degrees. The target angle formed by the target image and the y-axis direction of the world coordinate system is 5 degrees, the preset angle threshold is 1 degree, and the target angle is larger than the preset angle threshold. In the y-axis direction of the world coordinate system, the target image is reversely adjusted by 5 degrees. The target angle formed by the target image and the z-axis direction of the world coordinate system is 4 degrees, the preset angle threshold is 1 degree, and the target angle is larger than the preset angle threshold. The target image is adjusted back 4 degrees in the z-axis direction of the world coordinate system.
In step S52, the target angle includes a first angle between the target image and the x-axis direction of the world coordinate system; a second angle between the target image and the y-axis direction of the world coordinate system; a third angle between the target image and the z-axis direction of the world coordinate system.
In the embodiment of the disclosure, the target angle includes an included angle between a pixel in a certain row direction in the region of interest on the target image and an x-axis direction in the world coordinate system, or an included angle between a pixel in a certain column direction in the region of interest on the target image and a y-axis direction in the world coordinate system, or an included angle between a line between the region of interest on the target image and the camera imaging sensor and a z-axis direction in the world coordinate system.
The following embodiments of the present disclosure further explain and illustrate the method of correcting the relative pose between the camera and the target based on the target angle in the above embodiments of the present disclosure.
FIG. 6 is a flowchart illustrating a method of correcting a relative pose between a camera and a target based on a target angle according to an exemplary embodiment, as shown in FIG. 6, the method of correcting a relative pose between a camera and a target based on a target angle comprising the following steps.
In step S61, if the target angle is smaller than the preset angle threshold, a rotation matrix is generated based on the target angle, and the target is adjusted or the camera is adjusted based on the rotation matrix.
In the embodiment of the disclosure, the angle threshold is a preset angle, for example, may be 1 degree, and when a first angle formed by the target image and the x-axis direction of the world coordinate system, a second angle formed by the target image and the y-axis direction of the world coordinate system, and a third angle formed by the target image and the z-axis direction of the world coordinate system are all smaller than the preset angle of 1 degree, a rotation matrix is generated, and the target is adjusted based on the rotation matrix.
In the embodiment of the present disclosure, the rotation matrix may be calculated in the following manner, for example: the rotation matrix generated based on the target image and the first angle alpha formed in the x-axis direction of the world coordinate system can be a three-row three-column matrix, wherein data of a first row of the matrix can be sequentially 1, 0 and 0 from left to right, data of a second row of the matrix can be sequentially 0 from left to right, cosine function cos alpha of the first angle alpha, negative value-sin alpha of sine function of the first angle alpha, and data of a third row of the matrix can be sequentially 0 from left to right, sine function sin alpha of the first angle alpha and cosine function cos alpha of the first angle alpha.
The rotation matrix generated based on the second angle β formed by the target image and the y-axis direction of the world coordinate system may be a three-row three-column matrix, wherein the data of the first row of the matrix may be cosine functions cos β, 0 of the second angle β, sine functions sin β of the second angle β sequentially from left to right, the data of the second row of the matrix may be 0, 1, 0 sequentially from left to right, and the data of the third row of the matrix may be negative values-sin β, 0, cosine functions cos β of the sine functions of the second angle β sequentially from left to right.
The rotation matrix generated based on the third angle gamma formed by the target image and the z-axis direction of the world coordinate system may be a three-row three-column matrix, wherein data of a first row of the matrix may be cosine function cos gamma of the third angle gamma, negative value-sin gamma of sine function of the third angle gamma, and 0 sequentially from left to right, data of a second row of the matrix may be sine gamma of the third angle gamma, cosine function cos gamma of the third angle gamma, and 0 sequentially from left to right, and data of the third row of the matrix may be 0, and 1 sequentially from left to right.
In an embodiment of the present disclosure, the targets are adjusted based on the resulting rotation matrix. The adjustment method may be that a rotation matrix generated by a third angle γ is multiplied by a rotation matrix generated by a second angle β, and a rotation matrix generated by a first angle α is multiplied by the right.
In step S62, the target angle includes a first angle between the target image and the x-axis direction of the world coordinate system, a second angle between the target image and the y-axis direction of the world coordinate system, and a third angle between the target image and the z-axis direction of the world coordinate system.
The following examples of the present disclosure further illustrate and describe the method of determining the threshold range of the field of view in the above-described examples of the present disclosure.
FIG. 7 is a flowchart illustrating a method of correcting a relative pose between a camera and a target based on a target angle according to an exemplary embodiment, as shown in FIG. 7, the method of correcting a relative pose between a camera and a target based on a target angle comprising the following steps.
In step S71, it is determined that the optical design parameter is smaller than the field threshold range corresponding to when the distortion threshold is set, based on the correspondence between the optical design parameter and the field of view.
In an embodiment of the present disclosure, a field of view threshold range that is less than the optical design parameter threshold is determined from a plot of optical design parameter versus field of view. The field of view threshold may be selected using the maximum of undistorted curves or less distorted portions of the optical design parameters as a function of field of view. The threshold range of the field of view can be obtained according to the change curve of the optical design parameters along with the field of view.
Wherein, for example, a circular area with radius R is provided, and a small circular area with radius R is provided in the circular area, and if the ratio of R to R is 0.1, the area represented by R is 0.1 field of view.
In step S72, in the undistorted image, a region of interest satisfying the field-of-view threshold range is extracted.
In the embodiment of the disclosure, a field of view with small optical distortion is obtained according to a field of view threshold and is set as a region of interest.
The following examples of the embodiments of the present disclosure take a checkerboard target as an example, and the target pose correction method and the method for selecting a region of interest according to the foregoing embodiments of the present disclosure are illustrated.
Fig. 8 is a flowchart illustrating processing of a target image according to an exemplary embodiment. Referring to fig. 8, first, a target image captured by a camera is acquired, and an optical distortion coefficient is obtained according to a ratio of a difference between an actual image height and an ideal image height of the target image to the ideal image height, for example, the actual image height is 100, the ideal image height is 50, and the optical distortion coefficient is 50%. And carrying out optical de-distortion on the target image based on the obtained optical distortion coefficient to obtain a target image 1.
In the embodiment of the disclosure, fig. 9 is a schematic diagram of a distortion curve with a field of view according to an exemplary embodiment, and referring to fig. 9, a maximum value of a straight line portion of a curve is selected as a field of view threshold, and a region of interest is extracted based on the field of view threshold. For example, the distortion starts to appear in the distortion along with the change curve of the visual field at 0.8 visual field, the visual field threshold can be selected to be 0.8, namely, in the target image, a region with the ratio of radius to target image being 0.8 is selected as the region of interest by taking the center of the target image as the center. Referring to fig. 10, fig. 10 is a schematic diagram illustrating region of interest selection according to an exemplary embodiment.
In the embodiment of the disclosure, the target image 2 is obtained after the region of interest is extracted. And calculating an included angle between the target image 2 and the camera imaging sensor based on the target image 2.
Fig. 11 is a schematic diagram illustrating an angle between a target image 2 and a camera imaging sensor in an x-axis direction in an exemplary embodiment of the present disclosure. Fig. 12 (a) and 12 (b) are schematic diagrams showing angles between the target image 2 and the camera imaging sensor in the x-axis direction in an exemplary embodiment of the present disclosure, respectively.
Referring to fig. 12 (a), the pixels in the line direction of the target image 2 are abstracted to be straight lines, the plane where the imaging sensor of the camera is located is taken as the horizontal direction of the coordinate system, the world coordinate system is established, the x-axis direction of the world coordinate system is abstracted to be a dotted line, and the included angle between the pixels in the line direction of the target image 2 and the x-axis direction of the world coordinate system is calculated. Wherein h1 and h2 respectively represent the distances between pixels in the row direction of the target image 2 and the x-axis direction of the world coordinate system. The distance in the x-axis direction of the projection of the pixels in the row direction of the target image 2 to the world coordinate system is denoted by H. According to the formulas of H1, H2 and H and the arc tangent function, the included angle alpha between the pixels in the row direction of the target image 2 and the x-axis direction of the world coordinate system can be obtained, wherein alpha=arctan [ (h1+h2)/H ].
Referring to fig. 12 (b), the pixels in the line direction of the target image 2 are abstracted to be straight lines, the plane where the imaging sensor of the camera is located is taken as the horizontal direction of the coordinate system, the world coordinate system is established, the x-axis direction of the world coordinate system is abstracted to be a dotted line, and the included angle between the pixels in the line direction of the target image 2 and the x-axis direction of the world coordinate system is calculated. Wherein h1 and h2 respectively represent the distances between pixels in the row direction of the target image 2 and the x-axis direction of the world coordinate system. The distance in the x-axis direction of the projection of the pixels in the row direction of the target image 2 to the world coordinate system is denoted by H. According to the formulas of H1, H2 and H and the arc tangent function, the included angle alpha between the pixels in the row direction of the target image 2 and the x-axis direction of the world coordinate system can be obtained, wherein alpha=arctan [ (H1-H2)/H ].
Fig. 13 shows a schematic diagram of the angle between the target image 2 and the camera imaging sensor in the y-axis direction in an exemplary embodiment of the present disclosure. Fig. 14 shows a schematic diagram of the angle between the target image 2 and the camera imaging sensor in the y-axis direction in an exemplary embodiment of the present disclosure.
Referring to fig. 14, the pixels in the column direction of the target image 2 are abstracted to be straight lines, the y-axis direction of the world coordinate system is abstracted to be broken lines, and the included angles between the pixels in the column direction of the target image 2 and the y-axis direction of the world coordinate system are calculated. Where l1, l2 denote distances between pixels in the column direction of the target image 2 and the y-axis direction of the world coordinate system, respectively. The distance in the y-axis direction of the projection of the pixels in the column direction of the target image 2 to the world coordinate system is denoted by L. According to the formulas of L1, L2 and L and the arc tangent function, the included angle beta between the pixel in the column direction of the target image 2 and the y-axis direction of the world coordinate system can be obtained, wherein beta=arctan [ (l1+l2)/L ].
Fig. 15 shows a schematic diagram of an angle between a line connecting the target image 2 and an imaging sensor of the camera and a z-axis direction in an exemplary embodiment of the present disclosure. Fig. 16 shows a schematic diagram of an angle between a line connecting the target image 2 and an imaging sensor of the camera and a z-axis direction in an exemplary embodiment of the present disclosure.
Referring to fig. 16, the pixels in the direction of the line connecting the target image 2 and the imaging sensor of the camera are abstracted into straight lines, the z-axis direction of the world coordinate system is abstracted into broken lines, and the included angle between the pixels in the direction of the line connecting the target image 2 and the imaging sensor of the camera and the z-axis direction of the world coordinate system is calculated. Wherein H1 and H2 respectively represent a distance in a z-axis direction of a projection of pixels in a line direction between the target image 2 and the imaging sensor of the camera to the world coordinate system and a length of pixels in a line direction between the target image 2 and the imaging sensor of the camera. The distance between the target image 2 and the imaging sensor of the camera can be denoted Dist and the focal length of the camera by f. According to the formulas of H1, H2, dist and f and the arc tangent function, an included angle gamma between a pixel in the connecting line direction between the target image 2 and the imaging sensor of the camera and the z-axis direction of the world coordinate system can be obtained, wherein gamma=arctan [ (H2-H1)/(Dist/f) ].
Fig. 17 shows a schematic diagram of target adjustment according to an angle between the target image 2 and a sensor of the camera in the x-axis direction in an exemplary embodiment of the present disclosure. Referring to fig. 17, if the included angle between the target image 2 and the sensor of the camera in the x-axis direction is greater than or equal to the preset angle, the target is reversely adjusted in the x-axis direction. If the included angle between the target image 2 and the sensor of the camera in the x-axis direction is smaller than a preset angle, a rotation matrix is generated based on the included angle, and the rotation matrix is applied to the target image 2. The target can be adjusted by the same way according to the included angle between the target image 2 and the sensor of the camera in the y-axis and z-axis directions.
Based on the same conception, the embodiment of the disclosure also provides a pose correction device.
It can be appreciated that, in order to implement the above-mentioned functions, the pose correction apparatus provided in the embodiments of the present disclosure includes corresponding hardware structures and/or software modules for performing the respective functions. The disclosed embodiments may be implemented in hardware or a combination of hardware and computer software, in combination with the various example elements and algorithm steps disclosed in the embodiments of the disclosure. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application, but such implementation is not to be considered as beyond the scope of the embodiments of the present disclosure.
Fig. 18 is a block diagram illustrating a pose correction apparatus according to an exemplary embodiment. Referring to fig. 18, the apparatus includes a processing unit 101, an extracting unit 102, a determining unit 103, and a correcting unit 104.
The processing unit 101 is configured to acquire a target image captured by a camera, and perform de-distortion on the target image based on optical design parameters.
The extraction unit 102 is configured to extract a region of interest satisfying a threshold range of a field of view in an image subjected to de-distortion based on optical design parameters.
The determining unit 103 is configured to determine, in the region of interest, a target angle, where the target angle is an included angle between an imaging sensor of the camera and the target when the camera captures an image of the target.
The correction unit 104 corrects the relative pose between the camera and the target based on the target angle.
In one embodiment, the determining unit 103 determines the target angle based on the region of interest in the following manner: mapping the region of interest to a world coordinate system; determining a target angle between the target image and a target coordinate axis in the world coordinate system based on the target pixel in the region of interest; the target pixels comprise pixels in the same row direction in the region of interest, and the target coordinate axis is the x axis of the world coordinate system; or the target pixel comprises pixels in the same column direction in the region of interest, and the target coordinate axis is the y axis of the world coordinate system; or the target pixel comprises a pixel in the connecting line direction between the imaging sensor and the target, and the target coordinate axis is the z axis of the world coordinate system.
In another embodiment, the target pixels comprise pixels in the same row direction in the region of interest, and the target coordinate axis is the x-axis of the world coordinate system; or the target pixel comprises pixels in the same column direction in the region of interest, and the target coordinate axis is the y axis of the world coordinate system; the determination unit 103 determines a target angle between the target image and a target coordinate axis in the world coordinate system based on the target pixel in the region of interest in the following manner: determining a first distance between a first pixel in the target pixels and a target coordinate axis of the world coordinate system; determining a second distance between a second pixel in the target pixels and a target coordinate axis of the world coordinate system; determining a first projection length of a target pixel on a target coordinate axis of a world coordinate system; and performing arctangent function operation based on the first distance, the second distance and the first projection length to obtain the target angle.
In another embodiment, the target pixel comprises a pixel in a direction of a line between the imaging sensor and the target, and the target coordinate axis is a z-axis of the world coordinate system; the determination unit 103 determines a target angle between the target image and a target coordinate axis in the world coordinate system based on the target pixel in the region of interest in the following manner: acquiring a second projection length of the target pixel on a world coordinate system; determining a length of a pixel in the region of interest in a world coordinate system in a direction of a connecting line between the imaging sensor and the target as a third distance; determining a difference between the third distance and the second projection length; determining a ratio of a distance between the imaging sensor and the target to a focal length of the camera to obtain a first ratio; determining the ratio between the difference value and the first ratio to obtain a second ratio; and performing arctangent function operation on the second ratio to obtain a target angle.
In another embodiment, the correction unit 104 corrects the relative pose between the camera and the target based on the target angle in the following manner: if the target angle is larger than or equal to the preset angle threshold value, reversely adjusting the target in the direction of the target coordinate axis or reversely adjusting the target angle of the camera comprises a first angle between the target image and the x-axis direction of the world coordinate system; or a second angle between the target image and the y-axis direction of the world coordinate system; or a third angle between the target image and the z-axis direction of the world coordinate system.
In another embodiment, the correction unit 104 corrects the relative pose between the camera and the target based on the target angle in the following manner: if the target angle is smaller than the preset angle threshold, generating a rotation matrix based on the target angle, and adjusting a target or a camera based on the rotation matrix. The target angle includes a first angle between the target image and an x-axis direction of the world coordinate system, a second angle between the target image and a y-axis direction of the world coordinate system, and a third angle between the target image and a z-axis direction of the world coordinate system.
In another embodiment, the apparatus further comprises: before extracting a region of interest meeting a field threshold range from the de-distorted image, determining that the optical design parameter is smaller than the field threshold range corresponding to the set distortion threshold based on the corresponding relation between the optical design parameter and the field.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
Fig. 19 is a block diagram illustrating an apparatus 200 for pose correction according to an exemplary embodiment. For example, apparatus 200 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, exercise device, personal digital assistant, or the like.
Referring to fig. 19, the apparatus 200 may include one or more of the following components: a processing component 202, a memory 204, a power component 206, a multimedia component 208, an audio component 210, an input/output (I/O) interface 212, a sensor component 214, and a communication component 216.
The processing component 202 generally controls overall operation of the apparatus 200, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 202 may include one or more processors 220 to execute instructions to perform all or part of the steps of the methods described above. Further, the processing component 202 can include one or more modules that facilitate interactions between the processing component 202 and other components. For example, the processing component 202 may include a multimedia module to facilitate interaction between the multimedia component 208 and the processing component 202.
The memory 204 is configured to store various types of data to support operations at the apparatus 200. Examples of such data include instructions for any application or method operating on the device 200, contact data, phonebook data, messages, pictures, videos, and the like. The memory 204 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power component 206 provides power to the various components of the device 200. The power components 206 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the device 200.
The multimedia component 208 includes a screen between the device 200 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 208 includes a front-facing camera and/or a rear-facing camera. The front camera and/or the rear camera may receive external multimedia data when the apparatus 200 is in an operation mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 210 is configured to output and/or input audio signals. For example, the audio component 210 includes a Microphone (MIC) configured to receive external audio signals when the device 200 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 204 or transmitted via the communication component 216. In some embodiments, audio component 210 further includes a speaker for outputting audio signals.
The I/O interface 212 provides an interface between the processing assembly 202 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
The sensor assembly 214 includes one or more sensors for providing status assessment of various aspects of the apparatus 200. For example, the sensor assembly 214 may detect the on/off state of the device 200, the relative positioning of the components, such as the display and keypad of the device 200, the sensor assembly 214 may also detect a change in position of the device 200 or a component of the device 200, the presence or absence of user contact with the device 200, the orientation or acceleration/deceleration of the device 200, and a change in temperature of the device 200. The sensor assembly 214 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact. The sensor assembly 214 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 214 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 216 is configured to facilitate communication between the apparatus 200 and other devices in a wired or wireless manner. The device 200 may access a wireless network based on a communication standard, such as WiFi,2G or 3G, or a combination thereof. In one exemplary embodiment, the communication component 216 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 216 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 200 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 204, including instructions executable by processor 220 of apparatus 200 to perform the above-described method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
It is understood that the term "plurality" in this disclosure means two or more, and other adjectives are similar thereto. "and/or", describes an association relationship of an association object, and indicates that there may be three relationships, for example, a and/or B, and may indicate: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. The singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It is further understood that the terms "first," "second," and the like are used to describe various information, but such information should not be limited to these terms. These terms are only used to distinguish one type of information from another and do not denote a particular order or importance. Indeed, the expressions "first", "second", etc. may be used entirely interchangeably. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure.
It will be further understood that "connected" includes both direct connection where no other member is present and indirect connection where other element is present, unless specifically stated otherwise.
It will be further understood that although operations are depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the scope of the appended claims.

Claims (16)

1. A pose correction method, characterized by comprising:
acquiring a target image shot by a camera, and de-distorting the target image based on optical design parameters;
Extracting a region of interest meeting a field threshold range from the de-distorted image;
determining a target angle based on the region of interest, wherein the target angle is an included angle between an imaging sensor of the camera and a target when the camera shoots the target image;
and correcting the relative pose between the camera and the target based on the target angle.
2. The method of claim 1, wherein the determining a target angle based on the region of interest comprises:
mapping the region of interest to a world coordinate system;
determining a target angle between the target image and a target coordinate axis in the world coordinate system based on a target pixel in the region of interest;
the target pixels comprise pixels in the same row direction in the region of interest, and the target coordinate axis is the x axis of the world coordinate system;
or the target pixel comprises pixels in the same column direction in the region of interest, and the target coordinate axis is the y axis of the world coordinate system;
or the target pixel comprises a pixel in the connecting line direction between the imaging sensor and the target, and the target coordinate axis is the z axis of the world coordinate system.
3. The method of claim 2, wherein the target pixel comprises pixels in a same row direction in the region of interest, the target coordinate axis being an x-axis of the world coordinate system; or the target pixel comprises pixels in the same column direction in the region of interest, and the target coordinate axis is the y axis of the world coordinate system;
the determining, based on the target pixels in the region of interest, a target angle between the target image and a target coordinate axis in the world coordinate system includes:
determining a first distance between a first pixel of the target pixels and a target coordinate axis of the world coordinate system;
determining a second distance between a second pixel of the target pixels and a target coordinate axis of the world coordinate system;
determining a first projection length of the target pixel on a target coordinate axis of the world coordinate system;
and performing arctangent function operation based on the first distance, the second distance and the first projection length to obtain a target angle.
4. The method of claim 2, wherein the target pixel comprises a pixel in a direction of a line between the imaging sensor and a target, the target coordinate axis being a z-axis of the world coordinate system;
The determining, based on the target pixels in the region of interest, a target angle between the target image and a target coordinate axis in the world coordinate system includes:
acquiring a second projection length of the target pixel on the world coordinate system;
determining a length of a pixel in the region of interest in a direction of a line between the imaging sensor and a target in the world coordinate system as a third distance;
determining a difference between the third distance and the second projected length;
determining a ratio of a distance between the imaging sensor and a target to a focal length of the camera to obtain a first ratio;
determining a ratio between the difference value and the first ratio to obtain a second ratio;
and performing arctangent function operation on the second ratio to obtain a target angle.
5. The method of any one of claims 1 to 4, wherein correcting the relative pose between the camera and the target based on the target angle comprises:
if a target angle larger than or equal to a preset angle threshold exists, reversely adjusting a target or reversely adjusting the camera in the direction of a target coordinate axis;
The target angle includes a first angle between the target image and an x-axis direction of the world coordinate system; or a second angle between the target image and the y-axis direction of the world coordinate system; or a third angle between the target image and the z-axis direction of the world coordinate system.
6. The method of any one of claims 1 to 4, wherein correcting the relative pose between the camera and the target based on the target angle comprises:
if the target angle is smaller than a preset angle threshold, generating a rotation matrix based on the target angle, and adjusting the target or the camera based on the rotation matrix;
the target angle includes a first angle between the target image and an x-axis direction of the world coordinate system, a second angle between the target image and a y-axis direction of the world coordinate system, and a third angle between the target image and a z-axis direction of the world coordinate system.
7. The method according to claim 1, wherein the method further comprises:
before extracting a region of interest meeting a field threshold range from the de-distorted image, determining that the optical design parameter is smaller than the field threshold range corresponding to the set distortion threshold based on the corresponding relation between the optical design parameter and the field.
8. A posture correction apparatus, characterized by comprising:
the processing unit is used for acquiring a target image shot by the camera and de-distorting the target image based on optical design parameters;
the extraction unit is used for extracting a region of interest meeting a field threshold range from the de-distorted image;
the determining unit is used for determining a target angle in the region of interest, wherein the target angle is an included angle between an imaging sensor of the camera and a target when the camera shoots the target image;
and the correction unit is used for correcting the relative pose between the camera and the target based on the target angle.
9. The apparatus according to claim 8, wherein the determining unit determines the target angle based on the region of interest by:
mapping the region of interest to a world coordinate system;
determining a target angle between the target image and a target coordinate axis in the world coordinate system based on a target pixel in the region of interest;
the target pixels comprise pixels in the same row direction in the region of interest, and the target coordinate axis is the x axis of the world coordinate system; or (b)
The target pixels comprise pixels in the same column direction in the region of interest, and the target coordinate axis is the y axis of the world coordinate system; or (b)
The target pixel comprises a pixel in the connecting line direction between the imaging sensor and the target, and the target coordinate axis is the z axis of the world coordinate system.
10. The apparatus of claim 9, wherein the target pixel comprises pixels in a same row direction in the region of interest, the target coordinate axis being an x-axis of the world coordinate system; or the target pixel comprises pixels in the same column direction in the region of interest, and the target coordinate axis is the y axis of the world coordinate system;
the determination unit determines a target angle between the target image and a target coordinate axis in the world coordinate system based on the target pixel in the region of interest in the following manner:
determining a first distance between a first pixel of the target pixels and a target coordinate axis of the world coordinate system;
determining a second distance between a second pixel of the target pixels and a target coordinate axis of the world coordinate system;
determining a first projection length of the target pixel on a target coordinate axis of the world coordinate system;
And performing arctangent function operation based on the first distance, the second distance and the first projection length to obtain a target angle.
11. The apparatus of claim 9, wherein the target pixel comprises a pixel in a direction of a line between the imaging sensor and a target, the target coordinate axis being a z-axis of the world coordinate system;
the determination unit determines a target angle between the target image and a target coordinate axis in the world coordinate system based on the target pixel in the region of interest in the following manner:
acquiring a second projection length of the target pixel on the world coordinate system;
determining a length of a pixel in the region of interest in a direction of a line between the imaging sensor and a target in the world coordinate system as a third distance;
determining a difference between the third distance and the second projected length;
determining a ratio of a distance between the imaging sensor and a target to a focal length of the camera to obtain a first ratio;
determining a ratio between the difference value and the first ratio to obtain a second ratio;
and performing arctangent function operation on the second ratio to obtain a target angle.
12. The apparatus according to any one of claims 8 to 11, wherein the correction unit corrects the relative pose between the camera and the target based on the target angle in such a manner that:
if a target angle larger than or equal to a preset angle threshold exists, reversely adjusting a target or reversely adjusting the camera in the direction of a target coordinate axis;
the target angle includes a first angle between the target image and an x-axis direction of the world coordinate system; or a second angle between the target image and the y-axis direction of the world coordinate system; or a third angle between the target image and the z-axis direction of the world coordinate system.
13. The apparatus according to any one of claims 8 to 11, wherein the correction unit corrects the relative pose between the camera and the target based on the target angle in such a manner that:
if the target angle is smaller than a preset angle threshold, generating a rotation matrix based on the target angle, and adjusting the target or the camera based on the rotation matrix;
the target angle includes a first angle between the target image and an x-axis direction of the world coordinate system, a second angle between the target image and a y-axis direction of the world coordinate system, and a third angle between the target image and a z-axis direction of the world coordinate system.
14. The apparatus of claim 8, wherein the apparatus further comprises:
before extracting a region of interest meeting a field threshold range from the de-distorted image, determining that the optical design parameter is smaller than the field threshold range corresponding to the set distortion threshold based on the corresponding relation between the optical design parameter and the field.
15. A posture correction apparatus, characterized by comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: a method for performing pose correction according to any of claims 1 to 7.
16. A storage medium having instructions stored therein which, when executed by a processor of a terminal, enable the terminal comprising the processor to perform the method of pose correction according to any of claims 1 to 7.
CN202210559310.1A 2022-05-18 2022-05-18 Pose correction method, pose correction device and storage medium Pending CN117132657A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210559310.1A CN117132657A (en) 2022-05-18 2022-05-18 Pose correction method, pose correction device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210559310.1A CN117132657A (en) 2022-05-18 2022-05-18 Pose correction method, pose correction device and storage medium

Publications (1)

Publication Number Publication Date
CN117132657A true CN117132657A (en) 2023-11-28

Family

ID=88861558

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210559310.1A Pending CN117132657A (en) 2022-05-18 2022-05-18 Pose correction method, pose correction device and storage medium

Country Status (1)

Country Link
CN (1) CN117132657A (en)

Similar Documents

Publication Publication Date Title
JP6348611B2 (en) Automatic focusing method, apparatus, program and recording medium
CN106778773B (en) Method and device for positioning target object in picture
CN107944367B (en) Face key point detection method and device
CN109377446B (en) Face image processing method and device, electronic equipment and storage medium
CN106503682B (en) Method and device for positioning key points in video data
CN110930336B (en) Image processing method and device, electronic equipment and storage medium
CN113643356B (en) Camera pose determination method, virtual object display method, device and electronic equipment
CN114170302A (en) Camera external parameter calibration method and device, electronic equipment and storage medium
CN105678296B (en) Method and device for determining character inclination angle
CN113920083A (en) Image-based size measurement method and device, electronic equipment and storage medium
CN115601316A (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN117132657A (en) Pose correction method, pose correction device and storage medium
CN115760585A (en) Image correction method, image correction device, storage medium and electronic equipment
CN111986097B (en) Image processing method and device
CN117291823A (en) Image processing method, device and storage medium
CN111985280B (en) Image processing method and device
CN112070681B (en) Image processing method and device
CN114339018B (en) Method and device for switching lenses and storage medium
CN112733599B (en) Document image processing method and device, storage medium and terminal equipment
CN116863162A (en) Parameter optimization method and device of camera module, electronic equipment and storage medium
WO2023225910A1 (en) Video display method and apparatus, terminal device, and computer storage medium
CN117597705A (en) Camera calibration method and device and readable storage medium
CN118351164A (en) Scale recovery method, device, equipment and storage medium
CN116095466A (en) Shooting method, shooting device, electronic equipment and storage medium
CN115809958A (en) Image processing method and device and calibration method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination