CN113763477B - Camera point calibration method, device, equipment and medium - Google Patents

Camera point calibration method, device, equipment and medium Download PDF

Info

Publication number
CN113763477B
CN113763477B CN202010495680.4A CN202010495680A CN113763477B CN 113763477 B CN113763477 B CN 113763477B CN 202010495680 A CN202010495680 A CN 202010495680A CN 113763477 B CN113763477 B CN 113763477B
Authority
CN
China
Prior art keywords
target
camera
distance
target image
acquisition position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010495680.4A
Other languages
Chinese (zh)
Other versions
CN113763477A (en
Inventor
黄黎滨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Uniview Technologies Co Ltd
Original Assignee
Zhejiang Uniview Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Uniview Technologies Co Ltd filed Critical Zhejiang Uniview Technologies Co Ltd
Priority to CN202010495680.4A priority Critical patent/CN113763477B/en
Publication of CN113763477A publication Critical patent/CN113763477A/en
Application granted granted Critical
Publication of CN113763477B publication Critical patent/CN113763477B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application discloses a camera point calibration method, device, equipment and medium, wherein the method comprises the following steps: acquiring a first target image and a second target image acquired by a target camera, and a first acquisition position corresponding to the first target image and a second acquisition position corresponding to the second target image; determining a first distance from the target object to the target camera when the target object is at a first acquisition position and a second distance from the target object to the target camera when the target object is at a second acquisition position by using the first target image and the second target image; and determining a camera point position of the target camera according to the first acquisition position, the second acquisition position, the first distance and the second distance. The method can automatically perform point location calibration, speed up point location calibration and improve point location calibration efficiency.

Description

Camera point calibration method, device, equipment and medium
Technical Field
The present disclosure relates to the field of data processing technologies, and in particular, to a method, an apparatus, a device, and a medium for calibrating a camera point.
Background
Along with the wide use of map application, the point marking workload on the map is also increasing, and the currently commonly used point marking method is mainly manual marking, and the manual marking is usually to obtain each point in the map first and then directly mark the point on the map. However, as the number of the points increases, the labeling workload is multiplied, and especially after the points are added, the point visible field is required to be configured, so that the manual labeling workload is further increased. In addition, the manual marking speed is low, the workload is high, and the point calibration efficiency is low.
Disclosure of Invention
In view of this, the purpose of the present application is to provide a method, an apparatus, a device, and a medium for calibrating a camera point, which can automatically calibrate the point and improve the point calibration efficiency. The specific scheme is as follows:
in a first aspect, the application discloses a camera point calibration method, including:
acquiring a first target image and a second target image acquired by a target camera, and a first acquisition position corresponding to the first target image and a second acquisition position corresponding to the second target image, wherein the first target image and the second target image at least comprise the same target object, the first acquisition position is the position of the target object when the first target image is acquired, and the second acquisition position is the position of the target object when the second target image is acquired;
determining a first distance from the target object to the target camera when the target object is at a first acquisition position and a second distance from the target object to the target camera when the target object is at a second acquisition position by using the first target image and the second target image;
and determining a camera point position of the target camera according to the first acquisition position, the second acquisition position, the first distance and the second distance.
Optionally, the camera point calibration method further includes:
acquiring the height information of the target object;
correspondingly, the determining, by using the first target image and the second target image, a first distance from the target object to the target camera when the target object is at the first acquisition position and a second distance from the target object to the target camera when the target object is at the second acquisition position includes:
and determining a first distance from the target object to the target camera when the target object is at a first acquisition position and a second distance from the target object to the target camera when the target object is at a second acquisition position by using the height information, the first target image and the second target image.
Optionally, the determining, using the altitude information, the first target image and the second target image, a first distance from the target camera when the target object is at a first acquisition position and a second distance from the target camera when the target object is at a second acquisition position includes:
determining a first distance from the target object to the target camera at a first acquisition position by using a lens imaging principle, the height information and the first target image;
And determining a second distance from the target object to the target camera at a second acquisition position by using a lens imaging principle, the height information and the second target image.
Optionally, the determining, by using the altitude information and the first target image, a first distance from the target camera when the target object is at the first acquisition position includes:
and if n distances are determined by using the height information and the first target image, determining one distance from the n distances according to a preset distance determination rule as the first distance.
Optionally, the determining the camera point of the target camera according to the first acquisition position, the second acquisition position, the first distance and the second distance includes:
determining a first arc using the first acquisition location and the first distance;
determining a second arc using the second acquisition location and the second distance;
and determining an intersection point of the first arc and the second arc as a camera point position of the target camera.
Optionally, the determining the intersection point of the first arc and the second arc as the camera point of the target camera includes:
And if the number of the intersection points of the first circular arc and the second circular arc is 2, determining the camera point position of the target camera according to the position change of the target object in the first target image and the second target image.
Optionally, after determining the camera point of the target camera according to the first acquisition position, the second acquisition position, the first distance, and the second distance, the method further includes:
if the target camera is a zooming fixed camera or a fixed focus fixed camera, determining a point position visual field of the target camera by utilizing the first target image and the first distance;
and/or if the target camera is a zoom fixed camera or a fixed focus fixed camera, determining a point location visual field of the target camera by using the second target image and the second distance.
In a second aspect, the application discloses a camera point calibration device, including:
the information acquisition module is used for acquiring a first target image and a second target image acquired by a target camera, a first acquisition position corresponding to the first target image and a second acquisition position corresponding to the second target image, wherein the first target image and the second target image at least comprise the same target object, the first acquisition position is the position of the target object when the first target image is acquired, and the second acquisition position is the position of the target object when the second target image is acquired;
The distance determining module is used for determining a first distance from the target camera when the target object is at a first acquisition position and a second distance from the target camera when the target object is at a second acquisition position by using the first target image and the second target image;
and the camera point position determining module is used for determining the camera point position of the target camera according to the first acquisition position, the second acquisition position, the first distance and the second distance.
In a third aspect, the application discloses a camera point calibration device, comprising:
a memory and a processor;
wherein the memory is used for storing a computer program;
the processor is used for executing the computer program to realize the camera point calibration method disclosed in the prior art.
In a fourth aspect, the application discloses a computer readable storage medium for storing a computer program, wherein the computer program when executed by a processor implements the camera point calibration method disclosed above.
Therefore, the first target image, the second target image and the first acquisition position corresponding to the first target image and the second acquisition position corresponding to the second target image are acquired firstly, wherein the first target image and the second target image at least comprise the same target object, the first acquisition position is the position of the target object when the first target image is acquired, the second acquisition position is the position of the target object when the second target image is acquired, then the first distance between the first target image and the second target image and the target camera is determined by utilizing the first target image and the second target image, and the second distance between the target object and the target camera is determined according to the first acquisition position, the second acquisition position, the first distance and the second distance. Therefore, the distance between the two corresponding positions and the target camera can be determined according to the acquired images of the two different positions acquired by the target camera and at least comprising the same target object, and then the point position of the camera is determined according to the two distances, so that the point position calibration of the camera is completed, the point position calibration can be automatically carried out, the point position calibration speed is increased, and the point position calibration efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or in the background art of the present application, the drawings that are needed in the description of the embodiments or in the background art will be briefly described below, and it is obvious that the drawings in the following description are only embodiments of the present application, and that other drawings can be obtained according to the provided drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a camera point calibration method disclosed in the present application;
FIG. 2 is a flowchart of a specific camera point calibration method disclosed in the present application;
FIG. 3 is a schematic diagram of a lens imaging principle disclosed in the present application;
fig. 4 is a schematic diagram of an imaging principle of a zoom pan-tilt camera disclosed in the present application;
FIG. 5 is a schematic diagram of a zoom fixed camera imaging principle disclosed in the present application;
FIG. 6 is a schematic view of a zoom fixed camera imaging disclosed herein;
FIG. 7 is a schematic diagram of a time-varying stationary camera imaging principle taking into account imaging horizontal offset as disclosed herein;
fig. 8 is a schematic diagram of an imaging principle of a fixed-focus fixed camera disclosed in the present application;
fig. 9 is a schematic diagram of a principle of determining a camera point location disclosed in the present application;
FIG. 10 is a schematic illustration of a change in position from right to left for a calibrator as disclosed herein;
FIG. 11 is a schematic illustration of the position of a calibrator as disclosed herein as varying from left to right;
FIG. 12 is a schematic view of a camera point calibration device disclosed in the present application;
fig. 13 is a schematic structural diagram of a camera point calibration device disclosed in the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
At present, a common point marking method mainly comprises manual marking, wherein the manual marking usually comprises the steps of firstly obtaining each point in a map and then directly marking on the map. However, as the number of the points increases, the labeling workload is multiplied, and especially after the points are added, the point visible field is required to be configured, so that the manual labeling workload is further increased. In addition, the manual marking speed is low, the workload is high, and the point calibration efficiency is low.
Referring to fig. 1, an embodiment of the application discloses a camera point calibration method, which includes:
step S11: the method comprises the steps of acquiring a first target image and a second target image acquired by a target camera, a first acquisition position corresponding to the first target image and a second acquisition position corresponding to the second target image, wherein the first target image and the second target image at least comprise the same target object, the first acquisition position is the position of the target object when the first target image is acquired, and the second acquisition position is the position of the target object when the second target image is acquired.
In a specific implementation process, a first target image and a second target image acquired by a target camera need to be acquired first, a first acquisition position corresponding to the first target image, and a second acquisition position corresponding to the second target image, wherein the first target image and the second target image at least comprise the same target object, the first acquisition position is a position of the target object when the first target image is acquired, and similarly, the second acquisition position is a position of the target object when the second target image is acquired. For example, the first target image and the second target image are acquired, wherein the first target image includes a calibrator 1, the first target image is an image acquired by a target camera when the calibrator 1 is at a position 1, and the second target image is an image acquired by the target camera when the calibrator 1 is at a position 2. The first acquisition location and the second acquisition location may be locations acquired with a positioning device, e.g., a GPS (Global Positioning System ) based positioning device, a beidou system based positioning device, etc.
Step S12: and determining a first distance from the target object to the target camera when the target object is at a first acquisition position and a second distance from the target object to the target camera when the target object is at a second acquisition position by using the first target image and the second target image.
After the first target image and the second target image are acquired, a first distance from the target camera when the target object is at the first acquisition position and a second distance from the target camera when the target object is at the second acquisition position can be determined by using the first target image and the second target image.
In an actual application, the determining, by using the first target image and the second target image, a first distance from the target object to the target camera when the target object is at the first acquisition position and a second distance from the target object to the target camera when the target object is at the second acquisition position includes: determining a first distance from the target object to the target camera at a first acquisition position by utilizing a lens imaging principle and the first target image; and determining a second distance from the target object to the target camera at a second acquisition position by using a lens imaging principle and the second target image.
Step S13: and determining a camera point position of the target camera according to the first acquisition position, the second acquisition position, the first distance and the second distance.
After determining the first distance and the second distance, a camera point of the target camera may be determined according to the first acquisition position, the second acquisition position, the first distance, and the second distance.
Therefore, the first target image, the second target image and the first acquisition position corresponding to the first target image and the second acquisition position corresponding to the second target image are acquired firstly, wherein the first target image and the second target image at least comprise the same target object, the first acquisition position is the position of the target object when the first target image is acquired, the second acquisition position is the position of the target object when the second target image is acquired, then the first distance between the first target image and the second target image and the target camera is determined by utilizing the first target image and the second target image, and the second distance between the target object and the target camera is determined according to the first acquisition position, the second acquisition position, the first distance and the second distance. Therefore, the distance between the two corresponding positions and the target camera can be determined according to the acquired images of the two different positions acquired by the target camera and at least comprising the same target object, and then the point position of the camera is determined according to the two distances, so that the point position calibration of the camera is completed, the point position calibration can be automatically carried out, the point position calibration speed is increased, and the point position calibration efficiency is improved.
Referring to fig. 2, an embodiment of the present application discloses a specific camera point calibration method, which includes:
step S21: the method comprises the steps of acquiring a first target image and a second target image acquired by a target camera, a first acquisition position corresponding to the first target image and a second acquisition position corresponding to the second target image, wherein the first target image and the second target image at least comprise the same target object, the first acquisition position is the position of the target object when the first target image is acquired, and the second acquisition position is the position of the target object when the second target image is acquired.
Step S22: and acquiring the height information of the target object.
It will be appreciated that it is also necessary to obtain height information of the target object in order to determine a first distance of the target object from the target camera at a first acquisition position and a second distance of the target object from the target camera at a second acquisition position using the height information, the first target image and the second target image. And when the target object is a calibrator, the height information is the height of the calibrator.
Step S23: and determining a first distance from the target object to the target camera at a first acquisition position by using a lens imaging principle, the height information and the first target image.
After the first target image, the first acquisition position and the height information are acquired, the height information, the first target image and the second target image can be used for determining a first distance from the target object to the target camera at the first acquisition position and a second distance from the target object to the target camera at the second acquisition position. Specifically, the lens imaging principle, the height information and the first target image may be used to determine a first distance between the target object and the target camera when the target object is at the first acquisition position. Wherein the first distance is a horizontal distance. Determining a first distance from the target camera when the target object is at a first acquisition position by using the height information and the first target image comprises the following steps: and if n distances are determined by using the height information and the first target image, determining one distance from the n distances according to a preset distance determination rule as the first distance. The preset distance determining rule may be determined according to practical situations, for example, the distance needs to be greater than or equal to 0, and may also be considered in combination with the installation angle of the camera. In the following description, a target mark is set as the target object.
Referring to fig. 3, a schematic diagram of the principle of lens imaging is shown. In the figure, h represents the height of the target object, h' represents the height of the target object in an image, f represents the focal length of the target camera, u represents the distance between the target object and the target camera, v represents the distance, and then the imaging principle of the lens can be expressed as follows:
then, the similar triangle is used for obtaining:
distance to be obtainedThe distance between the imaging object and the corresponding camera can be obtained.
When the target camera is a zoom pan-tilt camera, an imaging schematic diagram can be shown in fig. 4, where L' represents a linear distance between the zoom pan-tilt camera and the target object, θ represents an installation angle of the zoom pan-tilt camera, H represents a vertical distance between the zoom pan-tilt camera and the ground, L represents a first distance between the zoom pan-tilt camera and the target object, and H represents a height of the target object. Since the zoom pan-tilt camera has an installation angle θ, the actual height of the target object for imaging should be h×cos θ, and since the ratio of h/h 'is greater than 1000, 1 in (1+h/h') can be ignored, the first distance between the zoom pan-tilt camera and the target object is obtained according to the above formula:
Further, the root equation of the unitary quadratic equation can be obtained:
when the first distance between the target object and the target camera at the first acquisition position is obtained by using the formula, the first distance needs to be determined according to actual conditions. For example, assuming that the camera mounting height h=300 CM, the camera focal length f=0.5 CM, the calibrator height h=170 CM, the imaging size H '= (picture person pixel value/total pixel value) ×ccd size, assuming that the person pixel value 300, the total pixel value 1080,1/3 inch CCD, is calculated to obtain H' =0.1 CM, the above assumed values are substituted into the root equation, and for convenience of calculation, the calculation units are unified to CM, two values of the zoom pan-tilt camera from the calibrator L are obtained, l=726 CM and l=126 CM, and by the two length values, two conditions that the mounting angle θ is smaller than 45 degrees and greater than 45 degrees can be known from the diagram, and a value smaller than 45 degrees is taken in consideration of the actual situation, that is, l=726 CM.
When the target camera is a zoom fixed camera, the imaging schematic diagram can be seen in fig. 5, and since the zoom fixed camera has an installation angle θ, and the target object, i.e., the calibrator, is not generally located at the center of the optical axis, the specific imaging is as shown in fig. 6, a 'represents the vertical distance from the center point of the calibrator in the image, and b' represents the horizontal distance from the center point of the calibrator in the image. Referring to fig. 5, it is assumed that the calibration person moves to the center of the optical axis in the vertical direction to calculate, where the moving distance is a, the vertical moving distance is d, L' represents the linear distance between the zoom fixed camera and the position where the target object is located after moving, θ represents the mounting angle of the zoom pan-tilt camera, H represents the vertical distance between the zoom fixed camera and the ground, L represents the horizontal distance between the zoom fixed camera and the position where the target object is located after moving, H represents the height information of the target object, and L1 represents the first distance between the zoom fixed camera and the position where the target object is located before moving. From the same imaging scale on the focal plane, it is possible to obtain:
Thus, it can be obtained that the vertical movement distanceAnd respectively obtaining according to an object imaging formula and a similar triangle equal ratio relation: />It is further possible to obtain:then it can be obtained from the root equation of the unitary quadratic equation:
when the first distance between the target object and the target camera at the first acquisition position is obtained by using the formula, the first distance needs to be determined according to actual conditions. For example, assume that the camera mounting height h=300 CM, the camera focal length f=0.5 CM, the calibrator height h=170cm, the imaging size H' = (picture person pixel value/total pixel value) ×ccd size, assume the person pixel value 300, total imageCalculating to obtain h '=0.1 CM by using a CCD with a prime value of 1080,1/3 inches, assuming that the longitudinal height of a calibrator from the center is a' =0.01 CM in a picture, substituting the assumed values into a root formula, and for convenience, integrating all calculation units into the CM, obtaining two values of d after calculation, namely, a positive number and a negative number, obtaining a positive number d=14.67 CM, and obtaining a=15.792cm by calculation, wherein L= 789.6CM, and the first distance between a person and a camera is the first distance
When the target camera is a zoom fixed camera, if horizontal offset is also required to be considered, an imaging schematic diagram is shown as 7, wherein L2 in the diagram represents a first distance from a person to the camera actually when the horizontal offset is considered, and α represents a horizontal included angle of the camera. OA represents the distance of the center point of the camera screen from the snapshot point, denoted as w, w ' represents the imaging of w in the camera, preferably w ' =b '. The first distance from the camera is considered to be the actual distance of the person when the horizontal direction is offset
When the target camera is a fixed focus camera, the imaging principle is as shown in fig. 8. If the calibrator stands at the point B, a clear image cannot be obtained on the imaging plane because the calibrator is not on the focusing plane of the camera, but because of the depth of field, the imaging is clear within a certain range, and also if the picture is unclear, the calibrator cannot be identified correctly, h 'obtained on the position of the point C on the imaging plane is slightly amplified under the condition that the calibrator can be identified, and the distance calculated according to h' is inaccurate. The actual true imaging position should be at point D, where the imaging height should be h ", v denotes the distance imaged at point C, and v1 denotes the distance imaged at point D.
From simplified imaging formulaeAnd according to the similar triangle->Wherein,so that the exact object distance cannot be calculated without knowing the actual distance between the cameras.
However, by assuming that the height of the calibrator is 170CM, the focal length of the lens is 0.5CM, the point E is the clearest focusing plane, and the distance v at the point C can be calculated when the point E is distant from the camera 800 CM. When the calibrator comes to the point B distant from the point E by 100CM, according to lens imaging, the imaging height of the calibrator at the point D can be calculated to be 0.094CM, then according to the calculated imaging height of the calibrator at the point D, the distance v1 of the calibrator at the point D can be calculated, and then according to similar triangle, the imaging height of the calibrator at the point C can be calculated to be 0.09400653186CM. If the first distance between the calibrator and the camera is calculated in the manner of zooming the fixed camera, the imaging height of the calibrator at the point C needs to be brought into the formula to calculate, and after the imaging height of the calibrator at the point C is brought into the formula to calculate, the distance between the calibrator and the camera can be 904CM, and compared with the actual distance 900CM of the point B, the distance is only 4CM more, and the error is in an acceptable range, so as long as the calibrator can be recognized by the intelligent camera, namely the definition is proved to be enough, namely the calculated error is relatively smaller in a reasonable depth of field range, namely the fixed focus camera can also calculate the first distance in the same calculation manner as the zooming fixed camera.
Step S24: and determining a second distance from the target object to the target camera at a second acquisition position by using a lens imaging principle, the height information and the second target image.
After the first distance value is obtained, a second distance from the target object to the target camera at a second acquisition position is determined by using a lens imaging principle, the height information and the second target image. Specifically, the calculation method may refer to the process of determining the first distance, which is not described herein.
Step S25: and determining a camera point position of the target camera according to the first acquisition position, the second acquisition position, the first distance and the second distance.
After the first distance and the second distance are obtained, a camera point of the target camera can be determined according to the first acquisition position, the second acquisition position, the first distance and the second distance. Specifically, the determining the camera point of the target camera according to the first acquisition position, the second acquisition position, the first distance, and the second distance includes: determining a first arc using the first acquisition location and the first distance; determining a second arc using the second acquisition location and the second distance; and determining an intersection point of the first arc and the second arc as a camera point position of the target camera. In determining an intersection point of the first arc and the second arc as a camera point of the target camera. And if the number of the intersection points of the first circular arc and the second circular arc is 2, determining the camera point position of the target camera according to the position change of the target object in the first target image and the second target image. Referring to fig. 9, a schematic diagram of a camera point location determination principle is shown. At the point M is a first acquisition position, the corresponding first distance is L3, the point N is a second acquisition position, the corresponding second distance is L4, a circle is determined by taking the point M as the center of a circle, L3 as the radius, a circle is determined by taking the point N as the center of a circle, and L4 as the radius, then the camera is positioned at the intersection point of the two circles, namely at the point I or the point K, at the moment, the camera point position of the target camera can be determined according to the position change of the target object in the first target image and the second target image, if the position change of the target object in the first target image and the second target image is shown in fig. 10, the point position of the camera is the point I, if the position change of the target object in the first target image and the second target image is shown in fig. 11, the point position of the camera is the point K, and the target object moves from left to right. In addition, if the distance is from near to far or from far to near, the judgment can be made according to the size of the snap shot human body.
Step S26: and if the target camera is a zooming fixed camera or a fixed focus fixed camera, determining the point position visual field of the target camera by utilizing the first target image and the first distance.
After determining the camera point position of the target camera, if the target camera is a zoom fixed camera or a fixed focus fixed camera, determining the point position visual field of the target camera by using the first target image and the first distance. And/or if the target camera is a zoom fixed camera or a fixed focus fixed camera, determining a point location visual field of the target camera by using the second target image and the second distance. Specifically, if the target camera is a zoom fixed camera or a fixed focus fixed camera, the point location visual field of the target camera is determined by using the first target image and the first distance, or any one of the second target image and the second distance. In practical application, considering that certain errors exist in the determining process of the first distance and the second distance, the first visual field may be determined by using the first target image and the first distance, the second visual field may be determined by using the second target image and the second distance, and then an average value of the first visual field and the second visual field is taken as the visual field of the target camera.
The visual field may be represented by an angle α in fig. 7, which represents an included angle between the camera and the horizontal, and an installation angle θ in fig. 5. For a zoom pan-tilt camera, since the zoom pan-tilt camera can rotate at different angles, no further determination of the field of view is required.
Thus, without manual map point position addition, a calibrator holds terminal equipment with a position reporting function, walks around in a shooting range of a camera to be added, can automatically add point positions, and when a plurality of cameras participate in image acquisition, the system traverses all camera point positions, if a specific calibrator is not shot by the camera, the system can give an alarm in time, and the calibrator is informed to go to acquisition, so that the condition of missing calibration is avoided
Referring to fig. 12, an embodiment of the present application discloses a camera point calibration device, including:
the information acquisition module 11 is configured to acquire a first target image, a second target image, a first acquisition position corresponding to the first target image, and a second acquisition position corresponding to the second target image, where the first target image and the second target image at least include the same target object, the first acquisition position is a position where the target object is located when the first target image is acquired, and the second acquisition position is a position where the target object is located when the second target image is acquired;
A distance determining module 12, configured to determine a first distance from the target camera when the target object is at the first acquisition position and a second distance from the target camera when the target object is at the second acquisition position using the first target image and the second target image;
the camera point position determining module 13 is configured to determine a camera point position of the target camera according to the first acquisition position, the second acquisition position, the first distance, and the second distance.
Therefore, the first target image, the second target image and the first acquisition position corresponding to the first target image and the second acquisition position corresponding to the second target image are acquired firstly, wherein the first target image and the second target image at least comprise the same target object, the first acquisition position is the position of the target object when the first target image is acquired, the second acquisition position is the position of the target object when the second target image is acquired, then the first distance between the first target image and the second target image and the target camera is determined by utilizing the first target image and the second target image, and the second distance between the target object and the target camera is determined according to the first acquisition position, the second acquisition position, the first distance and the second distance. Therefore, the distance between the two corresponding positions and the target camera can be determined according to the acquired images of the two different positions acquired by the target camera and at least comprising the same target object, and then the point position of the camera is determined according to the two distances, so that the point position calibration of the camera is completed, the point position calibration can be automatically carried out, the point position calibration speed is increased, and the point position calibration efficiency is improved.
Specifically, the camera point calibration device further comprises:
the height information acquisition module is used for acquiring the height information of the target object;
accordingly, the distance determining module 12 is configured to determine, using the altitude information, the first target image and the second target image, a first distance from the target camera when the target object is at the first acquisition position and a second distance from the target camera when the target object is at the second acquisition position.
Specifically, the distance determining module 12 is configured to determine, using a lens imaging principle, the height information, and the first target image, a first distance from the target object to the target camera when the target object is at the first acquisition position; and determining a second distance from the target object to the target camera at a second acquisition position by using a lens imaging principle, the height information and the second target image.
Further, the distance determining module 12 is configured to determine, when determining n distances by using the altitude information and the first target image, one distance from the n distances according to a preset distance determining rule as the first distance.
Further, the camera point position determining module 13 includes:
a first determining unit configured to determine a first arc using the first acquisition position and the first distance;
a second determining unit configured to determine a second arc using the second acquisition position and the second distance;
and a third determining unit, configured to determine an intersection point of the first arc and the second arc as a camera point of the target camera.
Further, the third determining unit is specifically configured to: and when the number of intersection points of the first circular arc and the second circular arc is 2, determining the camera point position of the target camera according to the position change of the target object in the first target image and the second target image.
Further, the camera point calibration device further comprises:
the visual field determining module is used for determining the point location visual field of the target camera by utilizing the first target image and the first distance when the target camera is a zooming fixed camera or a fixed-focus fixed camera; and/or when the target camera is a zooming fixed camera or a fixed-focus fixed camera, determining the point position visible area of the target camera by utilizing the second target image and the second distance.
Referring to fig. 13, a schematic structural diagram of a camera point calibration device 20 according to an embodiment of the present application is shown, where the camera point calibration may specifically include, but is not limited to, a tablet computer, a notebook computer, a desktop computer, or the like.
In general, the camera point calibration apparatus 20 in the present embodiment includes: a processor 21 and a memory 22.
Processor 21 may include one or more processing cores, such as a four-core processor, an eight-core processor, or the like, among others. The processor 21 may be implemented using at least one hardware selected from DSP (digital signal processing ), FPGA (field-programmable gate array, field programmable array), PLA (programmable logic array ). The processor 21 may also include a main processor, which is a processor for processing data in an awake state, also called a CPU (central processing unit, medium-sized processor), and a coprocessor; a coprocessor is a low-power processor for processing data in a standby state. In some embodiments, the processor 21 may be integrated with a GPU (graphics processing unit, image processor) for taking care of rendering and drawing of images that the display screen is required to display. In some embodiments, the processor 21 may include an AI (artificial intelligence ) processor for processing computing operations related to machine learning.
Memory 22 may include one or more computer-readable storage media, which may be non-transitory. Memory 22 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In this embodiment, the memory 22 is at least used for storing a computer program 221, where the computer program, when loaded and executed by the processor 21, is capable of implementing the camera point calibration method steps disclosed in any of the foregoing embodiments. In addition, the resources stored in the memory 22 may also include an operating system 222, data 223, and the like, and the storage mode may be transient storage or permanent storage. The operating system 222 may be Windows, unix, linux, among others. The data 223 may include a variety of data.
In some embodiments, the camera point calibration device 20 may further include a display screen 23, an input/output interface 24, a communication interface 25, a sensor 26, a power supply 27, and a communication bus 28.
It will be appreciated by those skilled in the art that the configuration shown in FIG. 13 is not limiting of the camera point calibration device 20 and may include more or fewer components than illustrated.
Further, the embodiment of the application also discloses a computer readable storage medium for storing a computer program, wherein the computer program is executed by a processor to implement the camera point calibration method disclosed in any of the foregoing embodiments.
The specific process of the camera point calibration method may refer to the corresponding content disclosed in the foregoing embodiment, and will not be described herein.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, so that the same or similar parts between the embodiments are referred to each other. For the device disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. The software modules may be disposed in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
Finally, it is further noted that relational terms such as first and second are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a list of processes, methods, articles, or apparatus that comprises other elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above describes in detail a method, apparatus, device and medium for calibrating a camera point, and specific examples are applied to illustrate the principles and embodiments of the present application, where the above description of the embodiments is only for helping to understand the method and core ideas of the present application; meanwhile, as those skilled in the art will have modifications in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.

Claims (9)

1. The camera point calibration method is characterized by comprising the following steps of:
acquiring a first target image and a second target image acquired by a target camera, and a first acquisition position corresponding to the first target image and a second acquisition position corresponding to the second target image, wherein the first target image and the second target image at least comprise the same target object, the first acquisition position is the position of the target object when the first target image is acquired, and the second acquisition position is the position of the target object when the second target image is acquired;
determining a first distance from the target object to the target camera when the target object is at a first acquisition position and a second distance from the target object to the target camera when the target object is at a second acquisition position by using the first target image and the second target image;
determining a camera point position of the target camera according to the first acquisition position, the second acquisition position, the first distance and the second distance;
wherein the method further comprises:
acquiring the height information of the target object;
the determining, using the first target image and the second target image, a first distance from the target object to the target camera at a first acquisition position and a second distance from the target object to the target camera at a second acquisition position includes:
And determining a first distance from the target object to the target camera when the target object is at a first acquisition position and a second distance from the target object to the target camera when the target object is at a second acquisition position by using the height information, the first target image and the second target image.
2. The method of calibrating a camera point according to claim 1, wherein determining a first distance from the target camera when the target object is at a first acquisition position and a second distance from the target camera when the target object is at a second acquisition position using the height information, the first target image, and the second target image includes:
determining a first distance from the target object to the target camera at a first acquisition position by using a lens imaging principle, the height information and the first target image;
and determining a second distance from the target object to the target camera at a second acquisition position by using a lens imaging principle, the height information and the second target image.
3. The method for calibrating a camera point according to claim 2, wherein determining a first distance from the target camera when the target object is at the first acquisition position using the height information and the first target image includes:
And if n distances are determined by using the height information and the first target image, determining one distance from the n distances according to a preset distance determination rule as the first distance.
4. The camera point calibration method according to claim 1, wherein the determining the camera point of the target camera according to the first acquisition position, the second acquisition position, the first distance, and the second distance includes:
determining a first arc using the first acquisition location and the first distance;
determining a second arc using the second acquisition location and the second distance;
and determining an intersection point of the first arc and the second arc as a camera point position of the target camera.
5. The method for calibrating a camera point according to claim 4, wherein determining an intersection point of the first arc and the second arc as the camera point of the target camera includes:
and if the number of the intersection points of the first circular arc and the second circular arc is 2, determining the camera point position of the target camera according to the position change of the target object in the first target image and the second target image.
6. The method according to any one of claims 1 to 5, wherein after determining the camera point of the target camera according to the first acquisition position, the second acquisition position, the first distance, and the second distance, the method further comprises:
if the target camera is a zooming fixed camera or a fixed focus fixed camera, determining a point position visual field of the target camera by utilizing the first target image and the first distance;
and/or if the target camera is a zoom fixed camera or a fixed focus fixed camera, determining a point location visual field of the target camera by using the second target image and the second distance.
7. The utility model provides a camera point position calibration device which characterized in that includes:
the information acquisition module is used for acquiring a first target image and a second target image acquired by a target camera, a first acquisition position corresponding to the first target image and a second acquisition position corresponding to the second target image, wherein the first target image and the second target image at least comprise the same target object, the first acquisition position is the position of the target object when the first target image is acquired, and the second acquisition position is the position of the target object when the second target image is acquired;
The distance determining module is used for determining a first distance from the target camera when the target object is at a first acquisition position and a second distance from the target camera when the target object is at a second acquisition position by using the first target image and the second target image;
the camera point position determining module is used for determining a camera point position of the target camera according to the first acquisition position, the second acquisition position, the first distance and the second distance;
the camera point calibration device is further specifically used for:
acquiring the height information of the target object;
the distance determining module is specifically configured to:
and determining a first distance from the target object to the target camera when the target object is at a first acquisition position and a second distance from the target object to the target camera when the target object is at a second acquisition position by using the height information, the first target image and the second target image.
8. A camera point calibration device, comprising:
a memory and a processor;
wherein the memory is used for storing a computer program;
the processor is configured to execute the computer program to implement the camera point calibration method of any one of claims 1 to 6.
9. A computer readable storage medium for storing a computer program, wherein the computer program when executed by a processor implements the camera point calibration method according to any one of claims 1 to 6.
CN202010495680.4A 2020-06-03 2020-06-03 Camera point calibration method, device, equipment and medium Active CN113763477B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010495680.4A CN113763477B (en) 2020-06-03 2020-06-03 Camera point calibration method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010495680.4A CN113763477B (en) 2020-06-03 2020-06-03 Camera point calibration method, device, equipment and medium

Publications (2)

Publication Number Publication Date
CN113763477A CN113763477A (en) 2021-12-07
CN113763477B true CN113763477B (en) 2024-04-05

Family

ID=78783325

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010495680.4A Active CN113763477B (en) 2020-06-03 2020-06-03 Camera point calibration method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN113763477B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114486186A (en) * 2021-12-27 2022-05-13 歌尔股份有限公司 Detection device and method for effective focal length of lens

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003075148A (en) * 2001-09-03 2003-03-12 Techno Vanguard:Kk Displacement measuring instrument using digital still camera
KR100792852B1 (en) * 2006-10-25 2008-01-14 전자부품연구원 Method for extracting distance of landmark of mobile robot with a single camera
CN103134489A (en) * 2013-01-29 2013-06-05 北京凯华信业科贸有限责任公司 Method of conducting target location based on mobile terminal
CN111178317A (en) * 2020-01-06 2020-05-19 广东博智林机器人有限公司 Detection positioning method, system, device, electronic equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003075148A (en) * 2001-09-03 2003-03-12 Techno Vanguard:Kk Displacement measuring instrument using digital still camera
KR100792852B1 (en) * 2006-10-25 2008-01-14 전자부품연구원 Method for extracting distance of landmark of mobile robot with a single camera
CN103134489A (en) * 2013-01-29 2013-06-05 北京凯华信业科贸有限责任公司 Method of conducting target location based on mobile terminal
CN111178317A (en) * 2020-01-06 2020-05-19 广东博智林机器人有限公司 Detection positioning method, system, device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN113763477A (en) 2021-12-07

Similar Documents

Publication Publication Date Title
US9250328B2 (en) Graphics-aided remote position measurement with handheld geodesic device
CN105451012B (en) 3-D imaging system and three-D imaging method
CN111862180B (en) Camera set pose acquisition method and device, storage medium and electronic equipment
US20120299936A1 (en) Graphics-aided remote position measurement with handheld geodesic device
CN108805938B (en) Detection method of optical anti-shake module, mobile terminal and storage medium
CN113029128B (en) Visual navigation method and related device, mobile terminal and storage medium
CN110689580B (en) Multi-camera calibration method and device
CN104111059A (en) Distance measuring and locating device and method and terminal
CN104778656A (en) Fisheye image correction method on basis of spherical perspective projection
CN102768762A (en) Digital camera calibration method targeted to shield tunnel defect digital radiography detection and device thereof
US20220074743A1 (en) Aerial survey method, aircraft, and storage medium
US11514608B2 (en) Fisheye camera calibration system, method and electronic device
WO2020124517A1 (en) Photographing equipment control method, photographing equipment control device and photographing equipment
CN113763477B (en) Camera point calibration method, device, equipment and medium
CN114979956A (en) Unmanned aerial vehicle aerial photography ground target positioning method and system
CN113674424B (en) Method and device for drawing electronic map
CN117288151B (en) Three-dimensional attitude determination method and device of projection equipment and electronic equipment
CN112907745B (en) Method and device for generating digital orthophoto map
CN116486290B (en) Unmanned aerial vehicle monitoring and tracking method and device, electronic equipment and storage medium
KR20220058846A (en) Robot positioning method and apparatus, apparatus, storage medium
CN114638880B (en) Planar ranging method, monocular camera and computer readable storage medium
US20160188141A1 (en) Electronic device and method for displaying target object thereof
CN113421300B (en) Method and device for determining actual position of object in fisheye camera image
CN113240754B (en) Method, device, equipment and storage medium for determining internal parameters of PTZ image pickup device
CN109377529A (en) A kind of picture coordinate transformation method, system and the device of ground coordinate and Pan/Tilt/Zoom camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant