CN110555888B - Master-slave camera calibration method, storage device, computer equipment and system thereof - Google Patents

Master-slave camera calibration method, storage device, computer equipment and system thereof Download PDF

Info

Publication number
CN110555888B
CN110555888B CN201910780875.0A CN201910780875A CN110555888B CN 110555888 B CN110555888 B CN 110555888B CN 201910780875 A CN201910780875 A CN 201910780875A CN 110555888 B CN110555888 B CN 110555888B
Authority
CN
China
Prior art keywords
coordinate
amount
image
horizontal rotation
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910780875.0A
Other languages
Chinese (zh)
Other versions
CN110555888A (en
Inventor
李中振
潘华东
高美
龚磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN201910780875.0A priority Critical patent/CN110555888B/en
Publication of CN110555888A publication Critical patent/CN110555888A/en
Application granted granted Critical
Publication of CN110555888B publication Critical patent/CN110555888B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Abstract

The application discloses a master-slave camera calibration method, a storage device, computer equipment and a master-slave camera monitoring system. The master-slave camera calibration method comprises the following steps: recording first coordinates of the target in a first image, wherein the first image is generated by the primary camera; recording a second coordinate of the target in a second image, a first horizontal rotation amount from the camera, a first pitch amount and a field angle at the same time, wherein the second image is generated by detecting the tracking target from the camera; calculating a second horizontal rotation amount and a second pitching amount corresponding to the target when the moment is in the center of a second image through a second coordinate, a first horizontal rotation amount, a first pitching amount and a field angle recorded at the same moment; and establishing a mapping table of the first coordinate and a second horizontal rotation amount and a second pitching amount corresponding to the same time. By the method, the mapping table of the master camera and the slave camera can be established even in a pure-color single scene.

Description

Master-slave camera calibration method, storage device, computer equipment and system thereof
Technical Field
The application relates to the technical field of security monitoring, in particular to a master-slave camera calibration method, a storage device, computer equipment and a master-slave camera monitoring system.
Background
Video monitoring in the security field is rapidly developed, and particularly, the installation and deployment of cameras are rapidly increased due to the appearance of network cameras, so that people can safely protect and drive.
The current master-slave camera monitoring system comprises a master camera and a slave camera which are installed according to certain requirements, wherein the master camera monitors the whole situation, finds a target, and rotates to a specified target area from the slave camera, so that the target is clearly positioned in a picture, and the process needs to calibrate the master camera and the slave camera.
However, in some scenes, such as a pure single scene, it is difficult to extract and match features between an image captured by the master camera and an image captured by the slave camera, and it is difficult to extract effective coordinate matching points of the master camera and the slave camera, and a mapping relationship cannot be established.
Disclosure of Invention
The application mainly provides a method for calibrating a master camera and a slave camera, a storage device, computer equipment and a master camera and slave camera monitoring system, and aims to solve the problem that the master camera and the slave camera cannot establish a mapping relation in certain scenes.
In order to solve the technical problem, the application adopts a technical scheme that: a method for calibrating a master camera and a slave camera is provided. The method for calibrating the master camera and the slave camera comprises the following steps: recording first coordinates of the target in a first image, wherein the first image is generated by the main camera; recording a second coordinate of the target in a second image, a first horizontal rotation amount from the camera, a first pitch amount and a field angle at the same time, wherein the second image is generated by detecting the tracking target from the camera; calculating a second horizontal rotation amount and a second pitching amount corresponding to the target when the moment is in the center of a second image through a second coordinate, a first horizontal rotation amount, a first pitching amount and a field angle recorded at the same moment; and establishing a mapping table of the first coordinate and a second horizontal rotation amount and a second pitching amount corresponding to the same moment.
In order to solve the above technical problem, another technical solution adopted by the present application is: a memory device is provided. The storage device stores a program that, when executed, enables the method described above to be implemented.
In order to solve the above technical problem, another technical solution adopted by the present application is: a computer device is provided. The computer device comprises a processor coupled to a memory for storing a program and a memory for executing the program to implement the method as described above.
In order to solve the above technical problem, another technical solution adopted by the present application is: a master-slave camera monitoring system is provided. The master-slave camera monitoring system comprises a master camera, a slave camera and a computer device as described above, the computer device being in communication connection with the master camera and the slave camera respectively.
The beneficial effect of this application is: different from the prior art, the application discloses a master-slave camera calibration method, a storage device, computer equipment and a master-slave camera monitoring system. The method comprises the steps of recording a first coordinate of a moving target in a first image to replace the characteristic point extracted from the environment of the first image directly, wherein the moving target has obvious characteristics and is convenient to mark, recording a second coordinate of the target in a second image, a first horizontal rotation amount, a first pitching amount and a field angle of a slave camera at the same moment, and determining a corresponding second horizontal rotation amount and a corresponding second pitching amount of the target when the moment is in the center of the second image according to the second coordinate, so that a mapping table of the first coordinate and the corresponding second horizontal rotation amount and second pitching amount at the same moment is established to realize the establishment of the mapping relation of the master camera and the slave camera, and even if the matching point of the master camera and the slave camera cannot be directly proposed from the environment in a pure-color single scene.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts, wherein:
fig. 1 is a schematic flowchart illustrating an embodiment of a method for calibrating a master camera and a slave camera according to the present disclosure;
FIG. 2 is a schematic flow diagram of S13 in the process flow of FIG. 1;
FIG. 3 is a schematic block diagram of an embodiment of a computer device provided herein;
fig. 4 is a schematic structural diagram of an embodiment of a memory device provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first", "second" and "third" in the embodiments of the present application are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first," "second," or "third" may explicitly or implicitly include at least one of the feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless explicitly specifically limited otherwise. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating an embodiment of a method for calibrating a master camera and a slave camera according to the present disclosure, in which the embodiment of the method for calibrating a master camera and a slave camera includes:
s11: a first coordinate of the object in the first image is recorded.
First coordinates of the object in a first image are recorded, wherein the first image is generated by the main camera.
The main camera can be a gunlock, a panoramic camera, a fisheye camera and other large-field cameras with fixed focal lengths, is wide in monitoring range, and can monitor global information and find targets.
The target may be an object that is easily recognized, such as a car, a pedestrian, a house, or the like. For example, if the target is a moving vehicle, the main camera continuously keeps track of the first coordinates of the vehicle in the first image, and thus the first coordinates are the coordinates of the moving vehicle at a certain time in the first image. The first coordinate of the stationary object, like a house, in the first image is unique and defined, so that no tracking is required.
The first coordinates are determined from a coordinate system established on the first image. For example, the length of a pixel point in the first image is used as the unit coordinate length, the resolution of the first image is obtained, and a pixel point is selected as the origin of coordinates in the first image, so that the first coordinate of the target in the first image at a certain time can be easily determined. Or, the unit coordinate length and the coordinate origin in the first image are freely set, so that the first coordinate of the target in the first image at a certain time is determined and recorded.
Optionally, the main camera continuously tracks and records a plurality of first coordinates of the same moving object in the first image. Or, the main camera simultaneously records a plurality of first coordinates corresponding to a plurality of groups of objects in the first image, that is, each object may record a plurality of first coordinates in time sequence.
For example, in a scene that the main camera monitors a section of road, some vehicles completely travel the whole section of road, and other vehicles only travel a partial section of the road and stop, enter a cell or turn around, so that a plurality of first coordinates of a plurality of groups of vehicles in the first image are tracked and recorded at the same time, the obtained data volume is richer, and the mapping table is more favorably established subsequently.
S12: a second coordinate of the target in the second image, a first amount of horizontal rotation from the camera, a first amount of pitch, and a field angle are recorded at the same time.
A second coordinate of the target in a second image generated from the camera detecting the tracking target, a first horizontal rotation amount, a first pitch amount, and a field angle of the slave camera are recorded at the same time.
The slave camera is a zoom camera, such as a zoom dome camera, the slave camera can flexibly rotate horizontally and vertically and zoom at the same time, and the detail information of the target can be clearly seen after the slave camera zooms, but the zoom ratio is increased, the visual field is reduced, and the whole situation cannot be well monitored.
Therefore, the advantages of calibration of the master camera and the slave camera are integrated, and the master camera and the slave camera are installed according to certain requirements to form a master-slave camera monitoring system. For example, the master camera and the slave camera are installed up and down, the master camera is fixed, and the slave camera can rotate up and down and left and right around the fixed shaft to form a master-slave camera monitoring system. Alternatively, the master camera is mounted side-by-side with the slave camera.
The main camera monitors the whole situation, detects and tracks the target, and records a first coordinate of the target in the first image; and tracking the target from the camera, keeping the target clearly in the second image, keeping the target at the center of the second image as far as possible, and recording a second coordinate of the target in the second image, a first horizontal rotation amount, a first pitching amount and a visual field angle of the camera.
When the first coordinate of the target in the first image is recorded at a certain moment, the second coordinate of the target in the second image, the first horizontal rotation amount, the first pitching amount and the field angle of the slave camera are also recorded at the same moment, so that the first coordinate has the corresponding second coordinate, the first horizontal rotation amount, the first pitching amount and the field angle of the slave camera, and the master-slave camera calibration is completed.
If the tracking of the master camera or the slave camera is lost or tracking is wrong in the tracking process when the master camera or the slave camera tracks the same target, only information when the master camera and the slave camera track successfully at the same time is recorded.
Further, at the same time, the zoom values of the slave camera when the object is clearly in the second image are also recorded, with different zoom values corresponding to different angles of view.
The scaling may be manually set or machine-calibrated, for example, setting the target to occupy one third of the area of the second image, or identifying the target, identifying the class of the target, and then matching the scaling value of the class of the target in the second image.
For example, the master camera detects a tracking target, horizontally rotates the slave camera by a first horizontal rotation amount and vertically rotates by a first pitch amount with respect to the initial position at the same time so that the target is located in the second image, adjusts the zoom value of the slave camera so that the target is clearly located within the second image, and records the first horizontal rotation amount, the first pitch amount, the zoom value, and the angle of view at that time.
Therefore, the first coordinate of the target is provided next time, namely the posture of the slave camera can be quickly regulated and controlled according to the first coordinate, so that the target can be clearly positioned in the second image.
The second coordinates are determined from a coordinate system established on the second image. For example, the length of a pixel point in the second image is used as the unit coordinate length, the resolution of the second image is obtained, and a pixel point is selected in the second image as the origin of coordinates, so that the second coordinate of the target in the second image at a certain time can be easily determined. Or, the unit coordinate length and the coordinate origin in the second image are freely set, so that the second coordinate of the target in the second image at a certain time is determined and recorded.
In this embodiment, the coordinate systems of the first image and the second image both use the length of one pixel point as the unit coordinate length, and use the pixel point at the lower left corner in the image as the origin of coordinates, so as to mark the first coordinate and the second coordinate.
Optionally, when a plurality of first coordinates of the plurality of sets of targets in the first image are recorded, corresponding second coordinates of the plurality of sets of targets in the second image, a first horizontal rotation amount from the camera, a first pitch amount, and a field angle are also recorded at the same time.
S13: and calculating a second horizontal rotation amount and a second pitching amount corresponding to the target when the moment is in the center of the second image according to the second coordinate, the first horizontal rotation amount, the first pitching amount and the field angle recorded at the same moment.
In S12, since it is not guaranteed that the target is located at the center of the second image, the second horizontal rotation amount and the second pitch amount corresponding to the target located at the center of the second image at the same time are determined based on the recorded second coordinate, the first horizontal rotation amount, the first pitch amount, and the angle of view at the same time.
Specifically, the step of determining the second horizontal rotation amount and the second pitch amount includes:
s131: the second horizontal rotation amount is calculated from the abscissa of the second coordinate, the horizontal angle of view, the first horizontal rotation amount, and the width coordinate of the second image.
The angle of view includes a horizontal angle of view H and a vertical angle of view V.
Acquiring the abscissa x of the second coordinate 2 Obtaining the coordinate difference value of the horizontal coordinate w/2 of the central point of the second image and the width coordinate w of the second image, multiplying the ratio value by the horizontal field angle H to obtain the corrected horizontal rotation amount of the target relative to the central point of the second image, wherein the first horizontal rotation amount P 1 The sum of the corrected horizontal rotation amount is the second horizontal rotation amount P when the target is located at the center point of the second image 2
Specifically, the second horizontal rotation amount P 2 The formula of (c) is calculated as: p 2 =P 1 +(x 2 -w/2)/w*H。
S132: and calculating a second pitching amount according to the ordinate of the second coordinate, the vertical field angle, the first pitching amount and the height coordinate of the second image.
Acquiring the ordinate y of the second coordinate 2 Obtaining the coordinate difference value of the horizontal coordinate h/2 of the central point of the second image, then obtaining the ratio of the difference value to the height coordinate h of the second image, multiplying the ratio by the vertical field angle V to obtain the corrected pitching quantity of the target relative to the central point of the second image, and obtaining the first pitching quantity T 1 The sum of the corrected pitching quantity is the second pitching quantity T when the target is positioned at the center point of the second image 2
Specifically, the second pitch amount T 2 The following formula is adopted for calculation: t is 2 =T 1 +(y 2 -h/2)/h*V。
Wherein (x) 2 ,y 2 ) Is the second coordinate, w is the width coordinate of the second image, H is the height coordinate of the second image, (w/2, H/2) is the coordinate of the center point of the second image, H is the corresponding horizontal field angle, V is the corresponding vertical field angle, P 1 For the first horizontal rotation amount, P 2 For the second horizontal rotation amount, T 1 Is a first pitch amount, T 2 Is a second depressionThe amount of pitch is measured.
S14: and establishing a mapping table of the first coordinate and a second horizontal rotation amount and a second pitching amount corresponding to the same time.
Wherein the abscissa x of the first coordinate 1 Mapped to the second horizontal rotation amount P 2 Ordinate y of the first coordinate 1 Mapped to the second pitch quantity T 2
Obtaining a plurality of groups of first coordinates (x) through the steps 1 ,y 1 ) And a second horizontal rotation amount P corresponding to the same time 2 Second pitching amount T 2 Thereby establishing a first coordinate (x) 1 ,y 1 ) Attitude information (P) of the slave camera with the target at the center point of the second image 2 ,T 2 ) Of the mapping table.
Further, a plurality of sets of first coordinates (x) of the object are obtained 1 ,y 1 ) And corresponding sets of second horizontal rotation amount, second pitch amount (P) 2 ,T 2 ) Establishing a mapping table so that the abscissa x of any first coordinate on the first image 1 And a second horizontal rotation amount P 2 Corresponding to the ordinate y of any first coordinate 1 And a second pitch amount T 2 And (7) corresponding.
In particular, the abscissa x according to the sets of first coordinates 1 And corresponding second amount of horizontal rotation P 2 Establishing an abscissa x of the first coordinate by interpolation 1 Second amount of horizontal rotation P of the map 2 . According to the ordinate y of the first coordinates of the sets 1 And a corresponding second amount of pitch T 2 Establishing a vertical coordinate y with any first coordinate by interpolation 1 Mapped second amount of pitch T 2
For example, in a scene of monitoring a section of a road, a plurality of first coordinates (x) along the road are acquired 1 ,y 1 ) And corresponding sets of second amount of horizontal rotation, second amount of pitch (P) 2 ,T 2 ) But a plurality of first coordinates (x) 1 ,y 1 ) Is discontinuous, thus using a discontinuity x 1 And corresponding P 2 Obtaining a continuous function curve, and using discontinuities y 1 And corresponding T 2 Another continuous function curve is obtained, so that the abscissa x of any first coordinate on the first image can be obtained 1 And a second horizontal rotation amount P 2 Corresponding to the ordinate y of any first coordinate 1 And a second pitch amount T 2 Correspondingly, an arbitrary first coordinate (x) in the first image 1 ,y 1 ) All have a unique (P) 2 ,T 2 ) And correspondingly, establishing a mapping table to finish the calibration of the master camera and the slave camera.
The calibration of the master camera and the slave camera can realize each coordinate (x) in the master camera 1 ,y 1 ) And attitude information (P) of the slave camera 2 ,T 2 ) The corresponding is carried out, the subsequent main camera tracks and identifies the target and provides a first coordinate of the target in the first image in real time, and then the auxiliary camera can carry out the corresponding according to the first coordinate (x) 1 ,y 1 ) And the mapping table acquires the attitude information (P) when the camera correspondingly tracks 2 ,T 2 ) Thereby enabling the target to be quickly tracked with the time being in the center of the second image.
In a pure-color single scene, such as a grassland and a snow scene, the whole scene of the grassland is green, and the whole scene of the snow scene is white, so that the difference of scene features at various positions in the shot first image is small, the feature extraction and matching of the main camera and the features in the image shot by the camera are difficult in the scene, and the calibration between the main camera and the auxiliary camera is difficult to complete.
According to the method and the device, the first coordinate of the moving target in the first image under the pure-color single scene is tracked and recorded, so that the position coordinates on the scene are calibrated.
Therefore, even in a pure-color single scene, the main camera only needs to recognize the target and determine the first coordinate of the target on the first image, and the auxiliary camera tracks the picture at the first coordinate according to the mapping relation, so that the target is positioned at the center of the second image, and the main camera and the auxiliary camera can track the monitored target quickly and accurately.
Based on this, the present application further provides a computer device 100, please refer to fig. 3, fig. 3 is a schematic structural diagram of a first embodiment of the computer device of the present application, in this embodiment, the computer device 100 includes a processor 110 and a memory 120, the processor 110 is coupled to the memory 120, the memory 120 is used for storing a program, and the processor 110 is used for executing the program to implement the method for calibrating the master-slave camera according to any of the embodiments.
The computer device 100 may be a codec. Processor 110 may also be referred to as a CPU (Central Processing Unit). The processor 110 may be an integrated circuit chip having signal processing capabilities. The processor 110 may also be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components. The general purpose processor 110 may be a microprocessor or the processor may be any conventional processor or the like.
Based on this, the present application further provides a storage device 200, please refer to fig. 4, fig. 4 is a schematic structural diagram of an embodiment of the storage device provided in the present application, in this embodiment, the storage device 200 stores a program 210, and when the program 210 is executed, the method for calibrating the master camera and the slave camera according to any of the embodiments described above can be implemented.
The program 210 may be stored in the storage device 200 in the form of a software product, and includes several instructions to make a device or a processor execute all or part of the steps of the methods according to the embodiments of the present application.
The storage device 200 is a medium in computer memory for storing some discrete physical quantity. The storage device 200 having a storage function includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk, and various media capable of storing the code of the program 210.
The present application further provides a master-slave camera monitoring system comprising a master camera, a slave camera and a computer device 100 as described above, the computer device 100 being in communication connection with the master camera and the slave camera, respectively.
The main camera can be a gunlock, a panoramic camera, a fisheye camera and other large-view cameras with fixed focal lengths, is wide in monitoring range, and can monitor global information and find targets. The slave camera is a camera with a variable focus, such as a zoom camera, can flexibly rotate horizontally, vertically and zoom at the same time, and the detail information of the target can be seen clearly after the slave camera zooms.
The computer device 100 is in communication connection with the master camera and the slave camera respectively, tracks and identifies the target through the master camera, records a first coordinate of the target on a first picture, and obtains a second horizontal rotation amount and a second pitching amount corresponding to the first coordinate by combining with mapping table regulation, so that the slave camera is regulated to quickly and accurately track the target to display specific detail information of the target.
Different from the prior art, the application discloses a master-slave camera calibration method, a storage device, computer equipment and a master-slave camera monitoring system. The method comprises the steps of recording a first coordinate of a moving target in a first image to replace the characteristic point extracted from the environment of the first image directly, wherein the moving target has obvious characteristics and is convenient to mark, recording a second coordinate of the target in a second image, a first horizontal rotation amount, a first pitching amount and a visual angle at the same moment, and determining a second horizontal rotation amount and a second pitching amount corresponding to the moment when the target is in the center of the second image according to the second coordinate, so that a mapping table of the first coordinate and the second horizontal rotation amount and the second pitching amount corresponding to the same moment is established to realize the establishment of the mapping relation of a master camera and a slave camera, and even if the matching point of the master camera and the slave camera cannot be directly proposed from the environment in a pure color single scene.
In the several embodiments provided in the present application, it should be understood that the disclosed system, apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings, or which are directly or indirectly applied to other related technical fields, are intended to be included within the scope of the present application.

Claims (9)

1. A method for calibrating a master camera and a slave camera is characterized by comprising the following steps:
recording first coordinates of a target in a first image, wherein the first image is generated by a primary camera;
recording a second coordinate of the target in a second image, a first horizontal rotation amount of a slave camera, a first pitch amount and a field angle at the same time, wherein the slave camera detects and tracks the target to generate the second image;
calculating a second horizontal rotation amount and a second pitching amount corresponding to the target when the moment is in the center of the second image according to the second coordinate, the first horizontal rotation amount, the first pitching amount and the field angle recorded at the same moment;
establishing a mapping table of the first coordinate and the second horizontal rotation amount and the second pitching amount corresponding to the same moment;
wherein the step of calculating a second horizontal rotation amount and a second pitch amount of the target at the time when the target is at the center of the second image by the second coordinate, the first horizontal rotation amount, the first pitch amount, and the angle of view recorded at the same time includes:
calculating the second horizontal rotation amount according to the abscissa of the second coordinate, a horizontal field angle, the first horizontal rotation amount, and the width coordinate of the second image;
calculating the second pitching amount according to the ordinate of the second coordinate, the vertical field angle, the first pitching amount, and the height coordinate of the second image;
wherein the field angle includes the horizontal field angle and the vertical field angle.
2. The method of master-slave camera calibration according to claim 1,
P 2 =P 1 +(x 2 -w/2)/w*H,T 2 =T 1 +(y 2 -h/2)/h*V;
wherein (x) 2 ,y 2 ) Is the second coordinate, w is the width coordinate of the second image, H is the height coordinate of the second image, H is the corresponding horizontal field of view, V is the corresponding vertical field of view, P 1 For said first horizontal rotation amount, P 2 For said second horizontal rotation amount, T 1 Is said first pitch amount, T 2 Is the second pitch amount.
3. A method of master-slave camera calibration according to claim 1, wherein the abscissa of the first coordinate is mapped to the second amount of horizontal rotation and the ordinate of the first coordinate is mapped to the second amount of pitch.
4. The method of master-slave camera calibration according to claim 3, wherein the establishing a mapping table of the first coordinate and the second amount of horizontal rotation and the second amount of pitch comprises:
and acquiring multiple groups of first coordinates of the target, and corresponding multiple groups of second horizontal rotation amount and second pitching amount, and establishing the mapping table.
5. The method of master-slave camera calibration according to claim 4, wherein the obtaining a plurality of sets of the first coordinates and corresponding second horizontal rotation amounts and second pitch amounts of the target and establishing the mapping table comprises:
establishing a second horizontal rotation amount mapped with the abscissa of any first coordinate by an interpolation method according to the abscissas of the multiple groups of first coordinates and the corresponding second horizontal rotation amount;
and establishing the second pitching amount mapped with the ordinate of any first coordinate by the interpolation method according to the ordinates of the multiple groups of first coordinates and the corresponding second pitching amounts.
6. A method for master-slave camera calibration according to claim 1, wherein a plurality of sets of corresponding first coordinates of the target in the first image are recorded simultaneously;
and recording the corresponding second coordinates, the first horizontal rotation amount, the first pitching amount and the field angle of the slave camera of a plurality of groups of targets in the second image at the same time.
7. A storage device, characterized in that it stores a program which, when executed, is able to implement the method according to any one of claims 1 to 6.
8. A computer device, characterized in that the computer device comprises a processor coupled to a memory for storing a program and a memory for executing the program for implementing the method according to any of claims 1-6.
9. A master-slave camera monitoring system, characterized in that it comprises a master camera, a slave camera and a computer device according to claim 8, which is communicatively connected with the master camera and the slave camera, respectively.
CN201910780875.0A 2019-08-22 2019-08-22 Master-slave camera calibration method, storage device, computer equipment and system thereof Active CN110555888B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910780875.0A CN110555888B (en) 2019-08-22 2019-08-22 Master-slave camera calibration method, storage device, computer equipment and system thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910780875.0A CN110555888B (en) 2019-08-22 2019-08-22 Master-slave camera calibration method, storage device, computer equipment and system thereof

Publications (2)

Publication Number Publication Date
CN110555888A CN110555888A (en) 2019-12-10
CN110555888B true CN110555888B (en) 2022-10-04

Family

ID=68737965

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910780875.0A Active CN110555888B (en) 2019-08-22 2019-08-22 Master-slave camera calibration method, storage device, computer equipment and system thereof

Country Status (1)

Country Link
CN (1) CN110555888B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111416942B (en) * 2020-04-27 2021-06-29 深圳市瑞立视多媒体科技有限公司 Method, device, equipment and storage medium for limiting camera search range
CN111627048B (en) * 2020-05-19 2022-07-01 浙江大学 Multi-camera cooperative target searching method
CN113640755A (en) * 2021-05-24 2021-11-12 中国南方电网有限责任公司超高压输电公司广州局 Target pitch angle acquisition method and device based on radar photoelectric linkage system
CN113194263B (en) * 2021-07-01 2021-10-22 中国南方电网有限责任公司超高压输电公司广州局 Gun and ball linkage control method and device, computer equipment and storage medium
CN114384568A (en) * 2021-12-29 2022-04-22 达闼机器人有限公司 Position measuring method and device based on mobile camera, processing equipment and medium
CN114742897B (en) * 2022-03-31 2023-02-28 阿波罗智联(北京)科技有限公司 Method, device and equipment for processing camera installation information of roadside sensing system

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2247921B1 (en) * 2008-02-12 2014-10-08 Trimble AB Determining coordinates of a target in relation to a survey instruments having a camera
US8897482B2 (en) * 2008-02-29 2014-11-25 Trimble Ab Stereo photogrammetry from a single station using a surveying instrument with an eccentric camera
KR101970197B1 (en) * 2012-10-29 2019-04-18 에스케이 텔레콤주식회사 Method for Controlling Multiple Camera, Apparatus therefor
US9532031B1 (en) * 2014-04-08 2016-12-27 The United States Of America As Represented By The Secretary Of The Navy Method for extrinsic camera calibration using a laser beam
CN105516661B (en) * 2015-12-10 2019-03-29 吴健辉 Principal and subordinate's target monitoring method that fisheye camera is combined with ptz camera
CN109118545B (en) * 2018-07-26 2021-04-16 深圳市易尚展示股份有限公司 Three-dimensional imaging system calibration method and system based on rotating shaft and binocular camera
CN109146980B (en) * 2018-08-12 2021-08-10 浙江农林大学 Monocular vision based optimized depth extraction and passive distance measurement method
CN109613935A (en) * 2018-12-05 2019-04-12 苏州博众机器人有限公司 A kind of overall view monitoring method, system, equipment and storage medium

Also Published As

Publication number Publication date
CN110555888A (en) 2019-12-10

Similar Documents

Publication Publication Date Title
CN110555888B (en) Master-slave camera calibration method, storage device, computer equipment and system thereof
CN110567469B (en) Visual positioning method and device, electronic equipment and system
Senior et al. Acquiring multi-scale images by pan-tilt-zoom control and automatic multi-camera calibration
CN110278382B (en) Focusing method, device, electronic equipment and storage medium
US20150195509A1 (en) Systems and Methods for Incorporating Two Dimensional Images Captured by a Moving Studio Camera with Actively Controlled Optics into a Virtual Three Dimensional Coordinate System
US20100103266A1 (en) Method, device and computer program for the self-calibration of a surveillance camera
KR101342393B1 (en) Georeferencing Method of Indoor Omni-Directional Images Acquired by Rotating Line Camera
CN110784641A (en) Method for combining views from multiple cameras and camera system
WO2021184302A1 (en) Image processing method and apparatus, imaging device, movable carrier, and storage medium
EP3629570A2 (en) Image capturing apparatus and image recording method
CN109443305B (en) Distance measuring method and device
JP5183152B2 (en) Image processing device
KR101705558B1 (en) Top view creating method for camera installed on vehicle and AVM system
CN109451298B (en) Deviation angle detection method, device and equipment for double cameras
TWI738406B (en) Method and apparatus for object detection
CN112488022A (en) Panoramic monitoring method, device and system
CN108732178A (en) A kind of atmospheric visibility detection method and device
Ramirez et al. Panoramic stitching for driver assistance and applications to motion saliency-based risk analysis
CN111612812B (en) Target object detection method, detection device and electronic equipment
CN111243021A (en) Vehicle-mounted visual positioning method and system based on multiple combined cameras and storage medium
CN109600598B (en) Image processing method, image processing device and computer readable recording medium
JP2011151459A (en) Composite display device
CN114782555A (en) Map mapping method, apparatus, and storage medium
CN115004683A (en) Imaging apparatus, imaging method, and program
CN112287901A (en) Target object detection method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant