CN111461994A - Method for obtaining coordinate transformation matrix and positioning target in monitoring picture - Google Patents

Method for obtaining coordinate transformation matrix and positioning target in monitoring picture Download PDF

Info

Publication number
CN111461994A
CN111461994A CN202010239392.2A CN202010239392A CN111461994A CN 111461994 A CN111461994 A CN 111461994A CN 202010239392 A CN202010239392 A CN 202010239392A CN 111461994 A CN111461994 A CN 111461994A
Authority
CN
China
Prior art keywords
camera
point
calibration
coordinate system
calculating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010239392.2A
Other languages
Chinese (zh)
Inventor
姚佳丽
毛晓蛟
章勇
曹李军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Keda Technology Co Ltd
Original Assignee
Suzhou Keda Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Keda Technology Co Ltd filed Critical Suzhou Keda Technology Co Ltd
Priority to CN202010239392.2A priority Critical patent/CN111461994A/en
Publication of CN111461994A publication Critical patent/CN111461994A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/60Rotation of whole images or parts thereof
    • G06T3/604Rotation of whole images or parts thereof using coordinate rotation digital computer [CORDIC] devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Manufacturing & Machinery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a method for acquiring a coordinate transformation matrix and positioning a target in a monitoring picture, which comprises the following steps: controlling the camera to move to a preset PTZ position, and taking the PTZ position as a calibration scene; selecting at least one calibration point from a calibration scene; acquiring pixel coordinates of a calibration point in a camera coordinate system corresponding to a camera in a calibration scene; acquiring GPS information of a calibration point and GPS information of a camera; establishing a world coordinate system by taking a projection point of the camera on the ground as an origin, and calculating the world coordinate of the calibration point by using the GPS information of the calibration point and the GPS information of the camera; and calculating to obtain the external parameter matrix of the world coordinate system and the camera coordinate system by using the world coordinate of the calibration point and the pixel coordinate of the calibration point. The invention does not need to adjust the PTZ position for multiple times, reduces the calibration workload, can convert the world coordinate of the target point into the pixel coordinate by utilizing the external reference matrix of the conversion relation between the world coordinate system and the camera coordinate system, and reduces the difficulty of positioning calculation.

Description

Method for obtaining coordinate transformation matrix and positioning target in monitoring picture
Technical Field
The invention relates to the technical field of intelligent video monitoring, in particular to a method for acquiring a coordinate transformation matrix and positioning a target in a monitoring picture.
Background
With the progress and development of society, the applications of target identification, tracking, scene identification, indexing and the like are more and more extensive, and the requirements of people on video monitoring are higher and higher, for example, in a new generation of augmented reality live-action monitoring system, functions of displaying a landmark building in a video frame picture, tracking a mobile vehicle, associating other monitoring equipment and the like by using a dynamic label are emphasized by various security monitoring manufacturers in the field of intelligent video monitoring, and the spatial relationship of various structured and unstructured data in a three-dimensional space is established through the rich media capability of the augmented reality label. For example:
in the prior art, calibration is performed around a camera by using a calibration point, when a certain target is located at any position of a visual field of the camera, the pixel coordinates of the target point are predicted by comparing the position relationship between the calibration position and the target point and selecting the parameters of the nearby calibration point and the PTZ information of the current camera, wherein PTZ (Pan/Tilt/Zoom) represents Pan-Tilt omni-directional (left-right/up-down) movement and Zoom control of a lens in security monitoring application. The conversion of the pixel coordinates of the target point located at any position in the field of view can be realized, so that the tracking of the target point located at any position in the field of view is realized, and the position information of the target is displayed in real time. However, the calibration operation in this method requires selecting multiple PTZ positions of the camera, which causes heavy calibration workload and inconvenience, and especially, when the effective reference object at a certain PTZ position of the camera is blocked and cannot be calibrated, the effect of this method may be affected.
Disclosure of Invention
In view of this, embodiments of the present invention provide a method for obtaining a coordinate transformation matrix and positioning a target in a monitored picture, so as to solve the problems in the prior art that a calibration operation needs to be performed through a plurality of PTZ positions to predict a pixel coordinate of a target point, which causes heavy calibration workload and inconvenience, and the process of predicting the pixel coordinate of the target point is complicated.
In order to achieve the purpose, the invention provides the following technical scheme:
in a first aspect, an embodiment of the present invention provides a method for obtaining a coordinate transformation matrix, including: controlling a camera to move to a preset PTZ position, and taking the PTZ position as a calibration scene; selecting at least one calibration point from the calibration scene; acquiring pixel coordinates of the calibration point in a camera coordinate system corresponding to the camera in a calibration scene; acquiring GPS information of the calibration point and GPS information of the camera; establishing a world coordinate system by taking a projection point of the camera on the ground as an origin, and calculating a world coordinate of the calibration point in the world coordinate system by using the GPS information of the calibration point and the GPS information of the camera; and calculating to obtain an external reference matrix of the world coordinate system and the camera coordinate system by using the world coordinate of the calibration point and the pixel coordinate of the calibration point, wherein the external reference matrix is used for expressing the conversion relation between the world coordinate system and the camera coordinate system.
In one embodiment, the calculating the world coordinate of the calibration point in the world coordinate system using the GPS information of the calibration point and the GPS information of the camera includes: acquiring GPS information of the camera to determine the GPS information of a ground projection point of the camera; calculating the distance between the calibration point and the ground projection point and the azimuth angle of the calibration point relative to the ground projection point by using the GPS information of the ground projection point and the GPS information of the calibration point; and calculating to obtain the world coordinate of the calibration point by using the distance and the azimuth angle.
In one embodiment, the world coordinates (X) of the index point are calculated by the following formulawi,Ywi,Zwi):
Xwi=dicos(βi-1.5π),Ywi=-E,Zwi=disin(βi-1.5π)
Wherein, XwiIndicating the coordinate value, Y, of the index point on the X-axis of the world coordinate systemwiIndicating the coordinates of the index point on the Y-axis of the world coordinate systemValue, ZwiCoordinate values, d, representing the index point on the Z-axis of the world coordinate systemiIndicating the distance between the index point and the ground projection point, βiIndicating the azimuth of the index point relative to the ground projection point, and E indicating the elevation information in the camera GPS information.
In one embodiment, the external reference matrix is calculated by the following formula:
M1=[Rr|Tr]
wherein M is1And the external reference matrix is represented, Rr represents a rotation matrix of the coordinate system of the camera relative to the world coordinate system, and Tr represents a translation matrix of the coordinate system of the camera relative to the world coordinate system.
In a second aspect, an embodiment of the present invention provides a method for positioning an object in a monitoring screen, including: acquiring an external parameter matrix by adopting the method for acquiring the coordinate transformation matrix according to the first aspect and any optional mode; acquiring GPS information of a target point, and calculating to obtain a world coordinate of the target point by using the GPS information of the target point and the GPS information of the camera; the internal parameter matrix represents camera parameter information in the monitoring picture; calculating to obtain the pixel coordinates of the target point in the calibration scene by using the external reference matrix, the internal reference matrix and the world coordinates of the target point; and calculating to obtain the pixel coordinates of the target point in the monitoring picture by utilizing the pixel coordinates of the target point in the calibration scene and the relationship between the calibration scene of the camera and the current field angle of the camera.
In one embodiment, the calculating the current internal reference matrix of the camera includes: acquiring the current horizontal field angle, the current vertical field angle and the imaging resolution of the camera; and calculating the current internal reference matrix of the camera by using the current horizontal field angle, the current vertical field angle and the imaging resolution of the camera.
In an embodiment, the calculating the current internal reference matrix of the camera by using the current horizontal field angle, the current vertical field angle of the camera and the resolution of camera imaging includes: calculating the current horizontal focal length and the current vertical focal length of the camera by using the current horizontal field angle and the current vertical field angle of the camera and the imaging resolution of the camera; calculating an abscissa and an ordinate on a two-dimensional plane formed by a camera imaging plane by using the resolution of the camera imaging; and calculating to obtain the current internal reference matrix of the camera by using the horizontal focal length, the vertical focal length, the abscissa and the ordinate on the two-dimensional plane.
In one embodiment, the pixel coordinates (x, y) of the target point in the calibration scene are calculated by the following formula:
Figure BDA0002432053660000041
wherein x denotes a pixel value of the abscissa of the target point, y denotes a pixel value of the ordinate of the target point, M1Represents an external reference matrix, M2Representing an internal reference matrix, XwIndicating the coordinate value of the target point on the X-axis of the world coordinate system, YwCoordinate values, Z, representing the target point on the Y-axis of the world coordinate systemwAnd the coordinate values of the target point on the Z axis of the world coordinate system are shown.
In an embodiment, after calculating the pixel coordinates of the target point in the monitoring screen by using the pixel coordinates of the target point in the calibration scene and the relationship between the camera in the calibration scene and the current field angle of the camera, the method further includes: judging whether the pixel coordinates of the target point in the camera calibration scene exceed the imaging resolution of the camera; and when the pixel coordinate of the target point in the monitoring picture exceeds the imaging resolution of the camera, prompting that the target point is not in the monitoring picture.
In a third aspect, an embodiment of the present invention provides a computer-readable storage medium, where a computer instruction is stored, and the computer instruction is configured to enable a computer to execute the method for obtaining a coordinate transformation matrix according to the first aspect of the embodiment of the present invention, or to execute the method for positioning an object in a monitoring screen according to the second aspect of the embodiment of the present invention.
In a fourth aspect, an embodiment of the present invention provides an electronic device, including: the system comprises at least one processor and a memory which is in communication connection with the at least one processor, wherein the memory stores instructions which can be executed by the at least one processor, and the instructions are executed by the at least one processor, so that the at least one processor executes the method for acquiring the coordinate transformation matrix according to the first aspect of the embodiment of the invention, or executes the method for positioning the target in the monitoring picture according to the second aspect of the embodiment of the invention.
The technical scheme of the invention has the following advantages:
1. the method for obtaining the coordinate transformation matrix and positioning the target in the monitoring picture comprises the steps of controlling a camera to move to a PTZ position, namely adopting a calibration scene, selecting at least one calibration point, then obtaining the pixel coordinate of the calibration point, and calculating the world coordinate of the calibration point by utilizing the established world coordinate system, so that an external reference matrix for expressing the coordinate transformation relation between the world coordinate system and the camera coordinate system can be calculated, and when monitoring and positioning a target point, the corresponding pixel coordinate can be obtained by calculating the world coordinate of the target point and the external reference matrix. The embodiment of the invention does not need to adjust the PTZ position for multiple times, only needs to select a proper calibration scene to select the calibration point, reduces the calibration workload, and can convert the world coordinate of the target point into the pixel coordinate through the external reference matrix for expressing the coordinate conversion relation between the world coordinate system and the camera coordinate system, thereby reducing the difficulty of positioning calculation.
2. The method for acquiring the coordinate transformation matrix and positioning the target in the monitoring picture utilizes the GPS information of the calibration point and the camera to more accurately calculate the world coordinate, and finally calculates the pixel coordinate of the target point in the monitoring picture. The real-time positioning, tracking and reality enhancing functions are realized, the three-dimensional visualization effect of video monitoring is enriched, and the macroscopic command and scheduling are conveniently realized; the pixel coordinates of the target point on the monitoring picture are calculated based on the world coordinates of the target point, when the target point is shielded by an object, the position of the target can be displayed on the picture, and the target positioning and tracking can be realized more intuitively; based on the GPS information of the target point, the position information of any target point in the monitoring picture can be displayed in real time, and the effect of dominating the invisible data is achieved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of a specific example of a method for acquiring a coordinate transformation matrix according to an embodiment of the present invention;
fig. 2 is a flowchart of a specific example of a method for locating a target in a monitoring screen according to an embodiment of the present invention;
fig. 3 is a flowchart of another specific example of a method for locating a target in a monitoring screen according to an embodiment of the present invention;
fig. 4 is a composition diagram of a specific example of an electronic terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it should be understood that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In the description of the present invention, it should be noted that the terms "center", "upper", "lower", "left", "right", "vertical", "horizontal", "inner", "outer", etc., indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of description and simplicity of description, but do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; the two elements may be directly connected or indirectly connected through an intermediate medium, or may be communicated with each other inside the two elements, or may be wirelessly connected or wired connected. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
In addition, the technical features involved in the different embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
Example 1
The embodiment of the invention provides a method for acquiring a coordinate transformation matrix, which comprises the following steps as shown in fig. 1:
step S01: and controlling the camera to move to a preset PTZ position, and taking the PTZ position as a calibration scene.
In practical application, the world where the camera is located is three-dimensional, the photo is two-dimensional, and the function of the three-dimensional to two-dimensional process is irreversible, so that the camera calibration is required, the camera calibration aims to find a proper mathematical model and solve the parameter of the model, and thus the three-dimensional to two-dimensional process can be approximated, so that the function of the three-dimensional to two-dimensional process can find an inverse function. In the embodiment of the present invention, the camera is first controlled to move to a preset PTZ position, and the preset PTZ position is determined as a calibration scene, it should be noted that the selection process of the calibration scene may be selected according to actual needs and precision requirements, and the present invention is not limited thereto.
Step S02: at least one calibration point is selected from the calibration scene.
In the embodiment of the invention, after a camera is controlled to move to a calibration scene, at least one calibration point is selected in the current calibration scene, wherein if a plurality of calibration points are selected, each calibration point is a point which is near to the ground, scattered and has obvious landmark characteristics, and when one piece of calibration point information cannot determine world coordinates or GPS information at the position of the calibration point cannot be acquired, a plurality of calibration points are required to be calibrated to calculate the world coordinates. It should be noted that, in the embodiment of the present invention, the number of calibration scenes and calibration points is selected according to actual needs, and after the camera determines the position and the related parameters of other cameras, the monitoring picture of the camera is determined, and in practical applications, the calibration is performed according to application data, which is not limited to this.
Step S03: and acquiring the pixel coordinates of the calibration point in a camera coordinate system corresponding to the camera in the calibration scene.
In the embodiment of the invention, after the calibration points are determined, a group of images are obtained after the current camera is used for shooting each calibration point, then, the pixel coordinate of the calibration point in the monitoring picture of the calibration scene is measured in the image by a predefined pixel coordinate system in a manual mode so as to be convenient for the calculation of the subsequent world coordinate, wherein the pixel coordinates include a horizontal abscissa and a vertical ordinate, it should be noted that, in practical applications, different pixel coordinates may be obtained from the angle of view of the camera, the world coordinates of the camera, and the difference of the defined pixel coordinate system, as long as the respective coordinate systems can be kept consistent before and after the establishment, and the pixel coordinates of the calibration point can be obtained by direct measurement, the present invention can be obtained in other ways in practical applications, and the present invention is not limited to this.
Step S04: and acquiring GPS information of the calibration point and GPS information of the camera.
In the embodiment of the invention, the existing satellite positioning system is utilized, and equipment or a map is utilized to acquire the GPS information of a calibration point and the GPS information of a camera in a calibration scene, wherein the GPS information comprises longitude, latitude and altitude. It should be noted that, in the process of acquiring the GPS information, the acquired GPS information may be related to a satellite positioning system and a device, so in practical application, a corresponding device is selected for measurement according to the accuracy of the experiment and the practical requirement, and the present invention is not limited thereto.
Step S05: and establishing a world coordinate system by taking the projection point of the camera on the ground as an origin, and calculating the world coordinate of the calibration point in the world coordinate system by using the GPS information of the calibration point and the GPS information of the camera.
In the embodiment of the invention, firstly, a world coordinate system needs to be established by taking a projection point of a camera on the ground as an origin, wherein the world coordinate system is an absolute coordinate system of an objective three-dimensional world and is also called an objective coordinate system; and after a world coordinate system is established, calculating the world coordinate of the calibration point. For example: and determining a ground projection point of the installation position of the camera as the origin of a world coordinate system, wherein the geographical north direction is the positive direction of the Z axis of the world coordinate system, and the direction perpendicular to the ground and the sky is the positive direction of the Y axis of the world coordinate system, so as to establish the world coordinate system.
In practical application, after the world coordinate system is determined, the world coordinate (X) of the point is calibratedwi,Ywi,Zwi) The GPS information is obtained by calculating the distance between the calibration point and the ground projection point, the azimuth angle of the calibration point relative to the ground projection point and the altitude information in the camera GPS information, wherein the GPS information simultaneously comprises longitude, latitude and altitude. It should be noted that, when calculating the world coordinates of the calibration points, the calculation method may add a base number or weighting processing on the basis of the above calculation method, and in practical application, the calculation method may perform corresponding adjustment according to the actual needs of the system, the specific gravity of each calibration point information, and the like, and the present invention is not limited thereto.
Step S06: and calculating to obtain an external reference matrix of the world coordinate system and the camera coordinate system by using the world coordinate of the calibration point and the pixel coordinate of the calibration point, wherein the external reference matrix is used for expressing the conversion relation of the world coordinate system and the camera coordinate system.
In the embodiment of the invention, the world coordinates of the calibration points obtained by the calculation and the pixel coordinates of the calibration points in the calibration scene are used for calculating the external parameter matrix which represents the conversion relation between the world coordinate system and the camera coordinate system. For example: generally, the world coordinate system and the camera coordinate system do not coincide, and when a certain point in the world coordinate system is projected onto the image plane, the coordinate of the point is firstly converted into the camera coordinate system, and the process of converting the world coordinate system into the camera coordinate system can be obtained through rotation and translation, so that an external reference matrix of the world coordinate system and the camera coordinate system is calculated, wherein the camera coordinate system is also called an optical center coordinate system and is a coordinate system established on the camera, and is defined for describing the position of an object from the angle of the camera, and is used as a middle ring for communicating the world coordinate system and the image/pixel coordinate system, for example, the optical center of the camera is taken as a coordinate origin, the X axis and the Y axis are respectively parallel to the X axis and the Y axis of the image coordinate system, and the optical axis of the camera is taken as the Z axis, so as to establish the camera coordinate system.
Alternatively, when calculating the external parameter matrix, the external parameter matrix may be calculated by the following formula:
M1=[Rr|Tr](1)
wherein M is1And the external reference matrix is represented, Rr represents a rotation matrix of the coordinate system of the camera relative to the world coordinate system, and Tr represents a translation matrix of the coordinate system of the camera relative to the world coordinate system. It should be noted that, in practical applications, other ways may be selected when establishing the camera coordinate system due to some special needs, and the embodiment of the present invention only illustrates a case of establishing the camera coordinate system, and the present invention is not limited thereto.
The method for acquiring the coordinate transformation matrix provided by the invention selects at least one calibration point by controlling the camera to move to a PTZ position, namely, adopting a calibration scene, then acquires the pixel coordinates of the calibration point, and calculates the world coordinates of the calibration point by utilizing the established world coordinate system, thereby calculating the external reference matrix for expressing the coordinate transformation relation between the world coordinate system and the camera coordinate system, and thus, when monitoring and positioning a target point, the corresponding pixel coordinates can be calculated by the world coordinates of the target point and the external reference matrix. The embodiment of the invention does not need to adjust the PTZ position for multiple times, only needs to select a proper calibration scene to select the calibration point, reduces the calibration workload, and can convert the world coordinate of the target point into the pixel coordinate through the external reference matrix for expressing the coordinate conversion relation between the world coordinate system and the camera coordinate system, thereby reducing the difficulty of positioning calculation.
In one embodiment, calculating the world coordinates of the calibration point in the world coordinate system using the GPS information of the calibration point and the GPS information of the camera includes the steps of:
step S051: and acquiring the GPS information of the camera to determine the GPS information of the ground projection point of the camera.
In the embodiment of the present invention, after the camera is erected, the GPS information of the camera can be obtained, and then the GPS information of the camera is projected to the ground to obtain the GPS information of the ground projection point of the camera.
Step S052: and calculating the distance between the calibration point and the ground projection point and the azimuth angle of the calibration point relative to the ground projection point by using the GPS information of the ground projection point and the GPS information of the calibration point.
In the embodiment of the invention, the distance between the calibration point and the ground projection point and the azimuth angle of the calibration point relative to the ground projection point are calculated by utilizing the GPS information of the ground projection point and the GPS information of the calibration point, and the linear distance (unit meter or kilometer) between the two points and the azimuth angle of the terminal point relative to the starting point (0-degree azimuth angle represents true north, 90-degree azimuth angle represents true east, 180-degree azimuth angle represents true south, 270-degree azimuth angle represents true west, 360-degree azimuth angle represents angle regression and still represents true north) can be calculated by using the longitude and latitude values in the GPS information of the two points in the calculation process. It should be noted that, in the embodiment of the present invention, only the distance and the azimuth angle between the calibration point and the ground projection point are illustrated and calculated, and in practical applications, the definition of the unit and the definition of the direction may be selectively set according to practical needs, and the present invention is not limited thereto.
Step S053: and calculating to obtain the world coordinates of the calibration point by using the distance and the azimuth angle.
Alternatively, the world coordinates (X) of the index point are calculated by the following formulawi,Ywi,Zwi):
Xwi=dicos(βi-1.5π),Ywi=-E,Zwi=disin(βi-1.5π) (2)
Wherein, XwiIndicating the coordinate value, Y, of the index point on the X-axis of the world coordinate systemwiCoordinate values, Z, representing the index point on the Y-axis of the world coordinate systemwiCoordinate values, d, representing the index point on the Z-axis of the world coordinate systemiIndicating the distance between the index point and the ground projection point, βiIndicating the azimuth of the index point relative to the ground projection point, and E indicating the elevation information in the camera GPS information. In practical applications, the calculation of the world coordinate is related to the established world coordinate system, and may be performed according to actual needs, which is not limited by the invention.
The method for acquiring the coordinate transformation matrix provided by the invention selects at least one calibration point by controlling the camera to move to a PTZ position, namely, adopting a calibration scene, then acquires the pixel coordinates of the calibration point, and calculates the world coordinates of the calibration point by utilizing the established world coordinate system, thereby calculating the external reference matrix for expressing the coordinate transformation relation between the world coordinate system and the camera coordinate system, and thus, when monitoring and positioning a target point, the corresponding pixel coordinates can be calculated by the world coordinates of the target point and the external reference matrix. The embodiment of the invention does not need to adjust the PTZ position for multiple times, only needs to select a proper calibration scene to select the calibration point, reduces the calibration workload, and can convert the world coordinate of the target point into the pixel coordinate through the external reference matrix for expressing the coordinate conversion relation between the world coordinate system and the camera coordinate system, thereby reducing the difficulty of positioning calculation.
An embodiment of the present invention further provides a method for positioning an object in a monitoring screen, as shown in fig. 2, the method for positioning an object in a monitoring screen specifically includes:
step S1: the external reference matrix is obtained by the method for obtaining the coordinate transformation matrix in the above embodiment 1.
Step S2: and acquiring GPS information of the target point, and calculating to obtain the world coordinate of the target point by using the GPS information of the target point and the GPS information of the camera.
In the embodiment of the present invention, the GPS information of the target point needs to be acquired in the same manner as the GPS information of the calibration point, which is not described herein again, and the GPS information of the target point also includes longitude, latitude and altitude, and then the world coordinate (X) of the target point is calculated by using the GPS information of the target point and the GPS information of the camera to obtain the world coordinate (X) of the target pointw,Yw,Zw) And calculating the world coordinate of the target point and the world coordinate of the target point, wherein the calculation method of the world coordinate of the target point is the same as the calibration point and is the same world coordinate system, and the calculation method of the world coordinate of the target point is not described herein again.
Step S3: and calculating a current internal reference matrix of the camera, wherein the internal reference matrix represents the camera parameter information in the monitoring picture.
In the embodiment of the present invention, in addition to the transformation relationship between the world coordinate system and the camera coordinate system, the transformation relationship between the pixel coordinate system and the camera coordinate system, that is, the current internal parameter matrix of the camera is also included, and each parameter in the internal parameter matrix is only related to the internal parameter of the camera and does not change with the change of the object position, so that when the current internal parameter matrix of the camera is calculated, only the internal parameter of the camera needs to be calculated first.
Step S4: and calculating to obtain the pixel coordinates of the target point in the calibration scene by using the external reference matrix, the internal reference matrix and the world coordinates of the target point.
In the embodiment of the invention, after the external reference matrix, the internal reference matrix and the world coordinates of the target point are obtained through calculation, the pixel coordinates (x, y) of the target point in a calibration scene can be calculated, the world coordinates of the target point can be converted into the camera coordinates through the external reference matrix in the calculation process, and then the obtained camera coordinates are converted into the pixel coordinates through the internal reference matrix. It should be noted that each coordinate is calculated according to a coordinate system established in advance, and in practical applications, the coordinate may be calculated according to a different established coordinate system, which is not limited in the present invention.
Optionally, the pixel coordinates (x, y) of the target point in the calibration scene are calculated by the following formula:
Figure BDA0002432053660000141
wherein x denotes a pixel value of the abscissa of the target point, y denotes a pixel value of the ordinate of the target point, M1Represents an external reference matrix, M2Representing an internal reference matrix, XwIndicating the coordinate value of the target point on the X-axis of the world coordinate system, YwCoordinate values, Z, representing the target point on the Y-axis of the world coordinate systemwAnd the coordinate values of the target point on the Z axis of the world coordinate system are shown.
Step S5: and calculating to obtain the pixel coordinates of the target point in the monitoring picture by utilizing the pixel coordinates of the target point in the calibration scene and the relationship between the calibration scene of the camera and the current field angle of the camera.
Specifically, in the embodiment of the present invention, after the pixel coordinates of the target point in the calibration scene are obtained through calculation, the information required for calculation, including the GPS information of the camera, including longitude, latitude and altitude, is obtained; PTZ information of the camera set during calibration comprises a pan-tilt angle and a pan-tilt angle; the geographic position of the center of the monitoring picture during calibration is the pan angle value of the camera during due north; the horizontal field angle and the vertical field angle of the lens of the camera during calibration; GPS information and altitude of the selected index point, and pixel coordinates of the selected index point in the monitoring picture; GPS information and altitude of the target point; PTZ information of the current position of the camera, a horizontal field angle and a vertical field angle of the lens are obtained, pixel coordinates of the target point in the monitoring picture are calculated by the existing method by utilizing the pixel coordinates of the target point in the calibration scene and the relation between the calibration scene and the current field angle of the camera(xo,yo)。
The method comprises the steps of converting pixel coordinates of a target point in a center area of a calibration scene into angular coordinates by utilizing the pixel coordinates of the target point in the calibration scene and the relationship between the calibration scene and the current angle of view of a camera, then determining the target point in a monitoring picture, and converting the angular coordinates of the target point in the center area of the calibration scene into corresponding pixel coordinates in a new scene (namely the monitoring picture where the target point is located) after camera operation parameters change, wherein the camera operation parameters comprise at least one of a remote radiation angle, an inclination angle and an angle of view of a holder. It should be noted that, in the embodiment of the present invention, only the process of calculating the pixel coordinates of the target point in the monitoring picture is illustrated, and in practical applications, the calculation may also be performed by other methods, and the calculation methods are methods known to those skilled in the art, and the present invention is not limited thereto.
The target positioning method in the monitoring picture provided by the invention utilizes the GPS information of the calibration point and the camera to more accurately calculate the world coordinate, and finally calculates the pixel coordinate of the target point in the monitoring picture. The real-time positioning, tracking and reality enhancing functions are realized, the three-dimensional visualization effect of video monitoring is enriched, and the macroscopic command and scheduling are conveniently realized; the pixel coordinates of the target point on the monitoring picture are calculated based on the world coordinates of the target point, when the target point is shielded by an object, the position of the target can be displayed on the picture, and the target positioning and tracking can be realized more intuitively; based on the GPS information of the target point, the position information of any target point in the monitoring picture can be displayed in real time, and the effect of dominating the invisible data is achieved.
In a specific embodiment, the calculating a current internal reference matrix of a camera in the embodiment of the present invention includes the following steps:
step S31: the current horizontal field angle, the vertical field angle and the resolution of the camera image are acquired.
Step S32: and calculating the current internal reference matrix of the camera by using the current horizontal field angle, the current vertical field angle and the resolution of camera imaging.
In a specific embodiment, the current internal reference matrix of the camera is calculated by using the current horizontal field angle, the vertical field angle and the imaging resolution of the camera, and the method comprises the following steps:
step S321: and calculating the current horizontal focal length and the current vertical focal length of the camera by using the current horizontal field angle and the current vertical field angle of the camera and the imaging resolution of the camera.
In the embodiment of the invention, a tangent function value is taken for half of the current horizontal field angle of the camera by utilizing the current horizontal field angle, the current vertical field angle and the imaging resolution of the camera, then the current horizontal focal length of the camera is obtained by dividing the horizontal resolution of the imaging resolution of the camera by twice of the tangent function value, the tangent function value is taken for half of the current vertical field angle of the camera in the same way, and then the current vertical focal length of the camera is obtained by dividing the vertical resolution of the imaging resolution of the camera by twice of the tangent function value. It should be noted that in practical application, a base number or weighting process may be added, and in practical application, corresponding adjustment may be performed according to the practical needs of the system, the specific gravity of the horizontal or vertical resolution of the device, and the like, and the present invention is not limited thereto.
Alternatively, the horizontal focal length and the vertical focal length are calculated by the following formulas:
Figure BDA0002432053660000171
wherein f isxDenotes the horizontal focal length, fyIndicating vertical focal length, αxα representing the current horizontal field of view of the camerayDenotes the current vertical field angle of the camera, W denotes the horizontal resolution of the camera imaging resolution, and H denotes the vertical resolution of the camera imaging resolution.
Step S322: and calculating the abscissa and the ordinate of the principal point on the two-dimensional plane formed by the camera imaging plane by using the resolution of the camera imaging.
In the embodiment of the present invention, by using the resolution of the camera imaging, half of the horizontal resolution of the camera imaging resolution is determined as the abscissa of the principal point on the two-dimensional plane formed by the camera imaging plane, and similarly, half of the vertical resolution of the camera imaging resolution is determined as the ordinate of the principal point on the two-dimensional plane formed by the camera imaging plane.
Alternatively, the abscissa and ordinate of the principal point on the two-dimensional plane formed by the camera imaging plane are calculated by the following formulas:
Figure BDA0002432053660000181
wherein u is0A value, v, representing the abscissa of the principal point on a two-dimensional plane formed by the imaging plane of the camera0A value indicating the ordinate of the principal point on the two-dimensional plane formed by the camera imaging plane, W indicating the horizontal resolution of the camera imaging resolution, and H indicating the vertical resolution of the camera imaging resolution.
Step S323: and calculating to obtain the current internal reference matrix of the camera by using the horizontal focal length, the vertical focal length, and the abscissa and the ordinate on the two-dimensional plane.
In the embodiment of the invention, the current internal parameter matrix of the camera is calculated by utilizing the horizontal focal length, the vertical focal length, the abscissa and the ordinate on the two-dimensional plane, and each parameter in the internal parameter matrix is only related to the internal parameter of the camera and does not change along with the change of the position of an object, so that when the current internal parameter matrix of the camera is calculated, only the internal parameter of the camera needs to be calculated firstly, the transformation relation which also comprises a pixel coordinate system and a camera coordinate system is obtained, namely the current internal parameter matrix of the camera is obtained, and each parameter in the internal parameter matrix is only related to the internal parameter of the camera and does not change along with the change of the position of the object, so when the current internal parameter matrix of the camera is calculated, only the internal parameter of the camera needs to be calculated firstly.
Optionally, the internal reference matrix is calculated by the following formula:
Figure BDA0002432053660000182
wherein M is2Representing an internal reference matrix, fxDenotes the horizontal focal length, fyDenotes the vertical focal length, u0A value, v, representing the abscissa of the principal point on a two-dimensional plane formed by the imaging plane of the camera0A value representing the ordinate of the principal point on a two-dimensional plane formed by the camera imaging plane.
In a specific embodiment, as shown in fig. 3, the method for positioning an object in a monitoring screen according to an embodiment of the present invention further includes the following steps:
step S6: and judging whether the pixel coordinates of the target point in the camera calibration scene exceed the imaging resolution of the camera.
In the embodiment of the invention, in practical application, if the calculated pixel coordinate exceeds any one value of the horizontal resolution or the vertical resolution of the camera imaging resolution, it is indicated that the target point is located outside the visual field of the picture of the camera calibration scene, the visual field of the monitored picture has a point distance from the calibration position at the moment, and the larger the distance is, the larger the difference between the calculated pixel coordinate and the actual value is, the correction is needed to be performed so as to reduce the deviation.
Step S7: when the pixel coordinate of the target point in the camera calibration scene exceeds the imaging resolution of the camera, prompting that the target point is not in the monitoring picture, and correcting the pixel coordinate of the target point in the monitoring picture to obtain a corrected pixel coordinate.
In the embodiment of the invention, when the pixel coordinate of the target point in the camera calibration scene exceeds the imaging resolution of the camera, the pixel coordinate of the target point in the monitoring picture is corrected, the difference between the calculated pixel coordinate and the actual value is larger under the condition, and the pixel coordinate of the target point far away from the calibration position is corrected to obtain the corrected pixel coordinate.
In practical application, the step of determining whether the pixel coordinate of the target point in the camera calibration scene exceeds the resolution of the camera imaging further includes the following steps:
step S8: and when the pixel coordinate of the target point in the camera calibration scene does not exceed the imaging resolution of the camera, determining the pixel coordinate of the target point in the monitoring picture as the final pixel coordinate.
In a specific embodiment, the method for correcting the pixel coordinates of the target point in the monitoring picture to obtain the corrected pixel coordinates in the embodiment of the present invention includes the following steps:
step S71: acquiring the current pan angle P of the camera at the calibration position0And an angle of inclination T0
Step S72: selecting a new calibration scene within a preset range of the calibration scene, selecting a plurality of groups of new calibration points in a monitoring picture of the new calibration scene, and determining the recording pixel coordinates of each new calibration point in the monitoring picture of the new calibration scene and the current pan angle and tilt angle of the camera at the new calibration position in a manual picking mode.
In the embodiment of the invention, a new calibration scene is selected within a preset range of the calibration scene, a plurality of groups of new calibration points are selected in a monitoring picture of the new calibration scene, and the recording pixel coordinates of each new calibration point in the monitoring picture of the new calibration scene and the current pan angle P of a camera at the new calibration position are determined in a manual picking modeiAnd an angle of inclination Ti. It should be noted that after the camera is erected, the camera parameters such as the monitoring picture, the pan angle, the tilt angle, etc. are determined, and the camera parameters can be set according to actual needs, and the preset range is also selected according to actual needs, and the invention is not limited to this in practical application.
Step S73: and calculating the world coordinates of each new calibration point, and calculating to obtain the calculated pixel coordinates of each new calibration point in the new calibration scene by using the current external reference matrix and the internal reference matrix of the camera at the new calibration position and the world coordinates of each new calibration point.
In the embodiment of the present invention, the calculation methods for calculating the world coordinates of the new calibration point, the current external reference matrix and the internal reference matrix of the camera are the same as those for calculating the calibration point, and are not repeated in the embodiment of the present invention.
Step S74: and calculating the horizontal coordinate difference value and the vertical coordinate difference value of each new calibration point by using each recorded pixel coordinate and each calculated pixel coordinate, and calculating the average value of the horizontal coordinate difference values and the average value of the vertical coordinate difference values of a plurality of groups of new calibration points by using the horizontal coordinate difference value and the vertical coordinate difference value of each new calibration point.
In the embodiment of the invention, the coordinates of each recorded pixel are utilized
Figure BDA0002432053660000201
And each calculated pixel coordinate
Figure BDA0002432053660000202
K represents K new calibration points, the horizontal coordinate difference value and the vertical coordinate difference value of each new calibration point are calculated, and the average value delta x of the horizontal coordinate difference values of a plurality of groups of new calibration points is calculated by utilizing the horizontal coordinate difference value and the vertical coordinate difference value of each new calibration pointiAnd the average value of the difference in the ordinate Δ yi
Step S75: calculating the pan angle difference Δ P between the current pan angle of the camera at the calibration position and the current pan angle of the camera at the new calibration positioniCalculating a tilt angle difference Δ T between a current tilt angle of the camera at the calibration position and a current tilt angle of the camera at the new calibration positioni
Step S76: and fitting a plurality of groups of functional relations by using the average value of each transverse coordinate difference value, the average value of each longitudinal coordinate difference value, the difference value of the rocking angle and the difference value of the inclination angle, and calculating the coordinates of the correction pixel by using each functional relation.
In a specific embodiment, in the actual measurement and calculation process, it is found through a large amount of data that a certain linear relationship exists between the average value of each lateral coordinate difference value, the average value of each longitudinal coordinate difference value, the difference value of the pan angle and the difference value of the tilt angle, and the embodiment of the present invention fits a plurality of sets of functional relationships by using the average value of each lateral coordinate difference value, the average value of each longitudinal coordinate difference value, the difference value of the pan angle and the difference value of the tilt angle, and further includes the following steps:
step S761: and linearly fitting the functional relationship between the average value of the rolling coordinate difference and the rolling angle difference by using the average value of the rolling coordinate difference and the rolling angle difference, and fitting to obtain the rolling slope and the rolling intercept of the functional relationship between the average value of the rolling coordinate difference and the rolling angle difference.
In the embodiment of the invention, i groups of deltax are utilizediAnd Δ PiLinear fitting yields the following expression:
Δxp=apxΔP+bPx(7)
wherein, apxRepresents the roll slope, bpxRepresenting the roll intercept.
Step S762: and linearly fitting the functional relationship between the average value of the horizontal coordinate difference values and the inclination angle difference value by using the average value of each horizontal coordinate difference value and the inclination angle difference value to obtain the horizontal inclination slope and the horizontal inclination intercept of the functional relationship between the average value of the horizontal coordinate difference values and the inclination angle difference value.
In the embodiment of the invention, i groups of deltax are utilizediAnd Δ TiLinear fitting yields the following expression:
ΔxT=aTxΔT+bTx(8)
wherein, aTxRepresents the lateral inclination slope, bTxThe transverse tilt intercept is indicated.
Step S763: and linearly fitting the functional relationship between the average value of the longitudinal coordinate difference values and the rocking angle difference value by using the average value of each longitudinal coordinate difference value and the rocking angle difference value to obtain the pitch slope and the pitch intercept of the functional relationship between the average value of the longitudinal coordinate difference values and the rocking angle difference value.
In the embodiment of the invention, i groups of delta y are utilizediAnd Δ PiLinear fitting yields the following expression:
Δyp=apyΔP+bPy(9)
wherein, apyRepresenting the pitch slope, bpyRepresenting the pitch intercept.
Step S764: and linearly fitting the functional relationship between the average value of the longitudinal coordinate difference values and the inclination angle difference value by using the average value of each longitudinal coordinate difference value and the inclination angle difference value to obtain the longitudinal inclination slope and the longitudinal inclination intercept of the functional relationship between the average value of the longitudinal coordinate difference values and the inclination angle difference value.
Practice of the inventionIn the embodiment of the present invention, i groups Δ y are usediAnd Δ TiLinear fitting yields the following expression:
ΔyT=aTyΔT+bTy(10)
wherein, aTyRepresents the slope of the longitudinal inclination, bTyThe longitudinal tilt intercept is indicated.
Further, the correction pixel coordinate (x) is calculated by the following formulac,yc):
Figure BDA0002432053660000221
Wherein x iscAbscissa pixel value, y, representing the coordinates of the correction pixelcOrdinate pixel value, x, representing the coordinates of the correction pixeloAbscissa pixel value, y, representing the pixel coordinate of the target point in the monitored pictureoOrdinate pixel value, a, representing the pixel coordinates of the target point in the monitored picturepxRepresents the roll slope, bpxDenotes the roll intercept, aTxRepresents the lateral inclination slope, bTxDenotes the transverse inclination intercept, apyRepresenting the pitch slope, bpyDenotes the pitch intercept, aTyRepresents the slope of the longitudinal inclination, bTyRepresenting the longitudinal tilt intercept, P0Indicating the camera pan angle, T, at the nominal position0Denotes a camera tilt angle at the calibration position, P denotes a camera pan angle at the target point, and T denotes a camera tilt angle at the target point.
The target positioning method in the monitoring picture provided by the invention utilizes the GPS information of the calibration point and the camera to more accurately calculate the world coordinate, finally calculates the pixel coordinate of the target point, judges the calculated pixel coordinate, corrects the target point which is not in the calibration scene of the camera, fits a plurality of groups of functional relations between the difference value of each pixel coordinate and the calculated pixel coordinate which are manually picked up and the current parameter of the camera in the preset range of the calibration scene, and calculates the corrected pixel coordinate of the target point. The real-time positioning, tracking and reality enhancing functions are realized, the three-dimensional visualization effect of video monitoring is enriched, and the macroscopic command and scheduling are conveniently realized; the pixel coordinates of the target point on the monitoring picture are calculated based on the world coordinates of the target point, when the target point is shielded by an object, the position of the target can be displayed on the picture, and the target positioning and tracking can be realized more intuitively; based on the GPS information of the target point, the position information of any target point in the monitoring picture can be displayed in real time, so that the effect of dominating the invisible data is achieved; and the error correction can improve the accuracy of target positioning and tracking and reduce the error.
An embodiment of the present invention provides an electronic device, as shown in fig. 4, including: at least one processor 401, such as a CPU (Central Processing Unit), at least one communication interface 403, memory 404, and at least one communication bus 402. Wherein a communication bus 402 is used to enable connective communication between these components. The communication interface 403 may include a Display (Display) and a Keyboard (Keyboard), and the optional communication interface 403 may also include a standard wired interface and a standard wireless interface. The Memory 404 may be a RAM (random Access Memory) or a non-volatile Memory (non-volatile Memory), such as at least one disk Memory. The memory 404 may optionally be at least one memory device located remotely from the processor 401. The processor 401 may execute the above-mentioned coordinate transformation matrix obtaining method, or execute the above-mentioned target locating method in the monitoring screen. A set of program codes is stored in the memory 404, and the processor 401 calls the program codes stored in the memory 404 to execute the above-mentioned method for acquiring the coordinate transformation matrix or to execute the above-mentioned method for locating the target in the monitoring screen.
The communication bus 402 may be a PerIPheral Component Interconnect (PCI) bus or an Extended Industry Standard Architecture (EISA) bus. The communication bus 402 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one line is shown in FIG. 4, but it is not intended that there be only one bus or one type of bus.
The memory 404 may include a volatile memory (RAM), such as a random-access memory (RAM); the memory may also include a non-volatile memory (english: non-volatile memory), such as a flash memory (english: flash memory), a hard disk (english: hard disk drive, abbreviation: HDD), or a solid-state drive (english: SSD); the memory 404 may also comprise a combination of memories of the kind described above.
The processor 401 may be a Central Processing Unit (CPU), a Network Processor (NP), or a combination of a CPU and an NP.
The processor 401 may further include a hardware chip, which may be an application-specific integrated circuit (ASIC), a programmable logic device (CP L D), or a combination thereof, and the P L D may be a complex programmable logic device (CP L D), a field-programmable gate array (FPGA), a general-purpose array logic (GA L), or any combination thereof.
Optionally, the memory 404 is also used to store program instructions. The processor 401 may call a program instruction to implement the method for acquiring the coordinate transformation matrix described above, or implement the method for positioning an object in the monitoring screen described above.
The embodiment of the invention also provides a computer-readable storage medium, wherein a computer-executable instruction is stored on the computer-readable storage medium, and the computer-executable instruction can execute the coordinate transformation matrix acquisition method or execute the target positioning method in the monitoring picture. The storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a flash Memory (FlashMemory), a Hard Disk (Hard Disk Drive, abbreviated as HDD), a Solid-State Drive (SSD), or the like; the storage medium may also comprise a combination of memories of the kind described above.
It should be understood that the above examples are only for clarity of illustration and are not intended to limit the embodiments. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. And obvious variations or modifications of the invention may be made without departing from the spirit or scope of the invention.

Claims (11)

1. A method for acquiring a coordinate transformation matrix is characterized by comprising the following steps:
controlling a camera to move to a preset PTZ position, and taking the PTZ position as a calibration scene;
selecting at least one calibration point from the calibration scene;
acquiring pixel coordinates of the calibration point in a camera coordinate system corresponding to the camera in a calibration scene;
acquiring GPS information of the calibration point and GPS information of the camera;
establishing a world coordinate system by taking a projection point of the camera on the ground as an origin, and calculating a world coordinate of the calibration point in the world coordinate system by using the GPS information of the calibration point and the GPS information of the camera;
and calculating to obtain an external reference matrix of the world coordinate system and the camera coordinate system by using the world coordinate of the calibration point and the pixel coordinate of the calibration point, wherein the external reference matrix is used for expressing the conversion relation between the world coordinate system and the camera coordinate system.
2. The method of claim 1, wherein calculating the world coordinates of the calibration point in a world coordinate system using the GPS information of the calibration point and the GPS information of the camera comprises:
acquiring GPS information of the camera to determine the GPS information of a ground projection point of the camera;
calculating the distance between the calibration point and the ground projection point and the azimuth angle of the calibration point relative to the ground projection point by using the GPS information of the ground projection point and the GPS information of the calibration point;
and calculating to obtain the world coordinate of the calibration point by using the distance and the azimuth angle.
3. The method of claim 2, wherein the world coordinates (X) of the index point are calculated by the following formulawi,Ywi,Zwi):
Xwi=dicos(βi-1.5π),Ywi=-E,Zwi=disin(βi-1.5π)
Wherein, XwiIndicating the coordinate value, Y, of the index point on the X-axis of the world coordinate systemwiCoordinate values, Z, representing the index point on the Y-axis of the world coordinate systemwiCoordinate values, d, representing the index point on the Z-axis of the world coordinate systemiIndicating the distance between the index point and the ground projection point, βiIndicating the azimuth of the index point relative to the ground projection point, and E indicating the elevation information in the camera GPS information.
4. The method of claim 1, wherein the external reference matrix is calculated by the following formula:
M1=[Rr|Tr]
wherein M is1And the external reference matrix is represented, Rr represents a rotation matrix of the coordinate system of the camera relative to the world coordinate system, and Tr represents a translation matrix of the coordinate system of the camera relative to the world coordinate system.
5. A method for positioning an object in a monitoring picture is characterized by comprising the following steps:
acquiring an external parameter matrix by using the coordinate transformation matrix acquisition method of any one of claims 1 to 4;
acquiring GPS information of a target point, and calculating to obtain a world coordinate of the target point by using the GPS information of the target point and the GPS information of the camera;
calculating a current internal reference matrix of the camera, wherein the internal reference matrix represents camera parameter information in the monitoring picture;
calculating to obtain the pixel coordinates of the target point in the calibration scene by using the external reference matrix, the internal reference matrix and the world coordinates of the target point;
and calculating to obtain the pixel coordinates of the target point in the monitoring picture by utilizing the pixel coordinates of the target point in the calibration scene and the relationship between the calibration scene of the camera and the current field angle of the camera.
6. The method of claim 5, wherein the calculating the current internal reference matrix of the camera comprises:
acquiring the current horizontal field angle, the current vertical field angle and the imaging resolution of the camera;
and calculating the current internal reference matrix of the camera by using the current horizontal field angle, the current vertical field angle and the imaging resolution of the camera.
7. The method of claim 6, wherein the calculating the current camera internal reference matrix using the current horizontal field angle, the current vertical field angle and the resolution of the camera image comprises:
calculating the current horizontal focal length and the current vertical focal length of the camera by using the current horizontal field angle and the current vertical field angle of the camera and the imaging resolution of the camera;
calculating an abscissa and an ordinate on a two-dimensional plane formed by a camera imaging plane by using the resolution of the camera imaging;
and calculating to obtain the current internal reference matrix of the camera by using the horizontal focal length, the vertical focal length, the abscissa and the ordinate on the two-dimensional plane.
8. The method according to claim 5, characterized in that the pixel coordinates (x, y) of the target point in the calibration scene are calculated by the following formula:
Figure FDA0002432053650000041
wherein x denotes a pixel value of the abscissa of the target point, y denotes a pixel value of the ordinate of the target point, M1Represents an external reference matrix, M2Representing an internal reference matrix, XwIndicating the coordinate value of the target point on the X-axis of the world coordinate system, YwCoordinate values, Z, representing the target point on the Y-axis of the world coordinate systemwAnd the coordinate values of the target point on the Z axis of the world coordinate system are shown.
9. The method according to claim 5, further comprising, after calculating pixel coordinates of the target point in the monitoring screen by using the pixel coordinates of the target point in the calibration scene and the relationship between the camera in the calibration scene and the current angle of view of the camera:
judging whether the pixel coordinates of the target point in the camera calibration scene exceed the imaging resolution of the camera;
and when the pixel coordinate of the target point in the camera calibration scene exceeds the imaging resolution of the camera, prompting that the target point is not in the monitoring picture.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium stores computer instructions which, when executed by a processor, implement the method of any one of claims 1-4 or the method of any one of claims 5-9.
11. An electronic device, comprising:
a memory and a processor, the memory and the processor being communicatively coupled to each other, the memory having stored therein computer instructions, the processor performing the method of any of claims 1-4 or performing the method of any of claims 5-9 by executing the computer instructions.
CN202010239392.2A 2020-03-30 2020-03-30 Method for obtaining coordinate transformation matrix and positioning target in monitoring picture Pending CN111461994A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010239392.2A CN111461994A (en) 2020-03-30 2020-03-30 Method for obtaining coordinate transformation matrix and positioning target in monitoring picture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010239392.2A CN111461994A (en) 2020-03-30 2020-03-30 Method for obtaining coordinate transformation matrix and positioning target in monitoring picture

Publications (1)

Publication Number Publication Date
CN111461994A true CN111461994A (en) 2020-07-28

Family

ID=71680203

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010239392.2A Pending CN111461994A (en) 2020-03-30 2020-03-30 Method for obtaining coordinate transformation matrix and positioning target in monitoring picture

Country Status (1)

Country Link
CN (1) CN111461994A (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112132909A (en) * 2020-09-23 2020-12-25 字节跳动有限公司 Parameter acquisition method and device, media data processing method and storage medium
CN112308930A (en) * 2020-10-30 2021-02-02 杭州海康威视数字技术股份有限公司 Camera external parameter calibration method, system and device
CN112465911A (en) * 2020-11-06 2021-03-09 北京迈格威科技有限公司 Image processing method and device
CN112561990A (en) * 2021-01-21 2021-03-26 禾多科技(北京)有限公司 Positioning information generation method, device, equipment and computer readable medium
CN112925002A (en) * 2021-02-07 2021-06-08 沈阳航空航天大学 Distributed visual positioning method for non-cooperative target in air
CN113033426A (en) * 2021-03-30 2021-06-25 北京车和家信息技术有限公司 Dynamic object labeling method, device, equipment and storage medium
CN113115002A (en) * 2021-04-13 2021-07-13 大庆安瑞达科技开发有限公司 Oil and gas field personnel and vehicle positioning associated video monitoring method
CN113223087A (en) * 2021-07-08 2021-08-06 武大吉奥信息技术有限公司 Target object geographic coordinate positioning method and device based on video monitoring
CN113572960A (en) * 2021-07-23 2021-10-29 武汉星环恒宇信息科技有限公司 Video rapid label positioning method for water affair prevention and control
CN113781575A (en) * 2021-08-09 2021-12-10 上海奥视达智能科技有限公司 Camera parameter calibration method, device, terminal and storage medium
CN114119651A (en) * 2021-11-30 2022-03-01 重庆紫光华山智安科技有限公司 Target tracking method, system, device and storage medium
CN114199124A (en) * 2021-11-09 2022-03-18 汕头大学 Coordinate calibration method, device, system and medium based on linear fitting
CN114333199A (en) * 2020-09-30 2022-04-12 中国电子科技集团公司第五十四研究所 Alarm method, equipment, system and chip
CN114511640A (en) * 2020-11-17 2022-05-17 北京四维图新科技股份有限公司 Method, device and storage medium for calibrating camera by using map
CN114608555A (en) * 2022-02-28 2022-06-10 珠海云洲智能科技股份有限公司 Target positioning method, system and storage medium
CN114638980A (en) * 2022-03-04 2022-06-17 支付宝(杭州)信息技术有限公司 Dish type identification processing method and device
CN114754743A (en) * 2022-04-18 2022-07-15 中国人民解放军陆军军事交通学院军事交通运输研究所 Target positioning method for carrying multiple PTZ cameras on intelligent ground unmanned platform
CN115375779A (en) * 2022-10-27 2022-11-22 智广海联(天津)大数据技术有限公司 Method and system for marking AR (augmented reality) real scene of camera
CN115588040A (en) * 2022-09-09 2023-01-10 四川省寰宇众恒科技有限公司 System and method for counting and positioning coordinates based on full-view imaging points
US11625860B1 (en) 2021-09-07 2023-04-11 Hong Kong Applied Science And Technology Research Institute Co., Ltd. Camera calibration method
WO2023087894A1 (en) * 2021-11-18 2023-05-25 京东方科技集团股份有限公司 Region adjustment method and apparatus, and camera and storage medium
CN116228888A (en) * 2023-04-21 2023-06-06 智广海联(天津)大数据技术有限公司 Conversion method and system for geographic coordinates and PTZ camera coordinates
CN117095066A (en) * 2023-10-18 2023-11-21 智广海联(天津)大数据技术有限公司 Method and device for marking PTZ camera screen
CN118587290A (en) * 2024-08-06 2024-09-03 浙江大华技术股份有限公司 Coordinate conversion method, apparatus and storage medium
US12125240B2 (en) 2021-09-07 2024-10-22 Hong Kong Applied Science And Technology Research Institute Co., Ltd Camera calibration method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107464264A (en) * 2016-06-02 2017-12-12 南京理工大学 A kind of camera parameter scaling method based on GPS
CN109413384A (en) * 2018-10-19 2019-03-01 天津天地人和企业管理咨询有限公司 Video monitoring system and method based on GPS information and PTZ
CN109523471A (en) * 2018-11-16 2019-03-26 厦门博聪信息技术有限公司 A kind of conversion method, system and the device of ground coordinate and wide angle cameras picture coordinate

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107464264A (en) * 2016-06-02 2017-12-12 南京理工大学 A kind of camera parameter scaling method based on GPS
CN109413384A (en) * 2018-10-19 2019-03-01 天津天地人和企业管理咨询有限公司 Video monitoring system and method based on GPS information and PTZ
CN109523471A (en) * 2018-11-16 2019-03-26 厦门博聪信息技术有限公司 A kind of conversion method, system and the device of ground coordinate and wide angle cameras picture coordinate

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112132909A (en) * 2020-09-23 2020-12-25 字节跳动有限公司 Parameter acquisition method and device, media data processing method and storage medium
CN114333199A (en) * 2020-09-30 2022-04-12 中国电子科技集团公司第五十四研究所 Alarm method, equipment, system and chip
CN114333199B (en) * 2020-09-30 2024-03-26 中国电子科技集团公司第五十四研究所 Alarm method, equipment, system and chip
CN112308930A (en) * 2020-10-30 2021-02-02 杭州海康威视数字技术股份有限公司 Camera external parameter calibration method, system and device
CN112308930B (en) * 2020-10-30 2023-10-10 杭州海康威视数字技术股份有限公司 Camera external parameter calibration method, system and device
CN112465911A (en) * 2020-11-06 2021-03-09 北京迈格威科技有限公司 Image processing method and device
CN114511640A (en) * 2020-11-17 2022-05-17 北京四维图新科技股份有限公司 Method, device and storage medium for calibrating camera by using map
CN112561990A (en) * 2021-01-21 2021-03-26 禾多科技(北京)有限公司 Positioning information generation method, device, equipment and computer readable medium
CN112925002A (en) * 2021-02-07 2021-06-08 沈阳航空航天大学 Distributed visual positioning method for non-cooperative target in air
CN112925002B (en) * 2021-02-07 2023-09-26 沈阳航空航天大学 Distributed visual positioning method for non-cooperative targets in space
CN113033426B (en) * 2021-03-30 2024-03-01 北京车和家信息技术有限公司 Dynamic object labeling method, device, equipment and storage medium
CN113033426A (en) * 2021-03-30 2021-06-25 北京车和家信息技术有限公司 Dynamic object labeling method, device, equipment and storage medium
CN113115002A (en) * 2021-04-13 2021-07-13 大庆安瑞达科技开发有限公司 Oil and gas field personnel and vehicle positioning associated video monitoring method
CN113223087A (en) * 2021-07-08 2021-08-06 武大吉奥信息技术有限公司 Target object geographic coordinate positioning method and device based on video monitoring
CN113572960A (en) * 2021-07-23 2021-10-29 武汉星环恒宇信息科技有限公司 Video rapid label positioning method for water affair prevention and control
CN113572960B (en) * 2021-07-23 2023-11-14 武汉星环恒宇信息科技有限公司 Video quick tag positioning method for water affair prevention and control
CN113781575A (en) * 2021-08-09 2021-12-10 上海奥视达智能科技有限公司 Camera parameter calibration method, device, terminal and storage medium
CN113781575B (en) * 2021-08-09 2024-01-12 上海奥视达智能科技有限公司 Calibration method and device for camera parameters, terminal and storage medium
US12125240B2 (en) 2021-09-07 2024-10-22 Hong Kong Applied Science And Technology Research Institute Co., Ltd Camera calibration method
US11625860B1 (en) 2021-09-07 2023-04-11 Hong Kong Applied Science And Technology Research Institute Co., Ltd. Camera calibration method
CN114199124A (en) * 2021-11-09 2022-03-18 汕头大学 Coordinate calibration method, device, system and medium based on linear fitting
WO2023087894A1 (en) * 2021-11-18 2023-05-25 京东方科技集团股份有限公司 Region adjustment method and apparatus, and camera and storage medium
CN114119651A (en) * 2021-11-30 2022-03-01 重庆紫光华山智安科技有限公司 Target tracking method, system, device and storage medium
CN114608555A (en) * 2022-02-28 2022-06-10 珠海云洲智能科技股份有限公司 Target positioning method, system and storage medium
CN114638980A (en) * 2022-03-04 2022-06-17 支付宝(杭州)信息技术有限公司 Dish type identification processing method and device
CN114754743B (en) * 2022-04-18 2024-09-10 中国人民解放军陆军军事交通学院军事交通运输研究所 Target positioning method for carrying multiple PTZ cameras on ground intelligent unmanned platform
CN114754743A (en) * 2022-04-18 2022-07-15 中国人民解放军陆军军事交通学院军事交通运输研究所 Target positioning method for carrying multiple PTZ cameras on intelligent ground unmanned platform
CN115588040A (en) * 2022-09-09 2023-01-10 四川省寰宇众恒科技有限公司 System and method for counting and positioning coordinates based on full-view imaging points
CN115375779A (en) * 2022-10-27 2022-11-22 智广海联(天津)大数据技术有限公司 Method and system for marking AR (augmented reality) real scene of camera
CN116228888B (en) * 2023-04-21 2023-08-22 智广海联(天津)大数据技术有限公司 Conversion method and system for geographic coordinates and PTZ camera coordinates
CN116228888A (en) * 2023-04-21 2023-06-06 智广海联(天津)大数据技术有限公司 Conversion method and system for geographic coordinates and PTZ camera coordinates
CN117095066A (en) * 2023-10-18 2023-11-21 智广海联(天津)大数据技术有限公司 Method and device for marking PTZ camera screen
CN117095066B (en) * 2023-10-18 2024-01-05 智广海联(天津)大数据技术有限公司 Method and device for marking PTZ camera screen
CN118587290A (en) * 2024-08-06 2024-09-03 浙江大华技术股份有限公司 Coordinate conversion method, apparatus and storage medium

Similar Documents

Publication Publication Date Title
CN111461994A (en) Method for obtaining coordinate transformation matrix and positioning target in monitoring picture
US10681271B2 (en) Image processing apparatus, image capturing system, image processing method, and recording medium
CN110310248B (en) A kind of real-time joining method of unmanned aerial vehicle remote sensing images and system
US9600859B2 (en) Image processing device, image processing method, and information processing device
US20190340737A1 (en) Image processing apparatus, image processing system, image capturing system, image processing method, and recording medium
US10467726B2 (en) Post capture imagery processing and deployment systems
JP6398472B2 (en) Image display system, image display apparatus, image display method, and program
CN103200358B (en) Coordinate transformation method between video camera and target scene and device
CN113345028B (en) Method and equipment for determining target coordinate transformation information
WO2022042350A1 (en) Target sky area image acquisition method and astrophotography device
CN110736447B (en) Vertical-direction horizontal position calibration method for integrated image acquisition equipment
KR20130121290A (en) Georeferencing method of indoor omni-directional images acquired by rotating line camera
CN114565677A (en) Positioning deviation rectifying method, monitoring equipment and computer readable storage medium
CN110749311B (en) Positioning method, positioning device and storage medium
US20080170799A1 (en) Method for calibrating a response curve of a camera
CN116817929B (en) Method and system for simultaneously positioning multiple targets on ground plane by unmanned aerial vehicle
CN111598930B (en) Color point cloud generation method and device and terminal equipment
CN116228888B (en) Conversion method and system for geographic coordinates and PTZ camera coordinates
CN109377529B (en) Method, system and device for converting ground coordinates and picture coordinates of PTZ camera
CN112230257A (en) Positioning monitoring method based on PTZ camera
CN111649716A (en) Space point-to-point distance measuring and calculating method based on panoramic image
CN107655458B (en) Panorama scene automatic association method based on GIS
CN115511961A (en) Three-dimensional space positioning method, system and storage medium
JP6610741B2 (en) Image display system, image display apparatus, image display method, and program
CN113810606A (en) Shooting target position determining method, shooting target position determining equipment and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200728

RJ01 Rejection of invention patent application after publication