CN109472829B - Object positioning method, device, equipment and storage medium - Google Patents

Object positioning method, device, equipment and storage medium Download PDF

Info

Publication number
CN109472829B
CN109472829B CN201811025819.8A CN201811025819A CN109472829B CN 109472829 B CN109472829 B CN 109472829B CN 201811025819 A CN201811025819 A CN 201811025819A CN 109472829 B CN109472829 B CN 109472829B
Authority
CN
China
Prior art keywords
camera
coordinate system
calibration object
image data
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811025819.8A
Other languages
Chinese (zh)
Other versions
CN109472829A (en
Inventor
杨小平
宋翔
胡志恒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SF Technology Co Ltd
Original Assignee
SF Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SF Technology Co Ltd filed Critical SF Technology Co Ltd
Priority to CN201811025819.8A priority Critical patent/CN109472829B/en
Publication of CN109472829A publication Critical patent/CN109472829A/en
Application granted granted Critical
Publication of CN109472829B publication Critical patent/CN109472829B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Abstract

The application discloses an object positioning method, an object positioning device, object positioning equipment and a storage medium. The method comprises the following steps: calculating intrinsic parameters of a camera based on first image data, wherein the first image data is image data of a first calibration object acquired by the camera, and the first calibration object moves at a constant speed along a preset route at a preset speed; calculating an external parameter of the camera relative to a second calibration object based on second image data, wherein the second image data is image data of the second calibration object acquired by the camera, and the position of the second calibration object relative to the camera is fixed, and the second calibration object is arranged on the ground; acquiring pixel coordinates of a target object in third image data acquired by the camera; and determining the position of the target object projected onto the ground by the camera according to the internal parameters and the external parameters of the camera and the pixel coordinates of the target object. According to the technical scheme of the embodiment of the application, the monocular camera can accurately position the object.

Description

Object positioning method, device, equipment and storage medium
Technical Field
The present application relates generally to the field of vision measurement technology, and more particularly, to a method, an apparatus, a device, and a storage medium for object positioning.
Background
Computer vision employs cameras and computers to acquire data and information of a target object to be photographed, and it is expected that the computer can perceive the environment. Computer vision subjects are mainly three-dimensional scenes mapped onto a single or multiple images, for example the reconstruction of three-dimensional scenes. With the development of computer vision technology, the demand for scene understanding is increasing, but in the practical application process, scene understanding has various problems which are difficult to solve.
For example, a seamless camera monitoring system is arranged in an airport, video streams at various positions of the airport are called based on the monitoring system, and videos in a certain time range are backtracked, but the current monitoring system can only record, does not understand the content of the scenes, and still needs manual participation in understanding or analyzing work.
Secondly, the understanding of the scene firstly locates the objects in the scene to determine whether the objects are properly present at the positions where the objects are present and present at the positions where the objects are not present, and gives corresponding real-time alarms.
In the seamless camera monitoring system for the airport, because the airport environment is very large, the cameras are very wide and sparse in distribution, and binocular imaging is difficult to form between the two cameras, objects on the airport cannot be positioned by using a binocular camera, so that how to accurately position the objects by using the monocular camera becomes a problem which is urgently needed to be solved at present.
Disclosure of Invention
In view of the above-mentioned drawbacks and deficiencies of the prior art, it is desirable to provide a solution for accurately positioning an object based on a monocular camera.
In a first aspect, an embodiment of the present application provides an object positioning method, where the method includes:
calculating intrinsic parameters of a camera based on first image data, wherein the first image data is image data of a first calibration object acquired by the camera, and the first calibration object moves at a constant speed along a preset route at a preset speed;
calculating an external parameter of the camera relative to a second calibration object based on second image data, wherein the second image data is image data of the second calibration object acquired by the camera, and the position of the second calibration object relative to the camera is fixed, and the second calibration object is arranged on the ground;
acquiring pixel coordinates of a target object in third image data acquired by the camera;
and determining the ground position of the target object in a world coordinate system taking the projection point of the camera on the ground as an origin according to the internal parameter and the external parameter of the camera and the pixel coordinate of the target object.
In a second aspect, an embodiment of the present application further provides an object positioning apparatus, where the apparatus includes:
the internal parameter calculating unit is used for calculating internal parameters of the camera based on first image data, wherein the first image data is image data of a first calibration object acquired by the camera, and the first calibration object moves at a constant speed along a preset route according to a preset speed;
the external parameter calculation unit is used for calculating an external parameter of the camera relative to a second calibration object based on second image data, wherein the second image data is image data of the second calibration object acquired by the camera, and the second calibration object is arranged on the ground and is fixed relative to the position of the camera;
the coordinate acquisition unit is used for acquiring pixel coordinates of a target object in third image data acquired by the camera;
and the positioning unit is used for determining the ground position of the target object in a world coordinate system with the projection point of the camera on the ground as an origin according to the internal parameter and the external parameter of the camera and the pixel coordinate of the target object.
In a third aspect, an embodiment of the present application further provides a computer device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and when the processor executes the computer program, the processor implements the object positioning method.
In a fourth aspect, an embodiment of the present application further provides a computer-readable storage medium, on which a computer program is stored, where the computer program is configured to:
which computer program, when being executed by a processor, carries out the above-mentioned object positioning method.
According to the object positioning scheme, the internal parameters of the camera are calculated through the first calibration object, the external parameters of the camera are calculated through the second calibration object, and the ground position of the target object in a world coordinate system with the projection point of the camera on the ground as the origin is determined through the internal parameters and the external parameters of the camera and the collected pixel coordinates of the target object. According to the technical scheme, the object can be accurately positioned by the monocular camera.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the detailed description of non-limiting embodiments made with reference to the following drawings:
fig. 1 illustrates an exemplary flowchart of an object locating method provided in an embodiment of the present application;
FIG. 2 shows a schematic diagram of a camera placed in a world coordinate system;
FIG. 3 is a block diagram illustrating an exemplary structure of an object positioning apparatus according to an embodiment of the present disclosure;
FIG. 4 illustrates a schematic block diagram of a computer system suitable for use to implement a server according to embodiments of the present application.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings.
It should be noted that, in the present application, the embodiments and features of the embodiments may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
As mentioned in the background, since the airport environment is very large, the cameras are very widely and sparsely distributed, and it is difficult to form binocular imaging between the two cameras, objects on the airport cannot be positioned using binocular, and only monocular positioning can be performed. Current monocular positioning solutions typically require the height of the camera and the angle of depression of the camera to be obtained in advance. However, this method is not satisfactory for cameras already installed in airports.
In view of the foregoing drawbacks of the prior art, embodiments of the present application provide an object positioning solution. According to the scheme, the internal parameters of the camera are calculated through the first calibration object, the external parameters of the camera are calculated through the second calibration object, and the ground position of the target object in a world coordinate system taking the projection point of the camera on the ground as the origin is determined through the internal and external parameters of the camera and the pixel coordinates of the acquired target object. According to the technical scheme, the object can be accurately positioned by the monocular camera.
The method of the embodiment of the present application will be described below with reference to a flowchart.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating an object positioning method according to an embodiment of the present disclosure.
As shown in fig. 1, the method includes:
and 110, calculating intrinsic parameters of the camera based on first image data, wherein the first image data is image data of a first calibration object acquired by the camera, and the first calibration object moves at a constant speed along a preset route according to a preset speed.
In the embodiments of the present application, the calibration object may refer to, but is not limited to: the checkerboard image is set in a predetermined size, for example, each cell is predetermined to be 3cm,5cm in width, or other size.
In step 110, the first calibration object may be a checkerboard image, or other images that may be used to achieve calibration. For example, the first calibration object may be placed on a trailer car, or the first calibration object may be placed using a mobile device such as a remote controlled car.
When the mobile device provided with the first calibration object moves at a constant speed on the airport runway, first image data is obtained, and the first image data is the imaging result of the camera on the first calibration object moving at the constant speed. And further realizing the calibration of the parameters in the camera according to the imaging result.
In the embodiment of the present application, the intrinsic parameters of the camera may be expressed as:
Figure GDA0001944621830000041
wherein f is x 、f y 、u 0 、v 0 Are all known variables.
And step 120, calculating an external parameter of the camera relative to a second calibration object based on second image data, wherein the second image data is image data of the second calibration object acquired by the camera, and the second calibration object is arranged on the ground and is fixed relative to the camera.
In the embodiment of the present application, the second calibration object may be a checkerboard image, or other images that can implement calibration. The position of the second calibration object relative to the camera is fixed, for example on the ground in front of the camera.
It should be noted that the second calibration object needs to be placed in a manner parallel to the horizontal and vertical axes of the world coordinate system. The world coordinate system is a coordinate system taking a projection point of the camera on the ground as an origin.
The camera acquires a second calibration object to obtain second image data, wherein the second image data is an imaging result of the camera on the second calibration object fixed on the ground. And then the calibration of the camera relative to the external parameters of the second calibration object is realized according to the imaging result. Wherein the external parameters of the camera relative to the second calibration object may include:
Figure GDA0001944621830000051
Figure GDA0001944621830000052
wherein r is 1 -r 9 、t 1 -t 3 Are all known variables.
And step 130, acquiring the pixel coordinates of the target object in the third image data acquired by the camera.
In the embodiment of the present application, a pixel coordinate system is established in third image data acquired by a camera, wherein a vertex at an upper left corner of the third image data may be determined as an origin of the pixel coordinate system, pixel coordinates of an object may be set to (u, v), and an abscissa u and an ordinate v are rows and columns where an image is located, respectively.
And step 140, determining the ground position of the target object in a world coordinate system according to the internal and external parameters of the camera and the pixel coordinates of the target object.
Wherein, step 140 can be implemented, but not limited to, as follows:
firstly, determining translation parameters of the camera relative to a second calibration object according to a coordinate conversion relation between a camera coordinate system taking the camera as an origin and a calibration object coordinate system taking a specified point in the second calibration object as the origin;
specifically, the coordinate transformation relationship between the camera coordinate system and the calibration object coordinate system can be expressed by formula (1):
Figure GDA0001944621830000061
wherein X, Y and Z refer to coordinates in the object coordinate system, and X, Y and Z refer to coordinates in the camera coordinate system, R -1 For conversion of the camera coordinate system to the rotation matrix of the calibration object coordinate system, R -1 T' is a translation matrix from the camera coordinate system to the calibration object coordinate system, and the translation coordinates of the camera with respect to the second calibration object are three parameter values in the translation matrix, namely dx, dy, and h.
FIG. 2 shows a schematic diagram of a camera placed in a world coordinate system, referring to FIG. 2, which includes three coordinate systems, respectively, a projected point O of the camera on the ground wo World coordinate system with origin, camera O ca Camera coordinate system as origin and vertex O of lower left corner of second calibration object ch A calibration object coordinate system as an origin.
Referring to the three coordinate systems, the relationship that can be established by converting the coordinate system of the calibration object to the coordinate system of the camera is shown in the following formula (2):
Figure GDA0001944621830000062
referring to equation (2), in order to convert the coordinate system of the calibration object to the coordinate system of the camera, the coordinate system of the calibration object needs to be rotated first, and then translated, so that the obtained translation matrix cannot be used in (d) of fig. 2 x ,d y H), these three variables cannot be solved, but if equation (2) is transformed as shown in equation (1), then: the camera coordinate system is rotated and then translated such that the translation parameter is (d) x ,d y H), i.e. R -1 T'。
Secondly, determining the position coordinates of the target object in a coordinate system of the calibration object based on a first relation between the pixel coordinates of the target object and the camera coordinates of the target object in the coordinate system of the camera, a second relation between the coordinates of the calibration object of the target object in the coordinate system of the calibration object and the camera coordinates, and internal and external parameters of the camera;
specifically, the first relationship may be expressed by equation (3):
Figure GDA0001944621830000071
wherein x is ca 、y ca And z ca Are the coordinates of the object in the camera coordinate system.
The second relationship can be expressed by equation (4):
Figure GDA0001944621830000072
wherein X ch 、Y ch And Z ch Coordinates of the object in the calibration object coordinate system are obtained.
From the above equation (3) and equation (4), and the internal and external parameters K, R, and T of the camera, the following equation (5) can be derived:
Figure GDA0001944621830000073
bonding withCalibration of the coordinate system of the object, Z ch By substituting the value of =0 into the above equation (5), the following equation (6) is obtained:
Figure GDA0001944621830000074
determining the position coordinates of the target object in a calibration object coordinate system;
let a = f in the above equation (6) x r 1 +(u 0 -u)r 7 、B=f x r 2 +(u 0 -u)r 8 、C=f y r 4 +(v 0 -v)r 7 、D=f y r 5 +(v 0 -v)r 8 、E=(u-u 0 )t 3 -f x t 1 、F=(v-v 0 )t 3 -f y t 2 Determining the position coordinates of the target object in a coordinate system of the calibration object:
Figure GDA0001944621830000075
and finally, converting the position coordinates of the target object in the coordinate system of the calibration object into a world coordinate system according to the translation parameters of the camera relative to the second calibration object, so as to obtain the ground position of the target object in the world coordinate system.
In particular, according to the formula
Figure GDA0001944621830000081
Determining the ground position of a target object in a world coordinate system; wherein X w And Y w And the coordinate values of the target object in the world coordinate system are obtained.
In this embodiment, after determining the ground position of the target object in the world coordinate system, the method may further include:
according to the formula
Figure GDA0001944621830000082
And determining a calibration error.
Wherein, E dis For calibration errors, X n And Y n The number of N is defined according to the requirement for the coordinate of the nth point in the second calibration object in the coordinate system of the calibration object.
The calibration error mainly has the functions of evaluating the accuracy of the algorithm and detecting the accuracy of positioning the target object.
According to the object positioning scheme, the internal parameters of the camera are calculated through the first calibration object, the external parameters of the camera are calculated through the second calibration object, and the ground position of the target object in a world coordinate system with the projection point of the camera on the ground as the origin is determined through the internal and external parameters of the camera and the acquired pixel coordinates of the target object. According to the technical scheme, the object can be accurately positioned by the monocular camera.
It should be noted that while the operations of the methods of the present invention are depicted in the drawings in a particular order, this does not require or imply that these operations must be performed in this particular order, or that all of the illustrated operations must be performed, in order to achieve desirable results. Rather, the steps depicted in the flowcharts may change order of execution. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
Further referring to fig. 3, it shows an exemplary structural block diagram of an object positioning device provided in the embodiments of the present application.
The apparatus may include:
an internal parameter calculating unit 31, configured to calculate an internal parameter of a camera based on first image data, where the first image data is image data of a first calibration object acquired by the camera, and the first calibration object moves at a constant speed along a predetermined route and at a predetermined speed;
an external parameter calculating unit 32, configured to calculate an external parameter of the camera relative to a second calibration object based on second image data, where the second image data is image data of the second calibration object captured by the camera, and the second calibration object is disposed on the ground and has a fixed position relative to the camera;
a coordinate acquiring unit 33 configured to acquire pixel coordinates of the target object in the third image data acquired by the camera;
and the positioning unit 34 is used for determining the ground position of the target object in a world coordinate system taking the projection point of the camera on the ground as an origin according to the internal parameter and the external parameter of the camera and the pixel coordinate of the target object.
Optionally, the positioning unit 34 includes:
the translation parameter determining module is used for determining translation parameters of the camera relative to the second calibration object according to a coordinate conversion relation between a camera coordinate system taking the camera as an origin and a calibration object coordinate system taking a designated point in the second calibration object as the origin;
a first position determining module for determining the position coordinates of the object in the calibration object coordinate system based on a first relationship between the pixel coordinates of the object and the coordinates of the object in the camera coordinate system, a second relationship between the coordinates of the object in the calibration object coordinate system and the coordinates of the object in the camera coordinate system, and the inside and outside parameters of the camera;
and the second position determining module is used for converting the position coordinates of the target object in the coordinate system of the calibration object into the world coordinate system according to the translation parameters of the camera relative to the second calibration object so as to obtain the ground position of the target object in the world coordinate system.
Optionally, the translation parameter determining module is specifically configured to:
according to the formula
Figure GDA0001944621830000091
Determining translational coordinates of the camera relative to the second calibration object; wherein: the X, Y and Z are coordinates in the calibration object coordinate system, the X, Y and Z are coordinates in the camera coordinate system, the R -1 For a rotation matrix of the camera coordinate system to the calibration object coordinate system, the R -1 T' is a translation matrix from the camera coordinate system to the calibration object coordinate system.
Optionally, the first relationship in the first position determining module is:
Figure GDA0001944621830000101
the second relationship is as follows:
Figure GDA0001944621830000102
the position coordinates of the target object in the calibration object coordinate system are as follows:
Figure GDA0001944621830000103
Figure GDA0001944621830000104
wherein K is an internal parameter of the camera, the
Figure GDA0001944621830000105
R and T are external parameters of the camera, the
Figure GDA0001944621830000106
The described
Figure GDA0001944621830000107
U and v are pixel coordinates of the object; said A = f x r 1 +(u 0 -u)r 7 B = f x r 2 +(u 0 -u)r 8 C = f y r 4 +(v 0 -v)r 7 D = f y r 5 +(v 0 -v)r 8 The E = (u-u) 0 )t 3 -f x t 1 The F = (v-v) 0 )t 3 -f y t 2
Optionally, the second position determining module is specifically configured to:
according to the formula
Figure GDA0001944621830000108
Determining a ground location of the target object in the world coordinate system; wherein, X is w And Y w Is the coordinate value of the object in the world coordinate system, d x And d y Is a translation parameter of the camera relative to the second calibration object.
Optionally, the apparatus may further include:
a calibration error determination unit for determining the calibration error according to the formula
Figure GDA0001944621830000109
Determining a calibration error;
wherein, E is dis For calibration error, said X n And said Y n The coordinate of the nth point in the second calibration object in the coordinate system of the calibration object.
Wherein the second image data and/or the third image data are image data subjected to distortion correction using an intrinsic parameter of the camera.
It should be understood that the subsystems or units recited in the apparatus correspond to the various steps in the method described with reference to fig. 1-2. Thus, the operations and features described above for the method are equally applicable to the apparatus and the units comprised therein and will not be described in further detail here.
Referring now to FIG. 4, a block diagram of a computer system 400 suitable for use in implementing a server according to embodiments of the present application is shown.
As shown in fig. 4, the computer system 400 includes a Central Processing Unit (CPU) 401 that can perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM) 402 or a program loaded from a storage section 408 into a Random Access Memory (RAM) 403. In the RAM 403, various programs and data necessary for the operation of the system 400 are also stored. The CPU 401, ROM 402, and RAM 403 are connected to each other via a bus 404. An input/output (I/O) interface 405 is also connected to bus 404.
The following components are connected to the I/O interface 405: an input section 406 including a keyboard, a mouse, and the like; an output section 407 including a display device such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 408 including a hard disk and the like; and a communication section 409 including a network interface card such as a LAN card, a modem, or the like. The communication section 409 performs communication processing via a network such as the internet. A drive 410 is also connected to the I/O interface 405 as needed. A removable medium 411 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 410 as needed, so that a computer program read out therefrom is mounted in the storage section 408 as needed.
In particular, the processes described above with reference to fig. 1-2 may be implemented as computer software programs, according to embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program tangibly embodied on a machine-readable medium, the computer program comprising program code for performing the methods of fig. 1-2. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 409 and/or installed from the removable medium 411.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units or modules described in the embodiments of the present application may be implemented by software or hardware. The described units or modules may also be provided in a processor. The names of these units or modules do not in some cases constitute a limitation of the unit or module itself.
As another aspect, the present application also provides a computer-readable storage medium, which may be the computer-readable storage medium included in the apparatus in the above-described embodiments; or it may be a separate computer readable storage medium not incorporated into the device. The computer readable storage medium stores one or more programs for use by one or more processors in performing the formula input methods described herein.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by a person skilled in the art that the scope of the invention as referred to in the present application is not limited to the embodiments with a specific combination of the above-mentioned features, but also covers other embodiments with any combination of the above-mentioned features or their equivalents without departing from the inventive concept. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (9)

1. A method of locating an object, the method comprising:
calculating intrinsic parameters of a camera based on first image data, wherein the first image data is image data of a first calibration object acquired by the camera, and the first calibration object moves at a constant speed along a preset route at a preset speed;
calculating an external parameter of the camera relative to a second calibration object based on second image data, wherein the second image data is image data of the second calibration object acquired by the camera, and the position of the second calibration object relative to the camera is fixed, and the second calibration object is arranged on the ground;
acquiring pixel coordinates of a target object in third image data acquired by the camera;
determining the ground position of the object in a world coordinate system taking the projection point of the camera on the ground as an origin according to the internal parameter and the external parameter of the camera and the pixel coordinate of the object,
the determining the ground position of the object in a world coordinate system with the projection point of the camera on the ground as an origin according to the internal parameter and the external parameter of the camera and the pixel coordinate of the object comprises:
determining translation parameters of the camera relative to the second calibration object according to a coordinate conversion relation between a camera coordinate system taking the camera as an origin and a calibration object coordinate system taking a specified point in the second calibration object as the origin;
determining the position coordinates of the object in the calibration object coordinate system based on a first relation between the pixel coordinates of the object and the coordinates of the object in the camera coordinate system, a second relation between the coordinates of the object in the calibration object coordinate system and the coordinates of the object in the camera coordinate system, and the internal and external parameters of the camera;
and according to the translation parameter of the camera relative to the second calibration object, converting the position coordinate of the target object in the calibration object coordinate system into the world coordinate system to obtain the ground position of the target object in the world coordinate system.
2. The method of claim 1, wherein determining translation parameters of the camera relative to the second calibration object according to the coordinate transformation relationship between the camera coordinate system and the calibration object coordinate system comprises:
according to the formula
Figure FDA0003752797000000021
Determining translational coordinates of the camera relative to the second calibration object;wherein: the X, Y and Z are coordinates in the calibration object coordinate system, the X, Y and Z are coordinates in the camera coordinate system, the R -1 For a rotation matrix of the camera coordinate system to the calibration object coordinate system, the R -1 T' is a translation matrix that the camera coordinate system is transformed to the calibration object coordinate system.
3. The method of claim 1, wherein the first relationship is:
Figure FDA0003752797000000022
the second relationship is as follows:
Figure FDA0003752797000000023
said X ch 、Y ch And Z ch For the coordinates of the object in the coordinate system of the calibration object, said x ca 、y ca And z ca Is the coordinates of the object in the camera coordinate system,
wherein the content of the first and second substances,
Figure FDA0003752797000000024
wherein K is an internal parameter of the camera, the
Figure FDA0003752797000000025
R and T are external parameters of the camera, the
Figure FDA0003752797000000026
The above-mentioned
Figure FDA0003752797000000027
The u and the v are pixel coordinates of the object; said A = f x r 1 +(u 0 -u)r 7 B = f x r 2 +(u 0 -u)r 8 Said C=f y r 4 +(v 0 -v)r 7 D = f y r 5 +(v 0 -v)r 8 The E = (u-u) 0 )t 3 -f x t 1 The F = (v-v) 0 )t 3 -f y t 2
4. The method of claim 3, wherein the translating the position coordinates of the object in the coordinate system of the calibration object into the coordinate system of the world according to the translation parameters of the camera relative to the second calibration object to obtain the ground position of the object in the coordinate system of the world comprises:
according to the formula
Figure FDA0003752797000000031
Determining a ground location of the target object in the world coordinate system; wherein, X is w And Y w Is the coordinate value of the object in the world coordinate system, d x And d y Is a translation parameter of the camera relative to the second calibration object.
5. The method of claim 4, further comprising:
according to the formula
Figure FDA0003752797000000032
Determining a calibration error;
wherein, E is dis For calibration errors, said X n And said Y n The coordinate of the nth point in the second calibration object in the coordinate system of the calibration object.
6. The method according to claim 1, wherein the second image data and/or the third image data is image data that is distortion corrected using intrinsic parameters of the camera.
7. An object positioning device, the device comprising:
the internal parameter calculating unit is used for calculating internal parameters of the camera based on first image data, wherein the first image data is image data of a first calibration object acquired by the camera, and the first calibration object moves at a constant speed along a preset route according to a preset speed;
the external parameter calculation unit is used for calculating an external parameter of the camera relative to a second calibration object based on second image data, wherein the second image data is image data of the second calibration object acquired by the camera, and the second calibration object is arranged on the ground and is fixed relative to the position of the camera;
the coordinate acquisition unit is used for acquiring pixel coordinates of a target object in third image data acquired by the camera;
a positioning unit for determining the ground position of the object in a world coordinate system with the projection point of the camera on the ground as an origin according to the internal parameter and the external parameter of the camera and the pixel coordinate of the object,
wherein the positioning unit includes:
the translation parameter determining module is used for determining translation parameters of the camera relative to the second calibration object according to a coordinate conversion relation between a camera coordinate system taking the camera as an origin and a calibration object coordinate system taking a designated point in the second calibration object as the origin;
a first position determining module for determining the position coordinates of the object in the calibration object coordinate system based on a first relationship between the pixel coordinates of the object and the coordinates of the object in the camera coordinate system, a second relationship between the coordinates of the object in the calibration object coordinate system and the coordinates of the object in the camera coordinate system, and the inside and outside parameters of the camera;
and the second position determining module is used for converting the position coordinates of the target object in the coordinate system of the calibration object into the world coordinate system according to the translation parameters of the camera relative to the second calibration object so as to obtain the ground position of the target object in the world coordinate system.
8. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1-6 when executing the program.
9. A computer-readable storage medium having stored thereon a computer program for:
the computer program, when executed by a processor, implements the method of any one of claims 1-6.
CN201811025819.8A 2018-09-04 2018-09-04 Object positioning method, device, equipment and storage medium Active CN109472829B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811025819.8A CN109472829B (en) 2018-09-04 2018-09-04 Object positioning method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811025819.8A CN109472829B (en) 2018-09-04 2018-09-04 Object positioning method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN109472829A CN109472829A (en) 2019-03-15
CN109472829B true CN109472829B (en) 2022-10-21

Family

ID=65661450

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811025819.8A Active CN109472829B (en) 2018-09-04 2018-09-04 Object positioning method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN109472829B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112435300A (en) * 2019-08-26 2021-03-02 华为技术有限公司 Positioning method and device
CN111673735A (en) * 2020-04-28 2020-09-18 平安科技(深圳)有限公司 Mechanical arm control method and device based on monocular vision positioning
CN111781585B (en) * 2020-06-09 2023-07-18 浙江大华技术股份有限公司 Method for determining firework setting-off position and image acquisition equipment
CN112288751A (en) * 2020-10-12 2021-01-29 董宇青 Automatic floor sweeping device and control algorithm
WO2022088103A1 (en) * 2020-10-30 2022-05-05 华为技术有限公司 Image calibration method and apparatus
CN113538578B (en) * 2021-06-22 2023-07-25 恒睿(重庆)人工智能技术研究院有限公司 Target positioning method, device, computer equipment and storage medium
CN113436279B (en) * 2021-07-23 2023-02-28 杭州海康威视数字技术股份有限公司 Image processing method, device and equipment
CN115588040A (en) * 2022-09-09 2023-01-10 四川省寰宇众恒科技有限公司 System and method for counting and positioning coordinates based on full-view imaging points

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101419705A (en) * 2007-10-24 2009-04-29 深圳华为通信技术有限公司 Video camera demarcating method and device
CN101963500A (en) * 2010-09-28 2011-02-02 四川大学 Computer vision large-scale distance measuring method and portable distance measurer for applying same
CN102750697A (en) * 2012-06-08 2012-10-24 华为技术有限公司 Parameter calibration method and device
CN102914295A (en) * 2012-09-21 2013-02-06 上海大学 Computer vision cube calibration based three-dimensional measurement method
CN103559707A (en) * 2013-10-30 2014-02-05 同济大学 Industrial fixed-focus camera parameter calibration method based on moving square target calibration object
CN106485755A (en) * 2016-09-26 2017-03-08 中国科学技术大学 A kind of multi-camera system scaling method
CN106485753A (en) * 2016-09-09 2017-03-08 奇瑞汽车股份有限公司 Method and apparatus for the camera calibration of pilotless automobile
CN107239748A (en) * 2017-05-16 2017-10-10 南京邮电大学 Robot target identification and localization method based on gridiron pattern calibration technique
CN107328364A (en) * 2017-08-15 2017-11-07 顺丰科技有限公司 A kind of volume, weight measuring system and its method of work
CN107610178A (en) * 2017-07-27 2018-01-19 北京航天计量测试技术研究所 A kind of industrial photogrammetry system camera parameter movable type scaling method
CN107993265A (en) * 2017-11-29 2018-05-04 深圳市沃特沃德股份有限公司 The calibration facility of monocular sweeper, method and device
CN108053375A (en) * 2017-12-06 2018-05-18 智车优行科技(北京)有限公司 Image data correction method, device and its automobile

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180012377A1 (en) * 2016-07-08 2018-01-11 Toyota Motor Engineering & Manufacturing North America, Inc. Vision-assist devices and methods of calibrating vision-assist devices

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101419705A (en) * 2007-10-24 2009-04-29 深圳华为通信技术有限公司 Video camera demarcating method and device
CN101963500A (en) * 2010-09-28 2011-02-02 四川大学 Computer vision large-scale distance measuring method and portable distance measurer for applying same
CN102750697A (en) * 2012-06-08 2012-10-24 华为技术有限公司 Parameter calibration method and device
CN102914295A (en) * 2012-09-21 2013-02-06 上海大学 Computer vision cube calibration based three-dimensional measurement method
CN103559707A (en) * 2013-10-30 2014-02-05 同济大学 Industrial fixed-focus camera parameter calibration method based on moving square target calibration object
CN106485753A (en) * 2016-09-09 2017-03-08 奇瑞汽车股份有限公司 Method and apparatus for the camera calibration of pilotless automobile
CN106485755A (en) * 2016-09-26 2017-03-08 中国科学技术大学 A kind of multi-camera system scaling method
CN107239748A (en) * 2017-05-16 2017-10-10 南京邮电大学 Robot target identification and localization method based on gridiron pattern calibration technique
CN107610178A (en) * 2017-07-27 2018-01-19 北京航天计量测试技术研究所 A kind of industrial photogrammetry system camera parameter movable type scaling method
CN107328364A (en) * 2017-08-15 2017-11-07 顺丰科技有限公司 A kind of volume, weight measuring system and its method of work
CN107993265A (en) * 2017-11-29 2018-05-04 深圳市沃特沃德股份有限公司 The calibration facility of monocular sweeper, method and device
CN108053375A (en) * 2017-12-06 2018-05-18 智车优行科技(北京)有限公司 Image data correction method, device and its automobile

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
A Real-time Camera Calibration System Based On OpenCV;Zhang Hui 等;《Proceedings of the SPIE》;20150731;第9631卷;第1-5页 *
基于单目视觉与惯导融合的无人机位姿估计;熊敏君 等;《计算机应用》;20171220;第37卷(第S2期);第127-133页 *

Also Published As

Publication number Publication date
CN109472829A (en) 2019-03-15

Similar Documents

Publication Publication Date Title
CN109472829B (en) Object positioning method, device, equipment and storage medium
US11024052B2 (en) Stereo camera and height acquisition method thereof and height acquisition system
CN108629756B (en) Kinectv2 depth image invalid point repairing method
JP2012517651A (en) Registration of 3D point cloud data for 2D electro-optic image data
KR20130138247A (en) Rapid 3d modeling
CN111340749B (en) Image quality detection method, device, equipment and storage medium
US9183634B2 (en) Image processing apparatus and image processing method
CN113029128B (en) Visual navigation method and related device, mobile terminal and storage medium
CN111383204A (en) Video image fusion method, fusion device, panoramic monitoring system and storage medium
CN109934873B (en) Method, device and equipment for acquiring marked image
WO2022088881A1 (en) Method, apparatus and system for generating a three-dimensional model of a scene
US20220270294A1 (en) Calibration methods, apparatuses, systems and devices for image acquisition device, and storage media
WO2018044378A1 (en) Quantifying gas leak rates using frame images acquired by a camera
CN111627073B (en) Calibration method, calibration device and storage medium based on man-machine interaction
CN116704048B (en) Double-light registration method
CN113763478B (en) Unmanned vehicle camera calibration method, device, equipment, storage medium and system
CN113496503B (en) Point cloud data generation and real-time display method, device, equipment and medium
CN111489398B (en) Imaging equipment calibration method and device
US10776928B1 (en) Efficient egomotion estimation using patch-based projected correlation
CN114549613A (en) Structural displacement measuring method and device based on deep super-resolution network
KR102076635B1 (en) Apparatus and method for generating panorama image for scattered fixed cameras
CN111489397A (en) Imaging device calibration method and device
CN113450415A (en) Imaging device calibration method and device
CN112308809A (en) Image synthesis method and device, computer equipment and storage medium
US20210150683A1 (en) Method and device for generating virtual reality data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant