CN118050723A - Combined calibration method, system, electronic equipment and computer readable storage medium - Google Patents

Combined calibration method, system, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN118050723A
CN118050723A CN202311773129.1A CN202311773129A CN118050723A CN 118050723 A CN118050723 A CN 118050723A CN 202311773129 A CN202311773129 A CN 202311773129A CN 118050723 A CN118050723 A CN 118050723A
Authority
CN
China
Prior art keywords
image
calibration method
points
millimeter wave
wave radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311773129.1A
Other languages
Chinese (zh)
Inventor
黄宁波
柏林
刘彪
舒海燕
袁添厦
祝涛剑
沈创芸
王恒华
方映峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Gosuncn Robot Co Ltd
Original Assignee
Guangzhou Gosuncn Robot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Gosuncn Robot Co Ltd filed Critical Guangzhou Gosuncn Robot Co Ltd
Priority to CN202311773129.1A priority Critical patent/CN118050723A/en
Publication of CN118050723A publication Critical patent/CN118050723A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a joint calibration method, a system, electronic equipment and a computer readable storage medium, wherein the joint calibration method comprises the following steps of; acquiring an image containing a metal plate through an RGB camera, acquiring a mass center of the metal plate through the image and displaying the mass center on the image; converting the cluster points output by the millimeter wave radar into a pixel coordinate system of the image, and displaying the cluster points on the image; the cluster points are converted to points in the image that coincide with the centroid. The combined calibration method disclosed by the invention utilizes the millimeter wave radar and the RGB camera to jointly calibrate, can be used for the inspection robot, and has the advantages of convenience in operation and cost saving.

Description

Combined calibration method, system, electronic equipment and computer readable storage medium
Technical Field
The present invention relates to the field of calibration processing technologies, and in particular, to a joint calibration method, a system, an electronic device, and a computer readable storage medium.
Background
The inspection robot needs to sense surrounding vehicles in the inspection process so as to avoid collision. Because the inspection robot is a low-speed unmanned vehicle device, the requirements of real-time perception and perceived accuracy of the surrounding environment are generally much lower than those of an unmanned vehicle. For cost reasons, inspection robots typically do not carry high performance deep learning reasoning specific processors (such as horizon sequences) like unmanned vehicles. The real-time performance and the accuracy actual demand of the perception and the perception cost determine the scheme that the inspection robot cannot adopt the unmanned automobile to perceive the surrounding environment, namely the BEV perception scheme of laser radar and RGB camera fusion.
Therefore, development of a calibration method which is convenient to operate and cost-effective and can be used for the inspection robot is urgently needed.
Disclosure of Invention
The invention aims to provide a novel technical scheme of a combined calibration method, a system, electronic equipment and a computer readable storage medium, and can provide a calibration method which is convenient to operate and cost-saving and can be used for a patrol robot.
In a first aspect of the present invention, a joint calibration method is provided, including the following steps; acquiring an image containing a metal plate through an RGB camera, acquiring a mass center of the metal plate through the image and displaying the mass center on the image; converting the cluster points output by the millimeter wave radar into a pixel coordinate system of the image, and displaying the cluster points on the image; the cluster points are converted to points in the image that coincide with the centroid.
Optionally, the centroid of the metal plate is obtained by the following formula:
optionally, the cluster points are converted to the pixel coordinate system using an IMU coordinate system.
Optionally, the initial formula for converting the cluster point to the pixel coordinate system is:
(u,v,1)T=KRI2CRR2I(xR,yR,1,1)T
Optionally, the iterative formula for converting the cluster point to the pixel coordinate system is:
Optionally, R R2I is adjusted by adjusting the rotation part in the matrix change amount Δr i.
Optionally, R R2I is adjusted by making a clockwise and/or counterclockwise adjustment to three coordinate axes in the coordinate system.
In a second aspect of the present invention, there is provided a joint calibration system applied to the joint calibration method described in the above embodiment, the system including: an image acquisition module that acquires an image including a metal plate; a centroid processing module that obtains a centroid of the metal plate and displays the centroid on the image; and the clustering point processing module converts the clustering points output by the millimeter wave radar into a pixel coordinate system of the image, displays the clustering points on the image, and converts the clustering points into points in the image to coincide with the mass center.
In a third aspect of the present invention, there is provided an electronic apparatus comprising: a processor and a memory in which computer program instructions are stored, wherein the computer program instructions, when executed by the processor, cause the processor to perform the steps of the joint calibration method described in the above embodiments.
In a fourth aspect of the present invention, there is provided a computer readable storage medium storing a computer program which, when executed by a processor, causes the processor to perform the steps of the joint calibration method described in the above embodiments.
The combined calibration method disclosed by the invention utilizes the millimeter wave radar and the RGB camera to jointly calibrate, can be used for the inspection robot, and has the advantages of convenience in operation and cost saving.
Other features of the present invention and its advantages will become apparent from the following detailed description of exemplary embodiments of the invention, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a flow chart of a joint calibration method according to an embodiment of the present invention;
FIG. 2 is another flow chart of a joint calibration method according to an embodiment of the invention;
fig. 3 is a schematic diagram of the operation of an electronic device according to an embodiment of the invention.
Reference numerals:
A processor 201;
a memory 202; an operating system 2021; an application 2022;
a network interface 203;
An input device 204;
a hard disk 205;
A display device 206.
Detailed Description
Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless it is specifically stated otherwise.
The following description of at least one exemplary embodiment is merely exemplary in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to one of ordinary skill in the relevant art may not be discussed in detail, but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any specific values should be construed as merely illustrative, and not a limitation. Thus, other examples of exemplary embodiments may have different values.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further discussion thereof is necessary in subsequent figures.
The joint calibration method according to the embodiment of the invention is specifically described below with reference to the accompanying drawings.
As shown in fig. 1 and 2, the joint calibration method according to the embodiment of the present invention includes the following steps;
acquiring an image containing the metal plate through an RGB camera, acquiring the mass center of the metal plate through the image and displaying the mass center on the image;
Converting the cluster points output by the millimeter wave radar into a pixel coordinate system of an image, and displaying the cluster points on the image;
The cluster points are converted to points in the image that coincide with the centroid.
In other words, the joint calibration method according to the embodiment of the present invention mainly includes the following steps: first, a metal plate is photographed by an RGB camera, that is, an image including the metal plate is obtained. Then, the centroid coordinates of the metal plate are acquired by capturing the obtained image, and the centroid is displayed on the image. That is, after calculating the centroid coordinates, it is also required to be displayed on the image. Then, the cluster points are output by the millimeter wave radar, and then the cluster points are converted into a pixel coordinate system of the image and displayed on the image. The cluster points are then converted to points in the image that coincide with the centroid. Wherein, the millimeter wave radar can select 3D millimeter wave radar.
It should be noted that the joint calibration method provided by the embodiment of the invention can be suitable for the inspection robot, and can be combined with the 3D millimeter wave radar and the RGB camera carried by the inspection robot to realize joint calibration, so as to realize the integration of the 3D millimeter wave radar and the RGB camera of the inspection robot.
Wherein, the 3D millimeter wave radar can give the coordinates and the speed of the moving object, but cannot give the category of the moving object, and the image shot by the RGB camera can give the category information of the object through the processing of the deep learning 2D object detection model (which can be operated on a CPU without a special processor for high-performance deep learning reasoning). In view of low prices of the two, the inspection robot adopts the fusion of the 3D millimeter wave thunder and the RGB camera to sense vehicles in the surrounding environment (pedestrians can avoid the pedestrians, so that the pedestrians are not sensed), and the cost is saved.
Therefore, according to the combined calibration method provided by the embodiment of the invention, the millimeter wave radar and the RGB camera are fused, so that the combined calibration method can be applied to the inspection robot and has the advantages of convenience in operation, cost reduction and the like.
According to one embodiment of the invention, the centroid of the metal plate is obtained by the following formula:
It should be noted that, since the millimeter wave radar will take a stationary metal Object as a moving Object, and the Object mode of the 3D millimeter wave radar generally uses a nearest neighbor clustering algorithm to cluster the original point cloud and then output a clustering center point, the following problem exists in the scheme of assuming that the centroid of the rectangular metal plate image is the target point detected in the Object mode of the 3D millimeter wave radar during actual calibration: the clustering points output by the 3D millimeter wave radar cannot be accurately overlapped with the mass center of the metal plate after being converted into the matrix of the camera pixel coordinate system module through the radar clustering points.
In the present embodiment, the centroid calculation method is as follows:
And/> Is beneficial to realizing accurate superposition.
It should be noted that, compared to the scheme of directly obtaining the conversion matrix of the 3D millimeter wave radar to the RGB camera based on the nonlinear optimization algorithm, the scheme of directly obtaining the conversion matrix of the 3D millimeter wave radar to the RGB camera based on the nonlinear optimization algorithm first adjusts the 3D millimeter wave radar to the Object mode, and places a rectangular metal plate in front of the robot carrying the 3D millimeter wave radar and the RGB camera (the 3D millimeter wave radar has a characteristic of regarding a stationary metal Object as a moving Object).
And in a static state of the robot, recording 3D millimeter wave Object mode points, detecting four corner points of the metal plate in the image shot by the camera by using a corner detection method, and calculating the barycenter coordinates of the metal plate (the 3D millimeter wave Object mode points and the barycenter coordinates of the metal plate form a point pair).
The scheme of directly solving the conversion matrix of the 3D millimeter wave radar to the RGB camera based on the nonlinear optimization algorithm needs to continuously adjust the position of the robot (or the metal plate moves and the robot is static) under the condition of keeping the metal plate static, at least 4 points need to be obtained, and finally, the conversion matrix between the 3D millimeter wave radar and the RGB camera is directly solved by using the nonlinear optimization algorithm such as the Levenberg-Marquard algorithm (namely the LM algorithm). The scheme for directly solving the conversion matrix of the 3D millimeter wave radar to the RGB camera based on the nonlinear optimization algorithm can use camera internal parameters, and can not use the camera internal parameters.
The scheme of directly solving the conversion matrix of the 3D millimeter wave radar to the RGB camera based on the nonlinear optimization algorithm needs to adjust the 3D millimeter wave radar to an Object mode. The Object model is that the original point cloud points collected by the 3D millimeter wave radar are clustered by a nearest neighbor clustering algorithm or a k-means mean clustering algorithm and a clustering center is output. The scheme for directly solving the conversion matrix of the 3D millimeter wave radar to the RGB camera based on the nonlinear optimization algorithm has the following main defects:
(1) The scheme of directly solving the conversion matrix of the 3D millimeter wave radar to the RGB camera based on the nonlinear optimization algorithm needs to continuously and manually adjust the position of the robot (or the metal plate moves and the robot is static) under the condition of keeping the metal plate static so as to obtain at least 4 pairs of coordinates of 3D millimeter wave points and the mass center of the metal plate, and is very inconvenient to operate.
(2) The nonlinear optimization algorithm used in the scheme of directly solving the conversion matrix of the 3D millimeter wave radar to the RGB camera based on the nonlinear optimization algorithm needs an initial value in the operation process, and the accuracy of a final result is directly affected by the quality of the initial value. But the initial values in the scheme of directly solving the conversion matrix of the 3D millimeter wave radar to the RGB camera based on the nonlinear optimization algorithm are actually randomly generated. Therefore, the scheme of directly solving the conversion matrix of the 3D millimeter wave radar to the RGB camera based on the nonlinear optimization algorithm cannot guarantee the calibration precision at all.
In the embodiment, in the formula (1) and the formula (2), the left side of the plus sign is a formula for calculating the centroid based on the scheme of directly solving the conversion matrix of the 3D millimeter wave radar to the RGB camera based on the nonlinear optimization algorithm, the right side of the plus sign is the metal plate clustering center coordinate calculated by the k-means mean clustering algorithm added in the embodiment, and the final output is the average value of the two.
Therefore, the novel metal plate centroid calculation method provided in the embodiment can solve the technical problem that after the cluster points output by the 3D millimeter wave radar are converted into the matrix of the camera pixel coordinate system module through the radar cluster points, the cluster points cannot be accurately overlapped with the metal plate centroid.
In some embodiments of the invention, the cluster points are converted to a pixel coordinate system using an IMU coordinate system. That is, the 3D millimeter wave radar cluster points may be converted onto the RGB camera image by the IMU coordinate system. In the embodiment, the conversion of the 3D millimeter wave radar to the RGB camera transformation matrix is realized by taking the IMU as a medium.
It should be noted that, the clustering point form output in the Object mode of the 3D millimeter wave radar is (r, yaw, v), and the coordinate formula in the coordinate system converted into the 3D millimeter wave radar is as follows:
xR=r*sin(yaw) (3);
yR=r*cos(yaw) (4)。
It should be noted that, since the coordinate systems of the IMU and the 3D millimeter wave radar are both right-hand systems in the Z-axis direction, in this embodiment, when the 3D millimeter wave radar cluster points are considered to be converted to RGB camera image pixel points for the first time, rotation between their coordinate systems may be ignored, and only one translation between them is assumed, that is, the initial transformation matrix R R2I between the 3D millimeter wave radar and the IMU is in the form as follows:
In equation (5), (p x,py,pz) is a vector from the IMU coordinate system origin to the coordinate system origin of the 3D millimeter wave radar, and a rough value of the vector is easily obtained from the mechanical design software.
In addition, the 3D millimeter wave radar cannot measure the Z-axis value of the coordinate system, so that the Z-axis value is 1 during normal calibration; because R R2I is 4*4, a homogeneous coordinate can be added to the 3D millimeter wave radar coordinate, and the final form is as follows: (x R,yR,1,1)T, superscript T denotes transpose.
According to one embodiment of the invention, the initial formula for the conversion of the cluster points to the pixel coordinate system is:
(u,v,1)T=KRI2CRR2I(xR,yR,1,1)T (6);
In equation (3), u, v denote image pixel coordinates, K denote RGB camera references, and R I2C denotes the IMU-to-RGB camera transformation matrix.
That is, the initial formula for converting the 3D millimeter wave radar cluster points to RGB camera image pixel coordinates is as follows:
(u,v,1)T=KRI2CRR2I(xR,yR,1,1)T(6)。
In equation (6), u, v denote image pixel coordinates, and R I2C denotes a transformation matrix of the IMU to the RGB camera, which can be calibrated at the time of designing the navigation system. K represents RGB camera parameters, which may be pre-calibrated. For the existing robot system, an IMU (inertial measurement unit) is usually installed, and 3 coordinate axes of the IMU are right-handed systems in the Z-axis direction, which is consistent with a 3D millimeter wave radar. Therefore, when the 3D millimeter wave radar reaches the transformation matrix R R2I of the IMU, if there is no requirement for high accuracy, the rotation in the transformation matrix can be ignored completely, but only one translation is assumed, and the rough value of the translation is easily obtained from the mechanical design software. In addition, the IMU to RGB camera transformation matrix R I2C is already calibrated at the time of design of the navigation system and is also readily available.
It can be seen that, in this embodiment, the 3D millimeter wave radar cluster point may be converted to the RGB camera coordinate system by the initial calculation method of converting the 3D millimeter wave radar cluster point to the RGB camera image by the IMU coordinate system, so as to calibrate the transformation matrix between the 3D millimeter wave radar and the RGB camera coordinate system, which is favorable for implementing the fusion of the 3D millimeter wave radar and the RGB camera.
Alternatively, the devices used in the joint calibration method may be IMUs, RGB cameras, and 3D millimeter wave radars carried by the inspection robot itself.
In some embodiments of the present invention, the iterative formula for converting the cluster points to the pixel coordinate system is:
That is, an iterative calculation method that can convert 3D millimeter wave radar cluster points onto RGB camera images by an IMU coordinate system is provided in the present embodiment. Therefore, after the user adjusts the module of the conversion matrix of the radar reaching the IMU to pull the progress bar, the iterative formula for converting the 3D millimeter wave radar cluster points to the RGB camera image pixel coordinates is as follows:
In the formula (7), ΔR i represents the matrix change amount of each pulling progress bar adjustment in the conversion matrix module of the adjustment radar reaching the IMU. It can be seen that in this embodiment, the adjustment of R is achieved by multiplying R R2I by a matrix variation DeltaR i by a module R2I
It should be noted that, compared with the scheme of calculating the conversion matrix of the 3D millimeter wave radar to the RGB camera by using the machine learning method based on the neural network, the scheme of calculating the conversion matrix of the 3D millimeter wave radar to the RGB camera by using the machine learning method based on the neural network generally adopts the original point cloud mode. The method comprises the steps of adopting a machine learning method based on a neural network and the like to calculate a conversion matrix of a 3D millimeter wave radar to an RGB camera to realize joint calibration, firstly, adjusting the 3D millimeter wave radar to an original point cloud model, and placing a plurality of metal balls (the 3D millimeter wave radar has the characteristic of taking a stationary metal object as a moving object) or placing an object moving forward on two sides of a road right in front of a robot carrying the 3D millimeter wave radar and the RGB camera. And in a static state of the robot, recording the cloud points of the original points of the 3D millimeter waves, calculating the center point of each metal ball in the RGB camera image, or calculating the center point of a moving object in the RGB camera image, deleting part of the outlier cloud points of the original points of the 3D millimeter waves to enable the number of the cloud points to be consistent with that of the center points of the metal balls or the center points of the moving object, correspondingly arranging the cloud points of the original points of the 3D millimeter waves with the center points of the metal balls or the center points of the moving object in the image according to the front-back position relation so as to form point pairs, and finally, transmitting the point pairs into a neural network algorithm or other machine learning algorithms to learn a conversion matrix of the 3D millimeter wave radar to the RGB camera.
The scheme for calculating the conversion matrix of the 3D millimeter wave radar to the RGB camera by adopting a machine learning method based on a neural network and the like to realize joint calibration mainly has the following two defects: first, a neural network algorithm or other machine learning algorithm used in a scheme for calculating a conversion matrix of the 3D millimeter wave radar to the RGB camera to realize joint calibration by adopting a machine learning method based on a neural network and the like needs a large amount of training data. In order to collect training data, a plurality of metal balls need to be purchased in advance, so that the calibration cost is high. If training data is collected by a movable object (such as a person), it is also necessary to develop a target detection algorithm for the relevant object, which also increases the calibration cost. Secondly, in order to collect training data, a large calibration site is obviously required by a scheme for realizing joint calibration by calculating a conversion matrix of the 3D millimeter wave radar to the RGB camera by adopting a machine learning method based on a neural network and the like.
In contrast, in the embodiment, because the calibration process uses the IMU as a coordinate system conversion medium, the whole calibration process does not need to move a robot or a metal plate, and only one person is required to pull the progress bar for adjusting R R2I on a computer. The present embodiment has an advantage of extremely easy operation.
According to one embodiment of the invention, R R2I is adjusted by adjusting the rotation portion in the matrix change ΔR i. That is, the method of transforming the 3D millimeter wave radar coordinate system to the IMU coordinate system transformation matrix may be adjusted. Wherein, since the transformation matrix R R2I is composed of a rotation matrix and a translation vector, the rotation matrix may be expressed as euler angles.
In some embodiments of the present invention, R R2I is adjusted by making a clockwise and/or counterclockwise adjustment to three coordinate axes in the coordinate system. For example, adjusting the rotation portion in the transformation matrix R R2I can be seen as rotating a certain angle (euler angle) clockwise or counterclockwise about the z, y, x axis. And because the coordinate system has three coordinate axes and rotates clockwise or anticlockwise, the rotating part of the adjustment transformation matrix R R2I corresponds to 6 delta R i in total. The translation portion of the adjustment matrix R R2I can be seen as translating a certain amount in the positive and negative directions of the z, y, x axes, and similarly corresponds to 6 Δr i.
For example, the 12 matrix variables of the adjustment transformation matrix R R2I are as follows:
(1) Rotated clockwise or counter-clockwise about the z-axis by an angle alpha
Clockwise direction
Anticlockwise
(2) Rotated clockwise or counter-clockwise about the y-axis by an angle alpha
Clockwise direction
Anticlockwise
(3) Rotated clockwise or anticlockwise about the x-axis by an angle alpha
Clockwise directionAnticlockwise/>
(4) Translation distance beta along positive and negative directions of x axis
In the positive directionNegative direction/>
(5) Translation distance beta along positive and negative directions of y axis
In the positive directionNegative direction/>
(6) Translation distance beta along positive and negative directions of z axis
In the positive direction
Negative direction
According to one embodiment of the invention, α takes 0.00785 radians (0.45 degrees) and β takes 0.08.
In some embodiments of the present invention, the 12 types of Δr i adjustment modes are made into a graphical interface, and R R2I is adjusted by pulling the progress bar. It should be noted that, in this embodiment, the 3D millimeter wave radar cluster points are converted to the image pixel coordinate system by using the IMU as a medium, and even if the initial conversion is inaccurate, that is, the initial conversion does not coincide with the centroid of the metal plate, as the progress bar is pulled continuously, they will eventually coincide.
The joint calibration method according to the embodiment of the present invention is described in detail below with reference to specific embodiments.
(1) The 3D millimeter wave radar is adjusted to Object mode and a rectangular metal plate is arranged in front of the robot.
(2) The centroid of the stencil is also displayed on the image.
(3) The 3D millimeter wave radar cluster points (x R,yR,1)T are converted into RGB camera image pixel coordinate system and displayed on the image:
(u,v,1)T=KRI2CRR2I(xR,yR,1,1)T
Wherein (x R,yR,1,1)T) is the homogeneous coordinate of the 3D millimeter wave radar cluster point, u, v represent the image pixel coordinate, K represents the RGB camera internal reference, and T represents the transposition.
(4) R R2I is adjusted, namely R R2I is multiplied by a matrix change amount DeltaR i (which is made into a graphical interface and adjusted by pulling a progress bar).
(5) Repeating the steps (3) and (4), and finally enabling the point of the 3D millimeter wave radar clustering point (Object mode) converted into the RGB camera image to be precisely coincident with the mass center of the metal plate.
(6) Assuming that the transformation matrix of the 3D millimeter wave radar RGB camera finally calibrated is adjusted N times
It should be noted that, because the point of converting the 3D millimeter wave radar cluster point (Object mode) into the RGB camera image and the centroid of the metal plate are precisely coincident, the joint calibration method of the embodiment of the invention has the characteristic of high precision.
Compared with the scheme of directly solving the conversion matrix of the 3D millimeter wave radar to the RGB camera based on the nonlinear optimization algorithm, the initial value of the nonlinear optimization algorithm is random, after the algorithm is ended, the point of the 3D millimeter wave radar cluster point on the image does not necessarily coincide with the centroid of the metal plate, and the embodiment can ensure the calibration accuracy.
In addition, the 3D millimeter wave radar coordinate system can be converted into the image coordinate system by pulling the progress bar, the whole calibration process does not need a mobile robot or a metal plate, calibration can be completed by one person and one computer, and compared with a scheme of calculating a conversion matrix of the 3D millimeter wave radar to an RGB camera by adopting a machine learning method based on a neural network and the like, the method is extremely easy to operate.
The invention also discloses a combined calibration system which is applied to the combined calibration method of any embodiment, and the system comprises an image acquisition module, a centroid processing module and a clustering point processing module.
Specifically, the image acquisition module acquires an image containing a metal plate, the centroid processing module acquires the centroid of the metal plate and displays the centroid on the image, the clustering point processing module converts the clustering points output by the millimeter wave radar under the pixel coordinate system of the image and displays the clustering points on the image, and the clustering points are converted to coincide with the centroid in the image.
The present invention also provides an electronic device including: a processor 201 and a memory 202, wherein computer program instructions are stored in the memory 202, wherein the computer program instructions, when executed by the processor 201, cause the processor 201 to perform the steps of the joint calibration method in the above-described embodiments.
Further, as shown in fig. 3, the electronic device further comprises a network interface 203, an input device 204, a hard disk 205, and a display device 206.
The interfaces and devices described above may be interconnected by a bus architecture. The bus architecture may include any number of interconnected buses and bridges. One or more central processing units 201 (CPUs), in particular represented by processor 201, and various circuits of one or more memories 202, represented by memories 202, are connected together. The bus architecture may also connect various other circuits together, such as peripheral devices, voltage regulators, and power management circuits. It is understood that a bus architecture is used to enable connected communications between these components. The bus architecture includes, in addition to a data bus, a power bus, a control bus, and a status signal bus, all of which are well known in the art and therefore will not be described in detail herein.
The network interface 203 may be connected to a network (e.g., the internet, a local area network, etc.), and may obtain relevant data from the network and store the relevant data in the hard disk 205.
Input device 204 may receive various instructions entered by an operator and send to processor 201 for execution. The input device 204 may include a keyboard or pointing device (e.g., a mouse, a trackball, a touch pad, or a touch screen, among others).
A display device 206 may display results obtained by the execution of instructions by the processor 201.
The memory 202 is used for storing programs and data necessary for the operation of the operating system 2021, and data such as intermediate results in the calculation process of the processor 201.
It will be appreciated that the memory 202 in embodiments of the invention can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. The nonvolatile memory may be Read Only Memory (ROM), programmable Read Only Memory (PROM), erasable Programmable Read Only Memory (EPROM), electrically Erasable Programmable Read Only Memory (EEPROM), or flash memory, among others. Volatile memory can be Random Access Memory (RAM), which acts as external cache memory. The memory 202 of the apparatus and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory 202.
In some implementations, the memory 202 stores the following elements, executable modules or data structures, or a subset thereof, or an extended set thereof: an operating system 2021 and application programs 2022.
The operating system 2021 contains various system programs, such as a framework layer, a core library layer, a driver layer, and the like, for implementing various basic services and processing hardware-based tasks. The application programs 2022 include various application programs 2022, such as a Browser (Browser), for implementing various application services. The program implementing the method of the embodiment of the present invention may be contained in the application program 2022.
The above-described processor 201 performs the steps of the joint calibration method according to the above-described embodiment when calling and executing the application 2022 and data stored in the memory 202, specifically, the program or instructions stored in the application 2022.
The method disclosed in the above embodiment of the present invention may be applied to the processor 201 or implemented by the processor 201. The processor 201 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware in the processor 201 or by instructions in the form of software. The processor 201 may be a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or may implement or perform the methods, steps, and logic blocks disclosed in embodiments of the present invention. A general purpose processor may be a microprocessor or the processor 201 may be any conventional processor 201 or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software modules in a decoding processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in the memory 202, and the processor 201 reads the information in the memory 202 and, in combination with its hardware, performs the steps of the method described above.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or a combination thereof. For a hardware implementation, the processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units designed to perform the functions of the invention, or a combination thereof.
For a software implementation, the techniques herein may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions herein. The software codes may be stored in the memory 202 and executed by the processor 201. The memory 202 may be implemented within the processor 201 or external to the processor 201.
Specifically, the processor 201 is further configured to read the computer program and perform the steps of predicting a stake pocket method and outputting answers to questions asked by the user.
In a fourth aspect of the present invention, there is also provided a computer readable storage medium storing a computer program, which when executed by the processor 201, causes the processor 201 to perform the steps of the joint calibration method of the above embodiment.
In the several embodiments provided in the present invention, it should be understood that the disclosed methods and apparatus may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of elements is merely a logical functional division, and there may be additional divisions of actual implementation, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may be physically included separately, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The integrated units implemented in the form of software functional units described above may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform part of the steps of the transceiving method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory RAM), a magnetic disk, or an optical disk, etc., which can store program codes.
While certain specific embodiments of the invention have been described in detail by way of example, it will be appreciated by those skilled in the art that the above examples are for illustration only and are not intended to limit the scope of the invention. It will be appreciated by those skilled in the art that modifications may be made to the above embodiments without departing from the scope and spirit of the invention. The scope of the invention is defined by the appended claims.

Claims (10)

1. The joint calibration method is characterized by comprising the following steps of;
Acquiring an image containing a metal plate through an RGB camera, acquiring a mass center of the metal plate through the image and displaying the mass center on the image;
converting the cluster points output by the millimeter wave radar into a pixel coordinate system of the image, and displaying the cluster points on the image;
the cluster points are converted to points in the image that coincide with the centroid.
2. The joint calibration method according to claim 1, wherein the centroid of the metal plate is obtained by the following formula:
3. the joint calibration method according to claim 1, characterized in that the cluster points are converted to the pixel coordinate system using an IMU coordinate system.
4. A joint calibration method according to claim 3, wherein the initial formula for the conversion of the cluster points into the pixel coordinate system is:
(u,v,1)T=KRI2CRR2I(xR,yR,1,1)T
5. the joint calibration method according to claim 4, wherein the iterative formula for converting the cluster points to the pixel coordinate system is:
6. The joint calibration method according to claim 5, wherein R R2I is adjusted by adjusting the rotation portion in the matrix change Δr i.
7. The joint calibration method according to claim 6, characterized in that R R2I is adjusted clockwise and/or counterclockwise by means of three coordinate axes in the coordinate system.
8. A joint calibration system for use in a joint calibration method according to any one of claims 1 to 7, the system comprising:
an image acquisition module that acquires an image including a metal plate;
A centroid processing module that obtains a centroid of the metal plate and displays the centroid on the image;
And the clustering point processing module converts the clustering points output by the millimeter wave radar into a pixel coordinate system of the image, displays the clustering points on the image, and converts the clustering points into points in the image to coincide with the mass center.
9. An electronic device, comprising: a processor and a memory in which computer program instructions are stored, wherein the computer program instructions, when executed by the processor, cause the processor to perform the steps of the joint calibration method of any of claims 1-7.
10. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program which, when executed by a processor, causes the processor to perform the steps of the joint calibration method of any of claims 1-7.
CN202311773129.1A 2023-12-21 2023-12-21 Combined calibration method, system, electronic equipment and computer readable storage medium Pending CN118050723A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311773129.1A CN118050723A (en) 2023-12-21 2023-12-21 Combined calibration method, system, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311773129.1A CN118050723A (en) 2023-12-21 2023-12-21 Combined calibration method, system, electronic equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN118050723A true CN118050723A (en) 2024-05-17

Family

ID=91052783

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311773129.1A Pending CN118050723A (en) 2023-12-21 2023-12-21 Combined calibration method, system, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN118050723A (en)

Similar Documents

Publication Publication Date Title
US10594941B2 (en) Method and device of image processing and camera
CN110927708B (en) Calibration method, device and equipment of intelligent road side unit
WO2018090308A1 (en) Enhanced localization method and apparatus
US20140085409A1 (en) Wide fov camera image calibration and de-warping
WO2022088103A1 (en) Image calibration method and apparatus
CN112288825B (en) Camera calibration method, camera calibration device, electronic equipment, storage medium and road side equipment
CN109901123B (en) Sensor calibration method, device, computer equipment and storage medium
CN107600008B (en) Method and device for generating backing auxiliary line, vehicle-mounted equipment and storage medium
CN112381889B (en) Camera inspection method, device, equipment and storage medium
CN114782911A (en) Image processing method, device, equipment, medium, chip and vehicle
US20210156710A1 (en) Map processing method, device, and computer-readable storage medium
CN103685936A (en) WIDE field of view camera image calibration and de-warping
CN113763478B (en) Unmanned vehicle camera calibration method, device, equipment, storage medium and system
EP3770665A1 (en) Method for correcting rolling shutter phenomenon, rolling shutter phenomenon correcting apparatus, and computer-readable recording medium
JP6385380B2 (en) Arithmetic device, control device and program
CN112256032A (en) AGV positioning system, control method, equipment and storage medium
CN118050723A (en) Combined calibration method, system, electronic equipment and computer readable storage medium
CN116030139A (en) Camera detection method and device, electronic equipment and vehicle
CN113610927B (en) AVM camera parameter calibration method and device and electronic equipment
CN111223139A (en) Target positioning method and terminal equipment
CN117095131B (en) Three-dimensional reconstruction method, equipment and storage medium for object motion key points
CN116817929B (en) Method and system for simultaneously positioning multiple targets on ground plane by unmanned aerial vehicle
CN113227708B (en) Method and device for determining pitch angle and terminal equipment
CN112446928B (en) External parameter determining system and method for shooting device
Chen et al. Imaging model of the off-axis non-central spherical mirrored stereo vision sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination