CN117092659A - System and method for jointly measuring ship height by laser imaging radar and camera - Google Patents

System and method for jointly measuring ship height by laser imaging radar and camera Download PDF

Info

Publication number
CN117092659A
CN117092659A CN202310930371.9A CN202310930371A CN117092659A CN 117092659 A CN117092659 A CN 117092659A CN 202310930371 A CN202310930371 A CN 202310930371A CN 117092659 A CN117092659 A CN 117092659A
Authority
CN
China
Prior art keywords
camera
point
data
ship
height
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310930371.9A
Other languages
Chinese (zh)
Inventor
熊木地
赵永杰
戚超
林意涵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian Maritime University
Original Assignee
Dalian Maritime University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian Maritime University filed Critical Dalian Maritime University
Priority to CN202310930371.9A priority Critical patent/CN117092659A/en
Publication of CN117092659A publication Critical patent/CN117092659A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a system and a method for jointly measuring ship height by a laser imaging radar and a camera, wherein data acquired by the camera sensor, the laser radar sensor and an inclination sensor are transmitted into an industrial computer, camera calibration, image processing and point cloud processing are respectively carried out by utilizing a camera calibration module, an image processing module and a point cloud data processing module, then the image processed data and the point cloud processed data are jointly calibrated by utilizing a joint calibration module, then coordinate transformation is carried out by utilizing a coordinate transformation module, then the data after joint calibration and coordinate transformation are input into a data fusion module for data fusion, inclination compensation is carried out on the data detected by the inclination sensor by utilizing an angle compensation module, and finally the ship height is obtained by carrying out high-speed calculation by utilizing the height calculation module based on the data after data fusion and the data after inclination compensation. According to the invention, the height of the ship can be measured rapidly and accurately through the combined measurement of the camera and the laser radar.

Description

System and method for jointly measuring ship height by laser imaging radar and camera
Technical Field
The invention relates to the technical field of ship height measurement, in particular to a system and a method for measuring ship height by combining a laser imaging radar and a camera.
Background
With the continuous increase of the navigation capacity of the inland, the phenomenon of ultra-high violation navigation of inland ships is increased, so that the navigation safety, the navigation efficiency and the economic benefit of the inland navigation channel are seriously influenced, and the healthy development of the navigation economy is hindered.
Measurement of river vessels currently, the technology for measuring the height of a river vessel in an inland river mainly uses a plurality of sensors arranged at different heights or a single sensor for measuring at different heights, and the technology comprises the following steps: a laser radar altimetry and a binocular camera parallax method.
However, the laser radar altimetry method can only acquire the height information of the ship surface, the point cloud of the laser radar is sparse in the vertical direction, the highest point of the ship can not be detected by the laser radar, and the accuracy can not be guaranteed. Although the binocular camera parallax method can obtain more accurate height information, the parallax method cannot accurately detect the boundary of an object in a region with insignificant color or reflectivity change, and has the problems of dead angles and illumination influence on the accuracy.
Therefore, there is no effective technical means for performing rapid off-board ship altitude measurement, and a new and efficient ship altitude measurement method is needed to realize rapid and accurate measurement of the ship.
Disclosure of Invention
In order to solve the technical problems, the invention provides a system and a method for jointly measuring the height of a ship by using a laser imaging radar and a camera, wherein the height of the ship can be quickly and accurately measured by jointly measuring the camera and the laser radar, and compared with the traditional ship height measuring method, the method can effectively improve the measuring precision of the height of the ship and has wide application prospect.
In order to achieve the above object, the technical scheme of the present invention is as follows:
the invention provides a system for measuring the height of a ship by combining a laser imaging radar and a camera, which comprises the following components:
a camera sensor for acquiring ship image data;
laser imaging to obtain a laser radar sensor of ship point cloud data;
the inclination sensor is used for collecting the inclination of the ship;
the camera calibration module is connected with the camera sensor and used for calibrating the camera to obtain internal reference data of the camera;
the image processing module is connected with the camera calibration module and used for performing image processing on the ship image data calibrated by the camera calibration module;
the point cloud data processing module is connected with the laser radar sensor and is used for processing ship point cloud data acquired by the laser radar sensor;
the joint calibration module is connected with the point cloud data processing module and the image processing module and is used for performing joint calibration on the processed point cloud data and the processed image data to obtain external parameter data of the camera and the laser radar;
the coordinate transformation module is connected with the joint calibration module and used for transforming coordinates of the data after joint calibration;
the data fusion module is connected with the joint calibration module and the coordinate transformation module and used for carrying out data fusion on the data subjected to joint calibration and the data subjected to coordinate transformation;
an angle compensation module connected with the inclination sensor and used for performing angle compensation on the ship inclination acquired by the inclination sensor, and
and the height calculation module is connected with the data fusion module and the angle compensation module and is used for calculating the ship height based on the data after the data fusion and the data after the angle compensation.
Further, the laser radar and the camera are horizontally arranged on the same bracket, the laser radar is arranged right above the camera, and a two-dimensional inclination sensor is arranged on one side of the laser radar so that the camera and the laser radar are kept horizontal.
Further, the height calculation module obtains the height of the ship relative to the water surface, namely the clearance height of the ship, according to the distance and the coordinates of the point cloud obtained by the data fusion module, and the installation height and the water level information of the laser radar.
Further, the height calculation module searches the nearest point projected by the point cloud downwards from the highest point of the ship according to the highest point P0 of the ship found by the image processing and the point cloud data projected to the image pixel points; when a first point P1 projected by the point cloud is found, calculating the distance between the first point P1 and the second point P, namely the number N of pixel points between the first point P and the second point P; the coordinates of the point corresponding to the point P1 in the camera coordinate system are (ac, bc, cc), the coordinates of the point corresponding to the laser radar coordinate system are (al, bl, cl), (al, bl, cl) are directly obtained from the point cloud data of the laser radar, the (ac, bc, cc) are obtained from the point cloud data of the laser radar through external parameter matrix transformation, and the two are high-precision known data; in actual measurement, the coordinates of the point between P0 and P1 on the Z axis of the laser radar coordinate system are equal or approximately equal to cl, namely the distance from the point between P0 and P1 to the plane in which the camera and the radar are located is the same; the Y-axis coordinate of the camera coordinate system and the Z-axis coordinate of the radar coordinate system represent the vessel height.
Further, the data fusion module utilizes the obtained internal reference data of the camera and the external reference data of the laser radar and the camera to carry out joint processing on the point cloud data acquired by the laser radar and the image data of the camera, extracts the highest point of the ship through a technology based on image processing, uses the internal reference of the camera and the external reference calibrated by the laser radar and the camera, and projects the point cloud acquired by the laser radar onto the image acquired by the camera through a projection algorithm and the conversion of a coordinate system; and then detecting the distance between the highest point of the ship and the closest point projected by the point cloud below the highest point, and calculating the real distance from the point to the point cloud projected point in a world coordinate system by using a correlation algorithm according to the distance.
Further, the camera calibration module adopts a Zhang Zhengyou calibration method, and obtains an internal reference matrix and a distortion coefficient of the camera by shooting checkerboard patterns at different positions and solving camera parameters by using a calibration algorithm.
Further, the image processing module carries out background modeling on a two-dimensional image shot by a camera, a background modeling algorithm is used for extracting a foreground image, namely a moving ship is extracted from the image, then the image is subjected to binarization processing, the target gray level of the moving ship is 1, the image is imaged and displayed to be white, the gray level value of a static background area is 0, the image is displayed to be black, each pixel point in the image is traversed from the top to the bottom in sequence for the binarized image, the first pixel value obtained by traversing is 1, a point with the pixel value of 1 exists in the adjacent point 8, and the point is regarded as the pixel value point of the highest point of the ship imaging, and at the moment, the highest point coordinate of the corresponding ship in the image is found.
Further, the point cloud data processing module filters noise and removes outliers and ground points of the point cloud data.
Furthermore, the joint calibration module uses a checkerboard calibration method to enable the laser radar and the camera to synchronously acquire data, and the conversion relation from the laser radar coordinate system to the camera coordinate system is solved by finding corner data of the checkerboard.
The invention also provides a method for jointly measuring the ship height by the laser imaging radar and the camera, which is based on a system for jointly measuring the ship height by the laser imaging radar and the camera, and comprises the following steps:
calibrating internal parameters of the camera, and respectively calculating internal reference data of the camera;
performing joint calibration on the camera and the laser radar to obtain external parameter data of the laser radar relative to the camera;
synchronously acquiring ships by using a camera and a laser radar to obtain image data and point cloud data;
processing the point cloud data, removing outliers and ground points, projecting the rest points into a new two-dimensional matrix according to an internal reference matrix of the camera and an external reference matrix of the camera and the laser radar, and reserving a one-to-one correspondence between points in the two-dimensional matrix and three-dimensional point cloud data points before projection;
the projected two-dimensional matrix projects points of the two-dimensional matrix contained in the image range onto the image according to the size of the image acquired by the camera, and each point is correspondingly projected onto a pixel point;
background modeling is carried out on a two-dimensional image shot by a camera, a ship target in the image is extracted, and two-dimensional coordinates of a ship high-point in a pixel coordinate system are obtained;
after the two-dimensional coordinates of the highest point are found on the image, forward finding the nearest point projected by the point cloud to the v-axis of the pixel coordinate system, recording the distance between the highest point of the ship and the projected point, and calculating the distance between the highest point of the ship and the projected point in the world coordinate system according to the distance, namely the real distance;
according to the one-to-one correspondence between the points of the two-dimensional matrix after the point cloud projection and the points in the three-dimensional point cloud before the projection, the three-dimensional coordinates of the projection points are found, and the real height value of the points relative to the mounting height of the laser radar and the camera is obtained by combining the angle compensation information of the inclination sensor;
and obtaining the height of the ship relative to the water surface according to the installation height of the camera and the laser radar, the water surface height, the height of the laser radar projection point and the height difference from the highest point of the ship to the laser radar projection point.
Compared with the prior art, the invention has the following beneficial effects:
the invention adopts the camera and the laser radar to jointly measure the ship height, and combines the information and advantages of the two sensors. Based on the image data of the camera and the three-dimensional point cloud data of the laser radar, the measurement data of the two sensors are combined, the acquisition of the point cloud on the surface of the ship body is realized, and the acquisition of the height information of the ship body is realized based on the algorithms such as the highest point extraction, the distance calculation, the coordinate transformation and the like. The advantages of high accuracy and good stability of the three-dimensional point cloud data of the laser radar and no influence of illumination, namely color, are fully exerted, the data collected by the camera are denser, the advantages of being convenient for target detection due to the fact that the texture is better are achieved, and the measurement accuracy and reliability are improved. The invention has the advantages of real-time performance, high precision, simple operation and the like, and can be effectively applied to the field of ship height measurement.
The invention maintains the one-to-one correspondence between the points of the two-dimensional matrix after the projection of the point cloud and the points in the three-dimensional point cloud before the projection, avoids the back projection of the image (from a pixel coordinate system to a radar coordinate system) when calculating the height, avoids the error generated thereby and improves the efficiency.
According to the invention, the two-dimensional inclination sensor is used for compensating the inclination angles of the camera and the laser radar, so that the measurement accuracy is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings may be obtained according to the drawings without inventive effort to a person skilled in the art.
FIG. 1 is a block diagram of a laser imaging radar and camera combined ship height measurement system in accordance with an embodiment of the present invention;
FIG. 2 is a flowchart of a method for measuring the ship height by combining a laser imaging radar and a camera according to an embodiment of the invention;
FIG. 3 is a diagram of an installation frame in an embodiment of the present invention;
FIG. 4 is a schematic diagram of the joint calibration conversion of three-dimensional coordinates of two sensors according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a pixel point of an image projected by a point cloud according to an embodiment of the present invention;
in the figure, 1, a camera sensor; 2. a lidar sensor; 3. an inclination sensor; 4. a point cloud data processing module; 5. an image processing module; 6. an angle compensation module; 7. a camera calibration module; 8. a joint calibration module; 9. a coordinate transformation module; 10. a data fusion module; 11. and a height calculating module.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
As shown in fig. 1, in an embodiment of the present invention, there is provided a system for measuring a ship height by combining a laser imaging radar and a camera, the system generally comprising: the system comprises a camera sensor 1, a laser radar sensor 2, an inclination sensor 3, a point cloud data processing module 4, an image processing module 5, an angle compensation module 6, a camera calibration module 7, a joint calibration module 8, a coordinate transformation module 9, a data fusion module 10 and a height calculation module 11; wherein:
a camera sensor 1 for acquiring ship image data;
the laser radar sensor 2 is used for laser imaging to obtain ship point cloud data;
the inclination sensor 3 is used for collecting the inclination of the ship;
in a specific implementation, as shown in fig. 3, the laser radar and the camera are horizontally arranged on the same bracket, the laser radar is arranged right above the camera, and a two-dimensional inclination sensor 3 is arranged on one side of the laser radar to keep the camera and the laser radar horizontal.
The camera calibration module 7 is connected with the camera sensor 1 and is used for calibrating the camera to obtain information such as an internal reference matrix and a distortion coefficient of the camera.
The image processing module 5 is connected with the camera calibration module 7 and is used for performing image processing on ship image data calibrated by the camera calibration module 7;
the point cloud data processing module 4 is connected with the laser radar sensor 2 and is used for processing ship point cloud data acquired by the laser radar sensor 2;
the joint calibration module 8 is connected with the point cloud data processing module 4 and the image processing module 5, and is used for performing joint calibration on the processed point cloud data and the processed image data to obtain external reference data of the camera and the laser radar, namely a rotation matrix and a translation matrix.
The coordinate transformation module 9 is connected with the joint calibration module 8 and is used for carrying out coordinate transformation on the data after joint calibration;
the data fusion module 10 is connected with the joint calibration module 8 and the coordinate transformation module 9, and is used for carrying out data fusion on the data after joint calibration and the data after coordinate transformation, and specifically comprises the following steps: the method comprises the steps of carrying out joint processing on point cloud data acquired by a laser radar and image data of a camera by using acquired internal reference data of the camera and external reference data of the laser radar and the camera, extracting the highest point of a ship by using an image processing technology, and projecting the point cloud acquired by the laser radar onto an image acquired by the camera by using the internal reference of the camera and the external reference calibrated by the laser radar and the camera through a projection algorithm and conversion of a coordinate system; then, the distance (pixel point with phase difference) between the highest point of the ship and the closest point projected by the point cloud below the highest point (in the v-axis forward direction of the pixel coordinate system) is detected, and the real distance from the point to the point cloud projected point in the world coordinate system is calculated by using a correlation algorithm according to the distance.
The angle compensation module 6 is connected with the inclination sensor 3 and is used for performing angle compensation on the ship inclination acquired by the inclination sensor 3;
the height calculating module 11 is connected with the data fusion module 10 and the angle compensating module 6, and performs ship height calculation based on the data after data fusion and the data after angle compensation, specifically, obtains the height of the ship relative to the water surface, namely the clearance height of the ship according to the distance obtained by the data fusion module 10, the coordinates of the point cloud, the installation height of the laser radar, the water level and other information.
The working process of the system is as follows: the method comprises the steps that data acquired by a camera sensor 1, a laser radar sensor 2 and an inclination sensor 3 are transmitted to an industrial computer, camera calibration, image processing and point cloud processing are respectively carried out by a camera calibration module 7, an image processing module 5 and a point cloud data processing module 4, then the data after image processing and the data after point cloud processing are jointly calibrated by a joint calibration module 8, then coordinate transformation is carried out by a coordinate transformation module 9, then the data after joint calibration and coordinate transformation are input into a data fusion module 10 for data fusion, inclination compensation is carried out on the data detected by the inclination sensor 3 by an angle compensation module 6, and finally a high-speed calculation is carried out by a height calculation module 11 based on the data after data fusion and the data after inclination compensation, so that the ship height is obtained.
As shown in fig. 2, based on the above system, a method for jointly measuring the ship height by using a laser imaging radar and a camera in an embodiment of the present invention includes the following steps:
s1, calibrating internal parameters of a camera, and respectively calculating internal reference data of the camera;
specifically, calibrating the camera includes solving an internal reference matrix and a distortion coefficient of the camera. The Zhang Zhengyou calibration method can be adopted, and the internal reference matrix and the distortion coefficient of the camera are obtained by shooting checkerboard patterns at different positions and solving camera parameters by using a calibration algorithm.
S2, carrying out joint calibration on the camera and the laser radar to obtain external parameter data of the laser radar relative to the camera;
specifically, the lidar is placed directly above the camera with both steps at the level. And synchronously acquiring data by using a checkerboard calibration method, and solving the conversion relation from a laser radar coordinate system to a camera coordinate system by finding corner data of the checkerboard.
In order to project points measured in the lidar coordinate system into the camera coordinate system, an additional transformation needs to be added in a mapping operation that is divided into two parts: translation and rotation.
Translation describes a linear movement from point P in three-dimensional coordinates to P' which can be achieved by adding a translation vector t to P:
the usage matrix is represented as follows:
in homogeneous coordinates, this can be represented by connecting an M-dimensional identity matrix I, where M is the number of elements of P and T, and T is the translation matrix.
In the rotation operation, the relation after rotation about the x-axis is
Y′ L =Y L cosα-Z L sinα (3)
Z′ L =Y L cosα+Z L sinα (4)
Similarly, the relationship after around the y-axis and around the z-axis is:
the rotation components in the three directions are multiplied to obtain a rotation matrix between the two three-dimensional coordinates:
as can be obtained from the above formula, the complete coordinate conversion required for mapping the laser radar three-dimensional point cloud onto the camera coordinate system is shown as follows, and the joint calibration formula of the camera and the laser radar is as follows:
wherein P is c (x c ,y z ,z c ) And P l (x l ,y l ,z l ) The point cloud data coordinates in the camera coordinate system and in the lidar coordinate system are respectively, and R and T are respectively a rotation matrix and a translation matrix from the lidar coordinate system to the camera coordinate system.
Equation (9) converting and correcting the coordinate system of the two sensors according to the internal reference matrix of the camera and the external reference matrix of the camera and the laser radar, and using the transformation matrix based on rigid body transformationAnd converting the point cloud data acquired by the laser radar into a camera coordinate system. And converting a camera coordinate system into an image coordinate system according to the internal reference matrix and the distortion coefficient of the camera, converting the image coordinate system into a pixel coordinate system, and performing image fusion with an image acquired by the camera to realize the projection of data of the radar point cloud on the image.
S3, synchronously acquiring ships by using a camera and a laser radar to obtain image data and point cloud data;
s4, processing the point cloud data, removing outliers and ground points, projecting the rest points into a new two-dimensional matrix according to an internal reference matrix of the camera and an external reference matrix of the camera and the laser radar, and reserving a one-to-one correspondence between the points in the two-dimensional matrix and three-dimensional point cloud data points before projection;
the point cloud data processing comprises the following steps: noise is filtered, outliers and ground points of the point cloud data are removed, and the points influence the accuracy of subsequent calculation, so that the outliers and the ground points need to be removed.
S5, projecting points of the projected two-dimensional matrix in the image range onto the image according to the size of the image acquired by the camera, wherein each point is correspondingly projected onto a pixel point;
to achieve data fusion, three-dimensional point cloud data are projected onto an image, firstly, the three-dimensional point cloud is converted from a laser radar coordinate system to a camera coordinate system according to the R and T carried in the marked formula (9). And converting the camera coordinate system into a pixel coordinate system according to the internal parameters of the camera, projecting the point cloud data onto pixel points of an image of the camera, and reserving a one-to-one correspondence relation between the point cloud data projected onto the pixel coordinate system and three-dimensional point cloud in the laser radar coordinate system before projection. The projected image is shown in fig. 5.
Specifically, the point cloud data P acquired by the laser radar L =(X L ,Y L ,Z L ) Projecting to the camera coordinate system to obtain corresponding coordinate P C =(X C ,Y C ,Z C ):
The points of the camera coordinate system are projected onto the pixel coordinate system, and the projection formula is as follows:
u=(f x *X C /Z C )+c x (11)
v=(f y *Y C /Z C )+c y (12)
where (u, v) is the point coordinates in the pixel coordinate system, (c) x ,c y ) Is the coordinate shift of the camera optical center under the pixel coordinate system, f x And f y Is the focal length of the camera, representing the focal length in the horizontal and vertical directions, respectively. According to the formula, the point cloud of the laser radar can be projected onto the pixel points of the image.
And filtering out points beyond the range of the image according to the size of the image, and projecting the rest points onto the image.
When the point cloud data are projected onto the pixel points of the image of the camera, the one-to-one correspondence relation between the point cloud data projected onto the pixel coordinate system and the three-dimensional point cloud in the laser radar coordinate system before projection is reserved.
S6, background modeling is carried out on the two-dimensional image shot by the camera, a ship target in the image is extracted, and two-dimensional coordinates of the ship high-point in a pixel coordinate system are obtained;
the image processing includes: background modeling is carried out on a two-dimensional image shot by a camera, a foreground image is extracted by using a background modeling algorithm, namely, a moving ship is extracted from the image, then the image is subjected to binarization processing, the target gray level of the moving ship is 1, the image is imaged and displayed as white, the gray level value of a static background area is 0, the image is displayed as black, each pixel point in the image is sequentially traversed from the top to the bottom (positive direction of a pixel coordinate system v axis) of the binarized image, the first pixel value obtained by traversing is 1, and a point with the pixel value of 1 exists in the adjacent point 8, and is regarded as the pixel value point of the highest point of the ship imaging, and at the moment, the coordinates of the highest point of the corresponding ship in the image are found.
S7, after the two-dimensional coordinates of the highest point are found on the image, the nearest point projected by the point cloud is positively found to the v-axis of the pixel coordinate system, the distance between the highest point of the ship and the projected point is recorded, and the distance between the highest point of the ship and the projected point in the world coordinate system, namely the real distance, is calculated according to the distance;
s8, according to the one-to-one correspondence between the points of the two-dimensional matrix after the point cloud projection and the points in the three-dimensional point cloud before the projection, the three-dimensional coordinates of the projection points are found, and the real height value of the points relative to the mounting height of the laser radar and the camera is obtained by combining the angle compensation information of the inclination sensor;
s9, obtaining the height of the ship relative to the water surface according to the installation height of the camera and the laser radar, the water surface height, the height of the laser radar projection point and the height difference from the highest point of the ship to the laser radar projection point.
Specifically, the altitude calculation is to find the highest point P0 of the ship and the point cloud data projected onto the pixel points of the image according to the image processing, and as shown in fig. 5, find the closest point projected by the point cloud downward (positive direction of the v-axis of the pixel coordinate system) from the highest point of the ship. When the first point P1 projected by the point cloud is found, the distance between the two points is calculated, namely the number N of pixel points between the two points. The coordinates of the point corresponding to the point P1 in the camera coordinate system are (ac, bc, cc), the coordinates of the point corresponding to the laser radar coordinate system are (al, bl, cl), (al, bl, cl) can be directly obtained from the point cloud data of the laser radar, the (ac, bc, cc) can be obtained from the point cloud data of the laser radar through external matrix transformation, and the two are high-precision known data. In actual measurement, the coordinates of the point between P0 and P1 on the Z axis of the lidar coordinate system are all equal or approximately equal to cl, i.e., the distance from the point between P0 and P1 to the plane in which the camera and radar lie is the same. The orientation relationship between the coordinate axes of the camera coordinate system and the radar coordinate system is shown in fig. 4, and the Y-axis coordinate of the camera coordinate system and the Z-axis coordinate of the radar coordinate system may represent the ship height.
Assuming that the field angle of the camera in the vertical direction is α, the photograph taken by the camera has n pixels in total in the vertical direction, and it can be seen that, at this distance, the actual distance d generated by moving each pixel is:
d=(b l *tan(α/n))/cos(α/2) (13)
b l the y-axis coordinates of a point of the lidar coordinate system corresponding to the point of the point cloud projected on the image represent the distance of the point to the plane in which the lidar and the camera are located.
The actual distance D between the highest point of the ship and the point cloud projection point at this time is:
D=N*d (14)
at the moment, the highest point of the ship is away from the mounting height H of the laser radar and the camera 1 The method comprises the following steps:
H 1 =D+c l (15)
c l the z-axis coordinates of points of the lidar coordinate system corresponding to points of the point cloud projected on the image.
If the compensation angle transmitted by the inclination sensor is theta (the laser radar coordinate system rotates anticlockwise by theta degrees around the x-axis), the inclination sensor is compensatedMounting height H of rear ship peak distance laser radar and camera 2 The method comprises the following steps:
H 2 =H*cosθ+b l *sinθ (16)
the clearance height H of the vessel at this time is:
H=H 2 +H 0 +S (17)
wherein H is 0 And S is the distance from the bottom of the camera mounting frame to the water surface of the river.
In the embodiment, the height of the ship can be measured rapidly and accurately through the combined measurement of the camera and the laser radar, and compared with the traditional ship height measurement method, the measurement accuracy of the ship height can be effectively improved, and the method has a wide application prospect.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the invention.

Claims (10)

1. A system for measuring vessel height by combining a laser imaging radar and a camera, the system comprising:
a camera sensor (1) for acquiring ship image data;
laser imaging to obtain a laser radar sensor (2) of ship point cloud data;
an inclination sensor (3) for acquiring the inclination of the ship;
the camera calibration module (7) is connected with the camera sensor (1) and used for calibrating the camera to obtain internal reference data of the camera;
the image processing module (5) is connected with the camera calibration module (7) and is used for performing image processing on ship image data calibrated by the camera calibration module (7);
the point cloud data processing module (4) is connected with the laser radar sensor (2) and is used for processing ship point cloud data acquired by the laser radar sensor (2);
the joint calibration module (8) is connected with the point cloud data processing module (4) and the image processing module (5) and is used for performing joint calibration on the processed point cloud data and the processed image data to obtain external parameter data of the camera and the laser radar;
the coordinate transformation module (9) is connected with the joint calibration module (8) and is used for transforming the coordinates of the data after joint calibration;
the data fusion module (10) is connected with the joint calibration module (8) and the coordinate transformation module (9) and is used for carrying out data fusion on the data after joint calibration and the data after coordinate transformation;
an angle compensation module (6) connected with the inclination sensor (3) and used for performing angle compensation on the ship inclination acquired by the inclination sensor (3), and
and the height calculation module (11) is connected with the data fusion module (10) and the angle compensation module (6) and is used for calculating the ship height based on the data after the data fusion and the data after the angle compensation.
2. A system for combined measurement of ship height by laser imaging radar and camera according to claim 1, characterized in that the laser radar and the camera are horizontally arranged on the same bracket, the laser radar is arranged right above the camera, and a two-dimensional tilt sensor (3) is arranged on the side of the laser radar to keep the camera and the laser radar horizontal.
3. The system for measuring the ship height by combining the laser imaging radar and the camera according to claim 1, wherein the height calculating module (11) obtains the ship height relative to the water surface, namely the clearance height of the ship according to the coordinates of the distance and the point cloud obtained by the data fusion module (10) and the installation height and the water level information of the laser radar.
4. A system for combined measurement of ship height by laser imaging radar and camera according to claim 3, characterized in that the height calculation module (11) finds the highest point P0 of the ship according to image processing and the point cloud data projected to the image pixel points, and from the highest point of the ship, the closest point projected by the point cloud is searched downwards; when a first point P1 projected by the point cloud is found, calculating the distance between the first point P1 and the second point P, namely the number N of pixel points between the first point P and the second point P; the coordinates of the point corresponding to the point P1 in the camera coordinate system are (ac, bc, cc), the coordinates of the point corresponding to the laser radar coordinate system are (al, bl, cl), (al, bl, cl) are directly obtained from the point cloud data of the laser radar, the (ac, bc, cc) are obtained from the point cloud data of the laser radar through external parameter matrix transformation, and the two are high-precision known data; in actual measurement, the coordinates of the point between P0 and P1 on the Z axis of the laser radar coordinate system are equal or approximately equal to cl, namely the distance from the point between P0 and P1 to the plane in which the camera and the radar are located is the same; the Y-axis coordinate of the camera coordinate system and the Z-axis coordinate of the radar coordinate system represent the vessel height.
5. The system for jointly measuring the ship height by using the laser imaging radar and the camera according to claim 1, wherein the data fusion module (10) utilizes the obtained internal reference data of the camera and the external reference data of the laser radar and the camera to jointly process the point cloud data collected by the laser radar and the image data of the camera, extracts the highest point of the ship by using the technology based on image processing, and uses the internal reference of the camera and the external reference calibrated by the laser radar and the camera to project the point cloud collected by the laser radar onto the image collected by the camera through a projection algorithm and the conversion of a coordinate system; and then detecting the distance between the highest point of the ship and the closest point projected by the point cloud below the highest point, and calculating the real distance from the point to the point cloud projected point in a world coordinate system by using a correlation algorithm according to the distance.
6. The system for measuring the ship height by combining the laser imaging radar and the camera according to claim 1, wherein the camera calibration module (7) adopts a Zhang Zhengyou calibration method, and obtains an internal reference matrix and a distortion coefficient of the camera by shooting checkerboard patterns at different positions and solving camera parameters by using a calibration algorithm.
7. The system for measuring the ship height by combining the laser imaging radar and the camera according to claim 1, wherein the image processing module (5) carries out background modeling on a two-dimensional image shot by the camera, a background modeling algorithm is used for extracting a foreground image, namely a moving ship is extracted from the image, then binarization processing is carried out on the image, the ship target gray level of the motion is 1, the image imaging is displayed as white, the gray level of a static background area is 0, the image is displayed as black, each pixel point in the image is sequentially traversed from top to bottom on the binarized image, the first pixel value obtained by traversing is 1, and a point with the pixel value of 1 exists in the adjacent part of the image processing module, the point is regarded as the pixel value point of the highest point of the ship imaging, and at the moment, the corresponding ship highest point coordinate in the image is found.
8. The system for measuring the ship height by combining the laser imaging radar and the camera according to claim 1, wherein the point cloud data processing module (4) filters noise and removes outliers and ground points of the point cloud data.
9. The system for measuring the ship height by combining the laser imaging radar and the camera according to claim 1, wherein the combined calibration module (8) uses a checkerboard calibration method to enable the laser radar and the camera to synchronously acquire data, and the conversion relation from a laser radar coordinate system to a camera coordinate system is solved by finding corner data of a checkerboard.
10. A method of jointly measuring the height of a vessel by a laser imaging radar and a camera, characterized in that the method comprises, based on the laser imaging radar and the camera system of any one of claims 1-9:
calibrating internal parameters of the camera, and respectively calculating internal reference data of the camera;
performing joint calibration on the camera and the laser radar to obtain external parameter data of the laser radar relative to the camera;
synchronously acquiring ships by using a camera and a laser radar to obtain image data and point cloud data;
processing the point cloud data, removing outliers and ground points, projecting the rest points into a new two-dimensional matrix according to an internal reference matrix of the camera and an external reference matrix of the camera and the laser radar, and reserving a one-to-one correspondence between points in the two-dimensional matrix and three-dimensional point cloud data points before projection;
the projected two-dimensional matrix projects points of the two-dimensional matrix contained in the image range onto the image according to the size of the image acquired by the camera, and each point is correspondingly projected onto a pixel point;
background modeling is carried out on a two-dimensional image shot by a camera, a ship target in the image is extracted, and two-dimensional coordinates of a ship high-point in a pixel coordinate system are obtained;
after the two-dimensional coordinates of the highest point are found on the image, forward finding the nearest point projected by the point cloud to the v-axis of the pixel coordinate system, recording the distance between the highest point of the ship and the projected point, and calculating the distance between the highest point of the ship and the projected point in the world coordinate system according to the distance, namely the real distance;
according to the one-to-one correspondence between the points of the two-dimensional matrix after the point cloud projection and the points in the three-dimensional point cloud before the projection, the three-dimensional coordinates of the projection points are found, and the real height value of the points relative to the mounting height of the laser radar and the camera is obtained by combining the angle compensation information of the inclination sensor;
and obtaining the height of the ship relative to the water surface according to the installation height of the camera and the laser radar, the water surface height, the height of the laser radar projection point and the height difference from the highest point of the ship to the laser radar projection point.
CN202310930371.9A 2023-07-26 2023-07-26 System and method for jointly measuring ship height by laser imaging radar and camera Pending CN117092659A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310930371.9A CN117092659A (en) 2023-07-26 2023-07-26 System and method for jointly measuring ship height by laser imaging radar and camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310930371.9A CN117092659A (en) 2023-07-26 2023-07-26 System and method for jointly measuring ship height by laser imaging radar and camera

Publications (1)

Publication Number Publication Date
CN117092659A true CN117092659A (en) 2023-11-21

Family

ID=88782695

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310930371.9A Pending CN117092659A (en) 2023-07-26 2023-07-26 System and method for jointly measuring ship height by laser imaging radar and camera

Country Status (1)

Country Link
CN (1) CN117092659A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117329971A (en) * 2023-12-01 2024-01-02 海博泰科技(青岛)有限公司 Compartment balance detection method and system based on three-dimensional laser radar

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117329971A (en) * 2023-12-01 2024-01-02 海博泰科技(青岛)有限公司 Compartment balance detection method and system based on three-dimensional laser radar

Similar Documents

Publication Publication Date Title
CN108986070B (en) Rock crack propagation experiment monitoring method based on high-speed video measurement
JP6484729B2 (en) Unmanned aircraft depth image acquisition method, acquisition device, and unmanned aircraft
CN109241976B (en) Method for estimating oil spilling area based on image processing and laser ranging
CN106839977B (en) Shield dregs volume method for real-time measurement based on optical grating projection binocular imaging technology
CN107884767A (en) A kind of method of binocular vision system measurement ship distance and height
JP4809291B2 (en) Measuring device and program
US20100157280A1 (en) Method and system for aligning a line scan camera with a lidar scanner for real time data fusion in three dimensions
CN109801302A (en) A kind of ultra-high-tension power transmission line foreign matter detecting method based on binocular vision
CN110044374B (en) Image feature-based monocular vision mileage measurement method and odometer
CN109343041B (en) Monocular distance measuring method for advanced intelligent auxiliary driving
CN107729893A (en) A kind of vision positioning method of clapper die spotting press, system and storage medium
CN112184811B (en) Monocular space structured light system structure calibration method and device
CN109597086A (en) A kind of motion measuring method of the outer hanging object of contactless helicopter
CN117092659A (en) System and method for jointly measuring ship height by laser imaging radar and camera
CN114111637A (en) Stripe structured light three-dimensional reconstruction method based on virtual dual-purpose
CN113223135A (en) Three-dimensional reconstruction device and method based on special composite plane mirror virtual image imaging
CN109341668A (en) Polyphaser measurement method based on refraction projection model and beam ray tracing method
CN113724337B (en) Camera dynamic external parameter calibration method and device without depending on tripod head angle
CN116188558B (en) Stereo photogrammetry method based on binocular vision
CN110490903B (en) Multi-target rapid capturing and tracking method in binocular vision measurement
CN105865350A (en) 3D object point cloud imaging method
CN113643436B (en) Depth data splicing and fusion method and device
JP4935769B2 (en) Plane region estimation apparatus and program
CN112419427A (en) Method for improving time-of-flight camera accuracy
CN114359365B (en) Convergence type binocular vision measuring method with high resolution

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination