CN114494806A - Target identification method, system, device and medium based on multivariate information fusion - Google Patents

Target identification method, system, device and medium based on multivariate information fusion Download PDF

Info

Publication number
CN114494806A
CN114494806A CN202111549900.8A CN202111549900A CN114494806A CN 114494806 A CN114494806 A CN 114494806A CN 202111549900 A CN202111549900 A CN 202111549900A CN 114494806 A CN114494806 A CN 114494806A
Authority
CN
China
Prior art keywords
dimensional image
real
dimensional
matrix
laser
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111549900.8A
Other languages
Chinese (zh)
Inventor
吴丹青
鲁敏
何成
彭冬华
颜依兰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Guotian Electronic Technology Co ltd
Original Assignee
Hunan Guotian Electronic Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Guotian Electronic Technology Co ltd filed Critical Hunan Guotian Electronic Technology Co ltd
Priority to CN202111549900.8A priority Critical patent/CN114494806A/en
Publication of CN114494806A publication Critical patent/CN114494806A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a target identification method, a system, equipment and a medium based on multivariate information fusion, wherein the method comprises the following steps; collecting real-time three-dimensional geographic coordinates of a ship and converting the real-time three-dimensional geographic coordinates into a real two-dimensional image point matrix; acquiring a real-time two-dimensional image of a ship to form a two-dimensional image point matrix; collecting real-time laser three-dimensional geographic coordinates of a ship and converting the real-time laser three-dimensional geographic coordinates into a laser two-dimensional image point matrix; fusing a laser two-dimensional image point matrix with the two-dimensional image point matrix to obtain a fused image with a fused two-dimensional image point matrix, and calculating the coincidence degree with a real two-dimensional image; and for the fused image with the contact ratio more than or equal to 80% of the threshold value of the contact ratio, determining a target frame required by positioning and identifying the target in the fused image by adopting convolutional neural network iterative optimization. The invention can fuse the multi-sensor data to obtain more comprehensive target information, can improve the detection precision of ships and expand the detection range, and can also provide more comprehensive information for maintaining ocean rights and interests.

Description

Target identification method, system, device and medium based on multivariate information fusion
Technical Field
The invention belongs to the technical field of target identification, and particularly relates to a target identification method, a system, equipment and a medium based on multivariate information fusion.
Background
The Automatic Identification System (AIS) of the ship can provide relevant information of cooperative ship targets in a monitoring area, and identity information and the like of the ship targets can be known according to AIS data. AIS, however, belongs to a passive sensor and cannot detect non-cooperative vessel targets at sea.
Synthetic Aperture Radar (SAR) has high resolution and strong maneuverability, and can accurately detect the position of a ship target on the ocean. SAR can also detect all vessel targets within the monitored area, but cannot provide identity information of the vessel targets. The SAR is an active microwave imaging radar, can provide high-resolution images, has the characteristics of all-weather detection all-weather all-day-time, and can monitor targets in any weather. SAR's the platform of carrying has the satellite, the aircraft, and unmanned aerial vehicle, along with the development of synthetic aperture technique, SAR also can install on the car. The limited conditions of the airborne SAR are few, and the real-time imaging can be realized to flexibly use the airborne SAR to carry out ship target monitoring. The airborne SAR has high spatial resolution and strong maneuverability, and the detected ship position information is more accurate, but the airborne SAR is more easily influenced by weather conditions.
The three-spectrum camera is a remote monitoring device, can meet the requirements of effective dynamic monitoring within the range of 3-10 kilometers and timely tracking hot spot early warning, and has the advantages of visible light and far infrared thermal induction dual-channel imaging output, day and night complementation and 24-hour uninterrupted monitoring.
Although the sensors adopted by the three ship target monitoring devices have characteristics, the single use of one target sensor can result in single source of sampling data, and further cause inaccurate target tracking and monitoring. Although there are some target monitoring algorithms with multi-sensor fusion in the prior art, such as a UT-PHD-based multi-sensor sequential fusion tracking method disclosed in 201911041389.3 or a multi-sensor decision-level fusion intelligent ship water surface target feeling knowledge identification method disclosed in 201911125608.6, the noise is too large, the feasibility is not high, and the data acquired by the multi-sensor is not effectively fused after coordinate conversion, and the target selection frame for final target identification is determined in an optimized manner, so that the target identification accuracy is reduced, and the target to be tracked and identified is missed.
Disclosure of Invention
The invention aims at the defects and provides a target identification method, a system, equipment and a medium based on multivariate information fusion. The invention can fuse the multi-sensor data to obtain more comprehensive target information, can improve the detection precision of ships and expand the detection range, and can also provide more comprehensive information for maintaining ocean rights and interests.
The invention provides the following technical scheme: the target identification method based on the multivariate information fusion comprises the following steps;
s1: collecting real-time three-dimensional geographic coordinates of a ship, and forming a real three-dimensional point cloud by a real-time real three-dimensional geographic coordinate matrix; converting the real three-dimensional point cloud into a real two-dimensional image point matrix;
s2: acquiring a real-time two-dimensional image of a ship to form a two-dimensional image point set, wherein coordinates of two-dimensional image points in the set form a two-dimensional image point matrix; simultaneously collecting real-time laser three-dimensional geographic coordinates of the ship, and forming a laser three-dimensional point cloud by using a real-time laser three-dimensional geographic coordinate matrix; converting the laser three-dimensional point cloud into a laser two-dimensional image point matrix;
s3: fusing the laser two-dimensional image point matrix with the two-dimensional image point matrix to obtain a fused image with a fused two-dimensional image point matrix, and calculating the coincidence degree of the obtained fused image and the two-dimensional image formed by the real two-dimensional image point matrix obtained in the step S1;
s4: judging whether the contact ratio is greater than or equal to 80% of a contact ratio threshold value, if so, carrying out the next step, otherwise, repeating the steps S1-S4;
s5: and iteratively optimizing the two-dimensional image dot matrix obtained by fusing in the step S3, which meets the coincidence degree threshold value, by adopting a convolutional neural network, constructing a minimum iteration loss model, obtaining an optimized two-dimensional image dot matrix with minimum iteration loss, and determining a target frame required by positioning and identifying a target in an image obtained by fusing the laser two-dimensional image dot matrix and the two-dimensional image dot matrix.
Further, the real-time real three-dimensional geographic coordinate matrix formed by the real-time real three-dimensional geographic coordinates collected in the step S1 is Xreal=(xreal,yreal,zreal,1)TWherein x isrealAs true x-axis coordinates, y, of the vesselrealAs the true y-axis coordinate, z, of the vesselrealReal z-axis coordinates of the vessel;
converting the real three-dimensional point cloud into a real two-dimensional image point matrix Y by adopting the following formulareal
Yreal=TrealXreal
Wherein, TrealA transformation matrix is converted for the real three-dimensional point cloud,
Figure BDA0003417156390000031
difor the distance between the ith point and the real point in the converted real two-dimensional image points, (c)x,cy) For centre points in images composed of true two-dimensional image points obtained by conversion, biIs the baseline of the image composed of true two-dimensional image points obtained by relative transformation.
Further, the step S2 is to convert the laser three-dimensional point cloud formed by the real-time laser three-dimensional geographic coordinates into a laser two-dimensional image point matrix YLidarThe formula of (1) is as follows:
YLidar=TLidarRSLidarXLidar
wherein the real-time laser three-dimensional geographic coordinate matrix XLidar=(xl,yl,zl,1)T,xlX-axis coordinate, y, of vessel measured for lidarlShip y-axis coordinate, z, measured for lidarlShip z-axis coordinates measured for the lidar; r is laser IIIA point cloud rotation matrix, which is an orthogonal matrix with a determinant of 1; t isLidarIs a laser three-dimensional point cloud conversion matrix,
Figure BDA0003417156390000032
djfor the distance between the j point of the converted laser two-dimensional image points and the real point, (m)x,my) For centre points in the image composed of transformed laser two-dimensional image points, bjA base line of an image formed by laser two-dimensional image points obtained by relative conversion; sLiadarIn order to transform the matrix, the matrix is,
Figure BDA0003417156390000033
Figure BDA0003417156390000034
RLidaris an orthogonal matrix with determinant 1.
Further, the method for fusing the laser two-dimensional image dot matrix and the two-dimensional image dot matrix in the step S3 includes the following steps:
s31: constructing a two-dimensional image point set A obtained by splicing two-dimensional image points in the laser two-dimensional image point matrix with two-dimensional image points in the two-dimensional image point matrix;
s32: dividing the two-dimensional image point set A into N multiplied by N grids, and determining the position of the w two-dimensional image point in the grids
Figure BDA0003417156390000035
S33: constructing a maximum similarity model to enable the laser two-dimensional image point matrix and the two-dimensional image point matrix YCamMaximal fusion was performed.
Further, the maximum similarity model constructed in the step S33 is as follows:
Figure BDA0003417156390000041
Figure BDA0003417156390000042
wherein j is 1,2,3, 4; m is Col, Tex, Siz or Shape,
Figure BDA0003417156390000043
for the color parameters of the image after the fusion,
Figure BDA0003417156390000044
for the parameters of the texture of the image after the fusion,
Figure BDA0003417156390000045
a size parameter in pixels for the fused image,
Figure BDA0003417156390000046
the image shape compatibility parameters in the fused image are obtained;
Figure BDA0003417156390000047
represents
Figure BDA0003417156390000048
Belonging to a laser two-dimensional image point matrix Y in a two-dimensional image point set A obtained by splicing and separated into N multiplied by N gridsLidarThe p-th point of the composed image,
Figure BDA0003417156390000049
represents
Figure BDA00034171563900000410
Belonging to a two-dimensional image point matrix Y in a two-dimensional image point set A obtained by splicing and separated into N multiplied by N gridsCamPoint q.
Further, the fused image color parameter calculation formula is as follows:
Figure BDA00034171563900000411
the calculation formula of the fused image texture parameters is as follows:
Figure BDA00034171563900000412
the size parameter calculation formula of the fused image with the pixel as a unit is as follows:
Figure BDA00034171563900000413
the calculation formula of the image shape compatibility parameter in the fused image is as follows:
Figure BDA00034171563900000414
Figure BDA00034171563900000415
representing a matrix Y of two-dimensional image points belonging to a laser in a set A of two-dimensional image points in a color descriptorLidarThe k-th histogram value of the composed image,
Figure BDA00034171563900000416
representing a matrix Y of two-dimensional image points belonging to a set A of two-dimensional image points in a color descriptorCamThe kth histogram value of the composed image;
Figure BDA0003417156390000051
representing a matrix Y of two-dimensional image points belonging to a laser in a set A of two-dimensional image points in a texture descriptorLidarThe k-th histogram value of the composed image,
Figure BDA0003417156390000052
representing a matrix Y of two-dimensional image points belonging to a set A of two-dimensional image points in a texture descriptorCamThe kth histogram value of the composed image; siz (-) stands for likeThe image size in units of pixels is a function of the calculation.
Further, the minimum iteration loss model in the step S4 is as follows:
Figure BDA0003417156390000053
Figure BDA0003417156390000054
wherein σ is a weight coefficient lost by the smoothness function, s is an algebraic number of iterations, smooth (·) is a smoothness function of an image formed by a plurality of two-dimensional image points in the two-dimensional image point set,
Figure BDA0003417156390000055
is the Euclidean distance between the adjacent two w-th two-dimensional image points and the w-1-th two-dimensional image point.
The invention also provides a target identification system based on multivariate information fusion by adopting the method, which comprises an AIS ship automatic identification system, a laser radar system, a three-spectrum camera and a main control calculation module;
the AIS ship automatic identification system is used for acquiring real-time three-dimensional geographic coordinates of ships;
the three-spectrum camera is used for acquiring a real-time two-dimensional image of a ship;
the laser radar system is used for acquiring a real-time laser three-dimensional geographic coordinate matrix of the ship;
the main control computing module is used for receiving the real-time three-dimensional geographic coordinates of the ship to form a real three-dimensional point cloud, converting the real three-dimensional point cloud into a real two-dimensional image point matrix, receiving a real two-dimensional image of the ship to form a two-dimensional image point set and a two-dimensional image point matrix, receiving the real-time laser three-dimensional geographic coordinate matrix of the ship to form a laser three-dimensional point cloud, and converting the laser three-dimensional point cloud into a laser two-dimensional image point matrix; : and fusing the laser two-dimensional image dot matrix and the two-dimensional image dot matrix to obtain a fused image with the fused two-dimensional image dot matrix, iteratively optimizing a two-dimensional image dot matrix meeting a coincidence degree threshold by adopting a convolutional neural network, and determining a target frame required by positioning and identifying a target in the image fused by the laser two-dimensional image dot matrix and the two-dimensional image dot matrix.
The invention also provides a computer device, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, and is characterized in that the processor implements the steps of the target identification method based on multivariate information fusion as described above when executing the computer program.
The present invention also provides a computer storage medium having stored thereon a computer program which, when being executed by a processor, carries out the steps of the multivariate information fusion based object recognition method as described above.
The invention has the beneficial effects that:
1. the invention fuses the three-dimensional data collected by the automatic ship identification system AIS, the three-dimensional data collected by the laser radar system and the two-dimensional data collected by the three-spectrum camera multi-sensor to obtain more comprehensive target information, converts the three-dimensional data collected by the laser radar system into two-dimensional image coordinates before fusion, forms a two-dimensional image to be fused with the real-time two-dimensional image collected by the three-spectrum camera, and adopts the construction of a maximum similarity model in the fusion process to ensure that the laser two-dimensional image point matrix Y is a Y matrixLidarWith said two-dimensional image dot matrix YCamThe method has the advantages that the integration is carried out to the maximum extent, the detection precision of the ship can be improved effectively, the detection range is enlarged, and more comprehensive information can be provided for maintaining ocean rights and interests.
2. The invention is to fuse the two-dimensional image point matrix Y obtained by fusionfuseBefore the fused image is optimized and iterated, the contact ratio of the fused image obtained by fusion, a real-time three-dimensional geographic coordinate acquired by an automatic identification system AIS of the ship and a real two-dimensional image obtained by real three-dimensional point cloud conversion forming a real-time real three-dimensional geographic coordinate matrix is calculated, and the contact ratio is calculatedThe calculation of (3) determines whether the optimization iteration meets the requirement of the optimization iteration, and then the optimization iteration is carried out, so that the accuracy and the feasibility of the optimization iteration and the reliability of the target obtained by identification are ensured.
3. The invention is realized by using laser two-dimensional image dot matrix YLidarWith a two-dimensional image dot matrix YCamPoint-trace association fusion is carried out to obtain a two-dimensional image point matrix Y with fusionfuseThe fused image, the track generation and the track association are used for generating the ship target track from the ship target point track of continuous time, so that more comprehensive information of the marine cooperative ship target can be obtained, and the non-cooperative ship target can be obtained.
4. In the optimization iteration process, the minimum iteration loss model is constructed, the smoothness function of each image point in the grid in the fusion image is taken as a limit value, the iteration loss of the iteration fusion image grid is minimized, the optimal iteration result is obtained under the condition that the fusion is ensured to lose the smoothness and the truth of the image, and the laser two-dimensional image point matrix Y is determinedLidarWith said two-dimensional image dot matrix YCamAnd positioning a target frame required by the recognition target in the fused image, wherein the target frame is the target obtained by recognition, and the accuracy of determining the recognition target by multi-sensor fusion is effectively improved from another aspect.
Drawings
The invention will be described in more detail hereinafter on the basis of embodiments and with reference to the accompanying drawings. Wherein:
FIG. 1 is a schematic flow chart of a target identification method based on multivariate information fusion provided by the present invention;
fig. 2 is a schematic diagram of a target identification structure based on multivariate information fusion provided by the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example 1
As shown in fig. 1, the method for identifying a target based on multivariate information fusion provided by this embodiment includes the following steps;
s1: collecting real-time three-dimensional geographic coordinates of a ship, and forming a real three-dimensional point cloud by a real-time real three-dimensional geographic coordinate matrix; converting the real three-dimensional point cloud into a real two-dimensional image point matrix;
s2: acquiring a real-time two-dimensional image of a ship to form a two-dimensional image point set, wherein coordinates of two-dimensional image points in the set form a two-dimensional image point matrix; simultaneously acquiring real-time laser three-dimensional geographic coordinates of the ship, and forming a laser three-dimensional point cloud by using a real-time laser three-dimensional geographic coordinate matrix; converting the laser three-dimensional point cloud into a laser two-dimensional image point matrix;
s3: fusing the laser two-dimensional image point matrix with the two-dimensional image point matrix to obtain a fused image with a fused two-dimensional image point matrix, and calculating the contact ratio of the obtained fused image and the two-dimensional image formed by the real two-dimensional image point matrix obtained in the step S1;
s4: judging whether the contact ratio is greater than or equal to 80% of the contact ratio threshold value, if so, carrying out the next step, otherwise, repeating the steps S1-S4;
s5: and (3) performing iterative optimization on the two-dimensional image dot matrix obtained by fusing the step S3 which meets the coincidence degree threshold value by adopting a convolutional neural network, constructing a minimum iterative loss model, obtaining an optimized two-dimensional image dot matrix by using the minimum iterative loss, and determining a target frame required by positioning and identifying a target in the image obtained by fusing the laser two-dimensional image dot matrix and the two-dimensional image dot matrix, wherein the target frame is the target obtained by identification.
Example 2
As shown in fig. 1, the target identification method based on multivariate information fusion provided by this embodiment includes the following steps;
s1: the automatic identification system AIS of the ship collects the real-time three-dimensional geographic coordinates of the ship, and the real-time real three-dimensional geographic coordinates form a real-time real three-dimensional coordinateThe geographic coordinate matrix is Xreal=(xreal,yreal,zreal,1)TWherein x isrealAs true x-axis coordinates, y, of the vesselrealAs the true y-axis coordinate, z, of the vesselrealReal z-axis coordinates of the ship; real-time real three-dimensional geographic coordinate matrix XrealForming a real three-dimensional point cloud; converting the real three-dimensional point cloud into a real two-dimensional image point matrix Y by adopting the following formulareal
Yreal=TrealXreal
Wherein, TrealA transformation matrix is converted for the real three-dimensional point cloud,
Figure BDA0003417156390000081
difor the distance between the ith point and the real point in the converted real two-dimensional image points, (c)x,cy) For centre points in images composed of true two-dimensional image points obtained by conversion, biIs the baseline of the image composed of true two-dimensional image points obtained by relative transformation.
S2: the three-spectrum camera collects real-time two-dimensional images of ships to form a two-dimensional image point set, and coordinates of two-dimensional image points in the set form a two-dimensional image point matrix YCam(ii) a Meanwhile, the laser radar system collects real-time laser three-dimensional geographic coordinates of ships and a real-time laser three-dimensional geographic coordinate matrix XLidarForming a laser three-dimensional point cloud; the real-time laser three-dimensional geographic coordinate X is obtained by adopting the following formulaLidarConverting the formed laser three-dimensional point cloud into a laser two-dimensional image point matrix YLidarThe formula of (1) is as follows:
YLidar=TLidarRSLidarXLidar
wherein the real-time laser three-dimensional geographic coordinate matrix XLidar=(xl,yl,zl,1)T,xlX-axis coordinate, y, of vessel measured for lidarlShip y-axis coordinate, z, measured for lidarlShip z-axis coordinates measured for the lidar; r is a laser three-dimensional point cloud rotation matrix which is positive with a determinant of 1Crossing the matrix; t isLidarIs a laser three-dimensional point cloud conversion matrix,
Figure BDA0003417156390000091
djfor the distance between the j point of the converted laser two-dimensional image points and the real point, (m)x,my) For centre points in the image composed of transformed laser two-dimensional image points, bjA base line of an image formed by laser two-dimensional image points obtained by relative conversion; sLiadarIn order to transform the matrix, the matrix is,
Figure BDA0003417156390000092
Figure BDA0003417156390000093
RLidaris an orthogonal matrix of determinant 1, RLidarCan be that
Figure BDA0003417156390000094
tLidarIs a translation vector matrix generated in the conversion process.
Figure BDA0003417156390000095
Figure BDA0003417156390000096
And
Figure BDA0003417156390000097
respectively translations produced during the transformation of the coordinates of the 3-dimensional lidar system into a 2-dimensional planar coordinate system.
S3: laser two-dimensional image point matrix YLidarWith a two-dimensional image dot matrix YCamPerforming fusion to obtain a point matrix Y with a fused two-dimensional imagefuseThe obtained fused image is combined with the true two-dimensional image point matrix Y obtained in the step S1realFormed ofCalculating the contact ratio of the two-dimensional image;
s4: judging whether the contact ratio is greater than or equal to 80% of the contact ratio threshold value, if so, carrying out the next step, otherwise, repeating the steps S1-S4;
s5: iterative optimization of the two-dimensional image lattice matrix Y obtained by fusing the step S3 meeting the threshold value of the contact ratio by adopting the convolutional neural networkfuseConstructing a minimum iteration loss model, obtaining an optimized two-dimensional image point matrix with the minimum iteration loss, and determining a laser two-dimensional image point matrix YLidarWith a two-dimensional image dot matrix YCamAnd positioning a target frame required by the recognition target in the fused image, wherein the target frame is the target obtained by recognition.
Example 3
As shown in fig. 1, the method for identifying a target based on multivariate information fusion provided by this embodiment includes the following steps;
s1: an automatic identification system AIS of the ship collects real-time three-dimensional geographic coordinates of the ship, and a real-time real three-dimensional geographic coordinate matrix formed by the collected real-time real three-dimensional geographic coordinates is Xreal=(xreal,yreal,zreal,1)TWherein x isrealAs true x-axis coordinates, y, of the vesselrealAs the true y-axis coordinate, z, of the vesselrealReal z-axis coordinates of the ship; real-time real three-dimensional geographic coordinate matrix XrealForming a real three-dimensional point cloud; converting the real three-dimensional point cloud into a real two-dimensional image point matrix Y by adopting the following formulareal
Yreal=TrealXreal
Wherein, TrealA transformation matrix is converted for the real three-dimensional point cloud,
Figure BDA0003417156390000101
difor the distance between the ith point and the real point in the converted real two-dimensional image points, (c)x,cy) For centre points in images composed of true two-dimensional image points obtained by conversion, biIs the baseline of the image composed of true two-dimensional image points obtained by relative transformation.
S2: the three-spectrum camera collects real-time two-dimensional images of ships to form a two-dimensional image point set, and coordinates of two-dimensional image points in the set form a two-dimensional image point matrix YCam(ii) a Meanwhile, the laser radar system collects real-time laser three-dimensional geographic coordinates of ships, and a real-time laser three-dimensional geographic coordinate matrix XLidarForming a laser three-dimensional point cloud; the real-time laser three-dimensional geographic coordinate X is obtained by adopting the following formulaLidarConverting the formed laser three-dimensional point cloud into a laser two-dimensional image point matrix YLidarThe formula of (1) is as follows:
YLidar=TLidarRSLidarXLidar
wherein the real-time laser three-dimensional geographic coordinate matrix XLidar=(xl,yl,zl,1)T,xlX-axis coordinate, y, of vessel measured for lidarlShip y-axis coordinate, z, measured for lidarlShip z-axis coordinates measured for the lidar; r is a laser three-dimensional point cloud rotation matrix which is an orthogonal matrix with a determinant of 1; t isLidarIs a laser three-dimensional point cloud conversion matrix,
Figure BDA0003417156390000102
djfor the distance between the j point of the converted laser two-dimensional image points and the real point, (m)x,my) For centre points in the image composed of transformed laser two-dimensional image points, bjA base line of an image formed by laser two-dimensional image points obtained by relative conversion; sLiadarIn order to transform the matrix, the matrix is,
Figure BDA0003417156390000103
Figure BDA0003417156390000104
RLidaris an orthogonal matrix of determinant 1, RLidarCan be that
Figure BDA0003417156390000105
tLidarIs a translation vector matrix generated in the conversion process.
Figure BDA0003417156390000111
Figure BDA0003417156390000112
And
Figure BDA0003417156390000113
respectively translations produced during the transformation of the coordinates of the 3-dimensional lidar system into a 2-dimensional planar coordinate system.
S3: laser two-dimensional image point matrix YLidarWith a two-dimensional image dot matrix YCamPerforming fusion to obtain a point matrix Y with a fused two-dimensional imagefuseThe fused image of (1); the method specifically comprises the following steps:
s31: constructing a laser two-dimensional image dot matrix YLidarTwo-dimensional image point and two-dimensional image point matrix Y in (1)CamA two-dimensional image point set A is obtained by splicing the two-dimensional image points; Camthe two-dimensional image point matrix Y is (u, v,1), where u is the three-spectrum camera The x-axis coordinate of the two-dimensional image ship measured by the camera, and v is the y-axis coordinate of the two-dimensional image ship measured by the three-spectrum camera;
s32: dividing the two-dimensional image point set A into N × N grids, and determining the position of the w-th two-dimensional image point in the grids
Figure BDA0003417156390000114
Preferably, nxn is 3 × 3 or 4 × 4;
s33: constructing a maximum similarity model to make a laser two-dimensional image point matrix YLidarWith a two-dimensional image dot matrix YCamMaximal fusion was performed.
Figure BDA0003417156390000115
Figure BDA0003417156390000116
Wherein j is 1,2,3, 4; m is Col, Tex, Siz or Shape,
Figure BDA0003417156390000117
for the color parameters of the image after the fusion,
Figure BDA0003417156390000118
for the parameters of the texture of the image after the fusion,
Figure BDA0003417156390000119
a size parameter in pixels for the fused image,
Figure BDA00034171563900001110
the image shape compatibility parameters in the fused image are obtained;
Figure BDA00034171563900001111
represents
Figure BDA00034171563900001112
Belonging to a laser two-dimensional image point matrix Y in a two-dimensional image point set A obtained by splicing and separated into N multiplied by N gridsLidarThe p-th point of the composed image,
Figure BDA00034171563900001113
represents
Figure BDA00034171563900001114
Belonging to a two-dimensional image point matrix Y in a two-dimensional image point set A obtained by splicing and separated into N multiplied by N gridsCamPoint q.
The calculation formula of the fused image color parameters is as follows:
Figure BDA0003417156390000121
the calculation formula of the texture parameters of the fused image is as follows:
Figure BDA0003417156390000122
the size parameter calculation formula of the fused image with the pixel as a unit is as follows:
Figure BDA0003417156390000123
the calculation formula of the image shape compatibility parameter in the fused image is as follows:
Figure BDA0003417156390000124
Figure BDA0003417156390000125
representing a matrix Y of two-dimensional image points belonging to a laser in a set A of two-dimensional image points in a color descriptorLidarThe k-th histogram value of the composed image,
Figure BDA0003417156390000126
representing a matrix Y of two-dimensional image points belonging to a set A of two-dimensional image points in a color descriptorCamThe kth histogram value of the composed image;
Figure BDA0003417156390000127
representing a matrix Y of two-dimensional image points belonging to a laser in a set A of two-dimensional image points in a texture descriptorLidarThe k-th histogram value of the composed image,
Figure BDA0003417156390000128
representing a matrix Y of two-dimensional image points belonging to a set A of two-dimensional image points in a texture descriptorCamOf the composed imageThe kth histogram value; siz (-) denotes an image size calculation function in units of pixels,
Figure BDA0003417156390000129
belonging to a laser two-dimensional image point matrix Y in a two-dimensional image point set A obtained by splicing and separated into N multiplied by N gridsLidarThe image size of the p-th point of the composed image,
Figure BDA00034171563900001210
belonging to a two-dimensional image point matrix Y in a two-dimensional image point set A obtained by splicing and separated into N multiplied by N gridsCamThe image size of the q-th point of (1), and siz (a) is the image size of the two-dimensional image point set a obtained by stitching.
Combining the obtained fusion image with the real two-dimensional image point matrix Y obtained in the step S1realCalculating the contact ratio of the formed two-dimensional image, setting the image size of the fused image as S and the dot matrix Y of the real two-dimensional imagerealThe image size of the formed two-dimensional image is H, and the coincidence degree calculation formula is as follows:
Figure BDA0003417156390000131
s4: judging whether the contact ratio is greater than or equal to 80% of the contact ratio threshold value, if so, carrying out the next step, otherwise, repeating the steps S1-S4;
s5: iterative optimization of the convolution neural network is adopted to meet the two-dimensional image lattice matrix Y obtained by fusing the step S3 of the coincidence degree threshold valuefuseConstructing a minimum iteration loss model, obtaining an optimized two-dimensional image point matrix with the minimum iteration loss, and determining a laser two-dimensional image point matrix YLidarWith a two-dimensional image dot matrix YCamAnd positioning a target frame required by the recognition target in the fused image, wherein the target frame is the target obtained by recognition.
Example 4
The target identification method based on multivariate information fusion provided by the embodiment comprises the following steps;
s1: the automatic identification system AIS of the ship collects real-time three-dimensional geographic coordinates of the ship, and a real-time real three-dimensional geographic coordinate matrix formed by the collected real-time real three-dimensional geographic coordinates is Xreal=(xreal,yreal,zreal,1)TWherein x isrealAs true x-axis coordinates, y, of the vesselrealAs the true y-axis coordinate, z, of the vesselrealReal z-axis coordinates of the ship; real-time real three-dimensional geographic coordinate matrix XrealForming a real three-dimensional point cloud; converting the real three-dimensional point cloud into a real two-dimensional image point matrix Y by adopting the following formulareal
Yreal=TrealXreal
Wherein, TrealA transformation matrix is converted for the real three-dimensional point cloud,
Figure BDA0003417156390000132
difor the distance between the ith point and the real point in the converted real two-dimensional image points, (c)x,cy) For centre points in images composed of true two-dimensional image points obtained by conversion, biIs the baseline of the image composed of true two-dimensional image points obtained by relative transformation.
S2: the three-spectrum camera collects real-time two-dimensional images of ships to form a two-dimensional image point set, and coordinates of two-dimensional image points in the set form a two-dimensional image point matrix YCam(ii) a Meanwhile, the laser radar system collects real-time laser three-dimensional geographic coordinates of ships and a real-time laser three-dimensional geographic coordinate matrix XLidarForming a laser three-dimensional point cloud; the real-time laser three-dimensional geographic coordinate X is obtained by adopting the following formulaLidarConverting the formed laser three-dimensional point cloud into a laser two-dimensional image point matrix YLidarThe formula (c) is as follows:
YLidar=TLidarRSLidarXLidar
wherein the real-time laser three-dimensional geographic coordinate matrix XLidar=(xl,yl,zl,1)T,xlX-axis coordinate, y, of vessel measured for lidarlFor laser radarMeasured y-axis coordinate, z, of the vessellShip z-axis coordinates measured for the lidar; r is a laser three-dimensional point cloud rotation matrix which is an orthogonal matrix with a determinant of 1; t isLidarIs a laser three-dimensional point cloud conversion matrix,
Figure BDA0003417156390000141
djfor the distance between the j point of the converted laser two-dimensional image points and the real point, (m)x,my) For centre points in the image composed of transformed laser two-dimensional image points, bjA base line of an image formed by laser two-dimensional image points obtained by relative conversion; sLiadarIn order to transform the matrix, the matrix is,
Figure BDA0003417156390000142
Figure BDA0003417156390000143
RLidaris an orthogonal matrix of determinant 1, RLidarCan be that
Figure BDA0003417156390000144
tLidarIs a translation vector matrix generated in the conversion process.
Figure BDA0003417156390000145
Figure BDA0003417156390000146
And
Figure BDA0003417156390000147
respectively translations produced during the transformation of the coordinates of the 3-dimensional lidar system into a 2-dimensional planar coordinate system.
S3: laser two-dimensional image point matrix YLidarWith a two-dimensional image dot matrix YCamPerforming fusion to obtain a two-dimensional fusionMatrix Y of image pointsfuseThe fused image of (2); the method specifically comprises the following steps:
s31: constructing a laser two-dimensional image dot matrix YLidarTwo-dimensional image point and two-dimensional image point matrix Y in (1)CamA two-dimensional image point set A is obtained by splicing the two-dimensional image points; Camthe two-dimensional image point matrix Y is (u, v,1), wherein u is a three-spectrum camera The x-axis coordinate of the two-dimensional image ship measured by the camera, and v is the y-axis coordinate of the two-dimensional image ship measured by the three-spectrum camera;
s32: dividing the two-dimensional image point set A into N × N grids, and determining the position of the w-th two-dimensional image point in the grids
Figure BDA0003417156390000148
Preferably, nxn is 3 × 3 or 4 × 4;
s33: constructing a maximum similarity model to make a laser two-dimensional image point matrix YLidarWith a two-dimensional image dot matrix YCamMaximal fusion was performed.
Figure BDA0003417156390000151
Figure BDA0003417156390000152
Wherein j is 1,2,3, 4; m is Col, Tex, Siz or Shape,
Figure BDA0003417156390000153
for the color parameters of the image after the fusion,
Figure BDA0003417156390000154
for the parameters of the texture of the image after the fusion,
Figure BDA0003417156390000155
a size parameter in pixels for the fused image,
Figure BDA0003417156390000156
the image shape compatibility parameters in the fused image are obtained;
Figure BDA0003417156390000157
represents
Figure BDA0003417156390000158
Belonging to a laser two-dimensional image point matrix Y in a two-dimensional image point set A obtained by splicing and separated into N multiplied by N gridsLidarThe p-th point of the composed image,
Figure BDA0003417156390000159
represents
Figure BDA00034171563900001510
Belonging to a two-dimensional image point matrix Y in a two-dimensional image point set A obtained by splicing and separated into N multiplied by N gridsCamPoint q.
The calculation formula of the fused image color parameters is as follows:
Figure BDA00034171563900001511
the calculation formula of the texture parameters of the fused image is as follows:
Figure BDA00034171563900001512
the size parameter calculation formula of the fused image with the pixel as a unit is as follows:
Figure BDA00034171563900001513
the calculation formula of the image shape compatibility parameter in the fused image is as follows:
Figure BDA00034171563900001514
Figure BDA00034171563900001515
representing a matrix Y of two-dimensional image points belonging to a laser in a set A of two-dimensional image points in a color descriptorLidarThe k-th histogram value of the composed image,
Figure BDA00034171563900001516
representing a matrix Y of two-dimensional image points belonging to a set A of two-dimensional image points in a color descriptorCamThe kth histogram value of the composed image;
Figure BDA0003417156390000161
representing a matrix Y of points belonging to a laser two-dimensional image in a set A of two-dimensional image points in a texture descriptorLidarThe k-th histogram value of the composed image,
Figure BDA0003417156390000162
representing a matrix Y of two-dimensional image points belonging to a set A of two-dimensional image points in a texture descriptorCamThe kth histogram value of the composed image; siz (-) denotes an image size calculation function in units of pixels,
Figure BDA0003417156390000163
belonging to a laser two-dimensional image point matrix Y in a two-dimensional image point set A obtained by splicing and separated into N multiplied by N gridsLidarThe image size of the p-th point of the composed image,
Figure BDA0003417156390000164
belonging to a two-dimensional image point matrix Y in a two-dimensional image point set A obtained by splicing and separated into N multiplied by N gridsCamThe image size of the q-th point of (a), and siz (a) is the image size of the two-dimensional image point set a obtained by stitching.
The obtained fusion image and the real two-dimensional image point matrix Y obtained in the step S1 are combinedrealThe formed two-dimensional image is used for calculating the coincidence degree and setting the fusionThe image size of the combined image is S, and the real two-dimensional image point matrix YrealThe image size of the formed two-dimensional image is H, and the coincidence degree calculation formula is as follows:
Figure BDA0003417156390000165
s4: judging whether the contact ratio is greater than or equal to 80% of the contact ratio threshold value, if so, carrying out the next step, otherwise, repeating the steps S1-S4;
s5: iterative optimization of the convolution neural network is adopted to meet the two-dimensional image lattice matrix Y obtained by fusing the step S3 of the coincidence degree threshold valuefuseConstructing a minimum iteration loss model, obtaining an optimized two-dimensional image point matrix with the minimum iteration loss, and determining a laser two-dimensional image point matrix YLidarWith a two-dimensional image dot matrix YCamAnd positioning a target frame required by the recognition target in the fused image, wherein the target frame is the target obtained by recognition.
The minimum iteration loss model is constructed as follows:
Figure BDA0003417156390000166
Figure BDA0003417156390000167
wherein σ is a weight coefficient lost by the smoothness function, s is an algebraic number of iterations, smooth (·) is a smoothness function of an image formed by a plurality of two-dimensional image points in the two-dimensional image point set,
Figure BDA0003417156390000168
is the Euclidean distance between two adjacent w-th two-dimensional image points and the w-1-th two-dimensional image point.
Example 5
As shown in fig. 2, the target identification system based on multivariate information fusion, which is provided for this embodiment and adopts the method provided in any one of embodiments 1 to 4, includes an AIS ship automatic identification system, a laser radar system, a three-spectrum camera, and a main control calculation module;
the automatic AIS ship identification system is used for acquiring real-time three-dimensional geographic coordinates of ships;
the three-spectrum camera is used for acquiring a real-time two-dimensional image of the ship;
the laser radar system is used for acquiring a real-time laser three-dimensional geographic coordinate matrix of the ship;
a main control calculation module for receiving real-time three-dimensional geographic coordinates of the ship to form a real three-dimensional point cloud and converting the real three-dimensional point cloud into a real two-dimensional image point matrix YrealReceiving a real-time two-dimensional image of the vessel, forming a two-dimensional image point set and having a two-dimensional image point matrix YCamReceiving a real-time laser three-dimensional geographic coordinate matrix X of a shipLidarForming a laser three-dimensional point cloud, and converting the laser three-dimensional point cloud into a laser two-dimensional image point matrix YLidar(ii) a : laser two-dimensional image point matrix YLidarWith a two-dimensional image dot matrix YCamPerforming fusion to obtain a point matrix Y with a fused two-dimensional imagefuseAnd iteratively optimizing a two-dimensional image dot matrix Y meeting the coincidence degree threshold value by adopting a convolution neural networkfuseDetermining a laser two-dimensional image dot matrix YLidarAnd a two-dimensional image point matrix YCamAnd positioning a target frame required by the recognition target in the fused image, wherein the target frame is the target obtained by recognition.
Example 6
The present invention also provides a computer device, comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor executes the computer program to implement the steps of the method for identifying an object based on multivariate information fusion according to any one of claims 1-7:
example 7
The present invention is also embodied in a computer storage medium having a computer program stored thereon, where the computer program is executed by a processor to implement the steps of the multivariate information fusion based object recognition method provided in any of embodiments 1-4.
In an exemplary embodiment, the electronic device may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium, such as a memory, including computer program instructions executable by a processor of an electronic device to perform the above-described method is also provided.
The present disclosure may be systems, methods, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the present disclosure by utilizing the state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The computer program product may be embodied in hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
While specific embodiments of the disclosure have been described above, the above description is illustrative, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (10)

1. The target identification method based on the multivariate information fusion is characterized by comprising the following steps;
s1: collecting real-time three-dimensional geographic coordinates of a ship, and forming a real three-dimensional point cloud by a real-time real three-dimensional geographic coordinate matrix; converting the real three-dimensional point cloud into a real two-dimensional image point matrix;
s2: acquiring a real-time two-dimensional image of a ship to form a two-dimensional image point set, wherein coordinates of two-dimensional image points in the set form a two-dimensional image point matrix; simultaneously acquiring real-time laser three-dimensional geographic coordinates of the ship, and forming a laser three-dimensional point cloud by using a real-time laser three-dimensional geographic coordinate matrix; converting the laser three-dimensional point cloud into a laser two-dimensional image point matrix;
s3: fusing the laser two-dimensional image point matrix with the two-dimensional image point matrix to obtain a fused image with a fused two-dimensional image point matrix, and calculating the coincidence degree of the obtained fused image and a two-dimensional image formed by the real two-dimensional image point matrix obtained in the step S1;
s4: judging whether the contact ratio is greater than or equal to 80% of a contact ratio threshold value, if so, carrying out the next step, otherwise, repeating the steps S1-S4;
s5: and iteratively optimizing the two-dimensional image dot matrix obtained by fusing in the step S3, which meets the coincidence degree threshold value, by adopting a convolutional neural network, constructing a minimum iteration loss model, obtaining an optimized two-dimensional image dot matrix with minimum iteration loss, and determining a target frame required by positioning and identifying a target in an image obtained by fusing the laser two-dimensional image dot matrix and the two-dimensional image dot matrix.
2. The method for identifying an object based on multivariate information fusion as defined in claim 1, wherein the real-time real three-dimensional geographic coordinates collected in step S1 form a real-time real three-dimensional geographic coordinates matrix of Xreal=(xreal,yreal,zreal,1)TWherein x isrealAs true x-axis coordinates, y, of the vesselrealAs the true y-axis coordinate, z, of the vesselrealReal z-axis coordinates of the ship;
converting the real three-dimensional point cloud into a real two-dimensional image point matrix Y by adopting the following formulareal
Yreal=TrealXreal
Wherein, TrealA transformation matrix is converted for the real three-dimensional point cloud,
Figure FDA0003417156380000011
difor the distance between the ith point and the real point in the converted real two-dimensional image points, (c)x,cy) For centre points in images composed of true two-dimensional image points obtained by conversion, biComposed of true two-dimensional image points obtained for relative transformationThe baseline of the image.
3. The method for identifying an object based on multivariate information fusion as claimed in claim 1, wherein the step of S2 is implemented by converting the laser three-dimensional point cloud formed by real-time laser three-dimensional geographic coordinates into a laser two-dimensional image point matrix YLidarThe formula of (1) is as follows:
YLidar=TLidarRSLidarXLidar
wherein the real-time laser three-dimensional geographic coordinate matrix XLidar=(xl,yl,zl,1)T,xlX-axis coordinate, y, of vessel measured for lidarlShip y-axis coordinate, z, measured for lidarlShip z-axis coordinates measured for the lidar; r is a laser three-dimensional point cloud rotation matrix which is an orthogonal matrix with a determinant of 1; t isLidarIs a laser three-dimensional point cloud conversion matrix,
Figure FDA0003417156380000021
djfor the distance between the j point of the converted laser two-dimensional image points and the real point, (m)x,my) For centre points in the image composed of transformed laser two-dimensional image points, bjA base line of an image formed by laser two-dimensional image points obtained by relative conversion; sLiadarIn order to transform the matrix, the matrix is,
Figure FDA0003417156380000022
Figure FDA0003417156380000023
RLidaris an orthogonal matrix with determinant 1.
4. The method for identifying an object based on multivariate information fusion as defined in claim 1, wherein the method for fusing the laser two-dimensional image point matrix and the two-dimensional image point matrix in the step of S3 comprises the following steps:
s31: constructing a two-dimensional image point set A obtained by splicing two-dimensional image points in the laser two-dimensional image point matrix with two-dimensional image points in the two-dimensional image point matrix;
s32: dividing the two-dimensional image point set A into N multiplied by N grids, and determining the position of the w two-dimensional image point in the grids
Figure FDA0003417156380000024
S33: and constructing a maximum similarity model, and fusing the laser two-dimensional image point matrix and the two-dimensional image point matrix to the maximum extent.
5. The method for identifying an object based on multivariate information fusion as defined in claim 4, wherein the maximum similarity model constructed in the step of S33 is as follows:
Figure FDA0003417156380000031
Figure FDA0003417156380000032
wherein j is 1,2,3, 4; m is Col, Tex, Siz or Shape,
Figure FDA0003417156380000033
for the color parameters of the image after the fusion,
Figure FDA0003417156380000034
for the parameters of the texture of the image after the fusion,
Figure FDA0003417156380000035
a size parameter in pixels for the fused image,
Figure FDA0003417156380000036
the image shape compatibility parameters in the fused image are obtained;
Figure FDA0003417156380000037
represents
Figure FDA0003417156380000038
Belonging to a laser two-dimensional image point matrix Y in a two-dimensional image point set A obtained by splicing and separated into NxN gridsLidarThe p-th point of the composed image,
Figure FDA0003417156380000039
represents
Figure FDA00034171563800000310
Belonging to a two-dimensional image point matrix Y in a two-dimensional image point set A obtained by splicing and separated into N multiplied by N gridsCamPoint q.
6. The method for identifying an object based on multivariate information fusion as defined in claim 5, wherein the fused image color parameter calculation formula is as follows:
Figure FDA00034171563800000311
the calculation formula of the fused image texture parameters is as follows:
Figure FDA00034171563800000312
the size parameter calculation formula of the fused image with the pixel as a unit is as follows:
Figure FDA00034171563800000313
the calculation formula of the image shape compatibility parameter in the fused image is as follows:
Figure FDA00034171563800000314
Figure FDA00034171563800000315
representing a matrix Y of two-dimensional image points belonging to a laser in a set A of two-dimensional image points in a color descriptorLidarThe k-th histogram value of the composed image,
Figure FDA00034171563800000316
representing a matrix Y of two-dimensional image points belonging to a set A of two-dimensional image points in a color descriptorCamThe kth histogram value of the composed image;
Figure FDA0003417156380000041
representing a matrix Y of two-dimensional image points belonging to a laser in a set A of two-dimensional image points in a texture descriptorLidarThe k-th histogram value of the composed image,
Figure FDA0003417156380000042
representing a matrix Y of two-dimensional image points belonging to a set A of two-dimensional image points in a texture descriptorCamThe kth histogram value of the composed image; siz (·) represents an image size calculation function in units of pixels.
7. The method for identifying an object based on multivariate information fusion as defined in claim 1, wherein the minimum iterative loss model in the step of S4 is as follows:
Figure FDA0003417156380000043
Figure FDA0003417156380000044
wherein σ is a weight coefficient lost by the smoothness function, s is an algebraic number of iterations, smooth (·) is a smoothness function of an image formed by a plurality of two-dimensional image points in the two-dimensional image point set,
Figure FDA0003417156380000045
is the Euclidean distance between two adjacent w-th two-dimensional image points and the w-1-th two-dimensional image point.
8. A target identification system based on multivariate information fusion and adopting the method of any one of embodiments 1-7 is characterized by comprising an AIS ship automatic identification system, a laser radar system, a three-spectrum camera and a main control calculation module;
the AIS ship automatic identification system is used for acquiring real-time three-dimensional geographic coordinates of ships;
the three-spectrum camera is used for acquiring a real-time two-dimensional image of the ship;
the laser radar system is used for acquiring a real-time laser three-dimensional geographic coordinate matrix of the ship;
the main control computing module is used for receiving the real-time three-dimensional geographic coordinates of the ship to form a real three-dimensional point cloud, converting the real three-dimensional point cloud into a real two-dimensional image point matrix, receiving a real two-dimensional image of the ship to form a two-dimensional image point set and a two-dimensional image point matrix, receiving the real-time laser three-dimensional geographic coordinate matrix of the ship to form a laser three-dimensional point cloud, and converting the laser three-dimensional point cloud into a laser two-dimensional image point matrix; : and fusing the laser two-dimensional image dot matrix and the two-dimensional image dot matrix to obtain a fused image with the fused two-dimensional image dot matrix, iteratively optimizing a two-dimensional image dot matrix meeting a coincidence degree threshold by adopting a convolutional neural network, and determining a target frame required by positioning and identifying a target in the image fused by the laser two-dimensional image dot matrix and the two-dimensional image dot matrix.
9. A computer arrangement comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the computer program performs the steps of the method for object recognition based on multivariate information fusion as defined in any one of claims 1-7.
10. A computer storage medium having a computer program stored thereon, wherein the computer program, when being executed by a processor, performs the steps of the multivariate information fusion based object recognition method as defined in any one of claims 1-7.
CN202111549900.8A 2021-12-17 2021-12-17 Target identification method, system, device and medium based on multivariate information fusion Pending CN114494806A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111549900.8A CN114494806A (en) 2021-12-17 2021-12-17 Target identification method, system, device and medium based on multivariate information fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111549900.8A CN114494806A (en) 2021-12-17 2021-12-17 Target identification method, system, device and medium based on multivariate information fusion

Publications (1)

Publication Number Publication Date
CN114494806A true CN114494806A (en) 2022-05-13

Family

ID=81494649

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111549900.8A Pending CN114494806A (en) 2021-12-17 2021-12-17 Target identification method, system, device and medium based on multivariate information fusion

Country Status (1)

Country Link
CN (1) CN114494806A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9215382B1 (en) * 2013-07-25 2015-12-15 The United States Of America As Represented By The Secretary Of The Navy Apparatus and method for data fusion and visualization of video and LADAR data
CN107609601A (en) * 2017-09-28 2018-01-19 北京计算机技术及应用研究所 A kind of ship seakeeping method based on multilayer convolutional neural networks
CN109444911A (en) * 2018-10-18 2019-03-08 哈尔滨工程大学 A kind of unmanned boat waterborne target detection identification and the localization method of monocular camera and laser radar information fusion
CN110532865A (en) * 2019-07-19 2019-12-03 南京航空航天大学 Spacecraft structure recognition methods based on visible light and laser fusion
CN111694010A (en) * 2020-05-27 2020-09-22 东南大学 Roadside vehicle identification method based on fusion of vision and laser radar
WO2020233443A1 (en) * 2019-05-21 2020-11-26 菜鸟智能物流控股有限公司 Method and device for performing calibration between lidar and camera
CN112767391A (en) * 2021-02-25 2021-05-07 国网福建省电力有限公司 Power grid line part defect positioning method fusing three-dimensional point cloud and two-dimensional image
CN113111887A (en) * 2021-04-26 2021-07-13 河海大学常州校区 Semantic segmentation method and system based on information fusion of camera and laser radar
CN113156421A (en) * 2021-04-07 2021-07-23 南京邮电大学 Obstacle detection method based on information fusion of millimeter wave radar and camera

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9215382B1 (en) * 2013-07-25 2015-12-15 The United States Of America As Represented By The Secretary Of The Navy Apparatus and method for data fusion and visualization of video and LADAR data
CN107609601A (en) * 2017-09-28 2018-01-19 北京计算机技术及应用研究所 A kind of ship seakeeping method based on multilayer convolutional neural networks
CN109444911A (en) * 2018-10-18 2019-03-08 哈尔滨工程大学 A kind of unmanned boat waterborne target detection identification and the localization method of monocular camera and laser radar information fusion
WO2020233443A1 (en) * 2019-05-21 2020-11-26 菜鸟智能物流控股有限公司 Method and device for performing calibration between lidar and camera
CN110532865A (en) * 2019-07-19 2019-12-03 南京航空航天大学 Spacecraft structure recognition methods based on visible light and laser fusion
CN111694010A (en) * 2020-05-27 2020-09-22 东南大学 Roadside vehicle identification method based on fusion of vision and laser radar
CN112767391A (en) * 2021-02-25 2021-05-07 国网福建省电力有限公司 Power grid line part defect positioning method fusing three-dimensional point cloud and two-dimensional image
CN113156421A (en) * 2021-04-07 2021-07-23 南京邮电大学 Obstacle detection method based on information fusion of millimeter wave radar and camera
CN113111887A (en) * 2021-04-26 2021-07-13 河海大学常州校区 Semantic segmentation method and system based on information fusion of camera and laser radar

Similar Documents

Publication Publication Date Title
CN110415342B (en) Three-dimensional point cloud reconstruction device and method based on multi-fusion sensor
Muhovič et al. Obstacle tracking for unmanned surface vessels using 3-D point cloud
Yang et al. Concrete defects inspection and 3D mapping using CityFlyer quadrotor robot
JP7088288B2 (en) Image processing device, image processing method, and image processing program
US20220051425A1 (en) Scale-aware monocular localization and mapping
CN113743385A (en) Unmanned ship water surface target detection method and device and unmanned ship
CN111812978B (en) Cooperative SLAM method and system for multiple unmanned aerial vehicles
DeBortoli et al. Elevatenet: A convolutional neural network for estimating the missing dimension in 2d underwater sonar images
CN115424233A (en) Target detection method and target detection device based on information fusion
Wang et al. 3D-LIDAR based branch estimation and intersection location for autonomous vehicles
CN117132649A (en) Ship video positioning method and device for artificial intelligent Beidou satellite navigation fusion
CN116844124A (en) Three-dimensional object detection frame labeling method, three-dimensional object detection frame labeling device, electronic equipment and storage medium
Li et al. Driver drowsiness behavior detection and analysis using vision-based multimodal features for driving safety
CN116642490A (en) Visual positioning navigation method based on hybrid map, robot and storage medium
CN114494806A (en) Target identification method, system, device and medium based on multivariate information fusion
Sulaj et al. Examples of real-time UAV data processing with cloud computing
CN114485607A (en) Method for determining motion track, operation equipment, device and storage medium
CN113792645A (en) AI eyeball fusing image and laser radar
Bhaganagar et al. A novel machine-learning framework with a moving platform for maritime drift calculations
Ji et al. A machine visual-based ship monitoring system for offshore wind farms
Liu et al. A vision based system for underwater docking
Cafaro et al. Towards Enhanced Support for Ship Sailing
CN117649619B (en) Unmanned aerial vehicle visual navigation positioning recovery method, system, device and readable storage medium
CN114390270B (en) Real-time intelligent site panorama exploration method and device and electronic equipment
Zhang et al. Towards Dense and Accurate Radar Perception Via Efficient Cross-Modal Diffusion Model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination