CN114325635A - Target fusion method for laser radar and navigation radar - Google Patents

Target fusion method for laser radar and navigation radar Download PDF

Info

Publication number
CN114325635A
CN114325635A CN202111661713.9A CN202111661713A CN114325635A CN 114325635 A CN114325635 A CN 114325635A CN 202111661713 A CN202111661713 A CN 202111661713A CN 114325635 A CN114325635 A CN 114325635A
Authority
CN
China
Prior art keywords
radar
track
target
navigation
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111661713.9A
Other languages
Chinese (zh)
Inventor
彭树林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Advanced Avionics Co ltd
Original Assignee
Shanghai Advanced Avionics Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Advanced Avionics Co ltd filed Critical Shanghai Advanced Avionics Co ltd
Priority to CN202111661713.9A priority Critical patent/CN114325635A/en
Publication of CN114325635A publication Critical patent/CN114325635A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • Optical Radar Systems And Details Thereof (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a target fusion method of a laser radar and a navigation radar, which comprises the following steps: s1: converting three-dimensional point cloud data of the laser radar into two-dimensional image data; s2: extracting and tracking a laser radar target and distributing a flight path through a radar tracker according to two-dimensional image data of the laser radar; s3: and the track fusion engine acquires the laser radar target track output by the radar tracker, and performs track fusion by combining the navigation radar target track to generate a fusion target and track information of the fusion target. According to the laser radar and navigation radar target fusion method provided by the invention, the point cloud data of the laser radar is converted into a three-dimensional target of a geographic space, the three-dimensional target is converted into a two-dimensional target, the fusion with the two-dimensional target detected by the navigation radar is realized through the fusion engine, and data support is provided for accurate detection and judgment of a complex traffic state.

Description

Target fusion method for laser radar and navigation radar
Technical Field
The invention relates to the field of data processing, in particular to a target fusion method of a laser radar and a navigation radar.
Background
The unmanned ship needs autonomous navigation, detection of obstacles which cannot be left, course planning, collision avoidance operation and the like. The obstacle detection means includes a laser radar, a navigation radar, an optical camera, an AIS receiver, and the like. Unlike manned navigation and collision avoidance systems, the determination of obstacles and collision avoidance operations are not ultimately confirmed by a human.
The commercial ship can work in an automatic driving state in a wide ocean because the density of ships in the ocean is extremely low, the water depth far exceeds the draught of the ships, the adjustment space of the course planning is large, and the probability of error occurrence is close to zero and is lower than the error probability of manual decision making in collision avoidance operation decision making by a computer. Thus, commercial ships are sailing in the ocean, often with the option of machine autopilot.
However, on the inbound and outbound ports or the dense navigation channels of the ships, the density of the ships is high, the traffic situation changes rapidly, the depth line range of the safety is small, and the complex situation causes the error probability of the machine decision to be far higher than that of the manual decision. Thus, the commercial vessel chooses manual driving in both of these situations.
The autonomous navigation of the unmanned ship also needs to face complex traffic conditions such as port entering and exiting or dense channels, and the detection of obstacles is required to be accurate and the judgment is correct. Not only the question where the target is but also the true and false questions of the target. Therefore, on the unmanned ship, different attributes of the same target are observed from different sides by using various sensors, and the attribute information is fused to distinguish the true and false of the target, which is one of the basic conditions for realizing autonomous sailing of the unmanned ship.
The target fusion of the laser radar and the navigation radar plays an important role in navigation of the unmanned ship in and out of ports and dense channels.
Disclosure of Invention
The invention aims to provide a method for fusing a laser radar target and a navigation radar target, which realizes the fusion of the laser radar target and the navigation radar target.
The technical scheme adopted by the invention for solving the technical problems is to provide a laser radar and navigation radar target fusion method, which comprises the following steps: s1: converting three-dimensional point cloud data of the laser radar into two-dimensional image data; s2: extracting and tracking a laser radar target and distributing a flight path through a radar tracker according to two-dimensional image data of the laser radar; s3: and the track fusion engine acquires the laser radar target track output by the radar tracker, and performs track fusion by combining the navigation radar target track to generate a fusion target and track information of the fusion target.
Further, the step S1 includes: s11: performing signal enhancement on the three-dimensional point cloud data of the laser radar; s12: acquiring navigation information and navigation attitude information, and converting three-dimensional point cloud data enhanced by laser radar signals to a geographic space; s13: carrying out space expansion on the three-dimensional point cloud data, and connecting the three-dimensional point cloud data obtained by splitting the continuous targets into three-dimensional point cloud data with unified targets in space; s14: carrying out weighted average filtering on three-dimensional point cloud data scanned twice in a week; s15: performing spatial filtering on the three-dimensional point cloud data subjected to weighted average filtering in the step S14; s16: and vertically projecting the three-dimensional point cloud data subjected to the spatial filtering in the step S15 onto a horizontal plane of the position where the ship is located, and converting the three-dimensional point cloud data into two-dimensional point trace image data without height.
Further, the radar tracker in step S2 is a built-in radar tracker, and step S2 includes: s21: combining continuous traces on trace edges in the two-dimensional trace point image data; s22: combining all traces with the distance from the trace point edge smaller than the set distance, and generating trace point numbers and trace point attribute descriptions, wherein the trace point attribute descriptions comprise the size of the trace point, the position of the trace point and the proportion of the trace point; s23: marking the combined trace points with the sizes larger than the set value and the specific gravities larger than the set value by using the latest timestamp, and recording the trace points into an internal trace point database or updating the information of the stored trace points; s24: marking traces which exist in n-2 scans in the internal trace point database in an alpha-beta filter after being filtered by the alpha-beta filter by using a latest timestamp, wherein the value of n is an odd number which is more than or equal to 3; recording the point track as a track to an internal track database or updating the track information which is put in storage, and marking the track as an active state in the track database; s25: according to the timestamp of warehousing or track updating, the mark of the track of the 'active' state with the data age exceeding 1 scanning period is changed into an 'extrapolation' state; s26: sliding all tracks marked as 'extrapolation' in the scanning at the speed of the marked moment, and calculating the current possible position as the track; s27: changing and marking the track target marked as an 'extrapolation' state in the track database into a 'disappearance' state after the set scanning times are exceeded; s28: changing the track target marked as 'disappearing' state in the track database to be marked as 'deleting' state after the set scanning times is exceeded; s29: outputting the newly stored or updated track data in the track database and the track with the changed state in the current scanning period, and deleting the track target with the 'deleted' state in the track database.
Further, the radar tracker in the step S2 may adopt an external radar tracker, and the step S2 includes: converting the two-dimensional point trace image data of the laser radar generated in the step S1 into compatible radar echo video data and outputting the compatible radar echo video data to a radar tracker; the radar tracker processes radar echo video data of the laser radar, and extraction, tracking and track distribution of a laser radar target are achieved.
Further, converting the two-dimensional trace point image data of the laser radar into radar echo video data comprises: azimuth sampling of a 360-degree scanning mode under a polar coordinate system is carried out on the two-dimensional point trace image, and radar echo data sampled in each azimuth are converted and packaged into radar echo video data to be output; the simulation rotating speed of 360-degree scanning is larger than or equal to 120RPM, so that the data rate in the 360-degree scanning mode is larger than or equal to 2 Hz.
Furthermore, the azimuth encoding number of the azimuth sampling is more than or equal to 4096, and the angular resolution of the output radar echo video data is less than or equal to 0.1 degree; echo waveThe number of samples to be sampled is greater than or equal to the laser radar range divided by 0.1 meter and is 2nAnd the distance resolution of echo sampling is less than or equal to 0.1 meter.
Further, in step S12, the navigation information and the navigation attitude information are obtained by measuring the position, the motion, and the navigation attitude of the ship through a satellite navigation system, a gyroscope, an accelerometer, a geomagnetic sensor, and a compass.
Further, the step S3 further includes: the flight path fusion engine receives target information output by the AIS receiver and fuses with a radar target flight path; the track fusion engine receives navigation information and attitude information of satellite positioning, acquires time information, position information, course information, heading information and steering rate information, and realizes track spatial registration, time registration and track association.
Compared with the prior art, the invention has the following beneficial effects: according to the laser radar and navigation radar target fusion method provided by the invention, the point cloud data of the laser radar is converted into a three-dimensional target of a geographic space, the three-dimensional target is converted into a two-dimensional target, the fusion with the two-dimensional target detected by the navigation radar is realized through the fusion engine, and data support is provided for accurate detection and judgment of a complex traffic state.
Drawings
FIG. 1 is a flow chart of a laser radar and navigation radar target fusion method in an embodiment of the present invention;
fig. 2 is a flowchart illustrating a process of converting three-dimensional point cloud data of a laser radar into two-dimensional image data according to an embodiment of the present invention.
Detailed Description
The invention is further described below with reference to the figures and examples.
Fig. 1 is a schematic structural diagram of a laser radar and navigation radar target fusion method in the embodiment of the invention.
Referring to fig. 1, a method for fusing a laser radar target and a navigation radar target according to an embodiment of the present invention includes the following steps:
s1: converting three-dimensional point cloud data of the laser radar into two-dimensional image data;
s2: extracting and tracking a laser radar target and distributing a flight path through a radar tracker according to two-dimensional image data of the laser radar;
s3: and the track fusion engine acquires the laser radar target track output by the radar tracker, and performs track fusion by combining the navigation radar target track to generate a fusion target and track information of the fusion target.
Referring to fig. 2, in the laser radar and navigation radar target fusion method according to the embodiment of the present invention, the step S1 of converting the three-dimensional point cloud data of the laser radar into the two-dimensional image data includes:
s11: performing signal enhancement on the three-dimensional point cloud data of the laser radar; and carrying out nonlinear amplification on the reflectivity value of the laser radar, so that a signal with smaller reflectivity is enhanced, and a target with lower reflectivity can be observed conveniently.
S12: acquiring navigation information and navigation attitude information, and converting three-dimensional point cloud data enhanced by laser radar signals to a geographic space; meanwhile, point cloud data of the laser radar is converted into a measurement result in a motion mode, so that the observation result of a stationary target in a geographic space is in a stationary state, and the size and the position of the stationary target do not change along with the motion of the laser radar. Navigation information and navigation attitude information are obtained by measuring the position, motion and navigation attitude of the ship through a satellite navigation system, a gyroscope, an accelerometer, a geomagnetic sensor and a compass.
S13: carrying out space expansion on the three-dimensional point cloud data, and connecting the three-dimensional point cloud data obtained by splitting the continuous targets into three-dimensional point cloud data with unified targets in space; because the laser radar with a small number of scanning lines or the laser radar with a high rotation speed can process a continuous object into two or more discontinuous objects, space expansion is needed to connect three-dimensional point cloud data obtained by splitting a continuous target into three-dimensional point cloud data with a uniform target in space.
S14: carrying out weighted average filtering on three-dimensional point cloud data scanned twice in a week; the method is used for eliminating the fluctuation of the observed value of the same target and integrating the observed data of different pitching angles of the same object to form a continuous and consistent result. Because the ship platform can not keep absolute calm on the water surface, the fluctuation characteristics of the two dynamic observations are integrated to obtain an observation result closer to the real condition.
S15: performing spatial filtering on the three-dimensional point cloud data subjected to weighted average filtering in the step S14; and eliminating the point cloud data with the height range exceeding in the geographic space according to the set upper space limit and the lower space limit. The method is used for eliminating a bridge target which is higher than a ship body and does not influence the navigation of the ship, an overhead high-voltage line target on water and the like, and reducing the interference of a high-altitude target.
S16: and vertically projecting the three-dimensional point cloud data subjected to the spatial filtering in the step S15 onto a horizontal plane of the position where the ship is located, and converting the three-dimensional point cloud data into two-dimensional point trace image data without height.
Specifically, in step S2, two ways may be adopted to achieve the extraction and tracking of the lidar target: using a built-in radar tracker or using an externally existing radar tracker.
The method adopts a built-in radar tracker to extract and track the laser radar target, and comprises the following steps:
s21: combining continuous traces on trace edges in the two-dimensional trace point image data;
s22: combining all traces with the distance from the trace point edge smaller than the set distance, and generating trace point numbers and trace point attribute descriptions, wherein the trace point attribute descriptions comprise the size of the trace point, the position of the trace point and the proportion of the trace point;
s23: marking the combined trace points with the sizes larger than the set value and the specific gravities larger than the set value by using the latest timestamp, and recording the trace points into an internal trace point database or updating the information of the stored trace points;
s24: marking traces which exist in n-2 scans in an internal trace point database, filtering the traces by an alpha-beta filter, and marking by using a latest timestamp, wherein the value of n is an odd number which is greater than or equal to 3, usually n is less than or equal to 31, and preferably n is 7; recording the point track as a track to an internal track database or updating the track information which is put in storage, and marking the track as an active state in the track database;
s25: according to the timestamp of warehousing or track updating, the mark of the track of the 'active' state with the data age exceeding 1 scanning period is changed into an 'extrapolation' state;
s26: sliding all tracks marked as 'extrapolation' in the scanning at the speed of the marked moment, and calculating the current possible position as the track;
s27: changing and marking the track target marked as an 'extrapolation' state in the track database into a 'disappearance' state after the set scanning times are exceeded;
s28: changing the track target marked as 'disappearing' state in the track database to be marked as 'deleting' state after the set scanning times is exceeded;
s29: outputting the newly stored or updated track data in the track database and the track with the changed state in the current scanning period, and deleting the track target with the 'deleted' state in the track database.
When the existing external radar tracker is used for extracting and tracking the laser radar target, the data recognizable by the existing radar tracker is radar echo video data, so that the two-dimensional point image data of the laser radar generated in the step S1 needs to be converted into compatible radar echo video data before the existing external radar tracker extracts and tracks the laser radar target; and then, processing radar echo video data of the laser radar through an external existing radar tracker to realize the extraction, tracking and track distribution of the laser radar target.
Specifically, converting the two-dimensional trace point image data of the laser radar into radar echo video data comprises: azimuth sampling of a 360-degree scanning mode under a polar coordinate system is carried out on the two-dimensional point trace image, and radar echo data sampled in each azimuth are converted and packaged into radar echo video data to be output; the simulation rotating speed of 360-degree scanning is more than or equal to 120RPM, so that the data rate in the 360-degree scanning mode is more than or equal to 2 Hz; the azimuth encoding number of azimuth sampling is more than or equal to 4096, and the angular resolution of the output radar echo video data is less than or equal to 0.1 degree; the number of samples of echo sampling is largeIs equal to or greater than the laser radar range divided by 0.1 m and is 2nAnd the distance resolution of echo sampling is less than or equal to 0.1 meter.
Step S3 further includes: a track fusion engine receives target information output by an AIS (Automatic Identification System) receiver and fuses with a radar target track; the track fusion engine receives navigation information and attitude information of satellite positioning, acquires time information, position information, course information, heading information and steering rate information, and realizes track spatial registration, time registration and track association.
The fusion of the laser radar target track and the ship navigation radar target track is usually realized by adopting an existing track fusion engine. The existing radar track fusion engine can realize the fusion of a plurality of radars, the fusion of the radars and AIS, the fusion of the radars and ADS-B and the like. The fused target information can indicate that each fused target is a target of which radar, or an AIS target, or a target of which radar and AIS are fused, or a target of which several radars are fused at the same time, or a target of which several radars and AIS are fused at the same time.
The observation range of the laser radar is relatively close, the observation range of the ship navigation radar is relatively far, the two targets can effectively distinguish small targets in a short distance after target fusion is carried out, the splitting condition of the same target on the output of the laser radar is further suppressed, the result of fusion output is closer to the real condition, and the interference of splitting a plurality of radar targets by the same target is reduced.
In a specific embodiment, the laser radar and navigation radar target fusion steps are as follows:
firstly, converting the point cloud data of the 32-line laser radar through the process shown in the step S1, and converting the three-dimensional point cloud image data of the laser radar into two-dimensional point trace image data;
then, the two-dimensional point trace image data is generated into radar echo data scanned by two coordinate radars in 360 degrees according to an SPx standard format, and then the radar echo data is sent to an SPxServer radar tracker 1 through a network, and the SPxServer radar tracker 1 processes the two-dimensional point trace image data of the laser radars so as to realize point trace extraction, track trace tracking and track distribution;
finally, the laser radar track is sent to an SPxFuse track fusion engine in an SPx format; the echo video data of the ship navigation radar is processed by the SPxServer radar tracker 2, the target track of the ship navigation radar is also output to the SPxFuse track fusion engine, and track line fusion is carried out by the SPxFuse track fusion engine.
The SPxServer radar tracker 1 and the SPxServer radar tracker 2 use different working parameters, and the SPxServer radar tracker 1 sets higher range resolution, angle resolution and smaller measurement error.
Target information output by the AIS receiver can be input into the SPxFuse flight path fusion engine to be fused with a radar target flight path. The data and attitude information of satellite positioning navigation can also be input into the SPxFuse track fusion engine, so that the fusion engine processes ship navigation information such as time information, position information, course information, heading information, steering rate information and the like, and track space registration, time registration and track association are accurately realized.
The spxfose track fusion engine sets at least 3 sensor inputs: the input of the first path of sensor is a laser radar target output by the SPxServer radar tracker 1, and a smaller measurement error parameter is set; the input of the second path of sensor is a ship navigation radar target output by the SPxServer radar tracker 2, and a larger measurement error parameter is set; the third sensor input is AIS information input.
In summary, according to the laser radar and navigation radar target fusion method provided by the embodiment of the invention, the point cloud data of the laser radar is converted into the three-dimensional target of the geographic space, the three-dimensional target is converted into the two-dimensional target, the fusion with the two-dimensional target detected by the navigation radar is realized through the fusion engine, and data support is provided for accurate detection and judgment of complex traffic states.
Although the present invention has been described with respect to the preferred embodiments, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (8)

1. A target fusion method for a laser radar and a navigation radar is characterized by comprising the following steps:
s1: converting three-dimensional point cloud data of the laser radar into two-dimensional image data;
s2: extracting and tracking a laser radar target and distributing a flight path through a radar tracker according to two-dimensional image data of the laser radar;
s3: and the track fusion engine acquires the laser radar target track output by the radar tracker, and performs track fusion by combining the navigation radar target track to generate a fusion target and track information of the fusion target.
2. The lidar and navigation radar target fusion method of claim 1, wherein said step S1 comprises:
s11: performing signal enhancement on the three-dimensional point cloud data of the laser radar;
s12: acquiring navigation information and navigation attitude information, and converting three-dimensional point cloud data enhanced by laser radar signals to a geographic space;
s13: carrying out space expansion on the three-dimensional point cloud data, and connecting the three-dimensional point cloud data obtained by splitting the continuous targets into three-dimensional point cloud data with unified targets in space;
s14: carrying out weighted average filtering on three-dimensional point cloud data scanned twice in a week;
s15: performing spatial filtering on the three-dimensional point cloud data subjected to weighted average filtering in the step S14;
s16: and vertically projecting the three-dimensional point cloud data subjected to the spatial filtering in the step S15 onto a horizontal plane of the position where the ship is located, and converting the three-dimensional point cloud data into two-dimensional point trace image data without height.
3. The lidar and navigation radar target fusion method of claim 2, wherein the radar tracker in the step S2 is a built-in radar tracker, and the step S2 comprises:
s21: combining continuous traces on trace edges in the two-dimensional trace point image data;
s22: combining all traces with the distance from the trace point edge smaller than the set distance, and generating trace point numbers and trace point attribute descriptions, wherein the trace point attribute descriptions comprise the size of the trace point, the position of the trace point and the proportion of the trace point;
s23: marking the combined trace points with the sizes larger than the set value and the specific gravities larger than the set value by using the latest timestamp, and recording the trace points into an internal trace point database or updating the information of the stored trace points;
s24: marking traces which exist in n-2 scans in the internal trace point database in an alpha-beta filter after being filtered by the alpha-beta filter by using a latest timestamp, wherein the value of n is an odd number which is more than or equal to 3; recording the point track as a track to an internal track database or updating the track information which is put in storage, and marking the track as an active state in the track database;
s25: according to the timestamp of warehousing or track updating, the mark of the track of the 'active' state with the data age exceeding 1 scanning period is changed into an 'extrapolation' state;
s26: sliding all tracks marked as 'extrapolation' in the scanning at the speed of the marked moment, and calculating the current possible position as the track;
s27: changing and marking the track target marked as an 'extrapolation' state in the track database into a 'disappearance' state after the set scanning times are exceeded;
s28: changing and marking the track target marked as a disappearing state in the track database into a 'deleting' state after the set scanning times is exceeded;
s29: outputting the newly stored or updated track data in the track database and the track with the changed state in the current scanning period, and deleting the track target with the 'deleted' state in the track database.
4. The lidar and navigation radar target fusion method of claim 2, wherein the radar tracker in the step S2 is an external radar tracker, and the step S2 comprises: converting the two-dimensional point trace image data of the laser radar generated in the step S1 into compatible radar echo video data and outputting the compatible radar echo video data to a radar tracker; and the external radar tracker processes radar echo video data of the laser radar to realize the extraction, tracking and track distribution of a laser radar target.
5. The lidar and navigation radar target fusion method of claim 4, wherein converting lidar two-dimensional spot image data to radar echo video data comprises: azimuth sampling of a 360-degree scanning mode under a polar coordinate system is carried out on the two-dimensional point trace image, and radar echo data sampled in each azimuth are converted and packaged into radar echo video data to be output; the simulation rotating speed of 360-degree scanning is larger than or equal to 120RPM, so that the data rate in the 360-degree scanning mode is larger than or equal to 2 Hz.
6. The lidar and navigation radar target fusion method of claim 5, wherein the azimuth encoding number of the azimuth samples is 4096 or more, and the angular resolution of the output radar echo video data is 0.1 degree or less; the number of samples of echo sampling is more than or equal to the laser radar range divided by 0.1 meter and is 2nAnd the distance resolution of echo sampling is less than or equal to 0.1 meter.
7. The lidar and navigation radar target fusion method of claim 2, wherein the navigation information and the navigation attitude information are obtained by a satellite navigation system, a gyroscope, an accelerometer, a geomagnetic sensor, and a compass measuring a position, a motion, and a navigation attitude of the ship in step S12.
8. The lidar and navigation radar target fusion method of claim 2, wherein said step S3 further comprises: the flight path fusion engine receives target information output by the AIS receiver and fuses with a radar target flight path; the track fusion engine receives navigation information and attitude information of satellite positioning, acquires time information, position information, course information, heading information and steering rate information, and realizes track spatial registration, time registration and track association.
CN202111661713.9A 2021-12-30 2021-12-30 Target fusion method for laser radar and navigation radar Pending CN114325635A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111661713.9A CN114325635A (en) 2021-12-30 2021-12-30 Target fusion method for laser radar and navigation radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111661713.9A CN114325635A (en) 2021-12-30 2021-12-30 Target fusion method for laser radar and navigation radar

Publications (1)

Publication Number Publication Date
CN114325635A true CN114325635A (en) 2022-04-12

Family

ID=81019772

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111661713.9A Pending CN114325635A (en) 2021-12-30 2021-12-30 Target fusion method for laser radar and navigation radar

Country Status (1)

Country Link
CN (1) CN114325635A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024001629A1 (en) * 2022-07-01 2024-01-04 重庆邮电大学 Multi-sensor fusion method and system for intelligent driving vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024001629A1 (en) * 2022-07-01 2024-01-04 重庆邮电大学 Multi-sensor fusion method and system for intelligent driving vehicle

Similar Documents

Publication Publication Date Title
US10942028B2 (en) Video sensor fusion and model based virtual and augmented reality systems and methods
US11328155B2 (en) Augmented reality labels systems and methods
CN108152831B (en) Laser radar obstacle identification method and system
US10908678B2 (en) Video and image chart fusion systems and methods
Larson et al. Autonomous navigation and obstacle avoidance for unmanned surface vehicles
CN111524392B (en) Comprehensive system for assisting intelligent ship remote driving
CN111090283B (en) Unmanned ship combined positioning and orientation method and system
CN102506867B (en) SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system) combined navigation method based on Harris comer matching and combined navigation system
US5416713A (en) Obstacle avoidance apparatus
US20220172464A1 (en) Water non-water segmentation systems and methods
WO2020061545A1 (en) Augmented reality display view generation
US20220392211A1 (en) Water non-water segmentation systems and methods
WO2021178603A1 (en) Water non-water segmentation systems and methods
CN112964291A (en) Sensor calibration method and device, computer storage medium and terminal
CN114325635A (en) Target fusion method for laser radar and navigation radar
CN114061565B (en) Unmanned ship SLAM and application method thereof
CN111984006B (en) Unmanned ship multi-target meeting collision avoidance method integrating ocean current and scale difference influences
CN213748480U (en) Electronic sand table system
CN114047514A (en) Unmanned ship navigation method based on millimeter wave radar
CN114445761A (en) Image recognition-based remote Raikang state monitoring method
CN112923932A (en) High-precision map generation method based on multi-sensing fusion positioning
WO2023162562A1 (en) Target monitoring system, target monitoring method, and program
CN113031598B (en) Path tracking obstacle avoidance guidance method and system
JPH0431439B2 (en)
Zube et al. Autonomously mapping shallow water environments under and above the water surface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination