CN116592871B - Unmanned ship multi-source target information fusion method - Google Patents

Unmanned ship multi-source target information fusion method Download PDF

Info

Publication number
CN116592871B
CN116592871B CN202310476123.1A CN202310476123A CN116592871B CN 116592871 B CN116592871 B CN 116592871B CN 202310476123 A CN202310476123 A CN 202310476123A CN 116592871 B CN116592871 B CN 116592871B
Authority
CN
China
Prior art keywords
obstacle
information
target information
obstacle target
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310476123.1A
Other languages
Chinese (zh)
Other versions
CN116592871A (en
Inventor
祁明浩
梁立
孙之光
潘雁行
付悦文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lianyungang Jereh Science Park Management Co ltd
Original Assignee
Lianyungang Jereh Science Park Management Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lianyungang Jereh Science Park Management Co ltd filed Critical Lianyungang Jereh Science Park Management Co ltd
Priority to CN202310476123.1A priority Critical patent/CN116592871B/en
Publication of CN116592871A publication Critical patent/CN116592871A/en
Application granted granted Critical
Publication of CN116592871B publication Critical patent/CN116592871B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/937Radar or analogous systems specially adapted for specific applications for anti-collision purposes of marine craft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Ocean & Marine Engineering (AREA)
  • Electromagnetism (AREA)
  • Navigation (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a method for fusing multisource target information of an unmanned ship, which is based on means such as navigation radar, photoelectric equipment, AIS equipment, satellite map and the like, acquires information such as target position and the like, unifies information description formats, defines a data transmission interface, realizes information data interaction, can complementarily prepare multisource information, improves the stability of the target information, and obtains more accurate and reliable target information by carrying out multisource information fusion processing.

Description

Unmanned ship multi-source target information fusion method
Technical Field
The invention belongs to the technical field of unmanned boats, and particularly relates to a method for fusing multisource target information of an unmanned boat.
Background
The offshore environment is complex, and for autonomous navigation of unmanned boats, various sensors are generally arranged on the boats to acquire target information of offshore obstacles. However, obstacle target information detected by a plurality of sensors is complex and is easy to generate false alarms, in order to ensure the credibility of the target information, fusion processing is needed to be carried out on multi-source target information, and various information is synthesized to form a real-time navigation situation. Patent CN113687349a discloses a method and a device for tracking a sea surface target of an unmanned ship based on multi-sensor fusion, wherein the method comprises the steps of acquiring original point cloud data acquired by a laser radar; acquiring original millimeter wave data acquired by a millimeter wave radar; performing fusion processing on the original point cloud data and the original millimeter wave data to obtain fused target information; and tracking the target on the sea surface based on the fused target information. Patent CN 113484864B discloses a method for sensing the cooperative environment of a navigation radar and a photoelectric pod facing an unmanned ship, which comprises the following steps: the unmanned ship is provided with a fixed navigation radar and a photoelectric pod for parameter initialization; the navigation radar scans the environment area to obtain a navigation radar image, performs filtering processing on the navigation radar image, and performs target screening and target characteristic analysis to form a radar perception target distribution diagram; the optoelectronic pod scans the environmental area to obtain an optoelectronic pod image, and the optoelectronic pod image is subjected to real-time target detection by adopting an improved SSD target detection algorithm. The existing unmanned ship information fusion method is generally mainly based on two kinds of source information, and does not use satellite maps for fusion, so that the unmanned ship information fusion method has no comprehensiveness and reliability.
Therefore, a method for fusing the multisource target information of the unmanned ship needs to be researched, and accuracy of obstacle target information is guaranteed, so that reliability of autonomous navigation of the unmanned ship is improved.
Disclosure of Invention
The invention aims to provide the unmanned ship multi-source target information fusion method aiming at the problem of low target information accuracy in the prior art, and the method has the advantages of high accuracy, high reliability and the like, and can meet the information fusion requirement of autonomous navigation of the unmanned ship.
The technical solution for realizing the purpose of the invention is as follows: in one aspect, a method for fusing multisource target information of an unmanned ship is provided, and the method comprises the following steps:
Step 1, acquiring navigation radar echo video information, photoelectric equipment image information, satellite map information and obstacle target information acquired by AIS equipment;
step 2, obstacle target recognition is carried out on the echo video information, and azimuth, distance, navigational speed and heading information of the target are extracted;
step 3, identifying obstacle targets from the image information, and tracking and ranging the identified targets to obtain azimuth and distance information of the targets;
Step 4, extracting fixed obstacle target information based on satellite map information;
and 5, carrying out fusion processing on the obstacle target information in the steps 1 to 3 to obtain a fusion target, and then combining the fixed obstacle target information extracted in the step 4 to form a comprehensive environment situation around the unmanned ship so as to realize obstacle avoidance of the unmanned ship.
Further, in step 5, the obstacle target information in steps 1 to 3 is fused, specifically: based on the obstacle targets detected by the navigation radar, aiming at the dynamic obstacle targets: fusing and confirming the obstacle target information acquired by the navigation radar and the obstacle target information acquired by the AIS equipment, and if the obstacle target information cannot be confirmed, continuing to fuse the obstacle target information acquired by the navigation radar and the obstacle target information acquired by the photoelectric equipment; for static targets: and fusing the obstacle target information acquired by the navigation radar and the obstacle target information acquired by the photoelectric equipment.
Further, the fusing of the obstacle target information acquired by the navigation radar and the obstacle target information acquired by the AIS equipment comprises the following specific processes:
If the azimuth and the distance between the obstacle target acquired by the navigation radar and the obstacle target acquired by the AIS equipment are within the set threshold, the obstacle target is judged to be the same obstacle target, the obstacle target information acquired by the AIS equipment is taken as the standard, and the fused obstacle target information is the obstacle target information acquired by the AIS equipment.
Further, the fusing of the obstacle target information acquired by the navigation radar and the obstacle target information acquired by the photoelectric device comprises the following specific processes:
If the azimuth and the distance between the obstacle target acquired by the navigation radar and the obstacle target acquired by the photoelectric equipment are within the set threshold, the obstacle target is judged to be the same obstacle target, if the navigation radar target is stably output, the integrated obstacle target information is obstacle target information acquired by the navigation radar, if the photoelectric equipment target is stably output, the integrated obstacle target information is obstacle target information acquired by the photoelectric equipment, if the navigation radar target and the photoelectric equipment target are stably output, the photoelectric equipment obstacle target information is taken as the standard, and the integrated obstacle target information is obstacle target information acquired by the photoelectric equipment.
Further, the navigation radar, the photoelectric equipment and the AIS equipment are all installed on the center line of the unmanned ship, and the center positions of other equipment are unified by taking the navigation radar as the center.
Further, the information description formats of the navigation radar, the photoelectric device and the AIS device are unified, and all adopt standard communication protocols.
Further, the navigation radar, the photoelectric device and the AIS device all adopt an Ethernet mode for data transmission.
In another aspect, an unmanned aerial vehicle multi-source target information fusion system is provided, the system comprising: navigation radar, photoelectric equipment, AIS equipment, satellite map and information fusion processing unit;
the information fusion processing unit is used for identifying obstacle targets of the echo information of the navigation radar and extracting the azimuth, distance, navigational speed and heading information of the targets;
The information fusion processing unit is used for identifying obstacle targets of the image information of the photoelectric equipment, tracking and ranging the identified targets to obtain target azimuth and distance information;
The information fusion processing unit is used for acquiring azimuth, distance, navigational speed and course information of surrounding ship targets through AIS equipment;
The information fusion processing unit is used for acquiring fixed obstacle target information through a satellite map;
The information fusion processing unit is also used for fusing obstacle target information acquired by the navigation radar, the photoelectric equipment and the AIS equipment to obtain a fusion target, and then combining the extracted fixed obstacle target to form a comprehensive environment situation around the unmanned ship so as to realize obstacle avoidance of the unmanned ship.
Compared with the prior art, the invention has the remarkable advantages that: 1) Information such as a target position is acquired based on means such as navigation radar, photoelectric equipment, AIS equipment and satellite map, a unified information description format is defined, a data transmission interface is defined, information data interaction is realized, multi-source information can be complementary mutually, and stability of the target information is improved; 2) The center positions of all the devices are unified, so that the accuracy of the target information is improved; 3) In the unmanned ship navigation process, the information fusion processing unit carries out comprehensive fusion processing on obstacle target information acquired by the multiple sensors, and can send the fusion target information to the navigation control system to form a real-time navigation situation, so that autonomous obstacle avoidance and safe navigation of the unmanned ship are realized.
The invention is described in further detail below with reference to the accompanying drawings.
Drawings
Fig. 1 is a flow chart for fusion of unmanned aerial vehicle multi-source target information.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application.
In one embodiment, a method for fusing multi-source target information of an unmanned ship is provided, wherein a navigation radar, a photoelectric device and an AIS device are configured in the unmanned ship, and the method specifically comprises the following steps:
Step 1, acquiring navigation radar echo video information, photoelectric equipment image information, satellite map information and obstacle target information acquired by AIS equipment;
step 2, obstacle target recognition is carried out on the echo video information, and azimuth, distance, navigational speed and heading information of the target are extracted;
step 3, identifying obstacle targets from the image information, and tracking and ranging the identified targets to obtain azimuth and distance information of the targets;
Step 4, extracting fixed obstacle target information based on satellite map information;
And 5, carrying out fusion processing on the obstacle target information in the steps 1 to 3 to obtain a fusion target, and then combining the fixed obstacle target information extracted in the step 4 (if the sensor does not detect the obstacle information extracted on the satellite map, the satellite map is taken as a main part) to form a comprehensive environment situation around the unmanned aerial vehicle, and sending the comprehensive environment situation to an unmanned aerial vehicle navigation control system to realize unmanned aerial vehicle obstacle avoidance.
Here, by performing fusion processing on the multisource target information detected by the multisensor, more accurate target information can be acquired. The multisource information fusion fully utilizes multisource situation awareness capability of the unmanned ship, collects information reported by equipment such as navigation radar, photoelectric equipment and AIS equipment on the unmanned ship, detects, identifies, positions and tracks a target object and generates situations, so that comprehensive environment situations around the unmanned ship are formed. And integrating the data information of the multiple sensors, performing multi-level and multi-space information complementation and optimal combination processing on the data of the various sensors through operations such as conversion, de-duplication, filtering and the like, and improving the accuracy and reliability of unmanned ship environment situation construction by utilizing the complementation among different devices.
Further, in one embodiment, the navigation radar, the optoelectronic device and the AIS device are all installed on the center line of the unmanned ship, and the center positions of other devices are unified by taking the navigation radar as the center.
Here, by unifying the hull coordinate system, that is, unifying the center position, the accuracy of the target information can be improved.
Further, in one embodiment, in terms of information transmission, information description formats of the unified navigation radar, the optoelectronic device and the AIS device all adopt standard communication protocols, and define a data transmission interface, and the unified information description formats adopt an ethernet form.
Further, in one embodiment, in step 5, the obstacle target information in steps 1 to 3 is fused, specifically: based on the obstacle targets detected by the navigation radar, aiming at dynamic obstacle targets (ships): fusing and confirming the obstacle target information acquired by the navigation radar and the obstacle target information acquired by the AIS equipment, and if the obstacle target information cannot be confirmed, continuing to fuse the obstacle target information acquired by the navigation radar and the obstacle target information acquired by the photoelectric equipment; for static targets: and fusing the obstacle target information acquired by the navigation radar and the obstacle target information acquired by the photoelectric equipment.
Here, the profile information of the obstacle detected from different angles by the two sensors is fused, so that the richer profile information can be obtained, and the recognition and detection capability of the obstacle can be improved.
The specific process for fusing the obstacle target information acquired by the navigation radar and the obstacle target information acquired by the AIS equipment comprises the following steps:
If the azimuth and the distance between the obstacle target acquired by the navigation radar and the obstacle target acquired by the AIS equipment are within the set threshold, the obstacle target is judged to be the same obstacle target, the obstacle target information acquired by the AIS equipment is taken as the standard, and the fused obstacle target information is the obstacle target information acquired by the AIS equipment.
The specific process for fusing the obstacle target information acquired by the navigation radar and the obstacle target information acquired by the photoelectric equipment comprises the following steps:
If the azimuth and the distance between the obstacle target acquired by the navigation radar and the obstacle target acquired by the photoelectric equipment are within the set threshold, the obstacle target is judged to be the same obstacle target, if the navigation radar target is stably output, the integrated obstacle target information is obstacle target information acquired by the navigation radar, if the photoelectric equipment target is stably output, the integrated obstacle target information is obstacle target information acquired by the photoelectric equipment, if the navigation radar target and the photoelectric equipment target are stably output, the photoelectric equipment obstacle target information is taken as the standard, and the integrated obstacle target information is obstacle target information acquired by the photoelectric equipment.
In one embodiment, an unmanned aerial vehicle multi-source target information fusion system is provided, the system comprising: navigation radar, photoelectric equipment, AIS equipment, satellite map and information fusion processing unit;
the information fusion processing unit is used for identifying obstacle targets of the echo information of the navigation radar and extracting the azimuth, distance, navigational speed and heading information of the targets;
The information fusion processing unit is used for identifying obstacle targets of the image information of the photoelectric equipment, tracking and ranging the identified targets to obtain target azimuth and distance information;
The information fusion processing unit is used for acquiring azimuth, distance, navigational speed and course information of surrounding ship targets through AIS equipment;
The information fusion processing unit is used for acquiring fixed obstacle target information through a satellite map;
The information fusion processing unit is also used for fusing obstacle target information acquired by the navigation radar, the photoelectric equipment and the AIS equipment to obtain a fusion target, and then combining the extracted fixed obstacle target to form a comprehensive environment situation around the unmanned ship so as to realize obstacle avoidance of the unmanned ship.
The navigation radar, the photoelectric equipment and the AIS equipment are all installed on the center line of the unmanned ship, the distance between the photoelectric equipment and the navigation radar in the horizontal direction and the height direction is measured by taking the navigation radar as the center, and the information fusion processing unit corrects the center of the photoelectric equipment to the center of the navigation radar through the distance value, so that the center position is unified.
For specific limitation of the unmanned aerial vehicle multi-source target information fusion system, reference may be made to the limitation of the unmanned aerial vehicle multi-source target information fusion method hereinabove, and no further description is given here. All or part of each module in the unmanned ship multi-source target information fusion system can be realized by software, hardware and a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
The unmanned ship multisource target information fusion method provided by the invention has the advantages of good stability and high reliability, improves the accuracy of targets, and can provide guarantee for safe navigation of the unmanned ship.
The foregoing has outlined and described the basic principles, features, and advantages of the present invention. It will be understood by those skilled in the art that the foregoing embodiments are not intended to limit the invention, and the above embodiments and descriptions are meant to be illustrative only of the principles of the invention, and that various modifications, equivalent substitutions, improvements, etc. may be made within the spirit and scope of the invention without departing from the spirit and scope of the invention.

Claims (8)

1. The unmanned ship multi-source target information fusion method is characterized by comprising the following steps of:
Step 1, acquiring navigation radar echo video information, photoelectric equipment image information, satellite map information and obstacle target information acquired by AIS equipment;
step 2, obstacle target recognition is carried out on the echo video information, and azimuth, distance, navigational speed and heading information of the target are extracted;
step 3, identifying obstacle targets from the image information, and tracking and ranging the identified targets to obtain azimuth and distance information of the targets;
Step 4, extracting fixed obstacle target information based on satellite map information;
Step 5, carrying out fusion processing on the obstacle target information in the steps 1 to 3 to obtain a fusion target, and then combining the fixed obstacle target information extracted in the step 4 to form a comprehensive environment situation around the unmanned ship, and sending the comprehensive environment situation to an unmanned ship navigation control system to realize obstacle avoidance of the unmanned ship;
in the step 5, the obstacle target information in the steps 1 to 3 is fused, specifically: based on the obstacle targets detected by the navigation radar, aiming at the dynamic obstacle targets: fusing and confirming the obstacle target information acquired by the navigation radar and the obstacle target information acquired by the AIS equipment, and if the obstacle target information cannot be confirmed, continuing to fuse the obstacle target information acquired by the navigation radar and the obstacle target information acquired by the photoelectric equipment; for static targets: and fusing the obstacle target information acquired by the navigation radar and the obstacle target information acquired by the photoelectric equipment.
2. The unmanned ship multi-source target information fusion method according to claim 1, wherein the fusion of the obstacle target information acquired by the navigation radar and the obstacle target information acquired by the AIS equipment comprises the following specific processes:
If the azimuth and the distance between the obstacle target acquired by the navigation radar and the obstacle target acquired by the AIS equipment are within the set threshold, the obstacle target is judged to be the same obstacle target, the obstacle target information acquired by the AIS equipment is taken as the standard, and the fused obstacle target information is the obstacle target information acquired by the AIS equipment.
3. The unmanned ship multi-source target information fusion method according to claim 1, wherein the fusion of the obstacle target information acquired by the navigation radar and the obstacle target information acquired by the photoelectric device comprises the following specific steps:
If the azimuth and the distance between the obstacle target acquired by the navigation radar and the obstacle target acquired by the photoelectric equipment are within the set threshold, the obstacle target is judged to be the same obstacle target, if the navigation radar target is stably output, the integrated obstacle target information is obstacle target information acquired by the navigation radar, if the photoelectric equipment target is stably output, the integrated obstacle target information is obstacle target information acquired by the photoelectric equipment, if the navigation radar target and the photoelectric equipment target are stably output, the photoelectric equipment obstacle target information is taken as the standard, and the integrated obstacle target information is obstacle target information acquired by the photoelectric equipment.
4. The unmanned ship multi-source target information fusion method according to claim 1, wherein the navigation radar, the photoelectric equipment and the AIS equipment are all installed on a central line of the unmanned ship, and the central positions of other equipment are uniform by taking the navigation radar as a center.
5. The unmanned aerial vehicle multisource target information fusion method according to claim 4, wherein information description formats of the navigation radar, the photoelectric equipment and the AIS equipment are unified, and standard communication protocols are adopted.
6. The unmanned aerial vehicle multi-source target information fusion method according to claim 4, wherein the navigation radar, the photoelectric device and the AIS device all adopt an Ethernet mode for data transmission.
7. Unmanned aerial vehicle multisource target information fusion system based on the method of any of claims 1 to 6, characterized in that the system comprises: navigation radar, photoelectric equipment, AIS equipment, satellite map and information fusion processing unit;
the information fusion processing unit is used for identifying obstacle targets of the echo information of the navigation radar and extracting the azimuth, distance, navigational speed and heading information of the targets;
The information fusion processing unit is used for identifying obstacle targets of the image information of the photoelectric equipment, tracking and ranging the identified targets to obtain target azimuth and distance information;
The information fusion processing unit is used for acquiring azimuth, distance, navigational speed and course information of surrounding ship targets through AIS equipment;
The information fusion processing unit is used for acquiring fixed obstacle target information through a satellite map;
The information fusion processing unit is also used for fusing obstacle target information acquired by the navigation radar, the photoelectric equipment and the AIS equipment to obtain a fusion target, and then combining the extracted fixed obstacle target to form a comprehensive environment situation around the unmanned ship so as to realize obstacle avoidance of the unmanned ship.
8. The unmanned ship multi-source target information fusion system according to claim 7, wherein the navigation radar, the photoelectric device and the AIS device are all installed on a central line of the unmanned ship, the distance between the photoelectric device and the navigation radar in the horizontal and height directions is measured by taking the navigation radar as a center, and the information fusion processing unit corrects the center of the photoelectric device to the center of the navigation radar through the distance value, so that the center position is unified.
CN202310476123.1A 2023-04-28 2023-04-28 Unmanned ship multi-source target information fusion method Active CN116592871B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310476123.1A CN116592871B (en) 2023-04-28 2023-04-28 Unmanned ship multi-source target information fusion method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310476123.1A CN116592871B (en) 2023-04-28 2023-04-28 Unmanned ship multi-source target information fusion method

Publications (2)

Publication Number Publication Date
CN116592871A CN116592871A (en) 2023-08-15
CN116592871B true CN116592871B (en) 2024-04-23

Family

ID=87599961

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310476123.1A Active CN116592871B (en) 2023-04-28 2023-04-28 Unmanned ship multi-source target information fusion method

Country Status (1)

Country Link
CN (1) CN116592871B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107450546A (en) * 2017-08-16 2017-12-08 北京克路德人工智能科技有限公司 Obstacle Avoidance based on GPS and ultrasonic radar
CN109829386A (en) * 2019-01-04 2019-05-31 清华大学 Intelligent vehicle based on Multi-source Information Fusion can traffic areas detection method
CN110667783A (en) * 2019-08-30 2020-01-10 安徽科微智能科技有限公司 Unmanned boat auxiliary driving system and method thereof
CN112985406A (en) * 2021-02-23 2021-06-18 武汉理工大学 Ship obstacle avoidance path planning method and device and storage medium
CN114812581A (en) * 2022-06-23 2022-07-29 中国科学院合肥物质科学研究院 Cross-country environment navigation method based on multi-sensor fusion
CN114879180A (en) * 2022-03-22 2022-08-09 大连海事大学 Seamless situation perception method for real-time fusion of unmanned ship-borne multi-element multi-scale radar
CN115686021A (en) * 2022-11-11 2023-02-03 中国船舶集团有限公司第七一六研究所 Unmanned ship is surveyd to ocean cloud and fog

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230131553A1 (en) * 2021-10-27 2023-04-27 Volvo Car Corporation Environment-aware path planning for a self-driving vehicle using dynamic step-size search

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107450546A (en) * 2017-08-16 2017-12-08 北京克路德人工智能科技有限公司 Obstacle Avoidance based on GPS and ultrasonic radar
CN109829386A (en) * 2019-01-04 2019-05-31 清华大学 Intelligent vehicle based on Multi-source Information Fusion can traffic areas detection method
CN110667783A (en) * 2019-08-30 2020-01-10 安徽科微智能科技有限公司 Unmanned boat auxiliary driving system and method thereof
CN112985406A (en) * 2021-02-23 2021-06-18 武汉理工大学 Ship obstacle avoidance path planning method and device and storage medium
CN114879180A (en) * 2022-03-22 2022-08-09 大连海事大学 Seamless situation perception method for real-time fusion of unmanned ship-borne multi-element multi-scale radar
CN114812581A (en) * 2022-06-23 2022-07-29 中国科学院合肥物质科学研究院 Cross-country environment navigation method based on multi-sensor fusion
CN115686021A (en) * 2022-11-11 2023-02-03 中国船舶集团有限公司第七一六研究所 Unmanned ship is surveyd to ocean cloud and fog

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
船载导航仪器信息融合的研究;张杏谷等;《仪器仪表学报》;第26卷(第3期);全文 *

Also Published As

Publication number Publication date
CN116592871A (en) 2023-08-15

Similar Documents

Publication Publication Date Title
Thombre et al. Sensors and AI techniques for situational awareness in autonomous ships: A review
CN110414396B (en) Unmanned ship perception fusion algorithm based on deep learning
CN110175186B (en) Intelligent ship environment threat target sensing system and method
Han et al. Coastal SLAM with marine radar for USV operation in GPS-restricted situations
CN109460035B (en) Secondary autonomous obstacle avoidance method for unmanned ship in high-speed state
CN105841688B (en) A kind of ship auxiliary anchors alongside the shore method and system
WO2019096401A1 (en) Real-time monitoring of surroundings of marine vessel
CN109029465B (en) Millimeter wave radar-based tracking and obstacle avoidance system for unmanned ship
CN107817488A (en) The unmanned plane obstacle avoidance apparatus and barrier-avoiding method merged based on millimetre-wave radar with vision
Clunie et al. Development of a perception system for an autonomous surface vehicle using monocular camera, lidar, and marine radar
CN108227739B (en) Close-range obstacle avoidance method of underwater automatic driving equipment and underwater automatic driving equipment
US11954918B2 (en) Object detection device, object detection method, and storage medium
EP2211200A1 (en) Marine radar system with three-dimensional memory
CN110879394A (en) Unmanned ship radar obstacle avoidance system and method based on motion attitude information
KR20170090138A (en) System for avoiding risk environments of ship and method for avoiding using the same
CN113687349A (en) Unmanned ship sea surface target tracking method and device based on multi-sensor fusion
CN115311900B (en) Inland waterway ship auxiliary target identification method based on visual enhancement
CN115761286A (en) Method for detecting navigation obstacle of unmanned surface vehicle based on laser radar under complex sea condition
Yao et al. LiDAR-based simultaneous multi-object tracking and static mapping in nearshore scenario
KR20240080189A (en) Distance measurement method and distance measurement device using the same
CN114325635A (en) Target fusion method for laser radar and navigation radar
CN112611376B (en) RGI-Lidar/SINS tightly-coupled AUV underwater navigation positioning method and system
CN116592871B (en) Unmanned ship multi-source target information fusion method
CN117542225A (en) Augmented reality ship auxiliary navigation system
CN111474536A (en) Intelligent ship autonomous positioning system and method based on shore-based radar system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant