CN110376593A - A kind of target apperception method and device based on laser radar - Google Patents

A kind of target apperception method and device based on laser radar Download PDF

Info

Publication number
CN110376593A
CN110376593A CN201910715446.5A CN201910715446A CN110376593A CN 110376593 A CN110376593 A CN 110376593A CN 201910715446 A CN201910715446 A CN 201910715446A CN 110376593 A CN110376593 A CN 110376593A
Authority
CN
China
Prior art keywords
target
laser radar
electro
optical system
unmanned boat
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910715446.5A
Other languages
Chinese (zh)
Other versions
CN110376593B (en
Inventor
袁敏
倪侃俊
钱伟
王新雅
张国兴
王南南
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SHANGHAI AIWEI AEROSPACE ELECTRONIC CO Ltd
Original Assignee
SHANGHAI AIWEI AEROSPACE ELECTRONIC CO Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SHANGHAI AIWEI AEROSPACE ELECTRONIC CO Ltd filed Critical SHANGHAI AIWEI AEROSPACE ELECTRONIC CO Ltd
Priority to CN201910715446.5A priority Critical patent/CN110376593B/en
Publication of CN110376593A publication Critical patent/CN110376593A/en
Application granted granted Critical
Publication of CN110376593B publication Critical patent/CN110376593B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/203Specially adapted for sailing ships
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes

Abstract

The target apperception method and device based on laser radar that the invention discloses a kind of, the described method comprises the following steps: step 1: laser radar sensing module detects unmanned boat surrounding body, obtains the location information of target;Step 2: cradle head control module is used for the positional information calculation electro-optical system holder parameter according to the target, and controls electro-optical system precise positioning target;Step 3: analysis module obtains target category and target position in the picture and Objective extraction profile, size and color for carrying out detection identification to target monitoring picture frame by frame;Step 4: the target data perceived is uploaded into unmanned boat control centre.Target apperception method and device provided by the invention based on laser radar utilizes target position to calculate electro-optical system holder parameter using laser radar and electro-optical system linkage active probe unmanned boat surrounding body, and precise positioning target improves target identification accuracy rate.

Description

A kind of target apperception method and device based on laser radar
Technical field
The present invention relates to a kind of target apperception method and device more particularly to a kind of target apperception sides based on laser radar Method and device.
Background technique
Unmanned water surface ship (Unmanned Surface Vehicle, USV) is because having small, mobility strong, intelligence The features such as degree is high and someone's water surface ship can be replaced to complete complicated, hot mission in extreme environment, production, scientific research, The fields such as national defence have wide practical use.Unmanned boat environmental perception device and method are the passes that unmanned boat realizes autonomous navigation Key technology has high researching value.
A kind of unmanned boat Global obstacle object that patent document CN109282813A is announced knows method for distinguishing, comprising: navigation thunder Up to scanning barrier;Calculate the correction position of barrier;Optronic tracker captures barrier, calculates the size of barrier;To obtaining The data of the barrier taken take mean value;Carry out global avoidance planning.This method detects unmanned boat surrounding objects using pathfinder, Optronic tracker capture identification target is recycled, actively perceive system is belonged to.Pathfinder detection accuracy described in this method is lower than Laser radar can not get accurate target position information;This method is not directed to optronic tracker holder calculation method of parameters; This method only extracts the dimension information of target, cannot carry out identification classification to target.
A kind of actively perceive apparatus and method based on unmanned boat that patent document CN109255820A is announced, comprising: nothing People's ship navigates by water control unit, boat-carrying laser radar apparatus, camera calibration unit, data acquisition and display unit, data leaflet Member, bank base server control unit, bank base Target Detection unlit;The bank base Target Detection unlit, for detecting complicated water After the target of domain, the position of target is determined;The unmanned boat navigates by water control unit, navigates by water for controlling unmanned boat to mesh to be observed Near mark;Boat-carrying laser radar apparatus, for detecting the specific location of target;The camera calibration unit, for by taking the photograph Camera demarcates all Cell groups;Data acquisition and display unit, the clear image for photographic subjects;Bank base server Control unit, for storing data acquisition and display unit acquired image data.This method passes through camera calibration unit Calibrating camera realizes observation of the video camera to the default area Cell, and staking-out work is cumbersome and positioning accuracy is not high;This method is only adopted The image data for collecting target, cannot carry out identification classification to target.
It, can be with the location information of actively perceive unmanned boat surrounding body target it is therefore desirable to develop a kind of device.
Summary of the invention
The target apperception method and device based on laser radar that technical problem to be solved by the invention is to provide a kind of, benefit With laser radar and electro-optical system linkage active probe unmanned boat surrounding body, utilizes target position to calculate electro-optical system holder and join Number, precise positioning target improve target identification accuracy rate.
The present invention is to solve above-mentioned technical problem and the technical solution adopted is that provide a kind of target based on laser radar Cognitive method, comprising the following steps:
Step 1: laser radar sensing module detects unmanned boat surrounding body, obtains the location information of target;
Step 2: cradle head control module is used for the positional information calculation electro-optical system holder parameter according to the target, and controls Electro-optical system precise positioning target processed;
Step 3: analysis module obtains target category and mesh for carrying out detection identification to target monitoring picture frame by frame Mark position in the picture and Objective extraction profile, size and color;
Step 4: the target data perceived is uploaded into unmanned boat control centre.
Preferably, the analysis module is for carrying out detection identification based on Darknet to target monitoring picture frame by frame Frame training study yolov3 waterborne target detection model.
Preferably, the step 1 specifically includes the laser radar detection unmanned boat surrounding body, obtains target with respect to nothing Distance, azimuth and the radial dimension of people's ship, are denoted as D respectivelyVT、θAAnd LT, the position of unmanned boat is obtained by high-precision GPS, It is denoted as (xlon,ylat), the calculation formula of the location information (x, y) of target is as follows:
Wherein,R is earth mean radius.
Preferably, the step 2 specifically includes electro-optical system holder parameter and is denoted as (p, t, z), adjusts electro-optical system holder To bearing null, i.e. holder parameter is (0,0,0), obtains the angle of camera lens optical axis and geographical geographical north and the angle with horizontal plane, It is denoted as θ respectively1, θ2
Holder parameter (p, t, the z) calculation formula is as follows:
P=θA1
Wherein, θAAzimuth for target relative to unmanned boat;
Wherein,H is height of the electro-optical system installation site away from sea level, DVTFor target relative to The distance of unmanned boat;
Wherein MmaxFor electro-optical system holder maximum amplification, fminAnd fmaxRespectively electro-optical system camera lens are most Small focal length and maximum focal length, f are the optimal focal length for focusing display current goal, and calculation formula is as follows:
Wherein,DVTDistance for target relative to unmanned boat, L are electro-optical system video camera sensor devices The width of CMOS or CCD, LTFor target radial size.
Preferably, the Darknet frame training study yolov3 waterborne target detection model that is based on includes: that acquisition is waterborne Target, including ship, reef, island, floating log, floating ice and other six class Target Photos of floating material multiple, a portion picture As training data, another part picture is as verify data.
Preferably, the step 4 includes by target geographic position, target relative to the distance of unmanned boat, azimuth, target Profile, size, generic upload unmanned boat control centre, and stored.
Another technical solution that the present invention uses to solve above-mentioned technical problem is to provide a kind of based on laser radar Target apperception device, comprising:
Laser radar sensing module comprising laser radar, high-precision GPS and radar data processing unit, the laser For radar for obtaining target relative to the distance of unmanned boat, azimuth and radial dimension, the high-precision GPS is described for editing The position of laser radar, the radar data processing unit are used to obtain the location information of target;
Electro-optical system is connected with cradle head control module with analysis module;
Cradle head control module comprising holder parameter calculation unit and high-speed holder control unit, the holder parameter meter Unit is calculated to be used to be used for according to the positional information calculation electro-optical system holder parameter of the target, the high-speed holder control unit Control the electro-optical system precise positioning target;
The analysis module comprising image processing unit and object-recognition unit, described image processing unit are used In carrying out detection identification to target monitoring picture frame by frame, the object-recognition unit is for obtaining target category and target in image In position and Objective extraction profile, size and color;
Unmanned boat control centre is used to obtain the target data perceived.
The present invention comparison prior art has following the utility model has the advantages that the target apperception provided by the invention based on laser radar Method and device improves detection efficient using laser radar and electro-optical system linkage active probe unmanned boat surrounding body;Benefit Electro-optical system holder parameter is calculated with target position, precise positioning target enhances observation effect, improves target identification accuracy rate; It is used for waterborne target Classification and Identification based on Darknet frame training yolov3 waterborne target detection model, discrimination to be high;Packet is provided Target geographic position, target are included relative to mesh such as the distance of unmanned boat, azimuth, radial dimension, the profile of target, generics Information data is marked, perception information is abundant.
Detailed description of the invention
Fig. 1 is the unmanned surface vehicle actively sense that the target apperception device based on laser radar is equipped in the embodiment of the present invention Know system schematic;
Fig. 2 is that the holder parameter calculating angular relationship of the target apperception device based on laser radar in the embodiment of the present invention is bowed View;
Fig. 3 is that the holder parameter of the target apperception device based on laser radar in the embodiment of the present invention calculates angular relationship side View;
Fig. 4 is the structural schematic diagram of the target apperception device based on laser radar in the embodiment of the present invention.
Specific embodiment
The invention will be further described with reference to the accompanying drawings and examples.
In the following description, in order to provide thorough understanding of the invention, many concrete details are elaborated.However, this hair Bright to practice in the case where these no concrete details, this will be aobvious and easy for the common technical staff in this field See.Therefore, concrete details elaboration is only exemplary, and concrete details can be changed by bold and unrestrained spirit and scope And it is still considered as within the spirit and scope of the present invention.
A kind of target apperception method and device based on laser radar provided in this embodiment, utilizes laser radar and photoelectricity System interlink active probe unmanned boat surrounding body, using target position calculate electro-optical system holder parameter, precise positioning target, Improve target identification accuracy rate.
Referring now to Fig. 1,1 is unmanned surface vehicle, and 2 be installation pillar, and 3 be photoelectric nacelle, and 4 be laser radar, and 5 be the water surface Target, 6 be unmanned boat control centre.
The target apperception method based on laser radar that present embodiment discloses a kind of, comprising the following steps:
Step 1: laser radar sensing module detects unmanned boat surrounding body, obtains the location information of target;
Step 2: cradle head control module is used for the positional information calculation electro-optical system holder parameter according to the target, and controls Electro-optical system precise positioning target processed;
Step 3: analysis module obtains target category and mesh for carrying out detection identification to target monitoring picture frame by frame Mark position in the picture and Objective extraction profile, size and color;
Step 4: the target data perceived is uploaded into unmanned boat control centre.
Preferably, the analysis module is for carrying out detection identification based on Darknet to target monitoring picture frame by frame Frame training study yolov3 waterborne target detection model.
Preferably, the step 1 specifically includes the laser radar detection unmanned boat surrounding body, obtains target with respect to nothing Distance, azimuth and the radial dimension of people's ship, are denoted as D respectivelyVT、θAAnd LT, the position of unmanned boat is obtained by high-precision GPS, It is denoted as (xlon,ylat), the calculation formula of the location information (x, y) of target is as follows:
Wherein,R is earth mean radius.
Preferably, the step 2 specifically includes electro-optical system holder parameter and is denoted as (p, t, z), adjusts electro-optical system holder To bearing null, i.e. holder parameter is (0,0,0), obtains the angle of camera lens optical axis and geographical geographical north and the angle with horizontal plane, It is denoted as θ respectively1, θ2
Holder parameter (p, t, the z) calculation formula is as follows:
P=θA1
Wherein, θAAzimuth for target relative to unmanned boat;
Wherein,H is height of the electro-optical system installation site away from sea level, DVTFor target relative to The distance of unmanned boat;
Wherein MmaxFor electro-optical system holder maximum amplification, fminAnd fmaxRespectively electro-optical system camera lens are most Small focal length and maximum focal length, f are the optimal focal length for focusing display current goal, and calculation formula is as follows:
Wherein,DVTDistance for target relative to unmanned boat, L are electro-optical system video camera sensor devices The width of CMOS or CCD, LTFor target radial size.
Preferably, the Darknet frame training study yolov3 waterborne target detection model that is based on includes: that acquisition is waterborne Target, including ship, reef, island, floating log, floating ice and other six class Target Photos of floating material multiple, a portion picture As training data, another part picture is as verify data.Such as ship, reef, island, floating log, floating ice and other floatings Six class Target Photo of object totally 30000, wherein 18000 are used as training data, 12000 are used as verify data, are based on Darknet frame training study yolov3 waterborne target detection model.Using trained model frame by frame to target monitoring picture Detection identification is carried out, the position of target category and target in the picture is obtained, to Objective extraction profile, size and external appearance characteristic number According to.
Preferably, the step 4 includes by target geographic position, target relative to the distance of unmanned boat, azimuth, target Profile, size, generic upload unmanned boat control centre, and stored, data supporting can be provided for flight course planning.
Referring now to Fig. 4, the present embodiment also discloses a kind of target apperception device based on laser radar, comprising:
Laser radar sensing module comprising laser radar, high-precision GPS and radar data processing unit, the laser For radar for obtaining target relative to the distance of unmanned boat, azimuth and radial dimension, the high-precision GPS is described for editing The position of laser radar, the radar data processing unit are used to obtain the location information of target;
Electro-optical system is connected with cradle head control module with analysis module;
Cradle head control module comprising holder parameter calculation unit and high-speed holder control unit, the holder parameter meter Unit is calculated to be used to be used for according to the positional information calculation electro-optical system holder parameter of the target, the high-speed holder control unit Control the electro-optical system precise positioning target;
The analysis module comprising image processing unit and object-recognition unit, described image processing unit are used In carrying out detection identification to target monitoring picture frame by frame, the object-recognition unit is for obtaining target category and target in image In position and Objective extraction profile, size and color;
Unmanned boat control centre is used to obtain the target data perceived.
To sum up, the target apperception method and device provided in this embodiment based on laser radar, utilizes laser radar and light Electric system linkage active probe unmanned boat surrounding body, improves detection efficient;Electro-optical system holder is calculated using target position Parameter, precise positioning target enhance observation effect, improve target identification accuracy rate;Based on Darknet frame training yolov3 water Area Objects detection model is used for waterborne target Classification and Identification, and discrimination is high;There is provided includes target geographic position, target relative to nothing The object information datas such as distance, azimuth, radial dimension, the profile of target, the generic of people's ship, perception information are abundant.
Although the present invention is disclosed as above with preferred embodiment, however, it is not to limit the invention, any this field skill Art personnel, without departing from the spirit and scope of the present invention, when can make a little modification and perfect therefore of the invention protection model It encloses to work as and subject to the definition of the claims.

Claims (7)

1. a kind of target apperception method based on laser radar, which comprises the following steps:
Step 1: laser radar sensing module detects unmanned boat surrounding body, obtains the location information of target;
Step 2: cradle head control module is used for the positional information calculation electro-optical system holder parameter according to the target, and controls light Electric system precise positioning target;
Step 3: analysis module obtains target category and target exists for carrying out detection identification to target monitoring picture frame by frame Position and Objective extraction profile, size and color in image;
Step 4: the target data perceived is uploaded into unmanned boat control centre.
2. the target apperception method according to claim 1 based on laser radar, which is characterized in that the video analysis mould Block is for carrying out detection identification based on the training study yolov3 waterborne target detection of Darknet frame to target monitoring picture frame by frame Model.
3. the target apperception method according to claim 2 based on laser radar, which is characterized in that the step 1 is specific Including the laser radar detection unmanned boat surrounding body, target is obtained with respect to the distance of unmanned boat, azimuth and radial dimension, It is denoted as D respectivelyVT、θAAnd LT, the position of unmanned boat is obtained by high-precision GPS, is denoted as (xlon,ylat), the location information of target The calculation formula of (x, y) is as follows:
Wherein,R is earth mean radius.
4. the target apperception method according to claim 3 based on laser radar, which is characterized in that the step 2 is specific Be denoted as (p, t, z) including electro-optical system holder parameter, adjustment electro-optical system holder to bearing null, i.e., holder parameter be (0,0, 0) angle of camera lens optical axis and geographical geographical north and the angle with horizontal plane, are obtained, is denoted as θ respectively1, θ2
Holder parameter (p, t, the z) calculation formula is as follows:
P=θA1
Wherein, θAAzimuth for target relative to unmanned boat;
Wherein,H is height of the electro-optical system installation site away from sea level, DVTIt is target relative to nobody The distance of ship;
Wherein MmaxFor electro-optical system holder maximum amplification, fminAnd fmaxRespectively electro-optical system camera lens are minimum burnt Away from and maximum focal length, f be the optimal focal length for focusing display current goal, calculation formula is as follows:
Wherein,DVTDistance for target relative to unmanned boat, L be electro-optical system video camera sensor devices CMOS or The width of CCD, LTFor target radial size.
5. the target apperception method according to claim 4 based on laser radar, which is characterized in that described to be based on Darknet frame training study yolov3 waterborne target detection model includes: acquisition target waterborne, including ship, reef, island Small island, floating log, floating ice and other six class Target Photos of floating material multiple, a portion picture is as training data, another part Picture is as verify data.
6. the target apperception method according to claim 1 based on laser radar, which is characterized in that the step 4 includes Target geographic position, target are uploaded nobody relative to the distance of unmanned boat, azimuth, the profile of target, size, generic Ship control centre, and stored.
7. a kind of target apperception device based on laser radar characterized by comprising
Laser radar sensing module comprising laser radar, high-precision GPS and radar data processing unit, the laser radar For obtaining target relative to the distance of unmanned boat, azimuth and radial dimension, the high-precision GPS is for editing the laser The position of radar, the radar data processing unit are used to obtain the location information of target;
Electro-optical system is connected with cradle head control module with analysis module;
Cradle head control module comprising holder parameter calculation unit and high-speed holder control unit, the holder parameter calculate single Member is for the positional information calculation electro-optical system holder parameter according to the target, and the high-speed holder control unit is for controlling The electro-optical system precise positioning target;
The analysis module comprising image processing unit and object-recognition unit, described image processing unit be used for by Frame carries out detection identification to target monitoring picture, and the object-recognition unit is used to obtaining target category and target in the picture Position and Objective extraction profile, size and color;
Unmanned boat control centre is used to obtain the target data perceived.
CN201910715446.5A 2019-08-05 2019-08-05 Target sensing method and device based on laser radar Active CN110376593B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910715446.5A CN110376593B (en) 2019-08-05 2019-08-05 Target sensing method and device based on laser radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910715446.5A CN110376593B (en) 2019-08-05 2019-08-05 Target sensing method and device based on laser radar

Publications (2)

Publication Number Publication Date
CN110376593A true CN110376593A (en) 2019-10-25
CN110376593B CN110376593B (en) 2021-05-04

Family

ID=68257964

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910715446.5A Active CN110376593B (en) 2019-08-05 2019-08-05 Target sensing method and device based on laser radar

Country Status (1)

Country Link
CN (1) CN110376593B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112835055A (en) * 2020-12-30 2021-05-25 潍柴动力股份有限公司 Positioning method and system of laser SLAM equipment
CN112927233A (en) * 2021-01-27 2021-06-08 湖州市港航管理中心 Marine laser radar and video combined target capturing method
CN113064157A (en) * 2021-06-01 2021-07-02 北京高普乐光电科技股份公司 Radar and photoelectric linkage early warning method, device and system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108471497A (en) * 2018-03-02 2018-08-31 天津市亚安科技有限公司 A kind of ship target real-time detection method based on monopod video camera
KR20190005413A (en) * 2017-07-06 2019-01-16 세한대학교기술지주회사 주식회사 Collision detection device of Marina leisure ship based on laser sensor
CN109298708A (en) * 2018-08-31 2019-02-01 中船重工鹏力(南京)大气海洋信息系统有限公司 A kind of unmanned boat automatic obstacle avoiding method merging radar and photoelectric information
CN109375633A (en) * 2018-12-18 2019-02-22 河海大学常州校区 River course clear up path planning system and method based on global state information
CN109444911A (en) * 2018-10-18 2019-03-08 哈尔滨工程大学 A kind of unmanned boat waterborne target detection identification and the localization method of monocular camera and laser radar information fusion
CN109784278A (en) * 2019-01-17 2019-05-21 上海海事大学 The small and weak moving ship real-time detection method in sea based on deep learning

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190005413A (en) * 2017-07-06 2019-01-16 세한대학교기술지주회사 주식회사 Collision detection device of Marina leisure ship based on laser sensor
CN108471497A (en) * 2018-03-02 2018-08-31 天津市亚安科技有限公司 A kind of ship target real-time detection method based on monopod video camera
CN109298708A (en) * 2018-08-31 2019-02-01 中船重工鹏力(南京)大气海洋信息系统有限公司 A kind of unmanned boat automatic obstacle avoiding method merging radar and photoelectric information
CN109444911A (en) * 2018-10-18 2019-03-08 哈尔滨工程大学 A kind of unmanned boat waterborne target detection identification and the localization method of monocular camera and laser radar information fusion
CN109375633A (en) * 2018-12-18 2019-02-22 河海大学常州校区 River course clear up path planning system and method based on global state information
CN109784278A (en) * 2019-01-17 2019-05-21 上海海事大学 The small and weak moving ship real-time detection method in sea based on deep learning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MUHAMMAD ASROFI ET AL.: "Optimal Path Planning of a Mini USV using Sharp Cornering Algorithm", 《2016 INTERNATIONAL CONFERENCE ON INFORMATION TECHNOLOGY SYSTEMS AND INNOVATION (ICITSI)》 *
严新平: "智能船舶的研究现状与发展趋势", 《交通与港航》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112835055A (en) * 2020-12-30 2021-05-25 潍柴动力股份有限公司 Positioning method and system of laser SLAM equipment
CN112927233A (en) * 2021-01-27 2021-06-08 湖州市港航管理中心 Marine laser radar and video combined target capturing method
CN113064157A (en) * 2021-06-01 2021-07-02 北京高普乐光电科技股份公司 Radar and photoelectric linkage early warning method, device and system
CN113064157B (en) * 2021-06-01 2022-05-27 北京高普乐光电科技股份公司 Radar and photoelectric linkage early warning method, device and system

Also Published As

Publication number Publication date
CN110376593B (en) 2021-05-04

Similar Documents

Publication Publication Date Title
CN109931939B (en) Vehicle positioning method, device, equipment and computer readable storage medium
US9465129B1 (en) Image-based mapping locating system
CN110376593A (en) A kind of target apperception method and device based on laser radar
CN106408601B (en) A kind of binocular fusion localization method and device based on GPS
CN110142785A (en) A kind of crusing robot visual servo method based on target detection
CN104501779A (en) High-accuracy target positioning method of unmanned plane on basis of multi-station measurement
US20220024549A1 (en) System and method for measuring the distance to an object in water
CN109446973B (en) Vehicle positioning method based on deep neural network image recognition
CN106092054A (en) A kind of power circuit identification precise positioning air navigation aid
Nagai et al. UAV borne mapping by multi sensor integration
CN206611521U (en) A kind of vehicle environment identifying system and omni-directional visual module based on multisensor
KR20210007767A (en) Autonomous navigation ship system for removing sea waste based on deep learning-vision recognition
CN113627473B (en) Multi-mode sensor-based water surface unmanned ship environment information fusion sensing method
CN104599281B (en) A kind of based on the conforming panorama sketch in horizontal linear orientation and remote sensing figure method for registering
CN115717867A (en) Bridge deformation measurement method based on airborne double cameras and target tracking
CN107741233A (en) A kind of construction method of the outdoor map of three-dimensional
JP5152913B2 (en) Offshore monitoring system and method
KR100878781B1 (en) Method for surveying which can measure structure size and coordinates using portable terminal
CN104613928A (en) Automatic tracking and air measurement method for optical pilot balloon theodolite
US11587241B2 (en) Detection of environmental changes to delivery zone
Fan et al. Bio-inspired multisensor navigation system based on the skylight compass and visual place recognition for unmanned aerial vehicles
CN111402324B (en) Target measurement method, electronic equipment and computer storage medium
CN115471555A (en) Unmanned aerial vehicle infrared inspection pose determination method based on image feature point matching
WO2022059603A1 (en) Flood damage determination device, flood damage determination method, and program
CN113592837A (en) Road kiln well lid height difference calculation method based on unmanned aerial vehicle fixed-point aerial photography

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant