CN110580044A - unmanned ship full-automatic navigation heterogeneous system based on intelligent sensing - Google Patents
unmanned ship full-automatic navigation heterogeneous system based on intelligent sensing Download PDFInfo
- Publication number
- CN110580044A CN110580044A CN201910814289.3A CN201910814289A CN110580044A CN 110580044 A CN110580044 A CN 110580044A CN 201910814289 A CN201910814289 A CN 201910814289A CN 110580044 A CN110580044 A CN 110580044A
- Authority
- CN
- China
- Prior art keywords
- ship
- unmanned ship
- module
- subsystem
- chart
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 230000033001 locomotion Effects 0.000 claims description 38
- 230000004927 fusion Effects 0.000 claims description 19
- 238000005259 measurement Methods 0.000 claims description 8
- 230000003068 static effect Effects 0.000 claims description 7
- 230000004888 barrier function Effects 0.000 claims description 6
- 238000004364 calculation method Methods 0.000 claims description 6
- 238000013499 data model Methods 0.000 claims description 6
- 230000008676 import Effects 0.000 claims description 6
- 230000008447 perception Effects 0.000 claims description 5
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 4
- 238000004891 communication Methods 0.000 claims description 3
- 230000007613 environmental effect Effects 0.000 claims description 2
- 206010063385 Intellectualisation Diseases 0.000 abstract description 2
- 238000012545 processing Methods 0.000 description 7
- 238000000034 method Methods 0.000 description 6
- 238000013480 data collection Methods 0.000 description 4
- 238000011161 development Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 2
- 238000005034 decoration Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000001149 cognitive effect Effects 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 239000003643 water by type Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S11/00—Systems for determining distance or velocity not using reflection or reradiation
- G01S11/12—Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/93—Sonar systems specially adapted for specific applications for anti-collision purposes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/0206—Control of position or course in two dimensions specially adapted to water vehicles
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Automation & Control Theory (AREA)
- Electromagnetism (AREA)
- Acoustics & Sound (AREA)
- Aviation & Aerospace Engineering (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention relates to the technical field of intelligent ship control systems, in particular to an intelligent sensing-based full-automatic navigation heterogeneous system for an unmanned ship. The device utilizes the least sensors to sense the information in all directions, and the efficiency and the intellectualization of the whole work are improved.
Description
Technical Field
The invention relates to the technical field of intelligent ship control systems, in particular to a full-automatic navigation heterogeneous system of an unmanned ship based on intelligent sensing.
Background
under the background of economic development of all countries at present, the ocean economy is vigorously developed, and the method is a necessary way for the development of the coastal countries and regions. The development of ocean economy is bound to go to a motorway, and when ocean activities are carried out, the unmanned ship is essential in the ocean activities.
The unmanned ship is a novel ocean carrying platform, has extremely wide application prospect in civil and military fields, and can carry out work such as marine environment monitoring, oceanographic weather forecast, communication relay, territorial sea monitoring and the like. But the unmanned ship system at present is still in a remote control or semi-automatic control state. In the remote control process, a control person needs to operate on the shore or the mother ship. The time and money costs associated with the renting of mother ships, training of personnel, etc. are relatively high. The unskilled operation technique of the personnel can greatly increase the operation time and ensure the operation quality; due to carelessness in a short time, the unmanned ship can easily collide with obstacles, the service life of the unmanned ship is shortened, and even the unmanned ship is turned over. In the semi-automatic control process, the unmanned ship can carry out course according to a preset route, but an onshore barrier and an underwater barrier need to be avoided by an operator according to a known electronic chart, a route is set, and a dynamic barrier needs to be timely switched to a remote control state, so that the overall safety is low. And for unknown waters, the navigation risk of the unmanned ship is greatly improved.
Disclosure of Invention
the invention aims to overcome the defects of the technology and provide a fully-automatic navigation heterogeneous system of an unmanned ship based on intelligent sensing.
in order to achieve the purpose, the invention adopts the following technical scheme: 1. the utility model provides a full-automatic heterogeneous system of sailing of unmanned ship based on intelligent perception which characterized in that: the system comprises an unmanned ship body platform, a ship body power system, a remote control module, a chart module, a path memory module, a sensor module and an automatic control module;
The unmanned ship hull platform is used for carrying all intelligent control systems and a hull power system;
The ship body power system comprises a power supply module and a ship body propeller, the power supply module supplies power to all equipment of the unmanned ship, and the ship body propeller is connected with the power supply module and the automatic control module so as to control the motion of the unmanned ship;
the remote control module is set as a mobile phone APP or a remote controller and is interacted with the shipborne automatic control module through wireless communication;
the chart module is used for storing the electronic chart;
The path memory module is used for memorizing the running route of the unmanned ship and facilitating the return to an initial point after the operation is finished;
The sensor module comprises a GNSS receiver, an inertia measurement unit, a look-around camera, a binocular camera, a laser radar, a millimeter wave radar and an underwater sonar, and can sense the above-water and underwater obstacles in an all-round way;
the automatic control module comprises a data acquisition subsystem, a data fusion subsystem, a planning decision subsystem and a motion control subsystem, the automatic control module is directly connected with a ship-mounted sensor, the data acquisition subsystem controls the multisource sensor to acquire environmental information, and different types of data acquired by the multisource sensor are transmitted to the data fusion subsystem by adding time marks;
the data fusion subsystem carries out clustering calculation on the object information detected by all the sensors at the same time according to the time mark information through the data processed by the data acquisition subsystem, and establishes a real-time dynamic model of the unmanned ship. Data such as relative speed, relative coordinates and the like between the unmanned ship and the barrier can be obtained through the real-time dynamic model of the unmanned ship, and the data is recorded with a time mark and transmitted to the planning decision subsystem;
The planning decision subsystem has entered electronic chart data through the chart module, building a bottom layer map data model. Meanwhile, the planning decision subsystem is fused into a bottom map data model according to the dynamic data with the time marks of the unmanned ship provided by the data fusion subsystem, establishes a real-time target model and classifies dynamic and static targets; performing rasterization representation to obtain the coordinate, length and width of the static obstacle and the coordinate and motion state information of the dynamic obstacle in the current environment;
The planning decision subsystem comprehensively plans and decides the motion state and the motion speed of the vehicle at the moment and in the next seconds according to the position and speed information of the unmanned ship in the planning chart and the position, size and speed information of all real targets in the chart, and analyzes the motion state and the motion speed according to an algorithm to form a control signal and transmits the control signal to the motion control subsystem;
the motion control subsystem receives the control signal sent by the planning decision subsystem, and the ship propeller reacts according to the control signal to control the motion direction, the motion speed and the steering angle of the ship. The ship inertia measurement unit compares the ship motion state signal with the control signal in real time to form closed-loop control, so that the unmanned ship can complete a series of motions.
preferably, the GNSS receiver needs to be elevated above the unmanned ship platform, so that the GNSS receiver can better receive satellite signals; the inertial measurement unit is arranged at the central position of the unmanned ship, and an embedded algorithm is input by combining with the shape parameters of the ship body to carry out real-time calculation on the motion state of the ship body; the all-round-looking cameras are 4 wide-angle 360-degree cameras, are positioned at the front and back and two sides, are slightly higher than the ship platform, and are mainly applied to identification of short-distance scene obstacles; the binocular camera is arranged right ahead of the ship body, is mainly applied to medium and long distance scenes, and can clearly identify long distance static obstacles and dynamic obstacles on the sea; the laser radar is arranged right in front of the ship body, and can obtain extremely high speed, distance and angular resolution, so that an accurate 3D map is formed, and the anti-jamming capability is strong; the millimeter wave radar, 4 millimeter wave radars are arranged at four corners of the unmanned ship body to form large coverage without dead angles, so that the depth of field and speed information can be effectively extracted, and obstacles can be identified; the underwater sonar is located right below the ship body and completely immersed in water to scan underwater obstacles.
Preferably, the chart module comprises a chart importing module and a global preplanning module;
the chart import module imports the electronic chart of the sea area to be operated into the automatic control module according to a chart import format and establishes a bottom map data model;
And the global preplanning module plans a route between the starting point and the terminal point of the ship according to the imported chart and the marine course standard, so that the route does not collide with the known obstacles.
Preferably, the hull propeller of the hull power system adopts double propellers, the propeller has forward and reverse rotation and 5 speed gears, and the unmanned ship can advance, retreat and turn at different speeds by using the differential speed between the two propellers.
preferably, the path memory module stores a navigation route of the unmanned ship, automatically returns according to the navigation route after finishing the operation, and can preferentially call the stored chart route during navigation in the next same operation to perform more intelligent operation.
the full-automatic navigation system has the beneficial effects that the full-automatic navigation system is a comprehensive system integrating the functions of environment perception, planning decision, motion control and the like, and is dependent on the collection, fusion, processing and reaction of various sensors on the static and dynamic information data of the surrounding environment in the navigation process of the unmanned ship and the analysis of complex scenes. The sensors need to be distributed, the whole working framework is reasonably distributed, the minimum number of sensors are utilized to carry out all-around sensing, and the whole working efficiency and the intellectualization are improved.
drawings
FIG. 1 is a schematic structural diagram of the present invention.
Detailed Description
the following detailed description of the preferred embodiments will be made with reference to the accompanying drawings. Referring to fig. 1, the following describes the preferred embodiment of the present invention in detail with reference to the accompanying drawings. It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein.
Referring to fig. 1, the invention relates to an intelligent sensing-based unmanned ship full-automatic navigation heterogeneous system, which is characterized in that: the system comprises an unmanned ship body platform, a ship body power system, a remote control module, a chart module, a path memory module and a sensor module, wherein the sensor module comprises a GNSS receiver, an inertia measurement unit, a look-around camera, a binocular camera, a laser radar, a millimeter wave radar and an underwater sonar, and the automatic control module comprises a data acquisition subsystem, a data fusion subsystem, a planning decision subsystem and a motion control subsystem.
The data acquisition subsystem mainly completes acquisition and preprocessing of sensor data, adopts a scheme of FPGA + CPU + GPU as a main system processing architecture, meets the requirement of a system on parallel high-speed processing performance, is easy to realize a complex algorithm, and greatly improves the operation performance. All simple multiply-accumulate operations capable of running in parallel are realized in the FPGA, and the CPU is only responsible for complex operations such as iteration, optimization and the like, so that the burden of the CPU can be reduced to the minimum, and the real-time performance of the system is improved.
Data collected by the unmanned ship data collection subsystem are mainly divided into image data collection, radar data collection and inertial navigation data collection; the image data acquisition comprises all-round looking image acquisition and forward looking image acquisition, wherein all-round looking image acquisition is realized by 4 wide-angle 360-degree cameras which are positioned at the front and the back and at the two sides and are slightly higher than the ship body platform. The panoramic image acquisition acquires images of a plurality of cameras at the same time under the trigger of a system synchronization signal, converts the images into digital information and transmits the digital information to a CPU (central processing unit) on an acquisition module, the CPU carries out conventional processing such as noise reduction and filtering, white balance and the like on original data, and then carries out picture synthesis and picture correction through an image splicing algorithm to form an unmanned ship omnidirectional image. And extracting characteristic information and effective information in the image through an image recognition algorithm, adding a time mark, and transmitting the information to a data fusion subsystem for further processing. The method comprises the steps of forward-looking image acquisition, wherein two cameras are used for acquiring images at different viewing angles, according to the matching relation between pixels, a processing algorithm calculates the offset between the pixels according to triangulation to acquire the three-dimensional information of an object, and the depth of field information of the object can be obtained, so that the actual distance between the object and the cameras and the three-dimensional size of the object are calculated. The distance measurement of the camera is not accurate due to the defects of the perception principle. The radar data acquisition can make up the deficiency of visual ranging. The radar transmitter sends out continuous waves, receives reflected waves of the continuous waves, records the time difference and determines the distance between the obstacle and the ship body. Acquiring point cloud data of an object, performing coordinate conversion, denoising and the like, clustering according to the point cloud distance or the reflection intensity, extracting the clustered features, and classifying according to the features. And the inertial navigation data acquisition is to acquire a pulse signal output by inertial navigation by using a logic circuit of the CPLD, send the pulse signal in a fixed time to the DSP in a digital signal form through I/O, and send the acquired data of the gyroscope and the accelerometer to the data fusion subsystem and the upper computer through serial ports after the DSP finishes data acquisition.
The data fusion subsystem mainly performs fusion calculation on various sensor data. In this embodiment, the data fusion subsystem follows the hardware architecture platform of the data acquisition subsystem. The data fusion subsystem carries out clustering calculation on object information detected by all sensors at the same moment according to the data of all the sensors provided by the data acquisition subsystem and time marks, calculates geometric similarity of targets according to two or more adjacent clustering results, further carries out target matching according to the similarity, determines the successfully matched targets as original targets, and extracts the movement speed of the original targets relative to the coordinate system of the unmanned ship; and calculating to obtain information such as coordinates, target length, target width and the like of the center of the real target according to the target distance data transmitted by the data acquisition subsystem. The data fusion subsystem transmits the data defining the category, speed, coordinates, length, width, etc. of the target plus a time stamp to the planning decision subsystem.
the planning decision subsystem establishes a real target characteristic model according to the geometric characteristics and the motion characteristics of the real target provided by the data fusion subsystem, and judges and classifies the target type; and rasterizing the target features, performing rasterization expression, and acquiring the barrier map in the current environment. And matching and fusing the map and the high-precision electronic chart preinstalled in the system to obtain a fusion cognitive map aiming at the current moment. The planning decision subsystem comprehensively plans and decides the motion state and the motion speed of the vehicle at the moment and in the next seconds according to the information of the position, the speed and the like of the unmanned ship at the moment and the information of the relative position, the relative size, the relative speed and the like of all obstacles, and transmits the motion state and the motion speed to the motion control subsystem in the form of control signals.
The motion control subsystem receives the control signal from the planning decision subsystem to drive the two propellers, the propellers are simultaneously positively transmitted to realize forward high-speed forward, the differential speed of the propellers finishes left and right steering, and the propellers are reversely rotated to realize speed reduction.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.
Claims (5)
1. The utility model provides a full-automatic heterogeneous system of sailing of unmanned ship based on intelligent perception which characterized in that: the system comprises an unmanned ship body platform, a ship body power system, a remote control module, a chart module, a path memory module, a sensor module and an automatic control module;
The unmanned ship hull platform is used for carrying all intelligent control systems and a hull power system;
the ship body power system comprises a power supply module and a ship body propeller, the power supply module supplies power to all equipment of the unmanned ship, and the ship body propeller is connected with the power supply module and the automatic control module so as to control the motion of the unmanned ship;
the remote control module is set as a mobile phone APP or a remote controller and is interacted with the shipborne automatic control module through wireless communication;
The chart module is used for storing the electronic chart;
the path memory module is used for memorizing the running route of the unmanned ship and facilitating the return to an initial point after the operation is finished;
The sensor module comprises a GNSS receiver, an inertia measurement unit, a look-around camera, a binocular camera, a laser radar, a millimeter wave radar and an underwater sonar, and can sense the above-water and underwater obstacles in an all-round way;
the automatic control module comprises a data acquisition subsystem, a data fusion subsystem, a planning decision subsystem and a motion control subsystem, the automatic control module is directly connected with a ship-mounted sensor, the data acquisition subsystem controls the multisource sensor to acquire environmental information, and different types of data acquired by the multisource sensor are transmitted to the data fusion subsystem by adding time marks;
The data fusion subsystem carries out clustering calculation on the object information detected by all the sensors at the same time according to the time mark information through the data processed by the data acquisition subsystem, and establishes a real-time dynamic model of the unmanned ship. Data such as relative speed, relative coordinates and the like between the unmanned ship and the barrier can be obtained through the real-time dynamic model of the unmanned ship, and the data is recorded with a time mark and transmitted to the planning decision subsystem;
The planning decision subsystem has entered electronic chart data through the chart module, building a bottom layer map data model. Meanwhile, the planning decision subsystem is fused into a bottom map data model according to the dynamic data with the time marks of the unmanned ship provided by the data fusion subsystem, establishes a real-time target model and classifies dynamic and static targets; performing rasterization representation to obtain the coordinate, length and width of the static obstacle and the coordinate and motion state information of the dynamic obstacle in the current environment;
The planning decision subsystem comprehensively plans and decides the motion state and the motion speed of the vehicle at the moment and in the next seconds according to the position and speed information of the unmanned ship in the planning chart and the position, size and speed information of all real targets in the chart, and analyzes the motion state and the motion speed according to an algorithm to form a control signal and transmits the control signal to the motion control subsystem;
The motion control subsystem receives the control signal sent by the planning decision subsystem, and the ship propeller reacts according to the control signal to control the motion direction, the motion speed and the steering angle of the ship. The ship inertia measurement unit compares the ship motion state signal with the control signal in real time to form closed-loop control, so that the unmanned ship can complete a series of motions.
2. the unmanned ship full-automatic navigation heterogeneous system based on intelligent sensing of claim 1, wherein the GNSS receiver needs to be elevated above an unmanned ship platform to better receive satellite signals; the inertial measurement unit is arranged at the central position of the unmanned ship, and an embedded algorithm is input by combining with the shape parameters of the ship body to carry out real-time calculation on the motion state of the ship body; the all-round-looking cameras are 4 wide-angle 360-degree cameras, are positioned at the front and back and two sides, are slightly higher than the ship platform, and are mainly applied to identification of short-distance scene obstacles; the binocular camera is arranged right ahead of the ship body, is mainly applied to medium and long distance scenes, and can clearly identify long distance static obstacles and dynamic obstacles on the sea; the laser radar is arranged right in front of the ship body, and can obtain extremely high speed, distance and angular resolution, so that an accurate 3D map is formed, and the anti-jamming capability is strong; the millimeter wave radar, 4 millimeter wave radars are arranged at four corners of the unmanned ship body to form large coverage without dead angles, so that the depth of field and speed information can be effectively extracted, and obstacles can be identified; the underwater sonar is located right below the ship body and completely immersed in water to scan underwater obstacles.
3. the unmanned ship full-automatic navigation heterogeneous system based on intelligent perception according to claim 1, wherein the chart module comprises a chart importing module and a global preplanning module;
the chart import module imports the electronic chart of the sea area to be operated into the automatic control module according to a chart import format and establishes a bottom map data model;
And the global preplanning module plans a route between the starting point and the terminal point of the ship according to the imported chart and the marine course standard, so that the route does not collide with the known obstacles.
4. The unmanned ship full-automatic navigation heterogeneous system based on intelligent sensing of claim 1, wherein the ship propulsion system adopts double propulsion units, the propulsion units have two forward and reverse rotations and 5 speed gears, and the forward, reverse and differential turning of the unmanned ship is realized by using the differential speed between the two propulsion units.
5. The unmanned ship full-automatic navigation heterogeneous system based on intelligent sensing of claim 1, wherein the path memory module stores unmanned ship navigation routes, automatically navigates back according to the navigation routes after finishing operations, and can preferentially call the stored chart routes for more intelligent operations during navigation at the next same operation.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910814289.3A CN110580044A (en) | 2019-08-30 | 2019-08-30 | unmanned ship full-automatic navigation heterogeneous system based on intelligent sensing |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910814289.3A CN110580044A (en) | 2019-08-30 | 2019-08-30 | unmanned ship full-automatic navigation heterogeneous system based on intelligent sensing |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110580044A true CN110580044A (en) | 2019-12-17 |
Family
ID=68812250
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910814289.3A Withdrawn CN110580044A (en) | 2019-08-30 | 2019-08-30 | unmanned ship full-automatic navigation heterogeneous system based on intelligent sensing |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110580044A (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111277792A (en) * | 2020-01-08 | 2020-06-12 | 北京航天发射技术研究所 | Small boat control system and method based on mobile terminal |
CN111324126A (en) * | 2020-03-12 | 2020-06-23 | 集美大学 | Visual unmanned ship and visual navigation method thereof |
CN111366959A (en) * | 2020-03-16 | 2020-07-03 | 华中科技大学 | Unmanned ship recovery method and system based on images |
CN111508005A (en) * | 2020-03-02 | 2020-08-07 | 北京优世达科技有限公司 | Unmanned ship overwater obstacle autonomous detection system based on binocular vision |
CN111694072A (en) * | 2020-06-21 | 2020-09-22 | 黄河勘测规划设计研究院有限公司 | Multi-platform and multi-sensor development system integration and data processing platform |
CN112051574A (en) * | 2020-08-05 | 2020-12-08 | 华友天宇科技(武汉)股份有限公司 | Automatic rotary tillage ship based on high-precision map |
CN112180915A (en) * | 2020-09-16 | 2021-01-05 | 哈尔滨工业大学(威海) | ROS-based double-thrust unmanned ship motion control system and control method |
CN113010958A (en) * | 2019-12-20 | 2021-06-22 | 财团法人船舶暨海洋产业研发中心 | Simulation system of self-propelled ship and operation method thereof |
CN113156960A (en) * | 2021-04-28 | 2021-07-23 | 苏州优世达智能科技有限公司 | Unmanned ship autonomous obstacle avoidance system based on combination of binocular vision and millimeter wave radar |
CN113238538A (en) * | 2021-06-08 | 2021-08-10 | 浙江大虫科技股份有限公司 | Visual intelligent command scheduling platform |
CN113602416A (en) * | 2021-08-09 | 2021-11-05 | 山东交通学院 | Unmanned ship vision perception system |
CN113917930A (en) * | 2021-11-11 | 2022-01-11 | 中国船舶重工集团公司第七一九研究所 | Unmanned ship navigation state control method based on sensing data |
CN115586777A (en) * | 2022-11-04 | 2023-01-10 | 广西壮族自治区水利电力勘测设计研究院有限责任公司 | Unmanned ship remote measurement control method for water depth measurement |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106909145A (en) * | 2017-02-22 | 2017-06-30 | 武汉理工大学 | Unmanned hydrographical survey ship barrier real-time perception obstacle avoidance system and method |
US20170225760A1 (en) * | 2014-12-15 | 2017-08-10 | Leidos, Inc. | System and Method For Fusion of Sensor Data to Support Autonomous Maritime Vessels |
CN107422736A (en) * | 2017-08-03 | 2017-12-01 | 大连海事大学 | A kind of unmanned boat independently makes a return voyage system and its method of work |
CN107577230A (en) * | 2017-08-16 | 2018-01-12 | 武汉理工大学 | A kind of intelligent avoidance collision system towards unmanned boat |
CN107748561A (en) * | 2017-09-25 | 2018-03-02 | 华南理工大学 | A kind of unmanned boat part obstacle avoidance system and method based on more parameter sensings |
CN108469817A (en) * | 2018-03-09 | 2018-08-31 | 武汉理工大学 | The unmanned boat obstruction-avoiding control system merged based on FPGA and information |
CN109636921A (en) * | 2018-12-17 | 2019-04-16 | 武汉理工大学 | Intelligent vision ship sensory perceptual system and data processing method based on cloud platform |
CN110058597A (en) * | 2019-06-19 | 2019-07-26 | 奥特酷智能科技(南京)有限公司 | A kind of automatic Pilot heterogeneous system and implementation method |
-
2019
- 2019-08-30 CN CN201910814289.3A patent/CN110580044A/en not_active Withdrawn
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170225760A1 (en) * | 2014-12-15 | 2017-08-10 | Leidos, Inc. | System and Method For Fusion of Sensor Data to Support Autonomous Maritime Vessels |
CN106909145A (en) * | 2017-02-22 | 2017-06-30 | 武汉理工大学 | Unmanned hydrographical survey ship barrier real-time perception obstacle avoidance system and method |
CN107422736A (en) * | 2017-08-03 | 2017-12-01 | 大连海事大学 | A kind of unmanned boat independently makes a return voyage system and its method of work |
CN107577230A (en) * | 2017-08-16 | 2018-01-12 | 武汉理工大学 | A kind of intelligent avoidance collision system towards unmanned boat |
CN107748561A (en) * | 2017-09-25 | 2018-03-02 | 华南理工大学 | A kind of unmanned boat part obstacle avoidance system and method based on more parameter sensings |
CN108469817A (en) * | 2018-03-09 | 2018-08-31 | 武汉理工大学 | The unmanned boat obstruction-avoiding control system merged based on FPGA and information |
CN109636921A (en) * | 2018-12-17 | 2019-04-16 | 武汉理工大学 | Intelligent vision ship sensory perceptual system and data processing method based on cloud platform |
CN110058597A (en) * | 2019-06-19 | 2019-07-26 | 奥特酷智能科技(南京)有限公司 | A kind of automatic Pilot heterogeneous system and implementation method |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113010958A (en) * | 2019-12-20 | 2021-06-22 | 财团法人船舶暨海洋产业研发中心 | Simulation system of self-propelled ship and operation method thereof |
CN111277792A (en) * | 2020-01-08 | 2020-06-12 | 北京航天发射技术研究所 | Small boat control system and method based on mobile terminal |
CN111508005A (en) * | 2020-03-02 | 2020-08-07 | 北京优世达科技有限公司 | Unmanned ship overwater obstacle autonomous detection system based on binocular vision |
CN111324126A (en) * | 2020-03-12 | 2020-06-23 | 集美大学 | Visual unmanned ship and visual navigation method thereof |
CN111324126B (en) * | 2020-03-12 | 2022-07-05 | 集美大学 | Vision unmanned ship |
CN111366959B (en) * | 2020-03-16 | 2021-11-30 | 华中科技大学 | Unmanned ship recovery method and system based on images |
CN111366959A (en) * | 2020-03-16 | 2020-07-03 | 华中科技大学 | Unmanned ship recovery method and system based on images |
CN111694072A (en) * | 2020-06-21 | 2020-09-22 | 黄河勘测规划设计研究院有限公司 | Multi-platform and multi-sensor development system integration and data processing platform |
CN112051574A (en) * | 2020-08-05 | 2020-12-08 | 华友天宇科技(武汉)股份有限公司 | Automatic rotary tillage ship based on high-precision map |
CN112180915A (en) * | 2020-09-16 | 2021-01-05 | 哈尔滨工业大学(威海) | ROS-based double-thrust unmanned ship motion control system and control method |
CN113156960A (en) * | 2021-04-28 | 2021-07-23 | 苏州优世达智能科技有限公司 | Unmanned ship autonomous obstacle avoidance system based on combination of binocular vision and millimeter wave radar |
CN113238538A (en) * | 2021-06-08 | 2021-08-10 | 浙江大虫科技股份有限公司 | Visual intelligent command scheduling platform |
CN113602416A (en) * | 2021-08-09 | 2021-11-05 | 山东交通学院 | Unmanned ship vision perception system |
CN113917930A (en) * | 2021-11-11 | 2022-01-11 | 中国船舶重工集团公司第七一九研究所 | Unmanned ship navigation state control method based on sensing data |
CN115586777A (en) * | 2022-11-04 | 2023-01-10 | 广西壮族自治区水利电力勘测设计研究院有限责任公司 | Unmanned ship remote measurement control method for water depth measurement |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110580044A (en) | unmanned ship full-automatic navigation heterogeneous system based on intelligent sensing | |
CN110414396B (en) | Unmanned ship perception fusion algorithm based on deep learning | |
CN109283538B (en) | Marine target size detection method based on vision and laser sensor data fusion | |
US10943355B2 (en) | Systems and methods for detecting an object velocity | |
CN110850403B (en) | Multi-sensor decision-level fused intelligent ship water surface target feeling knowledge identification method | |
CN109084747B (en) | Waterborne traffic panoramic three-dimensional navigation system and method based on universal three-dimensional engine | |
Cheng et al. | Are we ready for unmanned surface vehicles in inland waterways? The usvinland multisensor dataset and benchmark | |
KR20220155559A (en) | Autonomous navigation method using image segmentation | |
CN108303988A (en) | A kind of the target identification tracing system and its working method of unmanned boat | |
CN110175186A (en) | A kind of intelligent ship environmental threat target apperception system and method | |
US11514668B2 (en) | Method and device for situation awareness | |
CN207908979U (en) | A kind of target identification tracing system of unmanned boat | |
CN111524392B (en) | Comprehensive system for assisting intelligent ship remote driving | |
US20220024549A1 (en) | System and method for measuring the distance to an object in water | |
CA2950791A1 (en) | Binocular visual navigation system and method based on power robot | |
CN109515086A (en) | Hydrospace detection robot and its operational method | |
KR20210007767A (en) | Autonomous navigation ship system for removing sea waste based on deep learning-vision recognition | |
KR102466804B1 (en) | Autonomous navigation method using image segmentation | |
Clunie et al. | Development of a perception system for an autonomous surface vehicle using monocular camera, lidar, and marine radar | |
CN113124864A (en) | Water surface navigation method adopting machine vision and inertial navigation fusion | |
CN113110514A (en) | Unmanned ship navigation obstacle avoidance system and method based on big data | |
KR102278674B1 (en) | Method and device for monitoring harbor and ship considering sea level | |
Joshi et al. | Underwater exploration and mapping | |
CN115031718A (en) | Unmanned ship synchronous positioning and mapping method (SLAM) and system with multi-sensor fusion | |
Yao et al. | Waterscenes: A multi-task 4d radar-camera fusion dataset and benchmark for autonomous driving on water surfaces |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20191217 |