KR20170089574A - System for managing obstacle of ship and method for managing obstacle - Google Patents

System for managing obstacle of ship and method for managing obstacle Download PDF

Info

Publication number
KR20170089574A
KR20170089574A KR1020160009952A KR20160009952A KR20170089574A KR 20170089574 A KR20170089574 A KR 20170089574A KR 1020160009952 A KR1020160009952 A KR 1020160009952A KR 20160009952 A KR20160009952 A KR 20160009952A KR 20170089574 A KR20170089574 A KR 20170089574A
Authority
KR
South Korea
Prior art keywords
obstacle
received
camera module
obstacles
information
Prior art date
Application number
KR1020160009952A
Other languages
Korean (ko)
Other versions
KR101823029B1 (en
Inventor
박성규
이인성
류승각
유진열
한성곤
박소희
Original Assignee
대우조선해양 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 대우조선해양 주식회사 filed Critical 대우조선해양 주식회사
Priority to KR1020160009952A priority Critical patent/KR101823029B1/en
Publication of KR20170089574A publication Critical patent/KR20170089574A/en
Application granted granted Critical
Publication of KR101823029B1 publication Critical patent/KR101823029B1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B35/00Vessels or similar floating structures specially adapted for specific purposes and not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B43/00Improving safety of vessels, e.g. damage control, not otherwise provided for
    • B63B43/18Improving safety of vessels, e.g. damage control, not otherwise provided for preventing collision or grounding; reducing collision damage
    • B63B43/20Feelers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B49/00Arrangements of nautical instruments or navigational aids
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • H04N5/225
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B35/00Vessels or similar floating structures specially adapted for specific purposes and not otherwise provided for
    • B63B2035/006Unmanned surface vessels, e.g. remotely controlled
    • B63B2035/007Unmanned surface vessels, e.g. remotely controlled autonomously operating
    • B63B2732/00

Abstract

The present invention relates to a system and a method for managing obstacles of a ship which are configured such that various types of obstacles located adjacent to an autonomously sailing unmanned ship and relevant signals are received to alter an orientation angle of a camera module, such that the camera module is primarily aligned with the position of the obstacles, and the degree of risk of obstacles determined by obtaining and analyzing surrounding images and temperatures is transmitted to an autonomous sailing module. According to one embodiment of the present invention, provided is the method for managing obstacles of the system for managing obstacles which comprises: the camera module obtaining the surrounding images of the unmanned ship; and an autonomous sailing module controlling the autonomous sailing of the unmanned ship. The method for managing obstacles of a ship comprises the steps of: receiving location message information of the obstacles measured by a radar installed on the unmanned ship; measuring the orientation angle of the camera module using distances from the obstacles, wherein the distances are included in the received location message information of the obstacles; determining the degree of risk of the obstacles using the received surrounding images and temperatures received from the camera module operated at the measured orientation angle; and transmitting the determined degree of the risk of the obstacle to the autonomous sailing module.

Description

Technical Field [0001] The present invention relates to an obstacle management system and an obstacle management method for a ship,

The present invention relates to an obstacle management system and an obstacle management method for a ship. More particularly, the present invention relates to an obstacle management system and an obstacle management method of a ship that receives various kinds of obstacles and signals located around an unmanned ship, The present invention relates to an obstacle management system and an obstacle management method of a ship, which can transmit a danger level of an obstacle determined by analyzing a surrounding image and a temperature obtained by changing a directional angle to an autonomous navigation module.

An unmanned vessel, ie a drone ship, refers to a ship capable of navigating a defined route automatically without a crew member and, if necessary, controlling the navigation and engine parts (eg, engines, rudder devices) in a remote control center .

On the other hand, a remote control center for remote maneuvering of unmanned vessels and fleet is needed on the ground. In order to solve technical problems and legal problems, the master and chief engineers must conduct direct command by the remote control center.

The above-mentioned unmanned ships are in a state of being filed in addition to Korean Registered Patent No. 0734814 (Jun. 27, 2007).

These unmanned vessels use the X-band and S-band radar to recognize the objects and directions of obstacles and identify them with the naked eye by referring to the AIS (Auto Identification System) In the case of night time, it is difficult to identify things, and there is a risk of navigation.

In addition, cameras for daytime and nighttime are produced. However, since the obstacle can not be automatically judged by using the images obtained through the camera, it is difficult to guarantee the safety of unmanned vessels without autonomous navigation.

On the other hand, a system for recognizing a peripheral obstacle of a ship is disclosed in Korean Patent Publication No. 1233698 (2013.02.06) and Korean Patent Registration No. 1457171 (Apr.

However, Korean Patent Registration No. 1233698 of the above-mentioned patents detects an obstacle by using only a motion sensor, and Korean Patent Registration No. 1457171 recognizes a dangerous object when the size of an object recognized by the camera becomes large, , It is necessary to avoid the object after recognizing the object, and in the case of the nighttime, it is difficult to identify the object and it is dangerous to navigate.

Therefore, it is applied to the unmanned ship, automatically recognize various kinds of obstacles and signals located in the vicinity of the ship, firstly detects the position of the obstacle, and obtains the high resolution image and temperature A system capable of transmitting the risk of the determined obstacle to the autonomous navigation module or the remote control module is required.

Korean Registered Patent No. 0734814 (Jun. 27, 2007) "Autonomous Unmanned Ship" Korean Patent Registration No. 1233698 (Feb. 23, 2013) "Obstacle Detection Apparatus and Method" Korean Registered Patent Publication No. 1457171 (Oct. 27, 2014) "Vessel Situation Recognition System"

It is an object of the present invention to analyze a peripheral image and a temperature obtained by changing the directing angle of a camera module so as to receive various kinds of obstacles and signals located in the periphery of an unmanned ship under autonomous navigation and to match the position of an obstacle And to provide an obstacle management system and an obstacle management method of a ship in which the degree of danger of a determined obstacle can be transmitted to an autonomous navigation module.

According to an aspect of the present invention, there is provided an obstacle management method for an obstacle management system including a camera module for acquiring a peripheral image of an unmanned ship, and an autonomous navigation module for controlling autonomous navigation of the unmanned vessel, Receiving location message information of an obstacle measured from a radar installed on the unmanned vessel; Calculating a directivity angle of the camera module using a distance from the obstacle included in the location message information of the received obstacle; Determining a degree of danger of an obstacle based on the ambient image and temperature received from the camera module driven at the calculated orientation angle; And transmitting the degree of danger of the determined obstacle to the autonomous navigation module.

The calculating step estimates the altitude of the camera module using an altitude difference between the altitude received from the GPS receiver, the installed position of the GPS receiver, and the preset mounting position of the camera module, The orientation angle of the camera module can be calculated.

Also, the obstacle managing method according to an embodiment of the present invention may further include calculating radar degree of the radar using the radar received from the GPS receiver, the position difference between the GPS receiver and the radar, And calculating and storing the radius of the obstacle in the obstacle database by using the direction angle and the distance received from the computed radius of the radar and the position message information of the obstacle.

Further, the method of managing an obstacle according to an embodiment of the present invention may further include receiving message information of an obstacle provided from an AIS installed in the unmanned vessel after the receiving step, and the storing and managing step When the message information of the obstacle is received, the obstacle matching the obstacle of the obstacle included in the message information of the obstacle may be searched and the message information of the received obstacle may be mapped to the searched obstacle and stored in the obstacle database.

The method further includes applying an operation command to the fan motor and the tilt motor attached to the camera module to be driven at the calculated orientation angle after the calculating step can do.

Wherein the determining comprises: extracting an outline of the received peripheral image and pixels of the image; Estimating an area of the obstacle using the extracted outline and pixels of the image; Comparing the area of the obstacle with the obtained temperature with the obstacle classification information stored in the database to determine the type of the obstacle; And calculating a danger level of the obstacle based on the specific gravity set in the kind of the obstacle, the area of the estimated obstacle, and the relative speed of the obstacle.

Wherein calculating the degree of danger of the obstacle includes calculating azimuth and distance information with the obstacle received from the radar, message information of the obstacle received from the AIS, distance of the obstacle received by the camera module, The degree of danger of the obstacle can be calculated by reflecting high priority information.

Further, the method of managing an obstacle according to an embodiment of the present invention includes: after the transmitting step, receiving a risk of the obstacle and displaying an obstacle within a certain distance on the display unit; And providing the image information obtained through the camera module to the display unit when a specific obstacle among the displayed obstacles is selected.

The displaying step may display the obstacles displayed on the display unit in a descending order of the degree of risk among obstacles located within a predetermined distance from the unmanned vessel.

According to another embodiment of the present invention, there is provided an obstacle management system including a camera module for acquiring a peripheral image of an unmanned ship, and an autonomous navigation module for controlling autonomous navigation of the unmanned vessel, And a controller for receiving the positional message information of the measured obstacles and determining the position of the obstacle through the ambient image and the temperature received from the camera module driven by the calculated direction angle using the distance to the obstacle included in the positional message information of the received obstacle And a central processing module for determining the degree of danger and transmitting the degree of danger of the determined obstacle to the self-navigation module.

The central processing module estimates the altitude of the camera module using the altitude difference between the altitude received from the GPS receiver, the installed position of the GPS receiver, and the preset mounting position of the camera module, and calculates a distance between the estimated altitude and the obstacle The orientation angle of the camera module can be calculated.

The central processing module calculates a radius of the radar using the radar received from the GPS receiver, the position difference between the GPS receiver and the radar, calculates the radius of the calculated radar, and the direction received from the positional message information of the obstacle It is possible to store and manage the obstacle database in the obstacle database by estimating the radius of the obstacle using the angle and the distance.

When the central processing module receives message information of an obstacle provided from the AIS installed on the unmanned vessel, the central processing module searches for an obstacle that matches the diameter of the obstacle included in the message information of the obstacle, Can be mapped and stored.

The central processing module extracts the outline of the received peripheral image and pixels of the image, estimates the area of the obstacle by using the extracted outline and the pixels of the image, and calculates the area of the obstacle estimated and the obtained temperature to the database The type of the obstacle is compared with the stored obstacle classification information, and the degree of danger of the obstacle can be calculated using the specific weight set for the type of the obstacle, the area of the estimated obstacle, and the relative speed of the obstacle .

Wherein the central processing module comprises: azimuth and distance information with the obstacles received from the radar; message information of the obstacles received from the AIS; distances of the obstacles received by the camera module; an azimuth angle between the unmanned vessel and the obstacle; The degree of danger of the obstacle can be calculated by reflecting the high information.

In addition, the obstacle management system according to another embodiment of the present invention receives the risk of the obstacle and displays an obstacle within a certain distance on the display unit, and when a specific obstacle is selected from the displayed obstacle, And a remote control module for providing information on the display unit.

The remote control module may display the obstacles displayed on the display unit in a descending order of the number of obstacles located within a predetermined distance from the ship.

According to the embodiment of the present invention, various types of obstacles and signals located around the unmanned ship during autonomous navigation are received and the orientation and angle of the camera module are changed so that the camera module is aligned with the position of the obstacle. And the degree of danger of the obstacle determined by the obstacle can be transmitted to the autonomous navigation module. Accordingly, it is possible to make decisions such as avoiding obstacles and rescue operations of drift ships. It also improves navigation stability by communicating the danger level of obstacles to the on-ground control center where the remote control module is installed.

1 is a block diagram for explaining an obstacle management system for a ship according to an embodiment of the present invention;
FIG. 2 is a block diagram for explaining a camera module shown in FIG. 1;
FIG. 3 is a block diagram for explaining the central processing module shown in FIG. 1;
FIG. 4 is a block diagram for explaining the remote control module shown in FIG. 1;
5 is an operational flowchart for explaining an obstacle management method using a system for managing a vessel's vessel according to an embodiment of the present invention, and
FIG. 6 is a view showing a screen for monitoring the position of an obstacle based on an unmanned ship. FIG.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.

FIG. 1 is a block diagram illustrating a ship obstacle management system according to an embodiment of the present invention. FIG. 2 is a block diagram for explaining a camera module shown in FIG. 1, FIG. 4 is a block diagram for explaining the remote control module shown in FIG. 1. Referring to FIG.

Referring to FIG. 1, a ship obstacle management system according to an embodiment of the present invention includes a camera module 10, a central processing module 20, an interface module 30, a remote control module 40, and an autonomous navigation module 50 And the like.

The obstacle management system of the vessel shown in FIG. 1 is a dangerous environment notification system applied to an unmanned vessel capable of autonomous operation, and may be applied to a manned vessel. For manned ships, the degree of danger of the obstacle can be provided to the main controller installed in the control room.

Hereinafter, the description will be limited to an unmanned ship.

The camera module 10 acquires images and temperatures located around the unmanned ship.

2, the camera module 10 includes an integrated day / night camera, that is, a first camera 10a (for example, an HD camera) and a second camera 10b (for example, an LWIR camera) The fan motor 10c and the tilt motor 10d are installed in an integrated day / night camera and include a fan motor 10c and a tilt motor 10c for adjusting the directing angles of the first and second cameras 10a and 10b. Tilt motor 10d and a communication section 10e for communication with the central processing module 20. [

The first and second cameras 10a and 10b sequentially monitor rotation in all directions and acquire images around the unmanned ship.

The fan motor 10c is a motor capable of rotating the first and second cameras 10a and 10b 360 degrees and the tilt motor 10d is a motor capable of rotating the first and second cameras 10a and 10b by 90 degrees .

The central processing module 20 changes the orientation angle of the camera module 10 primarily by using the distance and the direction angle with the obstacle included in the position message information of the obstacle measured from the radar 4 installed on the unmanned vessel , The danger level of the obstacle determined by using the ambient image and the temperature acquired through the camera module 10 driven by the changed steering angle is transmitted to the autonomous navigation module 50 and the remote steering module 60 installed in the control center on the land do.

At this time, the radar 4 emits radio waves of X band or S band and uses RADAR, which is an apparatus for estimating the existence and distance of an obstacle, And ARPA that provides information related to collision such as TCPA, but it is preferably used as ARPA.

The central processing module 20 receives all the data related to the unmanned ship through the interface module 30. [

The interface module 30 is connected to a GPS receiver 1, a COMPASS 2, an AIS (Auto Identification System), a ship automatic identification device 3 and a radar 4. The interface module 30 is connected to an unmanned ship Receives the signal of the NMEA 0183 of various installed devices, transforms it into the XML format, and transfers the converted data to the central processing module 20.

The interface module 30 receives the position information including the latitude and longitude of the unmanned ship from the GPS receiver 1, receives the azimuth measured from the COMPASS 2, and transmits the message information of the obstacle provided from the AIS 3 And receives the position message information of the obstacle located in the periphery of the unmanned ship measured from the radar 4 and the radius information of the obstacle. When the RADAR is received from the radar 4 and the RADAR is received on the unmanned vessel, only the position message information of the obstacle is received and the radar information of the obstacle is not received, the central processing module 20 calculates the radar received from the GPS receiver 1, The radius of the radar 4 is calculated using the positional difference (x, y, z or the azimuth angle and distance) between the radar 4 and the radar 4, And estimates the radius information of the obstacle by using the azimuth and distance of the obstacle received from the obstacle.

The AIS (3) receives message information of obstacles including other ships operating around the unmanned ship through VHF communication. Here, the message information of the obstacle is, for example, one of NMEA0183 messages received from the AIS (3), and consists of a predetermined protocol, for example, an encryption message according to the ITU-R M1371 protocol. The message information of such an obstacle includes Maritime Mobile Service Identity (MMSI) of the target vessel, target ship name, ship type, size, position, speed, a player's direction, route, loading status, shipment and destination.

The central processing module 20 stores the location message information of the obstacle in the obstacle item when receiving the location message information of the obstacle from the radar 4 installed in the unmanned ship, In this case, only the changed information is updated, and if it is the location message information of the new obstacle, the obstacle item is generated and stored. At this time, when storing or updating the obstacle items, the radar 4 receives and stores the information about the radius of the obstacles, or estimates the radius of the obstacles as described above and stores them.

When the message information of the obstacle provided from the AIS 3 is received, the central processing module 20 updates the message information of the latest obstacle and transmits the radar 4 (4) using the radar information included in the message information of the updated obstacle ), And provides the identification information to the searched information of the obstacle by searching for the searched information of the searched obstacle that matches the searched information included in the location message information of the stored obstacle including the searched information. In the obstacle database, the message information of the obstacle is tabulated and the provided identification information is stored and managed as a unique key. At this time, since the message information of the obstacle provided from the AIS 3 is received within a few seconds of the receiving interval and the radar information of the obstacle received from the radar 4 within a few seconds, immediately after receiving the message information of the obstacle from the AIS 3 It is possible to map the unique number by searching for the radius information of the obstacle.

The central processing module 20 receives the position message information of the obstacle including the azimuth and distance of the obstacle, the CPA, the TCPA, and the like from the radar 4 on the basis of the unmanned vessel, Upon receiving the information, the obstacle list is updated with the radius information of the received obstacle, and the final obstacle list is updated by sorting on the basis of the CPA (the closest distance to the unmanned vessel) or the TCPA (the closest time to the unmanned vessel) . If the central processing module 20 receives the position message information of the obstacle including the azimuth and distance of the obstacle, the CPA, the TCPA, and the like based on the unmanned ship from the radar 4, Estimates the radius information of the actual obstacle by using the azimuth and distance of the obstacle included in the position message information of the obstacle, and updates the information on the obstacle list.

That is, immediately after receiving the message information of the obstacle provided in the AIS 3 via the interface module 30, the central processing module 20 searches for an actual obstacle coinciding with the radar information included in the message of the obstacle in the obstacle list And updates the obstacle list by adding information such as the ship name, the MMSI, the ship size, the type of ship, the ship heading, the ship speed, and the ship route provided from the AIS 3.

Thereafter, the central processing module 20 calculates the azimuth angle that the camera module 10 should aim for by using the azimuth angle of the obstacle and the distance of the obstacle, calculates the altitude received from the GPS receiver 1, And estimates the altitude of the camera module 10 using the altitude difference between the installed position of the camera module 10 and the preset location of the camera module 10, and calculates the altitude of the camera module 10 using the distance between the estimated altitude and the obstacle included in the message information of the obstacle ) To the fan motor 10c and the tilt motor 10d attached to the camera module 10 so as to be driven by a steering angle including the calculated azimuth and altitude angles.

The central processing module 20 receives the ambient image and the temperature obtained through the camera module 10 having the changed orientation angle, estimates the area of the obstacle by using the outline of the obtained surrounding image and the image pixel, The type of the obstacle is determined by comparing the area of the obstacle and the received temperature with the obstacle classification information stored in the obstacle database, and the danger of the obstacle calculated using the specific gravity set in the determined type of obstacle, the area of the obstacle, To the autonomous navigation module 50 and the remote control module 50 to determine the exact position where the camera module 10 is installed, for example, coordinates away from the radar 4, coordinates away from the GPS receiver 10, It receives input and manages various settings.

The degree of danger of the obstacle is determined by the message information of the obstacle received from the AIS 3, the position message information of the obstacle received from the radar 4, the distance information between the obstacle and the obstacle obtained by the camera module 10, And may be calculated by reflecting information having a high priority among the azimuth angles. The priorities are shown in Table 1 below.

Priority High Low Information TTM TLL VDM Vincenty
Formulae
Camera Image Processing Camera drive motor
Update Time V V V Obstacle number V V Obstacle distance V V Obstacle name V Relative speed V V V .........

Here, TTM is the location message information of the obstacle received from the radar 4, TLL is the radius information of the obstacle received from the radar 4, and VDM is the message information of the obstacle received from the AIS 3.

3, the central processing module 20 includes an image format changing unit 20a, a remote image streaming unit 20b, an image processing unit 20c, an obstacle database 20d, a first communication unit 20e A driving control unit 20f, a coordinate calculation unit 20g, an obstacle management unit 10h, a second communication unit 20i, a setting management unit 20j, an information collecting unit 20k and a third communication unit 20l do.

The image format changing unit 20a compresses the format of the image file received from the camera module 10 to suit the remote transmission.

The remote image streaming unit 20b connects to the camera module 10 and provides a live image so that live images can be checked.

The image processing unit 20c performs image-based analysis such as removal of the fixed area of the ship, unmanned ship, horizontal line recognition, size of the obstacle and transformation of the shape vector image through the image processing from the peripheral image received from the camera module 10 do.

In addition, the image processing unit 20c analyzes the surrounding image to remove an unmanned ship-related deck image part from the received peripheral images to select an obstacle, i.e., an object, and when an object occupying at least 3 pixels is captured at the lower end of the horizontal line reference , It is periodically observed as an obstacle estimation object, and when it occupies more than 7 pixels, it is selected as an obstacle.

The obstacle database 20d stores obstacle classification information that can determine the type of the obstacle by using the size, shape, and temperature of the obstacle.

The first communication unit 20e receives the peripheral image and the temperature acquired through the camera module 10 and transmits the operation command generated through the drive control unit 20f to the camera module 10.

The drive control section 20f generates an operation command to the fan motor 10c and the tilt motor 10d according to the orientation angle to be moved.

The coordinate calculation unit 20g calculates the radius information of the obstacle by using the radius (position) of the unmanned ship, the relative angle of the obstacle, and the distance information of the obstacle, or calculates the radar information of the obstacle, (Radius of curvature) to calculate the relative angle and distance of the obstacle. At this time, the coordinate calculation unit 20g can calculate the radius information of the obstacle or calculate the relative angle and distance of the obstacle, for example, through the Vincenty Algorithm.

More precisely, the coordinate calculation unit 20g calculates the coordinate value of the angular velocity of the tilt motor 10d using the angular velocity of the tilt motor 10d and the azimuth angle of the fan motor 10c of the camera module 10, , The distance information of the obstacle, the distance information of the obstacle, the radius information of the unmanned ship, and the Vincenty algorithm described above can be used to calculate the radius information of the obstacle.

The obstacle management unit 20h provides identification information to the external apparatuses, that is, the AIS 3 and the radar 4, and the radar information of the obstacles made by the image analysis, (20d).

The second communication unit 20i is a means for communicating with the remote control module 4 to transmit the degree of danger of the obstacle to the remote control module 40 or to receive the steering value for remote control from the remote control module 40 .

The setting management unit 20j sets the size of the unmanned ship, the installation position of the camera module 10 (distance to the bow, distance to the stern, distance to the port, distance to the starboard, height from the deck) Various setting information such as installation position, installation position of the GPS receiver 1, administrator ID / PW setting, output image format setting, and whether or not the streaming unit is activated are stored and managed. Various setting information may be stored in the above-described obstacle database 20d or in a separate database (not shown).

The information collecting unit 20k receives the information received from the external equipment, that is, the GPS receiver 1, the COMPASS 2, the AIS 3 and the radar 4 and collects them according to the identification information of the obstacles, .

The third communication unit 201 is a means for communicating with the interface module 30 and receives information received from the GPS receiver 1, the COMPASS 2, the AIS 3 and the radar 4.

Referring to FIG. 4, the remote control module 40 includes a display unit 40a, a control unit 40b, and a communication unit 40c.

The display unit 40 displays information received from the central processing module 40a or received from the autonomous navigation module 50. [ The display unit 40a displays image information on an obstacle located in the vicinity of the unmanned ship, and may be a plurality of images.

A controller (not shown) included in the remote control module 40 displays an obstacle existing within a certain distance on the basis of the position of the unmanned vessel on the display unit 40a and displays the selected obstacle on a selected obstacle The image information obtained from the camera module 10 is displayed on the display unit 40a.

In addition, the controller included in the remote control module 40 classifies the colors in the descending order of risk level based on the degree of danger of the obstacles received from the central processing module 30, An obstacle can be displayed as shown.

The control unit 40b inputs the steering value when it is necessary to remotely adjust the unmanned vessel. At this time, the inputted steering value is transmitted to the autonomous navigation module 50 through the communication unit 40c.

Also, the control unit 40b can manually or automatically control the camera module 10.

The communication unit 40c is a means for communicating with the central processing module 20 and the autonomous navigation module 50. The communication unit 40c receives the degree of danger of the obstacle from the central processing module 20 and transmits the steering value to the autonomous navigation module 50 .

5 is a flowchart illustrating an obstacle management method using an obstacle management system of a ship according to an embodiment of the present invention.

Referring to FIG. 5, the central processing module 20 included in the obstacle management system of the ship receives the message information of the obstacle provided from the AIS 3 (S11).

The central processing module 20 determines whether the location message information of the obstacle has been received from the radar 4 (S13).

If it is determined in step S13 that the location message information of the obstacle is not received, the central processing module 20 determines whether there is a phase change of the surrounding image acquired from the camera module 10 (S16). The camera module 10 acquires a peripheral image while rotating at a predetermined angle with a period, and transmits the peripheral image to the central processing module 20. The central processing module 20 can determine whether the change of the peripheral image received from the camera module 10 is changed by a predetermined amount.

If it is determined in step S16 that there is no change in the phase of the surrounding image, the central processing module 20 moves the process to step S11 described above.

If it is determined in step S16 that there is a phase change of the peripheral image, the central processing module 20 receives the peripheral image and the temperature acquired through the camera module 10 (S18).

The central processing module 20 estimates the position and radius information of the obstacle based on the received peripheral image (S20), and then moves the process to a step S25 described later.

As a result of the determination in step S13, when the location message information of the obstacle is received, the central processing module 20 extracts the radar information of the obstacle received from the radar 4 from the message information of the obstacle received in step S11, The message information of the obstacle coinciding with the degree-of-mobility information of the obstacle estimated using the relative azimuth angle and distance included in the position message information of the obstacle is searched and the obstacle list is updated by mapping additional information included in the message information of the searched obstacle (S15).

Although it is described in the present embodiment that step S13 is performed after step S13, the central processing module 20 performs a step of determining whether the radar information of the obstacle has been received before or after the step S13 . In the case where the radar information of the obstacle is not received, the central processing module 20 can estimate the radar information of the obstacle by using the relative azimuth angle and the distance included in the location message information of the obstacle as described above.

Accordingly, the central processing module 20 determines whether or not the position information of the obstacle is estimated as described above or the message information of the obstacle matching the radar information of the obstacle received from the radar 4 You can search.

Thereafter, the central processing module 20 aligns the radius information of the obstacle based on the TCPA, for example, in order of the risk of the obstacle with respect to the updated obstacle list (S17).

The central processing module 20 selects a specific obstacle based on a pre-selected criterion among the aligned obstacle list (S19). At this time, a predetermined criterion can be set, for example, by TCPA. That is, the central processing module 20 selects a specific obstacle in the ordered list of obstacles in the order of the closest to the unmanned vessel. Accordingly, the process from step S21 to step S33 to be described later is performed for a plurality of obstacles in the obstacle list.

The central processing module 20 calculates a direction angle (i.e., azimuth angle and altitude angle) that the camera module should aim by using the azimuth angle of the obstacle included in the location message information of the selected specific obstacle and the distance to the obstacle (S21 ). As described above, the central processing module 20 uses the azimuth angle of the obstacle and the distance to the obstacle to calculate an azimuth angle that the camera module 10 should aim for, The altitude of the camera module 10 is estimated using the altitude difference between the installation position of the receiver 1 and the installation position of the camera module 10 previously set and the distance between the estimated altitude and the obstacle included in the message information of the obstacle To calculate an altitude angle at which the camera module 10 should aim.

The central processing module 20 applies an operation command to the fan motor 10c and the tilt motor 10d attached to the camera module 10 so as to be driven at the calculated orientation angle, And the temperature are received (S23).

The central processing module 20 extracts the outline of the received peripheral image (S25).

The central processing module 20 estimates the number of pixels in the horizontal and vertical lengths by using an average value excluding the upper 5% and lower 5% values of the continuously acquired values and estimates the area of the obstacle based on the total number of pixels (S27).

The central processing module 20 determines the type of the obstacle based on the values obtained so far (S29). That is, the central processing module 20 compares the temperature received in step S23, the area of the obstacle estimated in step S27 described above, and the outline shape extracted in step S25 described above with the obstacle classification information stored in the obstacle database 20d To determine the type of obstacle.

The central processing module 20 substitutes the specific gravity r, the obstacle area m and the relative speed v of the obstacle, which are set in the kind of the obstacle determined in the step S29, into the following equation (1) (S31). The specific gravity is set to 7.87, 0.91, 0.98, and 1.0, respectively, for each type of obstacle, for example, ship, glacier, person, and unknown.

Figure pat00001

Here, when the relative speed is negative, V = 0 is set.

The central processing module 20 transmits the degree of danger of the calculated obstacle to the autonomous navigation module 50 (S33). To the autonomous navigation module 50 and the remote steering module 40 installed in the remote control center, the identification information of the obstacles sorted by the degree of danger of the obstacle.

When the identification information of the selected obstacle among the obstacles displayed on the display unit 40a is received from the remote control module 40, the central processing module 20 transmits the location message information of the obstacle stored in association with the identification information of the obstacle The azimuth angle of the camera module 10 is changed using the azimuth angle of the included obstacle and the distance between the azimuth of the obstacle and the obstacle, and the acquired image information of the obstacle is transmitted to the remote control module 40.

By doing so, it is possible not only to improve the safety of navigation by grasping the types and areas of various obstacles during unmanned large-sized ship operation and transmitting the information to the autonomous navigation module 50 and the remote control module 40 in the on- Avoidance, and rescue operations of drift ships.

The invention being thus described, it will be obvious that the same way may be varied in many ways. Such modifications are intended to be within the spirit and scope of the invention as defined by the appended claims.

1: GPS receiver 2: COMPASS
3: AIS 4: Radar
10: camera module 20: central processing module
30: Interface module 40: Remote control module
50: Self-service module

Claims (17)

An obstacle management method for an obstacle management system including a camera module for acquiring a peripheral image of an unmanned ship and an autonomous navigation module for controlling autonomous navigation of the unmanned ship,
Receiving location message information of an obstacle measured from a radar installed on the unmanned vessel;
Calculating a directivity angle of the camera module using a distance from the obstacle included in the location message information of the received obstacle;
Determining a degree of danger of an obstacle based on the ambient image and temperature received from the camera module driven at the calculated orientation angle; And
And transmitting the risk level of the determined obstacle to the autonomous navigation module.
The method according to claim 1,
The calculating step estimates the altitude of the camera module using an altitude difference between the altitude received from the GPS receiver, the installed position of the GPS receiver, and the preset mounting position of the camera module, Wherein the orientation angle of the camera module is calculated using the angle of view of the camera module.
The method according to claim 1,
After the receiving step,
Calculating a radar radar magnitude using a radar received from a GPS receiver, a position difference between the GPS receiver and the radar, calculating a radius of the radar and a direction angle and a distance received from the positional message information of the obstacle Further comprising the steps of: estimating a degree of a barycenter of the obstacle and storing and managing the degree of barycenter of the obstacle in an obstacle database.
The method of claim 3,
After the receiving step,
Further comprising receiving message information of an obstacle provided from an AIS installed on the unmanned vessel,
The storing and managing step
When the message information of the obstacle is received, searches for an obstacle that matches the diameter of the obstacle included in the message information of the obstacle, maps the message information of the received obstacle to the searched obstacle, and stores the mapped information in the obstacle database How to manage obstacles.
The method according to claim 1,
After the calculating step,
Further comprising the step of applying an operation command to the fan motor and the tilt motor attached to the camera module so as to be driven at the calculated orientation angle.
The method according to claim 1,
The step of determining
Extracting an outline of the received peripheral image and pixels of the image;
Estimating an area of the obstacle using the extracted outline and pixels of the image;
Comparing the area of the obstacle with the obtained temperature with the obstacle classification information stored in the database to determine the type of the obstacle; And
And calculating a danger level of the obstacle based on the specific weight set in the kind of the obstacle, the area of the estimated obstacle, and the relative speed of the obstacle.
The method of claim 6,
Wherein calculating the degree of danger of the obstacle includes calculating azimuth and distance information with the obstacle received from the radar, message information of the obstacle received from the AIS, distance of the obstacle received by the camera module, Wherein the degree of danger of the obstacle is calculated by reflecting information having high priority set in advance.
The method according to claim 1,
After the transmitting step,
Receiving a risk of the obstacle and displaying an obstacle within a certain distance on the display unit; And
Further comprising providing the image information obtained through the camera module on the display unit when a specific obstacle is selected from the displayed obstacles.
The method of claim 8,
The step of displaying
And displaying the obstacles displayed on the display unit in a descending order of the number of obstacles located within a predetermined distance from the unmanned vessel.
An obstacle management system comprising a camera module for acquiring a peripheral image of an unmanned ship, and an autonomous navigation module for controlling autonomous navigation of the unmanned ship,
And a controller for receiving the position message information of the obstacle measured from the radar installed on the unmanned vessel and detecting the position of the obstacle in the vicinity of the obstacle received from the camera module driven at the calculated direction using the distance to the obstacle included in the position message information of the received obstacle And a central processing module for determining the degree of danger of the obstacle through the image and the temperature, and transmitting the degree of danger of the determined obstacle to the autonomous navigation module.
The method of claim 10,
The central processing module estimates the altitude of the camera module using the altitude difference between the altitude received from the GPS receiver, the installed position of the GPS receiver, and the preset mounting position of the camera module, and calculates a distance between the estimated altitude and the obstacle Wherein the directional angle of the camera module is calculated using the angle of view of the camera module.
The method of claim 10,
The central processing module calculates a radius of the radar using the radar received from the GPS receiver, the position difference between the GPS receiver and the radar, calculates the radius of the calculated radar, and the direction received from the positional message information of the obstacle And estimating a degree of a degree of the obstacle by using angles and distances to store and manage the obstacle in the obstacle database.
The method of claim 12,
When the central processing module receives message information of an obstacle provided from the AIS installed on the unmanned vessel, the central processing module searches for an obstacle that matches the diameter of the obstacle included in the message information of the obstacle, And stores the mapped map data.
The method of claim 10,
The central processing module extracts the outline of the received peripheral image and pixels of the image, estimates the area of the obstacle by using the extracted outline and the pixels of the image, and calculates the area of the obstacle estimated and the obtained temperature to the database The type of the obstacle is compared with the stored obstacle classification information, and the degree of danger of the obstacle is calculated using the specific weight set for the type of the obstacle, the area of the estimated obstacle, and the relative speed of the obstacle Characterized by an obstacle management system.
15. The method of claim 14,
Wherein the central processing module comprises: azimuth and distance information with the obstacles received from the radar; message information of the obstacles received from the AIS; distances of the obstacles received by the camera module; an azimuth angle between the unmanned vessel and the obstacle; Wherein the degree of danger of the obstacle is calculated by reflecting high information of the obstacle.
The method of claim 10,
Further comprising a remote control module for receiving the risk of the obstacle and displaying an obstacle within a certain distance on the display part and providing the image information received through the camera module to the display part when a specific obstacle among the displayed obstacles is selected The fault management system comprising:
18. The method of claim 16,
Wherein the remote control module distinguishes and displays the obstacles displayed on the display unit in descending order of the degree of risk among obstacles located within a predetermined distance from the ship.
KR1020160009952A 2016-01-27 2016-01-27 System for managing obstacle of ship and method for managing obstacle KR101823029B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160009952A KR101823029B1 (en) 2016-01-27 2016-01-27 System for managing obstacle of ship and method for managing obstacle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160009952A KR101823029B1 (en) 2016-01-27 2016-01-27 System for managing obstacle of ship and method for managing obstacle

Publications (2)

Publication Number Publication Date
KR20170089574A true KR20170089574A (en) 2017-08-04
KR101823029B1 KR101823029B1 (en) 2018-01-31

Family

ID=59654444

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160009952A KR101823029B1 (en) 2016-01-27 2016-01-27 System for managing obstacle of ship and method for managing obstacle

Country Status (1)

Country Link
KR (1) KR101823029B1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200070761A (en) * 2018-12-10 2020-06-18 ㈜버틀러네트워크 System and method for measuring distance to obstacle in ship
KR20200070758A (en) * 2018-12-10 2020-06-18 ㈜버틀러네트워크 System for sailing unmanned ship using infrared camera and method thereof
KR20210136807A (en) * 2020-05-08 2021-11-17 주식회사 아비커스 support system for vessel operation and ship having the same
KR20210154501A (en) * 2020-06-12 2021-12-21 엘아이지넥스원 주식회사 Behavior-based control method and system considering the interaction between operator and an autonomous surface vehicle
KR20230012271A (en) 2021-07-15 2023-01-26 대우조선해양 주식회사 System and method for recognition of atypical obstacle system and computer-readable recording medium including the same
CN115980739A (en) * 2023-03-21 2023-04-18 安徽隼波科技有限公司 Automatic defense deploying method for radar guided photoelectric tracking
EP4187347A3 (en) * 2021-11-29 2023-09-27 Korea Institute of Ocean Science & Technology System and method for multi-image-based vessel proximity situation recognition support

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11741709B2 (en) * 2018-05-22 2023-08-29 Starship Technologies Oü Method and system for analyzing surroundings of an autonomous or semi-autonomous vehicle
KR102198091B1 (en) 2019-11-20 2021-01-04 국방과학연구소 Apparatus and method for displaying ship information

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4574157B2 (en) * 2003-10-17 2010-11-04 富士重工業株式会社 Information display device and information display method
KR101455071B1 (en) * 2013-09-17 2014-10-30 주식회사 서울텍 Method for enhancing night time image using digital compositing

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20200070761A (en) * 2018-12-10 2020-06-18 ㈜버틀러네트워크 System and method for measuring distance to obstacle in ship
KR20200070758A (en) * 2018-12-10 2020-06-18 ㈜버틀러네트워크 System for sailing unmanned ship using infrared camera and method thereof
KR20210136807A (en) * 2020-05-08 2021-11-17 주식회사 아비커스 support system for vessel operation and ship having the same
KR20210154501A (en) * 2020-06-12 2021-12-21 엘아이지넥스원 주식회사 Behavior-based control method and system considering the interaction between operator and an autonomous surface vehicle
KR20230012271A (en) 2021-07-15 2023-01-26 대우조선해양 주식회사 System and method for recognition of atypical obstacle system and computer-readable recording medium including the same
EP4187347A3 (en) * 2021-11-29 2023-09-27 Korea Institute of Ocean Science & Technology System and method for multi-image-based vessel proximity situation recognition support
CN115980739A (en) * 2023-03-21 2023-04-18 安徽隼波科技有限公司 Automatic defense deploying method for radar guided photoelectric tracking

Also Published As

Publication number Publication date
KR101823029B1 (en) 2018-01-31

Similar Documents

Publication Publication Date Title
KR101823029B1 (en) System for managing obstacle of ship and method for managing obstacle
US10322804B2 (en) Device that controls flight altitude of unmanned aerial vehicle
US20200207474A1 (en) Unmanned aerial vehicle and payload delivery system
US9783320B2 (en) Airplane collision avoidance
CA2767312C (en) Automatic video surveillance system and method
AU2021202509B2 (en) Image based localization for unmanned aerial vehicles, and associated systems and methods
US20180032042A1 (en) System And Method Of Dynamically Controlling Parameters For Processing Sensor Output Data
US20210053680A1 (en) Systems and methods for autonomous navigation and computation of unmanned vehicles
CN106164931B (en) Method and device for displaying objects on a vehicle display device
KR20170088124A (en) Communication system of unmanned ship and communication method using the same
KR102399982B1 (en) Ships using the aircraft safe operation support systems
EP3248139A1 (en) Cloud feature detection
JP2020138681A (en) Control system for unmanned flight vehicle
EP3248138A1 (en) Detecting and ranging cloud features
CN110997488A (en) System and method for dynamically controlling parameters for processing sensor output data
KR102340589B1 (en) Ships using the aircraft safe operation support systems
JP6890108B2 (en) Unmanned flight equipment, imaging system and imaging method
JP2000152220A (en) Method for controlling monitor itv camera
JP6628373B1 (en) Wall trace type flight control system for multicopter
US9120569B2 (en) Clickable camera window
CN111492326B (en) Image-based positioning for unmanned aerial vehicles and related systems and methods
WO2021075072A1 (en) Object detection device, flight vehicle, object detection method, computer program, and method for generating learning model
KR102464086B1 (en) Collision avoidance system and method between autonomous vessel and general vessel

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant