EP4213111A1 - System for detecting objects on a water surface - Google Patents
System for detecting objects on a water surface Download PDFInfo
- Publication number
- EP4213111A1 EP4213111A1 EP23150776.5A EP23150776A EP4213111A1 EP 4213111 A1 EP4213111 A1 EP 4213111A1 EP 23150776 A EP23150776 A EP 23150776A EP 4213111 A1 EP4213111 A1 EP 4213111A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- feature point
- water area
- distance
- imager
- controller
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 title claims abstract description 114
- 238000001514 detection method Methods 0.000 claims abstract description 52
- 230000007423 decrease Effects 0.000 claims abstract description 11
- 238000003384 imaging method Methods 0.000 claims description 37
- 238000005259 measurement Methods 0.000 claims description 29
- 238000000034 method Methods 0.000 description 14
- 238000006073 displacement reaction Methods 0.000 description 5
- 238000013507 mapping Methods 0.000 description 5
- 230000010365 information processing Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000004043 responsiveness Effects 0.000 description 3
- 238000012937 correction Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003032 molecular docking Methods 0.000 description 1
- 239000011295 pitch Substances 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/203—Specially adapted for sailing ships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B63—SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
- B63B—SHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING
- B63B49/00—Arrangements of nautical instruments or navigational aids
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B63—SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
- B63B—SHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING
- B63B79/00—Monitoring properties or operating parameters of vessels in operation
- B63B79/40—Monitoring properties or operating parameters of vessels in operation for controlling the operation of vessels, e.g. monitoring their speed, routing or maintenance schedules
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
- G05D1/02—Control of position or course in two dimensions
- G05D1/0206—Control of position or course in two dimensions specially adapted to water vehicles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/771—Feature selection, e.g. selecting representative features from a multi-dimensional feature space
Definitions
- the present invention relates to a water area object detection system, and more particularly, it relates to a water area object detection system that includes an imager and creates a surrounding map based on an image captured by the imager.
- An autonomous mobile robot that includes an imager and creates an environment map (surrounding map) based on an image captured by the imager is known in general.
- Such an autonomous mobile robot is disclosed by Tatsunori Kou, Kenji Suzuki, and Shuji Hashimoto, Department of Applied Physics, Waseda University, "3D Environment Recognition and Mapping for Autonomous Mobile Robots", the 66th National Meeting of the Information Processing Society of Japan, pages 2-453 to 2-454 , for example.
- an autonomous mobile robot including an imager and a controller.
- the controller of the autonomous mobile robot creates a surrounding environment map (surrounding map) by detecting the posture and position of the autonomous mobile robot and a distance to an object around the autonomous mobile robot based on an image captured by the imager.
- the controller detects a measurement point (feature point) indicating an object, sets, around the measurement point, a probability distribution range (object presence range) in which the object around the autonomous mobile robot is probabilistically distributed, and shows the probability distribution range on the environment map. That is, a range in which an object that interferes with movement of the autonomous mobile robot is probably present, which is the probability distribution range in which the object is probabilistically distributed, is shown on the environment map.
- the controller increases the probability distribution range (object presence range) as the distance from the imager to the object increases, taking into account a distance measurement error. Specifically, the distance measurement error becomes greater as the distance from the imager to the object increases, and thus the controller also increases the probability distribution range (object presence range) according to the distance measurement error. Conversely, the distance measurement error becomes smaller as the distance from the imager to the object decreases, and thus the controller also reduces the probability distribution range (object presence range) according to the distance measurement error.
- a technique to create an environment map (surrounding map) around a mobile body using an imager include a technique to create an environment map in consideration of the real-time characteristics (responsiveness) of control by detecting a small number of measurement points (feature points) indicating objects from a captured image.
- a probability distribution range (object presence range) in which an object around the autonomous mobile robot is probabilistically distributed becomes smaller according to a distance measurement error as the imager is closer to the object, and thus the probability distribution range (object presence range) may become too small.
- the probability distribution range may not include a portion in which the object is originally present, and a gap may occur between adjacent probability distribution ranges (object presence ranges). Therefore, it is desired to create an environment map that more accurately indicates an object near the imager.
- said object is solved by a water area object detection system having the features of independent claim 1. Preferred embodiments are laid down in the dependent claims.
- a water area object detection system includes an imager configured to be provided on a hull and configured to capture an image around the hull, and a controller configured or programmed to perform a control to detect a feature point corresponding to an object in the image together with a distance to the feature point based on the image captured by the imager to create a water area map in which an object presence range including a likelihood that the object is present is set around the feature point.
- the controller is configured or programmed to reduce the object presence range as the distance from the imager of the hull to the feature point corresponding to the object decreases, and set a size of the object presence range to a lower limit when the distance from the imager to the feature point is equal to or less than a predetermined distance.
- a water area object detection system includes the controller configured or programmed to create the water area map in which the object presence range including a likelihood that the object is present is set, and the controller is configured or programmed to reduce the object presence range as the distance from the imager of the hull to the feature point corresponding to the object decreases, and set the size of the object presence range to the lower limit when the distance from the imager to the feature point is equal to or less than the predetermined distance. Accordingly, when the imager is relatively near the object such that the distance from the imager to the feature point is equal to or less than the predetermined distance, the size of the object presence range is set to the lower limit, and thus an excessive reduction in the object presence range is prevented.
- the lower limit is preferably set such that a plurality of the object presence ranges set for a plurality of the feature points corresponding to the same object partially overlap each other when the distance from the imager to the feature point is equal to or less than the predetermined distance.
- the distance from the imager to the feature point is equal to or less than the predetermined distance, i.e., when each of the plurality of object presence ranges set for the plurality of feature points corresponding to the same object is relatively small, the plurality of object presence ranges partially overlap each other, and thus the possibility that a portion in which the object is originally present is not included in the object presence range, and a gap occurs between the adjacent object presence ranges is more reliably reduced or prevented. Therefore, the water area map that still more accurately indicates the object near the imager is created.
- the controller is preferably configured or programmed to automatically dock the hull by automatically moving the hull toward a shore structure corresponding to the object. Accordingly, the hull is easily docked at the shore structure.
- the controller is preferably configured or programmed to, when an obstacle corresponding to the object is present between the hull and the shore structure when the hull is automatically docked, set the size of the object presence range to the lower limit or more, set a movement route configured to avoid the object presence range around the feature point corresponding to the obstacle, and automatically move the hull along the movement route.
- the sizes of the plurality of object presence ranges indicating the shore structure are set to the lower limit or more, and thus a gap between the plurality of object presence ranges indicating the shore structure is filled. Consequently, when the hull is automatically docked at the shore structure, contact of the hull with the obstacle is reduced or prevented.
- the controller is preferably configured or programmed to change the size of the object presence range according to a distance measurement error that quadratically becomes greater as the distance from the imager to the feature point increases when the distance from the imager to the feature point is larger than the predetermined distance. Accordingly, when the distance from the imager to the feature point is larger than the predetermined distance, the size of the object presence range is set to an appropriate size in consideration of the distance measurement error. Therefore, the water area map that more accurately indicates the object imaged by the imager is created.
- the controller is preferably configured or programmed to create the two-dimensional water area map horizontally extending by setting the object presence range in a horizontal plane. Accordingly, as compared with a case in which a three-dimensional water area map is created in consideration of an upward-downward direction (height direction), the processing load on the controller is reduced, and thus the real-time characteristics (responsiveness) of control is improved.
- the controller is preferably configured or programmed to redetect the feature point corresponding to the object in the image together with the distance to the feature point for each predetermined number of imaging frames of the imager to update the water area map. Accordingly, when an estimation method such as Bayesian estimation is performed, the posterior probability is converged in an earlier stage (an area in which the object is present has a higher probability, and an area in which the object is absent has a lower probability).
- the controller is preferably configured or programmed to update the water area map using Bayesian estimation, and assign a current probability larger than an initial probability and a prior probability to the object presence range and assign a current probability smaller than the initial probability and the prior probability to a range outside the object presence range to calculate a posterior probability using the Bayesian estimation. Accordingly, even when noise such as bubbles occurs in a predetermined frame, Bayesian estimation is repeatedly performed to update the water area map such that the probability that the noise such as bubbles is present in the water area map is reduced after the noise disappears. Therefore, the noise is reliably removed from the water area map.
- the imager preferably includes two imaging light receivers configured to be spaced apart at different locations on the hull, and the controller is preferably configured or programmed to measure the distance from the imager to the feature point using the two imaging light receivers. Accordingly, two different images are captured simultaneously by the two imaging light receivers to measure the distance to the feature point by the triangulation method, and thus as compared with a case in which two images are captured from different locations at different times by a single imaging apparatus to measure the distance to the feature point by the triangulation method, the accuracy of distance measurement is improved.
- a water area object detection system preferably further includes a display configured to be provided on the hull and configured to display the water area map, and the controller is preferably configured or programmed to perform a control to display the one feature point in one pixel of the display and set the object presence range having a perfect circular shape around the one pixel in which the one feature point is displayed to display the one feature point and the object presence range on the display. Accordingly, the boundary of the object presence range is set at positions to which distances from the feature point are equal to each other, and the water area map that more accurately indicates the object imaged by the imager is created.
- the controller is preferably configured or programmed to set the predetermined distance to 15 m or more and 25 m or less, and set the size of the object presence range to the lower limit when the distance from the imager to the feature point is equal to or less than the predetermined distance. Accordingly, the size of the object presence range is set to the lower limit within a predetermined range from the imager to the feature point with the predetermined distance of 15 m or more and 25 m or less, which is a relatively close to the hull, and the water area map that more accurately indicates the object is created.
- the structure of a marine vessel 100 including a water area object detection system 103 is now described with reference to FIGS. 1 to 8 .
- the water area object detection system 103 is an example of a "surrounding object detection system”.
- arrow FWD represents the forward movement direction of the marine vessel 100 (front side with reference to a hull 101)
- arrow BWD represents the reverse movement direction of the marine vessel 100 (rear side with reference to the hull 101).
- the hull 101 is an example of a "mobile body”.
- arrow L represents the portside direction of the marine vessel 100 (left side with respect to the hull 101), and arrow R represents the starboard direction of the marine vessel 100 (right side with respect to the hull 101).
- the marine vessel 100 includes the hull 101, a marine propulsion device 102 provided on the hull 101, and the water area object detection system 103 provided on or in the hull 101.
- the marine propulsion device 102 is attached to a transom of the hull 101 from behind. That is, in preferred embodiments, the marine propulsion device 102 is an outboard motor, and the marine vessel 100 is an outboard motor boat.
- the marine vessel 100 performs a control to estimate the self-position of the hull 101 in the water area map M while creating a two-dimensional water area map M (see FIG. 2 ) around the hull 101 that extends horizontally, using the water area object detection system 103.
- the water area map M is an example of a "surrounding map".
- control described above is achieved by simultaneous localization and mapping (SLAM).
- SLAM simultaneous localization and mapping
- the SLAM is a technique to simultaneously create an environment map around a mobile device and estimate the self-position of the mobile device in the environment map, using an image captured by a camera installed on the mobile device, for example.
- estimation of the self-position of the mobile device using the SLAM is able to be performed even in an environment such as indoors in which a GPS or the like is not usable.
- the SLAM enables the mobile device to move while avoiding surrounding objects so as to not collide with the objects, and to move along an optimal movement route without duplication of routes, for example.
- the SLAM includes passive SLAM (such as so-called visual SLAM) that uses an image sensor such as a camera to image a surrounding object, and active SLAM (such as so-called LiDAR SLAM) performed by irradiating a surrounding object with a laser beam of a laser device and detecting the reflected laser beam.
- passive SLAM such as so-called visual SLAM
- active SLAM such as so-called LiDAR SLAM
- the marine vessel 100 performs a control using a means such as the former passive SLAM.
- the passive SLAM using an image sensor such as a camera includes a means to acquire dense detection data and a means to acquire sparse detection data.
- the means to acquire dense detection data requires a larger amount of data processing in a controller as compared with the means to acquire sparse detection data.
- the sparse detection data refers to data obtained by extracting a feature point that is a portion of an image captured by an image sensor such as a camera, for example.
- the marine vessel 100 performs a control using the means such as the SLAM to acquire the latter sparse detection data. Consequently, the marine vessel 100 computes acquired detection data more quickly, and performs a real-time control according to movement of the marine vessel 100. That is, the marine vessel 100 performs a highly responsive (real-time) control.
- the marine vessel 100 performs a control to automatically move while avoiding obstacles (objects 0) along a movement route R using the water area map M created using the water area object detection system 103 and a control to automatically dock the hull 101 at a shore structure O1 such as a floating pier.
- the marine vessel 100 uses the water area map M as a means to know the positions of the obstacles (objects 0) not only when the marine vessel 100 automatically moves but also when a user manually maneuvers the marine vessel 100. That is, the water area map M is a so-called cost map to indicate the positions of the obstacles (objects 0) that are present around the marine vessel 100, for example.
- the water area object detection system 103 includes an imager 1 provided on the hull 101, a display 2 provided on the hull 101, and a controller 3 (see FIG. 1 ).
- the water area object detection system 103 (controller 3) performs a control to create the water area map M in which an object presence range F1 including a likelihood that an object O is present is set around a feature point F by detecting, based on an image captured by the imager 1, the feature point F corresponding to the object 0 in the image together with a distance to the feature point F.
- the imager 1 captures an image around the hull 101.
- the imager 1 includes two imaging light receivers 1a spaced apart at different locations on the hull 101.
- Each imaging light receiver 1a includes a monocular camera including an imaging device such as a CCD sensor or a CMOS sensor.
- the water area object detection system 103 (controller 3) measures a distance from the imager 1 to the feature point F using the two imaging light receivers 1a. Specifically, the water area object detection system 103 (controller 3) measures the distance to the feature point F corresponding to the object O in an image captured by a triangulation method based on images captured by the two imaging light receivers 1a.
- the "feature point F corresponding to the object O in an image” refers to a specific point shown in a portion of the image in which the object O is located.
- the feature point F is set in a portion of the image in which there is a particularly large change in brightness or color tone.
- the water area object detection system 103 preliminarily performs distortion correction of images captured by the two imaging light receivers 1a, rectification to associate the images with each other, and parallax estimation by matching corresponding feature points F on the images, for example, as preprocessing for distance measurement by the triangulation method.
- a distance L in the horizontal direction from the other of the imaging light receivers 1a to the feature point F (object 0) is obtained by the following formula.
- L1 fx pd + L 1
- the displacement amount ⁇ L of the measurement distance increases as the parallax d decreases, and the displacement amount ⁇ L of the measurement distance decreases as the parallax d increases.
- This displacement amount ⁇ L of the measurement distance corresponds to a distance resolution at the time of measuring the distance L. That is, as the distance L in the horizontal direction from the imaging light receiver 1a to the feature point F increases, a distance measurement error becomes greater. The distance measurement error quadratically becomes greater as the distance L increases.
- the display 2 displays the water area map M created by the controller 3 (see FIG. 1 ).
- the display 2 displays the water area map M within a display image of 450 pixels wide by 600 pixels long.
- the display image of 450 pixels wide by 600 pixels long is a display image in which a point group is plotted in a world coordinate system.
- the size of one pixel of the water area map M corresponds to a size of 0.1 m horizontally and 0.1 m vertically in the world coordinate system.
- the display 2 displays the imager 1 (imaging light receiver 1a) at a predetermined pixel position of the water area map M in a predetermined orientation.
- the display 2 also displays the hull 101 together with the imager 1 on the water area map M.
- the display 2 displays a schematic model of the hull 101 on the water area map M.
- the display 2 displays the feature points F on the water area map M. Furthermore, the display 2 displays, on the water area map M, the object presence range F1 set around the feature point F and including a likelihood that the object 0 is present.
- the controller 3 performs a control to display one feature point F in one pixel P of the display 2 and set a perfect circular object presence range F1 around one pixel P in which the feature point F is displayed to display the feature point F and the perfect circular object presence range F1 on the display 2 (see FIG. 6 ).
- the object presence range F1 refers to a range set around the feature point F and including a likelihood that the object O is present.
- the object presence range F1 refers to a range in which the object 0 is probabilistically present.
- the object presence range F1 should be avoided when the marine vessel 100 moves, and a movement route R (see FIGS. 2 and 8 ) for automatic movement is not set in the object presence range F1.
- the display 2 displays an imaging area A with a predetermined angle of view indicating a range currently being imaged by the imager 1 on the water area map M.
- the controller 3 (see FIG. 1 ) includes a circuit board including a central processing unit (CPU), a read-only memory (ROM), a random access memory (RAM), etc., for example.
- the controller 3 is provided in the hull 101.
- the controller 3 is connected to the two imaging light receivers 1a, the display 2, and the marine propulsion device 102 by signal lines.
- the controller 3 performs a control to automatically move the hull 101 by setting the movement route R for automatic movement based on the created water area map M and controlling driving of the marine propulsion device 102.
- the controller 3 performs a control to create the water area map M in which the perfect circular object presence range F1 including a likelihood that the object O is present is set around the feature point F by detecting the feature point F corresponding to the object 0 in the image together with the distance to the feature point F based on the image captured by the imager 1.
- the controller 3 creates the two-dimensional water area map M extending in the horizontal direction of the hull 101 by setting the object presence range F1 in a horizontal plane. That is, the controller 3 creates the two-dimensional water area map M extending in the forward-rearward direction and the right-left direction (arrows FWD, BWD, L, and R) of the hull 101.
- the controller 3 reduces the object presence range F1 as the distance from the imager 1 of the hull 101 to the feature point F corresponding to the object O decreases.
- the size of the object presence range F1 is set to a lower limit r1.
- the controller 3 sets a minimum value for the size of the object presence range F1 such that the size of the object presence range F1 does not become too small.
- the predetermined distance d1 refers to a distance to determine a range in which the size of the object presence range F1 is set to the lower limit.
- the controller 3 changes the predetermined distance d1 by setting.
- the controller 3 sets the predetermined distance d1 to 15 m or more and 25 m or less, and sets the size of the object presence range F1 to the lower limit r1 when the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1.
- the controller 3 sets the predetermined distance d1 to about 21 m, and sets the size of the object presence range F1 (a distance from the feature point F to the outer edge of the perfect circular object presence range F1, i.e., the radius of the object presence range F1) to the lower limit r1 when the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1 (about 21 m). That is, in a range relatively close to the hull 101 in which the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1, the size of the object presence range F1 is set to a constant lower limit r1.
- the lower limit r1 is set such that a plurality of object presence ranges F1 set for a plurality of feature points F corresponding to the same object O partially overlap each other. That is, the controller 3 reduces or prevents the occurrence of a gap between adjacent object presence ranges F1 of the plurality of object presence ranges F1 set for the plurality of feature points F corresponding to the same object O. If the lower limit r1 is not set for the size of the object presence range, the object presence range continues to become smaller toward the feature point as the distance from the imager to the feature point decreases, and thus a gap is more likely to occur between the adjacent object presence ranges.
- the controller 3 changes the size of the object presence range according to the distance measurement error that quadratically becomes greater as the distance from the imager 1 to the feature point F increases (see FIG. 7 ).
- the controller 3 increases the size of the object presence range F1 in proportion to the magnitude of the distance measurement error as the distance from the imager 1 to the feature point F increases when the distance from the imager 1 to the feature point F is larger than the predetermined distance d1.
- the controller 3 sets the size (radius) of the object presence range F1 to r2 larger than the lower limit r1. Furthermore, the controller 3 sets the size (radius) of the object presence range F1 to r3 larger than r2 when the distance from the imager 1 to the feature point F is d3 larger than the distance d2.
- the controller 3 performs a control to redetect the feature point F corresponding to the object 0 in the image together with the distance to the feature point F for each imaging frame of the imager 1 and compare the detected feature point F with the currently used feature point F.
- the controller 3 performs a control to reject the detected feature point F and continue to use the currently used feature point F when the degree of change is smaller than a predetermined value, and performs a control to update the result to the newly detected feature point F and update the water area map M and the position of the imager 1 in the water area map M when the degree of change is conversely larger than the predetermined value.
- the controller 3 updates the water area map M every predetermined imaging frame (every ten frames, for example).
- the controller 3 updates the water area map M using Bayesian estimation.
- the controller 3 performs a Bayesian estimation for each pixel P and updates a presence probability indicating that the object O will be present at each pixel P. For example, the controller 3 assigns a current probability larger than an initial probability and a prior probability to the object presence range F1 and assigns a current probability smaller than the initial probability and the prior probability to a range (absence range) outside the object presence range F1 to calculate a posterior probability using Bayesian estimation.
- a presence probability of "0.5” is assigned as the initial probability to all pixels P, and a presence probability of "1" is uniformly assigned to each pixel P in the object presence range F1, and a presence probability of "0.35" is uniformly assigned to the range outside the object presence range F1.
- the controller 3 sets the current probability and the initial probability to the same value to make the posterior probability equal to the prior probability in Bayesian estimation outside the range of the angle of view of the imager 1.
- the controller 3 controls driving of the marine propulsion device 102 to move the hull 101.
- the controller 3 automatically docks the hull 101 by automatically moving the hull 101 toward the shore structure 01 (see FIG. 3 ) corresponding to the object O.
- the controller 3 sets the size of the object presence range F1 to the lower limit r1 or more, sets the movement route R that avoids the object presence range F1 around the feature point F corresponding to an obstacle (object O), and automatically moves the hull 101 along the movement route R when the obstacle corresponding to the object O is present between the hull 101 and the shore structure 01.
- the water area object detection system 103 includes the controller 3 configured or programmed to create the water area map M in which the object presence range F1 including a likelihood that the object O is present is set.
- the controller 3 is configured or programmed to reduce the object presence range F1 as the distance from the imager 1 of the hull 101 to the feature point F corresponding to the object O decreases, and set the size of the object presence range F1 to the lower limit r1 when the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1.
- the size of the object presence range F1 is set to the lower limit r1, and thus an excessive reduction in the object presence range F1 is prevented. Therefore, the possibility that in the water area map M, a portion in which the object O is originally present is not included in the object presence range F1, and a gap occurs between the adjacent object presence ranges F1 is reduced or prevented. Therefore, the water area map M that accurately indicates the object 0 near the imager 1 is created.
- the lower limit r1 is set such that the plurality of object presence ranges F1 set for the plurality of feature points F corresponding to the same object 0 partially overlap each other when the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1.
- the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1, i.e., when each of the plurality of object presence ranges F1 set for the plurality of feature points F corresponding to the same object 0 is relatively small, the plurality of object presence ranges F1 partially overlap each other, and thus the possibility that a portion in which the object O is originally present is not included in the object presence range F1, and a gap occurs between the adjacent object presence ranges F1 is more reliably reduced or prevented. Therefore, the water area map M that still more accurately indicates the object 0 near the imager 1 is created.
- the controller 3 is configured or programmed to automatically dock the hull 101 by automatically moving the hull 101 toward the shore structure 01 corresponding to the object O. Accordingly, the hull 101 is easily docked at the shore structure O1.
- the controller 3 is configured or programmed to, when the obstacle corresponding to the object 0 is present between the hull 101 and the shore structure 01 when the hull 101 is automatically docked, set the size of the object presence range F1 to the lower limit r1 or more, set the movement route R that avoids the object presence range F1 around the feature point F corresponding to the obstacle, and automatically move the hull 101 along the movement route R. Accordingly, the sizes of the plurality of object presence ranges F1 indicating the shore structure 01 are set to the lower limit r1 or more, and thus a gap between the plurality of object presence ranges F1 indicating the shore structure O1 is filled. Consequently, when the hull 101 is automatically docked at the shore structure O1, contact of the hull 101 with the obstacle is reduced or prevented.
- the controller 3 is configured or programmed to change the size of the object presence range F1 according to the distance measurement error that quadratically becomes greater as the distance from the imager 1 to the feature point F increases when the distance from the imager 1 to the feature point F is larger than the predetermined distance d1. Accordingly, when the distance from the imager 1 to the feature point F is larger than the predetermined distance d1, the size of the object presence range F1 is set to an appropriate size in consideration of the distance measurement error. Therefore, the water area map M that more accurately indicates the object O imaged by the imager 1 is created.
- the controller 3 is configured or programmed to create the two-dimensional water area map M extending in the horizontal direction by setting the object presence range F1 in the horizontal plane. Accordingly, as compared with a case in which a three-dimensional water area map M is created in consideration of an upward-downward direction (height direction), the processing load on the controller 3 is reduced, and thus the real-time characteristics (responsiveness) of control is improved.
- the controller 3 is configured or programmed to redetect the feature point F corresponding to the object O in the image together with the distance to the feature point F for each predetermined number of imaging frames of the imager 1 to update the water area map M. Accordingly, when an estimation method such as Bayesian estimation is performed, the posterior probability is converged in an earlier stage (an area in which the object O is present has a higher probability, and an area in which the object O is absent has a lower probability).
- the controller 3 is configured or programmed to update the water area map M using Bayesian estimation, and assign the current probability larger than the initial probability and the prior probability to the object presence range F1 and assign the current probability smaller than the initial probability and the prior probability to the range outside the object presence range F1 to calculate the posterior probability using Bayesian estimation. Accordingly, even when noise such as bubbles occurs in a predetermined frame, Bayesian estimation is repeatedly performed to update the water area map M such that the probability that the noise such as bubbles is present in the water area map M is reduced after the noise disappears. Therefore, the noise is reliably removed from the water area map M.
- the imager 1 includes the two imaging light receivers 1a spaced apart at different locations on the hull 101, and the controller 3 is configured or programmed to measure the distance from the imager 1 to the feature point F using the two imaging light receivers 1a. Accordingly, two different images are captured simultaneously by the two imaging light receivers 1a to measure the distance to the feature point F by the triangulation method, and thus as compared with a case in which two images are captured from different locations at different times by a single imaging apparatus to measure the distance to the feature point F by the triangulation method, the accuracy of distance measurement is improved.
- the water area object detection system 103 further includes the display 2 provided on the hull 101 to display the water area map M, and the controller 3 is configured or programmed to perform a control to display one feature point F in one pixel P of the display 2 and set the perfect circular object presence range F1 around one pixel P in which the feature point F is displayed to display the feature point F and the perfect circular object presence range F1 on the display 2. Accordingly, the boundary of the object presence range F1 is set at positions to which distances from the feature point F are equal to each other, and the water area map M that more accurately indicates the object O imaged by the imager 1 is created.
- the controller 3 is configured or programmed to set the predetermined distance d1 to 15 m or more and 25 m or less, and set the size of the object presence range F1 to the lower limit r1 when the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1. Accordingly, the size of the object presence range F1 is set to the lower limit r1 within a predetermined range from the imager 1 to the feature point F with the predetermined distance d1 of 15 m or more and 25 m or less, which is a relatively close to the hull 101, and the water area map M that more accurately indicates the object O is created.
- the present teaching is preferably applied to a marine vessel in preferred embodiments described above, the present teaching is not restricted to this.
- the present teaching may alternatively be applied to a vehicle 201 shown in FIG. 9 .
- the vehicle 201 includes a surrounding object detection system 203.
- the surrounding object detection system 203 includes two imaging light receivers 1a, a display 2, and a controller 3 to create a surrounding map.
- the present teaching may alternatively be applied to a flying object such as a drone.
- the vehicle 201 and the flying object are examples of a "mobile body".
- the marine vessel is preferably an outboard motor boat in preferred embodiments described above, the present teaching is not restricted to this.
- the marine vessel may alternatively be a marine vessel other than an outboard motor boat.
- the marine vessel may be a marine vessel including an inboard motor, an inboard-outboard motor, or a jet propulsion device.
- the imager preferably includes the two imaging light receivers each including a monocular camera in preferred embodiments described above, the present teaching is not restricted to this.
- the imager may alternatively include stereo cameras, for example.
- the imager may include only one monocular camera.
- the marine vessel preferably includes a highly accurate GPS to detect the position of the hull and a highly accurate inertial measurement unit (IMU) to detect the attitude of the hull.
- IMU inertial measurement unit
- the size of one pixel indicating the water area map preferably corresponds to a size of 0.1 m vertically and 0.1 m horizontally in the world coordinate system in preferred embodiments described above, the present teaching is not restricted to this. In the present teaching, the size of one pixel indicating the water area map may alternatively correspond to a size different from 0.1 m vertically and 0.1 m horizontally in the world coordinate system.
- the predetermined distance to determine a range in which the size of the object presence range is set to the lower limit is preferably 15 m or more and 25 m or less (about 21 m) in preferred embodiments described above, the present teaching is not restricted to this. In the present teaching, the predetermined distance to determine a range in which the size of the object presence range is set to the lower limit may alternatively be different from 15 m or more and 25 m or less.
- a presence probability of "1" is preferably uniformly assigned to each pixel in the object presence range in Bayesian estimation in preferred embodiments described above, the present teaching is not restricted to this.
- a presence probability of "1" may not be uniformly assigned to each pixel in the object presence range in Bayesian estimation.
- the presence probability of the feature point may be set to "1”, and the presence probability may be decreased according to a Gaussian distribution, for example, as the distance from the feature point increases in the object presence range.
- the object presence range is preferably indicated by a perfect circle in the water area map in preferred embodiments described above, the present teaching is not restricted to this.
- the object presence range may alternatively be indicated by a shape different from a perfect circle, such as an ellipse.
Abstract
Description
- The present invention relates to a water area object detection system, and more particularly, it relates to a water area object detection system that includes an imager and creates a surrounding map based on an image captured by the imager.
- An autonomous mobile robot that includes an imager and creates an environment map (surrounding map) based on an image captured by the imager is known in general. Such an autonomous mobile robot is disclosed by Tatsunori Kou, Kenji Suzuki, and Shuji Hashimoto, Department of Applied Physics, Waseda University, "3D Environment Recognition and Mapping for Autonomous Mobile Robots", the 66th National Meeting of the Information Processing Society of Japan, pages 2-453 to 2-454, for example.
- Tatsunori Kou, Kenji Suzuki, and Shuji Hashimoto, Department of Applied Physics, Waseda University, "3D Environment Recognition and Mapping for Autonomous Mobile Robots", the 66th National Meeting of the Information Processing Society of Japan, pages 2-453 to 2-454 disclose an autonomous mobile robot including an imager and a controller. The controller of the autonomous mobile robot creates a surrounding environment map (surrounding map) by detecting the posture and position of the autonomous mobile robot and a distance to an object around the autonomous mobile robot based on an image captured by the imager. At this time, the controller detects a measurement point (feature point) indicating an object, sets, around the measurement point, a probability distribution range (object presence range) in which the object around the autonomous mobile robot is probabilistically distributed, and shows the probability distribution range on the environment map. That is, a range in which an object that interferes with movement of the autonomous mobile robot is probably present, which is the probability distribution range in which the object is probabilistically distributed, is shown on the environment map.
- The controller increases the probability distribution range (object presence range) as the distance from the imager to the object increases, taking into account a distance measurement error. Specifically, the distance measurement error becomes greater as the distance from the imager to the object increases, and thus the controller also increases the probability distribution range (object presence range) according to the distance measurement error. Conversely, the distance measurement error becomes smaller as the distance from the imager to the object decreases, and thus the controller also reduces the probability distribution range (object presence range) according to the distance measurement error.
- Although not clearly described by Tatsunori Kou, Kenji Suzuki, and Shuji Hashimoto, Department of Applied Physics, Waseda University, "3D Environment Recognition and Mapping for Autonomous Mobile Robots", the 66th National Meeting of the Information Processing Society of Japan, pages 2-453 to 2-454, a technique to create an environment map (surrounding map) around a mobile body using an imager include a technique to create an environment map in consideration of the real-time characteristics (responsiveness) of control by detecting a small number of measurement points (feature points) indicating objects from a captured image.
- When the environment map creation technique described above to decrease the number of measurement points in consideration of the real-time characteristics of control is used in the autonomous mobile robot described by Tatsunori Kou, Kenji Suzuki, and Shuji Hashimoto, Department of Applied Physics, Waseda University, "3D Environment Recognition and Mapping for Autonomous Mobile Robots", the 66th National Meeting of the Information Processing Society of Japan, pages 2-453 to 2-454, a probability distribution range (object presence range) in which an object around the autonomous mobile robot is probabilistically distributed becomes smaller according to a distance measurement error as the imager is closer to the object, and thus the probability distribution range (object presence range) may become too small. Consequently, the probability distribution range (object presence range) may not include a portion in which the object is originally present, and a gap may occur between adjacent probability distribution ranges (object presence ranges). Therefore, it is desired to create an environment map that more accurately indicates an object near the imager.
- It is an object of the present invention to provide a water area object detection system that creates a water area map (surrounding map) that more accurately indicates an object near an imager. According to the present invention, said object is solved by a water area object detection system having the features of
independent claim 1. Preferred embodiments are laid down in the dependent claims. - A water area object detection system according to a preferred embodiment includes an imager configured to be provided on a hull and configured to capture an image around the hull, and a controller configured or programmed to perform a control to detect a feature point corresponding to an object in the image together with a distance to the feature point based on the image captured by the imager to create a water area map in which an object presence range including a likelihood that the object is present is set around the feature point. The controller is configured or programmed to reduce the object presence range as the distance from the imager of the hull to the feature point corresponding to the object decreases, and set a size of the object presence range to a lower limit when the distance from the imager to the feature point is equal to or less than a predetermined distance.
- A water area object detection system according to a preferred embodiment includes the controller configured or programmed to create the water area map in which the object presence range including a likelihood that the object is present is set, and the controller is configured or programmed to reduce the object presence range as the distance from the imager of the hull to the feature point corresponding to the object decreases, and set the size of the object presence range to the lower limit when the distance from the imager to the feature point is equal to or less than the predetermined distance. Accordingly, when the imager is relatively near the object such that the distance from the imager to the feature point is equal to or less than the predetermined distance, the size of the object presence range is set to the lower limit, and thus an excessive reduction in the object presence range is prevented. Therefore, the possibility that in the water area map, a portion in which the object is originally present is not included in the object presence range, and a gap occurs between adjacent object presence ranges is reduced or prevented. Therefore, the water area map that more accurately indicates the object near the imager is created.
- In a water area object detection system according to a preferred embodiment, the lower limit is preferably set such that a plurality of the object presence ranges set for a plurality of the feature points corresponding to the same object partially overlap each other when the distance from the imager to the feature point is equal to or less than the predetermined distance. Accordingly, when the distance from the imager to the feature point is equal to or less than the predetermined distance, i.e., when each of the plurality of object presence ranges set for the plurality of feature points corresponding to the same object is relatively small, the plurality of object presence ranges partially overlap each other, and thus the possibility that a portion in which the object is originally present is not included in the object presence range, and a gap occurs between the adjacent object presence ranges is more reliably reduced or prevented. Therefore, the water area map that still more accurately indicates the object near the imager is created.
- In a water area object detection system according to a preferred embodiment, the controller is preferably configured or programmed to automatically dock the hull by automatically moving the hull toward a shore structure corresponding to the object. Accordingly, the hull is easily docked at the shore structure.
- In such a case, the controller is preferably configured or programmed to, when an obstacle corresponding to the object is present between the hull and the shore structure when the hull is automatically docked, set the size of the object presence range to the lower limit or more, set a movement route configured to avoid the object presence range around the feature point corresponding to the obstacle, and automatically move the hull along the movement route. Accordingly, the sizes of the plurality of object presence ranges indicating the shore structure are set to the lower limit or more, and thus a gap between the plurality of object presence ranges indicating the shore structure is filled. Consequently, when the hull is automatically docked at the shore structure, contact of the hull with the obstacle is reduced or prevented.
- In a water area object detection system according to a preferred embodiment, the controller is preferably configured or programmed to change the size of the object presence range according to a distance measurement error that quadratically becomes greater as the distance from the imager to the feature point increases when the distance from the imager to the feature point is larger than the predetermined distance. Accordingly, when the distance from the imager to the feature point is larger than the predetermined distance, the size of the object presence range is set to an appropriate size in consideration of the distance measurement error. Therefore, the water area map that more accurately indicates the object imaged by the imager is created.
- In a water area object detection system according to a preferred embodiment, the controller is preferably configured or programmed to create the two-dimensional water area map horizontally extending by setting the object presence range in a horizontal plane. Accordingly, as compared with a case in which a three-dimensional water area map is created in consideration of an upward-downward direction (height direction), the processing load on the controller is reduced, and thus the real-time characteristics (responsiveness) of control is improved.
- In a water area object detection system according to a preferred embodiment, the controller is preferably configured or programmed to redetect the feature point corresponding to the object in the image together with the distance to the feature point for each predetermined number of imaging frames of the imager to update the water area map. Accordingly, when an estimation method such as Bayesian estimation is performed, the posterior probability is converged in an earlier stage (an area in which the object is present has a higher probability, and an area in which the object is absent has a lower probability).
- In such a case, the controller is preferably configured or programmed to update the water area map using Bayesian estimation, and assign a current probability larger than an initial probability and a prior probability to the object presence range and assign a current probability smaller than the initial probability and the prior probability to a range outside the object presence range to calculate a posterior probability using the Bayesian estimation. Accordingly, even when noise such as bubbles occurs in a predetermined frame, Bayesian estimation is repeatedly performed to update the water area map such that the probability that the noise such as bubbles is present in the water area map is reduced after the noise disappears. Therefore, the noise is reliably removed from the water area map.
- In a water area object detection system according to a preferred embodiment, the imager preferably includes two imaging light receivers configured to be spaced apart at different locations on the hull, and the controller is preferably configured or programmed to measure the distance from the imager to the feature point using the two imaging light receivers. Accordingly, two different images are captured simultaneously by the two imaging light receivers to measure the distance to the feature point by the triangulation method, and thus as compared with a case in which two images are captured from different locations at different times by a single imaging apparatus to measure the distance to the feature point by the triangulation method, the accuracy of distance measurement is improved.
- A water area object detection system according to a preferred embodiment preferably further includes a display configured to be provided on the hull and configured to display the water area map, and the controller is preferably configured or programmed to perform a control to display the one feature point in one pixel of the display and set the object presence range having a perfect circular shape around the one pixel in which the one feature point is displayed to display the one feature point and the object presence range on the display. Accordingly, the boundary of the object presence range is set at positions to which distances from the feature point are equal to each other, and the water area map that more accurately indicates the object imaged by the imager is created.
- In a water area object detection system according to a preferred embodiment, the controller is preferably configured or programmed to set the predetermined distance to 15 m or more and 25 m or less, and set the size of the object presence range to the lower limit when the distance from the imager to the feature point is equal to or less than the predetermined distance. Accordingly, the size of the object presence range is set to the lower limit within a predetermined range from the imager to the feature point with the predetermined distance of 15 m or more and 25 m or less, which is a relatively close to the hull, and the water area map that more accurately indicates the object is created.
- The above and other elements, features, steps, characteristics and advantages of preferred embodiments will become more apparent from the following detailed description of the preferred embodiments with reference to the attached drawings.
-
-
FIG. 1 is a side view showing a marine vessel including a water area object detection system according to a preferred embodiment. -
FIG. 2 is a diagram showing a water area map created by a water area object detection system according to a preferred embodiment. -
FIG. 3 is a plan view illustrating an actual object (such as a structure) in a range corresponding to a water area map. -
FIG. 4 is a schematic view illustrating the size of an object presence range on a water area map. -
FIG. 5 is a diagram illustrating a triangulation method using two imaging light receivers. -
FIG. 6 is an enlarged view showing a feature point and an object presence range on a water area map. -
FIG. 7 is a graph showing a relationship between a distance between an imager and a feature point and the size of an object presence range. -
FIG. 8 is a diagram illustrating a movement route that avoids an object (obstacle). -
FIG. 9 is a plan view showing a vehicle including a surrounding object detection system according to a modified example. - Preferred embodiments are hereinafter described with reference to the drawings.
- The structure of a
marine vessel 100 including a water areaobject detection system 103 according to preferred embodiments is now described with reference toFIGS. 1 to 8 . The water areaobject detection system 103 is an example of a "surrounding object detection system". - In the figures, arrow FWD represents the forward movement direction of the marine vessel 100 (front side with reference to a hull 101), and arrow BWD represents the reverse movement direction of the marine vessel 100 (rear side with reference to the hull 101). The
hull 101 is an example of a "mobile body". - In the figures, arrow L represents the portside direction of the marine vessel 100 (left side with respect to the hull 101), and arrow R represents the starboard direction of the marine vessel 100 (right side with respect to the hull 101).
- As shown in
FIG. 1 , themarine vessel 100 includes thehull 101, amarine propulsion device 102 provided on thehull 101, and the water areaobject detection system 103 provided on or in thehull 101. - The
marine propulsion device 102 is attached to a transom of thehull 101 from behind. That is, in preferred embodiments, themarine propulsion device 102 is an outboard motor, and themarine vessel 100 is an outboard motor boat. - The
marine vessel 100 performs a control to estimate the self-position of thehull 101 in the water area map M while creating a two-dimensional water area map M (seeFIG. 2 ) around thehull 101 that extends horizontally, using the water areaobject detection system 103. The water area map M is an example of a "surrounding map". - As an example, the control described above (the control to estimate the self-position of the
hull 101 in the water area map M while creating the water area map M) is achieved by simultaneous localization and mapping (SLAM). - The SLAM is a technique to simultaneously create an environment map around a mobile device and estimate the self-position of the mobile device in the environment map, using an image captured by a camera installed on the mobile device, for example. Unlike estimation of a self-position on a map using a global positioning system (GPS), for example, estimation of the self-position of the mobile device using the SLAM is able to be performed even in an environment such as indoors in which a GPS or the like is not usable.
- The SLAM enables the mobile device to move while avoiding surrounding objects so as to not collide with the objects, and to move along an optimal movement route without duplication of routes, for example.
- The SLAM includes passive SLAM (such as so-called visual SLAM) that uses an image sensor such as a camera to image a surrounding object, and active SLAM (such as so-called LiDAR SLAM) performed by irradiating a surrounding object with a laser beam of a laser device and detecting the reflected laser beam. The
marine vessel 100 according to preferred embodiments performs a control using a means such as the former passive SLAM. - The passive SLAM using an image sensor such as a camera includes a means to acquire dense detection data and a means to acquire sparse detection data. The means to acquire dense detection data requires a larger amount of data processing in a controller as compared with the means to acquire sparse detection data. The sparse detection data refers to data obtained by extracting a feature point that is a portion of an image captured by an image sensor such as a camera, for example.
- The
marine vessel 100 according to preferred embodiments performs a control using the means such as the SLAM to acquire the latter sparse detection data. Consequently, themarine vessel 100 computes acquired detection data more quickly, and performs a real-time control according to movement of themarine vessel 100. That is, themarine vessel 100 performs a highly responsive (real-time) control. - As shown in
FIGS. 2 and3 , themarine vessel 100 performs a control to automatically move while avoiding obstacles (objects 0) along a movement route R using the water area map M created using the water areaobject detection system 103 and a control to automatically dock thehull 101 at a shore structure O1 such as a floating pier. - The
marine vessel 100 uses the water area map M as a means to know the positions of the obstacles (objects 0) not only when themarine vessel 100 automatically moves but also when a user manually maneuvers themarine vessel 100. That is, the water area map M is a so-called cost map to indicate the positions of the obstacles (objects 0) that are present around themarine vessel 100, for example. - The water area
object detection system 103 includes animager 1 provided on thehull 101, adisplay 2 provided on thehull 101, and a controller 3 (seeFIG. 1 ). - As shown in
FIG. 4 , the water area object detection system 103 (controller 3) performs a control to create the water area map M in which an object presence range F1 including a likelihood that an object O is present is set around a feature point F by detecting, based on an image captured by theimager 1, the feature point F corresponding to theobject 0 in the image together with a distance to the feature point F. - The
imager 1 captures an image around thehull 101. Theimager 1 includes twoimaging light receivers 1a spaced apart at different locations on thehull 101. Eachimaging light receiver 1a includes a monocular camera including an imaging device such as a CCD sensor or a CMOS sensor. - The water area object detection system 103 (controller 3) measures a distance from the
imager 1 to the feature point F using the twoimaging light receivers 1a. Specifically, the water area object detection system 103 (controller 3) measures the distance to the feature point F corresponding to the object O in an image captured by a triangulation method based on images captured by the twoimaging light receivers 1a. - The "feature point F corresponding to the object O in an image" refers to a specific point shown in a portion of the image in which the object O is located. As an example, the feature point F is set in a portion of the image in which there is a particularly large change in brightness or color tone.
- The water area object detection system 103 (controller 3) preliminarily performs distortion correction of images captured by the two
imaging light receivers 1a, rectification to associate the images with each other, and parallax estimation by matching corresponding feature points F on the images, for example, as preprocessing for distance measurement by the triangulation method. - Measurement of the distance to the feature point F by the triangulation method performed by the water area object detection system 103 (controller 3) is now described with reference to
FIG. 5 . - Assuming that L1 represents a distance in a horizontal direction between the two
imaging light receivers 1a, x represents a distance in a direction perpendicular to the horizontal direction between the twoimaging light receivers 1a, d represents the parallax of the twoimaging light receivers 1a, p represents the element pitches of the imaging devices of theimaging light receivers 1a, and f represents a focal length of one of theimaging light receivers 1a, a distance L in the horizontal direction from the other of theimaging light receivers 1a to the feature point F (object 0) is obtained by the following formula. When the twoimaging light receivers 1a are provided side by side like a stereo camera, L1 = 0.
[Mathematical Formula 1] -
- From the above formula showing the displacement amount ΔL of the measurement distance, it is understood that the displacement amount ΔL of the measurement distance increases as the parallax d decreases, and the displacement amount ΔL of the measurement distance decreases as the parallax d increases. This displacement amount ΔL of the measurement distance corresponds to a distance resolution at the time of measuring the distance L. That is, as the distance L in the horizontal direction from the
imaging light receiver 1a to the feature point F increases, a distance measurement error becomes greater. The distance measurement error quadratically becomes greater as the distance L increases. - As shown in
FIG. 2 , the display 2 (seeFIG. 1 ) displays the water area map M created by the controller 3 (seeFIG. 1 ). As an example, thedisplay 2 displays the water area map M within a display image of 450 pixels wide by 600 pixels long. The display image of 450 pixels wide by 600 pixels long is a display image in which a point group is plotted in a world coordinate system. As an example, the size of one pixel of the water area map M corresponds to a size of 0.1 m horizontally and 0.1 m vertically in the world coordinate system. - The
display 2 displays the imager 1 (imaging light receiver 1a) at a predetermined pixel position of the water area map M in a predetermined orientation. Thedisplay 2 also displays thehull 101 together with theimager 1 on the water area map M. Thedisplay 2 displays a schematic model of thehull 101 on the water area map M. - The
display 2 displays the feature points F on the water area map M. Furthermore, thedisplay 2 displays, on the water area map M, the object presence range F1 set around the feature point F and including a likelihood that theobject 0 is present. As a specific example, thecontroller 3 performs a control to display one feature point F in one pixel P of thedisplay 2 and set a perfect circular object presence range F1 around one pixel P in which the feature point F is displayed to display the feature point F and the perfect circular object presence range F1 on the display 2 (seeFIG. 6 ). - The object presence range F1 refers to a range set around the feature point F and including a likelihood that the object O is present. In short, the object presence range F1 refers to a range in which the
object 0 is probabilistically present. In other words, the object presence range F1 should be avoided when themarine vessel 100 moves, and a movement route R (seeFIGS. 2 and8 ) for automatic movement is not set in the object presence range F1. - The
display 2 displays an imaging area A with a predetermined angle of view indicating a range currently being imaged by theimager 1 on the water area map M. - The controller 3 (see
FIG. 1 ) includes a circuit board including a central processing unit (CPU), a read-only memory (ROM), a random access memory (RAM), etc., for example. Thecontroller 3 is provided in thehull 101. Thecontroller 3 is connected to the twoimaging light receivers 1a, thedisplay 2, and themarine propulsion device 102 by signal lines. - The
controller 3 performs a control to automatically move thehull 101 by setting the movement route R for automatic movement based on the created water area map M and controlling driving of themarine propulsion device 102. - As described above, the
controller 3 performs a control to create the water area map M in which the perfect circular object presence range F1 including a likelihood that the object O is present is set around the feature point F by detecting the feature point F corresponding to theobject 0 in the image together with the distance to the feature point F based on the image captured by theimager 1. - The
controller 3 creates the two-dimensional water area map M extending in the horizontal direction of thehull 101 by setting the object presence range F1 in a horizontal plane. That is, thecontroller 3 creates the two-dimensional water area map M extending in the forward-rearward direction and the right-left direction (arrows FWD, BWD, L, and R) of thehull 101. - As shown in
FIGS. 4 and7 , when the two-dimensional water area map M is created, thecontroller 3 reduces the object presence range F1 as the distance from theimager 1 of thehull 101 to the feature point F corresponding to the object O decreases. When the distance from theimager 1 to the feature point F is equal to or less than a predetermined distance d1, the size of the object presence range F1 is set to a lower limit r1. In other words, thecontroller 3 sets a minimum value for the size of the object presence range F1 such that the size of the object presence range F1 does not become too small. The predetermined distance d1 refers to a distance to determine a range in which the size of the object presence range F1 is set to the lower limit. - The
controller 3 changes the predetermined distance d1 by setting. As an example, thecontroller 3 sets the predetermined distance d1 to 15 m or more and 25 m or less, and sets the size of the object presence range F1 to the lower limit r1 when the distance from theimager 1 to the feature point F is equal to or less than the predetermined distance d1. As a specific example, thecontroller 3 sets the predetermined distance d1 to about 21 m, and sets the size of the object presence range F1 (a distance from the feature point F to the outer edge of the perfect circular object presence range F1, i.e., the radius of the object presence range F1) to the lower limit r1 when the distance from theimager 1 to the feature point F is equal to or less than the predetermined distance d1 (about 21 m). That is, in a range relatively close to thehull 101 in which the distance from theimager 1 to the feature point F is equal to or less than the predetermined distance d1, the size of the object presence range F1 is set to a constant lower limit r1. - When the distance from the
imager 1 to the feature point F is equal to or less than the predetermined distance d1, the lower limit r1 is set such that a plurality of object presence ranges F1 set for a plurality of feature points F corresponding to the same object O partially overlap each other. That is, thecontroller 3 reduces or prevents the occurrence of a gap between adjacent object presence ranges F1 of the plurality of object presence ranges F1 set for the plurality of feature points F corresponding to the same object O. If the lower limit r1 is not set for the size of the object presence range, the object presence range continues to become smaller toward the feature point as the distance from the imager to the feature point decreases, and thus a gap is more likely to occur between the adjacent object presence ranges. - When the distance from the
imager 1 to the feature point F is larger than the predetermined distance d1, thecontroller 3 changes the size of the object presence range according to the distance measurement error that quadratically becomes greater as the distance from theimager 1 to the feature point F increases (seeFIG. 7 ). - As a specific example, the
controller 3 increases the size of the object presence range F1 in proportion to the magnitude of the distance measurement error as the distance from theimager 1 to the feature point F increases when the distance from theimager 1 to the feature point F is larger than the predetermined distance d1. - For example, when the distance from the
imager 1 to the feature point F is d2 larger than the predetermined distance d1, thecontroller 3 sets the size (radius) of the object presence range F1 to r2 larger than the lower limit r1. Furthermore, thecontroller 3 sets the size (radius) of the object presence range F1 to r3 larger than r2 when the distance from theimager 1 to the feature point F is d3 larger than the distance d2. - The
controller 3 performs a control to redetect the feature point F corresponding to theobject 0 in the image together with the distance to the feature point F for each imaging frame of theimager 1 and compare the detected feature point F with the currently used feature point F. Thecontroller 3 performs a control to reject the detected feature point F and continue to use the currently used feature point F when the degree of change is smaller than a predetermined value, and performs a control to update the result to the newly detected feature point F and update the water area map M and the position of theimager 1 in the water area map M when the degree of change is conversely larger than the predetermined value. Furthermore, thecontroller 3 updates the water area map M every predetermined imaging frame (every ten frames, for example). - The
controller 3 updates the water area map M using Bayesian estimation. Thecontroller 3 performs a Bayesian estimation for each pixel P and updates a presence probability indicating that the object O will be present at each pixel P. For example, thecontroller 3 assigns a current probability larger than an initial probability and a prior probability to the object presence range F1 and assigns a current probability smaller than the initial probability and the prior probability to a range (absence range) outside the object presence range F1 to calculate a posterior probability using Bayesian estimation. A formula for Bayesian estimation is "posterior probability = (current probability × prior probability)/initial probability". - As a specific example, a presence probability of "0.5" is assigned as the initial probability to all pixels P, and a presence probability of "1" is uniformly assigned to each pixel P in the object presence range F1, and a presence probability of "0.35" is uniformly assigned to the range outside the object presence range F1. The
controller 3 sets the current probability and the initial probability to the same value to make the posterior probability equal to the prior probability in Bayesian estimation outside the range of the angle of view of theimager 1. - As shown in
FIGS. 2 ,3 , and8 , thecontroller 3 controls driving of themarine propulsion device 102 to move thehull 101. - As an example, the
controller 3 automatically docks thehull 101 by automatically moving thehull 101 toward the shore structure 01 (seeFIG. 3 ) corresponding to the object O. When automatically docking thehull 101, thecontroller 3 sets the size of the object presence range F1 to the lower limit r1 or more, sets the movement route R that avoids the object presence range F1 around the feature point F corresponding to an obstacle (object O), and automatically moves thehull 101 along the movement route R when the obstacle corresponding to the object O is present between thehull 101 and theshore structure 01. - According to the various preferred embodiments described above, the following advantageous effects are achieved.
- According to a preferred embodiment, the water area
object detection system 103 includes thecontroller 3 configured or programmed to create the water area map M in which the object presence range F1 including a likelihood that the object O is present is set. Thecontroller 3 is configured or programmed to reduce the object presence range F1 as the distance from theimager 1 of thehull 101 to the feature point F corresponding to the object O decreases, and set the size of the object presence range F1 to the lower limit r1 when the distance from theimager 1 to the feature point F is equal to or less than the predetermined distance d1. Accordingly, when theimager 1 is relatively near the object O such that the distance from theimager 1 to the feature point F is equal to or less than the predetermined distance d1, the size of the object presence range F1 is set to the lower limit r1, and thus an excessive reduction in the object presence range F1 is prevented. Therefore, the possibility that in the water area map M, a portion in which the object O is originally present is not included in the object presence range F1, and a gap occurs between the adjacent object presence ranges F1 is reduced or prevented. Therefore, the water area map M that accurately indicates theobject 0 near theimager 1 is created. - According to a preferred embodiment, the lower limit r1 is set such that the plurality of object presence ranges F1 set for the plurality of feature points F corresponding to the
same object 0 partially overlap each other when the distance from theimager 1 to the feature point F is equal to or less than the predetermined distance d1. Accordingly, when the distance from theimager 1 to the feature point F is equal to or less than the predetermined distance d1, i.e., when each of the plurality of object presence ranges F1 set for the plurality of feature points F corresponding to thesame object 0 is relatively small, the plurality of object presence ranges F1 partially overlap each other, and thus the possibility that a portion in which the object O is originally present is not included in the object presence range F1, and a gap occurs between the adjacent object presence ranges F1 is more reliably reduced or prevented. Therefore, the water area map M that still more accurately indicates theobject 0 near theimager 1 is created. - According to a preferred embodiment, the
controller 3 is configured or programmed to automatically dock thehull 101 by automatically moving thehull 101 toward theshore structure 01 corresponding to the object O. Accordingly, thehull 101 is easily docked at the shore structure O1. - According to a preferred embodiment, the
controller 3 is configured or programmed to, when the obstacle corresponding to theobject 0 is present between thehull 101 and theshore structure 01 when thehull 101 is automatically docked, set the size of the object presence range F1 to the lower limit r1 or more, set the movement route R that avoids the object presence range F1 around the feature point F corresponding to the obstacle, and automatically move thehull 101 along the movement route R. Accordingly, the sizes of the plurality of object presence ranges F1 indicating theshore structure 01 are set to the lower limit r1 or more, and thus a gap between the plurality of object presence ranges F1 indicating the shore structure O1 is filled. Consequently, when thehull 101 is automatically docked at the shore structure O1, contact of thehull 101 with the obstacle is reduced or prevented. - According to a preferred embodiment, the
controller 3 is configured or programmed to change the size of the object presence range F1 according to the distance measurement error that quadratically becomes greater as the distance from theimager 1 to the feature point F increases when the distance from theimager 1 to the feature point F is larger than the predetermined distance d1. Accordingly, when the distance from theimager 1 to the feature point F is larger than the predetermined distance d1, the size of the object presence range F1 is set to an appropriate size in consideration of the distance measurement error. Therefore, the water area map M that more accurately indicates the object O imaged by theimager 1 is created. - According to a preferred embodiment, the
controller 3 is configured or programmed to create the two-dimensional water area map M extending in the horizontal direction by setting the object presence range F1 in the horizontal plane. Accordingly, as compared with a case in which a three-dimensional water area map M is created in consideration of an upward-downward direction (height direction), the processing load on thecontroller 3 is reduced, and thus the real-time characteristics (responsiveness) of control is improved. - According to a preferred embodiment, the
controller 3 is configured or programmed to redetect the feature point F corresponding to the object O in the image together with the distance to the feature point F for each predetermined number of imaging frames of theimager 1 to update the water area map M. Accordingly, when an estimation method such as Bayesian estimation is performed, the posterior probability is converged in an earlier stage (an area in which the object O is present has a higher probability, and an area in which the object O is absent has a lower probability). - According to a preferred embodiment, the
controller 3 is configured or programmed to update the water area map M using Bayesian estimation, and assign the current probability larger than the initial probability and the prior probability to the object presence range F1 and assign the current probability smaller than the initial probability and the prior probability to the range outside the object presence range F1 to calculate the posterior probability using Bayesian estimation. Accordingly, even when noise such as bubbles occurs in a predetermined frame, Bayesian estimation is repeatedly performed to update the water area map M such that the probability that the noise such as bubbles is present in the water area map M is reduced after the noise disappears. Therefore, the noise is reliably removed from the water area map M. - According to a preferred embodiment, the
imager 1 includes the twoimaging light receivers 1a spaced apart at different locations on thehull 101, and thecontroller 3 is configured or programmed to measure the distance from theimager 1 to the feature point F using the twoimaging light receivers 1a. Accordingly, two different images are captured simultaneously by the twoimaging light receivers 1a to measure the distance to the feature point F by the triangulation method, and thus as compared with a case in which two images are captured from different locations at different times by a single imaging apparatus to measure the distance to the feature point F by the triangulation method, the accuracy of distance measurement is improved. - According to a preferred embodiment, the water area
object detection system 103 further includes thedisplay 2 provided on thehull 101 to display the water area map M, and thecontroller 3 is configured or programmed to perform a control to display one feature point F in one pixel P of thedisplay 2 and set the perfect circular object presence range F1 around one pixel P in which the feature point F is displayed to display the feature point F and the perfect circular object presence range F1 on thedisplay 2. Accordingly, the boundary of the object presence range F1 is set at positions to which distances from the feature point F are equal to each other, and the water area map M that more accurately indicates the object O imaged by theimager 1 is created. - According to a preferred embodiment, the
controller 3 is configured or programmed to set the predetermined distance d1 to 15 m or more and 25 m or less, and set the size of the object presence range F1 to the lower limit r1 when the distance from theimager 1 to the feature point F is equal to or less than the predetermined distance d1. Accordingly, the size of the object presence range F1 is set to the lower limit r1 within a predetermined range from theimager 1 to the feature point F with the predetermined distance d1 of 15 m or more and 25 m or less, which is a relatively close to thehull 101, and the water area map M that more accurately indicates the object O is created. - The preferred embodiments described above are illustrative for present teaching but the present teaching also relates to modifications of the preferred embodiments.
- For example, while the present teaching is preferably applied to a marine vessel in preferred embodiments described above, the present teaching is not restricted to this. The present teaching may alternatively be applied to a
vehicle 201 shown inFIG. 9 . Thevehicle 201 includes a surroundingobject detection system 203. The surroundingobject detection system 203 includes twoimaging light receivers 1a, adisplay 2, and acontroller 3 to create a surrounding map. In addition, the present teaching may alternatively be applied to a flying object such as a drone. Thevehicle 201 and the flying object are examples of a "mobile body". - While the marine vessel is preferably an outboard motor boat in preferred embodiments described above, the present teaching is not restricted to this. In the present teaching, the marine vessel may alternatively be a marine vessel other than an outboard motor boat. For example, the marine vessel may be a marine vessel including an inboard motor, an inboard-outboard motor, or a jet propulsion device.
- While the imager preferably includes the two imaging light receivers each including a monocular camera in preferred embodiments described above, the present teaching is not restricted to this. In the present teaching, the imager may alternatively include stereo cameras, for example. Alternatively, the imager may include only one monocular camera. In such a case, the marine vessel preferably includes a highly accurate GPS to detect the position of the hull and a highly accurate inertial measurement unit (IMU) to detect the attitude of the hull.
- While the size of one pixel indicating the water area map preferably corresponds to a size of 0.1 m vertically and 0.1 m horizontally in the world coordinate system in preferred embodiments described above, the present teaching is not restricted to this. In the present teaching, the size of one pixel indicating the water area map may alternatively correspond to a size different from 0.1 m vertically and 0.1 m horizontally in the world coordinate system.
- While the predetermined distance to determine a range in which the size of the object presence range is set to the lower limit is preferably 15 m or more and 25 m or less (about 21 m) in preferred embodiments described above, the present teaching is not restricted to this. In the present teaching, the predetermined distance to determine a range in which the size of the object presence range is set to the lower limit may alternatively be different from 15 m or more and 25 m or less.
- While a presence probability of "1" is preferably uniformly assigned to each pixel in the object presence range in Bayesian estimation in preferred embodiments described above, the present teaching is not restricted to this. In the present teaching, a presence probability of "1" may not be uniformly assigned to each pixel in the object presence range in Bayesian estimation. For example, the presence probability of the feature point may be set to "1", and the presence probability may be decreased according to a Gaussian distribution, for example, as the distance from the feature point increases in the object presence range.
- While the object presence range is preferably indicated by a perfect circle in the water area map in preferred embodiments described above, the present teaching is not restricted to this. In the present teaching, the object presence range may alternatively be indicated by a shape different from a perfect circle, such as an ellipse.
Claims (11)
- A water area object detection system (103) comprising:an imager (1) configured to be provided on a hull (101) and configured to capture an image around the hull (101); anda controller (3) configured or programmed to perform a control to detect a feature point (F) corresponding to an object (O) in the image together with a distance to the feature point (F) based on the image captured by the imager (1) to create a water area map (M) in which an object presence range (F1) including a likelihood that the object (O) is present is set around the feature point (F); whereinthe controller (3) is configured or programmed to reduce the object presence range (F1) as the distance from the imager (1) of the hull (101) to the feature point (F) corresponding to the object (O) decreases, and set a size of the object presence range (F1) to a lower limit (r1) when the distance from the imager (1) to the feature point (F) is equal to or less than a predetermined distance.
- The water area object detection system according to claim 1, wherein the lower limit (r1) is set such that a plurality of the object presence ranges (F1) set for a plurality of the feature points (F) corresponding to the same object (O) partially overlap each other when the distance from the imager (1) to the feature point (F) is equal to or less than the predetermined distance.
- The water area object detection system according to claim 1 or 2, wherein the controller (3) is configured or programmed to automatically dock the hull (101) by automatically moving the hull (101) toward a shore structure (O1) corresponding to the object (O).
- The water area object detection system according to claim 3, wherein the controller (3) is configured or programmed to, when an obstacle corresponding to the object (O) is present between the hull (101) and the shore structure (O1) when the hull (101) is automatically docked, set the size of the object presence range (F1) to the lower limit (r1) or more, set a movement route (R) configured to avoid the object presence range (F1) around the feature point (F) corresponding to the obstacle, and automatically move the hull (101) along the movement route (R).
- The water area object detection system according to any one of claims 1 to 4, wherein the controller (3) is configured or programmed to change the size of the object presence range (F1) according to a distance measurement error that quadratically becomes greater as the distance from the imager (1) to the feature point (F) increases when the distance from the imager (1) to the feature point (F) is larger than the predetermined distance.
- The water area object detection system according to any one of claims 1 to 5, wherein the controller (3) is configured or programmed to create the two-dimensional water area map horizontally extending by setting the object presence range (F1) in a horizontal plane.
- The water area object detection system according to any one of claims 1 to 6, wherein the controller (3) is configured or programmed to redetect the feature point (F) corresponding to the object (O) in the image together with the distance to the feature point (F) for each predetermined number of imaging frames of the imager (1) to update the water area map.
- The water area object detection system according to claim 7, wherein the controller (3) is configured or programmed to:update the water area map using Bayesian estimation; andassign a current probability larger than an initial probability and a prior probability to the object presence range (F1) and assign a current probability smaller than the initial probability and the prior probability to a range outside the object presence range (F1) to calculate a posterior probability using the Bayesian estimation.
- The water area object detection system according to any one of claims 1 to 8, whereinthe imager (1) includes two imaging light receivers (1a) configured to be spaced apart at different locations on the hull (101); andthe controller (3) is configured or programmed to measure the distance from the imager (1) to the feature point (F) using the two imaging light receivers (1a).
- The water area object detection system according to any one of claims 1 to 9, further comprising:a display (2) configured to be provided on the hull (101) and configured to display the water area map; whereinthe controller (3) is configured or programmed to perform a control to display the one feature point (F) in one pixel (P) of the display (2) and set the object presence range (F1) having a perfect circular shape around the one pixel in which the one feature point (F) is displayed to display the one feature point and the object presence range (F1) on the display (2).
- The water area object detection system according to any one of claims 1 to 10, wherein the controller (3) is configured or programmed to set the predetermined distance to 15 m or more and 25 m or less, and set the size of the object presence range (F1) to the lower limit (r1) when the distance from the imager (1) to the feature point (F) is equal to or less than the predetermined distance.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2022004596A JP7328378B2 (en) | 2022-01-14 | 2022-01-14 | Aquatic Object Detection System, Vessel and Peripheral Object Detection System |
Publications (1)
Publication Number | Publication Date |
---|---|
EP4213111A1 true EP4213111A1 (en) | 2023-07-19 |
Family
ID=84888758
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP23150776.5A Pending EP4213111A1 (en) | 2022-01-14 | 2023-01-09 | System for detecting objects on a water surface |
Country Status (3)
Country | Link |
---|---|
US (1) | US20230228575A1 (en) |
EP (1) | EP4213111A1 (en) |
JP (1) | JP7328378B2 (en) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190308712A1 (en) * | 2016-12-02 | 2019-10-10 | Yamaha Hatsudoki Kabushiki Kaisha | Boat |
US20210406560A1 (en) * | 2020-06-25 | 2021-12-30 | Nvidia Corporation | Sensor fusion for autonomous machine applications using machine learning |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5472538B2 (en) | 2011-06-14 | 2014-04-16 | 日産自動車株式会社 | Distance measuring device and environmental map generating device |
JP7232089B2 (en) | 2019-03-19 | 2023-03-02 | ヤマハ発動機株式会社 | Display device for ship, ship and image display method for ship |
-
2022
- 2022-01-14 JP JP2022004596A patent/JP7328378B2/en active Active
-
2023
- 2023-01-04 US US18/092,964 patent/US20230228575A1/en active Pending
- 2023-01-09 EP EP23150776.5A patent/EP4213111A1/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190308712A1 (en) * | 2016-12-02 | 2019-10-10 | Yamaha Hatsudoki Kabushiki Kaisha | Boat |
US20210406560A1 (en) * | 2020-06-25 | 2021-12-30 | Nvidia Corporation | Sensor fusion for autonomous machine applications using machine learning |
Non-Patent Citations (2)
Title |
---|
SE S ET AL: "Mobile Robot Localization and Mapping with Uncertainty using Scale-Invariant Visual Landmarks", THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH,, vol. 21, no. 8, 31 August 2002 (2002-08-31), pages 735 - 758, XP002693739, DOI: 10.1177/027836402761412467 * |
TATSUNORI KOUKENJI SUZUKISHUJI HASHIMOTO: "the 66th National Meeting of the Information Processing Society of Japan", WASEDA UNIVERSITY, article "3D Environment Recognition and Mapping for Autonomous Mobile Robots", pages: 2 - 453 |
Also Published As
Publication number | Publication date |
---|---|
JP7328378B2 (en) | 2023-08-16 |
US20230228575A1 (en) | 2023-07-20 |
JP2023103836A (en) | 2023-07-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11029686B2 (en) | Automatic location placement system | |
JP7258062B2 (en) | Automatic positioning system | |
KR101683274B1 (en) | System for supporting vessel berth using unmanned aerial vehicle and the method thereof | |
KR102530691B1 (en) | Device and method for monitoring a berthing | |
JP5000244B2 (en) | Docking support device and ship equipped with the same | |
Park et al. | Development of an unmanned surface vehicle system for the 2014 Maritime RobotX Challenge | |
Hurtós et al. | Autonomous detection, following and mapping of an underwater chain using sonar | |
JP2018177074A (en) | Autonomous type underwater robot and control method for the same | |
US11480965B2 (en) | Automatic location placement system | |
KR102530847B1 (en) | Method and device for monitoring harbor and ship | |
US20220122465A1 (en) | Unmanned aircraft system, a control system of a marine vessel and a method for controlling a navigation system of a marine vessel | |
EP4053822A1 (en) | Ship docking assistance device | |
EP3860908A1 (en) | System and method for assisting docking of a vessel | |
CN115131720A (en) | Ship berthing assisting method based on artificial intelligence | |
EP4213111A1 (en) | System for detecting objects on a water surface | |
KR20230074438A (en) | Device and method for monitoring ship and port | |
US11741193B2 (en) | Distance recognition system for use in marine vessel, control method thereof, and marine vessel | |
EP4213110A1 (en) | System for detecting objects on a water surface and marine vessel with a system for detecting objects on a water surface | |
Hurtos et al. | Sonar-based chain following using an autonomous underwater vehicle | |
Kang et al. | Development of USV autonomy for the 2014 maritime RobotX challenge | |
TWI835431B (en) | Ship docking system and ship docking method | |
Harada et al. | Experimental study on collision avoidance procedures for plastic waste cleaner USV | |
CN116149316A (en) | Visual guidance method for unmanned ship dynamic recovery |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
P01 | Opt-out of the competence of the unified patent court (upc) registered |
Effective date: 20230807 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20240117 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR |