EP4213111A1 - System for detecting objects on a water surface - Google Patents

System for detecting objects on a water surface Download PDF

Info

Publication number
EP4213111A1
EP4213111A1 EP23150776.5A EP23150776A EP4213111A1 EP 4213111 A1 EP4213111 A1 EP 4213111A1 EP 23150776 A EP23150776 A EP 23150776A EP 4213111 A1 EP4213111 A1 EP 4213111A1
Authority
EP
European Patent Office
Prior art keywords
feature point
water area
distance
imager
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP23150776.5A
Other languages
German (de)
French (fr)
Inventor
Kazumichi Yoshida
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yamaha Motor Co Ltd
Original Assignee
Yamaha Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yamaha Motor Co Ltd filed Critical Yamaha Motor Co Ltd
Publication of EP4213111A1 publication Critical patent/EP4213111A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/203Specially adapted for sailing ships
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B49/00Arrangements of nautical instruments or navigational aids
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63BSHIPS OR OTHER WATERBORNE VESSELS; EQUIPMENT FOR SHIPPING 
    • B63B79/00Monitoring properties or operating parameters of vessels in operation
    • B63B79/40Monitoring properties or operating parameters of vessels in operation for controlling the operation of vessels, e.g. monitoring their speed, routing or maintenance schedules
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/0206Control of position or course in two dimensions specially adapted to water vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/771Feature selection, e.g. selecting representative features from a multi-dimensional feature space

Definitions

  • the present invention relates to a water area object detection system, and more particularly, it relates to a water area object detection system that includes an imager and creates a surrounding map based on an image captured by the imager.
  • An autonomous mobile robot that includes an imager and creates an environment map (surrounding map) based on an image captured by the imager is known in general.
  • Such an autonomous mobile robot is disclosed by Tatsunori Kou, Kenji Suzuki, and Shuji Hashimoto, Department of Applied Physics, Waseda University, "3D Environment Recognition and Mapping for Autonomous Mobile Robots", the 66th National Meeting of the Information Processing Society of Japan, pages 2-453 to 2-454 , for example.
  • an autonomous mobile robot including an imager and a controller.
  • the controller of the autonomous mobile robot creates a surrounding environment map (surrounding map) by detecting the posture and position of the autonomous mobile robot and a distance to an object around the autonomous mobile robot based on an image captured by the imager.
  • the controller detects a measurement point (feature point) indicating an object, sets, around the measurement point, a probability distribution range (object presence range) in which the object around the autonomous mobile robot is probabilistically distributed, and shows the probability distribution range on the environment map. That is, a range in which an object that interferes with movement of the autonomous mobile robot is probably present, which is the probability distribution range in which the object is probabilistically distributed, is shown on the environment map.
  • the controller increases the probability distribution range (object presence range) as the distance from the imager to the object increases, taking into account a distance measurement error. Specifically, the distance measurement error becomes greater as the distance from the imager to the object increases, and thus the controller also increases the probability distribution range (object presence range) according to the distance measurement error. Conversely, the distance measurement error becomes smaller as the distance from the imager to the object decreases, and thus the controller also reduces the probability distribution range (object presence range) according to the distance measurement error.
  • a technique to create an environment map (surrounding map) around a mobile body using an imager include a technique to create an environment map in consideration of the real-time characteristics (responsiveness) of control by detecting a small number of measurement points (feature points) indicating objects from a captured image.
  • a probability distribution range (object presence range) in which an object around the autonomous mobile robot is probabilistically distributed becomes smaller according to a distance measurement error as the imager is closer to the object, and thus the probability distribution range (object presence range) may become too small.
  • the probability distribution range may not include a portion in which the object is originally present, and a gap may occur between adjacent probability distribution ranges (object presence ranges). Therefore, it is desired to create an environment map that more accurately indicates an object near the imager.
  • said object is solved by a water area object detection system having the features of independent claim 1. Preferred embodiments are laid down in the dependent claims.
  • a water area object detection system includes an imager configured to be provided on a hull and configured to capture an image around the hull, and a controller configured or programmed to perform a control to detect a feature point corresponding to an object in the image together with a distance to the feature point based on the image captured by the imager to create a water area map in which an object presence range including a likelihood that the object is present is set around the feature point.
  • the controller is configured or programmed to reduce the object presence range as the distance from the imager of the hull to the feature point corresponding to the object decreases, and set a size of the object presence range to a lower limit when the distance from the imager to the feature point is equal to or less than a predetermined distance.
  • a water area object detection system includes the controller configured or programmed to create the water area map in which the object presence range including a likelihood that the object is present is set, and the controller is configured or programmed to reduce the object presence range as the distance from the imager of the hull to the feature point corresponding to the object decreases, and set the size of the object presence range to the lower limit when the distance from the imager to the feature point is equal to or less than the predetermined distance. Accordingly, when the imager is relatively near the object such that the distance from the imager to the feature point is equal to or less than the predetermined distance, the size of the object presence range is set to the lower limit, and thus an excessive reduction in the object presence range is prevented.
  • the lower limit is preferably set such that a plurality of the object presence ranges set for a plurality of the feature points corresponding to the same object partially overlap each other when the distance from the imager to the feature point is equal to or less than the predetermined distance.
  • the distance from the imager to the feature point is equal to or less than the predetermined distance, i.e., when each of the plurality of object presence ranges set for the plurality of feature points corresponding to the same object is relatively small, the plurality of object presence ranges partially overlap each other, and thus the possibility that a portion in which the object is originally present is not included in the object presence range, and a gap occurs between the adjacent object presence ranges is more reliably reduced or prevented. Therefore, the water area map that still more accurately indicates the object near the imager is created.
  • the controller is preferably configured or programmed to automatically dock the hull by automatically moving the hull toward a shore structure corresponding to the object. Accordingly, the hull is easily docked at the shore structure.
  • the controller is preferably configured or programmed to, when an obstacle corresponding to the object is present between the hull and the shore structure when the hull is automatically docked, set the size of the object presence range to the lower limit or more, set a movement route configured to avoid the object presence range around the feature point corresponding to the obstacle, and automatically move the hull along the movement route.
  • the sizes of the plurality of object presence ranges indicating the shore structure are set to the lower limit or more, and thus a gap between the plurality of object presence ranges indicating the shore structure is filled. Consequently, when the hull is automatically docked at the shore structure, contact of the hull with the obstacle is reduced or prevented.
  • the controller is preferably configured or programmed to change the size of the object presence range according to a distance measurement error that quadratically becomes greater as the distance from the imager to the feature point increases when the distance from the imager to the feature point is larger than the predetermined distance. Accordingly, when the distance from the imager to the feature point is larger than the predetermined distance, the size of the object presence range is set to an appropriate size in consideration of the distance measurement error. Therefore, the water area map that more accurately indicates the object imaged by the imager is created.
  • the controller is preferably configured or programmed to create the two-dimensional water area map horizontally extending by setting the object presence range in a horizontal plane. Accordingly, as compared with a case in which a three-dimensional water area map is created in consideration of an upward-downward direction (height direction), the processing load on the controller is reduced, and thus the real-time characteristics (responsiveness) of control is improved.
  • the controller is preferably configured or programmed to redetect the feature point corresponding to the object in the image together with the distance to the feature point for each predetermined number of imaging frames of the imager to update the water area map. Accordingly, when an estimation method such as Bayesian estimation is performed, the posterior probability is converged in an earlier stage (an area in which the object is present has a higher probability, and an area in which the object is absent has a lower probability).
  • the controller is preferably configured or programmed to update the water area map using Bayesian estimation, and assign a current probability larger than an initial probability and a prior probability to the object presence range and assign a current probability smaller than the initial probability and the prior probability to a range outside the object presence range to calculate a posterior probability using the Bayesian estimation. Accordingly, even when noise such as bubbles occurs in a predetermined frame, Bayesian estimation is repeatedly performed to update the water area map such that the probability that the noise such as bubbles is present in the water area map is reduced after the noise disappears. Therefore, the noise is reliably removed from the water area map.
  • the imager preferably includes two imaging light receivers configured to be spaced apart at different locations on the hull, and the controller is preferably configured or programmed to measure the distance from the imager to the feature point using the two imaging light receivers. Accordingly, two different images are captured simultaneously by the two imaging light receivers to measure the distance to the feature point by the triangulation method, and thus as compared with a case in which two images are captured from different locations at different times by a single imaging apparatus to measure the distance to the feature point by the triangulation method, the accuracy of distance measurement is improved.
  • a water area object detection system preferably further includes a display configured to be provided on the hull and configured to display the water area map, and the controller is preferably configured or programmed to perform a control to display the one feature point in one pixel of the display and set the object presence range having a perfect circular shape around the one pixel in which the one feature point is displayed to display the one feature point and the object presence range on the display. Accordingly, the boundary of the object presence range is set at positions to which distances from the feature point are equal to each other, and the water area map that more accurately indicates the object imaged by the imager is created.
  • the controller is preferably configured or programmed to set the predetermined distance to 15 m or more and 25 m or less, and set the size of the object presence range to the lower limit when the distance from the imager to the feature point is equal to or less than the predetermined distance. Accordingly, the size of the object presence range is set to the lower limit within a predetermined range from the imager to the feature point with the predetermined distance of 15 m or more and 25 m or less, which is a relatively close to the hull, and the water area map that more accurately indicates the object is created.
  • the structure of a marine vessel 100 including a water area object detection system 103 is now described with reference to FIGS. 1 to 8 .
  • the water area object detection system 103 is an example of a "surrounding object detection system”.
  • arrow FWD represents the forward movement direction of the marine vessel 100 (front side with reference to a hull 101)
  • arrow BWD represents the reverse movement direction of the marine vessel 100 (rear side with reference to the hull 101).
  • the hull 101 is an example of a "mobile body”.
  • arrow L represents the portside direction of the marine vessel 100 (left side with respect to the hull 101), and arrow R represents the starboard direction of the marine vessel 100 (right side with respect to the hull 101).
  • the marine vessel 100 includes the hull 101, a marine propulsion device 102 provided on the hull 101, and the water area object detection system 103 provided on or in the hull 101.
  • the marine propulsion device 102 is attached to a transom of the hull 101 from behind. That is, in preferred embodiments, the marine propulsion device 102 is an outboard motor, and the marine vessel 100 is an outboard motor boat.
  • the marine vessel 100 performs a control to estimate the self-position of the hull 101 in the water area map M while creating a two-dimensional water area map M (see FIG. 2 ) around the hull 101 that extends horizontally, using the water area object detection system 103.
  • the water area map M is an example of a "surrounding map".
  • control described above is achieved by simultaneous localization and mapping (SLAM).
  • SLAM simultaneous localization and mapping
  • the SLAM is a technique to simultaneously create an environment map around a mobile device and estimate the self-position of the mobile device in the environment map, using an image captured by a camera installed on the mobile device, for example.
  • estimation of the self-position of the mobile device using the SLAM is able to be performed even in an environment such as indoors in which a GPS or the like is not usable.
  • the SLAM enables the mobile device to move while avoiding surrounding objects so as to not collide with the objects, and to move along an optimal movement route without duplication of routes, for example.
  • the SLAM includes passive SLAM (such as so-called visual SLAM) that uses an image sensor such as a camera to image a surrounding object, and active SLAM (such as so-called LiDAR SLAM) performed by irradiating a surrounding object with a laser beam of a laser device and detecting the reflected laser beam.
  • passive SLAM such as so-called visual SLAM
  • active SLAM such as so-called LiDAR SLAM
  • the marine vessel 100 performs a control using a means such as the former passive SLAM.
  • the passive SLAM using an image sensor such as a camera includes a means to acquire dense detection data and a means to acquire sparse detection data.
  • the means to acquire dense detection data requires a larger amount of data processing in a controller as compared with the means to acquire sparse detection data.
  • the sparse detection data refers to data obtained by extracting a feature point that is a portion of an image captured by an image sensor such as a camera, for example.
  • the marine vessel 100 performs a control using the means such as the SLAM to acquire the latter sparse detection data. Consequently, the marine vessel 100 computes acquired detection data more quickly, and performs a real-time control according to movement of the marine vessel 100. That is, the marine vessel 100 performs a highly responsive (real-time) control.
  • the marine vessel 100 performs a control to automatically move while avoiding obstacles (objects 0) along a movement route R using the water area map M created using the water area object detection system 103 and a control to automatically dock the hull 101 at a shore structure O1 such as a floating pier.
  • the marine vessel 100 uses the water area map M as a means to know the positions of the obstacles (objects 0) not only when the marine vessel 100 automatically moves but also when a user manually maneuvers the marine vessel 100. That is, the water area map M is a so-called cost map to indicate the positions of the obstacles (objects 0) that are present around the marine vessel 100, for example.
  • the water area object detection system 103 includes an imager 1 provided on the hull 101, a display 2 provided on the hull 101, and a controller 3 (see FIG. 1 ).
  • the water area object detection system 103 (controller 3) performs a control to create the water area map M in which an object presence range F1 including a likelihood that an object O is present is set around a feature point F by detecting, based on an image captured by the imager 1, the feature point F corresponding to the object 0 in the image together with a distance to the feature point F.
  • the imager 1 captures an image around the hull 101.
  • the imager 1 includes two imaging light receivers 1a spaced apart at different locations on the hull 101.
  • Each imaging light receiver 1a includes a monocular camera including an imaging device such as a CCD sensor or a CMOS sensor.
  • the water area object detection system 103 (controller 3) measures a distance from the imager 1 to the feature point F using the two imaging light receivers 1a. Specifically, the water area object detection system 103 (controller 3) measures the distance to the feature point F corresponding to the object O in an image captured by a triangulation method based on images captured by the two imaging light receivers 1a.
  • the "feature point F corresponding to the object O in an image” refers to a specific point shown in a portion of the image in which the object O is located.
  • the feature point F is set in a portion of the image in which there is a particularly large change in brightness or color tone.
  • the water area object detection system 103 preliminarily performs distortion correction of images captured by the two imaging light receivers 1a, rectification to associate the images with each other, and parallax estimation by matching corresponding feature points F on the images, for example, as preprocessing for distance measurement by the triangulation method.
  • a distance L in the horizontal direction from the other of the imaging light receivers 1a to the feature point F (object 0) is obtained by the following formula.
  • L1 fx pd + L 1
  • the displacement amount ⁇ L of the measurement distance increases as the parallax d decreases, and the displacement amount ⁇ L of the measurement distance decreases as the parallax d increases.
  • This displacement amount ⁇ L of the measurement distance corresponds to a distance resolution at the time of measuring the distance L. That is, as the distance L in the horizontal direction from the imaging light receiver 1a to the feature point F increases, a distance measurement error becomes greater. The distance measurement error quadratically becomes greater as the distance L increases.
  • the display 2 displays the water area map M created by the controller 3 (see FIG. 1 ).
  • the display 2 displays the water area map M within a display image of 450 pixels wide by 600 pixels long.
  • the display image of 450 pixels wide by 600 pixels long is a display image in which a point group is plotted in a world coordinate system.
  • the size of one pixel of the water area map M corresponds to a size of 0.1 m horizontally and 0.1 m vertically in the world coordinate system.
  • the display 2 displays the imager 1 (imaging light receiver 1a) at a predetermined pixel position of the water area map M in a predetermined orientation.
  • the display 2 also displays the hull 101 together with the imager 1 on the water area map M.
  • the display 2 displays a schematic model of the hull 101 on the water area map M.
  • the display 2 displays the feature points F on the water area map M. Furthermore, the display 2 displays, on the water area map M, the object presence range F1 set around the feature point F and including a likelihood that the object 0 is present.
  • the controller 3 performs a control to display one feature point F in one pixel P of the display 2 and set a perfect circular object presence range F1 around one pixel P in which the feature point F is displayed to display the feature point F and the perfect circular object presence range F1 on the display 2 (see FIG. 6 ).
  • the object presence range F1 refers to a range set around the feature point F and including a likelihood that the object O is present.
  • the object presence range F1 refers to a range in which the object 0 is probabilistically present.
  • the object presence range F1 should be avoided when the marine vessel 100 moves, and a movement route R (see FIGS. 2 and 8 ) for automatic movement is not set in the object presence range F1.
  • the display 2 displays an imaging area A with a predetermined angle of view indicating a range currently being imaged by the imager 1 on the water area map M.
  • the controller 3 (see FIG. 1 ) includes a circuit board including a central processing unit (CPU), a read-only memory (ROM), a random access memory (RAM), etc., for example.
  • the controller 3 is provided in the hull 101.
  • the controller 3 is connected to the two imaging light receivers 1a, the display 2, and the marine propulsion device 102 by signal lines.
  • the controller 3 performs a control to automatically move the hull 101 by setting the movement route R for automatic movement based on the created water area map M and controlling driving of the marine propulsion device 102.
  • the controller 3 performs a control to create the water area map M in which the perfect circular object presence range F1 including a likelihood that the object O is present is set around the feature point F by detecting the feature point F corresponding to the object 0 in the image together with the distance to the feature point F based on the image captured by the imager 1.
  • the controller 3 creates the two-dimensional water area map M extending in the horizontal direction of the hull 101 by setting the object presence range F1 in a horizontal plane. That is, the controller 3 creates the two-dimensional water area map M extending in the forward-rearward direction and the right-left direction (arrows FWD, BWD, L, and R) of the hull 101.
  • the controller 3 reduces the object presence range F1 as the distance from the imager 1 of the hull 101 to the feature point F corresponding to the object O decreases.
  • the size of the object presence range F1 is set to a lower limit r1.
  • the controller 3 sets a minimum value for the size of the object presence range F1 such that the size of the object presence range F1 does not become too small.
  • the predetermined distance d1 refers to a distance to determine a range in which the size of the object presence range F1 is set to the lower limit.
  • the controller 3 changes the predetermined distance d1 by setting.
  • the controller 3 sets the predetermined distance d1 to 15 m or more and 25 m or less, and sets the size of the object presence range F1 to the lower limit r1 when the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1.
  • the controller 3 sets the predetermined distance d1 to about 21 m, and sets the size of the object presence range F1 (a distance from the feature point F to the outer edge of the perfect circular object presence range F1, i.e., the radius of the object presence range F1) to the lower limit r1 when the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1 (about 21 m). That is, in a range relatively close to the hull 101 in which the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1, the size of the object presence range F1 is set to a constant lower limit r1.
  • the lower limit r1 is set such that a plurality of object presence ranges F1 set for a plurality of feature points F corresponding to the same object O partially overlap each other. That is, the controller 3 reduces or prevents the occurrence of a gap between adjacent object presence ranges F1 of the plurality of object presence ranges F1 set for the plurality of feature points F corresponding to the same object O. If the lower limit r1 is not set for the size of the object presence range, the object presence range continues to become smaller toward the feature point as the distance from the imager to the feature point decreases, and thus a gap is more likely to occur between the adjacent object presence ranges.
  • the controller 3 changes the size of the object presence range according to the distance measurement error that quadratically becomes greater as the distance from the imager 1 to the feature point F increases (see FIG. 7 ).
  • the controller 3 increases the size of the object presence range F1 in proportion to the magnitude of the distance measurement error as the distance from the imager 1 to the feature point F increases when the distance from the imager 1 to the feature point F is larger than the predetermined distance d1.
  • the controller 3 sets the size (radius) of the object presence range F1 to r2 larger than the lower limit r1. Furthermore, the controller 3 sets the size (radius) of the object presence range F1 to r3 larger than r2 when the distance from the imager 1 to the feature point F is d3 larger than the distance d2.
  • the controller 3 performs a control to redetect the feature point F corresponding to the object 0 in the image together with the distance to the feature point F for each imaging frame of the imager 1 and compare the detected feature point F with the currently used feature point F.
  • the controller 3 performs a control to reject the detected feature point F and continue to use the currently used feature point F when the degree of change is smaller than a predetermined value, and performs a control to update the result to the newly detected feature point F and update the water area map M and the position of the imager 1 in the water area map M when the degree of change is conversely larger than the predetermined value.
  • the controller 3 updates the water area map M every predetermined imaging frame (every ten frames, for example).
  • the controller 3 updates the water area map M using Bayesian estimation.
  • the controller 3 performs a Bayesian estimation for each pixel P and updates a presence probability indicating that the object O will be present at each pixel P. For example, the controller 3 assigns a current probability larger than an initial probability and a prior probability to the object presence range F1 and assigns a current probability smaller than the initial probability and the prior probability to a range (absence range) outside the object presence range F1 to calculate a posterior probability using Bayesian estimation.
  • a presence probability of "0.5” is assigned as the initial probability to all pixels P, and a presence probability of "1" is uniformly assigned to each pixel P in the object presence range F1, and a presence probability of "0.35" is uniformly assigned to the range outside the object presence range F1.
  • the controller 3 sets the current probability and the initial probability to the same value to make the posterior probability equal to the prior probability in Bayesian estimation outside the range of the angle of view of the imager 1.
  • the controller 3 controls driving of the marine propulsion device 102 to move the hull 101.
  • the controller 3 automatically docks the hull 101 by automatically moving the hull 101 toward the shore structure 01 (see FIG. 3 ) corresponding to the object O.
  • the controller 3 sets the size of the object presence range F1 to the lower limit r1 or more, sets the movement route R that avoids the object presence range F1 around the feature point F corresponding to an obstacle (object O), and automatically moves the hull 101 along the movement route R when the obstacle corresponding to the object O is present between the hull 101 and the shore structure 01.
  • the water area object detection system 103 includes the controller 3 configured or programmed to create the water area map M in which the object presence range F1 including a likelihood that the object O is present is set.
  • the controller 3 is configured or programmed to reduce the object presence range F1 as the distance from the imager 1 of the hull 101 to the feature point F corresponding to the object O decreases, and set the size of the object presence range F1 to the lower limit r1 when the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1.
  • the size of the object presence range F1 is set to the lower limit r1, and thus an excessive reduction in the object presence range F1 is prevented. Therefore, the possibility that in the water area map M, a portion in which the object O is originally present is not included in the object presence range F1, and a gap occurs between the adjacent object presence ranges F1 is reduced or prevented. Therefore, the water area map M that accurately indicates the object 0 near the imager 1 is created.
  • the lower limit r1 is set such that the plurality of object presence ranges F1 set for the plurality of feature points F corresponding to the same object 0 partially overlap each other when the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1.
  • the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1, i.e., when each of the plurality of object presence ranges F1 set for the plurality of feature points F corresponding to the same object 0 is relatively small, the plurality of object presence ranges F1 partially overlap each other, and thus the possibility that a portion in which the object O is originally present is not included in the object presence range F1, and a gap occurs between the adjacent object presence ranges F1 is more reliably reduced or prevented. Therefore, the water area map M that still more accurately indicates the object 0 near the imager 1 is created.
  • the controller 3 is configured or programmed to automatically dock the hull 101 by automatically moving the hull 101 toward the shore structure 01 corresponding to the object O. Accordingly, the hull 101 is easily docked at the shore structure O1.
  • the controller 3 is configured or programmed to, when the obstacle corresponding to the object 0 is present between the hull 101 and the shore structure 01 when the hull 101 is automatically docked, set the size of the object presence range F1 to the lower limit r1 or more, set the movement route R that avoids the object presence range F1 around the feature point F corresponding to the obstacle, and automatically move the hull 101 along the movement route R. Accordingly, the sizes of the plurality of object presence ranges F1 indicating the shore structure 01 are set to the lower limit r1 or more, and thus a gap between the plurality of object presence ranges F1 indicating the shore structure O1 is filled. Consequently, when the hull 101 is automatically docked at the shore structure O1, contact of the hull 101 with the obstacle is reduced or prevented.
  • the controller 3 is configured or programmed to change the size of the object presence range F1 according to the distance measurement error that quadratically becomes greater as the distance from the imager 1 to the feature point F increases when the distance from the imager 1 to the feature point F is larger than the predetermined distance d1. Accordingly, when the distance from the imager 1 to the feature point F is larger than the predetermined distance d1, the size of the object presence range F1 is set to an appropriate size in consideration of the distance measurement error. Therefore, the water area map M that more accurately indicates the object O imaged by the imager 1 is created.
  • the controller 3 is configured or programmed to create the two-dimensional water area map M extending in the horizontal direction by setting the object presence range F1 in the horizontal plane. Accordingly, as compared with a case in which a three-dimensional water area map M is created in consideration of an upward-downward direction (height direction), the processing load on the controller 3 is reduced, and thus the real-time characteristics (responsiveness) of control is improved.
  • the controller 3 is configured or programmed to redetect the feature point F corresponding to the object O in the image together with the distance to the feature point F for each predetermined number of imaging frames of the imager 1 to update the water area map M. Accordingly, when an estimation method such as Bayesian estimation is performed, the posterior probability is converged in an earlier stage (an area in which the object O is present has a higher probability, and an area in which the object O is absent has a lower probability).
  • the controller 3 is configured or programmed to update the water area map M using Bayesian estimation, and assign the current probability larger than the initial probability and the prior probability to the object presence range F1 and assign the current probability smaller than the initial probability and the prior probability to the range outside the object presence range F1 to calculate the posterior probability using Bayesian estimation. Accordingly, even when noise such as bubbles occurs in a predetermined frame, Bayesian estimation is repeatedly performed to update the water area map M such that the probability that the noise such as bubbles is present in the water area map M is reduced after the noise disappears. Therefore, the noise is reliably removed from the water area map M.
  • the imager 1 includes the two imaging light receivers 1a spaced apart at different locations on the hull 101, and the controller 3 is configured or programmed to measure the distance from the imager 1 to the feature point F using the two imaging light receivers 1a. Accordingly, two different images are captured simultaneously by the two imaging light receivers 1a to measure the distance to the feature point F by the triangulation method, and thus as compared with a case in which two images are captured from different locations at different times by a single imaging apparatus to measure the distance to the feature point F by the triangulation method, the accuracy of distance measurement is improved.
  • the water area object detection system 103 further includes the display 2 provided on the hull 101 to display the water area map M, and the controller 3 is configured or programmed to perform a control to display one feature point F in one pixel P of the display 2 and set the perfect circular object presence range F1 around one pixel P in which the feature point F is displayed to display the feature point F and the perfect circular object presence range F1 on the display 2. Accordingly, the boundary of the object presence range F1 is set at positions to which distances from the feature point F are equal to each other, and the water area map M that more accurately indicates the object O imaged by the imager 1 is created.
  • the controller 3 is configured or programmed to set the predetermined distance d1 to 15 m or more and 25 m or less, and set the size of the object presence range F1 to the lower limit r1 when the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1. Accordingly, the size of the object presence range F1 is set to the lower limit r1 within a predetermined range from the imager 1 to the feature point F with the predetermined distance d1 of 15 m or more and 25 m or less, which is a relatively close to the hull 101, and the water area map M that more accurately indicates the object O is created.
  • the present teaching is preferably applied to a marine vessel in preferred embodiments described above, the present teaching is not restricted to this.
  • the present teaching may alternatively be applied to a vehicle 201 shown in FIG. 9 .
  • the vehicle 201 includes a surrounding object detection system 203.
  • the surrounding object detection system 203 includes two imaging light receivers 1a, a display 2, and a controller 3 to create a surrounding map.
  • the present teaching may alternatively be applied to a flying object such as a drone.
  • the vehicle 201 and the flying object are examples of a "mobile body".
  • the marine vessel is preferably an outboard motor boat in preferred embodiments described above, the present teaching is not restricted to this.
  • the marine vessel may alternatively be a marine vessel other than an outboard motor boat.
  • the marine vessel may be a marine vessel including an inboard motor, an inboard-outboard motor, or a jet propulsion device.
  • the imager preferably includes the two imaging light receivers each including a monocular camera in preferred embodiments described above, the present teaching is not restricted to this.
  • the imager may alternatively include stereo cameras, for example.
  • the imager may include only one monocular camera.
  • the marine vessel preferably includes a highly accurate GPS to detect the position of the hull and a highly accurate inertial measurement unit (IMU) to detect the attitude of the hull.
  • IMU inertial measurement unit
  • the size of one pixel indicating the water area map preferably corresponds to a size of 0.1 m vertically and 0.1 m horizontally in the world coordinate system in preferred embodiments described above, the present teaching is not restricted to this. In the present teaching, the size of one pixel indicating the water area map may alternatively correspond to a size different from 0.1 m vertically and 0.1 m horizontally in the world coordinate system.
  • the predetermined distance to determine a range in which the size of the object presence range is set to the lower limit is preferably 15 m or more and 25 m or less (about 21 m) in preferred embodiments described above, the present teaching is not restricted to this. In the present teaching, the predetermined distance to determine a range in which the size of the object presence range is set to the lower limit may alternatively be different from 15 m or more and 25 m or less.
  • a presence probability of "1" is preferably uniformly assigned to each pixel in the object presence range in Bayesian estimation in preferred embodiments described above, the present teaching is not restricted to this.
  • a presence probability of "1" may not be uniformly assigned to each pixel in the object presence range in Bayesian estimation.
  • the presence probability of the feature point may be set to "1”, and the presence probability may be decreased according to a Gaussian distribution, for example, as the distance from the feature point increases in the object presence range.
  • the object presence range is preferably indicated by a perfect circle in the water area map in preferred embodiments described above, the present teaching is not restricted to this.
  • the object presence range may alternatively be indicated by a shape different from a perfect circle, such as an ellipse.

Abstract

A water area object detection system includes an imager (1a) configured to capture an image around a hull (101), and a controller configured or programmed to perform a control to detect a feature point (F) corresponding to an object (O) in the image together with a distance to the feature point based on the image captured by the imager to create a water area map (M) in which an object presence range (F1) including a likelihood that the object is present is set around the feature point. The controller is configured or programmed to reduce the object presence range as the distance from the imager (1a) to the feature point (F) decreases, and set a size of the object presence range to a lower limit (r1) when the distance from the imager to the feature point is equal to or less than a predetermined distance (d1).

Description

  • The present invention relates to a water area object detection system, and more particularly, it relates to a water area object detection system that includes an imager and creates a surrounding map based on an image captured by the imager.
  • An autonomous mobile robot that includes an imager and creates an environment map (surrounding map) based on an image captured by the imager is known in general. Such an autonomous mobile robot is disclosed by Tatsunori Kou, Kenji Suzuki, and Shuji Hashimoto, Department of Applied Physics, Waseda University, "3D Environment Recognition and Mapping for Autonomous Mobile Robots", the 66th National Meeting of the Information Processing Society of Japan, pages 2-453 to 2-454, for example.
  • Tatsunori Kou, Kenji Suzuki, and Shuji Hashimoto, Department of Applied Physics, Waseda University, "3D Environment Recognition and Mapping for Autonomous Mobile Robots", the 66th National Meeting of the Information Processing Society of Japan, pages 2-453 to 2-454 disclose an autonomous mobile robot including an imager and a controller. The controller of the autonomous mobile robot creates a surrounding environment map (surrounding map) by detecting the posture and position of the autonomous mobile robot and a distance to an object around the autonomous mobile robot based on an image captured by the imager. At this time, the controller detects a measurement point (feature point) indicating an object, sets, around the measurement point, a probability distribution range (object presence range) in which the object around the autonomous mobile robot is probabilistically distributed, and shows the probability distribution range on the environment map. That is, a range in which an object that interferes with movement of the autonomous mobile robot is probably present, which is the probability distribution range in which the object is probabilistically distributed, is shown on the environment map.
  • The controller increases the probability distribution range (object presence range) as the distance from the imager to the object increases, taking into account a distance measurement error. Specifically, the distance measurement error becomes greater as the distance from the imager to the object increases, and thus the controller also increases the probability distribution range (object presence range) according to the distance measurement error. Conversely, the distance measurement error becomes smaller as the distance from the imager to the object decreases, and thus the controller also reduces the probability distribution range (object presence range) according to the distance measurement error.
  • Although not clearly described by Tatsunori Kou, Kenji Suzuki, and Shuji Hashimoto, Department of Applied Physics, Waseda University, "3D Environment Recognition and Mapping for Autonomous Mobile Robots", the 66th National Meeting of the Information Processing Society of Japan, pages 2-453 to 2-454, a technique to create an environment map (surrounding map) around a mobile body using an imager include a technique to create an environment map in consideration of the real-time characteristics (responsiveness) of control by detecting a small number of measurement points (feature points) indicating objects from a captured image.
  • When the environment map creation technique described above to decrease the number of measurement points in consideration of the real-time characteristics of control is used in the autonomous mobile robot described by Tatsunori Kou, Kenji Suzuki, and Shuji Hashimoto, Department of Applied Physics, Waseda University, "3D Environment Recognition and Mapping for Autonomous Mobile Robots", the 66th National Meeting of the Information Processing Society of Japan, pages 2-453 to 2-454, a probability distribution range (object presence range) in which an object around the autonomous mobile robot is probabilistically distributed becomes smaller according to a distance measurement error as the imager is closer to the object, and thus the probability distribution range (object presence range) may become too small. Consequently, the probability distribution range (object presence range) may not include a portion in which the object is originally present, and a gap may occur between adjacent probability distribution ranges (object presence ranges). Therefore, it is desired to create an environment map that more accurately indicates an object near the imager.
  • It is an object of the present invention to provide a water area object detection system that creates a water area map (surrounding map) that more accurately indicates an object near an imager. According to the present invention, said object is solved by a water area object detection system having the features of independent claim 1. Preferred embodiments are laid down in the dependent claims.
  • A water area object detection system according to a preferred embodiment includes an imager configured to be provided on a hull and configured to capture an image around the hull, and a controller configured or programmed to perform a control to detect a feature point corresponding to an object in the image together with a distance to the feature point based on the image captured by the imager to create a water area map in which an object presence range including a likelihood that the object is present is set around the feature point. The controller is configured or programmed to reduce the object presence range as the distance from the imager of the hull to the feature point corresponding to the object decreases, and set a size of the object presence range to a lower limit when the distance from the imager to the feature point is equal to or less than a predetermined distance.
  • A water area object detection system according to a preferred embodiment includes the controller configured or programmed to create the water area map in which the object presence range including a likelihood that the object is present is set, and the controller is configured or programmed to reduce the object presence range as the distance from the imager of the hull to the feature point corresponding to the object decreases, and set the size of the object presence range to the lower limit when the distance from the imager to the feature point is equal to or less than the predetermined distance. Accordingly, when the imager is relatively near the object such that the distance from the imager to the feature point is equal to or less than the predetermined distance, the size of the object presence range is set to the lower limit, and thus an excessive reduction in the object presence range is prevented. Therefore, the possibility that in the water area map, a portion in which the object is originally present is not included in the object presence range, and a gap occurs between adjacent object presence ranges is reduced or prevented. Therefore, the water area map that more accurately indicates the object near the imager is created.
  • In a water area object detection system according to a preferred embodiment, the lower limit is preferably set such that a plurality of the object presence ranges set for a plurality of the feature points corresponding to the same object partially overlap each other when the distance from the imager to the feature point is equal to or less than the predetermined distance. Accordingly, when the distance from the imager to the feature point is equal to or less than the predetermined distance, i.e., when each of the plurality of object presence ranges set for the plurality of feature points corresponding to the same object is relatively small, the plurality of object presence ranges partially overlap each other, and thus the possibility that a portion in which the object is originally present is not included in the object presence range, and a gap occurs between the adjacent object presence ranges is more reliably reduced or prevented. Therefore, the water area map that still more accurately indicates the object near the imager is created.
  • In a water area object detection system according to a preferred embodiment, the controller is preferably configured or programmed to automatically dock the hull by automatically moving the hull toward a shore structure corresponding to the object. Accordingly, the hull is easily docked at the shore structure.
  • In such a case, the controller is preferably configured or programmed to, when an obstacle corresponding to the object is present between the hull and the shore structure when the hull is automatically docked, set the size of the object presence range to the lower limit or more, set a movement route configured to avoid the object presence range around the feature point corresponding to the obstacle, and automatically move the hull along the movement route. Accordingly, the sizes of the plurality of object presence ranges indicating the shore structure are set to the lower limit or more, and thus a gap between the plurality of object presence ranges indicating the shore structure is filled. Consequently, when the hull is automatically docked at the shore structure, contact of the hull with the obstacle is reduced or prevented.
  • In a water area object detection system according to a preferred embodiment, the controller is preferably configured or programmed to change the size of the object presence range according to a distance measurement error that quadratically becomes greater as the distance from the imager to the feature point increases when the distance from the imager to the feature point is larger than the predetermined distance. Accordingly, when the distance from the imager to the feature point is larger than the predetermined distance, the size of the object presence range is set to an appropriate size in consideration of the distance measurement error. Therefore, the water area map that more accurately indicates the object imaged by the imager is created.
  • In a water area object detection system according to a preferred embodiment, the controller is preferably configured or programmed to create the two-dimensional water area map horizontally extending by setting the object presence range in a horizontal plane. Accordingly, as compared with a case in which a three-dimensional water area map is created in consideration of an upward-downward direction (height direction), the processing load on the controller is reduced, and thus the real-time characteristics (responsiveness) of control is improved.
  • In a water area object detection system according to a preferred embodiment, the controller is preferably configured or programmed to redetect the feature point corresponding to the object in the image together with the distance to the feature point for each predetermined number of imaging frames of the imager to update the water area map. Accordingly, when an estimation method such as Bayesian estimation is performed, the posterior probability is converged in an earlier stage (an area in which the object is present has a higher probability, and an area in which the object is absent has a lower probability).
  • In such a case, the controller is preferably configured or programmed to update the water area map using Bayesian estimation, and assign a current probability larger than an initial probability and a prior probability to the object presence range and assign a current probability smaller than the initial probability and the prior probability to a range outside the object presence range to calculate a posterior probability using the Bayesian estimation. Accordingly, even when noise such as bubbles occurs in a predetermined frame, Bayesian estimation is repeatedly performed to update the water area map such that the probability that the noise such as bubbles is present in the water area map is reduced after the noise disappears. Therefore, the noise is reliably removed from the water area map.
  • In a water area object detection system according to a preferred embodiment, the imager preferably includes two imaging light receivers configured to be spaced apart at different locations on the hull, and the controller is preferably configured or programmed to measure the distance from the imager to the feature point using the two imaging light receivers. Accordingly, two different images are captured simultaneously by the two imaging light receivers to measure the distance to the feature point by the triangulation method, and thus as compared with a case in which two images are captured from different locations at different times by a single imaging apparatus to measure the distance to the feature point by the triangulation method, the accuracy of distance measurement is improved.
  • A water area object detection system according to a preferred embodiment preferably further includes a display configured to be provided on the hull and configured to display the water area map, and the controller is preferably configured or programmed to perform a control to display the one feature point in one pixel of the display and set the object presence range having a perfect circular shape around the one pixel in which the one feature point is displayed to display the one feature point and the object presence range on the display. Accordingly, the boundary of the object presence range is set at positions to which distances from the feature point are equal to each other, and the water area map that more accurately indicates the object imaged by the imager is created.
  • In a water area object detection system according to a preferred embodiment, the controller is preferably configured or programmed to set the predetermined distance to 15 m or more and 25 m or less, and set the size of the object presence range to the lower limit when the distance from the imager to the feature point is equal to or less than the predetermined distance. Accordingly, the size of the object presence range is set to the lower limit within a predetermined range from the imager to the feature point with the predetermined distance of 15 m or more and 25 m or less, which is a relatively close to the hull, and the water area map that more accurately indicates the object is created.
  • The above and other elements, features, steps, characteristics and advantages of preferred embodiments will become more apparent from the following detailed description of the preferred embodiments with reference to the attached drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
    • FIG. 1 is a side view showing a marine vessel including a water area object detection system according to a preferred embodiment.
    • FIG. 2 is a diagram showing a water area map created by a water area object detection system according to a preferred embodiment.
    • FIG. 3 is a plan view illustrating an actual object (such as a structure) in a range corresponding to a water area map.
    • FIG. 4 is a schematic view illustrating the size of an object presence range on a water area map.
    • FIG. 5 is a diagram illustrating a triangulation method using two imaging light receivers.
    • FIG. 6 is an enlarged view showing a feature point and an object presence range on a water area map.
    • FIG. 7 is a graph showing a relationship between a distance between an imager and a feature point and the size of an object presence range.
    • FIG. 8 is a diagram illustrating a movement route that avoids an object (obstacle).
    • FIG. 9 is a plan view showing a vehicle including a surrounding object detection system according to a modified example.
    DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Preferred embodiments are hereinafter described with reference to the drawings.
  • The structure of a marine vessel 100 including a water area object detection system 103 according to preferred embodiments is now described with reference to FIGS. 1 to 8. The water area object detection system 103 is an example of a "surrounding object detection system".
  • In the figures, arrow FWD represents the forward movement direction of the marine vessel 100 (front side with reference to a hull 101), and arrow BWD represents the reverse movement direction of the marine vessel 100 (rear side with reference to the hull 101). The hull 101 is an example of a "mobile body".
  • In the figures, arrow L represents the portside direction of the marine vessel 100 (left side with respect to the hull 101), and arrow R represents the starboard direction of the marine vessel 100 (right side with respect to the hull 101).
  • As shown in FIG. 1, the marine vessel 100 includes the hull 101, a marine propulsion device 102 provided on the hull 101, and the water area object detection system 103 provided on or in the hull 101.
  • The marine propulsion device 102 is attached to a transom of the hull 101 from behind. That is, in preferred embodiments, the marine propulsion device 102 is an outboard motor, and the marine vessel 100 is an outboard motor boat.
  • The marine vessel 100 performs a control to estimate the self-position of the hull 101 in the water area map M while creating a two-dimensional water area map M (see FIG. 2) around the hull 101 that extends horizontally, using the water area object detection system 103. The water area map M is an example of a "surrounding map".
  • As an example, the control described above (the control to estimate the self-position of the hull 101 in the water area map M while creating the water area map M) is achieved by simultaneous localization and mapping (SLAM).
  • The SLAM is a technique to simultaneously create an environment map around a mobile device and estimate the self-position of the mobile device in the environment map, using an image captured by a camera installed on the mobile device, for example. Unlike estimation of a self-position on a map using a global positioning system (GPS), for example, estimation of the self-position of the mobile device using the SLAM is able to be performed even in an environment such as indoors in which a GPS or the like is not usable.
  • The SLAM enables the mobile device to move while avoiding surrounding objects so as to not collide with the objects, and to move along an optimal movement route without duplication of routes, for example.
  • The SLAM includes passive SLAM (such as so-called visual SLAM) that uses an image sensor such as a camera to image a surrounding object, and active SLAM (such as so-called LiDAR SLAM) performed by irradiating a surrounding object with a laser beam of a laser device and detecting the reflected laser beam. The marine vessel 100 according to preferred embodiments performs a control using a means such as the former passive SLAM.
  • The passive SLAM using an image sensor such as a camera includes a means to acquire dense detection data and a means to acquire sparse detection data. The means to acquire dense detection data requires a larger amount of data processing in a controller as compared with the means to acquire sparse detection data. The sparse detection data refers to data obtained by extracting a feature point that is a portion of an image captured by an image sensor such as a camera, for example.
  • The marine vessel 100 according to preferred embodiments performs a control using the means such as the SLAM to acquire the latter sparse detection data. Consequently, the marine vessel 100 computes acquired detection data more quickly, and performs a real-time control according to movement of the marine vessel 100. That is, the marine vessel 100 performs a highly responsive (real-time) control.
  • As shown in FIGS. 2 and 3, the marine vessel 100 performs a control to automatically move while avoiding obstacles (objects 0) along a movement route R using the water area map M created using the water area object detection system 103 and a control to automatically dock the hull 101 at a shore structure O1 such as a floating pier.
  • The marine vessel 100 uses the water area map M as a means to know the positions of the obstacles (objects 0) not only when the marine vessel 100 automatically moves but also when a user manually maneuvers the marine vessel 100. That is, the water area map M is a so-called cost map to indicate the positions of the obstacles (objects 0) that are present around the marine vessel 100, for example.
  • The water area object detection system 103 includes an imager 1 provided on the hull 101, a display 2 provided on the hull 101, and a controller 3 (see FIG. 1).
  • As shown in FIG. 4, the water area object detection system 103 (controller 3) performs a control to create the water area map M in which an object presence range F1 including a likelihood that an object O is present is set around a feature point F by detecting, based on an image captured by the imager 1, the feature point F corresponding to the object 0 in the image together with a distance to the feature point F.
  • The imager 1 captures an image around the hull 101. The imager 1 includes two imaging light receivers 1a spaced apart at different locations on the hull 101. Each imaging light receiver 1a includes a monocular camera including an imaging device such as a CCD sensor or a CMOS sensor.
  • The water area object detection system 103 (controller 3) measures a distance from the imager 1 to the feature point F using the two imaging light receivers 1a. Specifically, the water area object detection system 103 (controller 3) measures the distance to the feature point F corresponding to the object O in an image captured by a triangulation method based on images captured by the two imaging light receivers 1a.
  • The "feature point F corresponding to the object O in an image" refers to a specific point shown in a portion of the image in which the object O is located. As an example, the feature point F is set in a portion of the image in which there is a particularly large change in brightness or color tone.
  • The water area object detection system 103 (controller 3) preliminarily performs distortion correction of images captured by the two imaging light receivers 1a, rectification to associate the images with each other, and parallax estimation by matching corresponding feature points F on the images, for example, as preprocessing for distance measurement by the triangulation method.
  • Measurement of the distance to the feature point F by the triangulation method performed by the water area object detection system 103 (controller 3) is now described with reference to FIG. 5.
  • Assuming that L1 represents a distance in a horizontal direction between the two imaging light receivers 1a, x represents a distance in a direction perpendicular to the horizontal direction between the two imaging light receivers 1a, d represents the parallax of the two imaging light receivers 1a, p represents the element pitches of the imaging devices of the imaging light receivers 1a, and f represents a focal length of one of the imaging light receivers 1a, a distance L in the horizontal direction from the other of the imaging light receivers 1a to the feature point F (object 0) is obtained by the following formula. When the two imaging light receivers 1a are provided side by side like a stereo camera, L1 = 0.
    [Mathematical Formula 1] L = fx pd + L 1
    Figure imgb0001
  • A displacement amount ΔL of a measurement distance in a case in which the parallax d is shifted by one pixel in the images captured by the two imaging light receivers 1a is obtained by the following formula.
    [Mathematical Formula 2] ΔL = fx pd d + 1
    Figure imgb0002
  • From the above formula showing the displacement amount ΔL of the measurement distance, it is understood that the displacement amount ΔL of the measurement distance increases as the parallax d decreases, and the displacement amount ΔL of the measurement distance decreases as the parallax d increases. This displacement amount ΔL of the measurement distance corresponds to a distance resolution at the time of measuring the distance L. That is, as the distance L in the horizontal direction from the imaging light receiver 1a to the feature point F increases, a distance measurement error becomes greater. The distance measurement error quadratically becomes greater as the distance L increases.
  • As shown in FIG. 2, the display 2 (see FIG. 1) displays the water area map M created by the controller 3 (see FIG. 1). As an example, the display 2 displays the water area map M within a display image of 450 pixels wide by 600 pixels long. The display image of 450 pixels wide by 600 pixels long is a display image in which a point group is plotted in a world coordinate system. As an example, the size of one pixel of the water area map M corresponds to a size of 0.1 m horizontally and 0.1 m vertically in the world coordinate system.
  • The display 2 displays the imager 1 (imaging light receiver 1a) at a predetermined pixel position of the water area map M in a predetermined orientation. The display 2 also displays the hull 101 together with the imager 1 on the water area map M. The display 2 displays a schematic model of the hull 101 on the water area map M.
  • The display 2 displays the feature points F on the water area map M. Furthermore, the display 2 displays, on the water area map M, the object presence range F1 set around the feature point F and including a likelihood that the object 0 is present. As a specific example, the controller 3 performs a control to display one feature point F in one pixel P of the display 2 and set a perfect circular object presence range F1 around one pixel P in which the feature point F is displayed to display the feature point F and the perfect circular object presence range F1 on the display 2 (see FIG. 6).
  • The object presence range F1 refers to a range set around the feature point F and including a likelihood that the object O is present. In short, the object presence range F1 refers to a range in which the object 0 is probabilistically present. In other words, the object presence range F1 should be avoided when the marine vessel 100 moves, and a movement route R (see FIGS. 2 and 8) for automatic movement is not set in the object presence range F1.
  • The display 2 displays an imaging area A with a predetermined angle of view indicating a range currently being imaged by the imager 1 on the water area map M.
  • The controller 3 (see FIG. 1) includes a circuit board including a central processing unit (CPU), a read-only memory (ROM), a random access memory (RAM), etc., for example. The controller 3 is provided in the hull 101. The controller 3 is connected to the two imaging light receivers 1a, the display 2, and the marine propulsion device 102 by signal lines.
  • The controller 3 performs a control to automatically move the hull 101 by setting the movement route R for automatic movement based on the created water area map M and controlling driving of the marine propulsion device 102.
  • As described above, the controller 3 performs a control to create the water area map M in which the perfect circular object presence range F1 including a likelihood that the object O is present is set around the feature point F by detecting the feature point F corresponding to the object 0 in the image together with the distance to the feature point F based on the image captured by the imager 1.
  • The controller 3 creates the two-dimensional water area map M extending in the horizontal direction of the hull 101 by setting the object presence range F1 in a horizontal plane. That is, the controller 3 creates the two-dimensional water area map M extending in the forward-rearward direction and the right-left direction (arrows FWD, BWD, L, and R) of the hull 101.
  • As shown in FIGS. 4 and 7, when the two-dimensional water area map M is created, the controller 3 reduces the object presence range F1 as the distance from the imager 1 of the hull 101 to the feature point F corresponding to the object O decreases. When the distance from the imager 1 to the feature point F is equal to or less than a predetermined distance d1, the size of the object presence range F1 is set to a lower limit r1. In other words, the controller 3 sets a minimum value for the size of the object presence range F1 such that the size of the object presence range F1 does not become too small. The predetermined distance d1 refers to a distance to determine a range in which the size of the object presence range F1 is set to the lower limit.
  • The controller 3 changes the predetermined distance d1 by setting. As an example, the controller 3 sets the predetermined distance d1 to 15 m or more and 25 m or less, and sets the size of the object presence range F1 to the lower limit r1 when the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1. As a specific example, the controller 3 sets the predetermined distance d1 to about 21 m, and sets the size of the object presence range F1 (a distance from the feature point F to the outer edge of the perfect circular object presence range F1, i.e., the radius of the object presence range F1) to the lower limit r1 when the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1 (about 21 m). That is, in a range relatively close to the hull 101 in which the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1, the size of the object presence range F1 is set to a constant lower limit r1.
  • When the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1, the lower limit r1 is set such that a plurality of object presence ranges F1 set for a plurality of feature points F corresponding to the same object O partially overlap each other. That is, the controller 3 reduces or prevents the occurrence of a gap between adjacent object presence ranges F1 of the plurality of object presence ranges F1 set for the plurality of feature points F corresponding to the same object O. If the lower limit r1 is not set for the size of the object presence range, the object presence range continues to become smaller toward the feature point as the distance from the imager to the feature point decreases, and thus a gap is more likely to occur between the adjacent object presence ranges.
  • When the distance from the imager 1 to the feature point F is larger than the predetermined distance d1, the controller 3 changes the size of the object presence range according to the distance measurement error that quadratically becomes greater as the distance from the imager 1 to the feature point F increases (see FIG. 7).
  • As a specific example, the controller 3 increases the size of the object presence range F1 in proportion to the magnitude of the distance measurement error as the distance from the imager 1 to the feature point F increases when the distance from the imager 1 to the feature point F is larger than the predetermined distance d1.
  • For example, when the distance from the imager 1 to the feature point F is d2 larger than the predetermined distance d1, the controller 3 sets the size (radius) of the object presence range F1 to r2 larger than the lower limit r1. Furthermore, the controller 3 sets the size (radius) of the object presence range F1 to r3 larger than r2 when the distance from the imager 1 to the feature point F is d3 larger than the distance d2.
  • The controller 3 performs a control to redetect the feature point F corresponding to the object 0 in the image together with the distance to the feature point F for each imaging frame of the imager 1 and compare the detected feature point F with the currently used feature point F. The controller 3 performs a control to reject the detected feature point F and continue to use the currently used feature point F when the degree of change is smaller than a predetermined value, and performs a control to update the result to the newly detected feature point F and update the water area map M and the position of the imager 1 in the water area map M when the degree of change is conversely larger than the predetermined value. Furthermore, the controller 3 updates the water area map M every predetermined imaging frame (every ten frames, for example).
  • The controller 3 updates the water area map M using Bayesian estimation. The controller 3 performs a Bayesian estimation for each pixel P and updates a presence probability indicating that the object O will be present at each pixel P. For example, the controller 3 assigns a current probability larger than an initial probability and a prior probability to the object presence range F1 and assigns a current probability smaller than the initial probability and the prior probability to a range (absence range) outside the object presence range F1 to calculate a posterior probability using Bayesian estimation. A formula for Bayesian estimation is "posterior probability = (current probability × prior probability)/initial probability".
  • As a specific example, a presence probability of "0.5" is assigned as the initial probability to all pixels P, and a presence probability of "1" is uniformly assigned to each pixel P in the object presence range F1, and a presence probability of "0.35" is uniformly assigned to the range outside the object presence range F1. The controller 3 sets the current probability and the initial probability to the same value to make the posterior probability equal to the prior probability in Bayesian estimation outside the range of the angle of view of the imager 1.
  • As shown in FIGS. 2, 3, and 8, the controller 3 controls driving of the marine propulsion device 102 to move the hull 101.
  • As an example, the controller 3 automatically docks the hull 101 by automatically moving the hull 101 toward the shore structure 01 (see FIG. 3) corresponding to the object O. When automatically docking the hull 101, the controller 3 sets the size of the object presence range F1 to the lower limit r1 or more, sets the movement route R that avoids the object presence range F1 around the feature point F corresponding to an obstacle (object O), and automatically moves the hull 101 along the movement route R when the obstacle corresponding to the object O is present between the hull 101 and the shore structure 01.
  • According to the various preferred embodiments described above, the following advantageous effects are achieved.
  • According to a preferred embodiment, the water area object detection system 103 includes the controller 3 configured or programmed to create the water area map M in which the object presence range F1 including a likelihood that the object O is present is set. The controller 3 is configured or programmed to reduce the object presence range F1 as the distance from the imager 1 of the hull 101 to the feature point F corresponding to the object O decreases, and set the size of the object presence range F1 to the lower limit r1 when the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1. Accordingly, when the imager 1 is relatively near the object O such that the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1, the size of the object presence range F1 is set to the lower limit r1, and thus an excessive reduction in the object presence range F1 is prevented. Therefore, the possibility that in the water area map M, a portion in which the object O is originally present is not included in the object presence range F1, and a gap occurs between the adjacent object presence ranges F1 is reduced or prevented. Therefore, the water area map M that accurately indicates the object 0 near the imager 1 is created.
  • According to a preferred embodiment, the lower limit r1 is set such that the plurality of object presence ranges F1 set for the plurality of feature points F corresponding to the same object 0 partially overlap each other when the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1. Accordingly, when the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1, i.e., when each of the plurality of object presence ranges F1 set for the plurality of feature points F corresponding to the same object 0 is relatively small, the plurality of object presence ranges F1 partially overlap each other, and thus the possibility that a portion in which the object O is originally present is not included in the object presence range F1, and a gap occurs between the adjacent object presence ranges F1 is more reliably reduced or prevented. Therefore, the water area map M that still more accurately indicates the object 0 near the imager 1 is created.
  • According to a preferred embodiment, the controller 3 is configured or programmed to automatically dock the hull 101 by automatically moving the hull 101 toward the shore structure 01 corresponding to the object O. Accordingly, the hull 101 is easily docked at the shore structure O1.
  • According to a preferred embodiment, the controller 3 is configured or programmed to, when the obstacle corresponding to the object 0 is present between the hull 101 and the shore structure 01 when the hull 101 is automatically docked, set the size of the object presence range F1 to the lower limit r1 or more, set the movement route R that avoids the object presence range F1 around the feature point F corresponding to the obstacle, and automatically move the hull 101 along the movement route R. Accordingly, the sizes of the plurality of object presence ranges F1 indicating the shore structure 01 are set to the lower limit r1 or more, and thus a gap between the plurality of object presence ranges F1 indicating the shore structure O1 is filled. Consequently, when the hull 101 is automatically docked at the shore structure O1, contact of the hull 101 with the obstacle is reduced or prevented.
  • According to a preferred embodiment, the controller 3 is configured or programmed to change the size of the object presence range F1 according to the distance measurement error that quadratically becomes greater as the distance from the imager 1 to the feature point F increases when the distance from the imager 1 to the feature point F is larger than the predetermined distance d1. Accordingly, when the distance from the imager 1 to the feature point F is larger than the predetermined distance d1, the size of the object presence range F1 is set to an appropriate size in consideration of the distance measurement error. Therefore, the water area map M that more accurately indicates the object O imaged by the imager 1 is created.
  • According to a preferred embodiment, the controller 3 is configured or programmed to create the two-dimensional water area map M extending in the horizontal direction by setting the object presence range F1 in the horizontal plane. Accordingly, as compared with a case in which a three-dimensional water area map M is created in consideration of an upward-downward direction (height direction), the processing load on the controller 3 is reduced, and thus the real-time characteristics (responsiveness) of control is improved.
  • According to a preferred embodiment, the controller 3 is configured or programmed to redetect the feature point F corresponding to the object O in the image together with the distance to the feature point F for each predetermined number of imaging frames of the imager 1 to update the water area map M. Accordingly, when an estimation method such as Bayesian estimation is performed, the posterior probability is converged in an earlier stage (an area in which the object O is present has a higher probability, and an area in which the object O is absent has a lower probability).
  • According to a preferred embodiment, the controller 3 is configured or programmed to update the water area map M using Bayesian estimation, and assign the current probability larger than the initial probability and the prior probability to the object presence range F1 and assign the current probability smaller than the initial probability and the prior probability to the range outside the object presence range F1 to calculate the posterior probability using Bayesian estimation. Accordingly, even when noise such as bubbles occurs in a predetermined frame, Bayesian estimation is repeatedly performed to update the water area map M such that the probability that the noise such as bubbles is present in the water area map M is reduced after the noise disappears. Therefore, the noise is reliably removed from the water area map M.
  • According to a preferred embodiment, the imager 1 includes the two imaging light receivers 1a spaced apart at different locations on the hull 101, and the controller 3 is configured or programmed to measure the distance from the imager 1 to the feature point F using the two imaging light receivers 1a. Accordingly, two different images are captured simultaneously by the two imaging light receivers 1a to measure the distance to the feature point F by the triangulation method, and thus as compared with a case in which two images are captured from different locations at different times by a single imaging apparatus to measure the distance to the feature point F by the triangulation method, the accuracy of distance measurement is improved.
  • According to a preferred embodiment, the water area object detection system 103 further includes the display 2 provided on the hull 101 to display the water area map M, and the controller 3 is configured or programmed to perform a control to display one feature point F in one pixel P of the display 2 and set the perfect circular object presence range F1 around one pixel P in which the feature point F is displayed to display the feature point F and the perfect circular object presence range F1 on the display 2. Accordingly, the boundary of the object presence range F1 is set at positions to which distances from the feature point F are equal to each other, and the water area map M that more accurately indicates the object O imaged by the imager 1 is created.
  • According to a preferred embodiment, the controller 3 is configured or programmed to set the predetermined distance d1 to 15 m or more and 25 m or less, and set the size of the object presence range F1 to the lower limit r1 when the distance from the imager 1 to the feature point F is equal to or less than the predetermined distance d1. Accordingly, the size of the object presence range F1 is set to the lower limit r1 within a predetermined range from the imager 1 to the feature point F with the predetermined distance d1 of 15 m or more and 25 m or less, which is a relatively close to the hull 101, and the water area map M that more accurately indicates the object O is created.
  • The preferred embodiments described above are illustrative for present teaching but the present teaching also relates to modifications of the preferred embodiments.
  • For example, while the present teaching is preferably applied to a marine vessel in preferred embodiments described above, the present teaching is not restricted to this. The present teaching may alternatively be applied to a vehicle 201 shown in FIG. 9. The vehicle 201 includes a surrounding object detection system 203. The surrounding object detection system 203 includes two imaging light receivers 1a, a display 2, and a controller 3 to create a surrounding map. In addition, the present teaching may alternatively be applied to a flying object such as a drone. The vehicle 201 and the flying object are examples of a "mobile body".
  • While the marine vessel is preferably an outboard motor boat in preferred embodiments described above, the present teaching is not restricted to this. In the present teaching, the marine vessel may alternatively be a marine vessel other than an outboard motor boat. For example, the marine vessel may be a marine vessel including an inboard motor, an inboard-outboard motor, or a jet propulsion device.
  • While the imager preferably includes the two imaging light receivers each including a monocular camera in preferred embodiments described above, the present teaching is not restricted to this. In the present teaching, the imager may alternatively include stereo cameras, for example. Alternatively, the imager may include only one monocular camera. In such a case, the marine vessel preferably includes a highly accurate GPS to detect the position of the hull and a highly accurate inertial measurement unit (IMU) to detect the attitude of the hull.
  • While the size of one pixel indicating the water area map preferably corresponds to a size of 0.1 m vertically and 0.1 m horizontally in the world coordinate system in preferred embodiments described above, the present teaching is not restricted to this. In the present teaching, the size of one pixel indicating the water area map may alternatively correspond to a size different from 0.1 m vertically and 0.1 m horizontally in the world coordinate system.
  • While the predetermined distance to determine a range in which the size of the object presence range is set to the lower limit is preferably 15 m or more and 25 m or less (about 21 m) in preferred embodiments described above, the present teaching is not restricted to this. In the present teaching, the predetermined distance to determine a range in which the size of the object presence range is set to the lower limit may alternatively be different from 15 m or more and 25 m or less.
  • While a presence probability of "1" is preferably uniformly assigned to each pixel in the object presence range in Bayesian estimation in preferred embodiments described above, the present teaching is not restricted to this. In the present teaching, a presence probability of "1" may not be uniformly assigned to each pixel in the object presence range in Bayesian estimation. For example, the presence probability of the feature point may be set to "1", and the presence probability may be decreased according to a Gaussian distribution, for example, as the distance from the feature point increases in the object presence range.
  • While the object presence range is preferably indicated by a perfect circle in the water area map in preferred embodiments described above, the present teaching is not restricted to this. In the present teaching, the object presence range may alternatively be indicated by a shape different from a perfect circle, such as an ellipse.

Claims (11)

  1. A water area object detection system (103) comprising:
    an imager (1) configured to be provided on a hull (101) and configured to capture an image around the hull (101); and
    a controller (3) configured or programmed to perform a control to detect a feature point (F) corresponding to an object (O) in the image together with a distance to the feature point (F) based on the image captured by the imager (1) to create a water area map (M) in which an object presence range (F1) including a likelihood that the object (O) is present is set around the feature point (F); wherein
    the controller (3) is configured or programmed to reduce the object presence range (F1) as the distance from the imager (1) of the hull (101) to the feature point (F) corresponding to the object (O) decreases, and set a size of the object presence range (F1) to a lower limit (r1) when the distance from the imager (1) to the feature point (F) is equal to or less than a predetermined distance.
  2. The water area object detection system according to claim 1, wherein the lower limit (r1) is set such that a plurality of the object presence ranges (F1) set for a plurality of the feature points (F) corresponding to the same object (O) partially overlap each other when the distance from the imager (1) to the feature point (F) is equal to or less than the predetermined distance.
  3. The water area object detection system according to claim 1 or 2, wherein the controller (3) is configured or programmed to automatically dock the hull (101) by automatically moving the hull (101) toward a shore structure (O1) corresponding to the object (O).
  4. The water area object detection system according to claim 3, wherein the controller (3) is configured or programmed to, when an obstacle corresponding to the object (O) is present between the hull (101) and the shore structure (O1) when the hull (101) is automatically docked, set the size of the object presence range (F1) to the lower limit (r1) or more, set a movement route (R) configured to avoid the object presence range (F1) around the feature point (F) corresponding to the obstacle, and automatically move the hull (101) along the movement route (R).
  5. The water area object detection system according to any one of claims 1 to 4, wherein the controller (3) is configured or programmed to change the size of the object presence range (F1) according to a distance measurement error that quadratically becomes greater as the distance from the imager (1) to the feature point (F) increases when the distance from the imager (1) to the feature point (F) is larger than the predetermined distance.
  6. The water area object detection system according to any one of claims 1 to 5, wherein the controller (3) is configured or programmed to create the two-dimensional water area map horizontally extending by setting the object presence range (F1) in a horizontal plane.
  7. The water area object detection system according to any one of claims 1 to 6, wherein the controller (3) is configured or programmed to redetect the feature point (F) corresponding to the object (O) in the image together with the distance to the feature point (F) for each predetermined number of imaging frames of the imager (1) to update the water area map.
  8. The water area object detection system according to claim 7, wherein the controller (3) is configured or programmed to:
    update the water area map using Bayesian estimation; and
    assign a current probability larger than an initial probability and a prior probability to the object presence range (F1) and assign a current probability smaller than the initial probability and the prior probability to a range outside the object presence range (F1) to calculate a posterior probability using the Bayesian estimation.
  9. The water area object detection system according to any one of claims 1 to 8, wherein
    the imager (1) includes two imaging light receivers (1a) configured to be spaced apart at different locations on the hull (101); and
    the controller (3) is configured or programmed to measure the distance from the imager (1) to the feature point (F) using the two imaging light receivers (1a).
  10. The water area object detection system according to any one of claims 1 to 9, further comprising:
    a display (2) configured to be provided on the hull (101) and configured to display the water area map; wherein
    the controller (3) is configured or programmed to perform a control to display the one feature point (F) in one pixel (P) of the display (2) and set the object presence range (F1) having a perfect circular shape around the one pixel in which the one feature point (F) is displayed to display the one feature point and the object presence range (F1) on the display (2).
  11. The water area object detection system according to any one of claims 1 to 10, wherein the controller (3) is configured or programmed to set the predetermined distance to 15 m or more and 25 m or less, and set the size of the object presence range (F1) to the lower limit (r1) when the distance from the imager (1) to the feature point (F) is equal to or less than the predetermined distance.
EP23150776.5A 2022-01-14 2023-01-09 System for detecting objects on a water surface Pending EP4213111A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2022004596A JP7328378B2 (en) 2022-01-14 2022-01-14 Aquatic Object Detection System, Vessel and Peripheral Object Detection System

Publications (1)

Publication Number Publication Date
EP4213111A1 true EP4213111A1 (en) 2023-07-19

Family

ID=84888758

Family Applications (1)

Application Number Title Priority Date Filing Date
EP23150776.5A Pending EP4213111A1 (en) 2022-01-14 2023-01-09 System for detecting objects on a water surface

Country Status (3)

Country Link
US (1) US20230228575A1 (en)
EP (1) EP4213111A1 (en)
JP (1) JP7328378B2 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190308712A1 (en) * 2016-12-02 2019-10-10 Yamaha Hatsudoki Kabushiki Kaisha Boat
US20210406560A1 (en) * 2020-06-25 2021-12-30 Nvidia Corporation Sensor fusion for autonomous machine applications using machine learning

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5472538B2 (en) 2011-06-14 2014-04-16 日産自動車株式会社 Distance measuring device and environmental map generating device
JP7232089B2 (en) 2019-03-19 2023-03-02 ヤマハ発動機株式会社 Display device for ship, ship and image display method for ship

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190308712A1 (en) * 2016-12-02 2019-10-10 Yamaha Hatsudoki Kabushiki Kaisha Boat
US20210406560A1 (en) * 2020-06-25 2021-12-30 Nvidia Corporation Sensor fusion for autonomous machine applications using machine learning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SE S ET AL: "Mobile Robot Localization and Mapping with Uncertainty using Scale-Invariant Visual Landmarks", THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH,, vol. 21, no. 8, 31 August 2002 (2002-08-31), pages 735 - 758, XP002693739, DOI: 10.1177/027836402761412467 *
TATSUNORI KOUKENJI SUZUKISHUJI HASHIMOTO: "the 66th National Meeting of the Information Processing Society of Japan", WASEDA UNIVERSITY, article "3D Environment Recognition and Mapping for Autonomous Mobile Robots", pages: 2 - 453

Also Published As

Publication number Publication date
JP7328378B2 (en) 2023-08-16
US20230228575A1 (en) 2023-07-20
JP2023103836A (en) 2023-07-27

Similar Documents

Publication Publication Date Title
US11029686B2 (en) Automatic location placement system
JP7258062B2 (en) Automatic positioning system
KR101683274B1 (en) System for supporting vessel berth using unmanned aerial vehicle and the method thereof
KR102530691B1 (en) Device and method for monitoring a berthing
JP5000244B2 (en) Docking support device and ship equipped with the same
Park et al. Development of an unmanned surface vehicle system for the 2014 Maritime RobotX Challenge
Hurtós et al. Autonomous detection, following and mapping of an underwater chain using sonar
JP2018177074A (en) Autonomous type underwater robot and control method for the same
US11480965B2 (en) Automatic location placement system
KR102530847B1 (en) Method and device for monitoring harbor and ship
US20220122465A1 (en) Unmanned aircraft system, a control system of a marine vessel and a method for controlling a navigation system of a marine vessel
EP4053822A1 (en) Ship docking assistance device
EP3860908A1 (en) System and method for assisting docking of a vessel
CN115131720A (en) Ship berthing assisting method based on artificial intelligence
EP4213111A1 (en) System for detecting objects on a water surface
KR20230074438A (en) Device and method for monitoring ship and port
US11741193B2 (en) Distance recognition system for use in marine vessel, control method thereof, and marine vessel
EP4213110A1 (en) System for detecting objects on a water surface and marine vessel with a system for detecting objects on a water surface
Hurtos et al. Sonar-based chain following using an autonomous underwater vehicle
Kang et al. Development of USV autonomy for the 2014 maritime RobotX challenge
TWI835431B (en) Ship docking system and ship docking method
Harada et al. Experimental study on collision avoidance procedures for plastic waste cleaner USV
CN116149316A (en) Visual guidance method for unmanned ship dynamic recovery

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230807

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20240117

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR