WO2023162561A1 - Dispositif de surveillance de points de repère, système de direction de navire, procédé de surveillance de points de repère et programme - Google Patents

Dispositif de surveillance de points de repère, système de direction de navire, procédé de surveillance de points de repère et programme Download PDF

Info

Publication number
WO2023162561A1
WO2023162561A1 PCT/JP2023/002267 JP2023002267W WO2023162561A1 WO 2023162561 A1 WO2023162561 A1 WO 2023162561A1 JP 2023002267 W JP2023002267 W JP 2023002267W WO 2023162561 A1 WO2023162561 A1 WO 2023162561A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
target
captured
daytime
nighttime
Prior art date
Application number
PCT/JP2023/002267
Other languages
English (en)
Japanese (ja)
Inventor
正也 能瀬
トロン ミン トラン
博紀 村上
Original Assignee
古野電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 古野電気株式会社 filed Critical 古野電気株式会社
Publication of WO2023162561A1 publication Critical patent/WO2023162561A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G3/00Traffic control systems for marine craft

Definitions

  • the present invention relates to a target monitoring device, a ship maneuvering system, a target monitoring method, and a program.
  • a camera device is installed to sequentially acquire image data of the surrounding water area of a construction vessel, and the image data of the surrounding water area is sequentially input to a computing device.
  • Machine-learned prediction models are stored in advance using image data of multiple types of ships including other construction ships and image data other than ships as training data, and a prediction model is generated using an arithmetic device. and the image data of the surrounding waters, a technique is disclosed for sequentially determining whether or not a vessel of a preset type to be monitored exists in the image data of the surrounding waters.
  • the present invention has been made in view of the above problems, and its main purpose is to provide a target monitoring device, a ship maneuvering system, a target monitoring method, and a program capable of image recognition suitable for the scene. be.
  • a target monitoring apparatus includes an image acquisition unit that acquires an image containing targets on the sea captured by a camera installed on a ship; The target included in the image is detected using a scene determination unit that determines whether the image was captured in the daytime or at night, and a learned model for daytime when the image is determined to have been captured in the daytime.
  • a daytime image recognition unit, and a nighttime image recognition unit that detects the target included in the image using a trained model for nighttime when it is determined that the image was captured at nighttime. . According to this, image recognition suitable for the scene becomes possible.
  • the scene determination unit may use a learned model for scene determination to determine whether the image was captured in the daytime or at night. According to this, by using a trained model for scene determination, it is possible to improve the accuracy of scene determination.
  • the nighttime image recognition unit may detect, as the target, a target candidate that has been detected by the learned model for nighttime and has a brightness of a predetermined level or higher. There is a risk that a trained model for nighttime use alone may not be able to detect a single, independent light as a target. It is possible to improve the accuracy of target detection.
  • the daytime image recognition unit may detect the target included in the image and determine the type of the target using the learned model for daytime. According to this, it is possible not only to detect the target but also to distinguish the type of the target.
  • the nighttime image recognition unit may use the learned model for nighttime to detect the target included in the image and determine whether or not the target is a target. Since it is difficult to distinguish between types of targets in images taken at night, a pre-trained model that is suitable for images taken at night by discriminating whether or not the target is a target rather than the type of the target. can be used.
  • the scene determination unit further determines whether the image was captured during a sunrise or sunset time zone, and if it is determined that the image was captured during a sunrise or sunset time zone, and a sunrise/sunset image recognition unit that detects the target included in the image using a trained model for sunrise/sunset. Since the reflection of the sea surface is strong during sunrise or sunset, the accuracy of target detection may not be sufficient with the trained model for daytime or nighttime use. , it is possible to improve the accuracy of target object detection even for an image captured during the sunrise or sunset time period.
  • the sunrise/sunset image recognition unit may detect the target included in the image and determine the type of the target using the trained model for sunrise/sunset. According to this, it is possible not only to detect the target but also to distinguish the type of the target.
  • the scene determination unit further determines whether the image was shot against backlight, and if it is determined that the image was shot against backlight, the image before being input to the trained model
  • a preprocessing unit that performs gamma correction or contrast adjustment may be further provided. According to this, it is possible to improve the accuracy of target detection even for an image captured against backlight.
  • the scene determination unit further determines whether the image was captured in fog, and if it is determined that the image was captured in fog, the image before being input to the trained model
  • a preprocessing unit that performs image sharpening processing may be further provided. According to this, it is possible to improve the accuracy of target detection even for an image captured in fog.
  • the trained model for the daytime uses learning images including images captured in the daytime as input data, and the in-image position and type of the target included in the learning images as teacher data. It may be a trained model generated by learning. According to this, it is possible to use a trained model suitable for images captured in the daytime.
  • the learned model for nighttime uses learning images including images captured at nighttime as input data, and the position of the target in the image included in the learning images as teacher data, and performs machine learning. It may be a generated trained model. According to this, it is possible to use a trained model suitable for images captured at night.
  • the trained model for sunrise/sunset uses, as input data, learning images including images captured during sunrise or sunset, and the in-image position of the target included in the learning images. It may also be a trained model generated by machine learning using the type and the type as teacher data. According to this, it is possible to use a trained model suitable for an image captured during the sunrise or sunset time period.
  • a marine vessel maneuvering system includes the target monitoring device, a marine vessel maneuvering determination unit that performs marine vessel maneuvering judgment based on the target detected by the target monitoring device, and the a marine vessel maneuvering control unit that controls the marine vessel maneuvering. According to this, it is possible to determine ship maneuvering and control ship maneuvering based on targets detected by image recognition suitable for the scene.
  • a target monitoring method obtains an image containing a target on the sea taken by a camera installed on a ship, and determines whether the image was taken during the daytime or at nighttime. and if it is determined that the image was captured in the daytime, the target included in the image is detected using the learned model for daytime, and the image is determined to be captured in the nighttime. Second, the target included in the image is detected using the trained model for nighttime. According to this, image recognition suitable for the scene becomes possible.
  • a program obtains an image including a marine target captured by a camera installed on a ship, and determines whether the image is captured during the daytime or during the nighttime. , detecting the target included in the image using a learned model for daytime when it is determined that the image was captured in the daytime; and determining that the image was captured in the nighttime.
  • the computer is caused to detect the target included in the image using the learned model for nighttime. According to this, image recognition suitable for the scene becomes possible.
  • FIG. 10 is a diagram showing an example of recognition by a trained model for scene determination
  • FIG. 10 is a diagram showing an example of recognition by a trained model for daytime
  • FIG. 10 is a diagram showing an example of recognition by a trained model for nighttime
  • FIG. 10 is a diagram showing an example of recognition by a trained model for sunrise and sunset
  • It is a figure which shows the procedure example of a target monitoring method. It is a figure which shows the procedure example of pre-processing. It is a figure which shows the procedure example of an image recognition process.
  • FIG. 1 is a block diagram showing a configuration example of the target monitoring system 100.
  • the target object monitoring system 100 is a system mounted on a ship.
  • the ship equipped with the target monitoring system 100 is called “own ship”, and the other ships are called “other ships”.
  • the target monitoring system 100 includes a target monitoring device 1, a display unit 2, a radar 3, an AIS 4, a camera 5, a GNSS receiver 6, a gyrocompass 7, an ECDIS 8, a wireless communication unit 9, and a ship maneuvering control unit. These devices are connected to a network N such as a LAN, and are capable of network communication with each other.
  • a network N such as a LAN
  • the target monitoring device 1 is a computer including a CPU, RAM, ROM, non-volatile memory, input/output interface, and the like.
  • the CPU of the target monitoring device 1 executes information processing according to a program loaded from the ROM or nonvolatile memory to the RAM.
  • the program may be supplied via an information storage medium such as an optical disc or memory card, or may be supplied via a communication network such as the Internet or LAN.
  • the display unit 2 displays the display image generated by the target monitoring device 1.
  • the display unit 2 also displays radar images, camera images, electronic charts, and the like.
  • the display unit 2 is, for example, a display device with a touch sensor, a so-called touch panel.
  • the touch sensor detects a position within the screen indicated by a user's finger or the like.
  • the designated position is not limited to this, and may be input by a trackball or the like.
  • the radar 3 emits radio waves around its own ship, receives the reflected waves, and generates echo data based on the received signals. Also, the radar 3 identifies a target from the echo data and generates TT data (Target Tracking Data) representing the position and speed of the target.
  • TT data Target Tracking Data
  • the AIS (Automatic Identification System) 4 receives AIS data from other ships around the ship or from land control. Not limited to AIS, VDES (VHF Data Exchange System) may be used.
  • the AIS data includes identification codes of other ships, ship names, positions, courses, ship speeds, ship types, hull lengths, destinations, and the like.
  • the camera 5 is a digital camera that captures images of the outside from the own ship and generates image data.
  • the camera 5 is installed, for example, on the bridge of the own ship facing the heading.
  • the camera 5 may be a camera having a pan/tilt function and an optical zoom function, a so-called PTZ camera.
  • the GNSS receiver 6 detects the position of the own ship based on radio waves received from the GNSS (Global Navigation Satellite System).
  • the gyrocompass 7 detects the heading of the own ship.
  • a GPS compass may be used instead of the gyro compass.
  • the ECDIS (Electronic Chart Display and Information System) 8 acquires the ship's position from the GNSS receiver 6 and displays the ship's position on the electronic chart.
  • the ECDIS 8 also displays the planned route of the own ship on the electronic chart.
  • a GNSS plotter may be used.
  • the wireless communication unit 9 includes various wireless equipment for realizing communication with other ships or land control, such as ultra-high frequency, very high frequency band, medium short wave band, or short wave band radio equipment.
  • the ship steering control unit 10 is a control device for realizing automatic ship steering, and controls the steering gear of the own ship. Further, the ship maneuvering control unit 10 may control the engine of the own ship.
  • the target monitoring device 1 is an independent device, but it is not limited to this, and may be integrated with other devices such as the ECDIS 8. That is, the functions of the target monitoring device 1 may be realized by another device.
  • the target monitoring device 1 is mounted on the own ship and used to monitor targets of other ships, etc. existing around the own ship, but the application is not limited to this.
  • the target monitoring device 1 may be installed in a control on land and used to monitor vessels existing in the controlled sea area.
  • FIG. 2 is a block diagram showing a configuration example of the target monitoring device 1.
  • the control unit 20 of the target monitoring device 1 includes an image acquisition unit 11 , an image processing unit 12 , a display control unit 13 and a ship maneuvering determination unit 14 . These functional units are implemented by the control unit 20 executing information processing according to programs. Note that the ship maneuvering determination unit 14 may be located outside the target monitoring device 1 .
  • the control unit 20 of the target monitoring device 1 further includes a target management DB (database) 19.
  • a target management DB 19 is provided in the memory of the target monitoring device 1 .
  • the image acquisition unit 11 acquires images including marine targets such as other ships captured by the camera 5 installed on the own ship.
  • the image acquisition unit 11 sequentially acquires time-series images from the camera 5 and sequentially provides the images to the image processing unit 12 .
  • Time-series images are, for example, still images (frames) included in moving image data.
  • the image processing unit 12 performs predetermined image processing such as image recognition on the image acquired by the image acquisition unit 11, generates target data of the target recognized from the image, and registers the data in the target management DB 19. do. Details of the image processing unit 12 will be described later.
  • the target management DB 19 is a database that manages target data generated by the image processing unit 12 .
  • target management DB 19 not only target data generated by the image processing unit 12 but also other target data such as TT data generated by the radar 3 or AIS data received by the AIS 4 may be integrated. good.
  • the display control unit 13 generates a display image including an object representing the target based on the target data registered in the target management DB 19 and outputs the display image to the display unit 2 .
  • the display image is, for example, a radar image, an electronic chart, or a composite image thereof, and the object representing the target is placed at a position in the image corresponding to the actual position of the target.
  • the ship maneuvering determination unit 14 makes ship maneuvering decisions based on the target data registered in the target management DB 19, and causes the ship maneuvering control unit 10 to perform avoidance maneuvers when it is determined that it is necessary to avoid the target. Specifically, the ship maneuvering control unit 10 calculates a avoidance route for avoiding the target using a avoidance maneuvering algorithm, and controls the steering gear or the engine so that the own ship follows the avoidance route.
  • FIG. 3 is a diagram showing an example of the contents of the target management DB 19.
  • the target management DB 19 includes fields such as "target ID”, “type”, “position in image”, “actual position”, “speed”, and “course”.
  • the target management DB 19 may further include, for example, the size of the target and the elapsed time since detection.
  • Type represents the type of target determined from the image captured by the camera 5.
  • the type of target is, for example, a ship type such as a tanker, a pleasure boat, or a fishing boat.
  • Target types may further include offshore installations, such as buoys.
  • Position in image represents the position where the target exists in the image.
  • Actual position represents the position of the target in the physical space calculated based on the position of the target in the image.
  • the actual position is calculated by first calculating the relative position of the target with respect to the own ship from the position in the image and converting it into the absolute position of the target using the position of the own ship.
  • the "position” may be calculated by integrating the relative position of the target detected by the radar 3 or the actual position of the target received by the AIS 4, or alternatively.
  • “Velocity” and “Course” represent the velocity and course of the target calculated based on the change in the actual position of the target over time.
  • the target management DB 19 stores not only target data of targets recognized from images captured by the camera 5, but also target data captured by a separately installed PZT camera, fixed-point camera, 360-degree camera, or infrared camera. Target data of the target recognized from the image may be further registered.
  • FIG. 4 is a diagram showing a configuration example of the image processing unit 12.
  • the image processing unit 12 includes a scene determination unit 21 , a preprocessing unit 22 , a daytime image recognition unit 23 , a nighttime image recognition unit 24 , and a sunrise/sunset image recognition unit 25 . These functional units are implemented by the control unit 20 executing information processing according to programs.
  • the image processing unit 12 further includes a determination model holding unit 31, a daytime model holding unit 33, a nighttime model holding unit 34, and a sunrise/sunset model holding unit 35. These storage units are provided in the memory of the target monitoring device 1 .
  • the scene determination unit 21 uses the learned model for scene determination held in the determination model holding unit 31 to determine whether the image acquired by the image acquisition unit 11 was taken during the daytime, at nighttime, or It is determined whether the image was captured during the time of sunrise or sunset. The scene determination unit 21 further determines whether the image was captured in backlight or in fog.
  • the trained model DM for scene determination is, for example, an image discrimination model such as a convolutional neural network (CNN).
  • the learned model DM for scene determination is a trained model generated by machine learning using a learning image as input data and a class associated with the learning image as teacher data.
  • the learning images include images of the sea captured during the day, images of the sea captured at night, images of the sea captured during the time of sunrise or sunset (hereinafter also referred to as "sunrise and sunset"), It includes an image of the sea captured in backlight, an image of the sea captured in fog, and the like.
  • the training images may include images of the sea generated by Generative Adversarial Networks (GAN) or 3 Dimensional Computer Graphics (3DCG).
  • Classes associated with training images include “daytime”, “nighttime”, “sunset”, “backlight”, and "fog”.
  • the output layer of the trained model DM for scene judgment has elements corresponding to classes. Elements corresponding to “daytime”, “nighttime”, and “sunrise/set” are set so that the sum of probabilities is 1, for example, by a softmax function.
  • the scene determination unit 21 applies the class with the highest probability among "daytime”, "nighttime”, and "sunset".
  • the scene determination unit 21 determines that the image P was captured in the daytime when the probability of “daytime” is highest, and determines that the image P was captured in the nighttime when the probability of “nighttime” is highest, When the probability of "Sunrise/Sunset" is the highest, it is determined that the image P was captured during sunrise/sunset.
  • the elements corresponding to "backlight” and “fog” are set so as to output accuracies of 0 or more and 1 or less using, for example, a sigmoid function.
  • the scene determination unit 21 determines that the image P was captured in backlight when the probability of “backlight” is equal to or higher than the threshold, and determines that the image P was captured in fog when the probability of “fog” is equal to or higher than the threshold. do.
  • the scene determination unit 21 determines whether the image P was captured during the daytime or at night according to the sunrise time and sunset time calculated based on the image capture time of the image P and the current position of the own ship. It may be determined whether the image was taken or whether the image was taken at sunrise or sunset.
  • the sunrise time period is a period of a predetermined length that includes the sunrise time
  • the sunset time period is a period of a predetermined length that includes the sunset time.
  • the daytime is the period from the time of sunrise to the time of sunset, excluding the time zone of sunrise and time zone of sunset.
  • the night time is the period from the time of sunset to the time of sunrise, excluding the time of sunrise and the time of sunset.
  • the scene determination unit 21 determines whether the image P was captured during the daytime, during the nighttime, or during sunrise and sunset, depending on the ambient brightness detected by the illuminance sensor provided on the ship. You can judge.
  • the preprocessing unit 22 performs gamma correction or contrast adjustment on the image P when it is determined by the scene determination unit 21 that the image P has been shot against the backlight, and the preprocessing unit 22 performs gamma correction or contrast adjustment on the image P to perform daytime use, nighttime use, or sunrise/sunset use in the latter stage. Process the image into an image suitable for input to the trained model of
  • the preprocessing unit 22 performs image sharpening processing such as defog processing on the image P, and performs subsequent daytime and nighttime image processing. , or processed into an image suitable for input to a trained model for sunrise and sunset.
  • the daytime image recognition unit 23 uses the learned model for daytime held in the daytime model holding unit 33 to recognize the image P as Detects contained targets.
  • the daytime image recognition unit 23 determines the type of target detected from the image P.
  • the type of target is, for example, a ship type such as a tanker, a pleasure boat, or a fishing boat.
  • the type of target may be, for example, an offshore installation such as a buoy.
  • the trained model for daytime is, for example, an object detection model such as SSD (Single Shot MultiBox Detector) or YOLO (You Only Look Once), and outputs a bounding box surrounding the target included in the image.
  • the trained model for daytime may be a segmentation model such as Semantic Segmentation or Instance Segmentation.
  • a trained model for daytime use is generated by machine learning, using as input data training images that include images of the ocean taken during the daytime, and using the positions and types of targets in the images contained in the training images as teacher data. It is a trained model.
  • Training images may include daytime maritime images generated by a generative adversarial network (GAN) or 3DCG.
  • GAN generative adversarial network
  • 3DCG 3DCG
  • the position of the target within the image is specified by the coordinates of a rectangular area containing the target within the image P.
  • the in-image position of the target is associated with a class representing the type of the target, such as "tanker”, “pleasure board”, “fishing boat”, and “buoy”, and estimation accuracy.
  • FIG. 6 is a diagram showing an example of recognition of an image DP captured during the daytime by a trained model for daytime use.
  • a target SH such as another ship included in the image DP captured in the daytime is surrounded by a rectangular bounding box BB.
  • a label CF describing the type of target and the accuracy of estimation is added to the bounding box BB.
  • the nighttime image recognition unit 24 uses the learned model for nighttime held in the nighttime model holding unit 34 to recognize the image P. Detects contained targets.
  • the nighttime image recognition unit 24 determines not the type of the target detected from the image P, but whether it is a target. That is, the nighttime image recognition unit 24 determines that the object is a target when the accuracy of estimation output from the trained model for nighttime is equal to or higher than the threshold.
  • the trained model for nighttime use may be an object detection model such as SSD or YOLO, or a segmentation model such as Semantic Segmentation or Instance Segmentation, like the trained model for daytime use. .
  • a trained model for nighttime use is generated by machine learning using training images including images of the sea taken at night as input data, and the positions of targets in the images included in the training images as teacher data. It is a finished model.
  • the trained model for nighttime also learns the arrangement pattern of lights as a parameter.
  • Training images may include nighttime ocean images generated by a generative adversarial network (GAN) or 3DCG.
  • GAN generative adversarial network
  • 3DCG 3DCG.
  • a class representing the target is associated with the position in the image of the target.
  • FIG. 7 is a diagram showing an example of recognition of an image NP captured at night by a trained model for night. As shown in the figure, in the image NP captured at night, only the light L emitted by targets such as other ships can be seen. When a trained model for nighttime is applied to such an image NP, the light L of the target is surrounded by a rectangular bounding box BB, and the bounding box BB describes the target and the estimation accuracy. Label CF is added.
  • the nighttime image recognition unit 24 detects target candidates that have been detected by the learned model for nighttime and have a brightness level equal to or higher than a predetermined level, as targets. In other words, there is a risk that a single independent light cannot be detected as a target simply by applying a trained model for nighttime use. Even with NP, it is possible to improve the accuracy of target detection.
  • the sunrise/sunset image recognition unit 25 uses the trained model for sunrise/sunset held in the sunrise/sunset model holding unit 35 to perform A target included in the image P is detected. Also, the sunrise/sunset image recognition unit 25 determines the type of the target detected from the image P in the same manner as the daytime image recognition unit 23 .
  • the trained model for sunrise/sunset can be an object detection model such as SSD or YOLO, or a segmentation model such as Semantic Segmentation or Instance Segmentation, just like the trained models for daytime and nighttime. There may be.
  • the trained model for sunrise/sunset uses training images, including images of the ocean captured at sunrise/sunset, as input data, and the positions and types of targets in the images included in the training images as teacher data. This is the generated trained model.
  • the training images may include sunrise and sunset images of the ocean generated by a generative adversarial network (GAN) or 3DCG.
  • GAN generative adversarial network
  • 3DCG 3DCG
  • FIG. 8 is a diagram showing an example of recognition of an image SP captured at sunrise/sunset by a trained model for sunrise/sunset.
  • targets SH such as other ships included in the image SP captured at sunrise and sunset are surrounded by a rectangular bounding box BB.
  • a label CF describing the type of target and the accuracy of estimation is added to the bounding box BB.
  • the trained model for daytime or nighttime may not be accurate enough for target detection, but a trained model for sunrise and sunset is prepared separately. By doing so, it is possible to improve the accuracy of target detection even in the image SP captured at the time of sunrise and sunset.
  • FIG. 9 is a diagram showing a procedure example of a target monitoring method implemented in the target monitoring system 100.
  • FIG. FIG. 10 is a diagram showing a procedure example of a preprocessing routine.
  • FIG. 11 is a diagram showing a procedure example of an image recognition processing routine.
  • the control unit 20 of the target monitoring device 1 executes the information processing shown in the figure according to the program.
  • control unit 20 acquires the image P generated by the camera 5 (S11, processing as the image acquisition unit 11).
  • control unit 20 uses the learned model for scene determination to determine whether the acquired image P was captured during the daytime, at nighttime, or during sunrise or sunset. Further, it is determined whether the image was captured in backlight or in fog (S12, processing by the scene determination unit 21).
  • control unit 20 executes a preprocessing routine (S13, processing as the preprocessing unit 22).
  • control unit 20 performs gamma correction or contrast adjustment on the image P ( S22).
  • control unit 20 performs image sharpening processing such as Defog processing on the image P (S24).
  • control unit 20 executes an image recognition processing routine (S14).
  • the control unit 20 uses the learned model for daytime to perform image recognition. Targets included in P are detected and the type of the target is determined (S32, processing by the daytime image recognition unit 23).
  • control unit 20 detects target candidates included in the image P using the learned model for nighttime, and A target candidate having the above brightness is extracted as a target (S34, S35, processing by the nighttime image recognition unit 24).
  • the control unit 20 uses the trained model for sunrise/sunset to detect targets included in the image P.
  • the type of target is discriminated (S37, processing by the sunrise/sunset image recognition unit 25).
  • the image recognition processing routine ends, and the main routine shown in FIG. 9 also ends.
  • the control unit 20 generates target data of the target detected from the image P and registers it in the target management DB 19 .
  • Target monitoring device 2 display unit, 3 radar, 4 AIS, 5 camera, 6 GNSS receiver, 7 gyrocompass, 8 ECDIS, 9 wireless communication unit, 10 ship operation control unit, 11 image acquisition unit, 12 image processing unit , 13 Display control unit, 14 Ship maneuvering determination unit, 19 Target management DB, 20 Control unit, 21 Scene determination unit, 22 Preprocessing unit, 23 Daytime image recognition unit, 24 Nighttime image recognition unit, 25 Daytime image Recognition unit, 31 determination model holding unit, 33 daytime model holding unit, 34 nighttime model holding unit, 35 daytime model holding unit, 100 target monitoring system

Abstract

[Problème] Fournir un dispositif de surveillance de points de repère qui peut effectuer une reconnaissance d'image appropriée à un réglage. [Solution] Ce dispositif de surveillance de points de repère comprend : une unité d'acquisition d'image qui acquiert une image comprenant un point de repère en mer et qui a été capturée par un dispositif de prise de vues installé sur un navire ; une unité de détermination de réglage qui détermine si l'image a été capturée pendant la journée ou la nuit ; une unité de reconnaissance d'image diurne qui détecte le point de repère inclus dans l'image à l'aide d'un modèle diurne appris si l'image est déterminée comme ayant été capturée de jour ; et une unité de reconnaissance d'image nocturne qui détecte un point de repère inclus dans l'image à l'aide d'un modèle nocturne appris, si l'image est déterminée comme ayant été capturée de nuit.
PCT/JP2023/002267 2022-02-25 2023-01-25 Dispositif de surveillance de points de repère, système de direction de navire, procédé de surveillance de points de repère et programme WO2023162561A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022027914A JP2023124259A (ja) 2022-02-25 2022-02-25 物標監視装置、操船システム、物標監視方法、及びプログラム
JP2022-027914 2022-02-25

Publications (1)

Publication Number Publication Date
WO2023162561A1 true WO2023162561A1 (fr) 2023-08-31

Family

ID=87765441

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/002267 WO2023162561A1 (fr) 2022-02-25 2023-01-25 Dispositif de surveillance de points de repère, système de direction de navire, procédé de surveillance de points de repère et programme

Country Status (2)

Country Link
JP (1) JP2023124259A (fr)
WO (1) WO2023162561A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018037061A (ja) * 2016-09-01 2018-03-08 三星電子株式会社Samsung Electronics Co.,Ltd. 自律走行車両のためのビジョンセンサの制御方法及び装置
WO2020090251A1 (fr) * 2018-10-30 2020-05-07 日本電気株式会社 Dispositif, procédé et programme de reconnaissance d'objets
JP2020170319A (ja) * 2019-04-02 2020-10-15 Kyb株式会社 検出装置
JP2021187282A (ja) * 2020-05-29 2021-12-13 東亜建設工業株式会社 工事用船舶の航行監視システムおよび航行監視方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018037061A (ja) * 2016-09-01 2018-03-08 三星電子株式会社Samsung Electronics Co.,Ltd. 自律走行車両のためのビジョンセンサの制御方法及び装置
WO2020090251A1 (fr) * 2018-10-30 2020-05-07 日本電気株式会社 Dispositif, procédé et programme de reconnaissance d'objets
JP2020170319A (ja) * 2019-04-02 2020-10-15 Kyb株式会社 検出装置
JP2021187282A (ja) * 2020-05-29 2021-12-13 東亜建設工業株式会社 工事用船舶の航行監視システムおよび航行監視方法

Also Published As

Publication number Publication date
JP2023124259A (ja) 2023-09-06

Similar Documents

Publication Publication Date Title
US20190204416A1 (en) Target object detecting device, method of detecting a target object and computer readable medium
US20220024549A1 (en) System and method for measuring the distance to an object in water
WO2021132437A1 (fr) Serveur administratif dans un système d'aide à la navigation de navire, procédé d'aide à la navigation de navire et programme d'aide à la navigation de navire
Yu et al. Object detection-tracking algorithm for unmanned surface vehicles based on a radar-photoelectric system
EP3926364A1 (fr) Système de détection d'objet cible de navire, procédé de détection d'objet cible de navire et dispositif d'estimation de la fiabilité
Yu Development of real-time acoustic image recognition system using by autonomous marine vehicle
CN113933828A (zh) 一种无人艇环境自适应多尺度目标检测方法及系统
WO2023162561A1 (fr) Dispositif de surveillance de points de repère, système de direction de navire, procédé de surveillance de points de repère et programme
US20230351764A1 (en) Autonomous cruising system, navigational sign identifying method, and non-transitory computer-readable medium
CN115830140A (zh) 一种海上近程光电监控方法、系统、介质、设备及终端
US20220171043A1 (en) Sonar display features
WO2023112348A1 (fr) Dispositif de surveillance de cible, procédé de surveillance de cible et programme
CN107941220B (zh) 一种基于视觉的无人船海天线检测与导航方法和系统
WO2023162562A1 (fr) Système de surveillance de cible, procédé de surveillance de cible et programme
WO2023286360A1 (fr) Dispositif de collecte de données d'apprentissage, procédé de collecte de données d'apprentissage et programme
WO2023112349A1 (fr) Dispositif de surveillance de cible, procédé de surveillance de cible et programme
WO2023112347A1 (fr) Dispositif de surveillance de cible, procédé de surveillance de cible et programme
US20240104746A1 (en) Vessel tracking and monitoring system and operating method thereof
WO2023074014A1 (fr) Dispositif de surveillance de navire, procédé de surveillance de navire et programme
JP7486355B2 (ja) 船舶用物標検出システム、船舶用物標検出方法、信頼度推定装置、及びプログラム
WO2022137953A1 (fr) Dispositif d'identification d'amer, système de navigation autonome, procédé d'identification d'amer et programme
Cafaro et al. Towards Enhanced Support for Ship Sailing
WO2023286359A1 (fr) Appareil d'aide à l'accostage, procédé d'aide à l'accostage, et programme
WO2022137931A1 (fr) Dispositif d'identification de balise de chenal, système de navigation autonome, procédé d'identification de balise de chenal et programme
Tulchinskii et al. Automatic assistance system for visual control of targets

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23759567

Country of ref document: EP

Kind code of ref document: A1