WO2023100389A1 - Procédé de traitement d'informations, dispositif de traitement d'informations, et programme informatique - Google Patents

Procédé de traitement d'informations, dispositif de traitement d'informations, et programme informatique Download PDF

Info

Publication number
WO2023100389A1
WO2023100389A1 PCT/JP2022/015560 JP2022015560W WO2023100389A1 WO 2023100389 A1 WO2023100389 A1 WO 2023100389A1 JP 2022015560 W JP2022015560 W JP 2022015560W WO 2023100389 A1 WO2023100389 A1 WO 2023100389A1
Authority
WO
WIPO (PCT)
Prior art keywords
water
information processing
image
camera
ship
Prior art date
Application number
PCT/JP2022/015560
Other languages
English (en)
Japanese (ja)
Inventor
奈緒美 藤澤
勝幸 柳
正也 能瀬
ミン トラン
悠太 高橋
博紀 村上
Original Assignee
古野電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 古野電気株式会社 filed Critical 古野電気株式会社
Publication of WO2023100389A1 publication Critical patent/WO2023100389A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G3/00Traffic control systems for marine craft
    • G08G3/02Anti-collision systems

Definitions

  • the present invention relates to an information processing method, an information processing device, and a computer program for determining water objects around a ship.
  • Patent Literature 1 discloses a technique for facilitating identification of an object on the water by displaying an image of the exterior of a ship photographed and a determination result by radar in association with each other.
  • the present invention has been made in view of such circumstances, and its object is to provide an information processing method, an information processing apparatus, and a computer program that enable effective discrimination of objects on the water. That's what it is.
  • the information processing method when a plurality of photographing devices of different types installed on a ship obtain photographed images photographing the surroundings of the ship, and the photographed images are input, water objects included in the photographed images are acquired.
  • a photographed image obtained from each of the plurality of photographing devices is input to the trained model that outputs the discrimination result of the above; acquiring the discrimination result of the object on the water outputted by the trained model; is characterized by outputting the discrimination result of the object on the water included in .
  • An information processing apparatus includes an image acquisition unit that acquires captured images of the surroundings of a ship captured by a plurality of different types of capturing devices; a discrimination result acquiring unit that inputs captured images acquired from each of the plurality of photographing devices to a trained model that outputs results, and acquires a discrimination result of a water object output by the trained model; and an output unit for outputting the determination result corresponding to each of the devices.
  • a computer program acquires photographed images of the surroundings of a ship taken by a plurality of photographing devices of different types, and when the photographed images are input, learning to output the discrimination results of the objects on the water included in the photographed images. Captured images obtained from each of the plurality of photographing devices are input to the trained model, discrimination results of water objects output by the trained model are obtained, and the discrimination results corresponding to each of the plurality of photographing devices are obtained. It is characterized by having a computer execute processing to output.
  • captured images captured by a plurality of image capturing devices installed on a ship are input to a trained model, and the learned model outputs the discrimination results of water objects.
  • the trained model it is possible to discriminate objects on the water without relying on the crew's visual observation.
  • a plurality of imaging devices installed at a plurality of locations within the vessel water objects present in a wide area around the vessel can be effectively identified.
  • a plurality of imaging devices of different types it is possible to effectively identify more aquatic objects than can be identified using a single imaging device.
  • FIG. 1 is a block diagram showing a configuration example of an information processing system for determining water objects around a ship;
  • FIG. 2 is a block diagram showing an example of the internal functional configuration of the information processing apparatus;
  • FIG. 4 is a schematic diagram showing an example of a captured image;
  • FIG. 10 is a schematic diagram showing an example of outputting a photographed image and a discrimination result of an object on the water;
  • FIG. 10 is a schematic diagram showing an example of outputting a radar image, a photographed image, and a determination result of an object on the water;
  • FIG. 4 is a schematic diagram showing an example of a radar image with a designated portion and a captured image with the corresponding portion emphasized;
  • 7 is a flowchart illustrating an example of processing for re-learning a trained model;
  • FIG. 1 is a schematic diagram showing a ship.
  • a ship 1 is equipped with a plurality of photographing devices.
  • a plurality of photographing devices photograph the surroundings of the ship 1 and identify objects on the water included in the photographed images.
  • a water object is an object that is on the water around the ship 1, such as another ship or a buoy.
  • the multiple imaging devices include a front camera 11, an infrared camera 12, a PTZ camera 13 capable of pan, tilt and zoom, a starboard camera 14, and a port camera 15.
  • the front camera 11 captures an area in front of the ship 1 using visible light.
  • the infrared camera 12 captures an area in front of the ship 1 using infrared rays. By using the infrared camera 12, it becomes easy to photograph an object such as an object on water at night, which is difficult to photograph using visible light.
  • the PTZ camera 13 can photograph an area in any direction at any magnification.
  • a front camera 11 , an infrared camera 12 , and a PTZ camera 13 are installed in front of the ship 1 .
  • the starboard camera 14 is installed on the starboard side of the ship 1
  • the port camera 15 is installed on the port side of the ship 1 .
  • the starboard camera 14 uses visible light to capture an area facing the starboard side of the ship 1 .
  • the port camera 15 uses visible light to capture an area facing the port side of the ship 1 .
  • the relative positions of the areas photographed by the front camera 11, the infrared camera 12, the starboard camera 14, and the port camera 15 with respect to the ship 1 are fixed.
  • the plurality of photographing devices include photographing devices installed at a plurality of positions within the ship 1, and include photographing devices of different types.
  • the angles of view of the front camera 11, the starboard camera 14, and the port camera 15 are preferably wide.
  • the orientation in the horizontal plane is represented by an angle where the front of the ship 1 is 0 degrees
  • the front camera 11 photographs the areas of 300 degrees to 360 degrees and 0 degrees to 60 degrees in the circumference of the ship 1
  • the starboard camera 14 photographs an area of 30 degrees to 150 degrees
  • the port camera 15 photographs an area of 210 degrees to 330 degrees. Since the front camera 11, the starboard camera 14, and the port camera 15 have wide angles of view, a wide area can be photographed with one camera. Therefore, it is possible to efficiently discriminate an object on the water.
  • the front camera 11, the infrared camera 12, the starboard camera 14, or the port camera 15 may be configured so that the position of the area to be photographed can be changed.
  • the plurality of imaging devices provided on the ship 1 may include cameras other than the front camera 11 , the infrared camera 12 , the PTZ camera 13 , the starboard camera 14 and the port camera 15 . Any one of the front camera 11, the infrared camera 12, the PTZ camera 13, the starboard camera 14, and the port camera 15 may not be included in the plurality of imaging devices.
  • FIG. 2 is a block diagram showing a configuration example of an information processing system 200 for identifying water objects around the ship 1.
  • the information processing system 200 includes an information processing device 2 .
  • the information processing device 2 is a computer.
  • a plurality of photographing devices including a front camera 11 , an infrared camera 12 , a PTZ camera 13 , a starboard camera 14 and a port camera 15 are connected to the information processing device 2 .
  • a radar 16 and an AIS (Automatic Identification System) 17 provided on the ship 1 are also connected to the information processing device 2 .
  • the radar 16 scans the surroundings of the vessel 1 with radio waves, discriminates objects on the water based on the reflected waves of the radio waves, and generates a radar image including the discrimination result.
  • the AIS 17 is a device for exchanging ship information about ships between ships, and receives ship information including information for identifying other ships from other ships by wireless communication.
  • the information processing system 200 includes a display device 31 and an operation device 32 .
  • the display device 31 displays images.
  • the display device 31 is, for example, a liquid crystal display or an EL display (Electroluminescent Display).
  • the operation device 32 receives input of information by receiving an operation from a user such as a crew member of the ship 1 .
  • the operating device 32 is, for example, a touch panel, keyboard, or pointing device.
  • the display device 31 and the operation device 32 may be integrated.
  • FIG. 3 is a block diagram showing an example of the internal functional configuration of the information processing device 2.
  • the information processing device 2 executes an information processing method for identifying objects on the water around the ship 1 .
  • the information processing device 2 includes an arithmetic unit 21 , a memory 22 , a drive unit 23 , a storage unit 24 and an interface unit 25 .
  • the calculation unit 21 is configured using, for example, a CPU (Central Processing Unit), a GPU (Graphics Processing Unit), or a multi-core CPU.
  • a display device 31 and an operation device 32 are connected to the calculation unit 21 .
  • the memory 22 stores temporary data generated along with computation.
  • the memory 22 is, for example, a RAM (Random Access Memory).
  • a drive unit 23 reads information from a recording medium 20 such as an optical disc or a portable memory.
  • the storage unit 24 is nonvolatile, such as a hard disk or nonvolatile semiconductor memory.
  • the calculation unit 21 causes the drive unit 23 to read the computer program 241 recorded on the recording medium 20 and causes the storage unit 24 to store the read computer program 241 .
  • the calculation unit 21 executes processing necessary for the information processing device 2 according to the computer program 241 .
  • Computer program 241 may be a computer program product.
  • the computer program 241 may be downloaded from outside the information processing device 2 .
  • the computer program 241 may be pre-stored in the information processing device 2 . In these cases, the information processing device 2 does not have to include the drive section 23 .
  • the front camera 11, the infrared camera 12, the PTZ camera 13, the starboard camera 14, and the port camera 15 are connected to the interface unit 25.
  • the forward camera 11 , the infrared camera 12 , the PTZ camera 13 , the starboard camera 14 and the port side camera 15 create captured images of areas around the ship 1 that they are in charge of, and input the captured images to the information processing device 2 .
  • the interface unit 25 accepts an input photographed image.
  • the information processing device 2 acquires the captured image by receiving the captured image at the interface unit 25 .
  • the front camera 11, the infrared camera 12, the PTZ camera 13, the starboard camera 14, and the port camera 15 continuously create captured images, or periodically and repeatedly create captured images.
  • the information processing device 2 acquires captured images continuously or acquires captured images repeatedly on a regular basis.
  • the interface unit 25 is connected to the radar 16 and the AIS 17.
  • the radar 16 generates a radar image and inputs it to the information processing device 2 .
  • the information processing device 2 acquires the radar image by receiving the input radar image at the interface unit 25 .
  • the radar 16 repeatedly creates radar images, and the information processing device 2 repeatedly acquires captured images.
  • the AIS 17 receives vessel information from other vessels and inputs AIS data based on the received vessel information to the information processing device 2 .
  • AIS data includes information indicating the type of other vessels.
  • the information processing device 2 acquires the AIS data by receiving the input AIS data at the interface unit 25 . Each time the AIS 17 receives ship information, the AIS 17 inputs the AIS data to the information processing device 2, and the information processing device 2 acquires the AIS data.
  • the information processing device 2 may be configured by a plurality of computers, and data may be distributed and stored by the plurality of computers, and processing may be performed by the plurality of computers in a distributed manner.
  • the information processing device 2 may be realized by a plurality of virtual machines provided within one computer.
  • the information processing device 2 is equipped with a trained model 242 that has been trained.
  • the learned model 242 is implemented by the computing unit 21 executing information processing according to the computer program 241 .
  • the storage unit 24 stores data recording the parameters of the trained model 242 that has been learned in advance, and the calculation unit 21 uses the parameters to execute information processing according to the computer program 241, A trained model 242 is realized.
  • the trained model 242 may be configured by hardware. Alternatively, the trained model 242 may be provided outside the information processing device 2 , and the information processing device 2 may execute processing using the external trained model 242 .
  • FIG. 4 is a conceptual diagram showing the functions of the trained model 242.
  • FIG. Captured images created by the front camera 11 , the infrared camera 12 , the PTZ camera 13 , the starboard camera 14 or the port camera 15 are input to the learned model 242 .
  • the learned model 242 is trained in advance by machine learning so that when a photographed image is input, it outputs a result of judging a water object included in the photographed image.
  • the discrimination results output by the trained model 242 include the position of the object on the water in the captured image, the type of the object on the water, and the degree of certainty of the type of the object on the water.
  • the trained model 242 is configured using YOLO (You Only Look Once).
  • the trained model 242 may be a model using a method other than YOLO, such as R-CNN (Region Based Convolutional Neural Networks) or segmentation.
  • the learned model 242 may include a plurality of learned models in one-to-one correspondence with the front camera 11, the infrared camera 12, the PTZ camera 13, the starboard camera 14, and the port camera 15. Each of the plurality of learned models is individually learned in advance so as to output a determination result of a water object when a photographed image created by the corresponding photographing device is input.
  • the information processing device 2 performs a process of identifying water objects existing around the ship 1 and outputting the identification result of the water objects.
  • FIG. 5 is a flow chart showing an example of a process for outputting a determination result of an object on the water. A step is abbreviated as S below.
  • the computing unit 21 of the information processing device 2 executes processing according to the computer program 241 .
  • the front camera 11 , the infrared camera 12 , the PTZ camera 13 , the starboard camera 14 and the port camera 15 create captured images by capturing images, and input the captured images to the information processing device 2 .
  • the information processing device 2 acquires a captured image by receiving the captured image input from each imaging device at the interface unit 25 (S101).
  • the information processing device 2 may acquire only the captured images created by some of the imaging devices.
  • the calculation unit 21 stores the captured image in the storage unit 24 .
  • FIG. 6 is a schematic diagram showing an example of a captured image.
  • FIG. 6 shows an example of a photographed image 41 created by the front camera 11.
  • a captured image 41 includes a portion of the hull 51 of the ship 1 .
  • the photographed image 41 also includes a plurality of water objects 52 .
  • the processing of S101 corresponds to the image acquisition unit.
  • the information processing device 2 inputs the captured image to the learned model 242 (S102).
  • the calculation unit 21 inputs the captured image to the learned model 242, and causes the learned model 242 to execute processing.
  • the trained model 242 includes a plurality of trained models corresponding to a plurality of imaging devices one-to-one
  • the calculation unit 21 converts the captured image input from each imaging device into a trained model corresponding to the imaging device.
  • Enter to The learned model 242 outputs a determination result of determining a water object included in the input photographed image in response to the input of the photographed image.
  • the information processing device 2 acquires the determination result of the object on the water output by the learned model 242 (S103).
  • the calculation unit 21 stores the determination result of the object on the water in the storage unit 24 .
  • the determination result includes the position of the aquatic object in the captured image, the type of the aquatic object, and the degree of certainty of the type of the aquatic object.
  • the information processing device 2 determines whether or not the degree of certainty of the type of object on the water included in the determination result is within a predetermined range (S104).
  • the predetermined range is a range in which the degree of certainty is somewhat low.
  • the predetermined range is a range equal to or less than a predetermined threshold.
  • the second threshold is set to a value smaller than the threshold, and the predetermined range is equal to or greater than the second threshold and equal to or less than the threshold.
  • the predetermined range is stored in advance in the storage unit 24 or included in the computer program 241 .
  • the calculation unit 21 compares the certainty with a predetermined range.
  • the calculation unit 21 determines whether or not the certainty factors for each aquatic object are within a predetermined range. When the same object on the water is included in a plurality of photographed images, the calculation unit 21 determines whether the maximum degree of certainty among the degrees of certainty obtained based on the plurality of photographed images is included in a predetermined range. Alternatively, it may be determined whether or not the average value or the variance value of the certainty factors is within a predetermined range.
  • the information processing device 2 magnifies and shoots the water object whose kind of certainty is within the predetermined range.
  • the camera 13 is controlled (S105).
  • the calculation unit 21 identifies the position of the water object whose type certainty factor is within a predetermined range based on the captured image, and causes the PTZ camera 13 to photograph the identified position water object.
  • a control signal for control is transmitted from the interface unit 25 to the PTZ camera 13 .
  • the PTZ camera 13 adjusts its orientation in accordance with the control signal, and enlarges and photographs the water objects whose type certainty is within a predetermined range.
  • the PTZ camera 13 creates a photographed image by photographing, and inputs the photographed image to the information processing device 2 .
  • the information processing apparatus 2 acquires the captured image by receiving the captured image input from the PTZ camera 13 at the interface unit 25 (S106).
  • the calculation unit 21 stores the captured image in the storage unit 24 .
  • the photographed image includes an enlarged water object whose type confidence is within a predetermined range.
  • the captured image acquired in S101 corresponds to the first captured image, and the imaging device that created the first captured image corresponds to the first camera.
  • the captured image acquired in S106 corresponds to the second captured image.
  • the information processing device 2 next inputs the captured image acquired in S106 to the learned model 242 (S107).
  • the trained model 242 includes a plurality of trained models in one-to-one correspondence with a plurality of photographing devices
  • the calculation unit 21 inputs the photographed image to the trained model for the PTZ camera 13 .
  • the learned model 242 outputs a determination result of determining a water object included in the input photographed image.
  • the information processing device 2 acquires the determination result of the object on the water output by the learned model 242 (S108).
  • S ⁇ b>108 the calculation unit 21 stores the determination result of the object on the water in the storage unit 24 . Since the objects on the water are magnified, the features of the objects on the water become more prominent in the captured image.
  • a determination result with a higher degree of certainty can be obtained based on the magnified photographed image of the object on the water.
  • a determination result may be obtained that the photographed image does not include any object on the water.
  • the processing of S107-S108 corresponds to the determination result acquisition unit.
  • the information processing device 2 When the certainty factor is not within the predetermined range in S104 (S104: NO), or after S108 is finished, the information processing device 2 outputs the captured image and the water object discrimination result (S109).
  • the calculation unit 21 outputs the photographed image and the water object discrimination result by displaying the image including the photographed image and the water object discrimination result on the display device 31 .
  • the calculation unit 21 outputs a plurality of photographed images and the discrimination result of the object on the water included in each photographed image. The processing of S109 corresponds to the output unit.
  • FIG. 7 is a schematic diagram showing an example of outputting the photographed image and the determination result of the object on the water.
  • FIG. 7 shows an example of an image displayed on the display device 31.
  • the displayed images include an image 41 created by the front camera 11, an image 42 created by the infrared camera 12, an image 43 created by the PTZ camera 13, an image 44 created by the starboard camera 14, and an image 44 created by the port camera.
  • a photographed image 45 created by 15 is included.
  • Each photographed image is attached with a label 46 that indicates the type of photographing device that created it. By visually recognizing the label 46, the user can confirm which photographing device each photographed image was created by.
  • a photographed image input by each photographing device to the information processing device 2 is associated with identification information for identifying the photographing device. Based on the identification information associated with the captured image, the calculation unit 21 generates a label 46 indicating the type of the image capturing apparatus that created the captured image, and displays the image including the label 46 attached to each captured image on the display device. 31.
  • a mark indicating the position of the water object in the captured image and information indicating the type of the water object are output. For example, as shown in FIG. 7, a bounding box 53 surrounding an aquatic object 52 included in the captured image is displayed as a mark indicating the position of the aquatic object within the captured image. A mark other than the bounding box 53, such as an arrow, may be output. A user can know that a water object exists around the ship 1 by visually recognizing the bounding box 53 .
  • character information 54 representing the type of the aquatic object 52 in characters is displayed as information indicating the type of the aquatic object.
  • the types of aquatic objects are determined to be either ships, aids to navigation, lighthouses, aquaculture facilities, buoys, bridge girders, piers, other non-vessel objects, or unknown objects.
  • the type of the floating object is determined as an unknown object.
  • a water object identified as a vessel may be identified as either a merchant vessel, cruise ship, crane vessel, tugboat, gravel carrier, fishing vessel, pleasure vessel, other vessel, or unknown vessel. Aids or buoys may be further classified.
  • the learned model 242 has been trained so that when the certainty factor is within a predetermined range, the type of the water object 52 photographed below a predetermined size is determined to be unknown. good.
  • the user can recognize the types of water objects existing around the ship 1 by visually recognizing the character information 54 .
  • Information indicating the type of object on the water may be output in addition to the character information 54 . For example, by changing the color of the bounding box 53 or the character information 54 according to the type of water object, different types of water objects may be represented.
  • the captured image 41 created by the front camera 11 and the captured image 42 created by the infrared camera 12 capture almost the same area. However, since the light used is different, the water object 52 included in one captured image may not be included in the other captured image.
  • the captured image 41 created by the front camera 11, the captured image 44 created by the starboard camera 14, and the captured image 45 created by the port camera 15 are captured in different areas. However, if there is an overlapping portion in the areas photographed by the two photographing devices, the same aquatic object 52 may be included in the two photographed images.
  • FIG. 7 shows an example in which the captured image 41 created by the front camera 11 and the captured image 44 created by the starboard camera 14 include the same water object 52, which is a fishing boat.
  • FIG. 7 shows an example in which a photographed image 41 created by the front camera 11 and a photographed image 45 created by the port side camera 15 include the same aquatic object 52, which is a buoy.
  • the image shown in FIG. 7 also includes a photographed image 43 obtained by enlarging and photographing one water object 52 included in the photographed image 41 created by the front camera 11 with the PTZ camera 13 .
  • One aquatic object 52 included in the photographed image 41 appears very small, the type is determined to be unknown, and the degree of certainty is within a predetermined range.
  • the photographed image 43 obtained by enlarging and photographing the object on the water by the PTZ camera 13 is created, and the determination result of the object on the water is obtained.
  • the captured image 43 created by the PTZ camera 13 one aquatic object 52 included in the captured image 41 is enlarged.
  • the type of the aquatic object 52 that was determined as unknown in the determination result based on the captured image 41 is determined as a cruise ship in the determination result based on the captured image 43 .
  • text information 54 indicating that the waterborne object is an unknown object is displayed. be.
  • the calculation part 21 displays information indicating that the photographed image 43 does not contain the object on the water on the display device 31. - ⁇ At this time, the calculation unit 21 may display information indicating that the captured image 43 does not include the object on the water without displaying the captured image 43 .
  • the regions photographed by the photographing devices are different from each other, making it possible to photograph a wider region than can be photographed by a single photographing device. Become. Therefore, it is possible to effectively discriminate water objects present in a wide area around the ship 1 .
  • a plurality of imaging devices of different types water objects that cannot be photographed with one imaging device or are photographed unclearly can sometimes be clearly photographed using another type of imaging device. . Therefore, it is possible to effectively identify more water objects than can be identified using a single imaging device. For example, by using the PTZ camera 13 to enlarge and photograph an object on the water that becomes unclear when photographed by another photographing device, it is possible to obtain an accurate determination result.
  • the information processing device 2 then outputs the radar image, the captured image, and the determination result of the object on the water (S110).
  • the radar 16 scans the surroundings of the vessel 1 with radio waves, discriminates objects on the water based on the reflected waves of the radio waves, generates a radar image including the discrimination result, and inputs the radar image to the information processing device 2 .
  • the information processing device 2 acquires a captured image by receiving the input radar image at the interface unit 25 .
  • the calculation unit 21 stores the acquired radar image in the storage unit 24 . In S ⁇ b>110 , the calculation unit 21 displays on the display device 31 an image including the radar image, the photographed image, and the determination result of the object on the water.
  • FIG. 8 is a schematic diagram showing an example of outputting a radar image, a photographed image, and a determination result of an object on the water.
  • FIG. 8 shows an example of an image displayed on the display device 31, including the radar image 6, the photographed image, and the determination result of the object on the water.
  • the displayed images include the radar image 6, the captured image 41 created by the front camera 11, the captured image 44 created by the starboard camera 14, and the captured image 45 created by the port camera 15.
  • the captured image 41 is placed in front of the radar image 6
  • the captured image 44 is placed on the right side of the radar image 6
  • the captured image 45 is placed on the left side of the radar image 6 .
  • each photographed image and the radar image 6 are displayed in comparison, making it easy to understand the correspondence between the contents of the radar image 6 and the contents of the photographed image.
  • Other captured images may be displayed in contrast to the radar image 6 .
  • the radar image 6 includes images 61 of water objects present around the ship 1 .
  • the radar image 6 includes the determination result of the object on the water determined by the radar 16 .
  • both determination results are output in the same manner. That is, as a result of discrimination of a water object included in the radar image 6, a mark indicating the position of the water object in the radar image 6 and information indicating the type of the water object are output. For example, a bounding box 53 surrounding an image 61 of an object on the water and character information 63 indicating the type of the object 52 on the water are displayed in the same manner as the determination result of the object on the water superimposed on the captured image.
  • the contents of the character information 63 representing the type of the aquatic object are the same as the contents of the character information 54 representing the type of the corresponding aquatic object 52 .
  • the color of the bounding box 53 or the text information 54 related to the image 61 of the water object may be adjusted so as to be the same as the color of the bounding box 53 or the text information 54 related to the corresponding water object 52 .
  • the discrimination result based on the AIS data obtained from the AIS 17 may be used as the discrimination result of the object on the water superimposed on the radar image 6, the discrimination result based on the AIS data obtained from the AIS 17 may be used.
  • the water object discrimination results obtained by the two methods are compared.
  • the discrimination results of a water object obtained by two types of methods match. is easily ascertained. It is also possible to detect errors in the discrimination results of objects on the water. For example, if the image 61 of the object on the water is included in the radar image 6 but the corresponding object 52 on the water is not included in the captured image, it can be determined that the image 61 of the object on the water is a false image. can. In this way, objects on the water are discriminated more reliably.
  • the captured image 42 created by the infrared camera 12 and the captured image 43 created by the PTZ camera 13 may be further displayed.
  • the information processing device 2 next accepts designation of a part of the radar image 6 (S111).
  • the calculation unit 21 receives designation by the user operating the operation device 32 .
  • the calculation unit 21 displays on the display device 31 a designating figure such as a cursor or a bounding box for designating a part of the radar image 6 superimposed on the radar image 6 .
  • the calculation unit 21 changes the position of the designating graphic according to the user's operation received by the operating device 32 and receives the designation of a part of the radar image 6 .
  • the information processing device 2 then outputs the determination result of the object on the water included in the corresponding part in the captured image corresponding to the part in the designated radar image 6 (S112).
  • an emphasis figure such as a bounding box for emphasizing the corresponding portion in the captured image is superimposed on the captured image and displayed on the display device 31.
  • FIG. 9 is a schematic diagram showing an example of the radar image 6 with a designated portion and the captured image with the corresponding portion emphasized. Overlaid on the radar image 6, a designation image 64 is displayed for designating a portion including an image 61 of one aquatic object.
  • the designated image 64 is a graphic enclosing the designated portion with a double line.
  • photographed images 41 and 43 including corresponding portions are displayed, and an emphasis figure 55 for emphasizing the corresponding portions is displayed superimposed on the photographed images 41 and 43 .
  • the emphasized graphic 55 is a graphic enclosing the corresponding portion with a double line. Character information 54 is displayed as the determination result of the aquatic object 52 included in the corresponding portion.
  • a part in the radar image 6 and the corresponding part in the captured image are output in comparison, and the user can confirm how each part in the radar image 6 looks visually.
  • the user can confirm the water object 52 corresponding to the image 61 included in the radar image 6 in the captured image.
  • the user can easily compare the result of identifying the object on the water using the radar 16 and the result of identifying the object on the water based on the photographed image.
  • a photographed image that does not include a corresponding portion corresponding to a portion within the designated radar image 6 may be further displayed. After S112 ends, the information processing device 2 ends the process of outputting the determination result of the object on the water.
  • the processing of S109, the processing of S110, and the processing of S111-112 may be executed in an order different from the example shown in FIG. may be executed alternately and repeatedly. Any of the processing of S109, the processing of S110, and the processing of S111-112 may be omitted.
  • the processing of S101 to S109 is executed as needed, and the determination result of the object on the water is used for navigation of the ship 1.
  • FIG. 10 is a flowchart showing an example of processing for re-learning the trained model 242 .
  • the information processing device 2 generates training data that associates a photographed image created by one of the photographing devices with the discrimination result of the object on the water (S21).
  • the calculation unit 21 prepares training data including a data set that associates a photographed image with a water object discrimination result based on a photographed image taken by a photographing device different from the photographing device that photographed the photographed image.
  • the photographed image 41 created by the front camera 11 and the determination result of the water object based on the photographed image 43 created by the PTZ camera 13 are associated.
  • the calculation unit 21 generates training data including a data set that associates the captured image with the determination result of the object on the water using the radar 16. In addition, the calculation unit 21 generates training data including a data set that associates the captured image with the determination result of the object on the water using the AIS 17 . In S ⁇ b>21 , the calculation unit 21 generates training data including a plurality of data sets in which the photographed images and the discrimination results of the objects on the water are associated with each other, and stores the training data in the storage unit 24 .
  • the information processing device 2 next re-learns the learned model 242 using the training data (S22).
  • the calculation unit 21 inputs the captured image included in the training data to the learned model 242 and re-learns the learned model 242 .
  • the learned model 242 outputs a discrimination result of a water object according to the input of the photographed image.
  • the calculation unit 21 acquires the determination result of the object on the water output by the trained model 242, and the determination result of the object on the water associated with the captured image input to the trained model 242 in the training data, and the learned model 242
  • the calculation parameters of the learned model 242 are adjusted so that the difference from the output determination result of the object on the water becomes small. That is, the parameters are adjusted so that the determination result of the aquatic object that is substantially the same as the determination result of the aquatic object associated with the captured image is output.
  • the computing unit 21 re-learns the learned model 242 by repeating the process using a plurality of data sets included in the training data and adjusting the parameters of the learned model 242 .
  • the calculation unit 21 stores data recording the adjusted final parameters in the storage unit 24 . In this way, a relearned trained model 242 is generated.
  • the information processing device 2 ends the process of re-learning the trained model 242 .
  • the learned model 242 is re-learned using the captured images and the discrimination results of the objects on the water based on the captured images taken by different imaging devices as training data, so that more accurate discrimination results can be output.
  • trained model 242 can be adjusted.
  • the learned model 242 is adjusted using discrimination results different from the discrimination results based on the captured images. be able to. Note that the processes of S21 and S22 may be executed by a device other than the information processing device 2. FIG.
  • captured images captured by a plurality of image capturing devices installed on the ship 1 are input to the trained model 242, and the trained model 242 outputs the water object discrimination results. do.
  • the learned model 242 it is possible to discriminate objects on the water without depending on the visual observation of the crew.
  • a plurality of photographing devices installed at a plurality of positions within the ship 1 it is possible to photograph a wide area, and to effectively discriminate water objects existing in a wide area around the ship 1. becomes possible.
  • by using a plurality of photographing devices of different types it is possible to effectively discriminate many underwater objects. Effective discrimination of water objects can facilitate safe navigation of the vessel 1 .
  • the radar 16 and the AIS 17 are connected to the information processing device 2 in this embodiment, the radar 16 and the AIS 17 may be independent from the information processing device 2 .
  • the radar 16 and the AIS 17 are connected to the display device 31 without going through the information processing device 2, and independently of the processing by the information processing device 2, the radar image 6 and the radar 16 or the water object using the AIS 17 are displayed. may be displayed.

Landscapes

  • Engineering & Computer Science (AREA)
  • Ocean & Marine Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Processing (AREA)

Abstract

Le problème décrit par la présente invention est de fournir un procédé de traitement d'informations, un dispositif de traitement d'informations et un programme informatique qui permettent une identification efficace d'objets sur l'eau. La solution selon l'invention porte sur un procédé de traitement d'informations pour identifier un objet sur l'eau autour d'un navire, lequel procédé consiste à : acquérir des images photographiées obtenues par photographie de l'environnement du navire par l'intermédiaire d'une pluralité de dispositifs de photographie de différents types installés sur le navire ; entrer les images photographiées acquises à partir de chacun de la pluralité de dispositifs de photographie dans un modèle entraîné qui délivre des résultats d'identification des objets sur l'eau compris dans l'image photographiée lorsque les images photographiées sont entrées ; acquérir les résultats d'identification des objets sur l'eau délivrés à partir du modèle entraîné ; et délivrer les résultats d'identification des objets sur l'eau compris dans chacun de la pluralité de dispositifs de photographie.
PCT/JP2022/015560 2021-12-02 2022-03-29 Procédé de traitement d'informations, dispositif de traitement d'informations, et programme informatique WO2023100389A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-196343 2021-12-02
JP2021196343 2021-12-02

Publications (1)

Publication Number Publication Date
WO2023100389A1 true WO2023100389A1 (fr) 2023-06-08

Family

ID=86611781

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/015560 WO2023100389A1 (fr) 2021-12-02 2022-03-29 Procédé de traitement d'informations, dispositif de traitement d'informations, et programme informatique

Country Status (1)

Country Link
WO (1) WO2023100389A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6236549B1 (ja) * 2016-06-02 2017-11-22 日本郵船株式会社 船舶航行支援装置
JP2021103396A (ja) * 2019-12-25 2021-07-15 知洋 下川部 船舶の航行支援システムにおける管理サーバ、船舶の航行支援方法、及び船舶の航行支援プログラム

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6236549B1 (ja) * 2016-06-02 2017-11-22 日本郵船株式会社 船舶航行支援装置
JP2021103396A (ja) * 2019-12-25 2021-07-15 知洋 下川部 船舶の航行支援システムにおける管理サーバ、船舶の航行支援方法、及び船舶の航行支援プログラム

Similar Documents

Publication Publication Date Title
US11900668B2 (en) System and method for identifying an object in water
JP6932487B2 (ja) 移動体監視装置
KR102530691B1 (ko) 접안 모니터링 장치 및 방법
KR20200039589A (ko) 선박 및 항만 모니터링 장치 및 방법
KR102520844B1 (ko) 해수면을 고려한 항만 및 선박 모니터링 방법 및 장치
KR102311089B1 (ko) 스마트 해상부이를 이용한 해양을 모니터링하기 위한 장치 및 이를 위한 방법
Wisernig et al. Augmented reality visualization for sailboats (ARVS)
WO2023100389A1 (fr) Procédé de traitement d'informations, dispositif de traitement d'informations, et programme informatique
Sumimoto et al. Machine vision for detection of the rescue target in the marine casualty
CN114332682B (zh) 一种海上全景去雾目标识别方法
KR20210055904A (ko) 전자 항해 감시 장치
JP2005316958A (ja) 赤目検出装置および方法並びにプログラム
KR102613592B1 (ko) 영상을 이용한 원거리 선박 검출 및 세부 유형 식별 방법
WO2022137952A1 (fr) Dispositif d'identification d'amer, système de navigation autonome, procédé d'identification d'amer et programme
KR102315080B1 (ko) 스마트 해상부이를 이용한 해양을 모니터링하기 위한 장치 및 이를 위한 방법
CN115082811A (zh) 一种依据图像数据的海上航行船舶识别与距离测量的方法
KR102295283B1 (ko) 스마트항해지원장치
US20070242876A1 (en) Image Processing Apparatus, Image Processing Method, and Program
CN115035466A (zh) 一种用于安全监视的红外全景雷达系统
US20240104746A1 (en) Vessel tracking and monitoring system and operating method thereof
JPWO2020208742A1 (ja) 多角形検出装置、多角形検出方法、及び多角形検出プログラム
WO2023167071A1 (fr) Dispositif de traitement d'informations, dispositif de surveillance, procédé de traitement d'informations et programme
EP4296968A1 (fr) Procédé d'étiquetage d'une surface d'eau dans une image, procédé de fourniture d'un ensemble de données d'apprentissage pour l'apprentissage, la validation et/ou le test d'un algorithme d'apprentissage automatique, algorithme d'apprentissage automatique pour détecter une surface d'eau dans une image et système de détection de surface d'eau
WO2022137953A1 (fr) Dispositif d'identification d'amer, système de navigation autonome, procédé d'identification d'amer et programme
US20230079528A1 (en) Target object detection device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22900825

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2023564729

Country of ref document: JP

Kind code of ref document: A