WO2018140549A1 - Line array cameras for a man over board detection system - Google Patents
Line array cameras for a man over board detection system Download PDFInfo
- Publication number
- WO2018140549A1 WO2018140549A1 PCT/US2018/015139 US2018015139W WO2018140549A1 WO 2018140549 A1 WO2018140549 A1 WO 2018140549A1 US 2018015139 W US2018015139 W US 2018015139W WO 2018140549 A1 WO2018140549 A1 WO 2018140549A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- line array
- array camera
- control system
- response
- data sets
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 50
- 230000004044 response Effects 0.000 claims abstract description 52
- 238000000034 method Methods 0.000 claims description 20
- 238000012795 verification Methods 0.000 claims description 11
- 238000005286 illumination Methods 0.000 claims description 3
- 230000003213 activating effect Effects 0.000 claims description 2
- 238000004590 computer program Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 4
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000001931 thermography Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000036760 body temperature Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000009429 electrical wiring Methods 0.000 description 1
- 230000005670 electromagnetic radiation Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008520 organization Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B63—SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
- B63C—LAUNCHING, HAULING-OUT, OR DRY-DOCKING OF VESSELS; LIFE-SAVING IN WATER; EQUIPMENT FOR DWELLING OR WORKING UNDER WATER; MEANS FOR SALVAGING OR SEARCHING FOR UNDERWATER OBJECTS
- B63C9/00—Life-saving in water
- B63C9/0005—Life-saving in water by means of alarm devices for persons falling into the water, e.g. by signalling, by controlling the propulsion or manoeuvring means of the boat
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/147—Details of sensors, e.g. sensor lenses
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/02—Alarms for ensuring the safety of persons
- G08B21/08—Alarms for ensuring the safety of persons responsive to the presence of persons in a body of water, e.g. a swimming pool; responsive to an abnormal condition of a body of water
- G08B21/086—Alarms for ensuring the safety of persons responsive to the presence of persons in a body of water, e.g. a swimming pool; responsive to an abnormal condition of a body of water by monitoring a perimeter outside the body of the water
Definitions
- the subject matter disclosed herein relates generally to the field of object detection, and specifically to a method and apparatus for man overboard detection.
- man overboard detection systems rely on eye-witness accounts or use thermal imaging cameras to detect a body in the water once someone has fallen overboard.
- a person's body temperature must be significantly different from that of the water to be visible on the thermal imaging camera and even small waves can obscure the camera's view of a person in the water.
- a more reliable solution for man overboard detection is desired.
- an object detection system comprising: at least one line array camera configured to capture one or more data sets within a field of view of the line array camera using a one-dimensional array of pixels within each line array camera, each data set being a one-dimensional output from the one-dimensional array of pixels at a point in time; and a control system configured to determine when an object has moved through a field of view in response to the one or more data sets and determine whether the object is a person in response to the one or more data sets; wherein the control system is configured to determine that a person has moved through a field of view in response to at least one of variations within each data set and variations between each data set.
- further embodiments of the system may include where the at least one line array camera further comprises: a first line array camera; and a second line array camera located at a first selected distance below the first line array camera; wherein the control system is configured to determine the velocity of the object in response to one more or more data sets captured by the first line array camera, the one or more data sets captured by the second line array camera, and the first selected distance.
- control system is configured to determine that the object is not a human being in response to the velocity of the object.
- control system is configured to compile a two-dimensional image in response to the velocity of the object and at least one of the one more or more data sets captured by the first line array camera and the one or more data sets captured by the second line array camera.
- control system is configured to analyze the two-dimensional image to determine whether the object is a person or not.
- control system is configured to transmit the two-dimensional image for human verification.
- control system is configured to activate a response protocol when a person is detected.
- further embodiments of the system may include where the object detection system is located on a ship and the response protocol includes at least one of an alarm, a notification message to crew, a buoy drop, an engine stop or a coast guard call.
- the at least one line array camera further comprises: a first line array camera; and a third line array camera located at a second selected distance away from the first line array camera, the third line array camera facing the first line array camera in such a way that a field of view of the second camera overlaps the field of view of the first camera; wherein the control system is configured to determine at least one of a size and a location of the object in response to the second selected distance and at least one of the one more or more data sets captured by the first line array camera, and the one or more data sets captured by the third line array camera.
- control system is configured to determine that the object is not a human being in response to the size of the object.
- further embodiments of the system may include where at least one of the time and location of the object is transmitted for human verification of the object.
- further embodiments of the system may include where the first line array camera is a short wave infrared camera.
- further embodiments of the system may include where the first line array camera utilizes active illumination.
- a method of object detection comprising: capturing one or more data sets within a field of view of at least one line array camera using a one-dimensional array of pixels within each line array camera, each data set being a one-dimensional output from the one-dimensional array of pixels at a point in time; determining, using a control system, when an object has moved through a field of view in response to the one or more data sets; and determining whether the object is a person in response to the one or more data sets; wherein the control system is configured to determine that a person has moved through a field of view in response to at least one of variations within each data set and variations between each data set.
- the at least one line array camera further comprises: a first line array camera; and a second line array camera located at a first selected distance below the first line array camera; wherein the control system is configured to determine the velocity of the object in response to one more or more data sets captured by the first line array camera, the one or more data sets captured by the second line array camera, and the first selected distance; wherein the control system is configured to determine that a person has moved through a field of view in response to at least one of variations within each data set and variations between each data set.
- further embodiments of the system may include determining, using the control system, that the object is not a human being in response to the velocity of the object.
- further embodiments of the system may include compiling, using the control system, a two- dimensional image in response to the velocity of the object and at least one of the one more or more data sets captured by the first line array camera and the one or more data sets captured by the second line array camera.
- further embodiments of the system may include analyzing, using the control system, the two- dimensional image to determine whether the object is a person or not.
- further embodiments of the system may transmitting, using the control system, the two- dimensional image for human verification.
- further embodiments of the system may include activating, using the control system, a response protocol when a person is detected.
- FIG. 1 a- le illustrates a schematic view of a line array camera capturing multiple one dimensional data sets of an object to compile a two-dimensional image, in accordance with an embodiment of the disclosure
- FIG. 2 illustrates a schematic view of an object overboard detection system of for use on a ship, in accordance with an embodiment of the disclosure
- FIG. 3 illustrates a side view of a ship incorporating the object overboard detection system of FIG. 1, in accordance with an embodiment of the disclosure
- FIG. 4 illustrates a top view of a ship incorporating the object overboard detection system of FIG. 1, in accordance with an embodiment of the disclosure
- FIG. 5 is a flow chart of a method of operating an object overboard detection system, in accordance with an embodiment of the disclosure.
- FIG. 6 is a flow chart of a method of response protocol after detecting an object falling overboard, in accordance with an embodiment of the disclosure.
- FIGs. la-le shows a schematic view of a line array camera 110 capturing multiple one dimensional data sets 300a of an object 200 to compile a two-dimensional image 300. While FIGs. la-le displays a single line array camera 110, FIGs. 2-4 contains multiple line array cameras with various names includes a first upper line array camera 110a, a second upper line array camera 110b, a first lower line array camera 120a, a second lower line array camera 120b, etc.
- the nomenclature (first, second, upper, lower) in FIGs. 2-4 is to differentiate between the organization of various line array cameras and each line array camera has the same capabilities as the line array camera 110 discussed in references to FIGs. la-le.
- the nomenclature first, second, upper, lower is non-limiting and other terms may be used such, as for example, first, second, third, and fourth.
- the line array camera 110 may be any camera capable of capturing one- dimensional data sets using a one-dimensional array of pixels 111 such as, for example, a line scan camera, a line array camera, or a one-dimensional array camera. As shown in FIGS la- le, the line array camera 110 has a very narrow field of view due to a first one-dimensional array of pixels 11 1. As an object 200 moves across a field of view 112 of the line array camera 110, a one-dimensional data set 300a of the object 200 is recorded. Note: the line array camera 110 is continuously capturing data sets even when there is no object 200 moving through the first field of view 112.
- the one-dimensional data set 300a may be a "slice" of the complete image 300 of the object 200, as seen FIGS la-le. These data sets 300a can then be compiled to create a two-dimensional image 300 of the object 200 that has moved across the field of view 112 of the line array camera 110. Detection of an object 200 within the field of view 112 may be determined in response to at least one of variations within each data set and variations between each data set.
- a second line array camera can be used in order to calculate the velocity and/or direction of a moving object 200.
- line array cameras are being used to detect an object 200 (specifically a person) falling overboard a ship 400.
- an object 200 specifically a person
- the overboard detection system 100 is illustrated in reference to a ship 400, the ship 400 shall not be considered limiting and the overboard detection system 100 may be utilized on other structures where detection of falling objects is desired, such as, for example a bridge or pier.
- the one-dimensional array of pixels that produces one- dimensional data sets helps reduce privacy concerns by only imaging slices of an object 200 that moves through the field of view of the line array camera, and does not include the background.
- the narrow field of view 112of the line array camera can help reduce nuisance/false alarm risks common in video (2D) systems.
- the line array cameras may be short wave infrared (SWIR) cameras.
- SWIR cameras are beneficial for detection purposes because they do not need visible light and can therefore do discreet detection.
- SWIR wavelengths can also penetrate common atmospheric conditions such as haze which would normally obscure the view of a visible camera.
- FIG. 2 shows a schematic view of an object overboard detection system 100 for a ship 400, in accordance with an embodiment of the disclosure.
- the object overboard detection system 100 includes a first upper line array camera 110a, a first lower line array camera 120a, a second upper line array camera 110b, a second lower line array camera 120b, and a control system 150.
- the line array cameras 110a, 110b, 120a, 120b are organized in pairs of line array cameras 108a, 108b.
- the first line array camera pair 108a includes a first upper line array camera 110a and a first lower line array camera 120a.
- the first upper line array camera 110a is located vertically above the first lower line array camera 120a.
- the first lower line array camera 120a is located at a first distance Dl away from the first upper line array camera 110a.
- the first lower field of view 122a of the first lower line array camera 120a is parallel to the first upper field of view 112a of the first upper camera 110a.
- the first field of views 112a, 122a may also be parallel to a deck 402 of the ship 400.
- the first upper line array camera 110a is configured to capture one or more primary upper data sets within the first upper field of view 112a using a first upper one- dimensional array of pixels 111a. Each primary upper data set is the one-dimensional digital output of what is captured by the first upper one-dimensional array of pixels 11 la of the first upper line array camera 110a.
- the first lower line array camera 120a operates similar to the first upper line array camera 110a.
- the first lower line array camera 120a is configured to capture one or more primary lower data sets within the first lower field of view 122a. Each primary lower data set is the one-dimensional digital output of what is captured by a first lower one-dimensional array of pixels 121a of the first lower line array camera 120a.
- the second line array camera pair 108b includes a second upper line array camera 110b and a second lower line array camera 120b.
- the second upper line array camera 110b is located vertically above the second lower line array camera 120b.
- the second lower line array camera 120b is located at a first distance Dl away from the second upper line array camera 110b.
- the second lower field of view 122b of the second lower line array camera 120b is parallel to the second upper field of view 112b of the second upper camera 110b.
- the second field of views 112b, 122b may also be parallel to a deck 402 of the ship 400.
- the second upper line array camera 110b is configured to capture one or more secondary upper data sets within the second upper field of view 112b using a second upper one-dimensional array of pixels 111b.
- Each secondary upper data set is the one-dimensional digital output of what is captured by the second upper one-dimensional array of pixels 11 lb of the second upper line array camera 110b.
- the second lower line array camera 120b operates similar to the second upper line array camera 110b.
- the second lower line array camera 120b is configured to capture one or more secondary lower data sets within the second lower field of view 122b.
- Each secondary lower data set is the one-dimensional digital output of what is captured by a second lower one-dimensional array of pixels 121b of the second lower line array camera 120b.
- the second upper line array camera 110b is facing the first upper line array camera 110a and the second lower line array camera 120b is facing the first lower line array camera 120a.
- the second pair of line array cameras 108b is located at a second selected distance away from the first pair of line array cameras 108b. Since the pairs of line array cameras 108a, 108b are facing each other, the first upper field of view 112a overlaps with the second upper field of view 112b and the first lower field of view 122a overlaps with the second lower field of view 122b, as seen in FIG. 2.
- the second field of views 112b, 122b is parallel to first field of views 112a, 122a. Thus an object 200 falling overboard would have to fall through four fields of view 112a, 112b, 122a, 122b.
- the direction of the motion of the object 200 can be determined based on which field of view 112a, 112b, 122a, 122b, the object 200 enters first. For example, if the object 200 crosses the upper field of views 112a, 112b at a first point in time and then the lower field of view 122a, 122b at second point in time it may be determined that the object 200 is heading in a first direction XI and thus falling overboard off the ship 400.
- knowing the direction in which the object 200 is moving would help differentiate between a person falling over board through the field of views 112a, 112b, 122a, 122b and a bird flying up through the field of views 112a, 112b, 122a, 122b.
- the velocity of the object 200 can be calculated and used to build up a two-dimensional image (as seen in FIGS, la-le).
- the line array cameras can be arranged facing each other such that the first upper field of view 112a overlaps with the second upper field of view 112b and the first lower field of view 122a overlaps with the second lower field of view 122b.
- having the two line array cameras face each other could provide additional information about the size of the object 200 and location of the object 200 in reference to the two cameras.
- a distance D3 from the first pair of line array cameras 108a to the object may be determined and a distance D4 from the second pair of line array cameras 108b to the object 200 may be determined.
- the two data sets from each line array camera 110a, 120a are compared to each other. Specifically, the number of pixels that the object 200 takes up within each data set (aka pixels on target). For example, if an object takes up 100 pixels on target for the first upper line array camera 110a and 2 pixels on the second line array camera 120a, then the object is closer to the first upper line array camera 110a.
- the actual location of the object 200 between the line array cameras 110a, 120a can then be calculated in response to the number of pixels in each data set, angular field of view of each line array camera 110a, 120a, and the known distance between each line array camera 110a, 120a.
- the size of the object 200 may then be determined using the location of the object 200 between the two line array cameras 110a, 120a and the number of "pixels on target".
- the alarm 170 may only be activated when the size of the object is within a selected size range.
- the control system 150 is configured to control the operation of the object overboard detection system 100 and determine whether an object has fallen overboard, what that object 200 might be, and the location where the object 200 fell overboard.
- the control system 150 is configured to perform the velocity, trajectory, size, and location calculations that were described above.
- the control system 150 is in communication with each line array camera 110a, 120a, 110b, 120b.
- the control system 150 may include a processor and an associated memory.
- the processor may be, but is not limited to, a single-processor or multi-processor system of any of a wide array of possible architectures, including field programmable gate array (FPGA), central processing unit (CPU), application specific integrated circuits (ASIC), digital signal processor (DSP) or graphics processing unit (GPU) hardware arranged homogenously or heterogeneously.
- the memory may be but is not limited to a random access memory (RAM), read only memory (ROM), or other electronic, optical, magnetic or any other computer readable medium.
- each data set is captured by the line array cameras 110a, 120a, 110a, 120b
- the data sets are processed by the control system 150 to determine if an object 200 passed through any of the four fields of views 112a, 122a, 112b, 122b and if that object 200 was a person.
- an alarm 170 may be activated.
- the alarm 170 may be visual and/or audible.
- a response protocol may be activated when the control system has detected a man over board event.
- the response protocol may include at least one of an alarm 170, a notification message to crew, a buoy drop 320, an engine stop or a coast guard call.
- response protocols may be initiated automatically or by a member of the crew on the ship 400 once the detection has been made. Additionally, the two-dimensional image compiled by one of the line array cameras 110a, 120a, 110b, 120b may also be transmitted for human verification. The crew could then take action based on the visual image from the camera 310 or the two- dimensional image compiled by one of the line array cameras 110a, 120a, 110b, 120b. If the object 200 is not a person, the crew may deactivate the alarm, however if the object 200 is a person, the crew may initiate a rescue effort to save the person. Alternatively, the control system 150 may be configured, through visual recognition, to determine an identity of the object 200.
- the alarm 170 may be deactivated if the control system 150 determines that the identity of the object 200 is a bottle but the control system 150 may activate the alarm 170 if the identity of the obj ect 200 is a person.
- the time and location information can be sent to crew members in order for them to quickly search through corresponding security footage (from security cameras on board the ship) for further verification of the man overboard event.
- FIGs. 3 and 4 show the object overboard detection system 100 incorporated on a ship 400, according to an embodiment of the present disclosure.
- the configuration shown in FIGs. 3 and 4 is an example of a possible implementation of the overboard detection system 100 on a ship.
- the line array cameras are organized in pairs of line array cameras 108a-108n.
- Each line array camera pair 108a-108n includes an upper line array camera 110 and a lower line array camera 120.
- each line array camera pair 108a-108f has the upper line array camera HOa-l lOf located vertically above the lower line array camera 120a-l lOf and the same is true for the remaining pairs of line array cameras 108-108n not pictured.
- the pairs of line array cameras 108a-108n are arranged systematically around the ship 400 so that that the full perimeter 470 of the ship 400 is within an upper field of view 112a-l 12n, as seen in FIG. 3.
- an object 200 were to fall off the ship 400, it would have to pass through an upper field of view 112a-112n and subsequently the respective lower field of view (not pictured in FIG.
- the disclosure is not limited by the number or arrangement of pairs or single units of line array cameras 108a-108n, which may vary depending on the size and the shape of the ship 400.
- the arrangement of the cameras may need to be designed differently for each ship and may also depend on the quality of each line array cameras. For example, line array cameras with lower pixel counts may require more to be placed around a ship. Additional line array cameras may be required if there are protrusion from the ship 400. Fewer line array cameras may be needed in an area where it is impossible to fall off the ship 400 (i.e. no balconies, windows, there is no upper deck, inaccessible areas of the ship to passengers and/or crew).
- FIG. 6 shows a flow chart of a method 500 of detecting an object 200 falling overboard, in accordance with an embodiment of the disclosure.
- at block 504 at least one line array camera 110a, 110b, 120a, 120b captures one or more data sets within a field of view 112a, 112b, 122a, 122b of the line array camera 110a, 110b, 120a, 120b using a one-dimensional array of pixels 111a, 11 lb, 121a, 121b within each line array camera 110a, 110b, 120a, 120b.
- Each data set is a one- dimensional output from the one-dimensional array of pixels at a point in time.
- a control system 150 determines when an object 200 has moved through a field of view in response to the one or more data sets.
- the control system 150 determines, whether the object 200 is a person in response to the one or more data sets.
- the detection system 100 is constantly capturing data sets from all of the line array cameras 110a, 110b, 120a, 120b and analyzing the data sets to determine when an object 200 has moved into any of the fields of view 112a, 112b, 122a, 122b (it does this by comparing the pixels within the same data set as well as comparing different data sets from the same camera). Once the detection system 100 has determined that an object 200 has moved into the field of view of one or more of the cameras, it can analyze the data sets from other cameras to get further information about whether or not the object 200 is a falling person or not.
- utilizing multiple line array cameras allow for multiple layers of detection for falling objects and verification, thus increasing redundancy and reducing false alarm rates.
- the line array cameras are small and may not protrude more than 600 mm from the side of the ship or other structure.
- the line array cameras, as described above will be able to provide images of the object within seconds after detection for human verification.
- SWIR cameras are beneficial for detection purposes because they do not need visible light and can therefore do discreet detection.
- SWIR wavelengths can also penetrate common atmospheric conditions such as haze which would normally obscure the view of a visible camera. Active illumination may also be implemented.
- FIG. 6 shows a flow chart of a method 600 of response protocol after detecting an object 200 falling overboard, in accordance with an embodiment of the disclosure.
- an object 200 is detected by the line array detection system 100.
- a low regret response may be initiated in block 620.
- a buoy 320 may be dropped and/or a two-dimensional image of the object 200 sent for human verification.
- the ship 400 may return to normal operation at block 622.
- a medium regret response at block 630 the engines of the ship 400 may be stopped and/or security video may be reviewed near where the obj ect 200 was detected falling off the ship 400. If the medium regret response at block 630 turns out to be a false alarm then the ship 400 may return to normal operations.
- a high regret scenario at block 540 then the ship course may be reversed, passengers notified, coast guard notified and a safety crew may be sent into the water.
- embodiments can be in the form of processor-implemented processes and devices for practicing those processes, such as processor.
- Embodiments can also be in the form of computer program code containing instructions embodied in tangible media, such as network cloud storage, SD cards, flash drives, floppy diskettes, CD ROMs, hard drives, or any other computer-readable storage medium, wherein, when the computer program code is loaded into and executed by a computer, the computer becomes a device for practicing the embodiments.
- Embodiments can also be in the form of computer program code, for example, whether stored in a storage medium, loaded into and/or executed by a computer, or transmitted over some transmission medium, loaded into and/or executed by a computer, or transmitted over some transmission medium, such as over electrical wiring or cabling, through fiber optics, or via electromagnetic radiation, wherein, when the computer program code is loaded into an executed by a computer, the computer becomes an device for practicing the embodiments.
- the computer program code segments configure the microprocessor to create specific logic circuits.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Vascular Medicine (AREA)
- Ocean & Marine Engineering (AREA)
- Mechanical Engineering (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Image Analysis (AREA)
- Emergency Alarm Devices (AREA)
- Traffic Control Systems (AREA)
- Geophysics And Detection Of Objects (AREA)
- Alarm Systems (AREA)
- Studio Devices (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
Claims
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP18704685.9A EP3574648B1 (en) | 2017-01-25 | 2018-01-25 | Line array cameras for a man over board detection system |
ES18704685T ES2966187T3 (en) | 2017-01-25 | 2018-01-25 | Linear distribution chambers for a man overboard detection system |
KR1020197020335A KR102547931B1 (en) | 2017-01-25 | 2018-01-25 | Line Array Camera for Overboard Person Detection System |
JP2019560070A JP7123073B2 (en) | 2017-01-25 | 2018-01-25 | Line array camera for person overboard detection system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201762450443P | 2017-01-25 | 2017-01-25 | |
US62/450,443 | 2017-01-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018140549A1 true WO2018140549A1 (en) | 2018-08-02 |
Family
ID=61193064
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2018/015139 WO2018140549A1 (en) | 2017-01-25 | 2018-01-25 | Line array cameras for a man over board detection system |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP3574648B1 (en) |
JP (1) | JP7123073B2 (en) |
KR (1) | KR102547931B1 (en) |
ES (1) | ES2966187T3 (en) |
WO (1) | WO2018140549A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3671682A1 (en) * | 2018-12-17 | 2020-06-24 | SOS Ltd. | Man over board detection system |
WO2020160875A1 (en) * | 2019-02-06 | 2020-08-13 | Robert Bosch Gmbh | Monitoring device and method for monitoring a man-overboard event in a ship section |
US10771948B2 (en) | 2018-12-03 | 2020-09-08 | Rohde & Schwarz Gmbh & Co. Kg | System and method for monitoring a spatial position of a mobile transmitter, man-over-board detection system |
WO2022167823A1 (en) * | 2021-02-08 | 2022-08-11 | Offshore Survival Systems Limited | Location apparatus |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120229282A1 (en) * | 2011-03-10 | 2012-09-13 | Security Identification Systems Corporation a Florida | Maritime Overboard Detection and Tracking System |
GB2493390A (en) * | 2011-08-05 | 2013-02-06 | Marine & Remote Sensing Solutions Ltd | System for detecting a person overboard event |
CN104268882A (en) * | 2014-09-29 | 2015-01-07 | 深圳市热活力科技有限公司 | High-speed moving object detecting and speed measuring method and system based on double-linear-array cameras |
US9106810B1 (en) * | 2013-06-09 | 2015-08-11 | MTN Satellite Communications Inc. | Maritime safety systems for crew and passengers |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5189638A (en) * | 1975-02-03 | 1976-08-05 | ||
JPS6332677A (en) * | 1986-04-30 | 1988-02-12 | Hochiki Corp | Measuring instrument for traveling object value |
JP2558198B2 (en) * | 1992-01-09 | 1996-11-27 | 株式会社コア | Mobile object analyzer |
JPH08329213A (en) * | 1995-05-31 | 1996-12-13 | Yazaki Corp | Detector and counter for passing object |
JP3403697B2 (en) * | 1999-05-28 | 2003-05-06 | 日本電信電話株式会社 | Image processing method and apparatus |
JP2010118039A (en) | 2008-10-16 | 2010-05-27 | Mitsubishi Electric Corp | Mobile object detector |
-
2018
- 2018-01-25 ES ES18704685T patent/ES2966187T3/en active Active
- 2018-01-25 EP EP18704685.9A patent/EP3574648B1/en active Active
- 2018-01-25 JP JP2019560070A patent/JP7123073B2/en active Active
- 2018-01-25 WO PCT/US2018/015139 patent/WO2018140549A1/en unknown
- 2018-01-25 KR KR1020197020335A patent/KR102547931B1/en active IP Right Grant
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120229282A1 (en) * | 2011-03-10 | 2012-09-13 | Security Identification Systems Corporation a Florida | Maritime Overboard Detection and Tracking System |
GB2493390A (en) * | 2011-08-05 | 2013-02-06 | Marine & Remote Sensing Solutions Ltd | System for detecting a person overboard event |
US9106810B1 (en) * | 2013-06-09 | 2015-08-11 | MTN Satellite Communications Inc. | Maritime safety systems for crew and passengers |
CN104268882A (en) * | 2014-09-29 | 2015-01-07 | 深圳市热活力科技有限公司 | High-speed moving object detecting and speed measuring method and system based on double-linear-array cameras |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10771948B2 (en) | 2018-12-03 | 2020-09-08 | Rohde & Schwarz Gmbh & Co. Kg | System and method for monitoring a spatial position of a mobile transmitter, man-over-board detection system |
EP3671682A1 (en) * | 2018-12-17 | 2020-06-24 | SOS Ltd. | Man over board detection system |
JP2022513926A (en) * | 2018-12-17 | 2022-02-09 | エスオーエス リミテッド | Crossing the Line Detection System |
WO2020160875A1 (en) * | 2019-02-06 | 2020-08-13 | Robert Bosch Gmbh | Monitoring device and method for monitoring a man-overboard event in a ship section |
US11823550B2 (en) | 2019-02-06 | 2023-11-21 | Robert Bosch Gmbh | Monitoring device and method for monitoring a man-overboard in a ship section |
WO2022167823A1 (en) * | 2021-02-08 | 2022-08-11 | Offshore Survival Systems Limited | Location apparatus |
Also Published As
Publication number | Publication date |
---|---|
ES2966187T3 (en) | 2024-04-18 |
EP3574648A1 (en) | 2019-12-04 |
JP2020507175A (en) | 2020-03-05 |
EP3574648B1 (en) | 2023-12-13 |
KR20190110536A (en) | 2019-09-30 |
JP7123073B2 (en) | 2022-08-22 |
KR102547931B1 (en) | 2023-06-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3574648B1 (en) | Line array cameras for a man over board detection system | |
CN108027874B (en) | Computer vision based security system using depth camera | |
US10922552B2 (en) | System and method for man overboard incident detection | |
US9972188B2 (en) | Sonar based drowning detection system, method and kit | |
US7542588B2 (en) | System and method for assuring high resolution imaging of distinctive characteristics of a moving object | |
US10007836B2 (en) | Bird detection device, bird detection system, bird detection method, and program extracting a difference between the corrected images | |
US9896170B1 (en) | Man overboard detection system | |
KR20180133745A (en) | Flying object identification system using lidar sensors and pan/tilt zoom cameras and method for controlling the same | |
US11288517B2 (en) | System and method for deep learning enhanced object incident detection | |
KR102479959B1 (en) | Artificial intelligence based integrated alert method and object monitoring device | |
US11209517B2 (en) | Mobile body detection device, mobile body detection method, and mobile body detection program | |
KR102456190B1 (en) | Black box system for offshore fishing vessels | |
AU2018286646A1 (en) | A system and a method for monitoring a predetermined region in a water body | |
US11126857B1 (en) | System and method for object falling and overboarding incident detection | |
CN114882661A (en) | Outdoor early warning method, device, system and computer readable storage medium | |
JP7157763B2 (en) | Line array detection and imaging system | |
KR20140118631A (en) | Context awareness system for vessel | |
JP7062879B2 (en) | Display control device and display control method | |
WO2023286295A1 (en) | Intrusion determination device, intrusion detection system, intrusion determination method, and program storage medium | |
Long et al. | An Image-based Fall Detection System using You Only Look Once (YOLO) Algorithm to Monitor Elders’ Fall Events | |
KR102571490B1 (en) | Apparatus for counting the number of people | |
JP2023012601A (en) | Guidance device, guidance system, guidance method, and computer program | |
CN116152705A (en) | Perimeter intrusion monitoring method and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18704685 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 20197020335 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2019560070 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2018704685 Country of ref document: EP Effective date: 20190826 |