WO2023171068A1 - Système de surveillance - Google Patents

Système de surveillance Download PDF

Info

Publication number
WO2023171068A1
WO2023171068A1 PCT/JP2022/045759 JP2022045759W WO2023171068A1 WO 2023171068 A1 WO2023171068 A1 WO 2023171068A1 JP 2022045759 W JP2022045759 W JP 2022045759W WO 2023171068 A1 WO2023171068 A1 WO 2023171068A1
Authority
WO
WIPO (PCT)
Prior art keywords
crossing
moving object
monitoring system
railroad crossing
state analysis
Prior art date
Application number
PCT/JP2022/045759
Other languages
English (en)
Japanese (ja)
Inventor
圭吾 長谷川
友輔 生内
Original Assignee
株式会社日立国際電気
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立国際電気 filed Critical 株式会社日立国際電気
Publication of WO2023171068A1 publication Critical patent/WO2023171068A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/04Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a monitoring system, and particularly to a monitoring system that detects a dangerous event based on the results of detecting and tracking a detection target in an image.
  • Patent Document 1 when it is determined that a vehicle is approaching a crossing road, it is determined whether or not the detected object is a moving object that is moving. If it is determined that there is a moving object, it predicts the change in the position of the moving object, and based on the predicted change in the position of the moving object, it predicts whether the moving object is likely to interfere with the vehicle's travel. An apparatus is disclosed.
  • Patent Document 2 describes a level crossing passability judgment that determines whether a passerby can pass through a level crossing by comparing the passerby's position and passing speed with the level crossing information, and notifies the passerby of the result of the judgment.
  • a support system is disclosed.
  • Patent Document 1 assumes the use of a laser irradiation type sensor.
  • the range that can be detected by laser irradiation is limited, and the information obtained from the laser is also limited.
  • laser irradiation type sensors are more expensive than cameras. Furthermore, it is necessary to install a separate camera in order for the guard to visually check the situation inside the railroad crossing.
  • Patent Document 2 assumes that the location of a passerby is acquired by the passerby holding a mobile terminal. Therefore, if a passerby does not have a corresponding mobile terminal, the passerby cannot be detected.
  • the present invention aims to provide a monitoring system that can more accurately detect dangerous events within railroad crossings.
  • one of the typical monitoring systems of the present invention includes an imaging device and an analysis device, the analysis device has a state analysis control section and a state analysis section, and the analysis device includes a state analysis control section and a state analysis section.
  • the condition analysis control section detects that a train is approaching based on the analysis of the image acquired by the imaging device or the information notified from the condition notification section
  • the condition analysis control section instructs the condition analysis section to start analyzing the image inside the railroad crossing.
  • the state analysis unit detects and tracks a moving object within the railroad crossing using the image acquired by the imaging device, and uses at least the current moving speed of the moving object to detect the object to be detected by the time the crossing of the railroad crossing is completed. If it is determined that the set virtual line cannot be reached, it is determined that a dangerous event has occurred and an alarm process is performed.
  • a dangerous event within a railroad crossing can be detected more accurately in a monitoring system.
  • FIG. 1 is a block diagram of a computer system for implementing aspects according to embodiments of the present disclosure.
  • FIG. 2 is a block diagram showing an example of the configuration of the monitoring system of the present invention.
  • FIG. 3 is a conceptual diagram showing an example of application of the monitoring system of the present invention.
  • FIG. 4 shows an example of a flowchart of processing of the monitoring system of the present invention.
  • FIG. 5 is a diagram for explaining the determination method of the monitoring system of the present invention.
  • FIG. 6 is a diagram showing a first display example in the monitoring system of the present invention.
  • FIG. 7 is a diagram showing a second display example in the monitoring system of the present invention.
  • FIG. 1 is a block diagram of a computer system 1 for implementing aspects according to embodiments of the present disclosure.
  • the mechanisms and apparatus of the various embodiments disclosed herein may be applied to any suitable computing system.
  • the main components of computer system 1 include one or more processors 2 , memory 4 , terminal interface 12 , storage interface 14 , I/O (input/output) device interface 16 , and network interface 18 . These components may be interconnected via a memory bus 6, an I/O bus 8, a bus interface unit 9, and an I/O bus interface unit 10.
  • the computer system 1 may include one or more processing devices 2A and 2B, collectively referred to as processors 2. Each processor 2 executes instructions stored in memory 4 and may include onboard cache. In some embodiments, computer system 1 may include multiple processors, and in other embodiments, computer system 1 may be a single processing unit system. Processing devices include CPU (Central Processing Unit), FPGA (Field-Programmable Gate Array), GPU (Graphics Processing Unit), and DSP (Digital Processing Unit). al Signal Processor) etc. can be applied.
  • CPU Central Processing Unit
  • FPGA Field-Programmable Gate Array
  • GPU Graphics Processing Unit
  • DSP Digital Processing Unit
  • memory 4 may include random access semiconductor memory, storage devices, or storage media (either volatile or non-volatile) for storing data and programs.
  • memory 4 represents the entire virtual memory of computer system 1 and may include virtual memory of other computer systems connected to computer system 1 via a network.
  • this memory 4 may be conceptually considered as a single entity, in other embodiments this memory 4 may be a more complex arrangement, such as a hierarchy of caches and other memory devices. .
  • memory may exist as multiple levels of caches, and these caches may be divided by function. As a result, one cache may hold instructions while the other cache holds non-instruction data used by the processor.
  • NUMA Non-Uniform
  • Memory Access Memory is so-called NUMA (Non-Uniform) (Memory Access) computer architecture may be distributed and associated with a variety of different processing devices.
  • Memory 4 may store all or some of the programs, modules, and data structures that perform the functions described herein.
  • the memory 4 may store a latent factor identification application 50.
  • latent factor identification application 50 may include instructions or writings that perform functions described below on processor 2, or may include instructions or writings that are interpreted by other instructions or writings.
  • latent factor identification application 50 may be applied to semiconductor devices, chips, logic gates, circuits, circuit cards, and/or other physical hardware instead of or in addition to processor-based systems. It may also be implemented in hardware via a device.
  • latent factor identification application 50 may include data other than instructions or descriptions.
  • cameras, sensors, or other data input devices may be provided to communicate directly with bus interface unit 9, processor 2, or other hardware of computer system 1. . Such an arrangement may reduce the need for processor 2 to access memory 4 and the latent factor identification application.
  • the computer system 1 may include a processor 2 , a memory 4 , a display system 24 , and a bus interface unit 9 for communicating between the I/O bus interface unit 10 .
  • I/O bus interface unit 10 may be coupled to I/O bus 8 for transferring data to and from various I/O units.
  • I/O bus interface unit 10 connects via I/O bus 8 to a plurality of I/O interface units 12, 14, 16, also known as I/O processors (IOPs) or I/O adapters (IOAs). and 18.
  • Display system 24 may include a display controller, display memory, or both. A display controller may provide video, audio, or both data to display device 26.
  • Computer system 1 may also include devices such as one or more sensors configured to collect data and provide the data to processor 2 .
  • computer system 1 may include environmental sensors that collect humidity data, temperature data, pressure data, etc., motion sensors that collect acceleration data, motion data, etc., and the like. Other types of sensors can also be used.
  • the display memory may be a dedicated memory for buffering video data.
  • Display system 24 may be connected to a display device 26, such as a standalone display screen, a television, a tablet, or a handheld device.
  • display device 26 may include speakers to render audio.
  • a speaker for rendering audio may be connected to the I/O interface unit.
  • the functionality provided by display system 24 may be implemented by an integrated circuit that includes processor 2.
  • the functionality provided by the bus interface unit 9 may be realized by an integrated circuit including the processor 2.
  • the I/O interface unit has the ability to communicate with various storage or I/O devices.
  • the terminal interface unit 12 may include a user output device such as a video display device, a speaker television, or a user input device such as a keyboard, mouse, keypad, touch pad, trackball, button, light pen, or other pointing device. It is possible to attach the user I/O device 20 as shown in FIG. Using the user interface, the user inputs input data and instructions to the user I/O device 20 and the computer system 1 by operating the user input device, and receives output data from the computer system 1. Good too.
  • the user interface may be displayed on a display device, played via a speaker, or printed via a printer, for example, via the user I/O device 20.
  • Storage interface 14 may include one or more disk drives or direct access storage devices 22 (typically magnetic disk drive storage devices, but an array of disk drives or other storage devices configured to appear as a single disk drive). ) can be installed.
  • storage device 22 may be implemented as any secondary storage device.
  • the contents of the memory 4 may be stored in the storage device 22 and read from the storage device 22 as needed.
  • Network interface 18 may provide a communication path so that computer system 1 and other devices can communicate with each other. This communication path may be, for example, the network 30.
  • the computer system 1 shown in FIG. 1 includes a bus structure that provides a direct communication path between a processor 2, a memory 4, a bus interface 9, a display system 24, and an I/O bus interface unit 10;
  • computer system 1 may include point-to-point links in a hierarchical, star, or web configuration, multiple hierarchical buses, and parallel or redundant communication paths.
  • I/O bus interface unit 10 and I/O bus 8 are shown as a single unit, in reality computer system 1 may include multiple I/O bus interface units 10 or multiple I/O A bus 8 may also be provided.
  • I/O interface units for separating the I/O bus 8 from various communication paths connecting various I/O devices, in other embodiments, one of the I/O devices may be Some or all may be directly connected to one system I/O bus.
  • computer system 1 is a device that receives requests from other computer systems (clients) that do not have a direct user interface, such as a multi-user mainframe computer system, a single-user system, or a server computer. There may be. In other embodiments, computer system 1 may be a desktop computer, a portable computer, a laptop, a tablet computer, a pocket computer, a telephone, a smartphone, or any other suitable electronic device.
  • FIG. 2 is a block diagram showing an example of the configuration of the monitoring system of the present invention.
  • the monitoring system 100 shown in FIG. 2 includes an imaging device 101, an analysis device 102, an alarm device 103, a display device 104, and a status notification section 105.
  • the imaging device 101 is configured by, for example, a network camera, and is used to acquire images of monitoring areas such as railroad crossings.
  • the imaging device 101 can have a camera configuration that obtains information by forming an image of incident light on an image sensor through a lens or an aperture.
  • Examples of the image sensor here include a CCD (Charge-Coupled Device) image sensor, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, and the like.
  • the imaging device 101 captures video at, for example, 3 frames per second (3 fps) or more, and the information is sent to the analysis device 102 .
  • a plurality of imaging devices 101 can be installed depending on the situation.
  • As the camera a visible light camera, an infrared camera, or the like can be used.
  • the analysis device 102 includes a state analysis control section 106 and a state analysis section 107.
  • the analysis device 102 is realized by, for example, a computer equipped with a processor such as a CPU, GPU, FPGA, or DSP, and the computer system 1 of FIG. 1 can be applied, for example.
  • the condition analysis control section 106 controls the start of detection of a dangerous event by the condition analysis section 107 based on the image obtained by the imaging device 101 and the train approach information notified from the condition notification section 105.
  • the state analysis unit 107 analyzes the image obtained by the imaging device 101 and predicts whether a dangerous event such as a moving object such as a person or a car being left behind at a railroad crossing will occur. A specific determination method will be described later.
  • the warning device 103 is composed of, for example, a speaker, and when the state analysis unit 107 detects a dangerous event, it notifies the surrounding area of the railroad crossing of the danger by sound. Further, as another aspect, it is configured by a protective radio device, and when the state analysis unit 107 detects a dangerous event, it transmits an emergency stop signal to the relevant train, etc.
  • the display device 104 is a display device that can display surveillance video captured by the imaging device 101. At this time, the details of the dangerous event detected by the state analysis unit 107 can be displayed.
  • the display device 104 can be configured with, for example, a liquid crystal display, an organic EL (OLED) display, or the like. Further, operation means such as a switcher, a keyboard, a mouse, etc. may be attached.
  • the display device 104 may aggregate images from each imaging device 101 to a monitoring center or the like.
  • the display device 104 may be configured at the same location as the analysis device 102, or may be configured at a different location.
  • the display device 104 can be applied as the display device 26 in the example of FIG.
  • the status notification unit 105 is configured using a railroad crossing controller, an automatic train stop (ATS), and the like.
  • the status notification unit 105 detects the approach of a train to the relevant railroad crossing, and notifies the status analysis control unit 106 of the approach information.
  • This approach information can be determined based on, for example, the position of the train to the relevant level crossing, whether the estimated time until reaching the relevant level crossing is within a predetermined value, etc.
  • FIG. 3 is a conceptual diagram showing an example of application of the monitoring system of the present invention.
  • a moving object 130 exemplified by a wheelchair, is shown crossing a crossing area 160 of a railroad crossing 150.
  • the crossing area 160 is the area between the front and rear blocking rods 151 and crosses the track 170.
  • a boundary line 161 of the crossing area 160 is a boundary line near the front blocking rod 151.
  • the imaging device 101 can be installed in a railroad crossing warning device 152, etc., and is installed so that it can photograph the entire crossing area 160. For this reason, it is best to install it at a higher position than a certain level. For example, the height is 2 m or more, which is higher than a person's height. Further, instead of just one imaging device 101, a plurality of imaging devices 101 may be installed diagonally, etc., to photograph the entire railroad crossing 150 including the crossing area 160 from multiple angles.
  • the warning device 103 is installed near the railroad crossing 150, and can issue a warning to the railroad crossing 150 and people near the railroad crossing 150. In the example of FIG. 3, it is installed in a railroad crossing warning device 152.
  • the analysis device 102 can be installed at a location different from the railroad crossing 150. Furthermore, the condition analysis unit 107 may be installed near the railroad crossing 150 or may be installed integrally with the analysis device 102.
  • the display device 104 can be installed at a location different from the railroad crossing 150. For example, it is possible to aggregate the information together with information from the imaging devices 101 at other railroad crossings at a monitoring center or the like.
  • FIG. 3 shows a case where a dangerous event occurs in which the moving object 130 is likely to be left behind in the crossing area 160 of the railroad crossing 150.
  • the information is analyzed by the analysis device 102 based on the image photographed by the imaging device 101, and the event is determined to be a dangerous event. Based on this, the alarm device 103 issues a predetermined alarm. Further, on the display device 104, a supervisor can check the video of the event determined to be a dangerous event.
  • FIG. 4 shows an example of a flowchart of processing of the monitoring system of the present invention. The processing flow of the monitoring system 100 will be explained using FIG. 4.
  • the state analysis control unit 106 detects the approach of a train.
  • the detection here can be carried out by detecting the start of the warning bell or the start of the crossing at the railroad crossing 150.
  • the imaging device 101 constantly transmits railroad crossing surveillance video to the state analysis control unit 106. Therefore, the state analysis control unit 106 can detect the start of shutoff by image analysis. Examples of methods for detecting the start of a cutoff using surveillance video include a method of detecting blinking of a railroad crossing warning from the railroad crossing alarm 152 based on changes in color and brightness, and a method of detecting movement of the cutoff rod 151.
  • the approach of a train may be detected by the status notification unit 105 acquiring information from a level crossing controller, an ATS, or the like.
  • the condition analysis control section 106 instructs the condition analysis section 107 to start analyzing the level crossing condition.
  • the state analysis unit 107 detects and tracks moving objects 130 such as people, cars, bicycles, motorcycles, wheelchairs, strollers, canes, and white canes inside the railroad crossing 150.
  • the detection and tracking here can be realized by combining machine learning technology, pattern matching, etc. based on the video captured by the imaging device 101.
  • Machine learning technology can be performed by image analysis using AI (artificial intelligence), deep learning, etc.
  • Pattern matching can be performed, for example, by image analysis, such as by comparing with a template image.
  • the moving object 130 Once the moving object 130 is detected, it moves frame by frame on the screen. At this time, the movement history of the detected moving object 130 can be tracked using machine learning or pattern matching to determine whether the object is the same as the detected moving object 130.
  • algorithms other than machine learning and pattern matching may be used to detect and track the moving object 130.
  • a detection object type ID may be attached to each moving object to identify the type of the object, such as a person, a car, a bicycle, a motorcycle, a wheelchair, a stroller, a cane, a white cane, etc.
  • step S203 the speed of the moving object 130 is estimated.
  • the speed is estimated by the state analysis unit 107 through image analysis of the video captured by the imaging device 101.
  • a specific example of the speed estimation method will be explained in the determination method described later.
  • step S204 it is determined whether the moving object 130 can cross the railroad crossing 150. This determination can be made by the state analysis unit 107 using the speed estimated in step S203. Whether the moving object 130 has crossed the railroad crossing 150 can be determined by, for example, whether it can pass through the crossing area 160 before the crossing is completed. A specific example of the determination method will be described later.
  • step S204 if the moving object 130 can cross the railroad crossing 150, the process ends. If crossing is not possible (true), the process advances to step S205. In this case, the state analysis unit 107 transmits information on the occurrence of a dangerous event.
  • step S205 alarm processing is performed.
  • the warning device 103 alerts the area around the railroad crossing 150 of danger by voice or the like.
  • the display device 104 may display a warning about the occurrence of a dangerous event. These operations can be performed based on information received from the state analysis unit 107 regarding the occurrence of a dangerous event.
  • FIG. 5 is a diagram for explaining the determination method of the monitoring system of the present invention. The first determination method will be explained using FIG. 5.
  • FIG. 5 shows a state in which a moving object 130 (a wheelchair is exemplified in FIG. 5) is crossing a railroad crossing 150.
  • An imaginary line 301 is determined near the rear boundary in the transverse direction of the crossing area 160 of the railroad crossing 150, and an imaginary line 302 is determined near the front boundary. These can be determined by image analysis.
  • the moving object 130 is present in the traversal area 160, the current time is t n , and the current position is P n (x(t n ), y(t n )).
  • the first determination method is to determine that the moving object 130 cannot cross the railroad crossing (true) if the time it takes to cross the railroad crossing 150 is predicted to be longer than the remaining time until the barrier 151 of the railroad crossing 150 closes. This is a judgment method.
  • the destination point where the moving object 130 reaches the imaginary line 302 in front is PG (x G , y G )
  • the destination point from the current position t n (x (t n ), y (t n )) is PG (x G , y G ).
  • the distance d(t n ) to P G (x G , y G ) is expressed by Equation 1 below.
  • the destination point P G (x G , y G ) is, for example, the position of the virtual line 302 that is the shortest distance ahead from the current position P n (x (t n ), y (t n )). can be taken as the destination point PG .
  • the intersection with the virtual line 302 on the extension of the movement vector calculated from this may be set as the destination point PG .
  • the current moving speed v(t n ) is expressed by the following equation 2.
  • the immediately preceding time t n-1 is a predetermined time before the current time t n , and may be, for example, a predetermined frame interval such as a difference from one frame before or several frames before.
  • Equation 3 is a determination condition. Here, it is determined whether Equation 3 is true or false using the distance d(t n ) obtained from Equation 1 and the moving speed v(t n ) obtained from Equation 2 above.
  • is the remaining time from the current time t n until the crossing of the railroad crossing 150 is completed. This ⁇ can be calculated by the state analysis unit 107 based on the information on the railroad crossing, since the time from the start of the railroad crossing to the completion of the railroad crossing is determined in advance.
  • the above example shows an example in which the determination is made only for the virtual line 302 located in front of the moving direction of the moving object 130, but the determination is made for the virtual line 301 located behind the moving object 130 in the moving direction. It's okay.
  • the distance d(t n ) to the virtual line is applied to both the backward virtual line 301 and the forward virtual line 302, and if both are true, it is determined that crossing is not possible. It's okay. This also takes into account the possibility that the moving object 130 can turn back.
  • the second determination method determines whether the amount of movement of the detected object (moving object 130) at a certain time is less than or equal to a threshold value under predetermined conditions. In this case, it can be assumed that the vehicle is turning back, wandering around, or moving at low speed, so this method determines that crossing is not possible (true) in such a case. Specifically, it can be expressed by the following equation 4.
  • Equation 4-1 which is the first condition shown below, will be explained.
  • v(t n ) is the speed at the current time calculated using Equation 2 above.
  • the time t m is a time before the current time t n , for example, 2 seconds ago, further 5 seconds ago, or even 10 seconds ago. In other words, it can be a value between about 1 second and 10 seconds ago.
  • Equation 4-1 indicates that the total amount of movement from the previous time t m to the current time t n is less than or equal to a predetermined movement amount threshold.
  • Equation 4-1 uses the total amount of movement for simplicity, but a velocity vector that takes the direction of movement into consideration may be used.
  • equation 4-2 which is the next condition shown below, will be explained.
  • is the time from the current time t n until the crossing of the railroad crossing 150 is completed.
  • T th1 is a threshold value for the remaining time to complete the cutoff. In other words, if there is little time remaining until the crossing of the level crossing 150 is completed, the condition of this equation is satisfied because the situation is dangerous.
  • Equation 4-3 is 1 when the detection target (moving object 130) is within the detection area M, and the condition of this equation is satisfied.
  • the detection area M(x, y) may be set in a predetermined area within the railroad crossing, but may also be set to include a prohibited area such as a track area outside the railroad crossing.
  • the above equation 4 is determined to be true when all of the above equations 4-1, 4-2, and 4-3 are satisfied, and it can be determined that a dangerous event has occurred. However, the degree of risk may be indicated in stages when some of these three equations are satisfied. For example, if Equations 4-1 and 4-3 are satisfied, it is determined that a dangerous event has occurred, and if Equation 4-2 is further satisfied, it is determined that a dangerous event has occurred.
  • the predetermined time T th2 can be determined in advance.
  • the fourth determination method is the determination using the detection target type described in step S202.
  • the detection target type can be determined by image analysis using AI (artificial intelligence), deep learning, or the like. At this time, a determination is made according to the type of detection target. Examples of detection target types include people (adults, children, elderly people), cars, bicycles, motorbikes, wheelchairs, strollers, canes, white canes, and the like.
  • the threshold values (V th , T th1 , T th2 ) used in the second and third determination methods may be changed and set in advance for each detection target type.
  • the arrival to the virtual line 301 or 302 is determined based on machine learning. Anticipate the time. Then, if the expected arrival time is longer (slower) than the completion time of the railroad crossing 150, or if the probability that it will be longer (slower) is greater than a predetermined value, it may be determined that a dangerous event has occurred. . In this case, the arrival time for that type can be predicted by using past data at the relevant level crossing and data at other level crossings.
  • a dangerous event has occurred. For example, in the case of a car, if the car is stopped at a railroad crossing for a predetermined period of time, it is possible that the car in front of the car is stuck in a traffic jam and cannot move, or that the car is unable to move due to a problem such as a breakdown of the car. For example, if the wheelchair is stopped near a track groove or in the direction of the track in a crossing zone, there is a possibility that the wheelchair has fallen off and is unable to move.
  • the event may be determined to be a dangerous event or a state with a high possibility of becoming a dangerous event.
  • FIG. 6 is a diagram showing an example of display on the first screen in the monitoring system of the present invention.
  • FIG. 7 is a diagram showing an example of display on the second screen in the monitoring system of the present invention.
  • the first screen shown in FIG. 6 and the second screen shown in FIG. 7 are displayed by the display device 104. These examples show an example of highlighting or enlarging a video of interest from among a plurality of displayed surveillance videos. Although any display method is possible, the following is a configuration for the observer to notice the danger.
  • the first display method is the method shown in FIG.
  • a main display 410 in which a plurality of images for each imaging device 101 are arranged in tiles is displayed on the left side. In other words, images of different railroad crossings are displayed side by side.
  • the main display 410 in FIG. 6 shows an example in which nine videos are displayed in three horizontal columns and three vertical columns.
  • a thumbnail display 420 is provided on the right side of the main display 410.
  • the thumbnail display 420 is displayed in a smaller area than the main display 410, and each video display is also smaller than the main display 410.
  • images from the imaging device 101 that cannot be displayed on the main display 410 are displayed in an enlarged manner.
  • the thumbnail display 420 in FIG. 6 shows an example in which three videos are displayed vertically side by side.
  • step S201 in FIG. This may also be the start of shutoff.
  • step S204 in FIG. It is possible to display a different emphasis on a stage where a train is approaching and a stage where a dangerous event has been determined as compared to a normal stage.
  • the image from the corresponding imaging device 101 may be emphasized by some kind of highlight display.
  • the corresponding video may be moved from the thumbnail display 420 in FIG. 6 to the main display 410.
  • a specific highlight display 411 is performed at the stage where the event is determined to be a dangerous event.
  • the highlight display 411 is divided into colors, such as a yellow frame for an approaching train and a red frame for a dangerous event, and displays a warning using different colored frames.
  • images that cannot be displayed on the main display 410 may be highlighted and displayed on the thumbnail display 420 in the same manner as the highlight display 411.
  • images showing a train approaching and a dangerous event may be displayed side by side so that they are displayed higher than other images.
  • images may be arranged from the left side of the upper row of the main display 410.
  • the video of the dangerous event may be displayed at a higher priority than the video of the approaching train.
  • a highlight display 411 of an image of an approaching train as shown in FIG. 6 is performed in the same way as in the first display method.
  • the screen changes to a second display screen 500 as shown in FIG.
  • the video determined to be a dangerous event is displayed as an enlarged display 511 on the main display 510 as shown in FIG.
  • the enlarged display 511 uses four normal image display spaces to display one image in an enlarged manner.
  • the thumbnail display 420 on the right side in FIG. 7 is displayed in the same manner as in FIG. 6 .
  • the first display method and the second display method described above may have a function of canceling the emphasis or enlarged display of the dangerous event after detecting the dangerous event. This assumes that safety has been confirmed by a supervisor or the like.
  • the display device 104 may be provided with an interface for this cancellation operation. In this case, once the highlighted or enlarged display is started, the highlighted or enlarged display may be continued until it is manually canceled.
  • videos in which a shutdown is being started and videos of dangerous events are preferentially displayed on the main displays 410 and 510, and videos in other states are displayed as thumbnail displays 420. You may. This allows videos of particular interest to be displayed on the main displays 410, 510.
  • the main displays 410 and 510 may exclusively allocate the video that is starting to be cut off and the video in other states to each tile-shaped display portion. Furthermore, each tile-shaped display portion can be allocated for cyclic display.
  • ⁇ Effect> it is possible to accurately detect a dangerous event at a railroad crossing by analyzing the image captured by the imaging device. At this time, by detecting the approach of a train, it can be detected before the crossing is completed. In addition, accurate detection is possible by detecting whether a moving object can cross a railroad crossing. Furthermore, since an imaging device is used, it is possible to detect the entire condition inside the railroad crossing. Furthermore, it can be realized without using other sensors such as a laser irradiation type sensor, and costs can be reduced. Furthermore, the images from the imaging device can be viewed directly by surveillance personnel to visually confirm the situation. In addition, by emphasizing or enlarging and displaying a video of interest from the surveillance video on the display device, it is possible to display the video so that the surveillance staff can easily notice the video.
  • the present invention is not limited to the above-described embodiments, and includes various modifications.
  • the embodiments described above are described in detail to explain the present invention in an easy-to-understand manner, and the present invention is not necessarily limited to having all the configurations described.
  • it is possible to replace a part of the configuration of one embodiment with the configuration of another embodiment and it is also possible to add the configuration of another embodiment to the configuration of one embodiment.
  • the above embodiment has been described with respect to a railroad crossing, it is also applicable to other situations in which the predicted arrival time to the virtual line is used. For example, it can be applied to signal control at intersections, automatic door opening/closing control, elevator door opening/closing control, etc.
  • the above embodiment shows an example in which the detection target is tracked using images from a camera, it is also possible to use a configuration in which various sensors such as LiDAR and millimeter wave radar are additionally used for detection and tracking to improve detection accuracy. good.
  • Status analysis unit 130 ...moving object, 150...level crossing, 151...blocking rod, 152...level crossing alarm, 160...crossing area, 161...boundary line, 170...railway, 301, 302...virtual line, 400...display screen, 410...main display, 411...Highlight display, 420...Thumbnail display, 500...Second display screen, 510...Main display, 511...Enlarged display

Abstract

Le but de la présente invention est de fournir un système de surveillance permettant de détecter plus précisément des événements dangereux au niveau de passages à niveau. Le système de surveillance comprend un dispositif d'imagerie et un dispositif d'analyse. Le dispositif d'analyse comprend une unité de commande d'analyse d'état et une unité d'analyse d'état. Lors de la détection de l'approche d'un train par analyse d'une image acquise par le dispositif d'imagerie ou sur la base d'informations notifiées à partir d'une unité de notification d'état, l'unité de commande d'analyse d'état ordonne à l'unité d'analyse d'état de démarrer l'analyse de l'image dans le passage à niveau. L'unité d'analyse d'état utilise l'image acquise par le dispositif d'imagerie pour détecter et suivre un objet mobile à l'intérieur du passage à niveau. Lorsqu'il est déterminé en utilisant au moins une vitesse de déplacement actuelle de l'objet mobile qu'une cible de détection ne peut pas atteindre une ligne virtuelle définie jusqu'à ce que le blocage du passage à niveau soit achevé, l'unité d'analyse d'état détermine qu'un événement dangereux se produit, et effectue un traitement d'alarme.
PCT/JP2022/045759 2022-03-07 2022-12-13 Système de surveillance WO2023171068A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022-034071 2022-03-07
JP2022034071 2022-03-07

Publications (1)

Publication Number Publication Date
WO2023171068A1 true WO2023171068A1 (fr) 2023-09-14

Family

ID=87936587

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/045759 WO2023171068A1 (fr) 2022-03-07 2022-12-13 Système de surveillance

Country Status (1)

Country Link
WO (1) WO2023171068A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006111177A (ja) * 2004-10-15 2006-04-27 Central Japan Railway Co 横断路の移動物体検知装置、横断路の情報連絡装置およびプログラム
JP2006118914A (ja) * 2004-10-20 2006-05-11 West Japan Railway Co 物体検出装置
JP2010181614A (ja) * 2009-02-05 2010-08-19 Yahoo Japan Corp 移動シミュレーション装置、および移動シミュレーション装置の動作方法
JP2015070401A (ja) * 2013-09-27 2015-04-13 日本電気株式会社 映像処理装置、映像処理方法および映像処理プログラム
JP2015182556A (ja) * 2014-03-24 2015-10-22 公益財団法人鉄道総合技術研究所 踏切通過者の監視システム及び監視プログラム
WO2018084191A1 (fr) * 2016-11-07 2018-05-11 株式会社日立国際電気 Système d'analyse d'état de congestion
JP2021064398A (ja) * 2021-01-04 2021-04-22 日本電気株式会社 制御方法、プログラム、及びシステム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006111177A (ja) * 2004-10-15 2006-04-27 Central Japan Railway Co 横断路の移動物体検知装置、横断路の情報連絡装置およびプログラム
JP2006118914A (ja) * 2004-10-20 2006-05-11 West Japan Railway Co 物体検出装置
JP2010181614A (ja) * 2009-02-05 2010-08-19 Yahoo Japan Corp 移動シミュレーション装置、および移動シミュレーション装置の動作方法
JP2015070401A (ja) * 2013-09-27 2015-04-13 日本電気株式会社 映像処理装置、映像処理方法および映像処理プログラム
JP2015182556A (ja) * 2014-03-24 2015-10-22 公益財団法人鉄道総合技術研究所 踏切通過者の監視システム及び監視プログラム
WO2018084191A1 (fr) * 2016-11-07 2018-05-11 株式会社日立国際電気 Système d'analyse d'état de congestion
JP2021064398A (ja) * 2021-01-04 2021-04-22 日本電気株式会社 制御方法、プログラム、及びシステム

Similar Documents

Publication Publication Date Title
CN111144247B (zh) 一种基于深度学习的自动扶梯乘客逆行检测方法
EP2801956B1 (fr) Dispositif de comptage de passagers
JP3876288B2 (ja) 状態認識システムおよび状態認識表示生成方法
US8693725B2 (en) Reliability in detecting rail crossing events
WO2019191142A1 (fr) Surveillance de zone intelligente avec une intelligence artificielle
WO2019178036A1 (fr) Coordination d'exposition pour plusieurs caméras
US11945435B2 (en) Devices and methods for predicting collisions and/or intersection violations
JP6127659B2 (ja) 運転支援装置及び運転支援方法
CA2610965A1 (fr) Procede et unite d'evaluation d'images pour analyse de scene
JP5917327B2 (ja) エスカレータ監視システム
CN103886755B (zh) 具有闯红灯拍照功能的路口异常停车快速报警系统及方法
JPWO2017047687A1 (ja) 監視システム
JP4600929B2 (ja) 停止低速車両検出装置
US20210312193A1 (en) Devices and methods for predicting intersection violations and/or collisions
US20230166731A1 (en) Devices and methods for assisting operation of vehicles based on situational assessment fusing expoential risks (safer)
JPH06223157A (ja) 画像センサによる移動体検知システム
JPWO2020178926A1 (ja) 置去り物検知装置および置去り物検知方法
WO2014199817A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image et programme de traitement d'image
Chen et al. Traffic extreme situations detection in video sequences based on integral optical flow
JP2009086748A (ja) 監視装置及びプログラム
WO2023171068A1 (fr) Système de surveillance
JP2011198142A (ja) 情報提供プログラム、情報提供装置および情報提供方法
JP5587068B2 (ja) 運転支援装置及び方法
US20210309221A1 (en) Devices and methods for determining region of interest for object detection in camera images
US20230083156A1 (en) Surveillance system, surveillance apparatus, surveillance method, and non-transitory computer-readable storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22931038

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2024505904

Country of ref document: JP