WO2023171068A1 - Surveillance system - Google Patents

Surveillance system Download PDF

Info

Publication number
WO2023171068A1
WO2023171068A1 PCT/JP2022/045759 JP2022045759W WO2023171068A1 WO 2023171068 A1 WO2023171068 A1 WO 2023171068A1 JP 2022045759 W JP2022045759 W JP 2022045759W WO 2023171068 A1 WO2023171068 A1 WO 2023171068A1
Authority
WO
WIPO (PCT)
Prior art keywords
crossing
moving object
monitoring system
railroad crossing
state analysis
Prior art date
Application number
PCT/JP2022/045759
Other languages
French (fr)
Japanese (ja)
Inventor
圭吾 長谷川
友輔 生内
Original Assignee
株式会社日立国際電気
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立国際電気 filed Critical 株式会社日立国際電気
Publication of WO2023171068A1 publication Critical patent/WO2023171068A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • G08B25/01Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
    • G08B25/04Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium using a single signalling line, e.g. in a closed loop
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a monitoring system, and particularly to a monitoring system that detects a dangerous event based on the results of detecting and tracking a detection target in an image.
  • Patent Document 1 when it is determined that a vehicle is approaching a crossing road, it is determined whether or not the detected object is a moving object that is moving. If it is determined that there is a moving object, it predicts the change in the position of the moving object, and based on the predicted change in the position of the moving object, it predicts whether the moving object is likely to interfere with the vehicle's travel. An apparatus is disclosed.
  • Patent Document 2 describes a level crossing passability judgment that determines whether a passerby can pass through a level crossing by comparing the passerby's position and passing speed with the level crossing information, and notifies the passerby of the result of the judgment.
  • a support system is disclosed.
  • Patent Document 1 assumes the use of a laser irradiation type sensor.
  • the range that can be detected by laser irradiation is limited, and the information obtained from the laser is also limited.
  • laser irradiation type sensors are more expensive than cameras. Furthermore, it is necessary to install a separate camera in order for the guard to visually check the situation inside the railroad crossing.
  • Patent Document 2 assumes that the location of a passerby is acquired by the passerby holding a mobile terminal. Therefore, if a passerby does not have a corresponding mobile terminal, the passerby cannot be detected.
  • the present invention aims to provide a monitoring system that can more accurately detect dangerous events within railroad crossings.
  • one of the typical monitoring systems of the present invention includes an imaging device and an analysis device, the analysis device has a state analysis control section and a state analysis section, and the analysis device includes a state analysis control section and a state analysis section.
  • the condition analysis control section detects that a train is approaching based on the analysis of the image acquired by the imaging device or the information notified from the condition notification section
  • the condition analysis control section instructs the condition analysis section to start analyzing the image inside the railroad crossing.
  • the state analysis unit detects and tracks a moving object within the railroad crossing using the image acquired by the imaging device, and uses at least the current moving speed of the moving object to detect the object to be detected by the time the crossing of the railroad crossing is completed. If it is determined that the set virtual line cannot be reached, it is determined that a dangerous event has occurred and an alarm process is performed.
  • a dangerous event within a railroad crossing can be detected more accurately in a monitoring system.
  • FIG. 1 is a block diagram of a computer system for implementing aspects according to embodiments of the present disclosure.
  • FIG. 2 is a block diagram showing an example of the configuration of the monitoring system of the present invention.
  • FIG. 3 is a conceptual diagram showing an example of application of the monitoring system of the present invention.
  • FIG. 4 shows an example of a flowchart of processing of the monitoring system of the present invention.
  • FIG. 5 is a diagram for explaining the determination method of the monitoring system of the present invention.
  • FIG. 6 is a diagram showing a first display example in the monitoring system of the present invention.
  • FIG. 7 is a diagram showing a second display example in the monitoring system of the present invention.
  • FIG. 1 is a block diagram of a computer system 1 for implementing aspects according to embodiments of the present disclosure.
  • the mechanisms and apparatus of the various embodiments disclosed herein may be applied to any suitable computing system.
  • the main components of computer system 1 include one or more processors 2 , memory 4 , terminal interface 12 , storage interface 14 , I/O (input/output) device interface 16 , and network interface 18 . These components may be interconnected via a memory bus 6, an I/O bus 8, a bus interface unit 9, and an I/O bus interface unit 10.
  • the computer system 1 may include one or more processing devices 2A and 2B, collectively referred to as processors 2. Each processor 2 executes instructions stored in memory 4 and may include onboard cache. In some embodiments, computer system 1 may include multiple processors, and in other embodiments, computer system 1 may be a single processing unit system. Processing devices include CPU (Central Processing Unit), FPGA (Field-Programmable Gate Array), GPU (Graphics Processing Unit), and DSP (Digital Processing Unit). al Signal Processor) etc. can be applied.
  • CPU Central Processing Unit
  • FPGA Field-Programmable Gate Array
  • GPU Graphics Processing Unit
  • DSP Digital Processing Unit
  • memory 4 may include random access semiconductor memory, storage devices, or storage media (either volatile or non-volatile) for storing data and programs.
  • memory 4 represents the entire virtual memory of computer system 1 and may include virtual memory of other computer systems connected to computer system 1 via a network.
  • this memory 4 may be conceptually considered as a single entity, in other embodiments this memory 4 may be a more complex arrangement, such as a hierarchy of caches and other memory devices. .
  • memory may exist as multiple levels of caches, and these caches may be divided by function. As a result, one cache may hold instructions while the other cache holds non-instruction data used by the processor.
  • NUMA Non-Uniform
  • Memory Access Memory is so-called NUMA (Non-Uniform) (Memory Access) computer architecture may be distributed and associated with a variety of different processing devices.
  • Memory 4 may store all or some of the programs, modules, and data structures that perform the functions described herein.
  • the memory 4 may store a latent factor identification application 50.
  • latent factor identification application 50 may include instructions or writings that perform functions described below on processor 2, or may include instructions or writings that are interpreted by other instructions or writings.
  • latent factor identification application 50 may be applied to semiconductor devices, chips, logic gates, circuits, circuit cards, and/or other physical hardware instead of or in addition to processor-based systems. It may also be implemented in hardware via a device.
  • latent factor identification application 50 may include data other than instructions or descriptions.
  • cameras, sensors, or other data input devices may be provided to communicate directly with bus interface unit 9, processor 2, or other hardware of computer system 1. . Such an arrangement may reduce the need for processor 2 to access memory 4 and the latent factor identification application.
  • the computer system 1 may include a processor 2 , a memory 4 , a display system 24 , and a bus interface unit 9 for communicating between the I/O bus interface unit 10 .
  • I/O bus interface unit 10 may be coupled to I/O bus 8 for transferring data to and from various I/O units.
  • I/O bus interface unit 10 connects via I/O bus 8 to a plurality of I/O interface units 12, 14, 16, also known as I/O processors (IOPs) or I/O adapters (IOAs). and 18.
  • Display system 24 may include a display controller, display memory, or both. A display controller may provide video, audio, or both data to display device 26.
  • Computer system 1 may also include devices such as one or more sensors configured to collect data and provide the data to processor 2 .
  • computer system 1 may include environmental sensors that collect humidity data, temperature data, pressure data, etc., motion sensors that collect acceleration data, motion data, etc., and the like. Other types of sensors can also be used.
  • the display memory may be a dedicated memory for buffering video data.
  • Display system 24 may be connected to a display device 26, such as a standalone display screen, a television, a tablet, or a handheld device.
  • display device 26 may include speakers to render audio.
  • a speaker for rendering audio may be connected to the I/O interface unit.
  • the functionality provided by display system 24 may be implemented by an integrated circuit that includes processor 2.
  • the functionality provided by the bus interface unit 9 may be realized by an integrated circuit including the processor 2.
  • the I/O interface unit has the ability to communicate with various storage or I/O devices.
  • the terminal interface unit 12 may include a user output device such as a video display device, a speaker television, or a user input device such as a keyboard, mouse, keypad, touch pad, trackball, button, light pen, or other pointing device. It is possible to attach the user I/O device 20 as shown in FIG. Using the user interface, the user inputs input data and instructions to the user I/O device 20 and the computer system 1 by operating the user input device, and receives output data from the computer system 1. Good too.
  • the user interface may be displayed on a display device, played via a speaker, or printed via a printer, for example, via the user I/O device 20.
  • Storage interface 14 may include one or more disk drives or direct access storage devices 22 (typically magnetic disk drive storage devices, but an array of disk drives or other storage devices configured to appear as a single disk drive). ) can be installed.
  • storage device 22 may be implemented as any secondary storage device.
  • the contents of the memory 4 may be stored in the storage device 22 and read from the storage device 22 as needed.
  • Network interface 18 may provide a communication path so that computer system 1 and other devices can communicate with each other. This communication path may be, for example, the network 30.
  • the computer system 1 shown in FIG. 1 includes a bus structure that provides a direct communication path between a processor 2, a memory 4, a bus interface 9, a display system 24, and an I/O bus interface unit 10;
  • computer system 1 may include point-to-point links in a hierarchical, star, or web configuration, multiple hierarchical buses, and parallel or redundant communication paths.
  • I/O bus interface unit 10 and I/O bus 8 are shown as a single unit, in reality computer system 1 may include multiple I/O bus interface units 10 or multiple I/O A bus 8 may also be provided.
  • I/O interface units for separating the I/O bus 8 from various communication paths connecting various I/O devices, in other embodiments, one of the I/O devices may be Some or all may be directly connected to one system I/O bus.
  • computer system 1 is a device that receives requests from other computer systems (clients) that do not have a direct user interface, such as a multi-user mainframe computer system, a single-user system, or a server computer. There may be. In other embodiments, computer system 1 may be a desktop computer, a portable computer, a laptop, a tablet computer, a pocket computer, a telephone, a smartphone, or any other suitable electronic device.
  • FIG. 2 is a block diagram showing an example of the configuration of the monitoring system of the present invention.
  • the monitoring system 100 shown in FIG. 2 includes an imaging device 101, an analysis device 102, an alarm device 103, a display device 104, and a status notification section 105.
  • the imaging device 101 is configured by, for example, a network camera, and is used to acquire images of monitoring areas such as railroad crossings.
  • the imaging device 101 can have a camera configuration that obtains information by forming an image of incident light on an image sensor through a lens or an aperture.
  • Examples of the image sensor here include a CCD (Charge-Coupled Device) image sensor, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, and the like.
  • the imaging device 101 captures video at, for example, 3 frames per second (3 fps) or more, and the information is sent to the analysis device 102 .
  • a plurality of imaging devices 101 can be installed depending on the situation.
  • As the camera a visible light camera, an infrared camera, or the like can be used.
  • the analysis device 102 includes a state analysis control section 106 and a state analysis section 107.
  • the analysis device 102 is realized by, for example, a computer equipped with a processor such as a CPU, GPU, FPGA, or DSP, and the computer system 1 of FIG. 1 can be applied, for example.
  • the condition analysis control section 106 controls the start of detection of a dangerous event by the condition analysis section 107 based on the image obtained by the imaging device 101 and the train approach information notified from the condition notification section 105.
  • the state analysis unit 107 analyzes the image obtained by the imaging device 101 and predicts whether a dangerous event such as a moving object such as a person or a car being left behind at a railroad crossing will occur. A specific determination method will be described later.
  • the warning device 103 is composed of, for example, a speaker, and when the state analysis unit 107 detects a dangerous event, it notifies the surrounding area of the railroad crossing of the danger by sound. Further, as another aspect, it is configured by a protective radio device, and when the state analysis unit 107 detects a dangerous event, it transmits an emergency stop signal to the relevant train, etc.
  • the display device 104 is a display device that can display surveillance video captured by the imaging device 101. At this time, the details of the dangerous event detected by the state analysis unit 107 can be displayed.
  • the display device 104 can be configured with, for example, a liquid crystal display, an organic EL (OLED) display, or the like. Further, operation means such as a switcher, a keyboard, a mouse, etc. may be attached.
  • the display device 104 may aggregate images from each imaging device 101 to a monitoring center or the like.
  • the display device 104 may be configured at the same location as the analysis device 102, or may be configured at a different location.
  • the display device 104 can be applied as the display device 26 in the example of FIG.
  • the status notification unit 105 is configured using a railroad crossing controller, an automatic train stop (ATS), and the like.
  • the status notification unit 105 detects the approach of a train to the relevant railroad crossing, and notifies the status analysis control unit 106 of the approach information.
  • This approach information can be determined based on, for example, the position of the train to the relevant level crossing, whether the estimated time until reaching the relevant level crossing is within a predetermined value, etc.
  • FIG. 3 is a conceptual diagram showing an example of application of the monitoring system of the present invention.
  • a moving object 130 exemplified by a wheelchair, is shown crossing a crossing area 160 of a railroad crossing 150.
  • the crossing area 160 is the area between the front and rear blocking rods 151 and crosses the track 170.
  • a boundary line 161 of the crossing area 160 is a boundary line near the front blocking rod 151.
  • the imaging device 101 can be installed in a railroad crossing warning device 152, etc., and is installed so that it can photograph the entire crossing area 160. For this reason, it is best to install it at a higher position than a certain level. For example, the height is 2 m or more, which is higher than a person's height. Further, instead of just one imaging device 101, a plurality of imaging devices 101 may be installed diagonally, etc., to photograph the entire railroad crossing 150 including the crossing area 160 from multiple angles.
  • the warning device 103 is installed near the railroad crossing 150, and can issue a warning to the railroad crossing 150 and people near the railroad crossing 150. In the example of FIG. 3, it is installed in a railroad crossing warning device 152.
  • the analysis device 102 can be installed at a location different from the railroad crossing 150. Furthermore, the condition analysis unit 107 may be installed near the railroad crossing 150 or may be installed integrally with the analysis device 102.
  • the display device 104 can be installed at a location different from the railroad crossing 150. For example, it is possible to aggregate the information together with information from the imaging devices 101 at other railroad crossings at a monitoring center or the like.
  • FIG. 3 shows a case where a dangerous event occurs in which the moving object 130 is likely to be left behind in the crossing area 160 of the railroad crossing 150.
  • the information is analyzed by the analysis device 102 based on the image photographed by the imaging device 101, and the event is determined to be a dangerous event. Based on this, the alarm device 103 issues a predetermined alarm. Further, on the display device 104, a supervisor can check the video of the event determined to be a dangerous event.
  • FIG. 4 shows an example of a flowchart of processing of the monitoring system of the present invention. The processing flow of the monitoring system 100 will be explained using FIG. 4.
  • the state analysis control unit 106 detects the approach of a train.
  • the detection here can be carried out by detecting the start of the warning bell or the start of the crossing at the railroad crossing 150.
  • the imaging device 101 constantly transmits railroad crossing surveillance video to the state analysis control unit 106. Therefore, the state analysis control unit 106 can detect the start of shutoff by image analysis. Examples of methods for detecting the start of a cutoff using surveillance video include a method of detecting blinking of a railroad crossing warning from the railroad crossing alarm 152 based on changes in color and brightness, and a method of detecting movement of the cutoff rod 151.
  • the approach of a train may be detected by the status notification unit 105 acquiring information from a level crossing controller, an ATS, or the like.
  • the condition analysis control section 106 instructs the condition analysis section 107 to start analyzing the level crossing condition.
  • the state analysis unit 107 detects and tracks moving objects 130 such as people, cars, bicycles, motorcycles, wheelchairs, strollers, canes, and white canes inside the railroad crossing 150.
  • the detection and tracking here can be realized by combining machine learning technology, pattern matching, etc. based on the video captured by the imaging device 101.
  • Machine learning technology can be performed by image analysis using AI (artificial intelligence), deep learning, etc.
  • Pattern matching can be performed, for example, by image analysis, such as by comparing with a template image.
  • the moving object 130 Once the moving object 130 is detected, it moves frame by frame on the screen. At this time, the movement history of the detected moving object 130 can be tracked using machine learning or pattern matching to determine whether the object is the same as the detected moving object 130.
  • algorithms other than machine learning and pattern matching may be used to detect and track the moving object 130.
  • a detection object type ID may be attached to each moving object to identify the type of the object, such as a person, a car, a bicycle, a motorcycle, a wheelchair, a stroller, a cane, a white cane, etc.
  • step S203 the speed of the moving object 130 is estimated.
  • the speed is estimated by the state analysis unit 107 through image analysis of the video captured by the imaging device 101.
  • a specific example of the speed estimation method will be explained in the determination method described later.
  • step S204 it is determined whether the moving object 130 can cross the railroad crossing 150. This determination can be made by the state analysis unit 107 using the speed estimated in step S203. Whether the moving object 130 has crossed the railroad crossing 150 can be determined by, for example, whether it can pass through the crossing area 160 before the crossing is completed. A specific example of the determination method will be described later.
  • step S204 if the moving object 130 can cross the railroad crossing 150, the process ends. If crossing is not possible (true), the process advances to step S205. In this case, the state analysis unit 107 transmits information on the occurrence of a dangerous event.
  • step S205 alarm processing is performed.
  • the warning device 103 alerts the area around the railroad crossing 150 of danger by voice or the like.
  • the display device 104 may display a warning about the occurrence of a dangerous event. These operations can be performed based on information received from the state analysis unit 107 regarding the occurrence of a dangerous event.
  • FIG. 5 is a diagram for explaining the determination method of the monitoring system of the present invention. The first determination method will be explained using FIG. 5.
  • FIG. 5 shows a state in which a moving object 130 (a wheelchair is exemplified in FIG. 5) is crossing a railroad crossing 150.
  • An imaginary line 301 is determined near the rear boundary in the transverse direction of the crossing area 160 of the railroad crossing 150, and an imaginary line 302 is determined near the front boundary. These can be determined by image analysis.
  • the moving object 130 is present in the traversal area 160, the current time is t n , and the current position is P n (x(t n ), y(t n )).
  • the first determination method is to determine that the moving object 130 cannot cross the railroad crossing (true) if the time it takes to cross the railroad crossing 150 is predicted to be longer than the remaining time until the barrier 151 of the railroad crossing 150 closes. This is a judgment method.
  • the destination point where the moving object 130 reaches the imaginary line 302 in front is PG (x G , y G )
  • the destination point from the current position t n (x (t n ), y (t n )) is PG (x G , y G ).
  • the distance d(t n ) to P G (x G , y G ) is expressed by Equation 1 below.
  • the destination point P G (x G , y G ) is, for example, the position of the virtual line 302 that is the shortest distance ahead from the current position P n (x (t n ), y (t n )). can be taken as the destination point PG .
  • the intersection with the virtual line 302 on the extension of the movement vector calculated from this may be set as the destination point PG .
  • the current moving speed v(t n ) is expressed by the following equation 2.
  • the immediately preceding time t n-1 is a predetermined time before the current time t n , and may be, for example, a predetermined frame interval such as a difference from one frame before or several frames before.
  • Equation 3 is a determination condition. Here, it is determined whether Equation 3 is true or false using the distance d(t n ) obtained from Equation 1 and the moving speed v(t n ) obtained from Equation 2 above.
  • is the remaining time from the current time t n until the crossing of the railroad crossing 150 is completed. This ⁇ can be calculated by the state analysis unit 107 based on the information on the railroad crossing, since the time from the start of the railroad crossing to the completion of the railroad crossing is determined in advance.
  • the above example shows an example in which the determination is made only for the virtual line 302 located in front of the moving direction of the moving object 130, but the determination is made for the virtual line 301 located behind the moving object 130 in the moving direction. It's okay.
  • the distance d(t n ) to the virtual line is applied to both the backward virtual line 301 and the forward virtual line 302, and if both are true, it is determined that crossing is not possible. It's okay. This also takes into account the possibility that the moving object 130 can turn back.
  • the second determination method determines whether the amount of movement of the detected object (moving object 130) at a certain time is less than or equal to a threshold value under predetermined conditions. In this case, it can be assumed that the vehicle is turning back, wandering around, or moving at low speed, so this method determines that crossing is not possible (true) in such a case. Specifically, it can be expressed by the following equation 4.
  • Equation 4-1 which is the first condition shown below, will be explained.
  • v(t n ) is the speed at the current time calculated using Equation 2 above.
  • the time t m is a time before the current time t n , for example, 2 seconds ago, further 5 seconds ago, or even 10 seconds ago. In other words, it can be a value between about 1 second and 10 seconds ago.
  • Equation 4-1 indicates that the total amount of movement from the previous time t m to the current time t n is less than or equal to a predetermined movement amount threshold.
  • Equation 4-1 uses the total amount of movement for simplicity, but a velocity vector that takes the direction of movement into consideration may be used.
  • equation 4-2 which is the next condition shown below, will be explained.
  • is the time from the current time t n until the crossing of the railroad crossing 150 is completed.
  • T th1 is a threshold value for the remaining time to complete the cutoff. In other words, if there is little time remaining until the crossing of the level crossing 150 is completed, the condition of this equation is satisfied because the situation is dangerous.
  • Equation 4-3 is 1 when the detection target (moving object 130) is within the detection area M, and the condition of this equation is satisfied.
  • the detection area M(x, y) may be set in a predetermined area within the railroad crossing, but may also be set to include a prohibited area such as a track area outside the railroad crossing.
  • the above equation 4 is determined to be true when all of the above equations 4-1, 4-2, and 4-3 are satisfied, and it can be determined that a dangerous event has occurred. However, the degree of risk may be indicated in stages when some of these three equations are satisfied. For example, if Equations 4-1 and 4-3 are satisfied, it is determined that a dangerous event has occurred, and if Equation 4-2 is further satisfied, it is determined that a dangerous event has occurred.
  • the predetermined time T th2 can be determined in advance.
  • the fourth determination method is the determination using the detection target type described in step S202.
  • the detection target type can be determined by image analysis using AI (artificial intelligence), deep learning, or the like. At this time, a determination is made according to the type of detection target. Examples of detection target types include people (adults, children, elderly people), cars, bicycles, motorbikes, wheelchairs, strollers, canes, white canes, and the like.
  • the threshold values (V th , T th1 , T th2 ) used in the second and third determination methods may be changed and set in advance for each detection target type.
  • the arrival to the virtual line 301 or 302 is determined based on machine learning. Anticipate the time. Then, if the expected arrival time is longer (slower) than the completion time of the railroad crossing 150, or if the probability that it will be longer (slower) is greater than a predetermined value, it may be determined that a dangerous event has occurred. . In this case, the arrival time for that type can be predicted by using past data at the relevant level crossing and data at other level crossings.
  • a dangerous event has occurred. For example, in the case of a car, if the car is stopped at a railroad crossing for a predetermined period of time, it is possible that the car in front of the car is stuck in a traffic jam and cannot move, or that the car is unable to move due to a problem such as a breakdown of the car. For example, if the wheelchair is stopped near a track groove or in the direction of the track in a crossing zone, there is a possibility that the wheelchair has fallen off and is unable to move.
  • the event may be determined to be a dangerous event or a state with a high possibility of becoming a dangerous event.
  • FIG. 6 is a diagram showing an example of display on the first screen in the monitoring system of the present invention.
  • FIG. 7 is a diagram showing an example of display on the second screen in the monitoring system of the present invention.
  • the first screen shown in FIG. 6 and the second screen shown in FIG. 7 are displayed by the display device 104. These examples show an example of highlighting or enlarging a video of interest from among a plurality of displayed surveillance videos. Although any display method is possible, the following is a configuration for the observer to notice the danger.
  • the first display method is the method shown in FIG.
  • a main display 410 in which a plurality of images for each imaging device 101 are arranged in tiles is displayed on the left side. In other words, images of different railroad crossings are displayed side by side.
  • the main display 410 in FIG. 6 shows an example in which nine videos are displayed in three horizontal columns and three vertical columns.
  • a thumbnail display 420 is provided on the right side of the main display 410.
  • the thumbnail display 420 is displayed in a smaller area than the main display 410, and each video display is also smaller than the main display 410.
  • images from the imaging device 101 that cannot be displayed on the main display 410 are displayed in an enlarged manner.
  • the thumbnail display 420 in FIG. 6 shows an example in which three videos are displayed vertically side by side.
  • step S201 in FIG. This may also be the start of shutoff.
  • step S204 in FIG. It is possible to display a different emphasis on a stage where a train is approaching and a stage where a dangerous event has been determined as compared to a normal stage.
  • the image from the corresponding imaging device 101 may be emphasized by some kind of highlight display.
  • the corresponding video may be moved from the thumbnail display 420 in FIG. 6 to the main display 410.
  • a specific highlight display 411 is performed at the stage where the event is determined to be a dangerous event.
  • the highlight display 411 is divided into colors, such as a yellow frame for an approaching train and a red frame for a dangerous event, and displays a warning using different colored frames.
  • images that cannot be displayed on the main display 410 may be highlighted and displayed on the thumbnail display 420 in the same manner as the highlight display 411.
  • images showing a train approaching and a dangerous event may be displayed side by side so that they are displayed higher than other images.
  • images may be arranged from the left side of the upper row of the main display 410.
  • the video of the dangerous event may be displayed at a higher priority than the video of the approaching train.
  • a highlight display 411 of an image of an approaching train as shown in FIG. 6 is performed in the same way as in the first display method.
  • the screen changes to a second display screen 500 as shown in FIG.
  • the video determined to be a dangerous event is displayed as an enlarged display 511 on the main display 510 as shown in FIG.
  • the enlarged display 511 uses four normal image display spaces to display one image in an enlarged manner.
  • the thumbnail display 420 on the right side in FIG. 7 is displayed in the same manner as in FIG. 6 .
  • the first display method and the second display method described above may have a function of canceling the emphasis or enlarged display of the dangerous event after detecting the dangerous event. This assumes that safety has been confirmed by a supervisor or the like.
  • the display device 104 may be provided with an interface for this cancellation operation. In this case, once the highlighted or enlarged display is started, the highlighted or enlarged display may be continued until it is manually canceled.
  • videos in which a shutdown is being started and videos of dangerous events are preferentially displayed on the main displays 410 and 510, and videos in other states are displayed as thumbnail displays 420. You may. This allows videos of particular interest to be displayed on the main displays 410, 510.
  • the main displays 410 and 510 may exclusively allocate the video that is starting to be cut off and the video in other states to each tile-shaped display portion. Furthermore, each tile-shaped display portion can be allocated for cyclic display.
  • ⁇ Effect> it is possible to accurately detect a dangerous event at a railroad crossing by analyzing the image captured by the imaging device. At this time, by detecting the approach of a train, it can be detected before the crossing is completed. In addition, accurate detection is possible by detecting whether a moving object can cross a railroad crossing. Furthermore, since an imaging device is used, it is possible to detect the entire condition inside the railroad crossing. Furthermore, it can be realized without using other sensors such as a laser irradiation type sensor, and costs can be reduced. Furthermore, the images from the imaging device can be viewed directly by surveillance personnel to visually confirm the situation. In addition, by emphasizing or enlarging and displaying a video of interest from the surveillance video on the display device, it is possible to display the video so that the surveillance staff can easily notice the video.
  • the present invention is not limited to the above-described embodiments, and includes various modifications.
  • the embodiments described above are described in detail to explain the present invention in an easy-to-understand manner, and the present invention is not necessarily limited to having all the configurations described.
  • it is possible to replace a part of the configuration of one embodiment with the configuration of another embodiment and it is also possible to add the configuration of another embodiment to the configuration of one embodiment.
  • the above embodiment has been described with respect to a railroad crossing, it is also applicable to other situations in which the predicted arrival time to the virtual line is used. For example, it can be applied to signal control at intersections, automatic door opening/closing control, elevator door opening/closing control, etc.
  • the above embodiment shows an example in which the detection target is tracked using images from a camera, it is also possible to use a configuration in which various sensors such as LiDAR and millimeter wave radar are additionally used for detection and tracking to improve detection accuracy. good.
  • Status analysis unit 130 ...moving object, 150...level crossing, 151...blocking rod, 152...level crossing alarm, 160...crossing area, 161...boundary line, 170...railway, 301, 302...virtual line, 400...display screen, 410...main display, 411...Highlight display, 420...Thumbnail display, 500...Second display screen, 510...Main display, 511...Enlarged display

Abstract

The purpose of the present invention is to provide a surveillance system capable of more accurately detecting dangerous events at railroad crossings. The surveillance system comprises an imaging device and an analysis device. The analysis device includes a state analysis control unit and a state analysis unit. When detecting the approach of a train by analyzing an image acquired by the imaging device or on the basis of information notified from a state notification unit, the state analysis control unit instructs the state analysis unit to start analyzing the image within the railroad crossing. The state analysis unit uses the image acquired by the imaging device to detect and track a moving object within the railroad crossing. When it is determined by using at least a current moving speed of the moving object that a detection target cannot reach a set virtual line until the blocking of the railroad crossing is completed, the state analysis unit determines that a dangerous event occurs, and performs alarm processing.

Description

監視システムMonitoring system
 本発明は、監視システムに関し、特に、画像中の検知対象を検知し追跡した結果に基づき、危険事象を検知する監視システムに関する。 The present invention relates to a monitoring system, and particularly to a monitoring system that detects a dangerous event based on the results of detecting and tracking a detection target in an image.
 例えば、踏切では自動踏切警報器や交通信号機などを設置し、歩行者、自動車、列車を分離して通行することで交通安全を実現している。しかしながら、踏切内に人や自動車、車椅子などが取り残されて、列車と衝突する事故が発生している。このため、安全向上のため踏切内に残存する歩行者や自動車等を検知するためのレーザセンサを設置することが一般的に行われている。 For example, automatic level crossing alarms and traffic signals are installed at railroad crossings to ensure traffic safety by separating pedestrians, cars, and trains. However, accidents have occurred in which people, cars, wheelchairs, etc. are left behind at railroad crossings and collided with trains. Therefore, in order to improve safety, it is common practice to install laser sensors to detect pedestrians, cars, etc. remaining within railroad crossings.
 一方、特許文献1には、横断路へ車両が接近していると判断された場合には、検出された物体が移動中である移動物体であるか否かを判断し、物体が移動物体であると判断された場合には、その移動物体の位置変化を予測し、予測された移動物体の位置変化に基づき、移動物体が車両の走行を妨げるおそれがあるか否かを予測する移動物体検知装置が開示されている。 On the other hand, in Patent Document 1, when it is determined that a vehicle is approaching a crossing road, it is determined whether or not the detected object is a moving object that is moving. If it is determined that there is a moving object, it predicts the change in the position of the moving object, and based on the predicted change in the position of the moving object, it predicts whether the moving object is likely to interfere with the vehicle's travel. An apparatus is disclosed.
 また、特許文献2には、通行人の位置及び通行速度を踏切情報と照合することで、通行人が踏切を通行可能であるか判断し、判断の結果を通行人に通知する踏切通過可否判断支援システムが開示されている。 In addition, Patent Document 2 describes a level crossing passability judgment that determines whether a passerby can pass through a level crossing by comparing the passerby's position and passing speed with the level crossing information, and notifies the passerby of the result of the judgment. A support system is disclosed.
特開2006-111177号公報Japanese Patent Application Publication No. 2006-111177 特開2016-155405号公報Japanese Patent Application Publication No. 2016-155405
 しかしながら、レーザセンサによる方法は遮断桿が降りた後の危険事象の検知であることから、検知から発報し列車が停止するまでの時間猶予が短い。また、踏切内を監視する監視カメラを設置して映像を目視で確認することも考えられるが、映像監視による方法は多数の踏切を監視員が同時に監視するため大きな負担となる。 However, since the method using a laser sensor detects a dangerous event after the barrier has been lowered, there is a short time lag from detection until the alarm is issued and the train stops. Another option is to install surveillance cameras to monitor the inside of railroad crossings and visually check the images, but this method requires a large number of railroad crossings to be monitored at the same time, which puts a heavy burden on the inspectors.
 一方、特許文献1ではレーザ照射式のセンサの使用を前提としている。この場合、レーザで照射して検知できる範囲が限定されるとともに、レーザから得られる情報も限られた情報となる。また、レーザ照射式のセンサはカメラに比べて高価である。さらに、踏切内の状況を監視員が目視で確認するためにはカメラを別に設ける必要がある。 On the other hand, Patent Document 1 assumes the use of a laser irradiation type sensor. In this case, the range that can be detected by laser irradiation is limited, and the information obtained from the laser is also limited. Additionally, laser irradiation type sensors are more expensive than cameras. Furthermore, it is necessary to install a separate camera in order for the guard to visually check the situation inside the railroad crossing.
 また、特許文献2では、通行人が携帯端末を保持していることにより通行人の位置を取得することを前提としている。このため、通行人が対応する携帯端末を保持していない場合は通行人を検知することができない。 Furthermore, Patent Document 2 assumes that the location of a passerby is acquired by the passerby holding a mobile terminal. Therefore, if a passerby does not have a corresponding mobile terminal, the passerby cannot be detected.
 本発明は、上記課題に鑑みて、踏切内の危険事象をより的確に検知できる監視システムを提供することを目的とする。 In view of the above problems, the present invention aims to provide a monitoring system that can more accurately detect dangerous events within railroad crossings.
 上記目的を達成するため、代表的な本発明の監視システムの一つは、撮像装置と、解析装置とを備え、前記解析装置は、状態解析制御部と、状態解析部とを有し、前記状態解析制御部は、前記撮像装置が取得した映像の解析または状態通知部から通知された情報によって列車が接近していることを検知すると前記状態解析部に対して踏切内の映像解析の開始を指示し、前記状態解析部は、前記撮像装置が取得した映像を用いて踏切内の移動物体の検知と追跡を行い、少なくとも前記移動物体の現在移動速度を用いて踏切の遮断完了までに検知対象が設定した仮想線まで到達できないと判定した場合に危険事象が発生していると判定して発報処理を行う。 In order to achieve the above object, one of the typical monitoring systems of the present invention includes an imaging device and an analysis device, the analysis device has a state analysis control section and a state analysis section, and the analysis device includes a state analysis control section and a state analysis section. When the condition analysis control section detects that a train is approaching based on the analysis of the image acquired by the imaging device or the information notified from the condition notification section, the condition analysis control section instructs the condition analysis section to start analyzing the image inside the railroad crossing. and the state analysis unit detects and tracks a moving object within the railroad crossing using the image acquired by the imaging device, and uses at least the current moving speed of the moving object to detect the object to be detected by the time the crossing of the railroad crossing is completed. If it is determined that the set virtual line cannot be reached, it is determined that a dangerous event has occurred and an alarm process is performed.
 本発明によれば、監視システムにおいて、踏切内の危険事象をより的確に検知できる。
 上記以外の課題、構成及び効果は、以下の実施形態により明らかにされる。
According to the present invention, a dangerous event within a railroad crossing can be detected more accurately in a monitoring system.
Problems, configurations, and effects other than those described above will be clarified by the following embodiments.
図1は、本開示の実施形態による態様を実施するためのコンピュータシステムのブロック図である。FIG. 1 is a block diagram of a computer system for implementing aspects according to embodiments of the present disclosure. 図2は、本発明の監視システムの構成例を示すブロック図である。FIG. 2 is a block diagram showing an example of the configuration of the monitoring system of the present invention. 図3は、本発明の監視システムの適用例を示す概念図である。FIG. 3 is a conceptual diagram showing an example of application of the monitoring system of the present invention. 図4は、本発明の監視システムの処理のフローチャートの一例を示す。FIG. 4 shows an example of a flowchart of processing of the monitoring system of the present invention. 図5は、本発明の監視システムの判定方法を説明するための図である。FIG. 5 is a diagram for explaining the determination method of the monitoring system of the present invention. 図6は、本発明の監視システムにおける第1の表示例を示す図である。FIG. 6 is a diagram showing a first display example in the monitoring system of the present invention. 図7は、本発明の監視システムにおける第2の表示例を示す図である。FIG. 7 is a diagram showing a second display example in the monitoring system of the present invention.
 本発明を実施するための形態を説明する。 A mode for carrying out the present invention will be described.
<実施形態による態様を実施するためのコンピュータシステム>
 図1は、本開示の実施形態による態様を実施するためのコンピュータシステム1のブロック図である。本明細書で開示される様々な実施形態の機構及び装置は、任意の適切なコンピューティングシステムに適用されてもよい。コンピュータシステム1の主要コンポーネントは、1つ以上のプロセッサ2、メモリ4、端末インターフェース12、ストレージインターフェース14、I/O(入出力)デバイスインターフェース16、及びネットワークインターフェース18を含む。これらのコンポーネントは、メモリバス6、I/Oバス8、バスインターフェースユニット9、及びI/Oバスインターフェースユニット10を介して、相互的に接続されてもよい。
<Computer system for implementing aspects according to embodiments>
FIG. 1 is a block diagram of a computer system 1 for implementing aspects according to embodiments of the present disclosure. The mechanisms and apparatus of the various embodiments disclosed herein may be applied to any suitable computing system. The main components of computer system 1 include one or more processors 2 , memory 4 , terminal interface 12 , storage interface 14 , I/O (input/output) device interface 16 , and network interface 18 . These components may be interconnected via a memory bus 6, an I/O bus 8, a bus interface unit 9, and an I/O bus interface unit 10.
 コンピュータシステム1は、プロセッサ2と総称される1つ又は複数の処理装置2A及び2Bを含んでもよい。各プロセッサ2は、メモリ4に格納された命令を実行し、オンボードキャッシュを含んでもよい。ある実施形態では、コンピュータシステム1は複数のプロセッサを備えてもよく、また別の実施形態では、コンピュータシステム1は単一の処理装置によるシステムであってもよい。処理装置としては、CPU(Central Processing Unit)、FPGA(Field-Programmable Gate Array)、GPU(Graphics Processong Unit)、DSP(Digital Signal Processor)等を適用できる。 The computer system 1 may include one or more processing devices 2A and 2B, collectively referred to as processors 2. Each processor 2 executes instructions stored in memory 4 and may include onboard cache. In some embodiments, computer system 1 may include multiple processors, and in other embodiments, computer system 1 may be a single processing unit system. Processing devices include CPU (Central Processing Unit), FPGA (Field-Programmable Gate Array), GPU (Graphics Processing Unit), and DSP (Digital Processing Unit). al Signal Processor) etc. can be applied.
 ある実施形態では、メモリ4は、データ及びプログラムを記憶するためのランダムアクセス半導体メモリ、記憶装置、又は記憶媒体(揮発性又は不揮発性のいずれか)を含んでもよい。ある実施形態では、メモリ4は、コンピュータシステム1の仮想メモリ全体を表しており、ネットワークを介してコンピュータシステム1に接続された他のコンピュータシステムの仮想メモリを含んでもよい。メモリ4は、概念的には単一のものとみなされてもよいが、他の実施形態では、このメモリ4は、キャッシュおよび他のメモリデバイスの階層など、より複雑な構成となる場合がある。例えば、メモリは複数のレベルのキャッシュとして存在し、これらのキャッシュは機能毎に分割されてもよい。その結果、1つのキャッシュは命令を保持し、他のキャッシュはプロセッサによって使用される非命令データを保持する構成であってもよい。メモリは、いわゆるNUMA(Non-Uniform
 Memory Access)コンピュータアーキテクチャのように、分散され、種々の異なる処理装置に関連付けられてもよい。
In some embodiments, memory 4 may include random access semiconductor memory, storage devices, or storage media (either volatile or non-volatile) for storing data and programs. In some embodiments, memory 4 represents the entire virtual memory of computer system 1 and may include virtual memory of other computer systems connected to computer system 1 via a network. Although the memory 4 may be conceptually considered as a single entity, in other embodiments this memory 4 may be a more complex arrangement, such as a hierarchy of caches and other memory devices. . For example, memory may exist as multiple levels of caches, and these caches may be divided by function. As a result, one cache may hold instructions while the other cache holds non-instruction data used by the processor. Memory is so-called NUMA (Non-Uniform)
(Memory Access) computer architecture may be distributed and associated with a variety of different processing devices.
 メモリ4は、本明細書で説明する機能を実施するプログラム、モジュール、及びデータ構造のすべて又は一部を格納してもよい。例えば、メモリ4は、潜在因子特定アプリケーション50を格納していてもよい。ある実施形態では、潜在因子特定アプリケーション50は、後述する機能をプロセッサ2上で実行する命令又は記述を含んでもよく、あるいは別の命令又は記述によって解釈される命令又は記述を含んでもよい。ある実施形態では、潜在因子特定アプリケーション50は、プロセッサベースのシステムの代わりに、またはプロセッサベースのシステムに加えて、半導体デバイス、チップ、論理ゲート、回路、回路カード、および/または他の物理ハードウェアデバイスを介してハードウェアで実施されてもよい。ある実施形態では、潜在因子特定アプリケーション50は、命令又は記述以外のデータを含んでもよい。ある実施形態では、カメラ、センサ、または他のデータ入力デバイス(図示せず)が、バスインターフェースユニット9、プロセッサ2、またはコンピュータシステム1の他のハードウェアと直接通信するように提供されてもよい。このような構成では、プロセッサ2がメモリ4及び潜在因子識別アプリケーションにアクセスする必要性が低減する可能性がある。 Memory 4 may store all or some of the programs, modules, and data structures that perform the functions described herein. For example, the memory 4 may store a latent factor identification application 50. In some embodiments, latent factor identification application 50 may include instructions or writings that perform functions described below on processor 2, or may include instructions or writings that are interpreted by other instructions or writings. In some embodiments, latent factor identification application 50 may be applied to semiconductor devices, chips, logic gates, circuits, circuit cards, and/or other physical hardware instead of or in addition to processor-based systems. It may also be implemented in hardware via a device. In some embodiments, latent factor identification application 50 may include data other than instructions or descriptions. In some embodiments, cameras, sensors, or other data input devices (not shown) may be provided to communicate directly with bus interface unit 9, processor 2, or other hardware of computer system 1. . Such an arrangement may reduce the need for processor 2 to access memory 4 and the latent factor identification application.
 コンピュータシステム1は、プロセッサ2、メモリ4、表示システム24、及びI/Oバスインターフェースユニット10間の通信を行うバスインターフェースユニット9を含んでもよい。I/Oバスインターフェースユニット10は、様々なI/Oユニットとの間でデータを転送するためのI/Oバス8と連結していてもよい。I/Oバスインターフェースユニット10は、I/Oバス8を介して、I/Oプロセッサ(IOP)又はI/Oアダプタ(IOA)としても知られる複数のI/Oインターフェースユニット12、14、16、及び18と通信してもよい。表示システム24は、表示コントローラ、表示メモリ、又はその両方を含んでもよい。表示コントローラは、ビデオ、オーディオ、又はその両方のデータを表示装置26に提供することができる。また、コンピュータシステム1は、データを収集し、プロセッサ2に当該データを提供するように構成された1つまたは複数のセンサ等のデバイスを含んでもよい。例えば、コンピュータシステム1は、湿度データ、温度データ、圧力データ等を収集する環境センサ、及び加速度データ、運動データ等を収集するモーションセンサ等を含んでもよい。これ以外のタイプのセンサも使用可能である。表示メモリは、ビデオデータをバッファするための専用メモリであってもよい。表示システム24は、単独のディスプレイ画面、テレビ、タブレット、又は携帯型デバイスなどの表示装置26に接続されてもよい。ある実施形態では、表示装置26は、オーディオをレンダリングするためスピーカを含んでもよい。あるいは、オーディオをレンダリングするためのスピーカは、I/Oインターフェースユニットと接続されてもよい。他の実施形態では、表示システム24が提供する機能は、プロセッサ2を含む集積回路によって実現されてもよい。同様に、バスインターフェースユニット9が提供する機能は、プロセッサ2を含む集積回路によって実現されてもよい。 The computer system 1 may include a processor 2 , a memory 4 , a display system 24 , and a bus interface unit 9 for communicating between the I/O bus interface unit 10 . I/O bus interface unit 10 may be coupled to I/O bus 8 for transferring data to and from various I/O units. I/O bus interface unit 10 connects via I/O bus 8 to a plurality of I/ O interface units 12, 14, 16, also known as I/O processors (IOPs) or I/O adapters (IOAs). and 18. Display system 24 may include a display controller, display memory, or both. A display controller may provide video, audio, or both data to display device 26. Computer system 1 may also include devices such as one or more sensors configured to collect data and provide the data to processor 2 . For example, computer system 1 may include environmental sensors that collect humidity data, temperature data, pressure data, etc., motion sensors that collect acceleration data, motion data, etc., and the like. Other types of sensors can also be used. The display memory may be a dedicated memory for buffering video data. Display system 24 may be connected to a display device 26, such as a standalone display screen, a television, a tablet, or a handheld device. In some embodiments, display device 26 may include speakers to render audio. Alternatively, a speaker for rendering audio may be connected to the I/O interface unit. In other embodiments, the functionality provided by display system 24 may be implemented by an integrated circuit that includes processor 2. Similarly, the functionality provided by the bus interface unit 9 may be realized by an integrated circuit including the processor 2.
 I/Oインターフェースユニットは、様々なストレージ又はI/Oデバイスと通信する機能を備える。例えば、端末インターフェースユニット12は、ビデオ表示装置、スピーカテレビ等のユーザ出力デバイスや、キーボード、マウス、キーパッド、タッチパッド、トラックボール、ボタン、ライトペン、又は他のポインティングデバイス等のユーザ入力デバイスのようなユーザI/Oデバイス20の取り付けが可能である。ユーザは、ユーザインターフェースを使用して、ユーザ入力デバイスを操作することで、ユーザI/Oデバイス20及びコンピュータシステム1に対して入力データや指示を入力し、コンピュータシステム1からの出力データを受け取ってもよい。ユーザインターフェースは例えば、ユーザI/Oデバイス20を介して、表示装置に表示されたり、スピーカによって再生されたり、プリンタを介して印刷されたりしてもよい。 The I/O interface unit has the ability to communicate with various storage or I/O devices. For example, the terminal interface unit 12 may include a user output device such as a video display device, a speaker television, or a user input device such as a keyboard, mouse, keypad, touch pad, trackball, button, light pen, or other pointing device. It is possible to attach the user I/O device 20 as shown in FIG. Using the user interface, the user inputs input data and instructions to the user I/O device 20 and the computer system 1 by operating the user input device, and receives output data from the computer system 1. Good too. The user interface may be displayed on a display device, played via a speaker, or printed via a printer, for example, via the user I/O device 20.
 ストレージインターフェース14は、1つ又は複数のディスクドライブや直接アクセスストレージ装置22(通常は磁気ディスクドライブストレージ装置であるが、単一のディスクドライブとして見えるように構成されたディスクドライブのアレイ又は他のストレージ装置であってもよい)の取り付けが可能である。ある実施形態では、ストレージ装置22は、任意の二次記憶装置として実装されてもよい。メモリ4の内容は、ストレージ装置22に記憶され、必要に応じてストレージ装置22から読み出されてもよい。ネットワークインターフェース18は、コンピュータシステム1と他のデバイスが相互的に通信できるように、通信経路を提供してもよい。この通信経路は、例えば、ネットワーク30であってもよい。 Storage interface 14 may include one or more disk drives or direct access storage devices 22 (typically magnetic disk drive storage devices, but an array of disk drives or other storage devices configured to appear as a single disk drive). ) can be installed. In some embodiments, storage device 22 may be implemented as any secondary storage device. The contents of the memory 4 may be stored in the storage device 22 and read from the storage device 22 as needed. Network interface 18 may provide a communication path so that computer system 1 and other devices can communicate with each other. This communication path may be, for example, the network 30.
 図1に示されるコンピュータシステム1は、プロセッサ2、メモリ4、バスインタフェース9、表示システム24、及びI/Oバスインターフェースユニット10の間の直接通信経路を提供するバス構造を備えているが、他の実施形態では、コンピュータシステム1は、階層構成、スター構成、又はウェブ構成のポイントツーポイントリンク、複数の階層バス、平行又は冗長の通信経路を含んでもよい。さらに、I/Oバスインターフェースユニット10及びI/Oバス8が単一のユニットとして示されているが、実際には、コンピュータシステム1は複数のI/Oバスインターフェースユニット10又は複数のI/Oバス8を備えてもよい。また、I/Oバス8を様々なI/Oデバイスに繋がる各種通信経路から分離するための複数のI/Oインターフェースユニットが示されているが、他の実施形態では、I/Oデバイスの一部または全部が、1つのシステムI/Oバスに直接接続されてもよい。 The computer system 1 shown in FIG. 1 includes a bus structure that provides a direct communication path between a processor 2, a memory 4, a bus interface 9, a display system 24, and an I/O bus interface unit 10; In embodiments, computer system 1 may include point-to-point links in a hierarchical, star, or web configuration, multiple hierarchical buses, and parallel or redundant communication paths. Further, although I/O bus interface unit 10 and I/O bus 8 are shown as a single unit, in reality computer system 1 may include multiple I/O bus interface units 10 or multiple I/O A bus 8 may also be provided. Also, although a plurality of I/O interface units are shown for separating the I/O bus 8 from various communication paths connecting various I/O devices, in other embodiments, one of the I/O devices may be Some or all may be directly connected to one system I/O bus.
 ある実施形態では、コンピュータシステム1は、マルチユーザメインフレームコンピュータシステム、シングルユーザシステム、又はサーバコンピュータ等の、直接的ユーザインターフェースを有しない、他のコンピュータシステム(クライアント)からの要求を受信するデバイスであってもよい。他の実施形態では、コンピュータシステム1は、デスクトップコンピュータ、携帯型コンピュータ、ノートパソコン、タブレットコンピュータ、ポケットコンピュータ、電話、スマートフォン、又は任意の他の適切な電子機器であってもよい。 In some embodiments, computer system 1 is a device that receives requests from other computer systems (clients) that do not have a direct user interface, such as a multi-user mainframe computer system, a single-user system, or a server computer. There may be. In other embodiments, computer system 1 may be a desktop computer, a portable computer, a laptop, a tablet computer, a pocket computer, a telephone, a smartphone, or any other suitable electronic device.
 <監視システムのブロック図>
 図2は、本発明の監視システムの構成例を示すブロック図である。
<Block diagram of monitoring system>
FIG. 2 is a block diagram showing an example of the configuration of the monitoring system of the present invention.
 図2で示される監視システム100は、撮像装置101、解析装置102、警報装置103、表示装置104、状態通知部105を備えている。 The monitoring system 100 shown in FIG. 2 includes an imaging device 101, an analysis device 102, an alarm device 103, a display device 104, and a status notification section 105.
 撮像装置101は、例えばネットワークカメラによって構成され、踏切等の監視エリアの映像取得に用いられる。撮像装置101は、レンズや絞りを介して撮像素子に入射光を結像して情報を得るカメラの構成を適用できる。ここでの撮像素子の例としては、CCD(Charge-Coupled Device)イメージセンサやCMOS(Complementary Metal Oxide Semiconductor)イメージセンサ等があげられる。撮像装置101は、映像として、例えば、1秒間に3フレーム(3fps)以上等で撮影して、その情報は、解析装置102へ送られる。撮像装置101は、状況に応じて複数設置可能である。カメラとしては、可視光や赤外線等のカメラを用いることができる。 The imaging device 101 is configured by, for example, a network camera, and is used to acquire images of monitoring areas such as railroad crossings. The imaging device 101 can have a camera configuration that obtains information by forming an image of incident light on an image sensor through a lens or an aperture. Examples of the image sensor here include a CCD (Charge-Coupled Device) image sensor, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, and the like. The imaging device 101 captures video at, for example, 3 frames per second (3 fps) or more, and the information is sent to the analysis device 102 . A plurality of imaging devices 101 can be installed depending on the situation. As the camera, a visible light camera, an infrared camera, or the like can be used.
 解析装置102は、状態解析制御部106と状態解析部107を備える。解析装置102は、例えば、CPU、GPU、FPGA、DSPなどのプロセッサを搭載したコンピュータによって実現され、例えば、図1のコンピュータシステム1を適用できる。状態解析制御部106は、撮像装置101によって得られた映像や状態通知部105から通知された列車の接近情報に基づき、状態解析部107による危険事象の検知開始を制御する。状態解析部107は、撮像装置101によって得た映像を解析し、人物や自動車などの移動物体が踏切内で取り残される等の危険事象が発生するかどうかの予測を行う。具体的な判定方法については後述する。 The analysis device 102 includes a state analysis control section 106 and a state analysis section 107. The analysis device 102 is realized by, for example, a computer equipped with a processor such as a CPU, GPU, FPGA, or DSP, and the computer system 1 of FIG. 1 can be applied, for example. The condition analysis control section 106 controls the start of detection of a dangerous event by the condition analysis section 107 based on the image obtained by the imaging device 101 and the train approach information notified from the condition notification section 105. The state analysis unit 107 analyzes the image obtained by the imaging device 101 and predicts whether a dangerous event such as a moving object such as a person or a car being left behind at a railroad crossing will occur. A specific determination method will be described later.
 警報装置103は、例えば、スピーカなどで構成され、状態解析部107が危険事象を検知すると、音声によって踏切周囲に危険を知らせる。また、その他の態様として、防護無線装置によって構成され、状態解析部107が危険事象を検知した際に非常停止信号の発信を該当する列車等へ行う。 The warning device 103 is composed of, for example, a speaker, and when the state analysis unit 107 detects a dangerous event, it notifies the surrounding area of the railroad crossing of the danger by sound. Further, as another aspect, it is configured by a protective radio device, and when the state analysis unit 107 detects a dangerous event, it transmits an emergency stop signal to the relevant train, etc.
 表示装置104は、撮像装置101で撮影された監視映像を表示できる表示装置である。このとき、状態解析部107で検知した危険事象の内容等の表示を行うことができる。表示装置104は、例えば、液晶ディスプレイや有機EL(OLED)ディスプレイ等で構成することができる。また、スイッチャーやキーボードやマウスなどの操作手段を付属していてもよい。表示装置104は、各撮像装置101による映像を監視センターなどに集約してもよい。表示装置104は、解析装置102と同じ場所に構成してもよいし、別の場所に構成してもよい。表示装置104は、図1の例ならば表示装置26として適用できる。 The display device 104 is a display device that can display surveillance video captured by the imaging device 101. At this time, the details of the dangerous event detected by the state analysis unit 107 can be displayed. The display device 104 can be configured with, for example, a liquid crystal display, an organic EL (OLED) display, or the like. Further, operation means such as a switcher, a keyboard, a mouse, etc. may be attached. The display device 104 may aggregate images from each imaging device 101 to a monitoring center or the like. The display device 104 may be configured at the same location as the analysis device 102, or may be configured at a different location. The display device 104 can be applied as the display device 26 in the example of FIG.
 状態通知部105は、踏切制御子やATS(Automatic Train Stop)などを用いて構成される。状態通知部105は、該当する踏切への列車の接近を検知し、状態解析制御部106に接近情報を通知する。この接近情報は、例えば、該当する踏切までの列車の位置や該当する踏切に到達するまでの予想時間が所定の値以内に達したか否か等で判定することができる。 The status notification unit 105 is configured using a railroad crossing controller, an automatic train stop (ATS), and the like. The status notification unit 105 detects the approach of a train to the relevant railroad crossing, and notifies the status analysis control unit 106 of the approach information. This approach information can be determined based on, for example, the position of the train to the relevant level crossing, whether the estimated time until reaching the relevant level crossing is within a predetermined value, etc.
<監視システムの適用例>
 図3は、本発明の監視システムの適用例を示す概念図である。
<Example of application of monitoring system>
FIG. 3 is a conceptual diagram showing an example of application of the monitoring system of the present invention.
 図3では、踏切150の横断エリア160を渡る車椅子で例示された移動物体130が示されている。横断エリア160は、前方と後方の遮断桿151の間のエリアであり、線路170を横切る。移動物体130は前方の境界線161を超えると横断エリア160を抜けて安全なエリアに到達することができる。横断エリア160の境界線161は前方の遮断桿151付近の境界線である。 In FIG. 3, a moving object 130, exemplified by a wheelchair, is shown crossing a crossing area 160 of a railroad crossing 150. The crossing area 160 is the area between the front and rear blocking rods 151 and crosses the track 170. When the moving object 130 crosses the front boundary line 161, it can pass through the crossing area 160 and reach a safe area. A boundary line 161 of the crossing area 160 is a boundary line near the front blocking rod 151.
 撮像装置101は、踏切警報機152等に設置することができ、横断エリア160全体を撮影することができるように設置される。このため、ある程度以上高い位置に設置するとよい。例えば、人の高さより高い2m以上などである。また、撮像装置101は、一台だけでなく対角などに複数設置して、横断エリア160を含む踏切150全体を多角的に撮影するようにしてもよい。 The imaging device 101 can be installed in a railroad crossing warning device 152, etc., and is installed so that it can photograph the entire crossing area 160. For this reason, it is best to install it at a higher position than a certain level. For example, the height is 2 m or more, which is higher than a person's height. Further, instead of just one imaging device 101, a plurality of imaging devices 101 may be installed diagonally, etc., to photograph the entire railroad crossing 150 including the crossing area 160 from multiple angles.
 警報装置103は、踏切150近傍に設置され、踏切150や踏切150近傍にいる人に警報を発することができる。図3の例では、踏切警報機152に設置されている。 The warning device 103 is installed near the railroad crossing 150, and can issue a warning to the railroad crossing 150 and people near the railroad crossing 150. In the example of FIG. 3, it is installed in a railroad crossing warning device 152.
 解析装置102は、踏切150とは別の場所に設置することが可能である。また、状態解析部107は、踏切150近傍に設置されてもよいし、解析装置102と一体に設置されてもよい。 The analysis device 102 can be installed at a location different from the railroad crossing 150. Furthermore, the condition analysis unit 107 may be installed near the railroad crossing 150 or may be installed integrally with the analysis device 102.
 表示装置104は、踏切150とは別の場所に設置することが可能である。例えば、監視センターなどに他の踏切における撮像装置101の情報とあわせて集約させることが可能である。 The display device 104 can be installed at a location different from the railroad crossing 150. For example, it is possible to aggregate the information together with information from the imaging devices 101 at other railroad crossings at a monitoring center or the like.
 図3は、移動物体130が踏切150の横断エリア160に取り残されそうな危険事象が発生する場合を示す。この場合は、撮像装置101で撮影された映像に基づき解析装置102でその情報が解析され危険事象と判定される。それに基づき警報装置103で所定の発報がなされる。また、表示装置104では監視員が危険事象と判定された映像を確認できる。 FIG. 3 shows a case where a dangerous event occurs in which the moving object 130 is likely to be left behind in the crossing area 160 of the railroad crossing 150. In this case, the information is analyzed by the analysis device 102 based on the image photographed by the imaging device 101, and the event is determined to be a dangerous event. Based on this, the alarm device 103 issues a predetermined alarm. Further, on the display device 104, a supervisor can check the video of the event determined to be a dangerous event.
<フローチャート>図4は、本発明の監視システムの処理のフローチャートの一例を示す。図4を使って、監視システム100の処理の流れを説明する。 <Flowchart> FIG. 4 shows an example of a flowchart of processing of the monitoring system of the present invention. The processing flow of the monitoring system 100 will be explained using FIG. 4.
 まず、ステップS201では、状態解析制御部106が列車の接近を検知する。ここでの検知は踏切150の予鈴開始や遮断開始を検知する方法があげられる。また、撮像装置101は、状態解析制御部106に踏切監視映像を常時送信している。このため、状態解析制御部106は画像解析によって遮断開始を検知することができる。監視映像によって遮断開始を検知する方法としては、例えば踏切警報機152の踏切警報等の点滅を色・輝度の変化から検知する方法や、遮断桿151の動きを検知する方法がある。また、このほか、列車の接近の検知は状態通知部105が踏切制御子やATSなどからの情報を取得することにより列車の接近を検知してもよい。列車の接近を検知すると状態解析制御部106は、状態解析部107に踏切状態の解析開始を指示する。 First, in step S201, the state analysis control unit 106 detects the approach of a train. The detection here can be carried out by detecting the start of the warning bell or the start of the crossing at the railroad crossing 150. Furthermore, the imaging device 101 constantly transmits railroad crossing surveillance video to the state analysis control unit 106. Therefore, the state analysis control unit 106 can detect the start of shutoff by image analysis. Examples of methods for detecting the start of a cutoff using surveillance video include a method of detecting blinking of a railroad crossing warning from the railroad crossing alarm 152 based on changes in color and brightness, and a method of detecting movement of the cutoff rod 151. In addition to this, the approach of a train may be detected by the status notification unit 105 acquiring information from a level crossing controller, an ATS, or the like. When the approach of a train is detected, the condition analysis control section 106 instructs the condition analysis section 107 to start analyzing the level crossing condition.
 次に、ステップS202では、状態解析部107が踏切150内の人物、自動車、自転車、バイク、車いす、ベビーカー、杖、白杖等の移動物体130を検知して追跡する。ここでの検知や追跡は、撮像装置101で撮影された映像に基づき、機械学習技術やパターンマッチングなどを組み合わせることによって実現できる。機械学習技術は、AI(人工知能)を用いて、ディープラーニングを用いる等して画像解析により行うことが可能である。パターンマッチングは、例えば、テンプレート画像と照合する等して画像解析により行うことが可能である。移動物体130を一回検知すると、画面上ではフレームごとに移行していく。このとき、検知した移動物体130と同じ物であるかどうかを、機械学習やパターンマッチングを使用して移動してきた履歴を追跡できる。また、移動物体130の検知や追跡は、機械学習やパターンマッチング以外のアルゴリズムを利用してもよい。 Next, in step S202, the state analysis unit 107 detects and tracks moving objects 130 such as people, cars, bicycles, motorcycles, wheelchairs, strollers, canes, and white canes inside the railroad crossing 150. The detection and tracking here can be realized by combining machine learning technology, pattern matching, etc. based on the video captured by the imaging device 101. Machine learning technology can be performed by image analysis using AI (artificial intelligence), deep learning, etc. Pattern matching can be performed, for example, by image analysis, such as by comparing with a template image. Once the moving object 130 is detected, it moves frame by frame on the screen. At this time, the movement history of the detected moving object 130 can be tracked using machine learning or pattern matching to determine whether the object is the same as the detected moving object 130. Furthermore, algorithms other than machine learning and pattern matching may be used to detect and track the moving object 130.
 このとき、各移動物体に対して検知対象を識別する識別IDを付す処理を行うとよい。さらに、各移動物体に対して人物、自動車、自転車、バイク、車いす、ベビーカー、杖、白杖等の種別を特定する検知対象種別IDを付してもよい。このようにIDと検知対象を紐付けすることで後述する判別や後の危険事象の解析に役立てることができる。 At this time, it is preferable to perform a process of attaching an identification ID to each moving object to identify the detection target. Furthermore, a detection object type ID may be attached to each moving object to identify the type of the object, such as a person, a car, a bicycle, a motorcycle, a wheelchair, a stroller, a cane, a white cane, etc. By associating the ID and the detection target in this way, it can be useful for later-described determination and later analysis of dangerous events.
 次に、ステップS203では、移動物体130の速度を推定する。速度の推定は状態解析部107において撮像装置101で撮影された映像の画像解析で行う。速度の推定方法の具体例は、後述する判定方法で説明する。 Next, in step S203, the speed of the moving object 130 is estimated. The speed is estimated by the state analysis unit 107 through image analysis of the video captured by the imaging device 101. A specific example of the speed estimation method will be explained in the determination method described later.
 次に、ステップS204では、移動物体130が踏切150を横断できるか否かを判定する。ここでの判定は、状態解析部107で、ステップS203で推定した速度を用いるなどして行うことができる。移動物体130が踏切150を横断したか否かは、例えば、遮断完了までに横断エリア160を通過できるか否か等で判定できる。判定方法の具体例は後述する。ステップS204で、移動物体130が踏切150を横断できる場合は処理が終了する。横断不可(真)の場合はステップS205へ進む。この場合、状態解析部107は危険事象の発生の情報を送信する。 Next, in step S204, it is determined whether the moving object 130 can cross the railroad crossing 150. This determination can be made by the state analysis unit 107 using the speed estimated in step S203. Whether the moving object 130 has crossed the railroad crossing 150 can be determined by, for example, whether it can pass through the crossing area 160 before the crossing is completed. A specific example of the determination method will be described later. In step S204, if the moving object 130 can cross the railroad crossing 150, the process ends. If crossing is not possible (true), the process advances to step S205. In this case, the state analysis unit 107 transmits information on the occurrence of a dangerous event.
 ステップS205では、発報処理を行う。例えば、警報装置103により音声等によって踏切150周辺に危険を知らせる。この他、表示装置104が危険事象の発生を警告表示する等してもよい。これらは、状態解析部107からの危険事象の発生の情報を受信し、それに基づき行うことができる。 In step S205, alarm processing is performed. For example, the warning device 103 alerts the area around the railroad crossing 150 of danger by voice or the like. In addition, the display device 104 may display a warning about the occurrence of a dangerous event. These operations can be performed based on information received from the state analysis unit 107 regarding the occurrence of a dangerous event.
<第1の判定方法>
 図5は、本発明の監視システムの判定方法を説明するための図である。図5を用いて第1の判定方法について説明する。
<First determination method>
FIG. 5 is a diagram for explaining the determination method of the monitoring system of the present invention. The first determination method will be explained using FIG. 5.
 図5は、移動物体130(図5では車いすを例示)が踏切150を横断している途中の状態を示している。踏切150の横断エリア160の横断方向の後方の境界付近には仮想線301、前方の境界付近には仮想線302が決められる。これらは画像解析により判別可能である。移動物体130は、横断エリア160に存在し、現在の時刻はt、現在の位置はP(x(t),y(t))である。 FIG. 5 shows a state in which a moving object 130 (a wheelchair is exemplified in FIG. 5) is crossing a railroad crossing 150. An imaginary line 301 is determined near the rear boundary in the transverse direction of the crossing area 160 of the railroad crossing 150, and an imaginary line 302 is determined near the front boundary. These can be determined by image analysis. The moving object 130 is present in the traversal area 160, the current time is t n , and the current position is P n (x(t n ), y(t n )).
 第1の判定方法は、移動物体130が踏切150を渡り切るまでの時間が、踏切150の遮断桿151が閉まるまでの残り時間よりも長いと予測される場合に、横断不可(真)とする判定方法である。 The first determination method is to determine that the moving object 130 cannot cross the railroad crossing (true) if the time it takes to cross the railroad crossing 150 is predicted to be longer than the remaining time until the barrier 151 of the railroad crossing 150 closes. This is a judgment method.
 移動物体130が、前方の仮想線302に達する到達地点をP(x,y)とすると、現在tの位置P(x(t),y(t))から到達地点P(x,y)までの距離d(t)は以下の式1で示される。
Figure JPOXMLDOC01-appb-M000001
 ここで、到達地点P(x,y)は、例えば、現在の位置P(x(t),y(t))から、前方の一番距離が短い仮想線302の位置を到達地点Pとすることができる。このほか、直前の位置から移動により直近の移動方向が算出できるため、これから算出される移動ベクトルの延長線上の仮想線302との交点を到達地点Pとしてもよい。
If the destination point where the moving object 130 reaches the imaginary line 302 in front is PG (x G , y G ), then the destination point from the current position t n (x (t n ), y (t n )) is PG (x G , y G ). The distance d(t n ) to P G (x G , y G ) is expressed by Equation 1 below.
Figure JPOXMLDOC01-appb-M000001
Here, the destination point P G (x G , y G ) is, for example, the position of the virtual line 302 that is the shortest distance ahead from the current position P n (x (t n ), y (t n )). can be taken as the destination point PG . In addition, since the most recent direction of movement can be calculated by moving from the previous position, the intersection with the virtual line 302 on the extension of the movement vector calculated from this may be set as the destination point PG .
 次に、現在の移動速度v(t)は次の式2で示される。
Figure JPOXMLDOC01-appb-M000002
 ここでは、直前の時刻tn-1における位置Pn-1(x(tn-1),y(tn-1))から、現在の時刻tの位置P(x(t),y(t))までの移動距離と時間差により求めることができる。直前の時刻tn-1は、現在の時刻tに対して所定時間前であり、例えば、1フレーム前や数フレーム前との差等の所定のフレーム間隔としてもよい。
Next, the current moving speed v(t n ) is expressed by the following equation 2.
Figure JPOXMLDOC01-appb-M000002
Here, from the position P n-1 (x(t n-1 ) , y(t n-1 )) at the previous time t n-1 to the position P n (x(t n ) ) at the current time t n , y(t n )) and the time difference. The immediately preceding time t n-1 is a predetermined time before the current time t n , and may be, for example, a predetermined frame interval such as a difference from one frame before or several frames before.
 次の式3は判定条件である。
Figure JPOXMLDOC01-appb-M000003
 ここでは、式1で求めた距離d(t)と上記式2で求めた移動速度v(t)を用いて、式3が真か偽か判定する。ここでτは現在の時刻tから踏切150の遮断完了までの残り時間である。このτは、踏切の遮断開始から遮断完了までの時間はあらかじめ決まっているため、当該踏切の情報に基づき状態解析部107で算出することが可能である。
The following equation 3 is a determination condition.
Figure JPOXMLDOC01-appb-M000003
Here, it is determined whether Equation 3 is true or false using the distance d(t n ) obtained from Equation 1 and the moving speed v(t n ) obtained from Equation 2 above. Here, τ is the remaining time from the current time t n until the crossing of the railroad crossing 150 is completed. This τ can be calculated by the state analysis unit 107 based on the information on the railroad crossing, since the time from the start of the railroad crossing to the completion of the railroad crossing is determined in advance.
 上記の式3の条件を満たすと、踏切150の遮断完了までに移動物体130が仮想線301までに達しないことになる。このため、この条件に合致すれば真となり移動物体130が踏切150を渡り切ることができないため、危険事象が発生していると判定できる。 If the above condition of Equation 3 is satisfied, the moving object 130 will not reach the virtual line 301 before the crossing of the railroad crossing 150 is completed. Therefore, if this condition is met, it becomes true and the moving object 130 cannot cross the railroad crossing 150, so it can be determined that a dangerous event has occurred.
 なお、上記の例は、移動物体130の移動方向前方にある仮想線302に対してのみ判定を行う例を示したが、移動物体130の移動方向後方にある仮想線301に対して判定を行ってもよい。また、これ以外にも、仮想線までの距離d(t)は後方の仮想線301と前方の仮想線302の両方に適用して、両方とも真となった場合には横断不可と判定しても良い。これは、移動物体130が引き返せる可能性も考慮したものである。 Note that the above example shows an example in which the determination is made only for the virtual line 302 located in front of the moving direction of the moving object 130, but the determination is made for the virtual line 301 located behind the moving object 130 in the moving direction. It's okay. In addition to this, the distance d(t n ) to the virtual line is applied to both the backward virtual line 301 and the forward virtual line 302, and if both are true, it is determined that crossing is not possible. It's okay. This also takes into account the possibility that the moving object 130 can turn back.
<第2の判定方法>
 第2の判定方法は、検知物体(移動物体130)のある時間の移動量が所定条件の下、閾値以下であるかを判定する。この場合、引き返す動き、うろつき、低速状態にあると想定できるため、このような場合は横断不可(真)と判定する方法である。具体的には次の式4で表すことができる。
Figure JPOXMLDOC01-appb-M000004
<Second determination method>
The second determination method determines whether the amount of movement of the detected object (moving object 130) at a certain time is less than or equal to a threshold value under predetermined conditions. In this case, it can be assumed that the vehicle is turning back, wandering around, or moving at low speed, so this method determines that crossing is not possible (true) in such a case. Specifically, it can be expressed by the following equation 4.
Figure JPOXMLDOC01-appb-M000004
 上記の式4のうち、以下に示す最初の条件である式4-1について説明する。
Figure JPOXMLDOC01-appb-M000005
 ここで、v(t)は上記式2で求めた現在時刻の速度である。Vthは時刻t=tからt=tまでの移動量の閾値である。時刻tは現在の時刻tよりも前の時刻で、例えば2秒前、さらには5秒前、さらには10秒前等である。すなわち1秒前から10秒前ぐらいの間の値とすることができる。式4-1では、前の時刻tから現在の時刻tまでの移動量の合計があらかじめ定めた移動量の閾値以下であることを示す。これにより所定時間内の移動量が所定以下の場合は、この式に合致する。式4-1は簡単のため移動量の合計を用いているが、移動方向を考慮した速度ベクトルを用いてもよい。この場合は、合成ベクトルが時刻t=tにおける位置P(x(t),y(t))とt=tにおける位置P(x(t),y(t))の移動方向と移動距離を表すので、移動距離が短い場合は位置P付近に滞留しているとみなせる。
Of the above equations 4, equation 4-1, which is the first condition shown below, will be explained.
Figure JPOXMLDOC01-appb-M000005
Here, v(t n ) is the speed at the current time calculated using Equation 2 above. V th is a threshold value of the amount of movement from time t=t m to t=t n . The time t m is a time before the current time t n , for example, 2 seconds ago, further 5 seconds ago, or even 10 seconds ago. In other words, it can be a value between about 1 second and 10 seconds ago. Equation 4-1 indicates that the total amount of movement from the previous time t m to the current time t n is less than or equal to a predetermined movement amount threshold. As a result, if the amount of movement within the predetermined time is less than or equal to the predetermined value, this formula is met. Equation 4-1 uses the total amount of movement for simplicity, but a velocity vector that takes the direction of movement into consideration may be used. In this case, the resultant vector is the position P m (x (t m ), y (t m )) at time t = t m and the position P n (x (t n ), y (t n ) ) at t = t n . ) represents the movement direction and movement distance, so if the movement distance is short, it can be considered that the movement is staying near the position Pm .
 上記の式4のうち、以下に示す次の条件である式4-2について説明する。
Figure JPOXMLDOC01-appb-M000006
 ここでτは、現在の時刻tから踏切150の遮断完了までの時間である。Tth1は遮断完了残り時間の閾値である。すなわち、踏切150の遮断完了までの残り時間が少ない場合には危険な状態であるため、この式の条件を満たす。
Of the above equations 4, equation 4-2, which is the next condition shown below, will be explained.
Figure JPOXMLDOC01-appb-M000006
Here, τ is the time from the current time t n until the crossing of the railroad crossing 150 is completed. T th1 is a threshold value for the remaining time to complete the cutoff. In other words, if there is little time remaining until the crossing of the level crossing 150 is completed, the condition of this equation is satisfied because the situation is dangerous.
 上記の式4のうち、以下に示す最後の条件である式4-3について説明する。
Figure JPOXMLDOC01-appb-M000007
 ここで、M(x(t),y(t))は、現在の時刻に検知対象の位置が検知領域であるか否かを示している。検知領域Mは、例えば、横断エリア160が想定され、図5で示した仮想線301と仮想線302の間の領域が想定される。ここで、検知領域M(x,y)であれば「1」となり、それ以外の領域であれば「0」となる。式4-3では、検知対象(移動物体130)が検知領域M内にいる場合は1となり、この式の条件を満たす。なお、検知領域M(x,y)は踏切内の所定のエリアに設定しても良いが、踏切外の線路領域などの侵入禁止エリアも含めて設定しても良い。
Of the above equations 4, equation 4-3, which is the last condition shown below, will be explained.
Figure JPOXMLDOC01-appb-M000007
Here, M(x(t n ), y(t n )) indicates whether the position of the detection target is in the detection area at the current time. The detection area M is assumed to be, for example, the crossing area 160, and is assumed to be an area between the virtual line 301 and the virtual line 302 shown in FIG. 5. Here, if it is the detection area M(x, y), it will be "1", and if it is any other area, it will be "0". Equation 4-3 is 1 when the detection target (moving object 130) is within the detection area M, and the condition of this equation is satisfied. Note that the detection area M(x, y) may be set in a predetermined area within the railroad crossing, but may also be set to include a prohibited area such as a track area outside the railroad crossing.
 上記の式4は、上記の式4-1、式4-2、式4-3をすべて満たす場合に真と判定され、危険事象が発生していると判定できる。しかし、これらの3つの式の一部を満たした場合に段階的に危険度を示しても良い。例えば、式4-1と式4-3を満たす場合などは危険の前段階として、式4-2をさらに満たす場合に危険事象が発生していると判定する等である。 The above equation 4 is determined to be true when all of the above equations 4-1, 4-2, and 4-3 are satisfied, and it can be determined that a dangerous event has occurred. However, the degree of risk may be indicated in stages when some of these three equations are satisfied. For example, if Equations 4-1 and 4-3 are satisfied, it is determined that a dangerous event has occurred, and if Equation 4-2 is further satisfied, it is determined that a dangerous event has occurred.
<第3の判定方法>
 第3の判定方法は、追跡の消失が発生した場合に何らかの異常が発生したとして横断不可(真)と判定する方法である。例えば、検知対象である移動物体130が、踏切150の横断エリア160外の線路への侵入によって検知エリア外に移動した場合や、フェンス、電柱などの建造物の陰に隠れた場合等が相当する。この場合、撮像装置101で検知対象を捉えることができないため、追跡が消失する場合がある。このとき、最後に検知対象を検知した位置から検知対象が動かなかったものとみなし、v=0として取り扱う。そして、v=0の状態が所定の時間Tth2継続した場合に横断不可(真)と判定し、危険事象が発生していると判定する。所定の時間Tth2はあらかじめ定めておくことができる。
<Third determination method>
The third determination method is a method of determining that some abnormality has occurred and that crossing is not possible (true) when tracking disappears. For example, this may occur if the moving object 130 to be detected moves out of the detection area by intruding onto the tracks outside the crossing area 160 of the railroad crossing 150, or if it hides behind a structure such as a fence or utility pole. . In this case, since the imaging device 101 cannot capture the detection target, tracking may be lost. At this time, it is assumed that the detection target has not moved from the position where the detection target was last detected, and it is treated as v=0. Then, when the state of v=0 continues for a predetermined time T th2 , it is determined that crossing is not possible (true), and it is determined that a dangerous event has occurred. The predetermined time T th2 can be determined in advance.
 なお、追跡が消失したことをv=0と取り扱う方法は第1の判定方法、第2の判定方法に適用することも可能である。第1の判定方法であれば、例えば、式3において、v(t)=0とすれば、式の左辺の値は無限大となり、式3の条件を満たす。これにより、危険事象が発生していると判定できる。第2の判定方法であれば、例えば、式4において、v(t)=0とすれば、式4-1の部分の式の左辺は0となり、式4の条件を満たす。これにより、危険事象が発生していると判定できる。 Note that the method of treating the disappearance of tracking as v=0 can also be applied to the first determination method and the second determination method. In the first determination method, for example, if v(t n )=0 in equation 3, the value on the left side of the equation becomes infinite, and the condition of equation 3 is satisfied. This allows it to be determined that a dangerous event has occurred. In the second determination method, for example, if v(t n )=0 in Equation 4, the left side of Equation 4-1 becomes 0, and the condition of Equation 4 is satisfied. This allows it to be determined that a dangerous event has occurred.
<第4の判定方法>
 第4の判定方法は、ステップS202で説明した、検知対象種別を利用した判定である。検知対象種別の判別は、AI(人工知能)を用いて、ディープラーニングを用いる等して画像解析により行うことが可能である。このとき、検知対象種別に応じた判定を行う。検知対象種別としては、例えば、人物(大人、子供、高齢者)、自動車、自転車、バイク、車いす、ベビーカー、杖、白杖等の種別があげられる。例えば、第2、第3の判定方法で用いた閾値(Vth、th1、Tth2)を検知対象種別ごとに変更してあらかじめ設定しておいてもよい。
<Fourth determination method>
The fourth determination method is the determination using the detection target type described in step S202. The detection target type can be determined by image analysis using AI (artificial intelligence), deep learning, or the like. At this time, a determination is made according to the type of detection target. Examples of detection target types include people (adults, children, elderly people), cars, bicycles, motorbikes, wheelchairs, strollers, canes, white canes, and the like. For example, the threshold values (V th , T th1 , T th2 ) used in the second and third determination methods may be changed and set in advance for each detection target type.
 さらに、検知対象種別ごとに、現在の移動速度v(t)と位置P(x(t),y(t))から、機械学習に基づき仮想線301又は仮想線302までの到達時間を予想する。そして、その予想到達時間が、踏切150の遮断完了時間よりも長い(遅い)場合、もしくは、長く(遅く)なる確率が所定以上の場合は、危険事象が発生していると判定してもよい。この場合は、過去の当該踏切におけるデータや他の踏切におけるデータを用いることでその種別における到達時間を予測することができる。 Furthermore, for each detection target type, based on the current moving speed v (t n ) and position P n (x (t n ), y (t n )), the arrival to the virtual line 301 or 302 is determined based on machine learning. Anticipate the time. Then, if the expected arrival time is longer (slower) than the completion time of the railroad crossing 150, or if the probability that it will be longer (slower) is greater than a predetermined value, it may be determined that a dangerous event has occurred. . In this case, the arrival time for that type can be predicted by using past data at the relevant level crossing and data at other level crossings.
 また、検知対象種別ごとに特定の事象が発生した場合は、危険事象が発生していると判定してもよい。例えば、自動車であるならば、自動車が踏切内に所定時間停留している場合は、前の自動車が渋滞で詰まって動けないか、自動車の故障等のトラブルで動けないことが考えられる。例えば、車いすであるならば、車いすが線路の溝付近や横断ゾーンの線路方向の際で止まっている場合、脱輪により動けなくなっている可能性が考えられる。例えば、所定以上の小さい子供であるならば、子供単独で踏切内にいる場合は、自己の判断で踏切を渡り切れない可能性が考えられる。このような場合は、第1から第3の判定方法にかかわらず、危険事象と判定するか、危険事象になる可能性が高い状態と判定してもよい。 Additionally, if a specific event occurs for each detection target type, it may be determined that a dangerous event has occurred. For example, in the case of a car, if the car is stopped at a railroad crossing for a predetermined period of time, it is possible that the car in front of the car is stuck in a traffic jam and cannot move, or that the car is unable to move due to a problem such as a breakdown of the car. For example, if the wheelchair is stopped near a track groove or in the direction of the track in a crossing zone, there is a possibility that the wheelchair has fallen off and is unable to move. For example, if a child is smaller than a predetermined age and is alone at a railroad crossing, there is a possibility that the child may not be able to cross the railroad crossing on his or her own judgment. In such a case, regardless of the first to third determination methods, the event may be determined to be a dangerous event or a state with a high possibility of becoming a dangerous event.
<表示例>
 図6は、本発明の監視システムにおける第1の画面による表示例を示す図である。図7は、本発明の監視システムにおける第2の画面による表示例を示す図である。
<Display example>
FIG. 6 is a diagram showing an example of display on the first screen in the monitoring system of the present invention. FIG. 7 is a diagram showing an example of display on the second screen in the monitoring system of the present invention.
 図6に示す第1の画面と図7に示す第2の画面は、表示装置104によって表示される。これらは、表示する複数の監視映像の中から注目すべき映像を強調または拡大して表示する例を示している。任意の表示方法が可能であるが、監視員が危険に気付くための構成として以下に示す。 The first screen shown in FIG. 6 and the second screen shown in FIG. 7 are displayed by the display device 104. These examples show an example of highlighting or enlarging a video of interest from among a plurality of displayed surveillance videos. Although any display method is possible, the following is a configuration for the observer to notice the danger.
 第1の表示方法は、図6に示される方法である。図6の表示画面400では、左側に撮像装置101ごとの複数の映像がタイル状に並んだメイン表示410が表示されている。すなわち異なる踏切の映像が並んで表示されている。図6のメイン表示410は、横3列と縦3列の9個の映像が表示される例を示している。また、メイン表示410の右側にはサムネイル表示420を有している。サムネイル表示420は、メイン表示410よりも小さい範囲で表示され、1つ1つの映像表示もメイン表示410よりも小さい。サムネイル表示420では、メイン表示410に表示しきれない撮像装置101からの映像を例えば拡大表示している。図6のサムネイル表示420では、縦に3つの映像が並んで表示されている例を示している。 The first display method is the method shown in FIG. In the display screen 400 of FIG. 6, a main display 410 in which a plurality of images for each imaging device 101 are arranged in tiles is displayed on the left side. In other words, images of different railroad crossings are displayed side by side. The main display 410 in FIG. 6 shows an example in which nine videos are displayed in three horizontal columns and three vertical columns. Further, a thumbnail display 420 is provided on the right side of the main display 410. The thumbnail display 420 is displayed in a smaller area than the main display 410, and each video display is also smaller than the main display 410. In the thumbnail display 420, for example, images from the imaging device 101 that cannot be displayed on the main display 410 are displayed in an enlarged manner. The thumbnail display 420 in FIG. 6 shows an example in which three videos are displayed vertically side by side.
 表示の段階としては2つの段階が考えられる。1つは、図4のステップS201で列車の接近を検知した段階である。これは、遮断開始としてもよい。もう1つは、図4のステップS204において、危険事象と判定された段階である。通常の段階に対して、列車の接近の段階と、危険事象と判定された段階とをそれぞれ異なる強調表示を行うことができる。例えば、列車の接近を検知した段階で該当する撮像装置101からの映像を何らかのハイライト表示で強調してもよい。この場合、該当する映像を図6のサムネイル表示420からメイン表示410に移動してもよい。そして、危険事象と判定された段階で、特定のハイライト表示411を行う。ハイライト表示411は、例えば列車の接近は黄色枠、危険事象は赤色枠として色を分けて、異なる色による色枠で警告表示する。 There are two possible display stages. One is when the approach of a train is detected in step S201 in FIG. This may also be the start of shutoff. The other is the stage in which a dangerous event is determined in step S204 in FIG. It is possible to display a different emphasis on a stage where a train is approaching and a stage where a dangerous event has been determined as compared to a normal stage. For example, at the stage when the approach of a train is detected, the image from the corresponding imaging device 101 may be emphasized by some kind of highlight display. In this case, the corresponding video may be moved from the thumbnail display 420 in FIG. 6 to the main display 410. Then, a specific highlight display 411 is performed at the stage where the event is determined to be a dangerous event. The highlight display 411 is divided into colors, such as a yellow frame for an approaching train and a red frame for a dangerous event, and displays a warning using different colored frames.
 また、メイン表示410で表示しきれない映像については同様にサムネイル表示420にハイライト表示411と同様にハイライトして表示してもよい。また、列車の接近と危険事象の状態にある映像を、それ以外の映像よりも上位に表示されるように並べて表示してもよい。例えば、このような映像をメイン表示410の上段の左側から並べていく等である。この場合、危険事象の映像の優先度を列車の接近状態の映像よりも上位に表示させてもよい。 Additionally, images that cannot be displayed on the main display 410 may be highlighted and displayed on the thumbnail display 420 in the same manner as the highlight display 411. Further, images showing a train approaching and a dangerous event may be displayed side by side so that they are displayed higher than other images. For example, such images may be arranged from the left side of the upper row of the main display 410. In this case, the video of the dangerous event may be displayed at a higher priority than the video of the approaching train.
 第2の表示方法では、図6の様に列車が接近している映像のハイライト表示411を第1の表示方法と同様に行う。しかし、危険事象と判定したときは、図7のような第2の表示画面500に遷移する。ここでは、危険事象と判定した映像を図7の様にメイン表示510に拡大表示511として表示する。拡大表示511は、例えば、図7に示すように通常の映像の表示スペース4つ分を使い1つの映像を拡大表示する。また、図7において右側のサムネイル表示420は、図6と同様の表示が行われる。 In the second display method, a highlight display 411 of an image of an approaching train as shown in FIG. 6 is performed in the same way as in the first display method. However, when it is determined that it is a dangerous event, the screen changes to a second display screen 500 as shown in FIG. Here, the video determined to be a dangerous event is displayed as an enlarged display 511 on the main display 510 as shown in FIG. For example, as shown in FIG. 7, the enlarged display 511 uses four normal image display spaces to display one image in an enlarged manner. Further, the thumbnail display 420 on the right side in FIG. 7 is displayed in the same manner as in FIG. 6 .
 なお、上記の第1の表示方法、第2の表示方法で、危険事象を検知した後に、危険事象の強調又は拡大表示を解除する機能を備えていてもよい。これは、監視員などにより安全が確認された場合などを想定したものである。この解除の操作のためのインターフェースを表示装置104に備えていてもよい。この場合、一度強調又は拡大表示を開始すると手動による解除が行われるまで強調又は拡大表示を継続するようにしてもよい。 Note that the first display method and the second display method described above may have a function of canceling the emphasis or enlarged display of the dangerous event after detecting the dangerous event. This assumes that safety has been confirmed by a supervisor or the like. The display device 104 may be provided with an interface for this cancellation operation. In this case, once the highlighted or enlarged display is started, the highlighted or enlarged display may be continued until it is manually canceled.
 また、上記第1の表示方法と第2の表示方法において、遮断開始中の映像や危険事象の映像を優先的にメイン表示410、510に表示し、その他の状態の映像をサムネイル表示420として表示してもよい。このことで特に注目すべき映像をメイン表示410、510に表示する。また、メイン表示410、510は遮断開始中の映像、その他の状態の映像を各タイル状の表示部分に専用で割り当ててもよい。また、各タイル状の表示部分は、巡回表示用に割り当ることも可能である。 In addition, in the first display method and the second display method, videos in which a shutdown is being started and videos of dangerous events are preferentially displayed on the main displays 410 and 510, and videos in other states are displayed as thumbnail displays 420. You may. This allows videos of particular interest to be displayed on the main displays 410, 510. In addition, the main displays 410 and 510 may exclusively allocate the video that is starting to be cut off and the video in other states to each tile-shaped display portion. Furthermore, each tile-shaped display portion can be allocated for cyclic display.
<効果>
 以上の実施形態により、踏切の危険事象を撮像装置の映像を画像解析することにより的確に検知することができる。このとき、列車の接近を検知することで、踏切の遮断が完了する前に検知することができる。また、移動物体が踏切を渡りきれるかを検知することで的確な検知が可能となる。さらに、撮像装置を用いるため、踏切内全体の状態を検知することが可能となる。さらに、レーザ照射式のセンサなど他のセンサを用いなくても実現することが可能となり、コストを抑えることが可能となる。さらに、撮像装置からの映像は監視員がそのまま目視で状況を確認できる。また、表示装置において、監視映像の中から注目すべき映像を強調または拡大して表示することで、監視員がその映像にいち早く気づきやすい表示とすることができる。
<Effect>
According to the embodiments described above, it is possible to accurately detect a dangerous event at a railroad crossing by analyzing the image captured by the imaging device. At this time, by detecting the approach of a train, it can be detected before the crossing is completed. In addition, accurate detection is possible by detecting whether a moving object can cross a railroad crossing. Furthermore, since an imaging device is used, it is possible to detect the entire condition inside the railroad crossing. Furthermore, it can be realized without using other sensors such as a laser irradiation type sensor, and costs can be reduced. Furthermore, the images from the imaging device can be viewed directly by surveillance personnel to visually confirm the situation. In addition, by emphasizing or enlarging and displaying a video of interest from the surveillance video on the display device, it is possible to display the video so that the surveillance staff can easily notice the video.
 なお、本発明は上記した実施例に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施例は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、ある実施例の構成の一部を他の実施例の構成に置き換えることが可能であり、また、ある実施例の構成に他の実施例の構成を加えることも可能である。また、各実施例の構成の一部について、他の構成の追加・削除・置換をすることが可能である。 Note that the present invention is not limited to the above-described embodiments, and includes various modifications. For example, the embodiments described above are described in detail to explain the present invention in an easy-to-understand manner, and the present invention is not necessarily limited to having all the configurations described. Furthermore, it is possible to replace a part of the configuration of one embodiment with the configuration of another embodiment, and it is also possible to add the configuration of another embodiment to the configuration of one embodiment. Further, it is possible to add, delete, or replace a part of the configuration of each embodiment with other configurations.
 例えば、上記実施形態は踏切について示したが、これ以外の仮想線までの到達予測時間を用いる状況についても適用可能である。例えば交差点における信号制御、自動ドアの開閉制御、エレベータのドア開閉制御などにも適用可能である。 For example, although the above embodiment has been described with respect to a railroad crossing, it is also applicable to other situations in which the predicted arrival time to the virtual line is used. For example, it can be applied to signal control at intersections, automatic door opening/closing control, elevator door opening/closing control, etc.
 さらに、上記実施形態は検知対象をカメラによる映像によって追跡する例を示したが、LiDAR(ライダー)やミリ波レーダなど各種センサを検知や追跡に追加で用いて、検知の精度を上げる構成としてもよい。 Furthermore, although the above embodiment shows an example in which the detection target is tracked using images from a camera, it is also possible to use a configuration in which various sensors such as LiDAR and millimeter wave radar are additionally used for detection and tracking to improve detection accuracy. good.
1…コンピュータシステム、2…プロセッサ、2A、2B…処理装置、4…メモリ、6…メモリバス、8…I/Oバス、9…バスインターフェースユニット、10…I/Oバスインターフェースユニット、12…端末インターフェースユニット、14…ストレージインターフェース、16…I/Oデバイスインターフェース、18…ネットワークインターフェース、20…ユーザI/Oデバイス、22…ストレージ装置、24…表示システム、26…表示装置、30…ネットワーク、50…潜在因子特定アプリケーション、100…監視システム、101…撮像装置、102…解析装置、103…警報装置、104…表示装置、105…状態通知部、106…状態解析制御部、107…状態解析部、130…移動物体、150…踏切、151…遮断桿、152…踏切警報機、160…横断エリア、161…境界線、170…線路、301、302…仮想線、400…表示画面、410…メイン表示、411…ハイライト表示、420…サムネイル表示、500…第2の表示画面、510…メイン表示、511…拡大表示 DESCRIPTION OF SYMBOLS 1... Computer system, 2... Processor, 2A, 2B... Processing device, 4... Memory, 6... Memory bus, 8... I/O bus, 9... Bus interface unit, 10... I/O bus interface unit, 12... Terminal Interface unit, 14...Storage interface, 16...I/O device interface, 18...Network interface, 20...User I/O device, 22...Storage device, 24...Display system, 26...Display device, 30...Network, 50... Latent factor identification application, 100... Monitoring system, 101... Imaging device, 102... Analysis device, 103... Alarm device, 104... Display device, 105... Status notification unit, 106... Status analysis control unit, 107... Status analysis unit, 130 ...moving object, 150...level crossing, 151...blocking rod, 152...level crossing alarm, 160...crossing area, 161...boundary line, 170...railway, 301, 302...virtual line, 400...display screen, 410...main display, 411...Highlight display, 420...Thumbnail display, 500...Second display screen, 510...Main display, 511...Enlarged display

Claims (6)

  1.  撮像装置と、解析装置とを備え、
     前記解析装置は、状態解析制御部と、状態解析部とを有し、
     前記状態解析制御部は、前記撮像装置が取得した映像の解析または状態通知部から通知された情報によって列車が接近していることを検知すると前記状態解析部に対して踏切内の映像解析の開始を指示し、
     前記状態解析部は、前記撮像装置が取得した映像を用いて踏切内の移動物体の検知と追跡を行い、少なくとも前記移動物体の現在移動速度を用いて踏切の遮断完了までに検知対象が設定した仮想線まで到達できないと判定した場合に危険事象が発生していると判定して発報処理を行う監視システム。
    Equipped with an imaging device and an analysis device,
    The analysis device includes a state analysis control section and a state analysis section,
    When the condition analysis control section detects that a train is approaching based on the analysis of the image acquired by the imaging device or the information notified from the condition notification section, the condition analysis control section causes the condition analysis section to start analyzing the image inside the railroad crossing. instruct the
    The state analysis unit detects and tracks a moving object within the railroad crossing using the image acquired by the imaging device, and uses at least the current moving speed of the moving object to determine whether the detection target has been set by the time the crossing of the railroad crossing is completed. A monitoring system that determines that a dangerous event has occurred and issues an alarm if it is determined that the virtual line cannot be reached.
  2.  請求項1に記載の監視システムにおいて、
     前記状態解析部は、検知対象を識別するIDと、検知対象の種別を画像解析により特定して種別を識別するIDを、検知対象ごとに紐づけて追跡し、
     前記種別には少なくとも人物、自動車、自転車、車いす、ベビーカー、杖、白杖を含むことを特徴とする監視システム。
    The monitoring system according to claim 1,
    The state analysis unit links and tracks an ID that identifies the detection target and an ID that identifies the type of the detection target by identifying the type by image analysis for each detection target,
    A monitoring system characterized in that the types include at least a person, a car, a bicycle, a wheelchair, a stroller, a cane, and a white cane.
  3.  請求項1に記載の監視システムにおいて、
     前記状態解析部は、前記移動物体の現在の移動速度と、前記移動物体の現在の位置から前記仮想線までの距離から、前記移動物体が前記仮想線まで到達する時間を算出し、この時間が踏切の遮断完了までの残り時間よりも多い場合は危険事象が発生していると判定することを特徴とする監視システム。
    The monitoring system according to claim 1,
    The state analysis unit calculates the time required for the moving object to reach the virtual line from the current moving speed of the moving object and the distance from the current position of the moving object to the virtual line, and calculates the time required for the moving object to reach the virtual line. A monitoring system that determines that a dangerous event has occurred if the time is longer than the remaining time until the crossing is completed.
  4.  請求項1に記載の監視システムにおいて、
     前記状態解析部は、前記移動物体が踏切内に存在し、かつ、前記移動物体の現在の時間の移動量が所定の閾値以下であり、かつ、踏切の遮断完了までの時間が所定の閾値以内の場合、危険事象が発生していると判定する監視システム。
    The monitoring system according to claim 1,
    The state analysis unit is configured to determine that the moving object exists within the railroad crossing, the amount of movement of the moving object at the current time is less than or equal to a predetermined threshold, and the time until the crossing of the railroad crossing is completed is within the predetermined threshold. A monitoring system that determines that a dangerous event has occurred.
  5.  請求項1に記載の監視システムにおいて、
     前記状態解析部は、種別に応じて前記移動物体の現在の位置と速度から前記仮想線までの到達時間を予想し、予想した到達時間が踏切の遮断完了時間よりも長い場合は、危険事象が発生していると判定することを特徴とする監視システム。
    The monitoring system according to claim 1,
    The state analysis unit predicts the arrival time to the virtual line based on the current position and speed of the moving object according to the type, and if the predicted arrival time is longer than the crossing completion time of the crossing, it is determined that a dangerous event has occurred. A monitoring system characterized by determining that an occurrence has occurred.
  6.  請求項1に記載の監視システムにおいて、
     表示装置を備え、前記表示装置は、複数の前記撮像装置からの映像を並べて表示し、前記状態解析部で危険事象が発生していると判定した前記撮像装置の映像を強調または拡大して表示することを特徴とする監視システム。
    The monitoring system according to claim 1,
    comprising a display device, the display device displays images from the plurality of image pickup devices side by side, and highlights or enlarges the image of the image pickup device determined by the condition analysis unit to be in the occurrence of a dangerous event; A monitoring system characterized by:
PCT/JP2022/045759 2022-03-07 2022-12-13 Surveillance system WO2023171068A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022034071 2022-03-07
JP2022-034071 2022-03-07

Publications (1)

Publication Number Publication Date
WO2023171068A1 true WO2023171068A1 (en) 2023-09-14

Family

ID=87936587

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/045759 WO2023171068A1 (en) 2022-03-07 2022-12-13 Surveillance system

Country Status (1)

Country Link
WO (1) WO2023171068A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006111177A (en) * 2004-10-15 2006-04-27 Central Japan Railway Co Detector for detecting moving object in crosscut, crosscut information communication device and program
JP2006118914A (en) * 2004-10-20 2006-05-11 West Japan Railway Co Object detector
JP2010181614A (en) * 2009-02-05 2010-08-19 Yahoo Japan Corp Movement simulation apparatus, and method of operating the same
JP2015070401A (en) * 2013-09-27 2015-04-13 日本電気株式会社 Image processing apparatus, image processing method, and image processing program
JP2015182556A (en) * 2014-03-24 2015-10-22 公益財団法人鉄道総合技術研究所 Monitoring system for railroad-crossing passer and monitoring program
WO2018084191A1 (en) * 2016-11-07 2018-05-11 株式会社日立国際電気 Congestion state analysis system
JP2021064398A (en) * 2021-01-04 2021-04-22 日本電気株式会社 Control method, program, and system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006111177A (en) * 2004-10-15 2006-04-27 Central Japan Railway Co Detector for detecting moving object in crosscut, crosscut information communication device and program
JP2006118914A (en) * 2004-10-20 2006-05-11 West Japan Railway Co Object detector
JP2010181614A (en) * 2009-02-05 2010-08-19 Yahoo Japan Corp Movement simulation apparatus, and method of operating the same
JP2015070401A (en) * 2013-09-27 2015-04-13 日本電気株式会社 Image processing apparatus, image processing method, and image processing program
JP2015182556A (en) * 2014-03-24 2015-10-22 公益財団法人鉄道総合技術研究所 Monitoring system for railroad-crossing passer and monitoring program
WO2018084191A1 (en) * 2016-11-07 2018-05-11 株式会社日立国際電気 Congestion state analysis system
JP2021064398A (en) * 2021-01-04 2021-04-22 日本電気株式会社 Control method, program, and system

Similar Documents

Publication Publication Date Title
CN111144247B (en) Escalator passenger reverse detection method based on deep learning
EP2801956B1 (en) Passenger counter
JP3876288B2 (en) State recognition system and state recognition display generation method
US8693725B2 (en) Reliability in detecting rail crossing events
WO2019191142A1 (en) Smart area monitoring with artificial intelligence
WO2019178036A1 (en) Exposure coordination for multiple cameras
US11945435B2 (en) Devices and methods for predicting collisions and/or intersection violations
JP6127659B2 (en) Driving support device and driving support method
CA2610965A1 (en) Method and image evaluation unit for scene analysis
JP5917327B2 (en) Escalator monitoring system
CN103886755B (en) Crossing exception parking rapid alarm system and method with the camera function that makes a dash across the red light
JPWO2017047687A1 (en) Monitoring system
JP4600929B2 (en) Stop low-speed vehicle detection device
US20210312193A1 (en) Devices and methods for predicting intersection violations and/or collisions
US20230166743A1 (en) Devices and methods for assisting operation of vehicles based on situational assessment fusing expoential risks (safer)
JPH06223157A (en) Moving body detection system by image sensor
JPWO2020178926A1 (en) Left-behind object detection device and left-behind object detection method
WO2014199817A1 (en) Image processing device, image processing method, and image processing program
Chen et al. Traffic extreme situations detection in video sequences based on integral optical flow
JP2009086748A (en) Monitoring device and program
WO2023171068A1 (en) Surveillance system
JP5587068B2 (en) Driving support apparatus and method
US20210309221A1 (en) Devices and methods for determining region of interest for object detection in camera images
US20230083156A1 (en) Surveillance system, surveillance apparatus, surveillance method, and non-transitory computer-readable storage medium
Rayi et al. Object’s action detection using GMM algorithm for smart visual surveillance system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22931038

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2024505904

Country of ref document: JP