WO2017047687A1 - Système de surveillance - Google Patents

Système de surveillance Download PDF

Info

Publication number
WO2017047687A1
WO2017047687A1 PCT/JP2016/077236 JP2016077236W WO2017047687A1 WO 2017047687 A1 WO2017047687 A1 WO 2017047687A1 JP 2016077236 W JP2016077236 W JP 2016077236W WO 2017047687 A1 WO2017047687 A1 WO 2017047687A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
speed
camera
monitoring
abnormality
Prior art date
Application number
PCT/JP2016/077236
Other languages
English (en)
Japanese (ja)
Inventor
佑一郎 小宮
佐々 敦
山口 宗明
雅俊 近藤
Original Assignee
株式会社日立国際電気
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立国際電気 filed Critical 株式会社日立国際電気
Priority to JP2017539963A priority Critical patent/JP6584024B2/ja
Publication of WO2017047687A1 publication Critical patent/WO2017047687A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B25/00Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/052Detecting movement of traffic to be counted or controlled with provision for determining speed or overspeed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a monitoring system that detects an abnormality occurring on a road based on the movement of a vehicle from a video of a monitoring camera installed on the road.
  • a monitoring system is required to have a function of automatically detecting an abnormality occurring in the field from a video of a monitoring camera and issuing an alarm. Prompt confirmation of abnormalities leads to prompt response after an accident and prevention of serious accidents.
  • Such a function is particularly effective when a large number of surveillance cameras are used, such as in a large-scale facility, and the supervisor cannot check all the images simultaneously.
  • One of the facilities that require an automatic detection function is an expressway. Abnormalities on the highway can be an emergency stop of the vehicle due to road damage, obstacles, breakdowns, etc., all of which can lead to serious accidents. Therefore, a technique for automatically detecting an abnormal situation from a video of a surveillance camera set on a highway is considered.
  • Patent Documents 1 and 2 As a technique related to the present invention, one that is mounted on a vehicle and detects a movement of a preceding vehicle that avoids an obstacle or the like is known (for example, see Patent Documents 1 and 2).
  • an object of the present invention is to provide a monitoring system that performs systematic image processing on a number of surveillance camera images and realizes abnormality detection with limited computational resources.
  • a monitoring system determines a priority order from a monitoring camera having a function of detecting a speed abnormality of a vehicle in a video, speed abnormality information from the monitoring camera, and a positional relationship between the monitoring cameras,
  • a central control unit having a function of analyzing a video of a high-priority surveillance camera video is provided.
  • the setting of the monitoring camera can be dynamically switched depending on whether or not it is a target of video analysis and whether or not an abnormality is detected.
  • an abnormality in the monitoring area can be efficiently detected from the video of the monitoring camera with a small amount of calculation.
  • FIG. 1 is a block diagram illustrating an example of a logical configuration of a monitoring system 1 according to a first embodiment.
  • 3 is a schematic state transition diagram of the operation of the monitoring system 1.
  • FIG. The schematic diagram explaining the principle which judges a traffic jam start point.
  • FIG. The schematic diagram which shows the principle which calculates an actual speed.
  • 10 is a flowchart of speed calculation by the speed analysis unit 22.
  • the monitoring system is based on the trajectory (route) of a vehicle traveling on the road, or the vehicle speed and the passing frequency obtained in the process of obtaining the trajectory. The presence of the error), and notify the abnormality.
  • an operation for directly detecting an obstacle causing the abnormality from the video is performed in cooperation with the estimation of the abnormality.
  • FIG. 1 is a block diagram showing an example of a logical configuration of the monitoring system 1 according to the first embodiment of the present invention.
  • the monitoring system 1 of this example is a centralized system that is centralized at an arbitrary number N of monitoring camera devices 2-1 to 2-N arranged at predetermined intervals along a highway and a base such as a traffic control center.
  • a processing system 3 includes a video analysis unit 4, a video display unit 5, a video recording unit 6, and a system control unit 7 that controls them.
  • the monitoring camera devices 2-1 to 2-N are, for example, a PTZ (panning pan) equipped with an electric head and an electric zoom.
  • a high-sensitivity color camera is used as the camera body.
  • the pan, tilt, and zoom to be applied are given as camera control information.
  • the surveillance camera 2 encodes and outputs a video signal obtained by the camera body as a surveillance camera video, and measures the speed of the vehicle in the video by a known technique such as Non-Patent Document 1.
  • the measured speed is output as speed abnormality detection information at least when the speed is abnormal.
  • the monitoring camera video can be subjected to processing such as shake correction, gradation correction, and super-resolution before being encoded in order to facilitate processing in the video analysis unit 4.
  • the video analysis unit 4 analyzes the monitoring camera video received from the monitoring camera 2 and outputs the analysis result.
  • the video analysis unit 4 only needs to process at most M channels (M is an integer smaller than N) designated by the system control unit 7, and simultaneously processes the monitoring camera videos from the N monitoring cameras 2 in real time. There is no need to have the ability.
  • the video analysis unit 4 has a capability of recognizing a vehicle, accumulates or learns a normal vehicle travel locus in advance, generates a vehicle trajectory recognized in a given image, and is not normal. Identify the occurrence of a running track.
  • a trajectory of a vehicle that overtakes a stopped vehicle, a trajectory that avoids falling objects on a road, or a trajectory of a plurality of vehicles that decelerate and accelerate before and after the same point is detected as abnormal.
  • the position information of falling objects and points is output. Further, in response to a request from the system control unit 7, an event (object) itself that causes the abnormal traveling locus is further detected.
  • the video display unit 5 displays the monitoring camera video received from the monitoring camera 2 according to the display control information.
  • L L is an integer smaller than N
  • flat panel displays are arranged vertically and horizontally, an L channel image is input, and the image is output together with surveillance camera information (photographing location name) and the like.
  • the video display unit 5 preferentially displays a video in which an abnormality has been detected or reveals an alarm that has been issued in accordance with a command from the system control unit 7.
  • the video recording unit 6 records the surveillance camera video received from the surveillance camera 2 on a hard disk or the like according to the recording control information.
  • the video recording unit 6 may change the video recording method based on the result of abnormality detection. For example, in the case of recording video after compressing it, the video of the surveillance camera in which an abnormality has been detected is saved at a reduced compression rate.
  • the system control unit 7 has a function of issuing an alarm based on the speed abnormality detection information from the connected N monitoring cameras 2 and the analysis result of the video analysis unit 4 and notifying the video display unit. Similarly to general CCTV, display control information is generated and the recording rate of the video recording unit 5 is set in order to display video on the video display unit 5 while cyclically switching the video of the surveillance camera 2. Generate recording control information to be controlled.
  • the system control unit 7 can be implemented as a part of a process control system (also called a distributed control system (DCS)) that performs traffic control.
  • the traffic control system is mainly managed by traffic counters and traffic measured by ETC 2.0.
  • FIG. 2 shows a schematic state transition diagram of the operation of the monitoring system 1.
  • the monitoring system 1 of this example generally has four states: steady state, congestion cause search, obstacle search, and obstacle observation.
  • steady state is a state in which there is no traffic jam and no obstacle, and the N monitoring cameras 2 are used like a traffic counter to monitor whether a speed abnormality has occurred.
  • the video is analyzed while selecting the surveillance camera 2 in a cyclic manner, and an attempt is made to detect the vehicle avoidance (trajectory of travel that avoids the obstacle) or the obstacle itself.
  • the state transits to a congestion cause search state, when an escape is detected, the state transits to an obstacle search state, and when an obstacle is detected, the state transits to an obstacle observation state.
  • the congestion cause search state is a state in which the start point and cause of the congestion are searched, the start point of the congestion is estimated from the shooting point of the video where the speed abnormality is detected and its change, and detection of the vehicle indicating the avoidance at that point is detected. Try.
  • the speed abnormality is continuously monitored in the places other than the starting point as in the steady state. As shown in FIG. 3, if there is a section where speed anomalies (low speed) continue, it is estimated as a traffic jam section, and the end of the traffic jam section on the vehicle traveling direction side (that is, a camera with speed anomalies and no speed anomalies) It can be estimated that there is a traffic jam start point between the cameras).
  • the speed of the traveling vehicle suddenly decreases with a certain camera or reaches a low speed that can be regarded as a traffic jam, if there is no speed abnormality before and after that, it can be estimated that it is the starting point of a newly generated traffic jam. If the estimated start point of the traffic jam is a location where natural traffic jams frequently occur, or if other known causes are known, a transition is made to a steady state. Otherwise, a high-magnification image near the starting point is acquired cyclically by PTZ control, attempts to detect a vehicle that shows avoidance, and if detected, transitions to the obstacle search state and issues an alarm (obstacle alarm) To do.
  • an alarm obstacle alarm
  • the image of the escape detection point is analyzed to try to detect the obstacle itself.
  • monitoring of speed anomalies is continued outside the detection point.
  • it is necessary to create an image (background image) when there are no obstacles in advance.
  • an alarm is issued as appropriate and a transition to the obstacle observation state is made.
  • the obstacle observation state it is a state in which the detected obstacle is continuously monitored, and when it is caught by the vehicle and disappears from the spot, it transits to the congestion cause search state and is dropped again on the road in the driving direction. You will search for traffic jams due to obstacles. Moreover, when the removal of the obstacle is completed by a worker who has visited the site, the state transitions to a steady state.
  • FIG. 4 shows a functional block diagram of the surveillance camera 2 of the present embodiment.
  • the surveillance camera 2 includes an imaging unit 21, a speed analysis unit 22, an image quality enhancement processing unit 23, and a camera control unit 24.
  • the monitoring camera 2 of this example calculates the speed of the subject from the captured video, detects a speed abnormality, and notifies the central processing system as speed abnormality detection information.
  • the imaging unit 21 includes a color imaging element, and outputs a subject image formed on the imaging surface as an imaging unit output video.
  • the speed analysis unit 22 calculates and outputs the actual speed of the subject (vehicle) in the input video.
  • the high image quality processing unit 23 performs high image quality processing such as super-resolution and contrast correction on the input video and outputs the result.
  • the speed analysis unit 22 and the image quality improvement processing unit 23 can be realized by software operating on a common processor. Alternatively, it can be realized in such a manner that a multiplier or the like is exclusively used by hardware such as a common FPGA (Field-Programmable Gate Array).
  • the camera control unit 24 controls operations of the PTZ, the speed analysis unit 22, and the image quality improvement processing unit 23 based on the camera control information from the central processing system 3.
  • the monitoring camera 2 measures whether or not a speed abnormality has occurred within a certain period and responds as speed abnormality detection information.
  • two threshold values TL and TH are set for the speed and its temporal change, and an abnormality occurs when the vehicle speed is equal to or lower than TL or equal to or higher than TH.
  • the camera control unit 24 controls the speed analysis unit and the image quality improvement processing unit in accordance with a request from the central processing unit. For example, only one function may be operated according to the request.
  • the output is output as a monitoring camera video
  • the imaging unit output video is output as a monitoring camera video.
  • the electric camera platform 25 changes the imaging direction of the imaging unit 21 in accordance with the camera control information received via the camera control unit 24.
  • the electric zoom lens 26 changes the imaging range (zoom magnification) of the imaging unit 21 according to the received camera control information.
  • the speed analysis unit 22, the image quality improvement processing unit 23, and the camera control unit 24 may be provided in the case of the imaging unit 21.
  • Fig. 5 shows the principle of calculating the actual speed in the real world.
  • a speed detection range that is a range in which the speed is calculated in the video is determined.
  • the actual distance of the speed detection range can be obtained from the height and angle at which the camera is installed. Alternatively, it may be set using a reference distance in the video such as a white line on the road.
  • the speed of the vehicle can be obtained, for example, by measuring how many frames it takes for the vehicle to pass through the speed detection range set above.
  • FIG. 6 shows a flow of speed calculation by the speed analysis unit 22.
  • step S11 one frame of the imaging unit output video is captured.
  • step S12 the moving object region is detected for the input frame.
  • the moving object region can be obtained by a known method, for example, a difference in pixel value between frames. If it is determined in S13 that a moving object exists, a feature point is detected in the moving object region in S14 by, for example, Harris corner detection or SUSAN operator.
  • S15 tracking of the moving object is performed by associating the feature points of the moving object region in the time domain.
  • Well-known technologies such as mean shift, particle filter, Kalman filter, etc.
  • a trajectory can be obtained. Tracking can be done in moving object region units, feature point units, or cluster units of feature points that are close in distance and similar in motion, and can be used in addition to differences in feature point positions to be matched. For example, the difference in the direction and type of corners (edges) may be considered.
  • feature points when there are feature points that are always detected at the same position, they may be learned to mask the first detection.
  • a train of vehicles that are closely connected in a traffic jam can be detected as a single moving object region in S12, but the feature points can be handled by appropriately grouping them.
  • S15 the number of frames required to pass the speed detection range is measured from the tracked result.
  • FIG. 7 shows a flow of obstacle alarm issuing performed by the system control unit 7.
  • step S31 speed abnormality information is collected from each monitoring camera 2.
  • the speed abnormality information is information indicating whether or not the speed of the subject of each monitoring camera is out of the range set within a certain period. Since speed monitoring information cannot be obtained from the surveillance camera 2 that is already subject to video analysis and is operating in the high image quality mode, the trajectory information obtained in the analysis by the video analysis unit 4 should be used instead. Can do.
  • the calculation of speed can be realized by image analysis with a low calculation cost compared to video analysis using recognition, learning, or the like. In the vicinity of the monitoring camera where the speed abnormality is detected, some abnormality may have occurred, such as traffic jam or the vehicle stepping on the brake.
  • step S32 the video of the surveillance camera 2 to be analyzed by the video analysis unit 4 is selected.
  • the system control unit 7 gives priority to the video of the monitoring camera 2 in which the speed abnormality is detected and transfers it to the video analysis unit 4. If the cameras in which the speed abnormality is detected are located adjacent to each other, the priority of only the head (downstream) monitoring camera 2 is increased as shown in FIG. This is intended to prioritize the camera corresponding to the head of the traffic jam, because it is expected that multiple consecutive cameras on the road will detect speed abnormalities depending on the traffic jam section. . When there are more cameras 2 with higher priority than M, analysis is performed with priority given to a long period from the time of the last analysis to the current time.
  • the images of cameras and other surveillance cameras that are geographically adjacent to the high-priority cameras are analyzed from the time of the last analysis to the present. Analysis may be performed with priority given to those having a long period. The history selected as the video analysis target is retained for a certain period.
  • the system control unit 7 transmits the information to the target monitoring camera 2 in S33.
  • the monitoring camera 2 to be analyzed stops the operation of the speed abnormality detection function, and performs processing for improving the image quality of the video with the computation resources. Further, when the surveillance camera 2 compresses and outputs video, the bit rate may be increased only during analysis to improve the image quality.
  • the video analysis unit 4 continues to analyze the video from the monitoring camera 2 that is the analysis target, and sequentially returns the analysis results to the system control unit 7.
  • the system control unit 7 determines whether or not the received analysis result (the latest one) indicates an abnormality such as avoidance. Since the analysis result may include erroneous detection, the analysis result may be retained for a certain period in the past and judged based on them.
  • the system control unit 7 searches the history as to whether there is a camera subject to video analysis more than once in the past fixed period as S36. In S37, based on the search in S36, it is determined whether or not there is a camera that has been frequently analyzed. If there is a corresponding camera, the process proceeds to the same processing (alarm issue) as when it is determined to be abnormal in S35. If there is no corresponding camera, the process returns to S31.
  • the system control unit 7 issues an alarm in S38.
  • the alarm is performed by transmitting the video of the camera in which an abnormality is detected and information such as the position and time to the video display device.
  • the information is transmitted to the monitoring camera 2 in which the abnormality is detected. Thereby, on the surveillance camera 2 side, if it is being performed, the operation of the speed abnormality detection function can be stopped, and the processing for improving the image quality of the video can be performed with the calculation resources.
  • the system control unit 7 also transmits camera control information that allows the position information of the abnormality output from the video analysis unit 4 to be clearly captured at a higher magnification to the corresponding monitoring camera 2.
  • the video analysis unit 4 is instructed to detect the cause of the abnormality itself.
  • the video analysis unit 4 is also provided with information on the shooting direction and range changed by the camera control information.
  • the video analysis unit 4 may perform road damage detection, obstacle recognition (detection), vehicle emergency stop detection, and the like using the above-described recognition / learning technique or the like.
  • contrast correction, fluctuation correction, fluctuation correction, super-resolution processing, or the like may be performed as preprocessing.
  • the number of passing vehicles per predetermined time may be measured, and the traffic jam start point may be searched from the change.
  • the video analysis unit 4 may detect the boundary line of the traveling lane drawn on the road surface by Hough transform or the like, and track the moving object using the detected line. For example, it is possible to reduce the amount of processing by excluding movements greatly different from the direction of the traveling lane from the candidates, and it is possible to determine the apparent size of the vehicle and the degree of avoidance based on the width of the lane. Also, when an obstacle is detected from the video, an alarm with a different severity can be issued based on the size and the positional relationship with the travel lane. For example, when an obstacle initially detected in the roadside zone moves to the traveling lane, an alarm with a higher severity is issued again.
  • the manner of sharing image processing between the monitoring camera 2 and the video analysis unit 4 includes various modes other than those described in this example.
  • the speed analysis unit 22 of the monitoring camera 2 may always calculate the trajectory of the moving area (vehicle) and pass the data to the video analysis unit 4 to perform the subsequent processing.
  • the trajectory data is converted from the screen coordinates to the global coordinate system, standardized to pass points that do not depend on the speed, and averaged (accumulated), and the avoidance is determined based on the difference from the averaged data can do.
  • a known technique such as Patent Document 3 can be used for coordinate system conversion.
  • the number of trajectory data (passage point data) to be averaged is required to be a certain number (for example, for 10 vehicles), and the video analysis unit 4 holds averaged data for each shooting field angle of each monitoring camera 2 To do.
  • a well-known algorithm for example, 1 class SVM, Singular spectrum analysis, etc.
  • detecting non-stationary data such as avoidance can also be used.
  • 1 monitoring system 2 monitoring camera device, 3 central processing system (center), 4 video analysis unit, 5 video display unit, 6 video recording unit, 7 system control unit, 21 imaging unit, 22 speed analysis unit, 23 image quality enhancement processing unit, 24 camera control unit, 25 electric pan head, 26 electric zoom lens.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un système pour détecter un obstacle réfléchi parmi un grand nombre de vidéos de caméra ayant une faible charge de traitement. Une pluralité de caméras installées dans des zones de bord de route détectent régulièrement le lieu d'un véhicule descendant une route ou une vitesse de véhicule ou une fréquence de passage obtenue durant une acquisition de lieu, et notifient à un centre le moment où il existe une anomalie. Dans le centre, une caméra à analyser est rétrécie au voisinage d'un point de départ de congestion sur la base de la notification, une analyse avancée est réalisée sur les lieux de véhicule, etc., et un état instable est détecté à partir des données analysées. La caméra choisie pour subir une analyse réalise un traitement renforcé de qualité d'image sur le lieu ou la vitesse de véhicule et un autre traitement de manière préférentielle de façon à délivrer une vidéo plus nette au centre.
PCT/JP2016/077236 2015-09-17 2016-09-15 Système de surveillance WO2017047687A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017539963A JP6584024B2 (ja) 2015-09-17 2016-09-15 監視システム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2015-184124 2015-09-17
JP2015184124 2015-09-17

Publications (1)

Publication Number Publication Date
WO2017047687A1 true WO2017047687A1 (fr) 2017-03-23

Family

ID=58288964

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/077236 WO2017047687A1 (fr) 2015-09-17 2016-09-15 Système de surveillance

Country Status (2)

Country Link
JP (1) JP6584024B2 (fr)
WO (1) WO2017047687A1 (fr)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190080142A (ko) * 2017-12-28 2019-07-08 (주) 에스비네트워크 터널 내 유고상황 이벤트를 파노라마 영상에 표시하는 시스템 및 표시 방법
JP2020198497A (ja) * 2019-05-31 2020-12-10 株式会社東芝 交通映像処理装置
WO2021020304A1 (fr) * 2019-07-29 2021-02-04 京セラ株式会社 Station de base, dispositif de bord de route, système de communication de trafic, procédé de gestion de trafic et procédé de génération de données d'apprentissage
JP2021022221A (ja) * 2019-07-29 2021-02-18 京セラ株式会社 基地局、交通通信システム、及び交通管理方法
CN112637492A (zh) * 2020-12-19 2021-04-09 中建浩运有限公司 一种智能实体展会系统
JP2021077072A (ja) * 2019-11-08 2021-05-20 富士通株式会社 情報処理プログラム、情報処理方法、および情報処理装置
WO2021171338A1 (fr) * 2020-02-25 2021-09-02 日本電信電話株式会社 Dispositif, procédé et système de suivi d'objet en mouvement, dispositif d'apprentissage, et programme
WO2021186854A1 (fr) * 2020-03-19 2021-09-23 日本電気株式会社 Dispositif de traitement de données, dispositif de transmission, procédé de traitement de données et programme
JP2021175033A (ja) * 2020-04-21 2021-11-01 株式会社日立製作所 事象解析システムおよび事象解析方法
CN113744550A (zh) * 2020-05-15 2021-12-03 丰田自动车株式会社 信息处理装置以及信息处理系统
WO2022024208A1 (fr) * 2020-07-28 2022-02-03 日本電気株式会社 Dispositif de surveillance de trafic, système de surveillance de trafic, procédé de surveillance de trafic et programme
WO2022074969A1 (fr) * 2020-10-05 2022-04-14 株式会社デンソー Dispositif de traitement d'informations, dispositif de commande de véhicule et procédé de distribution d'informations routières
WO2022269980A1 (fr) * 2021-06-24 2022-12-29 日立Astemo株式会社 Dispositif et procédé de reconnaissance d'environnement externe

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113923449B (zh) * 2021-12-15 2022-03-15 深圳市光网视科技有限公司 一种预判安防监控终端故障的方法及系统

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03118696A (ja) * 1989-09-29 1991-05-21 Omron Corp 異常走行車両の検出装置
JPH10307987A (ja) * 1997-05-02 1998-11-17 Mitsubishi Heavy Ind Ltd 交通流計測装置
JPH1139589A (ja) * 1997-07-18 1999-02-12 Fuji Electric Co Ltd 交通監視装置および交通監視方法
JPH1196494A (ja) * 1997-09-22 1999-04-09 Hitachi Ltd 交通流監視方法および装置
JPH11288495A (ja) * 1998-04-02 1999-10-19 Sumitomo Electric Ind Ltd 交通異常検出装置
JP2002083394A (ja) * 2000-06-30 2002-03-22 Sumitomo Electric Ind Ltd 交通流の異常検知装置及び方法
JP2002190090A (ja) * 2000-10-13 2002-07-05 Sumitomo Electric Ind Ltd 交通流の異常検知装置及び方法(スペクトル)
JP2005073218A (ja) * 2003-08-07 2005-03-17 Matsushita Electric Ind Co Ltd 画像処理装置
JP2009284371A (ja) * 2008-05-26 2009-12-03 Meidensha Corp 映像監視装置
JP2011071920A (ja) * 2009-09-28 2011-04-07 Hitachi Kokusai Electric Inc 遠隔映像監視システム
JP2012234377A (ja) * 2011-05-02 2012-11-29 Mitsubishi Electric Corp 映像監視システム
JP2014192713A (ja) * 2013-03-27 2014-10-06 Hitachi Industry & Control Solutions Ltd 監視システム

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03118696A (ja) * 1989-09-29 1991-05-21 Omron Corp 異常走行車両の検出装置
JPH10307987A (ja) * 1997-05-02 1998-11-17 Mitsubishi Heavy Ind Ltd 交通流計測装置
JPH1139589A (ja) * 1997-07-18 1999-02-12 Fuji Electric Co Ltd 交通監視装置および交通監視方法
JPH1196494A (ja) * 1997-09-22 1999-04-09 Hitachi Ltd 交通流監視方法および装置
JPH11288495A (ja) * 1998-04-02 1999-10-19 Sumitomo Electric Ind Ltd 交通異常検出装置
JP2002083394A (ja) * 2000-06-30 2002-03-22 Sumitomo Electric Ind Ltd 交通流の異常検知装置及び方法
JP2002190090A (ja) * 2000-10-13 2002-07-05 Sumitomo Electric Ind Ltd 交通流の異常検知装置及び方法(スペクトル)
JP2005073218A (ja) * 2003-08-07 2005-03-17 Matsushita Electric Ind Co Ltd 画像処理装置
JP2009284371A (ja) * 2008-05-26 2009-12-03 Meidensha Corp 映像監視装置
JP2011071920A (ja) * 2009-09-28 2011-04-07 Hitachi Kokusai Electric Inc 遠隔映像監視システム
JP2012234377A (ja) * 2011-05-02 2012-11-29 Mitsubishi Electric Corp 映像監視システム
JP2014192713A (ja) * 2013-03-27 2014-10-06 Hitachi Industry & Control Solutions Ltd 監視システム

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102011119B1 (ko) * 2017-12-28 2019-10-21 (주)에스비네트워크 터널 내 유고상황 이벤트를 파노라마 영상에 표시하는 시스템 및 표시 방법
KR20190080142A (ko) * 2017-12-28 2019-07-08 (주) 에스비네트워크 터널 내 유고상황 이벤트를 파노라마 영상에 표시하는 시스템 및 표시 방법
JP7326029B2 (ja) 2019-05-31 2023-08-15 株式会社東芝 交通映像処理装置
JP2020198497A (ja) * 2019-05-31 2020-12-10 株式会社東芝 交通映像処理装置
WO2021020304A1 (fr) * 2019-07-29 2021-02-04 京セラ株式会社 Station de base, dispositif de bord de route, système de communication de trafic, procédé de gestion de trafic et procédé de génération de données d'apprentissage
JP2021022221A (ja) * 2019-07-29 2021-02-18 京セラ株式会社 基地局、交通通信システム、及び交通管理方法
JP7401217B2 (ja) 2019-07-29 2023-12-19 京セラ株式会社 基地局、交通通信システム、及び交通管理方法
JP2021077072A (ja) * 2019-11-08 2021-05-20 富士通株式会社 情報処理プログラム、情報処理方法、および情報処理装置
JPWO2021171338A1 (fr) * 2020-02-25 2021-09-02
WO2021171338A1 (fr) * 2020-02-25 2021-09-02 日本電信電話株式会社 Dispositif, procédé et système de suivi d'objet en mouvement, dispositif d'apprentissage, et programme
US11983891B2 (en) 2020-02-25 2024-05-14 Nippon Telegraph And Telephone Corporation Moving target tracking device, moving target tracking method, moving target tracking system, learning device, and program
JP7255745B2 (ja) 2020-02-25 2023-04-11 日本電信電話株式会社 移動対象追跡装置、移動対象追跡方法、移動対象追跡システム、および、学習装置、並びに、プログラム
WO2021186854A1 (fr) * 2020-03-19 2021-09-23 日本電気株式会社 Dispositif de traitement de données, dispositif de transmission, procédé de traitement de données et programme
JP2021175033A (ja) * 2020-04-21 2021-11-01 株式会社日立製作所 事象解析システムおよび事象解析方法
JP7440332B2 (ja) 2020-04-21 2024-02-28 株式会社日立製作所 事象解析システムおよび事象解析方法
CN113744550A (zh) * 2020-05-15 2021-12-03 丰田自动车株式会社 信息处理装置以及信息处理系统
WO2022024208A1 (fr) * 2020-07-28 2022-02-03 日本電気株式会社 Dispositif de surveillance de trafic, système de surveillance de trafic, procédé de surveillance de trafic et programme
JPWO2022024208A1 (fr) * 2020-07-28 2022-02-03
JPWO2022074969A1 (fr) * 2020-10-05 2022-04-14
WO2022074969A1 (fr) * 2020-10-05 2022-04-14 株式会社デンソー Dispositif de traitement d'informations, dispositif de commande de véhicule et procédé de distribution d'informations routières
CN112637492A (zh) * 2020-12-19 2021-04-09 中建浩运有限公司 一种智能实体展会系统
WO2022269980A1 (fr) * 2021-06-24 2022-12-29 日立Astemo株式会社 Dispositif et procédé de reconnaissance d'environnement externe

Also Published As

Publication number Publication date
JP6584024B2 (ja) 2019-10-02
JPWO2017047687A1 (ja) 2018-08-02

Similar Documents

Publication Publication Date Title
JP6584024B2 (ja) 監視システム
KR101942491B1 (ko) 도로교통 상황 모니터링 관제·실시간 교통정보 분석·교통신호제어로 이루어진 인공지능형 cctv통합관제중개제어모듈을 갖는 cctv통합관제센터시스템
US20180204335A1 (en) System for tracking object, and camera assembly therefor
JP4984575B2 (ja) 画像処理による侵入者検知装置
KR102105162B1 (ko) 전자파 센서를 이용해 차량 속도, 차량 위치 및 차량 통행량을 파악하고 규칙을 위반하는 차량을 구분하여 이에 대한 정보는 동영상 또는 사진으로 저장하는 스마트 속도위반단속 장치, 스마트 신호위반단속 장치 및 스마트시티 솔루션 장치
KR101496390B1 (ko) 차량번호인식 시스템
KR101187908B1 (ko) 복수의 이벤트 영상을 표시하는 통합 감시 시스템 및 이를이용한 복수의 이벤트 영상 표시 방법
KR20200103194A (ko) 영상 기반 비정상 상황 모니터링 방법 및 시스템
KR100690279B1 (ko) 다목적 영상감지 시스템
CN112241974A (zh) 交通事故检测方法及处理方法、系统、存储介质
KR100871833B1 (ko) 자동 추적 카메라 장치
Prommool et al. Vision-based automatic vehicle counting system using motion estimation with Taylor series approximation
KR102434154B1 (ko) 영상감시시스템에서의 고속 이동물체의 위치 및 모션 캡쳐 방법
JP3426068B2 (ja) テレビカメラ切替監視方式
CN112906428B (zh) 影像侦测区域取得方法及空间使用情况的判定方法
KR20180068462A (ko) 신호등 제어 시스템 및 방법
KR101859329B1 (ko) 주정차 단속 시스템
JPH0676195A (ja) 異常事象検出装置
JP4675217B2 (ja) 追尾型監視システム
JPH0460880A (ja) 動体識別解析管理システム
Heo et al. Autonomous reckless driving detection using deep learning on embedded GPUs
JP2018192844A (ja) 監視装置、監視システム、監視プログラム、および、記憶媒体
KR20230020184A (ko) 고정 카메라 및 이동 카메라를 이용한 영상 분석 장치
JP4209826B2 (ja) 交通流監視装置
JP4697761B2 (ja) 待ち行列検出方法及び待ち行列検出装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16846558

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017539963

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16846558

Country of ref document: EP

Kind code of ref document: A1