CN111539301B - Scene chaos degree discrimination method based on video analysis technology - Google Patents

Scene chaos degree discrimination method based on video analysis technology Download PDF

Info

Publication number
CN111539301B
CN111539301B CN202010312148.4A CN202010312148A CN111539301B CN 111539301 B CN111539301 B CN 111539301B CN 202010312148 A CN202010312148 A CN 202010312148A CN 111539301 B CN111539301 B CN 111539301B
Authority
CN
China
Prior art keywords
frame
optical flow
scene
value
inter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010312148.4A
Other languages
Chinese (zh)
Other versions
CN111539301A (en
Inventor
犹津
徐勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guizhou Siso Electronics Co ltd
Guizhou Security Engineering Technology Research Center Co ltd
Original Assignee
Guizhou Siso Electronics Co ltd
Guizhou Security Engineering Technology Research Center Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guizhou Siso Electronics Co ltd, Guizhou Security Engineering Technology Research Center Co ltd filed Critical Guizhou Siso Electronics Co ltd
Priority to CN202010312148.4A priority Critical patent/CN111539301B/en
Publication of CN111539301A publication Critical patent/CN111539301A/en
Application granted granted Critical
Publication of CN111539301B publication Critical patent/CN111539301B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/53Recognition of crowd images, e.g. recognition of crowd congestion

Abstract

The invention discloses a scene chaos degree judging method based on a video analysis technology, and relates to a video monitoring technology. The method is provided aiming at providing a reasonable judgment basis for the chaos degree under the complex environment, and the horizontal direction difference and the vertical direction difference of the frame light stream of a single frame are detected; detecting multi-frame average energy, inter-frame maximum optical flow differential and inter-frame median optical flow differential of continuous k frames; obtaining the scene chaos degree; and sending out prompt information when the scene chaos degree exceeds a threshold value. The method has the advantages that not only can security personnel engaged in video monitoring be liberated from boring and inefficient heavy real-time video watching, but also effective technical guarantee can be provided for maintaining social stability and protecting the safety of lives and properties of people. The method can be applied to the situation that the main target in the scene is a person, the situation that the main target in the scene is an object, or the situation that people and objects are more simultaneously.

Description

Scene chaos degree judging method based on video analysis technology
Technical Field
The invention relates to a video monitoring technology, in particular to a scene chaos degree judging method based on a video analysis technology.
Background
When the degree of confusion of the scene is higher, the probability that the scene belongs to an abnormal situation is higher, and the attention and the monitoring are more important from the perspective of public safety. For example, a crowd in a scene with a high degree of confusion is likely to be at some risk, may be in a state of crowdsourcing, or may have an abnormality in an object in motion, etc. Therefore, the chaos degree judgment has great application prospect and important technical significance. In the real urban life, the supervision department can acquire videos of public areas through cameras widely distributed in cities, and then mainly monitor the states of crowds and moving objects in occasions with high scene chaos degree, and timely dispose and respond. Under the condition that the main target in the scene is an object or the condition that people and objects are simultaneously more, the prior art cannot accurately give a prompt signal to monitoring personnel. How to provide reasonable judgment basis in a complex environment is a problem to be solved urgently by the technical personnel in the field.
Disclosure of Invention
The invention aims to provide a scene chaos degree judging method based on a video analysis technology, which can be applied to the situation that a main target in a scene is a person, the situation that the main target in the scene is an object, or the situation that a plurality of persons and objects exist simultaneously, so as to make up for the defects in the prior art.
The invention relates to a scene chaos degree judging method based on a video analysis technology, which is characterized by detecting a horizontal direction difference u and a vertical direction difference v of a frame light stream of a single frame; detecting multi-frame average energy of continuous k frames
Figure BDA0002458058630000011
Interframe maximum optical flow difference dif max Inter-frame median optical flow differential dif mean (ii) a Obtaining a scene chaos degree conf:
Figure BDA0002458058630000012
sending out prompt information when the scene chaos degree exceeds a threshold value; wherein a, b, c, d and e are coefficients, and k is a natural number not less than 3.
The difference u in the horizontal direction of the frame optical flow is specifically expressed by
Figure BDA0002458058630000021
Obtaining; wherein M and N are the coordinates of any pixel in the frame, p and q are the coordinates of any other pixel in the same frame, and M and N are the maximum values of the coordinates in the frame; u. of mn Is the horizontal direction light flow value u of the pixel with the coordinate values of m and n pq The horizontal direction optical flow values of pixels with coordinate values of p and q are shown.
The vertical direction difference v of the frame light stream is specifically expressed by the formula
Figure BDA0002458058630000022
Obtaining; v. of mn Is the vertical direction light flow value v of the pixel with the coordinate values m and n pq The vertical direction optical flow values of the pixels with coordinate values of p and q are shown.
The specific formula of the detected multi-frame average energy f is
Figure BDA0002458058630000023
Is obtained, wherein>
Figure BDA0002458058630000024
Is the average of the optical flow magnitude values fmn for each of the successive k frames.
The optical flow amplitude value fmn is specifically represented by the formula f mn =|u mn |+|v mn And | obtaining.
The inter-frame maximum optical flow difference dif max Concrete formula
Figure BDA0002458058630000025
Obtaining; wherein dif mn Is an inter-frame optical flow differential, based on>
Figure BDA0002458058630000026
Thus obtaining the product.
Said inter-frame median optical flow differential dif mean Quilt type
Figure BDA0002458058630000027
Obtaining; the method comprises the following specific steps: all inter-frame optical flow differentials dif mn In order of magnitude, if there are odd number of inter-frame optical flow differentials dif mn Then the intermediate inter-frame optical flow differential dif mn Value assignment as inter-frame median optical flow differential dif mean (ii) a If there is an even number of inter-frame optical flow differentials dif mn Then the optical flow difference dif between the two frames in the middle mn Averaging and assigning as an interframe median optical flow difference dif mean
The values of the coefficients a, b, c and d are all 1, and the value of the coefficient e is 0.5.
The value of k ranges from 15 to 40.
The method for distinguishing the degree of scene disorder based on the video analysis technology has the advantages that not only can security personnel engaged in video monitoring be liberated from boring and inefficient heavy real-time video watching, but also effective technical guarantee can be provided for maintaining social stability and protecting the safety of lives and properties of people. The method can be applied to the situation that the main target in the scene is a person, the situation that the main target in the scene is an object, or the situation that people and objects are more simultaneously.
Drawings
Fig. 1 is a schematic flow chart of a scene confusion degree determination method based on a video analysis technology according to the present invention.
Detailed Description
As shown in fig. 1, the principle and implementation of a scene confusion degree determination method based on a video analysis technology according to the present invention are as follows.
Even if the number of normal people is large, the normal people can be in a more orderly waiting or moving state. People in abnormal conditions can have relatively disordered states and movements. For example, people in a confused state typically experience disorderly directions of movement, or relatively rapid movements or limb movements.
The invention provides a scene chaos degree judging method based on physical quantities in the aspects of video analysis, aiming at the characteristics of people in abnormal conditions. The proposed physical quantities include frame optical flow horizontal direction difference, frame optical flow vertical direction difference, multi-frame average energy, inter-frame maximum optical flow difference, and inter-frame median optical flow difference. A scene clutter degree calculation formula based on these physical quantities at the same time is also proposed. The rationality lies in that: the larger the optical flow difference value of pixels in one video segment is, the larger the maximum optical flow difference between frames is, and the larger the average energy of multiple frames is, the higher the degree of scene disorder in the video segment is, and the more possible there is the excessive traffic or other abnormal situations.
The frame optical flow horizontal direction difference is defined as follows: let the frame image be I, I mn Is the pixel value of the element with coordinate value (m, n) in the image I. Pixel value I mn Corresponding horizontal direction luminous flux value is u mn . The horizontal direction light flow value of all pixels of the whole frame is averaged to
Figure BDA0002458058630000031
Figure BDA0002458058630000032
Defined as the horizontal direction difference of the frame optical flow. M and N are the maximum value of the coordinates in the frame, p and q are the coordinates of another arbitrary pixel in the same frame, and the processing process is carried out based on the coordinates between two arbitrary different pixel points in the same frame. u. of mn Is the horizontal direction light flow value u of the pixel with the coordinate values of m and n pq The horizontal direction optical flow values of the pixels with coordinate values of p and q are shown.
The vertical direction difference of the frame optical flow is defined as follows: i is mn Corresponding to a vertical direction luminous flux value of v mn . The average of the vertical direction luminous flux values of all pixels of the whole frame is
Figure BDA0002458058630000041
Figure BDA0002458058630000042
I.e. the vertical difference of the frame light flow. v. of mn Is the vertical direction light flow value v of the pixel with the coordinate values m and n pq The vertical direction optical flow values of pixels with coordinate values of p and q are shown.
The maximum optical flow differential between frames is defined as follows: and backtracking from the current frame and taking out continuous k frames in the video segment, and firstly calculating the optical flow amplitude values of all pixel points of each frame. For a certain frame, the value of the optical flow amplitude of the pixel value corresponding to the coordinate point (m, n) is defined as f mn =|u mn |+|v mn L, where l u mn | represents u mn Absolute value of (u) mn And v mn Representing optical flow values in the horizontal and vertical directions, respectively. Because continuous k frames in the video segment are taken out, and the number of pixel points of each frame is the same. Therefore, for each coordinate point (m, n), k pixel values of the coordinate point in the k frames are respectively extracted and recorded as
Figure BDA0002458058630000047
For the k pixel values at coordinate point (m, n), the average of their optical flow magnitude values is->
Figure BDA0002458058630000043
The value of k is determined empirically and ranges from 15 to 40. Will be/are>
Figure BDA0002458058630000044
Defined as the inter-frame optical-flow differential with respect to coordinate point (m, n).
Figure BDA0002458058630000045
Defined as the maximum optical-flow differential between frames.
Figure BDA0002458058630000046
Defined as the inter-frame median optical-flow differential, the operation is as follows: all inter-frame optical flow differences are arranged in descending order, provided that there is an odd number of inter-frame optical flow differences,the value in the middle is taken as dif mean . If there are even number of interframe light stream difference values, the average value of the two interframe light stream difference values in the middle is used as dif mean . Will->
Figure BDA0002458058630000051
Defined as the average energy over multiple frames.
Based on the physical quantity, the method provided by the invention calculates and outputs the scene chaos degree for each frame from a certain frame in the video. In order to avoid the influence of inter-frame mutation caused by the starting frame of the video segment, the calculation is not started from the first frame in the video. The calculation formula is as follows:
Figure BDA0002458058630000052
wherein a, b, c, d, e are coefficients, and d > e. The main basis for defining the scene chaos degree conf is that the scene chaos degree is in direct proportion to the difference representing the horizontal direction of the optical flow and the difference representing the vertical direction of the optical flow, and is also in direct proportion to the average energy representing the multiple frames, the maximum between frames, and the average optical flow difference. Preferred values for the coefficients a, b, c, d, e are a, b, c, d =1, e =0.5.
Based on the calculation, when conf is larger than gamma, the real-time monitoring system considers that the scene chaos degree reaches the early warning degree, gives out the scene chaos degree exceeding the threshold value, and feeds back the scene chaos degree to the monitoring personnel needing attention or similar prompts. The threshold value gamma is given an initial value in advance and can be adjusted for a specific scene.
Because most abnormal conditions in the video have certain continuity, light warning can be given, such as blue coincidence reminding is used on a screen, and meanwhile, a light sound prompt voice is given; and a severe alarm, such as using a red coincidence reminder on a screen and simultaneously sending out a prompt voice. The severe alarm means that the degree of abnormality in the scene is high, and attention and response are required in time. And within the specified time interval t, if the chaos degree in the scene only occurs once and exceeds the threshold value, a light alarm is given. And in the specified time interval t, when the degree of disorder appearing three times or more in the scene exceeds the threshold value, the alarm is a heavy alarm. Obviously, in practical applications, a heavy alarm will only be given if a light alarm has occurred.
The initial value of the threshold γ is determined as follows: a collection of video segments is first collected, which includes two types of video with a general scene clutter level and a high scene clutter level. The degree of scene clutter in a video segment is determined manually to be general or high, and to be more general than high. The ideal ratio is more than 5. The method of the invention is used for calculating the scene chaos degree of each video segment. Then, the initial value of the scene disorder degree threshold γ is determined by experienced persons through actual viewing of the video and the calculated scene disorder degree of each video segment. In specific operation, each person gives an initial value of the scene disorder degree threshold gamma of the person, and the average value of the initial values is the final initial value of the scene disorder degree threshold gamma.
In consideration of practical application, some scenes, such as busy subway entrance/exit, etc., should be allowed to have a relatively high initial value of the scene clutter degree threshold γ because there are normally many people. Some scenes, such as self-service banking outlets, should be allowed to have a relatively low initial value of the scene clutter level threshold γ, since there are normally fewer people. Therefore, when the system is deployed, the manager is allowed to adjust the initial value of the scene chaos degree threshold gamma according to a specific scene, and a prompt of the scene chaos degree which is more suitable for the needs is obtained.
It will be apparent to those skilled in the art that various other changes and modifications may be made in the above-described embodiments and concepts and all such changes and modifications are intended to be within the scope of the appended claims.

Claims (3)

1. A scene chaos degree discrimination method based on a video analysis technology is characterized in that a horizontal direction difference u of a frame light stream of a single frame and a vertical direction difference u of the frame light stream are detectedA difference in orientation v; detecting multi-frame average energy of continuous k frames
Figure FDA0004052200430000011
Interframe maximum optical flow difference dif max Inter-frame median optical flow differential dif mean (ii) a Obtaining a scene chaos degree conf:
Figure FDA0004052200430000012
sending out prompt information when the scene chaos degree exceeds a threshold value; wherein a, b, c, d and e are coefficients, and k is a natural number not less than 3;
the horizontal direction difference u of the frame optical flow is specifically expressed by
Figure FDA0004052200430000013
Obtaining; wherein M and N are the coordinates of any pixel in the frame, p and q are the coordinates of any other pixel in the same frame, and M and N are the maximum values of the coordinates in the frame; u. of mn Is the horizontal direction luminous flux value u of the pixel with the coordinate values of m and n pq The horizontal direction light flow value of the pixel with coordinate values of p and q;
the difference v of the vertical direction of the frame light stream is specifically expressed by the formula
Figure FDA0004052200430000014
Obtaining; v. of mn Is the vertical direction light flow value v of the pixel with the coordinate values m and n pq The light flow value in the vertical direction of the pixel with the coordinate values of p and q;
the average energy of the detected multiframe
Figure FDA0004052200430000015
Is particularly composed of>
Figure FDA0004052200430000016
Is obtained, wherein>
Figure FDA0004052200430000017
The average value of the optical flow amplitude values fmn of each frame in the continuous k frames is taken;
the value of optical flow amplitude fmn is specifically represented by the formula f mn =|u mn |+|v mn Obtaining the absolute value;
the inter-frame maximum optical flow difference dif max Concrete formula
Figure FDA0004052200430000021
Obtaining; wherein dif mn Is an inter-frame optical flow differential, based on>
Figure FDA0004052200430000022
Obtaining;
the inter-frame median optical flow difference dif mean Quilt type
Figure FDA0004052200430000023
Obtaining; the method comprises the following specific steps: all inter-frame optical flow differentials dif mn In order of magnitude, if there are odd number of inter-frame optical flow differentials dif mn Then the intermediate inter-frame optical flow differential dif mn Value assignment as inter-frame median optical flow differential dif mean (ii) a If there are even number of inter-frame optical flow differences dif mn Then the optical flow difference dif between the two frames in the middle mn Averaging and assigning as an interframe median optical flow difference dif mean
2. The method for distinguishing the degree of confusion of a scene based on the video analysis technology as claimed in claim 1, wherein the coefficients a, b, c, d all take on the value of 1, and the coefficient e takes on the value of 0.5.
3. The method as claimed in claim 2, wherein the k value ranges from 15 to 40.
CN202010312148.4A 2020-04-20 2020-04-20 Scene chaos degree discrimination method based on video analysis technology Active CN111539301B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010312148.4A CN111539301B (en) 2020-04-20 2020-04-20 Scene chaos degree discrimination method based on video analysis technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010312148.4A CN111539301B (en) 2020-04-20 2020-04-20 Scene chaos degree discrimination method based on video analysis technology

Publications (2)

Publication Number Publication Date
CN111539301A CN111539301A (en) 2020-08-14
CN111539301B true CN111539301B (en) 2023-04-18

Family

ID=71975134

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010312148.4A Active CN111539301B (en) 2020-04-20 2020-04-20 Scene chaos degree discrimination method based on video analysis technology

Country Status (1)

Country Link
CN (1) CN111539301B (en)

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003051016A (en) * 2001-05-11 2003-02-21 Honda Motor Co Ltd System, method and program for detecting approach
CN101751553A (en) * 2008-12-03 2010-06-23 中国科学院自动化研究所 Method for analyzing and predicting large-scale crowd density
CN102708571A (en) * 2011-06-24 2012-10-03 杭州海康威视软件有限公司 Method and device for detecting strenuous motion in video
CN103384331A (en) * 2013-07-19 2013-11-06 上海交通大学 Video inter-frame forgery detection method based on light stream consistency
CN103500324A (en) * 2013-09-29 2014-01-08 重庆科技学院 Violent behavior recognition method based on video monitoring
CN105389567A (en) * 2015-11-16 2016-03-09 上海交通大学 Group anomaly detection method based on a dense optical flow histogram
CN205334277U (en) * 2016-02-18 2016-06-22 贵州思索电子有限公司 Visual remote monitoring governing system of storehouse humiture
CN106022234A (en) * 2016-05-13 2016-10-12 中国人民解放军国防科学技术大学 Abnormal crowd behavior detection algorithm based on optical flow computation
RU168516U1 (en) * 2016-02-26 2017-02-07 Общество с ограниченной ответственностью "Вебзавод" AUTONOMOUS OPTICAL LIQUID FLOW METER FOR MEDICAL DROPS
CN107330372A (en) * 2017-06-05 2017-11-07 四川大学 A kind of crowd density based on video and the analysis method of unusual checking system
CN108288021A (en) * 2017-12-12 2018-07-17 深圳市深网视界科技有限公司 A kind of crowd's accident detection method, electronic equipment and storage medium
CN108596157A (en) * 2018-05-14 2018-09-28 三峡大学 A kind of crowd's agitation scene detection method and system based on motion detection
CN110111357A (en) * 2019-04-03 2019-08-09 天津大学 A kind of saliency detection method
CN110300030A (en) * 2019-07-11 2019-10-01 贵州安防工程技术研究中心有限公司 Intelligent video operation management system
CN110334665A (en) * 2019-07-10 2019-10-15 贵州安防工程技术研究中心有限公司 A kind of face identification system and method for 3D identification

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5400718B2 (en) * 2010-07-12 2014-01-29 株式会社日立国際電気 Monitoring system and monitoring method

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003051016A (en) * 2001-05-11 2003-02-21 Honda Motor Co Ltd System, method and program for detecting approach
CN101751553A (en) * 2008-12-03 2010-06-23 中国科学院自动化研究所 Method for analyzing and predicting large-scale crowd density
CN102708571A (en) * 2011-06-24 2012-10-03 杭州海康威视软件有限公司 Method and device for detecting strenuous motion in video
CN103384331A (en) * 2013-07-19 2013-11-06 上海交通大学 Video inter-frame forgery detection method based on light stream consistency
CN103500324A (en) * 2013-09-29 2014-01-08 重庆科技学院 Violent behavior recognition method based on video monitoring
CN105389567A (en) * 2015-11-16 2016-03-09 上海交通大学 Group anomaly detection method based on a dense optical flow histogram
CN205334277U (en) * 2016-02-18 2016-06-22 贵州思索电子有限公司 Visual remote monitoring governing system of storehouse humiture
RU168516U1 (en) * 2016-02-26 2017-02-07 Общество с ограниченной ответственностью "Вебзавод" AUTONOMOUS OPTICAL LIQUID FLOW METER FOR MEDICAL DROPS
CN106022234A (en) * 2016-05-13 2016-10-12 中国人民解放军国防科学技术大学 Abnormal crowd behavior detection algorithm based on optical flow computation
CN107330372A (en) * 2017-06-05 2017-11-07 四川大学 A kind of crowd density based on video and the analysis method of unusual checking system
CN108288021A (en) * 2017-12-12 2018-07-17 深圳市深网视界科技有限公司 A kind of crowd's accident detection method, electronic equipment and storage medium
CN108596157A (en) * 2018-05-14 2018-09-28 三峡大学 A kind of crowd's agitation scene detection method and system based on motion detection
CN110111357A (en) * 2019-04-03 2019-08-09 天津大学 A kind of saliency detection method
CN110334665A (en) * 2019-07-10 2019-10-15 贵州安防工程技术研究中心有限公司 A kind of face identification system and method for 3D identification
CN110300030A (en) * 2019-07-11 2019-10-01 贵州安防工程技术研究中心有限公司 Intelligent video operation management system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Anomaly Detection using Context Dependent optical flow;Mondal,Ranjan等;《11th Indian Conference on Computer Vision, Graphics and Image Pricessing》;20181222;全文 *
Detecting anomalous crowd behavior using correlation analysis of optical flow;Nayan,Navneet等;《Signal image and video processing》;20190904;第13卷(第6期);全文 *
基于人群密度的异常行为检测与分级研究;韦招静等;《电视技术》;20180305(第03期);全文 *
基于加权光流能量的异常行为检测;傅博等;《吉林大学学报(工学版)》;20131115(第06期);全文 *

Also Published As

Publication number Publication date
CN111539301A (en) 2020-08-14

Similar Documents

Publication Publication Date Title
CN110428522B (en) Intelligent security system of wisdom new town
US11721187B2 (en) Method and system for displaying video streams
WO2022011828A1 (en) System and method for detecting object that gets in and out of elevator, object detection system, elevator light curtain, and elevator device
CN104123544B (en) Anomaly detection method and system based on video analysis
CN105447458B (en) A kind of large-scale crowd video analytic system and method
JP4673849B2 (en) Computerized method and apparatus for determining a visual field relationship between a plurality of image sensors
US9240051B2 (en) Object density estimation in video
CN104581081B (en) Passenger flow analysing method based on video information
US20030043160A1 (en) Image data processing
CN201936415U (en) Automatic forest fire identification and alarm system
CN110021133B (en) All-weather fire-fighting fire patrol early-warning monitoring system and fire image detection method
CN103106766A (en) Forest fire identification method and forest fire identification system
CN102013009A (en) Smoke image recognition method and device
KR20060031832A (en) A smart visual security system based on real-time behavior analysis and situation cognizance
CN103686086A (en) Method for carrying out video monitoring on specific area
CN109614875B (en) Intelligent security alarm system based on motion rule
JP2013127716A (en) Abnormal state detection system for congestion
CN106251363A (en) A kind of wisdom gold eyeball identification artificial abortion's demographic method and device
CN107480653A (en) passenger flow volume detection method based on computer vision
CN102025975A (en) Automatic monitoring method and system
CN110619735A (en) System for monitoring and alarming falling object
CN108288361A (en) A kind of passageway for fire apparatus door state detection method
CN109948474A (en) AI thermal imaging all-weather intelligent monitoring method
CN115171022A (en) Method and system for detecting wearing of safety helmet in construction scene
CN114917519B (en) Building intelligent fire control system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant