WO2020221031A1 - Procédé de génération de diagramme thermodynamique de comportement, et dispositif d'alarme, dispositif électronique et support de stockage - Google Patents

Procédé de génération de diagramme thermodynamique de comportement, et dispositif d'alarme, dispositif électronique et support de stockage Download PDF

Info

Publication number
WO2020221031A1
WO2020221031A1 PCT/CN2020/085423 CN2020085423W WO2020221031A1 WO 2020221031 A1 WO2020221031 A1 WO 2020221031A1 CN 2020085423 W CN2020085423 W CN 2020085423W WO 2020221031 A1 WO2020221031 A1 WO 2020221031A1
Authority
WO
WIPO (PCT)
Prior art keywords
behavior
sub
designated
target
area
Prior art date
Application number
PCT/CN2020/085423
Other languages
English (en)
Chinese (zh)
Inventor
赵飞
Original Assignee
杭州海康威视数字技术股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 杭州海康威视数字技术股份有限公司 filed Critical 杭州海康威视数字技术股份有限公司
Publication of WO2020221031A1 publication Critical patent/WO2020221031A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/12Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
    • G08B17/125Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions by using a video camera to detect fire or smoke
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B17/00Fire alarms; Alarms responsive to explosion
    • G08B17/12Actuation by presence of radiation or particles, e.g. of infrared radiation or of ions
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons

Definitions

  • This application relates to the field of image processing technology, in particular to methods, devices, electronic equipment, and storage media for generating and warning behavioral heat maps.
  • the purpose of the embodiments of the present application is to provide a method, device, electronic device, and storage medium for generating and alarming behavioral heat map to realize intuitive monitoring of a large area.
  • the specific technical solutions are as follows:
  • an embodiment of the present application provides an alarm method, and the method includes:
  • the behavior heat map represents the frequency of occurrence of a specified behavior in each sub-region of the area to be counted
  • triggering an alarm for the sub-area meeting the preset alarm condition includes:
  • the sub-regions of the behavior heat map include thermal colors, and the thermal colors represent the frequency of occurrence of a specified behavior in the sub-region, and the higher the frequency of occurrence of the specified behavior in the sub-region, the The higher the heating value of the thermal color of the area;
  • the triggering an alarm for the sub-area meeting the preset alarm condition when the sub-area of the behavior heat map meets the preset alarm condition includes:
  • an alarm for the sub-area to be alarmed is triggered.
  • the sub-regions of the behavior heat map include thermal colors
  • the designated behaviors are multiple designated behaviors, and different designated behaviors correspond to different thermal colors
  • the depth of the thermal color corresponds to the thermal color.
  • the frequency of occurrence of the specified behavior is positively correlated, and each of the thermal colors corresponds to the corresponding alarm linkage;
  • the triggering an alarm for the sub-area meeting the preset alarm condition when the sub-area of the behavior heat map meets the preset alarm condition includes:
  • the alarm method in the embodiment of the present application further includes:
  • the display instruction display the image data in the sub-region to be displayed, wherein the image data in the sub-region to be displayed is a video stream of the monitoring area in the sub-region to be displayed.
  • an embodiment of the present application provides a method for generating a behavioral heat map, which is applied to a back-end device, and the method includes:
  • a behavior heat map of the designated behavior in the region to be counted is generated.
  • the obtaining the behavior analysis result of each designated target in the image data of each preset monitoring area includes:
  • the pixel area sequence of each designated target is analyzed, and the behavior analysis result of each designated target is obtained.
  • the pixel region sequence of each designated target is the pixel region sequence of each sampled designated target
  • the computer vision technology is used to track and detect the designated target in each of the image data, and extract the A sequence of pixel areas, including:
  • target behavior sequence extraction is performed on each of the image data to obtain the pixel area sequence of each designated target of the sample.
  • one of the sub-regions includes at least one of the preset monitoring regions, and the determining the frequency of occurrence of the designated behavior in each sub-region of the region to be counted according to the behavior analysis result of each designated target includes:
  • the behavior analysis result of each designated target, and the preset monitoring area where each designated target is located the frequency of occurrence of the designated behavior in each of the sub-regions of the area to be counted is determined.
  • the method further includes:
  • the determining the frequency of occurrence of the designated behavior in each sub-region of the area to be counted according to the behavior analysis result of each designated target includes:
  • the frequency of occurrence of the designated behavior in each sub-region of the area to be counted is determined.
  • the acquiring the actual position of each of the designated targets in the preset monitoring area includes:
  • the actual position of each designated target in the preset monitoring area is determined.
  • the determining the frequency of occurrence of the specified behavior in each subregion of the area to be counted according to the actual location of each specified target and the behavior analysis result of each specified target includes:
  • each designated target classify each designated target to obtain multiple behavior lists, wherein the behavior types of the designated targets in the same behavior list are the same;
  • the frequency of occurrence of the designated behavior in each sub-region of the area to be counted is determined.
  • the method before the determining the frequency of occurrence of the designated behavior in each subregion of the area to be counted according to the behavior analysis result of each designated target, the method further includes:
  • each sub-area in the area to be counted is determined.
  • the designated behavior has a plurality of designated behaviors
  • the generating a behavior heat map of the designated behavior of the region to be counted according to the frequency of occurrence of the designated behavior in each of the sub-regions includes:
  • the thermal color corresponding to each designated behavior in the subregion is displayed on the map of the subregion, where any The intensity of the thermal color is positively related to the frequency of occurrence of the specified behavior corresponding to the thermal color.
  • the behavior analysis result of each designated target is a list of designated targets that trigger each designated behavior; the obtaining the behavior analysis result of each designated target in the image data of each preset monitoring area includes:
  • an embodiment of the present application provides a method for sending a behavior list, which is applied to a front-end smart device, and the method includes:
  • each designated target classify each designated target to obtain multiple behavior lists, wherein the behavior type of each designated target in the same behavior list is the same;
  • the behavior analysis result of each designated target is the behavior analysis result of each sampled designated target.
  • the image data is analyzed by computer vision technology to obtain the behavior analysis result of each designated target in the image data, include:
  • an embodiment of the present application provides an alarm device, which includes:
  • the heat map display module is used to display the behavior heat map of the area to be counted, wherein the behavior heat map represents the frequency of occurrence of a specified behavior in each sub-region of the area to be counted;
  • the alarm triggering module is used to trigger an alarm for the sub-area meeting the preset alarm condition when the sub-area of the behavior heat map meets the preset alarm condition.
  • the alarm trigger module includes:
  • a frequency comparison sub-module for comparing the frequency of occurrence of a specified behavior in each sub-region of the behavior heat map with a preset frequency threshold
  • the sub-area alarm sub-module is used for triggering an alarm for the target sub-area for the target sub-area whose frequency of occurrence of the specified behavior is greater than the preset frequency threshold.
  • the sub-regions of the behavior heat map include thermal colors, and the thermal colors represent the frequency of occurrence of a specified behavior in the sub-region, and the higher the frequency of occurrence of the specified behavior in the sub-region, the The higher the heating value of the thermal color of the area;
  • the alarm trigger module includes:
  • the thermal value comparison sub-module is used to compare the thermal value of the thermal color of each of the sub-regions with the preset thermal value
  • the triggering alarm sub-module is used for triggering an alarm for the sub-area to be alarmed whose heating value is greater than the preset heating threshold.
  • the sub-regions of the behavior heat map include thermal colors
  • the designated behaviors are multiple designated behaviors, and different designated behaviors correspond to different thermal colors
  • the depth of the thermal color corresponds to the thermal color.
  • the frequency of occurrence of the specified behavior is positively correlated, and each of the thermal colors corresponds to the corresponding alarm linkage;
  • the alarm trigger module is specifically used for:
  • the alarm device in the embodiment of the present application further includes:
  • the display instruction receiving module is used to obtain the user's display instruction for the sub-area to be displayed
  • the image data display module is configured to display the image data in the sub-region to be displayed according to the display instruction, wherein the image data in the sub-region to be displayed is the monitoring in the sub-region to be displayed The video stream of the region.
  • an embodiment of the present application provides an apparatus for generating a behavioral heat map, which is applied to a back-end device, and the apparatus includes:
  • the analysis result obtaining module is used to obtain the behavior analysis result of each designated target in the image data of each preset monitoring area, wherein the preset monitoring area is the area in the area to be counted;
  • the sub-region frequency statistics module is used to determine the frequency of occurrence of the designated behavior in each sub-region of the region to be counted according to the behavior analysis result of each designated target, wherein the sub-region and the preset monitoring region exist Intersection
  • the behavior heat map generating module is configured to generate a behavior heat map of the designated behavior of the region to be counted according to the frequency of occurrence of the designated behavior in each of the sub-regions.
  • the analysis result obtaining module includes:
  • Image data acquisition sub-module for acquiring image data of each preset monitoring area
  • the behavior analysis sub-module, the behavior analysis sub-module includes:
  • An area sequence determination unit configured to track and detect the designated targets in each of the image data using computer vision technology, and extract the pixel region sequence of each of the designated targets;
  • the area sequence analysis unit is used to analyze the pixel area sequence of each designated target to obtain the behavior analysis result of each designated target.
  • the pixel region sequence of each designated target is a pixel region sequence of each sample designated target
  • the region sequence determining unit includes:
  • the position determination subunit is used to determine each designated target in each of the image data and the position of each designated target through a preset target detection algorithm and a preset target tracking algorithm;
  • the coefficient sampling subunit is used to sample each designated target in each of the image data by using a preset target sampling algorithm to obtain each sampled designated target;
  • the region intercepting and determining subunit is used to perform target behavior sequence extraction on each of the image data according to the position of the designated target of each sample to obtain the pixel region sequence of each designated target of the sample.
  • one of the sub-regions includes at least one of the preset monitoring regions, and the sub-region frequency statistics module includes:
  • An inclusion relationship determination sub-module for obtaining the inclusion relationship between each of the sub-regions and each of the preset monitoring regions
  • the behavior frequency statistics sub-module is used to determine the occurrence of a designated behavior in each of the sub-regions of the area to be counted according to the inclusion relationship, the behavior analysis result of each of the designated targets, and the preset monitoring area where each designated target is The frequency.
  • the device for generating a behavioral heat map in this embodiment of the present application further includes:
  • An actual position acquisition module configured to acquire the actual position of each designated target in the preset monitoring area
  • the sub-region frequency statistics module is specifically configured to determine the frequency of occurrence of the designated behavior in each sub-region of the region to be counted according to the actual location of each designated target and the behavior analysis result of each designated target.
  • the actual location acquisition module includes:
  • An image position acquisition sub-module configured to determine the position of each designated target in the image data according to the pixel area sequence of each designated target
  • the actual position mapping sub-module is used to determine the actual position of each designated target in the preset monitoring area according to the position of each designated target in the image data.
  • the sub-region frequency statistics module includes:
  • the designated target classification sub-module is used to classify each designated target according to the behavior analysis result of each designated target to obtain multiple behavior lists, wherein the behavior types of the designated targets in the same behavior list are the same;
  • the target list determination sub-module is used to determine the target behavior list corresponding to the specified behavior
  • the frequency determination sub-module is used to determine the frequency of occurrence of the specified behavior in each sub-region of the area to be counted according to the actual position of each designated target in the target behavior list.
  • the device for generating a behavioral heat map in this embodiment of the present application further includes:
  • a setting instruction acquisition module configured to acquire a granularity setting instruction input by a user, wherein the granularity setting instruction represents the size attribute of the sub-region;
  • the sub-region setting module is used to determine each sub-region in the area to be counted according to the granularity setting instruction.
  • the specified behavior has multiple specified behaviors
  • the behavior heat map generating module includes:
  • the multi-frequency statistics sub-module is used to obtain an electronic map of the area to be counted, and obtain the frequency of occurrence of each designated behavior in each sub-area;
  • the thermal color corresponding sub-module is used to determine the thermal color corresponding to each specified behavior
  • the map coloring sub-module is used for any sub-area in the electronic map, according to the frequency of occurrence of each specified behavior in the sub-area, display the corresponding to each specified behavior in the sub-area on the map of the sub-area Thermal color, where the intensity of any thermal color is positively related to the frequency of occurrence of the specified behavior corresponding to the thermal color.
  • the behavior analysis result of each designated target is a list of designated targets that trigger each designated behavior; the analysis result obtaining module includes:
  • the behavior list receiving sub-module is configured to receive the behavior list sent by each smart device, wherein the behavior list includes the identification of the designated target, and the behavior types of the designated targets in the same behavior list are the same;
  • the behavior list assembling sub-module is used to assemble each of the behavior lists to obtain the designated target lists that trigger each designated behavior.
  • an embodiment of the present application provides a behavior list sending device, which is applied to a front-end smart device, and the device includes:
  • the image data acquisition module is used to acquire the image data of the preset monitoring area
  • the target behavior analysis module is used to analyze the image data through computer vision technology to obtain the behavior analysis result of each designated target in the image data;
  • the designated target classification module is configured to classify each designated target according to the behavior analysis result of each designated target to obtain multiple behavior lists, wherein the behavior types of the designated targets in the same behavior list are the same;
  • the behavior list sending module is used to send each of the behavior lists.
  • the behavior analysis result of each designated target is the behavior analysis result of each sampled designated target
  • the target behavior analysis module includes:
  • the target position determination sub-module is used to determine each designated target in the image data and the position of each designated target through a preset target detection algorithm and a preset target tracking algorithm;
  • the designated target sampling sub-module is used to sample each designated target in the image data by using a preset target sampling algorithm to obtain each sampling designated target;
  • the pixel region interception sub-module is configured to extract the target behavior sequence of the image data according to the position of each of the sampled designated targets to obtain the pixel region sequence of each of the sampled designated targets;
  • the target behavior analysis sub-module is used to analyze the pixel area sequence of each of the sampling designated targets to obtain the behavior analysis result of each of the sampling designated targets.
  • an embodiment of the present application provides an electronic device, including a processor and a memory;
  • the memory is used to store computer programs
  • the processor is configured to implement the alarm method described in any one of the first aspects when executing the program stored in the memory.
  • an embodiment of the present application provides an electronic device, including a processor and a memory;
  • the memory is used to store computer programs
  • the processor is configured to implement the method for generating a behavioral heat map according to any one of the second aspects when executing the program stored in the memory.
  • an embodiment of the present application provides an electronic device including a processor and a memory
  • the memory is used to store computer programs
  • the processor is configured to implement the behavior list sending method of any one of the foregoing third aspects when executing the program stored in the memory.
  • an embodiment of the present application provides a computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, any of the foregoing aspects of the first aspect is implemented. 1. The alarm method described.
  • an embodiment of the present application provides a computer-readable storage medium, wherein a computer program is stored in the computer-readable storage medium, and the computer program implements the above-mentioned second aspect when executed by a processor. Any of the described behavioral heat map generation methods.
  • an embodiment of the present application provides a computer-readable storage medium, wherein a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the foregoing third aspect is implemented Any of the aforementioned behavior list sending methods.
  • the behavioral heat map generation and alarm method, device, electronic equipment and storage medium provided in the embodiments of the application obtain the behavior analysis results of each designated target in the image data of each preset monitoring area, where the preset monitoring area is the area to be counted According to the behavior analysis results of each designated target, determine the frequency of the specified behavior in each sub-area of the area to be counted. Among them, the sub-area and the preset monitoring area overlap; according to the frequency of the specified behavior in each sub-area , To generate the behavior heat map of the specified behavior of the area to be counted.
  • FIG. 1a is a first schematic diagram of a method for generating a behavioral heat map according to an embodiment of this application
  • FIG. 1b is a second schematic diagram of a method for generating a behavioral heat map according to an embodiment of the application
  • FIG. 2 is a third schematic diagram of a method for generating a behavioral heat map according to an embodiment of the application
  • FIG. 3 is a schematic diagram of an alarm method according to an embodiment of the application.
  • FIG. 4 is a schematic diagram of a deep learning algorithm training process according to an embodiment of the application.
  • FIG. 5 is a fourth schematic diagram of a method for generating a behavioral heat map according to an embodiment of the application.
  • FIG. 6 is a schematic diagram of a method for sending a behavior list according to an embodiment of the application.
  • FIG. 7 is a schematic diagram of an apparatus for generating a behavioral heat map according to an embodiment of the application.
  • FIG. 8 is a schematic diagram of an apparatus for sending a behavior list according to an embodiment of this application.
  • FIG. 9 is a schematic diagram of an electronic device according to an embodiment of the application.
  • an embodiment of the present application provides a method for generating a behavioral heat map. See FIG. 1a, which is applied to a back-end device. The method includes:
  • S101 Obtain a behavior analysis result of each designated target in the image data of each preset monitoring area, where the foregoing preset monitoring area is an area in the area to be counted.
  • the method for generating a behavioral heat map in the embodiment of the present application is applied to a back-end device, so it can be executed by a back-end device.
  • the back-end device may be a server, a personal computer, or a hard disk video recorder.
  • the image data in the embodiments of the present application may be a video stream, and in some application scenarios that only include target recognition, it may also be a single-frame video frame.
  • the preset monitoring area is the monitoring area designated by the user in the area to be counted.
  • the back-end equipment can directly obtain the behavior analysis results of each designated target in each image data through smart cameras installed in each preset monitoring area. Among them, the smart camera analyzes the image data collected by itself through computer vision technology, obtains the behavior analysis result of each designated target in the image data collected by itself, and sends it to the back-end device.
  • the smart camera can send the behavior analysis results of each specified target through a behavior list.
  • the smart camera establishes a behavior list for each behavior type, and the smart camera will trigger the designation of the specified behavior type
  • the identification of the target is added to the corresponding behavior list.
  • obtaining the behavior analysis result of each designated target in the image data of each preset monitoring area includes: receiving a behavior list sent by each smart device, wherein the behavior list includes the identification of the designated target, and the behavior list is in the same behavior list.
  • the behavior types of the designated targets are the same; the above-mentioned behavior lists are assembled, and the designated target lists that trigger each designated behavior are respectively obtained, wherein the behavior analysis results of the above-mentioned designated targets are expressed in the form of the designated target list that triggers each designated behavior.
  • the back-end device can summarize the analysis results of the image data of multiple preset monitoring areas, and allocate the computing resources to the front-end smart device, reducing the processing pressure of the back-end device and improving flexibility.
  • the smart device here may be a smart camera or a hard disk video recorder, etc.
  • the image data analysis process is executed by a back-end device. See FIG. 1b.
  • the above-mentioned acquiring behavior analysis results of each designated target in the image data of each preset monitoring area includes:
  • the back-end device receives the image data of each preset monitoring area sent by each camera.
  • S1012 Analyze each of the above-mentioned image data through computer vision technology to obtain a behavior analysis result of the designated target in each of the above-mentioned image data.
  • the back-end equipment uses computer vision technology to obtain the behavior analysis results of the specified targets in each image data.
  • the designated target is the target that the user wants to pay attention to, which can be a person, a vehicle, or an animal, etc., which can be set according to actual requirements.
  • the computer vision technology is a pre-trained deep learning algorithm, and the back-end device uses the deep learning algorithm to analyze the image data to obtain the behavior analysis result of the specified target in the image data.
  • the process of the pre-trained deep learning algorithm can be shown in Figure 4, including: determining the behavior type of interest, calibrating the behavior type of the specified target in each image data containing the specified target, obtaining sample image data, and inputting the sample image data Train in the deep learning algorithm, and get the pre-trained deep learning algorithm after convergence.
  • the foregoing analysis of each of the foregoing image data by computer vision technology to obtain the behavior analysis result of the specified target in each of the foregoing image data includes:
  • Step 1 Using computer vision technology, the designated targets in each of the above-mentioned image data are respectively tracked and detected, and the pixel region sequence of each of the above-mentioned designated targets is extracted.
  • Computer vision technology can include target detection algorithms and target tracking algorithms.
  • the back-end equipment recognizes the specified targets in each image data through the target detection algorithm, and tracks the specified targets through the target tracking algorithm to obtain each specified target According to the position in the image data, the pixel region sequence of each designated target is extracted according to the position of each designated target in the image data.
  • the pixel area sequence of the designated target may be sampled.
  • the computer vision technology is used to track and detect the designated targets in each of the above-mentioned image data, and extract the pixel region sequence of each of the above-mentioned designated targets, including:
  • Step A Determine each designated target and the position of each designated target in each of the aforementioned image data by using a preset target detection algorithm and a preset target tracking algorithm.
  • the back-end equipment recognizes the designated targets in each image data through target detection and calculation, and tracks the designated targets through the target tracking algorithm, so as to obtain the position of each designated target in the image data.
  • a unique ID can be set for each designated target.
  • Target detection algorithms can include pedestrian target detection, such as HOG (Histogram of Oriented Gradient), DPM (Deformable Parts Models), FRCNN (Faster Regions with Convolutional Neural Networks, faster based on partitions) Convolutional Neural Network), YOLO (You Only Look Once), SSD (Single Shot Multibox Detector), the target tracking algorithm can be a multi-target tracking algorithm method.
  • Step B sampling each designated target in each of the above-mentioned image data through a preset target sampling algorithm to obtain each sampling designated target.
  • the back-end equipment also performs target sampling on each designated target, thereby reducing the processing pressure of the back-end equipment.
  • the back-end equipment can sample the specified target through any relevant sampling algorithm. For example, target sparse sampling of specified targets in each image data.
  • Target sparse sampling methods include but are not limited to target uniform sampling, point uniform sampling, point weight sampling, and sampling based on the number of regional targets, etc., which can be obtained by sampling A designated target with an appropriate target scale, that is, sampling designated targets.
  • Step C Perform target behavior sequence extraction on each of the above-mentioned image data according to the position of the designated target of each of the above-mentioned samples to obtain the pixel region sequence of each designated target of the above-mentioned sample.
  • step A the location of each designated target is determined, and each sampling designated target is a target in each designated target, so the position of each sampling designated target is known.
  • the back-end device extracts the target behavior sequence of each image data according to the position of each sample designated target, and performs image interception from the image data according to a certain structure, such as Tubelet, to obtain the pixel area sequence of each sample designated target. By sampling the pixel area sequence of the designated target instead of the pixel area sequence of each designated target.
  • the pixel area sequence of each designated target is sampled, and the sparse sampling is completed according to the density of the target. While preserving the distribution characteristics of the designated target under different preset monitoring regions, it reduces the data of behavior type recognition processing. The quantity has improved the practicability of the overall scheme.
  • Step 2 Analyze the pixel area sequence of each of the above-mentioned designated targets to obtain the behavior analysis result of each of the above-mentioned designated targets.
  • Analyze the pixel region sequence of each specified target using the image sequence behavior recognition framework, for example, LSTM (Long Short-Term Memory), dual stream network, C3D (3D ConvNets, deep 3-dimensional convolutional network) P3D (Pseudo-3D Residual Networks), ArtNet, PointNet, PointSIFT, etc., combine the classification neural network to extract the sequence behavior feature, and obtain the behavior analysis result of the pixel area sequence of each specified target.
  • LSTM Long Short-Term Memory
  • C3D 3D ConvNets, deep 3-dimensional convolutional network
  • P3D Pseudo-3D Residual Networks
  • ArtNet PointNet
  • PointSIFT PointSIFT
  • classification neural networks include, but are not limited to, Residual Neural Network 18 (Residual Neural Network 18), Residual Neural Network 50 (Residual Neural Network 50), Residual Neural Network 101 (Residual Neural Network 101), Resnet 152 (Residual Neural Network 152, Residual Neural Network 152), Inception-v1, VGG (Visual Geometry Group Network, Visual Geometry Group Network), etc.
  • the behavior analysis result includes behavior category and confidence.
  • S102 Determine the frequency of occurrence of the designated behavior in each sub-region of the area to be counted according to the behavior analysis result of each of the above-mentioned designated targets, where the above-mentioned sub-region and the above-mentioned preset monitoring area have an intersection;
  • the area to be counted can be a preset area or a user-designated area.
  • the above method further includes: acquiring a region to be counted selection instruction input by the user; and determining the region to be counted according to the above-mentioned region to be counted selection instruction.
  • the selection command of the area to be counted represents the range of the area to be counted.
  • Each sub-region of the area to be counted may be predetermined, for example, a plurality of area intervals are divided according to the area size in advance, and the size and division method of the sub-regions are set for each area interval.
  • the sub-areas are divided in advance, and multiple sub-areas are divided in advance according to different granularities, for example, roads, floors, communities, urban areas, cities, or provinces. According to the pre-divided sub-areas, determine the sub-areas included in the area to be counted.
  • the above method further includes:
  • Step 1 Obtain a granularity setting instruction input by a user, wherein the granularity setting instruction represents the size attribute of the sub-region.
  • the granularity setting instruction represents the size of the sub-region.
  • the granularity setting instruction represents the sub-region as a road, a floor, a district, an urban area, a city, or a province.
  • Step 2 Determine each sub-areas in the area to be counted according to the above-mentioned granularity setting instruction. For example, when the sub-area characterized by the granularity setting instruction is a cell, each sub-area is determined to be a cell; when the sub-area characterized by the granularity setting instruction is a street, it is determined that each sub-area is a street.
  • the behavior analysis results of different granularities can be summarized, and the regional behavior can be visually displayed in different colors and color shades in combination with the electronic map, which is intuitive and easy to use.
  • the specified behavior can be a preset behavior type or a behavior type selected by the user in real time.
  • the above method further includes: acquiring a specified behavior selection instruction input by a user, wherein the specified behavior selection instruction represents a behavior type of the specified behavior; and the specified behavior is determined according to the specified behavior selection instruction.
  • the back-end equipment determines the sub-region where each designated target belongs according to the preset monitoring area to which each designated target belongs; according to the behavior analysis results of each designated target, separately counts the frequency of occurrence of the designated behavior in each sub-region.
  • the size of the preset monitoring area is smaller than the size of the sub-area.
  • the sub-area includes the preset monitoring area, and the determination is determined based on the behavior analysis result of each of the specified targets. The frequency of occurrence of the specified behavior in each sub-areas of the area to be counted, including:
  • Step 1 Obtain the inclusion relationship between each of the aforementioned sub-areas and each of the aforementioned preset monitoring areas.
  • the preset monitoring areas included in each sub-area are respectively determined.
  • Step 2 Determine the frequency of occurrence of the specified behavior in each of the sub-regions of the area to be counted according to the inclusion relationship, the behavior analysis result of each of the specified targets, and the preset monitoring area where each of the specified targets is located.
  • the image data is a video image of the preset monitoring area, and the specified target in any image data is the specified target in the preset monitoring area corresponding to the image data. If the subarea includes a preset monitoring area, the designated target in the preset monitoring area is the designated target in the subarea. According to the behavior analysis results of each designated target, count the frequency of occurrence of designated behaviors in each sub-region.
  • the above method further includes: classifying the designated targets according to the behavior analysis results of the designated targets to obtain multiple behavior lists, where , The behavior type of each specified target in the same behavior list is the same. According to the behavior analysis results of the specified target, each specified target of the same behavior type is divided into a behavior list. In addition to recording the corresponding behavior type and the identification of the specified target, the behavior list can also record the location of the specified target, and specify the target. The location of can be the image data/preset monitoring area to which the specified target belongs, or the location of the specified target is the actual coordinates of the specified target, etc.
  • S103 Generate a behavior heat map of the designated behavior in the region to be counted according to the frequency of occurrence of the designated behavior in each subregion.
  • the back-end device colors each sub-areas of the area to be counted in the electronic map according to the frequency of occurrence of the designated behavior in each sub-areas, so as to obtain a behavior heat map of the designated behavior in the area to be counted.
  • cool and warm colors can be used to indicate the frequency of occurrence of the specified behavior in the sub-region. For example, the higher the frequency of the specified behavior in the sub-region, the closer the color of the sub-region to the warm color; The lower the frequency of the specified behavior, the closer the color of the sub-region is to the cool color.
  • the user may wish to monitor multiple types of behaviors.
  • the above-mentioned designated behaviors are multiple designated behaviors.
  • the above-mentioned designated behaviors are generated according to the frequency of occurrence of the designated behaviors in each of the above-mentioned sub-regions.
  • the behavior heat map of the specified behavior in the above-mentioned area to be counted includes:
  • Step 1 Obtain an electronic map of the area to be counted, and obtain the frequency of occurrence of each designated behavior in each of the sub-areas.
  • Step two determine the thermal color corresponding to each of the above specified behaviors.
  • the designated behavior includes a variety of designated behaviors, and different thermal colors can be set for different designated behaviors.
  • the thermal color corresponding to each specified behavior can be randomly determined or specified by the user, which will not be repeated here.
  • Step 3 For any sub-area in the above electronic map, according to the frequency of occurrence of each specified behavior in the sub-area, the thermal color corresponding to each specified behavior in the sub-area is displayed on the map of the sub-area, where any The intensity of a thermal color is positively related to the frequency of occurrence of the specified behavior corresponding to the thermal color.
  • the thermal color of the specified behavior contained in the sub-region is displayed in the electronic map position of the sub-region. And the higher the frequency of the specified behavior in the sub-region, the darker the thermal color corresponding to the specified behavior.
  • the behavioral heat map can be zoomed in or out, and the frequency statistics data can be updated according to the scale of the electronic map.
  • the user can select the image data in the sub-area from the electronic map for video preview to observe the actual situation more realistically.
  • the above method further includes obtaining an image display instruction for the designated preset monitoring area; and displaying the image data of the designated preset monitoring area according to the image display instruction. For example, the user can click the designated preset monitoring area in the sub-area through the mouse or touch screen, and the back-end device displays the image data of the designated preset monitoring area after detecting the click instruction for the designated preset monitoring area.
  • the frequency of occurrence of designated behaviors in each sub-region in the region to be counted is counted by image data, and then a behavior heat map of the region to be counted is generated, which can realize intuitive monitoring of a large area.
  • the above method further includes:
  • the actual location of the designated target can be reported by a front-end smart device such as a smart camera, or it can be determined by a back-end device based on image data.
  • the foregoing obtaining the actual position of each of the specified targets in the preset monitoring area includes:
  • S201 Determine the position of each designated target in the image data according to the pixel area sequence of each designated target.
  • the pixel area of the specified target can be the pixel area selected by the target frame of the specified target.
  • the position sequence of the specified target in the above image data is determined.
  • it can be a sequence of position coordinates (multiple consecutive time series). Coordinate areas).
  • S202 Determine the actual position of each designated target in the preset monitoring area according to the position of each designated target in the image data.
  • the position of the designated target in the image data is converted into the actual position of the designated target in the preset monitoring area, and the actual position can be the global positioning system coordinates or a custom area coordinate.
  • the actual location of each designated target may be sent to the back-end device by a front-end device such as a smart camera, and the back-end device may directly obtain the actual location of each designated target.
  • a front-end device such as a smart camera
  • the above determination of the frequency of occurrence of the designated behavior in each sub-region of the area to be counted based on the behavior analysis results of each of the above designated targets includes:
  • the frequency of occurrence of designated behaviors in each subregion of the area to be counted is determined.
  • the actual location of each designated target by determining the actual location of each designated target, it can be applied to the case where the sub-region does not contain the complete preset monitoring area, and it can even be applied to the case where the sub-area is smaller than the preset monitoring area, which can be applied to behavior
  • a scene with a small granularity of the heat map that is, a scene with a small sub-area, theoretically the smallest sub-area can be a coordinate point, which can greatly increase the monitoring accuracy of the behavioral heat map.
  • the designated behaviors in each subregion of the region to be counted are determined based on the actual position of each designated target and the behavior analysis result of each designated target. Frequency of occurrence, including:
  • S1021 According to the behavior analysis result of each of the specified targets, classify each of the specified targets to obtain multiple behavior lists, wherein the behavior types of the specified targets in the same behavior list are the same.
  • each designated target of the same behavior type is divided into a behavior list.
  • the behavior type of the behavior list the identification of each specified target included in the behavior list, and the actual position of each specified target included in the behavior list are recorded in the behavior list.
  • S1022 Determine a target behavior list corresponding to the specified behavior.
  • S1023 Determine the frequency of occurrence of the specified behavior in each subregion of the area to be counted according to the actual position of each designated target in the target behavior list.
  • the frequency of occurrence of the designated target in each subregion is determined.
  • the embodiment of the present application also provides an alarm method. Referring to FIG. 3, the method includes:
  • the alarm method in the embodiment of the present application may be executed by a back-end device.
  • the back-end device may be a server, a personal computer, or a hard disk video recorder.
  • the behavioral heatmap can be obtained by any of the above-mentioned methods for generating the behavioral heatmap, and will not be repeated here.
  • the preset alarm condition can be set according to the actual situation, for example, the specified behavior frequency is greater than the preset frequency threshold, or the heat value of the heat color is greater than the preset heat preset value.
  • triggering an alarm for the sub-regions that meet the preset alarm condition includes: comparing each sub-region of the above-mentioned behavior heat map respectively The frequency of occurrence of the specified behavior and the size of the preset frequency threshold; for the target subregion where the frequency of occurrence of the specified behavior is greater than the preset frequency threshold, an alarm for the target subregion is triggered.
  • the sub-regions of the above-mentioned behavior heat map include thermal colors, and the above-mentioned thermal colors represent the frequency of occurrence of the specified behavior in the above-mentioned sub-region, and the higher the frequency of occurrence of the specified behavior in the above-mentioned sub-region, the above-mentioned sub-region The higher the thermal value of the thermal color of the area; when the above-mentioned sub-regions of the above-mentioned behavioral heat map meet the preset alarm conditions, the alarm for the sub-regions meeting the preset alarm conditions is triggered, including: comparing the thermal colors of the above-mentioned sub-regions respectively The heating power value of and the preset heating power preset value; for the sub-area to be alarmed whose heating power value is greater than the preset heating threshold value, an alarm for the sub-area to be alarmed is triggered.
  • the method for generating a behavioral heat map of the embodiment of the present application may be specifically as shown in FIG. 5.
  • the user can set the behavior category of interest, and the back-end device monitors the frequency of each specified behavior in the area to be counted in real time. Once the specified behavior category's thermal power increases significantly and reaches the preset thermal threshold, it can actively trigger the early warning linkage. Users can take the initiative to view live video or live examples of the behavior category, and respond in time according to the situation.
  • the alarm method in the embodiments of the present application can be widely used in various fields. Examples are as follows:
  • an alert is triggered and the live video is pushed
  • give the manager a short video of the on-site behavior. If you check that it is a fire in the shopping mall, you can quickly send a fire alarm to support; when the heat of the behavior of people falling to the ground or fighting with people in a certain area suddenly rises and reaches the upper threshold, the manager actually checks and finds In the event of a violent terrorist incident at the railway station, staff can be quickly sent to support.
  • the behavior types of people queuing, people staying, people dragging suitcases can be preset.
  • the manager checks the pushed live video or short video and finds the train If a large number of passengers are found in a square at a station, transportation resources or evacuation personnel can be quickly dispatched to evacuate the crowd.
  • the linkage strategy is triggered to alert the teaching administrator.
  • the teaching manager checked the pushed video or short behavioral video and found that some classrooms had a low teaching atmosphere, and they could understand the teaching work in time and improve the quality of teaching.
  • the linkage strategy will be triggered to alert the pasture manager .
  • the herd After checking the pushed live video or short video by the ranch manager, it is found that the herd is suspected of being poisoned or epidemic, and the disease control and hygiene work can be carried out quickly.
  • the embodiments of this application it is possible to easily perform multi-functional combined use based on the statistical results of the behavior list of the designated monitoring area, including viewing the distribution characteristics of a single/any multiple behaviors, and viewing the behavior distribution characteristics of a single area/any multiple areas.
  • the interactive operation is simple.
  • the behavioral heat map is used for behavior preview and scheduling, which is convenient for users to quickly pay attention to the on-site situation, facilitate evidence collection, and quickly make system scheduling, which improves the level of intelligence.
  • the sub-regions of the above-mentioned behavior heat map include thermal colors, and the above-mentioned designated behaviors have multiple designated behaviors, and different designated behaviors correspond to different thermal colors.
  • the intensity of the aforementioned thermal color is positively correlated with the frequency of occurrence of the specified behavior corresponding to the aforementioned thermal color, and each of the aforementioned thermal colors corresponds to a corresponding alarm linkage;
  • triggering an alarm for the sub-area meeting the preset alarm condition includes:
  • Step 1 For each thermal color in each of the above-mentioned sub-regions, compare the depth of the thermal color with the size of the preset degree preset value corresponding to the thermal color.
  • Step 2 For the target thermal color whose depth is greater than the preset degree, trigger an alarm linkage for the sub-region where the target thermal color is located and corresponding to the target thermal color.
  • a preset degree threshold is set for each thermal color in advance, and the preset degree threshold for different thermal colors may be the same or different, and set according to actual needs.
  • Different alarm linkages can be set for the thermal colors of different sub-regions, or the same alarm linkage can be set, which can be set according to actual needs.
  • the back-end device analyzes the thermal colors of each sub-region respectively, and compares the intensity of the thermal color with the preset degree corresponding to the thermal color for any thermal color. When the depth of the thermal color is greater than the preset degree threshold corresponding to the thermal color, an alarm linkage for the sub-region where the thermal color is located and corresponding to the thermal color is executed.
  • the detection and alarm of multiple specified behaviors can be simultaneously realized based on the behavior heat map, which can meet various needs of users.
  • the alarm method in the embodiment of the present application further includes:
  • Step 1 Obtain the user's display instruction for the sub-region to be displayed.
  • Step 2 According to the display instruction, display the image data in the sub-region to be displayed, wherein the image data in the sub-region to be displayed is the video stream of the monitoring area in the sub-region to be displayed.
  • the image data is the video stream of each monitoring area collected by the monitoring equipment, and the user can display the image data in the sub-area to be displayed through display instructions.
  • the sub-region to be displayed includes multiple image data, and a preview window of each image data may be generated first for the user to choose to display.
  • the display of the image data of the actual monitoring scene is realized, which can help the user to fully understand the actual situation and meet the various needs of the user.
  • the embodiment of the present application also provides a method for sending a behavior list. See FIG. 6, which is applied to a front-end smart device.
  • the method includes:
  • S601 Acquire image data of a preset monitoring area.
  • the behavior list sending method of the embodiment of the present application is applied to a front-end smart device, and therefore can be implemented by a front-end smart device.
  • the front-end smart device may be a smart camera or a hard disk video recorder.
  • the smart camera can directly collect the image data of the preset monitoring area to obtain the image data of the preset monitoring area.
  • the hard disk video recorder can obtain the image data of the preset monitoring area through the connected camera.
  • S602 Analyze the above-mentioned image data through computer vision technology to obtain a behavior analysis result of each designated target in the above-mentioned image data.
  • the front-end intelligent equipment recognizes the designated target in the image data through target detection and calculation, and tracks each designated target through the target tracking algorithm, so as to obtain the position of each designated target in the image data, according to the designated target in the image data Perform behavior recognition on each designated target and obtain the behavior analysis result of each designated target.
  • S603 According to the behavior analysis result of each of the specified targets, classify each of the specified targets to obtain multiple behavior lists, wherein the behavior types of the specified targets in the same behavior list are the same.
  • each specified target of the same behavior type is divided into a behavior list.
  • the behavior list can also record the location of the specified target, and specify the target.
  • the location of can be the image data/preset monitoring area to which the specified target belongs, or the location of the specified target is the actual coordinates of the specified target, etc.
  • the smart camera or hard disk video recorder sends each behavior list to the server so that the server generates a behavior heat map according to the behavior list.
  • the generation process of the behavior heat map is the same as the above-mentioned behavior heat map generation method, and will not be repeated here.
  • the behavior analysis result of each designated target is the behavior analysis result of each sampled designated target.
  • the foregoing image data is analyzed through computer vision technology to obtain the behavior analysis result of each designated target in the image data, including:
  • Step 1 Determine each designated target in the image data and the position of each designated target through a preset target detection algorithm and a preset target tracking algorithm.
  • the designated target in the image data is recognized, and the designated target is tracked through the target tracking algorithm, so as to obtain the position of each designated target in the image data.
  • the target detection algorithm may include pedestrian target detection, for example, HOG, DPM, FRCNN, YOLO, SSD, and the target tracking algorithm may be a multi-target tracking algorithm method.
  • the second step is to sample each designated target in the above-mentioned image data by using a preset target sampling algorithm to obtain each sampled designated target.
  • Target sampling for each specified target for example, target sparse sampling of the specified target in each image data.
  • Target sparse sampling methods include but are not limited to target uniform sampling, point uniform sampling, point weighted sampling, and area-based target quantity Sampling, etc., through sampling, an appropriate amount of designated targets of target scale can be obtained, that is, sampling designated targets.
  • Step 3 Perform target behavior sequence extraction on the image data according to the position of the designated target of each sample to obtain the pixel region sequence of the designated target of each sample.
  • step one the position of each designated target is determined, and each sampling designated target is a target in each designated target, so the position of each sampling designated target is known.
  • the front-end smart device extracts the target behavior sequence of each image data according to the location of the designated target for each sample, and performs image interception from the image data according to a certain structure, such as Tubelet, etc., to obtain the pixel area sequence of the designated target for each sample.
  • Step 4 Analyze the pixel area sequence of each of the above-mentioned sampled designated targets to obtain the behavior analysis result of each of the above-mentioned sampled designated targets.
  • the classification neural network includes but is not limited to Resnet18, Resnet50, Resnet101, Resnet152, Inception-v1, VGG, etc.
  • the behavior analysis result includes behavior category and confidence.
  • the embodiment of the application provides an alarm device, which includes:
  • the heat map display module is used to display the behavior heat map of the area to be counted, wherein the behavior heat map represents the frequency of the specified behavior in each sub-region of the area to be counted;
  • the alarm trigger module is used to trigger an alarm for the sub-area meeting the preset alarm condition when the sub-area of the above-mentioned behavioral heat map meets the preset alarm condition.
  • the above alarm trigger module includes:
  • the frequency comparison sub-module is used to compare the frequency of the specified behavior in each sub-region of the behavior heat map with the preset frequency threshold;
  • the sub-area alarm sub-module is used to trigger an alarm for the target sub-area for the target sub-area whose frequency of occurrence of the specified behavior is greater than the preset frequency threshold.
  • the sub-regions of the behavior heat map include thermal colors, and the thermal colors represent the frequency of occurrence of the specified behaviors in the sub-regions, and the higher the frequency of the specified behaviors in the sub-regions, the lower the thermal color of the sub-regions. The higher the heating value;
  • the above alarm trigger module includes:
  • the thermal value comparison sub-module is used to compare the thermal value of the thermal color of each of the above sub-regions with the preset thermal pre-value;
  • the triggering alarm sub-module is used for triggering an alarm for the above-mentioned sub-area to be alarmed for the sub-area to be alarmed whose heating value is greater than the above-mentioned preset thermal threshold value.
  • the sub-regions of the above-mentioned behavior heat map include thermal colors
  • the above-mentioned designated behaviors are multiple designated behaviors
  • different designated behaviors correspond to different thermal colors.
  • the depth of the above-mentioned thermal color corresponds to the designated behavior corresponding to the above-mentioned thermal color.
  • the frequency is positively correlated, and each of the above thermal colors corresponds to the corresponding alarm linkage;
  • the above alarm trigger module is specifically used for:
  • the alarm device in the embodiment of the present application further includes:
  • the display instruction receiving module is used to obtain the user's display instruction for the sub-area to be displayed
  • the image data display module is used to display the image data in the sub-area to be displayed according to the display instruction, wherein the image data in the sub-area to be displayed is the video stream of the monitoring area in the sub-area to be displayed .
  • the embodiment of the present application also provides a device for generating a behavioral heat map. See FIG. 7, which is applied to a back-end device.
  • the device includes:
  • the analysis result obtaining module 701 is configured to obtain the behavior analysis result of each designated target in the image data of each preset monitoring area, where the foregoing preset monitoring area is an area to be counted;
  • the sub-region frequency statistics module 702 is configured to determine the frequency of occurrence of the designated behavior in each sub-region of the area to be counted according to the behavior analysis result of each of the above-mentioned designated targets, wherein the above-mentioned sub-region and the above-mentioned preset monitoring area have an intersection;
  • the behavior heat map generating module 703 is configured to generate the behavior heat map of the designated behavior in the region to be counted according to the frequency of occurrence of the designated behavior in each of the above sub-regions.
  • the aforementioned analysis result obtaining module 701 includes:
  • Image data acquisition sub-module for acquiring image data of each preset monitoring area
  • the behavior analysis sub-module is used to analyze each of the above-mentioned image data through computer vision technology to obtain the behavior analysis result of the designated target in each of the above-mentioned image data.
  • the above behavior analysis sub-module includes:
  • the region sequence determination unit is used to track and detect the designated targets in each of the above-mentioned image data by computer vision technology, and extract the pixel region sequence of each of the above-mentioned designated targets;
  • the area sequence analysis unit is used to analyze the pixel area sequence of each of the designated targets to obtain the behavior analysis results of each of the designated targets.
  • the pixel region sequence of each designated target is a pixel region sequence of each sample designated target
  • the above-mentioned region sequence determining unit includes:
  • the position determination subunit is used to determine each designated target in each of the aforementioned image data and the position of each aforementioned designated target through a preset target detection algorithm and a preset target tracking algorithm;
  • the coefficient sampling subunit is used to sample each designated target in each of the above-mentioned image data by using a preset target sampling algorithm to obtain each sampled designated target;
  • the region intercepting and determining subunit is used to extract the target behavior sequence of each of the above-mentioned image data according to the position of each of the above-mentioned sample-designated targets to obtain the pixel region sequence of each of the above-mentioned sample-designated targets.
  • the above-mentioned behavioral heat map generating device further includes:
  • the actual position acquisition module is used to acquire the actual position of each of the above-mentioned designated targets in the above-mentioned preset monitoring area;
  • the sub-region frequency statistics module is specifically configured to determine the frequency of occurrence of the designated behavior in each sub-region of the region to be counted based on the actual location of each designated target and the behavior analysis result of each designated target.
  • the aforementioned actual location acquisition module includes:
  • the image position acquisition sub-module is used to determine the position of each designated target in the image data according to the pixel area sequence of each designated target;
  • the actual location mapping sub-module is used to determine the actual location of each specified target in the preset monitoring area according to the location of each specified target in the image data.
  • the aforementioned sub-region frequency statistics module 702 includes:
  • the designated target classification sub-module is used to classify the designated targets according to the behavior analysis results of the designated targets to obtain multiple behavior lists, wherein the behavior types of the designated targets in the same behavior list are the same;
  • the target list determination sub-module is used to determine the target behavior list corresponding to the specified behavior
  • the frequency determination sub-module is used to determine the frequency of occurrence of the above-mentioned designated behavior in each sub-region of the area to be counted according to the actual position of each designated target in the above-mentioned target behavior list.
  • the above-mentioned behavioral heat map generating device further includes:
  • a setting instruction acquisition module configured to acquire a granularity setting instruction input by a user, wherein the granularity setting instruction represents the size attribute of the sub-region;
  • the sub-region setting module is used to determine each sub-region in the area to be counted according to the above-mentioned granularity setting instruction.
  • the aforementioned sub-area includes the aforementioned preset monitoring area
  • the aforementioned sub-area frequency statistics module 702 includes:
  • An inclusion relationship determination sub-module for acquiring the inclusion relationship between each of the aforementioned sub-areas and each of the aforementioned preset monitoring areas;
  • the behavior frequency statistics sub-module is used to determine the frequency of occurrence of the specified behavior in each of the above sub-regions of the area to be counted according to the above inclusion relationship, the behavior analysis result of each of the above specified targets, and the preset monitoring area where each of the specified targets is located.
  • the above-mentioned designated behaviors are multiple designated behaviors
  • the above-mentioned behavior heat map generating module 703 includes:
  • the multi-frequency statistics sub-module is used to obtain the electronic map of the above-mentioned area to be counted, and obtain the frequency of occurrence of each designated behavior in each of the above-mentioned sub-regions;
  • the thermal color corresponding sub-module is used to determine the thermal color corresponding to each of the above specified behaviors
  • the map coloring sub-module is used for any sub-area in the above electronic map, according to the frequency of occurrence of each specified behavior in the sub-area, and display the thermal color corresponding to each specified behavior in the sub-area on the map of the sub-area , Where the intensity of any thermal color is positively related to the frequency of occurrence of the specified behavior corresponding to the thermal color.
  • the above-mentioned behavioral heat map generating device further includes:
  • the linkage strategy module is used to execute the linkage strategy of the specified behavior corresponding to the thermal color meeting the preset linkage rule when the thermal color of the specified monitoring area of the behavior heat map meets the preset linkage rule.
  • the behavior analysis result of each designated target mentioned above is a list of designated targets that trigger each designated behavior; the above analysis result obtaining module includes:
  • the behavior list receiving sub-module is used to receive the behavior list sent by each smart device, wherein the above behavior list includes the identification of the designated target, and the behavior types of the designated targets in the same behavior list are the same;
  • the behavior list assembly sub-module is used to assemble each of the above-mentioned behavior lists to obtain the designated target lists that trigger each designated behavior.
  • An embodiment of the present application also provides an apparatus for sending a behavior list. See FIG. 8, which is applied to a front-end smart device.
  • the apparatus includes:
  • the image data acquisition module 801 is used to acquire image data of a preset monitoring area
  • the target behavior analysis module 802 is used to analyze the above-mentioned image data through computer vision technology to obtain the behavior analysis result of each specified target in the above-mentioned image data;
  • the designated target classification module 803 is configured to classify the designated targets according to the behavior analysis results of the designated targets to obtain multiple behavior lists, wherein the behavior types of the designated targets in the same behavior list are the same;
  • the behavior list sending module 804 is configured to send each of the above-mentioned behavior lists.
  • the behavior analysis result of each designated target is the behavior analysis result of each sampled designated target.
  • the aforementioned target behavior analysis module 802 includes:
  • the target position determination sub-module is used to determine each designated target in the image data and the position of each designated target through a preset target detection algorithm and a preset target tracking algorithm;
  • the designated target sampling sub-module is used to sample each designated target in the above-mentioned image data by using a preset target sampling algorithm to obtain each sampling designated target;
  • the pixel region interception sub-module is used to extract the target behavior sequence of the image data according to the position of the designated target of each of the above samples to obtain the pixel region sequence of each designated target of the sample;
  • the target behavior analysis sub-module is used to analyze the pixel area sequence of each of the above-mentioned sampling designated targets to obtain the behavior analysis results of each of the above-mentioned sampling designated targets.
  • the embodiment of the present application also provides an electronic device, including: a processor and a memory;
  • the aforementioned memory is used to store computer programs
  • the above-mentioned processor When the above-mentioned processor is used to execute the computer program stored in the above-mentioned memory, it realizes any one of the above-mentioned behavioral heat map generating methods.
  • the electronic device of the embodiment of the present application further includes a communication interface 902 and a communication bus 904.
  • the processor 901, the communication interface 902, and the memory 903 communicate with each other through the communication bus 904.
  • the electronic device may be a server or a hard disk video recorder.
  • the embodiment of the present application also provides an electronic device, including: a processor and a memory;
  • the aforementioned memory is used to store computer programs
  • the electronic device can be a smart camera or a hard disk video recorder.
  • the embodiment of the present application also provides an electronic device, including: a processor and a memory;
  • the aforementioned memory is used to store computer programs
  • the processor is used to execute any of the above alarm methods when executing the computer program stored in the memory.
  • the communication bus mentioned in the above electronic device may be a Peripheral Component Interconnect (PCI) bus or an Extended Industry Standard Architecture (EISA) bus.
  • PCI Peripheral Component Interconnect
  • EISA Extended Industry Standard Architecture
  • the communication bus can be divided into address bus, data bus, control bus and so on. For ease of representation, only one thick line is used in the figure, but it does not mean that there is only one bus or one type of bus.
  • the communication interface is used for communication between the aforementioned electronic device and other devices.
  • the memory may include random access memory (Random Access Memory, RAM), and may also include non-volatile memory (Non-Volatile Memory, NVM), such as at least one disk storage.
  • NVM non-Volatile Memory
  • the memory may also be at least one storage device located far away from the foregoing processor.
  • the above-mentioned processor can be a general-purpose processor, including a central processing unit (CPU), a network processor (Network Processor, NP), etc.; it can also be a digital signal processor (Digital Signal Processing, DSP), a dedicated integrated Circuit (Application Specific Integrated Circuit, ASIC), Field-Programmable Gate Array (FPGA) or other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components.
  • CPU central processing unit
  • NP Network Processor
  • DSP Digital Signal Processing
  • ASIC Application Specific Integrated Circuit
  • FPGA Field-Programmable Gate Array
  • An embodiment of the present application also provides a computer-readable storage medium, and the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, any one of the above-mentioned behavioral heat map generation methods is implemented.
  • An embodiment of the present application also provides a computer-readable storage medium, and the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, any one of the foregoing behavior list sending methods is implemented.
  • An embodiment of the present application also provides a computer-readable storage medium, and the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, any of the foregoing alarm methods is implemented.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Alarm Systems (AREA)

Abstract

La présente invention concerne un procédé de génération de diagramme thermodynamique de comportement et un dispositif d'alarme, un dispositif électronique et un support de stockage. Ledit procédé consiste à : acquérir un résultat d'analyse de comportement de chaque cible désignée dans des données d'image de chaque région surveillée prédéfinie, la région surveillée prédéfinie étant une région dans une région à compter ; selon le résultat d'analyse de comportement de chaque cible désignée, déterminer la fréquence du comportement désigné se produisant dans chaque sous-région de la région à compter, une intersection entre la sous-région et la région surveillée prédéfinie ; et selon la fréquence du comportement désigné se produisant dans chaque sous-région, générer un diagramme thermodynamique de comportement du comportement désigné dans la région à compter. Le procédé de génération de diagramme thermodynamique de comportement dans les modes de réalisation de la présente invention compte, au moyen de données d'image, la fréquence d'un comportement désigné se produisant dans chaque sous-région d'une région à compter, puis génère un diagramme thermodynamique de comportement de la région à compter, ce qui permet de réaliser une surveillance intuitive d'une région de grande surface.
PCT/CN2020/085423 2019-04-28 2020-04-17 Procédé de génération de diagramme thermodynamique de comportement, et dispositif d'alarme, dispositif électronique et support de stockage WO2020221031A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910351634.4A CN111862521B (zh) 2019-04-28 2019-04-28 行为热力图生成及报警方法、装置、电子设备及存储介质
CN201910351634.4 2019-04-28

Publications (1)

Publication Number Publication Date
WO2020221031A1 true WO2020221031A1 (fr) 2020-11-05

Family

ID=72965214

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/085423 WO2020221031A1 (fr) 2019-04-28 2020-04-17 Procédé de génération de diagramme thermodynamique de comportement, et dispositif d'alarme, dispositif électronique et support de stockage

Country Status (2)

Country Link
CN (1) CN111862521B (fr)
WO (1) WO2020221031A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112560715A (zh) * 2020-12-21 2021-03-26 北京市商汤科技开发有限公司 操作记录展示方法、装置、电子设备及存储介质
CN112749461A (zh) * 2020-12-25 2021-05-04 深圳供电局有限公司 负荷数据监控方法、电力系统、计算机设备和存储介质
CN112906594A (zh) * 2021-03-03 2021-06-04 杭州海康威视数字技术股份有限公司 一种布防区域生成方法、装置、设备及存储介质
CN113010829A (zh) * 2021-03-31 2021-06-22 建信金融科技有限责任公司 一种数据分区可视化方法、装置、计算机设备及存储介质
CN114474091A (zh) * 2022-01-26 2022-05-13 北京声智科技有限公司 机器人消杀方法、消杀机器人、消杀设备及存储介质
CN114579889A (zh) * 2022-04-26 2022-06-03 阿里巴巴(中国)有限公司 订单热力图的推荐方法、装置、设备及存储介质
CN115099620A (zh) * 2022-06-23 2022-09-23 中国建筑第五工程局有限公司 一种基于bim的智能房建施工信息收集分析系统

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112488913A (zh) * 2020-11-27 2021-03-12 杭州海康威视数字技术股份有限公司 数据处理方法、装置及电子设备
CN113592320B (zh) * 2021-06-16 2023-10-03 成都世纪光合作用科技有限公司 一种就餐意图识别方法、装置及电子设备
CN115083112B (zh) * 2022-08-22 2022-11-22 枫树谷(成都)科技有限责任公司 一种智能预警应急管理系统及其部署方法
CN115648630B (zh) * 2022-10-25 2024-05-31 上海复志信息科技股份有限公司 离型膜打印区域的处理方法、装置及光固化3d打印设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105447458A (zh) * 2015-11-17 2016-03-30 深圳市商汤科技有限公司 一种大规模人群视频分析系统和方法
CN106251578A (zh) * 2016-08-19 2016-12-21 深圳奇迹智慧网络有限公司 基于探针的人流预警分析方法和系统
US20170193792A1 (en) * 2015-12-31 2017-07-06 International Business Machines Corporation Visitor Flow Management
CN108846389A (zh) * 2018-08-13 2018-11-20 树蛙信息科技(南京)有限公司 一种客流分析系统及其方法

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20150080863A (ko) * 2014-01-02 2015-07-10 삼성테크윈 주식회사 히트맵 제공 장치 및 방법
RU2611959C2 (ru) * 2015-02-27 2017-03-01 Общество С Ограниченной Ответственностью "Яндекс" Способ (варианты) и система (варианты) создания тепловой карты
CN106485868B (zh) * 2015-08-27 2019-07-16 杭州海康威视数字技术股份有限公司 火情的监测方法、系统和火情的监测服务器
US10140832B2 (en) * 2016-01-26 2018-11-27 Flir Systems, Inc. Systems and methods for behavioral based alarms
KR20180130621A (ko) * 2017-05-29 2018-12-10 전자부품연구원 서비스 공간 내 외현적 행동 반응 분석 장치 및 방법
CN107292271B (zh) * 2017-06-23 2020-02-14 北京易真学思教育科技有限公司 学习监控方法、装置及电子设备
CN108038116A (zh) * 2017-10-24 2018-05-15 安徽四创电子股份有限公司 一种基于gis的人流密度监控方法
CN108229407A (zh) * 2018-01-11 2018-06-29 武汉米人科技有限公司 一种视频分析中的行为检测方法与系统
CN108419045B (zh) * 2018-02-11 2020-08-04 浙江大华技术股份有限公司 一种基于红外热成像技术的监控方法及装置
CN108399591A (zh) * 2018-02-12 2018-08-14 北京天时前程自动化工程技术有限公司 热力站供热区域的热网数据可视化监控方法及系统
CN108428326A (zh) * 2018-03-14 2018-08-21 海南师范大学 一种生态旅游评估管理预警信息系统
CN108764047A (zh) * 2018-04-27 2018-11-06 深圳市商汤科技有限公司 群体情绪行为分析方法和装置、电子设备、介质、产品
CN109034355B (zh) * 2018-07-02 2022-08-02 百度在线网络技术(北京)有限公司 致密人群的人数预测方法、装置、设备以及存储介质
CN109508657B (zh) * 2018-10-29 2022-04-26 重庆中科云从科技有限公司 人群聚集分析方法、系统、计算机可读存储介质及设备
CN109635769B (zh) * 2018-12-20 2023-06-23 天津天地伟业信息系统集成有限公司 一种用于球型摄像机的行为识别统计方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105447458A (zh) * 2015-11-17 2016-03-30 深圳市商汤科技有限公司 一种大规模人群视频分析系统和方法
US20170193792A1 (en) * 2015-12-31 2017-07-06 International Business Machines Corporation Visitor Flow Management
CN106251578A (zh) * 2016-08-19 2016-12-21 深圳奇迹智慧网络有限公司 基于探针的人流预警分析方法和系统
CN108846389A (zh) * 2018-08-13 2018-11-20 树蛙信息科技(南京)有限公司 一种客流分析系统及其方法

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112560715A (zh) * 2020-12-21 2021-03-26 北京市商汤科技开发有限公司 操作记录展示方法、装置、电子设备及存储介质
CN112749461A (zh) * 2020-12-25 2021-05-04 深圳供电局有限公司 负荷数据监控方法、电力系统、计算机设备和存储介质
CN112906594A (zh) * 2021-03-03 2021-06-04 杭州海康威视数字技术股份有限公司 一种布防区域生成方法、装置、设备及存储介质
CN112906594B (zh) * 2021-03-03 2022-06-03 杭州海康威视数字技术股份有限公司 一种布防区域生成方法、装置、设备及存储介质
CN113010829A (zh) * 2021-03-31 2021-06-22 建信金融科技有限责任公司 一种数据分区可视化方法、装置、计算机设备及存储介质
CN113010829B (zh) * 2021-03-31 2023-01-20 中国建设银行股份有限公司 一种数据分区可视化方法、装置、计算机设备及存储介质
CN114474091A (zh) * 2022-01-26 2022-05-13 北京声智科技有限公司 机器人消杀方法、消杀机器人、消杀设备及存储介质
CN114474091B (zh) * 2022-01-26 2024-02-27 北京声智科技有限公司 机器人消杀方法、消杀机器人、消杀设备及存储介质
CN114579889A (zh) * 2022-04-26 2022-06-03 阿里巴巴(中国)有限公司 订单热力图的推荐方法、装置、设备及存储介质
CN115099620A (zh) * 2022-06-23 2022-09-23 中国建筑第五工程局有限公司 一种基于bim的智能房建施工信息收集分析系统
CN115099620B (zh) * 2022-06-23 2023-12-05 中国建筑第五工程局有限公司 一种基于bim的智能房建施工信息收集分析系统

Also Published As

Publication number Publication date
CN111862521B (zh) 2022-07-05
CN111862521A (zh) 2020-10-30

Similar Documents

Publication Publication Date Title
WO2020221031A1 (fr) Procédé de génération de diagramme thermodynamique de comportement, et dispositif d'alarme, dispositif électronique et support de stockage
US20240037953A1 (en) Methods and systems for determining object activity within a region of interest
US9792434B1 (en) Systems and methods for security data analysis and display
US11393212B2 (en) System for tracking and visualizing objects and a method therefor
US9760792B2 (en) Object detection and classification
AU2014214545B2 (en) A surveillance system
CN110428522A (zh) 一种智慧新城的智能安防系统
US20180189532A1 (en) Object Detection for Video Camera Self-Calibration
US9407879B2 (en) Systems and methods for automated cloud-based analytics and 3-dimensional (3D) playback for surveillance systems
Wang et al. Tweeting cameras for event detection
US20140355823A1 (en) Video search apparatus and method
JP2012518846A (ja) 異常挙動を予測するためのシステムおよび方法
CN105554440A (zh) 监控方法和设备
US20180150683A1 (en) Systems, methods, and devices for information sharing and matching
US11727317B2 (en) Systems and methods for coherent monitoring
US11887374B2 (en) Systems and methods for 2D detections and tracking
US11410371B1 (en) Conversion of object-related traffic sensor information at roadways and intersections for virtual dynamic digital representation of objects
Donratanapat et al. A national scale big data analytics pipeline to assess the potential impacts of flooding on critical infrastructures and communities
EP3940666A1 (fr) Procédé de reconstruction numérique, appareil et système pour une route de circulation
Shu et al. Small moving vehicle detection via local enhancement fusion for satellite video
CN110505438B (zh) 一种排队数据的获取方法和摄像机
CN113869427A (zh) 一种场景分析方法、装置、电子设备及存储介质
US20180278573A1 (en) Pubic safety camera identification system and method
US10506201B2 (en) Public safety camera identification and monitoring system and method
Yao et al. Integrating AI into CCTV Systems: A Comprehensive Evaluation of Smart Video Surveillance in Community Space

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20798242

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20798242

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 19.05.2023)

122 Ep: pct application non-entry in european phase

Ref document number: 20798242

Country of ref document: EP

Kind code of ref document: A1