CN112333431B - Scene monitoring method and device, electronic equipment and storage medium - Google Patents

Scene monitoring method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN112333431B
CN112333431B CN202011190695.6A CN202011190695A CN112333431B CN 112333431 B CN112333431 B CN 112333431B CN 202011190695 A CN202011190695 A CN 202011190695A CN 112333431 B CN112333431 B CN 112333431B
Authority
CN
China
Prior art keywords
monitoring
people
event
time period
preset time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011190695.6A
Other languages
Chinese (zh)
Other versions
CN112333431A (en
Inventor
柴龙龙
龚超
周静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Sensetime Technology Co Ltd
Original Assignee
Shenzhen Sensetime Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Sensetime Technology Co Ltd filed Critical Shenzhen Sensetime Technology Co Ltd
Priority to CN202011190695.6A priority Critical patent/CN112333431B/en
Priority to CN202210655667.XA priority patent/CN114900669A/en
Publication of CN112333431A publication Critical patent/CN112333431A/en
Priority to PCT/CN2021/094699 priority patent/WO2022088653A1/en
Priority to JP2021576933A priority patent/JP7305808B2/en
Priority to KR1020217042832A priority patent/KR20220058859A/en
Application granted granted Critical
Publication of CN112333431B publication Critical patent/CN112333431B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/53Recognition of crowd images, e.g. recognition of crowd congestion
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19602Image analysis to detect motion of the intruder, e.g. by frame subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Abstract

The present disclosure provides a scene monitoring method, apparatus, electronic device and storage medium, the method comprising: acquiring a monitoring video acquired by monitoring equipment arranged at least one monitoring point location; determining whether a monitoring event occurs in a monitoring area corresponding to the at least one monitoring point location based on the monitoring video; acquiring the number of people monitoring data matched with the monitoring event in a preset time period under the condition that the monitoring event occurs in the monitoring area corresponding to the at least one monitoring point location; and determining the people flow state data of the at least one monitoring device based on the people number monitoring data matched with the monitoring event in a preset time period.

Description

Scene monitoring method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of computer vision technologies, and in particular, to a scene monitoring method and apparatus, an electronic device, and a storage medium.
Background
With the improvement of living standard of people, more and more large-scale activities are held in various places and places. When a large-scale event is held, people flow more densely, so that accidents are easy to happen to places and places holding the large-scale event, such as trampling events, congestion events and the like. Therefore, effective monitoring of people flow is more and more important to ensure the safety degree of each place and each place.
Disclosure of Invention
In view of the above, the present disclosure at least provides a scene monitoring method, an apparatus, an electronic device and a storage medium.
In a first aspect, the present disclosure provides a scene monitoring method, including:
acquiring a monitoring video acquired by monitoring equipment arranged at least one monitoring point location;
determining whether a monitoring event occurs in a monitoring area corresponding to the at least one monitoring point location based on the monitoring video;
acquiring the number of people monitoring data matched with the monitoring event in a preset time period under the condition that the monitoring event occurs in the monitoring area corresponding to the at least one monitoring point location;
and determining the people flow state data of the at least one monitoring device based on the people number monitoring data matched with the monitoring event in a preset time period.
According to the method, by acquiring the monitoring video acquired by the monitoring equipment, when a monitoring event occurs in a monitoring area corresponding to at least one monitoring point location based on the acquired monitoring video, people number monitoring data matched with the monitoring time within a preset time period is acquired, people flow state data of at least one monitoring equipment is determined based on the people number monitoring data matched with the monitoring event within the preset time period, and the state of the monitoring event is represented through the determined people flow state data, so that the monitoring of the monitoring video is realized.
In a possible embodiment, after determining the people flow status data of the at least one monitoring device, the method further includes:
and generating people stream state alarm information under the condition that the people stream state data of the at least one monitoring device is determined to meet the alarm condition.
When the determined people stream state data meet the alarm condition, people stream state alarm information is generated, and the target monitoring area can be regulated and controlled based on the generated people stream state alarm information, so that safety accidents are avoided, and the safety of people streams in the target monitoring area is ensured.
In a possible implementation manner, in a case that the monitoring event is a line crossing event, determining whether a monitoring event occurs in a monitoring area corresponding to the at least one monitoring point based on the monitoring video includes:
determining whether a target object crossing a target position matched with a pre-drawn access boundary line exists in a monitoring area corresponding to the at least one monitoring point location based on the monitoring video;
and if so, determining that a cross-line event occurs in a monitoring area corresponding to the at least one monitoring point.
In the above embodiment, when a target object crossing a target position matched with the access boundary line exists in the monitoring area corresponding to at least one monitoring point location determined based on the monitoring video, it is determined that a line crossing event occurs in the monitoring area corresponding to at least one monitoring point location, so that real-time monitoring of the line crossing event is realized, and accuracy of monitoring the line crossing event is improved.
In a possible embodiment, in a case that the monitoring event is a line crossing event, acquiring people number monitoring data matched with the monitoring event within a preset time period includes:
acquiring the number of people entering and the number of people exiting at different acquisition time points within a preset time period, wherein the number of people entering and the number of people exiting at different acquisition time points are the number of people crossing a pre-drawn entrance and exit boundary line along a pre-drawn entrance direction at different acquisition time points; the number of people in the stream of people at different collection time points is the number of people crossing the pre-drawn in-and-out boundary line along the pre-drawn out direction at different collection time points.
According to the method, when the monitoring event is the cross-line event, the number of people entering the stream and the number of people exiting the stream at different acquisition time points in the preset time period can be acquired, and data support is provided for subsequently determining the people stream state data corresponding to the cross-line event.
In a possible implementation manner, in a case that the monitoring point is at one point, determining people stream status data of the at least one monitoring device based on the number of people monitoring data matching the monitoring event within a preset time period includes:
determining the total number of people entering and the total number of people exiting in the preset time period in the monitoring area corresponding to the monitoring point location based on the number of people entering and the number of people exiting at different acquisition time points in the preset time period;
generating people stream state alarm information under the condition that the people stream state data of the at least one monitoring device is determined to meet the alarm condition, wherein the generation of the people stream state alarm information comprises the following steps:
and generating people stream state warning information under the condition that the total incoming people stream number in the preset time period is determined to be larger than a set first people stream threshold value and/or the total outgoing people stream number in the preset time period is determined to be larger than a set second people stream threshold value.
Here, when the number of the monitoring points is one, the total number of the people entering and the total number of the people exiting within the preset time period in the monitoring area corresponding to the monitoring points are determined based on the number of the people entering and the number of the people exiting at different acquisition time points within the preset time period. And generating people stream state warning information under the condition that the total people entering stream number in the preset time period is greater than a set first people stream threshold value and/or the total people exiting stream number in the preset time period is greater than a set second people stream threshold value, so that the early warning of the people entering stream number and the people exiting stream number of the monitoring video is realized, people streams are dredged based on the generated people stream state warning information, and the safety accidents caused by the fact that the people entering stream number is large or the people exiting stream number is large in a short time are avoided.
In a possible embodiment, in a case that the monitoring point is located at one point, determining the people flow state data of the at least one monitoring device based on the number-of-people monitoring data matched with the monitoring event within a preset time period includes:
and determining the people entering flow speed and the people exiting flow speed in the monitoring area corresponding to the monitoring point location based on the people entering flow quantity and the people exiting flow quantity of different acquisition time points in a preset time period.
In the method, the people entering flow speed and the people exiting flow speed in the monitoring area corresponding to the monitoring point location can be determined based on the people entering flow quantity and the people exiting flow quantity of different acquisition time points in the preset time period, so that the people entering flow speed and the people exiting flow speed can be monitored, and the occurrence of safety accidents caused by the fact that the people entering flow speed is high or the people exiting flow speed is high is avoided.
In a possible embodiment, in a case that the monitoring point location is multiple, determining the people flow state data of the at least one monitoring device based on the number of people monitoring data matched with the monitoring event within a preset time period includes:
for each monitoring point location, determining the total number of people entering and the total number of people exiting within a preset time period in a monitoring area corresponding to the monitoring point location based on the number of people entering and the number of people exiting at different acquisition time points within the preset time period;
determining the personnel inventory in the target monitoring area based on the historical number of people in the target monitoring area in the preset time period and the total number of people entering the preset time period and the total number of people leaving the preset time period which are respectively corresponding to the plurality of monitoring point locations;
generating people stream state alarm information under the condition that the people stream state data of the at least one monitoring device is determined to meet the alarm condition, wherein the people stream state alarm information comprises the following steps:
and generating people stream state alarm information under the condition that the personnel net stock in the target monitoring area is determined to be larger than the set net stock threshold value.
After determining the total inflow number and the total outflow number in the preset time period in the monitoring area corresponding to each monitoring point, determining the personnel net stock in the target monitoring area based on the historical number of people in the target monitoring area in the preset time period and the total inflow number and the total outflow number in the preset time period respectively corresponding to the plurality of monitoring points, and generating the people flow state warning information under the condition that the personnel net stock in the target monitoring area is greater than the set net stock threshold value, thereby realizing the early warning of the personnel net stock in the target monitoring area, so that when the personnel net stock is large, people can be dredged based on the generated people flow state warning information, and the occurrence of safety accidents caused by the large number of people in the target monitoring area can be avoided.
In a possible implementation manner, in a case that the monitoring event is an over-dense event, determining whether a monitoring event occurs in a monitoring area corresponding to the at least one monitored point location based on the monitoring video includes:
determining whether the number of target objects in a monitoring area corresponding to the at least one monitoring point location exceeds an over-dense threshold value or not based on the monitoring video;
and if so, determining that an over-dense event occurs in a monitoring area corresponding to the at least one monitoring point location.
According to the method, when the number of the target objects in the monitoring area corresponding to the at least one monitoring point is determined to exceed the over-density threshold value based on the monitoring video, the over-density event is determined to occur in the monitoring area corresponding to the at least one monitoring point, so that the real-time monitoring of the over-density event is realized, and the accuracy of monitoring the over-density event is improved.
In a possible embodiment, in a case that the monitoring event is an over-dense event, acquiring people number monitoring data matched with the monitoring event within a preset time period includes:
and counting the number of the target objects at different acquisition time points in a preset time period.
In the method, when the monitoring event is an excessive event, the number of the target objects at different acquisition time points in a preset time period can be counted, and data support is provided for subsequently determining the people stream state data corresponding to the excessive event.
In a possible embodiment, in a case that the monitoring point is located at one point, determining the people flow state data of the at least one monitoring device based on the number-of-people monitoring data matched with the monitoring event within a preset time period includes:
determining the average number of people in a monitoring area corresponding to the monitoring point location in a preset time period based on the number of the target objects at different acquisition time points in the preset time period;
generating people stream state alarm information under the condition that the people stream state data of the at least one monitoring device is determined to meet the alarm condition, wherein the people stream state alarm information comprises the following steps:
and generating people stream state alarm information under the condition that the average number of people in the monitoring area corresponding to the monitoring point location in the preset time period is determined to be greater than a set first number threshold.
In the method, when one monitoring point location is provided, the average number of people in a monitoring area corresponding to the monitoring point location in a preset time period is determined based on the number of target objects at different acquisition time points in the preset time period; and under the condition that the average number of people in the monitoring area corresponding to the monitoring point location in the preset time period is greater than the set first number threshold, people stream state alarm information is generated, the monitoring of the average number of people in the monitoring area of the monitoring video is realized, so that people stream dispersion is carried out on the monitoring area based on the generated people stream state alarm information, and the occurrence of safety accidents caused when people in the monitoring area are dense is avoided.
In a possible embodiment, in a case that the monitoring point location is multiple, determining the people flow state data of the at least one monitoring device based on the number of people monitoring data matched with the monitoring event within a preset time period includes:
for each monitoring point location, determining the average number of people in the monitoring area corresponding to the monitoring point location in a preset time period based on the number of the target objects at different acquisition time points in the preset time period;
determining the total real-time number of people in the target monitoring area based on the average number of people corresponding to the monitoring point positions respectively;
generating people stream state alarm information under the condition that the people stream state data of the at least one monitoring device is determined to meet the alarm condition, wherein the people stream state alarm information comprises the following steps:
and generating people stream state alarm information under the condition that the total real-time people number in the target monitoring area is determined to be larger than a set second people number threshold value.
In the method, after the average number of people in the monitoring area corresponding to each monitoring point location in a preset time period is determined, the total number of real-time people in the target monitoring area can be determined based on the average number of people corresponding to the monitoring point locations respectively; and generating the people stream state warning information under the condition that the total real-time people number in the target monitoring area is determined to be larger than the set second people number threshold value, so that the early warning of the total real-time people number in the target monitoring area is realized, people in the target monitoring area are dredged based on the generated people stream state warning information when the total real-time people number is large, and safety accidents caused by the fact that the total real-time people number is large in the target monitoring area are avoided.
In a possible embodiment, the method further comprises:
averaging the people stream state data of the same acquisition time point in a plurality of recent historical dates to obtain the predicted people stream state data corresponding to each acquisition time point;
predicting the people stream state data corresponding to each acquisition time point to form the prediction data of the people stream state data in the future date; wherein the prediction data is used to generate a people stream grooming plan.
The following descriptions of the effects of the apparatus, the electronic device, and the like refer to the description of the above method, and are not repeated here.
In a second aspect, the present disclosure provides a scene monitoring device, including:
the first acquisition module is used for acquiring a monitoring video acquired by monitoring equipment arranged at least one monitoring point location;
the detection module is used for determining whether a monitoring event occurs in a monitoring area corresponding to the at least one monitoring point location based on the monitoring video;
the second acquisition module is used for acquiring the number of people monitoring data matched with the monitoring event within a preset time period under the condition that the monitoring event occurs in the monitoring area corresponding to the at least one monitoring point location;
and the determining module is used for determining the people flow state data of the at least one monitoring device based on the people number monitoring data matched with the monitoring event in a preset time period.
In a third aspect, the present disclosure provides an electronic device comprising: a processor, a memory and a bus, wherein the memory stores machine-readable instructions executable by the processor, and when the electronic device runs, the processor and the memory communicate with each other through the bus, and when the processor executes the machine-readable instructions, the processor performs the steps of the scene monitoring method according to the first aspect or any one of the embodiments.
In a fourth aspect, the present disclosure provides a computer-readable storage medium having a computer program stored thereon, where the computer program is executed by a processor to perform the steps of the scene monitoring method according to the first aspect or any one of the embodiments.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for use in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
Fig. 1 is a schematic flowchart illustrating a scene monitoring method according to an embodiment of the present disclosure;
FIG. 2a is an interface diagram showing an interface diagram of a video screen shot with a monitoring mark drawn thereon according to an embodiment of the present disclosure;
FIG. 2b is an interface diagram showing an interface diagram of a video screen shot with a monitoring identifier drawn therein according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram illustrating an interface showing detailed information of a people stream status alarm according to an embodiment of the present disclosure;
FIG. 4a is a schematic diagram illustrating an interface showing detailed information of a people stream status alarm according to an embodiment of the present disclosure;
FIG. 4b is a schematic diagram of another interface for displaying detailed information of a people flow state alarm provided by the embodiment of the present disclosure;
FIG. 4c is a schematic diagram of an interface for displaying details of an alarm according to an embodiment of the present disclosure;
FIG. 5 is a schematic diagram illustrating an interface showing detailed information of a people stream status alarm according to an embodiment of the present disclosure;
FIG. 6a is a schematic diagram illustrating an interface showing detailed information of a people stream status alarm according to an embodiment of the present disclosure;
FIG. 6b is a schematic diagram of an interface for displaying details of an alarm according to an embodiment of the present disclosure;
fig. 7 is a schematic diagram illustrating an architecture of a scene monitoring apparatus provided in an embodiment of the present disclosure;
fig. 8 shows a schematic structural diagram of an electronic device provided in an embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of the embodiments of the present disclosure, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure, presented in the figures, is not intended to limit the scope of the claimed disclosure, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
With the improvement of living standard of people, more and more large-scale activities are held in various places and places. When a large-scale event is held, people flow more densely, so that accidents are easy to happen to places and places holding the large-scale event, such as trampling events, congestion events and the like. Therefore, effective monitoring of people flow is more and more important to ensure the safety degree of each place and each place. In order to solve the above problems and improve the security of places and places, embodiments of the present disclosure provide a scene monitoring method and apparatus, an electronic device, and a storage medium.
For facilitating understanding of the embodiments of the present disclosure, a scene detection method disclosed in the embodiments of the present disclosure is first described in detail. An execution subject of the scene monitoring method provided by the embodiment of the present disclosure is generally a computer device with certain computing capability, and the computer device includes: a terminal device, which may be a User Equipment (UE), a mobile device, a User terminal, a cellular phone, a cordless phone, a Personal Digital Assistant (PDA), a handheld device, a computing device, a vehicle mounted device, a wearable device, or a server or other processing device. In some possible implementations, the scenario monitoring method may be implemented by a processor calling computer-readable instructions stored in a memory.
Referring to fig. 1, a schematic flow chart of a scene monitoring method provided in the embodiment of the present disclosure is shown, where the method includes S101-S104, specifically:
s101, acquiring a monitoring video acquired by monitoring equipment arranged at least one monitoring point position.
S102, determining whether a monitoring event occurs in a monitoring area corresponding to the at least one monitoring point location based on the monitoring video.
S103, under the condition that a monitoring event occurs in a monitoring area corresponding to the at least one monitoring point location, acquiring people number monitoring data matched with the monitoring event within a preset time period.
And S104, determining the people flow state data of the at least one monitoring device based on the people number monitoring data matched with the monitoring event in a preset time period.
According to the method, by acquiring the monitoring video acquired by the monitoring equipment, when a monitoring event occurs in a monitoring area corresponding to at least one monitoring point location based on the acquired monitoring video, people number monitoring data matched with the monitoring time within a preset time period is acquired, people flow state data of at least one monitoring equipment is determined based on the people number monitoring data matched with the monitoring event within the preset time period, and the state of the monitoring event is represented through the determined people flow state data, so that the monitoring of the monitoring video is realized.
S101 to S104 will be specifically described below.
For S101:
in specific implementation, the method can be used for detecting a target monitoring area, and the target monitoring area can be any area in a real scene, for example, the target monitoring area can be a mall, a beach, a park, a subway station, and the like.
For example, a plurality of monitoring points may be set at the target monitoring area, and one monitoring device is installed at each monitoring point, so that the monitoring device may monitor the corresponding monitoring area, thereby monitoring the target monitoring area. The monitoring point location can be determined according to actual needs, for example, when the target monitoring area is a shopping mall, one monitoring point location can be set at each door of the shopping mall, and/or one monitoring point location can be set at each elevator doorway, and the like.
In specific implementation, the monitoring device may be a monitoring camera or the like. The monitoring device is arranged at each monitoring point position, and the monitoring video in the corresponding monitoring area is collected through the monitoring device, so that the monitoring video collected by each monitoring device can be obtained, namely the monitoring video collected by the monitoring device arranged at least one monitoring point position is obtained.
For S102:
here, for the monitoring video collected by each monitoring device, based on the monitoring video, it may be determined whether a monitoring event occurs in the monitoring area corresponding to the monitoring point, and further, it may be determined whether a monitoring event occurs in the monitoring area corresponding to each monitoring point in at least one monitoring point. Wherein, the monitoring event can comprise an overt event and/or a line crossing event; the over-dense event means that the number density of people in the area is greater than a set value, namely the number density of people in the area is greater; the cross line refers to a reference line which is arranged in a crossing way by pedestrians in the area.
The crossing event and the dense event are described with reference to a specific scene, for example, for the crossing event, in a subway station, a reference line may be set at a position on the platform that is a preset distance (for example, 1 meter) from the subway, and whether a person crosses the reference line (i.e., whether a person crosses the reference line to enter the subway or exit the subway) is monitored, if so, the crossing event occurs. For example, for an over-dense event, on a beach, a target monitoring area may be set, and when the number of people in the target monitoring area is greater than the set number of people, it is determined that the over-dense event occurs.
In another optional implementation manner, a function button may be further provided for each monitoring video, and the monitoring event of the monitoring area of the monitoring point location is determined by triggering the function button. For example, a first function button corresponding to a cross-line event may be set (for a single surveillance video, cross-line event monitoring is performed), and after the first function button corresponding to the surveillance video a is triggered, it is determined that cross-line event monitoring is performed on the surveillance video a; or, a second function button corresponding to the cross-line event may be further set (for monitoring the cross-line event in the surveillance video group), and after the second function button corresponding to the surveillance video group a composed of the surveillance video a, the surveillance video B, and the like is triggered, it is determined to perform the cross-line event monitoring on the surveillance video group a.
For another example, a third function button corresponding to the over-dense event (for monitoring the over-dense event for a single surveillance video) may be further set, and after the third function button corresponding to the surveillance video a is triggered, it is determined that the over-dense event is monitored for the surveillance video a; or, a fourth function button corresponding to the over-dense event (for monitoring the over-dense event for the surveillance video group) may be further provided, and after the fourth function button corresponding to the surveillance video group a formed by the surveillance video a, the surveillance video B, and the like is triggered, it is determined to monitor the over-dense event for the surveillance video group a.
In specific implementation, before determining whether a monitoring event occurs in a monitoring area corresponding to at least one monitoring point location based on a monitoring video, a monitoring identifier corresponding to the monitoring video may be drawn. For a line crossing event, the monitoring identifier can be a pre-drawn in-out boundary line, an in-out direction and an out-out direction; for an excessive event, the monitoring identifier may be any polygon drawn in advance, or for the excessive event, the corresponding monitoring identifier may not be set. The monitoring identifiers corresponding to different monitoring videos are different, that is, for each monitoring video, a corresponding monitoring identifier (a monitoring identifier corresponding to a line crossing event and/or a monitoring identifier corresponding to a too-dense event) can be drawn for the monitoring video.
In specific implementation, for each monitoring video, a frame of video screen capture can be collected from the monitoring video, and the video screen capture is displayed, so that a user can draw a monitoring identifier on the video screen capture according to actual needs. And then, determining the position information of the monitoring identifier in the video picture screenshot by acquiring the video picture screenshot on which the monitoring identifier is pre-drawn, wherein the position information can be a coordinate set of the monitoring identifier in a pixel coordinate system corresponding to the video picture screenshot, for example, the position information of a boundary line and the like. Further, target location information matching the monitoring identification may be determined in the video picture of the surveillance video. When the installation information such as the position, the orientation and the like of the monitoring device is not changed, the position information of the monitoring identifier in the screenshot of the video image can be the target position information of the monitoring identifier in the video image of the monitoring video. Furthermore, whether a monitoring event occurs in a monitoring area corresponding to the monitoring point location can be determined based on the monitoring video collected by the monitoring device and the determined target location information.
Illustratively, when the monitoring identifier includes a monitoring identifier corresponding to a line crossing event, refer to an interface diagram showing a video screen capture on which the monitoring identifier is drawn as shown in fig. 2a, where fig. 2a includes a pre-drawn monitoring identifier 21, and the monitoring identifier includes a drawn access line and an arrow identifier indicating an access direction. When the monitoring identification is drawn, an incoming flow threshold (namely, a first incoming flow threshold) and/or an outgoing flow threshold (namely, a second incoming flow threshold) can be set on the displayed interface, so that the monitoring video can be monitored based on the set incoming flow threshold and/or outgoing flow threshold. The image also comprises prompt information set by the line crossing event positioned above the screenshot of the video image, so that a user can draw a monitoring identifier according to the displayed prompt information set by the line crossing event. When the monitoring mark is drawn, a button for redrawing can be triggered, the drawn monitoring mark is deleted, and a new monitoring mark is redrawn.
When the monitoring identifier includes a monitoring identifier of an excessive density event, refer to an interface schematic diagram of a video screen screenshot drawn with the monitoring identifier shown in fig. 2b, where the diagram includes a monitoring identifier 21 drawn in advance, and the monitoring identifier includes a polygon indicating a detection area, where the number of the detection areas may be multiple. When the monitoring identification is drawn, the number of classified early warning people, namely the number of early warning people corresponding to general risks, the number of early warning people corresponding to greater risks and the number of early warning people corresponding to major risks, can be set on a displayed interface, so that the monitoring video can be monitored based on the set number of classified early warning people. The diagram also comprises prompt information set by over-dense events positioned above the screenshot of the video picture, so that a user can draw monitoring marks indicating the detection area according to the prompt information set by the displayed over-line events. When the monitoring mark is drawn, a button for redrawing can be triggered, the drawn monitoring mark is deleted, and a new monitoring mark is redrawn. After the monitoring identifier corresponding to the monitoring video is drawn, the drawn monitoring identifier can be stored in the multiplexing area, so that when the monitoring identifier is determined next time, the function button of the multiplexing area can be directly triggered, and the monitoring identifier can be reused.
The function button of the human body label in fig. 2b is used for displaying the setting information of the human body label. Considering that the area size of the human body in the monitoring video picture is related to the height and angle of the monitoring device, the distance between the same human body and the monitoring device is different, and the area size of the human body in the monitoring video picture is different, namely the area of the human body is larger when the distance between the same human body and the monitoring device is shorter, so that the human body label is the basic setting of a line crossing event and an over dense event.
Specifically, in the screenshot of the video image, the area and the depth information of the human body frame of each pedestrian are estimated from the human body frames which mark a plurality of pedestrians at different depth positions; the method is convenient for the algorithm (for example, an image recognition algorithm for recognizing the human body) to recognize the human body by using the human body labeling result on different monitoring equipment under different conditions, so that the recognition precision is improved, wherein the more the human body frames are, the higher the precision is, and during the specific implementation, the number of the marked pedestrian frames can be set according to the needs, for example, the number range of the marked pedestrian frames can be set to be 3-10. And then the real-time number of people in the detection area of the video picture per second in the monitoring video can be detected by utilizing the area of the human body frames of a plurality of pedestrians and the depth information of each pedestrian.
In the event of excessive density, after the detection area (monitoring identifier) is drawn, the predicted area of the drawn area in the real scene can be calculated based on the human body sample marked in the human body label, the predicted area is displayed at the area prediction position below fig. 2b, and the personnel density in the detection area can be calculated in the subsequent point-location excessive density alarm, video group excessive density alarm and the like. And, the function button of "correcting area" is also included in the fig. 2b, and the predicted area displayed at the estimated area of the region can be corrected after the function button of "correcting area" is triggered.
In an optional implementation manner, in a case that a monitoring event is a line crossing event, determining whether a monitoring event occurs in a monitoring area corresponding to at least one monitoring point location based on a monitoring video includes:
determining whether a target object crossing a target position matched with a pre-drawn access boundary line exists in a monitoring area corresponding to the at least one monitoring point location based on the monitoring video;
and if so, determining that a cross-line event occurs in a monitoring area corresponding to the at least one monitoring point.
When the monitoring event is a cross-line event, for the monitoring video acquired by each monitoring point location, whether a target object crossing a target position matched with an in-out boundary line exists in a monitoring area corresponding to the monitoring point location or not can be determined based on the monitoring video, for example, whether a pedestrian crosses a drawn in-out boundary line is detected in the monitoring video, and if yes, the cross-line event occurs in the monitoring area corresponding to the monitoring point location; and if not, determining that no cross-line event occurs in the monitoring area corresponding to the monitoring point.
The monitoring area corresponding to the monitoring point location can be a detection area which can be monitored by the monitoring equipment arranged at the monitoring point location; the monitoring area corresponding to the monitoring point location is related to the installation position and the installation angle of the monitoring equipment, and different installation positions and/or installation angles correspond to different monitoring areas.
In the above embodiment, when a target object crossing a target position matched with the access boundary line exists in the monitoring area corresponding to at least one monitoring point location determined based on the monitoring video, it is determined that a line crossing event occurs in the monitoring area corresponding to at least one monitoring point location, so that real-time monitoring of the line crossing event is realized, and accuracy of monitoring the line crossing event is improved.
In an optional embodiment, in a case that a monitoring event is an over-dense event, determining whether a monitoring event occurs in a monitoring area corresponding to at least one monitored point location based on a monitoring video includes:
determining whether the number of target objects in a monitoring area corresponding to the at least one monitoring point location exceeds an over-dense threshold value or not based on the monitoring video;
and if so, determining that an over-dense event occurs in a monitoring area corresponding to the at least one monitoring point location.
When the monitoring event is an over-dense event, determining whether the number of target objects in a monitoring area corresponding to each monitoring point location exceeds an over-dense threshold value or not based on the monitoring video, for example, determining the number of human beings in the monitoring area in the monitoring video, judging whether the number of human beings is greater than the over-dense threshold value of a social group or not, and if so, determining that the over-dense event occurs in the monitoring area corresponding to the monitoring point location; if not, determining that the monitoring area corresponding to the monitoring point location has not occurred with the over-dense event. Here, when the monitoring event is an over-dense event, the monitoring region corresponding to the monitoring point location may be a detection region matched with the drawn polygon; when the monitoring identifier is not drawn, the monitoring area corresponding to the monitoring point location is a detection area that can be monitored by the monitoring device arranged at the monitoring point location (that is, the areas corresponding to the monitoring interface of the monitoring video are all monitoring areas).
According to the method, when the number of the target objects in the monitoring area corresponding to the at least one monitoring point is determined to exceed the over-density threshold value based on the monitoring video, the over-density event is determined to occur in the monitoring area corresponding to the at least one monitoring point, so that the real-time monitoring of the over-density event is realized, and the accuracy of monitoring the over-density event is improved.
For S103 and S104:
when a monitoring event occurs in a monitoring area corresponding to at least one monitoring point location, acquiring people number monitoring data matched with the monitoring event within a preset time period; the people number monitoring data comprises people number monitoring data corresponding to the cross-line event and/or people number monitoring data corresponding to the over-dense event. Further, determining people flow state data of at least one monitoring device based on the people number monitoring data matched with the monitoring events in a preset time period; the people stream state data comprises people stream state data corresponding to the cross-line event and/or people stream state data corresponding to the over-dense event. For example, the preset time period may be a time period from the moment when the monitoring event is determined to occur to one hour, and if the moment when the monitoring event is determined to occur is 13 hours and 10 minutes and 00 seconds, the preset time period is a time period from 13 hours and 10 minutes and 00 seconds to 14 hours and 10 minutes and 00 seconds. For example, the preset time period may be a time period from the time when the monitoring event is determined to occur to one minute, and if the time when the monitoring event is determined to occur is 13 hours and 10 minutes and 00 seconds, the preset time period may be a time period from 13 hours and 10 minutes and 00 seconds to 13 hours and 11 minutes and 00 seconds.
Aiming at the cross-line event, the number of people monitoring data matched with the cross-line event in a preset time period can be acquired; and determining the people flow state data, matched with the cross-line event, of at least one monitoring device based on the people number monitoring data matched with the cross-line event in a preset time period.
Aiming at the over-dense event, the number of people monitoring data matched with the over-dense event in a preset time period can be acquired; and determining the people flow state data, matched with the dense event, of at least one monitoring device based on the people number monitoring data matched with the dense event in a preset time period.
In an optional embodiment, after determining the people flow state data of at least one monitoring device, the method further includes: and generating people stream state alarm information under the condition that the people stream state data of at least one monitoring device meets the alarm condition.
And if so, generating people stream state alarm information so that a user can generate a dispersion plan based on the people stream state alarm information and avoid trampling, congestion and other events in a target monitoring area.
And when the determined people flow state data meet the alarm condition, people flow state alarm information is generated, and the target monitoring area can be regulated and controlled based on the generated people flow state alarm information, so that safety accidents are avoided, and the safety of people flow under the target monitoring area is ensured.
The following describes the alarm process of the line crossing event and the alarm process of the over-dense event in detail.
First, the alarm process of the line crossing event will be explained.
Under the condition that the monitoring event is a cross-line event, acquiring the number monitoring data matched with the monitoring event in a preset time period, wherein the number monitoring data comprises the following steps:
acquiring the number of people entering and the number of people leaving at different acquisition time points within a preset time period, wherein the number of people entering at different acquisition time points refers to the number of people crossing a pre-drawn entrance and exit boundary line along a pre-drawn entrance direction at different acquisition time points; the number of people in the stream of people at different collection time points is the number of people crossing the pre-drawn in-and-out boundary line along the pre-drawn out direction at different collection time points.
Here, the monitoring identifier corresponding to the cross-line event may include a preset ingress and egress boundary line and an ingress and egress direction (an ingress direction and/or an egress direction, where the egress direction is an opposite direction of the ingress direction), where the ingress and egress boundary line may divide a monitoring area corresponding to the monitoring video into an ingress area and an egress area, the ingress and egress direction in the ingress and egress direction may be a direction from the egress area into the ingress area, and the egress direction in the ingress and egress direction may be a direction from the ingress area into the egress area.
Further, the number of people entering the flow (i.e. the number of people entering the flow) and the number of people exiting the flow at each acquisition time point in a preset time period in the monitoring video can be determined based on the set entering and exiting boundary line, entering and exiting direction and the monitoring video, wherein the number of people entering the flow at different acquisition time points refers to the number of people crossing the entering and exiting boundary line along the entering direction at different acquisition time points; the number of people who flow out at different collection time points refers to the number of people crossing the in-and-out boundary line along the out direction at different collection time points.
Illustratively, a trained target tracking algorithm can be used, the monitoring video is detected based on the set monitoring identifier, a detection result is output once every preset time in a preset time period, multiple detection results in the preset time period can be the number of people entering and the number of people leaving at different acquisition time points in the preset time period, each detection result is associated with output time (the output time is an acquisition time point), and then the number of people entering and the number of people leaving at different acquisition time points in the preset time period can be obtained.
According to the method, when the monitoring event is the cross-line event, the number of people entering the stream and the number of people exiting the stream at different acquisition time points in the preset time period can be acquired, and data support is provided for subsequently determining the people stream state data corresponding to the cross-line event.
In an optional embodiment, in a case that the monitoring point is located at one point, determining people flow state data of at least one monitoring device based on the number of people monitoring data matched with the monitoring event within a preset time period includes: and determining the total number of people entering and the total number of people exiting in the preset time period in the monitoring area corresponding to the monitoring point location based on the number of people entering and the number of people exiting at different acquisition time points in the preset time period.
Generating people stream state alarm information under the condition that the people stream state data of at least one monitoring device is determined to meet the alarm condition, wherein the people stream state alarm information comprises the following steps: and generating people stream state alarm information under the condition that the total incoming people stream number in the preset time period is determined to be larger than the set first people stream threshold value and/or the total outgoing people stream number in the preset time period is determined to be larger than the set second people stream threshold value.
After the number of people monitoring data matched with the monitoring event (cross-line event) within the preset time period is obtained, that is, after the number of people entering the flow and the number of people exiting the flow at different acquisition time points within the preset time period are obtained for the cross-line event, the total number of people entering the flow and the total number of people exiting the flow within the preset time period in the monitoring area corresponding to the monitoring point location can be determined based on the number of people entering the flow and the number of people exiting the flow at different acquisition time points within the preset time period.
Continuing with the above embodiment, the trained target tracking algorithm may output a detection result every 3 seconds (determining an acquisition time point every 3 seconds), where the detection result may be the number of incoming flows and the number of outgoing flows within the 3 seconds, for example, the detection result may be: the number of incoming flows between 10 minutes 01 seconds at 08 hours and 10 minutes 03 seconds at 08 hours (including 10 minutes 01 seconds and 10 minutes 03 seconds) is 20, the number of outgoing flows is 50, and the associated output time (acquisition time point) is 10 minutes 03 seconds at 08 hours; and further, a plurality of detection results in a preset time period can be obtained, and the number of people entering the flow and the number of people exiting the flow at different acquisition time points in the preset time period can be obtained.
After the number of incoming streams and the number of outgoing streams at different acquisition time points in the preset time period are obtained, the number of incoming streams at different acquisition time points can be added to obtain the total number of incoming streams in the preset time period; and the number of the people coming out at different acquisition time points can be added to obtain the total number of the people coming out in the preset time period.
Here, the first and second people stream thresholds are preset, and may be set according to actual needs. After the total number of incoming flows and the total number of outgoing flows within the preset time period are obtained, whether the total number of incoming flows within the preset time period is greater than a set first people flow threshold value or not and/or whether the total number of outgoing flows within the preset time period is greater than a set second people flow threshold value or not can be judged.
And under the conditions that whether the total inflow number in the preset time period is greater than a set first inflow threshold value and whether the total outflow number in the preset time period is greater than a set second inflow threshold value or not are judged, if the total inflow number in the preset time period is greater than the set first inflow threshold value and/or if the total outflow number in the preset time period is greater than the set second inflow threshold value, generating inflow state warning information. The generated people stream state warning information may be information in a format of text, voice, video, and the like, for example, the generated people stream state warning information may be "attention, large number of people stream". In this case, the alarm event types of the people flow state alarm information are: the point crossing alarm.
Further, after triggering the generated people stream state alarm information, the detailed information of the people stream state alarm may be displayed, where the detailed information includes, but is not limited to, an alarm point location (i.e., a name of a monitoring device that alarms, etc.), alarm time, and an alarm event type, and when the alarm event type is a point location offline alarm, the detailed information further includes the number of people entering the unit time, the number of people exiting the unit time, and the like.
Here, when the number of the monitoring points is one, the total number of the people entering and the total number of the people exiting within the preset time period in the monitoring area corresponding to the monitoring points are determined based on the number of the people entering and the number of the people exiting at different acquisition time points within the preset time period. And generating people stream state warning information under the condition that the total number of people entering the video is greater than a set first people stream threshold value and/or the total number of people leaving the video is greater than a set second people stream threshold value in a preset time period, so that the early warning of the number of people entering the video and the number of people leaving the video is realized, the people stream is dredged based on the generated people stream state warning information, and the occurrence of safety accidents caused by the fact that the number of people entering the video is large in a short time or the number of people leaving the video is large is avoided.
In an optional embodiment, in a case that the monitoring point is located at one point, determining people flow state data of at least one monitoring device based on the number of people monitoring data matched with the monitoring event within a preset time period includes: and determining the people entering flow speed and the people exiting flow speed in the monitoring area corresponding to the monitoring point location based on the people entering flow quantity and the people exiting flow quantity of different acquisition time points in a preset time period.
Here, the people entering flow speed and the people exiting flow speed in the monitoring area corresponding to the monitoring point location can be determined based on the people entering flow number and the people exiting flow number of different acquisition time points in the preset time period.
In specific implementation, after the number of people entering and the number of people exiting at different collection time points within a preset time period are obtained, the detection results for multiple times can be classified and integrated according to output time (collection time points) to obtain the number of people entering and the number of people exiting within unit time (for example, one minute), and then the speed of people entering and the speed of people exiting can be obtained.
For example, the output results within the output time range of 08 hours 10 minutes 00 seconds to 08 hours 11 minutes 00 seconds (excluding 08 hours 10 minutes 00 seconds and including 08 hours 11 minutes 00 seconds) may be classified and integrated, that is, the output results obtained within the output time range of 08 hours 10 minutes 03 seconds, 08 hours 10 minutes 06 seconds, … …, 08 hours 10 minutes 57 seconds and 08 hours 11 minutes 00 seconds are classified into one class, and the detection results within the class are integrated to obtain the number of incoming flows and the number of outgoing flows within 1 minute (within a unit time) between 08 hours 10 minutes 00 seconds and 08 hours 11 minutes 00 seconds, that is, the incoming flow rate (unit: person/minute) and the outgoing flow rate (unit: person/minute) corresponding to 08 hours 10 minutes are obtained.
In the method, the people entering flow speed and the people exiting flow speed in the monitoring area corresponding to the monitoring point location can be determined based on the people entering flow quantity and the people exiting flow quantity of different acquisition time points in the preset time period, so that the people entering flow speed and the people exiting flow speed can be monitored, and the occurrence of safety accidents caused by the fact that the people entering flow speed is high or the people exiting flow speed is high is avoided.
Referring to fig. 3, an interface diagram showing detailed information of a people flow state alarm is shown, where the interface diagram includes alarm details and statistics of a day cross-line event time period, where the alarm details include an alarm point location, an event type, an alarm time, a duration (a time of a cross-line event duration), an inflow peak value, an outflow peak value, and the like, and the statistics of a current cross-line event time period include an over-line event alarm from a zero point of the day to a current time of the statistics. The figure also comprises a video picture screenshot, and the video picture screenshot displays corresponding outgoing flow information (the outgoing flow number and the outgoing flow speed) and incoming flow information (the incoming flow number and the incoming flow speed) at the current moment; a plurality of frames of alarm pictures are displayed below the video screenshot, wherein the number of the alarm pictures is related to the duration of the alarm, for example, when the duration of the line crossing event is 17 minutes, one frame of alarm picture can be extracted every other one minute and used as an alarm record, that is, 17 frames of alarm pictures can be displayed below the video screenshot.
In specific implementation, the name and the installation position of the monitoring device, point location information such as the collected monitoring video, and information such as the number of people who flow out and the number of people who flow in unit time may be persistently stored in a search server (for example, an elastic search) for subsequent search query.
In an optional embodiment, in a case that there are a plurality of monitoring point locations, determining people flow state data of at least one monitoring device based on people number monitoring data matched with a monitoring event within a preset time period includes:
step one, aiming at each monitoring point location, determining the total people entering flow quantity and the total people exiting flow quantity within the preset time period in the monitoring area corresponding to the monitoring point location based on the people entering flow quantity and the people exiting flow quantity of different acquisition time points within the preset time period.
And step two, determining the personnel net stock in the target monitoring area based on the historical number of people in the target monitoring area in the preset time period and the total number of people entering the target monitoring area and the total number of people leaving the target monitoring area in the preset time period corresponding to the plurality of monitoring point positions respectively.
Generating people stream state alarm information under the condition that the people stream state data of at least one monitoring device meets the alarm condition, wherein the generation comprises the following steps: and generating the people stream state warning information under the condition that the personnel net stock in the target monitoring area is determined to be larger than the set net stock threshold.
Here, considering that a plurality of monitoring devices may be installed in one site or place, people stream analysis may be performed on the monitoring videos respectively collected by the plurality of monitoring devices to obtain people stream state data of the plurality of monitoring videos. The monitoring videos respectively collected by the plurality of monitoring devices form a video group, that is, people stream analysis can be performed on the video group, so that people stream state data corresponding to the video group is obtained. In specific implementation, the cross-line analysis function of each monitoring video in the video group can be started by triggering the start button of the cross-line event corresponding to the video group set on the display interface. Meanwhile, specific information of the total stock grading early warning of people flows can be set on the display interface, for example, the number of primary early warning people corresponding to the stock increasing, the number of secondary early warning people corresponding to the stock warning and the number of tertiary early warning people corresponding to the stock overheating are filled.
In the first step, for each monitoring point location, after acquiring the number of incoming flows and the number of outgoing flows of different acquisition time points within a preset time period corresponding to the monitoring point location, the number of incoming flows of different acquisition time points may be added to obtain a total number of incoming flows within the preset time period corresponding to the monitoring point location; and adding the number of the people coming out at different acquisition time points to obtain the total number of the people coming out within the preset time period corresponding to the monitoring point position. And then the total inflow number and the total outflow number of each monitoring point location in the corresponding preset time period can be obtained.
For example, the total number of incoming people and the total number of outgoing people in the time point from 11 minutes 00 seconds at 08 hours to 12 minutes 00 seconds at 08 hours corresponding to each monitoring device may be obtained, and the time period from the time point from 11 minutes 00 seconds at 08 hours to 12 minutes 00 seconds at 08 hours is the preset time period.
And further, determining the personnel inventory in the target monitoring area based on the historical number of people in the target monitoring area in the preset time period and the total number of people entering and leaving in the preset time period corresponding to the plurality of monitoring point positions respectively. For example, for each monitoring video in the video group, the total incoming flow number and the total outgoing flow number of the monitoring video in the preset time period corresponding to each monitoring video may be subtracted to obtain the flow variation of the monitoring video in the preset time period, the flow variation in the preset time period corresponding to each monitoring video is added to obtain the total flow variation corresponding to the video group (i.e., the total flow variation corresponding to the site or the place corresponding to the video group), and then the total flow variation corresponding to the video group is added to the historical number of people in the target monitoring area in the preset time period to obtain the net number of people in the target monitoring area (i.e., the current number of people in the current time point corresponding to the site or the place corresponding to the video group).
For example, the preset time period may be a time period between a time point of 11 minutes 00 seconds at 08 hours and a time point of 12 minutes 00 seconds at 08 hours, then the current number of people corresponding to the time point of 11 minutes 00 seconds at 08 hours (that is, the number of historical people in the target monitoring area within the preset time period) may be obtained, the total number of incoming people and the total number of outgoing people within 11 minutes 00 seconds to 12 minutes 00 seconds at 08 hours (the preset time period) corresponding to each monitoring video within the video group may be obtained, and the total number of incoming people and the total number of outgoing people within 12 minutes 00 seconds at 08 hours and 11 minutes 00 seconds at 08 hours corresponding to each monitoring video within the video group may be determined based on the number of historical people (that is, the obtained, the total number of incoming people and the total number of outgoing people) within 11 minutes 00 seconds to 08 minutes 00 seconds within the preset time period) within the preset time period.
After the net stock of the personnel in the target monitoring area is obtained, the net stock of the personnel in the target monitoring area can be monitored, and when the net stock of the personnel in the target monitoring area is monitored to be larger than a preset net stock threshold value, people flow state warning information is generated. For example, the generated pedestrian flow state warning message may be "note, current time xx has more people in the yard". In this case, the alarm event types of the people flow state alarm information are: and (5) video group cross-line alarm.
In specific implementation, for the video group cross-line alarm, multiple levels of alarm risks may be set, for example, the multiple levels of alarm risks include: the inventory trend, the inventory warning, and the inventory superheat, different net inventory thresholds are set for different reporting risks, for example, the net inventory threshold corresponding to the inventory trend may be 100, the net inventory threshold corresponding to the inventory warning may be 200, and the net inventory threshold corresponding to the inventory superheat may be 500. Different people stream state alarm information can be set according to different alarm risks. For example, the crowd state warning information corresponding to the stock trend may be: alarm information in text format; the people flow state warning information corresponding to the stock warning can be as follows: alarm information in voice format; the pedestrian flow state warning information corresponding to the stock overheating can be as follows: alarm information in video format.
Further, after triggering the generated people stream state alarm information, the detailed information of the people stream state alarm may be displayed, where the detailed information includes, but is not limited to, an alarm point location (i.e., a name of a monitoring device that alarms, etc.), an alarm time, and an alarm event type, and when the alarm time type is a video group cross-line alarm, the detailed information may further include: the inventory of people at the current point in time.
After determining the total number of people entering the target monitoring area and the total number of people leaving the target monitoring area within the preset time period corresponding to each monitoring point, determining the amount of net inventory of people within the target monitoring area based on the historical number of people in the target monitoring area within the preset time period and the total number of people entering the total number of people leaving the target monitoring area and the total number of people leaving the target monitoring area within the preset time period respectively corresponding to the monitoring points, and generating the people flow state alarm information under the condition that the amount of net inventory of people within the target monitoring area is greater than the set net inventory threshold value, so that the early warning of the amount of net inventory of people within the target monitoring area is realized, when the amount of net inventory of people is large, the people are dredged based on the generated people flow state alarm information, and the occurrence of safety accidents caused by the large number of people in the target monitoring area is avoided.
Referring to fig. 4a, an interface diagram showing details of the warning of the traffic status is shown, and fig. 4a shows details of the warning of the traffic status in a map mode; referring to another interface diagram showing detailed information of the people flow state alarm shown in fig. 4b, the detailed information of the people flow state alarm is shown in fig. 4b in a list mode, wherein the list shown in fig. 4b includes an over-dense event and an off-line event. Specifically, after the information of the offline event shown in fig. 4a is triggered, or after the information of the offline event shown in fig. 4b is triggered, the alarm details shown in fig. 4c may be shown, where the alarm details shown in fig. 4c include a group name (i.e., a name corresponding to a video group), an event type, an alarm time, a duration, a peak value of total stock of people flow, and statistics of total stock of people flow in the day.
Next, the alarm process of the over-dense event may be explained in detail.
In an optional embodiment, in the case that the monitoring event is an over-dense event, acquiring the number-of-people monitoring data matched with the monitoring event within a preset time period includes: and counting the number of the target objects at different acquisition time points in a preset time period.
Here, when there is a monitoring identifier corresponding to an over-dense event in the monitoring video, the target objects (humans) in the detection area corresponding to the monitoring identifier may be detected based on the monitoring identifier corresponding to the over-dense event and the monitoring video, so as to obtain the number of the target objects in the detection area at each acquisition time point. When the monitoring identification corresponding to the over-dense time does not exist in the monitoring video, the whole monitoring picture of the monitoring video is considered to be the detection area, the monitoring video can be detected, and the number of the target objects in the detection area at each acquisition time point is obtained.
During specific implementation, the trained deep learning algorithm for identifying the target object can be utilized to detect the detection area in the surveillance video, the detection result is output in real time, and the detection result can be the number of people in the detection area at each acquisition time point in the surveillance video. The deep learning algorithm may output the detection result periodically, for example, the deep learning algorithm may output the detection result once every second, or may output the detection result once every two seconds. For example, the detection result may be: the number of people in the detection area at 10 minutes 00 seconds at 08 hours (acquisition time point) is 50; the number of people in the detection area at time 08, 10 minutes, 01 seconds is 54, and the like.
Furthermore, the number of target objects at different acquisition time points in the preset time period may be counted, for example, the preset time period is from 10 minutes 00 seconds at 08 hours to 11 minutes 00 seconds at 08 hours, and each second time point in the preset time point is taken as one acquisition time point, that is, the number of target objects at 10 minutes 00 seconds at 08 hours (acquisition time point 1), the number of target objects at 10 minutes 01 seconds at 08 hours (acquisition time point 2), and the number of target objects at … and 10 minutes 59 seconds at 08 hours (acquisition time point 60) may be counted.
In the method, when the monitoring event is an excessive event, the number of the target objects at different acquisition time points in a preset time period can be counted, and data support is provided for subsequently determining the people stream state data corresponding to the excessive event.
In an optional embodiment, in a case that the monitoring point is located at one point, determining people flow state data of at least one monitoring device based on the number of people monitoring data matched with the monitoring event within a preset time period includes: and determining the average number of the monitoring areas corresponding to the monitoring point positions in the preset time period based on the number of the target objects at different acquisition time points in the preset time period.
Generating people stream state alarm information under the condition that the people stream state data of at least one monitoring device is determined to meet the alarm condition, wherein the people stream state alarm information comprises the following steps: and generating people stream state alarm information under the condition that the average number of people in a monitoring area corresponding to the monitoring point location in a preset time period is greater than a set first number of people threshold value.
Here, the number of target objects at different collection time points in the preset time period may be averaged to obtain the average number of people in the preset time period in the monitoring area corresponding to the monitoring point location. And monitoring the average number of people in the preset time period, and generating people stream state alarm information when the average number of people is greater than the set first number threshold. The length of the preset time period may be set as required, for example, the length of the preset time period may be 5 seconds, 10 seconds, 60 seconds, 5 minutes, and the like. The length of the preset time period corresponding to the overline event and the length of the preset time period corresponding to the overtension event can be the same or different.
For example, the number of the target objects at different acquisition time points in a preset time period includes: if the number of target objects at time 08 is 10 minutes 01 seconds is 50, the number of target objects at time 08 is 10 minutes 02 seconds is 53, the number of target objects at time 08 is 10 minutes 03 seconds is 52, the number of target objects at time 08 is 10 minutes 04 seconds is 51, and the number of target objects at time 08 is 10 minutes 05 seconds is 54, then the 5-time detection results may be averaged to obtain an average value of 52, and the average number of people in the monitoring area corresponding to the monitoring point location within 10 minutes 01 seconds to 10 minutes 05 seconds at time 08 is determined to be 52.
Furthermore, the average number of people in a monitoring area corresponding to the monitoring point location within a preset time period can be monitored, and when the average number of people is greater than a set first number threshold, people stream state warning information is generated. For example, the generated people flow state warning message may be "attention, number of people in the current time xx area is large". In this case, the alarm event types of the people flow state alarm information are: and alarming the point over dense.
Further, after triggering the generated people flow state alarm information, the detailed information of the people flow state alarm may be displayed, where the detailed information includes, but is not limited to, an alarm point location (i.e., a name of a monitoring device that alarms, etc.), an alarm time, and an alarm event type, and when the alarm event type is a point-over-dense alarm, the detailed information may further include: the number of real-time people in the detection area at the current time point.
Referring to fig. 5, an interface diagram showing detailed information of a people stream state alarm is shown, where the interface diagram includes alarm details and current time period statistics of the current excessive density event, where the alarm details include alarm point locations, event types, alarm time, excessive density duration, peak number of people, peak density value, and the like, and the current time period statistics of the excessive density event include the current time period statistics of the excessive density event alarm from the zero point of the current day to the current time of the statistics. The figure also comprises a video screen capture, and a plurality of frames of alarm pictures are displayed below the video screen capture, wherein the number of the alarm pictures is related to the duration of the excessive density event, for example, when the duration of the excessive density event is 17 minutes, one frame of alarm picture can be extracted every other one minute and used as an alarm record, that is, 17 frames of alarm pictures can be displayed below the video screen capture.
In specific implementation, the name and the installation position of the monitoring device, point location information such as the collected monitoring video, and information persistence association such as the number of real-time people, the maximum value of the number of real-time people, and the minimum value of the number of real-time people per minute of the monitoring device may be stored in a search server (e.g., an elastic search) for subsequent search query.
In the method, when one monitoring point location is provided, the average number of people in a monitoring area corresponding to the monitoring point location in a preset time period is determined based on the number of target objects at different acquisition time points in the preset time period; and under the condition that the average number of people in the monitoring area corresponding to the monitoring point location in the preset time period is greater than the set first number threshold, people stream state alarm information is generated, the monitoring of the average number of people in the monitoring area of the monitoring video is realized, so that people stream dispersion is carried out on the monitoring area based on the generated people stream state alarm information, and the occurrence of safety accidents caused when people in the monitoring area are dense is avoided.
In an optional embodiment, in a case that there are a plurality of monitoring points, determining people flow state data of the at least one monitoring device based on the number of people monitoring data matched with the monitoring event within a preset time period includes:
step one, aiming at each monitoring point location, determining the average number of people in a monitoring area corresponding to the monitoring point location in a preset time period based on the number of the target objects at different acquisition time points in the preset time period.
And secondly, determining the total real-time number of people in the target monitoring area based on the average number of people corresponding to the plurality of monitoring point positions respectively.
Generating people stream state alarm information under the condition that the people stream state data of at least one monitoring device is determined to meet the alarm condition, wherein the people stream state alarm information comprises the following steps: and generating people flow state alarm information under the condition that the total real-time people number in the target monitoring area is determined to be larger than the set second people number threshold value.
Here, for each monitoring point location, the average number of people in the preset time period in the monitoring area corresponding to the monitoring point location may be determined based on the number of target objects at different acquisition time points in the preset time period; and then, the average number of people corresponding to the plurality of monitoring point positions can be added to determine the total real-time number of people in the target monitoring area.
After the total real-time number of people in the target monitoring area is determined, the total real-time number of people can be monitored, and when the total real-time number of people in the target monitoring area is determined to be larger than a set second number threshold value, people flow state warning information is generated.
In specific implementation, the people stream passing density analysis function of each monitoring video in the video group can be started by triggering the starting button of the passing density event corresponding to the video group arranged on the display interface. Meanwhile, specific information of real-time head number grading early warning can be set on the display interface, for example, the number of first-stage early warning people corresponding to general risks, the number of second-stage early warning people corresponding to greater risks and the number of third-stage early warning people corresponding to major risks are filled.
Here, when the monitoring apparatus includes a plurality of monitoring apparatuses, the monitoring videos respectively collected by the plurality of monitoring apparatuses constitute a video group. For the surveillance video collected by each surveillance device (i.e. for each surveillance video in the video group), a trained deep learning algorithm for identifying target objects can be used to detect the detection area indicated by the monitoring identifier in the surveillance video, and a detection result can be output in real time, wherein the detection result can be the number of target objects in the detection area at the collection time point in the surveillance video. Further, the average number of people in the monitoring area corresponding to the monitoring point location within the preset time period can be determined based on the detection results obtained periodically.
After the average number of people corresponding to each monitored video in the video group is obtained, the average number of people corresponding to each monitored video in the video group can be added to determine the total real-time number of people in the target monitoring area. And then the total real-time number of people in the target monitoring area can be monitored, and the people flow state warning information is generated under the condition that the total real-time number of people in the target monitoring area is greater than the set second people number threshold value. For example, the generated people flow state warning message may be "attention, current time xx scene has more people". In this case, the alarm event types of the people flow state alarm information are: and alarming the video group is too dense.
In specific implementation, for the video group excessive alarm, multiple levels of alarm risks may be set, for example, the multiple levels of alarm risks include: for example, the second number of people threshold corresponding to the first level risk may be 100, the second number of people threshold corresponding to the second level risk may be 200, and the second number of people threshold corresponding to the major risk may be 500. Different people stream state alarm information can be set according to different alarm risks. For example, the people flow state warning information corresponding to the first-level risk may be: alarm information in text format; the people stream state warning information corresponding to the secondary risk may be: alarm information in voice format; the people stream state warning information corresponding to the third-level risk may be: alarm information in video format.
Further, after triggering the generated people stream state alarm information, the detailed information of the people stream state alarm may be displayed, where the detailed information includes, but is not limited to, an alarm point location (i.e., a name of a monitoring device that alarms, etc.), an alarm time, and an alarm event type, and when the alarm event type is a video group over-dense alarm, the detailed information may further include: total real-time population of real-world scenes.
Referring to fig. 6a, which is a schematic diagram of an interface for displaying detailed information of a people stream state alarm, fig. 6a shows the detailed information of the people stream state alarm in a map mode; and referring to another interface schematic diagram showing detailed information of the people flow state alarm shown in fig. 4b, the detailed information of the people flow state alarm is shown in fig. 4b in a list mode, wherein the list shown in fig. 4b includes an over-dense event and an over-line event. Specifically, after triggering the information of the over-dense event shown in fig. 6a, or after triggering the information of the over-dense event shown in fig. 4b, the alarm details shown in fig. 6b may be shown, where the alarm details shown in fig. 6b include a group name (i.e., a name corresponding to a video group), an event type, an alarm time, a duration, a people number peak, a density peak, a current day real-time people number count, and a video source count.
After the people stream state data in the real scene are determined based on the monitoring video collected by at least one monitoring device and the pre-drawn monitoring identification matched with the target position in the video picture, a schematic diagram of the change of the people stream state data along with time can be generated so as to visually display the people stream state data of the current day. Specifically, the schematic diagram of the change of the people stream state data along with the time comprises a first schematic diagram of the change of the real-time total number along with the time, wherein the first schematic diagram of the change comprises a change relation of a peak number value along with the time and a change relation of a valley number value along with the time; and/or, a second schematic representation of the total stock of people over time; the second variation schematic diagram comprises the variation relation of the total stream of people with time, the variation relation of the incoming total stream of people with time and the variation relation of the total stock of the stream of people with time. The time interval set by the first variation diagram and the second variation diagram may be 5 minutes, 10 minutes, 30 minutes, 1 hour, and the like.
In the method, after the average number of people in the monitoring area corresponding to each monitoring point location within the preset time period is determined, the total number of real-time people in the target monitoring area can be determined based on the average number of people corresponding to the monitoring point locations respectively; and generating the people stream state warning information under the condition that the total real-time people number in the target monitoring area is determined to be larger than the set second people number threshold value, so that the early warning of the total real-time people number in the target monitoring area is realized, people in the target monitoring area are dredged based on the generated people stream state warning information when the total real-time people number is large, and safety accidents caused by the fact that the total real-time people number is large in the target monitoring area are avoided.
In an alternative embodiment, the method further comprises: averaging the people stream state data of the same acquisition time point in a plurality of recent historical dates to obtain the predicted people stream state data corresponding to each acquisition time point; predicting the people stream state data corresponding to each acquisition time point to form the prediction data of the people stream state data in the future date; the prediction data is used for generating a people stream dispersion plan.
Here, the multiple history periods may be set as needed, for example, the multiple history periods may be people stream state data in the last 7 days (one history period corresponds to one day), that is, in the time of 00: 10/8/10/7/10/1/7/10/7 (7 history periods), the people stream state data at the same collection time point in the last 7 history dates may be averaged, so as to obtain predicted people stream state data corresponding to each collection time point; and predicting the people stream state data corresponding to each acquisition time point to form the predicted data of the people stream state data in the future date.
For example, generating forecast data of total incoming flow quantity on a future date (future day), generating forecast data of total outgoing flow quantity on a future date (future day), and generating forecast data of human inventory on a future date (future day).
Furthermore, the crowd evacuation plan may be generated based on the prediction data of the crowd status data in the future date, for example, if the total number of people at 15 o 'clock is the largest in the prediction data, the number of people entering the target monitoring area may be controlled at 15 o' clock.
In an actual application scene, the method can be applied to a market, a hall and other scenes. The following description is given by taking a mall as an example, and respectively illustrates an offline event of a monitoring video and an offline event of a video group, and assuming that there are two doors in the mall, a monitoring device may be disposed at each door position (monitoring point location), that is, a monitoring device i (a monitoring device i disposed at the monitoring point location) collects a monitoring video of the door a, a monitoring device ii (a monitoring device ii disposed at the monitoring point location) collects a monitoring video of the door B, and the monitoring device i and the monitoring device ii may monitor pedestrians entering and exiting the door.
In specific implementation, the first monitoring video collected by the first monitoring device and the second monitoring video collected by the second monitoring device can be obtained. Aiming at a first monitoring video, drawing an access boundary line and an access direction on a video picture screenshot of the first monitoring video, and determining the total number of incoming streams and the total number of outgoing streams in a preset time period in a monitoring area corresponding to a monitoring point position in the first monitoring video; and generating people stream state alarm information under the condition that the total number of people entering the system in the preset time period is larger than the set first people stream threshold value and/or the total number of people leaving the system in the preset time period is larger than the set second people stream threshold value. And setting an access boundary line and an access direction on a video picture screenshot of the second monitoring video aiming at the second monitoring video, and determining the total number of incoming streams and the total number of outgoing streams within a preset time period in a monitoring area corresponding to the monitoring point location in the second monitoring video. And generating people stream state alarm information under the condition that the total incoming people stream quantity in the preset time period is larger than the set first people stream threshold value and/or the total outgoing people stream quantity in the preset time period is larger than the set second people stream threshold value.
Meanwhile, the first monitoring video and the second monitoring video form a video group, and the video group can be analyzed to determine people stream state data in a target monitoring area corresponding to the first monitoring device and the second monitoring device. In specific implementation, aiming at a first monitoring video, determining the total incoming flow number and the total outgoing flow number in a preset time period in a monitoring area corresponding to a first monitoring point; and determining the total incoming flow number and the total outgoing flow number in a preset time period in a monitoring area corresponding to the second monitoring point location for the second monitoring video. And further, determining the personnel inventory in the target monitoring area based on the historical number of people in the target monitoring area in the preset time period and the total number of people entering and leaving in the preset time period corresponding to the plurality of monitoring point positions respectively. I.e. the amount of personnel inventory within the store is determined. And generating the pedestrian flow state warning information under the condition that the personnel net stock is greater than the set net stock threshold value, so that the pedestrians in the shopping mall can be regulated and controlled after the pedestrian flow state warning information is received, and the occurrence of a congestion event is avoided.
The following description will be made of a dense event of a surveillance video and a dense event of a video group, respectively, taking a hall as an example. It is assumed that monitoring devices are respectively arranged at four corners (four monitoring points) of the hall, that is, four monitoring devices detect four detection areas of the hall, and monitoring videos collected by each monitoring device in the four monitoring devices form a video group.
In specific implementation, for each monitoring video in a video group, based on the number of target objects at different acquisition time points within a preset time period, the average number of people in a monitoring area corresponding to a monitoring point location within the preset time period is determined, and when the average number of people corresponding to the monitoring video is greater than a set first number threshold, people flow state alarm information corresponding to the monitoring video is generated. Namely, the monitoring of the excessive events aiming at each monitoring video in the video group is realized.
Meanwhile, a monitoring identifier can be drawn on the screenshot of the video picture, namely, the area corresponding to the monitoring identifier is a detection area; the monitoring mark also can not be drawn on the screenshot of the video picture, namely the monitoring video does not have a corresponding reference surface mark, and at the moment, the whole video picture is defaulted to be a detection area.
Meanwhile, the video group can be monitored for the excessive density events, and the total real-time number of people in the target monitoring area corresponding to the video group is determined. In specific implementation, the average number of people in a monitoring area corresponding to a monitoring point location in a preset time period is determined for each monitoring video in a video group; and determining the total real-time number of people in the target monitoring area based on the average number of people corresponding to the four monitoring point positions respectively. I.e. the total real-time population of the plurality of detection zones in the lobby is determined. And generating the people flow state alarm information under the condition that the total real-time number of people in the target monitoring area is greater than the set second people number threshold value, so that after the people flow state alarm information is received, the dense area in the hall can be dredged, and accidents caused by dense people are avoided.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Based on the same concept, an embodiment of the present disclosure further provides a scene monitoring device, as shown in fig. 7, which is an architecture schematic diagram of the scene monitoring device provided in the embodiment of the present disclosure, and includes a first obtaining module 701, a detecting module 702, a second obtaining module 703, and a determining module 704, specifically:
a first obtaining module 701, configured to obtain a monitoring video collected by a monitoring device disposed at least one monitoring point location;
a detection module 702, configured to determine whether a monitoring event occurs in a monitoring area corresponding to the at least one monitoring point location based on the monitoring video;
the second obtaining module 703 is configured to obtain, when a monitoring event occurs in a monitoring area corresponding to the at least one monitoring point location, number-of-people monitoring data matched with the monitoring event within a preset time period;
a determining module 704, configured to determine people stream status data of the at least one monitoring device based on the number of people monitoring data matching the monitoring event within a preset time period.
In a possible embodiment, after determining the people flow status data of the at least one monitoring device, the method further includes: an alert module 705 configured to:
and generating people stream state alarm information under the condition that the people stream state data of the at least one monitoring device is determined to meet the alarm condition.
In a possible implementation manner, in a case that the monitoring event is a line crossing event, the detecting module 702, when determining whether a monitoring event occurs in a monitoring area corresponding to the at least one monitoring point based on the monitoring video, is configured to:
determining whether a target object crossing a target position matched with a pre-drawn access boundary line exists in a monitoring area corresponding to the at least one monitoring point location based on the monitoring video;
and if so, determining that a cross-line event occurs in a monitoring area corresponding to the at least one monitoring point.
In a possible implementation manner, when the monitoring event is an offline event, the second obtaining module 703 is configured to, when obtaining the number-of-people monitoring data that matches the monitoring event within a preset time period,:
acquiring the number of people entering and the number of people leaving at different acquisition time points within a preset time period, wherein the number of people entering at different acquisition time points refers to the number of people crossing a pre-drawn entrance and exit boundary line along a pre-drawn entrance direction at different acquisition time points; the number of people in the stream of people at different collection time points is the number of people crossing the pre-drawn in-and-out boundary line along the pre-drawn out direction at different collection time points.
In a possible implementation manner, in a case that the monitoring point is one, the determining module 704, when determining people flow status data of the at least one monitoring device based on the number of people monitoring data matching the monitoring event within a preset time period, is configured to:
determining the total number of people entering and the total number of people exiting in the preset time period in the monitoring area corresponding to the monitoring point location based on the number of people entering and the number of people exiting at different acquisition time points in the preset time period;
the alarm module 705, when generating the people stream state alarm information under the condition that it is determined that the people stream state data of the at least one monitoring device satisfies the alarm condition, is configured to:
and generating people stream state warning information under the condition that the total incoming people stream number in the preset time period is determined to be larger than a set first people stream threshold value and/or the total outgoing people stream number in the preset time period is determined to be larger than a set second people stream threshold value.
In a possible implementation, in a case that the monitoring point is one, the determining module 704, when determining the people flow status data of the at least one monitoring device based on the number-of-people monitoring data matching the monitoring event within a preset time period, is configured to:
and determining the incoming flow speed and the outgoing flow speed in the monitoring area corresponding to the monitoring point position based on the incoming flow quantity and the outgoing flow quantity of different acquisition time points in a preset time period.
In a possible implementation manner, in a case that there are a plurality of monitoring points, the determining module 704, when determining the people flow status data of the at least one monitoring device based on the people monitoring data matched with the monitoring event within a preset time period, is configured to:
for each monitoring point location, determining the total number of people entering and the total number of people exiting within a preset time period in a monitoring area corresponding to the monitoring point location based on the number of people entering and the number of people exiting at different acquisition time points within the preset time period;
determining the personnel inventory in the target monitoring area based on the historical number of people in the target monitoring area in the preset time period and the total number of people entering the preset time period and the total number of people leaving the preset time period which are respectively corresponding to the plurality of monitoring point locations;
the alarm module 705, when generating the people stream state alarm information under the condition that it is determined that the people stream state data of the at least one monitoring device satisfies the alarm condition, is configured to:
and generating people stream state alarm information under the condition that the personnel net stock in the target monitoring area is determined to be larger than the set net stock threshold value.
In a possible implementation manner, in a case that the monitoring event is an over-dense event, the detecting module 702, when determining whether a monitoring event occurs in a monitoring area corresponding to the at least one monitored point location based on the monitoring video, is configured to:
determining whether the number of target objects in a monitoring area corresponding to the at least one monitoring point location exceeds an over-dense threshold value or not based on the monitoring video;
and if so, determining that an over-dense event occurs in a monitoring area corresponding to the at least one monitoring point location.
In a possible implementation manner, in a case that the monitoring event is an over-dense event, the second obtaining module 703, when obtaining the number-of-people monitoring data matched with the monitoring event within a preset time period, is configured to:
and counting the number of the target objects at different acquisition time points in a preset time period.
In a possible implementation, in a case that the monitoring point is one, the determining module 704, when determining the people flow status data of the at least one monitoring device based on the number-of-people monitoring data matching the monitoring event within a preset time period, is configured to:
determining the average number of people in a monitoring area corresponding to the monitoring point location in a preset time period based on the number of the target objects at different acquisition time points in the preset time period;
the alarm module 705, when generating the people stream state alarm information under the condition that it is determined that the people stream state data of the at least one monitoring device satisfies the alarm condition, is configured to:
and generating people stream state alarm information under the condition that the average number of people in the monitoring area corresponding to the monitoring point location in the preset time period is larger than a set first number threshold value.
In a possible implementation manner, in a case that there are a plurality of monitoring points, the determining module 704, when determining the people flow status data of the at least one monitoring device based on the people monitoring data matched with the monitoring event within a preset time period, is configured to:
for each monitoring point location, determining the average number of people in the monitoring area corresponding to the monitoring point location in a preset time period based on the number of the target objects at different acquisition time points in the preset time period;
determining the total real-time number of people in the target monitoring area based on the average number of people corresponding to the monitoring point positions respectively;
the alarm module 705, when generating the people stream state alarm information under the condition that it is determined that the people stream state data of the at least one monitoring device satisfies the alarm condition, is configured to:
and generating people stream state alarm information under the condition that the total real-time people number in the target monitoring area is determined to be larger than a set second people number threshold value.
In a possible embodiment, the device further comprises: an early warning module 706 configured to:
averaging the people stream state data of the same acquisition time point in a plurality of recent historical dates to obtain the predicted people stream state data corresponding to each acquisition time point;
predicting the people stream state data corresponding to each acquisition time point to form the prediction data of the people stream state data in the future date; wherein the prediction data is used to generate a people stream dispersion plan.
In some embodiments, the functions of the apparatus provided in the embodiments of the present disclosure or the included templates may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, no further description is provided here.
Based on the same technical concept, the embodiment of the disclosure also provides an electronic device. Referring to fig. 8, a schematic structural diagram of an electronic device provided in the embodiment of the present disclosure includes a processor 801, a memory 802, and a bus 803. The memory 802 is used for storing execution instructions and includes a memory 8021 and an external memory 8022; the memory 8021 is also referred to as an internal memory, and is used for temporarily storing operation data in the processor 801 and data exchanged with an external memory 8022 such as a hard disk, the processor 801 exchanges data with the external memory 8022 through the memory 8021, and when the electronic device 800 operates, the processor 801 communicates with the memory 802 through the bus 803, so that the processor 801 executes the following instructions:
acquiring a monitoring video acquired by monitoring equipment arranged at least one monitoring point location;
determining whether a monitoring event occurs in a monitoring area corresponding to the at least one monitoring point location based on the monitoring video;
acquiring the number of people monitoring data matched with the monitoring event in a preset time period under the condition that the monitoring event occurs in the monitoring area corresponding to the at least one monitoring point location;
and determining the people flow state data of the at least one monitoring device based on the people number monitoring data matched with the monitoring event in a preset time period.
In addition, the present disclosure also provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the scene monitoring method described in the above method embodiments are performed.
The computer program product of the scene monitoring method provided in the embodiments of the present disclosure includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute the steps of the scene monitoring method described in the above method embodiments, which may be referred to specifically for the above method embodiments, and are not described herein again.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in software functional units and sold or used as a stand-alone product, may be stored in a non-transitory computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above are only specific embodiments of the present disclosure, but the scope of the present disclosure is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present disclosure, and shall be covered by the scope of the present disclosure. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (15)

1. A method for monitoring a scene, comprising:
acquiring a monitoring video acquired by monitoring equipment arranged at least one monitoring point position;
determining whether a monitoring event occurs in a monitoring area corresponding to the at least one monitoring point location based on the monitoring video;
acquiring the number of people monitoring data matched with the monitoring event in a preset time period under the condition that the monitoring event occurs in the monitoring area corresponding to the at least one monitoring point location;
determining people flow state data of the at least one monitoring device based on the number of people monitoring data matched with the monitoring event in a preset time period; wherein the monitoring event comprises a line crossing event and/or an over-dense event; the people flow state data is used for representing the state of the monitoring event.
2. The method of claim 1, after determining the people flow status data of the at least one monitoring device, further comprising:
and generating people stream state alarm information under the condition that the people stream state data of the at least one monitoring device is determined to meet the alarm condition.
3. The method according to claim 1 or 2, wherein in a case that the monitoring event is a line crossing event, determining whether a monitoring event occurs in a monitoring area corresponding to the at least one monitoring point based on the monitoring video includes:
determining whether a target object crossing a target position matched with a pre-drawn access boundary line exists in a monitoring area corresponding to the at least one monitoring point location based on the monitoring video;
and if so, determining that a cross-line event occurs in a monitoring area corresponding to the at least one monitoring point.
4. The method according to claim 2, wherein in the case that the monitoring event is a line crossing event, acquiring the number-of-people monitoring data matched with the monitoring event in a preset time period comprises:
acquiring the number of people entering and the number of people exiting at different acquisition time points within a preset time period, wherein the number of people entering and the number of people exiting at different acquisition time points are the number of people crossing a pre-drawn entrance and exit boundary line along a pre-drawn entrance direction at different acquisition time points; the number of people coming out at different acquisition time points refers to the number of people crossing a pre-drawn in-and-out boundary line along a pre-drawn out direction at different acquisition time points.
5. The method of claim 4, wherein in the case that the monitoring point is one, determining the people flow state data of the at least one monitoring device based on the people monitoring data matched with the monitoring event within a preset time period comprises:
determining the total number of people entering and the total number of people exiting in the preset time period in the monitoring area corresponding to the monitoring point location based on the number of people entering and the number of people exiting at different acquisition time points in the preset time period;
generating people stream state alarm information under the condition that the people stream state data of the at least one monitoring device is determined to meet the alarm condition, wherein the people stream state alarm information comprises the following steps:
and generating people stream state warning information under the condition that the total incoming people stream number in the preset time period is determined to be larger than a set first people stream threshold value and/or the total outgoing people stream number in the preset time period is determined to be larger than a set second people stream threshold value.
6. The method of claim 4, wherein in the case that the monitoring point is one, determining the people flow state data of the at least one monitoring device based on the people monitoring data matched with the monitoring event within a preset time period comprises:
and determining the people entering flow speed and the people exiting flow speed in the monitoring area corresponding to the monitoring point location based on the people entering flow quantity and the people exiting flow quantity of different acquisition time points in a preset time period.
7. The method of claim 4, wherein in the case that the monitoring point location is multiple, determining the people flow state data of the at least one monitoring device based on the people monitoring data matched with the monitoring event within a preset time period comprises:
for each monitoring point location, determining the total number of people entering and the total number of people exiting within a preset time period in a monitoring area corresponding to the monitoring point location based on the number of people entering and the number of people exiting at different acquisition time points within the preset time period;
determining the personnel inventory in the target monitoring area based on the historical number of people in the target monitoring area in the preset time period and the total number of people entering the preset time period and the total number of people leaving the preset time period which are respectively corresponding to the plurality of monitoring point locations;
generating people stream state alarm information under the condition that the people stream state data of the at least one monitoring device is determined to meet the alarm condition, wherein the generation of the people stream state alarm information comprises the following steps:
and generating people stream state alarm information under the condition that the personnel net stock in the target monitoring area is determined to be larger than the set net stock threshold value.
8. The method according to claim 1 or 2, wherein in a case that the monitoring event is an excessive event, determining whether a monitoring event occurs in a monitoring area corresponding to the at least one monitoring point location based on the monitoring video includes:
determining whether the number of target objects in a monitoring area corresponding to the at least one monitoring point location exceeds an over-dense threshold value or not based on the monitoring video;
and if so, determining that an over-dense event occurs in a monitoring area corresponding to the at least one monitoring point location.
9. The method according to claim 2, wherein in the case that the monitoring event is an excessive event, acquiring the number-of-people monitoring data matched with the monitoring event within a preset time period comprises:
and counting the number of the target objects at different acquisition time points in a preset time period.
10. The method of claim 9, wherein determining the people flow state data of the at least one monitoring device based on the people monitoring data matched with the monitoring event within a preset time period in the case that the monitoring point is one comprises:
determining the average number of people in a monitoring area corresponding to the monitoring point location in a preset time period based on the number of the target objects at different acquisition time points in the preset time period;
generating people stream state alarm information under the condition that the people stream state data of the at least one monitoring device is determined to meet the alarm condition, wherein the generation of the people stream state alarm information comprises the following steps:
and generating people stream state alarm information under the condition that the average number of people in the monitoring area corresponding to the monitoring point location in the preset time period is larger than a set first number threshold value.
11. The method of claim 9, wherein in a case that the monitoring point location is multiple, determining the people flow state data of the at least one monitoring device based on the people monitoring data matched with the monitoring event within a preset time period comprises:
for each monitoring point location, determining the average number of people in the monitoring area corresponding to the monitoring point location in a preset time period based on the number of the target objects at different acquisition time points in the preset time period;
determining the total real-time number of people in the target monitoring area based on the average number of people corresponding to the monitoring point positions respectively;
generating people stream state alarm information under the condition that the people stream state data of the at least one monitoring device is determined to meet the alarm condition, wherein the people stream state alarm information comprises the following steps:
and generating people stream state alarm information under the condition that the total real-time people number in the target monitoring area is determined to be larger than a set second people number threshold value.
12. The method of any one of claims 1 to 11, further comprising:
averaging the people stream state data of the same acquisition time point in a plurality of recent historical dates to obtain the predicted people stream state data corresponding to each acquisition time point;
predicting the people stream state data corresponding to each acquisition time point to form the prediction data of the people stream state data in the future date; wherein the prediction data is used to generate a people stream grooming plan.
13. A scene monitoring device, comprising:
the first acquisition module is used for acquiring a monitoring video acquired by monitoring equipment arranged at least one monitoring point location;
the detection module is used for determining whether a monitoring event occurs in a monitoring area corresponding to the at least one monitoring point location based on the monitoring video;
the second acquisition module is used for acquiring the number of people monitoring data matched with the monitoring event within a preset time period under the condition that the monitoring event occurs in the monitoring area corresponding to the at least one monitoring point location;
the determining module is used for determining the people flow state data of the at least one monitoring device based on the people number monitoring data matched with the monitoring event in a preset time period; wherein the monitoring event comprises a line crossing event and/or an over-dense event; the people flow state data is used for representing the state of the monitoring event.
14. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the scene monitoring method according to any one of claims 1 to 12.
15. A computer-readable storage medium, having stored thereon a computer program which, when being executed by a processor, carries out the steps of the scene monitoring method according to one of claims 1 to 12.
CN202011190695.6A 2020-10-30 2020-10-30 Scene monitoring method and device, electronic equipment and storage medium Active CN112333431B (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN202011190695.6A CN112333431B (en) 2020-10-30 2020-10-30 Scene monitoring method and device, electronic equipment and storage medium
CN202210655667.XA CN114900669A (en) 2020-10-30 2020-10-30 Scene monitoring method and device, electronic equipment and storage medium
PCT/CN2021/094699 WO2022088653A1 (en) 2020-10-30 2021-05-19 Scene monitoring method and apparatus, electronic device, storage medium, and program
JP2021576933A JP7305808B2 (en) 2020-10-30 2021-05-19 On-site monitoring method and device, electronic device, storage medium and program
KR1020217042832A KR20220058859A (en) 2020-10-30 2021-05-19 Scenario monitoring methods, devices, electronic devices, storage media and programs

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011190695.6A CN112333431B (en) 2020-10-30 2020-10-30 Scene monitoring method and device, electronic equipment and storage medium

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202210655667.XA Division CN114900669A (en) 2020-10-30 2020-10-30 Scene monitoring method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112333431A CN112333431A (en) 2021-02-05
CN112333431B true CN112333431B (en) 2022-06-07

Family

ID=74297411

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202210655667.XA Pending CN114900669A (en) 2020-10-30 2020-10-30 Scene monitoring method and device, electronic equipment and storage medium
CN202011190695.6A Active CN112333431B (en) 2020-10-30 2020-10-30 Scene monitoring method and device, electronic equipment and storage medium

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202210655667.XA Pending CN114900669A (en) 2020-10-30 2020-10-30 Scene monitoring method and device, electronic equipment and storage medium

Country Status (4)

Country Link
JP (1) JP7305808B2 (en)
KR (1) KR20220058859A (en)
CN (2) CN114900669A (en)
WO (1) WO2022088653A1 (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114900669A (en) * 2020-10-30 2022-08-12 深圳市商汤科技有限公司 Scene monitoring method and device, electronic equipment and storage medium
CN113507588A (en) * 2021-06-03 2021-10-15 山西三友和智慧信息技术股份有限公司 Wisdom campus visitor flow monitoring system based on artificial intelligence
CN113536932A (en) * 2021-06-16 2021-10-22 中科曙光国际信息产业有限公司 Crowd gathering prediction method and device, computer equipment and storage medium
CN113762169A (en) * 2021-09-09 2021-12-07 北京市商汤科技开发有限公司 People flow statistical method and device, electronic equipment and storage medium
TWI796033B (en) * 2021-12-07 2023-03-11 巨鷗科技股份有限公司 People flow analysis and identification system
CN114724360A (en) * 2022-03-14 2022-07-08 江上(上海)软件科技有限公司 Application early warning system and early warning method based on smart city
CN114694285B (en) * 2022-03-29 2023-09-01 重庆紫光华山智安科技有限公司 People flow alarming method and device, electronic equipment and storage medium
CN115471978A (en) * 2022-08-18 2022-12-13 北京声智科技有限公司 Swimming place monitoring method and device
CN116188357A (en) * 2022-09-27 2023-05-30 珠海视熙科技有限公司 Entrance and exit human body detection method, imaging equipment, device and storage medium
CN115474005A (en) * 2022-10-28 2022-12-13 通号通信信息集团有限公司 Data processing method, data processing device, electronic apparatus, and storage medium
CN116012776B (en) * 2022-12-09 2024-02-23 北京数原数字化城市研究中心 Method and device for monitoring number of people, electronic equipment and readable storage medium
CN117238092B (en) * 2023-11-16 2024-01-30 建龙西林钢铁有限公司 Industrial factory risk early warning method based on oblique photography and man-vehicle positioning

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106778688A (en) * 2017-01-13 2017-05-31 辽宁工程技术大学 The detection method of crowd's throat floater event in a kind of crowd scene monitor video
CN107133607A (en) * 2017-05-27 2017-09-05 上海应用技术大学 Demographics' method and system based on video monitoring
WO2018059408A1 (en) * 2016-09-29 2018-04-05 北京市商汤科技开发有限公司 Cross-line counting method, and neural network training method and apparatus, and electronic device
CN107911653A (en) * 2017-11-16 2018-04-13 王磊 The module of intelligent video monitoring in institute, system, method and storage medium
CN111274340A (en) * 2020-01-15 2020-06-12 中国联合网络通信集团有限公司 People flow density monitoring processing method, equipment and storage medium

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007243342A (en) 2006-03-06 2007-09-20 Yokogawa Electric Corp Image-monitoring apparatus and image-monitoring system
US9197861B2 (en) * 2012-11-15 2015-11-24 Avo Usa Holding 2 Corporation Multi-dimensional virtual beam detection for video analytics
WO2014174737A1 (en) 2013-04-26 2014-10-30 日本電気株式会社 Monitoring device, monitoring method and monitoring program
CN104239908A (en) * 2014-07-28 2014-12-24 中国科学院自动化研究所 Intelligent ridership automatic statistical method based on self-adaptive threshold value
US11019268B2 (en) * 2015-03-27 2021-05-25 Nec Corporation Video surveillance system and video surveillance method
US9840166B2 (en) * 2015-04-13 2017-12-12 Verizon Patent And Licensing Inc. Determining the number of people in a vehicle
CN105139425B (en) * 2015-08-28 2018-12-07 浙江宇视科技有限公司 A kind of demographic method and device
CN105447458B (en) * 2015-11-17 2018-02-27 深圳市商汤科技有限公司 A kind of large-scale crowd video analytic system and method
CN205354276U (en) * 2015-12-24 2016-06-29 上海市水利工程设计研究院有限公司 Pressure sensing formula people current density alarm device
WO2017122258A1 (en) 2016-01-12 2017-07-20 株式会社日立国際電気 Congestion-state-monitoring system
CN105763853A (en) * 2016-04-14 2016-07-13 北京中电万联科技股份有限公司 Emergency early warning method for stampede accident in public area
CN106211065A (en) * 2016-06-30 2016-12-07 北京奇虎科技有限公司 The monitoring method and device of personnel's data on flows
WO2018025831A1 (en) 2016-08-04 2018-02-08 日本電気株式会社 People flow estimation device, display control device, people flow estimation method, and recording medium
CN107844848B (en) * 2016-09-20 2020-12-29 中国移动通信集团湖北有限公司 Regional pedestrian flow prediction method and system
CN109428938A (en) * 2017-09-04 2019-03-05 上海仪电(集团)有限公司中央研究院 A kind of linkage control intelligence system based on video analysis
JP2019117425A (en) * 2017-12-26 2019-07-18 キヤノンマーケティングジャパン株式会社 Information processing device, control method therefor, and program
CN109087478A (en) * 2018-08-22 2018-12-25 徐自远 A kind of early warning of the anti-swarm and jostlement of intelligence and method of river diversion and system
CN109272153A (en) * 2018-09-10 2019-01-25 合肥巨清信息科技有限公司 A kind of tourist attraction stream of people early warning system
CN109685009A (en) * 2018-12-20 2019-04-26 天和防务技术(北京)有限公司 A kind of method of region crowd density video detection
CN110708518B (en) * 2019-11-05 2021-05-25 北京深测科技有限公司 People flow analysis early warning dispersion method and system
CN110929648B (en) * 2019-11-22 2021-03-16 广东睿盟计算机科技有限公司 Monitoring data processing method and device, computer equipment and storage medium
CN111652161A (en) * 2020-06-08 2020-09-11 上海商汤智能科技有限公司 Crowd excess density prediction method and device, electronic equipment and storage medium
CN114900669A (en) * 2020-10-30 2022-08-12 深圳市商汤科技有限公司 Scene monitoring method and device, electronic equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018059408A1 (en) * 2016-09-29 2018-04-05 北京市商汤科技开发有限公司 Cross-line counting method, and neural network training method and apparatus, and electronic device
CN106778688A (en) * 2017-01-13 2017-05-31 辽宁工程技术大学 The detection method of crowd's throat floater event in a kind of crowd scene monitor video
CN107133607A (en) * 2017-05-27 2017-09-05 上海应用技术大学 Demographics' method and system based on video monitoring
CN107911653A (en) * 2017-11-16 2018-04-13 王磊 The module of intelligent video monitoring in institute, system, method and storage medium
CN111274340A (en) * 2020-01-15 2020-06-12 中国联合网络通信集团有限公司 People flow density monitoring processing method, equipment and storage medium

Also Published As

Publication number Publication date
CN114900669A (en) 2022-08-12
KR20220058859A (en) 2022-05-10
CN112333431A (en) 2021-02-05
JP7305808B2 (en) 2023-07-10
WO2022088653A1 (en) 2022-05-05
JP2023502816A (en) 2023-01-26

Similar Documents

Publication Publication Date Title
CN112333431B (en) Scene monitoring method and device, electronic equipment and storage medium
Adam et al. Robust real-time unusual event detection using multiple fixed-location monitors
CN108965826B (en) Monitoring method, monitoring device, processing equipment and storage medium
CN111860230B (en) Automatic detection system and method based on behavior of video monitoring personnel not wearing mask
US11752962B2 (en) Automatic accident detection
CN111325954B (en) Personnel loss early warning method, device, system and server
CN109803127A (en) Urban safety building site monitoring system and method based on big data and technology of Internet of things
US11763662B2 (en) Systems and methods of enforcing dynamic thresholds of social distancing rules
CN110544312A (en) Video display method and device in virtual scene, electronic equipment and storage device
CN111127066A (en) Mining application method and device based on user information
JP6621092B1 (en) Risk determination program and system
CN114170272A (en) Accident reporting and storing method based on sensing sensor in cloud environment
CN113869220A (en) Monitoring method and system for major traffic accidents
CN113112744A (en) Security management method and device, electronic equipment and storage medium
CN111914050A (en) Visual 3D monitoring platform based on specific places
CN112699328A (en) Network point service data processing method, device, system, equipment and storage medium
CN112330742A (en) Method and device for recording activity routes of key personnel in public area
JP2015056697A (en) Monitor system and control method thereof
CN114067248A (en) Behavior detection method and device, electronic equipment and storage medium
CN111372197B (en) Early warning method and related device
JP6739119B6 (en) Risk determination program and system
CN113052049A (en) Off-duty detection method and device based on artificial intelligence tool identification
CN112507928A (en) Visitor management system and method based on face recognition
CN113128294A (en) Road event evidence obtaining method and device, electronic equipment and storage medium
CN111611938B (en) Retrograde direction determining method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40040125

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant