CN111127830A - Alarm method, alarm system and readable storage medium based on monitoring equipment - Google Patents

Alarm method, alarm system and readable storage medium based on monitoring equipment Download PDF

Info

Publication number
CN111127830A
CN111127830A CN201811296613.9A CN201811296613A CN111127830A CN 111127830 A CN111127830 A CN 111127830A CN 201811296613 A CN201811296613 A CN 201811296613A CN 111127830 A CN111127830 A CN 111127830A
Authority
CN
China
Prior art keywords
face image
monitoring
alarm
current
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201811296613.9A
Other languages
Chinese (zh)
Inventor
吴炽强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qiku Internet Technology Shenzhen Co Ltd
Original Assignee
Qiku Internet Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qiku Internet Technology Shenzhen Co Ltd filed Critical Qiku Internet Technology Shenzhen Co Ltd
Priority to CN201811296613.9A priority Critical patent/CN111127830A/en
Publication of CN111127830A publication Critical patent/CN111127830A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Abstract

The application discloses an alarm method based on monitoring equipment, the monitoring equipment, an alarm system and a computer readable storage medium, wherein the alarm method comprises the steps of collecting a monitoring picture and acquiring a current face image in the monitoring picture; sending the current face image to a monitoring server so that the monitoring server acquires corresponding identity information and a historical face image corresponding to the identity information based on the current face image and acquires the historical face image sent by the monitoring server; and comparing the current face image with the historical face image to judge whether the current face image meets the alarm condition. The alarm method is based on the monitoring equipment, can effectively identify the face image, compares the face image with the historical face image to judge the face emotion, and immediately alarms when the abnormal condition is judged. The method and the device can find crime or dangerous situations in time.

Description

Alarm method, alarm system and readable storage medium based on monitoring equipment
Technical Field
The present application relates to the field of electronic technologies, and in particular, to an alarm method, an alarm system, and a readable storage medium based on a monitoring device.
Background
Modern life science and technology is developed, and in order to ensure personal safety and property safety of people, monitoring equipment is often installed in places with large people flow, such as entrances and exits of public places, overpasses, gates of residential areas and the like. The existing monitoring equipment has various types, and common monitoring equipment records the on-site situation in a memory card or transmits a picture to display equipment of a monitor, and the monitor judges whether a dangerous situation occurs and gives an alarm.
The current monitoring method can find dangerous conditions in time only by watching monitoring pictures all the time, and in places with large flow of people, the monitoring pictures display a lot of people, and a monitor is difficult to analyze the conditions of each person one by one.
Disclosure of Invention
The application provides an alarm method, an alarm system and a readable storage medium based on monitoring equipment, which aim to solve the problems that dangerous conditions cannot be found in time and intensive people cannot be monitored at the same time in the prior art.
In order to solve the technical problem, the application provides an alarm method based on monitoring equipment, which comprises the steps of collecting a monitoring picture and acquiring a current face image in the monitoring picture; sending the current face image to a monitoring server so that the monitoring server acquires corresponding identity information and a historical face image corresponding to the identity information based on the current face image and acquires the historical face image sent by the monitoring server; comparing the current face image with the historical face image to judge whether the current face image meets an alarm condition; if yes, alarm information is sent out.
In order to solve the above technical problem, the present application provides a monitoring device, including a camera assembly, a memory, and a processor; the camera is used for collecting monitoring pictures, the memory is used for storing computer programs, and the processor is used for executing the computer programs to realize the alarm method.
In order to solve the technical problem, the application provides an alarm system, which comprises a monitoring device, a monitoring server and a police server; the monitoring device is the monitoring device, the monitoring device is used for sending the alarm information to the monitoring server, and the monitoring server is used for receiving the alarm information sent by the monitoring device and sending the alarm information to the police server.
In order to solve the above technical problem, the present application provides a computer-readable storage medium, in which a computer program is stored, and the computer program is used for implementing the alarm method of any one of the above items when the computer program is executed by a processor.
The method and the device acquire the current face image in the monitoring picture by acquiring the monitoring picture, compare the current face image with the historical face image to judge whether the current face image meets the alarm condition, and if so, send out alarm information. By the mode, a monitor does not need to observe a monitoring picture all the time, and the monitoring setting can automatically judge the dangerous condition and give an alarm according to the alarm condition. And the monitoring equipment can simultaneously identify a plurality of face images, and can well monitor the condition of each person in places with dense people streams. Therefore, people do not need to pay attention to the monitoring picture any more, dangerous conditions can be found in time, the safety of people is guaranteed, and the intellectualization of the monitoring equipment is realized.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic flow chart of an embodiment of an alarm method based on a monitoring device according to the present application;
FIG. 2 is a schematic flow chart of another embodiment of the alarm method based on the monitoring device according to the present application;
FIG. 3 is a schematic flowchart of an embodiment of comparing a current face image with a historical face image according to the present application;
fig. 4 is a schematic flow chart of an embodiment of obtaining a face emotion index in the present embodiment;
FIG. 5 is a schematic flow chart illustrating a monitoring device based alarm method according to another embodiment of the present application;
FIG. 6 is a schematic block diagram of an embodiment of a monitoring device of the present application;
FIG. 7 is a schematic diagram of an embodiment of the alarm system of the present application;
FIG. 8 is a schematic structural diagram of an embodiment of a computer-readable storage medium of the present application.
Detailed Description
In order to make those skilled in the art better understand the technical solution of the present application, the following describes an alarm method, a monitoring device, an alarm system and a readable storage medium based on a monitoring device provided by the present invention in further detail with reference to the accompanying drawings and the detailed description.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating an embodiment of an alarm method based on a monitoring device according to the present application.
S11: and collecting a monitoring picture, and acquiring a current face image in the monitoring picture.
The monitoring device in this embodiment may be a camera module, which obtains a monitoring picture of a set area. Optionally, in an embodiment, the monitoring device may include a plurality of camera modules, and lenses of the plurality of camera modules correspond to different directions or angles, respectively, so as to increase a field of view of the collected monitoring picture.
In addition, in other embodiments, the monitoring device is rotatable, for example, the monitoring device rotates 360 degrees at a set rotation speed, and when the monitoring picture captures the face image, the rotation is stopped. Furthermore, the monitoring equipment can realize the tracking of the human face, collect the characteristic points in the human face image and then track based on the characteristic points, namely when the human face image moves, the monitoring equipment also rotates along with the characteristic points so as to ensure that the human face image is positioned at the center of the monitoring picture.
The face recognition mainly comprises face image acquisition, face image detection and face image preprocessing.
Wherein, the face image acquisition includes:
acquiring a face image: the method mainly utilizes a camera lens to collect images, such as static images, dynamic images, different positions, different expressions and the like, and can obtain good collection.
Wherein, the face image detection comprises: in practice, face detection is mainly used for preprocessing of face recognition, namely, the position and size of a face are accurately calibrated in an image. The face image contains abundant pattern features, such as histogram features, color features, template features, structural features, Haar features, and the like. The face detection is to extract the useful information and to use the features to realize the face detection.
The face detection can adopt Adaboost learning algorithm based on the characteristics, the Adaboost algorithm is a method for classification, and weak classification methods are combined to form a new strong classification method.
In the process of face detection, an Adaboost algorithm is used for picking out some rectangular features (weak classifiers) which can represent the face most, the weak classifiers are constructed into a strong classifier according to a weighted voting mode, and then a plurality of strong classifiers obtained by training are connected in series to form a cascade-structured stacked classifier, so that the detection speed of the classifier is effectively improved.
The face image preprocessing comprises the following steps:
preprocessing a face image: the image preprocessing for the human face is a process of processing the image based on the human face detection result and finally serving for feature extraction. The original image acquired by the system is limited by various conditions and random interference, so that the original image cannot be directly used, and the original image needs to be subjected to image preprocessing such as gray scale correction, noise filtering and the like in the early stage of image processing. For the face image, the preprocessing process mainly includes light compensation, gray level transformation, histogram equalization, normalization, geometric correction, filtering, sharpening, and the like of the face image.
S12: and sending the current face image to a monitoring server, and acquiring a historical face image sent by the monitoring server.
The data transmission between the monitoring device and the monitoring server is generally performed by wire, for example, using an optical cable. Of course, this embodiment does not exclude other wireless data transmission methods, such as transmitting data to a nearby base station by a cellular data transmission method, and further uploading the data to a server by the base station.
The database of the monitoring server stores a large number of face images corresponding to different identity information, and identity information can be confirmed in a face recognition mode. Specifically, the extracted feature data of the face image is subjected to search matching with a feature template stored in a database, and a threshold is set, so that when the similarity exceeds the threshold, a result obtained by matching is output. The face recognition is to compare the face features to be recognized with the obtained face feature template, and judge the identity information of the face according to the similarity degree.
And sending the current face image to a monitoring server, acquiring corresponding identity information and a historical face image corresponding to the identity information in a self-contained database or a cloud database by the monitoring server based on the current face image, and sending the historical face image to monitoring equipment. Specifically, the identity information may include a name, a year and month of birth, a gender, an identification number, and the like.
S13: and comparing the current face image with the historical face image to judge whether the current face image meets the alarm condition.
Optionally, the change of the expression may be obtained by comparing the current face image with the historical face image, it may be understood that the historical face image is a normal expression of a person, and when the person is in danger, the expression of the person often has a great change, such as sadness, crying, ferocity and the like, and whether the alarm condition is satisfied may be determined by comparing the expression of the current face image with the expression of the historical face image.
Furthermore, a listening device can be arranged in the monitoring equipment and used for listening to surrounding sound to assist in judging dangerous situations. The monitoring device will automatically collect keywords such as "life saving" and screaming. For example, the monitoring device at the doorway of the community detects that the expression of the current face image is painful, the expression of the current face image is greatly changed from the expression of the previous face image, and the calling for help is heard, so that the situation that the current body is uncomfortable and needs help can be judged, and the alarm condition is met.
And the monitoring equipment compares the current face image with the historical face image to judge whether the current face image meets the alarm condition. If not, the monitoring of the current face image is finished. If so, the next step of operation is carried out.
S14: and sending alarm information.
And when monitoring that the current face image meets the alarm condition, the monitoring equipment sends alarm information to a police service server. Specifically, the monitoring device acquires current time information, current location information, and a monitoring picture as alarm information. The monitoring equipment sends the alarm information to a monitoring server, and the monitoring server merges the previously acquired identity information into the alarm information and then sends the alarm information to a corresponding police server. In practice, the monitoring device is generally installed in public areas with dense people flows, such as overpasses, squares, cell gates, and the like, the police server is generally placed in security rooms, guard rooms, and the like, and when a dangerous condition occurs in the first time, a guard or guard can immediately know the condition.
In another case, an alarm for sounding an alarm may be further included in the monitoring apparatus, and when it is determined in step S13 that the user is in danger, the alarm is sounded to attract the attention of surrounding people.
The embodiment provides an alarm method based on monitoring equipment, wherein the monitoring equipment can automatically monitor a current face image in a picture, and when an abnormal condition is found when the current face image is compared with a historical face image, alarm information can be automatically sent out. The alarm method of the embodiment can find the dangerous condition of people in time and ask for help from nearby security guards and guards, thereby ensuring the life and property safety of people and ensuring people to feel peace.
Further, please refer to fig. 2, fig. 2 is a schematic flowchart illustrating another embodiment of the alarm method based on the monitoring device according to the present application. The embodiment will describe in detail how the monitoring device implements automatic alarm.
S21: and collecting a monitoring picture, and acquiring a current face image in the monitoring picture.
S22: and sending the current face image to a monitoring server, and acquiring a historical face image sent by the monitoring server.
S21 and S22 are similar to S11 and S12 in the above embodiments, and are not described herein again.
S23: and comparing the emotion index of the current face image with the emotion index obtained by the historical face image to judge whether the current face image meets the alarm condition.
Comparing the current face image with the historical face image, in this embodiment, the emotion index of the current face image and the emotion index of the historical face image may be compared, specifically, please refer to fig. 3, where fig. 3 is a schematic flow diagram of an embodiment of comparing the current face image with the historical face image in this application.
S231: and obtaining a corresponding current emotion index based on the current face image, and obtaining an average emotion index of the historical face image based on the historical face image.
A set of method for detecting emotion indexes corresponding to face images is formulated in monitoring equipment and is realized by acquiring a plurality of characteristic data corresponding to different facial organs extracted from the face images. Specifically, different proportions are distributed on the basis of facial muscles, mouths, eyes and eyebrows, respective change conditions are judged respectively, final judgment is given according to comprehensive conditions, for example, the eyebrow proportion is 10%, the muscle proportion is 20%, the mouth proportion is 30%, the eye proportion is 40%, and finally the emotion index of the face image is obtained according to the respective change degrees of 1-10. Referring to fig. 4, fig. 4 is a schematic flow chart of an embodiment of obtaining a face emotion index in the present embodiment.
S2311: a plurality of feature data corresponding to different facial organs are extracted from the face image.
The usable features are generally classified into visual features, pixel statistical features, face image transform coefficient features, face image algebraic features, and the like. The face feature extraction is performed on some features of the face. Face feature extraction, also known as face characterization, is a process of feature modeling for a face. The methods for extracting human face features are classified into two main categories: one is a knowledge-based characterization method; the other is a characterization method based on algebraic features or statistical learning.
The knowledge-based characterization method mainly obtains feature data which is helpful for face classification according to shape description of face organs and distance characteristics between the face organs, and feature components of the feature data generally comprise Euclidean distance, curvature, angle and the like between feature points. The human face is composed of parts such as eyes, nose, mouth, and chin, and geometric description of the parts and their structural relationship can be used as important features for recognizing the human face, and these features are called geometric features. The knowledge-based face characterization mainly comprises a geometric feature-based method and a template matching method.
Extracting a plurality of feature data corresponding to different facial organs from the face image, for example, if the expression of the face is detected to be facial muscle twitch, mouth large, eyes large and eyebrow crumple, the obtained data are feature data of the position of facial muscle, the size of mouth, the size of eyes, the position of eyebrow and the like.
S2312: the plurality of feature data is compared with standard data of the corresponding facial organ.
And comparing the characteristic data of the facial muscles, the mouth, the eyes, the eyebrows and the like with the standard data of the corresponding facial organs respectively, wherein the standard data of the facial organs are obtained from data counted by the Internet. Further, the standard data of the facial organs may be different according to changes in age and sex. The monitoring equipment analyzes the age and the gender of the current face image, and then finds out the standard data of the corresponding face organ according to the age and the gender, and the like.
S2313: and determining the emotion index of the face image based on the difference between the feature data and the standard data.
An emotional index of each facial organ is determined based on a difference between the feature data and the standard data of each facial organ. The following table is a correspondence between the difference between the characteristic data of the facial organ and the standard data and the emotion index, and it can be seen that the higher the difference between the characteristic data of the facial organ and the standard data, the higher the emotion index. Difference calculation formula of facial organ and standard data: (feature data of facial organ-standard data)/standard data 100%. It should be noted that since the data of the face part is roughly detected, the calculated difference data does not require a high degree of accuracy for the reduction operation, and only one decimal is retained, and the cardinality can be retained by rounding in the present embodiment.
Figure BDA0001851344130000081
Further, according to the emotion index of each facial organ and the preset weight of the corresponding facial organ, the weighted sum of the emotion indexes of the plurality of facial organs is used as the emotion index of the face image. Specifically, different proportions are allocated to facial muscles, mouth, eyes and eyebrows, the respective change conditions are respectively judged, and then the emotion indexes of the last current character are given according to comprehensive conditions, such as the eyebrow proportion of 10%, the facial muscles proportion of 20%, the mouth proportion of 30% and the eyes proportion of 40%. Therefore, the calculation formula of the emotion index of the current person is as follows: eyebrow emotion index, eyebrow weight ratio + face muscle emotion index, face muscle weight ratio + mouth emotion index, mouth weight ratio + eye emotion index, eye weight ratio. For example, when the emotion indexes of the face organs are detected as the eyebrows 3, the face muscles 5, the mouths 5, and the eyes 8, respectively, the current emotion index of the character measured last is 3 × 10% +5 × 20% +5 × 30% +8 × 40% + 6.8.
The embodiment provides a method for acquiring a face image emotion index, which is used for embodying a face expression to a quantized emotion index and facilitating comparison and judgment of monitoring equipment. The emotion index can be stored corresponding to the identity information of the person, and when the monitoring equipment needs to call the historical face image, the corresponding emotion index can be called out without repeated calculation of the monitoring equipment.
S232: and judging whether the current emotion index is larger than the average emotion index or not.
And judging whether the current emotion index is larger than an average emotion index, wherein the average emotion index can be the emotion index of the historical face image obtained by the monitoring equipment from the monitoring server, and the emotion index of the historical face image can be obtained according to the step of obtaining the emotion index of the face image. Further, a threshold value is set, for example, 10% of the emotion index of the historical face image, and the emotion index of the historical face image plus the threshold value is the average emotion index. According to the above example, if the emotion index of the historical face image under test is 5.3, the average emotion index is 5.3+5.3 × 10% ═ 5.83. And judging whether the current emotion index is greater than the average emotion index, wherein in the example, the current face emotion index is 6.8, the average emotion index is 5.83, and through comparison, if the current face emotion index is greater than the average emotion index, 6.8 is greater than 5.83, and the next operation is carried out. In other embodiments, if the current sentiment index is not greater than the average sentiment index, the method returns to step S21.
S233: and judging that the current face image meets the alarm condition.
If the current emotion index is larger than the average emotion index, the situation that the face emotion shot by the monitoring equipment is out of control and in a dangerous state at the moment is shown, and the alarm condition is met.
It should be noted that the historical face image in the monitoring server may be updated continuously, and when the monitoring device monitors the current face image of the person and does not cause an alarm, the current face image of this time may be used as the historical face image monitored next time. In addition, the person who has caused the alarm in the past may be set with a lower emotional index than that of the ordinary person who has not caused the alarm, specifically, the setting of the threshold value in the ordinary emotional index may be controlled, specifically, a lower threshold value may be set, for example, 10% for the ordinary person and 5% for the person who has caused the alarm.
In this embodiment, a process of satisfying an alarm condition by a monitoring device is specifically described, and is mainly implemented by detecting whether an emotion index of a current face image is greater than an emotion index of a historical face image. If yes, the emotion of the currently detected person is in an out-of-control state, and an alarm is needed. According to the embodiment, the emotion indexes of historical face images are selected as reference objects, so that the emotion change of people can be monitored more accurately.
S24: and sending alarm information.
When the fact that the deviation of the emotion index of a person on a certain day from the weekday is large, abnormal anger and injury are caused, the monitoring device can send alarm information to the police service server in time. In some embodiments, the monitoring device may also track someone in real time and send the tracking information to the police server. In addition, the monitoring equipment can also listen to surrounding sound for assisting in judging dangerous situations, and can also frighten criminals by playing advertisements or chirping when crimes occur.
Compared with the above embodiments, the present embodiment further provides a method for implementing automatic alarm by monitoring equipment, where: the emotion index of the current face image is compared with the average emotion index, when the emotion index of the current face image is higher than the average emotion index, the monitoring equipment sends alarm information to a police service server, and meanwhile, the monitoring equipment can also be a player and a radio device. The sound receiving device can acquire sound information in the environment, and then extract keywords in the sound information, such as 'life saving' or some roaring sound, as an auxiliary judgment for dangerous situations. The player can stop crimes by playing a broadcast or chirping. The embodiment can effectively monitor the emotion change of people in the monitoring picture, can also find the dangerous situation in time and give an alarm.
Further, please refer to fig. 5, wherein fig. 5 is a schematic flowchart of another embodiment of the alarm method based on the monitoring device according to the present application.
S51: and collecting a monitoring picture, and acquiring the current face image and the number of the face images in the monitoring picture.
The method for acquiring the face image is similar to S11 in the above embodiment, and will not be described herein. And judging whether the number of the face images is more than or equal to two. And if so, acquiring the emotion index corresponding to each face image. The specific steps execute the corresponding steps in the embodiments S2311 to S2313, which are not described herein again.
S52: and judging whether the difference of the emotion indexes corresponding to at least two face images in each face image is larger than a set threshold value.
And judging whether the difference of the emotion indexes corresponding to at least two face images in each face image is greater than a set threshold value, if so, further performing emotion recognition on at least two current faces in the face images, and judging the identities of the at least two current faces. Generally, the monitoring device determines the identity of a victim, a suspect, and an unknown. At least two persons present at the crime scene may include at least one suspect and at least one victim. If the fact that the current face image has sadness, frightening and the like is recognized, the person is recognized as a victim; if the fact that anger, excitement and the like exist in the current face image is recognized, the person is recognized as a suspect, and if the monitoring device cannot simply summarize the person into a victim or a suspect, the person is recognized as an unknown person. And recording the number of victims, suspects and unknowns in the picture in the monitoring equipment.
It should be noted that the difference of the emotion indexes refers to subtracting the emotion index of the standard face image from the emotion index of the current face image, the emotion index of the standard face image is different according to different ages and different sexes, and the monitoring device can obtain the most appropriate emotion index of the standard face image according to the recognized current face image. The threshold is set to detect the degree of emotion change of the current face image, and can be set according to actual needs, and is generally set to 0.8.
S53: and sending the current face image to a server, and acquiring a historical face image sent by the server.
S54: and comparing the current face image with the historical face image to judge whether the current face image meets the alarm condition, and if so, sending alarm information.
S53-S54 are similar to S12-S14, and are not repeated herein. In this embodiment, the alarm message further includes the number of suspects, victims, and unknown persons identified in the monitoring screen.
The embodiment is suitable for detecting the condition that a plurality of persons move in the monitoring picture, and when the condition that the plurality of persons meet the alarm condition in the monitoring picture is identified, the monitoring terminal can analyze the real-time monitoring picture to obtain the number of suspects, victims and unknown persons in the crime scene. According to the alarm information sent by the monitoring terminal, nearby security guards and guards can know the personnel condition of the crime scene and can carry out action arrangement.
For example, if the monitoring device recognizes that the current emotional index of a child and an adult exceeds the historical emotional index, the expressions of the child and the adult are further judged, and if the child is crying and screaming all the time, the adult is locked at the eyebrows and is stressed in complexion, the monitoring device can recognize that the child is a victim and the adult is a suspect, and guess that the child is likely to be in a way of going wrong. The monitoring device sends the analyzed information together with the alarm information to the police server.
Referring to fig. 6, fig. 6 is a schematic structural diagram of an embodiment of a monitoring device according to the present application. Wherein the monitoring device 60 comprises a camera assembly 61, a memory 62 and a processor 63. The camera assembly 61 is used for acquiring monitoring pictures, the memory 62 is used for storing computer programs, and the processor 63 is used for executing the computer programs to realize the steps of any embodiment of the alarm method. The principle and the steps are similar, and are not described in detail herein.
Referring to fig. 7, fig. 7 is a schematic structural diagram of an embodiment of an alarm system according to the present application. Wherein the alarm system 70 comprises a monitoring device 71, a monitoring server 72 and a police server 73. Monitoring device 71 may be monitoring device 60 in the above-described embodiment of fig. 6, where monitoring device 71 is configured to send alarm information to monitoring server 72, and monitoring server 72 is configured to receive the alarm information sent by monitoring device 71 and send the alarm information to police server 73.
Optionally, the alarm system 70 may include a plurality of monitoring devices 71, and the locations where the user goes may be determined by contacting the acquired face images with a plurality of cameras. For example, if the same face image is continuously and sequentially obtained from a plurality of cameras leading to the roof, it can be determined that the user is leading to the roof.
Referring to fig. 8, fig. 8 is a schematic structural diagram of an embodiment of a computer-readable storage medium according to the present application. The computer-readable storage medium 80 has stored therein a computer program 81, the computer program 81 being adapted to carry out the steps of any of the embodiments of the alarm method described above when executed by a processor. The principle and the steps are similar, and are not described in detail herein. The computer storage medium may be provided in the monitoring apparatus in the above embodiments.
Embodiments of the present application may be implemented in software functional units and may be stored in a computer readable storage medium when sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed by the prior art, or all or part of the technical solution may be embodied in a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (10)

1. An alarm method based on monitoring equipment is characterized in that,
collecting a monitoring picture, and acquiring a current face image in the monitoring picture;
sending the current face image to a monitoring server, so that the monitoring server acquires corresponding identity information and a historical face image corresponding to the identity information based on the current face image, and acquires the historical face image sent by the monitoring server;
comparing the current face image with the historical face image to judge whether the current face image meets an alarm condition;
if yes, alarm information is sent out.
2. The alarm method of a monitoring device according to claim 1,
the step of comparing the current face image with the historical face image to judge whether the current face image meets an alarm condition comprises the following steps:
obtaining a corresponding current emotion index based on the current face image, and obtaining an average emotion index of a historical face image based on the historical face image;
judging whether the current emotion index is larger than the average emotion index or not;
and if so, judging that the current face image meets the alarm condition.
3. The alarm method of a monitoring device according to claim 2,
the steps of obtaining a corresponding current emotion index based on the current face image and obtaining an average emotion index of a historical face image based on the historical face image comprise:
extracting a plurality of characteristic data corresponding to different facial organs from the face image;
comparing the plurality of feature data with standard data of corresponding facial organs;
and determining the emotion index of the face image based on the difference between the characteristic data and the standard data.
4. The alarm method of a monitoring device according to claim 3,
the step of determining an emotion index of the face image based on the difference between the feature data and the standard data includes:
determining an emotional index of each facial organ based on a difference between the feature data and the standard data of each facial organ;
and taking the weighted sum of the emotion indexes of the plurality of facial organs as the emotion index of the face image according to the emotion index of each facial organ and the preset weight of the corresponding facial organ.
5. The alarm method of a monitoring device according to claim 4,
the alarm method further comprises the following steps:
acquiring corresponding age and gender based on the face image;
and acquiring standard data of the facial organs corresponding to the ages and the sexes and preset weights of the corresponding facial organs.
6. The alarm method of a monitoring device according to claim 1,
the step of collecting the monitoring picture and acquiring the current face image in the monitoring picture comprises the following steps:
collecting a monitoring picture;
acquiring the current face images and the number of the face images in the monitoring picture;
judging whether the number of the face images is more than or equal to two;
if so, acquiring an emotion index corresponding to each face image;
judging whether the difference of the emotion indexes corresponding to at least two face images in each face image is larger than a set threshold value or not;
and if so, executing the step of sending the current face image to a server so that the server acquires corresponding identity information and a historical face image corresponding to the identity information based on the current face image, and acquiring the historical face image sent by the server.
7. The alarm method of a monitoring device according to claim 1,
the step of sending alarm information comprises the following steps:
acquiring current time information, current place information and a monitoring picture as alarm information;
and sending the alarm information to a monitoring server so that the monitoring server sends the alarm information to a corresponding police server.
8. A monitoring device, comprising a camera assembly, a memory, and a processor;
wherein the camera is used for collecting monitoring pictures, the memory is used for storing computer programs, and the processor is used for executing the computer programs to realize the alarm method according to any one of claims 1-7.
9. An alarm system is characterized by comprising monitoring equipment, a monitoring server and a police server;
the monitoring device is the monitoring device according to claim 8, the monitoring device is configured to send alarm information to a monitoring server, and the monitoring server is configured to receive the alarm information sent by the monitoring device and send the alarm information to a police server.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, is adapted to carry out the alarm method according to any one of claims 1 to 7.
CN201811296613.9A 2018-11-01 2018-11-01 Alarm method, alarm system and readable storage medium based on monitoring equipment Withdrawn CN111127830A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811296613.9A CN111127830A (en) 2018-11-01 2018-11-01 Alarm method, alarm system and readable storage medium based on monitoring equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811296613.9A CN111127830A (en) 2018-11-01 2018-11-01 Alarm method, alarm system and readable storage medium based on monitoring equipment

Publications (1)

Publication Number Publication Date
CN111127830A true CN111127830A (en) 2020-05-08

Family

ID=70494883

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811296613.9A Withdrawn CN111127830A (en) 2018-11-01 2018-11-01 Alarm method, alarm system and readable storage medium based on monitoring equipment

Country Status (1)

Country Link
CN (1) CN111127830A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111639209A (en) * 2020-05-20 2020-09-08 广东小天才科技有限公司 Book content searching method, terminal device and storage medium
CN111770310A (en) * 2020-07-02 2020-10-13 广州博冠智能科技有限公司 Lost child identification and positioning method and device
CN111814583A (en) * 2020-06-17 2020-10-23 北京航天时代光电科技有限公司 Information scanning system for information security
CN112000293A (en) * 2020-08-21 2020-11-27 饶志昌 Monitoring data storage method, device, equipment and storage medium based on big data
CN112070011A (en) * 2020-09-08 2020-12-11 安徽兰臣信息科技有限公司 Noninductive face recognition camera shooting snapshot machine for finding lost children
CN112115847A (en) * 2020-09-16 2020-12-22 深圳印像数据科技有限公司 Method for judging face emotion joyfulness
CN113191275A (en) * 2021-05-02 2021-07-30 李凤华 Internet of things monitoring system based on intelligent fire fighting and monitoring method thereof
CN114360085A (en) * 2021-11-25 2022-04-15 中国人民人寿保险股份有限公司 Method for identifying attendance cheating behaviors, service system and terminal equipment thereof
CN117218324A (en) * 2023-10-17 2023-12-12 广东迅扬科技股份有限公司 Camera regulation and control system and method based on artificial intelligence

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102566740A (en) * 2010-12-16 2012-07-11 富泰华工业(深圳)有限公司 Electronic device with emotion recognition function, and output control method of such electronic device
CN106469297A (en) * 2016-08-31 2017-03-01 北京小米移动软件有限公司 Emotion identification method, device and terminal unit
CN107895146A (en) * 2017-11-01 2018-04-10 深圳市科迈爱康科技有限公司 Micro- expression recognition method, device, system and computer-readable recording medium
CN107944434A (en) * 2015-06-11 2018-04-20 广东欧珀移动通信有限公司 A kind of alarm method and terminal based on rotating camera
WO2018135502A1 (en) * 2017-01-20 2018-07-26 シャープ株式会社 Household electric appliance and household electric appliance control system
CN108564007A (en) * 2018-03-27 2018-09-21 深圳市智能机器人研究院 A kind of Emotion identification method and apparatus based on Expression Recognition

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102566740A (en) * 2010-12-16 2012-07-11 富泰华工业(深圳)有限公司 Electronic device with emotion recognition function, and output control method of such electronic device
CN107944434A (en) * 2015-06-11 2018-04-20 广东欧珀移动通信有限公司 A kind of alarm method and terminal based on rotating camera
CN106469297A (en) * 2016-08-31 2017-03-01 北京小米移动软件有限公司 Emotion identification method, device and terminal unit
WO2018135502A1 (en) * 2017-01-20 2018-07-26 シャープ株式会社 Household electric appliance and household electric appliance control system
CN107895146A (en) * 2017-11-01 2018-04-10 深圳市科迈爱康科技有限公司 Micro- expression recognition method, device, system and computer-readable recording medium
CN108564007A (en) * 2018-03-27 2018-09-21 深圳市智能机器人研究院 A kind of Emotion identification method and apparatus based on Expression Recognition

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111639209A (en) * 2020-05-20 2020-09-08 广东小天才科技有限公司 Book content searching method, terminal device and storage medium
CN111639209B (en) * 2020-05-20 2023-12-22 广东小天才科技有限公司 Book content searching method, terminal equipment and storage medium
CN111814583A (en) * 2020-06-17 2020-10-23 北京航天时代光电科技有限公司 Information scanning system for information security
CN111770310A (en) * 2020-07-02 2020-10-13 广州博冠智能科技有限公司 Lost child identification and positioning method and device
CN112000293A (en) * 2020-08-21 2020-11-27 饶志昌 Monitoring data storage method, device, equipment and storage medium based on big data
CN112070011A (en) * 2020-09-08 2020-12-11 安徽兰臣信息科技有限公司 Noninductive face recognition camera shooting snapshot machine for finding lost children
CN112115847A (en) * 2020-09-16 2020-12-22 深圳印像数据科技有限公司 Method for judging face emotion joyfulness
CN113191275A (en) * 2021-05-02 2021-07-30 李凤华 Internet of things monitoring system based on intelligent fire fighting and monitoring method thereof
CN114360085A (en) * 2021-11-25 2022-04-15 中国人民人寿保险股份有限公司 Method for identifying attendance cheating behaviors, service system and terminal equipment thereof
CN117218324A (en) * 2023-10-17 2023-12-12 广东迅扬科技股份有限公司 Camera regulation and control system and method based on artificial intelligence

Similar Documents

Publication Publication Date Title
CN111127830A (en) Alarm method, alarm system and readable storage medium based on monitoring equipment
US9064145B2 (en) Identity recognition based on multiple feature fusion for an eye image
Tome et al. The 1st competition on counter measures to finger vein spoofing attacks
Salimi et al. Visual-based trash detection and classification system for smart trash bin robot
US20170032182A1 (en) System for adaptive real-time facial recognition using fixed video and still cameras
CN108470169A (en) Face identification system and method
US20040001142A1 (en) Method for suspect identification using scanning of surveillance media
CN103839373A (en) Sudden abnormal event intelligent identification alarm device and system
CN112183265A (en) Electric power construction video monitoring and alarming method and system based on image recognition
CN110852147B (en) Security alarm method, security alarm device, server and computer readable storage medium
CN111126219A (en) Transformer substation personnel identity recognition system and method based on artificial intelligence
Ali et al. Forensic face recognition: A survey
CN116563797B (en) Monitoring management system for intelligent campus
KR101957677B1 (en) System for learning based real time guidance through face recognition and the method thereof
CN208351494U (en) Face identification system
CN115690653A (en) Monitoring and early warning for realizing abnormal nursing behaviors of nursing staff based on AI behavior recognition
RU2316051C2 (en) Method and system for automatically checking presence of a living human face in biometric safety systems
CN110908718A (en) Face recognition activated voice navigation method, system, storage medium and equipment
TWI691923B (en) Fraud detection system for financial transaction and method thereof
CN110738985A (en) Cross-modal biometric feature recognition method and system based on voice signals
CN110148234A (en) Campus brush face picks exchange method, storage medium and system
CN115170059A (en) Intelligent safety monitoring system for outdoor construction site and working method
CN111581418B (en) Target person searching method based on image associated character information
JP2019159377A (en) Monitoring system, server device, monitoring method, and monitoring program
KR20230042926A (en) Apparatus and Method for Detecting Violence, Smart Violence Monitoring System having the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20200508

WW01 Invention patent application withdrawn after publication