CN110598632A - Target object monitoring method and device, electronic equipment and storage medium - Google Patents

Target object monitoring method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN110598632A
CN110598632A CN201910863589.0A CN201910863589A CN110598632A CN 110598632 A CN110598632 A CN 110598632A CN 201910863589 A CN201910863589 A CN 201910863589A CN 110598632 A CN110598632 A CN 110598632A
Authority
CN
China
Prior art keywords
target object
monitoring
result
state
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910863589.0A
Other languages
Chinese (zh)
Other versions
CN110598632B (en
Inventor
夏俊
戴娟
吴军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Sensetime Technology Co Ltd
Original Assignee
Shenzhen Sensetime Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Sensetime Technology Co Ltd filed Critical Shenzhen Sensetime Technology Co Ltd
Priority to CN201910863589.0A priority Critical patent/CN110598632B/en
Publication of CN110598632A publication Critical patent/CN110598632A/en
Application granted granted Critical
Publication of CN110598632B publication Critical patent/CN110598632B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Abstract

The disclosure relates to a target object monitoring method and device, an electronic device and a storage medium. The monitoring method of the target object comprises the following steps: acquiring monitoring data of a target object; according to the monitoring data, carrying out state evaluation on the target object to obtain a monitoring result; wherein the monitoring data comprises emotion recognition data and/or behavior recognition data obtained by image recognition of the target object. Through the process, the state of the target object can be comprehensively evaluated based on the monitoring data of the target object, so that a monitoring result is obtained; because the monitoring data comprises at least one of emotion recognition data and behavior recognition data obtained by carrying out image recognition on the target object, and the recognition dimensionality of the target object is more, the monitoring result obtained by carrying out state evaluation on the multi-dimensional monitoring data is higher in reliability, and the overall state of the target object can be more accurately reflected.

Description

Target object monitoring method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of data analysis, and in particular, to a method and an apparatus for monitoring a target object, an electronic device, and a storage medium.
Background
With the development of society and science and technology, the number of objects which can be accommodated in the same place is more and more. For the site supervisor, the state of all the objects in the site is mastered timely and comprehensively, so that the site supervisor can maintain the order of the site and ensure the safety of the site conveniently, and the site is effectively supervised. However, how to determine the state of an object in a location more accurately is a problem to be solved urgently.
Disclosure of Invention
The present disclosure presents a monitoring scheme for a target object.
According to an aspect of the present disclosure, there is provided a target object monitoring method, including:
acquiring monitoring data of a target object; according to the monitoring data, carrying out state evaluation on the target object to obtain a monitoring result; wherein the monitoring data includes emotion recognition data and/or behavior recognition data obtained by image recognition of the target object.
Through the process, the state of the target object can be comprehensively evaluated based on the monitoring data of the target object, so that a monitoring result is obtained; because the monitoring data comprises at least one of emotion recognition data and behavior recognition data obtained by carrying out image recognition on the target object, and the recognition dimensionality of the target object is more, the monitoring result obtained by carrying out state evaluation on the multi-dimensional monitoring data is higher in reliability, and the overall state of the target object can be more accurately reflected.
In a possible implementation manner, the performing state evaluation on the target object according to the monitoring data to obtain a monitoring result includes: acquiring the state to be evaluated of the target object; counting the monitoring data corresponding to the state to obtain a statistical result; and according to the statistical result, performing state evaluation on the target object to obtain a state evaluation result of the target object as the monitoring result.
Through the process, the obtained monitoring data can be effectively utilized to evaluate different states of the target object, so that the finally obtained monitoring result can comprise various state evaluation results of the target object, and the reliability of the monitoring result is further improved.
In a possible implementation manner, the performing state evaluation on the target object according to the statistical result to obtain a state evaluation result of the target object, where as the monitoring result, the method includes: carrying out weighted summation on the statistical result and a preset weight to obtain a weighted result; converting the weighting result into a preset value domain, and taking the converted result as a state statistic value; and averaging a plurality of state statistics values of the target object in a monitoring period, and taking an averaged result as a state evaluation result of the target object.
The statistical result and the preset weight are subjected to weighted summation to obtain a weighted result, and the monitoring data of the target object can be quantized, so that the subjective evaluation process of state evaluation of the target object is converted into an objective numerical value, and the monitoring result is more objective and convenient to analyze; by converting the weighting result into a preset value range and taking the converted result as a state statistic value, the final monitoring result can be in the same value range, and the comparison and analysis easiness of the monitoring result is further improved; by averaging a plurality of state statistics values of the target object in the monitoring period and taking the averaged result as the state evaluation result of the target object, the monitoring result can be more reliable, and the possibility that the monitoring result does not have reference significance due to short monitoring time is reduced.
In a possible implementation manner, before the performing state evaluation on the target object according to the statistical result to obtain a state evaluation result of the target object as the monitoring result, the method further includes: and judging whether the target object comprises at least one statistical result in a monitoring period, and if so, executing the state evaluation on the target object according to the statistical result to obtain a state evaluation result of the target object as the monitoring result.
Through the process, the state evaluation process can be carried out when the target object has the statistical result, the irrelevant operation in the absence of data is reduced, and the efficiency and the accuracy of the monitoring process are improved.
In one possible implementation, the behavior recognition data includes: at least one of sitting posture identification data, motion identification data, and face orientation identification data.
By receiving emotion recognition data and behavior recognition data obtained by image recognition of the target image, various emotions and behaviors can be recognized due to the fact that the recognition dimensionality of the recognition data is high, and therefore when evaluation is conducted based on the emotion recognition data and the behavior recognition data, the overall state of the target object can be analyzed comprehensively and accurately, and the reliability of a monitoring result is greatly improved.
In one possible implementation, the state includes: at least one of an emotional state, an energy state, and a learning state.
The monitoring result is obtained by evaluating the emotion state, the vigor state and the learning state of the target object, the monitoring condition of the target object can be comprehensively evaluated by analyzing the emotion, the vigor and the learning condition of the target object, so that the comprehensiveness, the reliability and the referential performance of the monitoring result are improved, and the monitoring condition of the target object can be comprehensively controlled conveniently.
In a possible implementation manner, the counting the monitoring data corresponding to the state includes: counting emotion recognition data corresponding to the emotional states; and/or counting emotion recognition data, sitting posture recognition data and motion recognition data corresponding to the energy state; and/or counting emotion recognition data, sitting posture recognition data, motion recognition data and face orientation recognition data corresponding to the learning state.
Through the process, different identification data can be referred to respectively obtain the emotional state, the energy state and the learning state of the target object, and different states are labeled with different reference data, namely when different types of states are obtained, the specific contents of the identification data which need to be referred to are different, so that the results among the different states have difference, and the comprehensiveness and the reliability of the monitoring results can be improved.
In a possible implementation manner, the performing state evaluation on the target object according to the monitoring data to obtain a monitoring result further includes: acquiring a first normal state evaluation range of a target object; and comparing the state evaluation result of the target object with the first normal state evaluation range, and judging that the target object is in an abnormal state under the condition that the state evaluation result of the target object exceeds the first normal state evaluation range.
By acquiring the first normal state evaluation range of the target object, comparing the state evaluation result of the target object with the first normal state evaluation range and determining that the state of the target object is in an abnormal state when the state evaluation result exceeds the first normal state evaluation range, the abnormal condition of the target object can be found in time, and the monitoring pertinence and reliability are further improved.
In a possible implementation manner, the obtaining a first normal state evaluation range of the target object includes: determining the first normal state evaluation range according to the historical monitoring result of the target object; or reading a preset first normal state evaluation range.
The first normal state evaluation range is determined according to the historical monitoring result of the target object, or the preset normal state evaluation range is directly read, the first normal state evaluation range is flexibly determined through the two modes, the flexibility of evaluation on the abnormal state of the target object can be improved, and the flexibility of target monitoring is improved.
In a possible implementation manner, the performing state evaluation on the target object according to the monitoring data to obtain a monitoring result further includes: obtaining state evaluation results of a plurality of objects in a set to which the target object belongs; and obtaining the state evaluation result of the set to which the target object belongs as the monitoring result according to the obtained state evaluation results of the plurality of objects.
The state evaluation results of the plurality of objects in the set to which the target object belongs are obtained, and the state evaluation results of the set to which the target object belongs are obtained according to the obtained results and serve as monitoring results, so that the monitoring results are beneficial to analyzing the collective emotion or behavior, the application range of the whole monitoring method is expanded, and the practicability of the monitoring method can be improved.
In a possible implementation manner, the performing state evaluation on the target object according to the monitoring data to obtain a monitoring result further includes: acquiring a second normal state evaluation range of the set to which the target object belongs; and comparing the state evaluation result of the set to which the target object belongs with the second normal evaluation range, and judging that the set to which the target object belongs is in an abnormal state under the condition that the state evaluation result of the set to which the target object belongs exceeds the second normal state evaluation range.
Through the process, the abnormal state of the set to which the target object belongs can be further evaluated, and the application range of the monitoring method is further expanded.
In one possible implementation, the method further includes: and sending the monitoring result to a target terminal, wherein the target terminal is used for displaying the monitoring result.
Through sending the monitoring result to can show the target terminal of monitoring result on, can carry out effectual show to the monitoring result to be convenient for comprehensively and accurately observe the monitoring condition of target object, promote real-time, the demonstration nature and the summarization nature of monitoring.
In one possible implementation, the method further includes: and sending alarm information according to the monitoring result.
By sending the alarm information according to the monitoring result, the alarm can be timely sent to the related object when the monitored target object is abnormal, so that the reliability, the safety and the practicability of monitoring are further improved.
According to an aspect of the present disclosure, there is provided a target object monitoring apparatus including:
the monitoring data acquisition module is used for acquiring monitoring data of a target object; the monitoring result generation module is used for carrying out state evaluation on the target object according to the monitoring data to obtain a monitoring result; wherein the monitoring data includes emotion recognition data and/or behavior recognition data obtained by image recognition of the target object.
In one possible implementation manner, the monitoring result generating module includes: the state acquisition unit is used for acquiring the state to be evaluated of the target object; the statistical unit is used for counting the monitoring data corresponding to the state to obtain a statistical result; and the state evaluation unit is used for carrying out state evaluation on the target object according to the statistical result to obtain a state evaluation result of the target object as the monitoring result.
In one possible implementation, the state evaluation unit is configured to: carrying out weighted summation on the statistical result and a preset weight to obtain a weighted result; converting the weighting result into a preset value domain, and taking the converted result as a state statistic value; and averaging a plurality of state statistics values of the target object in a monitoring period, and taking an averaged result as a state evaluation result of the target object.
In one possible implementation, the state evaluation unit is further configured to: and judging whether the target object comprises at least one statistical result in a monitoring period, and if so, executing the state evaluation on the target object according to the statistical result to obtain a state evaluation result of the target object as the monitoring result.
In one possible implementation, the behavior recognition data includes: at least one of sitting posture identification data, motion identification data, and face orientation identification data.
In one possible implementation, the state includes: at least one of an emotional state, an energy state, and a learning state.
In a possible implementation manner, the statistical unit is configured to: counting emotion recognition data corresponding to the emotional states; and/or counting emotion recognition data, sitting posture recognition data and motion recognition data corresponding to the energy state; and/or counting emotion recognition data, sitting posture recognition data, motion recognition data and face orientation recognition data corresponding to the learning state.
In a possible implementation manner, the monitoring result generating module further includes an abnormal state determining unit, where the abnormal state determining unit is configured to: acquiring a first normal state evaluation range of a target object; and comparing the state evaluation result of the target object with the first normal state evaluation range, and judging that the target object is in an abnormal state under the condition that the state evaluation result of the target object exceeds the first normal state evaluation range.
In a possible implementation manner, the abnormal state determination unit is further configured to: determining the first normal state evaluation range according to the historical monitoring result of the target object; or reading a preset first normal state evaluation range.
In one possible implementation manner, the monitoring result generation module is further configured to: obtaining state evaluation results of a plurality of objects in a set to which the target object belongs; and obtaining the state evaluation result of the set to which the target object belongs as the monitoring result according to the obtained state evaluation results of the plurality of objects.
In one possible implementation manner, the monitoring result generation module is further configured to: acquiring a second normal state evaluation range of the set to which the target object belongs; and comparing the state evaluation result of the set to which the target object belongs with the second normal evaluation range, and judging that the set to which the target object belongs is in an abnormal state under the condition that the state evaluation result of the set to which the target object belongs exceeds the second normal state evaluation range.
In one possible implementation, the apparatus is further configured to: and sending the monitoring result to a target terminal, wherein the target terminal is used for displaying the monitoring result.
In one possible implementation, the apparatus is further configured to: and sending alarm information according to the monitoring result.
According to an aspect of the present disclosure, there is provided a monitoring system of a target object, including:
the monitoring data generating device is used for generating monitoring data of the target object; a target object monitoring device as claimed in any one of the preceding claims; wherein the monitoring data generation device is connected with the monitoring device of the target object.
In one possible implementation manner, the monitoring data generating device includes: an image acquisition device for acquiring an image of a target object; the image recognition device is used for obtaining monitoring data of the target object through image recognition; the image recognition equipment is respectively connected with the image acquisition equipment and the monitoring device of the target object.
In one possible implementation, the system further includes: the target terminal is used for displaying the monitoring result of the monitoring device of the target object; and the target terminal is connected with the monitoring device of the target object.
According to an aspect of the present disclosure, there is provided an electronic device including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to: the above-described target object monitoring method is performed.
According to an aspect of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the above-described target object monitoring method.
In the embodiment of the present disclosure, a monitoring result is obtained by receiving monitoring data obtained by performing image recognition on a target object, and thus performing state evaluation on the target object according to the monitoring data, and the monitoring data includes emotion recognition data and behavior recognition data. Through the process, the state of the target object can be comprehensively evaluated based on the monitoring data of the target object, so that a monitoring result is obtained; because the monitoring data comprises at least one of emotion recognition data and behavior recognition data obtained by carrying out image recognition on the target object, and the recognition dimensionality of the target object is more, the monitoring result obtained by carrying out state evaluation on the multi-dimensional monitoring data is higher in reliability, and the overall state of the target object can be more accurately reflected.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure. Other features and aspects of the present disclosure will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure.
Fig. 1 shows a flow chart of a method of monitoring a target object according to an embodiment of the present disclosure.
Fig. 2 shows a block diagram of a monitoring device of a target object according to an embodiment of the present disclosure.
Fig. 3 illustrates a block diagram of a monitoring system of a target object according to an embodiment of the present disclosure.
Fig. 4 shows a schematic diagram of an application example according to the present disclosure.
Fig. 5 shows a block diagram of an electronic device in accordance with an embodiment of the disclosure.
Fig. 6 illustrates a block diagram of an electronic device in accordance with an embodiment of the disclosure.
Detailed Description
Various exemplary embodiments, features and aspects of the present disclosure will be described in detail below with reference to the accompanying drawings. In the drawings, like reference numbers can indicate functionally identical or similar elements. While the various aspects of the embodiments are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a better understanding of the present disclosure. It will be understood by those skilled in the art that the present disclosure may be practiced without some of these specific details. In some instances, methods, means, elements and circuits that are well known to those skilled in the art have not been described in detail so as not to obscure the present disclosure.
Fig. 1 shows a flowchart of a target object monitoring method according to an embodiment of the present disclosure, which may be applied to a terminal device, a server or other processing device, and the like. The terminal device may be a User Equipment (UE), a mobile device, a user terminal, a cellular phone, a cordless phone, a Personal Digital Assistant (PDA), a handheld device, a computing device, a vehicle-mounted device, a wearable device, or the like. In one example, the target object monitoring method can be applied to a cloud server or a local server, the cloud server can be a public cloud server or a private cloud server, and the method can be flexibly selected according to actual conditions.
In some possible implementations, the method for monitoring the target object may also be implemented by a processor calling computer readable instructions stored in a memory.
As shown in fig. 1, the monitoring method of the target object may include:
step S11, acquiring monitoring data of the target object, wherein the monitoring data includes emotion recognition data and/or behavior recognition data obtained by image recognition of the target object.
And step S12, performing state evaluation on the target object according to the monitoring data to obtain a monitoring result.
In the above disclosed embodiment, the identity of the target object to be monitored can be flexibly selected according to the actual situation. In a possible implementation manner, the method provided in the embodiment of the present disclosure may be applied to health analysis of students in a campus, and therefore, the target object may be a student at this time; in a possible implementation manner, the method provided in the embodiment of the present disclosure may also be applied to emotional behavior analysis on an experimental subject, where the target subject may be an experimental subject to be tested, and as the method application environment is different, the identity of the target subject may be changed accordingly, which is not listed here.
Monitoring data obtained by image recognition of the target object is received, so that state evaluation is carried out on the target object according to the monitoring data to obtain a monitoring result, and the monitoring data comprises emotion recognition data and/or behavior recognition data. Through the process, the state of the target object can be comprehensively evaluated based on the monitoring data of the target object, so that a monitoring result is obtained; because the monitoring data comprises at least one of emotion recognition data and behavior recognition data obtained by carrying out image recognition on the target object, and the recognition dimensionality of the target object is more, the monitoring result obtained by carrying out state evaluation on the multi-dimensional monitoring data is higher in reliability, and the overall state of the target object can be more accurately reflected.
In a possible implementation manner, the monitoring data may include both emotion recognition data and behavior recognition data, so that the behavior of the target object may be recognized, the emotion of the target object may also be recognized, and the recognition dimensions are multiple, thereby further improving the reliability of the monitoring result. Meanwhile, when the monitoring method provided in the embodiment of the disclosure is applied to a campus to analyze the health of students, a teacher or a school manager is not required to interfere with the scheme basically, and only a camera is deployed in a classroom to collect images/videos and then analyze the images/videos according to the collected images/videos, so that the overall state of the students in the classroom can be analyzed, the state change of each child can be continuously concerned about, the monitoring method is convenient for assisting the teacher to timely know whether the attention of the students in the classroom is concentrated or not, whether the student is energetic or not and whether the learning state is good or not, and a certain data support is provided for teaching work by combining with related statistical analysis.
The monitoring data obtained in step S11 may be from an identification device that identifies the target object, for example, may be from one identification device, or may be from two or more identification devices at the same time, which is not limited in the embodiment of the present disclosure. The implementation manner of the recognition apparatus is not limited in the embodiments of the present disclosure, and any terminal device, server, or other processing device that can perform image recognition on the target object may be used as the implementation manner of the recognition apparatus. In a possible implementation manner, the identification apparatus may include two parts, namely an image acquisition module and an identification module, which may be integrated in the same hardware device or implemented by different hardware, and the embodiments of the present disclosure are not limited thereto. The image acquisition module can be used for acquiring image data of a target object, the image data can be video data or static picture data, in one example, the video data can be analyzed into a plurality of frame pictures, then the frame pictures are extracted, the acquired image data are sent to the identification module, the identification module performs image identification on the target object through the received image data, and a corresponding identification result is obtained and used as monitoring data.
The method comprises the steps that the identification module carries out image identification on a target object, emotion identification data and behavior identification data can be obtained, wherein the obtained identification data can be flexibly selected according to actual conditions. The emotion types that can be identified by the emotion identification data can be flexibly determined according to a specific implementation manner of emotion identification, which is not specifically limited in the embodiment of the present disclosure, and in one example, the emotion types that can be identified by the emotion identification data may include 7 types of happiness, calmness, surprise, dislike, anger, sadness, and fear. The behavior recognition data may be further classified according to the category of the behavior, and in a possible implementation, the behavior recognition data may include: at least one of sitting posture identification data, motion identification data, and face orientation identification data. The types of behaviors that can be identified by these behavior identification data are also not limited in the embodiments of the present disclosure, and in one example, the types of sitting postures that can be identified by the sitting posture identification data may include 5 types of sitting, lying on the stomach, standing, leaning against the wall, and leaning against the table; in one example, the types of actions that can be identified by the action identification data may include 7 types of actions including yawning, reaching the waist, playing a mobile phone, lifting a hand, speaking, taking notes, and holding a head with hands; in one example, the face orientation categories that the face orientation identification data may identify may include 3 categories of head-down, head-up, and head-over.
In the embodiment of the disclosure, emotion recognition data and behavior recognition data obtained by image recognition of a target image are received, and because the recognition dimensionality of the recognition data is high, more than 20 emotions and behaviors can be recognized, so that when evaluation is performed based on the emotion recognition data and the behavior recognition data, the overall state of a target object can be analyzed relatively comprehensively and accurately, and the reliability of a monitoring result is greatly improved.
In the above-described disclosed embodiment, the manner in which the recognition module performs image recognition on the target image is also not limited in the present disclosure, and any implementation manner that can perform image recognition on the target object from the received image data can be used as the implementation manner of the recognition module. In one example, the image recognition may include one or more of face detection, face recognition, behavior recognition, and expression recognition, and the specific selection of which recognition modes may be flexibly determined according to requirements.
In one possible implementation, the identification process may be: firstly, detecting a target object in each frame of image from the image data, for example, when the target object is a student, detecting a human body in each frame of image from the image data; after the target object in each frame of image is detected, the emotion and behavior of the target object can be classified through model operation, so that a final recognition result is obtained, wherein the model operation can be based on a trained neural network model to perform emotion recognition or behavior recognition and the like on the detected target object, and no specific limitation is made, and the method can be flexibly selected according to actual conditions. In one example, if the sitting posture recognition is performed on the sitting posture of "lying prone", the specific recognition process may be: the features of the lying on the back are defined, such as the spine is bent at a large angle, the trunk is substantially parallel to the table top, the arms are overlapped and parallel to the table top or on the head and the arms, the head/face is in contact with the hands, and the hands or the arms are laid flat on the table. When the image after the video analysis is subjected to data processing, a relevant human body can be recognized firstly, whether the behavior characteristics of lying prone are met or not is judged through model calculation, and if the behavior characteristics of lying prone are met, the behavior of lying prone is judged. In one example, if the motion recognition is performed on the motion of "playing a mobile phone," the specific recognition process may be: firstly, defining the behavior characteristics of playing the mobile phone, such as the visible mobile phone on the picture or the playing behavior of the ipad product on the hand. When the image after video analysis is subjected to data processing, relevant human bodies are firstly identified, whether the behavior characteristics of the mobile phone playing are met or not is judged through model calculation, and if the behavior characteristics are met, the behavior of 'mobile phone playing' is judged.
Through the recognition device comprising the image acquisition module and the recognition module, the target object is subjected to image recognition to obtain monitoring data comprising emotion recognition data and behavior recognition data, the monitoring data are sent to the server capable of realizing the monitoring method of the target object provided by the embodiment of the disclosure, through the implementation mode, the emotion and behavior recognition process of the target object can be realized locally, the operation pressure of the server is reduced, the operation and service capacity of the server is improved, and the realization efficiency of the monitoring method provided by the embodiment of the disclosure is improved.
In a possible implementation manner, the monitoring data obtained in step S11 may also be directly obtained in the server, that is, after the image of the target object is obtained, the image may be uploaded to the server, and image recognition is performed on the server, so as to obtain the monitoring data, and the monitoring result is continuously obtained in the server by using the monitoring data.
After the monitoring data of the target object is acquired, the target object may be subjected to state evaluation according to the monitoring data through step S12, so as to obtain a monitoring result. In one possible implementation, step S12 may include:
step S121, obtaining the state to be evaluated of the target object.
Step S122, counting the monitoring data corresponding to the state to obtain a statistical result.
And S123, performing state evaluation on the target object according to the statistical result to obtain a state evaluation result of the target object as a monitoring result.
The state of the target object to be evaluated is obtained, the monitoring data corresponding to the state are counted to obtain a counting result, and the state of the target object is evaluated according to the counting result, so that the state evaluation result of the target object is obtained and is used as the monitoring result.
In step S121, a state to be evaluated of the target object is obtained, a state type included in the state may be flexibly determined according to an actual situation in the embodiment of the present disclosure, and in a possible implementation manner, the state may include: at least one of an emotional state, an energy state, and a learning state. Therefore, the implementation manner of step S121 may be determined according to the type of the state to be evaluated, and in one example, the emotional state, the energy state, and the learning state of the target object may be obtained at the same time; in one example, only one or two of the three states of the target object may be acquired; in an example, on the basis of obtaining the three states of the target object, other states of the target object may be further obtained, and the types and evaluation manners of the other states may be flexibly determined according to requirements.
The monitoring result is obtained by evaluating the emotion state, the vigor state and the learning state of the target object, the monitoring condition of the target object can be comprehensively evaluated by analyzing the emotion, the vigor and the learning condition of the target object, so that the comprehensiveness, the reliability and the referential performance of the monitoring result are improved, and the monitoring condition of the target object can be comprehensively controlled conveniently.
After the state to be evaluated of the target object is obtained in step S121, the monitoring data corresponding to the state may be counted in step S122, so as to obtain a statistical result. It can be seen that, when step S122 is implemented, since the statistical monitoring data corresponds to the state to be evaluated, the statistical content may change correspondingly with the different states to be evaluated, so that the implementation manner of step S122 is also different. In one possible implementation, step S122 may include:
counting emotion recognition data corresponding to the emotional states; and/or the presence of a gas in the gas,
counting emotion recognition data, sitting posture recognition data and action recognition data corresponding to the energy state; and/or the presence of a gas in the gas,
and counting emotion recognition data, sitting posture recognition data, action recognition data and face orientation recognition data corresponding to the learning state.
According to the disclosed embodiment, when the emotional state of the target object is evaluated, the emotion recognition data can be used as reference data for statistics and evaluation to obtain the emotional state evaluation result of the target object as a monitoring result; when the energy state of the target object is evaluated, the sitting posture identification data and the action identification data in the emotion identification data and the behavior identification data can be used as reference data for statistics and evaluation to obtain an energy state evaluation result as a monitoring result; when the learning state of the target object is evaluated, the sitting posture identification data, the motion identification data and the face orientation identification data in the emotion identification data and the behavior identification data can be used as reference data, and statistics and evaluation are performed to obtain a learning state evaluation result as a monitoring result. In a possible implementation manner, the emotion recognition data, the sitting posture recognition data, the motion recognition data, and the face orientation recognition data may be respectively counted, and then which statistical results are combined is selected according to the condition of the state to be evaluated, and then the state evaluation result of the target object is obtained through step S123 to serve as the monitoring result. In another possible implementation manner, corresponding monitoring data may be obtained according to the state to be evaluated, so as to obtain a corresponding statistical result. In an example, when the method is applied to health status analysis of students, emotion recognition data, sitting posture recognition data, motion recognition data and facial orientation recognition data of the students may be respectively counted, then a statistical result of the emotion status may be obtained according to a statistical condition of the emotion recognition data of the students, a statistical result of the energy status may be obtained according to a statistical condition of the emotion recognition data, the sitting posture recognition data and the motion recognition data of the students, a statistical result of the learning status may be obtained according to a statistical condition of the emotion recognition data, the sitting posture recognition data, the motion recognition data and the facial orientation recognition data of the students, and after the three statistical results are respectively obtained, the emotion status, the energy status and the learning status of the students may be respectively evaluated through step S123 to obtain monitoring results.
It can be seen from the above disclosure that, as the type of the state evaluation varies, the process of counting the monitoring data may also vary accordingly. In one possible implementation, the statistical approach may be: since the type of the monitoring data to be counted can be determined according to the state to be evaluated, each recognizable result in the identification data of the type can be counted based on the type of the identification data in the determined monitoring data to obtain the statistical result. For example, if the state to be evaluated is an emotional state, the type of the monitoring data that needs to be counted may be determined to be emotion recognition data, and it can be known from the above-mentioned disclosed embodiment that the recognizable result in the emotion recognition data includes 7 types of happiness, calmness, surprise, dislike, anger, sadness, and fear, so during counting, the number of times of the total of 7 types of happiness, calmness, surprise, dislike, and the like of the target object may be respectively recorded in a counting period, thereby obtaining the statistical result. If the state to be evaluated is a vigor state, in addition to counting the number of times of 7 types in the emotion recognition data in the above disclosed embodiment, the number of times of 5 types in total, such as the number of times of sitting and the number of times of lying prone, and the number of times of 7 types in total, such as the number of times of yawning and the number of times of stretching into the waist, in the sitting posture recognition data, and the motion recognition data need to be counted, so that a statistical result is obtained. The statistical manner of the learning state is similar to that of the above-described embodiment, and is not described herein again.
Through counting the emotion recognition data corresponding to the emotion state, counting the emotion recognition data corresponding to the energy state, the sitting posture recognition data and the action recognition data, counting the emotion recognition data corresponding to the learning state, the sitting posture recognition data, the action recognition data and the face orientation recognition data, different recognition data can be referred to, the emotion state, the energy state and the learning state of the target object can be respectively obtained, different reference data labels are arranged in different states, namely when different types of states are obtained, the specific contents of the recognition data which need to be referred to are different, therefore, results among different states have differences, and therefore the comprehensiveness and the reliability of monitoring results can be improved.
After the statistical result is obtained in step S122, the monitoring result may be obtained in step S123 based on the statistical result. The implementation manner of step S123 is not limited, that is, the manner of performing the state evaluation on the target object is not limited, in one possible implementation manner, the corresponding relationship between the statistical result and the state evaluation result may be determined by a weighted calculation manner, and in one possible implementation manner, the corresponding relationship between the statistical result and the state evaluation result may also be determined by other calculation manners, for example, a proportional relationship between the statistical results, and the like, and may be flexibly selected according to the actual situation. Therefore, in one possible implementation, step S123 may include:
and S1231, carrying out weighted summation on the statistical result and a preset weight to obtain a weighted result.
And step S1232, converting the weighting result into a preset value domain, and taking the converted result as a state statistic value.
Step S1233, averaging the multiple state statistics of the target object in the monitoring period, and taking the averaged result as the state evaluation result of the target object.
The statistical result and the preset weight are subjected to weighted summation to obtain a weighted result, and the monitoring data of the target object can be quantized, so that the subjective evaluation process of state evaluation of the target object is converted into an objective numerical value, and the monitoring result is more objective and convenient to analyze; by converting the weighting result into a preset value range and taking the converted result as a state statistic value, the final monitoring result can be in the same value range, and the comparison and analysis easiness of the monitoring result is further improved; by averaging a plurality of state statistics values of the target object in the monitoring period and taking the averaged result as the state evaluation result of the target object, the monitoring result can be more reliable, and the possibility that the monitoring result does not have reference significance due to short monitoring time is reduced.
In step S1231, an important reference basis for the weighted summation is a preset weight, and the preset weight can be flexibly determined according to actual conditions. In a possible implementation manner, the corresponding recognition data type may be determined according to the state to be evaluated, and then the weight definition is performed on the recognition result included in each recognition data type, where a positive recognition result may correspond to a positive weight whose value is positive, and a negative recognition result may correspond to a negative weight whose value is negative. In one example, if the energy state of the target object is to be evaluated, it can be known from the above disclosed implementation that the type of the identification data included in the target object includes emotion identification data, sitting posture identification data, and motion identification data, and therefore, further, we can define that the weight of interest in the emotion identification data is +1, the weight of lying on the back in the sitting posture identification data is-2, the weight of standing on the back is +1, the weight of yawning in the motion identification data is-2, the weight of stretching to the back is-2, the weight of head support is-2, the weight of lifting up the hand is +1, and the like, and the weights of the rest undescribed identification results are not listed one by one, and are defined according to the actual situation. The method for defining the emotional state and the learning state may refer to the disclosed example, and will not be described herein. It should be noted that, in the above-mentioned embodiments, it is proposed that, in different states to be evaluated, reference data may overlap, such as an emotional state, an energy state, and a learning state, which all need to refer to emotion identification data, and an energy state and a learning state also need to refer to sitting posture identification data and motion identification data, however, identification results in the same type of identification data may be used for evaluation of different states, and weights thereof may be determined according to actual conditions, may be the same or different, and are not limited herein. For example, when an emotional state, an energy state, and a learning state of a target object are evaluated at the same time, for a recognition result that the target object is happy in the emotional recognition data, the weight occupied in the energy state may be +1, the weight occupied in the emotional state may be equally +1, and the weight occupied in the learning state may be + 2.
After the weight of the recognition result in the recognition data under each type is determined, the statistical result and the preset weight may be weighted and summed to obtain a weighted result. The specific calculation method of the weighted sum can be flexibly determined according to actual situations, and in a possible implementation manner, the calculation process of the weighted sum may be as follows:
weighting result ═ Σ (statistical result of target object in statistical period under state to be evaluated × corresponding weight)
The time length of the statistical period may be flexibly determined according to the monitoring requirement, and is not limited in the embodiment of the present disclosure. In one example, when the method is applied to health analysis of students, the target object is a student, the statistical period may be defined as the time of each class (or each class), and the weighted result of the energy state of a student in a class is denoted as a, then:
a ∑ (count × corresponding weight of each of the student identification data of the class + count × corresponding weight of each of the student identification data of the class).
The calculation method of the weighted result of the emotional state and the learning state of the classmate in a class can be analogized according to the above example, and the description is omitted here.
Through the above process, the weighting result of the target object can be obtained, and further, through step S1232, the weighting result can be converted into a preset value range, and the conversion result is used as a state statistic value. The preset value range may be flexibly determined according to actual conditions, the range of the preset value range may be defined by itself, or the range of the preset value range may be determined according to all weighting results that may be obtained by the target object, which is not limited herein. In one example, the process of converting the weighted result into the preset value domain to obtain the state statistic may be: still take the above-mentioned weighted result of the energy state of a classmate in a class as a example in the disclosed embodiment, in the application example of the present disclosure, the value range of the weighted result of the energy state may be correspondingly determined according to the weight of each recognition result in the energy state, and the specific determination process may be flexibly calculated according to the actual situation, which is not limited herein. Further, the embodiment of the present disclosure may define a range of a preset value range as [0,100], and then, according to the value range of the weighted result of the energy state, a conversion formula for converting the weighted result of the energy state into the preset value range may be determined, in this example of the present disclosure, a conversion relationship between the statistical value of the energy state and the weighted result of the energy state may be obtained by calculation, and may satisfy:
vigor state statistics (weighted result of vigor state +8X)/11X 100%
Wherein, X is the recording times output by the target object in the time period.
Through the above formula, the weighted result of each energy state can be converted into an energy state statistic value between 0 and 100, and similarly, the above process can be referred to for the calculation of the emotional state statistic value and the learning state statistic value, and details are not repeated here.
As can be seen from the above disclosure, after the state statistics are obtained, in step S1233, a plurality of state statistics of the target object in the monitoring period are averaged, and the averaged result is used as the state evaluation result of the target object, so as to obtain the final monitoring result. The specific number of the plurality of state statistics values participating in the averaging is not limited in the embodiment of the present disclosure, and may be all state statistics values of the target object in the monitoring period, or some state statistics values screened according to the requirement, and the specific screening requirement is not limited in the embodiment of the present disclosure, and may be flexibly determined according to the actual situation. The time length of the monitoring period can also be flexibly determined according to actual conditions, and is not limited to the following disclosed embodiments. In a possible implementation manner, when the time of the monitoring period is short enough, the monitoring method may be regarded as obtaining a real-time monitoring result of the target object, and at this time, the monitoring method of the target object proposed in the embodiment of the present disclosure may also be regarded as a method for monitoring the target object in real time, and meanwhile, since the monitoring result needs to be obtained based on the statistical result, the statistical period at this time may also be directly regarded as the monitoring period, and therefore, the state statistical value obtained according to the statistical result may be directly used as the state evaluation result, and it is not necessary to average a plurality of state statistical values in the monitoring period. In a possible implementation manner, when the method provided in the above contents is applied to health analysis of students, the statistical period may be defined as the time of each class, and further, the monitoring time may be defined as a natural day (or a teaching natural day) with a teaching activity behavior, so that health analysis of each student can be performed on each teaching natural day; furthermore, after the monitoring result of each student on each teaching nature day is obtained, the monitoring result can be further counted and analyzed in units of weeks, months, years and the like.
In a possible implementation manner, before step S1233, the method further includes: and judging whether the target object comprises at least one statistical result in the monitoring period, and if so, executing state evaluation on the target object according to the statistical result to obtain a state evaluation result of the target object as a monitoring result. That is, if the target object includes one or more statistical results in the monitoring period, step S1233 may be performed to obtain a state evaluation result, and if the target object does not include a statistical result in the monitoring period, there is no state statistical value that can be averaged, and what operation to perform may be flexibly determined according to an actual situation. As can be seen from the foregoing disclosure, in a possible implementation manner, the implementation of step S1233 is premised on that the target object has at least a weighting result in a statistical period in the monitoring period. For example, when the students are analyzed for health as described above, the monitoring results of a certain student on a teaching nature day are calculated on the premise that the student has at least one class effective weighting result on the teaching nature day. The basis for judging whether the weighting result is valid can be to judge whether the location of the lesson and the location where the image data is collected are consistent, if so, the obtained weighting result can be considered to be valid, otherwise, the image data which is possibly collected is not the relevant data of the student, and the obtained weighting result is inaccurate, so the weighting result can be recorded as invalid.
When the above premises are met, all state values of the target object in the monitoring period can be averaged, and the obtained average result can be regarded as a state evaluation result of the target object, that is, a monitoring result. The specific average process can be flexibly determined according to actual conditions, and in one example, when students are subjected to health analysis, the state evaluation result of a certain student on a certain teaching natural day can satisfy the following formula:
state evaluation result (S1 × T1+ S2 × T2+ S3 × T3+ …)/(T1+ T2+ T3+ …)
Wherein, S1, S2, S3.. is the state statistical value of each statistical period in the teaching nature day for the student, and T1, T2, T3.. is the duration of each statistical period included in the teaching nature day. In the embodiment of the present disclosure, the daily emotional state, energy state and learning state of the student can be obtained according to the above formulas.
Through the disclosed embodiments of step S12, the state evaluation result of the target object can be obtained as the monitoring result, and therefore, further, the step S12 can further include the following steps of determining whether the state of the target object is normal by comparing the monitoring result with the first normal state evaluation range:
acquiring a first normal state evaluation range of a target object;
and comparing the state evaluation result of the target object with the first normal state evaluation range, and judging that the target object is in an abnormal state under the condition that the state evaluation result of the target object exceeds the first normal state evaluation range.
In the above-described disclosed embodiment, the first normal state evaluation range may be a range value used for evaluating whether the state of the target object is normal, and a specific implementation manner and a numerical range thereof are not limited in the embodiment of the present disclosure. In a possible implementation manner, it may be considered that if the state evaluation result of the target object is within the first normal state evaluation range, the target object is in a normal state, otherwise, the target object is in an abnormal state; in a possible implementation manner, it may also be considered that the target object is in a normal state if the state evaluation result of the target object is outside the first normal state evaluation range, and otherwise, the target object is in an abnormal state. In the present disclosure, each of the following disclosure embodiments is developed by taking an example in which the state evaluation result indicates that the target object is normal when it is within the first normal state evaluation range.
By acquiring the first normal state evaluation range of the target object, comparing the state evaluation result of the target object with the first normal state evaluation range and determining that the state of the target object is in an abnormal state when the state evaluation result exceeds the first normal state evaluation range, the abnormal condition of the target object can be found in time, and the monitoring pertinence and reliability are further improved.
In the above-described disclosed embodiment, the manner of obtaining the first normal state evaluation range of the target object is not limited. In one possible implementation manner, the obtaining the first normal state evaluation range of the target object may include: determining a first normal state evaluation range according to a historical monitoring result of a target object; or reading a preset first normal state evaluation range.
In an example, before the current monitoring period, monitoring results of the target object in multiple monitoring periods may have been obtained, and these monitoring results may be regarded as historical monitoring results with respect to the current monitoring period, and the monitoring results in the current monitoring period may be compared with the historical monitoring results, so as to determine whether the state of the target object is normal in the current monitoring period. It has been proposed in the above-described disclosed embodiment that the state of the target object may include an emotional state, an energy state, and a learning state, and therefore, when determining whether the target object is normal, it is also possible to determine whether the emotional state, the energy state, and the learning state of the target object are normal based on the emotional state history monitoring data, the energy state history monitoring data, and the learning state history monitoring data of the target object, respectively.
The specific determination process is not limited in the embodiments of the present disclosure, and in a possible implementation process, the determination process may be: firstly, a reference range (i.e. a first normal state evaluation range) of the normal state of the target object is obtained according to the historical monitoring result of the target object, and how to specifically obtain the reference range of the normal state, flexible analysis can be performed according to the duration of the historical monitoring result and the specific content condition of the historical monitoring result, and no specific limitation is made herein. For example, the historical monitoring results of the target object may be averaged to obtain an average monitoring result, and a floating value may be set, so that the reference range of the normal state of the target object is a range that floats up and down within the range of the floating value with the average monitoring result as a center. If the current monitoring result of the target object is out of the reference range of the normal state and the duration time exceeds the set time threshold (the duration of the actual threshold can be defined according to the actual situation), the target object can be considered to be in the abnormal state, otherwise, the target object is not considered to be in the abnormal state.
In an example, the normal state evaluation range of the target object may also be set according to other reference data or experience, and the set result is used as the first normal state evaluation range, and a value of the specific set result is not limited in the embodiment of the present disclosure and may be flexibly determined according to an actual situation.
In addition to determining whether the emotional state, the energy state, and the learning state of the target object are abnormal, when the method provided in the embodiment of the present disclosure is applied to student health analysis, in order to more comprehensively analyze the health state of the student, in addition to determining whether the three states are abnormal, other abnormal state determination may be added. For example, other abnormal states may include a sitting abnormal state, a class non-speaking abnormal state, and a continuous non-sitting abnormal state. The abnormal sitting posture state can be used for judging the sitting posture of the student based on the proportion of the time length (such as lying prone) of the passive sitting posture identification data of the class of the student to the time length of the class; the abnormal speaking state in class can be used for judging the speaking behaviors of the students in class based on the speaking conditions of the students in class (such as raising hands, speaking and the like in the action identification data); the abnormal state of continuous non-leaving can evaluate the non-leaving behavior (such as sitting, lying prone and the like in the sitting posture identification data) of the student in the break.
In a possible implementation manner, in order to ensure the reliability of data, when determining whether a target object is abnormal, the premise of the determination may be set to be that the overall state of the set to which the target object belongs is normal, that is, when the overall state of the set to which the target object belongs is normal, it is determined whether the target object is abnormal, otherwise, it is not determined whether the target object is abnormal. Specifically, how to judge the state of the set to which the target object belongs may refer to the following disclosure embodiments, which are not expanded herein.
The first normal state evaluation range is determined according to the historical monitoring result of the target object, or the preset first normal state evaluation range is directly read, the first normal state evaluation range is flexibly determined through the two modes, the flexibility of evaluation on the abnormal state of the target object can be improved, and the flexibility of target monitoring is improved.
Still further, step S12 may further include:
and acquiring the state evaluation results of a plurality of objects in the set to which the target object belongs.
And obtaining a state evaluation result of the set to which the target object belongs as a monitoring result according to the obtained result.
In the above disclosure, it has been proposed that the monitoring method for a target object provided in the present disclosure may be applied to health analysis of students or emotional behavior analysis of experimental subjects, and whether the students perform the health analysis or the emotional behavior analysis of the experimental subjects, the target object is affiliated to a certain set.
The set to which the target object belongs can be flexibly determined according to the identity of the target object, and is not limited to the following embodiments. The state evaluation results of a plurality of objects in the set to which the target object belongs are obtained, the specific number of the objects is not limited in the embodiment of the present disclosure, and the specific number of the objects may be all the objects in the set to which the target object belongs, or some objects screened according to actual conditions, and may be flexibly selected according to the actual conditions. In one example, when the method is applied to health analysis of students, the target objects may be individual students, and the set to which the target objects belong may be a learning group or class in which the individual students belong, may be expanded to a grade in which the individual students belong, or may be other sets divided according to the identity of the students. Taking the set to which the target object belongs as the class of the students, in one example, the process of obtaining the daily monitoring result of the class of a certain student may be: in the example of the present disclosure, the energy status of a class where a certain student is located may be used as the monitoring result of the class, and then the energy status evaluation result of each student in the class where the student is located is first obtained, and then the daily monitoring result of the class where the student is located is the sum of the energy status evaluation results of all students in the class per number of students. The number of students of the class is not necessarily equal to the number of students owned by the class, but is determined according to the number of students of the class who have energy state evaluation results on the same day, namely, if there are N students having energy state evaluation results on the same day, the number of students is equal to N. It should be noted that if N is less than or equal to 50% of the number of students owned by the class, that is, the number of students having energy status assessment results for the class is not more than half of the number of students owned by the class on the same day, the status assessment results for the class may be considered to have no computational value, and thus the monitoring results for the class are not computed.
In addition to this indication, the above process may be further referred to determine whether the set to which the target object belongs is in an abnormal state, and a specific determination process may be obtained by flexibly combining the implementations proposed in the foregoing disclosed embodiments, and therefore, in a possible implementation, the step S12 may further include:
acquiring a second normal state evaluation range of a set to which the target object belongs;
and comparing the state evaluation result of the set to which the target object belongs with the second normal evaluation range, and judging that the set to which the target object belongs is in an abnormal state under the condition that the state evaluation result of the set to which the target object belongs exceeds the second normal state evaluation range.
In the above-described disclosed embodiment, the second normal state evaluation range may be a range value used to evaluate whether the state of the set to which the target object belongs is normal, and a specific implementation manner and a numerical value range thereof are not limited in the present embodiment, and in a possible implementation manner, it may be considered that if the state evaluation result of the set to which the target object belongs is within the second normal state evaluation range, the total set to which the target object belongs is a normal state, otherwise, the total set to which the target object belongs is an abnormal state; in a possible implementation manner, it may also be considered that if the state evaluation result of the set to which the target object belongs is outside the second normal state evaluation range, the total set to which the target object belongs is in a normal state, otherwise, the total set to which the target object belongs is in an abnormal state. In the embodiments of the present disclosure, the following disclosed embodiments are developed by taking as an example that the overall normality of the set to which the target object belongs is indicated when the state evaluation result is within the second normal state evaluation range.
In the above-described disclosed embodiments, the obtaining manner of the second normal state evaluation range may refer to the obtaining manner of the first normal state evaluation range, and the target object may be replaced by the set to which the target object belongs, which is not described herein again, and in the same way, the comparison and determination process may also be obtained by flexible transformation of the above-described disclosed embodiments.
The state evaluation results of the plurality of objects in the set to which the target object belongs are obtained, and the state evaluation results of the set to which the target object belongs are obtained according to the obtained results and serve as monitoring results, so that the monitoring results are beneficial to analyzing the collective emotion or behavior, the application range of the whole monitoring method is expanded, and the practicability of the monitoring method can be improved.
By the target object monitoring method provided by the embodiment of the disclosure, the monitoring result of the target object can be automatically obtained, and the monitoring reliability is improved. Further, further operations may also be performed based on the monitoring result. Therefore, in a possible implementation manner, the method for monitoring a target object provided in the embodiment of the present disclosure may further include:
and step S13, sending the monitoring result to a target terminal, wherein the target terminal is used for displaying the monitoring result.
The implementation manner of the target terminal is not limited in the embodiment of the present disclosure, and any terminal device that can display the monitoring result can be used as the implementation manner of the target terminal. In one possible implementation manner, the target terminal may be a visual display platform, so that the monitoring result can be visually displayed; in a possible implementation manner, the target terminal may also be a voice broadcast platform, so that a monitoring result can be displayed in a voice manner.
In one example, when the target terminal is a visualization presentation platform, the target terminal may visually present the monitoring data in a predetermined presentation mode. The predetermined display mode may be defined according to actual conditions, and in one example, the predetermined display mode may be to display the monitoring result in a form of a report. Further, where the present method is applied to student health analysis, in one example, the predetermined presentation mode may include both a health alert and a statement-of-health presentation. The health reminding can be divided into class health reminding and student health reminding, and the above-mentioned disclosed embodiment has already proposed that when a monitoring result is obtained, whether a target object is in an abnormal state can be judged according to a state evaluation result of the target object and by combining a historical monitoring result of the target object, so that in the disclosed example, the health reminding can refer to the historical monitoring result, and when the student or the class is in the abnormal state, a real-time health reminding is generated. The health report part can be divided into a class health report and a student health report, the monitoring results of the class and the student are summarized and summarized according to a self-defined time stage, and the monitoring results are displayed on a visual display platform in the form of a report or a report.
Through sending the monitoring result to can show the target terminal of monitoring result on, can carry out effectual show to the monitoring result to be convenient for comprehensively and accurately observe the monitoring condition of target object, promote real-time, the demonstration nature and the summarization nature of monitoring.
Further, in a possible implementation manner, the method for monitoring a target object provided in the embodiment of the present disclosure may further include:
and step S14, sending alarm information according to the monitoring result.
It has been proposed in the above-mentioned embodiments that the monitoring method mentioned in the present disclosure can determine whether the target object is in an abnormal state. Therefore, when the target object is in an abnormal state or the set to which the target object belongs is in an abnormal state, the warning information may be further sent to the corresponding object, and the sending object of the warning information may be specified by itself, such as a relevant teacher or an experimenter, and the like, which is not limited herein. The content of the warning information may be determined flexibly according to the warning condition, and is not limited herein. The condition for sending the alarm information can be set according to the actual situation, and is not limited to the following disclosed embodiment, once the condition for sending the alarm information is reached, the server will automatically send the alarm information to the designated object. In one example, the implementation process of step S14 may be: and sending alarm information when the energy state, the learning state and the emotional state of the class and the students become good and bad, wherein the alarm information is sent under the condition of system default setting, such as a normal range of a certain state which is lower (or higher) than the previous three months for N consecutive days, the value of N can be set by self, and the same reminding content is sent at intervals of at least 1 day. In one example, the implementation process of step S14 may also be: the user who has the authority to look up or modify the data in the cloud can set the sending rule of the alarm information by himself, such as the content name of the alarm, the index or some behaviors related to the alarm, the sending frequency of the alarm, the alarm rule and the like.
By sending the alarm information according to the monitoring result, the alarm can be timely sent to the related object when the monitored target object is abnormal, so that the reliability, the safety and the practicability of monitoring are further improved.
It can be seen from the above disclosure that after the monitoring result is obtained, the alarm information may be sent according to the monitoring result, but the process of sending the alarm information is not necessary, that is, in a possible implementation manner, the obtained monitoring result may represent the end of the monitoring process, and the alarm information does not need to be sent.
Fig. 2 shows a block diagram of a monitoring device of a target object according to an embodiment of the present disclosure. As shown, the monitoring device 20 may include: a monitoring data acquisition module 21, configured to acquire monitoring data of a target object; the monitoring result generation module 22 is used for performing state evaluation on the target object according to the monitoring data to obtain a monitoring result; wherein the monitoring data comprises emotion recognition data and/or behavior recognition data obtained by image recognition of the target object.
In one possible implementation manner, the monitoring result generating module includes: the state acquisition unit is used for acquiring the state to be evaluated of the target object; the statistical unit is used for counting the monitoring data corresponding to the state to obtain a statistical result; and the state evaluation unit is used for carrying out state evaluation on the target object according to the statistical result to obtain a state evaluation result of the target object as a monitoring result.
In one possible implementation, the state evaluation unit is configured to: carrying out weighted summation on the statistical result and a preset weight to obtain a weighted result; converting the weighted result into a preset value domain, and taking the converted result as a state statistic value; and averaging a plurality of state statistics values of the target object in the monitoring period, and taking an averaged result as a state evaluation result of the target object.
In one possible implementation, the state evaluation unit is further configured to: and judging whether the target object comprises at least one statistical result in the monitoring period, and if so, executing state evaluation on the target object according to the statistical result to obtain a state evaluation result of the target object as a monitoring result.
In one possible implementation, the behavior recognition data includes: at least one of sitting posture identification data, motion identification data, and face orientation identification data.
In one possible implementation, the states include: at least one of an emotional state, an energy state, and a learning state.
In one possible implementation, the statistical unit is configured to: counting emotion recognition data corresponding to the emotional states; and/or counting emotion recognition data, sitting posture recognition data and motion recognition data corresponding to the energy state; and/or counting emotion recognition data, sitting posture recognition data, motion recognition data, and face orientation recognition data corresponding to the learning state.
In a possible implementation manner, the monitoring result generating module further includes an abnormal state determining unit, where the abnormal state determining unit is configured to: acquiring a first normal state evaluation range of a target object; and comparing the state evaluation result of the target object with the first normal state evaluation range, and judging that the target object is in an abnormal state under the condition that the state evaluation result of the target object exceeds the first normal state evaluation range.
In a possible implementation manner, the abnormal state determination unit is further configured to: determining a first normal state evaluation range according to a historical monitoring result of a target object; or reading a preset first normal state evaluation range.
In one possible implementation, the monitoring result generation module is further configured to: obtaining state evaluation results of a plurality of objects in a set to which a target object belongs; and obtaining the state evaluation result of the set to which the target object belongs as a monitoring result according to the obtained state evaluation results of the plurality of objects.
In one possible implementation, the monitoring result generation module is further configured to: acquiring a second normal state evaluation range of a set to which the target object belongs; and comparing the state evaluation result of the set to which the target object belongs with the second normal evaluation range, and judging that the set to which the target object belongs is in an abnormal state under the condition that the state evaluation result of the set to which the target object belongs exceeds the second normal state evaluation range.
In one possible implementation, the apparatus is further configured to: and sending the monitoring result to a target terminal, wherein the target terminal is used for displaying the monitoring result.
In one possible implementation, the apparatus is further configured to: and sending alarm information according to the monitoring result.
Fig. 3 shows a block diagram of a monitoring system of a target object according to an embodiment of the present disclosure. As shown, the monitoring system 30 may include:
a monitoring data generating means 31 for generating monitoring data of the target object; a target object monitoring device 20 as described in any of the above; the monitoring data generating device 31 is connected to the target monitoring device 20.
In the above disclosed embodiment, the implementation manner of the monitoring data generating device is not limited, and reference may be made to the identification device provided in the above disclosed embodiment. In one possible implementation, the monitoring data generating device includes:
an image acquisition device for acquiring an image of a target object; the image recognition device is used for obtaining monitoring data of the target object through image recognition; the image recognition equipment is respectively connected with the image acquisition equipment and the monitoring device of the target object.
In the above-mentioned embodiment, implementation manners of the image acquisition device and the image recognition device are not limited in this embodiment, and may be flexibly selected according to actual situations, and further, in a possible implementation manner, the implementation manner of the image acquisition device may refer to the image acquisition module in the above-mentioned embodiment, and the implementation manner of the image recognition device may refer to the recognition module in the above-mentioned embodiment, which is not described herein again.
In one possible implementation, the system further includes: the target terminal is used for displaying the monitoring result of the monitoring device of the target object; and the target terminal is connected with the monitoring device of the target object.
The implementation manner of the target terminal is not limited, and has been illustrated in the above disclosed embodiments, and is not described in detail herein.
Application scenario example
How to effectively evaluate the health state of students is of great importance to school teaching management work. For the analysis of health behaviors of students in a campus, no universal analysis method exists at present.
Therefore, the monitoring method of the target object with higher reliability can be effectively applied to the health state analysis of students, and has higher practical value.
Fig. 4 is a schematic diagram illustrating an application example according to the present disclosure, and as shown in the drawing, the present disclosure provides a set of student health analysis system, which can implement analysis and display of the health status of students by the monitoring method of the target object provided in the above disclosed embodiment.
As shown in the figure, the system is mainly realized by three modules, wherein the first module is used as an input module and is used for acquiring the emotional behavior recognition data of the students. In the application example of the present disclosure, the input module further includes two units, one is an image capturing unit, the image capturing unit is implemented by a video capturing camera deployed in a classroom in the application example of the present disclosure, and the other is an identification unit, the application example of the present disclosure is implemented by an artificial intelligence device (which may be referred to as a "box" in the application example of the present disclosure) having an identification function, the box may parse video data, and the emotion and behavior of a student are automatically identified by using artificial intelligence technologies such as face detection, face identification, behavior identification, and expression identification. In the application example of the present disclosure, the identification data obtained by the box can be mainly classified into four categories of emotion, sitting posture, movement and face orientation, which are as follows:
the emotions may include 7 emotions, such as: happiness, calmness, surprise, aversion, anger, sadness and fear; the sitting posture can include 5 sitting postures, such as: sitting, lying, standing, leaning against the wall, leaning against the table; the actions may include 7 actions, such as: yawning, lazy stretching, playing mobile phones, lifting hands, speaking, recording notes, and holding hands to hold heads; the face orientation may include 3 orientations, such as: head-lowering, head-raising, head-passing.
After acquiring the emotional behavior recognition data, the input module may upload the acquired emotional behavior recognition data to a second module in the system, that is, an analysis module, where the analysis module is configured to analyze the health status of the student through a health status analysis scheme in the module based on the emotional behavior recognition data. In an application example of the present disclosure, the health analysis implementation manner of the analysis module is specifically as follows:
first, the health status of the student is divided into three types, namely, energy, emotion and learning status. The energy state mainly includes that the emotion, sitting posture, movement and other behaviors of the student are counted, positive behaviors (such as happiness, standing, lifting hands and the like) are weighted positively, negative behaviors (such as yawning, groveling, stretching waist, supporting head and the like) are weighted reversely, and the energy state value of the student is finally output; the emotional state mainly counts the emotional behaviors of the students, positive behaviors (such as happiness) are weighted positively, negative behaviors (such as surprise, dislike, sadness and the like) are weighted reversely, and the emotional state value of the students is finally output; the learning state mainly counts behaviors such as emotion, sitting posture, action, face orientation and the like of the student, positive weighting is given to positive behaviors (such as raising head, sitting, standing, lifting hands, recording notes and the like), negative behaviors (such as lying prone, lowering head, playing a mobile phone, carrying over head, speaking and the like) are given reverse weighting, and finally the learning state value of the student is output. In addition, in order to ensure the reliability of the health state data, the analysis module may further refer to the student behavior and posture change frequency, a specific reference manner is not limited in the application example of the present disclosure, and in one example, the learning state value of the student may be correspondingly deducted and the like when the student behavior and posture change frequency exceeds a preset threshold.
In addition, the analysis module can also judge the normal or abnormal behavior of the student according to the following judgment criteria: obtaining a reference range of the normal state of the student based on the historical health state data of the student; and assuming that the health state of the student is not in the normal range for a period of time, the student state is considered to be abnormal for the period of time.
Furthermore, the analysis module can also analyze the health state of the class, wherein the health state of the class is the average value of the health states of all students in the class, the judgment basis of the normal or abnormal behaviors of the class is consistent with the logic of the judgment basis of the abnormal behaviors of the students, and the reference range of the normal state of the class is obtained based on the historical health state data of the class; and assuming that the health state of the class is not in the normal range for a period of time, the state of the class is considered to be abnormal for the period of time.
In order to analyze the health status of the student more comprehensively, besides the three health statuses, the analysis module also defines that the student abnormity can also comprise sitting posture abnormity, no speaking in class and continuous sitting-out. The abnormal sitting posture is characterized in that the sitting posture of the student is judged based on the proportion of passive sitting posture duration (such as lying prone) of the student in a classroom to the long sitting posture duration; the abnormal speaking in class is to judge the speaking behavior of the student in class based on the speaking condition of the student in class; and if the student does not leave the seat continuously, the student does not leave the seat in the class. In order to ensure the reliability of data, all the judgment is carried out on the individual students under the condition that the overall situation of the class is normal.
Through the process, the analysis module can obtain the analysis result of the health of the student, and then the analysis result is sent to the third module of the system, namely the visual report form display module, so that the health analysis result of the student is displayed on the platform end in a visual mode for a teacher in the school to check and reference. The health reminding is mainly divided into class health reminding and student health reminding, and the real-time health reminding is generated by taking historical data as reference. While the health report is mainly divided into a class report and a student report, which are generated according to stages and are the summary and summarization of stage data.
In addition, the system can also set the health status warning rules of students, and once the condition specified by the warning rules occurs, the system can automatically send warning information to designated personnel, in the application example of the disclosure, the system can set the reminding of the energy status, the learning status and the emotion status of the class and the students getting better and getting worse, and the reminding rules are default settings of the system, such as the normal range of a certain state in the previous three months in N consecutive days, and the same reminding content is sent at least 1 day intervals. Furthermore, the system can also enable the user to set reminding rules, including the content name of the reminding, the index or some behaviors related to the reminding, the sending frequency of the reminding, the reminding rules and the like.
By the student health analysis system provided in the application example of the disclosure, the hardware dependence in the student health analysis process can be reduced, and only a camera is deployed in a classroom to acquire videos; the manual dependence in the student health analysis process can be reduced, and the interference of teachers or school managers on the scheme is basically not needed; and the cost dependence in the health analysis process of the students is reduced, and excessive hardware and labor investment is avoided, so that the cost is low. Meanwhile, the student health analysis system provided in the application example of the disclosure has high emotional behavior recognition dimensionality, can recognize more than 20 emotional behavior actions, and has mature and reliable recognition results; the realization mode of the health state analysis is determined according to the guidance of a professional education team, and the reference value and the reliability are high; meanwhile, the health state analysis system can realize custom data analysis according to the requirement of a teacher, and is high in flexibility.
The system provided in the application example of the disclosure can be applied to student health analysis, and can also be applied to other people gathering places such as hospitals, so that emotion and behavior recognition data of people can be analyzed and early warned.
It is understood that the above-mentioned method embodiments of the present disclosure can be combined with each other to form a combined embodiment without departing from the logic of the principle, which is limited by the space, and the detailed description of the present disclosure is omitted.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the above-mentioned method. The computer readable storage medium may be a volatile computer readable storage medium or a non-volatile computer readable storage medium.
An embodiment of the present disclosure further provides an electronic device, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured as the above method.
In practical applications, the memory may be a volatile memory (RAM); or a non-volatile memory (non-volatile memory) such as a ROM, a flash memory (flash memory), a Hard Disk (Hard Disk Drive, HDD) or a Solid-State Drive (SSD); or a combination of the above types of memories and provides instructions and data to the processor.
The processor may be at least one of ASIC, DSP, DSPD, PLD, FPGA, CPU, controller, microcontroller, and microprocessor. It is understood that the electronic devices for implementing the above-described processor functions may be other devices, and the embodiments of the present disclosure are not particularly limited.
The electronic device may be provided as a terminal, server, or other form of device.
Based on the same technical concept of the foregoing embodiments, the embodiments of the present disclosure also provide a computer program, which when executed by a processor implements the above method.
Fig. 5 is a block diagram of an electronic device 800 in accordance with an embodiment of the disclosure. For example, the electronic device 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, or the like terminal.
Referring to fig. 5, electronic device 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the electronic device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the electronic device 800. Examples of such data include instructions for any application or method operating on the electronic device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 806 provides power to the various components of the electronic device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the electronic device 800.
The multimedia component 808 includes a screen that provides an output interface between the electronic device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the electronic device 800 is in an operation mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the electronic device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the electronic device 800. For example, the sensor assembly 814 may detect an open/closed state of the electronic device 800, the relative positioning of components, such as a display and keypad of the electronic device 800, the sensor assembly 814 may also detect a change in the position of the electronic device 800 or a component of the electronic device 800, the presence or absence of user contact with the electronic device 800, orientation or acceleration/deceleration of the electronic device 800, and a change in the temperature of the electronic device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate wired or wireless communication between the electronic device 800 and other devices. The electronic device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related personnel information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium, such as the memory 804, is also provided that includes computer program instructions executable by the processor 820 of the electronic device 800 to perform the above-described methods.
Fig. 6 is a block diagram of an electronic device 1900 according to an embodiment of the disclosure. For example, the electronic device 1900 may be provided as a server. Referring to fig. 6, electronic device 1900 includes a processing component 1922 further including one or more processors and memory resources, represented by memory 1932, for storing instructions, e.g., applications, executable by processing component 1922. The application programs stored in memory 1932 may include one or more modules that each correspond to a set of instructions. Further, the processing component 1922 is configured to execute instructions to perform the above-described method.
The electronic device 1900 may also include a power component 1926 configured to perform power management of the electronic device 1900, a wired or wireless network interface 1950 configured to connect the electronic device 1900 to a network, and an input/output (I/O) interface 1958. The electronic device 1900 may operate based on an operating system stored in memory 1932, such as Windows Server, Mac OS XTM, UnixTM, LinuxTM, FreeBSDTM, or the like.
In an exemplary embodiment, a non-transitory computer readable storage medium, such as the memory 1932, is also provided that includes computer program instructions executable by the processing component 1922 of the electronic device 1900 to perform the above-described methods.
The present disclosure may be systems, methods, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), can execute computer-readable program instructions to implement various aspects of the present disclosure by utilizing state personnel information of the computer-readable program instructions to personalize the electronic circuitry.
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (10)

1. A method of monitoring a target object, comprising:
acquiring monitoring data of a target object;
according to the monitoring data, carrying out state evaluation on the target object to obtain a monitoring result;
wherein the monitoring data includes emotion recognition data and/or behavior recognition data obtained by image recognition of the target object.
2. The method according to claim 1, wherein said performing a state evaluation on the target object according to the monitoring data to obtain a monitoring result comprises:
acquiring the state to be evaluated of the target object;
counting the monitoring data corresponding to the state to obtain a statistical result;
and according to the statistical result, performing state evaluation on the target object to obtain a state evaluation result of the target object as the monitoring result.
3. The method according to claim 2, wherein the performing the state estimation on the target object according to the statistical result to obtain a state estimation result of the target object as the monitoring result includes:
carrying out weighted summation on the statistical result and a preset weight to obtain a weighted result;
converting the weighting result into a preset value domain, and taking the converted result as a state statistic value;
and averaging a plurality of state statistics values of the target object in a monitoring period, and taking an averaged result as a state evaluation result of the target object.
4. The method according to claim 2 or 3, wherein before the performing the state estimation on the target object according to the statistical result to obtain the state estimation result of the target object as the monitoring result, the method further comprises:
and judging whether the target object comprises at least one statistical result in a monitoring period, and if so, executing the state evaluation on the target object according to the statistical result to obtain a state evaluation result of the target object as the monitoring result.
5. The method of any of claims 1 to 4, wherein the behavior recognition data comprises:
at least one of sitting posture identification data, motion identification data, and face orientation identification data.
6. The method according to any one of claims 1 to 5, wherein the state comprises:
at least one of an emotional state, an energy state, and a learning state.
7. A target object monitoring device, comprising:
the monitoring data acquisition module is used for acquiring monitoring data of a target object;
the monitoring result generation module is used for carrying out state evaluation on the target object according to the monitoring data to obtain a monitoring result;
wherein the monitoring data includes emotion recognition data and/or behavior recognition data obtained by image recognition of the target object.
8. A monitoring system for a target object, comprising:
the monitoring data generating device is used for generating monitoring data of the target object;
a target object monitoring device according to claim 7;
wherein the monitoring data generation device is connected with the monitoring device of the target object.
9. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to invoke the memory-stored instructions to perform the method of any of claims 1 to 6.
10. A computer readable storage medium having computer program instructions stored thereon, which when executed by a processor implement the method of any one of claims 1 to 6.
CN201910863589.0A 2019-09-12 2019-09-12 Target object monitoring method and device, electronic equipment and storage medium Active CN110598632B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910863589.0A CN110598632B (en) 2019-09-12 2019-09-12 Target object monitoring method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910863589.0A CN110598632B (en) 2019-09-12 2019-09-12 Target object monitoring method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN110598632A true CN110598632A (en) 2019-12-20
CN110598632B CN110598632B (en) 2022-09-09

Family

ID=68859496

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910863589.0A Active CN110598632B (en) 2019-09-12 2019-09-12 Target object monitoring method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN110598632B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111428686A (en) * 2020-04-14 2020-07-17 北京易华录信息技术股份有限公司 Student interest preference evaluation method, device and system
CN111539339A (en) * 2020-04-26 2020-08-14 北京市商汤科技开发有限公司 Data processing method and device, electronic equipment and storage medium
CN111601088A (en) * 2020-05-27 2020-08-28 大连成者科技有限公司 Sitting posture monitoring system based on monocular camera sitting posture identification technology
CN111680558A (en) * 2020-04-29 2020-09-18 北京易华录信息技术股份有限公司 Learning special attention assessment method and device based on video images
CN112528890A (en) * 2020-12-15 2021-03-19 北京易华录信息技术股份有限公司 Attention assessment method and device and electronic equipment
CN112842267A (en) * 2020-12-31 2021-05-28 深圳联达技术实业有限公司 Intelligent management system of sleep monitoring pillow
CN113065441A (en) * 2021-03-25 2021-07-02 开放智能机器(上海)有限公司 Image processing system and method based on edge device

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107169900A (en) * 2017-05-19 2017-09-15 南京信息工程大学 A kind of student listens to the teacher rate detection method
CN108121785A (en) * 2017-12-15 2018-06-05 华中师范大学 A kind of analysis method based on education big data
CN108694679A (en) * 2018-05-15 2018-10-23 北京中庆现代技术股份有限公司 A kind of method student's learning state detection and precisely pushed
CN108805009A (en) * 2018-04-20 2018-11-13 华中师范大学 Classroom learning state monitoring method based on multimodal information fusion and system
CN109034036A (en) * 2018-07-19 2018-12-18 青岛伴星智能科技有限公司 A kind of video analysis method, Method of Teaching Quality Evaluation and system, computer readable storage medium
CN109035089A (en) * 2018-07-25 2018-12-18 重庆科技学院 A kind of Online class atmosphere assessment system and method
CN109344682A (en) * 2018-08-02 2019-02-15 平安科技(深圳)有限公司 Classroom monitoring method, device, computer equipment and storage medium
CN109359606A (en) * 2018-10-24 2019-02-19 江苏君英天达人工智能研究院有限公司 A kind of classroom real-time monitoring and assessment system and its working method, creation method
CN109635725A (en) * 2018-12-11 2019-04-16 深圳先进技术研究院 Detect method, computer storage medium and the computer equipment of student's focus
CN109684984A (en) * 2018-12-19 2019-04-26 山东旭兴网络科技有限公司 A kind of expression recognition method suitable for classroom instruction
CN109727167A (en) * 2019-01-07 2019-05-07 北京汉博信息技术有限公司 A kind of teaching auxiliary system
CN109815795A (en) * 2018-12-14 2019-05-28 深圳壹账通智能科技有限公司 Classroom student's state analysis method and device based on face monitoring
CN109859078A (en) * 2018-12-24 2019-06-07 山东大学 A kind of student's Learning behavior analyzing interference method, apparatus and system
CN109919816A (en) * 2019-03-21 2019-06-21 北京旷视科技有限公司 Cource arrangement method and device, storage medium and electronic equipment based on data analysis
CN110059614A (en) * 2019-04-16 2019-07-26 广州大学 A kind of intelligent assistant teaching method and system based on face Emotion identification

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107169900A (en) * 2017-05-19 2017-09-15 南京信息工程大学 A kind of student listens to the teacher rate detection method
CN108121785A (en) * 2017-12-15 2018-06-05 华中师范大学 A kind of analysis method based on education big data
CN108805009A (en) * 2018-04-20 2018-11-13 华中师范大学 Classroom learning state monitoring method based on multimodal information fusion and system
CN108694679A (en) * 2018-05-15 2018-10-23 北京中庆现代技术股份有限公司 A kind of method student's learning state detection and precisely pushed
CN109034036A (en) * 2018-07-19 2018-12-18 青岛伴星智能科技有限公司 A kind of video analysis method, Method of Teaching Quality Evaluation and system, computer readable storage medium
CN109035089A (en) * 2018-07-25 2018-12-18 重庆科技学院 A kind of Online class atmosphere assessment system and method
CN109344682A (en) * 2018-08-02 2019-02-15 平安科技(深圳)有限公司 Classroom monitoring method, device, computer equipment and storage medium
CN109359606A (en) * 2018-10-24 2019-02-19 江苏君英天达人工智能研究院有限公司 A kind of classroom real-time monitoring and assessment system and its working method, creation method
CN109635725A (en) * 2018-12-11 2019-04-16 深圳先进技术研究院 Detect method, computer storage medium and the computer equipment of student's focus
CN109815795A (en) * 2018-12-14 2019-05-28 深圳壹账通智能科技有限公司 Classroom student's state analysis method and device based on face monitoring
CN109684984A (en) * 2018-12-19 2019-04-26 山东旭兴网络科技有限公司 A kind of expression recognition method suitable for classroom instruction
CN109859078A (en) * 2018-12-24 2019-06-07 山东大学 A kind of student's Learning behavior analyzing interference method, apparatus and system
CN109727167A (en) * 2019-01-07 2019-05-07 北京汉博信息技术有限公司 A kind of teaching auxiliary system
CN109919816A (en) * 2019-03-21 2019-06-21 北京旷视科技有限公司 Cource arrangement method and device, storage medium and electronic equipment based on data analysis
CN110059614A (en) * 2019-04-16 2019-07-26 广州大学 A kind of intelligent assistant teaching method and system based on face Emotion identification

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111428686A (en) * 2020-04-14 2020-07-17 北京易华录信息技术股份有限公司 Student interest preference evaluation method, device and system
CN111539339A (en) * 2020-04-26 2020-08-14 北京市商汤科技开发有限公司 Data processing method and device, electronic equipment and storage medium
WO2021218194A1 (en) * 2020-04-26 2021-11-04 北京市商汤科技开发有限公司 Data processing method and apparatus, electronic device, and storage medium
JP2022534345A (en) * 2020-04-26 2022-07-29 ベイジン・センスタイム・テクノロジー・デベロップメント・カンパニー・リミテッド Data processing method and device, electronic equipment and storage medium
CN111680558A (en) * 2020-04-29 2020-09-18 北京易华录信息技术股份有限公司 Learning special attention assessment method and device based on video images
CN111601088A (en) * 2020-05-27 2020-08-28 大连成者科技有限公司 Sitting posture monitoring system based on monocular camera sitting posture identification technology
CN111601088B (en) * 2020-05-27 2021-12-21 大连成者科技有限公司 Sitting posture monitoring system based on monocular camera sitting posture identification technology
CN112528890A (en) * 2020-12-15 2021-03-19 北京易华录信息技术股份有限公司 Attention assessment method and device and electronic equipment
CN112528890B (en) * 2020-12-15 2024-02-13 北京易华录信息技术股份有限公司 Attention assessment method and device and electronic equipment
CN112842267A (en) * 2020-12-31 2021-05-28 深圳联达技术实业有限公司 Intelligent management system of sleep monitoring pillow
CN113065441A (en) * 2021-03-25 2021-07-02 开放智能机器(上海)有限公司 Image processing system and method based on edge device

Also Published As

Publication number Publication date
CN110598632B (en) 2022-09-09

Similar Documents

Publication Publication Date Title
CN110598632B (en) Target object monitoring method and device, electronic equipment and storage medium
CN112287844B (en) Student situation analysis method and device, electronic device and storage medium
KR20210144658A (en) Video processing method and apparatus, electronic device and storage medium
KR102039848B1 (en) Personal emotion-based cognitive assistance systems, methods of providing personal emotion-based cognitive assistance, and non-transitory computer readable media for improving memory and decision making
CN105960672B (en) Variable component deep neural network for Robust speech recognition
US10431116B2 (en) Orator effectiveness through real-time feedback system with automatic detection of human behavioral and emotional states of orator and audience
CN111310665A (en) Violation event detection method and device, electronic equipment and storage medium
Ouchi et al. Smartphone-based monitoring system for activities of daily living for elderly people and their relatives etc.
CN110013261B (en) Emotion monitoring method and device, electronic equipment and storage medium
JP2019058625A (en) Emotion reading device and emotion analysis method
US20160104385A1 (en) Behavior recognition and analysis device and methods employed thereof
US20210287561A1 (en) Lecture support system, judgement apparatus, lecture support method, and program
JP6715410B2 (en) Evaluation method, evaluation device, evaluation program, and evaluation system
US20210304339A1 (en) System and a method for locally assessing a user during a test session
CN113576451A (en) Respiration rate detection method and device, storage medium and electronic equipment
JP2009267621A (en) Communication apparatus
WO2021218194A1 (en) Data processing method and apparatus, electronic device, and storage medium
TWI642026B (en) Psychological and behavioral assessment and diagnostic methods and systems
JPWO2021044249A5 (en)
WO2017175447A1 (en) Information processing apparatus, information processing method, and program
US20200177537A1 (en) Control system and control method for social network
JP6891732B2 (en) Information processing equipment, information processing methods, and programs
US20210142047A1 (en) Salient feature extraction using neural networks with temporal modeling for real time incorporation (sentri) autism aide
Sukreep et al. Recognizing Falls, Daily Activities, and Health Monitoring by Smart Devices.
WO2019186676A1 (en) Device for behavior estimation and change detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant