CN110738120B - Environment data adjustment method and device, electronic equipment and medium - Google Patents

Environment data adjustment method and device, electronic equipment and medium Download PDF

Info

Publication number
CN110738120B
CN110738120B CN201910875554.9A CN201910875554A CN110738120B CN 110738120 B CN110738120 B CN 110738120B CN 201910875554 A CN201910875554 A CN 201910875554A CN 110738120 B CN110738120 B CN 110738120B
Authority
CN
China
Prior art keywords
pattern recognition
recognition model
grade
data
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910875554.9A
Other languages
Chinese (zh)
Other versions
CN110738120A (en
Inventor
陈颖聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN201910875554.9A priority Critical patent/CN110738120B/en
Publication of CN110738120A publication Critical patent/CN110738120A/en
Application granted granted Critical
Publication of CN110738120B publication Critical patent/CN110738120B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Software Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to data analysis, and provides an environment data adjustment method, which comprises the following steps: collecting state data of a monitoring object through a plurality of cameras; collecting environmental data of the environment where the monitored object is located through a plurality of sensors; constructing a first pattern recognition model, and obtaining the state grade of the monitoring object according to the first pattern recognition model and the state data, wherein the state grade at least comprises an abnormal grade; constructing a second pattern recognition model, and obtaining the efficiency level of the monitored object according to the second pattern recognition model, the environment data and the state level, wherein the efficiency level at least comprises an inefficiency level; and when the state grade is an abnormal grade or the efficiency grade is an inefficient grade, adjusting the environment data until the state grade is judged to be not the abnormal grade and the efficiency grade is not the inefficient grade. The invention also provides an adjusting device, electronic equipment and a storage medium. The invention combines the working state and the working efficiency to obtain the best environmental data.

Description

Environment data adjustment method and device, electronic equipment and medium
Technical Field
The present invention relates to the field of data analysis technologies, and in particular, to an environmental data adjustment method and apparatus, an electronic device, and a medium.
Background
At present, the office is provided with a fire alarm system, a monitoring system, an air conditioner refrigerating system, a ventilation system and other systems which independently operate, and each system can adjust the environment of the office. On the other hand, for an office worker at an office, the status thereof can be obtained only by means of human observation or subjective description.
The state of office staff cannot be objectively obtained in the prior art, and the state of office staff is combined with the environment state of an office place, so that the environment adjustment of the office place is realized.
Disclosure of Invention
In view of the foregoing, it is an object of the present invention to provide an environmental data adjustment method, an electronic device, and a storage medium that obtain optimal environmental data in combination with an operating state and an operating efficiency.
In order to achieve the above object, the present invention further provides an environmental data adjustment method, including:
collecting state data of a monitored object through a plurality of cameras, wherein the state data comprises facial feature data and behavior feature data;
collecting environmental data of the environment in which the monitored object is located by a plurality of sensors, wherein the environmental data comprises one or more of temperature, humidity, brightness and ventilation level;
constructing a first pattern recognition model, and obtaining a state grade of a monitoring object according to the first pattern recognition model and the state data, wherein the state grade at least comprises an abnormal grade;
constructing a second pattern recognition model, and obtaining the efficiency level of the monitored object according to the second pattern recognition model, the environment data and the state level, wherein the efficiency level at least comprises an inefficiency level;
and when the state grade is an abnormal grade or the efficiency grade is an inefficient grade, adjusting the environment data until the state grade is judged to be not the abnormal grade and the efficiency grade is not the inefficient grade.
In one embodiment, the facial feature data comprises feature data of at least one facial feature; the behavior feature data includes micro-behavior feature data and interactive behavior feature data.
In one embodiment, the first pattern recognition model is a neural network model, including an input layer, an hidden layer, and an output layer;
by the formula
Deriving an output value of an hidden layer of the neural network model, wherein h j For the j-th node of the hidden layer, f is the excitation functionω ij For inputting the connection weight value of the ith node of the layer and the jth node of the hidden layer, x i A is a variable of the ith node of the input layer j I=1, 2..n, n is the threshold of the j-th node of the hidden layer, and n is the number of nodes of the input layer;
by the formula
Deriving an output value of an output layer of the neural network model, where o k For the output value, ω, of the kth node of the output layer jk B is the connection weight of the jth node of the hidden layer and the kth node of the output layer k J=1, 2..l, l is the number of nodes in the hidden layer, and k=1, 2..m, m is the number of nodes in the output layer.
In one embodiment, the training step of the first pattern recognition model or the second pattern recognition model comprises:
constructing a training set, and taking historical state data corresponding to different state levels as a first training set of a first pattern recognition model; taking the historical environment data and the historical state data corresponding to different efficiency levels as a second training set of a second pattern recognition model;
obtaining the first pattern recognition model according to the first training set and a first initial model;
and obtaining the second pattern recognition model according to the second training set and a second initial model.
In one embodiment, the training step of the first pattern recognition model comprises:
sequentially inputting historical state data of each sample of the first training set into a first pattern recognition model to obtain output values of each sample at each node of an output layer;
by the formula
ω ij ′=ω ij +αh j (1-h j )x i
ω jk ′=ω jk +αh j e k
a j ′=a j +αh j (1-h j )
b k ′=b k +e k
Sequentially updating parameters of the first pattern recognition model by combining errors of output values and actual values of nodes of an output layer of each sample, wherein omega ij 、ω jk 、a j And b k To update the pre-update neural network parameters, ω ij ′、ω jk ′、a j ' and b k ' is the updated neural network parameter, e k An error for the kth node of the output layer;
sequentially inputting historical state data of each sample of the first training set into a first pattern recognition model with updated parameters, and using a formula
Obtaining training errors, wherein N is the number of samples of the first training set,predicted value at output layer kth node for sample Z, +.>The actual value of the kth node corresponding to the sample Z;
repeating the steps to obtain training errors of the first pattern recognition model with continuously set training times;
and ending training when the change of the training error is smaller than the set threshold value.
In one embodiment, each monitoring object has a corresponding weight, and the step of deriving the status level of the monitoring object from the first pattern recognition model and the status data includes:
generating weighted state data according to the weight of the monitoring object and the state data of the monitoring object;
and obtaining the state grade of the monitoring object according to the first pattern recognition model and the weighted state data.
In one embodiment, each monitoring object has a corresponding weight, and the step of deriving an efficiency level of the monitoring object from the second pattern recognition model, the environmental data, and the status level includes:
grouping the detection objects according to the set conditions, wherein the sum of the weights of the detection objects in each group is 1;
acquiring the state grade of each group according to the weight of the monitoring object of each group and the state grade thereof;
and obtaining the efficiency level of each group according to the second pattern recognition model and the state level of each group and the environment data thereof.
In addition, in order to achieve the above object, the present invention also provides an electronic device including a memory and a processor, wherein the memory stores an environment data adjustment program, and the environment data adjustment program implements the above environment data adjustment method when executed by the processor.
In addition, in order to achieve the above object, the present invention also provides a computer-readable storage medium including an environment data adjustment program, which when executed by a processor, implements the above-described environment data adjustment method.
In addition, in order to achieve the above object, the present invention also provides an environmental data adjustment device, including:
the system comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module acquires state data of a monitored object through a plurality of cameras, and the state data comprises facial feature data and behavior feature data;
the second acquisition module acquires environmental data of the environment where the monitored object is located through a plurality of sensors, wherein the environmental data comprises one or more of temperature, humidity, brightness and ventilation level;
the first construction module is used for constructing a first pattern recognition model, and obtaining the state grade of the monitoring object according to the first pattern recognition model and the state data, wherein the state grade at least comprises an abnormal grade;
the second construction module is used for constructing a second pattern recognition model, and obtaining the efficiency grade of the monitoring object according to the second pattern recognition model, the environment data and the state grade, wherein the efficiency grade at least comprises a low efficiency grade;
and the adjusting module is used for adjusting the environment data until judging that the state grade is not an abnormal grade and the efficiency grade is not an inefficient grade when the state grade is an abnormal grade or the efficiency grade is an inefficient grade.
According to the environment data adjustment method and device, the electronic equipment and the computer readable storage medium, the working state of the staff can be intelligently estimated by utilizing the first mode identification model through the acquisition of the staff state data by the camera and the acquisition of the environment data by the sensor, objective estimation of the working efficiency of the staff can be realized by utilizing the second mode identification model, the requirements of the staff are monitored in real time, a more comfortable working environment is provided, and the working efficiency of the staff is improved.
Drawings
FIG. 1 is a flow chart of a method for adjusting environmental data according to the present invention;
FIG. 2 is a schematic view of an application environment of a preferred embodiment of the method for adjusting environmental data according to the present invention;
fig. 3 is a schematic diagram of an environment data adjustment device according to the present invention.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Specific embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
Fig. 1 is a flowchart of an environment data adjustment method according to the present invention, as shown in fig. 1, the environment data adjustment method includes:
step S1, collecting state data of a monitoring object through a plurality of cameras, wherein the state data comprises facial feature data and behavior feature data;
s2, acquiring environmental data of the environment where the monitored object is located by a plurality of sensors, wherein the environmental data comprises one or more of temperature, humidity, brightness and ventilation level;
step S3, a first pattern recognition model is constructed, a state grade of the monitoring object is obtained according to the first pattern recognition model and the state data, the state grade at least comprises an abnormal grade, for example, the state grade at least comprises unhappy, normal and happy in sequence from low to high, and the unhappy can be used as the abnormal grade;
step S4, a second pattern recognition model is constructed, and the efficiency level of the monitoring object is obtained according to the second pattern recognition model, the environment data and the state level, wherein the efficiency level at least comprises an inefficiency level, for example, the efficiency level at least comprises inefficiency, general and high efficiency in sequence from low to high;
and S5, when the state grade is an abnormal grade or the efficiency grade is an inefficient grade, adjusting the environment data until the state grade is judged to be not the abnormal grade and the efficiency grade is not the inefficient grade.
Preferably, the facial feature data comprises feature data of at least one facial feature; the behavior feature data includes micro-behavior feature data including one or more of eyebrow, eye, and nose feature data, such as eyebrow type data, eye-wide data, and nose type data, and interactive behavior feature data including one or more of body motion feature data including one or more of whether to shake legs, frequency of shaking legs, whether to shake head, and amplitude and frequency of shaking head, and other behavior feature data including one or more of frequency of going to a toilet, frequency between going to tea, tone of speaking, and mood feature of speaking, and the like.
According to the environmental data adjustment method, the optimal parameters can be obtained through various data acquisition channels and then through big data analysis, and the working efficiency of staff is improved through environmental adjustment and the like of an office place, and the method is specifically divided into: the data acquisition system (a sensor, a camera and the like) is used for acquiring the working state of each employee through big data analysis by identifying facial expressions, actions and the like of the employee; the optimal illumination intensity, the temperature and the like of each employee are intelligently analyzed and calculated by combining the environmental data such as the temperature, the humidity and the like of the current office place, the energy is reasonably utilized, the resource waste such as the turn-on of a person going to a lamp is avoided, the intelligent environmental adjustment is carried out, the employees work in the optimal working environment, and the working efficiency is improved; through data accumulation for a period of time, the system can analyze and evaluate the working states of the staff in the system and provide a data basis for analysis evaluation and leadership of a supervisor.
In an alternative embodiment, the facial feature data may be the degree of eyebrow lifting or frowning, smiling of lips, etc., and the status level may be classified into three levels of happiness, standard and happiness by a statistical algorithm through data analysis and then manually corrected, for example, the status level of the employee may be determined by collecting the radian of eyebrow and the radian of lips of facial expression under the standard level by a camera device, and setting the radian change of the happiness level and the happiness level relative to the standard level.
In an alternative embodiment, the state data is normalized before step S3 to eliminate the differences in different behavior habits of different persons.
In an alternative embodiment, in step S3, the first pattern recognition model is a neural network model, preferably a BP neural network model, including an input layer, an hidden layer, and an output layer, and the output values of the hidden layer and the output layer of the neural network model are obtained by the following formulas (1) and (2), wherein:
wherein hj is the output value of the j-th node of the hidden layer, and f is the excitation functionω ij For inputting the connection weight value of the ith node of the layer and the jth node of the hidden layer, x j A is a variable of the ith node of the input layer j I=1, 2..n, n is the number of nodes in the input layer, o k For the output value, ω, of the kth node of the output layer jk B is the connection weight of the jth node of the hidden layer and the kth node of the output layer k J=1, 2..l, l is the number of nodes in the hidden layer, and k=1, 2..m, m is the number of nodes in the output layer.
In step S4, the second pattern recognition model may be a classification model such as a neural network model, an LSTM model, or the like.
Preferably, the first pattern recognition model and the second pattern recognition model may be combined into one pattern recognition model, such as a neural network model, a portion of the nodes of which are used to input the state data output the state level and a portion of the nodes are used to input the environment data and the state level output the efficiency level.
Because of the face differences of different people, different first and second pattern recognition models can be established for different employees, including:
constructing a training set, and taking historical state data corresponding to different state levels as a first training set of a first pattern recognition model; taking the historical environment data and the historical state data corresponding to different efficiency levels as a second training set of a second pattern recognition model;
obtaining the first pattern recognition model according to the first training set and a first initial model;
and obtaining the second pattern recognition model according to the second training set and a second initial model.
The above training process will be described by taking a training step of a first pattern recognition model as an example, where the training step of the first pattern recognition model includes:
sequentially inputting historical state data of each sample of the first training set into a first pattern recognition model to obtain output values of each sample at each node of an output layer;
sequentially updating the parameters of the first pattern recognition model by combining the errors of the output values and the actual values of the nodes of the output layer of each sample through formulas (3) - (6),
ω ij ′=ω ij +αh j (1-h j )x i (3)
ω jk ′=ω jk +αh j e k (4)
a j ′=a j +αh j (1-h j ) (5)
b k ′=b k +e k (6)
wherein omega ij 、ω jk 、a j And b k To update the pre-update neural network parameters, ω ij ′、ω jk ′、a j ' and b k ' is the updated neural network parameter, e k An error for the kth node of the output layer;
sequentially inputting historical state data of each sample of the first training set into a first pattern recognition model with updated parameters, and using a formula
Obtaining a training error, wherein MSE is the training error, N is the number of samples of the first training set,for sample Z atOutput layer kth node predictor, < ->The actual value of the kth node corresponding to the sample Z;
repeating the steps to obtain training errors of the first pattern recognition model with continuously set training times; and ending training when the change of the training error is smaller than the set threshold value.
In most companies, it is difficult for a plurality of employees to be located in an office, so that the environmental data is suitable for each employee, and therefore, preferably, each monitoring object has a corresponding weight, and the step of deriving the status level of the monitoring object according to the first pattern recognition model and the status data includes:
generating weighted state data according to the weight of the monitoring object and the state data of the monitoring object;
and obtaining the state grade of the monitoring object according to the first pattern recognition model and the weighted state data.
For example, different weights may be given to different employees in step S3, the weights of the employees in the office and their status data may be multiplied together and added together to be input as the status data of the office to the first pattern recognition model, the status level of the office may be obtained, the status data of the employees not in the office may be 0, when the first pattern recognition model is trained, the working status of the employees with high weights may be the working status of the entire office, or the working status of the employees with a set proportion of weights may be the working status of the entire office, for example, 4 employees in the office may have a weight of 0.05, two groups may have a weight of 0.1, two boss may have a weight of 0.3, the normal status of the employees in 0.5 times may be the normal status of the entire office, the normal status of the employees higher than 0.5 times may be the happy status of the office, and the normal status of the employees lower than 0.5 times may be the happy status of the office.
In step S4, the environmental data and the state level of the office are input into the second pattern recognition model to obtain the efficiency level of the office, when the second pattern recognition model is trained, the efficiency level of the staff with high weight may be used as the efficiency level of the whole office, or the efficiency level of the staff with the set proportion may be used as the efficiency level of the whole office, for example, 4 staff members in the office, the weight is 0.05, two groups of staff members, the weight is 0.1, two boss members, the weight is 0.3, the general state of the staff with 0.5 times is used as the general state of the whole office, the general state of the staff with higher weight than 0.5 times is the efficient state of the office, and the general state of the staff with lower weight than 0.5 times is the inefficient state of the office.
Preferably, in step S3, the product of the difference between the status data of the staff and the status data in the normal working state of the staff multiplied by the weight is used as the status data corresponding to the weight of the staff, and the sum of the status data corresponding to the weight of each staff is used as the status data of the office, so that errors caused by different orientations and habits of different staff are reduced.
In an alternative embodiment, each monitoring object has a corresponding weight, and the step of deriving the efficiency level of the monitoring object from the second pattern recognition model, the environmental data and the status level comprises:
grouping the detection objects according to the set conditions, wherein the sum of the weights of the detection objects in each group is 1;
acquiring the state grade of each group according to the weight of the monitoring object of each group and the state grade thereof;
and obtaining the efficiency level of each group according to the second pattern recognition model and the state level of each group and the environment data thereof.
For example, in step S3, different weights are given to the status levels of different employees, the weight of the employees not in the office is 0, and the sum of the weights of the employees in the office is 1;
in step S4, the sum of the products of the weights and the state levels of the staff in the office is used as the state level of the office, the state level of the office and the environmental data are input into the second pattern recognition model to obtain the efficiency level of the office, and when the second pattern recognition model is trained, the efficiency level of the staff with high weight can be used as the efficiency level of the whole office, and the efficiency level of the staff with set proportion of weights can also be used as the efficiency level of the whole office.
In the above embodiments, the state level, the efficiency level, and the effective working time (the time not lower than the general efficiency level is taken as the effective working time) of each employee may be obtained according to the first pattern recognition model and the second pattern recognition model, so as to generate a working report of each employee, parameters such as the interaction times with the employee, the effective state time (the time not lower than the normal state level is taken as the effective state time) may be added to the working report, the working report of each employee may be sent to the supervisor, so as to help the supervisor find out whether the employee works abnormally in time, and even tutoring and talking may be performed during the abnormality, so as to increase the happiness of the employee.
The invention provides an environment data adjustment method which is applied to electronic equipment 1. Referring to fig. 2, an application environment of a preferred embodiment of the method for adjusting environmental data according to the present invention is shown.
In this embodiment, the electronic device 1 may be a terminal client having an operation function, such as a server, a mobile phone, a tablet computer, a portable computer, or a desktop computer.
The electronic device 1 comprises a plurality of cameras 11, a plurality of sensors 12, a network interface 13, a memory 14, a processor 15 and a communication bus 16, wherein:
the plurality of cameras 11 collect status data of the monitoring object, including facial feature data and behavioral feature data, which may be disposed near each station of the office.
A plurality of sensors 12 are provided in an office (e.g., a wall, a roof, etc. of the office) for measuring environmental data of the office, including one or more of temperature, humidity, light brightness, and ventilation level, that is, the plurality of sensors 12 may include a temperature sensor, a humidity sensor, a brightness sensor, and a ventilation sensor.
The network interface 13, which may optionally comprise a standard wired interface, a wireless interface (e.g. WI-FI interface), is in wireless connection with the plurality of cameras 11 and the plurality of sensors 12, transfers status data collected by the plurality of cameras and environmental data measured by the plurality of sensors to the memory 14 and the processor 15, and may also be used to establish a communication connection between the electronic device 1 and other electronic clients.
The readable storage medium of the memory 14 is generally used for storing the environment data adjustment program 10 and the like installed in the electronic device 1. The memory 14 may also be used to temporarily store data that has been output or is to be output.
The processor 15 may in some embodiments be a central processing unit (Central Processing Unit, CPU), microprocessor or other data processing chip for running program code stored in the memory 11 and for processing status data acquired by the plurality of cameras and environmental data measured by the plurality of sensors, such as executing the environmental data adjustment program 10, etc.
The communication bus 16 is used to enable connection communication between the components of the network interface 13, the memory 14 and the processor 15.
The memory 14 includes at least one type of readable storage medium. The at least one type of readable storage medium may be a non-volatile storage medium such as flash memory, a hard disk, a multimedia card, a card memory, etc. In some embodiments, the readable storage medium may be an internal storage unit of the electronic device 1, such as a hard disk of the electronic device 1. In other embodiments, the readable storage medium may also be an external memory of the electronic device 1, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the electronic device 1.
Fig. 1 shows only an electronic device 1 with components 11-16, but it is understood that not all shown components are required to be implemented, and that more or fewer components may be implemented instead.
Optionally, the electronic device 1 may further comprise a user interface, which may comprise an input unit such as a Keyboard (Keyboard), a voice input device such as a microphone or the like having voice recognition functionality, a voice output device such as a sound box, a headset or the like, and optionally a standard wired interface, a wireless interface.
Alternatively, the electronic device 1 may also comprise a display, which may also be referred to as a display screen or display unit.
In some embodiments, the display may be an LED display, a liquid crystal display, a touch-control liquid crystal display, an Organic Light-Emitting Diode (OLED) touch device, or the like. The display is used for displaying information processed in the electronic device 1 and for displaying a visualized user interface.
Optionally, the electronic device 1 further comprises a touch sensor. The area provided by the touch sensor for the user to perform a touch operation is referred to as a touch area. Further, the touch sensors described herein may be resistive touch sensors, capacitive touch sensors, and the like. The touch sensor may include not only a contact type touch sensor but also a proximity type touch sensor. Furthermore, the touch sensor may be a single sensor or may be a plurality of sensors arranged in an array, for example.
Optionally, the electronic device 1 may further include logic gates, sensors, audio circuits, etc., which are not described herein.
In the embodiment of the apparatus shown in fig. 1, an operating system and an environmental data adjustment program 10 may be included in a memory 11 as a kind of computer storage medium; the processor 12 implements the following steps when executing the environment data adjustment program 10 stored in the memory 11:
collecting state data of a monitored object through a plurality of cameras, wherein the state data comprises facial feature data and behavior feature data;
collecting environmental data of the environment in which the monitored object is located by a plurality of sensors, wherein the environmental data comprises one or more of temperature, humidity, brightness and ventilation level;
constructing a first pattern recognition model, and obtaining a state grade of a monitoring object according to the first pattern recognition model and the state data, wherein the state grade at least comprises an abnormal grade;
constructing a second pattern recognition model, and obtaining the efficiency level of the monitored object according to the second pattern recognition model, the environment data and the state level, wherein the efficiency level at least comprises an inefficiency level;
and when the state grade is an abnormal grade or the efficiency grade is an inefficient grade, adjusting the environment data until the state grade is judged to be not the abnormal grade and the efficiency grade is not the inefficient grade.
In other embodiments, the environmental data conditioning program 10 may be further divided into one or more modules, one or more modules being stored in the memory 11 and executed by the processor 12 to complete the present invention. The invention may refer to a series of computer program instruction segments capable of performing a specified function.
Fig. 3 is a schematic diagram of an environmental data adjustment device according to the present invention, as shown in fig. 3, the environmental data adjustment device includes:
the first acquisition module 110 acquires state data of a monitoring object through a plurality of cameras, wherein the state data comprises facial feature data and behavior feature data;
the second acquisition module 120 acquires environmental data of an environment in which the monitoring object is located, including one or more of temperature, humidity, brightness, and ventilation level, through a plurality of sensors;
the first construction module 130 constructs a first pattern recognition model, and obtains a state level of the monitored object according to the first pattern recognition model and the state data, where the state level at least includes an abnormal level, and the first pattern recognition model may be a neural network structure or/and a classification model, for example, a BP neural network, a logistic regression, a decision tree, a random forest, a support vector machine, and the like;
the second construction module 140 constructs a second pattern recognition model, and obtains an efficiency level of the monitored object according to the second pattern recognition model, the environmental data and the state level, wherein the efficiency level at least comprises an inefficiency level;
the adjusting module 150 adjusts the environmental data until it is determined that the state level is not an abnormal level and the efficiency level is not an inefficient level, for example, when the state level is lower than normal or the efficiency level is lower than normal, if the employee has a motion of adding clothes or holding and rubbing an arm, an instruction for adjusting the temperature up may be sent to the air conditioner to adjust the indoor temperature up, the temperature in the room may be changed by 1 °, or the change level may be set; if the employee has the action of covering the nose, an instruction to open the ventilation or to increase the ventilation level may be sent to the ventilation device (e.g., an air conditioner).
In addition, an embodiment of the present invention also proposes a computer-readable storage medium, in which an environment data adjustment program is included, the environment data adjustment program implementing the following steps when executed by a processor:
step S1, collecting state data of a monitoring object through a plurality of cameras, wherein the state data comprises facial feature data and behavior feature data;
s2, acquiring environmental data of the environment where the monitored object is located by a plurality of sensors, wherein the environmental data comprises one or more of temperature, humidity, brightness and ventilation level;
step S3, a first pattern recognition model is constructed, and the state grade of the monitored object is obtained according to the first pattern recognition model and the state data, wherein the state grade at least comprises an abnormal grade;
s4, constructing a second pattern recognition model, and obtaining the efficiency level of the monitored object according to the second pattern recognition model, the environment data and the state level, wherein the efficiency level at least comprises a low-efficiency level;
and S5, when the state grade is an abnormal grade or the efficiency grade is an inefficient grade, adjusting the environment data until the state grade is judged to be not the abnormal grade and the efficiency grade is not the inefficient grade.
The embodiment of the computer readable storage medium of the present invention is substantially the same as the embodiment of the above-mentioned environmental data adjustment method and apparatus, and the electronic device, and will not be described herein.
The environment data adjustment method, the electronic equipment and the computer readable storage medium can provide more comfortable working environment and improve the working efficiency of staff; the personnel requirements can be monitored in real time, and resource waste is avoided; the working state of staff can be intelligently evaluated, and scientific data analysis is provided.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, apparatus, article, or method that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, apparatus, article, or method. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, apparatus, article or method that comprises the element.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments. From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) as described above, comprising several instructions for causing a terminal client (which may be a mobile phone, a computer, a server, or a network client, etc.) to perform the method according to the embodiments of the present invention.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the invention, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (6)

1. An environmental data adjustment method, comprising:
collecting state data of a monitored object through a plurality of cameras, wherein the state data comprises facial feature data and behavior feature data;
collecting environmental data of the environment in which the monitored object is located by a plurality of sensors, wherein the environmental data comprises one or more of temperature, humidity, brightness and ventilation level;
constructing a first pattern recognition model, and obtaining a state grade of a monitoring object according to the first pattern recognition model and the state data, wherein the state grade at least comprises an abnormal grade; wherein each monitoring object has a corresponding weight, and the step of obtaining the state grade of the monitoring object according to the first pattern recognition model and the state data includes: generating weighted state data according to the weight of the monitoring object and the state data of the monitoring object; obtaining a state grade of the monitoring object according to the first pattern recognition model and the weighted state data;
constructing a second pattern recognition model, and obtaining the efficiency level of the monitored object according to the second pattern recognition model, the environment data and the state level, wherein the efficiency level at least comprises an inefficiency level;
when the state grade is an abnormal grade or the efficiency grade is an inefficient grade, adjusting the environment data until the state grade is judged to be not the abnormal grade and the efficiency grade is not the inefficient grade; the first pattern recognition model is a neural network model and comprises an input layer, an implicit layer and an output layer;
by the formula
Deriving implications of the neural network modelOutput value of layer, where h j For the j-th node of the hidden layer, f is the excitation functionω ij For inputting the connection weight value of the ith node of the layer and the jth node of the hidden layer, x i A is a variable of the ith node of the input layer j I=1, 2 … n, n is the number of nodes of the input layer;
by the formula
Deriving an output value of an output layer of the neural network model, where o k For the output value, ω, of the kth node of the output layer jk B is the connection weight of the jth node of the hidden layer and the kth node of the output layer k J=1, 2 … l, i is the number of nodes of the hidden layer, and k=1, 2 … m, m is the number of nodes of the output layer;
the training step of the first pattern recognition model or the second pattern recognition model comprises the following steps:
constructing a training set, and taking historical state data corresponding to different state levels as a first training set of a first pattern recognition model; taking the historical environment data and the historical state data corresponding to different efficiency levels as a second training set of a second pattern recognition model;
obtaining the first pattern recognition model according to the first training set and a first initial model;
obtaining the second pattern recognition model according to the second training set and a second initial model;
the training step of the first pattern recognition model comprises the following steps:
sequentially inputting historical state data of each sample of the first training set into a first pattern recognition model to obtain output values of each sample at each node of an output layer;
by the formula
ω ij '=ω ij +αh j (1-h j )x i
ω jk '=ω jk +αh j e k
a j '=a j +αh j (1-h j )
b k '=b k +e k
Sequentially updating parameters of the first pattern recognition model by combining errors of output values and actual values of nodes of an output layer of each sample, wherein omega ij 、ω jk 、a j And b k To update the pre-update neural network parameters, ω ij '、ω jk '、a j ' and b k ' is the updated neural network parameter, e k An error for the kth node of the output layer; alpha is the update gradient in the parameter update process;
sequentially inputting historical state data of each sample of the first training set into a first pattern recognition model with updated parameters, and using a formula
Obtaining training errors, wherein N is the number of samples of the first training set,predicted value at output layer kth node for sample Z, +.>The actual value of the kth node corresponding to the sample Z;
repeating the steps to obtain training errors of the first pattern recognition model with continuously set training times; and ending training when the change of the training error is smaller than the set threshold value.
2. The method for adjusting environmental data according to claim 1, wherein,
the facial feature data includes feature data of at least one facial feature;
the behavior feature data includes micro-behavior feature data and interactive behavior feature data.
3. The method of claim 1, wherein each monitored object has a corresponding weight, and wherein deriving an efficiency level for the monitored object from the second pattern recognition model, the environmental data, and the status level comprises:
grouping the detection objects according to the set conditions, wherein the sum of the weights of the detection objects in each group is 1;
acquiring the state grade of each group according to the weight of the monitoring object of each group and the state grade thereof;
and obtaining the efficiency level of each group according to the second pattern recognition model and the state level of each group and the environment data thereof.
4. An environmental data adjustment apparatus, comprising:
the system comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module acquires state data of a monitored object through a plurality of cameras, and the state data comprises facial feature data and behavior feature data;
the second acquisition module acquires environmental data of the environment where the monitored object is located through a plurality of sensors, wherein the environmental data comprises one or more of temperature, humidity, brightness and ventilation level;
the first construction module is used for constructing a first pattern recognition model, and obtaining the state grade of the monitoring object according to the first pattern recognition model and the state data, wherein the state grade at least comprises an abnormal grade; wherein each monitoring object has a corresponding weight, and the step of obtaining the state grade of the monitoring object according to the first pattern recognition model and the state data includes: generating weighted state data according to the weight of the monitoring object and the state data of the monitoring object; obtaining a state grade of the monitoring object according to the first pattern recognition model and the weighted state data;
the second construction module is used for constructing a second pattern recognition model, and obtaining the efficiency grade of the monitoring object according to the second pattern recognition model, the environment data and the state grade, wherein the efficiency grade at least comprises a low efficiency grade;
the adjustment module is used for adjusting the environment data until judging that the state grade is not an abnormal grade and the efficiency grade is not an inefficient grade when the state grade is an abnormal grade or the efficiency grade is an inefficient grade;
the first pattern recognition model is a neural network model and comprises an input layer, an implicit layer and an output layer;
by the formula
Deriving an output value of an hidden layer of the neural network model, wherein h j For the j-th node of the hidden layer, f is the excitation functionω ij For inputting the connection weight value of the ith node of the layer and the jth node of the hidden layer, x i A is a variable of the ith node of the input layer j I=1, 2 … n, n is the number of nodes of the input layer;
by the formula
Deriving an output value of an output layer of the neural network model, where o k For the output value, ω, of the kth node of the output layer jk B is the connection weight of the jth node of the hidden layer and the kth node of the output layer k J=1, 2 … l, l is the number of nodes in the hidden layer, which is the threshold of the kth node in the output layerK=1, 2 … m, m being the number of nodes of the output layer;
the training step of the first pattern recognition model or the second pattern recognition model comprises the following steps:
constructing a training set, and taking historical state data corresponding to different state levels as a first training set of a first pattern recognition model; taking the historical environment data and the historical state data corresponding to different efficiency levels as a second training set of a second pattern recognition model;
obtaining the first pattern recognition model according to the first training set and a first initial model;
obtaining the second pattern recognition model according to the second training set and a second initial model;
the training step of the first pattern recognition model comprises the following steps:
sequentially inputting historical state data of each sample of the first training set into a first pattern recognition model to obtain output values of each sample at each node of an output layer;
by the formula
ω ij '=ω ij +αh j (1-h j )x i
ω jk '=ω jk +αh j e k
a j '=a j +αh j (1-h j )
b k '=b k +e k
Sequentially updating parameters of the first pattern recognition model by combining errors of output values and actual values of nodes of an output layer of each sample, wherein omega ij 、ω jk 、a j And b k To update the pre-update neural network parameters, ω ij '、ω jk '、a j ' and b k ' is the updated neural network parameter, e k An error for the kth node of the output layer; alpha is the update gradient in the parameter update process;
sequentially inputting historical state data of each sample of the first training set into a first pattern recognition model with updated parameters, and using a formula
Obtaining training errors, wherein N is the number of samples of the first training set,predicted value at output layer kth node for sample Z, +.>The actual value of the kth node corresponding to the sample Z;
repeating the steps to obtain training errors of the first pattern recognition model with continuously set training times; and ending training when the change of the training error is smaller than the set threshold value.
5. An electronic device comprising a memory and a processor, wherein the memory has stored therein an environmental data adjustment program that when executed by the processor implements the environmental data adjustment method of any of claims 1-3.
6. A computer-readable storage medium, wherein an environment data adjustment program is included in the computer-readable storage medium, and the environment data adjustment program, when executed by a processor, implements the environment data adjustment method according to any one of claims 1 to 3.
CN201910875554.9A 2019-09-17 2019-09-17 Environment data adjustment method and device, electronic equipment and medium Active CN110738120B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910875554.9A CN110738120B (en) 2019-09-17 2019-09-17 Environment data adjustment method and device, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910875554.9A CN110738120B (en) 2019-09-17 2019-09-17 Environment data adjustment method and device, electronic equipment and medium

Publications (2)

Publication Number Publication Date
CN110738120A CN110738120A (en) 2020-01-31
CN110738120B true CN110738120B (en) 2024-02-09

Family

ID=69267976

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910875554.9A Active CN110738120B (en) 2019-09-17 2019-09-17 Environment data adjustment method and device, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN110738120B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113344417A (en) * 2021-06-23 2021-09-03 武汉虹信技术服务有限责任公司 Method, system, computer equipment and readable medium for checking houses of individual workshops in residential building
CN116341823B (en) * 2023-01-12 2023-11-14 江苏领视达智能科技有限公司 Remote processing monitoring system and monitoring method for peep-proof flat panel display
CN116257024B (en) * 2023-02-06 2023-11-14 南京乐汇光电科技有限公司 Alarm device and method for processing low-power-consumption display based on Internet of things

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104374053A (en) * 2014-11-25 2015-02-25 珠海格力电器股份有限公司 Intelligent control method, device and system
CN107835324A (en) * 2017-12-13 2018-03-23 维沃移动通信有限公司 A kind of back light brightness regulating method and mobile terminal
CN109669837A (en) * 2018-10-31 2019-04-23 平安科技(深圳)有限公司 Equipment state method for early warning, system, computer installation and readable storage medium storing program for executing
CN109686447A (en) * 2019-01-28 2019-04-26 远光软件股份有限公司 A kind of employee status's monitoring system based on artificial intelligence
CN109901406A (en) * 2019-02-22 2019-06-18 珠海格力电器股份有限公司 Monitoring method, device, equipment and the storage medium of building environment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104374053A (en) * 2014-11-25 2015-02-25 珠海格力电器股份有限公司 Intelligent control method, device and system
CN107835324A (en) * 2017-12-13 2018-03-23 维沃移动通信有限公司 A kind of back light brightness regulating method and mobile terminal
CN109669837A (en) * 2018-10-31 2019-04-23 平安科技(深圳)有限公司 Equipment state method for early warning, system, computer installation and readable storage medium storing program for executing
CN109686447A (en) * 2019-01-28 2019-04-26 远光软件股份有限公司 A kind of employee status's monitoring system based on artificial intelligence
CN109901406A (en) * 2019-02-22 2019-06-18 珠海格力电器股份有限公司 Monitoring method, device, equipment and the storage medium of building environment

Also Published As

Publication number Publication date
CN110738120A (en) 2020-01-31

Similar Documents

Publication Publication Date Title
CN110738120B (en) Environment data adjustment method and device, electronic equipment and medium
JP6386107B2 (en) Localized learning from global models
US11551103B2 (en) Data-driven activity prediction
CN108229667B (en) Trimming based on artificial neural network classification
CN110163082A (en) A kind of image recognition network model training method, image-recognizing method and device
US11164565B2 (en) Unsupervised learning system and method for performing weighting for improvement in speech recognition performance and recording medium for performing the method
CN107679475B (en) Store monitoring and evaluating method and device and storage medium
CN109891436A (en) Security system and its control method based on deep learning neural network
US20180307894A1 (en) Neural network systems
CN110399476A (en) Generation method, device, equipment and the storage medium of talent&#39;s portrait
KR102168558B1 (en) Training data selection method for active learning, training data selection device for active learning and image analysis method using active learning
JP2008113442A (en) Event-detection in multi-channel sensor-signal stream
CN110136832B (en) Cognitive ability assessment system and method based on daily behaviors of old people
CN106796693A (en) For the apparatus and method of processing environment smell
JP6817974B2 (en) Computer system
CN109783859A (en) Model building method, device and computer readable storage medium
US20210216914A1 (en) Information processing device, information processing method, and information processing program
CN114901066B (en) Auxiliary system and auxiliary method
CN109741818A (en) Resource allocation management method and device are intervened in medical inferior health based on artificial intelligence
CN115797868A (en) Behavior early warning method, system, device and medium for monitoring object
CN109741108A (en) Streaming application recommended method, device and electronic equipment based on context aware
KR20200080418A (en) Terminla and operating method thereof
CN113222170A (en) Intelligent algorithm and model for IOT (Internet of things) AI (Artificial Intelligence) collaborative service platform
Alsalemi et al. Boosting domestic energy efficiency through accurate consumption data collection
Effendy et al. Forest quality assessment based on bird sound recognition using convolutional neural networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant