CN113361469A - Method, device and equipment for identifying coverage state and storage medium - Google Patents
Method, device and equipment for identifying coverage state and storage medium Download PDFInfo
- Publication number
- CN113361469A CN113361469A CN202110739715.9A CN202110739715A CN113361469A CN 113361469 A CN113361469 A CN 113361469A CN 202110739715 A CN202110739715 A CN 202110739715A CN 113361469 A CN113361469 A CN 113361469A
- Authority
- CN
- China
- Prior art keywords
- coverage
- target user
- target
- probability
- identifying
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 45
- 230000007613 environmental effect Effects 0.000 claims description 19
- 238000012545 processing Methods 0.000 claims description 13
- 210000000746 body region Anatomy 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 11
- 238000013145 classification model Methods 0.000 claims description 9
- 238000001514 detection method Methods 0.000 claims description 6
- 238000005516 engineering process Methods 0.000 abstract description 5
- 238000013473 artificial intelligence Methods 0.000 abstract description 3
- 238000013135 deep learning Methods 0.000 abstract description 2
- 230000002265 prevention Effects 0.000 abstract description 2
- 230000007958 sleep Effects 0.000 description 18
- 230000006870 function Effects 0.000 description 9
- 238000004891 communication Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 5
- 230000003993 interaction Effects 0.000 description 5
- 238000002372 labelling Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 230000003860 sleep quality Effects 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 102100032202 Cornulin Human genes 0.000 description 1
- 101000920981 Homo sapiens Cornulin Proteins 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000517 effect on sleep Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Software Systems (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Mathematical Physics (AREA)
- Computational Linguistics (AREA)
- Health & Medical Sciences (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Image Analysis (AREA)
Abstract
The disclosure provides a method, a device, equipment and a storage medium for identifying a coverage state, relates to the field of artificial intelligence, in particular to computer vision and deep learning technology, and can be particularly used in an intelligent prevention and control scene. The specific implementation scheme is as follows: acquiring a target image of a sleeping scene where a target user is located; detecting the target image to obtain a human body area and a covering area in the target image; and identifying the coverage state of the target user according to the overlapping degree between the human body area and the coverage area. By the technology disclosed by the invention, a coverage state identification scheme with higher identification precision is provided.
Description
Technical Field
The present disclosure relates to the field of artificial intelligence technology, and in particular to computer vision and deep learning technology, which can be used in intelligent prevention and control scenes.
Background
During sleeping, children, old people with inconvenient actions, and the like may kick off coverings (such as quilts and blankets). If the treatment is not carried out in time, children or old people can catch a cold, so that the coverage condition of the children or the old people can be accurately identified, and the method is very important.
Disclosure of Invention
The disclosure provides a method, an apparatus, a device and a storage medium for identifying a coverage status.
According to an aspect of the present disclosure, there is provided a coverage status recognition method, including:
acquiring a target image of a sleeping scene where a target user is located;
detecting the target image to obtain a human body area and a covering area in the target image;
and identifying the coverage state of the target user according to the overlapping degree between the human body area and the coverage area.
According to another aspect of the present disclosure, there is provided a coverage state recognition apparatus including:
the image acquisition module is used for acquiring a target image of a sleeping scene where a target user is located;
the detection module is used for detecting the target image to obtain a human body area and a covering area in the target image;
and the coverage state identification module is used for identifying the coverage state of the target user according to the overlapping degree between the human body area and the coverage area.
According to another aspect of the present disclosure, there is provided an electronic device including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a coverage status identification method according to any one of the embodiments of the present disclosure.
According to another aspect of the present disclosure, there is provided a non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the coverage state identification method according to any one of the embodiments of the present disclosure.
According to another aspect of the present disclosure, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the coverage status identification method according to any of the embodiments of the present disclosure.
According to the technology of the disclosure, a coverage state identification scheme with higher identification precision is provided.
It should be understood that the statements in this section do not necessarily identify key or critical features of the embodiments of the present disclosure, nor do they limit the scope of the present disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The drawings are included to provide a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
fig. 1 is a flowchart of a coverage status identification method provided according to an embodiment of the present disclosure;
FIG. 2 is a flow chart of another coverage status identification method provided in accordance with an embodiment of the present disclosure;
FIG. 3 is a flow chart of yet another method for identifying a coverage status provided in accordance with an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a coverage status recognition apparatus according to an embodiment of the present disclosure;
fig. 5 is a block diagram of an electronic device for implementing the coverage status identification method of the embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below with reference to the accompanying drawings, in which various details of the embodiments of the disclosure are included to assist understanding, and which are to be considered as merely exemplary. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 is a flowchart of a coverage status identification method according to an embodiment of the present disclosure. The embodiment of the disclosure is suitable for identifying the covering state, and is particularly suitable for identifying the state of covering objects during sleeping, such as children, old people with inconvenient actions and the like, wherein the covering objects are articles for keeping temperature during sleeping, and can be quilts, blankets and the like. The method can be executed by a coverage status recognition apparatus, which can be implemented in software and/or hardware, and can be integrated in an electronic device carrying a coverage status recognition function, such as an acquisition device (e.g., an intelligent camera). As shown in fig. 1, the coverage status identification method provided in this embodiment may include:
s101, acquiring a target image of a sleeping scene where a target user is located.
The target user is a user who needs to perform coverage status recognition, and may be, for example, a child or an elderly person who is not convenient to move. The sleep scene refers to a scene in which the user is in a sleep state, and may be, for example, on a bed or a sofa. The target image is an image including the target user and a scene where the target user is located when sleeping, and may be, for example, an image including the target user and a set area around a bed where the target user is located, or may be, for example, an image including the target user and a set area around a sofa where the target user is located.
Further, if the execution main body of the embodiment is a collection device, the collection device (for example, an intelligent camera) may collect a target image of a sleep scene where a target user is located in real time. For example, in the process of sleeping on the crib, in order to avoid catching a cold when the baby kicks off the quilt in the sleeping process, the intelligent camera can be installed above the crib, and images of the baby sleeping on the crib are collected based on the intelligent camera. For another example, in order to monitor the state of an old person who is inconvenient to move at home, an intelligent camera can be installed at home to collect images of the old person who has a rest and falls asleep on a sofa.
For example, the execution main body in this embodiment may also be other electronic devices with computing functions, such as a computer; and then the acquisition equipment can send the acquired target image to the computer after acquiring the target image of the sleeping scene of the target user. Further, since data transmission requires time consumption, network bandwidth, and the like, in order to quickly know the coverage status of the target user, the execution subject of this embodiment is preferably an acquisition device.
Optionally, in an implementation manner, in this embodiment, if the collecting device monitors a sleep event of the target user, the collecting device collects a target image of a sleep scene where the target user is located. The sleep event can be triggered by the acquisition device receiving monitoring information sent by the terminal device associated with the target user, and the terminal device associated with the target user can be a device held by a guardian of the target user; or the monitored sleep event of the target user can be the monitored current time meeting the preset image acquisition period and the like.
And S102, detecting the target image to obtain a human body area and a covering area in the target image.
Optionally, the target image may be detected based on a target detection model, and the human body region and the covering object region in the target image are marked by the marking frame. The target detection model is pre-trained, and may be a YOLO v5 model, a fast RCNN model, a cascoded CRNN model, or a centerNet v2 model.
For example, the target image may be detected based on a Human Object Interaction (HOI) model, so as to obtain a Human body region and a covering region in the target image. The HOI model is a model for locating people and objects and identifying interaction between people and objects, and can be obtained by training based on a large number of images pre-labeled with human body regions and covering regions. Furthermore, the human body area in the image adopted in the training stage of the human-object interaction model is marked in the following mode, and the human body area can be marked according to the head area of the user, the height and the width of the user and other characteristics because the human body is shielded by the covering.
For example, in an optional implementation manner of this embodiment, if the target user is not detected in the target image, the angle, the shooting parameters, and the like of the capturing device may be adjusted until the target image is captured. After the target image is acquired, the subsequent operations, i.e., performing S102 and S103, may be performed.
And S103, identifying the coverage state of the target user according to the overlapping degree between the human body area and the coverage area.
Wherein, the covering state refers to the state of the user cover covering, and may include a target user cover covering and a target user uncover covering.
The overlapping degree refers to the overlapping degree between the human body region and the covering region, that is, the Intersection (IOU) ratio between the human body region labeling frame and the covering region labeling frame, and further, the Intersection and the Union between the human body region labeling frame and the covering region labeling frame can be determined, and the ratio between the Intersection and the Union is used as the overlapping degree between the human body region and the covering region.
Specifically, the overlapping degree between the human body area and the covering area is compared with an overlapping threshold, and if the overlapping degree is smaller than the overlapping threshold, the covering state of the target user is considered as that the target user does not cover the covering object; and if the overlapping degree is greater than the overlapping threshold value, the coverage state of the target user is considered as the cover of the target user. Wherein, the overlapping threshold value can be flexibly set by the skilled person according to the actual situation.
For example, if it is recognized that the coverage state of the target user is that the target user does not cover the coverage object, an early warning prompt may be performed on the terminal device associated with the target user, for example, the early warning prompt may be performed in a voice form, and the target image may be transmitted to the terminal device associated with the target user. For example, if the target user is a baby, the terminal device associated with the target user may be a terminal device (e.g., a mobile phone, a tablet, etc.) of a caregiver (e.g., a parent, etc.) of the baby.
It should be noted that, in the present embodiment, the coverage status of the target user is identified based on the overlapping degree, and since the overlapping degree does not change with the distance of the shooting distance of the capturing device (e.g., a camera), the false identification caused by the distance of the shooting distance of the capturing device can be avoided.
According to the technical scheme provided by the embodiment of the disclosure, the target image of the sleeping scene where the target user is located is collected, then the target image is detected, the human body area and the covering area in the target image are obtained, and then the covering state of the target user is identified according to the overlapping degree between the human body area and the covering area. Above-mentioned technical scheme not only can discern user's coverage state based on the overlap degree, but also can avoid because collection equipment (for example the camera) shoots the mistake discernment that the distance leads to, compares in the mode through artifical discernment coverage state, and above-mentioned scheme has greatly improved the degree of accuracy of coverage state discernment, and has improved recognition efficiency. In addition, by adding a coverage state identification function in the electronic equipment (such as the acquisition equipment), the intelligent degree of the equipment is increased.
In actual scenes, environment information is different under different sleep scenes, so that requirements for sleep states are different. On the basis of the above embodiment, as an optional mode of the embodiment of the present disclosure, the overlap threshold may be determined according to the environmental information of the sleeping scene where the target user is located; and identifying the coverage state of the target user according to the overlapping degree and the overlapping threshold value. Wherein the environmental information includes at least one of a current time, a temperature, a humidity, and a light.
Because different seasons in different regions have different influences on sleep, the overlapping threshold value can be flexibly determined according to the environmental information of the scene where the target user is located. For example, the user may be in different regions and seasons, and the sleep effect on the user may be different, for example, the temperature and humidity may be different in the same region but different seasons; the temperature and the humidity in different regions and in the same season are different, so that the overlapping threshold value can be determined according to the temperature and the humidity of the scene where the user is located. For example, for southern areas with a more humid climate, the overlap threshold may be set lower in the summer due to the higher temperature; in the northern area with dry climate, due to the high temperature in summer, people usually turn on the air conditioner when sleeping, so the overlapping threshold value can be larger than the overlapping threshold value of the southern area.
For example, the overlap threshold may also be determined according to the current time and temperature of the scene where the target user is located. For example, when the target user is at night, the temperature change is large, and the overlap threshold value can be set to be larger; the target user is in the daytime and the temperature is relatively higher than at night, so the overlap threshold can be set smaller.
Illustratively, the overlap threshold may also be determined based on the light rays of the scene in which the target user is located. For example, the overlap threshold may be adjusted in real time according to the intensity of the light and the temperature.
Optionally, in this embodiment, the environment information of the sleep scene where the target user is located may be input into a pre-trained overlap threshold determination model to determine the overlap threshold.
It should be noted that, in order to ensure the sleep quality of the target user, a minimum overlap threshold is set, and when the overlap threshold is determined according to the environment information of the sleep scene where the target user is located, the minimum overlap threshold cannot be lower.
It can be understood that, in the embodiment, by combining with an actual scene, the environment information of the sleep scene where the target user is located is introduced to flexibly determine the overlap threshold, and then the coverage state of the target user is identified according to the overlap degree and the determined overlap threshold, so that the accuracy of coverage state identification is increased, the intelligent degree of the device is further increased, and further, the satisfaction degree of the user is greatly improved.
Fig. 2 is a flowchart of another coverage status identification method provided according to an embodiment of the present disclosure, and this embodiment further explains in detail how to identify the coverage status of the target user on the basis of the above embodiment. As shown in fig. 2, the coverage status identification method provided in this embodiment may include:
s201, acquiring a target image of a sleeping scene where a target user is located.
S202, detecting the target image to obtain a human body area and a covering area in the target image.
And S203, processing the target image by adopting the classification model to obtain the predicted coverage probability and the predicted non-coverage probability of the target user.
In this embodiment, the predicted coverage probability may be a probability that the classification model predicts the coverage of the target user; correspondingly, the predicted uncovering probability is the probability that the target user predicted by the classification model uncovers the cover.
Specifically, the classification model is adopted to perform feature learning on the target image, so as to obtain the predicted coverage probability and the predicted non-coverage probability of the target user. The classification model is a trained neural network model based on pre-labeled overlay images and non-overlay images, such as a Resnet34 model.
And S204, identifying the coverage state of the target user according to the predicted coverage probability, the predicted non-coverage probability and the overlapping degree.
In this embodiment, the predicted coverage probability, the predicted non-coverage probability, the overlapping degree, and the target image may be input to the state determination model together to obtain the coverage state of the target user. Wherein the state determination model is pre-trained.
Optionally, the coverage probability of the target may be obtained according to the predicted coverage probability and the overlap degree, the coverage probability of the target is obtained according to the predicted coverage probability and the overlap degree, and the coverage state of the target user is identified according to the coverage probability of the target and the coverage probability of the target.
Specifically, the product of the predicted coverage probability and the overlap degree is used as a target coverage probability, the product of the predicted non-coverage probability and the overlap degree is used as a target non-coverage probability, the target coverage probability and the target non-coverage probability are further compared, and if the target coverage probability is greater than the target non-coverage probability, the coverage state of the target user is identified as a target user cover coverage object; and if the target coverage probability is greater than the target uncovering probability, identifying the coverage state of the target user as that the target user uncovers the cover.
It can be understood that the coverage state identification of the target user is carried out by introducing data of two dimensions of image features and overlapping degree, and the identification accuracy is improved.
Further, the coverage state of the target user can be identified according to the target coverage probability, the target non-coverage probability, the overlapping degree and the overlapping threshold value. For example, if the target coverage probability is greater than the target non-coverage probability and the overlap degree is greater than the overlap threshold, the coverage status of the target user is identified as the target user cover coverage. It can be understood that the overlapping threshold is introduced, and the coverage state of the target user is identified through the target coverage probability, the target non-coverage probability, the overlapping degree and the overlapping threshold, so that the identification accuracy is increased, and the false identification probability is reduced.
According to the technical scheme provided by the embodiment of the disclosure, a target image of a sleeping scene where a target user is located is collected, the target image is detected to obtain a human body area and a covering area in the target image, the target image is processed by adopting a classification model to obtain a predicted covering probability and a predicted uncovering probability of the target user, and the covering state of the target user is identified according to the predicted covering probability, the predicted uncovering probability and the overlapping degree. According to the technical scheme, the coverage state is identified by combining the image characteristics, and the identification accuracy of the coverage state is further improved.
Fig. 3 is a flowchart of another coverage status identification method provided according to an embodiment of the present disclosure, and the embodiment adds an early warning function on the basis of the above embodiment. As shown in fig. 3, the coverage status identification method provided in this embodiment may include:
s301, acquiring a target image of a sleeping scene where a target user is located.
S302, detecting the target image to obtain a human body area and a covering area in the target image.
And S303, identifying the coverage state of the target user according to the overlapping degree between the human body area and the coverage area.
S304, if the covering state is that the target user does not cover the covering object, determining whether early warning prompt is needed or not according to the environmental information of the sleeping scene where the target user is located.
Wherein the environmental information includes at least one of a current time, a temperature, a humidity, and a light.
In this embodiment, if the coverage state is that the target user does not cover the cover, and the temperature of the sleeping scene where the target user is located is relatively high, it is determined that the early warning prompt is not needed. For example, the material (e.g., sponge) of the sleeping scene of the target user may cause the temperature of the human body to rise, and therefore, if the coverage state is that the target user does not cover the covering object, it may be determined that the warning prompt is not required.
For example, if the coverage state is that the target user does not cover the coverage object and the temperature of the sleeping scene where the target user is located is low, it is determined that the warning prompt needs to be performed. For example, when a target user sleeps on a sofa, the temperature of the sofa is constant, so that the temperature of a human body cannot be changed, and in order to ensure the sleeping quality, an early warning prompt needs to be determined.
For example, if the coverage state is that the target user does not cover the coverage object, and the current time of the sleep scene where the target user is located is night, it is determined that the warning prompt needs to be performed. For example, when the infant is not covered with the cover at night, the infant needs to be warned in an early warning manner to ensure the sleep quality due to the fact that the temperature change at night is large.
For example, if the coverage state is that the target user does not cover the cover, and the humidity of the sleeping scene where the target user is located is relatively high, in order to ensure the sleeping quality, it is determined that the early warning prompt is not required.
For example, if the coverage state is that the target user does not cover the cover, and the light of the sleeping scene where the target user is located is too high, the temperature is continuously increased, and in order to ensure the sleeping quality, it is determined that no early warning prompt is required.
S305, if the fact that the early warning prompt is needed is determined, early warning information is sent to the terminal equipment related to the target user.
In this embodiment, if it is determined that the warning prompt is required, the warning prompt may be performed to the terminal device associated with the target user in a voice form, and the target image may be transmitted to the terminal device associated with the target user.
According to the technical scheme provided by the embodiment of the disclosure, a target image of a sleep scene where a target user is located is collected and detected to obtain a human body area and a covering object area in the target image, then the covering state of the target user is identified according to the overlapping degree between the human body area and the covering object area, if the covering state is that the target user does not cover the covering object, whether early warning prompt needs to be carried out or not is determined according to the environmental information of the sleep scene where the target user is located, the early warning prompt needs to be carried out, and early warning information is sent to terminal equipment associated with the target user. According to the technical scheme, the environmental information is introduced to determine whether to perform early warning prompt or not, so that the flexibility of the scheme is improved; the product intelligence is further increased, and the user satisfaction is improved.
Fig. 4 is a schematic structural diagram of a coverage status recognition apparatus according to an embodiment of the present disclosure. The embodiment of the disclosure is suitable for identifying the covering state, and is particularly suitable for identifying the state of covering objects during sleeping, such as children, old people with inconvenient actions and the like, wherein the covering objects are articles for keeping temperature during sleeping, and can be quilts, blankets and the like. The device can be implemented in a software and/or hardware manner, and can be integrated into an electronic device bearing a coverage state recognition function, such as an acquisition device (e.g., an intelligent camera).
As shown in fig. 4, the coverage status recognition apparatus 400 provided in the present embodiment may include an image acquisition module 401, a detection module 402, and a coverage status recognition module 403, wherein,
the image acquisition module 401 is configured to acquire a target image of a sleep scene where a target user is located;
a detection module 402, configured to detect a target image to obtain a human body region and a coverage region in the target image;
a coverage status identification module 403, configured to identify a coverage status of the target user according to an overlap between the human body area and the coverage area.
According to the technical scheme provided by the embodiment of the disclosure, the target image of the sleeping scene where the target user is located is collected, then the target image is detected, the human body area and the covering area in the target image are obtained, and then the covering state of the target user is identified according to the overlapping degree between the human body area and the covering area. Above-mentioned technical scheme not only can discern user's coverage state based on the overlap degree, but also can avoid because collection equipment (for example the camera) shoots the mistake discernment that the distance leads to, compares in the mode through artifical discernment coverage state, and above-mentioned scheme has greatly improved the degree of accuracy of coverage state discernment, and has improved recognition efficiency. In addition, by adding a coverage state identification function in the electronic equipment (such as the acquisition equipment), the intelligent degree of the equipment is increased.
Further, the coverage status identification module 403 comprises a processing unit and a coverage status identification unit, wherein,
the processing unit is used for processing the target image by adopting the classification model to obtain the predicted coverage probability and the predicted non-coverage probability of the target user;
and the coverage state identification unit is used for identifying the coverage state of the target user according to the predicted coverage probability, the predicted non-coverage probability and the overlapping degree.
Further, the coverage status identification unit is specifically configured to:
obtaining a target coverage probability according to the predicted coverage probability and the overlapping degree;
obtaining target uncovered probability according to the predicted uncovered probability and the overlapping degree;
and identifying the coverage state of the target user according to the target coverage probability and the target non-coverage probability.
Further, the coverage status identification module 403 is specifically configured to:
determining an overlapping threshold value according to the environmental information of the sleeping scene of the target user; the environmental information includes at least one of current time, temperature, humidity, and light;
and identifying the coverage state of the target user according to the overlapping degree and the overlapping threshold value.
Further, the device also comprises an early warning identification module, wherein the early warning identification module is used for:
if the target user is identified not to cover the covering object, determining whether early warning prompt is needed or not according to the environmental information of the sleeping scene of the target user; the environmental information includes at least one of current time, temperature, humidity, and light.
Further, the apparatus further comprises: an information sending module, the information sending module configured to:
and if the early warning prompt is determined to be needed, sending early warning information to the terminal equipment associated with the target user.
The coverage state identification device can execute the coverage state identification method provided by the embodiment of the disclosure, and has corresponding functional modules and beneficial effects of the execution method.
In the technical scheme of the disclosure, the acquisition, storage, application and the like of the personal information of the related user all accord with the regulations of related laws and regulations, and do not violate the good customs of the public order.
The present disclosure also provides an electronic device, a readable storage medium, and a computer program product according to embodiments of the present disclosure.
Fig. 5 is a block diagram of an electronic device for implementing the coverage status identification method of the embodiment of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 5, the electronic device 500 includes a computing unit 501, which can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM)502 or a computer program loaded from a storage unit 508 into a Random Access Memory (RAM) 503. In the RAM503, various programs and data required for the operation of the electronic apparatus 500 can also be stored. The calculation unit 501, the ROM 502, and the RAM503 are connected to each other by a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
A number of components in the electronic device 500 are connected to the I/O interface 505, including: an input unit 506 such as a keyboard, a mouse, or the like; an output unit 507 such as various types of displays, speakers, and the like; a storage unit 508, such as a magnetic disk, optical disk, or the like; and a communication unit 509 such as a network card, modem, wireless communication transceiver, etc. The communication unit 509 allows the electronic device 500 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
The computing unit 501 may be a variety of general-purpose and/or special-purpose processing components having processing and computing capabilities. Some examples of the computing unit 501 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The calculation unit 501 executes the respective methods and processes described above, such as the coverage state recognition method. For example, in some embodiments, the overlay state identification method may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 508. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 500 via the ROM 502 and/or the communication unit 509. When the computer program is loaded into the RAM503 and executed by the computing unit 501, one or more steps of the above described overlay state identification method may be performed. Alternatively, in other embodiments, the computing unit 501 may be configured to perform the coverage status identification method in any other suitable manner (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, Field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), system on a chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, receiving data and instructions from, and transmitting data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program codes may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program codes, when executed by the processor or controller, cause the functions/operations specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), Wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server with a combined blockchain.
It should be understood that various forms of the flows shown above may be used, with steps reordered, added, or deleted. For example, the steps described in the present disclosure may be executed in parallel, sequentially, or in different orders, as long as the desired results of the technical solutions disclosed in the present disclosure can be achieved, and the present disclosure is not limited herein.
The above detailed description should not be construed as limiting the scope of the disclosure. It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and substitutions may be made in accordance with design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present disclosure should be included in the scope of protection of the present disclosure.
Claims (15)
1. A coverage status identification method, comprising:
acquiring a target image of a sleeping scene where a target user is located;
detecting the target image to obtain a human body area and a covering area in the target image;
and identifying the coverage state of the target user according to the overlapping degree between the human body area and the coverage area.
2. The method of claim 1, wherein the identifying the coverage status of the target user according to the degree of overlap between the body region and the coverage region comprises:
processing the target image by adopting a classification model to obtain the predicted coverage probability and the predicted non-coverage probability of the target user;
and identifying the coverage state of the target user according to the predicted coverage probability, the predicted non-coverage probability and the overlapping degree.
3. The method of claim 2, wherein said identifying the coverage status of the target user based on the predicted coverage probability, the predicted non-coverage probability, and the degree of overlap comprises:
obtaining a target coverage probability according to the predicted coverage probability and the overlapping degree;
obtaining a target uncovered probability according to the predicted uncovered probability and the overlapping degree;
and identifying the coverage state of the target user according to the target coverage probability and the target non-coverage probability.
4. The method of claim 1, wherein the identifying the coverage status of the target user according to the degree of overlap between the body region and the coverage region comprises:
determining an overlapping threshold value according to the environmental information of the sleeping scene of the target user; the environmental information includes at least one of current time, temperature, humidity, and light;
and identifying the coverage state of the target user according to the overlapping degree and the overlapping threshold value.
5. The method according to any one of claims 1-4, further comprising, after identifying the coverage status of the target user according to the degree of overlap between the body region and the cover region:
if the coverage state is that the target user does not cover the covering object, determining whether early warning prompt is needed or not according to the environmental information of the sleeping scene of the target user; the environmental information includes at least one of a current time, a temperature, a humidity, and a light.
6. The method of claim 5, further comprising:
and if the early warning prompt is determined to be needed, sending early warning information to the terminal equipment associated with the target user.
7. A coverage status recognition apparatus comprising:
the image acquisition module is used for acquiring a target image of a sleeping scene where a target user is located;
the detection module is used for detecting the target image to obtain a human body area and a covering area in the target image;
and the coverage state identification module is used for identifying the coverage state of the target user according to the overlapping degree between the human body area and the coverage area.
8. The apparatus of claim 7, wherein the coverage status identification module comprises:
the processing unit is used for processing the target image by adopting a classification model to obtain the predicted coverage probability and the predicted non-coverage probability of the target user;
and the coverage state identification unit is used for identifying the coverage state of the target user according to the predicted coverage probability, the predicted non-coverage probability and the overlapping degree.
9. The apparatus according to claim 8, wherein the coverage status identification unit is specifically configured to:
obtaining a target coverage probability according to the predicted coverage probability and the overlapping degree;
obtaining a target uncovered probability according to the predicted uncovered probability and the overlapping degree;
and identifying the coverage state of the target user according to the target coverage probability and the target non-coverage probability.
10. The apparatus of claim 7, wherein the coverage status identification module is specifically configured to:
determining an overlapping threshold value according to the environmental information of the sleeping scene of the target user; the environmental information includes at least one of current time, temperature, humidity, and light;
and identifying the coverage state of the target user according to the overlapping degree and the overlapping threshold value.
11. The apparatus of any of claims 7-10, further comprising:
the early warning identification module is used for determining whether early warning prompt is needed or not according to the environmental information of the sleeping scene of the target user if the target user is identified not to cover the covering object; the environmental information includes at least one of a current time, a temperature, a humidity, and a light.
12. The apparatus of claim 11, further comprising:
and the information sending module is used for sending early warning information to the terminal equipment associated with the target user if the early warning prompt is determined to be needed.
13. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the coverage status identification method of any one of claims 1-6.
14. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the coverage status identification method according to any one of claims 1-6.
15. A computer program product comprising a computer program which, when executed by a processor, implements a coverage status identification method according to any one of claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110739715.9A CN113361469A (en) | 2021-06-30 | 2021-06-30 | Method, device and equipment for identifying coverage state and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110739715.9A CN113361469A (en) | 2021-06-30 | 2021-06-30 | Method, device and equipment for identifying coverage state and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113361469A true CN113361469A (en) | 2021-09-07 |
Family
ID=77537599
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110739715.9A Pending CN113361469A (en) | 2021-06-30 | 2021-06-30 | Method, device and equipment for identifying coverage state and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113361469A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114699046A (en) * | 2022-04-25 | 2022-07-05 | 深圳市华屹医疗科技有限公司 | Sleep monitoring method, monitor and monitoring system |
WO2023087677A1 (en) * | 2021-11-19 | 2023-05-25 | 青岛海尔空调器有限总公司 | Air conditioner sitting posture auxiliary control method, control device, and air conditioner |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104776551A (en) * | 2014-12-22 | 2015-07-15 | 珠海格力电器股份有限公司 | Sleep state monitoring method and device under air conditioner and air conditioner |
CN108209871A (en) * | 2017-12-27 | 2018-06-29 | 深圳信息职业技术学院 | Sleep monitor method, apparatus, system and electronic equipment |
CN109691986A (en) * | 2018-12-25 | 2019-04-30 | 合肥镭智光电科技有限公司 | Sleep detection system and detection method |
CN110580466A (en) * | 2019-09-05 | 2019-12-17 | 深圳市赛为智能股份有限公司 | infant quilt kicking behavior recognition method and device, computer equipment and storage medium |
CN112712020A (en) * | 2020-12-29 | 2021-04-27 | 文思海辉智科科技有限公司 | Sleep monitoring method, device and system |
CN113012176A (en) * | 2021-03-17 | 2021-06-22 | 北京百度网讯科技有限公司 | Sample image processing method and device, electronic equipment and storage medium |
-
2021
- 2021-06-30 CN CN202110739715.9A patent/CN113361469A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104776551A (en) * | 2014-12-22 | 2015-07-15 | 珠海格力电器股份有限公司 | Sleep state monitoring method and device under air conditioner and air conditioner |
CN108209871A (en) * | 2017-12-27 | 2018-06-29 | 深圳信息职业技术学院 | Sleep monitor method, apparatus, system and electronic equipment |
CN109691986A (en) * | 2018-12-25 | 2019-04-30 | 合肥镭智光电科技有限公司 | Sleep detection system and detection method |
CN110580466A (en) * | 2019-09-05 | 2019-12-17 | 深圳市赛为智能股份有限公司 | infant quilt kicking behavior recognition method and device, computer equipment and storage medium |
CN112712020A (en) * | 2020-12-29 | 2021-04-27 | 文思海辉智科科技有限公司 | Sleep monitoring method, device and system |
CN113012176A (en) * | 2021-03-17 | 2021-06-22 | 北京百度网讯科技有限公司 | Sample image processing method and device, electronic equipment and storage medium |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2023087677A1 (en) * | 2021-11-19 | 2023-05-25 | 青岛海尔空调器有限总公司 | Air conditioner sitting posture auxiliary control method, control device, and air conditioner |
CN114699046A (en) * | 2022-04-25 | 2022-07-05 | 深圳市华屹医疗科技有限公司 | Sleep monitoring method, monitor and monitoring system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109508688B (en) | Skeleton-based behavior detection method, terminal equipment and computer storage medium | |
US9396400B1 (en) | Computer-vision based security system using a depth camera | |
AU2019213309A1 (en) | System and method for generating an activity summary of a person | |
US20210124914A1 (en) | Training method of network, monitoring method, system, storage medium and computer device | |
CN109040693B (en) | Intelligent alarm system and method | |
CN112712020B (en) | Sleep monitoring method, device and system | |
CN109887234B (en) | Method and device for preventing children from getting lost, electronic equipment and storage medium | |
WO2015133195A1 (en) | Information processing device, information processing method, and program | |
CN113361469A (en) | Method, device and equipment for identifying coverage state and storage medium | |
US11631306B2 (en) | Methods and system for monitoring an environment | |
CN110737201B (en) | Monitoring method and device, storage medium and air conditioner | |
CN112733690A (en) | High-altitude parabolic detection method and device and electronic equipment | |
CN109543607A (en) | Object abnormal state detection method, system, monitor system and storage medium | |
US20130147917A1 (en) | Computing device and household monitoring method using the computing device | |
CN114469076A (en) | Identity feature fused old solitary people falling identification method and system | |
CN114022896A (en) | Target detection method and device, electronic equipment and readable storage medium | |
WO2018168604A1 (en) | Method, system, storage medium and computer system for determining fall response of subject | |
JP6822326B2 (en) | Watching support system and its control method | |
JP2021007055A (en) | Discriminator learning device, discriminator learning method, and computer program | |
CN116189232A (en) | Machine vision-based method and system for detecting abnormal behaviors of aged and elderly in nursing homes | |
CN113705284A (en) | Climbing identification method and device and camera | |
JP2015046811A (en) | Image sensor | |
CN108198203B (en) | Motion alarm method, device and computer readable storage medium | |
CN118762305A (en) | Machine vision-based illegal behavior identification and alarm method, system and equipment | |
CN118038492A (en) | Personnel drop detection method, device, equipment and medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |