CN114532923B - Health detection method and device, sweeping robot and storage medium - Google Patents

Health detection method and device, sweeping robot and storage medium Download PDF

Info

Publication number
CN114532923B
CN114532923B CN202210130164.0A CN202210130164A CN114532923B CN 114532923 B CN114532923 B CN 114532923B CN 202210130164 A CN202210130164 A CN 202210130164A CN 114532923 B CN114532923 B CN 114532923B
Authority
CN
China
Prior art keywords
target object
image
amount
time period
preset time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210130164.0A
Other languages
Chinese (zh)
Other versions
CN114532923A (en
Inventor
张瑞洁
李绍斌
宋德超
陈翀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai, Zhuhai Lianyun Technology Co Ltd filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN202210130164.0A priority Critical patent/CN114532923B/en
Publication of CN114532923A publication Critical patent/CN114532923A/en
Application granted granted Critical
Publication of CN114532923B publication Critical patent/CN114532923B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2458Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
    • G06F16/2462Approximate or statistical queries
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Chemical & Material Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Urology & Nephrology (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Hematology (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Fuzzy Systems (AREA)
  • Food Science & Technology (AREA)
  • Medicinal Chemistry (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Manipulator (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The embodiment of the invention relates to a health detection method, a device, a sweeping robot and a storage medium, comprising the following steps: acquiring an image acquired by a sweeping robot in the process of sweeping an active area of a target object; determining the alopecia amount of the target object in a first preset time period according to the image; and determining the health state of the target object according to the alopecia amount of the target object in a first preset time period. Therefore, the sweeping robot can clean rooms and determine the health state of the target object according to the number of the cleaned hairs, so that the health state of the target object is monitored.

Description

Health detection method and device, sweeping robot and storage medium
Technical Field
The embodiment of the invention relates to the technical field of intelligent home, in particular to a health detection method and device, a sweeping robot and a storage medium.
Background
The floor sweeping robot, also called an automatic sweeping machine, an intelligent dust collection, a robot dust collector and the like, is one of intelligent household appliances and can automatically sweep an area to be cleaned.
During the sweeping process of the sweeping robot, hairs can be possibly swept out. Hair is one of the health characteristics of the human body, and hair loss means that the body may be unhealthy. Wherein, the alopecia can be divided into loss due to deficiency of nutrition, loss due to mental stress, and loss due to stress. Among the above forms of alopecia, loss of nutrition and stress alopecia can be improved by regulation, and stress alopecia is serious, and permanent alopecia may be caused if not handled in time. Therefore, when the alopecia of the user is serious, the method has very important application significance for timely giving health warning to the user.
Disclosure of Invention
In view of this, in order to realize that the sweeping robot can timely alarm health of a user according to the alopecia amount of the user, the embodiment of the invention provides a health detection method, a health detection device, a sweeping robot and a storage medium.
In a first aspect, an embodiment of the present invention provides a health detection method, including:
acquiring an image acquired by a sweeping robot in the process of sweeping an active area of a target object;
determining the alopecia amount of the target object in a first preset time period according to the image;
and determining the health state of the target object according to the alopecia amount of the target object in a first preset time period.
In one possible embodiment, acquiring an image acquired by the sweeping robot during sweeping of an active area of a target object includes:
acquiring an image of each area to be cleaned, which is acquired by a cleaning robot before cleaning the area to be cleaned, wherein the area to be cleaned is a preset area in the target object active area;
and/or the number of the groups of groups,
and acquiring an image of a cleaning object in the built-in garbage box, which is acquired by the cleaning robot in the process of cleaning the moving area of the target object.
In one possible embodiment, determining the amount of hair loss of the target object within a first preset time period from the image includes:
inputting the image into a trained image recognition model to obtain the number of hairs contained in the image output by the image recognition model;
and determining the alopecia amount of the target object in a first preset time period according to the hair quantity.
In one possible embodiment, the determining, according to the number of hairs, an amount of hair loss of the target object within a first preset period of time includes:
acquiring the first time of the sweeping robot for sweeping the active area of the target object last time;
determining a time interval between the first time and a current time;
and determining the alopecia amount of the target object in a first preset time period according to the hair quantity and the time interval.
In one possible embodiment, the determining the health status of the target object according to the amount of hair loss of the target object in the first preset time period includes:
determining a normal hair loss amount range of the target object in the first preset time period from a preset database, wherein the preset database comprises normal hair loss amount ranges of a plurality of objects in the first preset time period;
Determining whether the amount of hair loss of the target object within the first preset time period is within the determined normal hair loss amount range;
if the alopecia amount of the target object in the first preset time period exceeds the normal alopecia amount range, determining that the target object is in an unhealthy state;
and if the alopecia amount of the target object in the first preset time period is within the normal alopecia amount range, determining that the target object is in a healthy state.
In one possible embodiment, the method further comprises:
and resetting the normal alopecia amount range corresponding to the target object in the preset database according to the alopecia amount under the condition that the target object is in an unhealthy state according to the alopecia amount of the target object in a first preset time period and the target object is determined to correct the unhealthy state to the healthy state.
In one possible embodiment, in the case that the target subject is determined to be in an unhealthy state according to the amount of hair loss of the target subject, the method further comprises:
generating a alopecia amount change graph of the target object within a second preset time period, wherein the second preset time period comprises a plurality of first preset time periods;
Outputting the alopecia amount change graph, and outputting alarm information for indicating that the target object is in an unhealthy state.
In a second aspect, embodiments of the present invention provide a health detection device, the device including:
the acquisition module is used for acquiring images acquired by the sweeping robot in the process of sweeping the active area of the target object;
a first determining module, configured to determine, according to the image, an amount of hair loss of the target object within a first preset time period;
and the second determining module is used for determining the health state of the target object according to the alopecia amount of the target object in the first preset time period.
In one possible implementation manner, the acquiring module is specifically configured to:
acquiring an image of each area to be cleaned, which is acquired by a cleaning robot before cleaning the area to be cleaned, wherein the area to be cleaned is a preset area in the target object active area;
and/or the number of the groups of groups,
and acquiring an image of a cleaning object in the built-in garbage box, which is acquired by the cleaning robot in the process of cleaning the moving area of the target object.
In one possible embodiment, the first determining module includes:
The model input sub-module is used for inputting the image into a trained image recognition model to obtain the number of hairs contained in the image output by the image recognition model;
and the alopecia amount determination submodule is used for determining the alopecia amount of the target object in a first preset time period according to the hair amount.
In one possible embodiment, the alopecia amount determination submodule is specifically configured to:
acquiring the first time of the sweeping robot for sweeping the active area of the target object last time;
determining a time interval between the first time and a current time;
and determining the alopecia amount of the target object in a first preset time period according to the hair quantity and the time interval.
In one possible embodiment, the second determining module includes:
a range determining sub-module, configured to determine a normal hair loss amount range of the target object in the first preset time period from a preset database, where the preset database includes normal hair loss amount ranges of a plurality of objects in the first preset time period;
a judging sub-module, configured to determine whether the amount of hair loss of the target object in the first preset time period is within the determined normal hair loss amount range;
A first determining submodule, configured to determine that the target object is in an unhealthy state if an amount of hair loss of the target object in the first preset time period exceeds the normal hair loss amount range;
and the second determining submodule is used for determining that the target object is in a healthy state if the alopecia amount of the target object in the first preset time period is within the normal alopecia amount range.
In one possible embodiment, the apparatus further comprises:
the resetting module is used for resetting the normal alopecia amount range corresponding to the target object in the preset database according to the alopecia amount under the condition that the target object is in an unhealthy state according to the alopecia amount of the target object in a first preset time period and the target object is corrected to be in a healthy state.
In one possible embodiment, the apparatus further comprises:
a generation module, configured to generate a hair loss amount change chart of the target object in a second preset time period when the target object is determined to be in an unhealthy state according to the hair loss amount of the target object, where the second preset time period includes a plurality of first preset time periods;
And the output module is used for outputting the alopecia amount change graph and outputting alarm information for indicating that the target object is in an unhealthy state.
In a third aspect, an embodiment of the present invention provides a sweeping robot, including: the device comprises an image acquisition module, a processor and a memory;
the image acquisition module is used for acquiring images in the process of cleaning the active area of the target object;
the processor is configured to execute a health detection program stored in the memory to implement the health detection method of any one of the first aspects.
A fourth aspect, and an embodiment of the present invention provides a cloud server, including: a processor and a memory, the processor being configured to execute a health detection program stored in the memory to implement the health detection method of any one of the first aspects.
In a fifth aspect, an embodiment of the present invention provides a health detection system, including: the system comprises a sweeping robot, a cloud server and a terminal;
the sweeping robot collects images in the process of sweeping the moving area of the target object and sends the images to the cloud server;
the cloud server determines the alopecia amount of the target object in a first preset time period according to the image; determining the health state of the target object according to the alopecia amount of the target object in a first preset time period; sending the health state of the target object to the terminal;
And the terminal outputs the health state of the target object.
A sixth aspect, an embodiment of the present invention provides a storage medium storing one or more programs executable by one or more processors to implement the health detection method of any one of the first aspects.
According to the technical scheme provided by the embodiment of the invention, the image acquired by the sweeping robot in the process of sweeping the active area of the target object is acquired, and then the alopecia amount of the target object in a first preset time period is determined according to the image; and determining the health status of the target object based on the amount of hair loss. The normal alopecia amount range of the target object is preset in the database of the sweeping robot, after each sweeping is completed and the alopecia amount of the target object is obtained through the identification image, the normal alopecia amount range of the target object is compared, and when the alopecia amount of the target object exceeds the normal alopecia amount range of the target object, the target object is judged to be in an unhealthy state, and the target object can be timely warned. The method and the device realize that the sweeping robot cleans rooms and simultaneously determines the health state of the target object according to the number of the cleaned hairs, thereby realizing the monitoring of the health state of the target object.
Drawings
Fig. 1 is a schematic diagram of a health detection system according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a sweeping robot according to an embodiment of the present invention;
FIG. 3 is a flowchart of an embodiment of a health detection method according to an embodiment of the present invention;
FIG. 4 is a flowchart of another embodiment of a health detection method according to an embodiment of the present invention;
fig. 5 is a schematic flow chart of image recognition by using a CNN neural network according to an embodiment of the present invention;
FIG. 6 is a flowchart of another embodiment of a health detection method according to an embodiment of the present invention;
FIG. 7 is a block diagram of an embodiment of a health detection device according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of another sweeping robot according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of a cloud server according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, a schematic architecture diagram of a health detection system according to an embodiment of the present invention is provided.
The health detection system shown in fig. 1 includes: a terminal 101, a cloud server 102, and a sweeping robot 103. The terminal 101, the cloud server 102 and the sweeping robot 103 are connected through network communication.
The terminal 101 may be a hardware device or software supporting a network connection to provide various network services. When the terminal 101 is hardware, it may be a variety of electronic devices supporting a display screen, including but not limited to a smart phone, tablet, laptop, desktop, etc., just as in fig. 1. When the terminal 101 is software, it can be installed in the above-listed electronic device. In the embodiment of the present invention, the terminal 101 may establish communication with the cloud server 102 and the sweeping robot 103 by installing corresponding application programs.
The cloud server 102 may be implemented by one server, or may be implemented by a server cluster formed by a plurality of servers, which is not limited in this embodiment of the present invention.
The sweeping robot 103 may include an image acquisition module therein for acquiring an image during a process of sweeping an active area of a target object and transmitting the acquired image to the cloud server 102 through a network.
In one embodiment, the user may control the sweeping robot 103 to sweep the active area of the target object through the terminal 101. When the robot 103 cleans the moving area of the target object, an image of the area to be cleaned or the cleaning object can be acquired. After that, the sweeping robot 103 transmits the acquired image to the cloud server 102. The cloud server 102 may apply the health detection method provided by the embodiment of the present invention to detect the health status of the target object.
In another embodiment, the user may control the sweeping robot 103 to sweep the active area of the target object through the terminal 101. When the sweeping robot 103 cleans the active area of the target object, the health detection method provided by the embodiment of the invention can be applied to detect the health state of the target object.
The health detection method provided by the invention is further explained with reference to the drawings in the following by using specific embodiments, and the embodiments do not limit the embodiments of the invention.
Referring to fig. 3, a flowchart of an embodiment of a health detection method is provided in an embodiment of the present invention. As one example, the flow illustrated in fig. 3 may be applied to a cloud server, such as cloud server 102 illustrated in fig. 1. As another example, the flow shown in fig. 3 may also be applied to a sweeping robot, such as the sweeping robot 103 illustrated in fig. 1. The flow shown in fig. 3 is only applied to the cloud server 102 as an example, and the health detection method provided by the embodiment of the present invention is described. As shown in fig. 3, the process may include the steps of:
Step 301, acquiring an image acquired by the sweeping robot in the process of sweeping the active area of the target object.
Step 302, determining the alopecia amount of the target object in a first preset time period according to the image.
The following collectively describes steps 301 and 302:
in real life, users typically have a fixed active area. For example, in daily life, the user may have his own independent bedroom, in which case the active area of the target object may refer to the bedroom of the target object.
Taking the schematic architecture diagram of the health detection system shown in fig. 1 as an example, in the embodiment of the present invention, the sweeping robot 103 may collect an image during the process of sweeping the active area of the target object, and upload the collected image to the cloud server 102, where the cloud server 102 determines, according to the collected image, the amount of hair loss of the target object in the first preset period. The first preset time period may be 1 day, 2 days, or 3 days, which is not limited in the embodiment of the present invention.
As one possible implementation, the cloud server 102 acquires an image of the area to be cleaned acquired by the sweeping robot 103 before sweeping each area to be cleaned. The area to be cleaned may be a predetermined area in the target object active area.
Specifically, before each cleaning of an area to be cleaned by the cleaning robot 103, an image of the area to be cleaned may be acquired first. The image is then uploaded to cloud server 102. In this way, the cloud server 102 may acquire images of a plurality of areas to be cleaned sequentially.
Then, the cloud server 102 inputs each acquired image to the trained image recognition model, and the number of hairs contained in the image output by the image recognition model can be obtained. Then, when the sweeping robot 103 has cleaned the whole active area of the target object, the cloud server 102 may sum the hair amounts respectively included in the plurality of images collected during the sweeping process of the sweeping robot 103, so as to obtain the hair amounts cleaned by the sweeping robot 103 during the sweeping process, and further the cloud server 102 may determine the hair loss amount of the target object in the first preset time period according to the hair amounts cleaned by the sweeping robot 103 during the sweeping process.
As another possible implementation manner, when the cloud server 102 acquires the image acquired by the sweeping robot 103, the cloud server may acquire the image of the cleaning object in the built-in garbage can acquired by the sweeping robot 103 in the process of cleaning the active area of the target object, and determine the alopecia amount of the target object in the first preset time period according to the image.
Specifically, as shown in fig. 2, a schematic structural diagram of a sweeping robot according to an embodiment of the present invention is provided. As shown in fig. 2, a temporary garbage box and a garbage box are arranged in the sweeping robot, an image acquisition module is arranged above the temporary garbage box, the sweeping robot can place a sweeping object in the temporary garbage box in the process of sweeping the moving area of the target object, and the image acquisition module above the temporary garbage box can acquire an image of the sweeping object in the temporary garbage box.
Optionally, the image acquisition module may periodically acquire an image of the cleaning object in the temporary dustbin. After the cleaning is completed, the acquired image is transmitted to the cloud server 102. The cloud server 102 may input the received plurality of images into the trained image recognition model to obtain the number of hairs contained in the plurality of images output by the image recognition model, respectively. Finally, the cloud server 102 sums the hair numbers respectively included in the plurality of images to obtain the hair number cleaned by the sweeping robot 103 in the cleaning process, and then the cloud server 102 can determine the hair loss of the target object in the first preset time period according to the hair number cleaned by the sweeping robot 103 in the cleaning process.
Alternatively, the image acquisition module may acquire the image of the cleaning object in the temporary dustbin only once after the cleaning of the cleaning robot 103 is completed, send the acquired image to the cloud server 102, and then the cloud server 102 inputs the acquired image of the cleaning object to the trained image recognition model, so as to obtain the number of hairs contained in the image output by the image recognition model. The number of hairs is the number of hairs cleaned by the sweeping robot 103 in the cleaning process, and the cloud server 102 can determine the hair loss of the target object in the first preset time period according to the number of hairs cleaned by the sweeping robot 103 in the cleaning process.
As to how the cloud server 102 determines the amount of hair loss of the target user in the first preset period of time based on the number of hairs cleaned during the cleaning process herein, the flow shown in fig. 4 will be described hereinafter, and will not be described in detail.
Alternatively, the image recognition model may be a CNN (Convolutional Neural Network ) model. Fig. 5 is a schematic flow chart of image recognition by using a CNN neural network according to an embodiment of the present invention.
Step 303, determining the health state of the target object according to the alopecia amount of the target object in the first preset time period.
In the embodiment of the present invention, a database may be preset on the cloud server 102, where the preset database includes a normal alopecia amount range of a plurality of objects in a first preset time period. Wherein, as an embodiment, in an initial case, the cloud server 102 may set a normal hair loss amount range of each object within a first preset period of time according to preset standard data; after that, the cloud server 102 can update the normal hair loss amount range of the subject according to the cleaning record of the sweeping robot 103.
Based on the above description, in an embodiment, when determining the health status of the target object according to the amount of hair loss of the target object in the first preset period, the cloud server 102 may first determine the normal hair loss range of the target object in the first preset period from the preset database; then, it is determined whether the amount of hair loss of the target object within the first preset time period is within the determined normal amount of hair loss. If the alopecia amount of the target object in the first preset time period exceeds the normal alopecia amount range, determining that the target object is in an unhealthy state; if the amount of hair loss of the target object within the first preset time period is within the normal hair loss amount range, the target object can be determined to be in a healthy state.
In addition, in the embodiment of the invention, under the condition that the target object is determined to be in an unhealthy state, a alopecia amount change chart of the target object in a second preset time period can be generated, the alopecia amount change chart is output, and alarm information for indicating that the target object is in the unhealthy state is output. The second preset time period may include a plurality of first time periods, for example: one week, one month, or two months, as embodiments of the invention are not limited in this regard.
In addition, in the embodiment of the present invention, when the cloud server 102 determines that the target object is in an unhealthy state according to the amount of hair loss of the target object in the first preset period, the cloud server 102 outputs the amount of hair loss of the target object in the first preset period and the current unhealthy state of the target object to the target object through the terminal 101. In this way, the target object can learn about its own health state according to the output content of the terminal 101.
Further, if the cloud server 102 determines that the health status of the target object is wrong, the target object may also correct the health status determined by the cloud server 102, for example, correct the determined unhealthy status to be healthy status. The cloud server 102 may reset the normal hair loss range corresponding to the target object in the preset database when determining that the target object corrects the determined unhealthy state to the healthy state.
For example, assuming that the first preset time period is 1 day, assuming that the amount of hair loss of the target object obtained by the cloud server 102 within 1 day is 70, continuing to assume that the range of the normal amount of hair loss of the target object in the preset database is set to 20-50, at this time, the cloud server 102 compares the amount of hair loss of the target object within the first preset time period to exceed the range of the normal amount of hair loss, and determines that the target object is in an unhealthy state. Then, the cloud server 102 outputs the amount of hair loss of the target object in the first preset period and the current unhealthy state of the target object to the target object through the terminal 101, and the health state judged by the cloud server 102 can be corrected assuming that the target object considers that the cloud server 102 judges the health state of the target object to be wrong. According to the target object, the cloud server 102 can reset the normal hair loss range corresponding to the target object in the preset database to 50-80 according to the target object with 70 hair loss per day.
Further, when the amount of hair loss of the target object is within the normal amount of hair loss, the cloud server 102 may further determine the health state of the target object by acquiring the color of the hair cleaned out during the cleaning, for example: and acquiring the occupation ratio of yellow or white in the hair cleaned in the cleaning process, setting the normal occupation ratio range of yellow or white in the hair color of the target object in a preset database, and determining that the target object is in an unhealthy state if the occupation ratio exceeds the normal occupation ratio range in the preset database.
Thus, the description of the flow shown in fig. 1 is completed.
According to the technical scheme provided by the embodiment of the invention, the image acquired by the sweeping robot in the process of sweeping the active area of the target object is acquired, and then the alopecia amount of the target object in a first preset time period is determined according to the image; and determining the health status of the target object based on the amount of hair loss. The normal alopecia amount range of the target object is preset in the database of the sweeping robot, after each sweeping is completed and the alopecia amount of the target object is obtained through the identification image, the normal alopecia amount range of the target object is compared, and when the alopecia amount of the target object exceeds the normal alopecia amount range of the target object, the target object is judged to be in an unhealthy state, and the target object can be timely warned. The method and the device realize that the sweeping robot cleans rooms and simultaneously determines the health state of the target object according to the number of the cleaned hairs, thereby realizing the monitoring of the health state of the target object.
Referring to fig. 4, a flowchart of an embodiment of another health detection method according to an embodiment of the present invention is provided. The process shown in fig. 4 describes how the cloud server 102 determines the amount of hair loss of the target user in the first preset time period according to the number of hairs purged during the cleaning process, based on the process shown in fig. 3, and may include the following steps, as shown in fig. 4:
Step 401, obtaining the first time when the sweeping robot cleans the active area of the target object last time.
Step 402, determining a time interval between the first time and the current time.
The following collectively describes steps 401 and 402:
taking the schematic architecture diagram of the health detection system shown in fig. 1 as an example, after detecting that the sweeping robot 103 cleans the active area of the target object each time, the cloud server 102 may add a cleaning record in the personal database of the target object, where the record may include: cleaning time, cleaning area, and number of hair to be cleaned.
Then, when the robot 103 cleans the target object next time, the cloud server 102 may acquire the first time of the last cleaning of the active area of the target object from the personal database after the cleaning is completed. The time interval between the first time and the current time may be determined from the first time.
For example, assuming that the last time the robot 103 cleans the target object's active area is 12 pm on 12 months 1 day and the current cleaning time is 12 pm on 12 months 3 days of the same year, the interval between the current cleaning and the last cleaning is 2 days.
Step 403, determining the alopecia amount of the target object in the first preset time period according to the hair number and the time interval.
In the above description, the cloud server 102 has determined the number of hairs to be swept out during the sweeping process, and may determine the amount of hair loss of the target object in the first preset period of time based on the number of hairs to be swept out and the above-described time interval.
As an example, assuming that the first preset time period is 1 day, continuing to assume that the time interval is 3 days, at this time, dividing the number of hairs by the time interval, the average value obtained may be used as the amount of hair loss of the target subject in the first preset time period.
Further, the cloud server 102 may output to the target object through the visual interface of the terminal 101: the hair quantity and time interval cleaned in the cleaning process, the hair loss of the target object in the first preset time period obtained through calculation and the health state of the target object. After the cloud server 102 receives the feedback information of the target object, if the alopecia amount of the target object is determined to be correct in the feedback information, determining that the calculated alopecia amount of the target object in a first preset time period is a correct value; if the target object modifies the number of hairs or the time interval cleaned in the current cleaning process in the feedback information, for example, modifies the time interval from 3 days to 2 days, the alopecia amount of the target object in the first preset time period is recalculated according to the modified content of the target object.
According to the technical scheme provided by the embodiment of the invention, the first time for the sweeping robot to sweep the active area of the target object last time is obtained, the time interval between the first time and the current time is determined, and then the alopecia amount of the target object in a first preset time period is determined according to the number of the hair to be swept and the time interval. The method comprises the steps of acquiring the first time of the latest cleaning and determining the time interval between the first time and the current time, and accurately calculating the alopecia amount of the target object in the first preset time period according to the number of the cleaned hair and the time interval, so that the cleaning robot cleans a room, and meanwhile, the health state of the target object is determined according to the number of the cleaned hair, the health state of the target object is monitored, and the target object can be timely alarmed when the target object is in an unhealthy state.
Referring to fig. 6, a flowchart of an embodiment of a method for detecting health according to another embodiment of the present invention, as shown in fig. 6, the flowchart may include the following steps:
first, after the sweeping robot receives a sweeping command (a target object can set a sweeping mode of the sweeping robot to issue the sweeping command through a terminal, for example, a full house sweeping mode sequentially sweeps each room, a designated room sweeping mode designates a specific room to sweep, an area sweeping designates a specific area to sweep), a sweeping room, that is, an active area of the target object, is started. After cleaning is completed, an image of the cleaning object in the garbage box is acquired, and image recognition is carried out, so that the number of hairs in the cleaning object is obtained.
Then, the cloud server judges whether the target object has a personal database (the personal database comprises a cleaning record of the sweeping robot on the active area of the target object and a normal alopecia amount range of the target object) or not, and if so, the normal alopecia amount range of the target object is obtained from the personal database; and if the target object does not establish a personal database, acquiring the normal alopecia amount range of the target object according to the sample standard data.
Secondly, the cloud server judges whether the target object is in a healthy state according to the normal hair loss amount range, if the hair amount in the cleaning object is in the normal hair loss amount range, the target object is in the healthy state, and at the moment, relevant data can be displayed through a terminal of the sweeping robot (for example, a weekly statistical graph, a monthly statistical graph or a annual statistical graph of the hair loss amount of the target object is displayed at a functional module of the terminal); if the hair quantity in the cleaning object exceeds the range of the normal alopecia quantity, the target object is in an unhealthy state. At this time, the terminal of the sweeping robot can push early warning information (for example, a mobile phone vibration prompt, a yellow early warning with an exclamation mark and the like) to the target object and display related data (for example, the alopecia amount data of the target object in the last half month is drawn into a line graph and displayed).
According to the technical scheme provided by the embodiment of the invention, the image acquired by the sweeping robot in the process of sweeping the active area of the target object is acquired, and then the alopecia amount of the target object in a first preset time period is determined according to the image; and determining the health status of the target object based on the amount of hair loss. The normal alopecia amount range of the target object is preset in the database of the sweeping robot, after each sweeping is completed and the alopecia amount of the target object is obtained through the identification image, the normal alopecia amount range of the target object is compared, and when the alopecia amount of the target object exceeds the normal alopecia amount range of the target object, the target object is judged to be in an unhealthy state, and the target object can be timely warned. The cleaning robot can clean rooms and determine the health state of the target object according to the hair falling amount, so that the health state of the target object is monitored, and the alarm can be given out timely when the target object is in an unhealthy state.
Referring to fig. 7, a block diagram of an embodiment of a health detection device according to an embodiment of the present invention is provided.
As shown in fig. 7, the apparatus includes:
An acquisition module 71, configured to acquire an image acquired by the sweeping robot during a process of sweeping a moving area of a target object;
a first determining module 72 for determining an amount of hair loss of the target object within a first preset time period from the image;
a second determining module 73, configured to determine a health state of the target object according to an amount of hair loss of the target object in a first preset time period.
In one possible implementation, the obtaining module 71 is specifically configured to:
acquiring an image of each area to be cleaned, which is acquired by a cleaning robot before cleaning the area to be cleaned, wherein the area to be cleaned is a preset area in the target object active area;
and/or the number of the groups of groups,
and acquiring an image of a cleaning object in the built-in garbage box, which is acquired by the cleaning robot in the process of cleaning the moving area of the target object.
In one possible implementation, the first determining module 72 includes (not shown in the figure):
the model input sub-module is used for inputting the image into a trained image recognition model to obtain the number of hairs contained in the image output by the image recognition model;
and the alopecia amount determination submodule is used for determining the alopecia amount of the target object in a first preset time period according to the hair amount.
In one possible embodiment, the alopecia amount determination submodule is specifically configured to:
acquiring the first time of the sweeping robot for sweeping the active area of the target object last time;
determining a time interval between the first time and a current time;
and determining the alopecia amount of the target object in a first preset time period according to the hair quantity and the time interval.
In one possible embodiment, the second determining module 73 includes (not shown in the figure):
a range determining sub-module, configured to determine a normal hair loss amount range of the target object in the first preset time period from a preset database, where the preset database includes normal hair loss amount ranges of a plurality of objects in the first preset time period;
a judging sub-module, configured to determine whether the amount of hair loss of the target object in the first preset time period is within the determined normal hair loss amount range;
a first determining submodule, configured to determine that the target object is in an unhealthy state if an amount of hair loss of the target object in the first preset time period exceeds the normal hair loss amount range;
And the second determining submodule is used for determining that the target object is in a healthy state if the alopecia amount of the target object in the first preset time period is within the normal alopecia amount range.
In one possible embodiment, the apparatus further comprises (not shown in the figures):
the resetting module is used for resetting the normal alopecia amount range corresponding to the target object in the preset database according to the alopecia amount under the condition that the target object is in an unhealthy state according to the alopecia amount of the target object in a first preset time period and the target object is corrected to be in a healthy state.
In one possible embodiment, the apparatus further comprises (not shown in the figures):
a generation module, configured to generate a hair loss amount change chart of the target object in a second preset time period when the target object is determined to be in an unhealthy state according to the hair loss amount of the target object, where the second preset time period includes a plurality of first preset time periods;
and the output module is used for outputting the alopecia amount change graph and outputting alarm information for indicating that the target object is in an unhealthy state.
Referring to fig. 8, a schematic structural diagram of another sweeping robot according to an embodiment of the present invention, the sweeping robot 800 shown in fig. 8 includes: an image acquisition module 806, at least one processor 801, memory 802, at least one network interface 804, and other user interfaces 803. The various components in the sweeping robot 800 are coupled together by a bus system 805. It is appreciated that the bus system 805 is used to enable connected communications between these components. The bus system 805 includes a power bus, a control bus, and a status signal bus in addition to the data bus. But for clarity of illustration, the various buses are labeled as bus system 805 in fig. 8.
Wherein, the image acquisition module 806 is configured to acquire an image during a process of cleaning an active area of the target object.
The user interface 803 may include a display, keyboard, or pointing device (e.g., mouse, trackball, touch pad, or touch screen, etc.).
It will be appreciated that the memory 802 in embodiments of the invention can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. The non-volatile memory may be a Read-only memory (ROM), a programmable Read-only memory (ProgrammableROM, PROM), an erasable programmable Read-only memory (ErasablePROM, EPROM), an electrically erasable programmable Read-only memory (ElectricallyEPROM, EEPROM), or a flash memory, among others. The volatile memory may be a random access memory (RandomAccessMemory, RAM) that acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic random access memory (DynamicRAM, DRAM), synchronous dynamic random access memory (SynchronousDRAM, SDRAM), double data rate synchronous dynamic random access memory (ddr SDRAM), enhanced Synchronous Dynamic Random Access Memory (ESDRAM), synchronous link dynamic random access memory (SynchlinkDRAM, SLDRAM), and direct memory bus random access memory (DirectRambusRAM, DRRAM). The memory 802 described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In some implementations, the memory 802 stores the following elements, executable units or data structures, or a subset thereof, or an extended set thereof: an operating system 8021 and application programs 8022.
The operating system 8021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, for implementing various basic services and processing hardware-based tasks. The application 8022 includes various application programs such as a media player (MediaPlayer), a Browser (Browser), and the like for realizing various application services. The program for implementing the method of the embodiment of the present invention may be contained in the application program 8022.
In the embodiment of the present invention, by calling a program or an instruction stored in the memory 802, specifically, a program or an instruction stored in the application program 8022, the processor 801 is configured to perform method steps provided by each method embodiment, for example, including:
acquiring an image acquired by a sweeping robot in the process of sweeping an active area of a target object;
determining the alopecia amount of the target object in a first preset time period according to the image;
and determining the health state of the target object according to the alopecia amount of the target object in a first preset time period.
The method disclosed in the above embodiment of the present invention may be applied to the processor 801 or implemented by the processor 801. The processor 801 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuitry in hardware in the processor 801 or by instructions in software. The processor 801 may be a general purpose processor, a digital signal processor (DigitalSignalProcessor, DSP), an application specific integrated circuit (application specific IntegratedCircuit, ASIC), an off-the-shelf programmable gate array (FieldProgrammableGateArray, FPGA) or other programmable logic device, a discrete gate or transistor logic device, or a discrete hardware component. The disclosed methods, steps, and logic blocks in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be embodied directly in the execution of a hardware decoding processor, or in the execution of a combination of hardware and software elements in a decoding processor. The software elements may be located in a random access memory, flash memory, read-only memory, programmable read-only memory or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in a memory 802, and the processor 801 reads information in the memory 802 and, in combination with its hardware, performs the steps of the above method.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or a combination thereof. For a hardware implementation, the processing units may be implemented within one or more application specific integrated circuits (ApplicationSpecificIntegratedCircuits, ASIC), digital signal processors (DigitalSignalProcessing, DSP), digital signal processing devices (dspev), programmable logic devices (ProgrammableLogicDevice, PLD), field programmable gate arrays (Field-ProgrammableGateArray, FPGA), general purpose processors, controllers, microcontrollers, microprocessors, other electronic units configured to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described herein may be implemented by means of units that perform the functions described herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
The sweeping robot provided in this embodiment may be a sweeping robot as shown in fig. 8, and may perform all steps of the health detection method shown in fig. 3 to 4, so as to achieve the technical effects of the health detection method shown in fig. 3 to 4, and the detailed description of fig. 3 to 4 is omitted herein for brevity.
Referring to fig. 9, for a schematic structural diagram of a cloud server provided in an embodiment of the present invention, a cloud server 900 shown in fig. 9 includes: at least one processor 901, memory 902, at least one network interface 904, and other user interfaces 903. The various components in cloud server 900 are coupled together by bus system 905. It is appreciated that the bus system 905 is employed to enable connected communications between these components. The bus system 905 includes a power bus, a control bus, and a status signal bus in addition to the data bus. But for clarity of illustration the various buses are labeled as bus system 905 in fig. 9.
The user interface 903 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, a trackball, a touch pad, or a touch screen, etc.).
It will be appreciated that the memory 902 in embodiments of the invention can be either volatile memory or nonvolatile memory, or can include both volatile and nonvolatile memory. The non-volatile memory may be a Read-only memory (ROM), a programmable Read-only memory (ProgrammableROM, PROM), an erasable programmable Read-only memory (ErasablePROM, EPROM), an electrically erasable programmable Read-only memory (ElectricallyEPROM, EEPROM), or a flash memory, among others. The volatile memory may be a random access memory (RandomAccessMemory, RAM) that acts as an external cache. By way of example, and not limitation, many forms of RAM are available, such as Static RAM (SRAM), dynamic random access memory (DynamicRAM, DRAM), synchronous dynamic random access memory (SynchronousDRAM, SDRAM), double data rate synchronous dynamic random access memory (ddr SDRAM), enhanced Synchronous Dynamic Random Access Memory (ESDRAM), synchronous link dynamic random access memory (SynchlinkDRAM, SLDRAM), and direct memory bus random access memory (DirectRambusRAM, DRRAM). The memory 902 described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In some implementations, the memory 902 stores the following elements, executable units or data structures, or a subset thereof, or an extended set thereof: an operating system 9021 and application programs 9022.
The operating system 9021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, for implementing various basic services and processing hardware-based tasks. The application 9022 includes various application programs such as a media player (MediaPlayer), a Browser (Browser), and the like for realizing various application services. A program for implementing the method of the embodiment of the present invention may be included in the application 9022.
In the embodiment of the present invention, by calling a program or an instruction stored in the memory 902, specifically, a program or an instruction stored in the application program 9022, the processor 901 is configured to execute method steps provided by each method embodiment, for example, including:
acquiring an image acquired by a cloud server in the process of cleaning an active area of a target object;
determining the alopecia amount of the target object in a first preset time period according to the image;
and determining the health state of the target object according to the alopecia amount of the target object in a first preset time period.
As to the connection manner and the function of each component in the cloud server, please refer to the related description of fig. 8, and the description is omitted here.
The cloud server provided in this embodiment may be a cloud server as shown in fig. 9, and may perform all the steps of the health detection method shown in fig. 3 to 4, so as to achieve the technical effects of the health detection method shown in fig. 3 to 4, and the detailed description will be omitted herein for brevity.
The embodiment of the invention also provides a storage medium (computer readable storage medium). The storage medium here stores one or more programs. Wherein the storage medium may comprise volatile memory, such as random access memory; the memory may also include non-volatile memory, such as read-only memory, flash memory, hard disk, or solid state disk; the memory may also comprise a combination of the above types of memories.
When one or more programs in the storage medium are executable by one or more processors, the above-described health detection method performed on the robot side of the sweeper is realized.
The processor is used for executing the health detection program stored in the memory to realize the following steps of the health detection method executed on the sweeping robot side:
Acquiring an image acquired by a sweeping robot in the process of sweeping an active area of a target object;
determining the alopecia amount of the target object in a first preset time period according to the image;
and determining the health state of the target object according to the alopecia amount of the target object in a first preset time period.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative elements and steps are described above generally in terms of function in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware, in a software module executed by a processor, or in a combination of the two. The software modules may be disposed in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The foregoing description of the embodiments has been provided for the purpose of illustrating the general principles of the invention, and is not meant to limit the scope of the invention, but to limit the invention to the particular embodiments, and any modifications, equivalents, improvements, etc. that fall within the spirit and principles of the invention are intended to be included within the scope of the invention.

Claims (10)

1. A method of health detection, the method comprising:
acquire the image that the robot of sweeping floor was gathered at the in-process that cleans target object's active area, wherein, the robot of sweeping floor is provided with interim rubbish box and rubbish box, the top of interim rubbish box is provided with image acquisition module, acquire the image that the robot of sweeping floor was gathered at the in-process that cleans target object's active area, include: the image acquisition module is used for acquiring an image of a cleaning object in the temporary garbage box, and the cleaning robot is used for placing the cleaning object in the temporary garbage box in the process of cleaning the moving area of the target object;
determining an amount of hair loss of the target object within a first preset time period from the image, comprising: inputting the image into a trained image recognition model to obtain the number of hairs contained in the image output by the image recognition model; determining the alopecia amount of the target object in a first preset time period according to the hair quantity;
Determining a health state of the target object according to the alopecia amount of the target object in a first preset time period, including: determining a normal hair loss amount range of the target object in the first preset time period from a preset database, wherein the preset database comprises normal hair loss amount ranges of a plurality of objects in the first preset time period; determining whether the amount of hair loss of the target object within the first preset time period is within the determined normal hair loss amount range; if the alopecia amount of the target object in the first preset time period exceeds the normal alopecia amount range, determining that the target object is in an unhealthy state;
and if the hair loss amount of the target object in the first preset time period is within the normal hair loss amount range, identifying the hair color contained in the image, determining the ratio of the hair number with the hair color being the target color to the hair number contained in the image, and determining that the target object is in a healthy state under the condition that the ratio is smaller than a preset ratio threshold value.
2. The method of claim 1, wherein acquiring the image acquired by the sweeping robot during the sweeping of the active area of the target object comprises:
Acquiring an image of each area to be cleaned, which is acquired by a cleaning robot before cleaning the area to be cleaned, wherein the area to be cleaned is a preset area in the target object active area;
and/or the number of the groups of groups,
and acquiring an image of a cleaning object in the built-in garbage box, which is acquired by the cleaning robot in the process of cleaning the moving area of the target object.
3. The method of claim 1, wherein said determining the amount of hair loss of said target subject over a first predetermined period of time based on said number of hairs comprises:
acquiring the first time of the sweeping robot for sweeping the active area of the target object last time;
determining a time interval between the first time and a current time;
and determining the alopecia amount of the target object in a first preset time period according to the hair quantity and the time interval.
4. The method according to claim 1, wherein the method further comprises:
and resetting the normal alopecia amount range corresponding to the target object in the preset database according to the alopecia amount under the condition that the target object is in an unhealthy state according to the alopecia amount of the target object in a first preset time period and the target object is determined to correct the unhealthy state to the healthy state.
5. A method according to any one of claims 1 to 3, wherein in case it is determined that the target subject is in an unhealthy state based on the amount of hair loss of the target subject, the method further comprises:
generating a alopecia amount change graph of the target object within a second preset time period, wherein the second preset time period comprises a plurality of first preset time periods;
outputting the alopecia amount change graph, and outputting alarm information for indicating that the target object is in an unhealthy state.
6. A health detection device, the device comprising:
the acquisition module is used for acquiring an image acquired by the sweeping robot in the process of sweeping the moving area of the target object, wherein the sweeping robot is provided with a temporary garbage box and a garbage box, the image acquisition module is arranged above the temporary garbage box, the acquisition of the image acquired by the sweeping robot in the process of sweeping the moving area of the target object comprises the following steps: the image acquisition module is used for acquiring an image of a cleaning object in the temporary garbage box, and the cleaning robot is used for placing the cleaning object in the temporary garbage box in the process of cleaning the moving area of the target object;
A first determining module, configured to determine, according to the image, an amount of hair loss of the target object within a first preset time period, including: inputting the image into a trained image recognition model to obtain the number of hairs contained in the image output by the image recognition model; determining the alopecia amount of the target object in a first preset time period according to the hair quantity;
a second determining module, configured to determine a health state of the target object according to an amount of hair loss of the target object in a first preset period of time, including: determining a normal hair loss amount range of the target object in the first preset time period from a preset database, wherein the preset database comprises normal hair loss amount ranges of a plurality of objects in the first preset time period; determining whether the amount of hair loss of the target object within the first preset time period is within the determined normal hair loss amount range; if the alopecia amount of the target object in the first preset time period exceeds the normal alopecia amount range, determining that the target object is in an unhealthy state;
and if the hair loss amount of the target object in the first preset time period is within the normal hair loss amount range, identifying the hair color contained in the image, determining the ratio of the hair number with the hair color being the target color to the hair number contained in the image, and determining that the target object is in a healthy state under the condition that the ratio is smaller than a preset ratio threshold value.
7. A robot for sweeping floor, comprising: the device comprises a temporary garbage box, a garbage box, an image acquisition module, a processor and a memory;
the image acquisition module is used for acquiring an image of a cleaning object in the temporary garbage box in the process of cleaning the active area of the target object, wherein the cleaning robot is used for placing the cleaning object in the temporary garbage box in the process of cleaning the active area of the target object;
the processor is configured to execute a health detection program stored in the memory to implement the health detection method of any one of claims 1 to 5.
8. A cloud server, comprising: a processor and a memory, the processor being configured to execute a health detection program stored in the memory to implement the health detection method of any one of claims 1 to 5.
9. A health detection system, comprising: the system comprises a sweeping robot, a cloud server and a terminal;
the sweeping robot collects images of sweeping objects in a temporary garbage box arranged by the sweeping robot in the process of sweeping an active area of a target object, and sends the images to the cloud server;
The cloud server determines the alopecia amount of the target object in a first preset time period according to the image; determining the health state of the target object according to the alopecia amount of the target object in a first preset time period; sending the health state of the target object to the terminal;
and the terminal outputs the health state of the target object.
10. A storage medium storing one or more programs executable by one or more processors to implement the health detection method of any of claims 1-5.
CN202210130164.0A 2022-02-11 2022-02-11 Health detection method and device, sweeping robot and storage medium Active CN114532923B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210130164.0A CN114532923B (en) 2022-02-11 2022-02-11 Health detection method and device, sweeping robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210130164.0A CN114532923B (en) 2022-02-11 2022-02-11 Health detection method and device, sweeping robot and storage medium

Publications (2)

Publication Number Publication Date
CN114532923A CN114532923A (en) 2022-05-27
CN114532923B true CN114532923B (en) 2023-09-12

Family

ID=81674060

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210130164.0A Active CN114532923B (en) 2022-02-11 2022-02-11 Health detection method and device, sweeping robot and storage medium

Country Status (1)

Country Link
CN (1) CN114532923B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103268312A (en) * 2013-05-03 2013-08-28 同济大学 Training corpus collection system and method based on user feedback
CN110765895A (en) * 2019-09-30 2020-02-07 北京鲲鹏神通科技有限公司 Method for distinguishing object by robot
CN112862789A (en) * 2021-02-10 2021-05-28 上海大学 Interactive image segmentation method based on machine learning
CN113095230A (en) * 2021-04-14 2021-07-09 北京深睿博联科技有限责任公司 Method and device for helping blind person to search for articles
CN113468919A (en) * 2020-03-31 2021-10-01 青岛海尔智能技术研发有限公司 Method, device and equipment for providing life advice
CN113591512A (en) * 2020-04-30 2021-11-02 青岛海尔智能技术研发有限公司 Method, device and equipment for hair identification

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103268312A (en) * 2013-05-03 2013-08-28 同济大学 Training corpus collection system and method based on user feedback
CN110765895A (en) * 2019-09-30 2020-02-07 北京鲲鹏神通科技有限公司 Method for distinguishing object by robot
CN113468919A (en) * 2020-03-31 2021-10-01 青岛海尔智能技术研发有限公司 Method, device and equipment for providing life advice
CN113591512A (en) * 2020-04-30 2021-11-02 青岛海尔智能技术研发有限公司 Method, device and equipment for hair identification
CN112862789A (en) * 2021-02-10 2021-05-28 上海大学 Interactive image segmentation method based on machine learning
CN113095230A (en) * 2021-04-14 2021-07-09 北京深睿博联科技有限责任公司 Method and device for helping blind person to search for articles

Also Published As

Publication number Publication date
CN114532923A (en) 2022-05-27

Similar Documents

Publication Publication Date Title
CN1774679B (en) Process control system and method for configuring a process control system
US7430494B2 (en) Dynamic data stream histograms for no loss of information
CN107491474A (en) Information recommendation method and device
Cornou et al. Classification of sows’ activity types from acceleration patterns using univariate and multivariate models
CN108389631A (en) Varicella morbidity method for early warning, server and computer readable storage medium
CN111643011A (en) Cleaning robot control method and device, cleaning robot and storage medium
CN114414935A (en) Automatic positioning method and system for feeder fault area of power distribution network based on big data
CN106357480A (en) Method and device for monitoring network performance of application and mobile terminal
CN115545058A (en) Water meter data analysis method and system and readable storage medium
CN114532923B (en) Health detection method and device, sweeping robot and storage medium
CN117235873B (en) Smart home layout method and system based on historical work record
JP6870312B2 (en) Measure introduction effect prediction device, measure introduction effect prediction program and measure introduction effect prediction method
CN111310351A (en) Method and device for monitoring state of workshop equipment, terminal equipment and storage medium
CN117077854A (en) Building energy consumption monitoring method and system based on sensor network
CN109407526B (en) Equipment detection method and device and household appliance
CN103631232A (en) Data monitoring control method and data monitoring control device
CN110838074A (en) Analysis method and device
Vignesh et al. Deep Reinforcement Learning Based Weather Monitoring Systemusing Arduino for Smart Environment
CN112205927B (en) Intelligent sweeping method and device of sweeping robot
JP2010108102A (en) Action pattern extraction device, method, and program
CN115177184A (en) Water adding method and system for water tank of cleaning robot
CN109719735B (en) Environment data recording method and system and robot
CN117522019A (en) Mulberry management method, device, equipment and readable storage medium
CN115390690A (en) Coordinate calibration method and device, storage medium and computer equipment
CN116264950A (en) Setting method of sweeping robot, sweeping robot and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant