CN114532923A - Health detection method and device, sweeping robot and storage medium - Google Patents

Health detection method and device, sweeping robot and storage medium Download PDF

Info

Publication number
CN114532923A
CN114532923A CN202210130164.0A CN202210130164A CN114532923A CN 114532923 A CN114532923 A CN 114532923A CN 202210130164 A CN202210130164 A CN 202210130164A CN 114532923 A CN114532923 A CN 114532923A
Authority
CN
China
Prior art keywords
target object
time period
preset time
alopecia
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210130164.0A
Other languages
Chinese (zh)
Other versions
CN114532923B (en
Inventor
张瑞洁
李绍斌
宋德超
陈翀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Zhuhai Lianyun Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai, Zhuhai Lianyun Technology Co Ltd filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN202210130164.0A priority Critical patent/CN114532923B/en
Publication of CN114532923A publication Critical patent/CN114532923A/en
Application granted granted Critical
Publication of CN114532923B publication Critical patent/CN114532923B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4002Installations of electric equipment
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2458Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
    • G06F16/2462Approximate or statistical queries
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Chemical & Material Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Urology & Nephrology (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • Hematology (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Fuzzy Systems (AREA)
  • Food Science & Technology (AREA)
  • Medicinal Chemistry (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Manipulator (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The embodiment of the invention relates to a health detection method, a health detection device, a sweeping robot and a storage medium, wherein the health detection method comprises the following steps: acquiring an image acquired by a sweeping robot in the process of sweeping the moving area of a target object; determining the hair loss amount of the target object in a first preset time period according to the image; and determining the health state of the target object according to the hair loss amount of the target object in a first preset time period. Therefore, the cleaning robot can determine the health state of the target object according to the number of the cleaned hairs while cleaning a room, and the health state of the target object can be monitored.

Description

Health detection method and device, sweeping robot and storage medium
Technical Field
The embodiment of the invention relates to the technical field of smart home, in particular to a health detection method and device, a sweeping robot and a storage medium.
Background
The floor sweeping robot is also called an automatic sweeper, intelligent dust collection, a robot dust collector and the like, is one of intelligent household appliances, and can automatically sweep an area to be swept.
During the cleaning process of the cleaning robot, the hair may be cleaned. Hair is one of the health features of the human body, and hair loss means that the body may be in an unhealthy state. Among them, alopecia can be classified into nutritional deficiency alopecia, psychogenic pressure alopecia, and pressure alopecia. Among the above several forms of alopecia, nutritional loss and psychogenic stress alopecia can be ameliorated by conditioning, while stress alopecia is severe and, if not dealt with in a timely manner, can result in permanent alopecia. Therefore, when the hair loss of the user is serious, the health warning device has very important application significance for timely warning the health of the user.
Disclosure of Invention
In view of this, in order to enable the sweeping robot to alarm the health of the user in time according to the hair loss amount of the user, embodiments of the present invention provide a health detection method and apparatus, a sweeping robot, and a storage medium.
In a first aspect, an embodiment of the present invention provides a health detection method, including:
acquiring an image acquired by a sweeping robot in the process of sweeping the moving area of a target object;
determining the hair loss amount of the target object in a first preset time period according to the image;
and determining the health state of the target object according to the hair loss amount of the target object in a first preset time period.
In one possible embodiment, acquiring an image acquired by the sweeping robot during sweeping of the moving area of the target object includes:
acquiring an image of each to-be-cleaned area acquired by a sweeping robot before each to-be-cleaned area is cleaned, wherein the to-be-cleaned area is a preset area in the target object moving area;
and/or the presence of a gas in the gas,
and acquiring images of the cleaning objects in the built-in garbage box, which are acquired by the sweeping robot in the process of sweeping the moving area of the target object.
In one possible embodiment, determining the hair loss amount of the target object in the first preset time period according to the image comprises:
inputting the image into a trained image recognition model to obtain the number of hairs contained in the image output by the image recognition model;
determining the alopecia amount of the target object in a first preset time period according to the hair quantity.
In one possible embodiment, the determining the hair loss amount of the target object in the first preset time period according to the hair amount includes:
acquiring the first time when the sweeping robot cleans the moving area of the target object last time;
determining a time interval between the first time and a current time;
and determining the alopecia amount of the target object in a first preset time period according to the hair number and the time interval.
In one possible embodiment, the determining the health status of the target subject according to the hair loss amount of the target subject within a first preset time period includes:
determining the normal alopecia volume range of the target subject in the first preset time period from a preset database, wherein the preset database comprises the normal alopecia volume ranges of a plurality of subjects in the first preset time period;
determining whether the alopecia amount of the target object in the first preset time period is within the determined normal alopecia amount range;
if the alopecia amount of the target subject in the first preset time period exceeds the normal alopecia amount range, determining that the target subject is in an unhealthy state;
and if the alopecia volume of the target subject in the first preset time period is within the normal alopecia volume range, determining that the target subject is in a healthy state.
In one possible embodiment, the method further comprises:
and resetting the normal alopecia amount range corresponding to the target object in the preset database according to the alopecia amount under the condition that the target object is determined to be in an unhealthy state according to the alopecia amount of the target object in a first preset time period and the target object is determined to correct the unhealthy state into a healthy state.
In one possible embodiment, in case that it is determined that the target subject is in an unhealthy state according to the hair loss amount of the target subject, the method further comprises:
generating a pattern of hair loss variation of the target object in a second preset time period, wherein the second preset time period comprises a plurality of first preset time periods;
and outputting the alopecia volume change graph and outputting alarm information for indicating that the target object is in an unhealthy state.
In a second aspect, an embodiment of the present invention provides a health detection apparatus, including:
the acquisition module is used for acquiring an image acquired by the sweeping robot in the process of sweeping the moving area of the target object;
the first determining module is used for determining the alopecia amount of the target object in a first preset time period according to the image;
and the second determination module is used for determining the health state of the target object according to the hair loss amount of the target object in a first preset time period.
In a possible implementation manner, the obtaining module is specifically configured to:
acquiring an image of each to-be-cleaned area acquired by a sweeping robot before each to-be-cleaned area is cleaned, wherein the to-be-cleaned area is a preset area in the target object moving area;
and/or the presence of a gas in the gas,
and acquiring images of the cleaning objects in the built-in garbage box, which are acquired by the sweeping robot in the process of sweeping the moving area of the target object.
In one possible embodiment, the first determining module includes:
the model input submodule is used for inputting the image into a trained image recognition model to obtain the number of hairs contained in the image output by the image recognition model;
and the hair loss amount determining submodule is used for determining the hair loss amount of the target object in a first preset time period according to the hair quantity.
In one possible embodiment, the alopecia amount determination submodule is specifically configured to:
acquiring the first time when the sweeping robot cleans the moving area of the target object last time;
determining a time interval between the first time and a current time;
and determining the alopecia amount of the target object in a first preset time period according to the hair number and the time interval.
In one possible implementation, the second determining module includes:
the range determining submodule is used for determining the normal alopecia volume range of the target object in the first preset time period from a preset database, and the preset database comprises the normal alopecia volume ranges of a plurality of objects in the first preset time period;
the judgment sub-module is used for determining whether the alopecia amount of the target object in the first preset time period is within the determined normal alopecia amount range;
the first determining submodule is used for determining that the target object is in an unhealthy state if the alopecia amount of the target object in the first preset time period exceeds the normal alopecia amount range;
and the second determining submodule is used for determining that the target object is in a healthy state if the alopecia volume of the target object in the first preset time period is within the normal alopecia volume range.
In one possible embodiment, the apparatus further comprises:
the resetting module is used for resetting the normal alopecia volume range corresponding to the target object in the preset database according to the alopecia volume of the target object in a first preset time period under the condition that the target object is determined to be in an unhealthy state and the target object is determined to correct the unhealthy state to a healthy state.
In one possible embodiment, the apparatus further comprises:
the generating module is used for generating an alopecia volume change chart of the target object in a second preset time period under the condition that the target object is determined to be in an unhealthy state according to the alopecia volume of the target object, wherein the second preset time period comprises a plurality of first preset time periods;
and the output module is used for outputting the alopecia volume change chart and outputting alarm information for indicating that the target object is in an unhealthy state.
In a third aspect, an embodiment of the present invention provides a sweeping robot, including: the system comprises an image acquisition module, a processor and a memory;
the image acquisition module is used for acquiring images in the process of cleaning the moving area of the target object;
the processor is configured to execute the health detection program stored in the memory to implement the health detection method of any one of the first aspect.
In a fourth aspect, an embodiment of the present invention provides a cloud server, including: a processor and a memory, the processor being configured to execute a health detection program stored in the memory to implement the health detection method of any of the first aspects.
In a fifth aspect, an embodiment of the present invention provides a health detection system, including: the sweeping robot, the cloud server and the terminal;
the sweeping robot collects images in the process of sweeping the moving area of the target object and sends the images to the cloud server;
the cloud server determines the hair loss amount of the target object in a first preset time period according to the image; determining the health state of the target object according to the hair loss amount of the target object in a first preset time period; sending the health state of the target object to the terminal;
the terminal outputs the health state of the target object.
A sixth aspect of the present invention provides a storage medium storing one or more programs, where the one or more programs are executable by one or more processors to implement the health detection method according to any one of the first aspects.
According to the technical scheme provided by the embodiment of the invention, the image acquired by the sweeping robot in the process of sweeping the moving area of the target object is acquired, and then the alopecia amount of the target object in a first preset time period is determined according to the image; and determining the health status of the target subject according to the alopecia amount. The normal alopecia volume range of the target object is preset in the database of the sweeping robot, after the alopecia volume of the target object is obtained through the recognition image after cleaning is completed each time, the normal alopecia volume range of the target object is compared, when the alopecia volume of the target object exceeds the normal alopecia volume range of the target object, the target object is judged to be in an unhealthy state, and the target object can be alarmed in time. The cleaning robot can determine the health state of the target object according to the number of the cleaned hairs while cleaning a room, so that the health state of the target object can be monitored.
Drawings
Fig. 1 is a schematic diagram of a health detection system according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of a sweeping robot according to an embodiment of the present invention;
FIG. 3 is a flowchart illustrating an embodiment of a health detection method according to the present invention;
FIG. 4 is a flowchart illustrating another embodiment of a health detection method according to the present invention;
fig. 5 is a schematic flowchart of an image recognition process performed by a CNN neural network according to an embodiment of the present invention;
FIG. 6 is a flowchart illustrating another embodiment of a health detection method according to the present invention;
FIG. 7 is a block diagram of an embodiment of a health detection apparatus according to the present invention;
fig. 8 is a schematic structural diagram of another sweeping robot provided in the embodiment of the present invention;
fig. 9 is a schematic structural diagram of a cloud server according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, a schematic diagram of a health detection system according to an embodiment of the present invention is shown.
The health detection system shown in fig. 1 includes: a terminal 101, a cloud server 102, and a sweeping robot 103. The terminal 101, the cloud server 102 and the sweeping robot 103 are connected through network communication.
The terminal 101 may be a hardware device or software that supports network connection to provide various network services. When device 101 is hardware, it may be a device that supports various electronic devices with display screens, including but not limited to smartphones, tablets, laptop portable computers, desktop computers, and the like, for example only a smartphone in fig. 1. When the device 101 is software, it can be installed in the electronic devices listed above. In the embodiment of the present invention, the device 101 may establish communication with the cloud server 102 and the sweeping robot 103 by installing corresponding applications.
The cloud server 102 may be implemented by using one server, or may be implemented in a form of a server cluster formed by multiple servers, which is not limited in this embodiment of the present invention.
The sweeping robot 103 may include an image acquisition module, which is configured to acquire an image during a process of sweeping a moving area of a target object, and send the acquired image to the cloud server 102 through a network.
In one embodiment, the user may control the sweeping robot 103 to sweep the active area of the target object through the terminal 101. When the sweeping robot 103 sweeps the moving area of the target object, the image of the area to be swept or the swept object can be collected. Then, the sweeping robot 103 sends the acquired image to the cloud server 102. The cloud server 102 may apply the health detection method provided by the embodiment of the present invention to detect the health status of the target object.
In another embodiment, the user may control the sweeping robot 103 to sweep the active area of the target object through the terminal 101. When the sweeping robot 103 cleans the moving area of the target object, the health state of the target object can be detected by applying the health detection method provided by the embodiment of the invention.
The health detection method provided by the present invention is further explained with reference to the following embodiments, which are not to be construed as limiting the embodiments of the present invention.
Referring to fig. 3, a flowchart of an embodiment of a health detection method according to an embodiment of the present invention is provided. For one embodiment, the process illustrated in FIG. 3 may be applied to a cloud server, such as cloud server 102 illustrated in FIG. 1. As another embodiment, the process shown in fig. 3 can also be applied to a sweeping robot, such as the sweeping robot 103 illustrated in fig. 1. The flow shown in fig. 3 is only applied to the cloud server 102 as an example, and the health detection method provided by the embodiment of the present invention is described. As shown in fig. 3, the process may include the following steps:
step 301, acquiring an image acquired by the sweeping robot in the process of sweeping the moving area of the target object.
Step 302, determining the alopecia amount of the target object in a first preset time period according to the image.
Step 301 and step 302 are collectively described below:
in real life, users often have fixed active areas. For example, in daily life, a user may own a separate bedroom, and in this example, the active region of the target object may refer to the bedroom of the target object.
Taking the schematic architecture diagram of the health detection system shown in fig. 1 as an example, in the embodiment of the present invention, the sweeping robot 103 may collect an image during the process of cleaning the moving area of the target object, and upload the collected image to the cloud server 102, and the cloud server 102 determines the hair loss amount of the target object in the first preset time period according to the collected image. The first preset time period may be 1 day, 2 days, or 3 days, which is not limited in the embodiment of the present invention.
As a possible implementation manner, the cloud server 102 acquires an image of the area to be cleaned, which is acquired by the sweeping robot 103 before each area to be cleaned is cleaned. The area to be cleaned may be a predetermined area in the target object movement area.
Specifically, each time the cleaning robot 103 cleans an area to be cleaned, an image of the area to be cleaned may be collected first. The image is then uploaded to the cloud server 102. In this way, the cloud server 102 may sequentially acquire images of a plurality of areas to be cleaned.
Then, the cloud server 102 inputs each acquired image to the trained image recognition model, and may obtain the number of hairs included in the image output by the image recognition model. Then, when the sweeping robot 103 has cleaned the entire active area of the target object, the cloud server 102 may sum the number of hairs included in each of the plurality of images collected by the sweeping robot 103 in the cleaning process, so as to obtain the number of hairs cleaned by the sweeping robot 103 in the cleaning process, and the cloud server 102 may determine the hair loss amount of the target object in the first preset time period according to the number of hairs cleaned by the sweeping robot 103 in the cleaning process.
As another possible implementation manner, when acquiring the image acquired by the sweeping robot 103, the cloud server 102 may acquire an image of a swept object in a built-in garbage box of the sweeping robot 103, which is acquired during sweeping of an active area of the target object, and determine an alopecia amount of the target object within a first preset time period according to the image.
Specifically, as shown in fig. 2, a schematic structural diagram of a sweeping robot provided in an embodiment of the present invention is shown. As shown in fig. 2, a temporary garbage box and a garbage box are arranged in the sweeping robot, an image collecting module is arranged above the temporary garbage box, the sweeping robot can place the swept objects in the temporary garbage box in the process of sweeping the moving area of the target object, and the image collecting module above the temporary garbage box can collect images of the swept objects in the temporary garbage box.
Optionally, the image capturing module may periodically capture images of the cleaning objects in the temporary trash box. After the sweeping is completed, the captured image is sent to the cloud server 102. The cloud server 102 may input the received multiple images to the trained image recognition model, and obtain the number of hairs respectively included in the multiple images output by the image recognition model. Finally, the cloud server 102 sums the hair numbers respectively included in the multiple images to obtain the hair number cleaned by the sweeping robot 103 in the cleaning process, and the cloud server 102 can determine the hair loss amount of the target object in the first preset time period according to the hair number cleaned by the sweeping robot 103 in the cleaning process.
Optionally, the image acquisition module may acquire an image of the cleaning object in the temporary trash box only once after the cleaning robot 103 finishes cleaning, send the acquired image to the cloud server 102, and then the cloud server 102 inputs the acquired image of the cleaning object to the trained image recognition model, so as to obtain the number of hairs included in the image output by the image recognition model. The number of the hairs is the number of the hairs cleaned by the sweeping robot 103 in the cleaning process, and the cloud server 102 can determine the hair loss amount of the target object in the first preset time period according to the number of the hairs cleaned by the sweeping robot 103 in the cleaning process.
As for how the cloud server 102 determines the hair loss amount of the target user in the first preset time period according to the hair amount cleaned in the cleaning process, the following is explained by the flow shown in fig. 4, and detailed description is not given here.
Optionally, the image recognition model may be a CNN (Convolutional Neural Network) model. As shown in fig. 5, a schematic flow chart of image recognition for a CNN neural network provided in an embodiment of the present invention is shown.
Step 303, determining the health status of the target subject according to the hair loss amount of the target subject in the first preset time period.
In the embodiment of the present invention, the cloud server 102 may be preset with a database, where the preset database includes a normal alopecia amount range of a plurality of subjects in a first preset time period. As an embodiment, in an initial situation, the cloud server 102 may set a normal hair loss amount range of each subject in a first preset time period according to preset standard data; thereafter, the cloud server 102 may update the normal alopecia amount range of the subject according to the cleaning record of the cleaning robot 103.
Based on the above description, in an embodiment, when determining the health status of the target subject according to the hair loss amount of the target subject within the first preset time period, the cloud server 102 may first determine a normal hair loss amount range of the target subject within the first preset time period from the preset database; then, whether the alopecia amount of the target object in the first preset time period is within the determined normal alopecia amount range is determined. If the alopecia volume of the target subject in the first preset time period exceeds the normal alopecia volume range, determining that the target subject is in an unhealthy state; if the alopecia amount of the target subject in the first preset time period is within the normal alopecia amount range, the target subject can be determined to be in a healthy state.
In addition, in the embodiment of the present invention, in a case where it is determined that the target object is in an unhealthy state, a hair loss amount variation graph of the target object in a second preset time period may be generated and output, and alarm information indicating that the target object is in an unhealthy state may be output. The second preset time period may include a plurality of first time periods, for example: one week, one month, or two months, which are not limited in this embodiment of the present invention.
In addition, in the embodiment of the present invention, when the cloud server 102 determines that the target object is in the unhealthy state according to the hair loss amount of the target object within the first preset time period, the cloud server 102 outputs the hair loss amount of the target object within the first preset time period and the current unhealthy state of the target object to the target object through the terminal 101. In this way, the target object can know the health status of itself according to the output content of the terminal 101.
Further, when the target object determines that the cloud server 102 has determined the health status of the target object incorrectly, the target object may correct the health status determined by the cloud server 102, for example, correct the determined unhealthy status to a healthy status. The cloud server 102 may reset the normal hair loss amount range corresponding to the target object in the preset database when it is determined that the target object corrects the determined unhealthy state to a healthy state.
For example, assuming that the first preset time period is 1 day, and assuming that the hair loss amount of the target object obtained by the cloud server 102 in 1 day is 70, it is continuously assumed that the normal hair loss amount range of the target object in the preset database is set to 20-50, at this time, the cloud server 102 compares that the hair loss amount of the target object in the first preset time period exceeds the normal hair loss amount range, and determines that the target object is in an unhealthy state. Then, the cloud server 102 outputs the hair loss amount of the target object in the first preset time period and the current unhealthy state of the target object to the target object through the terminal 101, and if the target object considers that the cloud server 102 judges the self health state wrongly, the health state judged by the cloud server 102 can be corrected. The cloud server 102 can reset the normal hair loss range corresponding to the target object in the preset database to be 50-80 according to the hair loss amount of the target object per day being 70.
In addition, when the hair loss amount of the target object is within the normal hair loss amount range, the cloud server 102 may further determine the health status of the target object by acquiring the color of the hair swept out during the sweeping process, for example: the method comprises the steps of obtaining the proportion of yellow or white in hair cleaned in the cleaning process, setting a normal proportion range of yellow or white in the hair color of a target object in a preset database, and determining that the target object is in an unhealthy state if the proportion exceeds the normal proportion range in the preset database.
So far, the description about the flow shown in fig. 1 is completed.
According to the technical scheme provided by the embodiment of the invention, the image acquired by the sweeping robot in the process of sweeping the moving area of the target object is acquired, and then the alopecia amount of the target object in a first preset time period is determined according to the image; and determining the health status of the target subject according to the alopecia amount. The normal alopecia amount range of the target object is preset in a database of the sweeping robot, after the alopecia amount of the target object is obtained through the recognition image after cleaning is completed each time, the normal alopecia amount range of the target object is compared, when the alopecia amount of the target object exceeds the normal alopecia amount range of the target object, the target object is judged to be in an unhealthy state, and the target object can be warned in time. The cleaning robot can determine the health state of the target object according to the number of the cleaned hairs while cleaning a room, so that the health state of the target object can be monitored.
Referring to fig. 4, a flowchart of another embodiment of a health detection method according to an embodiment of the present invention is provided. The flow shown in fig. 4 is based on the flow shown in fig. 3, and describes how the cloud server 102 determines the alopecia amount of the target user in the first preset time period according to the hair amount cleaned in the cleaning process, as shown in fig. 4, the flow may include the following steps:
step 401, obtaining the first time when the sweeping robot cleans the active area of the target object last time.
Step 402, determining a time interval between the first time and the current time.
Step 401 and step 402 are collectively described below:
taking the schematic architecture of the health detection system shown in fig. 1 as an example, after detecting that the sweeping robot 103 cleans the active area of the target object every time, the cloud server 102 may add a cleaning record in the personal database of the target object, where the cleaning record may include: cleaning time, cleaning area, and the amount of hair cleaned.
Thereafter, when the sweeping robot 103 performs the next sweeping, after the sweeping is completed, the cloud server 102 may obtain the first time of the last sweeping of the active area of the target object from the personal database. A time interval between the first time and the current time may be determined based on the first time.
For example, if the sweeping robot 103 has cleaned the moving area of the target object at 12 pm on 1 st/12 th month in the same year and the cleaning time is 12 pm on 3 rd/12 th month in the same year, the time interval between the cleaning and the cleaning is 2 days.
And step 403, determining the alopecia amount of the target object in a first preset time period according to the hair quantity and the time interval.
In the above description, the cloud server 102 has determined the number of hairs cleaned during the cleaning process, and based on the number of hairs cleaned and the time interval, the alopecia amount of the target object in the first preset time period may be determined.
For example, assuming that the first preset time period is 1 day, and continuing to assume that the time interval is 3 days, the average value obtained by dividing the hair number by the time interval can be used as the hair loss amount of the target object in the first preset time period.
In addition, the cloud server 102 may output, to the target object, through the visual interface of the terminal 101: the number of hairs cleaned in the cleaning process, the time interval, the hair loss amount of the target object in the first preset time period obtained through calculation, and the health state of the target object. After the cloud server 102 receives the feedback information of the target object, if the alopecia amount of the target object is determined to be correct in the feedback information, determining that the calculated alopecia amount of the target object in a first preset time period is a correct value; if the target object modifies the number of hairs cleaned in the cleaning process or the time interval in the feedback information, for example, the time interval is modified from 3 days to 2 days, the hair loss amount of the target object in the first preset time period is recalculated according to the modified content of the target object.
According to the technical scheme provided by the embodiment of the invention, the first time when the sweeping robot cleans the active area of the target object for the last time is obtained, the time interval between the first time and the current time is determined, and then the alopecia volume of the target object in the first preset time period is determined according to the number of the hair cleaned and the time interval. Due to the fact that the first time of the last cleaning is obtained and the time interval between the first time and the current time is determined, the hair loss amount of the target object in the first preset time period can be accurately calculated according to the number of the cleaned hairs and the time interval, the cleaning robot can determine the health state of the target object according to the number of the cleaned hairs while cleaning a room, the health state of the target object can be monitored, and an alarm can be given in time when the target object is in an unhealthy state.
Referring to fig. 6, a flowchart of another embodiment of a health detection method provided by the embodiment of the present invention is shown in fig. 6, where the flowchart may include the following steps:
firstly, after receiving a cleaning command (the cleaning mode of the cleaning robot can be set by the target object through the terminal to be the cleaning command issued by the cleaning robot, for example, a full-house cleaning mode for cleaning each room in sequence, a room-designated cleaning mode for designating a specific room for cleaning, a region cleaning mode for designating a specific region for cleaning), the cleaning robot starts to clean the room, that is, the active region of the target object. After cleaning is finished, an image of the cleaning object in the garbage box is obtained and image recognition is carried out to obtain the number of hairs in the cleaning object.
Then, the cloud server judges whether a personal database is established for the target object (the personal database comprises a cleaning record of the sweeping robot on the activity area of the target object and a normal alopecia amount range of the target object), and if so, the normal alopecia amount range of the target object is obtained from the personal database; and if the target object does not establish a personal database, acquiring the normal alopecia amount range of the target object according to the sample standard data.
Secondly, the cloud server judges whether the target object is in a healthy state or not according to the normal alopecia amount range, if the number of hairs in the cleaning object is within the normal alopecia amount range, the target object is in the healthy state, and at the moment, the terminal of the sweeping robot can display related data (for example, a weekly statistical chart, a monthly statistical chart or a yearly statistical chart of the alopecia amount of the target object is displayed on a functional module of the terminal); if the number of hairs in the above-mentioned swept object exceeds the normal alopecia amount range, the target object is in an unhealthy state. At this time, early warning information (for example, a mobile phone vibration prompt, a yellow early warning with an exclamation mark and the like) can be pushed to the target object through the terminal of the sweeping robot, and relevant data is displayed (for example, hair loss data of the target object in the last half month is drawn into a line drawing and displayed).
According to the technical scheme provided by the embodiment of the invention, the image acquired by the sweeping robot in the process of sweeping the moving area of the target object is acquired, and then the alopecia amount of the target object in a first preset time period is determined according to the image; and determining the health status of the target subject according to the alopecia amount. The normal alopecia volume range of the target object is preset in the database of the sweeping robot, after the alopecia volume of the target object is obtained through the recognition image after cleaning is completed each time, the normal alopecia volume range of the target object is compared, when the alopecia volume of the target object exceeds the normal alopecia volume range of the target object, the target object is judged to be in an unhealthy state, and the target object can be alarmed in time. The cleaning robot can determine the health state of the target object according to the hair falling amount cleaned while cleaning a room, so that the health state of the target object can be monitored, and an alarm can be given in time when the target object is in an unhealthy state.
Referring to fig. 7, a block diagram of an embodiment of a health detection apparatus according to an embodiment of the present invention is provided.
As shown in fig. 7, the apparatus includes:
the acquisition module 71 is configured to acquire an image acquired by the sweeping robot in a process of sweeping a moving area of a target object;
a first determining module 72, configured to determine, according to the image, an alopecia amount of the target object within a first preset time period;
and the second determining module 73 is used for determining the health state of the target object according to the hair loss amount of the target object in the first preset time period.
In a possible implementation, the obtaining module 71 is specifically configured to:
acquiring an image of each to-be-cleaned area acquired by a sweeping robot before each to-be-cleaned area is cleaned, wherein the to-be-cleaned area is a preset area in the target object moving area;
and/or the presence of a gas in the gas,
and acquiring images of the cleaning objects in the built-in garbage box, which are acquired by the sweeping robot in the process of sweeping the moving area of the target object.
In a possible embodiment, said first determination module 72 comprises (not shown in the figures):
the model input submodule is used for inputting the image into a trained image recognition model to obtain the number of hairs contained in the image output by the image recognition model;
and the alopecia amount determining sub-module is used for determining the alopecia amount of the target object in a first preset time period according to the hair quantity.
In one possible embodiment, the alopecia amount determination submodule is specifically configured to:
acquiring the first time when the sweeping robot cleans the moving area of the target object last time;
determining a time interval between the first time and a current time;
and determining the alopecia amount of the target object in a first preset time period according to the hair number and the time interval.
In a possible embodiment, said second determination module 73 comprises (not shown in the figures):
the range determining submodule is used for determining the normal alopecia volume range of the target object in the first preset time period from a preset database, and the preset database comprises the normal alopecia volume ranges of a plurality of objects in the first preset time period;
the judgment sub-module is used for determining whether the alopecia amount of the target object in the first preset time period is within the determined normal alopecia amount range;
the first determining submodule is used for determining that the target object is in an unhealthy state if the alopecia amount of the target object in the first preset time period exceeds the normal alopecia amount range;
and the second determining submodule is used for determining that the target object is in a healthy state if the alopecia volume of the target object in the first preset time period is within the normal alopecia volume range.
In a possible embodiment, the device further comprises (not shown in the figures):
the resetting module is used for resetting the normal alopecia amount range corresponding to the target object in the preset database according to the alopecia amount under the condition that the target object is determined to be in an unhealthy state according to the alopecia amount of the target object in a first preset time period and the target object is determined to correct the unhealthy state to a healthy state.
In a possible embodiment, the device further comprises (not shown in the figures):
the generating module is used for generating an alopecia volume change chart of the target object in a second preset time period under the condition that the target object is determined to be in an unhealthy state according to the alopecia volume of the target object, wherein the second preset time period comprises a plurality of first preset time periods;
and the output module is used for outputting the alopecia volume change chart and outputting alarm information for indicating that the target object is in an unhealthy state.
Referring to fig. 8, in order to provide a schematic structural diagram of another sweeping robot according to an embodiment of the present invention, a sweeping robot 800 shown in fig. 8 includes: an image acquisition module 806, at least one processor 801, a memory 802, at least one network interface 804, and other user interfaces 803. The various components in the sweeping robot 800 are coupled together by a bus system 805. It is understood that the bus system 805 is used to enable communications among the components connected. The bus system 805 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 805 in fig. 8.
The image acquisition module 806 is configured to acquire an image during sweeping of the moving area of the target object.
The user interface 803 may include a display, a keyboard, or a pointing device (e.g., a mouse, trackball, touch pad, or touch screen, etc.).
It will be appreciated that the memory 802 in embodiments of the invention may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile memory may be a Read-only memory (ROM), a programmable Read-only memory (PROM), an erasable programmable Read-only memory (erasabprom, EPROM), an electrically erasable programmable Read-only memory (EEPROM), or a flash memory. The volatile memory may be a Random Access Memory (RAM) which functions as an external cache. By way of example, but not limitation, many forms of RAM are available, such as static random access memory (staticiram, SRAM), dynamic random access memory (dynamic RAM, DRAM), synchronous dynamic random access memory (syncronous DRAM, SDRAM), double data rate synchronous dynamic random access memory (DDRSDRAM ), Enhanced Synchronous DRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and direct memory bus RAM (DRRAM). The memory 802 described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In some embodiments, memory 802 stores elements, executable units or data structures, or a subset thereof, or an expanded set thereof as follows: an operating system 8021 and application programs 8022.
The operating system 8021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, and is used for implementing various basic services and processing hardware-based tasks. The application 8022 includes various applications, such as a media player (MediaPlayer), a Browser (Browser), and the like, for implementing various application services. A program implementing a method according to an embodiment of the present invention may be included in application program 8022.
In the embodiment of the present invention, the processor 801 is configured to execute the method steps provided by each method embodiment by calling the program or instruction stored in the memory 802, specifically, the program or instruction stored in the application 8022, and for example, includes:
acquiring an image acquired by a sweeping robot in the process of sweeping the moving area of a target object;
determining the hair loss amount of the target object in a first preset time period according to the image;
and determining the health state of the target object according to the hair loss amount of the target object in a first preset time period.
The methods disclosed in the embodiments of the present invention described above may be implemented in the processor 801 or implemented by the processor 801. The processor 801 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 801. The processor 801 may be a general-purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, or discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software elements in the decoding processor. The software elements may be located in ram, flash, rom, prom, or eprom, registers, among other storage media that are well known in the art. The storage medium is located in the memory 802, and the processor 801 reads the information in the memory 802, and combines the hardware to complete the steps of the method.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or any combination thereof. For a hardware implementation, the processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units configured to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described herein may be implemented by means of units performing the functions described herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
The sweeping robot provided in this embodiment may be the sweeping robot shown in fig. 8, and may perform all the steps of the health detection method shown in fig. 3 to 4, so as to achieve the technical effects of the health detection method shown in fig. 3 to 4, and for brevity, the description is specifically referred to fig. 3 to 4, which is not repeated herein.
Referring to fig. 9, which is a schematic structural diagram of a cloud server according to an embodiment of the present invention, a cloud server 900 shown in fig. 9 includes: at least one processor 901, memory 902, at least one network interface 904, and other user interfaces 903. The various components in cloud server 900 are coupled together by a bus system 905. It is understood that the bus system 905 is used to enable communications among the components. The bus system 905 includes a power bus, a control bus, and a status signal bus, in addition to a data bus. For clarity of illustration, however, the various buses are labeled in fig. 9 as bus system 905.
The user interface 903 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, trackball, touch pad, or touch screen, etc.).
It is to be understood that the memory 902 in embodiments of the present invention may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile memory may be a Read-only memory (ROM), a programmable Read-only memory (PROM), an erasable programmable Read-only memory (erasabprom, EPROM), an electrically erasable programmable Read-only memory (EEPROM), or a flash memory. The volatile memory may be a Random Access Memory (RAM) which functions as an external cache. By way of example, but not limitation, many forms of RAM are available, such as static random access memory (staticiram, SRAM), dynamic random access memory (dynamic RAM, DRAM), synchronous dynamic random access memory (syncronous DRAM, SDRAM), double data rate synchronous dynamic random access memory (DDRSDRAM ), Enhanced Synchronous DRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and direct memory bus RAM (DRRAM). The memory 902 described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In some embodiments, memory 902 stores the following elements, executable units or data structures, or a subset thereof, or an expanded set thereof: an operating system 9021, and application programs 9022.
The operating system 9021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, and is configured to implement various basic services and process hardware-based tasks. The application 9022 includes various applications, such as a media player (MediaPlayer), a Browser (Browser), and the like, for implementing various application services. A program implementing the method of an embodiment of the present invention may be included in application 9022.
In the embodiment of the present invention, by calling a program or an instruction stored in the memory 902, specifically, a program or an instruction stored in the application 9022, the processor 901 is configured to execute the method steps provided by the method embodiments, for example, including:
acquiring an image acquired by a cloud server in the process of cleaning the moving area of a target object;
determining the hair loss amount of the target object in a first preset time period according to the image;
and determining the health state of the target object according to the hair loss amount of the target object in a first preset time period.
As for the connection mode and the function of each component in the cloud server, please refer to the related description of fig. 8, which is not described herein again.
The cloud server provided in this embodiment may be the cloud server shown in fig. 9, and may perform all the steps of the health detection method shown in fig. 3 to 4, so as to achieve the technical effects of the health detection method shown in fig. 3 to 4, and for brevity, it is specifically described with reference to fig. 3 to 4, and no further description is provided here.
The embodiment of the invention also provides a storage medium (computer readable storage medium). The storage medium herein stores one or more programs. Among others, the storage medium may include volatile memory, such as random access memory; the memory may also include non-volatile memory, such as read-only memory, flash memory, a hard disk, or a solid state disk; the memory may also comprise a combination of memories of the kind described above.
When the one or more programs in the storage medium are executed by the one or more processors, the health detection method executed on the side of the sweeping robot is realized.
The processor is used for executing the health detection program stored in the memory so as to realize the following steps of the health detection method executed on the sweeping robot side:
acquiring an image acquired by a sweeping robot in the process of sweeping the moving area of a target object;
determining the hair loss amount of the target object in a first preset time period according to the image;
and determining the health state of the target object according to the hair loss amount of the target object in a first preset time period.
Those of skill would further appreciate that the various illustrative components and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the components and steps of the various examples have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware, a software module executed by a processor, or a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are merely exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (12)

1. A method of health detection, the method comprising:
acquiring an image acquired by a sweeping robot in the process of sweeping the moving area of a target object;
determining the hair loss amount of the target object in a first preset time period according to the image;
and determining the health state of the target object according to the hair loss amount of the target object in a first preset time period.
2. The method of claim 1, wherein the obtaining of the image of the sweeping robot captured during sweeping of the active area of the target object comprises:
acquiring an image of each to-be-cleaned area acquired by a sweeping robot before the sweeping robot cleans the to-be-cleaned area, wherein the to-be-cleaned area is a preset area in the target object moving area;
and/or the presence of a gas in the gas,
and acquiring images of the cleaning objects in the built-in garbage box, which are acquired by the sweeping robot in the process of sweeping the moving area of the target object.
3. The method according to claim 1, wherein the determining of the hair loss amount of the target object within a first preset time period according to the image comprises:
inputting the image into a trained image recognition model to obtain the number of hairs contained in the image output by the image recognition model;
determining the alopecia amount of the target object in a first preset time period according to the hair quantity.
4. The method according to claim 3, wherein the determining the amount of hair loss of the target subject in the first preset time period according to the number of hairs comprises:
acquiring the first time when the sweeping robot cleans the moving area of the target object last time;
determining a time interval between the first time and a current time;
and determining the alopecia amount of the target object in a first preset time period according to the hair number and the time interval.
5. The method according to claim 1, wherein the determining the health status of the target subject according to the alopecia amount of the target subject within a first preset time period comprises:
determining the normal alopecia volume range of the target subject in the first preset time period from a preset database, wherein the preset database comprises the normal alopecia volume ranges of a plurality of subjects in the first preset time period;
determining whether the alopecia amount of the target object in the first preset time period is within the determined normal alopecia amount range;
if the alopecia amount of the target subject in the first preset time period exceeds the normal alopecia amount range, determining that the target subject is in an unhealthy state;
and if the alopecia volume of the target subject in the first preset time period is within the normal alopecia volume range, determining that the target subject is in a healthy state.
6. The method of claim 5, further comprising:
and under the condition that the target object is determined to be in an unhealthy state according to the alopecia amount of the target object in a first preset time period, and the target object is determined to correct the unhealthy state to a healthy state, resetting the normal alopecia amount range corresponding to the target object in the preset database according to the alopecia amount.
7. The method according to any one of claims 1 to 5, wherein in the case where the target subject is determined to be in an unhealthy state according to the amount of hair loss of the target subject, the method further comprises:
generating a pattern of hair loss variation of the target object in a second preset time period, wherein the second preset time period comprises a plurality of first preset time periods;
and outputting the alopecia volume change graph and outputting alarm information for indicating that the target object is in an unhealthy state.
8. A health detection device, characterized in that the device comprises:
the acquisition module is used for acquiring an image acquired by the sweeping robot in the process of sweeping the moving area of the target object;
the first determining module is used for determining the hair loss amount of the target object in a first preset time period according to the image;
and the second determination module is used for determining the health state of the target object according to the hair loss amount of the target object in a first preset time period.
9. A sweeping robot is characterized by comprising: the system comprises an image acquisition module, a processor and a memory;
the image acquisition module is used for acquiring images in the process of cleaning the moving area of the target object;
the processor is used for executing the health detection program stored in the memory to realize the health detection method of any one of claims 1-7.
10. A cloud server, comprising: a processor and a memory, the processor being configured to execute a health detection program stored in the memory to implement the health detection method of any one of claims 1-7.
11. A health detection system, comprising: the sweeping robot, the cloud server and the terminal;
the sweeping robot collects images in the process of sweeping the moving area of the target object and sends the images to the cloud server;
the cloud server determines the hair loss amount of the target object in a first preset time period according to the image; determining the health state of the target object according to the hair loss amount of the target object in a first preset time period; sending the health state of the target object to the terminal;
and the terminal outputs the health state of the target object.
12. A storage medium storing one or more programs executable by one or more processors to implement the health detection method of any one of claims 1-7.
CN202210130164.0A 2022-02-11 2022-02-11 Health detection method and device, sweeping robot and storage medium Active CN114532923B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210130164.0A CN114532923B (en) 2022-02-11 2022-02-11 Health detection method and device, sweeping robot and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210130164.0A CN114532923B (en) 2022-02-11 2022-02-11 Health detection method and device, sweeping robot and storage medium

Publications (2)

Publication Number Publication Date
CN114532923A true CN114532923A (en) 2022-05-27
CN114532923B CN114532923B (en) 2023-09-12

Family

ID=81674060

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210130164.0A Active CN114532923B (en) 2022-02-11 2022-02-11 Health detection method and device, sweeping robot and storage medium

Country Status (1)

Country Link
CN (1) CN114532923B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118379281A (en) * 2024-06-21 2024-07-23 北京大学第三医院(北京大学第三临床医学院) Method for analyzing skin image and method for evaluating efficacy of medication

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103268312A (en) * 2013-05-03 2013-08-28 同济大学 Training corpus collection system and method based on user feedback
CN110765895A (en) * 2019-09-30 2020-02-07 北京鲲鹏神通科技有限公司 Method for distinguishing object by robot
CN112862789A (en) * 2021-02-10 2021-05-28 上海大学 Interactive image segmentation method based on machine learning
CN113095230A (en) * 2021-04-14 2021-07-09 北京深睿博联科技有限责任公司 Method and device for helping blind person to search for articles
CN113468919A (en) * 2020-03-31 2021-10-01 青岛海尔智能技术研发有限公司 Method, device and equipment for providing life advice
CN113591512A (en) * 2020-04-30 2021-11-02 青岛海尔智能技术研发有限公司 Method, device and equipment for hair identification

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103268312A (en) * 2013-05-03 2013-08-28 同济大学 Training corpus collection system and method based on user feedback
CN110765895A (en) * 2019-09-30 2020-02-07 北京鲲鹏神通科技有限公司 Method for distinguishing object by robot
CN113468919A (en) * 2020-03-31 2021-10-01 青岛海尔智能技术研发有限公司 Method, device and equipment for providing life advice
CN113591512A (en) * 2020-04-30 2021-11-02 青岛海尔智能技术研发有限公司 Method, device and equipment for hair identification
CN112862789A (en) * 2021-02-10 2021-05-28 上海大学 Interactive image segmentation method based on machine learning
CN113095230A (en) * 2021-04-14 2021-07-09 北京深睿博联科技有限责任公司 Method and device for helping blind person to search for articles

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118379281A (en) * 2024-06-21 2024-07-23 北京大学第三医院(北京大学第三临床医学院) Method for analyzing skin image and method for evaluating efficacy of medication

Also Published As

Publication number Publication date
CN114532923B (en) 2023-09-12

Similar Documents

Publication Publication Date Title
RU2624737C2 (en) Method and device for cleaning waste
CN105380575B (en) Control method, system, Cloud Server and the sweeping robot of sweeping robot
CN111012261A (en) Sweeping method and system based on scene recognition, sweeping equipment and storage medium
CN111643011B (en) Cleaning robot control method and device, cleaning robot and storage medium
JP6214824B2 (en) Automatic test equipment
WO2022041484A1 (en) Human body fall detection method, apparatus and device, and storage medium
CN106357480A (en) Method and device for monitoring network performance of application and mobile terminal
CN114532923A (en) Health detection method and device, sweeping robot and storage medium
CN107102928A (en) Application crash information reporting method and device
CN111090593A (en) Method, device, electronic equipment and storage medium for determining crash attribution
CN113448834A (en) Buried point testing method and device, electronic equipment and storage medium
CN111310351A (en) Method and device for monitoring state of workshop equipment, terminal equipment and storage medium
CN110505438B (en) Queuing data acquisition method and camera
JP2009545223A (en) Event detection method and video surveillance system using the method
CN111260876B (en) Image processing method and device
CN110827194A (en) Image processing method, device and computer storage medium
CN113673318B (en) Motion detection method, motion detection device, computer equipment and storage medium
CN105339974A (en) Simulating sensors
CN113283939A (en) Advertisement exposure monitoring method, device and system
CN112205927B (en) Intelligent sweeping method and device of sweeping robot
CN111694805A (en) Method and device for processing logs of sweeper
CN110838074A (en) Analysis method and device
JP2010108102A (en) Action pattern extraction device, method, and program
CN114831554B (en) Water seepage detection method, device, robot, system and storage medium
CN113872947B (en) Data reporting method and device, electronic equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant