CN112443020A - Intelligent closestool and data acquisition method and device thereof - Google Patents
Intelligent closestool and data acquisition method and device thereof Download PDFInfo
- Publication number
- CN112443020A CN112443020A CN201910838993.2A CN201910838993A CN112443020A CN 112443020 A CN112443020 A CN 112443020A CN 201910838993 A CN201910838993 A CN 201910838993A CN 112443020 A CN112443020 A CN 112443020A
- Authority
- CN
- China
- Prior art keywords
- user
- weight
- intelligent
- acquiring
- parameters
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- E—FIXED CONSTRUCTIONS
- E03—WATER SUPPLY; SEWERAGE
- E03D—WATER-CLOSETS OR URINALS WITH FLUSHING DEVICES; FLUSHING VALVES THEREFOR
- E03D9/00—Sanitary or other accessories for lavatories ; Devices for cleaning or disinfecting the toilet room or the toilet bowl; Devices for eliminating smells
Landscapes
- Health & Medical Sciences (AREA)
- Public Health (AREA)
- Epidemiology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Hydrology & Water Resources (AREA)
- Water Supply & Treatment (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
A data acquisition method of an intelligent closestool comprises the following steps: acquiring an infrared image set in a preset time period before the closestool is used, and acquiring the weight of a user using the intelligent closestool currently; analyzing the posture characteristics of the user according to the infrared image set, and identifying the user who uses the intelligent closestool according to the posture characteristics and the user weight; searching historical sign parameters corresponding to the user; and acquiring current sign parameters of the user, and generating a sign parameter sequence corresponding to the user according to the historical sign parameters and the current sign parameters. The user can be accurately and effectively identified, the sign parameter sequence of the user is generated according to the identified user, and the data of the user can be analyzed and tracked.
Description
Technical Field
The application belongs to the field of cleaning tools, and particularly relates to an intelligent closestool and a data acquisition method and device thereof.
Background
With the progress of science and technology and the improvement of the living standard of people, the demand of people on the quality of life is higher and higher. For example, when people use the closestool, in order to improve the use experience of users, the closestool is integrated with the functions of cleaning buttocks, cleaning lower body, moving and cleaning, heat preservation of a seat ring, warm air drying, automatic deodorization and the like, so that the cleanliness of the users when using the closestool is improved, and the comfortable process of using the closestool is facilitated for the users.
Although the current intelligent toilet can further collect physical sign data of users, because a plurality of users often use the same toilet in the current family, the data collected by the toilet may not identify the corresponding users, and the data of the users cannot be effectively analyzed and tracked.
Disclosure of Invention
In view of this, the embodiment of the present application provides an intelligent toilet and a data acquisition method and device thereof, so as to solve the problems that data acquired by a toilet in the prior art may not identify a corresponding user, and data of the user cannot be effectively analyzed and tracked.
A first aspect of an embodiment of the present application provides a data acquisition method for an intelligent toilet, where the data acquisition method for the intelligent toilet includes:
acquiring an infrared image set in a preset time period before the closestool is used, and acquiring the weight of a user using the intelligent closestool currently;
analyzing the posture characteristics of the user according to the infrared image set, and identifying the user who uses the intelligent closestool according to the posture characteristics and the user weight;
searching historical sign parameters corresponding to the user;
and acquiring current sign parameters of the user, and generating a sign parameter sequence corresponding to the user according to the historical sign parameters and the current sign parameters.
With reference to the first aspect, in a first possible implementation manner of the first aspect, the physical sign parameter includes one or more of a blood glucose parameter, a blood lipid parameter, a heart rate parameter, a heart beat parameter, a stool routine parameter, or a urine routine parameter.
With reference to the first aspect, in a second possible implementation manner of the first aspect, the identifying, according to the posture feature and the user weight, a user who is currently using the intelligent toilet further includes:
acquiring positioning information of a mobile terminal, wherein the mobile terminal is bound with a user in advance;
and identifying the current user using the intelligent closestool according to the posture characteristic and the bearing weight by combining the positioning information of the mobile terminal.
With reference to the first aspect, in a third possible implementation manner of the first aspect, the data acquisition method of the intelligent toilet further includes:
estimating the expected use period of the user according to the use records of different users;
when the usage of the user is not detected in the estimated expected usage period, acquiring the positioning information of the mobile terminal bound by the user;
and if the positioning information of the mobile terminal bound by the user is matched with the position of the intelligent closestool, sending an abnormal prompt to the mobile terminal.
With reference to the first aspect, in a fourth possible implementation manner of the first aspect, the step of obtaining the weight of the user currently using the intelligent toilet includes:
acquiring the user posture of the user when the user uses the intelligent closestool, and acquiring the bearing quality of a gasket of the intelligent closestool;
and determining the weight of the user according to the posture of the user and the bearing weight.
With reference to the fourth possible implementation manner of the first aspect, in a fifth possible implementation manner of the first aspect, the determining the user weight according to the user posture and the bearing weight includes:
searching a calibration parameter corresponding to the current user gesture according to the corresponding relation between the preset user gesture and the calibration parameter;
and calculating the weight of the user according to the calibration parameters and the mass of the gasket of the intelligent closestool carried by the intelligent closestool.
With reference to the first aspect, in a sixth possible implementation manner of the first aspect, the step of analyzing a posture feature of the user according to the infrared image set, and identifying a user who is currently using the intelligent toilet according to the posture feature and the user weight includes:
searching one or more users corresponding to the currently acquired user weight according to the corresponding relation between the preset user weight and the users;
and determining the user corresponding to the user gesture in the searched one or more users according to the preset corresponding relation between the user gesture characteristics and the users.
A second aspect of the embodiments of the present application provides a data acquisition device of an intelligent toilet, including:
the data acquisition unit is used for acquiring an infrared image set in a preset time period before the closestool is used and acquiring the weight of a user using the intelligent closestool at present;
the identity recognition unit is used for analyzing the posture characteristics of the user according to the infrared image set and recognizing the user who uses the intelligent closestool at present according to the posture characteristics and the weight of the user;
the historical parameter searching unit is used for searching the historical sign parameters corresponding to the user according to the user;
and the sequence generating unit is used for acquiring the current physical sign parameters of the user and generating a physical sign parameter sequence corresponding to the user according to the historical physical sign parameters and the current physical sign parameters.
A third aspect of the embodiments of the present application provides an intelligent toilet, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor, when executing the computer program, implements the steps of the data acquisition method of the intelligent toilet according to any one of the first aspect.
A fourth aspect of embodiments of the present application provides a computer-readable storage medium storing a computer program which, when executed by a processor, implements the steps of the data collection method for an intelligent toilet according to any one of the first aspect.
Compared with the prior art, the embodiment of the application has the advantages that: according to the intelligent closestool, the infrared image set in the preset time period before the closestool is used is obtained, the current use is obtained, the user weight of the intelligent closestool is obtained, the gesture characteristic of a user is analyzed through the infrared image set, the user weight and the current use of gesture characteristic identification are obtained, the corresponding historical sign parameters are searched according to the identified current user, the sign parameter sequence of the user is generated according to the searched historical sign parameters and the current sign parameters, and therefore the user can be accurately and effectively identified, the sign parameter sequence of the user is generated according to the identified user, the analysis and tracking of the data of the user are facilitated, and the problem that the user is hidden due to image acquisition can be effectively avoided.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart illustrating an implementation of a data acquisition method for an intelligent toilet according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart of an implementation of a user weight calculation method provided in an embodiment of the present application;
fig. 3 is a schematic flow chart of an implementation of determining an abnormal use state of a user according to an embodiment of the present application;
FIG. 4 is a schematic structural diagram of a data acquisition device of an intelligent toilet according to an embodiment of the present disclosure;
fig. 5 is a schematic view of an intelligent toilet provided in an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Fig. 1 is a schematic implementation flow diagram of a data acquisition method for an intelligent toilet provided in an embodiment of the present application, which is detailed as follows:
in step S101, acquiring an infrared image set in a predetermined time period before the toilet is used, and acquiring a weight of a user currently using the intelligent toilet;
specifically, the infrared image set according to the embodiment of the present application may be acquired by an infrared thermal imaging sensor. The human body is a natural characteristic infrared radiation source, the temperature range is 35-37 ℃, and the central wavelength of the radiated infrared ray is 9-10 um. The infrared light radiated by the human body is usually different from the background, and the difference between the gray values of the human body target and the surrounding environment is shown in the infrared image. And the method is insensitive to factors such as articles carried by the user, weather conditions, illumination changes and the like, and can acquire the image of the body state of the user more accurately. Acquiring a plurality of images including body states according to a predetermined time interval to form an infrared image set, and obtaining dynamic posture characteristics of a user formed by the plurality of body states of the user.
The user weight may be indirectly obtained through a weight sensor provided at a gasket of the intelligent toilet. Specifically, as shown in fig. 2, the method includes:
in step S201, acquiring a user posture of the user when using the intelligent toilet, and acquiring a bearing quality of a gasket of the intelligent toilet;
the user state of the user when using the intelligent closestool can be the user inclination angle and whether the arm of the user is supported on the knee position. The corresponding relation between the user posture and the calibration parameters can be obtained in advance according to the different inclination angles and the different supporting positions of the user and the statistical data of the influence parameter values on the user weight. For example, if the calibration parameter corresponding to the user leaning to the angle a and the supporting position located at the thigh B is 1.2, the weight of the user may be determined according to the load weight detected by the weight sensor currently disposed at the gasket of the intelligent toilet, in combination with the calibration parameter corresponding to the user posture, which may specifically include the following steps S202-S203.
In step S202, according to a preset correspondence between the user gesture and the calibration parameter, a calibration parameter corresponding to the current user gesture is searched;
for example, calibration parameters corresponding to different postures can be determined in advance according to the actual weight of the user, various postures of the user and the bearing weight detected by the gasket of the intelligent closestool. According to the calculated calibration parameters, the calibration parameters corresponding to the current user posture can be searched by combining the current detected user posture.
In step S203, the user weight is calculated according to the calibration parameters and the mass of the gasket of the intelligent toilet.
According to the calibration parameters and the bearing weight, the weight of a user using the intelligent closestool can be calculated. For example, when the current load weight is 50KG and the searched verification parameter is 1.2, the actual user weight is 50 × 1.2 — 50 KG.
Wherein, can lead to infrared sensor detection in this application and use intelligent closestool's user's gesture.
In addition, because child or other users whose heights do not reach a certain value and the feet of child cannot be effectively supported, the posture of the user does not affect the change of the calibration parameters. Therefore, the height characteristics of the user can be detected in advance, and when the height of the user is smaller than a preset value and the feet cannot be effectively supported, the detected bearing weight can be directly used as the weight of the user.
For example, the height of the child a is 1.2 meters, when the child a sits on the toilet, due to the limitation of the height of the child, the feet of the child cannot be effectively supported, the detection of the posture of the child a can be ignored, the carrying weight can be directly obtained according to the weight sensor arranged on the gasket of the intelligent toilet, and the carrying weight is used as the weight of the child a.
In step S102, analyzing a posture feature of a user according to the infrared image set, and identifying a user currently using the intelligent toilet according to the posture feature and the user weight;
the posture characteristics of the user obtained by the infrared image set analysis in the embodiment of the application can include static posture characteristics of the user and dynamic posture characteristics of the user. The static posture characteristics can comprise the height, the body shape and the like of the user, and the dynamic posture characteristics can comprise the walking action posture of the user, the hand action posture of the user and the like.
In order to effectively identify the user who uses the intelligent closestool according to the posture characteristics and the weight of the user, the corresponding relation between the posture characteristics and the user and the corresponding relation between the weight of the user and the user need to be preset.
For example, one or more users corresponding to the currently detected user weight may be searched in advance according to the corresponding relationship between the user weight and the user, and then, according to the corresponding relationship between the gesture feature and the user, the user corresponding to the currently detected gesture feature may be further searched in the searched one or more users. If the user is determined to be the same user according to the user weight and the posture characteristic, the user is indicated to be the current user, and if the user cannot be found according to the user weight and the posture characteristic, the user is indicated to be a new user, such as a guest or a friend coming from home.
As a preferred embodiment of the present application, when identifying a user who is currently using the intelligent toilet, the location information of the mobile terminal may also be obtained, and the mobile terminal is bound with the user in advance. The mobile terminal includes but is not limited to a smart phone and the like. Through the positioning information of the mobile terminal, the identified user who uses the intelligent closestool can be further confirmed. That is, if the mobile terminal corresponding to the identified user matches the position of the intelligent toilet, it is determined that the current identification result is valid, and if the mobile terminal corresponding to the identified user does not match the position of the intelligent toilet, it is determined that the current identification result is invalid, that is, may exist.
In addition, as shown in fig. 3, as a further optimized embodiment of the present application, the data collection method of the intelligent toilet may include:
in step S301, estimating the expected use period of the user according to the use records of different users;
by counting the usage records of the user, the time law of the user using the intelligent closestool is searched, for example, the time law can be a predetermined time period every day, or a predetermined time period every other day, and the like. According to the statistical result, different usage rules corresponding to different users can be determined, and according to the usage rules, the preset usage period of the users can be determined.
For example, according to the usage record, the usage rule of the user a is about 8 points per day, and the usage period is 8:10 to 8: within a period of 50 deg.f. The expected usage period may be obtained according to the statistical usage period, for example, the expected usage period may be the statistical usage period, or a period that extends for a certain time length on the basis of the statistical usage period.
In step S302, when the usage of the user is not detected within the estimated expected usage period, obtaining the location information of the mobile terminal bound by the user;
if the user usage is not detected within the estimated expected usage period, it indicates that there may be an abnormality in the current usage data of the user, and the status of the user may be further confirmed according to the location information of the mobile terminal to which the user is bound.
In step S303, if the location information of the mobile terminal bound by the user matches the location of the intelligent toilet, an exception alert is sent to the mobile terminal.
If the positioning information of the mobile terminal bound by the user is matched with the position of the intelligent closestool, the situation that the user is located near the intelligent closestool currently and cannot use the intelligent closestool normally is indicated, and an abnormal prompt can be sent to the mobile terminal bound by the user or the monitoring terminal corresponding to the user.
In step S103, searching for a historical sign parameter corresponding to the user;
after the current user who uses intelligent closestool is discerned, can be according to the historical sign parameter that the user that discerns corresponds, historical sign parameter can include blood glucose parameter, blood lipid parameter, heart rate parameter, heartbeat parameter, conventional parameter of excrement or the conventional parameter of urine one or more. The routine parameters of the excrement can comprise the existence of bacteria in the digestive tract, virus infection and parasite infection, the existence of red blood cells and white blood cells in excrement, bacteria sensitivity test, Occult Blood Test (OBT), existence of insect eggs and the like. Urine conventional parameters may include leukocyte, ketone body, nitrite, urobilinogen, bilirubin, protein, glucose, urine specific gravity, occult blood, pH, vitamin C, etc.
In step S104, collecting current sign parameters of the user, and generating a sign parameter sequence corresponding to the user according to the historical sign parameters and the current sign parameters.
When a user uses the intelligent closestool, collecting the physical sign parameters when the user uses the intelligent closestool, and establishing the corresponding relation between the user and the physical sign parameters. According to the currently acquired physical sign parameters, the physical sign parameter sequence corresponding to the user can be obtained by combining the historical physical sign parameters of the user, according to the physical sign parameter sequence, the physical state of the user can be known, and the physical sign parameters of the detected physical state can be displayed on a display screen of the intelligent closestool or pushed to a mobile terminal corresponding to the user for displaying.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Fig. 4 is a schematic structural diagram of a data acquisition device of an intelligent toilet according to an embodiment of the present application, where the data acquisition device of the intelligent toilet includes:
a data acquisition unit 401, configured to acquire an infrared image set in a predetermined time period before the toilet is used, and acquire a weight of a user currently using the intelligent toilet;
the identity recognition unit 402 is used for analyzing the posture characteristics of the user according to the infrared image set and recognizing the user who uses the intelligent closestool at present according to the posture characteristics and the weight of the user;
a historical parameter searching unit 403, configured to search, according to the user, a historical sign parameter corresponding to the user;
and the sequence generating unit 404 is configured to acquire current sign parameters of the user, and generate a sign parameter sequence corresponding to the user according to the historical sign parameters and the current sign parameters.
The data acquisition device of the intelligent closestool shown in fig. 5 corresponds to the data acquisition method of the intelligent closestool shown in fig. 1.
Fig. 5 is a schematic view of an intelligent toilet provided by an embodiment of the present application. As shown in fig. 5, the intelligent toilet 5 of this embodiment includes: a processor 50, a memory 51 and a computer program 52 stored in said memory 51 and executable on said processor 50, such as a data acquisition program of an intelligent toilet. The processor 50, when executing the computer program 52, implements the steps in the above-described data collection method embodiments of each intelligent toilet. Alternatively, the processor 50 implements the functions of the modules/units in the above-described device embodiments when executing the computer program 52.
Illustratively, the computer program 52 may be partitioned into one or more modules/units, which are stored in the memory 51 and executed by the processor 50 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 52 in the intelligent toilet 5. For example, the computer program 52 may be divided into:
the data acquisition unit is used for acquiring an infrared image set in a preset time period before the closestool is used and acquiring the weight of a user using the intelligent closestool at present;
the identity recognition unit is used for analyzing the posture characteristics of the user according to the infrared image set and recognizing the user who uses the intelligent closestool at present according to the posture characteristics and the weight of the user;
the historical parameter searching unit is used for searching the historical sign parameters corresponding to the user according to the user;
and the sequence generating unit is used for acquiring the current physical sign parameters of the user and generating a physical sign parameter sequence corresponding to the user according to the historical physical sign parameters and the current physical sign parameters.
The intelligent toilet may include, but is not limited to, a processor 50, a memory 51. Those skilled in the art will appreciate that fig. 5 is merely an example of a smart toilet 5, and does not constitute a limitation of the smart toilet 5, and may include more or less components than shown, or combine certain components, or different components, for example, the smart toilet may also include input-output devices, network access devices, buses, etc.
The Processor 50 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may be an internal storage unit of the intelligent toilet 5, such as a hard disk or a memory of the intelligent toilet 5. The memory 51 may also be an external storage device of the intelligent toilet 5, such as a plug-in hard disk provided on the intelligent toilet 5, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like. Further, the memory 51 may also include both an internal storage unit and an external storage device of the intelligent toilet 5. The memory 51 is used to store the computer program and other programs and data required by the intelligent toilet. The memory 51 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain other components which may be suitably increased or decreased as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media which may not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.
Claims (10)
1. A data acquisition method of an intelligent closestool is characterized by comprising the following steps:
acquiring an infrared image set in a preset time period before the closestool is used, and acquiring the weight of a user using the intelligent closestool currently;
analyzing the posture characteristics of the user according to the infrared image set, and identifying the user who uses the intelligent closestool according to the posture characteristics and the user weight;
searching historical sign parameters corresponding to the user;
and acquiring current sign parameters of the user, and generating a sign parameter sequence corresponding to the user according to the historical sign parameters and the current sign parameters.
2. The data collection method of the intelligent toilet according to claim 1, wherein the physical parameters include one or more of blood glucose parameters, blood lipid parameters, heart rate parameters, heart beat parameters, stool routine parameters or urine routine parameters.
3. The intelligent toilet data collection method of claim 1, wherein the step of identifying the user currently using the intelligent toilet based on the posture characteristic and the user weight further comprises:
acquiring positioning information of a mobile terminal, wherein the mobile terminal is bound with a user in advance;
and identifying the current user using the intelligent closestool according to the posture characteristic and the bearing weight by combining the positioning information of the mobile terminal.
4. The data collection method for intelligent toilets of claim 1, further comprising:
estimating the expected use period of the user according to the use records of different users;
when the usage of the user is not detected in the estimated expected usage period, acquiring the positioning information of the mobile terminal bound by the user;
and if the positioning information of the mobile terminal bound by the user is matched with the position of the intelligent closestool, sending an abnormal prompt to the mobile terminal.
5. The data collecting method of an intelligent toilet according to claim 1, wherein the step of acquiring the weight of a user currently using the intelligent toilet comprises:
acquiring the user posture of the user when the user uses the intelligent closestool, and acquiring the bearing quality of a gasket of the intelligent closestool;
and determining the weight of the user according to the posture of the user and the bearing weight.
6. The intelligent toilet data collection method of claim 5, wherein the step of determining the user weight based on the user posture and the bearing weight comprises:
searching a calibration parameter corresponding to the current user gesture according to the corresponding relation between the preset user gesture and the calibration parameter;
and calculating the weight of the user according to the calibration parameters and the mass of the gasket of the intelligent closestool carried by the intelligent closestool.
7. The data collection method of the intelligent toilet according to claim 1, wherein the step of analyzing the posture feature of the user according to the infrared image set, and identifying the user who is currently using the intelligent toilet according to the posture feature and the user weight comprises:
searching one or more users corresponding to the currently acquired user weight according to the corresponding relation between the preset user weight and the users;
and determining the user corresponding to the user gesture in the searched one or more users according to the preset corresponding relation between the user gesture characteristics and the users.
8. The utility model provides an intelligent closestool's data acquisition device which characterized in that, intelligent closestool's data acquisition device includes:
the data acquisition unit is used for acquiring an infrared image set in a preset time period before the closestool is used and acquiring the weight of a user using the intelligent closestool at present;
the identity recognition unit is used for analyzing the posture characteristics of the user according to the infrared image set and recognizing the user who uses the intelligent closestool at present according to the posture characteristics and the weight of the user;
the historical parameter searching unit is used for searching the historical sign parameters corresponding to the user according to the user;
and the sequence generating unit is used for acquiring the current physical sign parameters of the user and generating a physical sign parameter sequence corresponding to the user according to the historical physical sign parameters and the current physical sign parameters.
9. An intelligent toilet comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor when executing the computer program implements the steps of the data collection method of the intelligent toilet according to any one of claims 1 to 7.
10. A computer-readable storage medium, in which a computer program is stored, which, when being executed by a processor, carries out the steps of the data acquisition method of an intelligent toilet according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910838993.2A CN112443020A (en) | 2019-09-05 | 2019-09-05 | Intelligent closestool and data acquisition method and device thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910838993.2A CN112443020A (en) | 2019-09-05 | 2019-09-05 | Intelligent closestool and data acquisition method and device thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112443020A true CN112443020A (en) | 2021-03-05 |
Family
ID=74733011
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910838993.2A Pending CN112443020A (en) | 2019-09-05 | 2019-09-05 | Intelligent closestool and data acquisition method and device thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112443020A (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104499547A (en) * | 2014-12-19 | 2015-04-08 | 钟剑威 | Cloud health toilet and use method thereof |
CN204456360U (en) * | 2015-03-09 | 2015-07-08 | 九牧厨卫股份有限公司 | Intelligent closestool |
US20170370936A1 (en) * | 2015-01-30 | 2017-12-28 | Toto Ltd. | Biological information measurement system |
CN209136623U (en) * | 2018-04-12 | 2019-07-23 | 北京几何科技有限公司 | It is a kind of to provide the detection system of health detection strategy based on user identity |
-
2019
- 2019-09-05 CN CN201910838993.2A patent/CN112443020A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104499547A (en) * | 2014-12-19 | 2015-04-08 | 钟剑威 | Cloud health toilet and use method thereof |
US20170370936A1 (en) * | 2015-01-30 | 2017-12-28 | Toto Ltd. | Biological information measurement system |
CN204456360U (en) * | 2015-03-09 | 2015-07-08 | 九牧厨卫股份有限公司 | Intelligent closestool |
CN209136623U (en) * | 2018-04-12 | 2019-07-23 | 北京几何科技有限公司 | It is a kind of to provide the detection system of health detection strategy based on user identity |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11562222B2 (en) | Systems and methods of identity analysis of electrocardiograms | |
US20210049353A1 (en) | Ai-based physical function assessment system | |
US20190029569A1 (en) | Activity analysis, fall detection and risk assessment systems and methods | |
EP3367883B1 (en) | Monitoring activities of daily living of a person | |
Ahanathapillai et al. | Preliminary study on activity monitoring using an android smart‐watch | |
Augustyniak et al. | Seamless tracing of human behavior using complementary wearable and house-embedded sensors | |
CN108565023A (en) | A kind of children disease prevention and control monitoring method, device and system | |
CN102883662A (en) | Medical image processing device and method for same | |
JP6618631B2 (en) | Computer system, animal diagnosis method and program | |
CN110456050B (en) | Portable intelligent digital parasite in vitro diagnostic instrument | |
CN102246705A (en) | Method and system for analyzing animal behaviours by using signal processing technology | |
CN110993043A (en) | Medical health management system | |
CN108882853A (en) | Measurement physiological parameter is triggered in time using visual context | |
KR101938361B1 (en) | Method and program for predicting skeleton state by the body ouline in x-ray image | |
Tabbakha et al. | A wearable device for machine learning based elderly's activity tracking and indoor location system | |
Banerjee et al. | Exploratory analysis of older adults’ sedentary behavior in the primary living area using kinect depth data | |
CN205263817U (en) | System for mobile device, removal external device and healthy based on iris information detection | |
Macrì et al. | A tracking system for laboratory mice to support medical researchers in behavioral analysis | |
CN112443020A (en) | Intelligent closestool and data acquisition method and device thereof | |
van Putten et al. | From video to vital signs: using personal device cameras to measure pulse rate and predict blood pressure using explainable AI | |
CN105550520A (en) | Mobile equipment, external mobile equipment and system for detecting health on the basis of iris information | |
CN114267043A (en) | Household health instrument digital reading identification method and device | |
CN113743292A (en) | Activity ability evaluation method, apparatus, medium, and device based on video sequence | |
KR102033063B1 (en) | System and method for providing customized health management based on individual historical data | |
CN112834047A (en) | Wearable temperature monitoring device, temperature monitoring method and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210305 |
|
RJ01 | Rejection of invention patent application after publication |