WO2023183660A1 - Système, procédé et appareil de formation de sessions d'apprentissage automatique (ml) - Google Patents

Système, procédé et appareil de formation de sessions d'apprentissage automatique (ml) Download PDF

Info

Publication number
WO2023183660A1
WO2023183660A1 PCT/US2023/021422 US2023021422W WO2023183660A1 WO 2023183660 A1 WO2023183660 A1 WO 2023183660A1 US 2023021422 W US2023021422 W US 2023021422W WO 2023183660 A1 WO2023183660 A1 WO 2023183660A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
frames
frame
sequences
session
Prior art date
Application number
PCT/US2023/021422
Other languages
English (en)
Inventor
Vikram KASHYAP
Dan Cheung
Ting-Hsuan Lin
Gabrielle Kim
Parmoon Sarmadi
Original Assignee
Toi Labs, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/701,799 external-priority patent/US20220211354A1/en
Application filed by Toi Labs, Inc. filed Critical Toi Labs, Inc.
Publication of WO2023183660A1 publication Critical patent/WO2023183660A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/20Measuring for diagnostic purposes; Identification of persons for measuring urological functions restricted to the evaluation of the urinary system
    • A61B5/207Sensing devices adapted to collect urine
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • G01N33/483Physical analysis of biological material
    • G01N33/487Physical analysis of biological material of liquid biological material
    • G01N33/493Physical analysis of biological material of liquid biological material urine
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6891Furniture

Definitions

  • a number of health conditions can be detected from the visual characteristics of human excreta. The best way to lower costs and treat these conditions effectively is early diagnosis and treatment. Current clinical management for these conditions, as well as many others, include monitoring and documenting specific excreta characteristics such as urine color, urination frequency, urine duration, stool color, stool frequency and stool consistency.
  • Fig. 1 is an exemplary toilet seat according to one embodiment of the disclosure
  • FIG. 2 illustrates a data collection flow diagram from a user’s toileting session
  • FIG. 3 schematically illustrates an exemplary ML annotation and reporting process
  • FIG. 4 is an exemplary flow diagram illustrating the machine learning pipeline
  • FIG. 5A illustrates an exemplary sessionization process according to one embodiment of the disclosure
  • FIG. 5B is a continuation of Fig. 5 A and illustrates an exemplary machine labeling (e.g., using ML) or scoring;
  • Fig. 5C illustrates the application of the disclosed principles to forming ML labeling to label the user’s sessions
  • FIG. 6 schematically illustrates dynamic sessionization according to one embodiment of the disclosure
  • Fig. 7 illustrates an exemplary embodiment of the disclosure for providing information (e. ., training) to the machine learning system
  • Fig. 8 shows exemplary session rules to account for user’s behavior
  • Fig. 9 schematically illustrates an exemplary implementation of personalization rules.
  • Such circuitry may also be used for wireless communication from the toilet to one or more remote servers configured to receive the acquired information to obtain the desired information.
  • the machine language or artificial intelligence disclosed herein may run at a remote location (e g., server or processor) or may be integrated within the toilet as disclosed herein.
  • references to “one embodiment,” “an embodiment,” “example embodiment,” “various embodiments,” etc., indicate that the embodiment(s) of the invention so described may include particular features, structures, or characteristics, but not every embodiment necessarily includes the particular features, structures, or characteristics. Further, some embodiments may have some, all, or none of the features described for other embodiments.
  • Coupled is used to indicate that two or more elements are in direct physical or electrical contact with each other.
  • Connected is used to indicate that two or more elements are in direct physical or electrical contact with each other.
  • Connected is used to indicate that two or more elements are in direct physical or electrical contact with each other.
  • Connected is used to indicate that two or more elements are in direct physical or electrical contact with each other.
  • Coupled is used to indicate that two or more elements co-operate or interact with each other, but they may or may not have intervening physical or electrical components between them.
  • Discussions herein utilizing terms such as, for example, “processing,” “computing,” “calculating,” “determining,” “establishing,” “analyzing,” “checking,” or the like, may refer to operation(s) and/or process(es) of a computer, a computing platform, a computing system, or other electronic computing device, that manipulate and/or transform data represented as physical (e.g., electronic) quantities within the computer's registers and/or memories into other data similarly represented as physical quantities within the computer's registers and/or memories or other information storage medium that may store instructions to perform operations and/or processes.
  • the disclosed embodiments relate to system, method and apparatus for forming ML sessions so that an Al system can identify, track and diagnose a user through the user’s bathroom sessions (as used herein, “event(s)” or “session(s)”).
  • a session is a period of time during which a user is active in the toilet. The session may include certain inactivity if such inactivity lasts only for a predefined duration.
  • a system is configured to capture images of excreta and, using ML algorithms, classifies the toileting session using Digital Biomarkers (DBMs).
  • DBMs Digital Biomarkers
  • the ability to create an excreta log to accurately deliver detailed information to doctors and healthcare providers can revolutionize healthcare by notifying when further urine or fecal screening is necessary.
  • the disclosed embodiments of ML and image identification through Al requires no user behavior change and provides a link between medical records and specific excreta patterns.
  • the system may be able to determine links between these excreta logs and the onset of specific diseases.
  • the exemplary system excreta records were collected and were compared against each other and against patients’ deidentified medical records to provide correlative data between excreta logs and acute episodes, such as cloudy urine, suspected blood, diarrhea and constipation.
  • the correlation can establish condition threshold for each type of acute episode and establish individual condition thresholds for reporting to healthcare professionals.
  • the disclosure uses an Internet-connected replacement toilet seat that continuously captures bowel movements and urinations using time-lapse imaging.
  • the toilet seat may comprise one or more sensors.
  • the sensor instrument may incorporate hardware (z.e., the toilet seat), as well as a software component (/. ⁇ ., analysis software created via machine learning models). After the toilet sensor captures time-lapse images of the contents going into the toilet bowl, these images are redirected to the system’s software component.
  • the software component analyzes the captured images to report stool and urine characteristics such as frequency of excretions, stool consistency, stool color and urine color that fall outside normal ranges.
  • Data captured by the toilet sensor may be transferred to a cloud server.
  • the data findings may be then analyzed further on a cloud server and then by human reviewers to validate the identified stool and urine characteristics. While in certain embodiments reported herein, data is analyzed on one or more cloud servers, the disclosed principles are not limited thereto. For example, data may be collected and locally analyzed and stored at the instrument.
  • the detection system may comprise one or more illumination source, lens, optical train, toilet seat, sensor (e.g., temperature, range, capacitive, bioelectrical impedance), fixture bracket, analysis circuitry and communication circuitry.
  • the bioelectrical impedance refers to bioelectrical impendence through the user’s body with the user sitting on the toilet. Conventional bioelectrical impedance measurement may be used for this purpose.
  • the circuitry may comprise one or more electronic circuitry (e.g., microprocessors and memory circuitry) configured to receive optical images from the lens and the optical train, convert the images to digital data and optionally analyze the converted digital data according to a predefined algorithm.
  • the detection system which may include optical components, software/firmware and sensors may be integrated with the toilet seat or may be affixed to the toilet seat or the toilet with fixture and brackets.
  • the toilet seat may comprise circuitry including hardware, software and firmware configured to collect data, analyze data and store the data and its results.
  • a communication module added to the detection instrument may further comprise communication circuitry to communicate, for example wirelessly, data results intermittently or on as needed basis.
  • Fig. 1 is an exemplary toilet seat according to one embodiment of the disclosure.
  • Fig. 1 shows toilet seat assembly 100, having a lid 110, rim 112 and detection system 120.
  • Assembly 100 may be integrated with a toilet (not shown) or may be added to a toilet (not shown).
  • Lid 110 may further comprise sensor components (not shown) to open when a user is nearby.
  • Detection system 120 may comprise various modules to implement the disclosed steps.
  • detection system 120 may comprise optical detection modules (not shown) including one or more lenses (as an optical train) and/or an IR detection module to detect or view urine or excrement.
  • Detection system 120 may also comprise an illumination source, a temperature sensor (e.g., temperature), analysis circuitry and communication circuitry.
  • Detection system 120 may also include one or more electronic circuitries (e g., microprocessors and memory circuitry) configured to receive optical images from the optical train (not shown) and convert the optical images to digital data.
  • the electronic circuitry(ies) may be configured according to the disclosed principles to analyze the converted digital data according to a predefined algorithm.
  • Detection system 120 may comprise memory components to store and implement the predefined algorithms. As discussed below, the algorithms may be implemented Al to identify disease or anomalies of interest. In one embodiment, the Al algorithms are stored in memory circuitry as software or firmware (not shown) integrated with detection system 120.
  • PCT/US2018/026618 filed April 6, 2018
  • PCT/US2020/019383 filed August 23, 2021
  • the toilet may include a so-called guest button 130 with may be used by a guest (non-regular user) to activate a guest session.
  • the guest button may be pressed or otherwise engaged to start obtaining and storing data from the guest user.
  • Additional interface may be provided (not shown) to allow the guest to enter data thereby associating the guest data with the guest’s identity.
  • the guest information may also be detected from the guest’s electronic devices (e.g., smart phone or smart watch) by using near- field communication and Bluetooth.
  • the toilet system may communicate directly with the user’s smart devices and request permission to identify the guest user. The user may then optionally grant permission to the system to associate and identify the recorded guest data.
  • the disclosure relates to algorithms that may be implemented as an Alto identify disease or anomalies of interest.
  • the Al application may be implemented as software or firmware on one or more electronic circuitries to identify and detect disease or health anomalies of the toilet user.
  • the user’s bathroom habits and excrement may be used to track the user’s health over time.
  • the Al may be trained using ML according to the disclosed embodiments.
  • Fig. 2 illustrates a data collection flow diagram from a user’s toileting session.
  • the flow diagram may be used to allow the ML system to call out (filter) events of medical significance.
  • the ML system may be trained to identify and label the significant event and produce a labeled result to show the event of significance.
  • Fig. 2 shows the collection of frames as a sample image when the user starts using the toilet.
  • the user’s session is recorded when a range sensor 111 in toilet lid 110 (Fig. 1) is activated or if a capacitive sensor (not shown) inside rim 112 is activated.
  • Frames may be recorded at a predefined rate, for example, at 30-60 frames per second.
  • a sequence may be considered as a group of frames that are tied to the start and end of the frame capture. Thus, a sequence may include several frames from when a user sits on the toilet, gets up to retrieve something (e.g., toilet paper) and returns to the toilet.
  • Sessions 230 are a group of sequences whose inactivity time is less than two minutes
  • episodes 240 are collected from various sessions. Episodes 240 comprise groups of sessions that have clinical significance. As discussed below, this data is among others then used to train the algorithm vis-a-vis ML. Heuristic ML algorithms may be applied to each level. This allows the ML system to filter events of significance or interest, label such events and generate a report.
  • Fig. 3 schematically illustrates an exemplary ML annotation and reporting process.
  • the ML system is filtering events of interest and labeling the events before reporting the filtered and labeled events to human reviewers. It should be noted that while the exemplary embodiment of Fig. 3 suggests human reviewers, the reviewers may be replaced by an Al system that implements the review based on prior training.
  • frames 308, sequences 312 and sessions 314 may be collected from toileting sessions of a particular user. Each of frames 308, sequences 312 and sessions 314 is then annotated by human observers. Next, an observation type and an optional node is assigned to each subject (frame, session or sequence) to describe a clinical or non-clinical condition. This is schematically represented as nodes 320. The nodes are then annotated or combined to form reviews 330. Each review is an observation and is assigned to a frame, sequence or session. Parameters 340 define a group of reviews of events during a fixed time window that show clinical significance. Finally, reports 350 are a group of parameters that are captured in a moment in time.
  • the relevant frames, sequences and sessions may be labeled as showing a cloudy urine and a report may be generated therefrom.
  • the start or the stop of the cloudy urine appearance is denoted by parameter change 340.
  • the parameter change may include such parameter changes as changes denoting clinical (e.g., cloudy urine or blood in the stool) or non-clinical (e.g., regular urination to no urination) characteristics. Additional alerts may be generated when a parameter change is detected.
  • the report may denote an aggregate of all of the parameter changes.
  • Fig. 4 is an exemplary flow diagram illustrating the machine learning pipeline.
  • Fig. 4 shows that in one embodiment, the ML pipeline is hierarchical.
  • frames are shown as group 402.
  • Frames 402 are collected as the user uses the toilet.
  • the frames are numbered 1 through n to denote the time sequence.
  • the frames are labeled by ML to denote a detected characteristic.
  • Hierarchy 402 and 404 show a multi-label classification of biomarkers at the individual frame level using neural networks such as Perceptron, Feed Forward Networks, Multi-Layer Perceptron, Radial Based Networks, Convolutional Neural Networks, Recurrent Neural Networks, and Long Short-Term Memory Networks.
  • neural networks such as Perceptron, Feed Forward Networks, Multi-Layer Perceptron, Radial Based Networks, Convolutional Neural Networks, Recurrent Neural Networks, and Long Short-Term Memory Networks.
  • Classification based on the previous stage’s output may be used to determine clinical session labels (traditional ML variants such as random forests are outperforming neural network based approaches at this stage).
  • various biomarkers are added as labels to the frames. The process may be extended to identify sessions with biomarkers 408. Session level classification based on low level biomarkers and clinical event labels is shown at 410. That data may be combined at a session level to determine the clinically relevant session labels: frame Level (multi-label classification of biomarkers at the individual frames) and session Level (classification based on the frame level output to determine clinical session labels using ML variants such as random forests).
  • a two stage process is used to identify patient anomalies of interest.
  • the process may comprise the steps of sessionization and machine labeling processes as schematically illustrated in Figs. 5(A) and 5(B), respectively.
  • Fig. 5A shows an exemplary sessionization process as the first of the two-part process.
  • sessionization is the process of grouping frames into sequences and sessions based on time series data, behavioral data and personalization data.
  • Time-based sessionization may be static or dynamic.
  • a static time interval or session timeout interval is used. The interval may be, for example, 1, 2, 5 or 10 mins.
  • Fig. 5A data pipeline 500 is sessionized and directed to ML system 550 (Fig. 5B) for ML labeling.
  • frames 502 are directed to a sessionization processor 505.
  • Sessionization processor 505 may comprise circuitry and software configured to receive frames 502 and define sequences and sessions based on predefined parameters.
  • Each frame may have a time stamp associated therewith.
  • the time stamp may denote the frames capture time relative to a clock.
  • the time lapse between the frames’ collection is denoted as A in Fig. 5 A.
  • the frames are placed into sequences (e g., Seq. 1-4) according to their time stamp. For example, frame 2 is collected 1 min and 50 seconds after frame 1 is received and frame 3 is collected nearly immediately after frame 2.
  • a lapse of 2 mins and 30 seconds occurs between the collection of frames 3 and 4.
  • Frames 4, 5 and 6 are collected substantially immediately and there is a lapse of 1 min and 55 seconds before frames 7 and 8 are collected.
  • Processor 505 forms sequences as a function of the time stamps. Thus, sequence 1 contains only frame 1; sequence 2 contains frames 2 and 3, sequence 3 contains frames 4, 5 and 6; and sequence 4 contains frames 7 and 8.
  • processor 505 may sessionize the frames and sequences by comparing the A values against a threshold value (e.g., 2 minutes).
  • a threshold value e.g. 2 minutes
  • session 1 includes sequences 1 and 2
  • session 2 includes sequences 4 through 8.
  • the threshold may be dynamic or other dynamic session timeout threshold may be used.
  • Dynamic timeout interval may be devised for a session as a function of behavioral time period for toileting sessions (i.e., start, end). For example, certain behavioral characteristics of the user may be recognized to define a dynamic threshold. The behavioral characteristics may include, for example, whether a person gets up during a session or falls over without assistance, etc.
  • Fig. 5B is a continuation of Fig. 5 A and illustrates an exemplary machine labeling or scoring by using the ML system according to one embodiment of the disclosure. Specifically, Fig. 5B shows an embodiment where each of the frames, sequences and sessions are labeled using ML trained to identify certain anomalies (e.g., cloudy urine or bloody stool) in the toilet bowl.
  • anomalies e.g., cloudy urine or bloody stool
  • both time and personal data may be used to label the collected data (frames, sequences, sessions).
  • Fig. 5B illustrates the unlabeled data as frames and sequences
  • the left hand side of Fig. 5B shows the ML labeling process
  • the ML process is schematically illustrated as 550.
  • Optical data collected from the user’s toilet session arrives as frames 502.
  • the frames are received at frame labeling processor 552.
  • the frame labeling processor 551 may comprise circuitry and software to label the optical data frames with predefined labels.
  • the received frames are organized from left to right in the order of their arrival with the earliest frame on the left (clean bowl), followed by urine stream, cloudy urine, formed stool, formed stool and the most recent frame on the right (unformed stool).
  • Row 554 illustrates labels that match the optical characteristics of each frame.
  • the labels are selected by frame labeling processor 551.
  • frame labeling processor 551 may depict a clean bowl from the corresponding frame.
  • each frame is labeled with its corresponding label to form a labeled frame.
  • the labeled frames are schematically illustrated as row 556. This step may also be implemented by frame labeling processor 551.
  • the labeled frames 556 are grouped into segments 558.
  • the grouping may be implemented based on a static or dynamic threshold as described in relation to Fig. 5A.
  • One or more of the processors may be used to implement the grouping.
  • the formed sequences may be matched against selected labels as shown at row 560.
  • the formed stool and the unformed stool are combined in one sequence.
  • the formed sequences 558 may be directed to sequence labeling processor 562, which may then match the sequences with sequence labels 564 to form labeled sequences 566.
  • labeled sequences 566 include one or more labeled frames (LF) as illustrated in Fig. 5B.
  • LF labeled frames
  • a combination of labeled frames and labeled sequences allow a healthcare professional or an Al to readily compile different frames and sessions in which the user’s stool showed an anomaly. It should be also noted that even though frames 4, 5 and 6 include different physical characteristics (i.e., formed and unformed stool), the sequence containing these frames is labeled as stool. A healthcare professional (or Al) reviewing the labeled sequence 3 may be alerted that the physical characteristics of the stool changed within a few frames.
  • Fig. 5C illustrates the application of the disclosed principles to forming ML labeling to label the user’s sessions.
  • machine labeling processor 570 receives sessions 1 to n, as inputs. These sessions are also illustrated at row 572 as clean bowl, cloudy urine, urine stream, formed stool and clean bowl. While a clean bowl label has been added to Fig. 5C, one of ordinary skill in the art would readily understand the application of a clean bowl label to frames and sequences of Figs 5A and 5B, respectively.
  • Sessions 572 are matched with session labels 574
  • Machine labeling processor 570 may provide additional labels as above and match session labels 574.
  • Row 576 illustrates labeled sessions identified as Sessions 1, 2 and 3. Each session may have one or more sequences and one or more frames.
  • the labeled sessions, labeled sequences and labeled frames may be stored at a memory circuit and associated with the user.
  • a user’s labeled sessions, for example, over several days may be used to track the user’s health or to identify health anomalies.
  • the sessionization may be static or dynamic. In static sessionization, the ML system determines the start and stop of the session as a function of a predetermined threshold. For example, if the threshold is set at 2 minutes (e.g., timeout), then the session is considered terminated if there is no user activity for more than 2 minutes.
  • Fig. 6 schematically illustrates dynamic sessionization according to one embodiment of the disclosure.
  • the dynamic sessionization may be considered a variable time window that changes depending on external variables including the user’s particular characteristics.
  • dynamic sessionization may be viewed as a series of contiguous toileting actions of a user during a given time interval.
  • the duration of the start and the ending of the toileting session may be dynamically changed based on external characteristics, including the user’ s behavior or health peculiarities (e.g., disabled users may require a longer start duration than healthy adults or children).
  • the start of the toileting session may be set to 3 minutes as illustrated by the dynamic start session 602.
  • the start of the toileting session may be triggered by a sensor that detects the user’s presence at or near the toilet.
  • the sensor may be configured to change the start of the toileting session depending on the user and the user’s specific needs.
  • the actual toileting time, indicated as 604, may also be static or dynamic.
  • toileting time 604 is shown with a static time interval (A) of 2 minutes but this may change according to certain predefined parameters.
  • the end of toileting session 606 may also define a dynamic period. As discussed, this portion may be extended or reduced according to the user’s needs or abilities.
  • the illustrative representation of Fig. 6 shows Session 1 as having a total of 8 minutes, which may be shortened or extended according to the user’s needs.
  • Fig. 7 illustrates an exemplary embodiment of the disclosure for providing information (e.g., training) to the machine learning system.
  • various sensors associated with the toilet provide sensor data 704 to ML system 700.
  • Sensor data may be obtained from sensors gathering time of flight (TOF) information 706, capacitive information 708, and information from the optical train (710).
  • TOF may comprise information relating to the length of an active session.
  • Capacitive sensor may relay information provided from a sensor located on the toilet seat which may be activated when the user is situated on the toilet.
  • Optical train information may comprise frames provided by the optical train situated in the toilet (see Fig. 1).
  • An exemplary TOF sensor may detect whether a user is standing in front of the toilet or sitting on the toilet or if someone is sitting on the commode riser of the toilet.
  • the TOF sensor may be located on the lid of toilet 111.
  • Certain toilets that have a commode riser e.g., for individuals who may be unable to squat on the toilet
  • the TOF sensor may be placed (e.g., near the toilet) to detect presence of a user on a commode riser.
  • ML system 700 also receives annotations from the observers.
  • Annotations may comprise frames which are labeled by human observers (e.g., blood in the stool, cloudy urine, clean toilet, etc.)
  • the output of ML system 700 are labels and is illustrated as labels for frames, sequences and sessions.
  • the exemplary embodiment of Fig. 7 shows four label categories: base 720, behavioral 722, system 724 and clinical 726.
  • Other labels may be added without departing from the disclosed principles.
  • the labels may be added to each frame, sequence and session as needed to provide a comprehensive understanding of the user’s toileting behavior and to aid in a potential diagnosis.
  • the base session 720 may comprise labels for base observations during the toileting session.
  • base session may comprise formed stool, unformed stool, urine present, urine stream present, etc.
  • Behavioral labels 722 may include labels for the user’s behavior during a toileting session. This group may comprise labels including, standing urination, sitting urination, non-flushing, flushing, etc.
  • System labels 724 relate to the state of the system during the toileting session. System session may comprise labels including clean bowl, clean lens, dirty bowl, dirty lens, etc.
  • Clinical labels 726 comprise labels for clinical association during toileting session and may comprise, for example, scatolia, blood present, cloudy urine, etc. Referring now to Figs. 5A-5C and Fig. 7, labels may be added by each of ML processors 552, 562 and 270 along with the various labels outlined in Fig. 7.
  • user By way of an additional example, user’s movement during toileting primarily due to how the user wipes with toilet paper can cause multiple sequences. Using multiple sensors and time based sessionization this user behavior can be correctly sessionized. This may be of particular interest for a senior population where assistance in wiping and cleaning may be required. Behaviors such as a delayed or forgotten flush can be sessionized using the ML system and sequence labels. Similarly, failing to flush is a user behavior that can cause incorrect sessionization. To resolve this issue, the prior session data may be fed into the current session’s analysis to identify only the new excreta of the new session. In certain embodiments, sessionization may utilize multiple sensor data sets to reduce other user behavior such as walking in front of the toilet or drying after showering (where the shower is near the toilet).
  • Fig. 8 shows exemplary session rules to account for the user’s behavior. Specifically, Fig. 8 shows various examples when user movement (or non-movement) or flushing may affect the label processing. Additional rules for labeling or processing the frame information prior to labeling may be added without departing from the disclosed principles.
  • Fig. 9 schematically illustrates an exemplary implementation of personalization rules.
  • the left hand side of Fig. 9 shows input that may be used as user identification metrics.
  • the inputs may comprise the user’s personal attributes.
  • One or more of the attributes may be used to identify the user.
  • the exemplary metrics of Fig. 9 include weight (902), height (904), non-flush (805) and bioelectrical impedance (908).
  • Other exemplary metrics, which may be known from the user’s profile include: age, continence status, current medications, mobility, body temperature (as measured by the toilet seat or a thermal scanner), or any other available health information.
  • the user’s urine voiding volume may be measured and used as a metric.
  • the results of a dipstick (or other litmus) test of the user’s urine may be used as a metric.
  • a watery stool or a constant watery stool may be identified by measuring the bioelectrical impedance of the bowl constituents. Another user metric may be the user’s voice (not shown) or fingerprint (not shown). The user may activate the session by personalizing the session. In addition, sessionization may be personalized for a user who does not flush regularly (910). A similar user recognition may be used for a user who needs to sit on the toilet after showering in order to dry. Sessionization may be based on the user’s toileting activities. For example, a user who regularly uses the toilet in the morning or in the evening may be identified based on the time of use.
  • Fig. 9 relates to actions taken by sessionization processor and the labeling actions comprise, by way of example, non-flush label (910), user movement label (912) and other exemplary labels (914).
  • a report may be generated for the user’s toileting session.
  • the report may include the most recent session or may include several sessions.
  • the report may trigger an alert if certain parameters are observed in the session. For example, if the user’s session indicates significant blood in the stool, a healthcare professional may be notified.
  • An annotated report may be generated from studying the user’s sessions.
  • a filtering session may generate frames of interest for an annotation system.
  • the annotation system may compile the relevant frames of interest for a report. Errors may be detected and corrected before generating the report. For example, if a frame identifies a health anomaly (e.g., bloody stool) but the associated sequence and session’s label suggest dirty lens, the ML filtering may account for the anomaly while considering the other physical issues. Finally, the generated sequences may be searched for easy lookup of a patient’s trending conditions.
  • a health anomaly e.g., bloody stool
  • Example 1 relates to a method to automatically label toileting acts of a user, the method comprising: activating a sensor to record the toileting acts in a plurality of frames, each frame capturing one or more activities during a discrete time interval and each frame separated from a subsequent frame by a time interval; forming one or more sequences by grouping the frames separated from each other by a substantially constant time interval; and forming one or more sessions by grouping the sequences as a function of a differential interval; labeling each frame by identifying one or more conditions present in the frame; labeling each sequence by identifying one or more conditions present in the plurality of frames in the respective sequence; and labeling each session by identifying one or more conditions present in the plurality of sequences in the respective session.
  • Example 2 relates to the method of example 1, wherein activating a sensor to record the toileting acts further comprises, activating a capacitive sensor to create start of a toileting session and activating a time-of-flight sensor to engage an optical system to record the toileting session.
  • Example 3 relates to the method of example 1, further comprising filtering the plurality of frames, sequences and sessions as a function of a predefined condition to generate a condition report.
  • Example 4 relates to the method of example 3, further comprising directing the condition report to a provider when the condition exceeds a threshold.
  • Example 5 relates to the method of example 1, wherein the one or more condition is selected to identify a biomarker or a condition marker.
  • Example 6 relates to the method of example 5, wherein the condition marker is selected from the group consisting of blood in the stool, unformed stool, cloudy urine and blood in the urine.
  • Example 7 relates to the method of example 1, wherein the condition comprises a physical characteristic of an excrement.
  • Example 8 relates to the method of example 1, wherein the threshold is determined dynamically as a function of one or more user characteristics.
  • Example 9 relates to the method of example 8, wherein the one or more characteristics comprises age, weight and mobility.
  • Example 10 relates to the method of example 1, wherein the sensors capture the user’s activities on a toilet using an optical train and circuitry to receive and store image data.
  • Example 11 relates to the method of example 1, wherein the frame comprises an optical image of one or more of the user’s excrement or urine.
  • Example 12 relates to a system to automatically label toileting acts of a user, the system comprising: a processor circuitry; a memory circuitry in communication with the processor circuity, the memory circuitry configured with instructions to cause the processor circuitry to automatically label toileting acts of the user by: activate a sensor to record the toileting acts in a plurality of frames, each frame capturing one or more activities during a discrete time interval and each frame separated from a subsequent frame by a time interval; form one or more sequences by grouping the frames separated from each other by a substantially constant time interval; and form one or more sessions by grouping the sequences as a function of a differential interval; label each frame by identifying one or more conditions present in the frame; label each sequence by identifying one or more conditions present in the plurality of frames in the respective sequence; and label each session by identifying one or more conditions present in the plurality of
  • Example 13 is directed to the system of example 12, wherein the memory circuitry further comprises instructions to activate the sensor to record the toileting acts by activating a capacitive sensor to create start of a toileting session and activating a time-of-flight sensor to engage an optical system to record the toileting session.
  • Example 14 is directed to the system of example 12, further comprising filtering the plurality of frames, sequences and sessions as a function of a predefined condition to generate a condition report.
  • Example 15 is directed to the system of example 14, further comprising directing the condition report to a provider when the condition exceeds a threshold.
  • Example 16 is directed to the system of example 12, wherein the one or more condition is selected to identify a biomarker or a condition marker.
  • Example 17 is directed to the system of example 16, wherein the condition marker is selected from the group consisting of blood in the stool, unformed stool, cloudy urine and blood in the urine.
  • Example 18 is directed to the system of example 12, wherein the condition comprises a physical characteristic of an excrement.
  • Example 19 is directed to the system of example 12, wherein the threshold is determined dynamically as a function of one or more user characteristics.
  • Example 20 is directed to the system of example 19, wherein the one or more characteristics is selected from the group consisting of ser’s age, continence status, current medications, mobility, body temperature or any other available health information.
  • Example 21 is directed to the system of example 12, wherein the sensors capture the user’s activities on a toilet using an optical train and circuitry to receive and store image data.
  • Example 22 is directed to the system of example 12, wherein the frame comprises an optical image of one or more of the user’s excrement or urine.
  • Example 23 is directed to a method to form machine learning (ML) sessions from a user’s activities, the method comprising: receiving a plurality of frames from a sensor, each frame capturing one or more activities during a discrete time interval and each frame separated from a subsequent frame by a time interval; forming one or more sequences by grouping the frames separated from each other by a substantially constant time interval; and forming one or more sessions by grouping the sequences as a function of a differential interval that exceeds a threshold.
  • ML machine learning
  • Example 24 is directed to the method of example 23, wherein the threshold is static and is determined apriori.
  • Example 25 is directed to the method of example 23, wherein the threshold is dynamic.
  • Example 26 is directed to the method of example 25, wherein the threshold is determined dynamically as a function of one or more user characteristics.
  • Example 27 is directed to the method of example 26, wherein the one or more characteristics is selected from the group consisting of ser’s age, continence status, current medications, mobility, body temperature or any other available health information.
  • Example 28 is directed to the method of example 23, wherein the frames are collected by one or more sensors to capture the user’s activities on a toilet.
  • Example 29 is directed to the method of example 23, wherein the frame comprises an optical image of one or more of the user’s excrement or urine.
  • Example 30 is directed to a system to form machine learning (ML) sessions from a user’s activities, the system comprising: a processor circuitry; a memory circuitry in communication with the processor circuity, the memory circuitry configured with instructions to cause the processor circuitry to: receive a plurality of frames from a sensor, each frame capturing one or more activities during a discrete time interval and each frame separated from a subsequent frame by a substantially constant time interval; form one or more sequences by grouping the frames separated from each other by a differential interval; and form one or more sessions by grouping the sequences as a function of a differential interval that exceeds a threshold.
  • ML machine learning
  • Example 31 is directed to the system of example 30, wherein the threshold is static and is determined apriori.
  • Example 32 is directed to the system of example 30, wherein the threshold is dynamic.
  • Example 33 is directed to the system of example 32, wherein the threshold is determined dynamically as a function of one or more user characteristics.
  • Example 34 is directed to the system of example 33, wherein the one or more characteristics is selected from the group consisting of ser’s age, continence status, current medications, mobility, body temperature or any other available health information.
  • Example 35 is directed to the system of example 30, wherein the frames are collected by one or more sensors to capture the user’s activities on a toilet.
  • Example 36 is directed to the system of example 30, wherein the frame comprises an optical image of one or more of the user’s excrement or urine.
  • Example 37 is directed to a non-transitory computer-readable medium comprising a processor circuitry and a memory circuitry in communication with the processor circuitry and including instructions to form machine learning (ML) sessions from a user’s activities, the memory circuitry further comprising instructions to cause the processor to: receive a plurality of frames from a sensor, each frame capturing one or more activities during a discrete time interval and each frame separated from a subsequent frame by a substantially constant time interval; form one or more sequences by grouping the frames separated from each other by a differential interval; and form one or more sessions by grouping the sequences as a function of a differential interval that exceeds a threshold.
  • ML machine learning
  • Example 38 is directed to the medium of example 37, wherein the threshold is static and is determined apriori.
  • Example 39 is directed to the medium of example 37, wherein the threshold is dynamic.
  • Example 40 is directed to the medium of example 39, wherein the threshold is determined dynamically as a function of one or more user characteristics.
  • Example 41 is directed to the medium of example 40, wherein the one or more characteristics is selected from the group consisting of ser’s age, continence status, current medications, mobility, body temperature or any other available health information.
  • Example 42 is directed to the medium of example 37, wherein the frames are collected by one or more sensors to capture the user’s activities on a toilet.
  • Example 43 is directed to the medium of example 37, wherein the frame comprises an optical image of one or more of the user’s excrement or urine.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Urology & Nephrology (AREA)
  • Artificial Intelligence (AREA)
  • Animal Behavior & Ethology (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Chemical & Material Sciences (AREA)
  • Physiology (AREA)
  • Hematology (AREA)
  • Food Science & Technology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Medicinal Chemistry (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Investigating Or Analysing Biological Materials (AREA)

Abstract

Les modes de réalisation divulgués concernent un système, un procédé et un appareil pour former des sessions ML, de telle sorte qu'un système d'intelligence artificielle (IA) puisse identifier, suivre et diagnostiquer un utilisateur par l'intermédiaire des sessions de salle de bains de l'utilisateur. Telle qu'utilisée ici, une session correspond à une période de temps pendant laquelle un utilisateur est actif dans les toilettes. La session peut comprendre une certaine inactivité si une telle inactivité dure seulement pendant une durée prédéfinie. Dans un mode de réalisation donné à titre d'exemple, un système est configuré pour capturer des images d'excréments et, à l'aide d'algorithmes ML, classifie la session de toilette à l'aide de biomarqueurs numériques (DBM). La possibilité de créer un journal d'excréments en vue de fournir avec précision des informations détaillées à des médecins et des fournisseurs de soins de santé peut révolutionner des soins de santé en notifiant le moment où une analyse d'urine ou fécale supplémentaire est nécessaire. Les modes de réalisation divulgués de ML et d'identification d'image par IA ne nécessitent pas de changement de comportement d'utilisateur et procurent un lien entre des dossiers médicaux et des motifs d'excréments spécifiques.
PCT/US2023/021422 2022-03-23 2023-05-08 Système, procédé et appareil de formation de sessions d'apprentissage automatique (ml) WO2023183660A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US17/701,799 2022-03-23
US17/701,799 US20220211354A1 (en) 2017-04-07 2022-03-23 Biomonitoring devices, methods, and systems for use in a bathroom setting
US202263339407P 2022-05-06 2022-05-06
US63/339,407 2022-05-06

Publications (1)

Publication Number Publication Date
WO2023183660A1 true WO2023183660A1 (fr) 2023-09-28

Family

ID=88101985

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/021422 WO2023183660A1 (fr) 2022-03-23 2023-05-08 Système, procédé et appareil de formation de sessions d'apprentissage automatique (ml)

Country Status (1)

Country Link
WO (1) WO2023183660A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5986701A (en) * 1996-09-26 1999-11-16 Flashpoint Technology, Inc. Method and system of grouping related images captured with a digital image capture device
US6263089B1 (en) * 1997-10-03 2001-07-17 Nippon Telephone And Telegraph Corporation Method and equipment for extracting image features from image sequence
WO2016135735A1 (fr) * 2015-02-25 2016-09-01 Outsense Diagnostics Ltd. Analyse de substance corporelle
US20190298316A1 (en) * 2017-04-07 2019-10-03 Toi Labs, Inc. Biomonitoring devices, methods, and systems for use in a bathroom setting
US20210310227A1 (en) * 2018-08-01 2021-10-07 Karl-Josef Kramer Toilet device with support functions

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5986701A (en) * 1996-09-26 1999-11-16 Flashpoint Technology, Inc. Method and system of grouping related images captured with a digital image capture device
US6263089B1 (en) * 1997-10-03 2001-07-17 Nippon Telephone And Telegraph Corporation Method and equipment for extracting image features from image sequence
WO2016135735A1 (fr) * 2015-02-25 2016-09-01 Outsense Diagnostics Ltd. Analyse de substance corporelle
US20190298316A1 (en) * 2017-04-07 2019-10-03 Toi Labs, Inc. Biomonitoring devices, methods, and systems for use in a bathroom setting
US20210310227A1 (en) * 2018-08-01 2021-10-07 Karl-Josef Kramer Toilet device with support functions

Similar Documents

Publication Publication Date Title
US7606617B2 (en) Urinalysis for the early detection of and recovery from worsening heart failure
EP3399310B1 (fr) Système de surveillance de santé, procédé de surveillance de santé et programme de surveillance de santé
US20200100725A1 (en) Toilet sense system for health diagnostics
US11903724B2 (en) Method and a system for detecting a respiratory event of a subject and a method for forming a model for detecting a respiratory event
JP2018109597A (ja) 健康モニタリングシステム、健康モニタリング方法および健康モニタリングプログラム
KR20210136056A (ko) 요실금 관리를 위한 사물 인터넷(iot) 솔루션
KR20130048471A (ko) 긴급진단기능이 구비된 좌변기 및 그 제어방법
US20200143945A1 (en) Method and apparatus for assisting health case of user
JPWO2018008155A1 (ja) 健康モニタリングシステム、健康モニタリング方法および健康モニタリングプログラム
Chapron et al. Highly accurate bathroom activity recognition using infrared proximity sensors
CN113556980A (zh) 卫生间环境中的使用者检测和识别
US11604177B1 (en) Smart toilet for human health monitoring
CN115860999A (zh) 一种基于物联网老人评估护理系统及方法
US20230277162A1 (en) System, Method and Apparatus for Forming Machine Learning Sessions
WO2023183660A1 (fr) Système, procédé et appareil de formation de sessions d'apprentissage automatique (ml)
US20210000405A1 (en) System for estimating a stress condition of an individual
JP6948705B2 (ja) 健康モニタリングシステム、健康モニタリング方法および健康モニタリングプログラム
US11540760B1 (en) Retrofittable and portable commode and systems for detecting, tracking, and alerting health changes
Calvaresi et al. Non-intrusive patient monitoring for supporting general practitioners in following diseases evolution
CN114847872A (zh) 一种基于物联网技术的医疗诊断系统及其实施方法
Balakrishnan et al. Integrating association rules and case-based reasoning to predict retinopathy
CN113450911A (zh) 便秘分析方法、装置及计算机设备
KR100680678B1 (ko) 배설물측정장치
JP3721980B2 (ja) 脳機能検査装置
CN117198473B (zh) 一种肠道准备质量判断系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23775772

Country of ref document: EP

Kind code of ref document: A1