WO2023242873A1 - Device system and method to monitor human activity - Google Patents

Device system and method to monitor human activity Download PDF

Info

Publication number
WO2023242873A1
WO2023242873A1 PCT/IN2023/050568 IN2023050568W WO2023242873A1 WO 2023242873 A1 WO2023242873 A1 WO 2023242873A1 IN 2023050568 W IN2023050568 W IN 2023050568W WO 2023242873 A1 WO2023242873 A1 WO 2023242873A1
Authority
WO
WIPO (PCT)
Prior art keywords
activity
human activity
module
environment
monitor
Prior art date
Application number
PCT/IN2023/050568
Other languages
French (fr)
Inventor
Suresh Kumar MATHA
Ashish Arora
Nikhil Sharma
Original Assignee
Matdun Labs India Private Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matdun Labs India Private Limited filed Critical Matdun Labs India Private Limited
Publication of WO2023242873A1 publication Critical patent/WO2023242873A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/143Sensing or illuminating at different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training

Definitions

  • the present invention relates to the maintenance and cleaning industry (Hospitality Industry) and more particularly to a smart system and method for monitoring, information collection, decision making, training, providing services, and human interaction in real world.
  • Hospitality Industry the maintenance and cleaning industry
  • the present invention relates to the maintenance and cleaning industry (Hospitality Industry) and more particularly to a smart system and method for monitoring, information collection, decision making, training, providing services, and human interaction in real world.
  • the present invention relates to the maintenance and cleaning industry and more particularly to a smart system and method for monitoring, information collection, decision making, training, providing services, and human interaction in real world.
  • the present invention provides a maid activity monitoring system and method.
  • the disclosure provides a new Al Model that uses a wearable device to see and understand human interactions with different objects in the real world.
  • the Al model may have more than 90% accuracy in identifying/understanding/recognizing how a maid interacts with different objects while cleaning a room.
  • This novel Al model may be successfully applied to any hospitality industry and residential places where each task can be identified uniquely, and data analytics are provided accordingly.
  • the system understands how much time a user/maid/person takes to perform a task such as, cleaning bathtub, changing bed sheets, vacuuming etc., either linearly or in random fashion to provide data analytics for optimization of the workforce and their daily tasks.
  • the present invention provides easy repair mechanism.
  • the mechanism provided is easy to work on thereby making the repair work very user friendly and easy. Since it is a modular type of assembly whenever a part is damaged, it can be easily replaced and the whole damaged part can be replaced. This can be done in a short time and can be done in a low cost.
  • the present invention provides easy assembly.
  • Fig. 1 illustrates a schematic representation of the custom hardware and how it interacts with the environment.
  • Fig. 2 illustrates a block diagram of modules involved in user recognition, capturing sensory information from environment, activity detection and tracking, metric assessment, and custom report generation.
  • FIG. 3 illustrates a computing unit in communication with database server via network.
  • Fig. 4 illustrates a flowchart describing steps in dealing with user recognition, understanding the nature of activities going on, doing performance assessment, and issuing alerts/notifications by generating custom reports.
  • Fig. 5 illustrates a schematic of system interacting with the human activities captured from sensory circuitry via mobile/ offline.
  • Embodiments of the present invention may include various steps, which will be described below. Steps may be performed by hardware components or may be embodied in machine-executable instructions, which may be used to cause a general-purpose or special-purpose with the instructions to perform the steps. Alternatively, steps may be performed by a combination of hardware, software, and/or by human operators.
  • a device to monitor human activity may include one or more processors or memories configures to locate and monitor human activity by A camera, microcontroller, illuminator and other sensors, wherein the human activity may be provided by 2D information as an input.
  • the device may capture scenes of environment by an imaging sensor module in frame-by-frame sequence within a permissible range in day and night weather conditions, wherein scenes are extracted in any environment and interpreted by vision based custom Al models; and generate and interpret the human activity status from custom report by activity metric calculation module based on performance.
  • the device to monitor human activity may operate the imaging sensor module, and the illuminator for capturing activities as and when user movement/face/activity is detected with in the scene captured.
  • the device to monitor human activity may calculate activity by calculation module to evaluate the amount/quality of work done in performing each activity as a whole or as an individual.
  • the device to monitor human activity where a database server manages individual /team activity profiles, which includes performance assessment reports and issues alerts/notifications to an individual and/or a team. [00030] The device to monitor human activity where a face recognition module and an activity detection and tracking module recognizes individuals based on activity profile stored in the database server.
  • FIG. 1 illustrates a schematic representation of the custom hardware and how it interacts with the environment.
  • a device (100) for capturing human activity may have a Microcontroller Module (101) and an image sensor module (102) which may capture frames/images of individual. Activities and operations may be captured in a frame-by-frame sequences during the day/night or in various situations, the device (100) may also include RGB sensor, IR sensor, and other sensors. The IR sensor and RGB sensor to capture user activity in frame sequence both in day and night times under all conditions of weather.
  • the human activity may be provided by 2D information as an input
  • custom visionbased Al models may start interacting with the environment, learn what people do daily, and create custom reports by activity metric calculation module (107) of what is happening in and around them and what could be done more effectively based on how well they perform the activity assigned to them as an individual or as a team.
  • a database server (108) may be set up to handle all human activity profile(s), including the criteria for evaluating each person or the entire team involved in the activity, custom reports, and the method for generating alerts/notifications as and when required.
  • a face recognition module (105) and an activity detection and tracking module (106) recognizes individuals based on activity profile stored in the database server (108).
  • a performance assessment module may (109) may be configured to detect and access daily performance of the frame-by-frame sequences and daily environment.
  • An alert/notification module (110) may be configured to trigger alert and notification in case of any mismatch or incorrect profile identification.
  • Fig. 2 illustrates a block diagram of modules involved in user recognition, capturing sensory information from environment, activity detection and tracking, metric assessment, and custom report generation.
  • a user authentication module (201) may be configured to authenticate individual from frame-by-frame sequences.
  • a sensory information module (202) may detect and record the environmental information day and night.
  • activity of a human being may be detected by a human activity detection and tracking module (203).
  • Daily activity of the environment and human detection may be calculated by activity metric calculation module (204) to generate a daily report.
  • a performance assessment module may (205) may be configured to detect and access daily performance of the frame-by-frame sequences and daily environment.
  • a customized report may be generated by a custom report generation module (206).
  • the database may have a user profile module (207) which may manage and store the individual user profiles.
  • An activity profile module (208) may be configured to detect and store daily activity log of the individuals and the environment.
  • a report module (209) may be configured to generate daily report on daily activities.
  • Fig. 3 illustrates a computing unit in communication with database server via network. Module 300 has multiple sensors and modules to complete few activities.
  • a user authentication module (301) may be configured to authenticate individual from frame-by-frame sequences.
  • a sensory information module (302) may detect and record the environmental information day and night.
  • activity of a human being may be detected by a human activity detection and tracking module (303).
  • Daily activity of the environment and human detection may be calculated by activity metric calculation module (304) to generate a daily report.
  • a performance assessment module may (305) may be configured to detect and access daily performance of the frame-by-frame sequences and daily environment.
  • a customized report may be generated by a custom report generation module (306).
  • the database may have a user profile module (307) which may manage and store the individual user profiles.
  • An activity profile module (308) may be configured to detect and store daily activity log of the individuals and the environment.
  • a report module (309) may be configured to generate daily report on daily activities.
  • Module (300) may extract data from the database (108) and communicate the same to cloud and/or a cloud customed hardware.
  • Fig. 4 illustrates a flowchart describing steps in dealing with user recognition, understanding the nature of activities going on, doing performance assessment, and issuing alerts/notifications by generating custom reports.
  • the step starts at (401) and moves to next step (402) where Microcontroller module (101) and an image sensor module (102) captures the frame-by-frame sequences and video frames may be extracted and process move to next step.
  • step (403) activity of a human being may be detected by a human activity detection and tracking module (403) and metrics may be assigned to each activity in step (404), the process moves to next step. Based on the metrics, performance may be assessed in step (405). Based on the metrics and the performance a custom report may be created.
  • the alert/notification module (110) may be configured to trigger alert and notification in case of any mismatch or incorrect profile identification.
  • Fig. 5 illustrates a schematic of system interacting with the human activities captured from sensory circuitry via mobile/ offline.
  • one or more modules (501) and IR/RGB sensors (504) may be configured to capture and assess the image of the individual frame-by-frame.
  • the images and the profile of the individual may be stored in a database (506).
  • the database (506) may be further connected to a cloud custom hardware (505).
  • the database (506) may be further connected with a cloud network (507).
  • the profile and images may be accessed through a mobile application and mobile computing device (503).
  • the present invention relates to the maintenance and cleaning industry and more particularly to a smart system and method for monitoring, information collection, decision making, training, providing services, and human interaction in real world.
  • the present invention provides a maid activity monitoring system and method.
  • the disclosure provides a new Al Model that uses a wearable device to see and understand human interactions with different objects in the real world.
  • the Al model may have more than 90% accuracy in identifying/understanding/recognizing how a maid interacts with different objects while cleaning a room.
  • This novel Al model may be successfully applied to any hospitality industry and residential places where each task can be identified uniquely, and data analytics are provided accordingly.
  • the system understands how much time a user/maid/person takes to perform a task such as, cleaning bathtub, changing bed sheets, vacuuming etc., either linearly or in random fashion to provide data analytics for optimization of the workforce and their daily tasks.
  • the device may be connected to Internet, Wi-Fi, or any communication device to send out Real time Data/Notifications/Alerts/ Update.
  • the neural engine processes multiple authentication/activity monitoring layers to identify the user/activity and processing may happen on the device or on the cloud or on both.
  • the present invention provides easy repair mechanism.
  • the mechanism provided is easy to work on thereby making the repair work very user friendly and easy. Since it is a modular type of assembly whenever a part is damaged, it can be easily replaced and the whole damaged part can be replaced. This can be done in a short time and can be done in a low cost.
  • the present invention provides easy assembly.
  • the present invention provides an Al Model that uses a wearable device to see and understand human interactions with different objects in the real world.
  • the present invention provides a system to understand how a maid interacts with different objects while cleaning a room.
  • the present invention provides a system to understands how much time a user/maid/person takes to perform a task such as, Cleaning Bathtub, Changing Bed Sheets, Vacuuming etc., either linearly or in random fashion.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Social Psychology (AREA)
  • Psychiatry (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The present invention relates to the maintenance and cleaning industry (Hospitality industry) and more particularly to a smart system and method for monitoring, information collection, decision making, training, providing services, and human interaction in real world. The maid activity monitoring system and method. The system provides a new AI Model that uses a wearable device to see and understand human interactions with different objects in the real world. The AI model may have more than 90% accuracy in identifying/understanding/recognizing how a maid interacts with different objects while cleaning a room. This new AI model may be successfully applied to any hospitality industry and residential places where each task can be identified uniquely, and data analytics are provided accordingly.

Description

DEVICE SYSTEM AND METHOD TO MONITOR HUMAN ACTIVITY
FIELD OF INVENTION:
[0001] The present invention relates to the maintenance and cleaning industry (Hospitality Industry) and more particularly to a smart system and method for monitoring, information collection, decision making, training, providing services, and human interaction in real world.
BACKGROUND OF THE INVENTION
[0002] In present world maintenance and cleaning industry has minimal computer infrastructure for supporting efficiency and monitoring. Today, the maids/room service personnel undertake the task of cleaning rooms which is generally an unmonitored and unmeasured activity. When a customer would like to move in to a room, the customer typically calls up to the maid/room service who respond upon availability/leisure. The way maids/room service select rooms for cleaning, the way they determine priority of which rooms will need availability first and ways to optimize the floor for services are mainly manual today. Looking past the maid/room service and cleaning, and other services in the cleaning and maintenance industry/or an other industry where human activity can be monitored can also be benefit from a more systematic process in service training, coaching, tracking and delivery [0003] From the aforementioned standpoint there has been a need for smart system and method for monitoring, information collection, decision making, training, providing services, and human interaction in real world. Therefore, there is a genuine need for monitoring, information collection, decision making, training, providing services, and human interaction in real world to overcome all the above problems by developing an improved system and method, which is accurate, efficient, dynamic, and simple.
OBJECTS OF THE INVENTION
[0004] It is an objective of the present invention to provides an Al Model(s) that uses a wearable device to see and understand human interactions with different objects in the real world.
[0005] It is an objective of the present invention to provides a system to understand how a maid interacts with different objects while cleaning a room.
[0006] It is an objective of the present invention to provides a system to understands how much time a user/maid/person takes to perform a task such as, Cleaning Bathtub, Changing Bed Sheets, Vacuuming etc., either linearly or in random fashion.
SUMMARY OF THE INVENTION [0007] The present invention relates to the maintenance and cleaning industry and more particularly to a smart system and method for monitoring, information collection, decision making, training, providing services, and human interaction in real world.
[0008] Although the present disclosure is explained with respect to a maid activity monitoring system, it would be appreciated that this application is used for illustration purposes, but it is not a limitation.
[0009] The present invention provides a maid activity monitoring system and method. The disclosure provides a new Al Model that uses a wearable device to see and understand human interactions with different objects in the real world. The Al model may have more than 90% accuracy in identifying/understanding/recognizing how a maid interacts with different objects while cleaning a room. This novel Al model may be successfully applied to any hospitality industry and residential places where each task can be identified uniquely, and data analytics are provided accordingly.
[00010] The system understands how much time a user/maid/person takes to perform a task such as, cleaning bathtub, changing bed sheets, vacuuming etc., either linearly or in random fashion to provide data analytics for optimization of the workforce and their daily tasks.
[00011] The present invention provides easy repair mechanism. The mechanism provided is easy to work on thereby making the repair work very user friendly and easy. Since it is a modular type of assembly whenever a part is damaged, it can be easily replaced and the whole damaged part can be replaced. This can be done in a short time and can be done in a low cost. The present invention provides easy assembly.
[00012] Additional aspects of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The aspects of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the appended claims. It is to be understood that the foregoing is a general description and the following detailed description are exemplary and explanatory only and are not restrictive of the invention, as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
[00013] The other objects, features and advantages will occur to those skilled in the art from the following description of the preferred embodiment and the accompanying drawings in which.
[00014] Fig. 1 illustrates a schematic representation of the custom hardware and how it interacts with the environment.
[00015] Fig. 2 illustrates a block diagram of modules involved in user recognition, capturing sensory information from environment, activity detection and tracking, metric assessment, and custom report generation.
[00016] Fig. 3 illustrates a computing unit in communication with database server via network. [00017] Fig. 4 illustrates a flowchart describing steps in dealing with user recognition, understanding the nature of activities going on, doing performance assessment, and issuing alerts/notifications by generating custom reports.
[00018] Fig. 5 illustrates a schematic of system interacting with the human activities captured from sensory circuitry via mobile/ offline.
[00019] Although specific features of the present invention are shown in some drawings and not in others, this is done for convenience only as each feature may be combined with any or all of the other features in accordance with the present invention.
DETAILED DESCRIPTION
[00020] Embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying figures and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein. [00021] In the following description, numerous specific details are set forth in order to provide a thorough understanding of embodiments of the present invention. It will be apparent to one skilled in the art that embodiments of the present invention may be practiced without some of these specific details.
[00022] Embodiments of the present invention may include various steps, which will be described below. Steps may be performed by hardware components or may be embodied in machine-executable instructions, which may be used to cause a general-purpose or special-purpose with the instructions to perform the steps. Alternatively, steps may be performed by a combination of hardware, software, and/or by human operators.
[00023] If the specification states a component or feature “may”, “can”, “could”, or “might” be included or have a characteristic, that component or feature is not required to be included or have the characteristic.
[00024] The foregoing illustrated and described embodiments of the invention are susceptible to various modifications and alternative forms, and it should be understood that the invention generally, as well as the specific embodiments described herein, are not limited to the particular forms or embodiments disclosed, but cover all modifications, equivalents and alternatives falling within the scope of the appended claims. By way of non- limiting example, it will be appreciated by those skilled in the art that particular features or characteristics in one embodiment may be combined as suitable with features or characteristics described in another embodiment. Moreover, the device and method for of the present invention can be a system used along with an automated or even with a manual system.
[00025] A device to monitor human activity may include one or more processors or memories configures to locate and monitor human activity by A camera, microcontroller, illuminator and other sensors, wherein the human activity may be provided by 2D information as an input.
[00026] The device may capture scenes of environment by an imaging sensor module in frame-by-frame sequence within a permissible range in day and night weather conditions, wherein scenes are extracted in any environment and interpreted by vision based custom Al models; and generate and interpret the human activity status from custom report by activity metric calculation module based on performance.
[00027] The device to monitor human activity may operate the imaging sensor module, and the illuminator for capturing activities as and when user movement/face/activity is detected with in the scene captured.
[00028] The device to monitor human activity may calculate activity by calculation module to evaluate the amount/quality of work done in performing each activity as a whole or as an individual.
[00029] The device to monitor human activity where a database server manages individual /team activity profiles, which includes performance assessment reports and issues alerts/notifications to an individual and/or a team. [00030] The device to monitor human activity where a face recognition module and an activity detection and tracking module recognizes individuals based on activity profile stored in the database server.
[00031] Fig. 1 illustrates a schematic representation of the custom hardware and how it interacts with the environment. A device (100) for capturing human activity may have a Microcontroller Module (101) and an image sensor module (102) which may capture frames/images of individual. Activities and operations may be captured in a frame-by-frame sequences during the day/night or in various situations, the device (100) may also include RGB sensor, IR sensor, and other sensors. The IR sensor and RGB sensor to capture user activity in frame sequence both in day and night times under all conditions of weather. The human activity may be provided by 2D information as an input In an embodiment custom visionbased Al models may start interacting with the environment, learn what people do daily, and create custom reports by activity metric calculation module (107) of what is happening in and around them and what could be done more effectively based on how well they perform the activity assigned to them as an individual or as a team. A database server (108) may be set up to handle all human activity profile(s), including the criteria for evaluating each person or the entire team involved in the activity, custom reports, and the method for generating alerts/notifications as and when required.
[00032] In an embodiment wherein a face recognition module (105) and an activity detection and tracking module (106) recognizes individuals based on activity profile stored in the database server (108). A performance assessment module may (109) may be configured to detect and access daily performance of the frame-by-frame sequences and daily environment. An alert/notification module (110) may be configured to trigger alert and notification in case of any mismatch or incorrect profile identification.
[00033] Fig. 2 illustrates a block diagram of modules involved in user recognition, capturing sensory information from environment, activity detection and tracking, metric assessment, and custom report generation. Under the module a user authentication module (201) may be configured to authenticate individual from frame-by-frame sequences. A sensory information module (202) may detect and record the environmental information day and night. In an embodiment activity of a human being may be detected by a human activity detection and tracking module (203). Daily activity of the environment and human detection may be calculated by activity metric calculation module (204) to generate a daily report. A performance assessment module may (205) may be configured to detect and access daily performance of the frame-by-frame sequences and daily environment. A customized report may be generated by a custom report generation module (206).
[00034] In an embodiment the database may have a user profile module (207) which may manage and store the individual user profiles. An activity profile module (208) may be configured to detect and store daily activity log of the individuals and the environment. A report module (209) may be configured to generate daily report on daily activities. [00035] Fig. 3 illustrates a computing unit in communication with database server via network. Module 300 has multiple sensors and modules to complete few activities. A user authentication module (301) may be configured to authenticate individual from frame-by-frame sequences. A sensory information module (302) may detect and record the environmental information day and night. In an embodiment activity of a human being may be detected by a human activity detection and tracking module (303). Daily activity of the environment and human detection may be calculated by activity metric calculation module (304) to generate a daily report. A performance assessment module may (305) may be configured to detect and access daily performance of the frame-by-frame sequences and daily environment. A customized report may be generated by a custom report generation module (306).
[00036] In an embodiment the database may have a user profile module (307) which may manage and store the individual user profiles. An activity profile module (308) may be configured to detect and store daily activity log of the individuals and the environment. A report module (309) may be configured to generate daily report on daily activities.
[00037] In an embodiment all the abovementioned data may be stored in the database (108) as mentioned in Fig. 1. Module (300) may extract data from the database (108) and communicate the same to cloud and/or a cloud customed hardware.
[00038] Fig. 4 illustrates a flowchart describing steps in dealing with user recognition, understanding the nature of activities going on, doing performance assessment, and issuing alerts/notifications by generating custom reports. The step starts at (401) and moves to next step (402) where Microcontroller module (101) and an image sensor module (102) captures the frame-by-frame sequences and video frames may be extracted and process move to next step. In step (403) activity of a human being may be detected by a human activity detection and tracking module (403) and metrics may be assigned to each activity in step (404), the process moves to next step. Based on the metrics, performance may be assessed in step (405). Based on the metrics and the performance a custom report may be created. In step (406) the alert/notification module (110) may be configured to trigger alert and notification in case of any mismatch or incorrect profile identification. The step end at (407).
[00039] Fig. 5 illustrates a schematic of system interacting with the human activities captured from sensory circuitry via mobile/ offline. In an embodiment one or more modules (501) and IR/RGB sensors (504) may be configured to capture and assess the image of the individual frame-by-frame. The images and the profile of the individual may be stored in a database (506). The database (506) may be further connected to a cloud custom hardware (505). In an embodiment the database (506) may be further connected with a cloud network (507). The profile and images may be accessed through a mobile application and mobile computing device (503).
[00040] The present invention relates to the maintenance and cleaning industry and more particularly to a smart system and method for monitoring, information collection, decision making, training, providing services, and human interaction in real world.
[00041] Although the present disclosure is explained with respect to a maid activity monitoring system, it would be appreciated that this application is used for illustration purposes, but it is not a limitation.
[00042] The present invention provides a maid activity monitoring system and method. The disclosure provides a new Al Model that uses a wearable device to see and understand human interactions with different objects in the real world. The Al model may have more than 90% accuracy in identifying/understanding/recognizing how a maid interacts with different objects while cleaning a room. This novel Al model may be successfully applied to any hospitality industry and residential places where each task can be identified uniquely, and data analytics are provided accordingly.
[00043] The system understands how much time a user/maid/person takes to perform a task such as, cleaning bathtub, changing bed sheets, vacuuming etc., either linearly or in random fashion to provide data analytics for optimization of the workforce and their daily tasks.
[00044] In an embodiment the device may be connected to Internet, Wi-Fi, or any communication device to send out Real time Data/Notifications/Alerts/ Update. The neural engine processes multiple authentication/activity monitoring layers to identify the user/activity and processing may happen on the device or on the cloud or on both. [00045] The present invention provides easy repair mechanism. The mechanism provided is easy to work on thereby making the repair work very user friendly and easy. Since it is a modular type of assembly whenever a part is damaged, it can be easily replaced and the whole damaged part can be replaced. This can be done in a short time and can be done in a low cost. The present invention provides easy assembly.
[00046] While embodiments of the present invention have been illustrated and described, it will be clear that the invention is not limited to these embodiments only. Numerous modifications, changes, variations, substitutions, and equivalents will be apparent to those skilled in the art, without departing from the spirit and scope of the invention, as described in the claim.
ADVANTAGES OF THE INVENTION:
[00047] The present invention provides an Al Model that uses a wearable device to see and understand human interactions with different objects in the real world.
[00048] The present invention provides a system to understand how a maid interacts with different objects while cleaning a room.
[00049] The present invention provides a system to understands how much time a user/maid/person takes to perform a task such as, Cleaning Bathtub, Changing Bed Sheets, Vacuuming etc., either linearly or in random fashion.

Claims

CLAIMS We Claim:
1. A device (100) to monitor human activity comprising: one or more processors or memories configured to: locate and monitor human activity by a Microcontroller and other sensors (101), wherein the human activity is provided by 2D information as an input; capture scenes of environment by an imaging sensor module (102) in frame-by-frame sequence within a permissible range in day and night weather conditions, wherein scenes are extracted in any environment and interpreted by vision based custom Al models; and generate and interpret the human activity status from custom report by activity metric calculation module (107) based on performance.
2. The device (100) to monitor human activity as claimed in claim 1, wherein the image sensing module (102) includes IR sensors and RGB sensors to capture user activity in frame sequence both in day and night times under all conditions of weather.
3. The device (100) to monitor human activity as claimed in claim 1, wherein the device (100) operates the imaging sensor module (102), and the a Microcontroller module (101) for capturing activities as and when user movement/face is detected with in the scene captured.
4. The device (100) to monitor human activity as claimed in claim 1, wherein the activity calculation module (107) evaluates the amount/quality of work done in performing each activity as a whole or as an individual.
5. The device (100) to monitor human activity as claimed in claim 1, wherein database server (108) manages individual /team activity profiles, which includes performance assessment reports and issues alerts/notifications to an individual and/or a team.
6. The device (100) to monitor human activity as claimed in claim 1, wherein a face recognition module (105) and an activity detection and tracking module (106) recognizes individuals based on activity profile stored in the database server (108).
7. A system (100) to monitor human activity comprising: one or more processors or memories configures to: locate and monitor human activity by a microcontroller module (101), wherein the human activity is provided by 2D information as an input; capture scenes of environment by an imaging sensor module (102) in frame-by-frame sequence within a permissible range in day and night weather conditions, wherein scenes are extracted in any environment and interpreted by vision based custom Al models; and generate and interpret the human activity status from custom report by activity metric calculation module (107) based on performance.
8. A method (100) to monitor human activity comprising: locating and monitoring human activity by a Microcontroller module (101), wherein the human activity is provided by 2D information as an input; capturing scenes of environment by an imaging sensor module (102) in frame-by-frame sequence within a permissible range in day and night weather conditions, wherein scenes are extracted in any environment and interpreted by vision based custom Al models; and generating and interpret the human activity status from custom report by activity metric calculation module (107) based on performance.
PCT/IN2023/050568 2022-06-17 2023-06-15 Device system and method to monitor human activity WO2023242873A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN202241014801 2022-06-17
IN202241014801 2022-06-17

Publications (1)

Publication Number Publication Date
WO2023242873A1 true WO2023242873A1 (en) 2023-12-21

Family

ID=89192564

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2023/050568 WO2023242873A1 (en) 2022-06-17 2023-06-15 Device system and method to monitor human activity

Country Status (1)

Country Link
WO (1) WO2023242873A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200250774A1 (en) * 2017-09-15 2020-08-06 Smartclean Technologies, Pte. Ltd. System and method for predictive cleaning

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200250774A1 (en) * 2017-09-15 2020-08-06 Smartclean Technologies, Pte. Ltd. System and method for predictive cleaning

Similar Documents

Publication Publication Date Title
US10812761B2 (en) Complex hardware-based system for video surveillance tracking
US11669979B2 (en) Method of searching data to identify images of an object captured by a camera system
JP6905850B2 (en) Image processing system, imaging device, learning model creation method, information processing device
US20190197319A1 (en) Video surveillance system, video processing apparatus, video processing method, and video processing program
CN110309702B (en) Video monitoring management system for shop counter
CN105404849B (en) Using associative memory sorted pictures to obtain a measure of pose
CN111080775A (en) Server routing inspection method and system based on artificial intelligence
WO2019231466A1 (en) Enterprise platform for enhancing operational performance
CN111008859A (en) Information presentation method and device in virtual shop, electronic equipment and storage medium
US20210304339A1 (en) System and a method for locally assessing a user during a test session
CN109117771B (en) System and method for detecting violence events in image based on anchor nodes
US20210334758A1 (en) System and Method of Reporting Based on Analysis of Location and Interaction Between Employees and Visitors
WO2023242873A1 (en) Device system and method to monitor human activity
WO2022190701A1 (en) Suspicious person alarm notification system, suspicious person alarm notification method, and suspicious person alarm notification program
JP2021033359A (en) Emotion estimation device, emotion estimation method, program, information presentation device, information presentation method and emotion estimation system
WO2022030548A1 (en) Monitoring information processing device, method, and program
CN114611967A (en) Object detection method, object detection apparatus, and computer-readable storage medium
WO2023242874A1 (en) Device, system and method to recognize security threats from surroundings
CN112936342A (en) System and method for evaluating actions of entity robot based on human body posture recognition algorithm
WO2019186676A1 (en) Device for behavior estimation and change detection
KR102527405B1 (en) Maintenance support system using mixed reality head mounted display
Li et al. A method of camera selection based on partially observable Markov decision process model in camera networks
US11321655B2 (en) Frictionless and autonomous control processing
WO2023112214A1 (en) Video-of-interest detection device, method, and program
CN105828035A (en) Monitoring method and device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23823423

Country of ref document: EP

Kind code of ref document: A1