WO2022019826A1 - Scheduling system, methods of operating and forming the same - Google Patents

Scheduling system, methods of operating and forming the same Download PDF

Info

Publication number
WO2022019826A1
WO2022019826A1 PCT/SG2020/050420 SG2020050420W WO2022019826A1 WO 2022019826 A1 WO2022019826 A1 WO 2022019826A1 SG 2020050420 W SG2020050420 W SG 2020050420W WO 2022019826 A1 WO2022019826 A1 WO 2022019826A1
Authority
WO
WIPO (PCT)
Prior art keywords
respective user
users
unit
officer
task
Prior art date
Application number
PCT/SG2020/050420
Other languages
French (fr)
Inventor
Mohan Kashyap PARGI
Original Assignee
Hitachi, Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi, Ltd. filed Critical Hitachi, Ltd.
Priority to PCT/SG2020/050420 priority Critical patent/WO2022019826A1/en
Publication of WO2022019826A1 publication Critical patent/WO2022019826A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06311Scheduling, planning or task assignment for a person or group
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/01Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/08Detecting, measuring or recording devices for evaluating the respiratory organs
    • A61B5/0816Measuring devices for examining respiratory frequency
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface

Definitions

  • Various aspects of this disclosure relate to a scheduling system. Various aspects of this disclosure relate to a method of forming a scheduling system. Various aspects of this disclosure relate to a method of operating a scheduling system.
  • a problem to be solved may be the need to design a method to identify and handle uncertain situations, such as delays in completing event tasks, and to efficiently choose which officers, workers etc. to dispatch to resolve those events.
  • the scheduling system may include a backend unit configured to determine an emotional score or emotion pertaining to a respective user of a plurality of users, such that the backend unit determines a plurality of emotional scores or emotions pertaining to different users of the plurality of users.
  • the scheduling system may also include a scheduling unit configured to select a candidate from the plurality of users to perform a task based on the plurality of emotional scores or emotions pertaining to different users of the plurality of users.
  • the backend unit may be configured to monitor a time taken by the candidate to complete the task and to determine the emotional score or emotion of the candidate when the candidate is performing the task.
  • the scheduling system may be configured to select another candidate for the task upon the scheduling system determining that the time taken by the candidate to complete the task is beyond an estimated time limit and upon the emotional score or emotion of the candidate falling below a predetermined threshold.
  • Various embodiments may provide a method of forming a scheduling system.
  • the method may include providing a backend unit of the scheduling system, the backend unit configured to determine an emotional score or emotion pertaining to a respective user of a plurality of users, such that the backend unit determines a plurality of emotional scores or emotions pertaining to different users of the plurality of users.
  • the method may also include providing a scheduling unit of the scheduling system, the scheduling unit configured to select a candidate from the plurality of users to perform a task based on the plurality of emotional scores or emotions pertaining to different users of the plurality of users.
  • the backend unit may be configured to monitor a time taken by the candidate to complete the task and to determine the emotional score or emotion of the candidate when the candidate is performing the task.
  • the scheduling system may be configured to select another candidate for the task upon the scheduling system determining that the time taken by the candidate to complete the task is beyond an estimated time limit and upon the emotional score or emotion of the candidate falling below a predetermined threshold.
  • Various embodiments may provide a method of operating a scheduling system.
  • the method may include using a backend unit of the scheduling system to determine an emotional score or emotion pertaining to a respective user of a plurality of users, such that the backend unit determines a plurality of emotional scores or emotions pertaining to different users of the plurality of users.
  • the method may also include using a scheduling unit of the scheduling system to select a candidate from the plurality of users to perform a task based on the plurality of emotional scores or emotions pertaining to different users of the plurality of users.
  • the backend unit may be configured to monitor a time taken by the candidate to complete the task and to determine the emotion or emotional score of the candidate when the candidate is performing the task.
  • the scheduling system may be configured to select another candidate for the task upon the scheduling system determining that the time taken by the candidate to complete the task is beyond an estimated time limit and upon the emotional score or emotion of the candidate falling below a predetermined threshold.
  • FIG. 1A is a schematic of a scheduling system according to various embodiments.
  • FIG. IB is another schematic of the scheduling system according to various embodiments.
  • FIG. 2A is a schematic illustrating emotion detection and classification performed by the system according to various embodiments.
  • FIG. 2B shows the profile of a signal representing an user or officer with calm emotion according to various embodiments.
  • FIG. 2C shows the profile of a signal representing an user or officer with abnormal emotion conditions, e.g. fear or panic, according to various embodiments.
  • FIG. 3 A shows a capability matrix according to various embodiments.
  • FIG. 3B shows a table showing the score values given to officers of various age groups in the capability matrix according to various embodiments.
  • FIG. 3C shows a table showing the score values given to officers of different experience in the capability matrix according to various embodiments.
  • FIG. 3D shows a table showing the score values given to senior officers and to junior officers in the capability matrix according to various embodiments.
  • FIG. 3E shows a table showing the score values given to different officers of different suitability/conformity to the tasks based on the skill sets of the officers according to various embodiments.
  • FIG. 3F shows a table showing the score values based on emotion exhibited by the officers when interacting with other people according to various embodiments.
  • FIG. 3G shows a table showing the score values based on emotion exhibited by the officers performing a task according to various embodiments.
  • FIG. 3H shows a table showing the score values based on health conditions of the officers according to various embodiments.
  • FIG. 4 shows a flow chart showing various steps involved in the scheduling system according to various embodiments.
  • FIG. 5 is a diagram illustrating processes between the data input unit / the user interface unit , the backend unit and the optimizer/scheduler unit of the scheduling system according to various embodiments.
  • FIG. 6A is a diagram illustrating processes between the system, officer 1 entity and officer 2 entity according to various embodiments.
  • FIG. 6B is a diagram illustrating processes between the system, officer 1 entity, officer 2 entity and officer 3 entity according to various embodiments.
  • FIG. 7A show the map input page of the graphics user interface of the user interface unit according to various embodiments.
  • FIG. 7B show the map display page of the graphics user interface of the user interface unit according to various embodiments when tasks have been assigned to the first officer and the second officer.
  • FIG. 7C show the map display page of the graphics user interface of the user interface unit 140 according to various embodiments when the first officer and the second officer are performing the tasks.
  • FIG. 7D show the map display page of the graphics user interface of the user interface unit according to various embodiments when the third officer is sent to perform the tasks previously assigned to the first officer.
  • FIG. 7E shows the officer management page of the graphics user interface of the user interface unit 140 according to various embodiments.
  • FIG. 7F shows the event management page of the graphics user interface of the user interface unit 140 according to various embodiments.
  • FIG. 8A is a schematic illustrating the scheduled task lists generated for the first officer (Officerl) and the second officer (Officer2) according to various embodiments.
  • FIG. 8B is a schematic illustrating the revised scheduled task lists generated for the first officer (Officer 1) and the second officer (Officer2) when the scheduling system detects a delay and a change in emotion of the first officer (Officerl) according to various embodiments.
  • FIG. 8C is a schematic illustrating the scheduled tasks lists generated for the first officer (Officerl) and the second officer (Officer2) with the estimated time limits according to various embodiments.
  • FIG. 8D is a schematic comparing the scheduled task lists and the revised task lists when the scheduling system detects a delay and a change in emotion of the first officer (Officerl) according to various embodiments.
  • FIG. 9 is a general illustration of a scheduling system according to various embodiments.
  • FIG. 10 is a general illustration of a method of forming a scheduling system according to various embodiments.
  • FIG. 11 is a general illustration of a method of operating a scheduling system according to various embodiments.
  • Embodiments described in the context of one of the methods or systems is analogously valid for the other methods or systems. Similarly, embodiments described in the context of a method are analogously valid for a system, and vice versa.
  • One prior reference describes creating a job list schedule for all different workers, and monitoring the stress level of workers performing tasks using wearable sensors.
  • the jobs may be reassigned manually, e.g. by a co-worker volunteering to take over the job. Based on the group activity of the workers, an efficiency index is designed and a group report is generated for that worker group.
  • an efficiency index is designed and a group report is generated for that worker group.
  • such a system does not take into account the capability of the workers, e.g. experience, as well as the emotions of the workers when assigning tasks Further, the system does not impose a constraint on the task completion time of the workers. Further, the system does not automatically identify situations such as delays and reassign tasks to other workers based on capability matrix scores.
  • Various embodiments may relate to a scheduling system. Various embodiments may seek to address one or more issues described above by creating an effective dispatch schedule of users (such as officers, workers etc.) by considering capability matrix scores determined or calculated by the scheduling system.
  • An “user” as described herein may generally be taken as any person that uses the system. A user may, for instance, be a police officer, a fire fighter, or an electrical maintenance worker, that uses the system to responds to an event and to undertake tasks.
  • Events may, for instance, include detecting a device failure in a building system.
  • Tasks are basically components which perform an activity in relation to an event. For example, in this case, checking the condition of the device and troubleshooting of device can be one task and the next task can be changing wires and replacing the wiring of the device system and the device.
  • the tasks may be performed by different officers, workers etc.
  • Various embodiments may reduce event-task times (e.g. reduce the time to complete tasks to a minimal).
  • Various embodiments may be configured to select a candidate from a plurality of users, e.g. officers, to perform a task.
  • Various embodiments may monitor event completion times and emotion exhibited by the users.
  • Various embodiments may be configured to select the next best ranked candidate based on the capability matrix scores in the event that there is a delay and the system detects negative emotions of the original candidate being assigned to perform that task.
  • the system may monitor the emotions of the users, e.g. officers, through image capture devices capturing facial expressions of the users, as well as through wearable sensors which monitor a body condition (e.g. heat beat rate, respiratory rate, skin temperature etc.) of the users.
  • the system may also monitor a time taken by the users to complete tasks.
  • the system may generate a capability matrix including emotional scores reflecting emotions of the users, as well as other information, such as age, hierarchy, experience, relevance, event task completion times and overall health condition etc. to select a candidate from the plurality of users to perform a task.
  • a capability matrix score of each user e.g. officer, may be calculated or determined based on the emotional scores as well as from the other information.
  • the system may be configured to select another candidate based on the capability matrix scores in the event that there is a delay and the system detects negative emotions (e.g. fear or extremely vulnerable mental state) of the original candidate being assigned to perform that task, so as to reduce the time to complete the task.
  • a “emotional score” of an user reflects probabilities that the user is experiencing different emotions such as (but is not limited to) happy, relaxed, angry and fearful, as determined by the scheduling system.
  • the different emotions may be different classifications as predefined in the scheduling system.
  • a high probability of positive emotions such as happy and relaxed and a low probability of negative emotions such as angry or fearful may be reflected by a positive emotional score.
  • a high probability of negative emotions and a low probability of positive emotions may be reflected by a negative emotional score.
  • the scheduling system may be configured to determine or detect different emotions by means such as detecting body conditions (e.g. heart rate, respiratory rate etc.) and/or detecting outward behaviors or expressions, such as facial expressions.
  • FIG. 1A is a schematic of a scheduling system 100 according to various embodiments.
  • FIG. IB is another schematic of the scheduling system 100 according to various embodiments.
  • the scheduling system 100 may be configured to dispatch users, e.g. officers or workers, in response to various events to solve issues.
  • the event may be any suitable event and the dispatching of the users may not be limited to any specific application.
  • the event may relate to fire issues, electrical issues, security issues, water issues etc.
  • the system may be used for scheduled events or issues, or for automated events/issues.
  • a scheduled event may be one event of a series of predefined or prescheduled, periodic events, e.g. a routine patrol stop of a police officer, or routine maintenance of power lines by an electrician.
  • an automated event may be a random event which can occur at any time and which is scheduled automatically by the system 100, e.g. the report of a burglary which a police officer is assigned to investigate, or report of a fire which a fire fighter is assigned to resolve.
  • the users or officers may be required to be dispatched to resolve automated events which are separate from their normal schedules.
  • the system 100 may include a data input unit 110, a backend unit 120, an optimizer /scheduler unit 130 (also referred to as a scheduling unit), and a user interface unit 140.
  • the units 110, 120, 130 and 140 may be individual computational modules/units or may be individual running process instances that programmed in one or more processing devices such Central Processor Unit (CPU), Graphical Processing Unit (GPU), microcontroller or other devices.
  • CPU Central Processor Unit
  • GPU Graphical Processing Unit
  • the units 110, 120, 130 and 140 may be part of a single processing device, while in various other embodiments, the units 110, 120, 130 and/or 140 may be included in different processing devices.
  • the units 110, 120, 130 and 140 may be connected to one another.
  • the backend unit 120 may be connected to the data input unit 110, the user interface unit 140, and the optimizer/scheduler unit 130.
  • the units may be connected directly or indirectly via wired means (e.g. bus lines, fiber optics cables etc.), via wireless means (e.g. WiFi, Bluetooth etc.) or a combination of both wired and wireless means.
  • the system 100 may be a computer system in the form of a personal computer (PC) or a server computer.
  • the computer may have a running operating system (OS) on top of hardware that includes a central processing unit (CPU), a storage device, an input device (e.g. keyboard or mouse), an output device (e.g. a display), and a communication interface (e.g. ethernet port to connect to a Local Area Network (LAN), which may be connected to the Internet or Wire Area Network (WAN), which other related devices may also be connected to).
  • the communications between units 110, 120, 130 and/or 140 may be carried out for instance, but is not limited to, Inter-Process Communication (IPC) mechanism.
  • IPC Inter-Process Communication
  • the data input unit block 110 may be a computational module/unit of the system 100.
  • Data input unit 110 may include an event monitoring system which contains information about the number of base stations, number of events occurring, the number of users or officers available, the corresponding tasks required to complete for the various events, and the estimated time to complete the tasks of the events.
  • the input information about the users or officers may inherently contain information about the location of the users or officers, and events information may inherently include information of the locations of various events.
  • the event monitoring system may collect, manage and store data about number of users/officers, events, tasks, event task times etc. at each time instance.
  • the data input unit 110 may also include a centralized commander system for monitoring a region of area using optical cameras and other sensors to trigger automated and scheduled events.
  • the base stations, events, manpower required i.e.
  • the data input unit 110 may also be configured to collect of data from wearable sensors worn by the users.
  • the wearable sensors may, for instance, include or contain a Global Positioning System (GPS) sensor, various body sensors for monitoring the body condition of the users, and/or an image capture device, e.g. an optical camera sensor.
  • GPS Global Positioning System
  • the wearable sensors may be attached to the users, for example, but not limited to their body/outfits.
  • the data input unit may be configured to obtain data from the wearable sensors through a communication network, for example, but not limited to WiFi, Bluetooth etc.
  • the backend unit 120 may be another computational module/unit of the system 100 connected to the data input unit.
  • the backend unit 120 may include a data processing unit or database (DB) 122, which may be configured to receive input data such as information pertaining to the number of base stations, number of events, number of tasks in events, number of users/officers, estimated event task completion times and/or event locations from the data input unit block 110.
  • the received input data may be stored in the database block 122, which may be for example, but not limited to PostgresDB, Mongo DB etc. in the form of one or more database tables.
  • the one or more database tables may contain information, for example, but not limited to the type of event, number of tasks in event, type of user/officer, type of base station etc.
  • the data processing unit 122 may additionally obtain information about the optimal schedules for each user/officer from the optimizer/scheduler unit 130. Once the schedule of the users or officers is generated, the routing information may be generated from the current location of the users or officers to the event locations. This information may be updated in the data processing unit
  • the backend unit 120 may also be connected to the user interface unit block 140 to display different aspects related to the graphics user interface (GUI) as described below.
  • GUI graphics user interface
  • the backend unit 120 may act as an interface between the data input unit 110, the user interface unit 140 and the optimizer/scheduler unit 130.
  • the data processing unit 122 in backend unit 120 may collect the data from data input block unit 110 in the form of signals (e.g. images, videos, GPS coordinates, signals from body sensors etc.).
  • the backend unit 120 may store the information in, for example, a data storage device, e.g. a hard disk drive, or the cloud of the block unit 122.
  • the data processing unit 122 may, for example but not limited to, an independent server system running a service process etc. which segregates the data according to different kinds modalities and extracts features which are all relevant for the emotion detection system and event task time detection system.
  • the extracted features and relevant data may be sent to the common processing unit 121 and the emotion detection unit
  • the emotion detection unit 123 may utilize deep learning or machine learning models to detect emotions of an user or officer, i.e. determine the probability that the officer or user is happy, relaxed/calm, angry, and fearful etc. Once the emotion of the user or officer is detected, the information may continuously be stored in the data processing unit 122, i.e. in the one or more database tables.
  • the common processing unit 121 may perform different processes for the backend unit 120 such as starting of backend service, resetting the service and stopping of different services.
  • the common processing unit 121 may also perform the continuous monitoring of the event task times of the users or officers from the data received (via the data processing unit 122 and the data input unit 110) from the wearable sensors.
  • the optimizer/scheduler block unit 130 may continuously interact with the database block unit 122 to fetch all the information in the DB table to optimizer/schedule unit 130.
  • the capability matrix scores may be updated based on this information and corresponding scheduling and re-scheduling of users or officers may performed through the optimizer/scheduler unit 130.
  • the updated information may be transmitted back to the database block unit 122.
  • the optimizer/scheduler unit 130 may include a computation unit system. This system is running continuously as a software service.
  • the optimizer/scheduler unit 130 may be connected to other units 110, 140 in the system 100 through the backend system unit 120.
  • the unit 130 may be configured to reduce or minimize time taken to perform certain set of tasks based on some constraints, such as the capability matrix scores.
  • the unit 130 may use a greedy optimizer, mathematical designed constraint optimizer etc. and metaheuristic algorithms like genetic algorithms, ant colony optimization etc., and/or any optimizer or optimization method.
  • delays i.e.
  • the system 100 may monitor the emotion of the user or officer assigned to perform the task. If the user or officer exhibits negative emotion, the unit 130 may modify the schedule using the optimizer and the information may be sent to the scheduler.
  • the scheduler may change the schedules of the users or officers based on the input from the optimizer and may send that information to the database block unit 122 in the backend block unit 120.
  • the user interface unit 140 may be a Graphical User Interface (GUI) unit of a device or a computer which presents graphics to show information to user including texts, icons, image, diagram, charts, shapes, movie playback, etc., and which also accepts user actions input through GUI objects such as buttons, forms, text boxes, checkboxes, radio button, mouse movement, mouse selection, mouse release, etc. These graphics may be physically presented by an image display such as Liquid Crystal Display (LCD), Plasma Display.
  • the device or computer may also include input devices such as keyboards, mouse, touch screen, etc. to accept inputs from the users or officers.
  • Various embodiments may include a plurality of devices or computers with each device or computer having a user interface unit 140. Different devices or computers may be accessed by different users.
  • a user who access the computer may be an officer or worker etc. sent to complete the task, an administrator/operator who coordinates the dispatching, or a commander etc.
  • the computer may be accessed by the officer or worker sent to complete a task, by the administrator/operator, or the commander.
  • FIG. 2A is a schematic illustrating emotion detection and classification performed by the system 100 according to various embodiments.
  • data may continuously be received in real time from the wearable sensors which are worn by the users or officers (e.g. through an Internet of Things (IOT) device interface).
  • the data may include images or videos of the users or officers captured by image capture devices and data relating to one or more body conditions (e.g. heart beat rate, perspiration rate etc, skin temperature.) of the users or officers.
  • the data may be received by unit 110 and may be transmitted by unit 110 to unit 122.
  • IOT Internet of Things
  • the unit 122 may from the images or videos extract the features indicating emotions, e.g. a facial expression of the user or the officer (e.g. whether the user or officer is smiling or frowning), or behavior of the user or the officer.
  • the unit 122 i.e. the emotion detector 123, may then use machine learning (ML) /deep learning (DL) models to classify the different emotions, e.g. based on the extracted features.
  • the emotions may be determined based on one or more body conditions (e.g. heart beat rate, respiratory rate, skin temperature etc.) of the user or officer.
  • the emotions may also be determined based on the extracted features, e.g. the facial expression or behavior of the user.
  • the unit 122 may be configured to determine a score (or quotient) for each of the emotions.
  • the emotions may, for instance, have the classifications happy, relaxed/calm, angry, fearful, disgust, anxiety etc.
  • the score or quotient for each of the emotions may reflect the probability that the user or officer is having that emotion. For instance, if the score or quotient for “happy” is high, there is high likelihood that the user is happy.
  • the score or quotients for the different emotions detected for each of the user or officer may be stored in the backend unit 122.
  • the unit 122 may determine an (overall) emotional score (or valence score) of the user or officer based on the scores of the individual emotions, which in turn may be based on the one or more body conditions of the respective user or officer.
  • the unit may determine the emotional score also based on extracted features, e.g. the facial expression or behavior of the respective user or officer.
  • the unit 122 may be configured to determine an emotional score or emotion of the user or the officer.
  • the emotional score or emotion may be based on the one or more body conditions of the user or officer.
  • the emotional score or emotion may be based on the one or more body conditions of the user or officer, and the extracted features, e.g. the facial expression or behavior of the user.
  • step 201 information may be received by data input unit 110 continuously from wearable sensors, which may, for instance, include Global Positioning System (GPS) sensors, various body sensors for monitoring the body condition of the users (e.g, heart rate sensor), and/or image capture device, e.gs. optical camera sensors.
  • the information may be transmitted from the wearable sensors directly or indirectly to the data input unit 110 via a communication network may be include, but is not limited to Bluetooth, WiFi etc.
  • the information such as data from the body sensor, and images/videos etc. may be stored by the data processing unit 122 in a local storage device, in the cloud etc. for further processing.
  • the data processing unit 122 may from the images or videos extract the features indicating a facial expression of the user or the officer (e.g. whether the user or officer is smiling or frowning), or behavior of the user or the officer. For instance, key points present in the face of the user or the officer as contained in an image or video frame may be identified. The key points may, for instance, points indicating the mouth, eyes etc. The movements or displacement of the key points may be tracked or monitored, and information relating to the movements or displacement may be sent from the data processing unit 122 to the emotion detector 123. The movements or displacement may be passed through deep neural network for example resnet, densenet etc. to extract more complex feature information.
  • the features indicating a facial expression of the user or the officer e.g. whether the user or officer is smiling or frowning
  • behavior of the user or the officer e.g. whether the user or officer is smiling or frowning
  • key points present in the face of the user or the officer as contained in an image or video frame may be identified.
  • the key points
  • the other signals from the wearable sensors may be processed to extract different signal features, for example, but not limited, to wavelet transformed signals, chirplet transformed signals, Fourier transformed signals etc.
  • Complex signal features may be generated by considering the combination of these extracted features as input to generate more discriminative features utilizing deep learning network feature extractor.
  • the emotion detector 123 may process the different complex features related to classification of different emotions. These learnt features may be passed through a deep learning classifier such as for example, but not limited, to ANN, Convnet, Resnet etc. or machine learning classifiers such as for example, but not limited to, random forest classifiers, decision tree classifiers etc. These classifiers may classify the extracted emotional features into happy, relaxed/calm, angry, and fearful etc.
  • a deep learning classifier such as for example, but not limited, to ANN, Convnet, Resnet etc.
  • machine learning classifiers such as for example, but not limited to, random forest classifiers, decision tree classifiers etc.
  • the emotion detector 123 may determine the probability of different outcomes based on the body condition of the respective user or officer. In various embodiments, the emotion detector 123 may also determine the probability of different outcomes also based on the extracted features. The emotion detector 123 may be configured to determine a score (or quotient) for each of the emotions. The score or quotient for each of the emotions may reflect the probability that the user or officer is having that emotion. For instance, if the score or quotient for “happy” is high, there is high likelihood that the user is happy. The score or quotients for the different emotions detected for each of the user or officer may be stored in the backend unit 122.
  • the unit 122 may determine an (overall) emotional score (or valence score) or emotion of the user or officer based on the scores of the individual emotions.
  • the unit 122 may be configured to determine the emotional score or emotion of the user or officer based on the body condition (e.g. heat beat rate, respiratory rate, skin temperature etc.) of the respective user or officer.
  • the unit 122 may be configured to determine the emotional score or emotion of the user or officer based on the body condition and also based on the extracted features.
  • the valence score may range from -1 to 0 representing overall negative emotion, and from 0 to 1 representing overall positive emotion.
  • FIG. 2B shows the profile of a signal representing a user or officer with calm emotion according to various embodiments.
  • FIG. 2C shows the profile of a signal representing a user or officer with abnormal emotion conditions, e.g. fear or panic, according to various embodiments.
  • the signals may, for instance, represent the respiration of a user or officer.
  • FIG. 3A shows a capability matrix 300 according to various embodiments.
  • the capability matrix 300 as shown in FIG. 3A includes the capability matrix scores for a team of officers.
  • the capability matrix 300 may include information under different categories or variables, such as serial number /rank (SL. NO/Rank), age, hierarchy, experience, relevance, emotion exhibited by the officers during interaction with people, emotion expressed while performing different tasks, even task completion times, overall health condition of the officer, and overall capability matrix scores of each officer.
  • the overall capability matrix score (alternatively referred to as capability index) of each officer may be calculated based on event task completion time, emotion exhibited by the officer during the performance of those tasks, emotion exhibited during interaction with people, emotion exhibited by the officer when performing those tasks, relevance, overall health condition of the officers and/or hierarchy, which may be critical variables required for the calculation of capability matrix scores.
  • the optional variables may be age and/or experience, which when specified may be used. Otherwise, the default values may be considered.
  • the capability matrix 300 may be generated by the optimizer/scheduler unit 130.
  • FIG. 3B shows a table 301 showing the score values given to officers of various age groups in the capability matrix 300 according to various embodiments.
  • Table 301 shows that an officer whose age is in the range of 20 - 40 years old may be given a score of 0.4, while an officer whose age is 40 years and above may be given a score of 0.6. When no age is been specified, the default value of 0.5 may be set.
  • FIG. 3C shows a table 302 showing the score values given to officers of different experience in the capability matrix 300 according to various embodiments.
  • Table 302 shows that an officer who has more than 5 years of experience may be given a score of 0.6, while an officer has less than 5 years of experience may be given a score of 0.4. When no age is been specified, the default value of 0.5 may be set.
  • FIG. 3D shows a table 303 showing the score values given to senior officers and to junior officers in the capability matrix 300 according to various embodiments.
  • Table 303 shows that a senior officer may be given a score value of 0.6, while a junior officer may be given a score value of 0.5.
  • FIG. 3E shows a table 304 showing the score values given to different officers of different suitability/conformity to the tasks based on the skill sets of the officers according to various embodiments.
  • Table 304 shows that officers with skill sets of high relevance to the task are given a score of 1, while officers with skill sets of low relevance to the task may be given a score of 0.
  • Officers with a skill sets with different degrees of relevance which fall between the two ends may be given values between 0 and 1.
  • FIG. 3F shows a table 305 showing the score values based on emotion exhibited by the officers when interacting with other people according to various embodiments.
  • This metric may provide a measure of emotion exhibited by an officer by analyzing his behavior / facial expression using wearable sensors and/or image capture devices (e.g. cameras) worn by the officer, as well as the behavior /facial expression of the person the officer is interacting with.
  • Each row of the right column shows a valence score which is an overall emotional score. If both the officer and the other person are determined to demonstrate positive behavior (e.g. happy, calm/relaxed), the valence score may range from 0 to 1. On the other hand, if the officer and the other person are determined to show negative behavior (e.g.
  • the valence score may range from - 1 to 0.
  • the facial expression of the other person may be captured by the camera worn by the officer and sent as an input (under 201 in FIG. 2).
  • the different emotions of the officer may be determined based on one or more body conditions (such as detected heat beat rate etc.).
  • the different emotions of the officer may be determined based on the one or more body conditions, as well as based on extracted features, e.g. from the facial expression or behavior of the officer and the other person interacting with the officer.
  • the different emotions shown by the other person may be classified under 203, and the valence score for the officer may be determined.
  • FIG. 3G shows a table 306 showing the score values based on emotion exhibited by the officers performing a task according to various embodiments. If an officer is determined to demonstrate positive emotion when performing a task, the officer may be given a score from 0 to 1. On the other hand, if the officer is determined to demonstrate negative emotion when performing the task, the officer may be given a score from -1 to 0. When the system determines that the information provided is not sufficient to determine the emotion of the officer, a default value of 0 may be provided. The receiving of various data (e.g. images/videos, information on body condition etc.) as well as the determination and classification as different emotions may be carried as described in relation to FIG. 2.
  • various data e.g. images/videos, information on body condition etc.
  • FIG. 3H shows a table 307 showing the score values based on health conditions of the officers according to various embodiments.
  • Various health data of the officers e.g. body temperature, heart beat rate etc.
  • the backend unit 120 may receive the health data from the data input unit 110, and may make a determination on the health condition of each officer based on the health data provided. If an officer is feeling unwell, a score of -1 may be assigned. On the other hand, is the officer is feeling extremely great, a score of +1 may be assigned. 0 may be the default score.
  • a score value may also be provided to each officer based on information of the time taken by the officers to perform different event tasks assigned to them during a particular duration of time, in a day, in a week etc.
  • a score of the average completion time may be determined by averaging the individual scores for each task over the number of tasks ((Taskl + Task2 + . + TaskN)/(N)), where N is the total number of tasks.
  • An overall capability matrix score may be determined or calculated for each user or officer based on score values under different categories or variables.
  • the different categories or variables may have different weightage.
  • the weightage of the different categories may be assigned randomly, based on past experience, or using an optimization method.
  • An advantage of using different weightages is that they may help to penalize or increase the influence of one category or variable in the outcome evaluation of the overall capability matrix score.
  • the overall capability matrix score may be determined as follows:
  • the officers may be ranked accordingly, with officer having the highest score being the highest rank, and the officer with the lowest score being the lowest rank.
  • the remaining or available officers with the best capability matrix scores i.e. the highest ranked officers among the remaining or available officers, may be dispatched to support and complete the task.
  • the capability matrix may be updated. For instance, the category “actual completion time relative to estimated completion time category” may be updated based on whether the actual time taken by the officer to complete the task is less than or equal to the estimated time, and the category “emotions expressed by officers when completing tasks” category may be updated based on the emotions exhibited by the officer when performing the task.
  • the assignment of new tasks may be based on the updated capability matrix.
  • FIG. 4 shows a flow chart 400 showing various steps involved in the scheduling system 100 according to various embodiments.
  • the flow chart 400 shows high-level processes of the system 100.
  • the system 400 receives information regarding the number of users, e.g. officers (O), events (E) and tasks (T).
  • the system 100 may also receive estimated event completion times (EECT), which may be the estimated time limit to complete each task.
  • EECT estimated event completion times
  • the information may be provided through input through user interface unit 140, and/or data input unit 110.
  • the information may be transmitted from user interface unit 140, and/or data input unit 110 to the backend unit 120.
  • the backend unit 120 may provide the information to an optimizer/scheduler unit 130 which may generate a capability matrix (e.g. the capability matrix 300 shown in FIG. 3A).
  • the optimizer/scheduler unit 130 may select a candidate to perform a task based on the overall capability matrix scores of the officers, i.e. selecting the highest ranked officer based on the overall capability matrix scores.
  • the unit 130 may seek to reduce or minimize event task time required to perform the tasks.
  • the system 100 i.e the data processing unit 122 and/or the common processing unit 121 of the backend system 120, may start to monitor a task, e.g. monitoring the actual time taken by the candidate (i.e. the officer selected under step 402) to perform the task, upon receiving a trigger, e.g. upon receiving GPS information (via data input unit 110) that the candidate has reached the location of the event.
  • the backend unit 120 may function as a software as a service system.
  • step 404 the data processing unit 122 and/or the common processing unit 121 of the backend system 120 may run continuously to monitor the time taken by the candidate to perform the task. The time taken may start from when the system 100 determines that the candidate has reached the location of the event, as described earlier under step 403.
  • the data processing unit 122 may trigger step 405.
  • step 406 the data processing unit 122 may at step 406 determine whether the task has been completed, e.g. through input by the candidate via user interface unit 140.
  • step 404 and step 406 may run continuously as a loop unless the data processing unit 122 determines that the actual time taken has exceeded the estimated time limit, or that the task has been completed.
  • the data processing unit 122 and/or the common processing unit 121 may together with the emotion detector 123, determine the emotion or emotional score of the candidate.
  • the emotion or emotional score of the candidate may be determined based on data signals pertaining to body conditions (e.g. heart beat rate, respiratory rate etc.).
  • the emotion or emotional score of the candidate may be determined based on the data signals pertaining to body conditions, as well as the extracted features of the facial expression or behavior of the candidate captured by the image capture device worn by the candidate. If the candidate shows negative emotions (e.g. fear, anger etc.), the data processing unit 122 may trigger the rescheduling process, i.e.
  • step 405 the system 100 may from step 405 go to step 406 (i.e. the data processing unit 122 and/or the common processing unit 121 may continuously run step 406 to check whether the task has been completed, step 404, and step 405 (until the system 100 determines that the task has been completed at step 406 or until the system 100 determines to trigger the rescheduling process due to the emotion of the candidate).
  • a predetermined threshold e.g. 0
  • the data processing unit 122 and/or the common processing unit 121 may trigger step 407.
  • the data processing unit 122 may trigger the optimizer/scheduler unit 130 to update the capability matrix. For instance, the score for the category “actual completion time relative to estimated completion time category” may be updated based on whether the actual time taken by the officer (the candidate or an officer selected by the system 100 to replace him to perform the task) to complete the task is less than or equal to the estimated time, and the score for the category “emotions expressed by officers when completing tasks” category may be updated based on the emotions exhibited by the officer when performing the task.
  • the data processing unit 122 may trigger the optimizer/scheduler unit 130 to update the capability matrix at regular intervals, e.g. per day, per week etc. The updated capability matrix may then be used to select the candidate for a future task.
  • FIG. 5 is a diagram 500 illustrating processes between the data input unit 110/ the user interface unit 140, the backend unit 120 and the optimizer/scheduler unit 130 of the scheduling system according to various embodiments.
  • FIG. 5 illustrates function calls between the data input unit 110/ the user interface unit 140, the backend system 120, and optimizer/scheduler 130 unit.
  • the backend unit 120 may start service and may send the request to the data input unit 110/ the user interface unit 140 to receive the data.
  • the data input unit 110/ the user interface unit 140 may send an acknowledgement (acknowledging that the service is connected) to the backend unit 120.
  • the backend unit 120 may send a request to the data input unit 110/ the user interface unit 140 for information relating to the officers, the events and the tasks (officers, events, tasks list).
  • the data input unit 110/ the user interface unit 140 may in response to the request, send the information to the backend unit 120.
  • the backend unit 120 may store the information in the data processing unit 122. These processes may correspond to step 401 in FIG. 4.
  • the backend unit 120 may then send a request to the optimizer/scheduler unit 130 to generate a schedule.
  • the optimizer/scheduler unit 130 may generate a schedule based on the capability matrix 300, and may transmit the schedule to the backend unit 120. These processes may correspond to step 402 in FIG. 4.
  • the backend unit 120 may then assign tasks to the officers based on the schedule received. For instance, the backend unit 120 may through the user interface unit 140 inform individual officers on their tasks and associated information such as location. The backend unit 120 may start monitoring on whether the officers have reached the location.
  • the backend unit 120 may initiate the “Start tasks” process and may monitor the actual time being taken by the officer to perform the task (step 403).
  • the backend unit 120 may run the service, and may monitor whether the actual time taken exceeds the estimated time limit (step 404).
  • the backend unit 120 may make a function call (i.e. send a request) to the optimizer/scheduler unit 130.
  • the optimizer/scheduler unit 130 may then select another candidate for the task based on the capability matrix scores.
  • the optimizer/scheduler unit 130 may send the rescheduled task lists to the officers.
  • the backend unit 120 may continuously monitor until the backend unit 120 determines that the task has been completed (step 406). As mentioned above, the backend unit 120 may trigger the optimizer/scheduler unit 130 to update the capability matrix.
  • the capability matrix may be updated based on information obtained from the task that has just been completed. In the next iteration, the candidate chosen to perform a task may be based on the updated capability matrix scores.
  • the schedules of the officers may then be cleared from the software service system and the information may be stored in the backup table in the data processing unit 122 table of backend unit 120.
  • the software service may then be stopped.
  • FIG. 6A is a diagram 600a illustrating processes between the system 100, officer 1 entity and officer 2 entity according to various embodiments.
  • Officer 1 entity may refer to the wearable sensor and/or a device displaying the graphics user interface for communication with the system held by a first officer.
  • officer 2 entity may refer to the wearable sensor and/or a device displaying the graphics user interface for communication with the system held by a second officer.
  • a set of tasks may be generated for officers based on inputs from data input unit 110 and/or user interface unit 140. The first officer and the second officer may be ranked the highest based on the capability matrix.
  • the optimizer/scheduler may generate a schedule of the two officers based on the inputs from data input unit 110 and/or user interface unit 140.
  • the task lists may be generated based on the schedule.
  • the system 100 may assign tasks to the officers, and may continuously monitor the progress of the completion of the tasks by the two officers.
  • the event task table at the data processing unit 122 of backend unit, based on the generated schedule, may continuously be updated when information is received by the backend unit 120 and the optimizer/scheduler unit 130.
  • the wearable sensor worn by the first officer may monitor the emotion of the first officer.
  • the image capture device of the wearable sensor may capture one of more images on the first officer and/or body sensors of the wearable sensor may detect body conditions such as heart beat rate etc. of the first officer.
  • the system 100 may determine an emotional score or emotion of the first officer, e.g. based on the body conditions of the first officer.
  • the system 100 may determine the emotional score or emotion of the first officer based on the body conditions of the first officer and the facial expression or behavior of the first officer.
  • the progress of the task may be monitored through the wearable sensor and/or through the graphics user interface of the device carried by the first officer.
  • the wearable sensor worn by the second officer may monitor the emotion of the second officer.
  • the image capture device of the wearable sensor may capture one of more images on the second officer and/or body sensors of the wearable sensor may detect body conditions such as heart beat rate etc. of the second officer.
  • the system 100 may determine an emotional score or emotion of the second officer, e.g. based on the body conditions of the second officer.
  • the system 100 may determine the emotional score or emotion of the second officer based on the body conditions of the second officer and the facial expression/behavior of the second officer.
  • the progress of the task may be monitored through the wearable sensor and/or through the graphics user interface of the device carried by the second officer.
  • the information may be transmitted from the wearable sensors directly or indirectly to the data input unit 110 via a communication network.
  • the information may be transmitted through the cloud, or via a server.
  • the information may continuously be provided from the wearable sensors to the data input unit 110, and information may also be sent from the data input unit 110 to the wearable sensor.
  • the system 100 may send the task list to officer 1 entity based on the schedule generated.
  • the officer 1 entity sends an acknowledgement to the system 100 acknowledging that the task list has been received.
  • the system 100 may send the task list to officer 2 entity based on the schedule generated.
  • the officer 2 entity sends an acknowledgement to the system 100 acknowledging that the task list has been received.
  • S 100, 0100, S200 and 0200 may occur between step 402 and step 403 shown in FIG. 4.
  • the system 100 may continuously track the position of the officer 1 entity and may check whether the first officer has reached the location of event tasks using the wearable sensor worn by the first officer.
  • the wearable sensor worn by the first officer may detect that the first officer has reached the location and officer 1 entity may send an acknowledgement to system 100 that the first officer has reached the location.
  • the system may start to monitor the actual time spent by the first in completing the task.
  • the system 100 may continuously track the position of the officer 2 entity and may check whether the second officer has reached the location of event tasks using the wearable sensor worn by the second officer (which may be different from the location of event tasks that the first officer is sent to).
  • the wearable sensor worn by the second officer may detect that the second officer has reached the location and officer 2 entity may send an acknowledgement to system 100 that the second officer has reached the location.
  • the system may start to monitor the actual time spent by the second officer in completing the task.
  • S300, 0300, S400 and 0400 may correspond to step 403 shown in FIG. 4.
  • the system 100 may continuously monitor the progress of the task.
  • the wearable sensor may detect completion of task (e.g. by officer 1 entity leaving location or through image capture device) and may send an acknowledgement to the system that the task has been completed.
  • the first officer may through the graphics user interface of the device indicate that the task has been completed.
  • the system 100 may continuously monitor the progress of the task.
  • the wearable sensor may detect completion of task (e.g. by officer 2 entity leaving location or through image capture device) and may send an acknowledgement to the system that the task has been completed.
  • the second officer may through the graphics user interface of the device indicate that the task has been completed.
  • S500, 0500, S600 and 0600 may correspond to steps 404 and 406 as illustrated in
  • the backend unit 120 may trigger the optimizer/scheduler unit 130 to update the capability matrix.
  • the capability matrix may be updated based on information obtained from the task that has just been completed.
  • the candidate chosen to perform a task may be based on the updated capability matrix scores.
  • the service may then be stopped.
  • FIG. 6B is a diagram 600b illustrating processes between the system 100, officer 1 entity, officer 2 entity and officer 3 entity according to various embodiments. Similar to FIG. 6A, officer 1 entity may refer to the wearable sensor and/or a device displaying the graphics user interface for communication with the system held by a first officer, officer 2 entity may refer to the wearable sensor and/or a device displaying the graphics user interface for communication with the system held by a second officer, while officer 3 entity may refer to the wearable sensor and/or a device displaying the graphics user interface for communication with the system held by a third officer.
  • a set of tasks may be generated for officers based on inputs from data input unit 110 and/or user interface unit 140.
  • the first officer and the second officer may be ranked the highest based on the capability matrix.
  • the third officer may be ranked the third highest.
  • the system 100 may assign tasks to the first officer and the second officer, and may continuously monitor the progress of the completion of the tasks by the two officers.
  • the third officer may remain in base as a standby.
  • each officer may wear a wearable sensor for monitoring the emotion of the respective officer.
  • the image capture device of the wearable sensor may capture one of more images on the respective officer and/or body sensors of the wearable sensor may detect body conditions such as heart beat rate etc. of the respective officer
  • the system 100 may determine an emotional score or emotion of the respective officer based on the body conditions of the respective officer.
  • the system 100 may determine the emotional score or emotion of the respective officer based on the facial expression/behavior of the respective officer and the body conditions of the respective officer. Further, the progress of the task may be monitored through the wearable sensor and/or through the graphics user interface of the device carried by the respective officer.
  • a set of tasks may be generated for officers based on inputs from data input unit 110 and/or user interface unit 140.
  • the first officer and the second officer may be ranked the highest based on the capability matrix.
  • the optimizer/scheduler unit 130 may generate a schedule of the two officers based on the inputs from data input unit 110 and/or user interface unit 140.
  • the system 100 may assign tasks to the officers, and may continuously monitor the progress of the completion of the tasks by the two officers.
  • the task lists may be generated based on the schedule.
  • the event task table at the data processing unit 122 of backend unit, based on the generated schedule may continuously be updated when information is received by the backend unit 120 and the optimizer/scheduler unit 130.
  • the information may be transmitted from the wearable sensors directly or indirectly to the data input unit 110 via a communication network.
  • the information may be transmitted through the cloud, or via a server.
  • the information may continuously be provided from the wearable sensors to the data input unit 110, and information may also be sent from the data input unit 110 to the wearable sensor.
  • the system 100 may send the task list to officer 1 entity based on the schedule generated.
  • the officer 1 entity sends an acknowledgement to the system 100 acknowledging that the task list has been received.
  • the system 100 may send the task list to officer 2 entity based on the schedule generated.
  • the officer 2 entity sends an acknowledgement to the system 100 acknowledging that the task list has been received.
  • A100, B100, A200 and B200 may occur between step 402 and step 403 shown in
  • the system 100 may continuously track the position of the officer 1 entity and may check whether the first officer has reached the location of event tasks using the wearable sensor worn by the first officer.
  • the wearable sensor worn by the first officer may detect that the first officer has reached the location and officer 1 entity may send an acknowledgement to system 100 that the first officer has reached the location.
  • the system may start to monitor the actual time spent by the first officer in completing the task.
  • the system 100 may continuously track the position of the officer 2 entity and may check whether the second officer has reached the location of event tasks using the wearable sensor worn by the second officer (which may be different from the location of event tasks that the first officer is sent to).
  • the wearable sensor worn by the second officer may detect that the second officer has reached the location and officer 2 entity may send an acknowledgement to system 100 that the second officer has reached the location.
  • the system may start to monitor the actual time spent by the second officer in completing the task.
  • S300, 0300, S400 and 0400 may correspond to step 403 shown in FIG. 4.
  • the system 100 may continuously monitor the progress of the task.
  • the wearable sensor may detect that the task is still in progress (e.g. by officer 1 entity still at location or through image capture device) and may send an acknowledgement indicating that the task is still in progress.
  • the system may determine that the actual time spent has exceeded the estimated time limit, i.e. delayed.
  • the wearable sensor may transmit information (e.g. images of the first officer, signals indicating body conditions of the first officer) which indicate negative emotion of the first officer.
  • the emotion detector 123 of the backend unit 120 may determine that the emotion score of the first officer has fallen below a predetermined threshold (e.g. 0).
  • A500 may correspond to step 404 shown in FIG. 4 while B500 may correspond to step 405 shown in FIG. 4.
  • the system 100 may continuously monitor the progress of the task.
  • the wearable sensor may detect completion of task (e.g. by officer 2 entity leaving location or through image capture device) and may send an acknowledgement to the system that the task has been completed.
  • the second officer may through the graphics user interface of the device indicate that the task has been completed.
  • A600 and B600 may correspond to steps 404 and 406 as illustrated in FIG. 4.
  • the system 100 may select another candidate for the task.
  • the third officer being the next highest rank in the capability matrix may be selected (step 402).
  • the task list of the third officer may also be updated.
  • the system may send information to officer 1 entity to inform the first officer to terminate the task as the officer has exhibited negative emotions while performing the set of the tasks.
  • the information may, for instance, be conveyed via the graphics user interface of the device.
  • officer 1 entity may acknowledge that it has received the information.
  • A750 and B750 may occur between step 402 and step 403.
  • the system 100 may send the task list to officer 3 entity based on the updated schedule generated by the optimizer/scheduler unit 130.
  • the officer 3 entity sends an acknowledgement that the task list has been received.
  • the system 100 may continuously track the position of the officer 3 entity and may check whether the third officer has reached the location of event tasks using the wearable sensor worn by the third officer.
  • the wearable sensor worn by the third officer may detect that the third officer has reached the location and officer 3 entity may send an acknowledgement to system 100 that the third officer has reached the location.
  • the system may start to monitor the actual time spent by the third officer in completing the task.
  • A900 and B900 may correspond to step 403 shown in FIG. 4.
  • the system 100 may continuously monitor the progress of the task that the third officer is assigned to.
  • the wearable sensor may detect completion of task (e.g. by officer 3 entity leaving location or through image capture device) and may send an acknowledgement to the system that the task has been completed.
  • the third officer may through the graphics user interface of the device indicate that the task has been completed.
  • A1000 and B 1000 may correspond to steps 404 and 406 as illustrated in FIG. 4.
  • the backend unit 120 may trigger the optimizer/scheduler unit 130 to update the capability matrix.
  • the capability matrix may be updated based on information obtained from the task that has just been completed. In the next iteration, the candidate chosen to perform a task may be based on the updated capability matrix scores.
  • FIG. 7A show the map input page 710 of the graphics user interface 700 of the user interface unit 140 according to various embodiments.
  • the map input page 710 may contain fields to accept the number of officers, event-tasks, base stations and event task time as input.
  • the map input page 710 may include a display of a map, for instance, a part of the city, or the country. Additionally, the map input page 710 may include three user interface (UI) blocks: base station creator 711, officer creator 712, and event creator 713.
  • UI user interface
  • the graphics user interface 700 may also include the map display page 720, the officer management page 730, and the event management page 740.
  • the graphics user interface 700 may, for instance, be hosted in a personal computer, or may be hosted on a cloud/ standalone server system, and may be accessed by the officer or worker sent to complete a task, by the administrator/operator, or the commander etc.
  • FIG. 7B show the map display page 720 of the graphics user interface 700 of the user interface unit 140 according to various embodiments when tasks have been assigned to the first officer and the second officer.
  • the map display page 720 may display the information about the current location of the base stations, officers and the event locations as provided to the map input page 710.
  • the information provided to the map input page 710 may be processed by the backend unit 120 and optimizer/scheduler unit 130 to generate the task lists for the officers. Further, the map display page 720 may also display the route in which the officers have traversed to the event locations.
  • the map display page 620 may continuously display information in the form of UI blocks about for example, but not limited to, location of the officers, officers’ task list, location of the base station, event location, actual event times (AET) (actual time taken to complete tasks), estimated event times (EET) (estimated time taken to complete tasks), emotion information of the officer etc.
  • AET actual event times
  • EET estimated event times
  • UI block 721 may be a part of the map display page and may contain information about the number of officers at the base station, and indication of the officers with the assigned tasks.
  • UI block 722 may contain routing information provided to the officers to navigate to the event locations and also the time estimated to resolve those events.
  • the first officer (Officerl) assigned to perform tasks at event location 1 (E1T1 and E1T2) may be shown the best route to event location 1.
  • the second officer (Officer2) assigned to perform tasks at event location 2 (E2T1 and E2T2) may be shown the best route to event location 2.
  • FIG. 7C show the map display page 720 of the graphics user interface 700 of the user interface unit 140 according to various embodiments when the first officer and the second officer are performing the tasks.
  • the first officer (Officerl) and the second officer (Officer2) may have reached their event locations respectively, and may have started performing the tasks.
  • the third officer (Officer3) and the fourth officer (Officer4) may remain at the base station on standby to be assigned.
  • UI block 723 may provide information about the third officer (Officer3) and the fourth officer (Officer4) present at the base station.
  • the standby officers may support existing events or may be assigned to perform tasks at new events when these events occur.
  • UI block 724 shows that the first officer (Officerl) and the second officer (Officer2) have reached the event locations.
  • the wearable sensor may send an acknowledgement to the backend unit 122 to trigger the timer for tracking the actual time spent (AET) by the officer to complete a task.
  • the actual time spent by the officer (AET) as well as the emotion of the officer may be continuously tracked as he performs the tasks assigned to him via the wearable sensor/image capture device.
  • the estimated event times (EET) may be previously provided to the system 100 by an user such as an operator.
  • the actual time (AET) spent by the first officer (Officerl) has exceeded the estimated time (EET).
  • the system 100 detects that the first officer is fearful. The system 100 may then be triggered to assign another candidate to perform the task that has been assigned to the first officer, since the first officer may no longer been able to perform the task. The emotional conditions of the officers may be displayed as bar graphs or charts etc.
  • the system 100 may determine that the second officer (Officer 2) is performing tasks on time, as shown in FIG. 7C.
  • FIG. 7D show the map display page 720 of the graphics user interface 700 of the user interface unit 140 according to various embodiments when the third officer is sent to perform the tasks previously assigned to the first officer.
  • the first officer (Officerl) is sent back to the base station.
  • UI block 724’ shows that the third officer (Officer3) resolving the delay.
  • FIG. 7E shows the officer management page 730 of the graphics user interface 700 of the user interface unit 140 according to various embodiments.
  • the officer management page 730 may include information about each officer, the progress of the current set of event tasks they are performing and also the capability matrix scores of each of the officers together with their rankings.
  • UI block 731 shows information about the available officers, e.g. name, role, type (e.g. fire, water, electrical, security etc.), and the associated base station. Clicking on a row associated with an officer in UI block 731 may result in the GUI to display UI block 732, which shows detailed information pertaining to the officer.
  • the information may include, for instance, the skill sets, employee’s identification (ID) etc.
  • the task list of the officer may also be displayed in UI block 732, with indications showing whether the tasks are completed, in progress, or to be done.
  • UI block 733 shows the capability matrix scores of the officers, as well as their ranks with respect to the capability matrix scores.
  • FIG. 7F shows the event management page 740 of the graphics user interface 700 of the user interface unit 140 according to various embodiments.
  • the event management page block 740 may display about the progress of different events and tasks, and may contain information about estimated and actual task completion times, the type of event the tasks are associated with, the priority of event, as well as the event category (whether the event is a scheduled event or an automated event).
  • the event management page 740 may contain an UI block 741 of a table showing columns such as the reference number (no.), priority, type, category, location, date/time, progress.
  • the “priority” column may indicate the priority level of each event, and may be indicated by different icons representing different priority levels.
  • the “type” column may indicate the type of event, e.g. fire, water, security etc., which may be represented by different icons.
  • the “category” column may indicate whether the event is a scheduled event or an automated event.
  • the “location” column may indicate the locations of the various events, the “date/time” may indicate the date and time in which the event is reported, and the “progress” column may indicate the progress in which the event is being resolved.
  • UI block 742 Clicking on a row associated with an event may trigger UI block 742 to be displayed.
  • UI block 742 may show detailed information of the event, such as the number of tasks and the status of each task. The estimated time to complete each task is also indicated. When the task is completed, the actual time to complete the task is also shown in block 742.
  • FIG. 8A is a schematic illustrating the scheduled task lists generated for the first officer (Officer 1) and the second officer (Officer2) according to various embodiments. As shown, the first officer is assigned the tasks E1T1 and E1T2, while the second officer is assigned the tasks E2T1 and E2T2.
  • FIG. 8B is a schematic illustrating the revised scheduled task lists generated for the first officer (Officer 1) and the second officer (Officer2) when the scheduling system detects a delay and a change in emotion of the first officer (Officer 1) according to various embodiments. As shown in FIG. 8B, the task E1T2 may be reallocated from the first officer to the second officer.
  • FIG. 8C is a schematic illustrating the scheduled tasks lists generated for the first officer (Officer 1) and the second officer (Officer2) with the estimated time limits according to various embodiments.
  • FIG. 8D is a schematic comparing the scheduled task lists and the revised task lists when the scheduling system detects a delay and a change in emotion of the first officer (Officerl) according to various embodiments. As shown in FIG. 8D, the first officer spends 70 minutes on E1T1 which is longer than the estimated time limit of 20 minutes. The second officer is able to complete his tasks (E2T1 and E2T2) within the estimated time limits and is re-assigned to complete E1T2 which the first officer is originally assigned to.
  • FIG. 9 is a general illustration of a scheduling system 900 according to various embodiments.
  • the scheduling system 900 may include a backend unit 920 configured to determine an emotional score or emotion pertaining to a respective user of a plurality of users, such that the backend unit 920 determines a plurality of emotional scores or emotions pertaining to different users of the plurality of users.
  • the scheduling system 900 may additionally include a scheduling unit 930 (also referred to as an optimizer /scheduler unit) configured to select a candidate from the plurality of users to perform a task based on the plurality of emotional scores or emotions pertaining to different users of the plurality of users.
  • a scheduling unit 930 also referred to as an optimizer /scheduler unit
  • the backend unit 920 may be configured to monitor a time taken by the candidate to complete the task and to determine the emotional score or emotion of the candidate when the candidate is performing the task.
  • the backend unit 920 may be configured to select another candidate for the task upon the scheduling system 900 determining that the time taken by the candidate to complete the task is beyond an estimated time limit and upon the emotional score or emotion of the candidate falling below a predetermined threshold.
  • the scheduling system 900 may further include a data input unit configured to receive one or more data signals indicating a body condition of the respective user of the plurality of users.
  • the backend unit 920 may be configured to determine the emotional score or emotion pertaining to the respective user based on the body condition of the respective user.
  • the body condition of the respective user may be, for instance, a heart beat rate of the respective user, a respiratory rate of the respective user, a perspiration rate of the respective user, or a skin temperature of the respective user.
  • the one or more data signals indicating the body condition of the respective user may be determined by a wearable sensor, i.e. one or more body sensors included in the wearable sensor.
  • the data input unit may be configured to receive one or more images of the respective user of the plurality of users.
  • the backend unit 920 may be configured to extract features indicating a facial expression or behavior of the respective user, and may be further configured to determine the emotional score or emotion pertaining to the respective user also based on the extracted features.
  • the one or more images may, for instance, be still images or a video feed of the respective user captured by an image capture device, e.g. a camera or a video camera.
  • the image capture device may be attached to or worn by the respective user.
  • the image capture device may be attached to clothing of the respective user.
  • the wearable sensor may include the one or more body sensors, and the image capture device. In various other embodiments, the wearable sensor and the image capture device may be separate devices.
  • the scheduling system 900 may further include an user interface unit further configured to receive information under one or more different categories in relation to the respective user, such that the user interface unit is configured to receive information under the one or more different categories in relation to different users of the plurality of users.
  • the scheduling unit 930 may be configured to select the candidate also based on information received under the one or more different categories in relation to different users of the plurality of users.
  • the scheduling unit 930 may be configured to select the candidate based on the information under the one or more different categories received by the user interface unit, as well as from the one or more data signals received by the data input unit.
  • the scheduling unit 930 may be configured to select the candidate based on the information under the one or more different categories received by the user interface unit, as well as from the one or more data signals received by the data input unit and from the one or more images.
  • the one or more different categories may be an age of the respective user, a position of the respective user, experience of the respective user, a relevance of a skillset of the respective user for the task, actual completion times in relation to estimated completion times of previous tasks allocated to the respective user, and/or an overall health score of the respective user.
  • the emotional score or emotion of the respective user may also be based on the body condition of one or more persons interacting with the respective user. In various embodiments, the emotional score or emotion of the respective user may also be based on the extracted features and the body condition of the one or more persons interacting with the respective user.
  • the user interface unit may be configured to display a graphics user interface (GUI) including a map input page, a map display page, an event management page for displaying the task, and an officer management page.
  • GUI graphics user interface
  • one category of the one or more different categories may be assigned a weightage different from a weightage assigned to another category of the one or more different categories.
  • the scheduling unit 930 may be configured to generate a capability matrix including the plurality of emotional scores or emotions pertaining to different users of the plurality of users, as well as information received under the one or more different categories in relation to different users of the plurality of users.
  • the capability matrix may also include scores determined by whether the different users of the plurality of users are able to complete previous tasks within predefined estimated time limits.
  • the capability matrix may be updated at regular time intervals. For instance, the capability matrix may be updated once day, once a month, or once a year. The capability matrix may be updated based on the plurality of emotional scores pertaining to the different users of the plurality of users during previous tasks as well as based on scores determined by whether the different users of the plurality of users are able to complete previous tasks within predefined estimated time limits.
  • the other candidate may be selected from the remaining users of the plurality of users based on the emotional scores or emotions as well as information received under the one or more different categories in relation to the remaining users.
  • the backend unit 920 may be configured to determine the emotional score or emotion of the respective user by determining a probability that the respective user is happy, a probability that the respective user is relaxed, a probability that the respective user is angry, and a probability that the respective user is fearful based on the body condition of the respective user.
  • the backend unit 920 may be configured to determine the emotional score or emotion of the respective user by determining a probability that the respective user is happy, a probability that the respective user is relaxed, a probability that the respective user is angry, and a probability that the respective user is fearful based on the extracted features of the facial expression/behavior of the respective user, and based on the body condition of the respective user.
  • FIG. 10 is a general illustration of a method 1000 of forming a scheduling system according to various embodiments.
  • the method may also include, in 1020, providing a backend unit of the scheduling system, the backend unit further configured to determine an emotional score or emotion pertaining to a respective user of a plurality of users, such that the backend unit determines a plurality of emotional scores or emotions pertaining to different users of the plurality of users.
  • the method may further include, in 1030, providing a scheduling unit configured to select a candidate from the plurality of users to perform a task based on the plurality of emotional scores or emotions pertaining to different users of the plurality of users.
  • the backend unit may be configured to monitor a time taken by the candidate to complete the task and to determine the emotional score or emotion of the candidate when the candidate is performing the task.
  • the scheduling system may be configured to select another candidate for the task upon the scheduling system determining that the time taken by the candidate to complete the task is beyond an estimated time limit and upon the emotional score or emotion of the candidate falling below a predetermined threshold.
  • the method may include providing a data input unit of the scheduling system, the data input unit configured to receive one or more data signals indicating a body condition of the respective user of the plurality of users.
  • the backend unit may be configured to determine the emotional score or emotion pertaining to the respective user based on the body condition of the respective user.
  • the data input unit may be configured to receive one or more images of the respective user of the plurality of users.
  • the backend unit may be configured to extract features indicating a facial expression/behavior of the respective user, and further configured to determine the emotional score or emotion pertaining to the respective user also based on the extracted features.
  • the method may additionally provide an user interface unit further configured to receive information under one or more different categories in relation to the respective user, such that the user interface unit is configured to receive information under the one or more different categories in relation to different users of the plurality of users.
  • the scheduling unit may be configured to select the candidate also based on information received under the one or more different categories in relation to different users of the plurality of users.
  • the one or more different categories may be an age of the respective user, a position of the respective user, experience of the respective user, a relevance of a skillset of the respective user for the task, actual completion times in relation to estimated completion times of previous tasks allocated to the respective user, and an overall health score of the respective user.
  • the user interface unit may be configured to display a graphics user interface (GUI) comprising a map input page, a map display page, an event management page for displaying the task, and an officer management page.
  • GUI graphics user interface
  • one category of the one or more different categories may be assigned a weightage different from a weightage assigned to another category of the one or more different categories.
  • the scheduling unit may be configured to generate a capability matrix comprising the plurality of emotional scores pertaining to different users of the plurality of users, as well as information received under the one or more different categories in relation to different users of the plurality of users.
  • the capability matrix may be updated at regular time intervals.
  • the body condition of the respective user may be a heart beat rate of the respective user, a respiratory rate of the respective user, a perspiration rate of the respective user or a skin temperature of the respective user.
  • the backend unit may be configured to determine the emotional score or emotion of the respective user by determining a probability that the respective user is happy, a probability that the respective user is relaxed, a probability that the respective user is angry, and a probability that the respective user is fearful based on the body condition of the respective user.
  • the backend unit may be configured to determine the emotional score or emotion of the respective user by determining a probability that the respective user is happy, a probability that the respective user is relaxed, a probability that the respective user is angry, and a probability that the respective user is fearful based on the extracted features of the facial expression/behavior of the respective user, and based on the body condition of the respective user.
  • the one or more images of the respective user may be still images or a video of the respective user captured by an image capture device.
  • the one or more data signals indicating the body condition of the respective user may be determined by a wearable sensor.
  • FIG. 11 is a general illustration of a method 1100 of operating a scheduling system according to various embodiments.
  • the method may include, in 1120, using a backend unit of the scheduling system to determine an emotional score or emotion pertaining to a respective user of a plurality of users, such that the backend unit determines a plurality of emotional scores or emotions pertaining to different users of the plurality of users.
  • the method may also include, in 1130, using a scheduling unit of the scheduling system to select a candidate from the plurality of users to perform a task based on the plurality of emotional scores or emotions pertaining to different users of the plurality of users.
  • the backend unit may be configured to monitor a time taken by the candidate to complete the task and to determine the emotional score or emotion of the candidate when the candidate is performing the task.
  • the scheduling system may be configured to select another candidate for the task upon the scheduling system determining that the time taken by the candidate to complete the task is beyond an estimated time limit and upon the emotional score or emotion of the candidate falling below a predetermined threshold.
  • the method may include using a data input unit of the scheduling system to receive one or more images and one or more data signals indicating a body condition of a respective user of a plurality of users.
  • the backend unit may be configured to determine the emotional score or emotion pertaining to the respective user based on the body condition of the respective user.
  • the data input unit may be configured to receive one or more images of the respective user of the plurality of users.
  • the backend unit may be configured to extract features indicating a facial expression or behavior of the respective user, and further configured to determine the emotional score or emotion pertaining to the respective user also based on the extracted features.
  • the method may also include using an user interface unit to receive information under one or more different categories in relation to the respective user, such that the user interface unit is configured to receive information under the one or more different categories in relation to different users of the plurality of users.
  • the scheduling unit may be configured to select the candidate also based on information received under the one or more different categories in relation to different users of the plurality of users.

Abstract

Various embodiments may provide a scheduling system. The scheduling system may include a backend unit configured to determine an emotional score or emotion pertaining to a respective user of a plurality of users, such that the backend unit determines a plurality of emotional scores or emotions pertaining to different users of the plurality of users. The scheduling system may also include a scheduling unit configured to select a candidate from the plurality of users to perform a task based on the plurality of emotional scores or emotions pertaining to different users of the plurality of users.

Description

SCHEDULING SYSTEM, METHODS OF OPERATING AND FORMING THE SAME
TECHNICAL FIELD
[0001] Various aspects of this disclosure relate to a scheduling system. Various aspects of this disclosure relate to a method of forming a scheduling system. Various aspects of this disclosure relate to a method of operating a scheduling system.
BACKGROUND
[0002] Recent advancement in development of the concept of smart city requires the development of efficient systems which can be used to manage critical infrastructure effectively. The systems may allow for continuous monitoring of different activities, such as identifying and tracking a suspicious person and device failure in building management systems. Further, in order to resolve these issues quickly after such events have been detected, the design of systems to enable effective dispatch of officers, workers etc. is required.
[0003] Additionally, certain tasks which are repetitive in nature, such as cleaning a place, replacing some objects etc. (i.e. scheduled events), also require the effective dispatch of officers or workers to resolve them.
[0004] The effective design of such a system is complex, especially in scenarios involving dispatching of different officers, workers etc. to resolve different events and their corresponding tasks, as this would involve creating and optimizing their schedule. The smart city system would need to create an effective dispatch schedule of the officers, workers etc. so they can respond and resolve the event tasks with minimal amount of time and cost. However, due to real world uncertainties such as delays, the prioritizing of different event tasks in relation to different events, the anticipation of event tasks, and the choosing of the right set of officers, workers etc. to perform these tasks become more challenging.
[0005] A problem to be solved may be the need to design a method to identify and handle uncertain situations, such as delays in completing event tasks, and to efficiently choose which officers, workers etc. to dispatch to resolve those events. SUMMARY
[0006] Various embodiments may provide a scheduling system. The scheduling system may include a backend unit configured to determine an emotional score or emotion pertaining to a respective user of a plurality of users, such that the backend unit determines a plurality of emotional scores or emotions pertaining to different users of the plurality of users. The scheduling system may also include a scheduling unit configured to select a candidate from the plurality of users to perform a task based on the plurality of emotional scores or emotions pertaining to different users of the plurality of users. The backend unit may be configured to monitor a time taken by the candidate to complete the task and to determine the emotional score or emotion of the candidate when the candidate is performing the task. The scheduling system may be configured to select another candidate for the task upon the scheduling system determining that the time taken by the candidate to complete the task is beyond an estimated time limit and upon the emotional score or emotion of the candidate falling below a predetermined threshold.
[0007] Various embodiments may provide a method of forming a scheduling system. The method may include providing a backend unit of the scheduling system, the backend unit configured to determine an emotional score or emotion pertaining to a respective user of a plurality of users, such that the backend unit determines a plurality of emotional scores or emotions pertaining to different users of the plurality of users. The method may also include providing a scheduling unit of the scheduling system, the scheduling unit configured to select a candidate from the plurality of users to perform a task based on the plurality of emotional scores or emotions pertaining to different users of the plurality of users. The backend unit may be configured to monitor a time taken by the candidate to complete the task and to determine the emotional score or emotion of the candidate when the candidate is performing the task. The scheduling system may be configured to select another candidate for the task upon the scheduling system determining that the time taken by the candidate to complete the task is beyond an estimated time limit and upon the emotional score or emotion of the candidate falling below a predetermined threshold.
[0008] Various embodiments may provide a method of operating a scheduling system. The method may include using a backend unit of the scheduling system to determine an emotional score or emotion pertaining to a respective user of a plurality of users, such that the backend unit determines a plurality of emotional scores or emotions pertaining to different users of the plurality of users. The method may also include using a scheduling unit of the scheduling system to select a candidate from the plurality of users to perform a task based on the plurality of emotional scores or emotions pertaining to different users of the plurality of users. The backend unit may be configured to monitor a time taken by the candidate to complete the task and to determine the emotion or emotional score of the candidate when the candidate is performing the task. The scheduling system may be configured to select another candidate for the task upon the scheduling system determining that the time taken by the candidate to complete the task is beyond an estimated time limit and upon the emotional score or emotion of the candidate falling below a predetermined threshold.
BRIEF DESCRIPTION OF THE DRAWINGS
[0009] The invention will be better understood with reference to the detailed description when considered in conjunction with the non-limiting examples and the accompanying drawings, in which:
FIG. 1A is a schematic of a scheduling system according to various embodiments.
FIG. IB is another schematic of the scheduling system according to various embodiments. FIG. 2A is a schematic illustrating emotion detection and classification performed by the system according to various embodiments.
FIG. 2B shows the profile of a signal representing an user or officer with calm emotion according to various embodiments.
FIG. 2C shows the profile of a signal representing an user or officer with abnormal emotion conditions, e.g. fear or panic, according to various embodiments.
FIG. 3 A shows a capability matrix according to various embodiments.
FIG. 3B shows a table showing the score values given to officers of various age groups in the capability matrix according to various embodiments.
FIG. 3C shows a table showing the score values given to officers of different experience in the capability matrix according to various embodiments.
FIG. 3D shows a table showing the score values given to senior officers and to junior officers in the capability matrix according to various embodiments.
FIG. 3E shows a table showing the score values given to different officers of different suitability/conformity to the tasks based on the skill sets of the officers according to various embodiments. FIG. 3F shows a table showing the score values based on emotion exhibited by the officers when interacting with other people according to various embodiments.
FIG. 3G shows a table showing the score values based on emotion exhibited by the officers performing a task according to various embodiments.
FIG. 3H shows a table showing the score values based on health conditions of the officers according to various embodiments.
FIG. 4 shows a flow chart showing various steps involved in the scheduling system according to various embodiments.
FIG. 5 is a diagram illustrating processes between the data input unit / the user interface unit , the backend unit and the optimizer/scheduler unit of the scheduling system according to various embodiments.
FIG. 6A is a diagram illustrating processes between the system, officer 1 entity and officer 2 entity according to various embodiments.
FIG. 6B is a diagram illustrating processes between the system, officer 1 entity, officer 2 entity and officer 3 entity according to various embodiments.
FIG. 7A show the map input page of the graphics user interface of the user interface unit according to various embodiments.
FIG. 7B show the map display page of the graphics user interface of the user interface unit according to various embodiments when tasks have been assigned to the first officer and the second officer.
FIG. 7C show the map display page of the graphics user interface of the user interface unit 140 according to various embodiments when the first officer and the second officer are performing the tasks.
FIG. 7D show the map display page of the graphics user interface of the user interface unit according to various embodiments when the third officer is sent to perform the tasks previously assigned to the first officer.
FIG. 7E shows the officer management page of the graphics user interface of the user interface unit 140 according to various embodiments.
FIG. 7F shows the event management page of the graphics user interface of the user interface unit 140 according to various embodiments.
FIG. 8A is a schematic illustrating the scheduled task lists generated for the first officer (Officerl) and the second officer (Officer2) according to various embodiments. FIG. 8B is a schematic illustrating the revised scheduled task lists generated for the first officer (Officer 1) and the second officer (Officer2) when the scheduling system detects a delay and a change in emotion of the first officer (Officerl) according to various embodiments.
FIG. 8C is a schematic illustrating the scheduled tasks lists generated for the first officer (Officerl) and the second officer (Officer2) with the estimated time limits according to various embodiments.
FIG. 8D is a schematic comparing the scheduled task lists and the revised task lists when the scheduling system detects a delay and a change in emotion of the first officer (Officerl) according to various embodiments.
FIG. 9 is a general illustration of a scheduling system according to various embodiments.
FIG. 10 is a general illustration of a method of forming a scheduling system according to various embodiments.
FIG. 11 is a general illustration of a method of operating a scheduling system according to various embodiments.
DETAILED DESCRIPTION
[0010] The following detailed description refers to the accompanying drawings that show, by way of illustration, specific details and embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention. Other embodiments may be utilized and structural, and logical changes may be made without departing from the scope of the invention. The various embodiments are not necessarily mutually exclusive, as some embodiments can be combined with one or more other embodiments to form new embodiments.
[0011] Embodiments described in the context of one of the methods or systems is analogously valid for the other methods or systems. Similarly, embodiments described in the context of a method are analogously valid for a system, and vice versa.
[0012] Features that are described in the context of an embodiment may correspondingly be applicable to the same or similar features in the other embodiments. Features that are described in the context of an embodiment may correspondingly be applicable to the other embodiments, even if not explicitly described in these other embodiments. Furthermore, additions and/or combinations and/or alternatives as described for a feature in the context of an embodiment may correspondingly be applicable to the same or similar feature in the other embodiments. [0013] In the context of various embodiments, the articles “a”, “an” and “the” as used with regard to a feature or element include a reference to one or more of the features or elements. [0014] In the context of various embodiments, the term “about” or “approximately” as applied to a numeric value encompasses the exact value and a reasonable variance.
[0015] As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
[0016] One prior reference describes creating a job list schedule for all different workers, and monitoring the stress level of workers performing tasks using wearable sensors. In case of situations, such as a delay in task completion, the jobs may be reassigned manually, e.g. by a co-worker volunteering to take over the job. Based on the group activity of the workers, an efficiency index is designed and a group report is generated for that worker group. However, such a system does not take into account the capability of the workers, e.g. experience, as well as the emotions of the workers when assigning tasks Further, the system does not impose a constraint on the task completion time of the workers. Further, the system does not automatically identify situations such as delays and reassign tasks to other workers based on capability matrix scores.
[0017] Various embodiments may relate to a scheduling system. Various embodiments may seek to address one or more issues described above by creating an effective dispatch schedule of users (such as officers, workers etc.) by considering capability matrix scores determined or calculated by the scheduling system. An “user” as described herein may generally be taken as any person that uses the system. A user may, for instance, be a police officer, a fire fighter, or an electrical maintenance worker, that uses the system to responds to an event and to undertake tasks.
[0018] Events may, for instance, include detecting a device failure in a building system. Tasks are basically components which perform an activity in relation to an event. For example, in this case, checking the condition of the device and troubleshooting of device can be one task and the next task can be changing wires and replacing the wiring of the device system and the device. The tasks may be performed by different officers, workers etc.
[0019] Various embodiments may reduce event-task times (e.g. reduce the time to complete tasks to a minimal). Various embodiments may be configured to select a candidate from a plurality of users, e.g. officers, to perform a task. Various embodiments may monitor event completion times and emotion exhibited by the users. Various embodiments may be configured to select the next best ranked candidate based on the capability matrix scores in the event that there is a delay and the system detects negative emotions of the original candidate being assigned to perform that task.
[0020] The system may monitor the emotions of the users, e.g. officers, through image capture devices capturing facial expressions of the users, as well as through wearable sensors which monitor a body condition (e.g. heat beat rate, respiratory rate, skin temperature etc.) of the users. The system may also monitor a time taken by the users to complete tasks.
[0021] The system may generate a capability matrix including emotional scores reflecting emotions of the users, as well as other information, such as age, hierarchy, experience, relevance, event task completion times and overall health condition etc. to select a candidate from the plurality of users to perform a task. A capability matrix score of each user, e.g. officer, may be calculated or determined based on the emotional scores as well as from the other information. The system may be configured to select another candidate based on the capability matrix scores in the event that there is a delay and the system detects negative emotions (e.g. fear or extremely vulnerable mental state) of the original candidate being assigned to perform that task, so as to reduce the time to complete the task.
[0022] In the current context, a “emotional score” of an user reflects probabilities that the user is experiencing different emotions such as (but is not limited to) happy, relaxed, angry and fearful, as determined by the scheduling system. The different emotions may be different classifications as predefined in the scheduling system. A high probability of positive emotions such as happy and relaxed and a low probability of negative emotions such as angry or fearful may be reflected by a positive emotional score. On the other hand, a high probability of negative emotions and a low probability of positive emotions may be reflected by a negative emotional score. The scheduling system may be configured to determine or detect different emotions by means such as detecting body conditions (e.g. heart rate, respiratory rate etc.) and/or detecting outward behaviors or expressions, such as facial expressions.
[0023] FIG. 1A is a schematic of a scheduling system 100 according to various embodiments. FIG. IB is another schematic of the scheduling system 100 according to various embodiments. The scheduling system 100 may be configured to dispatch users, e.g. officers or workers, in response to various events to solve issues. The event may be any suitable event and the dispatching of the users may not be limited to any specific application. In various embodiments, the event may relate to fire issues, electrical issues, security issues, water issues etc. The system may be used for scheduled events or issues, or for automated events/issues. A scheduled event may be one event of a series of predefined or prescheduled, periodic events, e.g. a routine patrol stop of a police officer, or routine maintenance of power lines by an electrician. On the other hand, an automated event may be a random event which can occur at any time and which is scheduled automatically by the system 100, e.g. the report of a burglary which a police officer is assigned to investigate, or report of a fire which a fire fighter is assigned to resolve. The users or officers may be required to be dispatched to resolve automated events which are separate from their normal schedules.
[0024] The system 100 may include a data input unit 110, a backend unit 120, an optimizer /scheduler unit 130 (also referred to as a scheduling unit), and a user interface unit 140.
[0025] The units 110, 120, 130 and 140 may be individual computational modules/units or may be individual running process instances that programmed in one or more processing devices such Central Processor Unit (CPU), Graphical Processing Unit (GPU), microcontroller or other devices. In various embodiments, the units 110, 120, 130 and 140 may be part of a single processing device, while in various other embodiments, the units 110, 120, 130 and/or 140 may be included in different processing devices. The units 110, 120, 130 and 140 may be connected to one another. For instance, as shown in FIG. 1A, the backend unit 120 may be connected to the data input unit 110, the user interface unit 140, and the optimizer/scheduler unit 130. The units may be connected directly or indirectly via wired means (e.g. bus lines, fiber optics cables etc.), via wireless means (e.g. WiFi, Bluetooth etc.) or a combination of both wired and wireless means.
[0026] In various embodiments, the system 100 may be a computer system in the form of a personal computer (PC) or a server computer. The computer may have a running operating system (OS) on top of hardware that includes a central processing unit (CPU), a storage device, an input device (e.g. keyboard or mouse), an output device (e.g. a display), and a communication interface (e.g. ethernet port to connect to a Local Area Network (LAN), which may be connected to the Internet or Wire Area Network (WAN), which other related devices may also be connected to). The communications between units 110, 120, 130 and/or 140 may be carried out for instance, but is not limited to, Inter-Process Communication (IPC) mechanism. [0027] As mentioned above, the data input unit block 110 may be a computational module/unit of the system 100.
[0028] Data input unit 110 may include an event monitoring system which contains information about the number of base stations, number of events occurring, the number of users or officers available, the corresponding tasks required to complete for the various events, and the estimated time to complete the tasks of the events. The input information about the users or officers may inherently contain information about the location of the users or officers, and events information may inherently include information of the locations of various events. The event monitoring system may collect, manage and store data about number of users/officers, events, tasks, event task times etc. at each time instance. The data input unit 110 may also include a centralized commander system for monitoring a region of area using optical cameras and other sensors to trigger automated and scheduled events. The base stations, events, manpower required (i.e. the user/officers involved) may relate to, for example but not limited to, fire, water, electrical security etc. In various embodiments, the different events may be prioritized differently. The data input unit 110 may also be configured to collect of data from wearable sensors worn by the users. The wearable sensors may, for instance, include or contain a Global Positioning System (GPS) sensor, various body sensors for monitoring the body condition of the users, and/or an image capture device, e.g. an optical camera sensor. The wearable sensors may be attached to the users, for example, but not limited to their body/outfits. The data input unit may be configured to obtain data from the wearable sensors through a communication network, for example, but not limited to WiFi, Bluetooth etc.
[0029] The backend unit 120 may be another computational module/unit of the system 100 connected to the data input unit.
[0030] The backend unit 120 may include a data processing unit or database (DB) 122, which may be configured to receive input data such as information pertaining to the number of base stations, number of events, number of tasks in events, number of users/officers, estimated event task completion times and/or event locations from the data input unit block 110. The received input data may be stored in the database block 122, which may be for example, but not limited to PostgresDB, Mongo DB etc. in the form of one or more database tables. The one or more database tables may contain information, for example, but not limited to the type of event, number of tasks in event, type of user/officer, type of base station etc. The data processing unit 122 may additionally obtain information about the optimal schedules for each user/officer from the optimizer/scheduler unit 130. Once the schedule of the users or officers is generated, the routing information may be generated from the current location of the users or officers to the event locations. This information may be updated in the data processing unit
122 and may also be displayed on the map display page in the user interface unit 140.
[0031] The backend unit 120 may also be connected to the user interface unit block 140 to display different aspects related to the graphics user interface (GUI) as described below. [0032] The backend unit 120 may act as an interface between the data input unit 110, the user interface unit 140 and the optimizer/scheduler unit 130. The data processing unit 122 in backend unit 120 may collect the data from data input block unit 110 in the form of signals (e.g. images, videos, GPS coordinates, signals from body sensors etc.). The backend unit 120 may store the information in, for example, a data storage device, e.g. a hard disk drive, or the cloud of the block unit 122. The data processing unit 122 may, for example but not limited to, an independent server system running a service process etc. which segregates the data according to different kinds modalities and extracts features which are all relevant for the emotion detection system and event task time detection system. The extracted features and relevant data may be sent to the common processing unit 121 and the emotion detection unit
123 also included in the backend unit 120.
[0033] The emotion detection unit 123 may utilize deep learning or machine learning models to detect emotions of an user or officer, i.e. determine the probability that the officer or user is happy, relaxed/calm, angry, and fearful etc. Once the emotion of the user or officer is detected, the information may continuously be stored in the data processing unit 122, i.e. in the one or more database tables.
[0034] The common processing unit 121 may perform different processes for the backend unit 120 such as starting of backend service, resetting the service and stopping of different services. The common processing unit 121 may also perform the continuous monitoring of the event task times of the users or officers from the data received (via the data processing unit 122 and the data input unit 110) from the wearable sensors.
[0035] The optimizer/scheduler block unit 130 may continuously interact with the database block unit 122 to fetch all the information in the DB table to optimizer/schedule unit 130. The capability matrix scores may be updated based on this information and corresponding scheduling and re-scheduling of users or officers may performed through the optimizer/scheduler unit 130. The updated information may be transmitted back to the database block unit 122.
[0036] The optimizer/scheduler unit 130 may include a computation unit system. This system is running continuously as a software service. The optimizer/scheduler unit 130 may be connected to other units 110, 140 in the system 100 through the backend system unit 120. The unit 130 may be configured to reduce or minimize time taken to perform certain set of tasks based on some constraints, such as the capability matrix scores. The unit 130 may use a greedy optimizer, mathematical designed constraint optimizer etc. and metaheuristic algorithms like genetic algorithms, ant colony optimization etc., and/or any optimizer or optimization method. [0037] Additionally, in the events of situations such as delays (i.e. time taken by a user/officer to complete the task is beyond an estimated time limit), the system 100 may monitor the emotion of the user or officer assigned to perform the task. If the user or officer exhibits negative emotion, the unit 130 may modify the schedule using the optimizer and the information may be sent to the scheduler. The scheduler may change the schedules of the users or officers based on the input from the optimizer and may send that information to the database block unit 122 in the backend block unit 120.
[0038] The user interface unit 140 may be a Graphical User Interface (GUI) unit of a device or a computer which presents graphics to show information to user including texts, icons, image, diagram, charts, shapes, movie playback, etc., and which also accepts user actions input through GUI objects such as buttons, forms, text boxes, checkboxes, radio button, mouse movement, mouse selection, mouse release, etc. These graphics may be physically presented by an image display such as Liquid Crystal Display (LCD), Plasma Display. The device or computer may also include input devices such as keyboards, mouse, touch screen, etc. to accept inputs from the users or officers. Various embodiments may include a plurality of devices or computers with each device or computer having a user interface unit 140. Different devices or computers may be accessed by different users. A user who access the computer may be an officer or worker etc. sent to complete the task, an administrator/operator who coordinates the dispatching, or a commander etc. The computer may be accessed by the officer or worker sent to complete a task, by the administrator/operator, or the commander.
[0039] FIG. 2A is a schematic illustrating emotion detection and classification performed by the system 100 according to various embodiments. [0040] In 200, data may continuously be received in real time from the wearable sensors which are worn by the users or officers (e.g. through an Internet of Things (IOT) device interface). The data may include images or videos of the users or officers captured by image capture devices and data relating to one or more body conditions (e.g. heart beat rate, perspiration rate etc, skin temperature.) of the users or officers. The data may be received by unit 110 and may be transmitted by unit 110 to unit 122.
[0041] The unit 122 may from the images or videos extract the features indicating emotions, e.g. a facial expression of the user or the officer (e.g. whether the user or officer is smiling or frowning), or behavior of the user or the officer. The unit 122, i.e. the emotion detector 123, may then use machine learning (ML) /deep learning (DL) models to classify the different emotions, e.g. based on the extracted features. In various embodiments, the emotions may be determined based on one or more body conditions (e.g. heart beat rate, respiratory rate, skin temperature etc.) of the user or officer. In various embodiments, the emotions may also be determined based on the extracted features, e.g. the facial expression or behavior of the user. The unit 122 may be configured to determine a score (or quotient) for each of the emotions. The emotions may, for instance, have the classifications happy, relaxed/calm, angry, fearful, disgust, anxiety etc. The score or quotient for each of the emotions may reflect the probability that the user or officer is having that emotion. For instance, if the score or quotient for “happy” is high, there is high likelihood that the user is happy. The score or quotients for the different emotions detected for each of the user or officer may be stored in the backend unit 122. The unit 122 may determine an (overall) emotional score (or valence score) of the user or officer based on the scores of the individual emotions, which in turn may be based on the one or more body conditions of the respective user or officer. In various embodiments, the unit may determine the emotional score also based on extracted features, e.g. the facial expression or behavior of the respective user or officer. In other words, the unit 122 may be configured to determine an emotional score or emotion of the user or the officer. The emotional score or emotion may be based on the one or more body conditions of the user or officer. In various embodiments, the emotional score or emotion may be based on the one or more body conditions of the user or officer, and the extracted features, e.g. the facial expression or behavior of the user.
[0042] Under step 201, information may be received by data input unit 110 continuously from wearable sensors, which may, for instance, include Global Positioning System (GPS) sensors, various body sensors for monitoring the body condition of the users (e.g, heart rate sensor), and/or image capture device, e.gs. optical camera sensors. The information may be transmitted from the wearable sensors directly or indirectly to the data input unit 110 via a communication network may be include, but is not limited to Bluetooth, WiFi etc. The information such as data from the body sensor, and images/videos etc. may be stored by the data processing unit 122 in a local storage device, in the cloud etc. for further processing. [0043] Under step 202, the data processing unit 122 may from the images or videos extract the features indicating a facial expression of the user or the officer (e.g. whether the user or officer is smiling or frowning), or behavior of the user or the officer. For instance, key points present in the face of the user or the officer as contained in an image or video frame may be identified. The key points may, for instance, points indicating the mouth, eyes etc. The movements or displacement of the key points may be tracked or monitored, and information relating to the movements or displacement may be sent from the data processing unit 122 to the emotion detector 123. The movements or displacement may be passed through deep neural network for example resnet, densenet etc. to extract more complex feature information. The other signals from the wearable sensors may be processed to extract different signal features, for example, but not limited, to wavelet transformed signals, chirplet transformed signals, Fourier transformed signals etc. Complex signal features may be generated by considering the combination of these extracted features as input to generate more discriminative features utilizing deep learning network feature extractor.
[0044] Under step 203, the emotion detector 123 may process the different complex features related to classification of different emotions. These learnt features may be passed through a deep learning classifier such as for example, but not limited, to ANN, Convnet, Resnet etc. or machine learning classifiers such as for example, but not limited to, random forest classifiers, decision tree classifiers etc. These classifiers may classify the extracted emotional features into happy, relaxed/calm, angry, and fearful etc.
[0045] Under step 204, the emotion detector 123 may determine the probability of different outcomes based on the body condition of the respective user or officer. In various embodiments, the emotion detector 123 may also determine the probability of different outcomes also based on the extracted features. The emotion detector 123 may be configured to determine a score (or quotient) for each of the emotions. The score or quotient for each of the emotions may reflect the probability that the user or officer is having that emotion. For instance, if the score or quotient for “happy” is high, there is high likelihood that the user is happy. The score or quotients for the different emotions detected for each of the user or officer may be stored in the backend unit 122. The unit 122 may determine an (overall) emotional score (or valence score) or emotion of the user or officer based on the scores of the individual emotions. The unit 122 may be configured to determine the emotional score or emotion of the user or officer based on the body condition (e.g. heat beat rate, respiratory rate, skin temperature etc.) of the respective user or officer. In various embodiments, the unit 122 may be configured to determine the emotional score or emotion of the user or officer based on the body condition and also based on the extracted features. The valence score may range from -1 to 0 representing overall negative emotion, and from 0 to 1 representing overall positive emotion. FIG. 2B shows the profile of a signal representing a user or officer with calm emotion according to various embodiments. FIG. 2C shows the profile of a signal representing a user or officer with abnormal emotion conditions, e.g. fear or panic, according to various embodiments. The signals may, for instance, represent the respiration of a user or officer.
[0046] FIG. 3A shows a capability matrix 300 according to various embodiments. The capability matrix 300 as shown in FIG. 3A includes the capability matrix scores for a team of officers. The capability matrix 300 may include information under different categories or variables, such as serial number /rank (SL. NO/Rank), age, hierarchy, experience, relevance, emotion exhibited by the officers during interaction with people, emotion expressed while performing different tasks, even task completion times, overall health condition of the officer, and overall capability matrix scores of each officer. The overall capability matrix score (alternatively referred to as capability index) of each officer may be calculated based on event task completion time, emotion exhibited by the officer during the performance of those tasks, emotion exhibited during interaction with people, emotion exhibited by the officer when performing those tasks, relevance, overall health condition of the officers and/or hierarchy, which may be critical variables required for the calculation of capability matrix scores. The optional variables may be age and/or experience, which when specified may be used. Otherwise, the default values may be considered. The capability matrix 300 may be generated by the optimizer/scheduler unit 130.
[0047] FIG. 3B shows a table 301 showing the score values given to officers of various age groups in the capability matrix 300 according to various embodiments. Table 301 shows that an officer whose age is in the range of 20 - 40 years old may be given a score of 0.4, while an officer whose age is 40 years and above may be given a score of 0.6. When no age is been specified, the default value of 0.5 may be set.
[0048] FIG. 3C shows a table 302 showing the score values given to officers of different experience in the capability matrix 300 according to various embodiments. Table 302 shows that an officer who has more than 5 years of experience may be given a score of 0.6, while an officer has less than 5 years of experience may be given a score of 0.4. When no age is been specified, the default value of 0.5 may be set.
[0049] FIG. 3D shows a table 303 showing the score values given to senior officers and to junior officers in the capability matrix 300 according to various embodiments. Table 303 shows that a senior officer may be given a score value of 0.6, while a junior officer may be given a score value of 0.5.
[0050] FIG. 3E shows a table 304 showing the score values given to different officers of different suitability/conformity to the tasks based on the skill sets of the officers according to various embodiments. Table 304 shows that officers with skill sets of high relevance to the task are given a score of 1, while officers with skill sets of low relevance to the task may be given a score of 0. Officers with a skill sets with different degrees of relevance which fall between the two ends may be given values between 0 and 1.
[0051] FIG. 3F shows a table 305 showing the score values based on emotion exhibited by the officers when interacting with other people according to various embodiments. This metric may provide a measure of emotion exhibited by an officer by analyzing his behavior / facial expression using wearable sensors and/or image capture devices (e.g. cameras) worn by the officer, as well as the behavior /facial expression of the person the officer is interacting with. Each row of the right column shows a valence score which is an overall emotional score. If both the officer and the other person are determined to demonstrate positive behavior (e.g. happy, calm/relaxed), the valence score may range from 0 to 1. On the other hand, if the officer and the other person are determined to show negative behavior (e.g. angry, fearful, sad), the valence score may range from - 1 to 0. The facial expression of the other person may be captured by the camera worn by the officer and sent as an input (under 201 in FIG. 2). In various embodiments, the different emotions of the officer may be determined based on one or more body conditions (such as detected heat beat rate etc.). In various embodiments, the different emotions of the officer may be determined based on the one or more body conditions, as well as based on extracted features, e.g. from the facial expression or behavior of the officer and the other person interacting with the officer. The different emotions shown by the other person may be classified under 203, and the valence score for the officer may be determined.
[0052] FIG. 3G shows a table 306 showing the score values based on emotion exhibited by the officers performing a task according to various embodiments. If an officer is determined to demonstrate positive emotion when performing a task, the officer may be given a score from 0 to 1. On the other hand, if the officer is determined to demonstrate negative emotion when performing the task, the officer may be given a score from -1 to 0. When the system determines that the information provided is not sufficient to determine the emotion of the officer, a default value of 0 may be provided. The receiving of various data (e.g. images/videos, information on body condition etc.) as well as the determination and classification as different emotions may be carried as described in relation to FIG. 2.
[0053] FIG. 3H shows a table 307 showing the score values based on health conditions of the officers according to various embodiments. Various health data of the officers (e.g. body temperature, heart beat rate etc.) may be detected by the wearable sensors worn by the officer and provided to the data input unit 110. The backend unit 120 may receive the health data from the data input unit 110, and may make a determination on the health condition of each officer based on the health data provided. If an officer is feeling unwell, a score of -1 may be assigned. On the other hand, is the officer is feeling extremely great, a score of +1 may be assigned. 0 may be the default score.
[0054] A score value may also be provided to each officer based on information of the time taken by the officers to perform different event tasks assigned to them during a particular duration of time, in a day, in a week etc. The score may be provided by comparing the actual task completion time with the estimated task completion time (which may be a time limit entered into the system 100 for the time to be completed). If the actual completion time for a task is greater than the estimated completion time (Actual task completion - Estimated task completion time > 0), a score of 0 may be provided. In contrast, if the actual completion time for a task is less than or equal to the estimated completion time (Actual task completion - Estimated task competition time <= 0), a score of 1 may be provided. A score of the average completion time may be determined by averaging the individual scores for each task over the number of tasks ((Taskl + Task2 + . + TaskN)/(N)), where N is the total number of tasks.
[0055] For example, consider a scenario in which an officer has 5 tasks and the officer has completed 3 tasks within the stipulated time (i.e. actual time taken by the task is less than or equal to the estimated time). In 2 of the tasks, the officer completes the task with some delay where the actual time of completion of the task has exceeded the estimated task time. The average score may be calculated as follows: (l+l+l+0+0)/5 = 3/5 = 0.6.
[0056] An overall capability matrix score may be determined or calculated for each user or officer based on score values under different categories or variables. The different categories or variables may have different weightage. The weightage of the different categories may be assigned randomly, based on past experience, or using an optimization method. An advantage of using different weightages is that they may help to penalize or increase the influence of one category or variable in the outcome evaluation of the overall capability matrix score.
[0057] For example, the weightage of different categories may be as below: weightage given to age category (weighted_age) = 0.1, weightage given to hierarchy category (weighted_hierarchy) = 0.3, weightage given to experience category (weighted_experience) = 0.4, weightage given to relevance category (weighted_relevance)=0.4, weightage given to emotions exhibited by officers during interaction with people (weighted_valence)=0.5, weightage given to emotions expressed by officers when completing tasks category (weighted_emotion_exhibited_tasks)=0.6, weightage given to actual completion time relative to estimated completion time category (weighted_task_times)=0.7, weightage given to health condition category (weighted_health_condition) = 0.4 [0058] The overall capability matrix score may be determined as follows:
Overall capability matrix score = weighted_age*age_score + wci g h tcd_h i crarc h y * hierarchy_score + weighted_experience*experience_score + weighted_relevance*relevance_score + weighted_valence * valence_score + weighted_emotion_exhibited_tasks * emotion_exhibited_tasks_score + weighted task_compeletion time * event_task_completion_time_score + weighted_health_condition * health_condition_score where age_score is the score under age category, hierarchy_score is the score under hierarchy category, experience_score is the score under experience category, relevance_score is the score under relevance category, valence_score is the score under emotions expressed by officers when completing tasks category, emotion_exhibited_tasks_score is the score under actual completion time relative to estimated completion time category, and health_condition_score is the score given to health condition category. [0059] The overall capability matrix scores for officer 1 and officer 2 as shown in FIG. 3A may be as follows:
Calculation of score for officer 1 = 0.1 * 0.6 + 0.3 * 0.6 + 0.4 * 0.6 + 0.4 * 1 + 0.5 * 0.2 + 0.6 * 1 + 0.7 * 0.7 + 0.4 * 1 = 2.469
Calculation of score for officer 2 = 0.1 * 0.6 + 0.3 * 0.6 + 0.4 * 0.5 + 0.4 * 1 + 0.5 * 0.5 + 0.6 * 0 + 0.7 * 0.8 + 0.4 * 1 = 2.05
[0060] After obtaining the capability matrix scores, the officers may be ranked accordingly, with officer having the highest score being the highest rank, and the officer with the lowest score being the lowest rank.
[0061] During the abnormal situations, such as long delays in completion of the event task times, the remaining or available officers with the best capability matrix scores, i.e. the highest ranked officers among the remaining or available officers, may be dispatched to support and complete the task. Additionally, when tasks are completed by the officers, the capability matrix may be updated. For instance, the category “actual completion time relative to estimated completion time category” may be updated based on whether the actual time taken by the officer to complete the task is less than or equal to the estimated time, and the category “emotions expressed by officers when completing tasks” category may be updated based on the emotions exhibited by the officer when performing the task. The assignment of new tasks may be based on the updated capability matrix.
[0062] FIG. 4 shows a flow chart 400 showing various steps involved in the scheduling system 100 according to various embodiments. The flow chart 400 shows high-level processes of the system 100. In step 401, the system 400 receives information regarding the number of users, e.g. officers (O), events (E) and tasks (T). The system 100 may also receive estimated event completion times (EECT), which may be the estimated time limit to complete each task. The information may be provided through input through user interface unit 140, and/or data input unit 110.
[0063] Under step 402, the information may be transmitted from user interface unit 140, and/or data input unit 110 to the backend unit 120. The backend unit 120 may provide the information to an optimizer/scheduler unit 130 which may generate a capability matrix (e.g. the capability matrix 300 shown in FIG. 3A). The optimizer/scheduler unit 130 may select a candidate to perform a task based on the overall capability matrix scores of the officers, i.e. selecting the highest ranked officer based on the overall capability matrix scores. The unit 130 may seek to reduce or minimize event task time required to perform the tasks.
[0064] Under step 403, the system 100, i.e the data processing unit 122 and/or the common processing unit 121 of the backend system 120, may start to monitor a task, e.g. monitoring the actual time taken by the candidate (i.e. the officer selected under step 402) to perform the task, upon receiving a trigger, e.g. upon receiving GPS information (via data input unit 110) that the candidate has reached the location of the event. In this manner, the backend unit 120 may function as a software as a service system.
[0065] Under step 404, the data processing unit 122 and/or the common processing unit 121 of the backend system 120 may run continuously to monitor the time taken by the candidate to perform the task. The time taken may start from when the system 100 determines that the candidate has reached the location of the event, as described earlier under step 403. When the data processing unit 122 and/or the common processing unit 121 determines that the actual time taken by the candidate has exceeded the estimated time limit, the data processing unit 122 may trigger step 405. Alternatively, going from step 404, if the actual time taken to perform the task does not exceed the estimated time limit, step 406 may be triggered, the data processing unit 122 may at step 406 determine whether the task has been completed, e.g. through input by the candidate via user interface unit 140. If the task has not been completed, the data processing unit 122 may trigger step 404. In such a manner, step 404 and step 406 may run continuously as a loop unless the data processing unit 122 determines that the actual time taken has exceeded the estimated time limit, or that the task has been completed.
[0066] Under step 405, the data processing unit 122 and/or the common processing unit 121 may together with the emotion detector 123, determine the emotion or emotional score of the candidate. The emotion or emotional score of the candidate may be determined based on data signals pertaining to body conditions (e.g. heart beat rate, respiratory rate etc.). In various embodiments, the emotion or emotional score of the candidate may be determined based on the data signals pertaining to body conditions, as well as the extracted features of the facial expression or behavior of the candidate captured by the image capture device worn by the candidate. If the candidate shows negative emotions (e.g. fear, anger etc.), the data processing unit 122 may trigger the rescheduling process, i.e. data processing unit 122 may trigger the optimizer/scheduler unit 130 to select another candidate for the task under step 402. If the emotion of the candidate appears to be still good, i.e. the emotional score of the candidate has not fallen below a predetermined threshold (e.g. 0), the system 100 may from step 405 go to step 406 (i.e. the data processing unit 122 and/or the common processing unit 121 may continuously run step 406 to check whether the task has been completed, step 404, and step 405 (until the system 100 determines that the task has been completed at step 406 or until the system 100 determines to trigger the rescheduling process due to the emotion of the candidate). [0067] At step 406, upon the data processing unit 122 determining that the task is completed, the data processing unit 122 and/or the common processing unit 121 may trigger step 407. At step 407, the data processing unit 122 may trigger the optimizer/scheduler unit 130 to update the capability matrix. For instance, the score for the category “actual completion time relative to estimated completion time category” may be updated based on whether the actual time taken by the officer (the candidate or an officer selected by the system 100 to replace him to perform the task) to complete the task is less than or equal to the estimated time, and the score for the category “emotions expressed by officers when completing tasks” category may be updated based on the emotions exhibited by the officer when performing the task. In various embodiments, the data processing unit 122 may trigger the optimizer/scheduler unit 130 to update the capability matrix at regular intervals, e.g. per day, per week etc. The updated capability matrix may then be used to select the candidate for a future task.
[0068] FIG. 5 is a diagram 500 illustrating processes between the data input unit 110/ the user interface unit 140, the backend unit 120 and the optimizer/scheduler unit 130 of the scheduling system according to various embodiments. FIG. 5 illustrates function calls between the data input unit 110/ the user interface unit 140, the backend system 120, and optimizer/scheduler 130 unit.
[0069] The backend unit 120 may start service and may send the request to the data input unit 110/ the user interface unit 140 to receive the data. The data input unit 110/ the user interface unit 140 may send an acknowledgement (acknowledging that the service is connected) to the backend unit 120. The backend unit 120 may send a request to the data input unit 110/ the user interface unit 140 for information relating to the officers, the events and the tasks (officers, events, tasks list). The data input unit 110/ the user interface unit 140 may in response to the request, send the information to the backend unit 120. The backend unit 120 may store the information in the data processing unit 122. These processes may correspond to step 401 in FIG. 4. [0070] The backend unit 120 may then send a request to the optimizer/scheduler unit 130 to generate a schedule. The optimizer/scheduler unit 130 may generate a schedule based on the capability matrix 300, and may transmit the schedule to the backend unit 120. These processes may correspond to step 402 in FIG. 4.
[0071] The backend unit 120 may then assign tasks to the officers based on the schedule received. For instance, the backend unit 120 may through the user interface unit 140 inform individual officers on their tasks and associated information such as location. The backend unit 120 may start monitoring on whether the officers have reached the location.
[0072] Once the backend unit 120 determines that the officer has reached the location, the backend unit 120 may initiate the “Start tasks” process and may monitor the actual time being taken by the officer to perform the task (step 403). The backend unit 120 may run the service, and may monitor whether the actual time taken exceeds the estimated time limit (step 404). Upon determining there is a delay and the data processing unit 122 determining to select another candidate to perform the task (step 405), the backend unit 120 may make a function call (i.e. send a request) to the optimizer/scheduler unit 130. The optimizer/scheduler unit 130 may then select another candidate for the task based on the capability matrix scores. The optimizer/scheduler unit 130 may send the rescheduled task lists to the officers.
[0073] The backend unit 120 may continuously monitor until the backend unit 120 determines that the task has been completed (step 406). As mentioned above, the backend unit 120 may trigger the optimizer/scheduler unit 130 to update the capability matrix. The capability matrix may be updated based on information obtained from the task that has just been completed. In the next iteration, the candidate chosen to perform a task may be based on the updated capability matrix scores.
[0074] The schedules of the officers may then be cleared from the software service system and the information may be stored in the backup table in the data processing unit 122 table of backend unit 120. The software service may then be stopped.
[0075] FIG. 6A is a diagram 600a illustrating processes between the system 100, officer 1 entity and officer 2 entity according to various embodiments. Officer 1 entity may refer to the wearable sensor and/or a device displaying the graphics user interface for communication with the system held by a first officer. Similarly, officer 2 entity may refer to the wearable sensor and/or a device displaying the graphics user interface for communication with the system held by a second officer. [0076] A set of tasks may be generated for officers based on inputs from data input unit 110 and/or user interface unit 140. The first officer and the second officer may be ranked the highest based on the capability matrix. The optimizer/scheduler may generate a schedule of the two officers based on the inputs from data input unit 110 and/or user interface unit 140. The task lists may be generated based on the schedule. The system 100 may assign tasks to the officers, and may continuously monitor the progress of the completion of the tasks by the two officers. The event task table at the data processing unit 122 of backend unit, based on the generated schedule, may continuously be updated when information is received by the backend unit 120 and the optimizer/scheduler unit 130.
[0077] The wearable sensor worn by the first officer may monitor the emotion of the first officer. For instance, the image capture device of the wearable sensor may capture one of more images on the first officer and/or body sensors of the wearable sensor may detect body conditions such as heart beat rate etc. of the first officer. The system 100 may determine an emotional score or emotion of the first officer, e.g. based on the body conditions of the first officer. In various embodiments, the system 100 may determine the emotional score or emotion of the first officer based on the body conditions of the first officer and the facial expression or behavior of the first officer. Further, the progress of the task may be monitored through the wearable sensor and/or through the graphics user interface of the device carried by the first officer.
[0078] Likewise, the wearable sensor worn by the second officer may monitor the emotion of the second officer. For instance, the image capture device of the wearable sensor may capture one of more images on the second officer and/or body sensors of the wearable sensor may detect body conditions such as heart beat rate etc. of the second officer. The system 100 may determine an emotional score or emotion of the second officer, e.g. based on the body conditions of the second officer. In various embodiments, the system 100 may determine the emotional score or emotion of the second officer based on the body conditions of the second officer and the facial expression/behavior of the second officer. Further, the progress of the task may be monitored through the wearable sensor and/or through the graphics user interface of the device carried by the second officer.
[0079] The information may be transmitted from the wearable sensors directly or indirectly to the data input unit 110 via a communication network. In various embodiments, the information may be transmitted through the cloud, or via a server. The information may continuously be provided from the wearable sensors to the data input unit 110, and information may also be sent from the data input unit 110 to the wearable sensor.
[0080] In S100 shown in FIG. 6A, the system 100 may send the task list to officer 1 entity based on the schedule generated. In 0100, the officer 1 entity sends an acknowledgement to the system 100 acknowledging that the task list has been received.
[0081] In S200 shown in FIG. 6A, the system 100 may send the task list to officer 2 entity based on the schedule generated. In 0200, the officer 2 entity sends an acknowledgement to the system 100 acknowledging that the task list has been received.
[0082] S 100, 0100, S200 and 0200 may occur between step 402 and step 403 shown in FIG. 4.
[0083] In S300 shown in FIG. 6A, the system 100 may continuously track the position of the officer 1 entity and may check whether the first officer has reached the location of event tasks using the wearable sensor worn by the first officer. In 0300, the wearable sensor worn by the first officer may detect that the first officer has reached the location and officer 1 entity may send an acknowledgement to system 100 that the first officer has reached the location. Upon receiving the acknowledgement from officer 1 entity, the system may start to monitor the actual time spent by the first in completing the task.
[0084] Likewise, in S400, the system 100 may continuously track the position of the officer 2 entity and may check whether the second officer has reached the location of event tasks using the wearable sensor worn by the second officer (which may be different from the location of event tasks that the first officer is sent to). In 0400, the wearable sensor worn by the second officer may detect that the second officer has reached the location and officer 2 entity may send an acknowledgement to system 100 that the second officer has reached the location. Upon receiving the acknowledgement from officer 2 entity, the system may start to monitor the actual time spent by the second officer in completing the task.
[0085] S300, 0300, S400 and 0400 may correspond to step 403 shown in FIG. 4.
[0086] In S500, the system 100 may continuously monitor the progress of the task. In 0500, the wearable sensor may detect completion of task (e.g. by officer 1 entity leaving location or through image capture device) and may send an acknowledgement to the system that the task has been completed. Alternatively, the first officer may through the graphics user interface of the device indicate that the task has been completed. [0087] In S600, the system 100 may continuously monitor the progress of the task. In 0600, the wearable sensor may detect completion of task (e.g. by officer 2 entity leaving location or through image capture device) and may send an acknowledgement to the system that the task has been completed. Alternatively, the second officer may through the graphics user interface of the device indicate that the task has been completed.
[0088] S500, 0500, S600 and 0600 may correspond to steps 404 and 406 as illustrated in
FIG. 4.
[0089] As mentioned above, the backend unit 120 may trigger the optimizer/scheduler unit 130 to update the capability matrix. The capability matrix may be updated based on information obtained from the task that has just been completed. In the next iteration, the candidate chosen to perform a task may be based on the updated capability matrix scores. The service may then be stopped.
[0090] FIG. 6B is a diagram 600b illustrating processes between the system 100, officer 1 entity, officer 2 entity and officer 3 entity according to various embodiments. Similar to FIG. 6A, officer 1 entity may refer to the wearable sensor and/or a device displaying the graphics user interface for communication with the system held by a first officer, officer 2 entity may refer to the wearable sensor and/or a device displaying the graphics user interface for communication with the system held by a second officer, while officer 3 entity may refer to the wearable sensor and/or a device displaying the graphics user interface for communication with the system held by a third officer.
[0091] A set of tasks may be generated for officers based on inputs from data input unit 110 and/or user interface unit 140. The first officer and the second officer may be ranked the highest based on the capability matrix. The third officer may be ranked the third highest. The system 100 may assign tasks to the first officer and the second officer, and may continuously monitor the progress of the completion of the tasks by the two officers. The third officer may remain in base as a standby.
[0092] Similar to what is described earlier, each officer may wear a wearable sensor for monitoring the emotion of the respective officer. For instance, the image capture device of the wearable sensor may capture one of more images on the respective officer and/or body sensors of the wearable sensor may detect body conditions such as heart beat rate etc. of the respective officer In various embodiments, the system 100 may determine an emotional score or emotion of the respective officer based on the body conditions of the respective officer. In various embodiments, the system 100 may determine the emotional score or emotion of the respective officer based on the facial expression/behavior of the respective officer and the body conditions of the respective officer. Further, the progress of the task may be monitored through the wearable sensor and/or through the graphics user interface of the device carried by the respective officer.
[0093] A set of tasks may be generated for officers based on inputs from data input unit 110 and/or user interface unit 140. The first officer and the second officer may be ranked the highest based on the capability matrix. The optimizer/scheduler unit 130 may generate a schedule of the two officers based on the inputs from data input unit 110 and/or user interface unit 140. The system 100 may assign tasks to the officers, and may continuously monitor the progress of the completion of the tasks by the two officers. The task lists may be generated based on the schedule. The event task table at the data processing unit 122 of backend unit, based on the generated schedule, may continuously be updated when information is received by the backend unit 120 and the optimizer/scheduler unit 130.
[0094] The information may be transmitted from the wearable sensors directly or indirectly to the data input unit 110 via a communication network. In various embodiments, the information may be transmitted through the cloud, or via a server. The information may continuously be provided from the wearable sensors to the data input unit 110, and information may also be sent from the data input unit 110 to the wearable sensor.
[0095] In A100 shown in FIG. 6B, the system 100 may send the task list to officer 1 entity based on the schedule generated. In B100, the officer 1 entity sends an acknowledgement to the system 100 acknowledging that the task list has been received.
[0096] In A200 shown in FIG. 6B, the system 100 may send the task list to officer 2 entity based on the schedule generated. In B200, the officer 2 entity sends an acknowledgement to the system 100 acknowledging that the task list has been received.
[0097] A100, B100, A200 and B200 may occur between step 402 and step 403 shown in
FIG. 4.
[0098] In A300 shown in FIG. 6B, the system 100 may continuously track the position of the officer 1 entity and may check whether the first officer has reached the location of event tasks using the wearable sensor worn by the first officer. In B300, the wearable sensor worn by the first officer may detect that the first officer has reached the location and officer 1 entity may send an acknowledgement to system 100 that the first officer has reached the location. Upon receiving the acknowledgement from officer 1 entity, the system may start to monitor the actual time spent by the first officer in completing the task.
[0099] Likewise, in A400, the system 100 may continuously track the position of the officer 2 entity and may check whether the second officer has reached the location of event tasks using the wearable sensor worn by the second officer (which may be different from the location of event tasks that the first officer is sent to). In B400, the wearable sensor worn by the second officer may detect that the second officer has reached the location and officer 2 entity may send an acknowledgement to system 100 that the second officer has reached the location. Upon receiving the acknowledgement from officer 2 entity, the system may start to monitor the actual time spent by the second officer in completing the task.
[00100] S300, 0300, S400 and 0400 may correspond to step 403 shown in FIG. 4.
[00101] In A500, the system 100 may continuously monitor the progress of the task. In B500, the wearable sensor may detect that the task is still in progress (e.g. by officer 1 entity still at location or through image capture device) and may send an acknowledgement indicating that the task is still in progress. The system may determine that the actual time spent has exceeded the estimated time limit, i.e. delayed. Further, the wearable sensor may transmit information (e.g. images of the first officer, signals indicating body conditions of the first officer) which indicate negative emotion of the first officer. The emotion detector 123 of the backend unit 120 may determine that the emotion score of the first officer has fallen below a predetermined threshold (e.g. 0). A500 may correspond to step 404 shown in FIG. 4 while B500 may correspond to step 405 shown in FIG. 4.
[00102] In A600, the system 100 may continuously monitor the progress of the task. In B600, the wearable sensor may detect completion of task (e.g. by officer 2 entity leaving location or through image capture device) and may send an acknowledgement to the system that the task has been completed. Alternatively, the second officer may through the graphics user interface of the device indicate that the task has been completed. A600 and B600 may correspond to steps 404 and 406 as illustrated in FIG. 4.
[00103] In A650, the system 100 may select another candidate for the task. The third officer, being the next highest rank in the capability matrix may be selected (step 402). The task list of the third officer may also be updated.
[00104] In A750, the system may send information to officer 1 entity to inform the first officer to terminate the task as the officer has exhibited negative emotions while performing the set of the tasks. The information may, for instance, be conveyed via the graphics user interface of the device. In B750, officer 1 entity may acknowledge that it has received the information. A750 and B750 may occur between step 402 and step 403.
[00105] In A800, the system 100 may send the task list to officer 3 entity based on the updated schedule generated by the optimizer/scheduler unit 130. In B800, the officer 3 entity sends an acknowledgement that the task list has been received.
[00106] In A900, the system 100 may continuously track the position of the officer 3 entity and may check whether the third officer has reached the location of event tasks using the wearable sensor worn by the third officer. In B900, the wearable sensor worn by the third officer may detect that the third officer has reached the location and officer 3 entity may send an acknowledgement to system 100 that the third officer has reached the location. Upon receiving the acknowledgement from officer 3 entity, the system may start to monitor the actual time spent by the third officer in completing the task.
[00107] A900 and B900 may correspond to step 403 shown in FIG. 4.
[00108] In A1000, the system 100 may continuously monitor the progress of the task that the third officer is assigned to. In B1000, the wearable sensor may detect completion of task (e.g. by officer 3 entity leaving location or through image capture device) and may send an acknowledgement to the system that the task has been completed. Alternatively, the third officer may through the graphics user interface of the device indicate that the task has been completed. A1000 and B 1000 may correspond to steps 404 and 406 as illustrated in FIG. 4. [00109] The backend unit 120 may trigger the optimizer/scheduler unit 130 to update the capability matrix. The capability matrix may be updated based on information obtained from the task that has just been completed. In the next iteration, the candidate chosen to perform a task may be based on the updated capability matrix scores. The service may then be stopped. [00110] FIG. 7A show the map input page 710 of the graphics user interface 700 of the user interface unit 140 according to various embodiments. The map input page 710 may contain fields to accept the number of officers, event-tasks, base stations and event task time as input. [00111] As shown in FIG. 7A, the map input page 710 may include a display of a map, for instance, a part of the city, or the country. Additionally, the map input page 710 may include three user interface (UI) blocks: base station creator 711, officer creator 712, and event creator 713. [00112] In addition to the map input page 710, the graphics user interface 700 may also include the map display page 720, the officer management page 730, and the event management page 740. The graphics user interface 700 may, for instance, be hosted in a personal computer, or may be hosted on a cloud/ standalone server system, and may be accessed by the officer or worker sent to complete a task, by the administrator/operator, or the commander etc.
[00113] FIG. 7B show the map display page 720 of the graphics user interface 700 of the user interface unit 140 according to various embodiments when tasks have been assigned to the first officer and the second officer. The map display page 720 may display the information about the current location of the base stations, officers and the event locations as provided to the map input page 710. The information provided to the map input page 710 may be processed by the backend unit 120 and optimizer/scheduler unit 130 to generate the task lists for the officers. Further, the map display page 720 may also display the route in which the officers have traversed to the event locations. The map display page 620 may continuously display information in the form of UI blocks about for example, but not limited to, location of the officers, officers’ task list, location of the base station, event location, actual event times (AET) (actual time taken to complete tasks), estimated event times (EET) (estimated time taken to complete tasks), emotion information of the officer etc.
[00114] UI block 721 may be a part of the map display page and may contain information about the number of officers at the base station, and indication of the officers with the assigned tasks. UI block 722 may contain routing information provided to the officers to navigate to the event locations and also the time estimated to resolve those events. The first officer (Officerl) assigned to perform tasks at event location 1 (E1T1 and E1T2) may be shown the best route to event location 1. Likewise, the second officer (Officer2) assigned to perform tasks at event location 2 (E2T1 and E2T2) may be shown the best route to event location 2.
[00115] FIG. 7C show the map display page 720 of the graphics user interface 700 of the user interface unit 140 according to various embodiments when the first officer and the second officer are performing the tasks. As shown in FIG. 7C, the first officer (Officerl) and the second officer (Officer2) may have reached their event locations respectively, and may have started performing the tasks. On the other hand, the third officer (Officer3) and the fourth officer (Officer4) may remain at the base station on standby to be assigned. UI block 723 may provide information about the third officer (Officer3) and the fourth officer (Officer4) present at the base station. The standby officers may support existing events or may be assigned to perform tasks at new events when these events occur.
[00116] UI block 724 shows that the first officer (Officerl) and the second officer (Officer2) have reached the event locations. When an officer reaches the event location, the wearable sensor may send an acknowledgement to the backend unit 122 to trigger the timer for tracking the actual time spent (AET) by the officer to complete a task. The actual time spent by the officer (AET) as well as the emotion of the officer may be continuously tracked as he performs the tasks assigned to him via the wearable sensor/image capture device. The estimated event times (EET) may be previously provided to the system 100 by an user such as an operator. [00117] As shown in FIG. 7C, the actual time (AET) spent by the first officer (Officerl) has exceeded the estimated time (EET). Further, the system 100 detects that the first officer is fearful. The system 100 may then be triggered to assign another candidate to perform the task that has been assigned to the first officer, since the first officer may no longer been able to perform the task. The emotional conditions of the officers may be displayed as bar graphs or charts etc. On the other hand, based on information from the wearable sensor transmitted by the data input unit 110 or through the graphics user interface unit 140 (based on inputs provided by the second officer), the system 100 may determine that the second officer (Officer 2) is performing tasks on time, as shown in FIG. 7C.
[00118] FIG. 7D show the map display page 720 of the graphics user interface 700 of the user interface unit 140 according to various embodiments when the third officer is sent to perform the tasks previously assigned to the first officer. As shown in UI block 723’, the first officer (Officerl) is sent back to the base station. UI block 724’ shows that the third officer (Officer3) resolving the delay.
[00119] FIG. 7E shows the officer management page 730 of the graphics user interface 700 of the user interface unit 140 according to various embodiments. The officer management page 730 may include information about each officer, the progress of the current set of event tasks they are performing and also the capability matrix scores of each of the officers together with their rankings. As shown in FIG. 7E, UI block 731 shows information about the available officers, e.g. name, role, type (e.g. fire, water, electrical, security etc.), and the associated base station. Clicking on a row associated with an officer in UI block 731 may result in the GUI to display UI block 732, which shows detailed information pertaining to the officer. The information may include, for instance, the skill sets, employee’s identification (ID) etc. The task list of the officer may also be displayed in UI block 732, with indications showing whether the tasks are completed, in progress, or to be done. UI block 733 shows the capability matrix scores of the officers, as well as their ranks with respect to the capability matrix scores. [00120] FIG. 7F shows the event management page 740 of the graphics user interface 700 of the user interface unit 140 according to various embodiments. The event management page block 740 may display about the progress of different events and tasks, and may contain information about estimated and actual task completion times, the type of event the tasks are associated with, the priority of event, as well as the event category (whether the event is a scheduled event or an automated event).
[00121] As shown in FIG. 7F, the event management page 740 may contain an UI block 741 of a table showing columns such as the reference number (no.), priority, type, category, location, date/time, progress. The “priority” column may indicate the priority level of each event, and may be indicated by different icons representing different priority levels. The “type” column may indicate the type of event, e.g. fire, water, security etc., which may be represented by different icons. The “category” column may indicate whether the event is a scheduled event or an automated event. The “location” column may indicate the locations of the various events, the “date/time” may indicate the date and time in which the event is reported, and the “progress” column may indicate the progress in which the event is being resolved. Clicking on a row associated with an event may trigger UI block 742 to be displayed. UI block 742 may show detailed information of the event, such as the number of tasks and the status of each task. The estimated time to complete each task is also indicated. When the task is completed, the actual time to complete the task is also shown in block 742.
[00122] FIG. 8A is a schematic illustrating the scheduled task lists generated for the first officer (Officer 1) and the second officer (Officer2) according to various embodiments. As shown, the first officer is assigned the tasks E1T1 and E1T2, while the second officer is assigned the tasks E2T1 and E2T2. FIG. 8B is a schematic illustrating the revised scheduled task lists generated for the first officer (Officer 1) and the second officer (Officer2) when the scheduling system detects a delay and a change in emotion of the first officer (Officer 1) according to various embodiments. As shown in FIG. 8B, the task E1T2 may be reallocated from the first officer to the second officer.
[00123] FIG. 8C is a schematic illustrating the scheduled tasks lists generated for the first officer (Officer 1) and the second officer (Officer2) with the estimated time limits according to various embodiments. FIG. 8D is a schematic comparing the scheduled task lists and the revised task lists when the scheduling system detects a delay and a change in emotion of the first officer (Officerl) according to various embodiments. As shown in FIG. 8D, the first officer spends 70 minutes on E1T1 which is longer than the estimated time limit of 20 minutes. The second officer is able to complete his tasks (E2T1 and E2T2) within the estimated time limits and is re-assigned to complete E1T2 which the first officer is originally assigned to.
[00124] FIG. 9 is a general illustration of a scheduling system 900 according to various embodiments. The scheduling system 900 may include a backend unit 920 configured to determine an emotional score or emotion pertaining to a respective user of a plurality of users, such that the backend unit 920 determines a plurality of emotional scores or emotions pertaining to different users of the plurality of users. The scheduling system 900 may additionally include a scheduling unit 930 (also referred to as an optimizer /scheduler unit) configured to select a candidate from the plurality of users to perform a task based on the plurality of emotional scores or emotions pertaining to different users of the plurality of users. The backend unit 920 may be configured to monitor a time taken by the candidate to complete the task and to determine the emotional score or emotion of the candidate when the candidate is performing the task. The backend unit 920 may be configured to select another candidate for the task upon the scheduling system 900 determining that the time taken by the candidate to complete the task is beyond an estimated time limit and upon the emotional score or emotion of the candidate falling below a predetermined threshold.
[00125] In various embodiments, the scheduling system 900 may further include a data input unit configured to receive one or more data signals indicating a body condition of the respective user of the plurality of users. The backend unit 920 may be configured to determine the emotional score or emotion pertaining to the respective user based on the body condition of the respective user.
[00126] The body condition of the respective user may be, for instance, a heart beat rate of the respective user, a respiratory rate of the respective user, a perspiration rate of the respective user, or a skin temperature of the respective user. The one or more data signals indicating the body condition of the respective user may be determined by a wearable sensor, i.e. one or more body sensors included in the wearable sensor.
[00127] In various embodiments, the data input unit may be configured to receive one or more images of the respective user of the plurality of users. The backend unit 920 may be configured to extract features indicating a facial expression or behavior of the respective user, and may be further configured to determine the emotional score or emotion pertaining to the respective user also based on the extracted features.
[00128] The one or more images may, for instance, be still images or a video feed of the respective user captured by an image capture device, e.g. a camera or a video camera. The image capture device may be attached to or worn by the respective user. For instance, the image capture device may be attached to clothing of the respective user. In various embodiments, the wearable sensor may include the one or more body sensors, and the image capture device. In various other embodiments, the wearable sensor and the image capture device may be separate devices.
[00129] In various embodiments, the scheduling system 900 may further include an user interface unit further configured to receive information under one or more different categories in relation to the respective user, such that the user interface unit is configured to receive information under the one or more different categories in relation to different users of the plurality of users. The scheduling unit 930 may be configured to select the candidate also based on information received under the one or more different categories in relation to different users of the plurality of users. In other words, in various embodiments, the scheduling unit 930 may be configured to select the candidate based on the information under the one or more different categories received by the user interface unit, as well as from the one or more data signals received by the data input unit. In various embodiments, the scheduling unit 930 may be configured to select the candidate based on the information under the one or more different categories received by the user interface unit, as well as from the one or more data signals received by the data input unit and from the one or more images.
[00130] In various embodiments, the one or more different categories may be an age of the respective user, a position of the respective user, experience of the respective user, a relevance of a skillset of the respective user for the task, actual completion times in relation to estimated completion times of previous tasks allocated to the respective user, and/or an overall health score of the respective user.
[00131] In various embodiments, the emotional score or emotion of the respective user may also be based on the body condition of one or more persons interacting with the respective user. In various embodiments, the emotional score or emotion of the respective user may also be based on the extracted features and the body condition of the one or more persons interacting with the respective user.
[00132] In various embodiments, the user interface unit may be configured to display a graphics user interface (GUI) including a map input page, a map display page, an event management page for displaying the task, and an officer management page.
[00133] In various embodiments, one category of the one or more different categories may be assigned a weightage different from a weightage assigned to another category of the one or more different categories.
[00134] In various embodiments, the scheduling unit 930 may be configured to generate a capability matrix including the plurality of emotional scores or emotions pertaining to different users of the plurality of users, as well as information received under the one or more different categories in relation to different users of the plurality of users. The capability matrix may also include scores determined by whether the different users of the plurality of users are able to complete previous tasks within predefined estimated time limits.
[00135] In various embodiments, the capability matrix may be updated at regular time intervals. For instance, the capability matrix may be updated once day, once a month, or once a year. The capability matrix may be updated based on the plurality of emotional scores pertaining to the different users of the plurality of users during previous tasks as well as based on scores determined by whether the different users of the plurality of users are able to complete previous tasks within predefined estimated time limits.
[00136] In various embodiments, the other candidate may be selected from the remaining users of the plurality of users based on the emotional scores or emotions as well as information received under the one or more different categories in relation to the remaining users.
[00137] In various embodiments, the backend unit 920 may be configured to determine the emotional score or emotion of the respective user by determining a probability that the respective user is happy, a probability that the respective user is relaxed, a probability that the respective user is angry, and a probability that the respective user is fearful based on the body condition of the respective user.
[00138] In various embodiments, the backend unit 920 may be configured to determine the emotional score or emotion of the respective user by determining a probability that the respective user is happy, a probability that the respective user is relaxed, a probability that the respective user is angry, and a probability that the respective user is fearful based on the extracted features of the facial expression/behavior of the respective user, and based on the body condition of the respective user.
[00139] Various embodiments may relate to a network including the scheduling system as well as the wearable sensors and/or the image capture devices worn or attached to the respective users. Various embodiments may refer to a method of forming and/or operating the network. [00140] FIG. 10 is a general illustration of a method 1000 of forming a scheduling system according to various embodiments. The method may also include, in 1020, providing a backend unit of the scheduling system, the backend unit further configured to determine an emotional score or emotion pertaining to a respective user of a plurality of users, such that the backend unit determines a plurality of emotional scores or emotions pertaining to different users of the plurality of users. The method may further include, in 1030, providing a scheduling unit configured to select a candidate from the plurality of users to perform a task based on the plurality of emotional scores or emotions pertaining to different users of the plurality of users. The backend unit may be configured to monitor a time taken by the candidate to complete the task and to determine the emotional score or emotion of the candidate when the candidate is performing the task. The scheduling system may be configured to select another candidate for the task upon the scheduling system determining that the time taken by the candidate to complete the task is beyond an estimated time limit and upon the emotional score or emotion of the candidate falling below a predetermined threshold.
[00141] In various embodiments, the method may include providing a data input unit of the scheduling system, the data input unit configured to receive one or more data signals indicating a body condition of the respective user of the plurality of users. The backend unit may be configured to determine the emotional score or emotion pertaining to the respective user based on the body condition of the respective user.
[00142] In various embodiments, the data input unit may be configured to receive one or more images of the respective user of the plurality of users. The backend unit may be configured to extract features indicating a facial expression/behavior of the respective user, and further configured to determine the emotional score or emotion pertaining to the respective user also based on the extracted features.
[00143] In various embodiments, the method may additionally provide an user interface unit further configured to receive information under one or more different categories in relation to the respective user, such that the user interface unit is configured to receive information under the one or more different categories in relation to different users of the plurality of users. The scheduling unit may be configured to select the candidate also based on information received under the one or more different categories in relation to different users of the plurality of users. [00144] In various embodiments, the one or more different categories may be an age of the respective user, a position of the respective user, experience of the respective user, a relevance of a skillset of the respective user for the task, actual completion times in relation to estimated completion times of previous tasks allocated to the respective user, and an overall health score of the respective user.
[00145] In various embodiments, the user interface unit may be configured to display a graphics user interface (GUI) comprising a map input page, a map display page, an event management page for displaying the task, and an officer management page.
[00146] In various embodiments, one category of the one or more different categories may be assigned a weightage different from a weightage assigned to another category of the one or more different categories.
[00147] In various embodiments, the scheduling unit may be configured to generate a capability matrix comprising the plurality of emotional scores pertaining to different users of the plurality of users, as well as information received under the one or more different categories in relation to different users of the plurality of users.
[00148] In various embodiments, the capability matrix may be updated at regular time intervals.
[00149] In various embodiments, the body condition of the respective user may be a heart beat rate of the respective user, a respiratory rate of the respective user, a perspiration rate of the respective user or a skin temperature of the respective user.
[00150] In various embodiments, the backend unit may be configured to determine the emotional score or emotion of the respective user by determining a probability that the respective user is happy, a probability that the respective user is relaxed, a probability that the respective user is angry, and a probability that the respective user is fearful based on the body condition of the respective user.
[00151] In various embodiments, the backend unit may be configured to determine the emotional score or emotion of the respective user by determining a probability that the respective user is happy, a probability that the respective user is relaxed, a probability that the respective user is angry, and a probability that the respective user is fearful based on the extracted features of the facial expression/behavior of the respective user, and based on the body condition of the respective user.
[00152] In various embodiments, the one or more images of the respective user may be still images or a video of the respective user captured by an image capture device.
[00153] In various embodiments, the one or more data signals indicating the body condition of the respective user may be determined by a wearable sensor.
[00154] FIG. 11 is a general illustration of a method 1100 of operating a scheduling system according to various embodiments. The method may include, in 1120, using a backend unit of the scheduling system to determine an emotional score or emotion pertaining to a respective user of a plurality of users, such that the backend unit determines a plurality of emotional scores or emotions pertaining to different users of the plurality of users. The method may also include, in 1130, using a scheduling unit of the scheduling system to select a candidate from the plurality of users to perform a task based on the plurality of emotional scores or emotions pertaining to different users of the plurality of users. The backend unit may be configured to monitor a time taken by the candidate to complete the task and to determine the emotional score or emotion of the candidate when the candidate is performing the task. The scheduling system may be configured to select another candidate for the task upon the scheduling system determining that the time taken by the candidate to complete the task is beyond an estimated time limit and upon the emotional score or emotion of the candidate falling below a predetermined threshold. [00155] In various embodiments, the method may include using a data input unit of the scheduling system to receive one or more images and one or more data signals indicating a body condition of a respective user of a plurality of users. The backend unit may be configured to determine the emotional score or emotion pertaining to the respective user based on the body condition of the respective user.
[00156] In various embodiments, the data input unit may be configured to receive one or more images of the respective user of the plurality of users. The backend unit may be configured to extract features indicating a facial expression or behavior of the respective user, and further configured to determine the emotional score or emotion pertaining to the respective user also based on the extracted features.
[00157] In various embodiments, the method may also include using an user interface unit to receive information under one or more different categories in relation to the respective user, such that the user interface unit is configured to receive information under the one or more different categories in relation to different users of the plurality of users. The scheduling unit may be configured to select the candidate also based on information received under the one or more different categories in relation to different users of the plurality of users.
[00158] While the invention has been particularly shown and described with reference to specific embodiments, it should be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. The scope of the invention is thus indicated by the appended claims and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced.

Claims

Claims
1. A scheduling system comprising: a backend unit configured to determine an emotional score pertaining to a respective user of a plurality of users, such that the backend unit determines a plurality of emotional scores pertaining to different users of the plurality of users; and a scheduling unit configured to select a candidate from the plurality of users to perform a task based on the plurality of emotional scores pertaining to different users of the plurality of users; wherein the backend unit is configured to monitor a time taken by the candidate to complete the task and to determine the emotional score of the candidate when the candidate is performing the task; and wherein the scheduling system is configured to select another candidate for the task upon the scheduling system determining that the time taken by the candidate to complete the task is beyond an estimated time limit and upon the emotional score of the candidate falling below a predetermined threshold.
2. The scheduling system according to claim 1, further comprising: a data input unit configured to receive one or more data signals indicating a body condition of the respective user of the plurality of users; wherein the backend unit is configured to determine the emotional score pertaining to the respective user based on the body condition of the respective user.
3. The scheduling system according to claim 2, wherein the body condition of the respective user is a heart beat rate of the respective user, a respiratory rate of the respective user, a perspiration rate of the respective user, or a skin temperature of the respective user.
4. The scheduling system according to claim 2 or claim 3, wherein the one or more data signals indicating the body condition of the respective user are determined by a wearable sensor.
5. The scheduling system according to any one of claims 2 to 4, wherein the data input unit is configured to receive one or more images of the respective user of the plurality of users; and wherein the backend unit is configured to extract features indicating a facial expression of the respective user, and further configured to determine the emotional score pertaining to the respective user also based on the extracted features.
6. The scheduling system according to claim 5, wherein the one or more images of the respective user are still images or a video of the respective user captured by an image capture device.
7. The scheduling system according to any one of claims 1 to 6, further comprising: an user interface unit further configured to receive information under one or more different categories in relation to the respective user, such that the user interface unit is configured to receive information under the one or more different categories in relation to different users of the plurality of users; wherein the scheduling unit is configured to select the candidate also based on information received under the one or more different categories in relation to different users of the plurality of users.
8. The scheduling system according to claim 7, wherein the one or more different categories are an age of the respective user, a position of the respective user, experience of the respective user, a relevance of a skillset of the respective user for the task, actual completion times in relation to estimated completion times of previous tasks allocated to the respective user, and an overall health score of the respective user.
9. The scheduling system according to claim 7 or claim 8, wherein the user interface unit is configured to display a graphics user interface (GUI) comprising a map input page, a map display page, an event management page for displaying the task, and an officer management page.
10. The scheduling system according to any one of claims 7 to 9, wherein one category of the one or more different categories is assigned a weightage different from a weightage assigned to another category of the one or more different categories.
11. The scheduling system according to any one of claims 7 to 10, wherein the scheduling unit is configured to generate a capability matrix comprising the plurality of emotional scores pertaining to different users of the plurality of users, as well as information received under the one or more different categories in relation to different users of the plurality of users.
12. The scheduling system according to claim 11, wherein the capability matrix is updated at regular time intervals.
13. The scheduling system according to any one of claims 1 to 12, wherein the backend unit is configured to determine the emotional score of the respective user by determining a probability that the respective user is happy, a probability that the respective user is relaxed, a probability that the respective user is angry, and a probability that the respective user is fearful based on the extracted features of the facial expression of the respective user, and based on the body condition of the respective user.
14. A method of forming a scheduling system, the method comprising: providing a backend unit of the scheduling system, the backend unit configured to determine an emotional score pertaining to a respective user of a plurality of users, such that the backend unit determines a plurality of emotional scores pertaining to different users of the plurality of users; and providing a scheduling unit of the scheduling system, the scheduling unit configured to select a candidate from the plurality of users to perform a task based on the plurality of emotional scores pertaining to different users of the plurality of users; wherein the backend unit is configured to monitor a time taken by the candidate to complete the task and to determine the emotional score of the candidate when the candidate is performing the task; and wherein the scheduling system is configured to select another candidate for the task upon the scheduling system determining that the time taken by the candidate to complete the task is beyond an estimated time limit and upon the emotional score of the candidate falling below a predetermined threshold.
15. The method according to claim 14, further comprising: providing a data input unit of the scheduling system, the data input unit configured to receive one or more data signals indicating a body condition of the respective user of the plurality of users; wherein the backend unit is configured to determine the emotional score pertaining to the respective user based on the body condition of the respective user.
16. The method according to claim 15, wherein the body condition of the respective user is a heart beat rate of the respective user, a respiratory rate of the respective user, a perspiration rate of the respective user, or a skin temperature of the respective user.
17. The method according to claim 15 or claim 16, wherein the one or more data signals indicating the body condition of the respective user are determined by a wearable sensor.
18. The method according to any one of claims 15 to 17, wherein the data input unit is configured to receive one or more images of the respective user of the plurality of users; and wherein the backend unit is configured to extract features indicating a facial expression of the respective user, and further configured to determine the emotional score pertaining to the respective user also based on the extracted features.
19. The method according to claim 18, wherein the one or more images of the respective user are still images or a video of the respective user captured by an image capture device.
20. The method according to any one of claims 14 to 19, further comprising: providing an user interface unit of the scheduling system, the user interface unit further configured to receive information under one or more different categories in relation to the respective user, such that the user interface unit is configured to receive information under the one or more different categories in relation to different users of the plurality of users; wherein the scheduling unit is configured to select the candidate also based on information received under the one or more different categories in relation to different users of the plurality of users.
21. The method according to claim 20, wherein the one or more different categories are an age of the respective user, a position of the respective user, experience of the respective user, a relevance of a skillset of the respective user for the task, actual completion times in relation to estimated completion times of previous tasks allocated to the respective user, and an overall health score of the respective user.
22. The method according to claim 20 or claim 21, wherein the user interface unit is configured to display a graphics user interface (GUI) comprising a map input page, a map display page, an event management page for displaying the task, and an officer management page.
23. The method according to any one of claims 20 to 22, wherein one category of the one or more different categories is assigned a weightage different from a weightage assigned to another category of the one or more different categories.
24. The method according to any one of claims 20 to 23, wherein the scheduling unit is configured to generate a capability matrix comprising the plurality of emotional scores pertaining to different users of the plurality of users, as well as information received under the one or more different categories in relation to different users of the plurality of users.
25. The method according to claim 24, wherein the capability matrix is updated at regular time intervals.
26. The method according to any one of claims 14 to 25, wherein the backend unit is configured to determine the emotional score of the respective user by determining a probability that the respective user is happy, a probability that the respective user is relaxed, a probability that the respective user is angry, and a probability that the respective user is fearful based on the extracted features of the facial expression of the respective user, and based on the body condition of the respective user.
27. A method of operating a scheduling system, the method comprising: using a backend unit of the scheduling system to determine an emotion pertaining to a respective user of a plurality of users, such that the backend unit determines a plurality of emotions pertaining to different users of the plurality of users; and using a scheduling unit of the scheduling system to select a candidate from the plurality of users to perform a task based on the plurality of emotions pertaining to different users of the plurality of users; wherein the backend unit is configured to monitor a time taken by the candidate to complete the task and to determine the emotion of the candidate when the candidate is performing the task; and wherein the scheduling system is configured to select another candidate for the task upon the scheduling system determining that the time taken by the candidate to complete the task is beyond an estimated time limit and upon the emotion of the candidate falling below a predetermined threshold.
28. The method according to claim 27, further comprising: using a data input unit of the scheduling system to receive one or more images and one or more data signals indicating a body condition of a respective user of a plurality of users; wherein the backend unit is configured to determine the emotion pertaining to the respective user based on the body condition of the respective user.
29. The method according to claim 27 or claim 28, using an user interface unit of the scheduling system to receive information under one or more different categories in relation to the respective user, such that the user interface unit is configured to receive information under the one or more different categories in relation to different users of the plurality of users; wherein the scheduling unit is configured to select the candidate also based on information received under the one or more different categories in relation to different users of the plurality of users.
PCT/SG2020/050420 2020-07-20 2020-07-20 Scheduling system, methods of operating and forming the same WO2022019826A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/SG2020/050420 WO2022019826A1 (en) 2020-07-20 2020-07-20 Scheduling system, methods of operating and forming the same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/SG2020/050420 WO2022019826A1 (en) 2020-07-20 2020-07-20 Scheduling system, methods of operating and forming the same

Publications (1)

Publication Number Publication Date
WO2022019826A1 true WO2022019826A1 (en) 2022-01-27

Family

ID=79728804

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2020/050420 WO2022019826A1 (en) 2020-07-20 2020-07-20 Scheduling system, methods of operating and forming the same

Country Status (1)

Country Link
WO (1) WO2022019826A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160267405A1 (en) * 2011-06-29 2016-09-15 Bruce Reiner Method and apparatus for real-time measurement and analysis of occupational stress and fatigue and performance outcome predictions
US9602669B1 (en) * 2015-12-29 2017-03-21 International Business Machines Corporation Call center anxiety feedback processor (CAFP) for biomarker based case assignment
CN109686447A (en) * 2019-01-28 2019-04-26 远光软件股份有限公司 A kind of employee status's monitoring system based on artificial intelligence
US20200075027A1 (en) * 2018-09-05 2020-03-05 Hitachi, Ltd. Management and execution of equipment maintenance
US20200184833A1 (en) * 2018-12-11 2020-06-11 Ge Aviation Systems Limited Aircraft and method of adjusting a pilot workload

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160267405A1 (en) * 2011-06-29 2016-09-15 Bruce Reiner Method and apparatus for real-time measurement and analysis of occupational stress and fatigue and performance outcome predictions
US9602669B1 (en) * 2015-12-29 2017-03-21 International Business Machines Corporation Call center anxiety feedback processor (CAFP) for biomarker based case assignment
US20200075027A1 (en) * 2018-09-05 2020-03-05 Hitachi, Ltd. Management and execution of equipment maintenance
US20200184833A1 (en) * 2018-12-11 2020-06-11 Ge Aviation Systems Limited Aircraft and method of adjusting a pilot workload
CN109686447A (en) * 2019-01-28 2019-04-26 远光软件股份有限公司 A kind of employee status's monitoring system based on artificial intelligence

Similar Documents

Publication Publication Date Title
CN109472190B (en) Apparatus and method for tracking and/or analyzing activity associated with a facility
US11301683B2 (en) Architecture, engineering and construction (AEC) construction safety risk analysis system and method for interactive visualization and capture
US10032352B2 (en) System and method for wearable indication of personal risk within a workplace
Killian et al. Learning to prescribe interventions for tuberculosis patients using digital adherence data
CN104750768B (en) Method and system for identification, monitoring and ranking event from social media
EP3276549A1 (en) Biometric-based resource allocation
Greasley et al. Modelling people’s behaviour using discrete-event simulation: a review
US20070050239A1 (en) Method for managing organizational capabilities
WO2017062635A1 (en) Training artificial intelligence
US20210233654A1 (en) Personal protective equipment and safety management system having active worker sensing and assessment
US20140278690A1 (en) Accommodating schedule variances in work allocation for shared service delivery
US9799007B2 (en) Method of collaborative software development
US20220310214A1 (en) Methods and apparatus for data-driven monitoring
CN112106084A (en) Personal protective device and security management system for comparative security event evaluation
US20180012181A1 (en) Method of collaborative software development
US11645600B2 (en) Managing apparel to facilitate compliance
US20220044169A1 (en) Systems and methods for resource analysis, optimization, or visualization
Kresge Data and algorithms in the workplace: a primer on new technologies
Moosavi et al. Staff scheduling for residential care under pandemic conditions: The case of COVID-19
WO2022019826A1 (en) Scheduling system, methods of operating and forming the same
JP2019144846A (en) Business management device and business management method
JP6796668B2 (en) Information processing equipment, information processing methods and programs
Courtway Wearables, augmented and virtual reality, integrated project delivery, and artificial intelligence
US20230075067A1 (en) Systems and Methods for Resource Analysis, Optimization, or Visualization
WO2022059223A1 (en) Video analyzing system and video analyzing method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20946128

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20946128

Country of ref document: EP

Kind code of ref document: A1