WO2020008577A1 - Système, procédé et programme de prédiction de présence et d'absence - Google Patents

Système, procédé et programme de prédiction de présence et d'absence Download PDF

Info

Publication number
WO2020008577A1
WO2020008577A1 PCT/JP2018/025417 JP2018025417W WO2020008577A1 WO 2020008577 A1 WO2020008577 A1 WO 2020008577A1 JP 2018025417 W JP2018025417 W JP 2018025417W WO 2020008577 A1 WO2020008577 A1 WO 2020008577A1
Authority
WO
WIPO (PCT)
Prior art keywords
employee
attendance
prediction
image
face image
Prior art date
Application number
PCT/JP2018/025417
Other languages
English (en)
Japanese (ja)
Inventor
圭祐 内田
和矢 吉田
Original Assignee
株式会社スペイシー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社スペイシー filed Critical 株式会社スペイシー
Priority to PCT/JP2018/025417 priority Critical patent/WO2020008577A1/fr
Publication of WO2020008577A1 publication Critical patent/WO2020008577A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management

Definitions

  • the present invention relates to an attendance prediction system, method, and program for predicting the attendance of an employee.
  • Patent Document 1 a technique for indicating the health condition of an employee based on overtime hours, number of days away from work, late leaving time, etc.
  • Patent Document 1 an index value calculation unit that calculates an index value indicating a probability that a prediction result obtained by machine learning is a correct answer by referring to a database in which information having a plurality of items is stored; It has a contribution calculation unit that calculates a contribution indicating the degree of contribution of each item to the value, and an output unit that outputs information including the index value and the contribution.
  • Japanese Patent Application Laid-Open No. 2003-189,097 proposes the degree of each item contributing to the index value among the index values calculated from the values of a plurality of items.
  • Patent Literature 1 has a problem that it is not possible to predict the attendance of an employee from a change in complexion.
  • the present invention focuses on the above points, and its object is to obtain a face image of an employee and predict attendance from changes in the face image.
  • the present invention is a attendance prediction system that predicts attendance from an employee image, wherein first acquisition means for acquiring the employee image, and a face image of the employee among the acquired images.
  • an attendance prediction system comprising: learning means for learning using the attendance information; and prediction means for predicting the subsequent attendance of the employee according to the learned result.
  • the present invention also relates to a time attendance prediction method for estimating attendance from an image of an employee, the method comprising: a first acquisition step of acquiring an image of the employee; and a face image of the employee among the acquired images.
  • a second acquisition step of acquiring the employee, a step of identifying the employee by the face image, a third acquisition step of acquiring the attendance information associated with the employee, and a plurality of days of the employee There is provided an attendance prediction method, comprising: a step of learning using an image and the attendance information; and a step of predicting a subsequent attendance of the employee according to the learned result.
  • the present invention is a program for causing a computer to execute attendance prediction processing from an employee image
  • the first acquiring step of acquiring the employee image comprises: A second acquisition step of acquiring a face image of the employee, a step of identifying the employee by the face image, a third acquisition step of acquiring the attendance information associated with the employee, Learning program using the image of the employee and the attendance information, and a step of predicting the subsequent attendance of the employee according to the learned result.
  • the present invention by learning a face image when time stamping is performed by face authentication in association with time information, it is possible to predict subsequent time from changes in the face image.
  • FIG. 4 is a diagram illustrating an example of learning data of the embodiment in a time series. It is a flowchart which shows an example of the attendance prediction process of the said embodiment. It is a figure showing other examples of an image acquisition of an employee of an embodiment.
  • the present invention learns a face image at the time of time stamping in face authentication in association with time information, and predicts subsequent time from the face image. Then, if necessary, a notice is sent to the employee, his / her related person (such as a boss), and an industrial physician.
  • a notice is sent to the employee, his / her related person (such as a boss), and an industrial physician.
  • the best mode for carrying out the present invention will be described in detail based on examples.
  • FIG. 1 is a conceptual diagram showing an outline of the attendance prediction system according to the present embodiment.
  • the attendance prediction system 100 includes a server 10 that learns a face image at the time of time stamping by face authentication in association with attendance information and performs subsequent attendance prediction from the face image, and an imaging device 60.
  • the imaging device 60 captures a face image of the employee 70, and may be a tablet 62 or a monitoring camera 66 as shown in FIG. In the case of the tablet 62, a predetermined area (an area in which four corners are defined by L) is acquired as a face image from the image displayed on the display unit 64.
  • the server 10 learns the face image at the time of time stamping by face authentication in association with the time information, and performs subsequent time prediction from the face image. Then, if necessary, the result of the attendance prediction is notified to the terminal of the employee 70, the related person 72 of the employee 70, the doctor 74, and the like.
  • the related person 72 may be the boss of the employee 70 or a family member.
  • the doctor 74 may be, for example, an industrial physician or a family doctor of the employee 70.
  • FIG. 2 is a block diagram showing a hardware configuration of a server of the attendance prediction system according to the present embodiment.
  • the server 10 includes a processor 12, a memory 14, a storage 16, and a communication unit 18, which are connected by a bus (not shown).
  • the processor 12 includes, for example, a CPU (Central Processing Unit) and performs various processes by reading and executing various programs stored in the memory 14.
  • the memory 14 stores a program to be executed by the CPU 12, and includes, for example, a ROM (Read Only Memory) and a RAM (Randam Access Memory). For example, various means shown in FIG. 4 are stored.
  • the storage 16 stores employee data 16A, acquired data 16B, a control program (not shown), and the like. In addition, learning data and the like may be stored.
  • the communication unit 18 performs various data communications with external storage means (not shown), terminals of the employees 70, related persons 72, doctors 74, and the like via a network.
  • FIG. 3 shows an example of the employee data 16A.
  • employee data 16A stores data for each employee, such as employee data 20A for employee A, employee data B for employee B, and so on.
  • basic information 22 a face image 24, attendance information 26, and health information 28 are stored.
  • the basic information 22 stores the name, gender, age, date of birth, blood type, weight, and the like of the employee A.
  • the face image 24 is a face image registered in advance to identify the employee A.
  • the attendance information 26 stores the attendance status of the employee A, for example, a date, a joining time, a leaving time, a working time, and the like.
  • the health information 28 stores information on visiting the employee A, medical chart information, and the like.
  • the data structure is the same for the employee B and other employees.
  • meal data such as how many calories have been eaten in the last few days or meal contents in the morning, day and night may be stored as employee data.
  • sleep data such as how many hours of sleep or the depth of sleep may be stored as employee data.
  • an action log indicating where the user went by GPS may be stored as employee data.
  • FIG. 4 is a block diagram showing the functional configuration of the server of the attendance prediction system according to the present embodiment.
  • the server 10 includes a first acquisition unit 30, a second acquisition unit 32, a third acquisition unit 34, an employee identification unit 36, a learning unit 38, an attendance prediction unit 40, a prediction result notification unit 42, a disease determination unit 44, and overtime control. Means 46 and interview setting means 48.
  • the first acquisition unit 30 acquires an image of the employee 70, and acquires an image of the employee captured by the imaging device 60 such as the tablet 62 or the monitoring camera 66.
  • the image captured by the imaging device 60 may be a still image, or may include a moving image of an employee 70 walking.
  • the acquired image is stored in the acquired data 16B of the storage 16.
  • the second acquisition unit 32 acquires a face image of the employee 70 from the images acquired by the first acquisition unit 30. This is to extract only the face image when the image acquired by the first acquisition unit 30 is an image of the entire body of the employee 70, as described later.
  • the employee identifying means 36 identifies the employee by the face image acquired by the second acquiring means 32, and compares the acquired face image with the face image 24 previously stored in the employee data 16A. The employee 70 is identified.
  • the third acquisition unit 34 acquires the attendance information 26 associated with the identified employee 70 from the employee data 16A.
  • the learning means 38 performs learning using the images of the employee 70 for a plurality of days and the attendance information 26. Specifically, feature points of the face image are extracted using image analysis, and the feature points are associated with attendance information. Further, not only the face image but also the image of the body posture and the image of the walking speed may be similarly extracted with feature points and associated with attendance information. For learning, machine learning or deep learning is used, but learning techniques other than these may be used.
  • FIG. 5 shows the learning data 50 in which the face images of the employees 70 for a plurality of days at the time of the time stamp are associated with the time information.
  • the learning data 50 shown in FIG. 3 includes a date column 52, a face image column 54, and an attendance information column 56.
  • the attendance information for 2018/4/1, 2018/4/2, and 2018/5/1 is "normal work", and the face image is also a normal expression.
  • the attendance information of 2018/4/3 is "1 hour late”, and the facial expression is frowning.
  • the attendance information on 2018/4/4 is “AM half-day off”, and the face image has a frown expression and a reduced mouth angle.
  • the attendance information is “all holidays” as in 2018/4/5, no face image exists.
  • the attendance prediction unit 40 predicts the attendance of the employee 70 based on the result learned by the learning unit 38. Specifically, the subsequent attendance of the employee 70 is predicted from the newly acquired face image of the employee 70. This prediction may be performed based on the temporal change of the image for a plurality of days, or may be performed regardless of the temporal change. For example, in many cases, it is expected that the attendance will deteriorate when the complexion deteriorates, and it is predicted that the lateness will increase in the near future.
  • the prediction may be performed based on the expression of the face image instead of the complexion, or the prediction may be performed in consideration of both the complexion and the facial expression.
  • the attendance predicting means 40 may perform the subsequent attendance prediction of the employee based on the employee's health information 28 in addition to the face image and the attendance information 26, and may include meal data, sleep data, Attendance prediction may be performed using the action log.
  • the prediction result notifying means 42 is for notifying the employee 70, his or her related person 72, or the doctor 74 of the attendance prediction obtained by the attendance predicting means 40.
  • a means of notification for example, an email may be transmitted to the terminal of the employee 70, the person concerned 72, or the doctor 74, or a message may be displayed on a display unit of the terminal.
  • a gentle message such as "Is your physical condition OK recently?"
  • a notification is sent to the related person 72, who is the boss of the employee 70, such as "It is highly probable that Mr.
  • the disease determining means 44 compares the learned image of the patient with the image of the employee 70 to determine the predetermined disease. Is determined. For example, in cooperation with a hospital or the like, an image of a patient in a sick state is learned in advance, and it is determined whether or not the patient is sick from an image of an employee who is attending time and attendance. The determination result by the disease determination unit 44 may be notified to the employee 70, the related person 72, and the doctor 74.
  • the overtime control means 46 controls overtime of the employee 70 in accordance with the attendance prediction obtained by the attendance prediction means 40. More specifically, a message such as "Take care of overtime” or "It is time to leave the office” is notified to the terminal of the employee 70 who seems to be in poor health. Further, a message such as “the terminal will be unavailable in 30 minutes” may be notified to the terminal of the employee 70. In this way, by sending a message to the employee 70 who is likely to feel ill due to the prediction of time and attendance to reduce overtime, the employee 70 can be made to be conscious of reducing working hours and securing time to improve physical condition. it can.
  • the interview setting means 48 sets a schedule of an interview between the employee 70 and the employee 72 or the doctor 74 according to the attendance prediction obtained by the attendance prediction means 40. Then, the set interview schedule is notified to both the terminal of the related person 72 or the doctor 74 and the terminal of the employee 70 by e-mail or message.
  • FIG. 6 is a flowchart illustrating an example of the attendance prediction process according to the present embodiment.
  • the first acquisition unit 30 of the server 10 acquires an image of the employee 70 captured by the imaging unit 60 such as the tablet 62 or the monitoring camera 66 (Step S10).
  • the second acquisition unit 32 acquires a face image of the employee 70 from among the acquired images of the employee 70 (Step S12).
  • These acquired data are stored in the storage 16 of the server 10 as acquired data 16B. The same applies to the data acquired by the server 10 in the following procedures.
  • the employee identifying means 36 identifies the employee 70 based on the acquired face image and the face image 24 stored in the employee data 16A (step S14).
  • the third obtaining unit 34 obtains the attendance information linked to the identified employee 70 (Step S16).
  • the learning unit 38 learns using the images of the employee 70 for a plurality of days and the attendance information acquired by the third acquiring unit 34 (step S18). Specifically, feature points of the face image are extracted using image analysis, and the feature points are associated with attendance information.
  • the feature point of the face image may be, for example, a complexion or a facial expression. In the case of a facial expression, the position of the eyebrows or the position of the corner of the mouth can be used as a feature point.
  • feature points may be extracted for posture and walking speed from images of the entire body as well as face images, and may be associated with attendance information.
  • the walking speed can be calculated, for example, by confirming the time required to move a predetermined distance from a moving image in which the employee 70 is walking. For learning, machine learning, deep learning, or the like is used, but other learning methods may be used.
  • FIG. 5 shows learning data 50 in which images of an employee 70 for a plurality of days are associated with attendance information.
  • the learning unit 38 extracts feature points of the face image and performs attendance. Associate with information.
  • the attendance prediction unit 40 predicts the subsequent attendance of the employee 70 according to the result learned by the learning unit 38 (step S20). Specifically, the subsequent attendance of the employee 70 is predicted from the newly acquired face image of the employee 70. For example, in FIG. 5, the face image of 2018/5/1 is a newly obtained face image, and the subsequent attendance is predicted from the face image and the learning result. Specifically, a prediction is made such as “I am likely to be late in the near future”. This attendance prediction may be performed from the temporal change of the image for a plurality of days, or may be performed regardless of the temporal change.
  • the prediction result notifying means 42 notifies the employee 70, his or her related person 72, or the doctor 74 of the time prediction obtained by the time predicting means 40 (step S22).
  • the notification is performed by sending an e-mail to the terminal of the employee 70, the person concerned 72, or the doctor 74, or displaying a message on the display unit of the terminal. If the attendance forecast predicts that the physical condition of the employee 70 is going bad, a gentle message such as "Is your physical condition OK nowadays?" The concerned person 72 who is the boss of the employee 70 is notified that "It is highly probable that Mr. A will not be able to come to the office for about three days next week". In addition, the doctor 74 is notified that “There is a high possibility that Mr. A will see a doctor next week”.
  • a hospital or the like may cooperate to provide a sick patient's face image and add the provided sick patient's face image to the learning data.
  • the learning means 38 learns the acquired image of the patient suffering from the predetermined disease
  • the disease judgment means 44 compares the learned patient image with the image of the employee 70 to determine whether the patient has the predetermined disease. May be determined. This determination result may be notified to the employee 70, the related person 72, or the doctor 74 together with the attendance prediction, or may be separately notified. Further, the disease prediction itself may be used for attendance prediction, and the result may be notified.
  • the overtime control means 46 may control the overtime of the employee according to the prediction of the attendance. More specifically, a message such as "Take care of overtime” or "It is time to leave the office” is notified to the terminal of the employee 70 who seems to be ill. Further, a message such as “the terminal will be unavailable in 30 minutes” may be notified to the terminal of the employee 70. In this way, by sending a message to the employee 70 who is likely to feel ill due to the prediction of time and attendance to reduce overtime, the employee 70 can be made to be conscious of reducing working hours and securing time to improve physical condition. it can.
  • the schedule of the interview between the employee 70 and the related person 72 or the doctor 74 of the employee 70 may be set by the interview setting means 48 in accordance with the predicted attendance. Then, the set interview schedule may be notified to both the terminal of the concerned person 72 or the doctor 74 and the terminal of the employee 70 by e-mail or message.
  • the attendance predicting means 40 learns, in addition to the attendance information 26, the health information 28 of the employee 70, meal data, sleep data, an action log, and the like, so that the employee's subsequent attendance can be obtained.
  • the prediction may be performed.
  • the first acquisition unit 30 that acquires the image of the employee 70 and the second acquisition unit 32 that acquires the face image of the employee 70 among the acquired images.
  • employee identification means 36 for identifying the employee 70 by a face image
  • third acquisition means 34 for acquiring the attendance information associated with the employee 70, images of the employee 70 for a plurality of days and attendance information
  • an attendance predicting unit 40 for predicting the subsequent attendance of the employee 70 according to the result of the learning.
  • the above-described embodiment is an example, and can be appropriately changed within a range in which a similar effect is obtained. Further, a form in which the following modified examples are combined may be employed.
  • the imaging device 60 shown in the above embodiment is an example, and the attendance prediction may be performed from images captured by the monitoring camera 66 or a camera installed at the gate of the company. In the example shown in FIG. 7, a surveillance camera 66 installed near an entrance 82 of a company building 80 captures a whole body image of an employee 70 passing through the entrance 82, and obtains a face image from the whole body image. Then, the attendance prediction similar to that of the above-described embodiment can be performed.
  • the attendance prediction is performed only from the still images captured by the tablet 62.
  • Attendance prediction may be performed from a moving image obtained by photographing a place where 70 is walking.
  • the attendance prediction is performed from the face image of the employee 70, but the attendance prediction may be performed from the whole body image in addition to the face image. For example, attendance prediction is performed based on the posture of the body and the walking speed. In many cases, poor physical condition is predicted when the posture of the body is poor or when walking is slow.
  • the employee data 16A shown in the above-described embodiment is also an example, and meal data such as how many calories have been eaten in the past several days or meal contents in the morning, day and night may be stored as the employee data. Further, for example, sleep data such as how many hours of sleep or the depth of sleep may be stored as employee data. Further, an action log indicating where the user went by GPS may be stored as employee data. Attendance prediction may be performed using employee data including these pieces of information.
  • the present invention may be provided as a program executed by the server 10. This program may be provided in a state where it can be read by a computer and recorded on a recording medium, or may be downloaded via a network. Further, the present invention may be provided as a method invention.
  • a face image at the time of time stamping in face authentication is learned in association with time information, and subsequent time prediction is performed from the face image. It is suitable as a system for predicting attendance.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Human Resources & Organizations (AREA)
  • Operations Research (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Data Mining & Analysis (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

Le problème décrit par la présente invention est d'acquérir l'image faciale d'un employé et de prédire la présence à partir d'un changement de l'image faciale. La solution selon l'invention porte sur un serveur 10 qui comprend : un premier moyen d'acquisition pour acquérir l'image d'un employé 70, un deuxième moyen d'acquisition pour acquérir une image faciale de l'employé 70 parmi les images acquises, un moyen d'identification d'employé pour identifier l'employé 70 par l'image faciale, un troisième moyen d'acquisition pour acquérir des informations de présence liées à l'employé 70, un moyen d'apprentissage pour l'apprentissage à l'aide de l'image et des informations de présence de l'employé 70 pendant une pluralité de jours, et un moyen de prédiction de présence pour prédire la présence de l'employé 70 par la suite conformément au résultat appris. Une image faciale au moment où la présence est pointée par authentification faciale est apprise en association avec les informations de présence, et la présence est ensuite prédite à partir de l'image faciale. La prédiction de la présence est notifiée à l'employé 70, à la personne concernée 72 ou à un médecin 74 selon les besoins.
PCT/JP2018/025417 2018-07-04 2018-07-04 Système, procédé et programme de prédiction de présence et d'absence WO2020008577A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/025417 WO2020008577A1 (fr) 2018-07-04 2018-07-04 Système, procédé et programme de prédiction de présence et d'absence

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/025417 WO2020008577A1 (fr) 2018-07-04 2018-07-04 Système, procédé et programme de prédiction de présence et d'absence

Publications (1)

Publication Number Publication Date
WO2020008577A1 true WO2020008577A1 (fr) 2020-01-09

Family

ID=69060523

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/025417 WO2020008577A1 (fr) 2018-07-04 2018-07-04 Système, procédé et programme de prédiction de présence et d'absence

Country Status (1)

Country Link
WO (1) WO2020008577A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016035336A1 (fr) * 2014-09-03 2016-03-10 日本電気株式会社 Système de prédiction de congé autorisé, dispositif d'apprentissage de règle de prédiction, dispositif de prédiction, procédé de prédiction de congé autorisé et support d'enregistrement lisible par ordinateur
JP2016139397A (ja) * 2015-01-23 2016-08-04 パナソニックIpマネジメント株式会社 画像処理装置、画像処理方法、画像表示装置およびコンピュータプログラム
JP2017100039A (ja) * 2015-12-01 2017-06-08 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 体調推定装置、体調推定システム及びプロセッサ
JP2017117147A (ja) * 2015-12-24 2017-06-29 前田建設工業株式会社 構造物の施工管理方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016035336A1 (fr) * 2014-09-03 2016-03-10 日本電気株式会社 Système de prédiction de congé autorisé, dispositif d'apprentissage de règle de prédiction, dispositif de prédiction, procédé de prédiction de congé autorisé et support d'enregistrement lisible par ordinateur
JP2016139397A (ja) * 2015-01-23 2016-08-04 パナソニックIpマネジメント株式会社 画像処理装置、画像処理方法、画像表示装置およびコンピュータプログラム
JP2017100039A (ja) * 2015-12-01 2017-06-08 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America 体調推定装置、体調推定システム及びプロセッサ
JP2017117147A (ja) * 2015-12-24 2017-06-29 前田建設工業株式会社 構造物の施工管理方法

Similar Documents

Publication Publication Date Title
US11096581B2 (en) Computer-assisted patient navigation and information systems and methods
US10354051B2 (en) Computer assisted patient navigation and information systems and methods
KR20190079157A (ko) 온라인 기반의 건강 관리 방법 및 장치
US20140236627A1 (en) Dynamic medical scheduling system and method of operation thereof
JP2019515398A (ja) 介護者による被介護者に関する非公式の所見をたどるためのシステム及び方法
JP2018195164A (ja) 解析装置、解析プログラム及び解析方法
US11545261B2 (en) Hospital healthcare provider monitoring and verifying device and system for patient care condition
Al Hemairy et al. A comprehensive framework for elderly healthcare monitoring in smart environment
JP2013148996A (ja) 重症度判定装置、及び、重症度判定方法
KR102430758B1 (ko) 개인 건강 관리를 위한 디지털돌봄 지원시스템, 서버 및 이의 제공 방법
US20170351829A1 (en) Smart device for monitoring and capturing patient data
Dhinakaran et al. Cloud based smart healthcare management system using blue eyes technology
WO2020008577A1 (fr) Système, procédé et programme de prédiction de présence et d'absence
Pepito et al. Caring for older persons in a technologically advanced nursing future
CN112749321A (zh) 数据处理方法、客户端、服务器、系统及存储介质
US20210407667A1 (en) Systems and methods for prediction of unnecessary emergency room visits
JP2002236759A (ja) 遠隔看護システム
KR20220051231A (ko) 의료 장치, 시스템, 및 방법
Krishnasamy et al. An innovative outcome of internet of things and artificial intelligence in remote centered healthcare application schemes
Menon et al. A visual framework for an iot-based healthcare system based on cloud computing
Shen et al. Toward a nationwide mobile-based public healthcare service system with wireless sensor networks
Jeyakumar et al. A smart virtual vision system for health monitoring
US20220115142A1 (en) System and method for tracking a users health status
JP7268679B2 (ja) 制御プログラム、レポート出力方法、およびレポート出力装置
Drobež et al. Ambient Intelligence Supporting Health and Social Care Services in Smart Care Settings: Literature Review and Research Agenda

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18925248

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 26/03/2021)

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 18925248

Country of ref document: EP

Kind code of ref document: A1