WO2022196471A1 - コンピュータプログラム、情報処理方法、情報処理装置及び情報処理システム - Google Patents
コンピュータプログラム、情報処理方法、情報処理装置及び情報処理システム Download PDFInfo
- Publication number
- WO2022196471A1 WO2022196471A1 PCT/JP2022/010172 JP2022010172W WO2022196471A1 WO 2022196471 A1 WO2022196471 A1 WO 2022196471A1 JP 2022010172 W JP2022010172 W JP 2022010172W WO 2022196471 A1 WO2022196471 A1 WO 2022196471A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- heart
- displacement
- computer
- sensor
- artery
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 134
- 238000003672 processing method Methods 0.000 title claims abstract description 12
- 238000004590 computer program Methods 0.000 title claims description 51
- 210000004204 blood vessel Anatomy 0.000 claims abstract description 152
- 238000006073 displacement reaction Methods 0.000 claims abstract description 148
- 230000010349 pulsation Effects 0.000 claims abstract description 133
- 230000005856 abnormality Effects 0.000 claims abstract description 87
- 238000000034 method Methods 0.000 claims description 99
- 210000001367 artery Anatomy 0.000 claims description 40
- 210000001715 carotid artery Anatomy 0.000 claims description 31
- 210000002683 foot Anatomy 0.000 claims description 29
- 238000001514 detection method Methods 0.000 claims description 25
- 230000002123 temporal effect Effects 0.000 claims description 17
- 210000002321 radial artery Anatomy 0.000 claims description 15
- 210000001994 temporal artery Anatomy 0.000 claims description 15
- 210000000707 wrist Anatomy 0.000 claims description 15
- 210000002302 brachial artery Anatomy 0.000 claims description 14
- 230000033764 rhythmic process Effects 0.000 claims description 13
- 230000006496 vascular abnormality Effects 0.000 claims description 13
- 210000001142 back Anatomy 0.000 claims description 12
- 210000002559 ulnar artery Anatomy 0.000 claims description 12
- 230000001133 acceleration Effects 0.000 claims description 11
- 230000033001 locomotion Effects 0.000 claims description 8
- 210000003813 thumb Anatomy 0.000 claims description 7
- 210000004731 jugular vein Anatomy 0.000 claims description 6
- 206010019280 Heart failures Diseases 0.000 claims description 5
- 210000005069 ears Anatomy 0.000 claims description 5
- 210000004932 little finger Anatomy 0.000 claims description 5
- 210000003141 lower extremity Anatomy 0.000 claims description 5
- 230000001678 irradiating effect Effects 0.000 claims description 4
- 206010000060 Abdominal distension Diseases 0.000 claims description 3
- 210000004709 eyebrow Anatomy 0.000 claims description 3
- 201000002818 limb ischemia Diseases 0.000 claims description 3
- 210000001364 upper extremity Anatomy 0.000 claims description 3
- 230000001815 facial effect Effects 0.000 claims description 2
- 238000012545 processing Methods 0.000 description 195
- 238000004891 communication Methods 0.000 description 43
- 238000010586 diagram Methods 0.000 description 16
- 230000000284 resting effect Effects 0.000 description 10
- 230000006870 function Effects 0.000 description 7
- 238000000605 extraction Methods 0.000 description 6
- 230000000747 cardiac effect Effects 0.000 description 5
- 206010008118 cerebral infarction Diseases 0.000 description 5
- 208000026106 cerebrovascular disease Diseases 0.000 description 5
- 206010008111 Cerebral haemorrhage Diseases 0.000 description 4
- 208000006011 Stroke Diseases 0.000 description 4
- 230000002159 abnormal effect Effects 0.000 description 4
- 230000002008 hemorrhagic effect Effects 0.000 description 4
- 230000000302 ischemic effect Effects 0.000 description 4
- 238000012544 monitoring process Methods 0.000 description 4
- 206010003210 Arteriosclerosis Diseases 0.000 description 3
- 208000011775 arteriosclerosis disease Diseases 0.000 description 3
- 238000010009 beating Methods 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 210000000245 forearm Anatomy 0.000 description 3
- 210000003128 head Anatomy 0.000 description 3
- 206010059865 Jugular vein distension Diseases 0.000 description 2
- 238000013527 convolutional neural network Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 208000028867 ischemia Diseases 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 210000002569 neuron Anatomy 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 210000000056 organ Anatomy 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 210000002832 shoulder Anatomy 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 208000002330 Congenital Heart Defects Diseases 0.000 description 1
- 206010020772 Hypertension Diseases 0.000 description 1
- 206010061216 Infarction Diseases 0.000 description 1
- 208000031481 Pathologic Constriction Diseases 0.000 description 1
- 206010057469 Vascular stenosis Diseases 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 210000004191 axillary artery Anatomy 0.000 description 1
- 230000017531 blood circulation Effects 0.000 description 1
- 210000000038 chest Anatomy 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 210000001105 femoral artery Anatomy 0.000 description 1
- 230000005831 heart abnormality Effects 0.000 description 1
- 230000007574 infarction Effects 0.000 description 1
- 210000003127 knee Anatomy 0.000 description 1
- 210000002414 leg Anatomy 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012806 monitoring device Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 210000003137 popliteal artery Anatomy 0.000 description 1
- 230000000644 propagated effect Effects 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000036262 stenosis Effects 0.000 description 1
- 208000037804 stenosis Diseases 0.000 description 1
- 230000008961 swelling Effects 0.000 description 1
- 229920002803 thermoplastic polyurethane Polymers 0.000 description 1
- 210000002465 tibial artery Anatomy 0.000 description 1
- 210000000689 upper leg Anatomy 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
- A61B5/02405—Determining heart rate variability
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1102—Ballistocardiography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B10/00—Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0062—Arrangements for scanning
- A61B5/0064—Body surface scanning
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
- A61B5/02416—Detecting, measuring or recording pulse rate or heart rate using photoplethysmograph signals, e.g. generated by infrared radiation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/05—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves
- A61B5/0507—Detecting, measuring or recording for diagnosis by means of electric currents or magnetic fields; Measuring using microwaves or radio waves using microwaves or terahertz waves
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
- A61B5/1128—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4887—Locating particular structures in or on the body
- A61B5/489—Blood vessels
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/63—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H40/00—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
- G16H40/60—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
- G16H40/67—ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/02028—Determining haemodynamic parameters not otherwise provided for, e.g. cardiac contractility or left ventricular ejection fraction
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/7264—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
- A61B5/7267—Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/70—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
Definitions
- the present invention relates to a computer program, an information processing method, an information processing device, and an information processing system.
- Patent Document 1 discloses a biological information monitoring device that monitors biological information such as pulse by radiating high-frequency electromagnetic waves, detecting reflected waves scattered on the surface of a person's biological body, and calculating temporal variations of the biological surface. disclosed.
- Patent Document 1 does not disclose a technique for determining specific heart or blood vessel abnormalities.
- An object of the present invention is to provide a computer program, an information processing method, an information processing apparatus, and an information processing system capable of detecting heart or blood vessel pulsations of a living body and determining abnormalities in the heart or blood vessels.
- a computer program is a computer program for causing a computer to determine an abnormality in a heart or blood vessel of a living body, and specifies a displacement site detectable from a first direction where the surface of the living body is displaced by the heart or the blood vessel. identifying a displacement site detectable from a second direction where the living body surface is displaced by the heart or the blood vessel, and detecting the pulsation of the heart or the blood vessel based on the displacement of the living body surface at the identified displacement site; The computer is caused to execute a process of determining an abnormality of the heart or blood vessels based on the detected information about the beat of the heart or blood vessels.
- An information processing method is an information processing method for determining abnormalities in the heart or blood vessels of a living body, wherein a displacement site detectable from a first direction where the surface of the living body is displaced by the heart or blood vessels is specified, Alternatively, a displacement site detectable from a second direction in which the body surface is displaced by the blood vessel is identified, and based on the displacement of the body surface at the identified displacement site, heart or blood vessel pulsation is detected, and Abnormality of the heart or blood vessels is determined based on the information about the beat of the heart or blood vessels.
- An information processing apparatus is an information processing apparatus for determining an abnormality of a heart or a blood vessel of a living body, and is a first method for identifying a displacement site detectable from a first direction where the surface of a living body is displaced by the heart or the blood vessel.
- an identifying part, a second identifying part that identifies a displacement site detectable from a second direction where the body surface is displaced by the heart or blood vessels, and the displacement identified by the first identifying part and the second identifying part a detection unit that detects heart or blood vessel pulsation based on the displacement of the body surface at the site; and a determination unit that determines an abnormality of the heart or blood vessel based on information related to the detected heart or blood vessel pulsation.
- An information processing system is an information processing system including an information processing device and a sensor device for determining abnormalities in the heart or blood vessels of a living body, wherein the information processing device displaces the surface of the living body due to the heart or blood vessels.
- a first infrared sensor or visible light sensor for identifying a displacement site from a first direction
- a first millimeter wave sensor or a terahertz band sensor for receiving reflected waves
- the sensor device includes a second infrared sensor or a visible light sensor
- a second millimeter wave sensor or terahertz band sensor that irradiates the living body with millimeter waves or terahertz band electromagnetic waves in a second direction and receives reflected waves from the displacement part of the living body.
- a specifying unit that specifies the displacement site based on signal data output from the first infrared sensor or visible light sensor and the second infrared sensor or visible light sensor; and the first millimeter wave sensor or From the terahertz band sensor and the second millimeter wave sensor or the terahertz band sensor, the displacement site identified by the identification unit is irradiated with millimeter waves or terahertz band electromagnetic waves, and the first millimeter wave sensor or the terahertz band sensor a detection unit for detecting heart or blood vessel pulsation based on signal data output from the sensor and the second millimeter wave sensor or terahertz band sensor; and a determining unit for determining an abnormality of the heart or blood vessels.
- FIG. 1 is an explanatory diagram showing a configuration example of an information processing system according to Embodiment 1;
- FIG. 1 is a block diagram showing a configuration example of an information processing apparatus according to Embodiment 1;
- FIG. 4 is a conceptual diagram showing an example of a learning model according to Embodiment 1;
- FIG. 2 is a conceptual diagram showing a displacement site, heart and blood vessels;
- 1 is a block diagram showing a configuration example of a sensor device according to Embodiment 1;
- FIG. 4 is a flowchart showing an information processing procedure according to the first embodiment; 4 is a flowchart showing an information processing procedure according to the first embodiment; 4 is a flow chart showing a process procedure for identifying a displaced portion; 4 is a flow chart showing a process procedure for identifying a displaced portion; 4 is a flow chart showing a processing procedure of the sensor device according to Embodiment 1.
- FIG. 4 is a flowchart showing an abnormality determination processing procedure; It is a schematic diagram which shows an example of the determination result display image. It is a schematic diagram which shows an example of the determination result display image. It is a schematic diagram which shows an example of the determination result display image. It is a schematic diagram which shows an example of the determination result display image.
- FIG. 1 is an explanatory diagram showing a configuration example of an information processing system according to the first embodiment.
- the information processing system includes the information processing device 1 and the sensor device 9 according to the first embodiment.
- the information processing device 1 is wirelessly connected to the sensor device 9, the first communication terminal 2, the second communication terminal 3, the server 4, and the like, and can transmit and receive various information.
- the information processing device 1 is a device that uses infrared rays and millimeter waves to detect heartbeats and blood vessel pulsations of a user (living body), and determines the presence or absence of abnormalities in the heart and blood vessels. It is assumed that the information processing device 1 is installed in the room R where the user is.
- the sensor device 9 is a device that detects the pulsation of the user's (living body's) heart and blood vessels in the same way as the information processing device 1 and transmits the detection result to the information processing device 1 .
- the sensor device 9 is provided in a lighting device L installed on the ceiling of the room R, for example.
- the first communication terminal 2 is a communication device used by the user's family.
- the second communication terminal 3 is a communication terminal used by medical personnel.
- the server 4 is a device that provides information related to the environment, such as temperature and humidity, that affects the user's heartbeat. Note that each device may be configured to be connected by a wired cable.
- an abnormality of the heart in this embodiment refers to a disease of the heart itself. Abnormalities of blood vessels include abnormalities of blood vessels themselves such as arteriosclerosis, and abnormalities of organs, organs, and sites due to abnormal blood flow such as cerebral infarction and foot infarction (severe lower extremity ischemia).
- FIG. 2 is a block diagram showing a configuration example of the information processing device 1 according to the first embodiment.
- the information processing device 1 is a computer including a processing unit 11 , a storage unit 12 , an infrared sensor 13 , a millimeter wave sensor 14 , a communication unit 15 , an operation unit 16 and a display unit 17 .
- the information processing apparatus 1 may be a multicomputer composed of a plurality of computers, or may be a virtual machine virtually constructed by software.
- the processing unit 11 includes one or more CPU (Central Processing Unit), MPU (Micro-Processing Unit), GPU (Graphics Processing Unit), GPGPU (General-purpose computing on graphics processing units), TPU (Tensor Processing Unit), etc. It is an arithmetic processing unit having The processing unit 11 reads out and executes the computer program P1 stored in the storage unit 12, thereby executing processing for determining abnormalities in the user's heart and blood vessels.
- CPU Central Processing Unit
- MPU Micro-Processing Unit
- GPU Graphics Processing Unit
- GPGPU General-purpose computing on graphics processing units
- TPU Torsor Processing Unit
- the storage unit 12 is a storage device such as a hard disk, EEPROM (Electrically Erasable Programmable ROM), flash memory, or the like.
- the storage unit 12 stores a computer program P1, a user DB 18, and a learning model 19 for causing the processing unit 11 to execute processing for causing a computer to determine abnormalities in the user's heart and blood vessels.
- the computer program P1 is a program for causing a computer to function as the information processing apparatus 1 according to the first embodiment and executing the information processing method according to the first embodiment.
- a computer program P1 specifies a displacement site where the body surface is displaced by the user's heart or blood vessels, detects the pulsation of the heart or blood vessels based on the displacement of the living body surface at the specified displacement site, and detects the detected heart.
- it is for causing a computer to execute a process of judging an abnormality of the heart or blood vessels based on information related to pulsation of blood vessels.
- the computer program P1 may be recorded on the recording medium 10 in a computer-readable manner.
- the storage unit 12 stores a computer program P1 read from the recording medium 10 by a reading device (not shown).
- a recording medium 10 is a semiconductor memory such as a flash memory, an optical disk, a magnetic disk, a magneto-optical disk, or the like.
- the computer program P1 according to the present embodiment may be downloaded from a program providing server (not shown) connected to a communication network and stored in the storage unit 12 .
- the user DB 18 stores basic user information such as identification information for identifying users, authentication information for authenticating individual users, name, gender, and age.
- the user DB 18 stores environment information such as the user's pulse and heartbeat, detection date and time, temperature and humidity detected by the information processing apparatus 1, in association with the user's identification information.
- the user DB 18 stores the determination result by the information processing apparatus 1, that is, information indicating whether the user's heart or blood vessels are normal, in association with the user's identification information.
- the user DB 18 may be a cloud database.
- the infrared sensor 13 is, for example, an infrared laser such as LiDAR, an infrared camera, or the like, and uses infrared rays to identify, without contact, each part of the user's human body, or a displaced part where the body surface is displaced by the pulsation of the heart or blood vessels. It is a sensor for The infrared sensor 13 is an example of a non-contact sensor for identifying the displacement site on the surface of the living body from the first direction.
- the first direction is, for example, a substantially horizontal direction.
- the infrared sensor 13 is a sensor for identifying the user's displaced portion in a substantially horizontal direction.
- the infrared sensor 13 has a high spatial resolution and is suitable for capturing the structure of an object. However, there is a demerit in that it is easily absorbed by clothes and the displacement of the body surface hidden by clothes cannot be detected.
- the substantially horizontal direction is not a strict horizontal direction, and may be a vertical direction or a direction oblique to the horizontal plane as long as it is possible to detect the front side portion of the user standing upright.
- An infrared camera is a camera equipped with a lens and a CMOS image sensor for receiving infrared rays reflected by a user's body surface or clothing, and outputs infrared image data (signal data) as user's two-dimensional information.
- Infrared image data is image data composed of a plurality of pixels arranged substantially horizontally and vertically.
- the infrared sensor 13, which is LiDAR includes a light-emitting element that emits infrared rays to the user and a light-receiving element that receives the infrared rays emitted and reflected by the user.
- the light emitting element is, for example, an infrared laser such as a vertical cavity surface emitting laser (VCSEL: Vertical Cavity Surface Emitting LASER), and irradiates a user with a dot pattern arranged vertically and horizontally.
- the light receiving element is, for example, a CMOS image sensor.
- the infrared sensor 13 calculates the distance to the user based on the round-trip time from when the light is emitted toward the user to when it is reflected back.
- the infrared sensor 13 calculates the distance to each dot pattern and outputs point cloud data (signal data), which is three-dimensional information of the user.
- the point cloud data represents, for example, a large number of points on the user's body surface or clothing surface by three-dimensional coordinates.
- the processing unit 11 can convert the point cloud data into voxel data.
- the processing unit 11 can also convert point cloud data or voxel data into two-dimensional infrared image data.
- point cloud data or voxel data into two-dimensional infrared image data.
- two-dimensional infrared image data is used to specify each part of the user's body and the displaced parts where the body surface is displaced by the pulsation of the heart and blood vessels. explain.
- the millimeter wave sensor 14 is a sensor that uses millimeter waves to detect the pulsation of the displaced portion of the user.
- the millimeter wave sensor 14 is inferior in spatial resolution to the infrared sensor 13, but since millimeter wave electromagnetic waves are transmitted through the user's clothing without being absorbed and are reflected by the surface of the living body, displacement of the surface of the living body is detected. Suitable for capturing.
- the millimeter wave sensor 14 irradiates the user with millimeter waves from the first direction and receives reflected waves from the user's displaced site, thereby detecting the pulsation at the displaced site.
- the first direction is substantially horizontal.
- the millimeter wave sensor 14 emits millimeter waves toward the user in a substantially horizontal direction and receives reflected waves from the user's displaced parts, thereby detecting pulsations at the displaced parts.
- the millimeter wave sensor 14 includes a synthesizer that generates millimeter wave signals, a transmitting antenna, a receiving antenna, a mixer, and the like.
- the transmitting antenna transmits millimeter-wave electromagnetic waves generated by the synthesizer.
- the receiving antenna receives millimeter-wave electromagnetic waves reflected by the user's biological surface.
- a mixer is a circuit that mixes a transmitted wave and a received wave to generate an intermediate frequency signal.
- the processing unit 11 can calculate the distance to the user based on the data of the intermediate frequency signal.
- the processing unit 11 can calculate the variation in the distance to the displacement site of the user, that is, the displacement of the surface of the living body, and can detect the pulsation of the heart or blood vessels at the displacement site.
- the millimeter wave sensor 14 can irradiate millimeter waves by aiming at an arbitrary displacement site by an electronic scanning method, and the processing unit 11 can detect the pulsation at the displacement site.
- the communication unit 15 includes a processing circuit for performing wireless communication processing, a communication circuit, and the like, and communicates with the sensor device 9, the first communication terminal 2, the second communication terminal 3, and the server 4 via a router (not shown). Send and receive various information.
- the operation unit 16 is an input device that receives operations of the information processing device 1 by the user.
- the input device is, for example, a pointing device such as a touch panel, or a keyboard.
- the display unit 17 is an output device that outputs the determination result of abnormality of the user's heart or blood vessels.
- the output device is, for example, a liquid crystal display or an EL display.
- FIG. 3 is a conceptual diagram showing an example of the learning model 19 according to the first embodiment.
- the learning model 19 is a model for recognizing a predetermined object included in the infrared image.
- the learning model 19 can, for example, classify objects by pixel by using image recognition technology using semantic segmentation, and recognize each part of the human body included in the infrared image as an object. be able to.
- the learning model 19 is based on the facial, right temporal, left temporal, right neck, left neck, right carotid triangle, left carotid triangle, right chest, left chest, upper right
- the arm, left upper arm, right forearm, left forearm, right hand root, left hand root, right hand palm, left hand palm, right back, left hand back, right back, left back, etc. are recognized pixel by pixel.
- the upper arm, forearm, wrist, palm and dorsum constitute the upper limb.
- the learning model 19 is, for example, a convolutional neural network (CNN) that has been trained by deep learning.
- the learning model 19 outputs an input layer 19a to which infrared image data is input, an intermediate layer 19b that extracts and restores the feature amount of the infrared image, and part extraction image data that indicates an object included in the infrared image in units of pixels. and an output layer 19c.
- the learning model 19 is U-Net, for example.
- the input layer 19a of the learning model 19 has a plurality of neurons that receive input of infrared image data, that is, the pixel values of each pixel that constitutes the infrared image, and passes the input pixel values to the intermediate layer 19b.
- the intermediate layer 19b has a convolution layer (CONV layer) and a deconvolution layer (DECONV layer).
- CONV layer convolution layer
- DECONV layer deconvolution layer
- a convolutional layer is a layer that dimensionally compresses the infrared image data. Dimensional compression extracts the features of the object.
- the deconvolution layer performs the deconvolution process to restore the original dimensions. Restoration processing in the deconvolution layer generates part extraction image data in which each pixel has a pixel value (class data) corresponding to the class of the object.
- the output layer 19c has a plurality of neurons that output part extraction image data.
- the part extraction image is an image that is classified according to each part of the human body, for example, classified by
- the learning model 19 is training data having infrared image data obtained by the infrared sensor 13 and part extraction image data in which each pixel of the infrared image is given class data according to the type of each part of the corresponding human body. It can be generated by preparing and machine learning an unlearned neural network using the training data.
- each part of the human body is classified by pixel unit by inputting the infrared image data of the human body obtained by the infrared sensor 13 as shown in FIG. Then, extracted site image data is obtained.
- the processing unit 11 converts point cloud data into voxel data, generates a plurality of two-dimensional image data based on the voxel data, performs image recognition processing on each of the two-dimensional image data in the same manner as described above, and performs 2
- the processing unit 11 converts point cloud data into voxel data, generates a plurality of two-dimensional image data based on the voxel data, performs image recognition processing on each of the two-dimensional image data in the same manner as described above, and performs 2
- By inversely converting a plurality of dimensional part extraction image data into voxel data or point cloud data it is possible to obtain data indicating the type of each part of the human body in three-dimensional data in voxel units or point data units. can.
- each part of the human body may be recognized using a learning model 19 such as 3D U-Net that can directly recognize each part of the user in voxel data.
- each part of the human body in three-dimensional information may be recognized using a known machine learning method.
- the processing unit 11 can recognize each part of the user's human body in the infrared image.
- the living body surface has a displacement site where the pulsation of the heart or blood vessels specifically propagates and appears as a periodic displacement of the living body surface.
- the processing unit 11 identifies the displacement site based on the recognition result using the learning model 19 .
- FIG. 4 is a conceptual diagram showing a displacement site, heart and blood vessels.
- the displaced parts are, for example, the neck, temporal region, upper arm, the inner side of the wrist near the thumb, the inner side of the wrist near the little finger, the dorsum of the foot, the chest, or the like.
- There is the carotid artery in the neck the superficial temporal artery in the temporal region, the brachial artery in the upper arm, the radial artery on the inside of the wrist near the thumb, and the inside of the wrist near the little finger.
- FIG. 5 is a block diagram showing a configuration example of the sensor device 9 according to the first embodiment.
- the sensor device 9 is a computer including a processing unit 91 , a storage unit 92 , an infrared sensor 93 , a millimeter wave sensor 94 and a communication unit 95 similar to those of the information processing device 1 .
- the processing unit 91 is an arithmetic processing device having one or more CPUs, GPUs, GPGPUs, TPUs, and the like.
- the processing unit 91 reads and executes the computer program P2 stored in the storage unit 92, thereby executing processing for detecting the heartbeat and blood vessel pulsation of the user.
- the storage unit 92 is a storage device such as a hard disk, EEPROM, or flash memory.
- the storage unit 92 stores a computer program P2 and a learning model 99 for causing the processing unit 91 to execute processing for causing the computer to determine processing for detecting heartbeats and blood vessels of the user.
- the structure of learning model 99 is similar to learning model 19 .
- the learning model 99 is preferably learned so that each part of the human body can be recognized from an infrared image of the user captured from above.
- the learning model 99 may be configured to recognize the head, ears, shoulders, etc. viewed from above, in addition to the parts that the learning model 19 can recognize.
- the computer program P2 is a program for causing a computer to function as the sensor device 9 according to the first embodiment.
- the computer program P2 specifies a displacement site where the body surface is displaced by the user's heart or blood vessels, and executes a process of detecting the heartbeat or blood vessel pulsation based on the displacement of the body surface at the specified displacement site. It is for
- the computer program P2 may be recorded on the recording medium 90 in a computer-readable manner.
- the storage unit 92 stores a computer program P2 read from the recording medium 90 by a reading device (not shown).
- a recording medium 90 is a semiconductor memory such as a flash memory, an optical disk, a magnetic disk, a magneto-optical disk, or the like.
- the computer program P2 according to the present embodiment may be downloaded from a program providing server (not shown) connected to a communication network and stored in the storage unit 92 .
- the infrared sensor 93 is, for example, an infrared laser such as LiDAR, an infrared camera, or the like. It is a sensor for non-contact identification.
- the second direction is, for example, a substantially vertical direction.
- the infrared sensor 93 is a sensor for specifying the user's displaced portion from above.
- the infrared sensor 93 is an example of a non-contact sensor for identifying a displacement site on the surface of the living body, and has the same structure as the infrared sensor 13 .
- the substantially vertical direction is not a strictly vertical direction, and may be a vertical direction or a diagonal direction with respect to the horizontal plane as long as it is a direction in which the part of the user standing upright can be detected from the top of the head.
- the millimeter wave sensor 94 is a sensor that uses millimeter waves to detect the pulsation of the user's displacement site.
- the structure of the millimeter wave sensor 94 is similar to that of the millimeter wave sensor 14 .
- the millimeter wave sensor 94 irradiates the user with millimeter waves from the second direction and receives reflected waves from the user's displaced site, thereby detecting pulsation at the displaced site.
- the second direction is substantially vertical.
- the millimeter wave sensor 94 irradiates the user with millimeter waves downward from above and receives reflected waves from the user's displaced portion, thereby detecting the pulsation at the displaced portion.
- the communication unit 95 includes a processing circuit for performing wireless communication processing, a communication circuit, etc., and transmits and receives various types of information to and from the information processing device 1 via a router (not shown).
- the information processing apparatus 1 periodically executes the following process at arbitrary timing, for example, three times a day.
- the processing unit 11 transmits to the sensor device 9 an instruction signal for instructing the process of identifying the user's displacement site and detecting the pulsation (step S110).
- the processing unit 11 detects the user's displaced part and pulsation captured from above, and requests the sensor device 9 to transmit pulsation detection result information indicating the pulsation at the detected displaced part.
- the details of the processing on the sensor device 9 side will be described later.
- the processing unit 11 uses the infrared sensor 13 to infrared-detect the user (step S111). Specifically, when the infrared sensor 13 is an infrared camera, the processing unit 11 captures an image of the user using the infrared camera and acquires infrared image data of the user. When the infrared sensor 13 is a LiDAR, the processing unit 11 uses the LiDAR to acquire point cloud data of the user. The processing unit 11 converts the point cloud data into two-dimensional infrared image data.
- the processing unit 11 identifies an individual by face authentication processing using the results of infrared detection (step S112). For example, the processing unit 11 extracts a predetermined feature amount from the user's infrared image data or point cloud data, and identifies the individual user by comparing the extracted feature amount with the authentication information registered in the user DB 18. do.
- the processing unit 11 recognizes each part of the user's human body in the infrared image by inputting the infrared image data into the learning model 19 (step S113). Then, the processing unit 11 identifies a displaced portion where the surface of the living body is periodically displaced by the pulsation of the heart or blood vessels (step S114). The details of the process of identifying the displacement site will be described later. Note that the processing unit 11 that executes the process of step S114 functions as a first specifying unit that specifies a displacement site where the body surface is displaced by the heart or blood vessels.
- the processing unit 11 successively aligns the sights on the identified displaced regions, irradiates them with millimeter waves, receives the reflected waves (step S115), and determines whether the heart or blood vessel in each displaced region is detected. is detected (step S116).
- the processing unit 11 detects changes in pulsation over time, pulse rate, heart rate, pulse rhythm, or amplitude of pulse fluctuation, peak point of pulsation, and the like.
- the change in pulsation over time is the change over time in the amount of displacement of the living body surface at the displaced site.
- the pulse rate is, for example, the number of arterial beats per minute.
- Heart rate is, for example, the number of heart beats per minute.
- the pulse rhythm is, for example, a pulsation period at a displacement site or a numerical value indicating the regularity of the pulsation period.
- the amplitude of pulse deflection is the amplitude of displacement of the biological surface at the displacement site.
- the pulsation peak point is the point at which the amount of displacement at the displacement site becomes maximum. It should be noted that it is also possible to detect the point in time when the amount of displacement becomes minimal.
- the processing unit 11 that executes the processes of steps S115 and S116 functions as a detection unit that detects the pulsation of the heart or blood vessels based on the displacement of the living body surface at the specified displacement site.
- the processing unit 11 determines whether or not the user is in a resting state based on the detected beat cycle (step S117). For example, the processing unit 11 reads from the user DB 18 information about the past beat of the individual identified by face authentication, and compares the currently detected beat cycle with the past beat cycle. It can be determined whether the user is in a resting state. If the current beating period is extremely shorter than the past beating period, it may be determined that the subject is not in a resting state.
- step S117: NO If it is determined that the subject is not in a resting state (step S117: NO), the processing unit 11 returns the process to step S113. If it is determined to be in a resting state (step S117: YES), the processing unit 11 determines whether or not the pulsation of each displaced part has been detected for a predetermined time (step S118).
- the predetermined time is, for example, several times longer than the average heart and blood vessel pulsation period. When determining that it is less than the predetermined time (step S118: NO), the processing unit 11 returns the process to step S113. If it is determined that the pulsation has been detected for the predetermined time or longer (step S118: YES), the processing unit 11 accesses the server 4 and acquires environmental information such as the current temperature and humidity (step S119).
- the processing unit 11 receives information on the pulse detection result transmitted from the sensor device 9 (step S120).
- the processing unit 11 associates the pulsation detection result indicating the pulsation at each displacement site, the date and time when the pulsation was detected, and the environmental information acquired in step S119 with the user identification information.
- the pulse detection result includes the pulse detection result detected by the information processing device 1 and the pulse detection result detected by the sensor device 9 .
- the processing unit 11 determines abnormality of the heart or blood vessels based on the detection results of pulsations in each of the plurality of identified displacement sites (step S122). Specifically, by comparing changes in pulsation over time, pulse rate, heart rate, pulse rhythm, or amplitude of pulse fluctuation, peak point of pulsation, etc., at each of the identified displacement sites, Determine cardiac or vascular abnormalities. Details of the abnormality determination process will be described later. Note that the processing unit 11 that executes the process of step S122 functions as a determination unit that determines an abnormality of the heart or blood vessels based on the information regarding the detected heartbeat or blood vessel pulsation.
- the processing unit 11 determines whether or not there is an abnormality in the heart or blood vessels (step S123). When it is determined that there is no abnormality (step S123: NO), the processing unit 11 determines whether or not it is a predetermined notification timing (step S124). If it is normal, it is thought that it is not necessary to display the determination result and notify the relevant parties each time an abnormality is determined, so the notification timing is determined in step S124. Note that display and notification may be performed each time the determination process is performed.
- step S124: NO If it is determined that it is not the notification timing (step S124: NO), the process ends. If it is determined that it is the notification timing (step S124: YES), the processing unit 11 generates a determination result display image 171 (see FIG. 12) indicating normality, and displays it on the display unit 17 (step S125). . Then, the processing unit 11 transmits the determination result to the family's first communication terminal 2 and the medical staff's communication terminal (step S126). In addition, the processing unit 11 may transmit the determination result to the first communication terminal 2 and the second communication terminal 3 together with information such as the user's name, identification information, and contact information. It should be noted that personal information such as the name of the user may be configured not to be transmitted to the second communication terminal 3 of the medical personnel.
- step S123 If it is determined that there is an abnormality in step S123 (step S123: YES), the processing unit 11 generates a determination result display image 171 (see FIGS. 13 and 14) indicating that there is an abnormality, and displays it on the display unit 17. (step S127). Then, the processing unit 11 transmits the determination result to the first communication terminal 2 of the family and the second communication terminal 3 of the medical staff (step S128).
- the medical personnel can receive the information on the determination result at the second communication terminal 3 and transmit the findings information indicating the findings on the determination result to the information processing device 1 .
- the processing unit 11 of the information processing device 1 receives the findings information transmitted from the second communication terminal 3 of the medical staff at the communication unit 15 (step S129), and displays the received findings information of the medical staff on the display unit 17. (step S130). In addition, the processing unit 11 transmits the finding information of the medical staff to the family's first communication terminal 2 (step S131), and ends the process.
- FIGS. 8 and 9 are flow charts showing the displacement part identification processing procedure.
- the processing unit 11 determines whether or not the region of the carotid artery triangle is recognized by the process of step S113 (step S151). When the region of the carotid triangle is recognized (step S151: YES), the processing unit 11 identifies the region as a displaced site including the carotid artery and jugular vein (step S152).
- step S152 When the process of step S152 is finished, or when it is determined that the carotid artery triangle region is not recognized in step S151 (step S151: NO), the processing unit 11 recognizes the user's face by the process of step S113. It is determined whether or not (step S153). If it is determined that the user's face has been recognized (step S153: YES), the processing unit 11 detects any one of the contour of the face, the eyes, the eyebrows, the nose, the mouth, the nasolabial fold, the ears and the chin, Based on the amount of deviation of each part such as the eyes from the midline, the displaced part of the carotid artery is identified (step S154).
- the processing unit 11 uses the learning model 19, for example, to detect the outline of the face, the eyes, the eyebrows, the nose, the mouth, the nasolabial folds, the ears, or the chin. Further, it is also possible to extract the image portion of the face in the infrared image and detect the eyes and the like of the face on a rule basis by binarization, pattern matching processing, or the like. On the other hand, the processing unit 11 can recognize the left and right parts of the body, for example, the right chest and left chest, the right upper arm and the left arm, etc., by the processing of step S113. can be identified as a line.
- the left side of the neck recognized by the learning model 19 may be specified as the displacement site where the carotid artery is located. If the face is facing left, the right side of the neck can be identified as the displacement site where the carotid artery is located. Further, it may be configured to further narrow down and specify the displaced portion from the portion corresponding to the right carotid artery or the left carotid artery according to the amount of rotation of the neck.
- step S154 determines whether the temporal region of the user is recognized by the process of step S113. It is determined whether or not (step S155). When it is determined that the temporal region is recognized (step S155: YES), the processing unit 11 identifies the measurement unit as a displaced site having a superficial temporal artery (step S156).
- step S156 determines whether or not the upper arm is recognized by the process of step S113. is determined (step S157). If it is determined that the upper arm has been recognized (step S157: YES), the processing unit 11 identifies the upper arm as a displaced site having the brachial artery (step S158).
- step S158 determines whether or not the palm is recognized by the process of step S113. Determine (step S159).
- step S159 determines whether or not the palm is recognized by the process of step S113.
- the processing unit 11 recognizes the position of the thumb from the image portion of the palm (step S160), and recognizes the portion near the thumb of the wrist recognized by the process of step S113.
- the radial artery is identified as a displaced site (step S161). Further, the processing unit 11 identifies the part of the carpal closer to the little finger recognized by the process of step S113 as a displaced part having the ulnar artery (step S162).
- the learning model 19 recognizes the inner side of the wrist as the wrist.
- the medial side of the carpal is a site where the body surface is displaced by the pulsation of the radial and ulnar arteries.
- the learning model 19 recognizes the outer side of the wrist as the wrist.
- the learning model 19 is machine-learned so as to distinguish and recognize the thumb side of the inner side of the wrist, the little finger side of the inner side of the wrist, and the outer side of the wrist, the recognition processing of the orientation of the palm and the position of the thumb is unnecessary. is.
- the learning model 19 can directly recognize the displacement site with the radial artery and the displacement site with the ulnar artery.
- step S162 determines whether or not the dorsum of the foot is recognized by the processing of step S113. Determine (step S163). If it is determined that the dorsum of the foot has been recognized (step S163: YES), the processing unit 11 identifies the dorsum of the foot as a displaced site having the dorsal artery of the foot (step S164).
- step S164 determines whether or not the chest is recognized by the process of step S113. Determine (step S165). If it is determined that the chest is not recognized (step S165: NO), the processing unit 11 ends the process of identifying the displacement site. If it is determined that the chest is recognized (step S165: YES), the processing unit 11 identifies the chest as a displaced region where the heart is located (step S166), and ends the displaced region identifying process.
- the processing unit 11 can identify the carotid artery, the temporal artery, the brachial artery, the radial artery, the ulnar artery, the dorsalis pedis artery, and a displaced site where the body surface is displaced by the heartbeat.
- the processing unit 11 can identify the site where the jugular vein is present.
- FIG. 10 is a flow chart showing the processing procedure of the sensor device 9 according to the first embodiment.
- the processing unit 91 of the sensor device 9 executes the following processing when receiving the detection instruction transmitted from the information processing device 1 .
- the processing unit 91 uses the infrared sensor 93 to infrared-detect the user (step S141). Specifically, when the infrared sensor 93 is an infrared camera, the processing unit 91 captures an image of the user using the infrared camera and acquires infrared image data of the user.
- the infrared sensor 93 is a LiDAR
- the processing unit 91 uses the LiDAR to acquire point cloud data of the user.
- the processing unit 91 converts the point cloud data into two-dimensional infrared image data.
- the processing unit 91 recognizes each part of the user's human body in the infrared image by inputting the infrared image data into the learning model 99 (step S142). Then, the processing unit 91 identifies a displacement site where the surface of the living body is periodically displaced by the pulsation of the heart or blood vessels (step S143). The details of the processing for specifying the displacement portion are the same as the processing for specifying the displacement portion by the information processing apparatus 1 . It should be noted that the processing unit 91 that executes the process of step S143 functions as a second specifying unit that specifies a displacement site where the body surface is displaced by the heart or blood vessels.
- the processing unit 91 sequentially aims at each identified displacement site, irradiates millimeter waves, receives the reflected wave (step S144), and detects the heart or blood vessel at each displacement site. is detected (step S145).
- the processing unit 91 that executes the processes of steps S144 and S145 functions as a detection unit that detects the pulsation of the heart or blood vessels based on the displacement of the living body surface at the specified displacement site.
- the processing unit 91 determines whether or not the user is in a resting state based on the detected beat cycle (step S146).
- step S146: NO If it is determined that it is not in a stable state (step S146: NO), the processing unit 91 returns the process to step S142. If it is determined to be in a stable state (step S146: YES), the processing unit 91 determines whether or not the pulsation of each displacement site has been detected for a predetermined time (step S147). The predetermined time is, for example, several times longer than the average heart and blood vessel pulsation period. If it is determined that the time is less than the predetermined time (step S147: NO), the processing section 91 returns the process to step S142. When determining that the pulsation has been detected for the predetermined time or longer (step S147: YES), the processing unit 91 transmits information on the pulsation detection result to the information processing device 1 (step S148), and finishes the process.
- a predetermined time is, for example, several times longer than the average heart and blood vessel pulsation period. If it is determined that the time is less than the predetermined time (step
- FIG. 11 is a flowchart showing an abnormality determination processing procedure.
- the processing unit 11 determines an abnormality of the blood vessel or the heart based on the amount of pulse rhythm deviation and the difference in amplitude between the left carotid artery and the right carotid artery (step S171). If the deviation amount of the pulse rhythm is equal to or greater than a predetermined threshold value, the processing unit 11 determines abnormality of blood vessels or heart. In other words, when the time difference between the peak time of pulsation at the first displacement site and the peak time of pulsation at the second displacement site is equal to or greater than a predetermined threshold, the processing unit 11 determines an abnormality of the blood vessel or heart. do.
- the processing unit 11 determines an abnormality in blood vessels or heart. If there is a large difference between the amount of pulse rhythm deviation and the amplitude of pulse fluctuation, abnormalities such as arteriosclerosis and vascular stenosis are suspected. For example, the processing unit 11 determines ischemic or hemorrhagic cerebrovascular abnormalities. That is, the processing unit 11 determines abnormalities related to stroke, cerebral infarction, and cerebral hemorrhage. The same applies hereinafter.
- the processing unit 11 determines abnormalities in blood vessels or the heart based on the amount of pulse rhythm deviation and the difference in amplitude between the left temporal artery and the right temporal artery (step S172). For example, the processing unit 11 determines ischemic or hemorrhagic cerebrovascular abnormalities. That is, the processing unit 11 determines abnormalities related to stroke, cerebral infarction, and cerebral hemorrhage.
- the processing unit 11 determines abnormalities in blood vessels or the heart based on the amount of pulse rhythm deviation and the difference in amplitude between the left brachial artery and the right brachial artery (step S173).
- the processing unit 11 determines abnormalities in blood vessels or the heart based on the amount of deviation in the pulse rhythm of the left dorsalis pedis artery and the right dorsal pedis artery and the difference in the amplitude of the swing (step S174). For example, the processing unit 11 determines abnormalities in leg blood vessels.
- the processing unit 11 determines a blood vessel or heart abnormality based on the difference between the pulsation propagation velocity between the heart and the left carotid artery and the pulsation propagation velocity between the heart and the right carotid artery (step S175). If the difference between the propagation velocities is greater than or equal to a predetermined threshold, some cardiac or vascular abnormality is suspected. For example, the processing unit 11 determines ischemic or hemorrhagic cerebrovascular abnormalities. That is, the processing unit 11 determines abnormalities related to stroke, cerebral infarction, and cerebral hemorrhage.
- the processing unit 11 determines an abnormality related to severe lower limb ischemia based on the amplitude of pulse fluctuation in the dorsalis pedis artery (step S176). If the fluctuation of the pulse of the dorsalis pedis artery is less than the predetermined threshold, the processing unit 11 determines that there is an abnormality of critical lower extremity ischemia.
- the processing unit 11 analyzes the infrared image data of the carotid trigone region or analyzes the point cloud data in the carotid trigone region to execute a process of detecting jugular vein distention. Abnormalities associated with heart failure are determined based on the presence or absence of hypertension (step S177).
- brachial artery, radial artery, ulnar artery, dorsalis pedis artery pulse, time change of any two heartbeats, pulse rate, heart rate, pulse rhythm, or deviation of peak point of heartbeat, swing Cardiac or blood vessel abnormalities may be determined based on the size difference or the like.
- an abnormality related to heart failure may be determined based on the size and shape of swelling of the jugular vein.
- the presence or absence of jugular vein distension and an abnormality related to heart failure may be determined by comparing with the arterial pulse or heartbeat fluctuation.
- abnormality determination is performed based on the current heartbeat and blood vessel beat. It may be configured to determine an abnormality of the heart or blood vessels by For example, the processing unit 11 may determine an increase or decrease in the user's heart rate as a heart or blood vessel abnormality. Furthermore, it is preferable to compare the information about the past beats detected in an environment similar to the current environment with the information about the current beats. Abnormalities in the heart or blood vessels can be determined with higher accuracy.
- FIGS. 12 to 14 are schematic diagrams showing an example of the determination result display image 171.
- FIG. The processing unit 11 generates the judgment result display image 171 as shown in FIGS. 12 to 14 by the processing of steps S125 and S127.
- the judgment result display image 171 displays, for example, a human body image 172 depicting a human body and various arteries and hearts to be detected.
- the human body image 172 includes character images indicating the names of various arteries "(1) temporal artery", "(2) carotid artery”, “(3) brachial artery”, “(4) radial artery”, "(5) ulnar artery” and “(6) dorsalis pedis artery”.
- the processing unit 11 may display character images corresponding to arteries for which pulsation could be detected and character images corresponding to arteries for which pulsation could not be detected in different modes. For example, the processing unit 11 may highlight character images corresponding to arteries for which pulsation could be detected, and display character images corresponding to arteries for which pulsation could not be detected in light characters.
- the determination result display image 171 includes graphs 173a and 173b showing temporal changes in pulsation of a plurality of arteries.
- two graphs 173a and 173b are displayed that show temporal changes in pulsation of the carotid artery and radial artery.
- the graphs 173a and 173b may display the state of pulsation in real time, or may statically display the state of pulsation for a certain period of time.
- a plurality of graphs 173a and 173b showing the pulsation of each artery may be displayed at predetermined screen positions corresponding to each artery and the heart, or only the graphs 173a and 173b showing the detected pulsation of the artery may be displayed. may be displayed.
- the processing unit 11 may receive selection of a graph to be displayed by the operation unit 16, and display graphs 173a and 173b showing the selected arterial pulsation. Also, it is preferable to display the peak of the beat as a zero point.
- the processing unit 11 may display graphs 173a and 173b showing the pulsation states of one or more representative arteries, as shown in FIG. For example, graphs 173a and 173b showing the pulsating states of the carotid artery and radial artery are displayed. Also, the processing unit 11 may display the character image corresponding to the artery displayed in the graph in a manner different from other character images. For example, it may be highlighted.
- the processing unit 11 converts graphs 173a and 173b showing temporal changes in the pulsation of the two arteries, which are the basis for the determination of abnormality, into a determination result display image, as shown in FIGS. 171.
- the processing unit 11 preferably displays the character image indicating the artery, which is the basis for the abnormality determination, in a manner that is different from the normal state. For example, the processing unit 11 may highlight the character image indicating the detected artery in green when the determination is normal, and highlight the artery used as the basis for the determination of abnormality in red.
- the determination result display image 171 includes a message image 174 indicating whether the determination result was normal.
- the determination result display image 171 includes a finding message image 175 indicating finding information, as shown in FIG.
- the user can know the beating state of the heart and blood vessels and whether the heart or blood vessels are normal. Needless to say, the determination result display image 171 may be transmitted to the first communication terminal 2 and the second communication terminal 3 as the determination result.
- determination result display image 171 described above is an example, and may be configured to display other information.
- graphs showing temporal changes in heart or blood vessel beats one day ago, one week ago, or one year ago may be displayed side by side or superimposed for comparison with the current graph.
- Information such as heart rate and pulse rate may also be displayed.
- the processing unit 11 can determine heart or blood vessel abnormalities such as arteriosclerosis and stenosis by comparing the pulse rate, heart rate, pulse rhythm, amplitude of vibration, and the like at a plurality of displacement sites. can. More specifically, heart or blood vessel abnormality can be determined based on the time difference between the pulsation peaks of the carotid artery and the radial artery, or the paired left and right arteries, and the difference in the magnitude of vibration.
- ischemic or hemorrhagic cerebrovascular abnormalities can be determined based on the amplitude of the temporal artery, carotid artery, or the like. That is, abnormalities related to stroke, cerebral infarction, and cerebral hemorrhage can be determined. Further, it is possible to determine the presence or absence of abnormal severe lower limb ischemia based on the magnitude of the swing of the dorsalis pedis artery. Further, by detecting jugular vein distention, abnormalities associated with heart failure can be determined.
- the infrared sensor 13 is used to identify the displacement site and then irradiate the displacement site with millimeter waves, the pulsation of blood vessels and the heart at the displacement site can be detected more accurately and efficiently. can do.
- the infrared sensor 13 can identify the displacement site with higher accuracy than millimeter waves.
- the millimeter wave sensor 14 can also detect pulsations at displacement sites hidden by clothes, which cannot be detected by the infrared sensor 13 .
- the information processing system recognizes each part of the user's body from two different directions, and by irradiating the millimeter wave sensor from the two directions, the information processing system can detect the heartbeat and blood vessel beats, thereby improving the accuracy. Abnormalities of the heart or blood vessels can be detected well.
- the information processing apparatus 1 is configured to irradiate millimeter waves while aiming at the displacement site specified by the infrared sensor 13 and detect the heartbeat, the heart or each artery can be detected efficiently and accurately. pulsation can be detected.
- the information processing apparatus 1 can display the medical staff's findings information on the display unit 17 when there is an abnormality in the heart or blood vessels and there is medical staff's findings information.
- medical staff's finding information can be transmitted to the family's first communication terminal 2 . Therefore, it is possible to detect an abnormality of the heart or blood vessels at an early stage and notify the user and the family of highly reliable information provided by medical personnel.
- the infrared sensors 13 and 93 are used to recognize various parts and displacement parts of the user's human body.
- device may be provided.
- the processing unit 11 can similarly recognize each part of the human body and specify the displaced part based on the image data of the user imaged with visible light.
- a sensor that transmits and receives electromagnetic waves in the terahertz band may be provided.
- the pulsation at the displacement site can be detected in the same manner as the millimeter wave sensors 14 and 94 .
- the information processing apparatus 1 including the infrared sensors 13, 93 and the millimeter wave sensors 14, 94 has been described, the infrared sensors 13, 93 or the millimeter wave sensors 14, 94 are devices externally connected by wire or wirelessly. good too.
- the infrared sensors 13, 93 and the millimeter wave sensors 14, 94 are used to detect the displaced parts and pulsations of the user's body from the horizontal and vertical directions.
- the infrared sensors 13, 93 and the millimeter wave sensors 14, 94 are used to detect the displaced parts and pulsations of the user's body from two directions.
- the configuration may be such that the displaced portion is specified from three or more directions and the pulsation is detected.
- Embodiment 1 an example in which a home computer executes the computer programs P1 and P2 according to Embodiment 1 has been described, but a cloud computer executes the computer programs P1 and P2 according to Embodiment 1. and may be configured to implement the information processing method. Further, needless to say, the computer programs P1 and P2 may be distributed and executed on a plurality of server computers.
- the information processing device 1 and the sensor device 9 share the processing and execute it, but the sensor device 9 may execute all or part of the processing based on the computer program P1.
- the information processing apparatus 1 may execute all or part of the processing based on the computer program P2.
- the information processing device 1 includes the first specifying unit, the detecting unit, and the determining unit
- the sensor device 9 includes the second specifying unit and the detecting unit.
- the arrangement of each unit is an example. . Either or both of the information processing device 1 and the sensor device 9 may be appropriately provided with the first specifying unit, the second specifying unit, the detecting unit, and the determining unit.
- Embodiment 2 The information processing system according to the second embodiment differs from the first embodiment in the information processing procedure. Since other configurations of the information processing system are the same as those of the information processing system according to the first embodiment, similar portions are denoted by the same reference numerals, and detailed description thereof is omitted.
- FIG. 15 is a flowchart showing an information processing procedure according to the second embodiment.
- the information processing device 1 executes the following processes at arbitrary timing.
- the processing unit 11 transmits to the sensor device 9 an instruction signal for instructing the process of identifying the user's displacement site and detecting the pulsation (step S210), and using the infrared sensor 13 to detect the user by infrared rays.
- step S211 the individual is identified by face authentication processing using the results of infrared detection (step S212), and the infrared image data is input to the learning model 19, so that each part of the user's body in the infrared image is identified. recognize (step S213).
- the processing unit 11 recognizes each part of the user's human body viewed from the substantially horizontal direction. In other words, the processing unit 11 performs image recognition of each part of the user in an infrared image obtained by detecting from a substantially horizontal direction using the infrared sensor 13 .
- the processing unit 91 of the sensor device 9 receives the instruction signal transmitted from the information processing device 1 (step S214).
- the processing unit 91 that has received the instruction signal uses the infrared sensor 93 to infrared-detect the user (step S215), and inputs the infrared image data to the learning model 99, thereby recognizing each part of the user's body in the infrared image. (step S216). That is, the processing unit 11 recognizes each part of the user's human body viewed from above. In other words, the processing unit 11 performs image recognition of each part of the user in an infrared image obtained by detecting from substantially vertically above using the infrared sensor 13 . Then, the processing unit 91 transmits the information of the recognition result obtained by the process of step S216 to the information processing apparatus 1 through the communication unit 95 (step S217).
- the processing unit 11 of the information processing device 1 receives the recognition result information transmitted from the sensor device 9 at the communication unit 15 (step S218). Then, based on the recognition result of each part of the user by the information processing device 1 and the recognition result of each part of the user by the sensor device 9, the processing unit 11 periodically detects the movement of the body surface due to the pulsation of the heart or blood vessels. A displaced portion to be displaced is identified (step S219).
- the processing unit 11 may use only the recognition result from the information processing device 1 to identify one displaced portion, or may use only the recognition result from the sensor device 9 to identify one displaced portion.
- the method for specifying the displacement site is the same as in the first embodiment.
- the processing unit 11 identifies the dorsum of the foot recognized by the sensor device 9 as a displaced part having a dorsal artery. do. In this case, the processing unit 11 temporarily stores information indicating that it is the sensor device 9 that can detect the pulsation of the dorsal artery. Conversely, when the information processing device 1 can recognize the temporal region and the sensor device 9 cannot recognize the temporal region, the processing unit 11 detects the temporal region recognized by the information processing device 1 as the superficial temporal artery. is identified as a displacement site.
- the processing unit 11 temporarily stores information indicating that the information processing apparatus 1 can detect the pulsation of the superficial temporal artery. In this way, when one of the information processing device 1 and the sensor device 9 can recognize a site corresponding to a displaced site where an artery whose pulsation is to be detected, the processing unit 11 detects whether the information processing device 1 or the sensor device 9 can be specified as the displacement site. The processing unit 11 temporarily stores information indicating the information processing device 1 or the sensor device 9 that has been able to recognize the part as a device capable of detecting the pulsation of the part.
- the displacement part may be specified based on the recognition result of the information processing device 1. Further, the size of the part recognized by the information processing device 1 is compared with the size of the corresponding part recognized by the sensor device 9, and the displacement part is determined using the recognition result of the device that can recognize the larger part. may be specified.
- the information processing device 1 may specify the displacement part using both the recognition results of the information processing device 1 and the sensor device 9 .
- a displacement site with a carotid artery can be recognized using both recognition results.
- the twist amount of the neck is estimated based on the amount of deviation from the midline of each part of the face, and the displacement part where the carotid artery is located is specified.
- the twist amount of the neck is directly estimated, and based on the twist amount, the information processing device 1 Displacement sites in the recognized neck can be identified.
- the method for recognizing a displaced site where the carotid artery is present is exemplified here, the displaced site may be identified by combining the recognition results of the information processing device 1 and the sensor device 9 as appropriate.
- step S219 the processing unit 11 transmits, through the communication unit 15, to the sensor device 9 information indicating the displacement site where the sensor device 9 should detect the pulsation (step S220).
- step S219 the processing unit 11 successively aligns the target with each displacement part identified as the part where the pulsation should be detected on the information processing apparatus 1 side, irradiates the millimeter wave, and emits the reflected wave. is received (step S221), and the pulsation of the heart or blood vessels at each displacement site is detected (step S222).
- the sensor device 9 receives the information indicating the displacement site at the communication section 95 (step S223). Based on the information, the sensor device 9 that has received the information sequentially aims at each displacement part specified as a part where the pulsation should be detected on the sensor device 9 side, irradiates the millimeter wave, and emits the reflected wave. Receive (step S224), and detect heart or blood vessel pulsation at each displacement site (step S225). Then, the processing unit 91 transmits the information of the pulsation detection result indicating the pulsation at each detected displacement site to the information processing device 1 through the communication unit 95 (step S226).
- the information processing device 1 receives the information on the pulse detection result transmitted from the sensor device 9 at the communication unit 15 (step S227). After that, by executing the same processing as steps S121 to S131 described in the first embodiment, an abnormality of the user's heart and blood vessels may be determined and a process of notifying it may be executed.
- Displacement sites to which waves should be applied can be identified, pulsations at each displacement site can be detected, and abnormalities in the user's heart or blood vessels can be determined.
- the information processing apparatus 1 side is configured to specify the displaced portion for which the pulsation is to be detected and the sensor device 9 side is configured to specify the displaced portion for which the pulsation is to be detected. It is possible to detect the pulsation of each part immediately. That is, each of the information processing device 1 and the sensor device 9 recognizes the pulsation at the displaced part to be detected by itself and selectively detects the pulsation at the displaced part. It is possible to detect the heartbeat of the body.
- the information processing system according to the third embodiment differs from the first embodiment in the information processing procedure. Since other configurations of the information processing system are the same as those of the information processing system according to the first embodiment, similar portions are denoted by the same reference numerals, and detailed description thereof is omitted.
- the infrared sensor 13 and the infrared sensor 93 according to the third embodiment are devices capable of three-dimensionally detecting the user's body surface and clothing surface, for example, an infrared laser such as LiDAR. Also, the information processing apparatus 1 recognizes each part of the user's body in three dimensions using three-dimensional information, that is, point cloud data or voxel data.
- FIG. 16 is a flowchart showing an information processing procedure according to the third embodiment.
- the information processing device 1 executes the following processes at arbitrary timing.
- the processing unit 11 transmits to the sensor device 9 an instruction signal for instructing the process of identifying the user's displaced part and detecting the pulsation (step S310). is detected by infrared rays (step S311), and the individual is identified by face authentication processing using the infrared detection results (step S312).
- the processing unit 91 of the sensor device 9 receives the instruction signal transmitted from the information processing device 1 (step S313).
- the processing unit 91 that has received the instruction signal uses the infrared sensor 93 to infrared-detect the user (step S314), and transmits the infrared signal data output from the infrared sensor 93 to the information processing apparatus 1 through the communication unit 95 (step S314). step S315).
- the infrared signal data is, for example, point cloud data that is three-dimensional information of the user.
- the processing unit 11 of the information processing device 1 receives the infrared signal data transmitted from the sensor device 9 at the communication unit 15 (step S316). Then, the processing unit 11 integrates the infrared signal data output from the infrared sensor 13 and the infrared signal data received in step S316 (step S317).
- the point cloud data obtained by the infrared sensor 13 represents the positions of a large number of points on the user's body surface or clothing surface in three-dimensional coordinates in a coordinate system based on the position of the infrared sensor 13. .
- the point cloud data obtained by the infrared sensor 93 represents the positions of a large number of points on the user's body surface or clothing surface in three-dimensional coordinates in a coordinate system based on the position of the infrared sensor 93. .
- the point cloud data obtained by the infrared sensor 13 and the point cloud data obtained by the infrared sensor 93 are data represented by different coordinate systems, but a plurality of points detected by both infrared sensors 13 and 93 are matched. If the coordinates are transformed so as to allow the point cloud data to be integrated, more detailed three-dimensional information of the user's biological surface or clothing surface can be obtained.
- the processing unit 11 uses the point cloud data obtained in step S317 to recognize each part of the user's body (step S318).
- the processing unit 11 may recognize each part of the user's body two-dimensionally or three-dimensionally.
- the processing unit 11 converts, for example, the point cloud data into three-dimensional voxel data or a two-dimensional infrared image, and inputs the converted voxel data or infrared image to the learning model 19 to obtain information about each part of the user's body. can be recognized.
- the two-dimensional infrared image includes an image obtained when the user is detected in a substantially horizontal direction using the infrared sensor 13 and an image obtained when the user is detected in a substantially vertical direction using the infrared sensor 93. It is an image.
- the processing unit 11 identifies a displacement site where the body surface is periodically displaced by the heartbeat or blood vessel pulsation, based on the recognition result of each site of the user by the information processing device 1 (step S319).
- processing unit 11 performs the same processes as steps S220 to S227 of the second embodiment and steps S121 to S131 described in the first embodiment to determine abnormalities in the user's heart and blood vessels, All that is necessary is to execute the notification process.
- the infrared signal data output from the infrared sensor 13 and the infrared sensor 93 are integrated to recognize each part of the user's body, identify the displacement part, and Since it is configured to detect the heartbeat, it is possible to more accurately determine an abnormality in the user's heart or blood vessel.
- the displacement part may be configured to detect the pulsation of the body surface in three dimensions. It is possible to detect the pulsation of the heart or blood vessels at the displacement site with higher accuracy.
- the information processing apparatus 1 according to the fourth embodiment differs from the first to third embodiments in that the acceleration sensor 5 and the contact sensor 6 are used to detect the user's body motion and pulse. Since other configurations of the information processing apparatus 1 are the same as those of the information processing apparatus 1 according to Embodiments 1 to 3, similar portions are denoted by the same reference numerals, and detailed description thereof is omitted.
- FIG. 17 is an explanatory diagram showing a configuration example of an information processing system according to the fourth embodiment.
- An information processing system according to Embodiment 4 includes an information processing device 1 similar to that of Embodiments 1 to 3, and further includes an acceleration sensor 5 attached to a user and a contact sensor 6 for detecting heart or blood vessel beats. and The acceleration sensor 5 transmits to the information processing device 1 acceleration signal data indicating acceleration according to the movement of the user's body.
- the contact sensor 6 is attached to a site where the body surface is displaced by the pulsation of the heart or blood vessels, and transmits pulsation signal data indicating the pulsation to the information processing apparatus 1 .
- the contact-type sensor 6 is preferably attached to a site where millimeter waves from the information processing apparatus 1 are difficult to irradiate.
- FIG. 18 is a flowchart showing an information processing procedure according to the fourth embodiment.
- the processing unit 11 of the information processing device 1 determines whether or not it is a predetermined monitoring timing (step S441).
- the predetermined monitoring timing is arbitrary timing and can be set by the user as appropriate. If it is determined that it is not the monitoring timing (step S441: NO), the processing unit 11 returns the process to step S441 and waits.
- step S441 When it is determined that it is the monitoring timing (step S441: YES), the processing unit 11 receives the acceleration signal data transmitted from the acceleration sensor 5 (step S442), and detects the pulse signal data transmitted from the contact sensor 6. is received (step S443).
- the processing unit 11 determines whether or not the user is in a resting state by determining whether or not the magnitude of body movement is less than a predetermined value based on the acceleration signal data (step S444). If it is determined that the amount of body movement is greater than or equal to the predetermined value and the subject is not in a resting state (step S444: NO), the processing unit 11 returns the process to step S442.
- the processing unit 11 When it is determined that the body movement is less than the predetermined value and the body is in a resting state (step S444: YES), the processing unit 11 identifies the displaced part by the same processing procedure as in the first to third embodiments, and pulsation of blood vessels, and performs heart and blood vessel abnormality determination processing. However, the processing unit 11 according to the fourth embodiment distinguishes between the pulsation detected by the millimeter wave sensor 14, which is a non-contact sensor, and the pulsation indicated by the pulsation signal data transmitted from the contact sensor 6. is used to determine cardiac or vascular abnormalities.
- the pulse of arteries in more parts can be detected.
- cardiac or vascular abnormalities For example, the pulsation of the axillary artery can be detected by attaching the contact sensor 6 to the armpit.
- the contact sensor 6 by attaching the contact sensor 6 to the base of the thigh, the back of the knee, and the back of the inner malleolus, it is possible to detect the pulsation of the femoral artery, the popliteal artery, the posterior tibial artery, etc.
- vascular abnormalities can be determined.
- the displacement part is specified and the pulsation is detected, so the abnormality determination process can be executed efficiently.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Physics & Mathematics (AREA)
- Cardiology (AREA)
- Physiology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Databases & Information Systems (AREA)
- Vascular Medicine (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)
Abstract
Description
図1は、実施形態1に係る情報処理システムの構成例を示す説明図である。情報処理システムは、実施形態1に係る情報処理装置1及びセンサ装置9を備える。情報処理装置1は、センサ装置9、第1通信端末2、第2通信端末3及びサーバ4等と無線接続されており、各種情報を送受信することができる。
情報処理装置1は、赤外線及びミリ波を用いて、ユーザ(生体)の心臓及び血管の拍動を検出し、心臓及び血管の異常の有無を判定する装置である。情報処理装置1はユーザがいる部屋Rの中に設置されているものとする。センサ装置9は、情報処理装置1と同様にしてユーザ(生体)の心臓及び血管の拍動を検出し、検出結果を情報処理装置1へ送信する装置である。センサ装置9は、例えば部屋Rの天井に設置された照明装置Lに設けられている。第1通信端末2はユーザの家族が用いる通信装置である。第2通信端末3は医療関係者が用いる通信端末である。サーバ4は、気温及び湿度等、ユーザの心拍に影響を与える環境に係る情報を提供する装置である。なお、各装置を有線ケーブルで接続するように構成してもよい。
なお、本実施形態における心臓の異常は、心臓そのものの疾患をいう。血管の異常には、動脈硬化等の血管そのものの異常、脳梗塞、足梗塞(重症下肢虚血)等の、血流の異常による臓器、器官、部位の異常が含まれる。
なお、略水平方向は、厳密な水平方向では無く、直立しているユーザの正面側の部位を検出可能な方向であれば、鉛直方向及び水平面に対して斜めの方向でもよい。
一方、LiDARである赤外線センサ13は、ユーザへ赤外線を照射する発光素子と、ユーザに照射され反射された赤外線を受光する受光素子を備える。発光素子は、例えば垂直共振器面発光レーザ(VCSEL:Vertical Cavity Surface Emitting LASER)等の赤外線レーザであり、縦横に並んだドットパターンをユーザへ照射する。受光素子は、例えばCMOSイメージセンサである。赤外線センサ13は、ユーザへ向けて照射され、反射されて戻ってくるまでの往復時間に基づいて、ユーザまでに距離を算出する。赤外線センサ13は、各ドットパターンまでの距離を算出して、ユーザの3次元情報である点群データ(信号データ)を出力する。点群データは、例えばユーザの体表面又は衣服表面上の多数の点を3次元座標で表したものである。処理部11は、点群データをボクセルデータに変換することができる。また、処理部11は点群データ又はボクセルデータを、2次元の赤外線画像データに変換することができる。
以下、説明を簡単にするために、2次元の赤外線画像データを用いて、ユーザであるユーザの体の各部位、並びに心臓及び血管の拍動により生体表面が変位する変位部位を特定する例を説明する。
ミリ波センサ14は、ミリ波信号を生成するシンセサイザ、送信アンテナ、受信アンテナ及びミキサ等を備える。送信アンテナはシンセサイザで生成されたミリ波の電磁波を送信する。受信アンテナは、ユーザの生体表面で反射されたミリ波の電磁波を受信する。ミキサは、送信波と受信波を混合し、中間周波信号を生成する回路である。処理部11は、中間周波信号のデータに基づいて、ユーザまでの距離を算出することができる。処理部11は、特に、ユーザの変位部位までの距離の変動、つまり生体表面の変位を算出することができ、当該変位部位における心臓又は血管の拍動を検出することができる。ミリ波センサ14は、電子スキャン方式により、任意の変位部位に照準を合わせて、ミリ波を照射することができ、処理部11は当該変位部位における拍動を検出することができる。
また、ボクセルデータにおけるユーザの各部位を直接的に認識することができる3D U-Net等の学習モデル19を用いて、人体の各部位の種類を認識するように構成してもよい。その他、公知の機械学習手法を用いて、3次元情報における人体の各部位を認識すればよい。
なお、略鉛直向は、厳密な鉛直方向では無く、直立しているユーザの部位を頭頂部側から検出可能な方向であれば、鉛直方向及び水平面に対して斜めの方向でもよい。
例えば、処理部11は、拍動の時間変化、脈拍数、心拍数、脈拍リズム、又は脈拍の振れの大きさ、拍動のピーク時点等を検出する。拍動の時間変化は、つまり変位部位の生体表面の変位量の時間変化である。脈拍数は、例えば1分当たりの動脈の拍動回数である。心拍数は、例えば1分当たりの心臓の拍動回数である。脈拍リズムは、例えば変位部位における拍動周期、又は拍動周期の規則性を示す数値である。脈拍の振れの大きさは、変位部位における生体表面の変位の振幅である。拍動のピーク時点は、変位部位における変位量が極大になる時点である。なお、変位量が極小になる時点を検出するようにしてもよい。
なお、ステップS115~ステップS116の処理を実行する処理部11は、特定された変位部位における生体表面の変位に基づいて、心臓又は血管の拍動を検出する検出部として機能する。
また、手根内側の親指側、手根内側の小指側及び手根外側を区別して認識するように学習モデル19を機械学習させている場合、上記手掌の向き、親指の位置の認識処理は不要である。学習モデル19は、直接的に橈骨動脈がある変位部位及び尺骨動脈がある変位部位を認識することができる。
なお、ステップS144~ステップS145の処理を実行する処理部91は、特定された変位部位における生体表面の変位に基づいて、心臓又は血管の拍動を検出する検出部として機能する。
処理部11は、拍動を検出できた動脈に対応する文字画像と、拍動を検出できなかった動脈に対応する文字画像とを異なる態様で表示するとよい。例えば、処理部11は、拍動を検出できた動脈に対応する文字画像をハイライト表示し、拍動を検出できなかった動脈に対応する文字画像を薄文字で表示するとよい。
なお、各動脈の拍動を示す複数のグラフ173a,173bは、各動脈及び心臓に対応する既定の画面位置に表示してもよいし、検出できた動脈の拍動を示すグラフ173a,173bのみを表示するようにしてもよい。処理部11は、操作部16にて表示するグラフの選択を受け付け、選択された動脈の拍動を示すグラフ173a,173bを表示するようにしてもよい。また、拍動のピークをゼロ点にして表示するとよい。
本実施形態1では、情報処理装置1が第1特定部、検出部及び判定部を備え、センサ装置9が第2特定部及び検出部を備える構成を説明したが、各部の配置は一例である。第1特定部、第2特定部、検出部及び判定部を情報処理装置1及びセンサ装置9のいずれか一方に又は双方に適宜設ければよい。
実施形態2に係る情報処理システムは、情報処理手順が実施形態1と異なる。情報処理システムのその他の構成は、実施形態1に係る情報処理システムと同様であるため、同様の箇所には同じ符号を付し、詳細な説明を省略する。
そして、処理部91は、ステップS216の処理による認識結果の情報を通信部95にて情報処理装置1へ送信する(ステップS217)。
逆に、情報処理装置1が側頭部を認識でき、センサ装置9が側頭部を認識できなかった場合、処理部11は、情報処理装置1が認識した側頭部を、浅側頭動脈がある変位部位として特定する。この場合、処理部11は、浅側頭動脈の拍動を検出できるのは情報処理装置1であることを示す情報を一時記憶する。
このように処理部11は、拍動を検出したい動脈がある変位部位に相当する部位を情報処理装置1及びセンサ装置9の一方が認識できた場合、情報処理装置1又はセンサ装置9のいずれかが認識した部位を、当該変位部位として特定することができる。処理部11は、当該部位を認識できた情報処理装置1又はセンサ装置9を示す情報を、当該部位の拍動を検出できる装置として一時記憶する。
ここでは頸動脈がある変位部位の認識方法を例示したが、適宜、情報処理装置1及びセンサ装置9の認識結果を組み合わせて、変位部位を特定すればよい。
実施形態3に係る情報処理システムは、情報処理手順が実施形態1と異なる。情報処理システムのその他の構成は、実施形態1に係る情報処理システムと同様であるため、同様の箇所には同じ符号を付し、詳細な説明を省略する。
赤外線センサ13にて得られる点群データは、当該赤外線センサ13の位置を基準にした座標系において、ユーザの体表面又は衣服表面上の多数の点の位置を3次元座標で表したものである。赤外線センサ93にて得られる点群データは、当該赤外線センサ93の位置を基準にした座標系において、ユーザの体表面又は衣服表面上の多数の点の位置を3次元座標で表したものである。赤外線センサ13にて得られる点群データ及び赤外線センサ93にて得られる点群データは異なる座標系で表されたデータであるが、両方の赤外線センサ13,93で検出される複数の点を合致させるように座標変換すれば、各点群データを統合することができ、ユーザの生体表面又は衣服表面のより詳細な3次元情報が得られる。
実施形態4に係る情報処理装置1は、加速度センサ5及び接触式センサ6を用いて、ユーザの体動及び拍動を検出する点が実施形態1~3と異なる。情報処理装置1のその他の構成は、実施形態1~3に係る情報処理装置1と同様であるため、同様の箇所には同じ符号を付し、詳細な説明を省略する。
2 第1通信端末
3 第2通信端末
4 サーバ
5 加速度センサ
6 接触式センサ
9 センサ装置
10,90 記録媒体
11,91 処理部
12,92 記憶部
13,93 赤外線センサ
14,94 ミリ波センサ
15,95 通信部
16 操作部
17 表示部
18 ユーザDB
19,99 学習モデル
19a 入力層
19b 中間層
19c 出力層
171 判定結果表示画像
P1,P2 コンピュータプログラム
R 部屋
L 照明装置
Claims (30)
- 生体の心臓又は血管の異常をコンピュータに判定させるためのコンピュータプログラムであって、
心臓又は血管により生体表面が変位する、第1方向から検出可能な変位部位を特定し、
心臓又は血管により生体表面が変位する、第2方向から検出可能な変位部位を特定し、
特定された前記変位部位における前記生体表面の変位に基づいて、心臓又は血管の拍動を検出し、
検出された心臓又は血管の拍動に係る情報に基づいて心臓又は血管の異常を判定する
処理を前記コンピュータに実行させるためのコンピュータプログラム。 - 複数の前記変位部位を特定し、
特定された複数の前記変位部位それぞれにおける脈拍数、脈拍リズム、又は振れの大きさを比較することにより、心臓又は血管の異常を判定する
処理を前記コンピュータに実行させるための請求項1に記載のコンピュータプログラム。 - 複数の前記変位部位を特定し、
第1の前記変位部位における拍動のピークと、第2の前記変位部位における拍動のピークとの時間差が閾値以上である場合、心臓又は血管に異常があると判定する
処理を前記コンピュータに実行させるための請求項1又は請求項2に記載のコンピュータプログラム。 - 非接触式センサから出力される信号データに基づいて前記変位部位を特定する
処理を前記コンピュータに実行させるための請求項1から請求項3のいずれか1項に記載のコンピュータプログラム。 - 赤外線又は可視光にて前記変位部位を特定し、
ミリ波又はテラヘルツ帯の電磁波を照射し、前記生体の前記変位部位からの反射波により、心臓又は血管の拍動を検出する
処理を前記コンピュータに実行させるための請求項1から請求項4のいずれか1項に記載のコンピュータプログラム。 - 前記生体に対して前記第1方向を向く第1の赤外線センサ又は可視光センサから出力される信号データに基づいて前記変位部位を特定し、
前記生体に対して前記第2方向を向く第2の赤外線センサ又は可視光センサから出力される信号データに基づいて前記変位部位を特定する
処理を前記コンピュータに実行させるための請求項1から請求項5のいずれか1項に記載のコンピュータプログラム。 - 前記生体に対して前記第1方向を向く第1の赤外線センサ又は可視光センサから出力される信号データと、前記生体に対して前記第2方向を向く第2の赤外線センサ又は可視光センサから出力される信号データとに基づいて前記変位部位を特定する
処理を前記コンピュータに実行させるための請求項1から請求項6のいずれか1項に記載のコンピュータプログラム。 - 前記生体に対して前記第1方向へミリ波又はテラヘルツ帯の電磁波を照射し、前記生体の前記変位部位からの反射波を受信する第1のミリ波センサ又はテラヘルツ帯センサから出力される信号データに基づいて、心臓又は血管の拍動を検出し、
前記生体に対して前記第2方向へミリ波又はテラヘルツ帯の電磁波を照射し、前記生体の前記変位部位からの反射波を受信する第2のミリ波センサ又はテラヘルツ帯センサから出力される信号データに基づいて、心臓又は血管の拍動を検出する
処理を前記コンピュータに実行させるための請求項1から請求項7のいずれか1項に記載のコンピュータプログラム。 - 前記第1方向は略水平方向であり、前記第2方向は略鉛直方向である
請求項1から請求項8のいずれか1項に記載のコンピュータプログラム。 - 特定された前記変位部位に照準を合わせてミリ波又はテラヘルツ帯の電磁波を照射し、該変位部位からの反射波により、心臓又は血管の拍動を検出する
処理を前記コンピュータに実行させるための請求項1から請求項9のいずれか1項に記載のコンピュータプログラム。 - 少なくとも一つの前記変位部位における心臓又は血管の拍動を、前記生体に取り付けられた接触式センサから出力される信号データに基づいて検出し、少なくとも一つの前記変位部位における心臓又は血管の拍動を非接触式センサから出力される信号データに基づいて検出する
処理を前記コンピュータに実行させるための請求項1から請求項10のいずれか1項に記載のコンピュータプログラム。 - 前記生体に取り付けられた加速度センサから出力される信号データに基づいて、体動の大きさが所定値未満であるか否かを判定し、
体動の大きさが所定値未満である場合、特定された前記変位部位における前記生体表面の変位に基づいて、心臓又は血管の拍動を検出する
処理を前記コンピュータに実行させるための請求項1から請求項11のいずれか1項に記載のコンピュータプログラム。 - 前記変位部位は、頸部、側頭部、上肢、足背又は胸部であり、前記頸部における頸動脈、前記側頭部における浅側頭動脈、前記上肢における上腕動脈、橈骨動脈若しくは尺骨動脈、前記足背における足背動脈又は前記胸部における心臓の拍動を検出する
処理を前記コンピュータに実行させるための請求項1から請求項12のいずれか1項に記載のコンピュータプログラム。 - 頸動脈三角の領域を前記変位部位として特定し、
頸動脈の脈拍を検出する
処理を前記コンピュータに実行させるための請求項13に記載のコンピュータプログラム。 - 顔の輪郭、目、眉、鼻、口、ほうれい線、耳及び顎のいずれか一つを検出し、
検出された部位の正中線からのずれ量に基づいて、頸動脈がある前記変位部位を特定し、
頸動脈の脈拍を検出する
処理を前記コンピュータに実行させるための請求項13又は請求項14のいずれか1項に記載のコンピュータプログラム。 - 上腕を前記変位部位として特定し、上腕動脈の拍動を検出する
処理を前記コンピュータに実行させるための請求項13から請求項15のいずれか1項に記載のコンピュータプログラム。 - 手根内側の親指寄り部分を前記変位部位として特定し、橈骨動脈の拍動を検出する
処理を前記コンピュータに実行させるための請求項13から請求項16のいずれか1項に記載のコンピュータプログラム。 - 手根内側の小指寄り部分を前記変位部位として特定し、尺骨動脈の拍動を検出する
処理を前記コンピュータに実行させるための請求項13から請求項17のいずれか1項に記載のコンピュータプログラム。 - 頸動脈、浅側頭動脈、上腕動脈、橈骨動脈、尺骨動脈、足背動脈の脈拍、又は心拍のいずれか2つに基づいて、心臓又は血管の異常を判定する
処理を前記コンピュータに実行させるための請求項13から請求項18のいずれか1項に記載のコンピュータプログラム。 - 頸動脈、浅側頭動脈、上腕動脈、橈骨動脈、尺骨動脈、又は足背動脈の脈拍と、心拍とに基づいて、心臓又は血管の異常を判定する
処理を前記コンピュータに実行させるための請求項13から請求項19のいずれか1項に記載のコンピュータプログラム。 - 頸動脈、浅側頭動脈、上腕動脈、橈骨動脈、尺骨動脈、又は足背動脈の脈拍のピークのずれに基づいて、心臓又は血管の異常を判定する
処理を前記コンピュータに実行させるための請求項13から請求項20のいずれか1項に記載のコンピュータプログラム。 - 足背動脈の振れの大きさに基づいて、重症下肢虚血に係る異常を判定する
処理を前記コンピュータに実行させるための請求項13から請求項21のいずれか1項に記載のコンピュータプログラム。 - 左頸動脈の脈拍と、右頸動脈の脈拍とに基づいて、心臓又は血管の異常を判定する
処理を前記コンピュータに実行させるための請求項13から請求項22のいずれか1項に記載のコンピュータプログラム。 - 左側頭動脈の脈拍と、右側頭動脈の脈拍とに基づいて、心臓又は血管の異常を判定する
処理を前記コンピュータに実行させるための請求項13から請求項23のいずれか1項に記載のコンピュータプログラム。 - 左上腕動脈の脈拍と、右上腕動脈の脈拍とに基づいて、心臓又は血管の異常を判定する
処理を前記コンピュータに実行させるための請求項13から請求項24のいずれか1項に記載のコンピュータプログラム。 - 左足背動脈の脈拍と、右足背動脈の脈拍とに基づいて、心臓又は血管の異常を判定する
処理を前記コンピュータに実行させるための請求項13から請求項25のいずれか1項に記載のコンピュータプログラム。 - 頸動脈三角の領域を前記変位部位として特定し、
頸静脈怒張を検出し、
心不全に係る異常を判定する
処理を前記コンピュータに実行させるための請求項1から請求項26のいずれか1項に記載のコンピュータプログラム。 - 生体の心臓又は血管の異常を判定する情報処理方法であって、
心臓又は血管により生体表面が変位する、第1方向から検出可能な変位部位を特定し、
心臓又は血管により生体表面が変位する、第2方向から検出可能な変位部位を特定し、
特定された前記変位部位における前記生体表面の変位に基づいて、心臓又は血管の拍動を検出し、
検出された心臓又は血管の拍動に係る情報に基づいて心臓又は血管の異常を判定する
情報処理方法。 - 生体の心臓又は血管の異常を判定する情報処理装置であって、
心臓又は血管により生体表面が変位する、第1方向から検出可能な変位部位を特定する第1特定部と、
心臓又は血管により生体表面が変位する、第2方向から検出可能な変位部位を特定する第2特定部と、
前記第1特定部及び前記第2特定部にて特定された前記変位部位における前記生体表面の変位に基づいて、心臓又は血管の拍動を検出する検出部と、
検出された心臓又は血管の拍動に係る情報に基づいて心臓又は血管の異常を判定する判定部と
を備える情報処理装置。 - 生体の心臓又は血管の異常を判定する情報処理装置及びセンサ装置を備えた情報処理システムであって、
前記情報処理装置は、
心臓又は血管により生体表面が変位する変位部位を第1方向から特定するための第1の赤外線センサ又は可視光センサと、
前記生体に対して第1方向へミリ波又はテラヘルツ帯の電磁波を照射し、前記生体の前記変位部位からの反射波を受信する第1のミリ波センサ又はテラヘルツ帯センサと
を備え、
前記センサ装置は、
心臓又は血管により生体表面が変位する変位部位を第2方向から特定するための第2の赤外線センサ又は可視光センサと、
前記生体に対して第2方向へミリ波又はテラヘルツ帯の電磁波を照射し、前記生体の前記変位部位からの反射波を受信する第2のミリ波センサ又はテラヘルツ帯センサと
を備え、
前記第1の赤外線センサ又は可視光センサ及び第2の赤外線センサ又は可視光センサから出力される信号データに基づいて、前記変位部位を特定する特定部と、
前記第1のミリ波センサ又はテラヘルツ帯センサ及び前記第2のミリ波センサ又はテラヘルツ帯センサから、前記特定部にて特定された前記変位部位へミリ波又はテラヘルツ帯の電磁波を照射させ、前記第1のミリ波センサ又はテラヘルツ帯センサ及び前記第2のミリ波センサ又はテラヘルツ帯センサから出力される信号データに基づいて心臓又は血管の拍動を検出する検出部と、
検出された心臓又は血管の拍動に係る情報に基づいて心臓又は血管の異常を判定する判定部と
を備える情報処理システム。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP22771223.9A EP4302705A1 (en) | 2021-03-16 | 2022-03-09 | Computer program, information processing method, information processing device, and information processing system |
JP2023507012A JPWO2022196471A1 (ja) | 2021-03-16 | 2022-03-09 | |
US18/354,698 US20230355120A1 (en) | 2021-03-16 | 2023-07-19 | Computer program, information processing method, information processing device, and information processing system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021042842 | 2021-03-16 | ||
JP2021-042842 | 2021-03-16 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/354,698 Continuation US20230355120A1 (en) | 2021-03-16 | 2023-07-19 | Computer program, information processing method, information processing device, and information processing system |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022196471A1 true WO2022196471A1 (ja) | 2022-09-22 |
Family
ID=83320529
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/010172 WO2022196471A1 (ja) | 2021-03-16 | 2022-03-09 | コンピュータプログラム、情報処理方法、情報処理装置及び情報処理システム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20230355120A1 (ja) |
EP (1) | EP4302705A1 (ja) |
JP (1) | JPWO2022196471A1 (ja) |
WO (1) | WO2022196471A1 (ja) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000083927A (ja) * | 1998-09-11 | 2000-03-28 | Nippon Avionics Co Ltd | 非接触式心肺機能監視装置 |
JP2016174872A (ja) * | 2015-03-23 | 2016-10-06 | 国立大学法人九州工業大学 | 生体情報の信号源特定装置及び生体情報の信号源特定方法 |
JP2019180451A (ja) * | 2018-04-02 | 2019-10-24 | テイ・エス テック株式会社 | 生体センサーの配置構造及びシート |
JP2019201698A (ja) * | 2018-05-21 | 2019-11-28 | 株式会社Soken | 生体検知装置 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11445929B2 (en) * | 2018-12-18 | 2022-09-20 | Movano Inc. | Systems for radio wave based health monitoring that utilize amplitude and phase data |
-
2022
- 2022-03-09 WO PCT/JP2022/010172 patent/WO2022196471A1/ja active Application Filing
- 2022-03-09 EP EP22771223.9A patent/EP4302705A1/en active Pending
- 2022-03-09 JP JP2023507012A patent/JPWO2022196471A1/ja active Pending
-
2023
- 2023-07-19 US US18/354,698 patent/US20230355120A1/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000083927A (ja) * | 1998-09-11 | 2000-03-28 | Nippon Avionics Co Ltd | 非接触式心肺機能監視装置 |
JP2016174872A (ja) * | 2015-03-23 | 2016-10-06 | 国立大学法人九州工業大学 | 生体情報の信号源特定装置及び生体情報の信号源特定方法 |
JP2019180451A (ja) * | 2018-04-02 | 2019-10-24 | テイ・エス テック株式会社 | 生体センサーの配置構造及びシート |
JP2019201698A (ja) * | 2018-05-21 | 2019-11-28 | 株式会社Soken | 生体検知装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4302705A4 * |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022196471A1 (ja) | 2022-09-22 |
EP4302705A4 (en) | 2024-01-10 |
US20230355120A1 (en) | 2023-11-09 |
EP4302705A1 (en) | 2024-01-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11986286B2 (en) | Gait-based assessment of neurodegeneration | |
AU2018250385B2 (en) | Motor task analysis system and method | |
Zhao et al. | Heart rate sensing with a robot mounted mmwave radar | |
Liu et al. | MetaPhys: few-shot adaptation for non-contact physiological measurement | |
US9974466B2 (en) | Method and apparatus for detecting change in health status | |
Leightley et al. | Automated analysis and quantification of human mobility using a depth sensor | |
US20150320343A1 (en) | Motion information processing apparatus and method | |
US20140299775A1 (en) | Method and apparatus for monitoring individuals while protecting their privacy | |
do Carmo Vilas-Boas et al. | Movement quantification in neurological diseases: Methods and applications | |
CN109480908A (zh) | 换能器导航方法及成像设备 | |
JP2011115460A (ja) | 視線制御装置、視線制御方法、及びそのプログラム | |
KR101310464B1 (ko) | 생체 정보 감시 시스템 및 그 시스템을 이용한 생체 정보 감시 방법 | |
JP2011251114A (ja) | 3次元超音波診断装置およびその動作方法 | |
Colantonio et al. | Computer vision for ambient assisted living: Monitoring systems for personalized healthcare and wellness that are robust in the real world and accepted by users, carers, and society | |
Gupta et al. | Automatic contact-less monitoring of breathing rate and heart rate utilizing the fusion of mmWave radar and camera steering system | |
CN111563454A (zh) | 一种双重活体验证的手部静脉识别方法及装置 | |
WO2022196471A1 (ja) | コンピュータプログラム、情報処理方法、情報処理装置及び情報処理システム | |
WO2020019346A1 (zh) | 生物特征识别方法、装置、系统及终端设备 | |
WO2022196469A1 (ja) | コンピュータプログラム、情報処理方法及び情報処理装置 | |
CN117918021A (zh) | 从摄像头观察结果中提取信号 | |
JP7372966B2 (ja) | 骨格モデルを提供するための装置、システム、装置の作動方法及びコンピュータプログラム | |
Beiderman et al. | Automatic solution for detection, identification and biomedical monitoring of a cow using remote sensing for optimised treatment of cattle | |
CN112233769A (zh) | 一种基于数据采集的患后康复系统 | |
JP2020098474A (ja) | 属性決定装置、属性決定システム、属性決定方法、プログラムおよび記録媒体 | |
US20230082016A1 (en) | Mask for non-contact respiratory monitoring |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22771223 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2023507012 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022771223 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2022771223 Country of ref document: EP Effective date: 20231002 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |