WO2023095321A1 - Dispositif de traitement d'informations, système de traitement d'informations et procédé de traitement d'informations - Google Patents

Dispositif de traitement d'informations, système de traitement d'informations et procédé de traitement d'informations Download PDF

Info

Publication number
WO2023095321A1
WO2023095321A1 PCT/JP2021/043586 JP2021043586W WO2023095321A1 WO 2023095321 A1 WO2023095321 A1 WO 2023095321A1 JP 2021043586 W JP2021043586 W JP 2021043586W WO 2023095321 A1 WO2023095321 A1 WO 2023095321A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
sensor
information processing
head
distance
Prior art date
Application number
PCT/JP2021/043586
Other languages
English (en)
Japanese (ja)
Inventor
治 川前
義憲 岡田
Original Assignee
マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by マクセル株式会社 filed Critical マクセル株式会社
Priority to PCT/JP2021/043586 priority Critical patent/WO2023095321A1/fr
Publication of WO2023095321A1 publication Critical patent/WO2023095321A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb

Definitions

  • the present invention relates to an information processing device, an information processing system, and an information processing method, and is particularly suitable for detection of symptoms such as dementia, signs of diseases, signs, and effects and effects of prescriptions such as treatment and medication.
  • the present invention relates to an information processing device, an information processing system, and an information processing method.
  • finger tapping motion as a "rule” that indicates the state of brain health, it is possible to quantify the fine motor function of the fingers, so it can be used in various fields such as healthcare, rehabilitation, and life support can.
  • Patent Document 1 describes "a motor function measuring device that calculates motion data based on the relative distance between a pair of a transmitting coil and a receiving coil attached to a movable part of a living body. and an evaluation device for evaluating the motor function of a living body based on the motion data received from the motor function measuring device.”
  • Patent Document 1 describes measuring and evaluating the tapping motion
  • the tapping motion is measured by attaching a magnetic sensor to the finger.
  • the present invention has been made in view of the above problems, and provides an information processing device, an information processing system, and an information processing method that can save the trouble of preparing for measurement of tapping motion and perform and measure tapping motion more accurately. offer.
  • an information processing apparatus includes a processor, and the processor measures and detects the distance between two fingers during a tapping motion of the subject's two fingers, and outputs distance information between the two fingers. and a head and neck region state change sensor that measures and detects changes in the state of mainly the head and neck region other than the finger and outputs head and neck state change amount information, wherein the finger tap is connected to Synchronization processing is executed between the distance sensor and the head and neck region state change sensor, and the finger tap distance sensor and the head and neck region state change sensor perform synchronous measurement operations to obtain the two-finger distance information and the It is characterized by acquiring head and neck state change amount information.
  • FIG. 1 is a diagram showing an appearance of an information processing apparatus according to a first embodiment
  • FIG. FIG. 10 is a diagram showing the results of tapping movement measurement by a range sensor and the measurement results of eye movements by a left-eye line-of-sight sensor and a right-eye line-of-sight sensor over time. It is a figure which shows the external appearance of the information processing apparatus which concerns on 2nd Embodiment.
  • the result of tapping movement measurement by the out-camera and distance sensor, the eye movement measurement result by the in-camera, the cheek muscle movement measurement result by the in-camera, and the luminance change measurement result of the face surface by the in-camera are time-lapsed. It is a figure shown with progress.
  • FIG. 10 is a diagram showing an image displayed on the display screen of the smartphone during tapping exercise; It is a figure which shows the external appearance of the information processing apparatus which concerns on 3rd Embodiment. It is a figure which shows the external appearance of the information processing apparatus which concerns on 3rd Embodiment. It is a figure which shows the display screen of the information processing apparatus which concerns on 3rd Embodiment. 4 is a flow chart showing the flow of processing of the information processing system; It shows an example of processing for synchronizing time between different devices.
  • 1 is a block diagram showing a configuration example of a head-mounted display as an example of an information processing apparatus according to an embodiment; FIG.
  • FIG. 1 is a block diagram showing a configuration example of a mobile information terminal as an example of an information processing apparatus according to an embodiment
  • FIG. 1 is a diagram schematically showing the appearance of an embodiment in which a cloud is configured as an information processing system according to the present embodiment
  • FIG. 1 is a diagram schematically showing the appearance of an embodiment in which a cloud is configured as an information processing system according to the present embodiment
  • the information processing apparatus includes a head mounted display (HMD, hereinafter referred to as HMD) that is worn on the head of the subject 10 and displays and visually recognizes real space information and virtual space information. It is a mounted embodiment.
  • HMD head mounted display
  • FIG. 1 is a diagram showing the appearance of the information processing apparatus according to the first embodiment.
  • the subject 10 whose tapping motion is to be measured wears the HMD 100 on its head.
  • the subject 10 is also the user of the HMD 100 .
  • the HMD 100 includes a distance sensor 101 that measures and detects the distance and angle to an object as a finger tapping distance sensor that measures the distance between two fingers during tapping motion (opening/closing motion) of the HMD.
  • the HMD 100 is provided with a distance measurement sensor 101 for the purpose of detecting gesture operations.
  • the distance measuring sensor 101 is used as a finger tapping distance sensor for detecting tapping motion.
  • the distance measuring sensor 101 is a sensor that can measure the distance and angle to an object and can perceive the shape of an object such as an object as a three-dimensional object.
  • LiDAR Light Detection and Ranging
  • a TOF Time Of Flight
  • a millimeter-wave radar or the like is used to detect the state of the target object.
  • the distance measuring sensor 101 detects the distance and angle to the tip of the index finger 103 and the tip of the thumb 104 of the left hand 102 performing the tapping motion as indicated by the solid line 108 . Therefore, the HMD 100 can measure the inter-finger distance between the tip of the index finger 103 and the tip of the thumb 104 during the tapping exercise over time. Since the depth distance can also be measured, the distance between two fingers can be accurately measured even if the fingers are oblique to the distance measuring sensor 101 .
  • the distance measuring sensor 101 detects the distance and angle to the tip of the index finger 106 and the tip of the thumb 107 of the right hand 105 performing the tapping motion as indicated by the solid line 109 . Therefore, the HMD 100 can measure the inter-finger distance between the tip of the index finger 106 and the tip of the thumb 107 during the tapping exercise over time.
  • the HMD 100 also includes an out-camera 110 that captures the surrounding external world in front.
  • the out-camera 110 acquires a photographed image by converting the light incident from the lens into an electric signal with an imaging element.
  • the left and right fingers performing the tapping exercise may be photographed with the out-camera 110, and the distance between the two fingers may be calculated and measured from the photographed image (out-camera image).
  • the processor 802 (see FIG. 8) of the HMD 100 detects the shape of the finger from the out-camera image and recognizes which finger is left or right. It is possible to obtain the pixel positions of those fingertips in the out-camera image, and detect how the positions of the two fingertips have changed over time.
  • the processor 802 can also grasp the distance between the fingertips based on how the tips of the fingertips appear to be large or small, and can grasp the inclination of the two fingers with respect to the out-camera 110 based on how the inside and outside of the two fingers appear. It is also possible to Recently, it is possible to Recently, it is possible to measure the distance between two fingers by accumulating data obtained by measuring the relationship between the inclination of the fingers and the distance in advance using machine learning or the like.
  • the HMD 100 can be used to conveniently measure and detect the distance between two fingers during the tapping exercise.
  • the processor 802 can analyze and extract the feature amount leading to brain function evaluation of the subject 10 as biometric information by the arithmetic function in the HMD 100 based on the measured and detected information.
  • dementia such as Alzheimer's disease, cerebrovascular disease, Lewy body disease, Parkinson's disease, developmental coordination disorder (inability to skip or jump rope, etc.).
  • the dexterous motor function of the fingers can be quantified and detected as a "measure” indicating the state of brain health, it can be used as a training and rehabilitation menu for improving brain function. is possible.
  • the tapping motion interval distance between the index finger and the thumb is measured over time has been described.
  • the tapping movement can be measured even if a plurality of fingers are opened and closed, and it goes without saying that the same action and effect can be obtained.
  • the case where the tapping exercise is performed with the index finger will be mainly described, but the same applies to other fingers than the index finger.
  • the HMD 100 also includes a left-eye line-of-sight sensor 111 and a right-eye line-of-sight sensor 112 that detect eye movement and line-of-sight by capturing the movements and orientations of the left and right eyes, respectively.
  • a left-eye line-of-sight sensor 111 and a right-eye line-of-sight sensor 112 measure movements of the pupil and eyeball, blinking, and the like.
  • the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 detect the movement and orientation of the right and left eyes, respectively, and can capture the eye movement and line of sight.
  • the process of detecting the movement of the eyeball may use a well-known technique that is generally used as an eye tracking process.
  • an infrared LED Light Emitting Diode
  • As another method for measuring and detecting eyeball movements there is also known a method of photographing an eye with a visible light camera and detecting the line of sight based on the positions of the pupil and iris.
  • the processor 802 In line-of-sight input processing using the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 as input devices of the HMD 100, the processor 802 detects the gaze point of the line-of-sight destination. The direction of the human gaze moves quickly, and it is difficult to determine the point of gaze if the fine movements of the eyeballs are followed. It slows down the followability of the sensor 112 .
  • the processor 802 applies a smoothing filter along the time axis to the sensor information from the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 to detect the gaze point.
  • the processor 802 uses the sensor information as it is without applying the smoothing filter, or uses a filter with a higher time resolution than the smoothing filter used for viewpoint input processing. Switching filters to detect minute movements of the line of sight.
  • the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 which detect minute pupil and eyeball movements at high speed, detect involuntary eyeball and pupillary movements unrelated to the subject's intention, rapid blinking, etc. can be detected.
  • the HMD 100 it becomes possible to analyze and extract the feature quantity leading to the evaluation of the abnormal state of the subject as biometric information by the arithmetic function in the HMD 100 .
  • dementia dementia
  • cerebrovascular disease dementia
  • Lewy body disease dementia
  • Parkinson's disease etc.
  • detection of signs, precursors, and signs of diseases dementia
  • analysis and extraction of features based on it may be possible to discriminate signs of oculomotor dysfunction.
  • the movement of the pupil/eyeball is detected by the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 at the same time.
  • Only the measurement of the distance between two fingers or the detection of pupil/eyeball movement by the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 alone has limits and variations in detection accuracy of biometric information that indicates signs, signs, and symptoms of diseases such as dementia. be. Therefore. By measuring these simultaneously, it becomes possible to detect biometric information that indicates a sign, portent, or symptom of a disease with higher accuracy.
  • the screen 121 is displayed on the display 113 so that it can be confirmed whether or not the entire figure of both fingers is shown.
  • the image is switched to an image different from the image of the entire two fingers that are in contact with each other. For example, during tapping motion measurement, a relaxation image 122 for relaxing the subject 10, an operation guide image 123 for finger tap distance measurement, and the like are displayed.
  • the subject 10 can measure the tapping motion in a relaxed state.
  • FIG. 2 is a diagram showing the results of tapping movement measurement by the distance measuring sensor 101 and the measurement results of eye movement by the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 over time.
  • the distance between two fingers, speed, and acceleration are taken as examples of the items indicating the feature amount of the tapping motion, and the transition of these items over time is shown.
  • the information including the measurement results of the eye movement by the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 corresponds to head-and-neck state variation information.
  • the distance between the two fingers is obtained by detecting the positions of the two fingers performing the tapping motion with the distance measuring sensor 101 and measuring the distance between the two fingers during the tapping motion. Velocity and acceleration are calculated and analyzed based on information on the distance between two fingers that is temporally displaced by the tapping motion.
  • the distance between two fingers is maximum, the velocity is zero, and the acceleration is the maximum negative value.
  • feature amounts indicating features of finger movement are analyzed and extracted for each item such as distance, speed, acceleration, tap interval, and phase difference.
  • the processor 802 calculates the maximum swing width of the finger, the total moving distance, etc. from the time transition of the distance between the two fingers, and evaluates how the finger movement has changed.
  • the maximum speed, opening speed, closing speed, etc. are calculated from the change in speed over time, and how quickly and stably the finger moves is evaluated.
  • the maximum amplitude of the acceleration, the force at which the finger starts to open, the force at which it finishes opening, the force at which it finishes closing, the force at which it starts to close, the finger contact time, etc. are calculated from the transition of the acceleration over time, and the force of finger movement is evaluated.
  • the processor 802 calculates the number of taps, the tap period, the tap frequency, and the like for the tap interval item of the feature amount, and evaluates the timing of tapping.
  • the processor 802 calculates the timing difference between both hands, the degree of similarity between both hands, the balance of alternating taps, etc., and evaluates the cooperation between both hands. By analyzing these, the extracted feature amount can be used as a clue to know the state of the brain that leads to brain function evaluation of the subject 10, and recognition of Alzheimer's type, cerebrovascular disease, Lewy body type, etc. This can be used as one means of testing for early detection of diseases such as Parkinson's disease, developmental coordination disorder, and the like.
  • the movement distance of the eyeball movement measurement result is obtained by detecting the position of the eyeball with the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112, and measuring the distance the eyeball moved over time during the tapping motion. .
  • the movement distance is detected by determining in which direction and how much distance the eyeball has moved from a certain line-of-sight direction as an initial position.
  • FIG. 2 shows the movement distance in the horizontal direction, it may be represented by a distance in the vertical direction or a distance vector from the initial position. It is also possible to display an index in which the characteristic appears conspicuously.
  • the average value of both eyes may be indicated, or one of the left and right eyes, in which the characteristic is prominently displayed, may be indicated. Of course, it may be displayed for both eyes.
  • the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 measure the high-speed and minute movements of both eyes.
  • the parameters include speeding up the frame rate of the eye measurement camera, reducing the average number of detections that have averaged multiple frames, and detecting each frame.
  • the detection range is adapted to the magnitude of the movement.
  • the smoothing of the waveform when displaying the graph should be eliminated as much as possible so that the state of exercise can be displayed as it is.
  • eye movements may be symptoms that are likely to occur in people with specific diseases and diseases. Therefore, based on the above-described tapping movement and the detected eyeball movement information, the arithmetic function in the information processing apparatus extracts a feature amount leading to an abnormal state evaluation of the subject. If the results of the analysis indicate signs, precursors, or signs of illness, it will be possible to detect the tendency at an early stage by a simple method. In addition, there is a possibility of early detection of dementia such as Alzheimer's disease, cerebrovascular disease, and Lewy body disease, Parkinson's disease, and disorders of eye movement function.
  • dementia such as Alzheimer's disease, cerebrovascular disease, and Lewy body disease, Parkinson's disease, and disorders of eye movement function.
  • the information processing device is widely used, and is installed in portable personal digital assistants such as smartphones and tablets that have a function of photographing the external world around the front and a subject and a function of displaying information. It is an embodiment to be carried out.
  • FIG. 3 is a diagram showing the appearance of an information processing apparatus according to the second embodiment.
  • a smart phone 300 will be described as a specific example of a portable information terminal.
  • the smartphone 300 is provided with an out-camera 301 that captures an image of the outside world in front and a ranging sensor 303 that measures and detects the distance and angle to an object on the back of a housing 305 of the smartphone 300 .
  • a display 908 (see FIG. 9) having a display screen is provided on the front of the housing 305 .
  • the smartphone 300 is supported by a stand 304.
  • the subject 10 positions the smartphone 300 and the hand so that the finger to be measured is within the angle of view of the out-camera 301 .
  • the left hand 102 and the right hand 105 performing the tapping motion are photographed by the out-camera 301 as indicated by the solid line 108, and from the out-camera image, the distance between the index finger 103 and the thumb 104 and the distance between the index finger 106 and the thumb 107 are determined. Calculate the distance between two fingers.
  • the out-camera 301 With the out-camera 301 alone, finger movements can only be captured in two dimensions, and for example, when the hand is tilted with respect to the out-camera 301, it may not be possible to measure accurately. In this case, the depth can be detected by using the distance measuring sensor 303, and the distance between the two fingers during the tapping operation can be accurately measured.
  • the distance measuring sensor 303 When measuring the distance between two fingers, only the out-camera 301 or the distance measuring sensor 303 may be used, or the out-camera 301 and the distance measuring sensor 303 may be used together in consideration of the balance between detection accuracy and cost.
  • the feature amount can be extracted by the arithmetic function in the information processing apparatus, and the same effect as described in the first embodiment can be obtained.
  • the smartphone 300 also has an in-camera 302 on the front of the housing 305 .
  • the in-camera 302 photographs the eyes, mouth, cheeks, temples, neck, and other head and neck regions of the subject 10 .
  • the processor 920 measures the entire face, movement of eyeballs/pupils, blinking, mouth, cheeks, temples, head and neck regions such as the neck from the in-camera image.
  • the in-camera 302 can observe not only the abnormal movement of the eyeballs and pupils described in the first embodiment, but also the spasms of the muscles of the entire face, mouth, cheeks, and temples, and the blood flow of the cheeks, temples, neck, and the like.
  • the movement of the eyeball and pupil can be detected based on the movement of the pupil and the area of the white of the eye. It is also possible to detect the opening and closing of the pupil and the iris.
  • the processor 920 can extract features from eyeball and pupil movements.
  • an image of a part of the face of the subject 10 captured by the in-camera 302, such as the mouth, cheeks, temples, etc. (this is a face image of the entire face of the subject 10, or the face from the neck is captured).
  • a partial image of the image of the head and neck may also be used.
  • spasms and convulsions of muscles of the mouth, cheeks, temples, etc. can be detected, and biometric information can be obtained that indicates signs, signs, and signs of illness.
  • the average value of each color component (red, green, blue) of the face area is calculated for each acquired frame.
  • the noise common to the three color components is removed, and the luminance waveform is extracted from the green component.
  • the pulse wave can be measured.
  • the biometric information may be obtained by measuring another state change in the head and neck region such as the eyes, mouth, cheeks, temples, and neck, or by measuring the state change in a region other than the head and neck region.
  • the out-camera 301 and the distance sensor 303 as finger tapping distance sensors detect the distance between two fingers during a two-finger tap operation
  • the in-camera 302 as a head and neck region state change sensor detects the distance between two fingers.
  • the head and neck region is imaged and changes in the state are detected at the same time.
  • FIG. 4 shows the result of tapping movement measurement by the out-camera 301 and the distance sensor 303, the measurement result of the eyeball movement by the in-camera 302, the measurement result of the cheek muscle movement by the in-camera 302, and the face by the in-camera 302. It is an example of the figure which showed the measurement result of the luminance change of the surface with time progress.
  • FIG. 4 the description of the measurement result of tapping motion and the measurement result of eyeball movement is the same as that described in FIG. 2, so it is omitted.
  • the detected small movement fluctuations can capture convulsions and convulsions of the cheeks as biological information.
  • the measurement result of the luminance change on the face surface shows the luminance waveform extracted from the green component, and the pulse rate is calculated from the peak values (for example, points 403 and 404) of this luminance waveform, and the pulse wave is obtained. It can be taken as biological information.
  • FIG. 5 is an example of a diagram showing an image displayed on the display screen 501 of the smartphone 300 during tapping exercise.
  • Display screen 501a is a display image before finger tapping measurement, and display screens 501b and 501c are displayed on display 908 (see FIG. 10) of smartphone 300 during measurement of the distance between two fingers during tapping exercise. is the screen.
  • the display screen 501a displays the appearance of the entire fingers (whole finger landscape) photographed by the out-camera 301 and a head and neck image 505 mainly of the subject's face photographed by the in-camera 302 .
  • a head and neck image 505 mainly of the subject's face photographed by the in-camera 302 .
  • the two fingers during the tapping exercise to be measured by the out-camera 301 are within the photographing range of the out-camera 301 before measurement.
  • the head-and-neck image 505 it can be confirmed that the head-and-neck region, mainly including the face of the subject 10, is within the photographing range of the in-camera 302 before measuring the distance between two fingers.
  • the head and neck can be photographed from the state before the tapping movement is measured. This is because the attention is focused on the fingers during the measurement of the tapping motion, so that recording the state of the head and neck region, which is not in the state of concentration, may complement the biological information.
  • the display 908 displays an image different from the overall appearance of both fingers performing the tapping motion. As a result, it is possible to suppress the influence of visual feedback on the subject 10 by watching the finger tapping motion during the tapping motion measurement, thereby enabling more accurate tapping motion measurement.
  • the display screen 501b displays an operation guide image 502 for measuring the distance between two fingers during the tapping exercise and a head and neck image 503 mainly including the face of the subject 10 photographed by the in-camera 302. be done.
  • the operation guide image 502 it is possible to measure the tapping motion while observing the operation guidance of the correct tapping motion and the progress state during the tapping motion measurement.
  • the head and neck image 503 it is possible to visually confirm that the head and neck region 504 such as the eyes, mouth, cheeks, temples, and neck are being photographed by the in-camera 302 without any trouble.
  • a relaxation image for relaxing the subject 10 is displayed on the display screen 501c.
  • the subject 10 can measure the tapping motion in a relaxed state.
  • the third embodiment includes a first information processing device that simultaneously measures the distance between two fingers during tapping and eyeball movement detection, and another device that measures and detects other biological information (referred to as a "biological information sensor"). are linked, and each biological information measured and detected by the first information processing device and another device is displayed by synchronizing the time stamps with the second information processing device.
  • 6A and 6B are diagrams showing the appearance of an information processing system according to the third embodiment.
  • the information processing system 1 shown in FIG. 6A includes an HMD 100 as a first information processing device, a smartphone 600 as a second information processing device, a heart rate sensor 601 worn on the chest of the subject 10 as a biological information sensor, and the head of the subject 10.
  • an electroencephalograph 602 worn on the body a smartwatch 603 worn on the forearm of the subject 10, a blood sugar sensor 604, a blood pressure monitor 605 worn on the upper arm of the subject 10, and an ear of the subject 10;
  • the smartphone 300, the HMD 100, and each biological information sensor are configured to be connected for communication.
  • Short-range radio such as Bluetooth (registered trademark) may be used for communication.
  • the HMD 100 uses the out-camera 110 or the distance sensor 101 to measure the distance between two fingers during the tapping motion, and the left eye line sensor 111 and the right eye line sensor 112 detect eyeball movements at the same time.
  • the smartphone 600 displays each biological information measured and detected by the HMD 100 and each biological information sensor.
  • the heart rate sensor 601 electrically measures the contraction motion of the heart and is charted as an electrocardiogram.
  • the electroencephalograph 602 measures and records, from electrodes attached to the head of the subject 10, the electrical activity of a very small amount of current that the brain continues to flow in association with its activity, and examines the activity state and active regions of the brain. be able to.
  • the smartwatch 603 is a wristwatch-type wearable terminal equipped with a touch screen and a CPU, and has a function of measuring a heart rate, a function of measuring a stress state from an action potential of the skin, and a function of displaying an electrocardiogram. ing.
  • the wristwatch-type blood sugar level sensor 604 uses a mid-infrared laser with a wavelength band of several ⁇ m to measure the blood sugar level in the blood.
  • the sphygmomanometer 605 measures the pressure (blood pressure) that the flow of the brachial artery pushes the blood vessel from the inside by wrapping a bag-shaped belt called a cuff around the upper arm, and is used to grasp the health condition and mental condition of the subject 10 . can be done.
  • the audiometer 606 emits sounds of seven frequencies between 125 Hz and 8000 Hz with strength and weakness, and the subject 10 listens through headphones to measure the hearing ability.
  • the various biological information measured and detected by each of the biological information sensors and the biological information of the fingers and head and neck regions measured and detected by the HMD 100 are measured and detected together with time information (time stamp).
  • various biological information measured and detected by each biological information sensor and biological information of fingers and head and neck regions measured and detected by the HMD 100 are transmitted to the smartphone 600 mainly via wireless communication.
  • the smart phone 600 displays on one screen various biometric information measured by matching the timing of each biometric information sensor and the HMD 100 . That is, various kinds of biological information with synchronized time stamps are displayed on one screen.
  • the smartphone 600 is shown as the second information processing device, but this is not essential, and the measurement results may be displayed on the screen of the HMD 100 by cooperation between the HMD 100 and each biological information sensor. .
  • the information processing system 1A shown in FIG. 6B uses the smartphone 300 as the first information processing device and the second information processing device, uses the same biological information sensors as those shown in FIG. is connected to the smartphone 300 for communication.
  • the information processing system 1A of FIG. 6B is different from the configuration of the information processing system 1 shown in FIG. 6A except that the first information processing device is changed from the HMD 100 to the smart phone 300, and the same action and the same effect can be obtained. can get.
  • FIG. 7 is a diagram showing a display screen of an information processing apparatus according to the third embodiment, in which various types of biological information measured and detected in the embodiment shown in FIG.
  • FIG. 10 is a diagram showing an example of a display screen in this case;
  • finger tapping, eye movements, cheek movements, electrocardiograms, and electroencephalograms are measured at the same time, and the time axes of the measurement operations are displayed.
  • the measurement results of finger tapping, eye movement, and temple movement are the same as those described in FIGS. 2 and 4, and are omitted.
  • the measurement results of the electrocardiogram are obtained by electrically measuring the contraction motion of the heart by the heartbeat sensor 601. In normal times, the same waveform is repeated periodically, and the state of the heart can be grasped from the degree of change in the waveform.
  • Electroencephalogram measurement results can determine areas where damage is suspected from the state of the waveform, making it possible to diagnose diseases such as episodic loss of consciousness or convulsions and cognitive dysfunction. Therefore, by displaying not only the results of tapping motion measurement but also various biological information detected at other sites on a single screen, it is possible to accurately detect signs of disease on the same time axis. .
  • an information processing device having an information communication function and a display function is used. This applies and is not limited to smartphones, and may be other devices or devices such as personal computers and monitor devices.
  • FIG. 8 is a flow chart showing the processing flow of the information processing system 1 .
  • FIG. 9 shows an example of processing for synchronizing times between different devices.
  • the first information processing device in the following description may be, for example, the smartphone 300, and the biometric information sensor may be the heartbeat sensor 601.
  • synchronization processing is performed between the first information processing device and the biological information sensor (S01).
  • the first information processing device sends a synchronization signal (sig.1) containing the command issue time (TA0) of the first information processing device to the biological information sensor in order to synchronize the clock with the biological information sensor. Send.
  • the synchronization signal (sig.2) including the time TB1 at which the synchronization signal (sig.1) was received from the first information processing device and the command issuing time (TB2) of the biological information sensor is transmitted as the first information. Send to processor.
  • the first information processing device that has received the synchronization signal (sig.2) calculates the difference between the time TA0 at which the synchronization signal (sig.1) is transmitted and the time TA3 at which the synchronization signal (sig.2) is received, and the biological information sensor From the information of the difference between the time TB1 when the synchronization signal (sig.1) was received and the time TB2 when the synchronization signal (sig.2) was transmitted, the time required for communication between the first information processing device and the biological information sensor and the time between the devices deviation can be grasped.
  • the synchronization signal (sig.3) allows the biological information sensor to grasp the time required for communication between devices and the time lag between devices. By grasping the time difference ⁇ T between devices and the time required for communication in this way, it is possible to display the measurement results of each device along the time axis. However, if each of the first information processing device and the biometric information sensor incorporates a clock indicating the time common to the world, such processing is not necessary.
  • the first information processing device calculates the time lag ⁇ T.
  • the first information processing device simultaneously measures the distance between two fingers in the tapping motion and detects the amount of head and neck state change by the head and neck region state change sensor, and the biological information sensor detects the biological information (S02).
  • the biological information is transmitted from the sensor to the first information processing device (S03).
  • the first information processing device combines the measurement result of the distance between two fingers and the head and neck state change amount information detected by the first information processing device at time TAn- ⁇ T with the biometric information received from the biometric information sensor at time TA. It is displayed on the display of the first information processing device according to the time axis (S04).
  • Measurement is continued until an instruction to end measurement is input to the first information processing device (S05: No, S02), and if there is an instruction to end measurement (S05: Yes), the measurement ends.
  • FIG. 10 is a block diagram showing a configuration example of a head-mounted display as an example of the information processing apparatus according to this embodiment.
  • FIG. 10 parts denoted by the same reference numerals as shown in FIGS. 1 and 3 have the same operations as those already explained in FIGS. 1 and 3, so detailed explanation thereof will be partially omitted.
  • the HMD 100 includes an out-camera 110, a distance measuring sensor 101, a left-eye line-of-sight sensor 111, a right-eye line-of-sight sensor 112, an acceleration sensor 804, a gyro sensor 805, a geomagnetic sensor 806, an operation input interface 807, a display 113, a processor 820, a program 831 and information data 832 , a vibrator 841 , a microphone 842 , a speaker 843 , and a communication device 844 .
  • the acceleration sensor 804 is a sensor that detects acceleration, which is a change in speed per unit time, and can detect movement, vibration, impact, and the like.
  • An acceleration sensor 804 provided in the HMD 100 can detect the tilt and direction of the HMD 100 .
  • the gyro sensor 805 is a sensor that detects the angular velocity in the rotational direction, and can capture the state of vertical, horizontal, and diagonal postures. Therefore, using the acceleration sensor 804 and the gyro sensor 805, it is possible to detect the posture such as the tilt and direction of the HMD 100.
  • FIG. 805 is a sensor that detects the angular velocity in the rotational direction, and can capture the state of vertical, horizontal, and diagonal postures. Therefore, using the acceleration sensor 804 and the gyro sensor 805, it is possible to detect the posture such as the tilt and direction of the HMD 100.
  • the geomagnetic sensor 806 is a sensor that detects the magnetic force of the earth, and detects the direction in which the HMD 100 is facing. It is also possible to detect the movement of the HMD 100 by using a three-axis type that detects geomagnetism in the vertical direction as well as in the front-rear direction and the horizontal direction, and captures changes in the geomagnetism with respect to the movement of the HMD 100 . With these, the state of the HMD 100 is optimized so that the posture state of the subject 10 wearing the HMD 100 can be detected and determined, and the distance between two fingers during tapping can be measured more accurately. can contribute to
  • the acceleration sensor 804, the gyro sensor 805, and the geomagnetic sensor 806 can detect the movement of the head and neck of the subject 10, they may be used as head and neck region state change sensors.
  • the processor 820 is composed of a CPU or the like, and controls each component of the HMD 100 by executing a program 831 such as an operating system (OS) and an operation control application stored in a memory 830. It realizes functions such as OS, middleware, applications, and other functions.
  • OS operating system
  • operation control application stored in a memory 830. It realizes functions such as OS, middleware, applications, and other functions.
  • the processor 820 executes a program that implements the functions of the information processing system according to this embodiment, thereby performing a finger tapping motion analysis processing unit 821, an eye movement detection analysis processing unit 822, a switching control unit 823, and a display control unit.
  • a portion 824 is realized.
  • the finger tapping motion analysis processing unit 821 analyzes the trajectory of the finger tapping motion from the motion of the fingers detected by the out-camera 110 or the distance measuring sensor 101, and calculates characteristic parameters.
  • the eye movement detection analysis processing unit 822 captures the eye movement from the movement and direction of the eyeball and pupil detected by the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112, and detects an abnormal state of the subject, such as involuntary swaying of the eyeball and pupil. Extract features that lead to the detection of signs of
  • the switching control unit 823 controls the switching between the eye tracking operation for pointing the target object and the eye movement measurement for measuring the state of the eye movement regarding the processing of the eye movement detection analysis processing unit 822 for performing different processing. It is something to do.
  • minute movement states of the subject's eyeballs are observed and feature quantities are extracted.
  • Eye-tracking and eye-movement measurements change the tracking speed for changes in gaze direction and eye movement, and perform processing such as detecting minute movements to analyze involuntary eye and pupil movements. Extract.
  • the display control unit 824 displays the time transition of the distance between two fingers detected by the finger tap distance sensor, the time transition of the head and neck region state change amount detected by the head and neck region state change sensor, and the biological information when there is a biological information sensor. are displayed side by side on the same screen of the display 113 with the time axis aligned.
  • the memory 830 is composed of a non-volatile storage device or the like, and stores various programs 831 and information data 832 handled by the processor 820 and the like.
  • information data 832 captured image data during tapping exercise, operation guide image data for finger tap distance measurement, relaxation image data, measured and detected biological information, and the like are stored.
  • the display 113 is provided in the video see-through type HMD 100, and is composed of a display such as a liquid crystal panel that displays the physical object in front of the user photographed by the out-camera 110 and the virtual object together.
  • a display such as a liquid crystal panel that displays the physical object in front of the user photographed by the out-camera 110 and the virtual object together.
  • an optical see-through HMD instead of the display 113, for example, a projection unit that projects a virtual object such as an operation guide image for finger tap distance measurement or a relaxation image, notification information to the subject 10, and the like; It consists of a transparent half-mirror that forms and displays an image of a virtual object, etc.
  • the subject 10 can visually recognize both the real object in the visual field range in front of the subject 10 and the imaged virtual object in a floating form.
  • the operation input interface 807 is an input interface using, for example, gesture operations, voice input, virtual keyboards, touch sensors, etc., and is used to set and input information that the subject 10 wants to input.
  • the operation input interface 807 may be separated from the main body of the HMD 100 and connected by wire or wirelessly.
  • the input operation screen may be displayed within the display screen of the display 113 and the input operation information may be captured according to the position on the input operation screen to which the line of sight is directed.
  • the input operation information may be captured by operating the pointer with .
  • the subject may utter a sound indicating the input operation, and the microphone 842 may collect the sound to capture the input operation information.
  • the speaker 843 outputs sound from the speaker or headphones based on the sound data, so that the subject 10 can be informed of the notification information by sound.
  • the vibrator 841 generates vibration under the control of the processor 820, and converts the notification instruction information transmitted by the HMD 100 to the subject 10 into vibration.
  • the vibrator 841 can transmit vibration to the head of the subject 10 wearing the HMD 100 to inform the subject 10 of the notification instruction information.
  • Examples of information to be notified to the subject 10 include information indicating the start and end of finger tapping measurement and head and neck measurement, information indicating information indicating other equipment during simultaneous measurement, interrupt notification, and the like. This notification information can improve usability.
  • the communication device 844 is a communication interface that performs wireless communication between at least the HMD 100, other devices, and the smartphones 300 and 600 by short-range wireless communication, wireless LAN, or base station communication, and is compatible with various predetermined communication interfaces. It includes a processing circuit, an antenna, etc., and transmits and receives biological information, image data, control signals, and the like.
  • short-range wireless communication Bluetooth (registered trademark), IrDA (Infrared Data Association, registered trademark), Zigbee (registered trademark), HomeRF (Home Radio Frequency, registered trademark), or Wi-Fi (registered trademark) It is performed using a wireless LAN such as.
  • the finger tapping motion analysis processing unit 821 analyzes each item such as distance, speed, acceleration, tap interval, and phase difference based on the information on the distance between two fingers during the tapping motion that is measured and detected using the distance sensor 101. Analyze and extract the feature quantity that indicates the feature of the finger movement. Biological information leading to brain function evaluation of the subject can be obtained from the analyzed and extracted feature amount.
  • the distance sensor 101 measures the distance information between the two fingers during the tapping motion
  • the finger tapping motion analysis processing unit 821 measures the distance information between the two fingers based on the measured distance information between the two fingers.
  • a feature quantity indicating a finger movement feature is extracted.
  • the eye movement detection and analysis processing unit 822 detects eye movement information with the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112, which leads to detection of signs of an abnormal state of the subject, such as involuntary eyeball or pupil sway. Extract features.
  • the HMD 100 it is possible to obtain, as biometric information, a plurality of feature amounts that are analyzed and extracted based on these pieces of information that are measured and detected at the same time. Further, it is possible to realize an information processing apparatus capable of detecting, with high accuracy, biometric information indicative of a sign, an omen, or a symptom of a disease in a simple and easy-to-use manner without using a special instrument.
  • FIG. 11 is a block diagram showing a configuration example of a smart phone 300, which is a mobile information terminal, as an example of an information processing apparatus according to this embodiment.
  • a smart phone 300 which is a mobile information terminal
  • FIG. 11 parts denoted by the same reference numerals as shown in FIGS. 1 and 3 have the same functions as the operations already explained with reference to FIGS. 1 and 3, so detailed explanations thereof will be omitted.
  • the smartphone 600 has the same configuration as the smartphone 300, a description thereof will be omitted.
  • smartphone 300 includes out camera 301, in camera 302, range sensor 303, acceleration sensor 904, gyro sensor 905, geomagnetic sensor 906, operation input interface 907, display 908, processor 920, program 931 and information data 932. , a vibrator 941 , a microphone 942 , a speaker 943 , a wireless communication device 944 , etc., and each component is connected to each other via a bus 950 . Duplicate description of the same configuration as that of the HMD 100 in FIG. 10 will be omitted.
  • the processor 920 executes a program that implements the functions of the information processing system according to the present embodiment, thereby performing a finger tapping motion analysis processing unit 921, a head and neck analysis processing unit 922, and a display control unit. 924 is realized.
  • a display 908 displays display contents via a highly transparent touch panel composed of a liquid crystal or the like. Displays information such as notification to the sample. Information to be notified to the subject 10 includes information indicating the start and end of finger tapping measurement and measurement of the head and neck region, information indicating other devices during simultaneous measurement, and the like. Note that the display device of the smartphone 300 displays various types of measured and detected biological information over time.
  • the operation input interface 907 is used by the subject 10 to input input information to the smartphone 300.
  • a touch panel configured on the display surface of the display 908 enables touch operations such as a finger or a touch pen.
  • Information that the subject 10 wants to input can be set and input by detecting it as an operation input.
  • the input operation information may be captured using a keyboard, key buttons, or the like, or the subject 10 may utter a sound indicating the input operation, and the microphone 942 may collect the sound to capture the input operation information.
  • the wireless communication device 944 is a communication interface that performs wireless communication between at least the HMD 100, other devices, and the smartphone 300 by short-range wireless communication, wireless LAN, or base station communication, and performs communication processing corresponding to various predetermined communication interfaces. It includes circuits, antennas, etc., and transmits and receives biological information, image data, control signals, and the like.
  • short-range wireless communication Bluetooth (registered trademark), IrDA (Infrared Data Association, registered trademark), Zigbee (registered trademark), HomeRF (Home Radio Frequency, registered trademark), or Wi-Fi (registered trademark) It is performed using a wireless LAN such as.
  • base station communication long-distance wireless communication such as W-CDMA (Wideband Code Division Multiple Access, registered trademark) and GSM (Global System for Mobile Communications) may be used.
  • FIG. 12 is a diagram schematically showing the appearance of an embodiment of an information processing system in which feature amounts measured and analyzed by an information processing device or other device are stored in the cloud as biometric information and utilized.
  • an information processing device 1000 comprising an HMD 100 and a smartphone 300 is connected to a server (computer) 1001 such as a cloud server, application server, or data server via a network 1002 such as the Internet.
  • the server 1001 is also connected to an information terminal such as a personal computer 1003 via a network 1002 .
  • biometric information data measured and analyzed by the information processing apparatus 1000 and other devices is stored and saved in the server 1001 forming the cloud via the network 1002 .
  • the server 1001 comprehensively stores a large amount of biometric information data such as general average value data of biometric information, and biometric information data indicating signs, signs, and indications of diseases by age and by underlying disease. .
  • a user 1004 such as a doctor or a nurse who remotely uses an information terminal such as a personal computer 1003 at a location different from the information processing apparatus 1000 receives the analyzed biological information data from the server 1001 and the server It becomes possible to acquire and compare a large amount of biological information data held by 1001 and to analyze with high precision the precursors, precursors, and signs of illness. In this way, it is possible to receive advanced diagnosis remotely from a doctor in a remote location.
  • the smart phone 300 can display on a single screen the biological information measured and detected at the same time by the HMD 100 and other devices, and it is possible to accurately ascertain and judge the occurrence of abnormal behavior on the same time axis with high detection accuracy. become.
  • the present invention is not limited to the above-described embodiments, and includes various modifications.
  • the above-described embodiments have been described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the configurations described.
  • part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.
  • each of the above configurations, functions, processing units, processing means, etc. may be realized in hardware, for example, by designing a part or all of them with an integrated circuit.
  • each of the above configurations, functions, etc. may be realized by software by a processor interpreting and executing a program for realizing each function.
  • Information such as programs, tables, and files that implement each function may be stored in recording devices such as memory, hard disks, SSDs (Solid State Drives), or recording media such as IC cards, SD cards, and DVDs. and may be stored in a device on a communication network.
  • the control lines and information lines indicate those considered necessary for explanation, and not all control lines and information lines are necessarily indicated on the product. In practice, it may be considered that almost all configurations are interconnected.
  • Reference Signs List 1 Information processing system 1A: Information processing system 10: Subject 100: HMD 101 : Ranging sensor 102 : Left hand 103 : Finger 104 : Thumb 105 : Right hand 106 : Finger 107 : Thumb 108 : Solid line 109 : Solid line 110 : Out camera 111 : Left eye sight line sensor 112 : Right eye sight line sensor 113 : Display 121 : Screen 122 : Relax image 123 : Operation guide image 201 : Dotted frame 202 : Dotted frame 300 : Smartphone 301 : Out camera 302 : In camera 303 : Distance sensor 304 : Stand 305 : Housing 401 : Dotted frame 402 : Dotted frame 403 : Point 404: Point 501: Display screen 501a: Display screen 501b: Display screen 501c: Display screen 502: Operation guide image 503: Head and neck image 504: Head and neck region 505: Head and neck image 600: Smartphone 601: Heart rate sensor 60

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Physiology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Cardiology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

La présente invention concerne un dispositif de traitement d'informations qui comporte un processeur. Le processeur est connecté à : un capteur de distance de tapotement de doigt pour mesurer/détecter une distance entre les doigts lorsqu'un sujet effectue un mouvement de tapotement à deux doigts, et délivrer des informations de distance entre les doigts ; et un capteur de changement d'état de la région de la tête et du cou pour mesurer/détecter un changement d'état non pas dans les doigts mais principalement dans la région de la tête et du cou, et délivrer des informations de quantité de changement d'état de la région de la tête et du cou. Le processeur amène également un traitement de synchronisation à être exécuté entre le capteur de distance de tapotement de doigt et le capteur de changement d'état de la région de la tête et du cou, et acquiert les informations de distance entre les doigts et les informations de quantité de changement d'état de la région de la tête et du cou obtenues à la suite de l'exécution synchronisée des opérations de mesure par le capteur de distance de tapotement de doigt et le capteur de changement d'état de la région de la tête et du cou.
PCT/JP2021/043586 2021-11-29 2021-11-29 Dispositif de traitement d'informations, système de traitement d'informations et procédé de traitement d'informations WO2023095321A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/043586 WO2023095321A1 (fr) 2021-11-29 2021-11-29 Dispositif de traitement d'informations, système de traitement d'informations et procédé de traitement d'informations

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/043586 WO2023095321A1 (fr) 2021-11-29 2021-11-29 Dispositif de traitement d'informations, système de traitement d'informations et procédé de traitement d'informations

Publications (1)

Publication Number Publication Date
WO2023095321A1 true WO2023095321A1 (fr) 2023-06-01

Family

ID=86539293

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/043586 WO2023095321A1 (fr) 2021-11-29 2021-11-29 Dispositif de traitement d'informations, système de traitement d'informations et procédé de traitement d'informations

Country Status (1)

Country Link
WO (1) WO2023095321A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011229745A (ja) * 2010-04-28 2011-11-17 Hitachi Computer Peripherals Co Ltd 運動機能解析装置
US20140154651A1 (en) * 2012-12-04 2014-06-05 Sync-Think, Inc. Quantifying peak cognitive performance using graduated difficulty
JP2017217144A (ja) * 2016-06-06 2017-12-14 マクセルホールディングス株式会社 手指運動練習メニュー生成システム、方法、及びプログラム
JP2018505759A (ja) * 2015-01-06 2018-03-01 バートン,デイビット 携帯装着可能監視システム
JP2019511067A (ja) * 2016-01-19 2019-04-18 マジック リープ, インコーポレイテッドMagic Leap,Inc. 反射を利用する拡張現実システムおよび方法
JP2020537579A (ja) * 2017-10-17 2020-12-24 ラオ、サティシュ 神経障害を識別及び監視するための機械学習ベースのシステム
WO2021014717A1 (fr) * 2019-07-22 2021-01-28 マクセル株式会社 Dispositif de détection et procédé de détection

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011229745A (ja) * 2010-04-28 2011-11-17 Hitachi Computer Peripherals Co Ltd 運動機能解析装置
US20140154651A1 (en) * 2012-12-04 2014-06-05 Sync-Think, Inc. Quantifying peak cognitive performance using graduated difficulty
JP2018505759A (ja) * 2015-01-06 2018-03-01 バートン,デイビット 携帯装着可能監視システム
JP2019511067A (ja) * 2016-01-19 2019-04-18 マジック リープ, インコーポレイテッドMagic Leap,Inc. 反射を利用する拡張現実システムおよび方法
JP2017217144A (ja) * 2016-06-06 2017-12-14 マクセルホールディングス株式会社 手指運動練習メニュー生成システム、方法、及びプログラム
JP2020537579A (ja) * 2017-10-17 2020-12-24 ラオ、サティシュ 神経障害を識別及び監視するための機械学習ベースのシステム
WO2021014717A1 (fr) * 2019-07-22 2021-01-28 マクセル株式会社 Dispositif de détection et procédé de détection

Similar Documents

Publication Publication Date Title
Cognolato et al. Head-mounted eye gaze tracking devices: An overview of modern devices and recent advances
CN109804331B (zh) 检测和使用身体组织电信号
CN105578954B (zh) 生理参数测量和反馈系统
US20200268296A1 (en) Brain-computer interface with adaptations for high-speed, accurate, and intuitive user interactions
Lamonaca et al. Health parameters monitoring by smartphone for quality of life improvement
Majaranta et al. Eye tracking and eye-based human–computer interaction
JP7106569B2 (ja) ユーザーの健康状態を評価するシステム
US20180206735A1 (en) Head-mounted device for capturing pulse data
EP3064130A1 (fr) Mesure d'activité cérébrale et système de rétroaction
JP2019513516A (ja) 人の視覚パフォーマンスを査定するために視覚データを入手し、集計し、解析する方法およびシステム
US10709328B2 (en) Main module, system and method for self-examination of a user's eye
WO2015123771A1 (fr) Suivi de gestes et commande en réalité augmentée et virtuelle
KR102029219B1 (ko) 뇌 신호를 추정하여 사용자 의도를 인식하는 방법, 그리고 이를 구현한 헤드 마운트 디스플레이 기반 뇌-컴퓨터 인터페이스 장치
US20220071484A1 (en) Virtual reality-based portable nystagmography device and diagnostic test method using same
Amprimo et al. Gmh-d: Combining google mediapipe and rgb-depth cameras for hand motor skills remote assessment
CN111712187A (zh) 生命信息显示设备、生命信息显示方法和程序
JP7209954B2 (ja) 眼振解析システム
WO2023095321A1 (fr) Dispositif de traitement d'informations, système de traitement d'informations et procédé de traitement d'informations
EP4088648A1 (fr) Dispositif d'imagerie, système de traitement de données de mouvement oculaire et procédé de commande
WO2023042343A1 (fr) Terminal de traitement de mesure, procédé et programme d'ordinateur pour effectuer un processus de mesure de mouvement de doigt
JP7449898B2 (ja) 撮像装置、眼球運動データ処理システム、および制御方法
Wilder et al. Eye tracking in virtual environments
KR20190076722A (ko) 모바일 기기를 이용한 vr/ar 기반 다중지능 검사방법 및 다중지능 검사시스템
US20240111380A1 (en) Finger tapping measurement processing terminal, system, method, and computer program
CN115770013B (zh) 辅助弱势人群的眼动测试方法、装置、设备及介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21965690

Country of ref document: EP

Kind code of ref document: A1