WO2023095321A1 - Information processing device, information processing system, and information processing method - Google Patents

Information processing device, information processing system, and information processing method Download PDF

Info

Publication number
WO2023095321A1
WO2023095321A1 PCT/JP2021/043586 JP2021043586W WO2023095321A1 WO 2023095321 A1 WO2023095321 A1 WO 2023095321A1 JP 2021043586 W JP2021043586 W JP 2021043586W WO 2023095321 A1 WO2023095321 A1 WO 2023095321A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
sensor
information processing
head
distance
Prior art date
Application number
PCT/JP2021/043586
Other languages
French (fr)
Japanese (ja)
Inventor
治 川前
義憲 岡田
Original Assignee
マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by マクセル株式会社 filed Critical マクセル株式会社
Priority to PCT/JP2021/043586 priority Critical patent/WO2023095321A1/en
Publication of WO2023095321A1 publication Critical patent/WO2023095321A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb

Definitions

  • the present invention relates to an information processing device, an information processing system, and an information processing method, and is particularly suitable for detection of symptoms such as dementia, signs of diseases, signs, and effects and effects of prescriptions such as treatment and medication.
  • the present invention relates to an information processing device, an information processing system, and an information processing method.
  • finger tapping motion as a "rule” that indicates the state of brain health, it is possible to quantify the fine motor function of the fingers, so it can be used in various fields such as healthcare, rehabilitation, and life support can.
  • Patent Document 1 describes "a motor function measuring device that calculates motion data based on the relative distance between a pair of a transmitting coil and a receiving coil attached to a movable part of a living body. and an evaluation device for evaluating the motor function of a living body based on the motion data received from the motor function measuring device.”
  • Patent Document 1 describes measuring and evaluating the tapping motion
  • the tapping motion is measured by attaching a magnetic sensor to the finger.
  • the present invention has been made in view of the above problems, and provides an information processing device, an information processing system, and an information processing method that can save the trouble of preparing for measurement of tapping motion and perform and measure tapping motion more accurately. offer.
  • an information processing apparatus includes a processor, and the processor measures and detects the distance between two fingers during a tapping motion of the subject's two fingers, and outputs distance information between the two fingers. and a head and neck region state change sensor that measures and detects changes in the state of mainly the head and neck region other than the finger and outputs head and neck state change amount information, wherein the finger tap is connected to Synchronization processing is executed between the distance sensor and the head and neck region state change sensor, and the finger tap distance sensor and the head and neck region state change sensor perform synchronous measurement operations to obtain the two-finger distance information and the It is characterized by acquiring head and neck state change amount information.
  • FIG. 1 is a diagram showing an appearance of an information processing apparatus according to a first embodiment
  • FIG. FIG. 10 is a diagram showing the results of tapping movement measurement by a range sensor and the measurement results of eye movements by a left-eye line-of-sight sensor and a right-eye line-of-sight sensor over time. It is a figure which shows the external appearance of the information processing apparatus which concerns on 2nd Embodiment.
  • the result of tapping movement measurement by the out-camera and distance sensor, the eye movement measurement result by the in-camera, the cheek muscle movement measurement result by the in-camera, and the luminance change measurement result of the face surface by the in-camera are time-lapsed. It is a figure shown with progress.
  • FIG. 10 is a diagram showing an image displayed on the display screen of the smartphone during tapping exercise; It is a figure which shows the external appearance of the information processing apparatus which concerns on 3rd Embodiment. It is a figure which shows the external appearance of the information processing apparatus which concerns on 3rd Embodiment. It is a figure which shows the display screen of the information processing apparatus which concerns on 3rd Embodiment. 4 is a flow chart showing the flow of processing of the information processing system; It shows an example of processing for synchronizing time between different devices.
  • 1 is a block diagram showing a configuration example of a head-mounted display as an example of an information processing apparatus according to an embodiment; FIG.
  • FIG. 1 is a block diagram showing a configuration example of a mobile information terminal as an example of an information processing apparatus according to an embodiment
  • FIG. 1 is a diagram schematically showing the appearance of an embodiment in which a cloud is configured as an information processing system according to the present embodiment
  • FIG. 1 is a diagram schematically showing the appearance of an embodiment in which a cloud is configured as an information processing system according to the present embodiment
  • the information processing apparatus includes a head mounted display (HMD, hereinafter referred to as HMD) that is worn on the head of the subject 10 and displays and visually recognizes real space information and virtual space information. It is a mounted embodiment.
  • HMD head mounted display
  • FIG. 1 is a diagram showing the appearance of the information processing apparatus according to the first embodiment.
  • the subject 10 whose tapping motion is to be measured wears the HMD 100 on its head.
  • the subject 10 is also the user of the HMD 100 .
  • the HMD 100 includes a distance sensor 101 that measures and detects the distance and angle to an object as a finger tapping distance sensor that measures the distance between two fingers during tapping motion (opening/closing motion) of the HMD.
  • the HMD 100 is provided with a distance measurement sensor 101 for the purpose of detecting gesture operations.
  • the distance measuring sensor 101 is used as a finger tapping distance sensor for detecting tapping motion.
  • the distance measuring sensor 101 is a sensor that can measure the distance and angle to an object and can perceive the shape of an object such as an object as a three-dimensional object.
  • LiDAR Light Detection and Ranging
  • a TOF Time Of Flight
  • a millimeter-wave radar or the like is used to detect the state of the target object.
  • the distance measuring sensor 101 detects the distance and angle to the tip of the index finger 103 and the tip of the thumb 104 of the left hand 102 performing the tapping motion as indicated by the solid line 108 . Therefore, the HMD 100 can measure the inter-finger distance between the tip of the index finger 103 and the tip of the thumb 104 during the tapping exercise over time. Since the depth distance can also be measured, the distance between two fingers can be accurately measured even if the fingers are oblique to the distance measuring sensor 101 .
  • the distance measuring sensor 101 detects the distance and angle to the tip of the index finger 106 and the tip of the thumb 107 of the right hand 105 performing the tapping motion as indicated by the solid line 109 . Therefore, the HMD 100 can measure the inter-finger distance between the tip of the index finger 106 and the tip of the thumb 107 during the tapping exercise over time.
  • the HMD 100 also includes an out-camera 110 that captures the surrounding external world in front.
  • the out-camera 110 acquires a photographed image by converting the light incident from the lens into an electric signal with an imaging element.
  • the left and right fingers performing the tapping exercise may be photographed with the out-camera 110, and the distance between the two fingers may be calculated and measured from the photographed image (out-camera image).
  • the processor 802 (see FIG. 8) of the HMD 100 detects the shape of the finger from the out-camera image and recognizes which finger is left or right. It is possible to obtain the pixel positions of those fingertips in the out-camera image, and detect how the positions of the two fingertips have changed over time.
  • the processor 802 can also grasp the distance between the fingertips based on how the tips of the fingertips appear to be large or small, and can grasp the inclination of the two fingers with respect to the out-camera 110 based on how the inside and outside of the two fingers appear. It is also possible to Recently, it is possible to Recently, it is possible to measure the distance between two fingers by accumulating data obtained by measuring the relationship between the inclination of the fingers and the distance in advance using machine learning or the like.
  • the HMD 100 can be used to conveniently measure and detect the distance between two fingers during the tapping exercise.
  • the processor 802 can analyze and extract the feature amount leading to brain function evaluation of the subject 10 as biometric information by the arithmetic function in the HMD 100 based on the measured and detected information.
  • dementia such as Alzheimer's disease, cerebrovascular disease, Lewy body disease, Parkinson's disease, developmental coordination disorder (inability to skip or jump rope, etc.).
  • the dexterous motor function of the fingers can be quantified and detected as a "measure” indicating the state of brain health, it can be used as a training and rehabilitation menu for improving brain function. is possible.
  • the tapping motion interval distance between the index finger and the thumb is measured over time has been described.
  • the tapping movement can be measured even if a plurality of fingers are opened and closed, and it goes without saying that the same action and effect can be obtained.
  • the case where the tapping exercise is performed with the index finger will be mainly described, but the same applies to other fingers than the index finger.
  • the HMD 100 also includes a left-eye line-of-sight sensor 111 and a right-eye line-of-sight sensor 112 that detect eye movement and line-of-sight by capturing the movements and orientations of the left and right eyes, respectively.
  • a left-eye line-of-sight sensor 111 and a right-eye line-of-sight sensor 112 measure movements of the pupil and eyeball, blinking, and the like.
  • the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 detect the movement and orientation of the right and left eyes, respectively, and can capture the eye movement and line of sight.
  • the process of detecting the movement of the eyeball may use a well-known technique that is generally used as an eye tracking process.
  • an infrared LED Light Emitting Diode
  • As another method for measuring and detecting eyeball movements there is also known a method of photographing an eye with a visible light camera and detecting the line of sight based on the positions of the pupil and iris.
  • the processor 802 In line-of-sight input processing using the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 as input devices of the HMD 100, the processor 802 detects the gaze point of the line-of-sight destination. The direction of the human gaze moves quickly, and it is difficult to determine the point of gaze if the fine movements of the eyeballs are followed. It slows down the followability of the sensor 112 .
  • the processor 802 applies a smoothing filter along the time axis to the sensor information from the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 to detect the gaze point.
  • the processor 802 uses the sensor information as it is without applying the smoothing filter, or uses a filter with a higher time resolution than the smoothing filter used for viewpoint input processing. Switching filters to detect minute movements of the line of sight.
  • the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 which detect minute pupil and eyeball movements at high speed, detect involuntary eyeball and pupillary movements unrelated to the subject's intention, rapid blinking, etc. can be detected.
  • the HMD 100 it becomes possible to analyze and extract the feature quantity leading to the evaluation of the abnormal state of the subject as biometric information by the arithmetic function in the HMD 100 .
  • dementia dementia
  • cerebrovascular disease dementia
  • Lewy body disease dementia
  • Parkinson's disease etc.
  • detection of signs, precursors, and signs of diseases dementia
  • analysis and extraction of features based on it may be possible to discriminate signs of oculomotor dysfunction.
  • the movement of the pupil/eyeball is detected by the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 at the same time.
  • Only the measurement of the distance between two fingers or the detection of pupil/eyeball movement by the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 alone has limits and variations in detection accuracy of biometric information that indicates signs, signs, and symptoms of diseases such as dementia. be. Therefore. By measuring these simultaneously, it becomes possible to detect biometric information that indicates a sign, portent, or symptom of a disease with higher accuracy.
  • the screen 121 is displayed on the display 113 so that it can be confirmed whether or not the entire figure of both fingers is shown.
  • the image is switched to an image different from the image of the entire two fingers that are in contact with each other. For example, during tapping motion measurement, a relaxation image 122 for relaxing the subject 10, an operation guide image 123 for finger tap distance measurement, and the like are displayed.
  • the subject 10 can measure the tapping motion in a relaxed state.
  • FIG. 2 is a diagram showing the results of tapping movement measurement by the distance measuring sensor 101 and the measurement results of eye movement by the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 over time.
  • the distance between two fingers, speed, and acceleration are taken as examples of the items indicating the feature amount of the tapping motion, and the transition of these items over time is shown.
  • the information including the measurement results of the eye movement by the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 corresponds to head-and-neck state variation information.
  • the distance between the two fingers is obtained by detecting the positions of the two fingers performing the tapping motion with the distance measuring sensor 101 and measuring the distance between the two fingers during the tapping motion. Velocity and acceleration are calculated and analyzed based on information on the distance between two fingers that is temporally displaced by the tapping motion.
  • the distance between two fingers is maximum, the velocity is zero, and the acceleration is the maximum negative value.
  • feature amounts indicating features of finger movement are analyzed and extracted for each item such as distance, speed, acceleration, tap interval, and phase difference.
  • the processor 802 calculates the maximum swing width of the finger, the total moving distance, etc. from the time transition of the distance between the two fingers, and evaluates how the finger movement has changed.
  • the maximum speed, opening speed, closing speed, etc. are calculated from the change in speed over time, and how quickly and stably the finger moves is evaluated.
  • the maximum amplitude of the acceleration, the force at which the finger starts to open, the force at which it finishes opening, the force at which it finishes closing, the force at which it starts to close, the finger contact time, etc. are calculated from the transition of the acceleration over time, and the force of finger movement is evaluated.
  • the processor 802 calculates the number of taps, the tap period, the tap frequency, and the like for the tap interval item of the feature amount, and evaluates the timing of tapping.
  • the processor 802 calculates the timing difference between both hands, the degree of similarity between both hands, the balance of alternating taps, etc., and evaluates the cooperation between both hands. By analyzing these, the extracted feature amount can be used as a clue to know the state of the brain that leads to brain function evaluation of the subject 10, and recognition of Alzheimer's type, cerebrovascular disease, Lewy body type, etc. This can be used as one means of testing for early detection of diseases such as Parkinson's disease, developmental coordination disorder, and the like.
  • the movement distance of the eyeball movement measurement result is obtained by detecting the position of the eyeball with the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112, and measuring the distance the eyeball moved over time during the tapping motion. .
  • the movement distance is detected by determining in which direction and how much distance the eyeball has moved from a certain line-of-sight direction as an initial position.
  • FIG. 2 shows the movement distance in the horizontal direction, it may be represented by a distance in the vertical direction or a distance vector from the initial position. It is also possible to display an index in which the characteristic appears conspicuously.
  • the average value of both eyes may be indicated, or one of the left and right eyes, in which the characteristic is prominently displayed, may be indicated. Of course, it may be displayed for both eyes.
  • the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 measure the high-speed and minute movements of both eyes.
  • the parameters include speeding up the frame rate of the eye measurement camera, reducing the average number of detections that have averaged multiple frames, and detecting each frame.
  • the detection range is adapted to the magnitude of the movement.
  • the smoothing of the waveform when displaying the graph should be eliminated as much as possible so that the state of exercise can be displayed as it is.
  • eye movements may be symptoms that are likely to occur in people with specific diseases and diseases. Therefore, based on the above-described tapping movement and the detected eyeball movement information, the arithmetic function in the information processing apparatus extracts a feature amount leading to an abnormal state evaluation of the subject. If the results of the analysis indicate signs, precursors, or signs of illness, it will be possible to detect the tendency at an early stage by a simple method. In addition, there is a possibility of early detection of dementia such as Alzheimer's disease, cerebrovascular disease, and Lewy body disease, Parkinson's disease, and disorders of eye movement function.
  • dementia such as Alzheimer's disease, cerebrovascular disease, and Lewy body disease, Parkinson's disease, and disorders of eye movement function.
  • the information processing device is widely used, and is installed in portable personal digital assistants such as smartphones and tablets that have a function of photographing the external world around the front and a subject and a function of displaying information. It is an embodiment to be carried out.
  • FIG. 3 is a diagram showing the appearance of an information processing apparatus according to the second embodiment.
  • a smart phone 300 will be described as a specific example of a portable information terminal.
  • the smartphone 300 is provided with an out-camera 301 that captures an image of the outside world in front and a ranging sensor 303 that measures and detects the distance and angle to an object on the back of a housing 305 of the smartphone 300 .
  • a display 908 (see FIG. 9) having a display screen is provided on the front of the housing 305 .
  • the smartphone 300 is supported by a stand 304.
  • the subject 10 positions the smartphone 300 and the hand so that the finger to be measured is within the angle of view of the out-camera 301 .
  • the left hand 102 and the right hand 105 performing the tapping motion are photographed by the out-camera 301 as indicated by the solid line 108, and from the out-camera image, the distance between the index finger 103 and the thumb 104 and the distance between the index finger 106 and the thumb 107 are determined. Calculate the distance between two fingers.
  • the out-camera 301 With the out-camera 301 alone, finger movements can only be captured in two dimensions, and for example, when the hand is tilted with respect to the out-camera 301, it may not be possible to measure accurately. In this case, the depth can be detected by using the distance measuring sensor 303, and the distance between the two fingers during the tapping operation can be accurately measured.
  • the distance measuring sensor 303 When measuring the distance between two fingers, only the out-camera 301 or the distance measuring sensor 303 may be used, or the out-camera 301 and the distance measuring sensor 303 may be used together in consideration of the balance between detection accuracy and cost.
  • the feature amount can be extracted by the arithmetic function in the information processing apparatus, and the same effect as described in the first embodiment can be obtained.
  • the smartphone 300 also has an in-camera 302 on the front of the housing 305 .
  • the in-camera 302 photographs the eyes, mouth, cheeks, temples, neck, and other head and neck regions of the subject 10 .
  • the processor 920 measures the entire face, movement of eyeballs/pupils, blinking, mouth, cheeks, temples, head and neck regions such as the neck from the in-camera image.
  • the in-camera 302 can observe not only the abnormal movement of the eyeballs and pupils described in the first embodiment, but also the spasms of the muscles of the entire face, mouth, cheeks, and temples, and the blood flow of the cheeks, temples, neck, and the like.
  • the movement of the eyeball and pupil can be detected based on the movement of the pupil and the area of the white of the eye. It is also possible to detect the opening and closing of the pupil and the iris.
  • the processor 920 can extract features from eyeball and pupil movements.
  • an image of a part of the face of the subject 10 captured by the in-camera 302, such as the mouth, cheeks, temples, etc. (this is a face image of the entire face of the subject 10, or the face from the neck is captured).
  • a partial image of the image of the head and neck may also be used.
  • spasms and convulsions of muscles of the mouth, cheeks, temples, etc. can be detected, and biometric information can be obtained that indicates signs, signs, and signs of illness.
  • the average value of each color component (red, green, blue) of the face area is calculated for each acquired frame.
  • the noise common to the three color components is removed, and the luminance waveform is extracted from the green component.
  • the pulse wave can be measured.
  • the biometric information may be obtained by measuring another state change in the head and neck region such as the eyes, mouth, cheeks, temples, and neck, or by measuring the state change in a region other than the head and neck region.
  • the out-camera 301 and the distance sensor 303 as finger tapping distance sensors detect the distance between two fingers during a two-finger tap operation
  • the in-camera 302 as a head and neck region state change sensor detects the distance between two fingers.
  • the head and neck region is imaged and changes in the state are detected at the same time.
  • FIG. 4 shows the result of tapping movement measurement by the out-camera 301 and the distance sensor 303, the measurement result of the eyeball movement by the in-camera 302, the measurement result of the cheek muscle movement by the in-camera 302, and the face by the in-camera 302. It is an example of the figure which showed the measurement result of the luminance change of the surface with time progress.
  • FIG. 4 the description of the measurement result of tapping motion and the measurement result of eyeball movement is the same as that described in FIG. 2, so it is omitted.
  • the detected small movement fluctuations can capture convulsions and convulsions of the cheeks as biological information.
  • the measurement result of the luminance change on the face surface shows the luminance waveform extracted from the green component, and the pulse rate is calculated from the peak values (for example, points 403 and 404) of this luminance waveform, and the pulse wave is obtained. It can be taken as biological information.
  • FIG. 5 is an example of a diagram showing an image displayed on the display screen 501 of the smartphone 300 during tapping exercise.
  • Display screen 501a is a display image before finger tapping measurement, and display screens 501b and 501c are displayed on display 908 (see FIG. 10) of smartphone 300 during measurement of the distance between two fingers during tapping exercise. is the screen.
  • the display screen 501a displays the appearance of the entire fingers (whole finger landscape) photographed by the out-camera 301 and a head and neck image 505 mainly of the subject's face photographed by the in-camera 302 .
  • a head and neck image 505 mainly of the subject's face photographed by the in-camera 302 .
  • the two fingers during the tapping exercise to be measured by the out-camera 301 are within the photographing range of the out-camera 301 before measurement.
  • the head-and-neck image 505 it can be confirmed that the head-and-neck region, mainly including the face of the subject 10, is within the photographing range of the in-camera 302 before measuring the distance between two fingers.
  • the head and neck can be photographed from the state before the tapping movement is measured. This is because the attention is focused on the fingers during the measurement of the tapping motion, so that recording the state of the head and neck region, which is not in the state of concentration, may complement the biological information.
  • the display 908 displays an image different from the overall appearance of both fingers performing the tapping motion. As a result, it is possible to suppress the influence of visual feedback on the subject 10 by watching the finger tapping motion during the tapping motion measurement, thereby enabling more accurate tapping motion measurement.
  • the display screen 501b displays an operation guide image 502 for measuring the distance between two fingers during the tapping exercise and a head and neck image 503 mainly including the face of the subject 10 photographed by the in-camera 302. be done.
  • the operation guide image 502 it is possible to measure the tapping motion while observing the operation guidance of the correct tapping motion and the progress state during the tapping motion measurement.
  • the head and neck image 503 it is possible to visually confirm that the head and neck region 504 such as the eyes, mouth, cheeks, temples, and neck are being photographed by the in-camera 302 without any trouble.
  • a relaxation image for relaxing the subject 10 is displayed on the display screen 501c.
  • the subject 10 can measure the tapping motion in a relaxed state.
  • the third embodiment includes a first information processing device that simultaneously measures the distance between two fingers during tapping and eyeball movement detection, and another device that measures and detects other biological information (referred to as a "biological information sensor"). are linked, and each biological information measured and detected by the first information processing device and another device is displayed by synchronizing the time stamps with the second information processing device.
  • 6A and 6B are diagrams showing the appearance of an information processing system according to the third embodiment.
  • the information processing system 1 shown in FIG. 6A includes an HMD 100 as a first information processing device, a smartphone 600 as a second information processing device, a heart rate sensor 601 worn on the chest of the subject 10 as a biological information sensor, and the head of the subject 10.
  • an electroencephalograph 602 worn on the body a smartwatch 603 worn on the forearm of the subject 10, a blood sugar sensor 604, a blood pressure monitor 605 worn on the upper arm of the subject 10, and an ear of the subject 10;
  • the smartphone 300, the HMD 100, and each biological information sensor are configured to be connected for communication.
  • Short-range radio such as Bluetooth (registered trademark) may be used for communication.
  • the HMD 100 uses the out-camera 110 or the distance sensor 101 to measure the distance between two fingers during the tapping motion, and the left eye line sensor 111 and the right eye line sensor 112 detect eyeball movements at the same time.
  • the smartphone 600 displays each biological information measured and detected by the HMD 100 and each biological information sensor.
  • the heart rate sensor 601 electrically measures the contraction motion of the heart and is charted as an electrocardiogram.
  • the electroencephalograph 602 measures and records, from electrodes attached to the head of the subject 10, the electrical activity of a very small amount of current that the brain continues to flow in association with its activity, and examines the activity state and active regions of the brain. be able to.
  • the smartwatch 603 is a wristwatch-type wearable terminal equipped with a touch screen and a CPU, and has a function of measuring a heart rate, a function of measuring a stress state from an action potential of the skin, and a function of displaying an electrocardiogram. ing.
  • the wristwatch-type blood sugar level sensor 604 uses a mid-infrared laser with a wavelength band of several ⁇ m to measure the blood sugar level in the blood.
  • the sphygmomanometer 605 measures the pressure (blood pressure) that the flow of the brachial artery pushes the blood vessel from the inside by wrapping a bag-shaped belt called a cuff around the upper arm, and is used to grasp the health condition and mental condition of the subject 10 . can be done.
  • the audiometer 606 emits sounds of seven frequencies between 125 Hz and 8000 Hz with strength and weakness, and the subject 10 listens through headphones to measure the hearing ability.
  • the various biological information measured and detected by each of the biological information sensors and the biological information of the fingers and head and neck regions measured and detected by the HMD 100 are measured and detected together with time information (time stamp).
  • various biological information measured and detected by each biological information sensor and biological information of fingers and head and neck regions measured and detected by the HMD 100 are transmitted to the smartphone 600 mainly via wireless communication.
  • the smart phone 600 displays on one screen various biometric information measured by matching the timing of each biometric information sensor and the HMD 100 . That is, various kinds of biological information with synchronized time stamps are displayed on one screen.
  • the smartphone 600 is shown as the second information processing device, but this is not essential, and the measurement results may be displayed on the screen of the HMD 100 by cooperation between the HMD 100 and each biological information sensor. .
  • the information processing system 1A shown in FIG. 6B uses the smartphone 300 as the first information processing device and the second information processing device, uses the same biological information sensors as those shown in FIG. is connected to the smartphone 300 for communication.
  • the information processing system 1A of FIG. 6B is different from the configuration of the information processing system 1 shown in FIG. 6A except that the first information processing device is changed from the HMD 100 to the smart phone 300, and the same action and the same effect can be obtained. can get.
  • FIG. 7 is a diagram showing a display screen of an information processing apparatus according to the third embodiment, in which various types of biological information measured and detected in the embodiment shown in FIG.
  • FIG. 10 is a diagram showing an example of a display screen in this case;
  • finger tapping, eye movements, cheek movements, electrocardiograms, and electroencephalograms are measured at the same time, and the time axes of the measurement operations are displayed.
  • the measurement results of finger tapping, eye movement, and temple movement are the same as those described in FIGS. 2 and 4, and are omitted.
  • the measurement results of the electrocardiogram are obtained by electrically measuring the contraction motion of the heart by the heartbeat sensor 601. In normal times, the same waveform is repeated periodically, and the state of the heart can be grasped from the degree of change in the waveform.
  • Electroencephalogram measurement results can determine areas where damage is suspected from the state of the waveform, making it possible to diagnose diseases such as episodic loss of consciousness or convulsions and cognitive dysfunction. Therefore, by displaying not only the results of tapping motion measurement but also various biological information detected at other sites on a single screen, it is possible to accurately detect signs of disease on the same time axis. .
  • an information processing device having an information communication function and a display function is used. This applies and is not limited to smartphones, and may be other devices or devices such as personal computers and monitor devices.
  • FIG. 8 is a flow chart showing the processing flow of the information processing system 1 .
  • FIG. 9 shows an example of processing for synchronizing times between different devices.
  • the first information processing device in the following description may be, for example, the smartphone 300, and the biometric information sensor may be the heartbeat sensor 601.
  • synchronization processing is performed between the first information processing device and the biological information sensor (S01).
  • the first information processing device sends a synchronization signal (sig.1) containing the command issue time (TA0) of the first information processing device to the biological information sensor in order to synchronize the clock with the biological information sensor. Send.
  • the synchronization signal (sig.2) including the time TB1 at which the synchronization signal (sig.1) was received from the first information processing device and the command issuing time (TB2) of the biological information sensor is transmitted as the first information. Send to processor.
  • the first information processing device that has received the synchronization signal (sig.2) calculates the difference between the time TA0 at which the synchronization signal (sig.1) is transmitted and the time TA3 at which the synchronization signal (sig.2) is received, and the biological information sensor From the information of the difference between the time TB1 when the synchronization signal (sig.1) was received and the time TB2 when the synchronization signal (sig.2) was transmitted, the time required for communication between the first information processing device and the biological information sensor and the time between the devices deviation can be grasped.
  • the synchronization signal (sig.3) allows the biological information sensor to grasp the time required for communication between devices and the time lag between devices. By grasping the time difference ⁇ T between devices and the time required for communication in this way, it is possible to display the measurement results of each device along the time axis. However, if each of the first information processing device and the biometric information sensor incorporates a clock indicating the time common to the world, such processing is not necessary.
  • the first information processing device calculates the time lag ⁇ T.
  • the first information processing device simultaneously measures the distance between two fingers in the tapping motion and detects the amount of head and neck state change by the head and neck region state change sensor, and the biological information sensor detects the biological information (S02).
  • the biological information is transmitted from the sensor to the first information processing device (S03).
  • the first information processing device combines the measurement result of the distance between two fingers and the head and neck state change amount information detected by the first information processing device at time TAn- ⁇ T with the biometric information received from the biometric information sensor at time TA. It is displayed on the display of the first information processing device according to the time axis (S04).
  • Measurement is continued until an instruction to end measurement is input to the first information processing device (S05: No, S02), and if there is an instruction to end measurement (S05: Yes), the measurement ends.
  • FIG. 10 is a block diagram showing a configuration example of a head-mounted display as an example of the information processing apparatus according to this embodiment.
  • FIG. 10 parts denoted by the same reference numerals as shown in FIGS. 1 and 3 have the same operations as those already explained in FIGS. 1 and 3, so detailed explanation thereof will be partially omitted.
  • the HMD 100 includes an out-camera 110, a distance measuring sensor 101, a left-eye line-of-sight sensor 111, a right-eye line-of-sight sensor 112, an acceleration sensor 804, a gyro sensor 805, a geomagnetic sensor 806, an operation input interface 807, a display 113, a processor 820, a program 831 and information data 832 , a vibrator 841 , a microphone 842 , a speaker 843 , and a communication device 844 .
  • the acceleration sensor 804 is a sensor that detects acceleration, which is a change in speed per unit time, and can detect movement, vibration, impact, and the like.
  • An acceleration sensor 804 provided in the HMD 100 can detect the tilt and direction of the HMD 100 .
  • the gyro sensor 805 is a sensor that detects the angular velocity in the rotational direction, and can capture the state of vertical, horizontal, and diagonal postures. Therefore, using the acceleration sensor 804 and the gyro sensor 805, it is possible to detect the posture such as the tilt and direction of the HMD 100.
  • FIG. 805 is a sensor that detects the angular velocity in the rotational direction, and can capture the state of vertical, horizontal, and diagonal postures. Therefore, using the acceleration sensor 804 and the gyro sensor 805, it is possible to detect the posture such as the tilt and direction of the HMD 100.
  • the geomagnetic sensor 806 is a sensor that detects the magnetic force of the earth, and detects the direction in which the HMD 100 is facing. It is also possible to detect the movement of the HMD 100 by using a three-axis type that detects geomagnetism in the vertical direction as well as in the front-rear direction and the horizontal direction, and captures changes in the geomagnetism with respect to the movement of the HMD 100 . With these, the state of the HMD 100 is optimized so that the posture state of the subject 10 wearing the HMD 100 can be detected and determined, and the distance between two fingers during tapping can be measured more accurately. can contribute to
  • the acceleration sensor 804, the gyro sensor 805, and the geomagnetic sensor 806 can detect the movement of the head and neck of the subject 10, they may be used as head and neck region state change sensors.
  • the processor 820 is composed of a CPU or the like, and controls each component of the HMD 100 by executing a program 831 such as an operating system (OS) and an operation control application stored in a memory 830. It realizes functions such as OS, middleware, applications, and other functions.
  • OS operating system
  • operation control application stored in a memory 830. It realizes functions such as OS, middleware, applications, and other functions.
  • the processor 820 executes a program that implements the functions of the information processing system according to this embodiment, thereby performing a finger tapping motion analysis processing unit 821, an eye movement detection analysis processing unit 822, a switching control unit 823, and a display control unit.
  • a portion 824 is realized.
  • the finger tapping motion analysis processing unit 821 analyzes the trajectory of the finger tapping motion from the motion of the fingers detected by the out-camera 110 or the distance measuring sensor 101, and calculates characteristic parameters.
  • the eye movement detection analysis processing unit 822 captures the eye movement from the movement and direction of the eyeball and pupil detected by the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112, and detects an abnormal state of the subject, such as involuntary swaying of the eyeball and pupil. Extract features that lead to the detection of signs of
  • the switching control unit 823 controls the switching between the eye tracking operation for pointing the target object and the eye movement measurement for measuring the state of the eye movement regarding the processing of the eye movement detection analysis processing unit 822 for performing different processing. It is something to do.
  • minute movement states of the subject's eyeballs are observed and feature quantities are extracted.
  • Eye-tracking and eye-movement measurements change the tracking speed for changes in gaze direction and eye movement, and perform processing such as detecting minute movements to analyze involuntary eye and pupil movements. Extract.
  • the display control unit 824 displays the time transition of the distance between two fingers detected by the finger tap distance sensor, the time transition of the head and neck region state change amount detected by the head and neck region state change sensor, and the biological information when there is a biological information sensor. are displayed side by side on the same screen of the display 113 with the time axis aligned.
  • the memory 830 is composed of a non-volatile storage device or the like, and stores various programs 831 and information data 832 handled by the processor 820 and the like.
  • information data 832 captured image data during tapping exercise, operation guide image data for finger tap distance measurement, relaxation image data, measured and detected biological information, and the like are stored.
  • the display 113 is provided in the video see-through type HMD 100, and is composed of a display such as a liquid crystal panel that displays the physical object in front of the user photographed by the out-camera 110 and the virtual object together.
  • a display such as a liquid crystal panel that displays the physical object in front of the user photographed by the out-camera 110 and the virtual object together.
  • an optical see-through HMD instead of the display 113, for example, a projection unit that projects a virtual object such as an operation guide image for finger tap distance measurement or a relaxation image, notification information to the subject 10, and the like; It consists of a transparent half-mirror that forms and displays an image of a virtual object, etc.
  • the subject 10 can visually recognize both the real object in the visual field range in front of the subject 10 and the imaged virtual object in a floating form.
  • the operation input interface 807 is an input interface using, for example, gesture operations, voice input, virtual keyboards, touch sensors, etc., and is used to set and input information that the subject 10 wants to input.
  • the operation input interface 807 may be separated from the main body of the HMD 100 and connected by wire or wirelessly.
  • the input operation screen may be displayed within the display screen of the display 113 and the input operation information may be captured according to the position on the input operation screen to which the line of sight is directed.
  • the input operation information may be captured by operating the pointer with .
  • the subject may utter a sound indicating the input operation, and the microphone 842 may collect the sound to capture the input operation information.
  • the speaker 843 outputs sound from the speaker or headphones based on the sound data, so that the subject 10 can be informed of the notification information by sound.
  • the vibrator 841 generates vibration under the control of the processor 820, and converts the notification instruction information transmitted by the HMD 100 to the subject 10 into vibration.
  • the vibrator 841 can transmit vibration to the head of the subject 10 wearing the HMD 100 to inform the subject 10 of the notification instruction information.
  • Examples of information to be notified to the subject 10 include information indicating the start and end of finger tapping measurement and head and neck measurement, information indicating information indicating other equipment during simultaneous measurement, interrupt notification, and the like. This notification information can improve usability.
  • the communication device 844 is a communication interface that performs wireless communication between at least the HMD 100, other devices, and the smartphones 300 and 600 by short-range wireless communication, wireless LAN, or base station communication, and is compatible with various predetermined communication interfaces. It includes a processing circuit, an antenna, etc., and transmits and receives biological information, image data, control signals, and the like.
  • short-range wireless communication Bluetooth (registered trademark), IrDA (Infrared Data Association, registered trademark), Zigbee (registered trademark), HomeRF (Home Radio Frequency, registered trademark), or Wi-Fi (registered trademark) It is performed using a wireless LAN such as.
  • the finger tapping motion analysis processing unit 821 analyzes each item such as distance, speed, acceleration, tap interval, and phase difference based on the information on the distance between two fingers during the tapping motion that is measured and detected using the distance sensor 101. Analyze and extract the feature quantity that indicates the feature of the finger movement. Biological information leading to brain function evaluation of the subject can be obtained from the analyzed and extracted feature amount.
  • the distance sensor 101 measures the distance information between the two fingers during the tapping motion
  • the finger tapping motion analysis processing unit 821 measures the distance information between the two fingers based on the measured distance information between the two fingers.
  • a feature quantity indicating a finger movement feature is extracted.
  • the eye movement detection and analysis processing unit 822 detects eye movement information with the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112, which leads to detection of signs of an abnormal state of the subject, such as involuntary eyeball or pupil sway. Extract features.
  • the HMD 100 it is possible to obtain, as biometric information, a plurality of feature amounts that are analyzed and extracted based on these pieces of information that are measured and detected at the same time. Further, it is possible to realize an information processing apparatus capable of detecting, with high accuracy, biometric information indicative of a sign, an omen, or a symptom of a disease in a simple and easy-to-use manner without using a special instrument.
  • FIG. 11 is a block diagram showing a configuration example of a smart phone 300, which is a mobile information terminal, as an example of an information processing apparatus according to this embodiment.
  • a smart phone 300 which is a mobile information terminal
  • FIG. 11 parts denoted by the same reference numerals as shown in FIGS. 1 and 3 have the same functions as the operations already explained with reference to FIGS. 1 and 3, so detailed explanations thereof will be omitted.
  • the smartphone 600 has the same configuration as the smartphone 300, a description thereof will be omitted.
  • smartphone 300 includes out camera 301, in camera 302, range sensor 303, acceleration sensor 904, gyro sensor 905, geomagnetic sensor 906, operation input interface 907, display 908, processor 920, program 931 and information data 932. , a vibrator 941 , a microphone 942 , a speaker 943 , a wireless communication device 944 , etc., and each component is connected to each other via a bus 950 . Duplicate description of the same configuration as that of the HMD 100 in FIG. 10 will be omitted.
  • the processor 920 executes a program that implements the functions of the information processing system according to the present embodiment, thereby performing a finger tapping motion analysis processing unit 921, a head and neck analysis processing unit 922, and a display control unit. 924 is realized.
  • a display 908 displays display contents via a highly transparent touch panel composed of a liquid crystal or the like. Displays information such as notification to the sample. Information to be notified to the subject 10 includes information indicating the start and end of finger tapping measurement and measurement of the head and neck region, information indicating other devices during simultaneous measurement, and the like. Note that the display device of the smartphone 300 displays various types of measured and detected biological information over time.
  • the operation input interface 907 is used by the subject 10 to input input information to the smartphone 300.
  • a touch panel configured on the display surface of the display 908 enables touch operations such as a finger or a touch pen.
  • Information that the subject 10 wants to input can be set and input by detecting it as an operation input.
  • the input operation information may be captured using a keyboard, key buttons, or the like, or the subject 10 may utter a sound indicating the input operation, and the microphone 942 may collect the sound to capture the input operation information.
  • the wireless communication device 944 is a communication interface that performs wireless communication between at least the HMD 100, other devices, and the smartphone 300 by short-range wireless communication, wireless LAN, or base station communication, and performs communication processing corresponding to various predetermined communication interfaces. It includes circuits, antennas, etc., and transmits and receives biological information, image data, control signals, and the like.
  • short-range wireless communication Bluetooth (registered trademark), IrDA (Infrared Data Association, registered trademark), Zigbee (registered trademark), HomeRF (Home Radio Frequency, registered trademark), or Wi-Fi (registered trademark) It is performed using a wireless LAN such as.
  • base station communication long-distance wireless communication such as W-CDMA (Wideband Code Division Multiple Access, registered trademark) and GSM (Global System for Mobile Communications) may be used.
  • FIG. 12 is a diagram schematically showing the appearance of an embodiment of an information processing system in which feature amounts measured and analyzed by an information processing device or other device are stored in the cloud as biometric information and utilized.
  • an information processing device 1000 comprising an HMD 100 and a smartphone 300 is connected to a server (computer) 1001 such as a cloud server, application server, or data server via a network 1002 such as the Internet.
  • the server 1001 is also connected to an information terminal such as a personal computer 1003 via a network 1002 .
  • biometric information data measured and analyzed by the information processing apparatus 1000 and other devices is stored and saved in the server 1001 forming the cloud via the network 1002 .
  • the server 1001 comprehensively stores a large amount of biometric information data such as general average value data of biometric information, and biometric information data indicating signs, signs, and indications of diseases by age and by underlying disease. .
  • a user 1004 such as a doctor or a nurse who remotely uses an information terminal such as a personal computer 1003 at a location different from the information processing apparatus 1000 receives the analyzed biological information data from the server 1001 and the server It becomes possible to acquire and compare a large amount of biological information data held by 1001 and to analyze with high precision the precursors, precursors, and signs of illness. In this way, it is possible to receive advanced diagnosis remotely from a doctor in a remote location.
  • the smart phone 300 can display on a single screen the biological information measured and detected at the same time by the HMD 100 and other devices, and it is possible to accurately ascertain and judge the occurrence of abnormal behavior on the same time axis with high detection accuracy. become.
  • the present invention is not limited to the above-described embodiments, and includes various modifications.
  • the above-described embodiments have been described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the configurations described.
  • part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment.
  • each of the above configurations, functions, processing units, processing means, etc. may be realized in hardware, for example, by designing a part or all of them with an integrated circuit.
  • each of the above configurations, functions, etc. may be realized by software by a processor interpreting and executing a program for realizing each function.
  • Information such as programs, tables, and files that implement each function may be stored in recording devices such as memory, hard disks, SSDs (Solid State Drives), or recording media such as IC cards, SD cards, and DVDs. and may be stored in a device on a communication network.
  • the control lines and information lines indicate those considered necessary for explanation, and not all control lines and information lines are necessarily indicated on the product. In practice, it may be considered that almost all configurations are interconnected.
  • Reference Signs List 1 Information processing system 1A: Information processing system 10: Subject 100: HMD 101 : Ranging sensor 102 : Left hand 103 : Finger 104 : Thumb 105 : Right hand 106 : Finger 107 : Thumb 108 : Solid line 109 : Solid line 110 : Out camera 111 : Left eye sight line sensor 112 : Right eye sight line sensor 113 : Display 121 : Screen 122 : Relax image 123 : Operation guide image 201 : Dotted frame 202 : Dotted frame 300 : Smartphone 301 : Out camera 302 : In camera 303 : Distance sensor 304 : Stand 305 : Housing 401 : Dotted frame 402 : Dotted frame 403 : Point 404: Point 501: Display screen 501a: Display screen 501b: Display screen 501c: Display screen 502: Operation guide image 503: Head and neck image 504: Head and neck region 505: Head and neck image 600: Smartphone 601: Heart rate sensor 60

Abstract

This information processing device is provided with a processor. The processor is connected to: a finger tap distance sensor for measuring/detecting an inter-finger distance when a subject makes a two-finger tapping motion, and outputting inter-finger distance information; and a head and neck region state change sensor for measuring/detecting a state change not in the fingers but in mainly the region of the head and neck, and outputting head and neck region state change amount information. The processor also causes synchronization processing to be executed between the finger tap distance sensor and the head and neck region state change sensor, and acquires the inter-finger distance information and the head and neck region state change amount information obtained as a result of the finger tap distance sensor and the head and neck region state change sensor synchronously performing the measurement operations.

Description

情報処理装置、情報処理システム、及び情報処理方法Information processing device, information processing system, and information processing method
 本発明は、情報処理装置、情報処理システム、及び情報処理方法に係り、特に認知症などの病気、疾病の予兆、前兆、兆候や、治療や投薬などの処方による効果や影響の検出に好適な情報処理装置、情報処理システム、及び情報処理方法に関する。 The present invention relates to an information processing device, an information processing system, and an information processing method, and is particularly suitable for detection of symptoms such as dementia, signs of diseases, signs, and effects and effects of prescriptions such as treatment and medication. The present invention relates to an information processing device, an information processing system, and an information processing method.
 人口の高齢化とともにアルツハイマー型認知症などの病気、疾病の患者は年々増加しており、早期発見ができれば、投薬等で病気の進行を遅らせることができ、重症化を予防することもできる。例えば、アルツハイマー型認知症の早期発見に向けたスクリーニング検査としては、近年、両手の親指と人差し指による二指の開閉運動(タッピング運動)の計測によって評価する方法が研究されている。アルツハイマー型認知症における脳の委縮に起因する両手指のリズム運動機能の低下をタッピング運動計測によって捉えようとするものである。また、手指は第2の脳であるといわれ、脳の中でも多くの領域が手指の働きに関係しており、手指の動きは、アルツハイマー型認知症に限らず脳血管性やレビー小体型等の認知症、パーキンソン病、発達性協調運動障害(スキップや縄跳びができない等)等とも関係していると言われている。即ち、指のタッピング運動から生体情報として脳の状態を知ることを目指して研究が進められている。 With the aging of the population, the number of patients with diseases such as Alzheimer's dementia is increasing year by year. For example, in recent years, as a screening test for early detection of Alzheimer's dementia, a method of evaluation by measuring two-finger opening and closing movements (tapping movements) using the thumbs and index fingers of both hands has been studied. We are trying to understand the deterioration of the rhythmic motor function of both fingers caused by brain atrophy in Alzheimer's dementia by tapping movement measurement. In addition, the fingers are said to be the second brain, and many areas in the brain are related to the function of the fingers. It is also said to be related to dementia, Parkinson's disease, developmental coordination disorder (inability to skip or jump rope, etc.). In other words, research is proceeding with the aim of knowing the state of the brain as biological information from finger tapping movements.
 更には、指のタッピング運動を脳の健康状態を示す“ものさし”として活用することで手指の巧緻運動機能を定量化できるため、ヘルスケア分野、リハビリ分野、生活支援分野など、さまざまな分野でも利用できる。 Furthermore, by using finger tapping motion as a "rule" that indicates the state of brain health, it is possible to quantify the fine motor function of the fingers, so it can be used in various fields such as healthcare, rehabilitation, and life support can.
 そして、タッピング運動を計測評価する方法として、例えば、下記特許文献1には、「生体の可動部分に取り付けた発信コイルと受信コイルのペアの相対距離に基づいて運動データを算出する運動機能測定装置と、運動機能測定装置から受信した運動データに基づいて生体の運動機能を評価する評価装置と、を備える運動機能評価システム」が記載されている。 As a method for measuring and evaluating the tapping motion, for example, Patent Document 1 below describes "a motor function measuring device that calculates motion data based on the relative distance between a pair of a transmitting coil and a receiving coil attached to a movable part of a living body. and an evaluation device for evaluating the motor function of a living body based on the motion data received from the motor function measuring device."
特開2016-49123号公報JP 2016-49123 A
 特許文献1では、タッピング運動を計測評価することについては記載されているものの、指に磁気センサを装着してタッピング運動を計測する。しかし、高齢者や手指の運動機能に障害がある場合、指に磁気センサを装着する行為が難しかったり、磁気センサを装着した状態ではタッピングが難しかったりするため、より簡易に検査を行いたいという要望がある。 Although Patent Document 1 describes measuring and evaluating the tapping motion, the tapping motion is measured by attaching a magnetic sensor to the finger. However, it is difficult for elderly people or those with impaired finger motor function to wear magnetic sensors on their fingers, and it is difficult to perform tapping while wearing magnetic sensors. There is
 本発明は、前記課題に鑑みてなされたものであり、タッピング運動の計測の準備の手間を省き、かつタッピング運動をより正確に行い、計測できる情報処理装置、情報処理システム、及び情報処理方法を提供する。 The present invention has been made in view of the above problems, and provides an information processing device, an information processing system, and an information processing method that can save the trouble of preparing for measurement of tapping motion and perform and measure tapping motion more accurately. offer.
 本発明は、上記課題を解決するために請求の範囲に記載の構成を備える。その一例をあげるならば、本発明に係る情報処理装置は、プロセッサを備え、前記プロセッサは、被検体の二指のタッピング運動時の二指間距離を計測検出して二指間距離情報を出力する指タップ距離センサと、指以外の主に頭頚部の部位の状態変化を計測検出して頭頚部状態変化量情報を出力する頭頚部位状態変化センサと、のそれぞれに接続され、前記指タップ距離センサ及び前記頭頚部位状態変化センサ間で同期処理を実行させ、前記指タップ距離センサ及び前記頭頚部位状態変化センサが同期して計測動作を行って得た前記二指間距離情報及び前記頭頚部状態変化量情報を取得する、ことを特徴とする。 The present invention has the configuration described in the claims in order to solve the above problems. As an example, an information processing apparatus according to the present invention includes a processor, and the processor measures and detects the distance between two fingers during a tapping motion of the subject's two fingers, and outputs distance information between the two fingers. and a head and neck region state change sensor that measures and detects changes in the state of mainly the head and neck region other than the finger and outputs head and neck state change amount information, wherein the finger tap is connected to Synchronization processing is executed between the distance sensor and the head and neck region state change sensor, and the finger tap distance sensor and the head and neck region state change sensor perform synchronous measurement operations to obtain the two-finger distance information and the It is characterized by acquiring head and neck state change amount information.
 本発明によれば、タッピング運動の計測の準備の手間を省き、かつタッピング運動をより正確に行い、計測できる情報処理装置、情報処理システム、及び情報処理方法を提供することができる。なお、上記した以外の課題、構成及び効果は、以下の実施形態の説明により明らかにされる。 According to the present invention, it is possible to provide an information processing device, an information processing system, and an information processing method that save the trouble of preparing for measurement of the tapping motion, perform the tapping motion more accurately, and measure the tapping motion. Problems, configurations, and effects other than those described above will be clarified by the following description of the embodiments.
第1実施形態に係る情報処理装置の外観を示す図である。1 is a diagram showing an appearance of an information processing apparatus according to a first embodiment; FIG. 測距センサによるタッピング運動の測定結果、及び左目視線センサ、右目視線センサによる眼球運動の測定結果を時間経過とともに示した図である。FIG. 10 is a diagram showing the results of tapping movement measurement by a range sensor and the measurement results of eye movements by a left-eye line-of-sight sensor and a right-eye line-of-sight sensor over time. 第2実施形態に係る情報処理装置の外観を示す図である。It is a figure which shows the external appearance of the information processing apparatus which concerns on 2nd Embodiment. アウトカメラや測距センサによるタッピング運動の測定結果と、インカメラによる眼球運動の測定結果、更にインカメラによる頬の筋肉の動きの測定結果、及びインカメラによる顔表面の輝度変化の測定結果を時間経過とともに示した図である。The result of tapping movement measurement by the out-camera and distance sensor, the eye movement measurement result by the in-camera, the cheek muscle movement measurement result by the in-camera, and the luminance change measurement result of the face surface by the in-camera are time-lapsed. It is a figure shown with progress. タッピング運動時のスマートフォンの表示画面に表示される画像を示した図である。FIG. 10 is a diagram showing an image displayed on the display screen of the smartphone during tapping exercise; 第3実施形態に係る情報処理装置の外観を示す図である。It is a figure which shows the external appearance of the information processing apparatus which concerns on 3rd Embodiment. 第3実施形態に係る情報処理装置の外観を示す図である。It is a figure which shows the external appearance of the information processing apparatus which concerns on 3rd Embodiment. 第3実施形態に係る情報処理装置の表示画面を示す図である。It is a figure which shows the display screen of the information processing apparatus which concerns on 3rd Embodiment. 情報処理システムの処理の流れを示すフローチャートである。4 is a flow chart showing the flow of processing of the information processing system; 異なる装置間で時間を合わせる処理の一例を示したものである。It shows an example of processing for synchronizing time between different devices. 本実施形態に係る情報処理装置の例としてヘッドマウントディスプレイの構成例を示すブロック図である。1 is a block diagram showing a configuration example of a head-mounted display as an example of an information processing apparatus according to an embodiment; FIG. 本実施形態に係る情報処理装置の例として携帯情報端末の構成例を示すブロック図である。1 is a block diagram showing a configuration example of a mobile information terminal as an example of an information processing apparatus according to an embodiment; FIG. 本実施形態に係る情報処理システムとしてクラウドを構成する場合の実施形態を外観模式的に示す図である。1 is a diagram schematically showing the appearance of an embodiment in which a cloud is configured as an information processing system according to the present embodiment; FIG.
 以下、本発明の実施形態の例を、図面を用いて説明する。全図を通じて同一の構成には同一の符号を付し、重複説明を省略する。 Hereinafter, examples of embodiments of the present invention will be described with reference to the drawings. The same reference numerals are given to the same configurations throughout the drawings, and redundant explanations are omitted.
<第1実施形態>
 第1実施形態は、情報処理装置は、被検体10の頭部に装着され現実空間情報や仮想空間情報を表示して視認するヘッドマウントディスプレイ(Head Mounted Display;HMD、以下、HMDと称す)に搭載される実施形態である。
<First embodiment>
In the first embodiment, the information processing apparatus includes a head mounted display (HMD, hereinafter referred to as HMD) that is worn on the head of the subject 10 and displays and visually recognizes real space information and virtual space information. It is a mounted embodiment.
 図1は、第1実施形態に係る情報処理装置の外観を示す図である。 FIG. 1 is a diagram showing the appearance of the information processing apparatus according to the first embodiment.
 タッピング運動を計測する被検体10は、HMD100を頭部に装着する。被検体10はHMD100のユーザでもある。 The subject 10 whose tapping motion is to be measured wears the HMD 100 on its head. The subject 10 is also the user of the HMD 100 .
 図1において、HMD100は、HMD二指のタッピング運動(開閉運動)時の二指間距離を計測する指タップ距離センサとして、対象物までの距離や角度を計測検出する測距センサ101を具備する。従来、HMD100には、ジェスチャー操作を検出される目的で測距センサ101が備えられている。本実施形態では、この測距センサ101をタッピング運動の検出に用いる指タップ距離センサとして流用する。 In FIG. 1, the HMD 100 includes a distance sensor 101 that measures and detects the distance and angle to an object as a finger tapping distance sensor that measures the distance between two fingers during tapping motion (opening/closing motion) of the HMD. . Conventionally, the HMD 100 is provided with a distance measurement sensor 101 for the purpose of detecting gesture operations. In this embodiment, the distance measuring sensor 101 is used as a finger tapping distance sensor for detecting tapping motion.
 測距センサ101は、対象物までの距離や角度を測定し、物などの対象物の形状を立体として捉えることができるセンサである。赤外線などのレーザ光を対象物に照射し、はね返ってくる散乱光を測定し遠距離にある対象物までの距離やその対象物の状態を分析検出するLiDAR(Light Detection and Ranging)や、被写体に照射したパルス光の反射時間を画素ごとに計測して測距を行うTOF(Time Of Flight)センサ、ミリ波の電波を発射しその反射波を受光して反射している対象物までの距離や対の象物の状態を検出するミリ波レーダーなどが用いられる。 The distance measuring sensor 101 is a sensor that can measure the distance and angle to an object and can perceive the shape of an object such as an object as a three-dimensional object. LiDAR (Light Detection and Ranging), which irradiates an object with a laser beam such as infrared rays and measures the reflected scattered light to analyze and detect the distance to a distant object and the state of the object, and A TOF (Time Of Flight) sensor that measures the distance by measuring the reflection time of the irradiated pulsed light for each pixel, emits millimeter-wave radio waves, receives the reflected waves, and measures the distance to the object that is reflected. A millimeter-wave radar or the like is used to detect the state of the target object.
 測距センサ101の検出範囲内においては、手や指の形状や骨格の認識とそれらの動き、また奥行きの位置関係も検出することができる。ここで、測距センサ101は、実線108で示すようにタッピング運動を行っている左手102の人差し指103の先端及び親指104の先端までの距離や角度を各々検出する。よって、HMD100では、タッピング運動中の人差し指103の先端と親指104の先端との二指間距離を時間経過とともに計測することができる。奥行きの距離も計測できるため、手指が測距センサ101に対して斜めの状態になっていても、二指間の距離を正確に計測可能である。 Within the detection range of the distance measuring sensor 101, it is possible to recognize the shapes and skeletons of the hands and fingers, their movements, and also detect the positional relationship in depth. Here, the distance measurement sensor 101 detects the distance and angle to the tip of the index finger 103 and the tip of the thumb 104 of the left hand 102 performing the tapping motion as indicated by the solid line 108 . Therefore, the HMD 100 can measure the inter-finger distance between the tip of the index finger 103 and the tip of the thumb 104 during the tapping exercise over time. Since the depth distance can also be measured, the distance between two fingers can be accurately measured even if the fingers are oblique to the distance measuring sensor 101 .
 同様に、測距センサ101は、実線109で示すようにタッピング運動を行っている右手105の人差し指106の先端及び親指107の先端までの距離や角度を各々検出する。よって、HMD100では、タッピング運動中の人差し指106の先端と親指107の先端との二指間距離を時間経過とともに計測することができる。 Similarly, the distance measuring sensor 101 detects the distance and angle to the tip of the index finger 106 and the tip of the thumb 107 of the right hand 105 performing the tapping motion as indicated by the solid line 109 . Therefore, the HMD 100 can measure the inter-finger distance between the tip of the index finger 106 and the tip of the thumb 107 during the tapping exercise over time.
 また、HMD100は、前方周囲の外界を撮影するアウトカメラ110を具備する。アウトカメラ110は、レンズから入射した光を撮像素子で電気信号に変換して撮影画像を取得する。アウトカメラ110でタッピング運動を行っている左指、右指を撮影し、撮影画像(アウトカメラ画像)から二指間距離を算出して計測してもよい。 The HMD 100 also includes an out-camera 110 that captures the surrounding external world in front. The out-camera 110 acquires a photographed image by converting the light incident from the lens into an electric signal with an imaging element. The left and right fingers performing the tapping exercise may be photographed with the out-camera 110, and the distance between the two fingers may be calculated and measured from the photographed image (out-camera image).
 HMD100のプロセッサ802(図8参照)は、アウトカメラ画像から指の形状を検出し、左右のどの指であるかを認識する。それらの指先がアウトカメラ画像のどの画素位置にあるかを求め、時間の経過により、二つの指先の位置がどのように変化したかを検出することができる。 The processor 802 (see FIG. 8) of the HMD 100 detects the shape of the finger from the out-camera image and recognizes which finger is left or right. It is possible to obtain the pixel positions of those fingertips in the out-camera image, and detect how the positions of the two fingertips have changed over time.
 また、プロセッサ802は、指先の先端部の大小の見え方によって、指先の距離を把握することも可能であり、二指の内側、外側の見え方によって、アウトカメラ110に対する二指の傾きを把握することも可能である。最近では、機械学習などにより、事前に手指の傾きと距離の関係を計測したデータを蓄積することで、二指間の距離を測定することができる。 The processor 802 can also grasp the distance between the fingertips based on how the tips of the fingertips appear to be large or small, and can grasp the inclination of the two fingers with respect to the out-camera 110 based on how the inside and outside of the two fingers appear. It is also possible to Recently, it is possible to measure the distance between two fingers by accumulating data obtained by measuring the relationship between the inclination of the fingers and the distance in advance using machine learning or the like.
 こうして、HMD100を用いて、使い勝手よくタッピング運動中の二指間距離を計測検出できる。 Thus, the HMD 100 can be used to conveniently measure and detect the distance between two fingers during the tapping exercise.
 また、プロセッサ802は、計測検出された情報をもとに、HMD100内の演算機能により被検体10の脳機能評価につながる特徴量を生体情報として解析抽出することが可能となる。 In addition, the processor 802 can analyze and extract the feature amount leading to brain function evaluation of the subject 10 as biometric information by the arithmetic function in the HMD 100 based on the measured and detected information.
 これにより、アルツハイマー型、脳血管性、レビー小体型等の認知症や、パーキンソン病、発達性協調運動障害(スキップや縄跳びができない等)等の病気の予兆、前兆の手がかりを得ることができる。 As a result, it is possible to obtain signs and clues about dementia such as Alzheimer's disease, cerebrovascular disease, Lewy body disease, Parkinson's disease, developmental coordination disorder (inability to skip or jump rope, etc.).
 更に、解析抽出した特徴量をもとにして、脳の健康状態を示す“ものさし”として手指の巧緻運動機能を定量化して検知できる場合には、脳機能改善のトレーニングやリハビリメニューとして利用することが可能である。 Furthermore, based on the extracted features, if the dexterous motor function of the fingers can be quantified and detected as a "measure" indicating the state of brain health, it can be used as a training and rehabilitation menu for improving brain function. is possible.
 例えば、音に合わせて指先の開閉を促してその巧緻性を評価するようなトレーニング・リハビリメニューが考えられる。 For example, a training/rehabilitation menu that encourages the fingertips to open and close in time with the sound and evaluates their dexterity is conceivable.
 なお、上記の例では、人差し指と親指との二指間のタッピング運動間隔距離を時間経過とともに計測する場合を説明したが、例えば、中指と親指の二指など、他の二指間のタッピング運動の場合でもよく、また複数の指を開閉させてもタッピング運動の計測は可能であり、同様の作用、効果が得られることが言うまでもない。なお、以降の説明でも、人差し指でタッピング運動を行う場合を主に説明するが、人差し指以外でも同様である。 In the above example, the case where the tapping motion interval distance between the index finger and the thumb is measured over time has been described. In addition, the tapping movement can be measured even if a plurality of fingers are opened and closed, and it goes without saying that the same action and effect can be obtained. In the following description, the case where the tapping exercise is performed with the index finger will be mainly described, but the same applies to other fingers than the index finger.
 また、HMD100は左目及び右目のそれぞれの動きや向きを捉えて眼球運動や視線を検出する左目視線センサ111、右目視線センサ112を具備する。左目視線センサ111、右目視線センサ112は、瞳孔や眼球の動き、まばたき等を計測する。 The HMD 100 also includes a left-eye line-of-sight sensor 111 and a right-eye line-of-sight sensor 112 that detect eye movement and line-of-sight by capturing the movements and orientations of the left and right eyes, respectively. A left-eye line-of-sight sensor 111 and a right-eye line-of-sight sensor 112 measure movements of the pupil and eyeball, blinking, and the like.
 左目視線センサ111、右目視線センサ112は、それぞれ右目、左目の動きや向きを検出し、眼球運動や視線を捉えることが可能になる。なお、眼球の動きを検出する処理は、アイトラッキング処理として一般的に用いられている周知技術を利用すればよく、例えば、角膜反射を利用した方法では、赤外線LED(Light Emitting Diode)を顔に照射し赤外線カメラで撮影し、赤外線LED照射でできた反射光の角膜上の位置(角膜反射)を基準点とし、角膜反射の位置に対する瞳孔の位置に基づいて眼球の動きや視線を検出する技術が知られている。なお、眼球の動きの他の計測検出方法としては、可視光カメラで目を写し、瞳孔や虹彩の位置に基づいて視線を検出する方法も知られている。 The left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 detect the movement and orientation of the right and left eyes, respectively, and can capture the eye movement and line of sight. In addition, the process of detecting the movement of the eyeball may use a well-known technique that is generally used as an eye tracking process. For example, in the method using corneal reflection, an infrared LED (Light Emitting Diode) A technology that detects the movement of the eyeball and the line of sight based on the position of the pupil with respect to the position of the corneal reflection, using the position of the reflected light (corneal reflection) on the cornea as a reference point. It has been known. As another method for measuring and detecting eyeball movements, there is also known a method of photographing an eye with a visible light camera and detecting the line of sight based on the positions of the pupil and iris.
 HMD100の入力デバイスとして左目視線センサ111、右目視線センサ112を用いた視線入力処理では、プロセッサ802は、視線先の注視点を検出する。人間の視線方向は素早く移動しており、その眼球の細かな動きに追従すると注視点が定まりにくいことから、視線入力処理では、眼球の細かな動きに追従しないように左目視線センサ111、右目視線センサ112の追従性を遅くしている。 In line-of-sight input processing using the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 as input devices of the HMD 100, the processor 802 detects the gaze point of the line-of-sight destination. The direction of the human gaze moves quickly, and it is difficult to determine the point of gaze if the fine movements of the eyeballs are followed. It slows down the followability of the sensor 112 .
 これに対して、本実施形態でのタッピング運動中の視点の動き検出では、被検体10の右目、左目の動きをより忠実に(空間解像度をより高く)検出したいという要望がある。 On the other hand, in the motion detection of the viewpoint during the tapping motion in this embodiment, there is a demand to detect the motion of the right eye and left eye of the subject 10 more faithfully (higher spatial resolution).
 そこで、左目視線センサ111、右目視線センサ112では、検出周期や、検出可能な動きの範囲を小さくして、高速に微小な瞳孔や眼球の運動を計測できるようにする。具体的には、プロセッサ802は、視線入力処理では左目視線センサ111、右目視線センサ112のセンサ情報に対して時間軸方向に沿った平滑化フィルタをかけて注視点を検出している。これに対して、タッピング運動中の視点の動き検出では、プロセッサ802は、平滑化フィルタをかけずにセンサ情報をそのまま用いたり、視点入力処理に用いた平滑化フィルタよりも時間解像度が高いフィルタにフィルタを切り替えて、微細な視線の動きを検出する。 Therefore, in the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112, the detection cycle and the range of detectable movement are reduced so that minute pupil and eyeball movements can be measured at high speed. Specifically, in the line-of-sight input process, the processor 802 applies a smoothing filter along the time axis to the sensor information from the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 to detect the gaze point. On the other hand, in the motion detection of the viewpoint during the tapping motion, the processor 802 uses the sensor information as it is without applying the smoothing filter, or uses a filter with a higher time resolution than the smoothing filter used for viewpoint input processing. Switching filters to detect minute movements of the line of sight.
 認知症などの病気、疾病がある人の場合、視線が細かくランダムにふらつくようにさ迷っていて視点が定まらない状態になる場合がある。よって、高速に微小な瞳孔や眼球の運動を検出する左目視線センサ111、右目視線センサ112により、被検体の意思とは無関係に起こる不随意な眼球や瞳孔のふらつくような動き、早いまばたき等を検出することができる。 In the case of people with dementia or other illnesses or illnesses, their gazes may sway in detail and randomly, causing them to lose focus. Therefore, the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112, which detect minute pupil and eyeball movements at high speed, detect involuntary eyeball and pupillary movements unrelated to the subject's intention, rapid blinking, etc. can be detected.
 また、検出された情報をもとに、HMD100内の演算機能により被検体の異常状態評価につながる特徴量を生体情報として解析抽出することが可能となる。例えば、アルツハイマー型、脳血管性、レビー小体型等の認知症や、パーキンソン病等の早期発見や、認知症等の病気の予兆、前兆、兆候の検知、更に解析抽出した特徴量をもとにして、眼球運動機能の障害の兆候を判別できる可能性が生じる。 Also, based on the detected information, it becomes possible to analyze and extract the feature quantity leading to the evaluation of the abnormal state of the subject as biometric information by the arithmetic function in the HMD 100 . For example, early detection of dementia such as Alzheimer's disease, cerebrovascular disease, and Lewy body disease, Parkinson's disease, etc., detection of signs, precursors, and signs of diseases such as dementia, and analysis and extraction of features based on Thus, it may be possible to discriminate signs of oculomotor dysfunction.
 そして、二指タップ動作時の計測とともに、左目視線センサ111、右目視線センサ112による瞳孔・眼球の動き検出を同時に併用して行う。二指間距離計測だけ、或いは左目視線センサ111、右目視線センサ112による瞳孔・眼球の動き検出だけでは、認知症などの病気の予兆、前兆、兆候を指し示す生体情報の検出精度に限界やバラつきがある。よって。これらを同時併用して計測することにより、より高い精度で病気の予兆、前兆、兆候を指し示す生体情報を検出することが可能となる。 Then, along with the measurement during the two-finger tapping action, the movement of the pupil/eyeball is detected by the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 at the same time. Only the measurement of the distance between two fingers or the detection of pupil/eyeball movement by the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 alone has limits and variations in detection accuracy of biometric information that indicates signs, signs, and symptoms of diseases such as dementia. be. Therefore. By measuring these simultaneously, it becomes possible to detect biometric information that indicates a sign, portent, or symptom of a disease with higher accuracy.
 また、HMD100は、タッピング運動時の二指間距離を計測する前は、両指全体の姿が映っているかを確認できる画面121をディスプレイ113に表示するが、タッピング運動時は、タッピング運動を行なっている両指全体の姿とは別の画像に切り替えて表示する。例えば、タッピング運動計測時には、被検体10をリラックスさせるためのリラックス画像122や、指タップ距離計測の操作ガイド画像123などを表示する。 In addition, before the HMD 100 measures the distance between two fingers during the tapping exercise, the screen 121 is displayed on the display 113 so that it can be confirmed whether or not the entire figure of both fingers is shown. The image is switched to an image different from the image of the entire two fingers that are in contact with each other. For example, during tapping motion measurement, a relaxation image 122 for relaxing the subject 10, an operation guide image 123 for finger tap distance measurement, and the like are displayed.
 これにより、タッピング運動計測中にまさにその指タップ動作を見ていることによって視覚から被検体10に与えるフィードバックが生じ、そのフィードバックによる計測への影響を抑制することができ、より正確なタッピング運動の計測を可能にする。 As a result, visual feedback is given to the subject 10 by watching the finger tapping motion during the tapping motion measurement, and the influence of the feedback on the measurement can be suppressed, and the tapping motion can be performed more accurately. enable measurement.
 更に、リラックス画像122の表示により、被検体10は緊張がほぐれた状態でタッピング運動の計測を行うことができる。 Furthermore, by displaying the relaxation image 122, the subject 10 can measure the tapping motion in a relaxed state.
 また、指タップ距離計測の操作ガイド画像123の表示により、正しいタッピング運動の操作案内やタッピング運動計測中の時間経過を見ながら、落ち着いた状態でタッピング運動の計測を行うことができる。 In addition, by displaying the operation guide image 123 for finger tapping distance measurement, it is possible to measure the tapping motion in a calm state while watching the correct operation guidance for the tapping motion and the passage of time during the tapping motion measurement.
 図2は、測距センサ101によるタッピング運動の測定結果、及び左目視線センサ111、右目視線センサ112による眼球運動の測定結果を時間経過とともに示した図である。図2では、タッピング運動の特徴量を示す項目として二指間距離、速度、加速度を例に挙げ、これらの時間推移を示す。左目視線センサ111、右目視線センサ112による眼球運動の測定結果を含む情報は、頭頚部状態変化量情報に相当する。 FIG. 2 is a diagram showing the results of tapping movement measurement by the distance measuring sensor 101 and the measurement results of eye movement by the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 over time. In FIG. 2, the distance between two fingers, speed, and acceleration are taken as examples of the items indicating the feature amount of the tapping motion, and the transition of these items over time is shown. The information including the measurement results of the eye movement by the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 corresponds to head-and-neck state variation information.
 二指間距離は、測距センサ101でタッピング運動を行っている二指の位置を検出して、タッピング運動中の二指の間隔距離を計測したものである。また速度、加速度は、タッピング運動により時間的に変位している算出された二指間距離情報をもとに、算出解析されたものである。 The distance between the two fingers is obtained by detecting the positions of the two fingers performing the tapping motion with the distance measuring sensor 101 and measuring the distance between the two fingers during the tapping motion. Velocity and acceleration are calculated and analyzed based on information on the distance between two fingers that is temporally displaced by the tapping motion.
 例えば、時刻t0では、二指間距離は最大となり、速度はゼロ、加速度は負の最大となる。タッピング運動中の二指間距離情報をもとに、距離、速度、加速度、タップインターバル、位相差などの各項目で指の動きの特徴を示す特徴量が解析抽出される。 For example, at time t0, the distance between two fingers is maximum, the velocity is zero, and the acceleration is the maximum negative value. Based on the information on the distance between two fingers during the tapping motion, feature amounts indicating features of finger movement are analyzed and extracted for each item such as distance, speed, acceleration, tap interval, and phase difference.
 プロセッサ802は、二指間距離の時間推移から、指の最大振れ幅や総移動距離などを算出し、指の動きがどのように変化したかを評価する。また、速度の時間推移から、最大速度、開く速度、閉じる速度などを算出し、指がどれだけ速く安定的に動いたかを評価する。更に加速度の時間推移から、加速度の最大振幅、開き始める勢い、開き終える勢い、閉じ終える勢い、閉じ始める勢い、指接触時間などを算出し、指の動きの勢いを評価する。 The processor 802 calculates the maximum swing width of the finger, the total moving distance, etc. from the time transition of the distance between the two fingers, and evaluates how the finger movement has changed. In addition, the maximum speed, opening speed, closing speed, etc. are calculated from the change in speed over time, and how quickly and stably the finger moves is evaluated. Furthermore, the maximum amplitude of the acceleration, the force at which the finger starts to open, the force at which it finishes opening, the force at which it finishes closing, the force at which it starts to close, the finger contact time, etc. are calculated from the transition of the acceleration over time, and the force of finger movement is evaluated.
 図2で図示していないが、特徴量のタップインターバルの項目では、プロセッサ802は、タップ回数、タップ周期、タップ頻度などを算出し、タッピングのタイミングを評価する。 Although not shown in FIG. 2, the processor 802 calculates the number of taps, the tap period, the tap frequency, and the like for the tap interval item of the feature amount, and evaluates the timing of tapping.
 特徴量の位相差の項目では、プロセッサ802は、両手のタイミングずれ、両手類似度、交互タップのバランスなどを算出し、両手間の連携を評価する。これらを解析することよって、抽出された特徴量により、被検体10の脳機能評価につながるような脳の状態を知る手掛かりとすることができ、アルツハイマー型、脳血管性、レビー小体型等の認知症や、パーキンソン病、発達性協調運動障害等の早期発見に向けた検査の一つの手段とすることができる。 For the item of phase difference of the feature amount, the processor 802 calculates the timing difference between both hands, the degree of similarity between both hands, the balance of alternating taps, etc., and evaluates the cooperation between both hands. By analyzing these, the extracted feature amount can be used as a clue to know the state of the brain that leads to brain function evaluation of the subject 10, and recognition of Alzheimer's type, cerebrovascular disease, Lewy body type, etc. This can be used as one means of testing for early detection of diseases such as Parkinson's disease, developmental coordination disorder, and the like.
 また、図2において、眼球運動測定結果の移動距離は、左目視線センサ111、右目視線センサ112で眼球の位置を検出して、タッピング運動中に眼球が時間とともに移動した距離を計測したものである。移動距離は、ある視線方向を初期位置として、そこから眼球がどの方向にどのくらいの距離を移動したかを検出する。図2では、左右方向の移動距離を示しているが、上下方向の距離、初期位置からの距離ベクトルで表してもよい。特徴が顕著に表れる指標を表示するようにしてもよい。また、両眼の平均値を示したり、左右で、顕著に特徴が表れている一方を示してもよい。もちろん、両眼とも表示しても構わない。 In FIG. 2, the movement distance of the eyeball movement measurement result is obtained by detecting the position of the eyeball with the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112, and measuring the distance the eyeball moved over time during the tapping motion. . The movement distance is detected by determining in which direction and how much distance the eyeball has moved from a certain line-of-sight direction as an initial position. Although FIG. 2 shows the movement distance in the horizontal direction, it may be represented by a distance in the vertical direction or a distance vector from the initial position. It is also possible to display an index in which the characteristic appears conspicuously. In addition, the average value of both eyes may be indicated, or one of the left and right eyes, in which the characteristic is prominently displayed, may be indicated. Of course, it may be displayed for both eyes.
 ここで、通常、アイトラッキング用の検出センサでは、視線によるポインティング動作等において、視線が高速に移動するため、視線の微小な動きまで追従すると、ポインティング位置も高速に移動してしまい安定的なポインティング動作ができなくなる。そのため、ポインティングのような動作の時には、細かな移動には追従せず、大きな視線の移動にのみ追従するように制御されている。それに対して、眼球運動の測定においては、左目視線センサ111、右目視線センサ112により、両眼の高速で微細な運動を計測するため、検出パラメータをアイトラッキングによるポインティング動作時と変更する。パラメータとしては、眼球計測用カメラのフレームレートを高速化したり、複数フレームを平均化していた検出の平均数を減らしたり、各フレームごとに検出したりする。また、眼球の運動が微小である場合には、検出範囲を運動の大きさに合わせるようにする。また、グラフ表示する場合の波形のスムージングも、できるだけ排除して、運動の様子がそのまま表示できるようにする。 Here, in the detection sensor for eye tracking, since the line of sight moves at high speed in the pointing operation by the line of sight, etc., tracking even minute movements of the line of sight causes the pointing position to also move at high speed, resulting in stable pointing. It becomes inoperable. Therefore, during actions such as pointing, it is controlled so that it does not follow fine movements, but only large movements of the line of sight. On the other hand, in the measurement of the eyeball movement, the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112 measure the high-speed and minute movements of both eyes. The parameters include speeding up the frame rate of the eye measurement camera, reducing the average number of detections that have averaged multiple frames, and detecting each frame. Also, when the movement of the eyeball is very small, the detection range is adapted to the magnitude of the movement. In addition, the smoothing of the waveform when displaying the graph should be eliminated as much as possible so that the state of exercise can be displayed as it is.
 このようにして、図2の眼球運動測定結果の移動距離の点線枠201で示すように眼球がランダムにふらつくように運動している場合は勿論、点線枠202で示すように眼球が小刻みに振動するように動く場合の眼球運動を捉えて表示することができる。 In this way, as shown by the dotted line frame 201 of the movement distance of the eye movement measurement result in FIG. It is possible to capture and display the eyeball movement when moving as if to do so.
 これらの眼球運動は、特定の病気、疾病がある人に起こりやすい症状である場合がある。そのため、前述したタッピング運動と検出された眼球の動き情報をもとに、情報処理装置内の演算機能により被検体の異常状態評価につながる特徴量を抽出する。解析の結果が病気の予兆、前兆、兆候を指し示すものであれば、早期に簡易な方法でその傾向を検知することが可能になる。そして、アルツハイマー型、脳血管性、レビー小体型等の認知症や、パーキンソン病など、更には眼球運動機能の障害の早期発見を実現する可能性がある。 These eye movements may be symptoms that are likely to occur in people with specific diseases and diseases. Therefore, based on the above-described tapping movement and the detected eyeball movement information, the arithmetic function in the information processing apparatus extracts a feature amount leading to an abnormal state evaluation of the subject. If the results of the analysis indicate signs, precursors, or signs of illness, it will be possible to detect the tendency at an early stage by a simple method. In addition, there is a possibility of early detection of dementia such as Alzheimer's disease, cerebrovascular disease, and Lewy body disease, Parkinson's disease, and disorders of eye movement function.
<第2実施形態>
 第2実施形態は、情報処理装置は、広く普及していて、前方周囲の外界や被検体を撮影する機能と情報を表示する機能とを有するスマートフォンやタブレットなどの携帯可能な携帯情報端末に搭載される実施形態である。
<Second embodiment>
In the second embodiment, the information processing device is widely used, and is installed in portable personal digital assistants such as smartphones and tablets that have a function of photographing the external world around the front and a subject and a function of displaying information. It is an embodiment to be carried out.
 図3は、第2実施形態に係る情報処理装置の外観を示す図である。 FIG. 3 is a diagram showing the appearance of an information processing apparatus according to the second embodiment.
 図3では、携帯情報端末の具体例としてスマートフォン300を例にとり説明する。スマートフォン300は、前方周囲の外界を撮影するアウトカメラ301や、対象物までの距離や角度を計測検出する測距センサ303を、スマートフォン300の筐体305の背面に設ける。一方、図3では図示を省略するが、筐体305の正面には、表示画面を有するディスプレイ908(図9参照)が設けられる。 In FIG. 3, a smart phone 300 will be described as a specific example of a portable information terminal. The smartphone 300 is provided with an out-camera 301 that captures an image of the outside world in front and a ranging sensor 303 that measures and detects the distance and angle to an object on the back of a housing 305 of the smartphone 300 . On the other hand, although not shown in FIG. 3, a display 908 (see FIG. 9) having a display screen is provided on the front of the housing 305 .
 スマートフォン300はスタンド304に支持される。 The smartphone 300 is supported by a stand 304.
 被検体10は、アウトカメラ301の画角に計測する指が入るようにスマートフォン300と手の位置を決める。 The subject 10 positions the smartphone 300 and the hand so that the finger to be measured is within the angle of view of the out-camera 301 .
 そして、実線108で示すようにタッピング運動を行っている左手102、右手105をアウトカメラ301で撮影し、アウトカメラ画像から、人差し指103と親指104との二指間距離、人差し指106と親指107との二指間距離を算出する。 Then, the left hand 102 and the right hand 105 performing the tapping motion are photographed by the out-camera 301 as indicated by the solid line 108, and from the out-camera image, the distance between the index finger 103 and the thumb 104 and the distance between the index finger 106 and the thumb 107 are determined. Calculate the distance between two fingers.
 アウトカメラ301だけでは、2次元でしか指の動きを捉えられず例えば手がアウトカメラ301に対して斜めになったときなどに正確に測れない場合がある。この場合、測距センサ303を用いることで奥行きまで検出でき、タッピング動作中の二指間距離を精度よく計測することが可能になる。二指間距離計測に際しては、検出精度とコストのバランスを考慮し、アウトカメラ301或いは測距センサ303のみを用いてもよいし、アウトカメラ301に測距センサ303を併用してもよい。 With the out-camera 301 alone, finger movements can only be captured in two dimensions, and for example, when the hand is tilted with respect to the out-camera 301, it may not be possible to measure accurately. In this case, the depth can be detected by using the distance measuring sensor 303, and the distance between the two fingers during the tapping operation can be accurately measured. When measuring the distance between two fingers, only the out-camera 301 or the distance measuring sensor 303 may be used, or the out-camera 301 and the distance measuring sensor 303 may be used together in consideration of the balance between detection accuracy and cost.
 このように、特別な器具を装着することなく広く普及している情報処理装置(携帯情報端末)を用いて、簡便な形で使い勝手よくタッピング運動中の二指間距離を計測検出できる。また、計測された情報をもとに、情報処理装置内の演算機能により特徴量を抽出することができ、第1実施形態で説明したと同様の効果が得られる。 In this way, it is possible to measure and detect the distance between two fingers during a tapping exercise in a simple and convenient manner using a widely spread information processing device (portable information terminal) without wearing any special equipment. Moreover, based on the measured information, the feature amount can be extracted by the arithmetic function in the information processing apparatus, and the same effect as described in the first embodiment can be obtained.
 また、スマートフォン300は、筐体305の正面にインカメラ302を具備する。インカメラ302は、被検体10の眼、口元、頬、こめかみ、首などの頭頚部位を撮影する。プロセッサ920は、インカメラ画像から、顔全体や、眼球・瞳孔の動きとまばたき、口元、頬、こめかみ、首などの頭頚部位の状態を計測する。インカメラ302では、第1実施形態で説明した眼球、瞳孔の異常な動きだけでなく、顔全体、口元、頬、こめかみの筋肉の痙攣、頬、こめかみ、首等の血流を観測できる。そして、インカメラ302で撮影された目の画像から、瞳孔の動きや白目の面積に基づいて眼球、瞳孔の動きを検出することができる。また、瞳孔の開閉や虹彩の検出も可能である。プロセッサ920は、眼球や瞳孔の動きから特徴量を抽出することができる。 The smartphone 300 also has an in-camera 302 on the front of the housing 305 . The in-camera 302 photographs the eyes, mouth, cheeks, temples, neck, and other head and neck regions of the subject 10 . The processor 920 measures the entire face, movement of eyeballs/pupils, blinking, mouth, cheeks, temples, head and neck regions such as the neck from the in-camera image. The in-camera 302 can observe not only the abnormal movement of the eyeballs and pupils described in the first embodiment, but also the spasms of the muscles of the entire face, mouth, cheeks, and temples, and the blood flow of the cheeks, temples, neck, and the like. From the image of the eye captured by the in-camera 302, the movement of the eyeball and pupil can be detected based on the movement of the pupil and the area of the white of the eye. It is also possible to detect the opening and closing of the pupil and the iris. The processor 920 can extract features from eyeball and pupil movements.
 また、インカメラ302で撮影された被検体10の顔の一部、例えば、口元、頬、こめかみ等が撮像された画像(これは被検体10顔全体を撮影した顔画像、首から顔を撮像した頭頚部画像の部分画像でもよい。)から口元、頬、こめかみ等の筋肉の痙攣、ひきつけを捉え、病気の予兆、前兆、兆候を指し示す生体情報として得ることができる。 In addition, an image of a part of the face of the subject 10 captured by the in-camera 302, such as the mouth, cheeks, temples, etc. (this is a face image of the entire face of the subject 10, or the face from the neck is captured). A partial image of the image of the head and neck may also be used.), spasms and convulsions of muscles of the mouth, cheeks, temples, etc. can be detected, and biometric information can be obtained that indicates signs, signs, and signs of illness.
 また、インカメラ302で撮影された被検体10の頬、こめかみ、首等の画像から、頬、こめかみ、首の輝度や色の変化を検出して、頬、こめかみ、首の血流による脈波を捉え、病気の予兆、前兆、兆候を指し示す生体情報として得ることができる。具体的には、血液中に含まれるヘモグロビンが緑色の光を吸収する特性に着目し、血流から生じると考えられる顔表面の輝度変化を捉えて血管の拍動である脈拍を検出するものである。 In addition, from the images of the cheeks, temples, neck, etc. of the subject 10 captured by the in-camera 302, changes in brightness and color of the cheeks, temples, and neck are detected, and pulse waves due to blood flow in the cheeks, temples, and neck are detected. can be captured and obtained as biological information that indicates signs, signs, and signs of illness. Specifically, we focused on the characteristic of hemoglobin contained in blood that absorbs green light, and detected the pulse, which is the pulsation of blood vessels, by capturing the brightness change on the face surface, which is thought to be caused by blood flow. be.
 詳しくは、まず、インカメラ画像から、取得フレームごとに顔領域の色成分(赤・緑・青色)それぞれの平均値を求める。その後、3つの色成分に共通するノイズを除去し、緑成分から輝度波形を抽出する。この輝度波形のピーク値から脈拍数を算出して、脈波測定が可能となる。なお、目、口元、頬、こめかみ、首などの頭頚部位で別の状態変化を計測して、或いは頭頚部以外の部位の状態変化を計測して、生体情報を取得してもよい。 Specifically, first, from the in-camera image, the average value of each color component (red, green, blue) of the face area is calculated for each acquired frame. After that, the noise common to the three color components is removed, and the luminance waveform is extracted from the green component. By calculating the pulse rate from the peak value of this luminance waveform, the pulse wave can be measured. The biometric information may be obtained by measuring another state change in the head and neck region such as the eyes, mouth, cheeks, temples, and neck, or by measuring the state change in a region other than the head and neck region.
 更には、本実施形態では、指タップ距離センサとしてアウトカメラ301や測距センサ303により二指タップ動作時の二指間距離を検出することと、頭頚部位状態変化センサとしてインカメラ302で指以外の主に頭頚部の部位を撮影してその状態変化を検出することとを同時に併用して行う。 Furthermore, in this embodiment, the out-camera 301 and the distance sensor 303 as finger tapping distance sensors detect the distance between two fingers during a two-finger tap operation, and the in-camera 302 as a head and neck region state change sensor detects the distance between two fingers. In addition to this, mainly the head and neck region is imaged and changes in the state are detected at the same time.
 アウトカメラ301や測距センサ303による二指間距離計測だけ、或いはインカメラ302撮影による頭頚部位の状態変化検出だけでは、病気の予兆、前兆、兆候を指し示す生体情報の検出精度に限度限界があるが、アウトカメラ301や測距センサ303、インカメラ302からなる両手段の同時併用により、より高い精度で病気の予兆、前兆、兆候を指し示す生体情報を検出することが可能となる。 Only by measuring the distance between two fingers by the out-camera 301 and the distance measuring sensor 303, or only by detecting changes in the state of the head and neck region by photographing with the in-camera 302, there is a limit to the detection accuracy of biometric information that indicates signs of illness. However, by simultaneously using the out-camera 301, the distance measuring sensor 303, and the in-camera 302 together, it is possible to detect biometric information indicating signs, signs, and signs of illness with higher accuracy.
 即ち、第1実施形態と同様、指だけでなく、指以外の部位で別の現象を計測することにより、病気の予兆、前兆、兆候の識別判別の確度を高めることができる。こうして、特別な器具を装着することなく、簡便な形で使い勝手よく病気の予兆、前兆、兆候を指し示す生体情報を高精度かつ高確度で検出できる情報処理装置を実現することができる。 That is, as in the first embodiment, by measuring other phenomena not only on the finger but also on a part other than the finger, it is possible to improve the accuracy of identification and discrimination of disease precursors, precursors, and signs. In this way, it is possible to realize an information processing apparatus capable of detecting biometric information indicative of a sign, portent, or symptom of a disease with high accuracy and high accuracy in a simple and convenient manner without having to wear a special instrument.
 図4は、アウトカメラ301や測距センサ303によるタッピング運動の測定結果と、インカメラ302による眼球運動の測定結果、更にインカメラ302による頬の筋肉の動きの測定結果、及びインカメラ302による顔表面の輝度変化の測定結果を時間経過とともに示した図の例である。 FIG. 4 shows the result of tapping movement measurement by the out-camera 301 and the distance sensor 303, the measurement result of the eyeball movement by the in-camera 302, the measurement result of the cheek muscle movement by the in-camera 302, and the face by the in-camera 302. It is an example of the figure which showed the measurement result of the luminance change of the surface with time progress.
 図4において、タッピング運動の測定結果及び眼球運動の測定結果の説明は、図2で説明したと同様であるので省略する。図4において、頬の筋肉の動きの測定結果では、点線枠401、402で例示するように、検出した小刻みな動き変動により、頬の痙攣やひきつけを生体情報として捉えることができる。 In FIG. 4, the description of the measurement result of tapping motion and the measurement result of eyeball movement is the same as that described in FIG. 2, so it is omitted. In FIG. 4, in the results of measurement of the movement of the cheek muscles, as exemplified by dotted-line frames 401 and 402, the detected small movement fluctuations can capture convulsions and convulsions of the cheeks as biological information.
 また、顔表面の輝度変化の測定結果では、緑成分から抽出された輝度波形を示しており、この輝度波形のピーク値(例えば、ポイント403、404)から脈拍数を算出して、脈波を生体情報として捉えることができる。 In addition, the measurement result of the luminance change on the face surface shows the luminance waveform extracted from the green component, and the pulse rate is calculated from the peak values (for example, points 403 and 404) of this luminance waveform, and the pulse wave is obtained. It can be taken as biological information.
 図5は、タッピング運動時のスマートフォン300の表示画面501に表示される画像を示した図の例である。 FIG. 5 is an example of a diagram showing an image displayed on the display screen 501 of the smartphone 300 during tapping exercise.
 表示画面501aは、指タッピング計測前の表示画像、表示画面501b、501cは、タッピング運動時の二指間距離の計測を行なっているときにスマートフォン300のディスプレイ908(図10参照)に表示される画面である。 Display screen 501a is a display image before finger tapping measurement, and display screens 501b and 501c are displayed on display 908 (see FIG. 10) of smartphone 300 during measurement of the distance between two fingers during tapping exercise. is the screen.
 表示画面501aは、アウトカメラ301で撮影された両指全体の姿(指全体風景)と、インカメラ302で撮影された被検体の顔を主体とした頭頚部画像505とが表示される。両指全体の姿表示により、計測前に、アウトカメラ301で計測するタッピング運動時の二指がアウトカメラ301の撮影範囲内に入っていることを確認することができる。また、頭頚部画像505の表示により、二指間距離計測前に、被検体10の顔を主体とした頭頚部がインカメラ302の撮影範囲内に入っていることを確認することができる。 The display screen 501a displays the appearance of the entire fingers (whole finger landscape) photographed by the out-camera 301 and a head and neck image 505 mainly of the subject's face photographed by the in-camera 302 . By displaying the appearance of both fingers as a whole, it can be confirmed that the two fingers during the tapping exercise to be measured by the out-camera 301 are within the photographing range of the out-camera 301 before measurement. Also, by displaying the head-and-neck image 505, it can be confirmed that the head-and-neck region, mainly including the face of the subject 10, is within the photographing range of the in-camera 302 before measuring the distance between two fingers.
 また、タッピング運動の計測前の状態から、頭頚部を撮影することができる。これは、タッピング運動の計測中は、注意力が手指に集中するため、集中状態でない頭頚部の状態を記録することで生体情報の補完となる場合があるからである。 In addition, the head and neck can be photographed from the state before the tapping movement is measured. This is because the attention is focused on the fingers during the measurement of the tapping motion, so that recording the state of the head and neck region, which is not in the state of concentration, may complement the biological information.
 一方、タッピング運動時の二指間距離の計測を行なっているときには、ディスプレイ908(図10参照)には、タッピング運動を行なっている両指全体の姿とは別の画像が表示される。これにより、タッピング運動計測中にまさにその指タップ動作を見ていることによって被検体10に視覚的なフィードバックの影響を抑制することができ、より正確なタッピング運動の計測を可能にする。 On the other hand, when measuring the distance between the two fingers during the tapping motion, the display 908 (see FIG. 10) displays an image different from the overall appearance of both fingers performing the tapping motion. As a result, it is possible to suppress the influence of visual feedback on the subject 10 by watching the finger tapping motion during the tapping motion measurement, thereby enabling more accurate tapping motion measurement.
 上記別の画像として例えば表示画面501bには、タッピング運動時の二指間距離計測の操作ガイド画像502とインカメラ302で撮影される被検体10の顔を主体とした頭頚部画像503とが表示される。操作ガイド画像502の表示により、正しいタッピング運動の操作案内やタッピング運動計測中の進捗状態を見ながら、タッピング運動の計測を行うことができる。 As the other images, for example, the display screen 501b displays an operation guide image 502 for measuring the distance between two fingers during the tapping exercise and a head and neck image 503 mainly including the face of the subject 10 photographed by the in-camera 302. be done. By displaying the operation guide image 502, it is possible to measure the tapping motion while observing the operation guidance of the correct tapping motion and the progress state during the tapping motion measurement.
 また、頭頚部画像503の表示により、眼、口元、頬、こめかみ、首などの頭頚部位504がインカメラ302で支障なく撮影されていることを視認することができる。 In addition, by displaying the head and neck image 503, it is possible to visually confirm that the head and neck region 504 such as the eyes, mouth, cheeks, temples, and neck are being photographed by the in-camera 302 without any trouble.
 また、他例として表示画面501cには、被検体10をリラックスさせるためのリラックス画像が表示される。これにより、被検体10は緊張がほぐれた状態でタッピング運動の計測を行うことができる。 As another example, a relaxation image for relaxing the subject 10 is displayed on the display screen 501c. As a result, the subject 10 can measure the tapping motion in a relaxed state.
<第3実施形態>
 第3実施形態は、タップ運動時の二指間距離計測と眼球の動き検出を同時に計測する第1情報処理装置と、別の生体情報を計測検出する他の機器(「生体情報センサ」という)とを連携させ、第1情報処理装置及び他の機器で計測検出された各生体情報を第2情報処理装置でタイムスタンプを同期させて表示する実施形態である。
<Third Embodiment>
The third embodiment includes a first information processing device that simultaneously measures the distance between two fingers during tapping and eyeball movement detection, and another device that measures and detects other biological information (referred to as a "biological information sensor"). are linked, and each biological information measured and detected by the first information processing device and another device is displayed by synchronizing the time stamps with the second information processing device.
 図6A、図6Bは、第3実施形態に係る情報処理システムの外観を示す図である。 6A and 6B are diagrams showing the appearance of an information processing system according to the third embodiment.
 図6Aに示す情報処理システム1は、第1情報処理装置としてHMD100、第2情報処理装置としてスマートフォン600、生体情報センサとしての被検体10の胸部に装着される心拍センサ601、被検体10の頭部に装着される脳波計602、被検体10の前腕に装着されるスマートウオッチ603や、血糖値センサ604、被検体10の上腕に装着される血圧計605、被検体10の耳部に装着されるオージオメータ606を含み、スマートフォン300とHMD100及び各生体情報センサとは通信接続されて構成される。通信には、Bluetooth(登録商標)等の近距離無線を使ってもよい。 The information processing system 1 shown in FIG. 6A includes an HMD 100 as a first information processing device, a smartphone 600 as a second information processing device, a heart rate sensor 601 worn on the chest of the subject 10 as a biological information sensor, and the head of the subject 10. an electroencephalograph 602 worn on the body, a smartwatch 603 worn on the forearm of the subject 10, a blood sugar sensor 604, a blood pressure monitor 605 worn on the upper arm of the subject 10, and an ear of the subject 10; The smartphone 300, the HMD 100, and each biological information sensor are configured to be connected for communication. Short-range radio such as Bluetooth (registered trademark) may be used for communication.
 HMD100は、アウトカメラ110もしくは測距センサ101でタッピング運動時の二指間距離計測を行い、左目視線センサ111及び右目視線センサ112で眼球の動きを同時に検出する。 The HMD 100 uses the out-camera 110 or the distance sensor 101 to measure the distance between two fingers during the tapping motion, and the left eye line sensor 111 and the right eye line sensor 112 detect eyeball movements at the same time.
 スマートフォン600は、HMD100及び各生体情報センサで計測検出された各生体情報を表示する。 The smartphone 600 displays each biological information measured and detected by the HMD 100 and each biological information sensor.
 心拍センサ601は、心臓の収縮運動を電気的に測定するもので心電図として図表化される。 The heart rate sensor 601 electrically measures the contraction motion of the heart and is charted as an electrocardiogram.
 脳波計602は、脳がその活動に伴って流し続けている微量な電流の電気的活動を被検体10の頭に装着した電極から計測・記録するもので、脳の活動状態や活動部位を調べることができる。 The electroencephalograph 602 measures and records, from electrodes attached to the head of the subject 10, the electrical activity of a very small amount of current that the brain continues to flow in association with its activity, and examines the activity state and active regions of the brain. be able to.
 スマートウオッチ603は、タッチスクリーンとCPUを搭載した腕時計タイプのウェアラブル端末で、心拍数を計測できる機能や、皮膚の活動電位からストレスの状態を計測したり、心電図を表示したりする機能を有している。 The smartwatch 603 is a wristwatch-type wearable terminal equipped with a touch screen and a CPU, and has a function of measuring a heart rate, a function of measuring a stress state from an action potential of the skin, and a function of displaying an electrocardiogram. ing.
 また、腕時計タイプの血糖値センサ604は、数μmの波長帯の中赤外レーザを使用して血液中の血糖値を測定する。 In addition, the wristwatch-type blood sugar level sensor 604 uses a mid-infrared laser with a wavelength band of several μm to measure the blood sugar level in the blood.
 血圧計605は、上腕にカフと呼ばれる袋状のベルトを巻き付けて上腕動脈の流れが内側から血管を押す圧力(血圧)を測定するもので、被検体10の健康状態や精神状態や把握することができる。 The sphygmomanometer 605 measures the pressure (blood pressure) that the flow of the brachial artery pushes the blood vessel from the inside by wrapping a bag-shaped belt called a cuff around the upper arm, and is used to grasp the health condition and mental condition of the subject 10 . can be done.
 オージオメータ606は、125Hzから8000Hzの間の7つの周波数の音に対し強弱を付けて発し、被検体10がヘッドフォンを通して聞き取りして聴力の測定を行うものである。 The audiometer 606 emits sounds of seven frequencies between 125 Hz and 8000 Hz with strength and weakness, and the subject 10 listens through headphones to measure the hearing ability.
 上記各生体情報センサで計測検出される各種の生体情報、及びHMD100で計測検出される指や頭頚部位の生体情報は、時間情報(タイムスタンプ)を合わせて計測検出される。 The various biological information measured and detected by each of the biological information sensors and the biological information of the fingers and head and neck regions measured and detected by the HMD 100 are measured and detected together with time information (time stamp).
 また、各生体情報センサで計測検出された各種の生体情報、及びHMD100で計測検出された手指や頭頚部位の生体情報は、主に無線通信を介して、スマートフォン600に送信される。 Also, various biological information measured and detected by each biological information sensor and biological information of fingers and head and neck regions measured and detected by the HMD 100 are transmitted to the smartphone 600 mainly via wireless communication.
 スマートフォン600では、各生体情報センサ及びHMD100のタイミングを合わせて計測された各種の生体情報を一つの画面上に表示する。即ち、タイムスタンプを同期させた各種の生体情報を一つの画面上に表示する。 The smart phone 600 displays on one screen various biometric information measured by matching the timing of each biometric information sensor and the HMD 100 . That is, various kinds of biological information with synchronized time stamps are displayed on one screen.
 これにより、HMD100での二指間距離計測や眼球の動きの検出によって得られる生体情報だけでなく、HMD100での計測と同期して他の機器で計測検出された複数の生体情報を合わせて、スマートフォン600で表示できる。その結果、病気の予兆、前兆、兆候を多面的に一層高い検出精度で的確に捉えることが可能となる。 As a result, not only the biological information obtained by measuring the distance between two fingers and detecting the movement of the eyeballs with the HMD 100, but also a plurality of biological information measured and detected by other devices in synchronization with the measurement by the HMD 100 can be combined, It can be displayed on the smart phone 600 . As a result, it becomes possible to accurately grasp signs, signs, and symptoms of diseases from multiple perspectives with higher detection accuracy.
 また、疾患への投薬によって種々の部位に生じる副作用の発見などにも寄与することができる可能性がある。 In addition, it may be possible to contribute to the discovery of side effects that occur in various parts of the body due to medication for diseases.
 上記では、第2情報処理装置としてスマートフォン600を示したが、これは必ずしも必須ではなく、HMD100と各生体情報センサが連携することで、HMD100の画面に測定結果を表示するようにしてもかまわない。 In the above description, the smartphone 600 is shown as the second information processing device, but this is not essential, and the measurement results may be displayed on the screen of the HMD 100 by cooperation between the HMD 100 and each biological information sensor. .
 図6Bに示す情報処理システム1Aは、第1情報処理装置及び第2情報処理装置としてスマートフォン300を用い、各生体情報センサは図6Aで示したものと同様のものを用い、これら各生体情報センサをスマートフォン300と通信接続して構成される。 The information processing system 1A shown in FIG. 6B uses the smartphone 300 as the first information processing device and the second information processing device, uses the same biological information sensors as those shown in FIG. is connected to the smartphone 300 for communication.
 図6Bの情報処理システム1Aは、図6Aに示した情報処理システム1の構成に対して、第1情報処理装置がHMD100からスマートフォン300に変わっているだけであり、同様の作用で同様の効果が得られる。 The information processing system 1A of FIG. 6B is different from the configuration of the information processing system 1 shown in FIG. 6A except that the first information processing device is changed from the HMD 100 to the smart phone 300, and the same action and the same effect can be obtained. can get.
 図7は、第3実施形態に係る情報処理装置の表示画面を示す図であり、図6に示した実施形態で計測検出された各種の生体情報をスマートフォン300でタイムスタンプを同期させて表示する場合の表示画面の例を示す図である。 FIG. 7 is a diagram showing a display screen of an information processing apparatus according to the third embodiment, in which various types of biological information measured and detected in the embodiment shown in FIG. FIG. 10 is a diagram showing an example of a display screen in this case;
 図7において、スマートフォン600の表示画面701には、指タップ、眼球運動、頬の動き、心電図、脳波を同時タイミングで測定した各測定結果が計測動作の時間軸を一致させて表示されている。指タップ、眼球運動、こめかみの動きの各測定結果は、図2、図4での説明と同じであり省略する。 In FIG. 7, on the display screen 701 of the smartphone 600, finger tapping, eye movements, cheek movements, electrocardiograms, and electroencephalograms are measured at the same time, and the time axes of the measurement operations are displayed. The measurement results of finger tapping, eye movement, and temple movement are the same as those described in FIGS. 2 and 4, and are omitted.
 心電図の測定結果は、心拍センサ601により心臓の収縮運動を電気的に測定したものであり、正常時は周期的に同じ波形が繰り返され、波形変化具合により心臓の状態を把握できる。 The measurement results of the electrocardiogram are obtained by electrically measuring the contraction motion of the heart by the heartbeat sensor 601. In normal times, the same waveform is repeated periodically, and the state of the heart can be grasped from the degree of change in the waveform.
 脳波測定結果は、波形の状態から損傷が疑われる部位を判断することができ、発作的に意識を失ったり痙攣したりする病気や認知機能障害を診断することができる。よって、タッピング運動の計測検出結果だけでなく、他の部位で検出した各生体情報を一つの画面に表示することで、同じ時間軸で疾病の発生の兆候を精度よく検出することが可能になる。  Electroencephalogram measurement results can determine areas where damage is suspected from the state of the waveform, making it possible to diagnose diseases such as episodic loss of consciousness or convulsions and cognitive dysfunction. Therefore, by displaying not only the results of tapping motion measurement but also various biological information detected at other sites on a single screen, it is possible to accurately detect signs of disease on the same time axis. .
 なお、HMD100やスマートフォン300及び生体情報センサにより同時タイミングで計測検出された各生体情報を一つの画面に表示する第2の情報処理装置としては、情報の通信機能と表示機能を有する情報処理装置が該当し、スマートフォンに限らず、パソコンやモニタ装置など、他の装置や機器でもよい。 As the second information processing device that displays on one screen the biological information measured and detected at the same time by the HMD 100, the smartphone 300, and the biological information sensor, an information processing device having an information communication function and a display function is used. This applies and is not limited to smartphones, and may be other devices or devices such as personal computers and monitor devices.
 図8、図9を参照して、情報処理システム1の処理の流れについて説明する。図8は、情報処理システム1の処理の流れを示すフローチャートである。図9は、異なる装置間で時間を合わせる処理の一例を示したものである。 The processing flow of the information processing system 1 will be described with reference to FIGS. 8 and 9. FIG. FIG. 8 is a flow chart showing the processing flow of the information processing system 1 . FIG. 9 shows an example of processing for synchronizing times between different devices.
 以下の説明における第1情報処理装置は例えばスマートフォン300であり、生体情報センサは心拍センサ601でもよい。 The first information processing device in the following description may be, for example, the smartphone 300, and the biometric information sensor may be the heartbeat sensor 601.
 計測準備として、第1情報処理装置と生体情報センサとの間で同期処理を行う(S01)。図9を示すように、第1情報処理装置は、生体情報センサと時計を合わせるために第1情報処理装置のコマンド発出時刻(TA0)を含んだ同期信号(sig.1)を生体情報センサに送信する。 As preparation for measurement, synchronization processing is performed between the first information processing device and the biological information sensor (S01). As shown in FIG. 9, the first information processing device sends a synchronization signal (sig.1) containing the command issue time (TA0) of the first information processing device to the biological information sensor in order to synchronize the clock with the biological information sensor. Send.
 生体情報センサでは、第1情報処理装置からの同期信号(sig.1)を受信した時刻TB1と、生体情報センサのコマンド発出時刻(TB2)を含んだ同期信号(sig.2)を第1情報処理装置に送信する。 In the biological information sensor, the synchronization signal (sig.2) including the time TB1 at which the synchronization signal (sig.1) was received from the first information processing device and the command issuing time (TB2) of the biological information sensor is transmitted as the first information. Send to processor.
 同期信号(sig.2)を受信した第1情報処理装置は、同期信号(sig.1)を発信した時刻TA0と同期信号(sig.2)受信した時刻TA3の差と、生体情報センサが、同期信号(sig.1)を受信した時刻TB1と同期信号(sig.2)を発信した時刻TB2の差の情報から、第1情報処理装置と生体情報センサの通信に要する時間と装置間の時間のずれを把握することができる。 The first information processing device that has received the synchronization signal (sig.2) calculates the difference between the time TA0 at which the synchronization signal (sig.1) is transmitted and the time TA3 at which the synchronization signal (sig.2) is received, and the biological information sensor From the information of the difference between the time TB1 when the synchronization signal (sig.1) was received and the time TB2 when the synchronization signal (sig.2) was transmitted, the time required for communication between the first information processing device and the biological information sensor and the time between the devices deviation can be grasped.
 同様に、同期信号(sig.3)により、生体情報センサも装置間の通信に要する時間と装置間の時間のずれを把握することができる。このようにして装置間の時間のずれΔTと、通信に要する時間を把握することで、各装置での測定結果について時間軸を合わせて表示することが可能となる。ただし、第1情報処理装置、生体情報センサのそれぞれが、世界共通の時間を示す時計を内蔵しているなら、このような処理は必要ない。 Similarly, the synchronization signal (sig.3) allows the biological information sensor to grasp the time required for communication between devices and the time lag between devices. By grasping the time difference ΔT between devices and the time required for communication in this way, it is possible to display the measurement results of each device along the time axis. However, if each of the first information processing device and the biometric information sensor incorporates a clock indicating the time common to the world, such processing is not necessary.
 第1情報処理装置は時間のずれΔTを演算する。 The first information processing device calculates the time lag ΔT.
 第1情報処理装置では、タッピング運動の二指間距離計測と頭頚部位状態変化センサによる頭頚部状態変化量の検出を同時に行い、生体情報センサは生体情報の検出を行い(S02)、生体情報センサから第1情報処理装置に生体情報を送信する(S03)。 The first information processing device simultaneously measures the distance between two fingers in the tapping motion and detects the amount of head and neck state change by the head and neck region state change sensor, and the biological information sensor detects the biological information (S02). The biological information is transmitted from the sensor to the first information processing device (S03).
 第1情報処理装置は、第1情報処理装置が時刻TAn-ΔTに検出した二指間距離計測結果及び頭頸部状態変化量情報と、生体情報センサから時刻TAに受信した生体情報とを同一の時間軸に合わせて第1情報処理装置のディスプレイに表示する(S04)。 The first information processing device combines the measurement result of the distance between two fingers and the head and neck state change amount information detected by the first information processing device at time TAn-ΔT with the biometric information received from the biometric information sensor at time TA. It is displayed on the display of the first information processing device according to the time axis (S04).
 計測終了の指示が第1情報処理装置に入力されるまで計測を継続し(S05:No、S02)、計測終了の指示があれば(S05:Yes)、終了する。 Measurement is continued until an instruction to end measurement is input to the first information processing device (S05: No, S02), and if there is an instruction to end measurement (S05: Yes), the measurement ends.
 次に、本実施形態に係る情報処理装置の構成例について、図10を用いて説明する。図10は、本実施形態に係る情報処理装置の例としてヘッドマウントディスプレイの構成例を示すブロック図である。 Next, a configuration example of the information processing apparatus according to this embodiment will be described using FIG. FIG. 10 is a block diagram showing a configuration example of a head-mounted display as an example of the information processing apparatus according to this embodiment.
 図10において、図1、図3に示され同一の符号を付された部分は、図1、図3で既に説明した動作と同一の動作を有するので、それらの詳細説明は一部省略する。 In FIG. 10, parts denoted by the same reference numerals as shown in FIGS. 1 and 3 have the same operations as those already explained in FIGS. 1 and 3, so detailed explanation thereof will be partially omitted.
 図10において、HMD100は、アウトカメラ110、測距センサ101、左目視線センサ111、右目視線センサ112、加速度センサ804、ジャイロセンサ805、地磁気センサ806、操作入力インタフェース807、ディスプレイ113、プロセッサ820、プログラム831と情報データ832を格納したメモリ830、バイブレータ841、マイク842、スピーカ843、通信器844を適宜用いて構成され、各構成部はそれぞれバス850を介して相互に接続されている。 10, the HMD 100 includes an out-camera 110, a distance measuring sensor 101, a left-eye line-of-sight sensor 111, a right-eye line-of-sight sensor 112, an acceleration sensor 804, a gyro sensor 805, a geomagnetic sensor 806, an operation input interface 807, a display 113, a processor 820, a program 831 and information data 832 , a vibrator 841 , a microphone 842 , a speaker 843 , and a communication device 844 .
 加速度センサ804は、単位時間当たりの速度の変化である加速度を検出するセンサであり、動き・振動・衝撃などを捉えることができる。HMD100内に設けられた加速度センサ804により、HMD100の傾きや方向を検知することができる。 The acceleration sensor 804 is a sensor that detects acceleration, which is a change in speed per unit time, and can detect movement, vibration, impact, and the like. An acceleration sensor 804 provided in the HMD 100 can detect the tilt and direction of the HMD 100 .
 ジャイロセンサ805は、回転方向の角速度を検出するセンサであり、縦・横・斜めの姿勢の状態を捉えることができる。よって、加速度センサ804及びジャイロセンサ805を用いて、HMD100の傾きや方向などの姿勢を検出することができる。 The gyro sensor 805 is a sensor that detects the angular velocity in the rotational direction, and can capture the state of vertical, horizontal, and diagonal postures. Therefore, using the acceleration sensor 804 and the gyro sensor 805, it is possible to detect the posture such as the tilt and direction of the HMD 100. FIG.
 地磁気センサ806は、地球の磁力を検出するセンサであり、HMD100の向いている方向を検出するものである。前後方向と左右方向に加え上下方向の地磁気も検出する3軸タイプを用い、HMD100の動きに対する地磁気変化を捉まえることにより、HMD100の動きを検出することも可能である。これらにより、HMD100を装着している被検体10がどのような姿勢状態かを検出判別でき、タッピング運動を行っているときの二指間距離をより正確に計測できるように、HMD100の状態の最適化に寄与できる。 The geomagnetic sensor 806 is a sensor that detects the magnetic force of the earth, and detects the direction in which the HMD 100 is facing. It is also possible to detect the movement of the HMD 100 by using a three-axis type that detects geomagnetism in the vertical direction as well as in the front-rear direction and the horizontal direction, and captures changes in the geomagnetism with respect to the movement of the HMD 100 . With these, the state of the HMD 100 is optimized so that the posture state of the subject 10 wearing the HMD 100 can be detected and determined, and the distance between two fingers during tapping can be measured more accurately. can contribute to
 上記加速度センサ804、ジャイロセンサ805、及び地磁気センサ806は、被検体10の頭頚部の動きを検出できるので、頭頚部位状態変化センサとして用いてもよい。 Since the acceleration sensor 804, the gyro sensor 805, and the geomagnetic sensor 806 can detect the movement of the head and neck of the subject 10, they may be used as head and neck region state change sensors.
 プロセッサ820は、CPU等で構成され、メモリ830に記憶格納されているオペレーティングシステム(Operating System:OS)や動作制御用アプリケーションなどのプログラム831を実行することにより、HMD100の各構成部を制御し、OS、ミドルウェア、アプリケーション等の機能や他の機能を実現する。 The processor 820 is composed of a CPU or the like, and controls each component of the HMD 100 by executing a program 831 such as an operating system (OS) and an operation control application stored in a memory 830. It realizes functions such as OS, middleware, applications, and other functions.
 一例として、プロセッサ820が本実施形態に係る情報処理システムの機能を実現するプログラムを実行することにより、指タッピング動作解析処理部821、眼球運動検出解析処理部822、切替制御部823、及び表示制御部824が実現する。 As an example, the processor 820 executes a program that implements the functions of the information processing system according to this embodiment, thereby performing a finger tapping motion analysis processing unit 821, an eye movement detection analysis processing unit 822, a switching control unit 823, and a display control unit. A portion 824 is realized.
 指タッピング動作解析処理部821は、アウトカメラ110もしくは測距センサ101で検出された手指の動きから指タッピング動作の軌跡を解析し、特徴となるパラメータを算出する。 The finger tapping motion analysis processing unit 821 analyzes the trajectory of the finger tapping motion from the motion of the fingers detected by the out-camera 110 or the distance measuring sensor 101, and calculates characteristic parameters.
 眼球運動検出解析処理部822は、左目視線センサ111、右目視線センサ112で検出された眼球や瞳孔の動きや向きから眼球運動を捉え、不随意な眼球や瞳孔のふらつきなど、被検体の異常状態の兆候の検知につながる特徴量を抽出する。 The eye movement detection analysis processing unit 822 captures the eye movement from the movement and direction of the eyeball and pupil detected by the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112, and detects an abnormal state of the subject, such as involuntary swaying of the eyeball and pupil. Extract features that lead to the detection of signs of
 切替制御部823は、眼球運動検出解析処理部822の処理に関して、対象物をポインティングするためのアイトラッキング動作と、眼球の運動の状態を計測する眼球運動計測で異なる処理を行うための切替を制御するものである。眼球運動計測では、被検体の眼球の微小な運動状態を観測し、特徴量を抽出する。アイトラッキング動作と眼球運動計測では、視線方向の変化や眼球の運動に対する追従の速度を変えたり、微小な動きを検出するなどの処理を行い、不随意な眼球や瞳孔のふらつくような動きを解析抽出する。 The switching control unit 823 controls the switching between the eye tracking operation for pointing the target object and the eye movement measurement for measuring the state of the eye movement regarding the processing of the eye movement detection analysis processing unit 822 for performing different processing. It is something to do. In the eye movement measurement, minute movement states of the subject's eyeballs are observed and feature quantities are extracted. Eye-tracking and eye-movement measurements change the tracking speed for changes in gaze direction and eye movement, and perform processing such as detecting minute movements to analyze involuntary eye and pupil movements. Extract.
 表示制御部824は、指タップ距離センサが検出した二指間距離の時間推移、頭頚部位状態変化センサが検出した頭頚部位状態変化量の時間推移、また生体情報センサがある場合は生体情報の時間推移を、時間軸を合わせてディスプレイ113の同一画面に並べて表示するための制御を行う。 The display control unit 824 displays the time transition of the distance between two fingers detected by the finger tap distance sensor, the time transition of the head and neck region state change amount detected by the head and neck region state change sensor, and the biological information when there is a biological information sensor. are displayed side by side on the same screen of the display 113 with the time axis aligned.
 メモリ830は、不揮発性記憶装置等で構成され、プロセッサ820等が扱う各種のプログラム831や情報データ832を記憶する。情報データ832としては、タッピング運動時などの撮影画像データ、指タップ距離計測の操作ガイド画像データ、リラックス画像データ、計測検出された生体情報などが格納される。 The memory 830 is composed of a non-volatile storage device or the like, and stores various programs 831 and information data 832 handled by the processor 820 and the like. As the information data 832, captured image data during tapping exercise, operation guide image data for finger tap distance measurement, relaxation image data, measured and detected biological information, and the like are stored.
 ディスプレイ113は、ビデオシースルー型のHMD100に備えられ、アウトカメラ110で撮影された目の前の実体物と仮想オブジェクト等を合わせて表示する液晶パネル等のディスプレイから構成される。これにより、被検体10は、目の前の視界画像内の実体物と仮想オブジェクト等を重ねて視認することができる。 The display 113 is provided in the video see-through type HMD 100, and is composed of a display such as a liquid crystal panel that displays the physical object in front of the user photographed by the out-camera 110 and the virtual object together. As a result, the subject 10 can visually recognize the physical object and the virtual object in the visual field image in front of the subject 10 while superimposing them.
 光学シースルー型のHMDの場合には、ディスプレイ113に代えて、例えば、指タップ距離計測の操作ガイド画像、リラックス画像などの仮想オブジェクトや被検体10への通知情報などを投影する投影部と、投影された仮想オブジェクト等を目の前で結像表示させる透明なハーフミラーからなっている。これにより、被検体10は、目の前の視界範囲の実体物とともに、結像された仮想オブジェクトを浮かんでいるような形で両者合わせて視認することができる。 In the case of an optical see-through HMD, instead of the display 113, for example, a projection unit that projects a virtual object such as an operation guide image for finger tap distance measurement or a relaxation image, notification information to the subject 10, and the like; It consists of a transparent half-mirror that forms and displays an image of a virtual object, etc. As a result, the subject 10 can visually recognize both the real object in the visual field range in front of the subject 10 and the imaged virtual object in a floating form.
 操作入力インタフェース807は、例えばジェスチャー操作や音声入力、仮想のキーボードやタッチセンサ等による入力インタフェースであり、被検体10が入力したい情報を設定入力するものである。操作入力インタフェース807は、HMD100の本体から分離し有線や無線で接続された形態でもよい。また、ディスプレイ113の表示画面内に入力操作画面を表示させ、視線が向いている入力操作画面上の位置により入力操作情報を取り込んでもよいし、ポインタを入力操作画面上に表示させ操作入力インタフェース807によりポインタを操作して入力操作情報を取り込んでもよい。また、被検体が入力操作を示す音声を発声し、マイク842で集音して入力操作情報を取り込んでもよい。 The operation input interface 807 is an input interface using, for example, gesture operations, voice input, virtual keyboards, touch sensors, etc., and is used to set and input information that the subject 10 wants to input. The operation input interface 807 may be separated from the main body of the HMD 100 and connected by wire or wirelessly. Alternatively, the input operation screen may be displayed within the display screen of the display 113 and the input operation information may be captured according to the position on the input operation screen to which the line of sight is directed. The input operation information may be captured by operating the pointer with . Alternatively, the subject may utter a sound indicating the input operation, and the microphone 842 may collect the sound to capture the input operation information.
 スピーカ843は、音声データに基づいてスピーカやヘッドフォンから音声を出力するで、被検体10への通知情報を音声で知らしめることができる。 The speaker 843 outputs sound from the speaker or headphones based on the sound data, so that the subject 10 can be informed of the notification information by sound.
 バイブレータ841は、プロセッサ820からの制御によって振動を発生させるもので、HMD100で発信された被検体10への通知指示情報を振動に変換する。バイブレータ841によりHMD100を装着している被検体10の頭部に振動を伝えて、通知指示情報を被検体10に知らしめることができる。なお、被検体10への通知情報の例としては、指タッピング計測、頭頚部計測の開始や終了を示す情報や、同時計測中の他の機器を示す情報などを示す通知、及び割込み通知等があり、これらの通知情報により、使い勝手の向上を図ることができる。 The vibrator 841 generates vibration under the control of the processor 820, and converts the notification instruction information transmitted by the HMD 100 to the subject 10 into vibration. The vibrator 841 can transmit vibration to the head of the subject 10 wearing the HMD 100 to inform the subject 10 of the notification instruction information. Examples of information to be notified to the subject 10 include information indicating the start and end of finger tapping measurement and head and neck measurement, information indicating information indicating other equipment during simultaneous measurement, interrupt notification, and the like. This notification information can improve usability.
 通信器844は、近距離無線通信、無線LAN或いは基地局通信により、少なくともHMD100、他の機器、スマートフォン300、600間で無線通信を行う通信インタフェースであり、所定の各種の通信インタフェースに対応する通信処理回路やアンテナ等を含み、生体情報、画像データ、制御信号などの送受信を行う。なお、近距離無線通信としては、Bluetooth(登録商標)、IrDA(Infrared Data Association、登録商標)、Zigbee(登録商標)、HomeRF(Home Radio Frequency、登録商標)、又は、Wi-Fi(登録商標)などの無線LANを用いて行なわれる。また、基地局通信としては、W-CDMA(Wideband Code Division Multiple Access、登録商標)やGSM(Global System for Mobile Communications)などの遠距離の無線通信を用いればよい。指タッピング動作解析処理部821は、測距センサ101を用いて計測検出されたタッピング運動中の二指間距離情報をもとに、距離、速度、加速度、タップインターバル、位相差などの各項目で指の動きの特徴を示す特徴量を解析抽出する。解析抽出された特徴量から被検体の脳機能評価につながる生体情報が取得できる。 The communication device 844 is a communication interface that performs wireless communication between at least the HMD 100, other devices, and the smartphones 300 and 600 by short-range wireless communication, wireless LAN, or base station communication, and is compatible with various predetermined communication interfaces. It includes a processing circuit, an antenna, etc., and transmits and receives biological information, image data, control signals, and the like. As short-range wireless communication, Bluetooth (registered trademark), IrDA (Infrared Data Association, registered trademark), Zigbee (registered trademark), HomeRF (Home Radio Frequency, registered trademark), or Wi-Fi (registered trademark) It is performed using a wireless LAN such as. Further, as base station communication, long-distance wireless communication such as W-CDMA (Wideband Code Division Multiple Access, registered trademark) and GSM (Global System for Mobile Communications) may be used. The finger tapping motion analysis processing unit 821 analyzes each item such as distance, speed, acceleration, tap interval, and phase difference based on the information on the distance between two fingers during the tapping motion that is measured and detected using the distance sensor 101. Analyze and extract the feature quantity that indicates the feature of the finger movement. Biological information leading to brain function evaluation of the subject can be obtained from the analyzed and extracted feature amount.
 以上の構成により、HMD100では、測距センサ101でタッピング運動中の二指間の距離情報を計測し、指タッピング動作解析処理部821により、計測された二指間の距離情報をもとに、指の動きの特徴を示す特徴量を抽出する。更に、眼球運動検出解析処理部822により、左目視線センサ111、右目視線センサ112で眼球の動き情報を検出し、不随意な眼球や瞳孔のふらつきなど、被検体の異常状態の兆候の検知につながる特徴量を抽出する。 With the above configuration, in the HMD 100, the distance sensor 101 measures the distance information between the two fingers during the tapping motion, and the finger tapping motion analysis processing unit 821 measures the distance information between the two fingers based on the measured distance information between the two fingers. A feature quantity indicating a finger movement feature is extracted. Furthermore, the eye movement detection and analysis processing unit 822 detects eye movement information with the left-eye line-of-sight sensor 111 and the right-eye line-of-sight sensor 112, which leads to detection of signs of an abnormal state of the subject, such as involuntary eyeball or pupil sway. Extract features.
 よって、HMD100では、同時に計測検出したこれらの情報をもとにして解析抽出した複数の特徴量を生体情報として得ることができる。そして、特別な器具を装着することなく、簡便な方法で使い勝手よく病気の予兆、前兆、兆候を指し示す生体情報を高精度で検出できる情報処理装置を実現できる。 Therefore, with the HMD 100, it is possible to obtain, as biometric information, a plurality of feature amounts that are analyzed and extracted based on these pieces of information that are measured and detected at the same time. Further, it is possible to realize an information processing apparatus capable of detecting, with high accuracy, biometric information indicative of a sign, an omen, or a symptom of a disease in a simple and easy-to-use manner without using a special instrument.
 次に、本実施形態に係る情報処理装置の構成例について、図11を用いて説明する。図11は、本実施形態に係る情報処理装置の例として携帯情報端末であるスマートフォン300の構成例を示すブロック図である。図11において、図1、図3に示され同一の符号を付された部分は、図1、図3で既に説明した動作と同一の機能を有するので、それらの詳細説明は省略する。またスマートフォン600はスマートフォン300と同一の構成であるので説明を省略する。 Next, a configuration example of the information processing apparatus according to this embodiment will be described using FIG. FIG. 11 is a block diagram showing a configuration example of a smart phone 300, which is a mobile information terminal, as an example of an information processing apparatus according to this embodiment. In FIG. 11, parts denoted by the same reference numerals as shown in FIGS. 1 and 3 have the same functions as the operations already explained with reference to FIGS. 1 and 3, so detailed explanations thereof will be omitted. Further, since the smartphone 600 has the same configuration as the smartphone 300, a description thereof will be omitted.
 図11において、スマートフォン300は、アウトカメラ301、インカメラ302、測距センサ303、加速度センサ904、ジャイロセンサ905、地磁気センサ906、操作入力インタフェース907、ディスプレイ908、プロセッサ920、プログラム931と情報データ932を格納するメモリ930、バイブレータ941、マイク942、スピーカ943、無線通信器944等を適宜用いて構成され、各構成部はそれぞれバス950を介して相互に接続されている。図10のHMD100と同一の構成については重複説明を省略する。 11, smartphone 300 includes out camera 301, in camera 302, range sensor 303, acceleration sensor 904, gyro sensor 905, geomagnetic sensor 906, operation input interface 907, display 908, processor 920, program 931 and information data 932. , a vibrator 941 , a microphone 942 , a speaker 943 , a wireless communication device 944 , etc., and each component is connected to each other via a bus 950 . Duplicate description of the same configuration as that of the HMD 100 in FIG. 10 will be omitted.
 プロセッサ920は、HMD100のプロセッサ820と同様、本実施形態に係る情報処理システムの機能を実現するプログラムを実行することにより、指タッピング動作解析処理部921、頭頚部解析処理部922、及び表示制御部924が実現する。 As with the processor 820 of the HMD 100, the processor 920 executes a program that implements the functions of the information processing system according to the present embodiment, thereby performing a finger tapping motion analysis processing unit 921, a head and neck analysis processing unit 922, and a display control unit. 924 is realized.
 ディスプレイ908は、液晶などで構成され透明性の高いタッチパネルを介して表示内容を表示するもので、タッピング運動時の風景画像や、指タップ距離計測の操作ガイド画像、リラックス画像、各種操作ボタン、被検体への通知情報などの表示を行う。被検体10への通知情報としては、指タッピング計測、頭頚部位の計測の開始や終了を示す情報や、同時計測中の他の機器を示す情報などが挙げられる。なお、スマートフォン300の表示デバイスでは、計測検出された各種の生体情報を時間経過とともに表示する。 A display 908 displays display contents via a highly transparent touch panel composed of a liquid crystal or the like. Displays information such as notification to the sample. Information to be notified to the subject 10 includes information indicating the start and end of finger tapping measurement and measurement of the head and neck region, information indicating other devices during simultaneous measurement, and the like. Note that the display device of the smartphone 300 displays various types of measured and detected biological information over time.
 操作入力インタフェース907は、被検体10によってスマートフォン300へ入力情報を入力するものであり、スマートフォン300の場合には、ディスプレイ908の表示面上に構成されたタッチパネルにより、指やタッチペンなどによるタッチ操作を操作入力として検出して、被検体10が入力したい情報を設定入力できる。また、例えばキーボードやキーボタン等によって入力操作情報を取り込んでもよいし、被検体10が入力操作を示す音声を発声し、マイク942で集音して入力操作情報を取り込んでもよい。 The operation input interface 907 is used by the subject 10 to input input information to the smartphone 300. In the case of the smartphone 300, a touch panel configured on the display surface of the display 908 enables touch operations such as a finger or a touch pen. Information that the subject 10 wants to input can be set and input by detecting it as an operation input. For example, the input operation information may be captured using a keyboard, key buttons, or the like, or the subject 10 may utter a sound indicating the input operation, and the microphone 942 may collect the sound to capture the input operation information.
 無線通信器944は、近距離無線通信、無線LAN或いは基地局通信により、少なくともHMD100、他の機器、スマートフォン300間で無線通信を行う通信インタフェースであり、所定の各種の通信インタフェースに対応する通信処理回路やアンテナ等を含み、生体情報、画像データ、制御信号などの送受信を行う。なお、近距離無線通信としては、Bluetooth(登録商標)、IrDA(Infrared Data Association、登録商標)、Zigbee(登録商標)、HomeRF(Home Radio Frequency、登録商標)、又は、Wi-Fi(登録商標)などの無線LANを用いて行なわれる。また、基地局通信としては、W-CDMA(Wideband Code Division Multiple Access、登録商標)やGSM(Global System for Mobile Communications)などの遠距離の無線通信を用いればよい。 The wireless communication device 944 is a communication interface that performs wireless communication between at least the HMD 100, other devices, and the smartphone 300 by short-range wireless communication, wireless LAN, or base station communication, and performs communication processing corresponding to various predetermined communication interfaces. It includes circuits, antennas, etc., and transmits and receives biological information, image data, control signals, and the like. As short-range wireless communication, Bluetooth (registered trademark), IrDA (Infrared Data Association, registered trademark), Zigbee (registered trademark), HomeRF (Home Radio Frequency, registered trademark), or Wi-Fi (registered trademark) It is performed using a wireless LAN such as. Further, as base station communication, long-distance wireless communication such as W-CDMA (Wideband Code Division Multiple Access, registered trademark) and GSM (Global System for Mobile Communications) may be used.
 次に、図12は、情報処理装置や他の機器で計測解析された特徴量を生体情報としてクラウドに保存して活用する場合の情報処理システムの実施形態を外観模式的に示す図である。 Next, FIG. 12 is a diagram schematically showing the appearance of an embodiment of an information processing system in which feature amounts measured and analyzed by an information processing device or other device are stored in the cloud as biometric information and utilized.
 図12において、HMD100やスマートフォン300からなる情報処理装置1000は、クラウドサーバ、アプリケーションサーバ、データサーバなどのサーバ(コンピュータ)1001とインターネットなどのネットワーク1002を介して接続される。また、サーバ1001は、例えばパソコン1003などの情報端末とネットワーク1002を介して接続される。 In FIG. 12, an information processing device 1000 comprising an HMD 100 and a smartphone 300 is connected to a server (computer) 1001 such as a cloud server, application server, or data server via a network 1002 such as the Internet. The server 1001 is also connected to an information terminal such as a personal computer 1003 via a network 1002 .
 図12に示した構成では、情報処理装置1000や他の機器で計測解析された生体情報データは、ネットワーク1002を経由してクラウドを構成するサーバ1001に格納保存される。 In the configuration shown in FIG. 12, biometric information data measured and analyzed by the information processing apparatus 1000 and other devices is stored and saved in the server 1001 forming the cloud via the network 1002 .
 一方、サーバ1001では、生体情報の一般的な平均値データや、年齢別、基礎疾患別の病気の予兆、前兆、兆候を指し示す生体情報データなど多くの生体情報データを網羅的に保有している。 On the other hand, the server 1001 comprehensively stores a large amount of biometric information data such as general average value data of biometric information, and biometric information data indicating signs, signs, and indications of diseases by age and by underlying disease. .
 また、リハビリテーションによる改善データや、投薬による効果、副作用のデータが蓄えられている。よって、情報処理装置1000等とは別の場所に居てパソコン1003などの情報端末をリモートで使用する医師や看護師等の使用者1004は、サーバ1001から、解析された生体情報データ、及びサーバ1001で保有している多くの生体情報データを取得して比較して、病気の予兆、前兆、兆候を高精度に分析することが可能となる。このように、離れたところに居る医師から遠隔で高度な診断を受けることが実現できる。 In addition, data on improvements due to rehabilitation and data on the effects and side effects of medication are stored. Therefore, a user 1004 such as a doctor or a nurse who remotely uses an information terminal such as a personal computer 1003 at a location different from the information processing apparatus 1000 receives the analyzed biological information data from the server 1001 and the server It becomes possible to acquire and compare a large amount of biological information data held by 1001 and to analyze with high precision the precursors, precursors, and signs of illness. In this way, it is possible to receive advanced diagnosis remotely from a doctor in a remote location.
 また、スマートフォン300では、HMD100や他の機器により同時タイミングで計測検出された各生体情報を一つの画面に表示でき、同じ時間軸で異常動作の発生を検出精度よく的確に把握判断することが可能になる。 In addition, the smart phone 300 can display on a single screen the biological information measured and detected at the same time by the HMD 100 and other devices, and it is possible to accurately ascertain and judge the occurrence of abnormal behavior on the same time axis with high detection accuracy. become.
 本発明は上記した実施形態に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施形態は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、ある実施形態の構成の一部を他の実施形態の構成に置き換えることが可能であり、また、ある実施形態の構成に他の実施形態の構成を加えることも可能である。また、各実施形態の構成の一部について、他の構成の追加・削除・置換をすることが可能である。 The present invention is not limited to the above-described embodiments, and includes various modifications. For example, the above-described embodiments have been described in detail in order to explain the present invention in an easy-to-understand manner, and are not necessarily limited to those having all the configurations described. Also, part of the configuration of one embodiment can be replaced with the configuration of another embodiment, and the configuration of another embodiment can be added to the configuration of one embodiment. Moreover, it is possible to add, delete, or replace a part of the configuration of each embodiment with another configuration.
 また、上記の各構成、機能、処理部、処理手段等は、それらの一部又は全部を、例えば集積回路で設計する等によりハードウェアで実現してもよい。また、上記の各構成、機能等は、プロセッサがそれぞれの機能を実現するプログラムを解釈し、実行することによりソフトウエアで実現してもよい。各機能を実現するプログラム、テーブル、ファイル等の情報は、メモリや、ハードディスク、SSD(Solid State Drive)等の記録装置、又は、ICカード、SDカード、DVD等の記録媒体に格納されてもよいし、通信網上の装置に格納されてもよい。また、制御線や情報線は説明上必要と考えられるものを示しており、製品上必ずしも全ての制御線や情報線を示しているとは限らない。実際には殆ど全ての構成が相互に接続されていると考えてもよい。 In addition, each of the above configurations, functions, processing units, processing means, etc. may be realized in hardware, for example, by designing a part or all of them with an integrated circuit. Moreover, each of the above configurations, functions, etc. may be realized by software by a processor interpreting and executing a program for realizing each function. Information such as programs, tables, and files that implement each function may be stored in recording devices such as memory, hard disks, SSDs (Solid State Drives), or recording media such as IC cards, SD cards, and DVDs. and may be stored in a device on a communication network. Further, the control lines and information lines indicate those considered necessary for explanation, and not all control lines and information lines are necessarily indicated on the product. In practice, it may be considered that almost all configurations are interconnected.
1    :情報処理システム
1A   :情報処理システム
10   :被検体
100  :HMD
101  :測距センサ
102  :左手
103  :指
104  :親指
105  :右手
106  :指
107  :親指
108  :実線
109  :実線
110  :アウトカメラ
111  :左目視線センサ
112  :右目視線センサ
113  :ディスプレイ
121  :画面
122  :リラックス画像
123  :操作ガイド画像
201  :点線枠
202  :点線枠
300  :スマートフォン
301  :アウトカメラ
302  :インカメラ
303  :測距センサ
304  :スタンド
305  :筐体
401  :点線枠
402  :点線枠
403  :ポイント
404  :ポイント
501  :表示画面
501a :表示画面
501b :表示画面
501c :表示画面
502  :操作ガイド画像
503  :頭頚部画像
504  :頭頚部位
505  :頭頚部画像
600  :スマートフォン
601  :心拍センサ
602  :脳波計
603  :スマートウオッチ
604  :血糖値センサ
605  :血圧計
606  :オージオメータ
701  :表示画面
802  :プロセッサ
804  :加速度センサ
805  :ジャイロセンサ
806  :地磁気センサ
807  :操作入力インタフェース
820  :プロセッサ
821  :指タッピング動作解析処理部
822  :眼球運動検出解析処理部
823  :切替制御部
824  :表示制御部
830  :メモリ
831  :プログラム
832  :情報データ
841  :バイブレータ
842  :マイク
843  :スピーカ
844  :通信器
850  :バス
904  :加速度センサ
905  :ジャイロセンサ
906  :地磁気センサ
907  :操作入力インタフェース
908  :ディスプレイ
920  :プロセッサ
921  :指タッピング動作解析処理部
922  :頭頚部解析処理部
924  :表示制御部
930  :メモリ
931  :プログラム
932  :情報データ
941  :バイブレータ
942  :マイク
943  :スピーカ
944  :無線通信器
950  :バス
1000 :情報処理装置
1001 :サーバ
1002 :ネットワーク
1003 :パソコン
1004 :使用者
 
Reference Signs List 1: Information processing system 1A: Information processing system 10: Subject 100: HMD
101 : Ranging sensor 102 : Left hand 103 : Finger 104 : Thumb 105 : Right hand 106 : Finger 107 : Thumb 108 : Solid line 109 : Solid line 110 : Out camera 111 : Left eye sight line sensor 112 : Right eye sight line sensor 113 : Display 121 : Screen 122 : Relax image 123 : Operation guide image 201 : Dotted frame 202 : Dotted frame 300 : Smartphone 301 : Out camera 302 : In camera 303 : Distance sensor 304 : Stand 305 : Housing 401 : Dotted frame 402 : Dotted frame 403 : Point 404: Point 501: Display screen 501a: Display screen 501b: Display screen 501c: Display screen 502: Operation guide image 503: Head and neck image 504: Head and neck region 505: Head and neck image 600: Smartphone 601: Heart rate sensor 602: Electroencephalograph 603 : smart watch 604 : blood glucose sensor 605 : sphygmomanometer 606 : audiometer 701 : display screen 802 : processor 804 : acceleration sensor 805 : gyro sensor 806 : geomagnetic sensor 807 : operation input interface 820 : processor 821 : finger tapping motion analysis Processing unit 822: eye movement detection analysis processing unit 823: switching control unit 824: display control unit 830: memory 831: program 832: information data 841: vibrator 842: microphone 843: speaker 844: communication device 850: bus 904: acceleration sensor 905 : Gyro sensor 906 : Geomagnetic sensor 907 : Operation input interface 908 : Display 920 : Processor 921 : Finger tapping motion analysis processing unit 922 : Head and neck analysis processing unit 924 : Display control unit 930 : Memory 931 : Program 932 : Information data 941 : vibrator 942 : microphone 943 : speaker 944 : wireless communication device 950 : bus 1000 : information processing device 1001 : server 1002 : network 1003 : personal computer 1004 : user

Claims (12)

  1.  情報処理装置であって、
     プロセッサを備え、
     前記プロセッサは、
     被検体の二指のタッピング運動時の二指間距離を計測検出して二指間距離情報を出力する指タップ距離センサと、指以外の主に頭頚部の部位の状態変化を計測検出して頭頚部状態変化量情報を出力する頭頚部位状態変化センサと、のそれぞれに接続され、
     前記指タップ距離センサ及び前記頭頚部位状態変化センサ間で同期処理を実行させ、
     前記指タップ距離センサ及び前記頭頚部位状態変化センサが同期して計測動作を行って得た前記二指間距離情報及び前記頭頚部状態変化量情報を取得する、
     ことを特徴とする情報処理装置。
    An information processing device,
    with a processor
    The processor
    A finger tapping distance sensor that measures and detects the distance between two fingers during tapping motion of the subject and outputs information on the distance between two fingers, and a sensor that measures and detects changes in the state of parts other than the fingers, mainly the head and neck. a head and neck region state change sensor that outputs head and neck state change amount information;
    executing synchronization processing between the finger tap distance sensor and the head and neck region state change sensor;
    acquiring the distance information between the two fingers and the amount of change in the state of the head and neck obtained by the finger tap distance sensor and the state change sensor of the head and neck region synchronously performing measurement operations;
    An information processing device characterized by:
  2.  請求項1に記載の情報処理装置であって、
     前記プロセッサに接続されたディスプレイを更に備え、
     前記プロセッサは、
     前記二指間距離情報及び前記頭頚部状態変化量情報の計測動作の時間軸を一致させて、前記ディスプレイに表示するための制御を更に実行する、
     ことを特徴とする情報処理装置。
    The information processing device according to claim 1,
    further comprising a display connected to the processor;
    The processor
    Matching the time axis of the measurement operation of the information on the distance between the two fingers and the information on the amount of change in head and neck state, and further performing control for displaying on the display;
    An information processing device characterized by:
  3.  請求項1に記載の情報処理装置であって、
     前記プロセッサは、
     前記二指間距離情報に基づいて被検体の脳機能評価につながる特徴量を生体情報として解析抽出するとともに、前記頭頚部状態変化量情報に基づいて前記被検体の異常状態評価につながる特徴量を生体情報として解析抽出する、
     ことを特徴とする情報処理装置。
    The information processing device according to claim 1,
    The processor
    Based on the information on the distance between two fingers, a feature amount leading to brain function evaluation of the subject is analyzed and extracted as biological information, and a feature amount leading to an abnormal state evaluation of the subject is extracted based on the head and neck state change amount information. Analyze and extract as biological information,
    An information processing device characterized by:
  4.  請求項1に記載の情報処理装置であって、
     前記情報処理装置は、被検体の頭部に装着され現実空間情報及び仮想空間情報の少なくとも一つを表示するヘッドマウントディスプレイに備えられ、
     前記ヘッドマウントディスプレイは、
     前記指タップ距離センサとして、対象物までの距離を計測する測距センサを備え、
     前記頭頚部位状態変化センサとして、前記被検体の左目の動きを検出する左目視線センサ、及び前記被検体の右目の動きを検出する右目視線センサを備える、
     ことを特徴とする情報処理装置。
    The information processing device according to claim 1,
    The information processing device is provided in a head-mounted display that is worn on the subject's head and displays at least one of real space information and virtual space information,
    The head mounted display is
    A distance sensor for measuring the distance to an object is provided as the finger tap distance sensor,
    As the head and neck region state change sensors, a left eye line-of-sight sensor that detects movement of the subject's left eye and a right eye line-of-sight sensor that detects movement of the subject's right eye are provided.
    An information processing device characterized by:
  5.  請求項4に記載の情報処理装置であって、
     前記ヘッドマウントディスプレイは、被検体の目前に配置されるディスプレイを更に備え、
     前記プロセッサは、前記ディスプレイに接続され、
     前記指タップ距離センサが、前記被検体によるタッピング運動時の二指間距離の計測を行なっているときには、前記タッピング運動時の指全体風景とは異なる内容からなる画像を表示する、
     ことを特徴とする情報処理装置。
    The information processing device according to claim 4,
    The head-mounted display further comprises a display arranged in front of the subject,
    the processor is connected to the display;
    When the finger tapping distance sensor is measuring the distance between two fingers during the tapping movement of the subject, an image having a content different from the entire finger scenery during the tapping movement is displayed.
    An information processing device characterized by:
  6.  請求項4に記載の情報処理装置であって、
     前記左目視線センサ及び前記右目視線センサの、視線入力処理と眼球運動計測で異なる処理に切り替える制御を行う、
     ことを特徴とする情報処理装置。
    The information processing device according to claim 4,
    Performing control to switch between different processing for line-of-sight input processing and eye movement measurement of the left-eye line-of-sight sensor and the right-eye line-of-sight sensor;
    An information processing device characterized by:
  7.  請求項1に記載の情報処理装置であって、
     前記情報処理装置は、携帯情報端末に備えられ、
     前記携帯情報端末は、
     筐体の正面に設けられたインカメラと、
     前記筐体の背面に設けられたアウトカメラと、を備え、
     前記指タップ距離センサとして前記アウトカメラを用い、前記プロセッサは、前記アウトカメラが前記被検体の前記二指を撮影したアウトカメラ画像に基づいて前記被検体の二指間距離を計測検出し、
     前記頭頚部位状態変化センサとして前記インカメラを用い、前記プロセッサは、前記インカメラが前記被検体の頭頚部位を撮影したインカメラ画像に基づいて前記頭頚部位の状態変化を計測検出する、
     ことを特徴とする情報処理装置。
    The information processing device according to claim 1,
    The information processing device is provided in a mobile information terminal,
    The portable information terminal is
    In-camera provided on the front of the housing,
    and an out-camera provided on the back of the housing,
    The out-camera is used as the finger tap distance sensor, the processor measures and detects the distance between the two fingers of the subject based on the out-camera image captured by the out-camera of the two fingers of the subject,
    The in-camera is used as the head and neck region state change sensor, and the processor measures and detects the state change of the head and neck region based on the in-camera image captured by the in-camera of the head and neck region of the subject.
    An information processing device characterized by:
  8.  請求項7に記載の情報処理装置であって、
     前記携帯情報端末は測距センサを更に備え、
     前記指タップ距離センサとして、前記アウトカメラに代わり、前記測距センサを用いる、
     ことを特徴とする情報処理装置。
    The information processing device according to claim 7,
    The mobile information terminal further comprises a ranging sensor,
    using the distance sensor instead of the out-camera as the finger tap distance sensor;
    An information processing device characterized by:
  9.  請求項7に記載の情報処理装置であって、
     前記携帯情報端末は、前記筐体の正面に設けられたディスプレイを更に備え、
     前記プロセッサは、前記ディスプレイに接続され、
     前記指タップ距離センサが、前記被検体によるタッピング運動時の二指間距離の計測を行なっているときには、前記タッピング運動時の指全体風景とは異なる内容からなる画像を前記ディスプレイに表示する、
     ことを特徴とする情報処理装置。
    The information processing device according to claim 7,
    The portable information terminal further comprises a display provided on the front of the housing,
    the processor is connected to the display;
    When the finger tapping distance sensor is measuring the distance between two fingers during the tapping movement of the subject, an image having a different content from the whole finger scene during the tapping movement is displayed on the display.
    An information processing device characterized by:
  10.  請求項9に記載の情報処理装置であって、
     前記携帯情報端末は、前記タッピング運動時の指全体風景とは異なる内容からなる画像として、前記タッピング運動を行っている前記被検体の頭頚部を前記インカメラで撮影した頭頚部画像、前記被検体をリラックスさせるためのリラックス画像、又は前記タッピング運動による二指間距離計測の操作ガイド画像である、
     ことを特徴とする情報処理装置。
    The information processing device according to claim 9,
    The portable information terminal provides a head and neck image obtained by photographing the head and neck of the subject performing the tapping exercise with the in-camera as an image having content different from the entire finger scene during the tapping exercise, is a relaxation image for relaxing, or an operation guide image for measuring the distance between two fingers by the tapping exercise,
    An information processing device characterized by:
  11.  第1情報処理装置、生体情報センサのそれぞれを第2情報処理装置に接続して構成される情報処理システムであって、
     第1情報処理装置は、
     第1プロセッサと、
     被検体の二指のタッピング運動時の二指間距離を計測検出して二指間距離情報を出力する指タップ距離センサと、
     指以外の主に頭頚部の部位の状態変化を計測検出して頭頚部状態変化量情報を出力する頭頚部位状態変化センサと、
     第1通信装置を備え、
     第1プロセッサは、
     前記指タップ距離センサ及び前記頭頚部位状態変化センサ間で同期処理を実行させ、
     前記指タップ距離センサ及び前記頭頚部位状態変化センサが同期して計測動作を行って得た前記二指間距離情報及び前記頭頚部状態変化量情報を取得して、第2情報処理装置に送信し、
     生体情報センサは、
     前記被検体の生体情報を前記指タップ距離センサ及び前記頭頚部位状態変化センサと同期して取得し、前記第2情報処理装置に送信し、
     前記第2情報処理装置は、
     第2プロセッサと、
     ディスプレイと、
     第2通信装置を備え、
     前記第2プロセッサは、前記第1情報処理装置と前記生体情報センサとの同期処理を行い、
     前記二指間距離情報、前記頭頚部状態変化量情報、及び前記生体情報センサからの情報を計測動作の時間軸を一致させて、前記ディスプレイに表示する、
     ことを特徴とする情報処理システム。
    An information processing system configured by connecting a first information processing device and a biological information sensor to a second information processing device,
    The first information processing device is
    a first processor;
    a finger tapping distance sensor that measures and detects the distance between two fingers during the tapping motion of the subject's two fingers and outputs distance information between the two fingers;
    a head and neck region state change sensor that measures and detects changes in the state of mainly head and neck regions other than fingers and outputs head and neck region state change amount information;
    comprising a first communication device;
    The first processor
    executing synchronization processing between the finger tap distance sensor and the head and neck region state change sensor;
    Acquire the information on the distance between two fingers and the information on the amount of change in the state of the head and neck region obtained by the finger tap distance sensor and the state change sensor of the head and neck region synchronously performing measurement operations, and transmit the information to the second information processing device. death,
    The biological information sensor is
    acquiring biological information of the subject in synchronization with the finger tap distance sensor and the head and neck region state change sensor, and transmitting the biological information to the second information processing device;
    The second information processing device is
    a second processor;
    a display;
    comprising a second communication device,
    The second processor performs synchronization processing between the first information processing device and the biological information sensor,
    Displaying the information on the distance between the two fingers, the information on the amount of change in the state of the head and neck, and the information from the biological information sensor on the display while matching the time axis of the measurement operation;
    An information processing system characterized by:
  12.  情報処理方法であって、
     被検体の二指のタッピング運動時の二指間距離を計測検出して二指間距離情報を出力する指タップ距離センサと、指以外の主に頭頚部の部位の状態変化を計測検出して頭頚部状態変化量情報を出力する頭頚部位状態変化センサと、を同期させるステップと、
     前記指タップ距離センサ及び前記頭頚部位状態変化センサが同期して計測動作を行って得た前記二指間距離情報及び前記頭頚部状態変化量情報を取得するステップと、
     を含むことを特徴とする情報処理方法。
     
    An information processing method,
    A finger tapping distance sensor that measures and detects the distance between two fingers during tapping motion of the subject and outputs information on the distance between two fingers, and a sensor that measures and detects changes in the state of parts other than the fingers, mainly the head and neck. a step of synchronizing a head and neck region state change sensor that outputs head and neck state change amount information;
    a step of acquiring the distance information between the two fingers and the head and neck state change amount information obtained by the finger tap distance sensor and the head and neck state change sensor performing synchronous measurement operations;
    An information processing method comprising:
PCT/JP2021/043586 2021-11-29 2021-11-29 Information processing device, information processing system, and information processing method WO2023095321A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/043586 WO2023095321A1 (en) 2021-11-29 2021-11-29 Information processing device, information processing system, and information processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/043586 WO2023095321A1 (en) 2021-11-29 2021-11-29 Information processing device, information processing system, and information processing method

Publications (1)

Publication Number Publication Date
WO2023095321A1 true WO2023095321A1 (en) 2023-06-01

Family

ID=86539293

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/043586 WO2023095321A1 (en) 2021-11-29 2021-11-29 Information processing device, information processing system, and information processing method

Country Status (1)

Country Link
WO (1) WO2023095321A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011229745A (en) * 2010-04-28 2011-11-17 Hitachi Computer Peripherals Co Ltd Motor function analyzing apparatus
US20140154651A1 (en) * 2012-12-04 2014-06-05 Sync-Think, Inc. Quantifying peak cognitive performance using graduated difficulty
JP2017217144A (en) * 2016-06-06 2017-12-14 マクセルホールディングス株式会社 System, method and program for generating hand finger movement practice menu
JP2018505759A (en) * 2015-01-06 2018-03-01 バートン,デイビット Portable wearable monitoring system
JP2019511067A (en) * 2016-01-19 2019-04-18 マジック リープ, インコーポレイテッドMagic Leap,Inc. Augmented reality system and method utilizing reflection
JP2020537579A (en) * 2017-10-17 2020-12-24 ラオ、サティシュ Machine learning-based system for identifying and monitoring neuropathy
WO2021014717A1 (en) * 2019-07-22 2021-01-28 マクセル株式会社 Detection device and detection method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011229745A (en) * 2010-04-28 2011-11-17 Hitachi Computer Peripherals Co Ltd Motor function analyzing apparatus
US20140154651A1 (en) * 2012-12-04 2014-06-05 Sync-Think, Inc. Quantifying peak cognitive performance using graduated difficulty
JP2018505759A (en) * 2015-01-06 2018-03-01 バートン,デイビット Portable wearable monitoring system
JP2019511067A (en) * 2016-01-19 2019-04-18 マジック リープ, インコーポレイテッドMagic Leap,Inc. Augmented reality system and method utilizing reflection
JP2017217144A (en) * 2016-06-06 2017-12-14 マクセルホールディングス株式会社 System, method and program for generating hand finger movement practice menu
JP2020537579A (en) * 2017-10-17 2020-12-24 ラオ、サティシュ Machine learning-based system for identifying and monitoring neuropathy
WO2021014717A1 (en) * 2019-07-22 2021-01-28 マクセル株式会社 Detection device and detection method

Similar Documents

Publication Publication Date Title
Cognolato et al. Head-mounted eye gaze tracking devices: An overview of modern devices and recent advances
CN109804331B (en) Detecting and using body tissue electrical signals
CN105578954B (en) Physiological parameter measurement and feedback system
US20200268296A1 (en) Brain-computer interface with adaptations for high-speed, accurate, and intuitive user interactions
Lamonaca et al. Health parameters monitoring by smartphone for quality of life improvement
Majaranta et al. Eye tracking and eye-based human–computer interaction
JP7106569B2 (en) A system that evaluates the user&#39;s health
US20180206735A1 (en) Head-mounted device for capturing pulse data
EP3064130A1 (en) Brain activity measurement and feedback system
JP2019513516A (en) Methods and systems for acquiring, aggregating and analyzing visual data to assess human visual performance
US10709328B2 (en) Main module, system and method for self-examination of a user&#39;s eye
WO2015123771A1 (en) Gesture tracking and control in augmented and virtual reality
KR102029219B1 (en) Method for recogniging user intention by estimating brain signals, and brain-computer interface apparatus based on head mounted display implementing the method
JP2018196730A (en) Method and system for monitoring eye position
JP2022500801A (en) Devices, methods, and programs for determining a user&#39;s cognitive status on a mobile device.
US20220071484A1 (en) Virtual reality-based portable nystagmography device and diagnostic test method using same
Amprimo et al. Gmh-d: Combining google mediapipe and rgb-depth cameras for hand motor skills remote assessment
JP7209954B2 (en) Nystagmus analysis system
WO2023095321A1 (en) Information processing device, information processing system, and information processing method
EP4088648A1 (en) Imaging device, ocular movement data processing system, and control method
WO2023042343A1 (en) Measurement processing terminal, method, and computer program for performing process of measuring finger movement
JP7449898B2 (en) Imaging device, eye movement data processing system, and control method
Wilder et al. Eye tracking in virtual environments
KR20190076722A (en) Method and system for testing multiple-intelligence based on vr/ar using mobile device
US20240111380A1 (en) Finger tapping measurement processing terminal, system, method, and computer program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21965690

Country of ref document: EP

Kind code of ref document: A1