WO2017170804A1 - Biometric measurement apparatus, information processing program, and biometric measurement method - Google Patents

Biometric measurement apparatus, information processing program, and biometric measurement method Download PDF

Info

Publication number
WO2017170804A1
WO2017170804A1 PCT/JP2017/013125 JP2017013125W WO2017170804A1 WO 2017170804 A1 WO2017170804 A1 WO 2017170804A1 JP 2017013125 W JP2017013125 W JP 2017013125W WO 2017170804 A1 WO2017170804 A1 WO 2017170804A1
Authority
WO
WIPO (PCT)
Prior art keywords
head
sensor
subject
image
sensor device
Prior art date
Application number
PCT/JP2017/013125
Other languages
French (fr)
Japanese (ja)
Inventor
慎治 吉田
美奈 吉村
清 長谷川
Original Assignee
株式会社日立ハイテクノロジーズ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立ハイテクノロジーズ filed Critical 株式会社日立ハイテクノロジーズ
Priority to JP2018509401A priority Critical patent/JPWO2017170804A1/en
Publication of WO2017170804A1 publication Critical patent/WO2017170804A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/026Measuring blood flow
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements

Definitions

  • the present invention relates to a biological measurement apparatus, an information processing program, and a biological measurement method.
  • a brain activity measuring system having a brain activity measuring device and a control unit.
  • the brain activity measuring device is disposed on the head of the subject, a plurality of sensors for measuring blood flow and brain waves of the subject disposed on the inner side of the wearing tool, and on the outer side of the wearing tool A plurality of display units.
  • the control unit measures the blood flow and brain waves in each region of the brain associated with the brain activity of the subject via a sensor, stores this measurement data (measurement value) in a memory, and based on the measurement data, The activity level at the measurement point is determined, and the display color and blinking speed of each display unit corresponding to each measurement point are controlled from the determination result (Patent Document 1).
  • the brain activity measurement system described in Patent Document 1 measures blood flow in the entire brain, and sensors are arranged over the entire head.
  • the brain activity measurement system described in Patent Document 1 is a large-scale device for measuring a specific part of the brain.
  • the blood flow and brain waves of a specific part of the brain are to be measured, it is desirable to downsize the living body measuring device to a size that can measure the specific part of the brain.
  • the biometric device is downsized, it is unclear which part of the head the biometric device is mounted. Therefore, the position where the biometric device is mounted (the position of the sensor on the subject's head) is determined. It is required to specify.
  • An object of the present invention is to provide a living body measuring apparatus capable of specifying a position where a sensor is mounted.
  • the first aspect is A sensor device mounted on a subject and having a plurality of sensors for measuring the physical quantity of the subject; An imaging device for imaging the sensor device and the subject; A biometric measuring device including: a user terminal that calculates a mounting position of the sensor of the sensor device with respect to the subject from an image captured by the imaging device.
  • the biological measurement device calculates the position of the sensor device attached to the subject from an image obtained by photographing the subject.
  • the disclosed aspect may be realized by a program being executed by an information processing apparatus. That is, the disclosed configuration can be specified as a program for causing the information processing apparatus to execute the processing executed by each unit in the above-described aspect, or a computer-readable recording medium on which the program is recorded. Further, the disclosed configuration may be specified by a method in which the information processing apparatus executes the process executed by each of the above-described units. The configuration of the disclosure may be specified as a system including an information processing apparatus that performs the processing executed by each of the above-described units.
  • FIG. 1 is a diagram illustrating a configuration example of a biological measurement apparatus according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a configuration example of the head-mounted device.
  • FIG. 3 is a diagram illustrating a configuration example of a user terminal.
  • FIG. 4 is a diagram illustrating an appearance of the head-mounted device.
  • FIG. 5 is a diagram illustrating an example in which the head mounting device is mounted on the head.
  • FIG. 6 is a diagram illustrating an example of an operation flow for recognizing the head mounting position of the head mounting device mounted on the head.
  • FIG. 7 is a diagram illustrating an example in which the measurement result is superimposed on the image of the head of the subject.
  • FIG. 1 is a diagram illustrating a configuration example of a biological measurement apparatus according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating a configuration example of the head-mounted device.
  • FIG. 3 is a diagram illustrating a configuration example of a user terminal.
  • FIG. 8 is a diagram illustrating an example in which the measurement result is superimposed on the simulated image of the head.
  • FIG. 9 is a diagram illustrating an example of the appearance of the head-mounted device.
  • 10 is a diagram illustrating an example of a columnar housing of the sensor of the head-mounted device of FIG.
  • FIG. 11 is a diagram illustrating an example of the sensor 14 facing the front.
  • FIG. 12 is a diagram illustrating an example of the sensor 14 not facing the front.
  • FIG. 13 is a diagram illustrating an example of a cross section of a head-mounted device that is not mounted on the head.
  • FIG. 14 is a diagram illustrating an example of a cross section of the head mounting device mounted on the head.
  • FIG. 15 is a diagram illustrating an example of the appearance of the head-mounted device.
  • FIG. 1 is a diagram illustrating a configuration example of a biological measurement apparatus according to an embodiment of the present invention.
  • the living body measurement device 5 specifies a mounting position of the head mounting device 1 that is mounted on the head of the subject and measures the blood flow of the head, and the mounting position of the head mounting device 1 and the result.
  • It has a user terminal 2 to display and an imaging device 3 for imaging where the head mounting device 1 is mounted on the head.
  • the head mounting device 1 and the user terminal 2 will be described in detail with reference to FIGS.
  • the head mounting device 1 and the user terminal 2 may be integrated. Further, the user terminal 2 and the imaging device 3 may be integrated. Blood flow is an example of a physical quantity of a subject.
  • the imaging device 3 is connected to the user terminal 2 by wire or wirelessly, and the captured image data is transferred to the user terminal 2.
  • the imaging device 3 is a photographing means such as a CCD (charge-coupled device) camera or an imaging tube.
  • the size of the imaging device 3 may be any size as long as the head can be imaged.
  • a single imaging device 3 may be used to image a plurality of heads, or a plurality of imaging devices 3 may be used to capture a plurality of heads at a time.
  • the living body measuring device 5 detects measurement data (also referred to as a detection value) indicating a change in blood flow from the user's head, and acquires brain activity information indicating the activity state of the user's brain.
  • the biological measurement device 5 is an example of a biological light measurement device.
  • a user is an example of a subject.
  • FIG. 2 is a diagram illustrating a configuration example of the head-mounted device.
  • the head-mounted device 1 includes a control unit 11, a wireless communication unit 13, sensors 141 to 146, and an output unit 15 in terms of information processing. When the sensors 141 to 146 are not distinguished, the sensors 141 to 146 are collectively referred to as the sensor 14.
  • the control unit 11 controls measurement and communication of the head-mounted device 1.
  • the control unit 11 includes, for example, a processor such as a CPU (Central Processing Unit) or a DSP (Digital Signal Processor) and a memory, and executes processing by a computer program, firmware, or the like that is executed on the memory. .
  • control unit 11 may be a dedicated hardware circuit, FPGA (Field Programmable Gate Gate Array), or the like that activates the wireless communication unit 13, the sensor 14, and the output unit 15 and executes cooperation processing with each component. Good.
  • the control unit 11 may be a mixture of a CPU, a DSP, a dedicated hardware circuit, and the like.
  • the head-mounted device 1 has six sensors 141 to 146, but the head-mounted device 1 may have six or more sensors 14 or fewer than six sensors 14. You may have.
  • the head mounting device 1 is an example of a sensor device.
  • the wireless communication unit 13 is connected to the control unit 11, the sensor 14, and the output unit 15 through a predetermined interface. However, the wireless communication unit 13 may be configured to acquire data from the sensor 14 via the control unit 11.
  • the wireless communication unit 13 communicates with the user terminal 2 via the network N1.
  • the network N1 is a network according to standards such as Bluetooth (registered trademark), wireless LAN (Local Area Network), and ZigBee (registered trademark).
  • the wireless communication unit 13 is an example of a transfer unit. However, in the information processing system, the standard of the wireless interface of the wireless communication unit 13 is not limited.
  • the header portion of the communication header or the user data portion (payload portion) in the communication data is embedded so that the user terminal 2 can identify the user (subject).
  • a communication unit that performs wired communication with the wireless communication unit 13 instead of the wireless communication unit 13 may be provided. That is, the head-mounted device 1 and the user terminal 2 may be connected by a wired communication interface.
  • the interface for wired communication is not limited, and various interfaces such as USB (Universal Serial Bus) and PCI Express can be used according to the use of the information processing system.
  • Sensor 14 irradiates the head with near-infrared rays, receives near-infrared rays that are partially absorbed and scattered near the cerebral cortex of the brain, and converts them into electrical signals.
  • the cerebral cortex of the brain has a different blood flow rate, for example, depending on the activity state of the brain.
  • the amount of hemoglobin bound to oxygen in the blood and the amount of hemoglobin not bound to oxygen change. Due to a change in the amount of hemoglobin, a change in the amount of oxygen, or the like, the near-infrared absorption characteristic or scattering characteristic in the vicinity of the cerebral cortex changes.
  • the sensor 14 converts the near-infrared light whose light amount is changed by the change in the near-infrared absorption rate or the change in the transmittance according to the blood flow state in the vicinity of the cerebral cortex into an electrical signal and outputs it.
  • Each sensor 14 is identified by an identifier, for example.
  • the sensor 14 includes, for example, a near-infrared light source that emits near-infrared light and a light-receiving unit that receives near-infrared light.
  • the near infrared light source is, for example, an LED (Light Emitting Diodes), an infrared lamp, or the like.
  • the light receiving unit includes a photoelectric element such as a photodiode or a phototransistor, an amplifier, and an AD (Analog Digital) converter. Note that the near-infrared light source and the light receiving unit may not be provided in pairs. For example, a plurality of light receiving units may be provided for one near infrared light source.
  • the light receiving unit is an example of a light detection unit.
  • the sensor 14 may further include a light source for detecting the wearing state.
  • the light source for mounting state detection is, for example, an LED or the like.
  • the light source for wearing state detection is substituted with a near infrared light source.
  • Each light source is an example of a light irradiation means.
  • the output unit 15 is, for example, a light emitting element such as an LED or a speaker that outputs sound or sound, and displays output information from the control unit 11.
  • the output unit 15 outputs the mounting state of the sensor 14 on the head by light or sound.
  • the output unit 15 may be a vibrator that outputs vibration.
  • the output unit 15 may be provided in each sensor 14.
  • the user terminal 2 acquires from the head-mounted device 1 change data of near-infrared absorption or transmission near the user's cerebral cortex, and performs various information processing related to the activity state of the user's brain.
  • Provide services including The user terminal 2 is an example of an information processing apparatus (computer).
  • the user terminal 2 is a dedicated or general-purpose computer such as a PC (Personal Computer), a smartphone, a mobile phone, a tablet terminal, a car navigation device, a PDA (Personal Digital Assistant), a game machine (game device), or a computer. It can be realized using an electronic device equipped with The user terminal 2 can be installed in, for example, a fitness club or a learning school.
  • the user terminal 2 includes a CPU 21, a memory 22, a wireless communication unit 23, a public line communication unit 24, a display unit 25, an operation unit 26, an output unit 27, an imaging unit 28, and a positioning unit 29. And a physical sensor unit 2A.
  • the CPU 21 executes processing as the user terminal 2 by a computer program that is executed in the memory 22 so as to be executable.
  • the processing as the user terminal 2 is, for example, a service including various information processing related to the activity state of the user's brain.
  • the CPU 21 that executes such a computer program is an example of an analysis unit.
  • the memory 22 stores a computer program executed by the CPU 21, data processed by the CPU 21, and data used.
  • the memory 22 may include volatile memory and non-volatile memory.
  • the wireless communication unit 23 is the same as the wireless communication unit 13 of the head-mounted device 1.
  • the wireless communication unit 23 is an example of a receiving unit.
  • the user terminal 2 may include a communication unit that performs wired communication instead of the wireless communication unit 23 or together with the wireless communication unit 23.
  • the public line communication unit 24 communicates with a server on the network N2, for example, the server (arithmetic unit) 3 and the like (not shown) via the network N2.
  • the network N2 is a public line network, for example, a mobile phone network.
  • the public line communication unit 24 connects to the network N2 via a base station of the mobile phone network.
  • the network N2 may be a network including an access network to the communication device of the Internet service provider and the Internet.
  • the access network to the communication device of the Internet connection company is, for example, an optical network provided by a communication carrier, ADSL (Asymmetric Digital Subscriber Line), or the like.
  • the network N2 is an example of a public wireless network.
  • the public line communication unit 24 is an example of public wireless communication means.
  • the network N2 is not limited to the public line network.
  • a private network such as a local network such as a LAN (Local Area Network), a company, an operator, a government office, a school, a research institution, etc.
  • a wide area network such as VPN (Virtual Private Network).
  • companies, businesses, government offices, schools, research institutions, etc. are also referred to as companies.
  • the display unit 25 is, for example, a liquid crystal display, an EL (Electro-Luminescence) panel, and the like, and displays output information from the CPU 21.
  • the operation unit 26 is, for example, a push button, a touch panel, or the like, and accepts a user operation.
  • the output unit 27 is, for example, a vibrator that outputs vibration, a speaker that outputs sound or sound, and the like.
  • the imaging unit 28 is a camera including a solid-state imaging element, for example.
  • As the solid-state image sensor a CCD (Charge-coupled device) image sensor, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, or the like can be used.
  • the positioning unit 29 is, for example, a GPS (Global Positioning System) receiver, receives radio waves from GPS satellites, and calculates the current position (latitude, longitude, etc.), time, and the like.
  • the positioning unit 29 is not limited to the one having a GPS receiver.
  • the positioning unit 29 may perform positioning based on the distance from the mobile phone base station.
  • the physical sensor unit 2A is, for example, an acceleration sensor or an angular acceleration sensor.
  • the physical sensor unit 2A may be a temperature sensor, a humidity sensor, an atmospheric pressure sensor, or a water pressure sensor.
  • FIG. 4 is a diagram illustrating an example of the appearance of the head-mounted device.
  • the head-mounted device 1 has six sensors 141 to 146 that are arranged in 2 rows and 3 columns (2 ⁇ 3) and measure blood flow in the head of the subject.
  • the head mounting device 1 is wound around and fixed to the subject's head by, for example, a mounting belt. Although six sensors 14 are used here, the number of sensors is arbitrarily determined depending on the range to be measured.
  • Each sensor 14 includes a columnar housing having a polygonal or circular cross section.
  • a sensor 14 including a light source or a light receiving part is attached to one cross section of the columnar housing, and an output part 15 is attached to the other cross section.
  • the cross-sectional shape includes, for example, a hexagonal shape, an octagonal shape, a square shape, a circular shape, and the like.
  • the head mounting device 1 includes a reference point 1a that can be recognized from the outside.
  • the reference point 1 a is provided at a predetermined position of the head mounting device 1.
  • the reference point 1a may be provided in one of the sensors 14.
  • the reference point 1a is indicated by a predetermined picture, character, light, or the like that can be recognized in an image captured by the imaging device 3.
  • the reference point 1a may be indicated by a light blinking pattern.
  • the head mounting device 1 may have a plurality of reference points. When the head-mounted device 1 has a plurality of reference points, predetermined pictures, characters, lights, etc. of the reference points are different from each other.
  • the position of each sensor 14 and the position of the reference point 1a in the head-mounted device 1 are measured in advance. Therefore, when the position of the head-mounted device 1 and the position of the reference point 1a on the subject's head are specified, the position of each sensor 14 on the subject's head is specified.
  • FIG. 5 is a diagram showing an example in which the head mounting device is mounted on the head.
  • FIG. 5 is an image of the head of the subject wearing the head mounting device 1 imaged by the imaging device 3.
  • the left side of the subject's head is shown, and the head mounting device 1 including a plurality of sensors 14 is mounted above the left ear.
  • FIG. 6 is a diagram showing an example of an operation flow for specifying the mounting position of the head mounting device mounted on the head.
  • the user terminal 2 may display information (image or character information) indicating the position where the head mounting device 1 is to be mounted on the display unit 25.
  • the position where the head mounting device 1 should be mounted is stored in advance in the memory 22 of the user terminal 2 or the like.
  • a subject or the like wearing the head mounting device 1 can confirm information (image or character information) displayed on the display unit 25 of the user terminal 2 and confirm a position where the head mounting device 1 is mounted. it can.
  • the operation flow of FIG. 6 is started after the subject wears the head mounting device 1 on the head.
  • the operation flow of FIG. 6 may be started when the test subject inputs information indicating completion of mounting of the head-mounted device 1 through the operation unit 26 of the user terminal 2.
  • the position where the head-mounted device 1 is to be mounted depends on the purpose of measuring cerebral blood flow.
  • the imaging device 3 images the head of the subject wearing the head mounting device 1.
  • the imaging device 3 is directed to the subject's head by the subject or the like. Also, the subject may point his / her head toward the imaging device 3.
  • the head of the subject to be imaged is, for example, the front of the head including eyes and nose, the left side of the head on the left side of the head (for example, FIG. 5), the right side of the head on the right side of the head, the top of the head on the top of the head, And five locations on the back side of the head.
  • a plurality of locations may be imaged by a single imaging device 3, or a plurality of locations may be captured by a plurality of imaging devices 3.
  • the imaging device 3 transmits the captured image to the user terminal 2.
  • the imaging device 3 may image at least one location including the head of the subject and the head mounting device 1.
  • the image captured by the imaging device 3 may be a still image or a moving image.
  • the user terminal 2 receives an image (head image) captured from the imaging device 3.
  • the received image is stored in the memory 22 or the like.
  • the CPU 21 extracts the outline of the head from the head image.
  • the CPU 21 extracts the contour of the head of the subject and the contour of the head mounting device 1 by a known image recognition method.
  • the CPU 21 extracts the position of the specific part of the head (position on the head surface) from the image of the head by a known image recognition method.
  • Specific sites are, for example, the eyes, nose, mouth, ears, shoulders, top of the head, and the like.
  • CPU21 may produce
  • an image in which the head is viewed from an arbitrary direction is generated by specifying the outline of the head and the three-dimensional position of each part from a plurality of images obtained by imaging the head. Can do.
  • CPU21 is good also considering the image produced
  • the position of each part in the head may be indicated by coordinates set on the surface of the head on the basis of the positions of eyes, nose, and the like.
  • the left eye position is the origin
  • the left eye to the right eye direction is the first axis
  • the left eye to nose direction is the second axis.
  • the first axis and the second axis are not parallel
  • every position in the image of the head including the nose can be expressed by the coordinates.
  • the positions of both shoulders and the top of the head can be specified and the same can be done.
  • the position of the top of the head can be, for example, the uppermost position in the outline of the head.
  • the CPU 21 extracts the head mounting device 1 mounted on the subject's head and the reference point 1a of the head mounting device 1 from the head image.
  • the CPU 21 compares, for example, the image of the head mounted device 1 and the image of the head stored in advance in the memory 22 or the like, and a portion that matches the image of the head mounted device 1 in the image of the head Are extracted as the head-mounted device 1. Further, the CPU 21 extracts a picture, character, light, etc., which is the same as the predetermined picture, character, light, etc. of the reference point 1a from the image of the head by a known image recognition method or the like, and extracts this as the reference point 1a. recognize.
  • the CPU 21 specifies the position of the head-mounted device 1 and the position of the reference point 1a.
  • the CPU 21 specifies the position of the head-mounted device 1 and the position of the reference point 1a as relative positions based on the positions of the eyes and nose of the head.
  • the CPU 21 may specify the three-dimensional position of the head-mounted device 1 and the three-dimensional position of the reference point 1a from the head-mounted device 1 included in the head image. Even if the position is represented by two-dimensional coordinates based on the positions of eyes, noses, etc. in one head image, it is based on the three-dimensional positions of eyes, noses, etc., identified from a plurality of head images. It may be expressed in three-dimensional coordinates.
  • the CPU 21 stores the identified position of the head-mounted device 1 and the position of the reference point 1a in the memory 22 or the like. It is desirable for the imaging device 3 to image the head-mounted device 1 at a position perpendicular to the reference point 1a.
  • the head mounted device 1 is imaged with a deviation from a position perpendicular to the reference point 1a, the head calculated from the position of the head actually measured by the head mounted device 1 and the image captured by the imaging device 3 is calculated. This is because the position of the portion is displaced. If an image is taken from a position perpendicular to the reference point 1a, the position of the head can be recognized more accurately.
  • the imaging device 3 determines that the position is different from the position perpendicular to the reference point 1a
  • the subject is placed on the head-mounted device 1 so that the imaging device 3 captures an image at a position perpendicular to the reference point 1a. You may make it display so that it may image again.
  • the CPU 21 may generate an image of the head obtained by superimposing the image of the head mounted device 1 mounted on the head on the image of the head of the subject, and display the image on the display unit 25. Further, the CPU 21 can specify the position of each sensor 14 from the position of the head-mounted device 1 and the position of the reference point 1a.
  • the position of each sensor 14 can be specified even when the shape of the head-mounted device 1 has symmetry (line symmetry, rotational symmetry).
  • the position of each sensor 14 is specified from the position of the head-mounted device 1 and the position of the reference point 1a, but the position of each sensor 14 is determined from the positions of the two reference points that the head-mounted device 1 has.
  • the position may be specified.
  • CPU21 may display an error on the display part 25, when the position of the head mounting apparatus 1 and the position of the reference point 1a cannot be specified in the image of the head. At this time, the operation flow in FIG. 6 ends assuming that the head-mounted device 1 is not worn.
  • the CPU 21 compares the position where the head mounting device 1 is to be mounted with the position where the identified head mounting device 1 is mounted.
  • the position where the head mounting device 1 is to be mounted is stored in advance in the memory 22 or the like.
  • the CPU 21 displays the head mounting device 1 in an appropriate position on the display unit 25. Display that it is installed. Further, the CPU 21 instructs the head mounting device 1 to start measurement.
  • the head-mounted device 1 starts measuring cerebral blood flow.
  • the user terminal superimposes the measurement result of the cerebral blood flow by each sensor 14 of the head mounted device 1 on the display unit 25 on the image of the head of the subject at the position of each sensor 14 in the image of the head of the subject. May be displayed.
  • the subject or the like can recognize the measurement result of the cerebral blood flow in association with the measured position of the head by the display.
  • FIG. 7 is a diagram showing an example in which the measurement result is superimposed on the image of the head of the subject.
  • the example of FIG. 7 includes the measurement result when the head-mounted device 1 is mounted on the subject's head and the cerebral blood flow is measured as shown in FIG.
  • the left side of the subject's head is shown, and the measurement result of cerebral blood flow is superimposed and displayed at the position where the head-mounted device 1 that performed the measurement is mounted.
  • the user terminal 2 adds brain blood to an image of the head viewed from an arbitrary direction generated by specifying the outline of the head and the three-dimensional position of each part from a plurality of images obtained by imaging the head. An image on which the measurement result of the flow rate is superimposed may be generated.
  • the position where the measurement result of the cerebral blood flow is superimposed is a position in the image of the head corresponding to the position where the head mounting device 1 is mounted.
  • the region indicating the measurement result may not coincide with the region where the head-mounted device 1 is worn. Further, the area indicating the measurement result may be smaller or larger than the area where the head-mounted device 1 is worn.
  • FIG. 8 is a diagram showing an example in which the measurement result is superimposed on the simulated image of the head.
  • an image in which the measurement result of the right head and the measurement result of the forehead is superimposed on a simulated image of the head viewed from above is displayed.
  • the simulated image of the head when the head is viewed from above includes, for example, both eyes, nose and ears. By including eyes and noses in the simulated image, it becomes easier to recognize the position on the head.
  • the head-mounted device 1 is mounted on the top of the head, the back of the head, or the left of the head and the cerebral blood flow is measured, the image in which the measurement result is superimposed on the simulated image of the head similarly. Can be generated.
  • the CPU 21 indicates that the head mounting device 1 is appropriate for the display unit 25. Display that it is not attached to the position. Furthermore, the user terminal 2 may display on the display unit 25 a prompt that prompts the subject to wear the head-mounted device 1 at an appropriate position. The subject confirms the display on the display unit 25 and again puts the head-mounted device 1 in an appropriate position. At this time, the CPU 21 may display on the display unit 25 the position where the head mounting device 1 is currently mounted and the position where the head mounting device 1 should be mounted. After the head mounting device 1 is mounted, the operation flow of FIG. 6 is started again.
  • the above configurations can be combined as much as possible.
  • head mounted devices 1 There may be a plurality of head mounted devices 1 to be mounted. When a plurality of head mounting devices 1 are mounted, the mounting position is specified for each sensor of each head mounting device 1.
  • the head mounting device 1 can be mounted on a portion other than the head, such as an arm or a leg, it can be performed in the same manner as in the above example.
  • the user terminal 2 can specify the position of each sensor using the position of a joint, the position of a finger, or the like instead of the position of the head's eyes or nose.
  • the reference point 1a may be provided in any sensor 14 of the head-mounted device 1.
  • the head mounting device 1 may be provided with a plurality of reference points 1a.
  • the output unit 15 may have the function of the reference point 1a.
  • FIG. 9 is a diagram showing an example of the appearance of the head-mounted device.
  • the head-mounted device 1 in FIG. 9 has 15 sensors 14 that measure blood flow in the head of the subject, which are arranged at the positions of lattice points of a triangular lattice on the surface of the sheet-like substance.
  • some of the sensors 14 are referred to as a sensor 141, a sensor 142, and a sensor 143.
  • the head mounting device 1 is wound around and fixed to the subject's head by, for example, a mounting belt.
  • the sheet-like substance can be deformed into a curved shape in accordance with the shape of the subject's head.
  • the initial state of the sheet-like substance is assumed to be planar.
  • the sheet-like substance may be curved in advance according to the shape of the subject's head.
  • Each sensor 14 in FIG. 9 includes a columnar housing having a hexagonal cross section.
  • the hexagonal shape of the sensor 14 in FIG. 9 is such that opposite sides are the same length and parallel.
  • the length of a pair of opposite sides of the hexagon is length a
  • the length of the other pair of opposite sides is length b
  • the length of the remaining pair of opposite sides is length c.
  • a sensor 14 including a light emitting part (light source) or a light receiving part is attached to one cross section of the columnar housing, and an output part 15 is attached to the other cross section.
  • the light emitting part or the light receiving part of the sensor 14 is arranged so as to contact the head of the subject.
  • the distance from the position of the center (light emitting part or light receiving part) of the sensor 14 to the position of the center of another sensor 14 adjacent to the sensor 14 is 2 ⁇ S2.
  • a center point of a line segment connecting the center of the sensor 14 and the center of another sensor 14 adjacent to the sensor 14 is referred to as an intermediate point.
  • the distance between the center of the sensor 14 and the adjacent intermediate point is S2.
  • the height of the hexagonal columnar housing is S1.
  • FIG. 10 is a diagram showing an example of a columnar housing of the sensor of the head mounted device of FIG.
  • the sensor 14 is a columnar housing having a hexagonal cross section, and a light emitting unit or a light receiving unit is provided on one surface of the hexagon.
  • An output section is provided on the other side of the hexagon.
  • the surface on which the light emitting part or the light receiving part of the sensor 14 is provided is the side that contacts the subject's head.
  • the surface where the output part 15 of the sensor 14 is provided becomes an outer side.
  • the length of two sides is a length a
  • the length of the other two sides is a length b
  • the length of the remaining two sides is a length c.
  • the user terminal 2 acquires an image (head image) captured from the imaging device 3.
  • the received image is stored in the memory 22 or the like.
  • the CPU 21 extracts the outline of the head from the head image.
  • the CPU 21 extracts the contour of the head of the subject and the contour of the head mounting device 1 by a known image recognition method. Further, the CPU 21 extracts the position of the specific part of the head (position on the head surface) from the image of the head by a known image recognition method.
  • Specific sites are, for example, the eyes, nose, mouth, ears, shoulders, top of the head and the like.
  • the CPU21 extracts the sensor 14 contained in the head mounting apparatus 1 extracted from the image of the head.
  • the CPU 21 extracts the sensor 14 based on the shape, color, predetermined mark, etc. of the sensor 14 given in advance.
  • the CPU 21 extracts, from the extracted sensors 14, the sensor 14 facing the front with respect to the imaging device 3.
  • the sensor 14 facing the front is a sensor 14 in which the hexagonal surface outside the sensor 14 and the optical axis of the imaging device 3 are orthogonal to each other.
  • the head mounting device 1 is curved along the head, so that the outer surface of the sensor 14 may not face the front.
  • the three-dimensional position of the other sensor 14 is specified on the basis of the sensor 14 facing the front.
  • Whether or not the sensor 14 is facing the front is determined by whether or not the ratio of the three adjacent sides of the hexagonal side of the sensor 14 matches a: b: c.
  • the sensor 14 in which the ratio of three adjacent sides of the hexagonal side of the sensor 14 matches a: b: c faces the front with respect to the imaging device 3.
  • the sensor 14 in which the ratio of three adjacent sides of the hexagonal side of the sensor 14 matches a: b: c does not face the front with respect to the imaging device 3.
  • FIG. 11 is a diagram illustrating an example of the sensor 14 facing the front. Since the hexagon outside the sensor 14 in FIG. 11 is orthogonal to the optical axis of the imaging device 3, three adjacent sides (length a ′, length b ′, length c ′) of the hexagonal side of the sensor 14. ) Ratio matches a: b: c.
  • FIG. 12 is a diagram illustrating an example of the sensor 14 not facing the front. Since the hexagon outside the sensor 14 in FIG. 12 is not orthogonal to the optical axis of the imaging device 3, three adjacent sides (length a ′′, length b ′′, length c ′′) of the hexagonal side of the sensor 14. ) Ratio does not match a: b: c.
  • the CPU 12 performs scaling between the number of pixels in the image and the actual length by using the length of one side of the sensor 14 determined to be facing the front. For example, when the number of dots in the image of the side of the length a of the sensor 14 facing the front is P, it is assumed that one dot in the image is actually a / P length.
  • the length of one dot in the image may be the same at any position in the image.
  • the actual length for one dot may be determined in consideration of the depth of the image.
  • the CPU 21 converts the length in the image into the actual length.
  • FIG. 13 is a diagram showing an example of a cross section of a head mounting device that is not mounted on the head.
  • FIG. 13 shows a cross section passing through the centers of the sensor 141, the sensor 142, and the sensor 143 of the head-mounted device 1 of FIG.
  • the left side of FIG. 13 is the head side, the right side is the outside (side where the imaging device 3 exists), and the Z direction in FIG. 13 is the direction of the optical axis of the imaging device 3.
  • the A1 direction is one of the directions orthogonal to the Z direction.
  • the A1 direction in FIG. 13 is the same as the A1 direction in FIG.
  • the height of the sensor 14 is S1
  • the distance between the sensor 14 and the intermediate point is S2.
  • the distance between the center of the sensor 14 and the center of the sensor 14 adjacent to the sensor 14 is 2 ⁇ S2.
  • the distance between the center of the sensor 141 and the center of the sensor 142 obtained from the image is 2 ⁇ S2.
  • the distance between the center of the sensor 142 and the center of the sensor 143 obtained from the image is also 2 ⁇ S2.
  • the Z-direction coordinates of the sensors 142 and 143 are the same as the Z-direction coordinates of the sensor 141.
  • the head mounting device 1 when the head mounting device 1 is mounted on the head, the head mounting device 1 is bent and deformed at each intermediate point along the head.
  • FIG. 14 is a diagram showing an example of a cross section of the head mounting device mounted on the head.
  • FIG. 14 shows a cross section passing through the centers of the sensor 141, the sensor 142, and the sensor 143 of the head-mounted device 1 of FIG.
  • the left side of FIG. 14 is the head side, the right side is the outside (side where the imaging device 3 exists), and the Z direction in FIG. 14 is the direction of the optical axis of the imaging device 3.
  • the surface of the sensor 141 faces the front with respect to the imaging device 3. That is, it is assumed that the surface of the sensor 141 is orthogonal to the optical axis of the imaging device 3.
  • the distance between the center of the sensor 141 and the center of the sensor 142 which is obtained from the captured image of the head, is L1.
  • the distance between the center of the sensor 142 and the center of the sensor 143 is L2. It is assumed that the sensor 142 is bent at an angle ⁇ with respect to the sensor 141 at an intermediate point between the sensor 142 and the sensor 141.
  • the sensor 143 is assumed to be bent at an angle ⁇ at an intermediate point between the sensor 142 and the sensor 142.
  • the CPU 21 of the user terminal 2 may obtain the center of the surface of the sensor 14 from the position of each side of the hexagon on the surface of the sensor 14, or the output unit 15 at the center of the surface of the sensor 14. May be provided in advance, and the position of the output unit 15 may be a hexagonal center.
  • the CPU21 can obtain
  • the position of the light emitting part or the light receiving part can be obtained.
  • the position of the light emitting unit or the light receiving unit of the sensor 142 is a position moved by S2 sin ⁇ in the Z direction and S2 + S2 cos ⁇ in the A1 direction with respect to the position of the light emitting unit or the light receiving unit of the sensor 141.
  • the position of the light emitting unit or the light receiving unit of the sensor 143 moves 2 ⁇ S2 sin ⁇ + S2 sin ( ⁇ + ⁇ ) in the Z direction with respect to the position of the light emitting unit or the light receiving unit of the sensor 141, and S2 + 2 ⁇ S2 cos ⁇ + S2 cos ( ⁇ + ⁇ ) in the A1 direction. It is the moved position.
  • the CPU 21 obtains the position of the sensor 141 relative to the light emitting unit or the light receiving unit in the same manner for the other sensors 14 in the A2 direction and the A3 direction as viewed from the sensor 141 in FIG. 9 based on the image of the head. Can do. Similarly, the position of the sensor 141 relative to the light emitting part or the light receiving part can be calculated for the other sensors 14.
  • the CPU 21 can calculate the relative positions of all the sensors 14 (the positions of the sensors 141 with respect to the light emitting part or the light receiving part). Further, the CPU 21 can recognize at which position of the head each sensor 14 is mounted by comparing the position of the head of the subject with the position of the sensor 14.
  • FIG. 15 is a diagram showing an example of the appearance of the head-mounted device.
  • the head-mounted device 1 shown in FIG. 15 has fifteen sensors 14 that measure the blood flow of the subject's head, which are arranged at positions of lattice points of an orthogonal lattice on the surface of the sheet-like substance.
  • the relative positions of the sensors 14 can be obtained in the same manner as described above.
  • the relative positions of all the sensors 14 can be obtained by obtaining the relative positions in the B1 direction and the B2 direction.
  • the imaging device 3 of the biological measurement device 5 images the head including the head mounting device 1 mounted on the subject's head.
  • the user terminal 2 specifies the position of the head-mounted device 1 to be mounted on the head on the head based on the captured image of the head.
  • the user terminal 2 specifies the position of each sensor 14 of the head-mounted device 1.
  • the biological measuring device 5 identifies the position where the head mounting device 1 is mounted by photographing the head where the head mounting device 1 is mounted, and the mounting position of the sensor 14 included in the head mounting device 1 is determined. Can be identified. According to the biological measurement device 5, the attachment position of the sensor 14 can be specified, so that the measurement result of the cerebral blood flow by the sensor 14 can be easily associated with the mounting position of the sensor 14.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Veterinary Medicine (AREA)
  • Hematology (AREA)
  • Cardiology (AREA)
  • Physiology (AREA)
  • Physics & Mathematics (AREA)
  • Biophysics (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Measuring Pulse, Heart Rate, Blood Pressure Or Blood Flow (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

Provided is a biometric measurement apparatus capable of identifying the position where a sensor is mounted. This biometric measurement apparatus has: a sensor device mounted on a test subject and having a plurality of sensors for measuring the physical amount of the test subject; an imaging device for imaging the sensor device and the test subject; and a user terminal that calculates the mounting positions of the sensors of the sensor device with respect to the test subject from an image taken by the imaging device.

Description

生体計測装置、情報処理プログラム、及び、生体計測方法Biological measuring device, information processing program, and biological measuring method
 本発明は、生体計測装置、情報処理プログラム、及び、生体計測方法に関する。 The present invention relates to a biological measurement apparatus, an information processing program, and a biological measurement method.
 脳活動計測装置と、コントロールユニットとを有する脳活動計測システムがある。脳活動計測装置は、被験者の頭部に装着される装着具と、当該装着具の内側に配された被験者の血流や脳波を計測する複数のセンサと、当該装着具の外側に配された複数の表示ユニットとを有する。コントロールユニットは、被験者の脳の活動に伴う脳の各領域における血流や脳波をセンサを介して計測し、この計測データ(計測値)をメモリに格納すると共に、計測データに基づいて各領域の計測ポイントにおける活動レベルを判定し、判定結果から各計測ポイントに対応する各表示ユニットの表示色及び点滅速度を制御する(特許文献1)。 There is a brain activity measuring system having a brain activity measuring device and a control unit. The brain activity measuring device is disposed on the head of the subject, a plurality of sensors for measuring blood flow and brain waves of the subject disposed on the inner side of the wearing tool, and on the outer side of the wearing tool A plurality of display units. The control unit measures the blood flow and brain waves in each region of the brain associated with the brain activity of the subject via a sensor, stores this measurement data (measurement value) in a memory, and based on the measurement data, The activity level at the measurement point is determined, and the display color and blinking speed of each display unit corresponding to each measurement point are controlled from the determination result (Patent Document 1).
特開2012-161375号公報JP 2012-161375 A
 被験者の脳活動に伴う脳の各領域における血流や脳波を計測するとき、脳全体の血流を計測する場合と脳全体の血流を計測せずに脳の特定の部分を計測すればよい場合とがある。特許文献1に記載された脳活動計測システムは、脳全体の血流を計測するもので、センサが頭部全体に配置される。特許文献1に記載された脳活動計測システムは、脳の特定の部分を計測するには大掛かりな装置になる。 When measuring blood flow and electroencephalogram in each area of the brain associated with the brain activity of the subject, it is only necessary to measure a specific part of the brain without measuring the blood flow of the entire brain and the blood flow of the entire brain There are cases. The brain activity measurement system described in Patent Document 1 measures blood flow in the entire brain, and sensors are arranged over the entire head. The brain activity measurement system described in Patent Document 1 is a large-scale device for measuring a specific part of the brain.
 脳の特定の部分の血流や脳波を計測するのであれば、脳の特定の部分を計測できるサイズに生体計測装置を小型化することが望ましい。しかし、生体計測装置を小型化すると、生体計測装置が頭部のどの部分に装着されているか不明確になるため、生体計測装置が装着されている位置(被験者の頭部におけるセンサの位置)を特定することが求められる。 If the blood flow and brain waves of a specific part of the brain are to be measured, it is desirable to downsize the living body measuring device to a size that can measure the specific part of the brain. However, when the biometric device is downsized, it is unclear which part of the head the biometric device is mounted. Therefore, the position where the biometric device is mounted (the position of the sensor on the subject's head) is determined. It is required to specify.
 本発明は、センサが装着されている位置を特定することができる生体計測装置を提供することを目的とする。 An object of the present invention is to provide a living body measuring apparatus capable of specifying a position where a sensor is mounted.
 上記課題を解決するために、以下の手段を採用する。
 即ち、第1の態様は、
 被験者に装着され、前記被験者の物理量を計測する複数のセンサを有するセンサ装置と、
 前記センサ装置及び前記被験者を撮像する撮像装置と、
 前記撮像装置で撮像した画像から前記被験者に対する前記センサ装置の前記センサの装着位置を演算する利用者端末と
 を有する生体計測装置とする。
In order to solve the above problems, the following means are adopted.
That is, the first aspect is
A sensor device mounted on a subject and having a plurality of sensors for measuring the physical quantity of the subject;
An imaging device for imaging the sensor device and the subject;
A biometric measuring device including: a user terminal that calculates a mounting position of the sensor of the sensor device with respect to the subject from an image captured by the imaging device.
 第1の態様によると、生体計測装置は、被験者を撮影した画像から被験者に装着されるセンサ装置の位置を算出する。 According to the first aspect, the biological measurement device calculates the position of the sensor device attached to the subject from an image obtained by photographing the subject.
 開示の態様は、プログラムが情報処理装置によって実行されることによって実現されてもよい。即ち、開示の構成は、上記した態様における各手段が実行する処理を、情報処理装置に対して実行させるためのプログラム、或いは当該プログラムを記録したコンピュータ読み取り可能な記録媒体として特定することができる。また、開示の構成は、上記した各手段が実行する処理を情報処理装置が実行する方法をもって特定されてもよい。開示の構成は、上記した各手段が実行する処理を行う情報処理装置を含むシステムとして特定されてもよい。 The disclosed aspect may be realized by a program being executed by an information processing apparatus. That is, the disclosed configuration can be specified as a program for causing the information processing apparatus to execute the processing executed by each unit in the above-described aspect, or a computer-readable recording medium on which the program is recorded. Further, the disclosed configuration may be specified by a method in which the information processing apparatus executes the process executed by each of the above-described units. The configuration of the disclosure may be specified as a system including an information processing apparatus that performs the processing executed by each of the above-described units.
 本発明によれば、センサが装着されている位置を特定することができる生体計測装置を提供することができる。 According to the present invention, it is possible to provide a living body measuring apparatus that can specify the position where the sensor is mounted.
図1は、本発明の実施形態に係る生体計測装置の構成例を示す図である。FIG. 1 is a diagram illustrating a configuration example of a biological measurement apparatus according to an embodiment of the present invention. 図2は、頭部装着装置の構成例を示す図である。FIG. 2 is a diagram illustrating a configuration example of the head-mounted device. 図3は、利用者端末の構成例を示す図である。FIG. 3 is a diagram illustrating a configuration example of a user terminal. 図4は、頭部装着装置の外観を示す図である。FIG. 4 is a diagram illustrating an appearance of the head-mounted device. 図5は、頭部装着装置が頭部に装着される例を示す図である。FIG. 5 is a diagram illustrating an example in which the head mounting device is mounted on the head. 図6は、頭部に装着した頭部装着装置の頭部装着位置を認識する動作フローの例を示す図である。FIG. 6 is a diagram illustrating an example of an operation flow for recognizing the head mounting position of the head mounting device mounted on the head. 図7は、被験者の頭部の画像に測定結果が重畳した例を示す図である。FIG. 7 is a diagram illustrating an example in which the measurement result is superimposed on the image of the head of the subject. 図8は、頭部の模擬画像に測定結果を重畳した例を示す図である。FIG. 8 is a diagram illustrating an example in which the measurement result is superimposed on the simulated image of the head. 図9は、頭部装着装置の外観の例を示す図である。FIG. 9 is a diagram illustrating an example of the appearance of the head-mounted device. 図10は、図9の頭部装着装置のセンサの柱状ハウジングの例を示す図である。10 is a diagram illustrating an example of a columnar housing of the sensor of the head-mounted device of FIG. 図11は、正面を向いているセンサ14の例を示す図である。FIG. 11 is a diagram illustrating an example of the sensor 14 facing the front. 図12は、正面を向いていないセンサ14の例を示す図である。FIG. 12 is a diagram illustrating an example of the sensor 14 not facing the front. 図13は、頭部に装着されていない頭部装着装置の断面の例を示す図である。FIG. 13 is a diagram illustrating an example of a cross section of a head-mounted device that is not mounted on the head. 図14は、頭部に装着された頭部装着装置の断面の例を示す図である。FIG. 14 is a diagram illustrating an example of a cross section of the head mounting device mounted on the head. 図15は、頭部装着装置の外観の例を示す図である。FIG. 15 is a diagram illustrating an example of the appearance of the head-mounted device.
 以下、図面を参照して実施形態について説明する。実施形態の構成は例示であり、発明の構成は、開示の実施形態の具体的構成に限定されない。発明の実施にあたって、実施形態に応じた具体的構成が適宜採用されてもよい。 Hereinafter, embodiments will be described with reference to the drawings. The configuration of the embodiment is an exemplification, and the configuration of the invention is not limited to the specific configuration of the disclosed embodiment. In carrying out the invention, a specific configuration according to the embodiment may be adopted as appropriate.
 〔実施形態〕
 (構成例)
 図1は、本発明の実施形態に係る生体計測装置の構成例を示す図である。図1のように、生体計測装置5は、被験者の頭部に装着して頭部の血流を測定する頭部装着装置1と、頭部装着装置1の装着位置を特定してその結果を表示する利用者端末2と、頭部装着装置1が頭部のどの位置に装着されているかを撮像する撮像装置3を有する。頭部装着装置1及び利用者端末2は、図2及び図3にて詳細に説明する。頭部装着装置1と利用者端末2とは、一体化してもよい。また、利用者端末2と撮像装置3とは、一体化してもよい。血流は、被験者の物理量の一例である。
Embodiment
(Configuration example)
FIG. 1 is a diagram illustrating a configuration example of a biological measurement apparatus according to an embodiment of the present invention. As shown in FIG. 1, the living body measurement device 5 specifies a mounting position of the head mounting device 1 that is mounted on the head of the subject and measures the blood flow of the head, and the mounting position of the head mounting device 1 and the result. It has a user terminal 2 to display and an imaging device 3 for imaging where the head mounting device 1 is mounted on the head. The head mounting device 1 and the user terminal 2 will be described in detail with reference to FIGS. The head mounting device 1 and the user terminal 2 may be integrated. Further, the user terminal 2 and the imaging device 3 may be integrated. Blood flow is an example of a physical quantity of a subject.
 撮像装置3は、利用者端末2と有線又は無線にて接続され、撮像した画像データが利用者端末2に転送される。撮像装置3は、CCD(charge-coupled device)カメラ、撮像管などの撮影手段である。また、撮像装置3の大きさは、頭部が撮像できる大きさであればよい。1つの撮像装置3を用いて複数個所の頭部を撮像してもよいし、複数の撮像装置3を用いて頭部の複数個所を一度に撮影するようにしてもよい。 The imaging device 3 is connected to the user terminal 2 by wire or wirelessly, and the captured image data is transferred to the user terminal 2. The imaging device 3 is a photographing means such as a CCD (charge-coupled device) camera or an imaging tube. The size of the imaging device 3 may be any size as long as the head can be imaged. A single imaging device 3 may be used to image a plurality of heads, or a plurality of imaging devices 3 may be used to capture a plurality of heads at a time.
 生体計測装置5は、利用者の頭部から血流量の変化を示す計測データ(検出値ともいう)を検出し、利用者の脳の活動状態を示す脳活動情報を取得する。生体計測装置5は、生体光計測装置の一例である。利用者は、被検体の一例である。 The living body measuring device 5 detects measurement data (also referred to as a detection value) indicating a change in blood flow from the user's head, and acquires brain activity information indicating the activity state of the user's brain. The biological measurement device 5 is an example of a biological light measurement device. A user is an example of a subject.
 図2は、頭部装着装置の構成例を示す図である。頭部装着装置1は、情報処理の側面としては、制御部11と、無線通信部13と、センサ141~146と、出力部15を有する。センサ141~146を区別しない場合は、センサ141~146を総称して、センサ14という。制御部11は、頭部装着装置1の計測と通信を制御する。制御部11は、例えば、CPU(Central Processing Unit)、あるいはDSP(Digital Signal Processor)等のプロセッサとメモリとを有し、メモリ上に実行可能に展開されたコンピュータプログラム、ファームウェア等により処理を実行する。ただし、制御部11は、無線通信部13とセンサ14、出力部15を起動し、各構成要素との連携処理を実行する専用のハードウェア回路、FPGA(Field Programmable Gate Array)等であってもよい。また、制御部11は、CPU、DSP、専用のハードウェア回路等が混在したものであってもよい。ここでは、頭部装着装置1が6個のセンサ141~146を有するとしているが、頭部装着装置1が、6個以上のセンサ14を有してもよいし、6個未満のセンサ14を有してもよい。頭部装着装置1は、センサ装置の一例である。 FIG. 2 is a diagram illustrating a configuration example of the head-mounted device. The head-mounted device 1 includes a control unit 11, a wireless communication unit 13, sensors 141 to 146, and an output unit 15 in terms of information processing. When the sensors 141 to 146 are not distinguished, the sensors 141 to 146 are collectively referred to as the sensor 14. The control unit 11 controls measurement and communication of the head-mounted device 1. The control unit 11 includes, for example, a processor such as a CPU (Central Processing Unit) or a DSP (Digital Signal Processor) and a memory, and executes processing by a computer program, firmware, or the like that is executed on the memory. . However, the control unit 11 may be a dedicated hardware circuit, FPGA (Field Programmable Gate Gate Array), or the like that activates the wireless communication unit 13, the sensor 14, and the output unit 15 and executes cooperation processing with each component. Good. The control unit 11 may be a mixture of a CPU, a DSP, a dedicated hardware circuit, and the like. Here, the head-mounted device 1 has six sensors 141 to 146, but the head-mounted device 1 may have six or more sensors 14 or fewer than six sensors 14. You may have. The head mounting device 1 is an example of a sensor device.
 無線通信部13は、所定のインターフェースによって、制御部11およびセンサ14、出力部15と接続される。ただし、無線通信部13は、制御部11を介して、センサ14からデータを取得する構成であってもよい。無線通信部13は、ネットワークN1を介して、利用者端末2と通信する。ネットワークN1は、例えば、Bluetooth(登録商標)、無線LAN(Local Area Network)、ZigBee(登録商標)等の規格にしたがったネットワークである。無線通信部13が転送手段の一例である。ただし、本情報処理システムにおいて、無線通信部13の無線インターフェースの規格が限定されるものではない。 The wireless communication unit 13 is connected to the control unit 11, the sensor 14, and the output unit 15 through a predetermined interface. However, the wireless communication unit 13 may be configured to acquire data from the sensor 14 via the control unit 11. The wireless communication unit 13 communicates with the user terminal 2 via the network N1. The network N1 is a network according to standards such as Bluetooth (registered trademark), wireless LAN (Local Area Network), and ZigBee (registered trademark). The wireless communication unit 13 is an example of a transfer unit. However, in the information processing system, the standard of the wireless interface of the wireless communication unit 13 is not limited.
 2台の頭部装着装置1-1、1-2(図示なし)がある場合において、ネットワークN1での通信時、通信ヘッダのヘッダ部分、あるいは、通信データ中の利用者データ部分(ペイロード部分)に、頭部装着装置1-1、1-2を識別する識別子を埋め込んで、利用者端末2が利用者(被験者)を識別できるようにする。 When there are two head-mounted devices 1-1 and 1-2 (not shown), when communicating on the network N1, the header portion of the communication header or the user data portion (payload portion) in the communication data Further, an identifier for identifying the head-mounted devices 1-1 and 1-2 is embedded so that the user terminal 2 can identify the user (subject).
 また、本情報処理システムにおいて、無線通信部13に代えて、あるいは、無線通信部13とともに有線で通信を行う通信部を設けてもよい。すなわち、頭部装着装置1と利用者端末2とが有線通信のインターフェースで接続されてもよい。この場合の有線通信のインターフェースに限定がある訳ではなく、情報処理システムの用途に応じてUSB(Universal Serial Bus)、PCI Express等の各種インターフェースを使用できる。 In the information processing system, a communication unit that performs wired communication with the wireless communication unit 13 instead of the wireless communication unit 13 may be provided. That is, the head-mounted device 1 and the user terminal 2 may be connected by a wired communication interface. In this case, the interface for wired communication is not limited, and various interfaces such as USB (Universal Serial Bus) and PCI Express can be used according to the use of the information processing system.
 センサ14は、近赤外線を頭部に照射し、脳の大脳皮質付近で一部吸収されて散乱された近赤外線を受光し、電気信号に変換する。脳の大脳皮質は、例えば、脳の活動状態に応じて、血流量が異なる。その結果、大脳皮質の各部において、血液中の酸素と結合したヘモグロビンの量と、酸素と結合していないヘモグロビンの量が変化する。ヘモグロビンの量の変化、酸素量の変化等に起因して、大脳皮質付近での近赤外線の吸収特性、あるいは、散乱特性が変化する。センサ14は、このような大脳皮質付近の血流の状態に応じた近赤外線吸収率の変化あるいは透過率の変化により光量が変化する近赤外線を電気信号に変換して出力する。各センサ14は、例えば、識別子によって識別される。 Sensor 14 irradiates the head with near-infrared rays, receives near-infrared rays that are partially absorbed and scattered near the cerebral cortex of the brain, and converts them into electrical signals. The cerebral cortex of the brain has a different blood flow rate, for example, depending on the activity state of the brain. As a result, in each part of the cerebral cortex, the amount of hemoglobin bound to oxygen in the blood and the amount of hemoglobin not bound to oxygen change. Due to a change in the amount of hemoglobin, a change in the amount of oxygen, or the like, the near-infrared absorption characteristic or scattering characteristic in the vicinity of the cerebral cortex changes. The sensor 14 converts the near-infrared light whose light amount is changed by the change in the near-infrared absorption rate or the change in the transmittance according to the blood flow state in the vicinity of the cerebral cortex into an electrical signal and outputs it. Each sensor 14 is identified by an identifier, for example.
 センサ14は、例えば、近赤外線を照射する近赤外線光源と、近赤外線を受光する受光部を含む。近赤外線光源は、例えば、LED(Light Emitting Diodes)、赤外線ランプ等である。また、受光部は、フォトダイオード、フォトトランジスタ等の光電素子と、増幅器と、AD(Analog Digital)コンバータとを含む。なお、近赤外線光源と受光部とが対にして設けられなくてもよい。例えば、1つの近赤外線光源に対して、複数の受光部を設けてもよい。受光部は光検出手段の一例である。また、センサ14は、装着状態検出用の光源を、さらに、有してもよい。装着状態検出用の光源は、例えば、LED等である。センサ14が装着状態検出用の光源を有しない場合、装着状態検出用の光源は、近赤外線光源で代用される。各光源は、光照射手段の例である。 The sensor 14 includes, for example, a near-infrared light source that emits near-infrared light and a light-receiving unit that receives near-infrared light. The near infrared light source is, for example, an LED (Light Emitting Diodes), an infrared lamp, or the like. The light receiving unit includes a photoelectric element such as a photodiode or a phototransistor, an amplifier, and an AD (Analog Digital) converter. Note that the near-infrared light source and the light receiving unit may not be provided in pairs. For example, a plurality of light receiving units may be provided for one near infrared light source. The light receiving unit is an example of a light detection unit. The sensor 14 may further include a light source for detecting the wearing state. The light source for mounting state detection is, for example, an LED or the like. When the sensor 14 does not have a light source for wearing state detection, the light source for wearing state detection is substituted with a near infrared light source. Each light source is an example of a light irradiation means.
 出力部15は、例えば、LED等の発光素子、または、音響あるいは音声を出力するスピーカ等であり、制御部11からの出力情報を表示する。出力部15は、センサ14の頭部への装着状態を光または音声によって出力する。出力部15は、振動を出力するバイブレータであってもよい。出力部15は、各センサ14に、それぞれ、設けられてもよい。 The output unit 15 is, for example, a light emitting element such as an LED or a speaker that outputs sound or sound, and displays output information from the control unit 11. The output unit 15 outputs the mounting state of the sensor 14 on the head by light or sound. The output unit 15 may be a vibrator that outputs vibration. The output unit 15 may be provided in each sensor 14.
 利用者端末2は、頭部装着装置1から、利用者の大脳皮質付近での近赤外線の吸収率または透過率の変化データを取得し、利用者の脳の活動状態に関連する様々な情報処理を含むサービスを提供する。利用者端末2は、情報処理装置(コンピュータ)の一例である。利用者端末2は、PC(Personal Computer)、スマートフォン、携帯電話、タブレット型端末、カーナビゲーション装置、PDA(Personal Digital Assistant)、ゲーム機(遊戯装置)のような専用または汎用のコンピュータ、あるいは、コンピュータを搭載した電子機器を使用して実現可能である。利用者端末2は、例えば、フィットネスクラブ、学習塾などに設置され得る。 The user terminal 2 acquires from the head-mounted device 1 change data of near-infrared absorption or transmission near the user's cerebral cortex, and performs various information processing related to the activity state of the user's brain. Provide services including The user terminal 2 is an example of an information processing apparatus (computer). The user terminal 2 is a dedicated or general-purpose computer such as a PC (Personal Computer), a smartphone, a mobile phone, a tablet terminal, a car navigation device, a PDA (Personal Digital Assistant), a game machine (game device), or a computer. It can be realized using an electronic device equipped with The user terminal 2 can be installed in, for example, a fitness club or a learning school.
 利用者端末2は、CPU21と、メモリ22と、無線通信部23と、公衆回線通信部24と、表示部25と、操作部26と、出力部27と、撮像部28と、測位部29と、物理センサ部2Aを有する。CPU21は、メモリ22に実行可能に展開されたコンピュータプログラムにより、利用者端末2としての処理を実行する。利用者端末2としての処理とは、例えば、上記利用者の脳の活動状態に関連する様々な情報処理を含むサービスである。このようなコンピュータプログラムを実行するCPU21が解析手段の一例である。 The user terminal 2 includes a CPU 21, a memory 22, a wireless communication unit 23, a public line communication unit 24, a display unit 25, an operation unit 26, an output unit 27, an imaging unit 28, and a positioning unit 29. And a physical sensor unit 2A. The CPU 21 executes processing as the user terminal 2 by a computer program that is executed in the memory 22 so as to be executable. The processing as the user terminal 2 is, for example, a service including various information processing related to the activity state of the user's brain. The CPU 21 that executes such a computer program is an example of an analysis unit.
 メモリ22は、CPU21で実行されるコンピュータプログラム、あるいは、CPU21が処理するデータ、使用するデータを記憶する。メモリ22は、揮発性メモリと不揮発性メモリを含んでよい。 The memory 22 stores a computer program executed by the CPU 21, data processed by the CPU 21, and data used. The memory 22 may include volatile memory and non-volatile memory.
 無線通信部23は、頭部装着装置1の無線通信部13と同様である。無線通信部23が受信手段の一例である。また、利用者端末2は、無線通信部23に代えて、あるいは、無線通信部23とともに有線で通信を行う通信部を有してもよい。 The wireless communication unit 23 is the same as the wireless communication unit 13 of the head-mounted device 1. The wireless communication unit 23 is an example of a receiving unit. Further, the user terminal 2 may include a communication unit that performs wired communication instead of the wireless communication unit 23 or together with the wireless communication unit 23.
 公衆回線通信部24は、ネットワークN2を介して、ネットワークN2上のサーバ、例えば、サーバ(演算装置)3等と通信する(図示なし)。ネットワークN2は、公衆回線網であり、例えば、携帯電話網である。ネットワークN2が携帯電話網である場合には、公衆回線通信部24は、携帯電話網の基地局を介してネットワークN2に接続する。ただし、ネットワークN2は、インターネット接続業者の通信装置へのアクセス網とインターネットを含むネットワークであってもよい。インターネット接続業者の通信装置へのアクセス網は、例えば、通信事業者が提供する光ネットワーク、ADSL(Asymmetric Digital Subscriber Line)等である。ネットワークN2が公衆無線ネットワークの一例である。また、公衆回線通信部24が公衆無線通信手段の一例である。ただし、本情報処理システムにおいて、ネットワークN2が公衆回線網に限定される訳ではなく、例えば、LAN(Local Area Network)等の構内ネットワーク、企業、事業者、役所、学校、研究機関等の専用回線、VPN(Virtual Private Network)等の広域ネットワークであってもよい。以下、企業、事業者、役所、学校、研究機関等を企業等ともいう。 The public line communication unit 24 communicates with a server on the network N2, for example, the server (arithmetic unit) 3 and the like (not shown) via the network N2. The network N2 is a public line network, for example, a mobile phone network. When the network N2 is a mobile phone network, the public line communication unit 24 connects to the network N2 via a base station of the mobile phone network. However, the network N2 may be a network including an access network to the communication device of the Internet service provider and the Internet. The access network to the communication device of the Internet connection company is, for example, an optical network provided by a communication carrier, ADSL (Asymmetric Digital Subscriber Line), or the like. The network N2 is an example of a public wireless network. The public line communication unit 24 is an example of public wireless communication means. However, in this information processing system, the network N2 is not limited to the public line network. For example, a private network such as a local network such as a LAN (Local Area Network), a company, an operator, a government office, a school, a research institution, etc. Or a wide area network such as VPN (Virtual Private Network). Hereinafter, companies, businesses, government offices, schools, research institutions, etc. are also referred to as companies.
 表示部25は、例えば、液晶ディスプレイ、EL(Electro-Luminescence)パネル等であり、CPU21からの出力情報を表示する。操作部26は、例えば、押しボタン、タッチパネル等であり、利用者の操作を受け付ける。出力部27は、例えば、振動を出力するバイブレータ、音響あるいは音声を出力するスピーカ等である。撮像部28は、例えば、固体撮像素子を含むカメラである。固体撮像素子としては、CCD(Charge-coupled device)イメージセンサ、CMOS(Complementary Metal Oxide Semiconductor)イメージセンサ等を利用できる。 The display unit 25 is, for example, a liquid crystal display, an EL (Electro-Luminescence) panel, and the like, and displays output information from the CPU 21. The operation unit 26 is, for example, a push button, a touch panel, or the like, and accepts a user operation. The output unit 27 is, for example, a vibrator that outputs vibration, a speaker that outputs sound or sound, and the like. The imaging unit 28 is a camera including a solid-state imaging element, for example. As the solid-state image sensor, a CCD (Charge-coupled device) image sensor, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, or the like can be used.
 測位部29は、例えば、GPS(Global Positioning System)受信機であり、GPS衛星からの電波を受信し、現在位置(緯度、経度等)、時刻等を算出する。ただし、測位部29としては、GPS受信機を有するものに限定される訳ではない。例えば、公衆回線通信部24が携帯電話網である場合には、測位部29は、携帯電話基地局からの距離を基に測位を実行してもよい。 The positioning unit 29 is, for example, a GPS (Global Positioning System) receiver, receives radio waves from GPS satellites, and calculates the current position (latitude, longitude, etc.), time, and the like. However, the positioning unit 29 is not limited to the one having a GPS receiver. For example, when the public line communication unit 24 is a mobile phone network, the positioning unit 29 may perform positioning based on the distance from the mobile phone base station.
 物理センサ部2Aは、例えば、加速度センサ、あるいは角加速度センサ等である。ただし、物理センサ部2Aは、温度センサ、湿度センサ、気圧センサ、または水圧センサであってもよい。 The physical sensor unit 2A is, for example, an acceleration sensor or an angular acceleration sensor. However, the physical sensor unit 2A may be a temperature sensor, a humidity sensor, an atmospheric pressure sensor, or a water pressure sensor.
 〈頭部装着装置の例〉
 図4は、頭部装着装置の外観の例を示す図である。頭部装着装置1は、2行3列(2×3)に配置された、被験者の頭部の血流を測定する6個のセンサ141~146を有する。頭部装着装置1は、例えば、装着ベルトなどによって、被験者の頭部に巻き付けられて固定される。なお、ここでは6個のセンサ14を使用したが、センサ数は測定したい範囲によって任意に決められる。
<Example of head mounted device>
FIG. 4 is a diagram illustrating an example of the appearance of the head-mounted device. The head-mounted device 1 has six sensors 141 to 146 that are arranged in 2 rows and 3 columns (2 × 3) and measure blood flow in the head of the subject. The head mounting device 1 is wound around and fixed to the subject's head by, for example, a mounting belt. Although six sensors 14 are used here, the number of sensors is arbitrarily determined depending on the range to be measured.
 各センサ14は、断面が多角形状または円形状の柱状ハウジングを含む。柱状ハウジングの一方の断面に光源又は受光部を含むセンサ14、他方の断面に出力部15が取り付けられている。断面の形状は、例えば、六角形状、八角形状、四角形状、円形状等がある。 Each sensor 14 includes a columnar housing having a polygonal or circular cross section. A sensor 14 including a light source or a light receiving part is attached to one cross section of the columnar housing, and an output part 15 is attached to the other cross section. The cross-sectional shape includes, for example, a hexagonal shape, an octagonal shape, a square shape, a circular shape, and the like.
 頭部装着装置1は、外側から認識できる基準点1aを含む。基準点1aは、頭部装着装置1の所定の位置に設けられる。基準点1aは、センサ14の1つに設けられてもよい。基準点1aは、撮像装置3で撮像される画像において認識され得る所定の絵や文字、光等で示される。基準点1aは、光の点滅パターンで示されてもよい。頭部装着装置1は、複数の基準点を有してもよい。頭部装着装置1が複数の基準点を有する場合、基準点の所定の絵、文字、光等は、互いに異なるものとする。頭部装着装置1における各センサ14の位置や基準点1aの位置は、あらかじめ測定されている。よって、被験者の頭部における頭部装着装置1の位置及び基準点1aの位置が特定されると、被検者の頭部における各センサ14の位置が特定される。 The head mounting device 1 includes a reference point 1a that can be recognized from the outside. The reference point 1 a is provided at a predetermined position of the head mounting device 1. The reference point 1a may be provided in one of the sensors 14. The reference point 1a is indicated by a predetermined picture, character, light, or the like that can be recognized in an image captured by the imaging device 3. The reference point 1a may be indicated by a light blinking pattern. The head mounting device 1 may have a plurality of reference points. When the head-mounted device 1 has a plurality of reference points, predetermined pictures, characters, lights, etc. of the reference points are different from each other. The position of each sensor 14 and the position of the reference point 1a in the head-mounted device 1 are measured in advance. Therefore, when the position of the head-mounted device 1 and the position of the reference point 1a on the subject's head are specified, the position of each sensor 14 on the subject's head is specified.
 図5は、頭部装着装置が頭部に装着される例を示す図である。図5は、撮像装置3で撮像された頭部装着装置1を装着した被験者の頭部の画像である。図5の例では、被験者の頭部の左面が示され、複数のセンサ14を含む頭部装着装置1が左耳の上方に装着されている。 FIG. 5 is a diagram showing an example in which the head mounting device is mounted on the head. FIG. 5 is an image of the head of the subject wearing the head mounting device 1 imaged by the imaging device 3. In the example of FIG. 5, the left side of the subject's head is shown, and the head mounting device 1 including a plurality of sensors 14 is mounted above the left ear.
 (動作例)
 次に、頭部に装着した頭部装着装置を撮像装置で撮影して、頭部装着装置の装着位置を特定する動作について説明する。
(Operation example)
Next, an operation for photographing the head-mounted device mounted on the head with the imaging device and specifying the mounting position of the head-mounted device will be described.
 図6は、頭部に装着した頭部装着装置の装着位置を特定する動作フローの例を示す図である。利用者端末2は、表示部25に、頭部装着装置1を装着すべき位置を示す情報(画像や文字情報)を表示してもよい。頭部装着装置1を装着すべき位置は、あらかじめ、利用者端末2のメモリ22等に格納されている。頭部装着装置1を装着する被験者等は、利用者端末2の表示部25に表示される情報(画像や文字情報)を確認して、頭部装着装置1を装着する位置を確認することができる。図6の動作フローは、被験者が頭部装着装置1を頭部に装着した後に開始される。図6の動作フローは、被験者が利用者端末2の操作部26により、頭部装着装置1の装着完了を示す情報を入力することにより、開始されてもよい。頭部装着装置1を装着すべき位置は、脳血流量の測定の目的などに依存する。 FIG. 6 is a diagram showing an example of an operation flow for specifying the mounting position of the head mounting device mounted on the head. The user terminal 2 may display information (image or character information) indicating the position where the head mounting device 1 is to be mounted on the display unit 25. The position where the head mounting device 1 should be mounted is stored in advance in the memory 22 of the user terminal 2 or the like. A subject or the like wearing the head mounting device 1 can confirm information (image or character information) displayed on the display unit 25 of the user terminal 2 and confirm a position where the head mounting device 1 is mounted. it can. The operation flow of FIG. 6 is started after the subject wears the head mounting device 1 on the head. The operation flow of FIG. 6 may be started when the test subject inputs information indicating completion of mounting of the head-mounted device 1 through the operation unit 26 of the user terminal 2. The position where the head-mounted device 1 is to be mounted depends on the purpose of measuring cerebral blood flow.
 S101では、撮像装置3は、頭部装着装置1を装着した被験者の頭部を撮像する。撮像装置3は、被験者等によって、被験者の頭部に向けられる。また、被験者が、自身の頭部を撮像装置3に向けてもよい。撮像する被験者の頭部は、例えば、目と鼻とを含む頭部正面、頭部左側の頭部左面(例えば、図5)、頭部右側の頭部右面、頭部上側の頭部上面、及び、頭部背面側の頭部背面の5個所である。1台の撮像装置3で複数個所を撮像しても、複数台の撮像装置3で複数個所を撮像してもよい。撮像装置3は、撮像した画像を、利用者端末2に送信する。撮像装置3は、被験者の頭部及び頭部装着装置1を含む少なくとも1個所を、撮像すればよい。撮像装置3が撮像する画像は、静止画像であっても、動画像であってもよい。 In S101, the imaging device 3 images the head of the subject wearing the head mounting device 1. The imaging device 3 is directed to the subject's head by the subject or the like. Also, the subject may point his / her head toward the imaging device 3. The head of the subject to be imaged is, for example, the front of the head including eyes and nose, the left side of the head on the left side of the head (for example, FIG. 5), the right side of the head on the right side of the head, the top of the head on the top of the head, And five locations on the back side of the head. A plurality of locations may be imaged by a single imaging device 3, or a plurality of locations may be captured by a plurality of imaging devices 3. The imaging device 3 transmits the captured image to the user terminal 2. The imaging device 3 may image at least one location including the head of the subject and the head mounting device 1. The image captured by the imaging device 3 may be a still image or a moving image.
 S102では、利用者端末2は、撮像装置3から撮像された画像(頭部の画像)を受信する。受信した画像は、メモリ22等に格納される。CPU21は、頭部の画像から、頭部の輪郭を抽出する。CPU21は、周知の画像認識方法により、被験者の頭部の輪郭及び頭部装着装置1の輪郭を抽出する。 In S102, the user terminal 2 receives an image (head image) captured from the imaging device 3. The received image is stored in the memory 22 or the like. The CPU 21 extracts the outline of the head from the head image. The CPU 21 extracts the contour of the head of the subject and the contour of the head mounting device 1 by a known image recognition method.
 S103では、CPU21は、頭部の画像から、周知の画像認識方法により、頭部の特定部位の位置(頭部表面における位置)を抽出する。特定部位は、例えば、目、鼻、口、耳、肩、頭頂部等である。 In S103, the CPU 21 extracts the position of the specific part of the head (position on the head surface) from the image of the head by a known image recognition method. Specific sites are, for example, the eyes, nose, mouth, ears, shoulders, top of the head, and the like.
 CPU21は、頭部の複数個所を撮像した画像がある場合、周知の画像合成方法により、任意の方向から頭部を見た画像を、生成してもよい。CPU21は、例えば、頭部正面(顔の正面)を撮像した画像と、頭部右側の頭部右面を撮像した画像とに基づいて、頭部の右斜め前方方向から見た画像を生成してもよい。周知の画像合成方法では、例えば、頭部を撮像した複数の画像から、頭部の輪郭や各部分の3次元位置を特定することにより、任意の方向から頭部を見た画像を生成することができる。CPU21は、撮像した画像に基づいて生成した画像を、頭部の画像としてもよい。頭部における各部分の位置は、頭部の表面に目や鼻等の位置を基準として設定された座標によって、示されてもよい。 CPU21 may produce | generate the image which looked at the head from arbitrary directions with the well-known image composition method, when there exists the image which imaged the several places of the head. CPU21 produces | generates the image seen from the diagonally right forward direction of the head based on the image which imaged the head front (front of the face), for example, and the image which imaged the head right surface of the head right side. Also good. In a known image composition method, for example, an image in which the head is viewed from an arbitrary direction is generated by specifying the outline of the head and the three-dimensional position of each part from a plurality of images obtained by imaging the head. Can do. CPU21 is good also considering the image produced | generated based on the imaged image as a head image. The position of each part in the head may be indicated by coordinates set on the surface of the head on the basis of the positions of eyes, nose, and the like.
 例えば、両方の目と、鼻とを含む頭部の画像(頭部正面の画像)において、左目の位置を原点として、左目から右目の方向を第1軸、左目から鼻の方向を第2軸とすることで、座標を設定する。第1軸と第2軸とが平行でなければ、鼻とを含む頭部の画像におけるあらゆる位置を、当該座標によって表現することができる。また、頭部背面の画像において、両方の肩及び頭頂部の位置を特定して、同様にすることができる。頭頂部の位置は、例えば、頭部の輪郭において最も上方の位置とすることができる。 For example, in an image of a head including both eyes and a nose (an image in front of the head), the left eye position is the origin, the left eye to the right eye direction is the first axis, and the left eye to nose direction is the second axis. To set the coordinates. If the first axis and the second axis are not parallel, every position in the image of the head including the nose can be expressed by the coordinates. Further, in the image of the back of the head, the positions of both shoulders and the top of the head can be specified and the same can be done. The position of the top of the head can be, for example, the uppermost position in the outline of the head.
 S104では、CPU21は、頭部の画像から、被験者の頭部に装着されている頭部装着装置1、及び、頭部装着装置1の基準点1aを抽出する。CPU21は、例えば、あらかじめ、メモリ22等に格納される頭部装着装置1の画像と頭部の画像とを、比較して、頭部の画像において、頭部装着装置1の画像と一致する部分を、頭部装着装置1として、抽出する。また、CPU21は、頭部の画像から、基準点1aの所定の絵、文字、光等と同じ、絵、文字、光等を、周知の画像認識方法等により抽出し、これを基準点1aと認識する。 In S104, the CPU 21 extracts the head mounting device 1 mounted on the subject's head and the reference point 1a of the head mounting device 1 from the head image. The CPU 21 compares, for example, the image of the head mounted device 1 and the image of the head stored in advance in the memory 22 or the like, and a portion that matches the image of the head mounted device 1 in the image of the head Are extracted as the head-mounted device 1. Further, the CPU 21 extracts a picture, character, light, etc., which is the same as the predetermined picture, character, light, etc. of the reference point 1a from the image of the head by a known image recognition method or the like, and extracts this as the reference point 1a. recognize.
 S105では、CPU21は、頭部装着装置1の位置と基準点1aの位置とを特定する。CPU21は、例えば、頭部装着装置1の位置と基準点1aの位置とを、頭部の目、鼻などの位置を基準とした相対的な位置として、特定する。また、CPU21は、頭部の画像に含まれる頭部装着装置1から、頭部装着装置1の3次元位置と基準点1aの3次元位置とを特定してもよい。位置は、1つの頭部の画像において目や鼻などの位置を基準とした2次元座標で表されても、複数の頭部の画像から特定した目や鼻などの3次元位置を基準とした3次元座標で表されてもよい。CPU21は、特定した頭部装着装置1の位置及び基準点1aの位置を、メモリ22等に格納する。撮像装置3が基準点1aに対して垂直の位置で頭部装着装置1を撮像するのが望ましい。基準点1aに対して垂直の位置からずれて頭部装着装置1を撮像すると、頭部装着装置1が実際に測定している頭部の位置と、撮像装置3で撮像した画像から算出した頭部の位置にずれが生ずるからである。基準点1aに対して垂直の位置から撮像すると、より正確に頭部の位置を認識することができる。 In S105, the CPU 21 specifies the position of the head-mounted device 1 and the position of the reference point 1a. For example, the CPU 21 specifies the position of the head-mounted device 1 and the position of the reference point 1a as relative positions based on the positions of the eyes and nose of the head. Further, the CPU 21 may specify the three-dimensional position of the head-mounted device 1 and the three-dimensional position of the reference point 1a from the head-mounted device 1 included in the head image. Even if the position is represented by two-dimensional coordinates based on the positions of eyes, noses, etc. in one head image, it is based on the three-dimensional positions of eyes, noses, etc., identified from a plurality of head images. It may be expressed in three-dimensional coordinates. The CPU 21 stores the identified position of the head-mounted device 1 and the position of the reference point 1a in the memory 22 or the like. It is desirable for the imaging device 3 to image the head-mounted device 1 at a position perpendicular to the reference point 1a. When the head mounted device 1 is imaged with a deviation from a position perpendicular to the reference point 1a, the head calculated from the position of the head actually measured by the head mounted device 1 and the image captured by the imaging device 3 is calculated. This is because the position of the portion is displaced. If an image is taken from a position perpendicular to the reference point 1a, the position of the head can be recognized more accurately.
 撮像装置3が基準点1aに対して垂直の位置と異なる位置であると判断したとき、撮像装置3が基準点1aに対して垂直の位置で撮像するように、被験者に頭部装着装置1を再度撮像してもらうように表示するようにしてもよい。CPU21は、被験者の頭部の画像に、頭部に装着される頭部装着装置1の画像を重畳した頭部の画像を生成し、表示部25に当該画像を表示してもよい。また、CPU21は、頭部装着装置1の位置及び基準点1aの位置から、各センサ14の位置を特定することができる。基準点1aの位置を用いることで、頭部装着装置1の形状が対称性(線対称、回転対称)を有している場合でも、各センサ14の位置を特定することができる。ここでは、頭部装着装置1の位置と基準点1aの位置とから、各センサ14の位置を特定しているが、頭部装着装置1が有する2つの基準点の位置から、各センサ14の位置を特定してもよい。また、CPU21は、頭部の画像において、頭部装着装置1の位置や基準点1aの位置を特定できない場合、表示部25にエラーを表示してもよい。このとき、図6の動作フローは、頭部装着装置1が装着されていないものとして、終了する。 When the imaging device 3 determines that the position is different from the position perpendicular to the reference point 1a, the subject is placed on the head-mounted device 1 so that the imaging device 3 captures an image at a position perpendicular to the reference point 1a. You may make it display so that it may image again. The CPU 21 may generate an image of the head obtained by superimposing the image of the head mounted device 1 mounted on the head on the image of the head of the subject, and display the image on the display unit 25. Further, the CPU 21 can specify the position of each sensor 14 from the position of the head-mounted device 1 and the position of the reference point 1a. By using the position of the reference point 1a, the position of each sensor 14 can be specified even when the shape of the head-mounted device 1 has symmetry (line symmetry, rotational symmetry). Here, the position of each sensor 14 is specified from the position of the head-mounted device 1 and the position of the reference point 1a, but the position of each sensor 14 is determined from the positions of the two reference points that the head-mounted device 1 has. The position may be specified. Moreover, CPU21 may display an error on the display part 25, when the position of the head mounting apparatus 1 and the position of the reference point 1a cannot be specified in the image of the head. At this time, the operation flow in FIG. 6 ends assuming that the head-mounted device 1 is not worn.
 CPU21は、頭部装着装置1を装着すべき位置と、特定した頭部装着装置1が装着されている位置とを比較する。頭部装着装置1を装着すべき位置は、あらかじめ、メモリ22等に格納されている。頭部装着装置1を装着すべき位置と特定した頭部装着装置1が装着されている位置とが一致している場合、CPU21は、表示部25に、頭部装着装置1が適切な位置に装着されている旨を表示する。また、CPU21は、頭部装着装置1に対して、測定の開始を指示する。頭部装着装置1は、脳血流量の測定を開始する。利用者端末は、表示部25に、頭部装着装置1の各センサ14による脳血流量の測定結果を、被験者の頭部の画像における各センサ14の位置で、被験者の頭部の画像に重畳して表示してもよい。被験者等は、当該表示により、脳血流量の測定結果を、頭部の計測された位置と対応付けて認識することができる。 The CPU 21 compares the position where the head mounting device 1 is to be mounted with the position where the identified head mounting device 1 is mounted. The position where the head mounting device 1 is to be mounted is stored in advance in the memory 22 or the like. When the position where the head mounting device 1 is to be mounted and the position where the identified head mounting device 1 is mounted are the same, the CPU 21 displays the head mounting device 1 in an appropriate position on the display unit 25. Display that it is installed. Further, the CPU 21 instructs the head mounting device 1 to start measurement. The head-mounted device 1 starts measuring cerebral blood flow. The user terminal superimposes the measurement result of the cerebral blood flow by each sensor 14 of the head mounted device 1 on the display unit 25 on the image of the head of the subject at the position of each sensor 14 in the image of the head of the subject. May be displayed. The subject or the like can recognize the measurement result of the cerebral blood flow in association with the measured position of the head by the display.
 図7は、被験者の頭部の画像に測定結果が重畳した例を示す図である。図7の例は、図5のように、頭部装着装置1を被験者の頭部に装着して、脳血流量を測定した際の測定結果を含む。図7の例では、被験者の頭部の左面が示され、測定を行った頭部装着装置1が装着された位置に、脳血流量の測定結果が重畳されて表示されている。また、利用者端末2は、頭部を撮像した複数の画像から頭部の輪郭や各部分の3次元位置を特定することにより生成した、任意の方向から頭部を見た画像に、脳血流量の測定結果を重畳した画像を生成してもよい。脳血流量の測定結果を重畳する位置は、頭部装着装置1を装着した位置に対応する頭部の画像における位置である。測定結果を示す領域は、頭部装着装置1を装着した領域と一致しなくてもよい。また、測定結果を示す領域は、頭部装着装置1を装着した領域よりも小さくても大きくてもよい。 FIG. 7 is a diagram showing an example in which the measurement result is superimposed on the image of the head of the subject. The example of FIG. 7 includes the measurement result when the head-mounted device 1 is mounted on the subject's head and the cerebral blood flow is measured as shown in FIG. In the example of FIG. 7, the left side of the subject's head is shown, and the measurement result of cerebral blood flow is superimposed and displayed at the position where the head-mounted device 1 that performed the measurement is mounted. In addition, the user terminal 2 adds brain blood to an image of the head viewed from an arbitrary direction generated by specifying the outline of the head and the three-dimensional position of each part from a plurality of images obtained by imaging the head. An image on which the measurement result of the flow rate is superimposed may be generated. The position where the measurement result of the cerebral blood flow is superimposed is a position in the image of the head corresponding to the position where the head mounting device 1 is mounted. The region indicating the measurement result may not coincide with the region where the head-mounted device 1 is worn. Further, the area indicating the measurement result may be smaller or larger than the area where the head-mounted device 1 is worn.
 図8は、頭部の模擬画像に測定結果を重畳した例を示す図である。図8の例では、人体の頭部を上方から見た頭部の模擬画像に、右側頭部の測定結果と前頭部の測定結果とが重畳された画像が表示されている。頭部を上方から見た頭部の模擬画像には、例えば、両目、鼻、耳などが含まれる。模擬画像に目や鼻などが含まれることにより、頭部における位置をより認識しやすくなる。頭部装着装置1が、頭部の頭頂部、後頭部、左側頭部に装着されて、脳血流量を測定された場合にも、同様に、頭部の模擬画像に測定結果が重畳された画像が生成され得る。図8のように測定結果を表示することで、被験者等が脳血流量を測定した位置を認識することが容易になる。 FIG. 8 is a diagram showing an example in which the measurement result is superimposed on the simulated image of the head. In the example of FIG. 8, an image in which the measurement result of the right head and the measurement result of the forehead is superimposed on a simulated image of the head viewed from above is displayed. The simulated image of the head when the head is viewed from above includes, for example, both eyes, nose and ears. By including eyes and noses in the simulated image, it becomes easier to recognize the position on the head. Similarly, when the head-mounted device 1 is mounted on the top of the head, the back of the head, or the left of the head and the cerebral blood flow is measured, the image in which the measurement result is superimposed on the simulated image of the head similarly. Can be generated. By displaying the measurement result as shown in FIG. 8, it becomes easy for the subject or the like to recognize the position where the cerebral blood flow is measured.
 一方、頭部装着装置1を装着すべき位置と特定した頭部装着装置1が装着されている位置とが一致していない場合、CPU21は、表示部25に、頭部装着装置1が適切な位置に装着されていない旨を表示する。さらに、利用者端末2は、被験者等に、頭部装着装置1を適切な位置に装着するように促す表示を、表示部25にしてもよい。被験者は、表示部25の表示を確認して、再度、頭部装着装置1を適切な位置に、装着させる。このとき、CPU21は、表示部25に、頭部装着装置1が現在装着されている位置と、頭部装着装置1を装着すべき位置とを表示してもよい。頭部装着装置1の装着後、再度、図6の動作フローが開始される。
 上記の構成は、可能な限り、組み合わせられ得る。
On the other hand, when the position where the head mounting device 1 is to be mounted and the position where the identified head mounting device 1 is mounted do not match, the CPU 21 indicates that the head mounting device 1 is appropriate for the display unit 25. Display that it is not attached to the position. Furthermore, the user terminal 2 may display on the display unit 25 a prompt that prompts the subject to wear the head-mounted device 1 at an appropriate position. The subject confirms the display on the display unit 25 and again puts the head-mounted device 1 in an appropriate position. At this time, the CPU 21 may display on the display unit 25 the position where the head mounting device 1 is currently mounted and the position where the head mounting device 1 should be mounted. After the head mounting device 1 is mounted, the operation flow of FIG. 6 is started again.
The above configurations can be combined as much as possible.
 (その他)
 装着される頭部装着装置1は、複数であってもよい。複数の頭部装着装置1が装着される場合、各頭部装着装置1の各センサについて、装着位置が特定される。
(Other)
There may be a plurality of head mounted devices 1 to be mounted. When a plurality of head mounting devices 1 are mounted, the mounting position is specified for each sensor of each head mounting device 1.
 頭部装着装置1を、腕や足などの頭部でない部分に装着すべき場合であっても、上記の例と、同様にできる。例えば、利用者端末2は、頭部の目や鼻などの位置の代わりに、関節の位置や指の位置などを用いて、各センサの位置を特定することができる。 Even if the head mounting device 1 is to be mounted on a portion other than the head, such as an arm or a leg, it can be performed in the same manner as in the above example. For example, the user terminal 2 can specify the position of each sensor using the position of a joint, the position of a finger, or the like instead of the position of the head's eyes or nose.
 基準点1aは、頭部装着装置1のいずれかのセンサ14に設けられてもよい。頭部装着装置1には、複数の基準点1aが、設けられてもよい。また、出力部15が基準点1aの機能を有してもよい。 The reference point 1a may be provided in any sensor 14 of the head-mounted device 1. The head mounting device 1 may be provided with a plurality of reference points 1a. The output unit 15 may have the function of the reference point 1a.
 (装着位置特定の変形例)
 ここでは、図6の動作フローの頭部装着装置の装着位置特定の変形例について説明する。ここでは、主として、図6の動作フローのS105の変形例について説明する。図6の動作フローとの共通点については、説明を省略する。ここでは、次の図9に示す頭部装着装置1が被験者の頭部に装着されているとする。また、ここでは、被験者に装着された頭部装着装置1のセンサ14の発光部(光源)または受光部の位置(相対位置)を特定する。
(Modified example of mounting position)
Here, a modified example of specifying the mounting position of the head mounting apparatus in the operation flow of FIG. 6 will be described. Here, a modified example of S105 in the operation flow of FIG. 6 will be mainly described. Description of points common to the operation flow of FIG. 6 is omitted. Here, it is assumed that the head mounting apparatus 1 shown in FIG. 9 is mounted on the head of the subject. Further, here, the position (relative position) of the light emitting part (light source) or the light receiving part of the sensor 14 of the head mounted device 1 attached to the subject is specified.
 図9は、頭部装着装置の外観の例を示す図である。図9の頭部装着装置1は、シート状の物質の面における三角格子の格子点の位置に配置された、被験者の頭部の血流を測定する15個のセンサ14を有する。ここでは、説明の都合上、一部のセンサ14を、センサ141、センサ142、センサ143とする。頭部装着装置1は、例えば、装着ベルトなどによって、被験者の頭部に巻き付けられて固定される。シート状の物質は、被験者の頭部の形状に合わせて、曲面状に変形しうる。ここでは、シート状の物質の初期状態を平面状とする。また、シート状の物質は、あらかじめ、被験者の頭部の形状に合わせて曲面状であってもよい。 FIG. 9 is a diagram showing an example of the appearance of the head-mounted device. The head-mounted device 1 in FIG. 9 has 15 sensors 14 that measure blood flow in the head of the subject, which are arranged at the positions of lattice points of a triangular lattice on the surface of the sheet-like substance. Here, for convenience of explanation, some of the sensors 14 are referred to as a sensor 141, a sensor 142, and a sensor 143. The head mounting device 1 is wound around and fixed to the subject's head by, for example, a mounting belt. The sheet-like substance can be deformed into a curved shape in accordance with the shape of the subject's head. Here, the initial state of the sheet-like substance is assumed to be planar. In addition, the sheet-like substance may be curved in advance according to the shape of the subject's head.
 図9の各センサ14は、断面が六角形状の柱状ハウジングを含む。図9のセンサ14の六角形は向かい合う辺同士が同じ長さで平行である。六角形の向かい合う1組の辺の長さを長さa、他の向かい合う1組の辺の長さを長さb、残りの向かい合う1組の辺の長さを長さcとする。の柱状ハウジングの一方の断面に発光部(光源)または受光部を含むセンサ14、他方の断面に出力部15が取り付けられている。センサ14の発光部または受光部は、被験者の頭部に接触するように配置される。センサ14の中心(発光部または受光部)の位置から、当該センサ14に隣接する他のセンサ14の中心の位置までの距離を、2×S2とする。また、センサ14の中心と当該センサ14に隣接する他のセンサ14の中心とを結ぶ線分の中心の点を、中間点と呼ぶ。センサ14の中心と隣接する中間点との距離は、S2である。六角形状の柱状ハウジングの高さをS1とする。 Each sensor 14 in FIG. 9 includes a columnar housing having a hexagonal cross section. The hexagonal shape of the sensor 14 in FIG. 9 is such that opposite sides are the same length and parallel. The length of a pair of opposite sides of the hexagon is length a, the length of the other pair of opposite sides is length b, and the length of the remaining pair of opposite sides is length c. A sensor 14 including a light emitting part (light source) or a light receiving part is attached to one cross section of the columnar housing, and an output part 15 is attached to the other cross section. The light emitting part or the light receiving part of the sensor 14 is arranged so as to contact the head of the subject. The distance from the position of the center (light emitting part or light receiving part) of the sensor 14 to the position of the center of another sensor 14 adjacent to the sensor 14 is 2 × S2. A center point of a line segment connecting the center of the sensor 14 and the center of another sensor 14 adjacent to the sensor 14 is referred to as an intermediate point. The distance between the center of the sensor 14 and the adjacent intermediate point is S2. The height of the hexagonal columnar housing is S1.
 図10は、図9の頭部装着装置のセンサの柱状ハウジングの例を示す図である。図10の例では、センサ14は、断面が六角形状の柱状ハウジングであり、六角形の一方の面には、発光部または受光部が設けられる。また、六角形の他方の面には、出力部が設けられる。センサ14の発光部または受光部が設けられる面は、被験者の頭部に接触する側である。また、センサ14の出力部15が設けられる面は、外側になる。六角形のうち2辺の長さは長さaであり、他の2辺の長さは長さbであり、残りの2辺の長さは長さcである。長さa、長さb、長さcは、a=bであっても、b=cであっても、c=aであっても、a=b=cであってもよい。 FIG. 10 is a diagram showing an example of a columnar housing of the sensor of the head mounted device of FIG. In the example of FIG. 10, the sensor 14 is a columnar housing having a hexagonal cross section, and a light emitting unit or a light receiving unit is provided on one surface of the hexagon. An output section is provided on the other side of the hexagon. The surface on which the light emitting part or the light receiving part of the sensor 14 is provided is the side that contacts the subject's head. Moreover, the surface where the output part 15 of the sensor 14 is provided becomes an outer side. In the hexagon, the length of two sides is a length a, the length of the other two sides is a length b, and the length of the remaining two sides is a length c. The length a, length b, and length c may be a = b, b = c, c = a, or a = b = c.
 利用者端末2は、撮像装置3から撮像された画像(頭部の画像)を取得する。受信した画像は、メモリ22等に格納される。CPU21は、頭部の画像から、頭部の輪郭を抽出する。CPU21は、周知の画像認識方法により、被験者の頭部の輪郭及び頭部装着装置1の輪郭を抽出する。さらに、CPU21は、頭部の画像から、周知の画像認識方法により、頭部の特定部位の位置(頭部表面における位置)を抽出する。特定部位は、例えば、目、鼻、口、耳、肩、頭頂部等である。 The user terminal 2 acquires an image (head image) captured from the imaging device 3. The received image is stored in the memory 22 or the like. The CPU 21 extracts the outline of the head from the head image. The CPU 21 extracts the contour of the head of the subject and the contour of the head mounting device 1 by a known image recognition method. Further, the CPU 21 extracts the position of the specific part of the head (position on the head surface) from the image of the head by a known image recognition method. Specific sites are, for example, the eyes, nose, mouth, ears, shoulders, top of the head and the like.
 CPU21は、頭部の画像から抽出した頭部装着装置1に含まれるセンサ14を抽出する。CPU21は、あらかじめ与えられているセンサ14の形状、色、所定のマーク等に基づいて、センサ14を抽出する。CPU21は、抽出したセンサ14の中から、撮像装置3に対して正面を向いているセンサ14を抽出する。正面を向いているセンサ14とは、センサ14の外側の六角形の面と撮像装置3の光軸とが直交するセンサ14である。頭部装着装置1が頭部に装着されると、頭部装着装置1が頭部に沿って湾曲するため、センサ14の外側の面が正面を向かなくなることがある。ここでは、正面を向いているセンサ14を基準として、他のセンサ14の3次元位置を特定する。センサ14が正面を向いているか否かの判定は、センサ14の六角形の辺の隣接する3辺の比が、a:b:cに一致するか否かにより判定する。センサ14の六角形の辺の隣接する3辺の比がa:b:cに一致するセンサ14は、撮像装置3に対して正面を向いている。センサ14の六角形の辺の隣接する3辺の比がa:b:cに一致するセンサ14は、撮像装置3に対して正面を向いていない。 CPU21 extracts the sensor 14 contained in the head mounting apparatus 1 extracted from the image of the head. The CPU 21 extracts the sensor 14 based on the shape, color, predetermined mark, etc. of the sensor 14 given in advance. The CPU 21 extracts, from the extracted sensors 14, the sensor 14 facing the front with respect to the imaging device 3. The sensor 14 facing the front is a sensor 14 in which the hexagonal surface outside the sensor 14 and the optical axis of the imaging device 3 are orthogonal to each other. When the head mounting device 1 is mounted on the head, the head mounting device 1 is curved along the head, so that the outer surface of the sensor 14 may not face the front. Here, the three-dimensional position of the other sensor 14 is specified on the basis of the sensor 14 facing the front. Whether or not the sensor 14 is facing the front is determined by whether or not the ratio of the three adjacent sides of the hexagonal side of the sensor 14 matches a: b: c. The sensor 14 in which the ratio of three adjacent sides of the hexagonal side of the sensor 14 matches a: b: c faces the front with respect to the imaging device 3. The sensor 14 in which the ratio of three adjacent sides of the hexagonal side of the sensor 14 matches a: b: c does not face the front with respect to the imaging device 3.
 図11は、正面を向いているセンサ14の例を示す図である。図11のセンサ14の外側の六角形は、撮像装置3の光軸と直交するため、センサ14の六角形の辺の隣接する3辺(長さa’、長さb’、長さc’)の比が、a:b:cに一致する。 FIG. 11 is a diagram illustrating an example of the sensor 14 facing the front. Since the hexagon outside the sensor 14 in FIG. 11 is orthogonal to the optical axis of the imaging device 3, three adjacent sides (length a ′, length b ′, length c ′) of the hexagonal side of the sensor 14. ) Ratio matches a: b: c.
 図12は、正面を向いていないセンサ14の例を示す図である。図12のセンサ14の外側の六角形は、撮像装置3の光軸と直交しないため、センサ14の六角形の辺の隣接する3辺(長さa”、長さb”、長さc”)の比が、a:b:cに一致しない。 FIG. 12 is a diagram illustrating an example of the sensor 14 not facing the front. Since the hexagon outside the sensor 14 in FIG. 12 is not orthogonal to the optical axis of the imaging device 3, three adjacent sides (length a ″, length b ″, length c ″) of the hexagonal side of the sensor 14. ) Ratio does not match a: b: c.
 次に、CPU12は、正面を向いていると判断したセンサ14の一辺の長さを利用して、画像におけるピクセル数と、実際の長さとのスケーリングを行う。例えば、正面を向いているセンサ14の長さaの辺の画像におけるドット数がPであった場合、画像における1ドットは、実際a/Pの長さであるとする。ここで、頭部の奥行きに対して頭部から撮像装置3までの距離が十分長い場合、画像における1ドットの長さは、画像のどの位置においても同じであるとしてよい。また、画像の奥行きを考慮して、1ドットに対する実際の長さを決めてもよい。CPU21は、以降、画像における長さを実際の長さに変換するものとする。 Next, the CPU 12 performs scaling between the number of pixels in the image and the actual length by using the length of one side of the sensor 14 determined to be facing the front. For example, when the number of dots in the image of the side of the length a of the sensor 14 facing the front is P, it is assumed that one dot in the image is actually a / P length. Here, when the distance from the head to the imaging device 3 is sufficiently long with respect to the depth of the head, the length of one dot in the image may be the same at any position in the image. Further, the actual length for one dot may be determined in consideration of the depth of the image. Hereinafter, the CPU 21 converts the length in the image into the actual length.
 図13は、頭部に装着されていない頭部装着装置の断面の例を示す図である。図13は、図9の頭部装着装置1のセンサ141、センサ142、センサ143の中心を通る断面を示す。図13の左側が頭部側で、右側が外側(撮像装置3の存在する側)、図13のZ方向は、撮像装置3の光軸の方向である。また、A1方向は、Z方向に直交する方向の1つである。図13のA1方向は、図9のA1方向と同じである。上記と同様に、センサ14の高さをS1、センサ14と中間点との距離をS2としている。センサ14の中心と当該センサ14に隣接するセンサ14の中心との距離は、2×S2である。 FIG. 13 is a diagram showing an example of a cross section of a head mounting device that is not mounted on the head. FIG. 13 shows a cross section passing through the centers of the sensor 141, the sensor 142, and the sensor 143 of the head-mounted device 1 of FIG. The left side of FIG. 13 is the head side, the right side is the outside (side where the imaging device 3 exists), and the Z direction in FIG. 13 is the direction of the optical axis of the imaging device 3. The A1 direction is one of the directions orthogonal to the Z direction. The A1 direction in FIG. 13 is the same as the A1 direction in FIG. Similarly to the above, the height of the sensor 14 is S1, and the distance between the sensor 14 and the intermediate point is S2. The distance between the center of the sensor 14 and the center of the sensor 14 adjacent to the sensor 14 is 2 × S2.
 仮に、図13の頭部装着装置1を、撮像装置3で撮影すると、画像から求まるセンサ141の中心とセンサ142の中心との距離は、2×S2となる。また、画像から求まるセンサ142の中心とセンサ143の中心との距離も、2×S2となる。また、センサ142、センサ143のZ方向の座標は、センサ141のZ方向の座標と同じとなる。 If the head mounting device 1 in FIG. 13 is photographed by the imaging device 3, the distance between the center of the sensor 141 and the center of the sensor 142 obtained from the image is 2 × S2. In addition, the distance between the center of the sensor 142 and the center of the sensor 143 obtained from the image is also 2 × S2. Also, the Z-direction coordinates of the sensors 142 and 143 are the same as the Z-direction coordinates of the sensor 141.
 ここでは、当該頭部装着装置1を頭部に装着した場合、頭部に沿って、各中間点で頭部装着装置1が折れ曲がって変形するものとする。 Here, when the head mounting device 1 is mounted on the head, the head mounting device 1 is bent and deformed at each intermediate point along the head.
 図14は、頭部に装着された頭部装着装置の断面の例を示す図である。図14は、図13と同様に、図9の頭部装着装置1のセンサ141、センサ142、センサ143の中心を通る断面を示す。図14の左側が頭部側で、右側が外側(撮像装置3の存在する側)、図14のZ方向は、撮像装置3の光軸の方向である。ここでは、センサ141の表面が撮像装置3に対して正面を向いているものとする。即ち、センサ141の表面が撮像装置3の光軸に直交するものとする。ここで、撮影された頭部の画像から求まる、センサ141の中心とセンサ142の中心との距離をL1とする。また、センサ142の中心とセンサ143の中心との距離をL2とする。センサ142は、センサ141との間の中間点において、センサ141に対して、角度θ、折れ曲がっているとする。センサ143は、センサ142との間の中間点において、角度φ、折れ曲がっているとする。 FIG. 14 is a diagram showing an example of a cross section of the head mounting device mounted on the head. FIG. 14 shows a cross section passing through the centers of the sensor 141, the sensor 142, and the sensor 143 of the head-mounted device 1 of FIG. The left side of FIG. 14 is the head side, the right side is the outside (side where the imaging device 3 exists), and the Z direction in FIG. 14 is the direction of the optical axis of the imaging device 3. Here, it is assumed that the surface of the sensor 141 faces the front with respect to the imaging device 3. That is, it is assumed that the surface of the sensor 141 is orthogonal to the optical axis of the imaging device 3. Here, the distance between the center of the sensor 141 and the center of the sensor 142, which is obtained from the captured image of the head, is L1. The distance between the center of the sensor 142 and the center of the sensor 143 is L2. It is assumed that the sensor 142 is bent at an angle θ with respect to the sensor 141 at an intermediate point between the sensor 142 and the sensor 141. The sensor 143 is assumed to be bent at an angle φ at an intermediate point between the sensor 142 and the sensor 142.
 利用者端末2のCPU21は、センサ14の表面の中心を、センサ14の表面の六角形の各辺の位置から六角形の中心を求めてもよいし、センサ14の表面の中心に出力部15をあらかじめ設けておき、出力部15の位置を六角形の中心としてもよい。 The CPU 21 of the user terminal 2 may obtain the center of the surface of the sensor 14 from the position of each side of the hexagon on the surface of the sensor 14, or the output unit 15 at the center of the surface of the sensor 14. May be provided in advance, and the position of the output unit 15 may be a hexagonal center.
 CPU21は、図14の図から、距離L1から、角度θを求めることができる。具体的には、L1=S2+S1sinθ+S2cosθである。また、距離L1および距離L2から、角度φを求めることができる。具体的には、L1+L2=S2+2S1cosθ+S1sin(θ+φ)+S2cos(θ+φ)である。また、CPU21は、角度θ、角度φを用いることにより、センサ141の発光部または受光部の位置に対するセンサ142の発光部または受光部の位置、センサ141の発光部または受光部の位置に対するセンサ143の発光部または受光部の位置を求めることができる。具体的には、センサ142の発光部または受光部の位置は、センサ141の発光部または受光部の位置に対して、Z方向にS2sinθ移動し、A1方向にS2+S2cosθ移動した位置である。同様に、センサ143の発光部または受光部の位置は、センサ141の発光部または受光部の位置に対して、Z方向に2×S2sinθ+S2sin(θ+φ)移動し、A1方向にS2+2×S2cosθ+S2cos(θ+φ)移動した位置である。CPU21は、頭部の画像に基づいて、図9のセンサ141から見てA2方向、A3方向の他のセンサ14に対しても同様にして、センサ141の発光部または受光部に対する位置を求めることができる。また、その他のセンサ14に対しても同様にして、センサ141の発光部または受光部のに対する位置を算出することができる。 CPU21 can obtain | require angle (theta) from the distance L1 from the figure of FIG. Specifically, L1 = S2 + S1sin θ + S2 cos θ. Further, the angle φ can be obtained from the distance L1 and the distance L2. Specifically, L1 + L2 = S2 + 2S1 cos θ + S1 sin (θ + φ) + S2 cos (θ + φ). In addition, the CPU 21 uses the angle θ and the angle φ to detect the position of the light emitting unit or the light receiving unit of the sensor 142 with respect to the position of the light emitting unit or the light receiving unit of the sensor 141 and the sensor 143 with respect to the position of the light emitting unit or the light receiving unit of the sensor 141. The position of the light emitting part or the light receiving part can be obtained. Specifically, the position of the light emitting unit or the light receiving unit of the sensor 142 is a position moved by S2 sin θ in the Z direction and S2 + S2 cos θ in the A1 direction with respect to the position of the light emitting unit or the light receiving unit of the sensor 141. Similarly, the position of the light emitting unit or the light receiving unit of the sensor 143 moves 2 × S2 sin θ + S2 sin (θ + φ) in the Z direction with respect to the position of the light emitting unit or the light receiving unit of the sensor 141, and S2 + 2 × S2 cos θ + S2 cos (θ + φ) in the A1 direction. It is the moved position. The CPU 21 obtains the position of the sensor 141 relative to the light emitting unit or the light receiving unit in the same manner for the other sensors 14 in the A2 direction and the A3 direction as viewed from the sensor 141 in FIG. 9 based on the image of the head. Can do. Similarly, the position of the sensor 141 relative to the light emitting part or the light receiving part can be calculated for the other sensors 14.
 これにより、CPU21は、すべてのセンサ14の相対位置(センサ141の発光部または受光部に対する位置)を算出することができる。また、CPU21は、被験者の頭部の位置と、センサ14の位置とを比べることにより、頭部のどの位置に各センサ14が装着されているかを認識することができる。 Thereby, the CPU 21 can calculate the relative positions of all the sensors 14 (the positions of the sensors 141 with respect to the light emitting part or the light receiving part). Further, the CPU 21 can recognize at which position of the head each sensor 14 is mounted by comparing the position of the head of the subject with the position of the sensor 14.
 図15は、頭部装着装置の外観の例を示す図である。図15の頭部装着装置1は、シート状の物質の面における直交格子の格子点の位置に配置された、被験者の頭部の血流を測定する15個のセンサ14を有する。センサ14が図15のように配置された場合も上記と同様にして、各センサ14の相対位置を求めることができる。ここでは、B1方向と、B2方向について相対位置を求めることで、すべてのセンサ14の相対位置を求めることができる。 FIG. 15 is a diagram showing an example of the appearance of the head-mounted device. The head-mounted device 1 shown in FIG. 15 has fifteen sensors 14 that measure the blood flow of the subject's head, which are arranged at positions of lattice points of an orthogonal lattice on the surface of the sheet-like substance. When the sensors 14 are arranged as shown in FIG. 15, the relative positions of the sensors 14 can be obtained in the same manner as described above. Here, the relative positions of all the sensors 14 can be obtained by obtaining the relative positions in the B1 direction and the B2 direction.
 上記の実施形態、変形例等は、可能な限り、適宜、構成等を組み合わされて実施され得る。 The above-described embodiments, modification examples, and the like can be implemented by appropriately combining configurations and the like as much as possible.
 (実施形態の作用、効果)
 生体計測装置5の撮像装置3は、被験者の頭部に装着された頭部装着装置1を含む頭部を撮像する。利用者端末2は、撮像された頭部の画像に基づいて、頭部に装着される頭部装着装置1の、頭部における位置を特定する。また、利用者端末2は、頭部装着装置1の各センサ14の位置を特定する。生体計測装置5は、頭部装着装置1が装着された頭部を撮影することで、頭部装着装置1が装着された位置を特定し、頭部装着装置1が有するセンサ14の取り付け位置を特定することができる。生体計測装置5によれば、センサ14の取り付け位置を特定することができることで、センサ14による脳血流量の測定結果と、当該センサ14の装着位置との対応づけを容易に行うことができる。
(Operation and effect of the embodiment)
The imaging device 3 of the biological measurement device 5 images the head including the head mounting device 1 mounted on the subject's head. The user terminal 2 specifies the position of the head-mounted device 1 to be mounted on the head on the head based on the captured image of the head. In addition, the user terminal 2 specifies the position of each sensor 14 of the head-mounted device 1. The biological measuring device 5 identifies the position where the head mounting device 1 is mounted by photographing the head where the head mounting device 1 is mounted, and the mounting position of the sensor 14 included in the head mounting device 1 is determined. Can be identified. According to the biological measurement device 5, the attachment position of the sensor 14 can be specified, so that the measurement result of the cerebral blood flow by the sensor 14 can be easily associated with the mounting position of the sensor 14.
     1   頭部装着装置
     2   利用者端末
     3   撮像装置
     5   生体計測装置
     11   制御部
     13   無線通信部
     14   センサ
     15   出力部
     21   CPU
     22   メモリ
     23   無線通信部
     24   公衆回線通信部
     25   表示部
     26   操作部
     27   出力部
     28   撮像部
     29   測位部
     2A   物理センサ部 
DESCRIPTION OF SYMBOLS 1 Head mounting apparatus 2 User terminal 3 Imaging device 5 Biometric apparatus 11 Control part 13 Wireless communication part 14 Sensor 15 Output part 21 CPU
22 memory 23 wireless communication unit 24 public line communication unit 25 display unit 26 operation unit 27 output unit 28 imaging unit 29 positioning unit 2A physical sensor unit

Claims (7)

  1.  被験者に装着され、前記被験者の物理量を計測する複数のセンサを有するセンサ装置と、
     前記センサ装置及び前記被験者を撮像する撮像装置と、
     前記撮像装置で撮像した画像から前記被験者に対する前記センサ装置の前記センサの装着位置を演算する利用者端末と
     を有する生体計測装置。
    A sensor device mounted on a subject and having a plurality of sensors for measuring the physical quantity of the subject;
    An imaging device for imaging the sensor device and the subject;
    A biometric measurement device comprising: a user terminal that calculates a mounting position of the sensor of the sensor device with respect to the subject from an image captured by the imaging device.
  2.  前記利用者端末は、前記被験者に装着される前記センサ装置の前記センサの位置を、前記被験者の画像に重畳して表示する表示部を有する
    請求項1に記載の生体計測装置。
    The biometric device according to claim 1, wherein the user terminal includes a display unit that displays the position of the sensor of the sensor device attached to the subject so as to be superimposed on the image of the subject.
  3.  前記センサ装置は、前記センサ装置の装着位置の基準となる基準点を有し、
     前記利用者端末は、前記基準点に基づいて、前記被験者に対する前記センサ装置の前記センサの装着位置を演算する、
    請求項1または2に記載の生体計測装置。
    The sensor device has a reference point serving as a reference for the mounting position of the sensor device,
    The user terminal calculates a mounting position of the sensor of the sensor device with respect to the subject based on the reference point.
    The biological measurement apparatus according to claim 1 or 2.
  4.  前記センサ装置は、複数の前記センサにより前記被験者の頭部に流れる血流を測定し、
     前記利用者端末は、前記センサによる測定結果を前記被験者の画像における前記センサ装置の位置で、前記被験者の画像に重畳して表示する表示部を有する、
    請求項1に記載の生体計測装置。
    The sensor device measures blood flow flowing to the subject's head by the plurality of sensors,
    The user terminal has a display unit that displays the measurement result of the sensor superimposed on the image of the subject at the position of the sensor device in the image of the subject.
    The biological measurement apparatus according to claim 1.
  5.  前記利用者端末は、前記撮像装置で撮像した画像に含まれる、前記センサの大きさ、および、隣接する前記センサの間の距離に基づいて、各前記センサの3次元位置を算出する、
    請求項1から4のいずれか1項に記載の生体計測装置。
    The user terminal calculates a three-dimensional position of each sensor based on a size of the sensor and a distance between adjacent sensors included in an image captured by the imaging device.
    The living body measurement apparatus according to any one of claims 1 to 4.
  6.  コンピュータが、
     被験者に装着され、前記被験者の物理量を計測する複数のセンサを有するセンサ装置及び前記被験者を撮像させ、
     撮像された画像から前記被験者に対する前記センサ装置の前記センサの装着位置を演算する、
    ことを実行するための情報処理プログラム。
    Computer
    A sensor device that is attached to a subject and has a plurality of sensors that measure the physical quantity of the subject and the subject is imaged,
    Calculating the mounting position of the sensor of the sensor device with respect to the subject from the captured image;
    An information processing program for executing things.
  7.  コンピュータが、
     被験者に装着され、前記被験者の物理量を計測する複数のセンサを有するセンサ装置及び前記被験者を撮像させ、
     撮像された画像から前記被験者に対する前記センサ装置の前記センサの装着位置を演算する、
    ことを実行する生体計測方法。 
    Computer
    A sensor device that is attached to a subject and has a plurality of sensors that measure the physical quantity of the subject and the subject is imaged,
    Calculating the mounting position of the sensor of the sensor device with respect to the subject from the captured image;
    A biological measurement method that performs the above.
PCT/JP2017/013125 2016-03-30 2017-03-29 Biometric measurement apparatus, information processing program, and biometric measurement method WO2017170804A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2018509401A JPWO2017170804A1 (en) 2016-03-30 2017-03-29 Biological measuring device, information processing program, and biological measuring method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-067461 2016-03-30
JP2016067461 2016-03-30

Publications (1)

Publication Number Publication Date
WO2017170804A1 true WO2017170804A1 (en) 2017-10-05

Family

ID=59965851

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/013125 WO2017170804A1 (en) 2016-03-30 2017-03-29 Biometric measurement apparatus, information processing program, and biometric measurement method

Country Status (2)

Country Link
JP (1) JPWO2017170804A1 (en)
WO (1) WO2017170804A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020071443A1 (en) * 2018-10-02 2020-04-09 株式会社NeU Blood flow volume measurement device
WO2020174842A1 (en) * 2019-02-27 2020-09-03 国立研究開発法人産業技術総合研究所 Brain function measurement device, brain function measurement method, and probe

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006122086A (en) * 2004-10-26 2006-05-18 Hitachi Ltd Optical measuring instrument for organism
JP2010119660A (en) * 2008-11-20 2010-06-03 Shimadzu Corp Organism measuring instrument
JP2014030621A (en) * 2012-08-03 2014-02-20 Sony Corp Information processor, program, and biological signal measuring set

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5717064B2 (en) * 2011-02-03 2015-05-13 国立大学法人 筑波大学 Blood flow measuring device and brain activity measuring device using blood flow measuring device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006122086A (en) * 2004-10-26 2006-05-18 Hitachi Ltd Optical measuring instrument for organism
JP2010119660A (en) * 2008-11-20 2010-06-03 Shimadzu Corp Organism measuring instrument
JP2014030621A (en) * 2012-08-03 2014-02-20 Sony Corp Information processor, program, and biological signal measuring set

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020071443A1 (en) * 2018-10-02 2020-04-09 株式会社NeU Blood flow volume measurement device
WO2020174842A1 (en) * 2019-02-27 2020-09-03 国立研究開発法人産業技術総合研究所 Brain function measurement device, brain function measurement method, and probe
JPWO2020174842A1 (en) * 2019-02-27 2021-10-14 国立研究開発法人産業技術総合研究所 Brain function measuring device and brain function measuring method and probe
JP7227652B2 (en) 2019-02-27 2023-02-22 国立研究開発法人産業技術総合研究所 Brain function measuring device, brain function measuring method and probe
US11911130B2 (en) 2019-02-27 2024-02-27 National Institute Of Advanced Industrial Science And Technology Brain function measurement device, brain function measurement method, and probe

Also Published As

Publication number Publication date
JPWO2017170804A1 (en) 2019-02-14

Similar Documents

Publication Publication Date Title
US11172817B2 (en) Fundus image capture system
WO2018072598A1 (en) Human body height measurement method and device, and smart mirror
CN104127302B (en) A kind of visually impaired people&#39;s walking along the street safety navigation method
US20150335239A1 (en) Head mountable device for measuring eye movement
RU2018136770A (en) SYSTEMS AND METHODS OF SCANNING FACES
US20150098636A1 (en) Integrated tracking with fiducial-based modeling
JP2014136137A (en) Medical information processing apparatus, and program
WO2012083349A4 (en) System for integrated wound analysis
US20150097935A1 (en) Integrated tracking with world modeling
WO2017170804A1 (en) Biometric measurement apparatus, information processing program, and biometric measurement method
US10402996B2 (en) Distance measuring device for human body features and method thereof
WO2019176491A1 (en) Gaze detector, method for controlling gaze detector, method for detecting corneal reflection image position, computer program, and storage medium
CN104720814A (en) Non-contact height automatic measuring system and non-contact height automatic measuring method
JP2017191546A (en) Medical use head-mounted display, program of medical use head-mounted display, and control method of medical use head-mounted display
JP6609737B2 (en) Biological light measurement device, information processing program, and information processing method
JP2016095584A (en) Pupil detection device, pupil detection method, and pupil detection program
JP7078568B2 (en) Display device, display control method, and display system
CN204275306U (en) A kind of network intelligence eye physiotherapy table
WO2018062410A1 (en) Selection system, selection method, and selection program
KR102348663B1 (en) System and method for measurement pulse and respiration using image and line laser
JP7228153B2 (en) Head-mounted device, brain activity recording system
JP7019139B2 (en) Information processing equipment, information processing method, information processing program
WO2022091507A1 (en) Determination device and determination method
JP6823845B2 (en) Information processing equipment, information processing method, information processing program
TWI710353B (en) Information processing device, information processing method, information processing program

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 2018509401

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17775338

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17775338

Country of ref document: EP

Kind code of ref document: A1