WO2007125794A1 - Dispositif de mesure de donnees et procede de mesure de donnees - Google Patents

Dispositif de mesure de donnees et procede de mesure de donnees Download PDF

Info

Publication number
WO2007125794A1
WO2007125794A1 PCT/JP2007/058430 JP2007058430W WO2007125794A1 WO 2007125794 A1 WO2007125794 A1 WO 2007125794A1 JP 2007058430 W JP2007058430 W JP 2007058430W WO 2007125794 A1 WO2007125794 A1 WO 2007125794A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
pupil
image
template
unit
Prior art date
Application number
PCT/JP2007/058430
Other languages
English (en)
Japanese (ja)
Inventor
Shin-Ichiroh Kitoh
Original Assignee
Konica Minolta Holdings, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konica Minolta Holdings, Inc. filed Critical Konica Minolta Holdings, Inc.
Publication of WO2007125794A1 publication Critical patent/WO2007125794A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/11Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
    • A61B3/112Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring diameter of pupils

Definitions

  • the present invention relates to a data measuring device and a data measuring method, and more particularly to a data measuring device and a data measuring method for measuring data related to a pupil of a living body.
  • an apparatus for measuring data related to a pupil of a living body has been proposed for diagnostic purposes such as medical diagnosis.
  • a data measuring device there has been proposed a measuring device provided with various measuring means and diagnostic means using data relating to the pupil of a living body.
  • Patent Document 1 detects the amount of change in a living body part such as an eye blink and a pupil diameter from a moving image of a living body. A diagnostic device to diagnose is described.
  • Patent Document 2 describes a person state in which a person's face is photographed using a CCD camera and infrared light, and a person's state is determined by extracting the pupil position or shape of the person from the photographed image. A detection device is described.
  • Patent Document 3 a change in the pupil area (pupil diameter) of a living body when a stimulus is applied by infrared light is detected by a goggle-shaped measuring tool equipped with an infrared light source and an infrared CCD camera.
  • a pupil-to-light reaction measuring instrument for relaxation evaluation that evaluates the relaxation of a living body from the detection result is described.
  • Patent Document 1 Japanese Patent Laid-Open No. 7-124126
  • Patent Document 2 Japanese Patent Laid-Open No. 7-249197
  • Patent Document 3 Japanese Patent Laid-Open No. 2005-143684
  • Patent Document 3 needs to be provided with a special device in combination with an infrared light source and an infrared CCD camera.
  • a template matching method as one method for extracting parts such as eyes, pupils, and mouths from biological images such as human face images.
  • a representative image called a template is prepared, the two are compared while scanning the template position on the image data to be sought, and the best match is extracted. This is a method of detecting the location of the site to be extracted.
  • the template is typically representative and fixed, and the template size and content (shape, color, etc.) are not changed during scanning, so the template may be misaligned with the image being inspected (for example, bare skin) If the eye is made into a template and the make-up eye is to be inspected, etc.), or if it doesn't exist (for example, a black eye template is used to extract the blue eye part), the detection accuracy is It will fall.
  • the inspection target is a moving image
  • the positional relationship between the subject and the camera changes three-dimensionally for each frame, so the size of the detection target in the moving image data changes, and the template cannot follow the change.
  • the camera itself and the subject's movement itself are constantly changing due to the amount of light and physiological changes. Therefore, in the case of performing template matching with high accuracy on an object such as a pupil that changes by itself, it is difficult to use the conventional method of fixing the template.
  • the present invention provides a data measurement apparatus and a data measurement method that improve detection accuracy by updating template information using information obtained from a detection result of a target in a frame image with moving image data. With the goal.
  • An image capturing unit for capturing a first image by capturing a living body having an eye area
  • the image capturing unit captures the living body and inputs a second image
  • the invention according to claim 2 is the data measuring device according to claim 1,
  • the data relating to the pupil is a pixel value at the center of the pupil and diameter data of the pupil.
  • the invention according to claim 3 is the data measuring device according to claim 1 or claim 2, wherein the data processing unit 'analysis unit is provided for each frame.
  • the data processing unit 'analysis unit is provided for each frame.
  • the invention according to claim 4 is the data measuring device according to any one of claims 1 to 3, wherein the image photographing unit is A distance sensor or a displacement sensor for measuring a distance or displacement from a living body is provided, and the data analysis unit calculates an absolute value of data related to the pupil using the distance or displacement from the living body as a parameter. To do.
  • the invention according to claim 5 is the data measuring device according to claim 4, wherein the distance sensor or the displacement sensor includes a plurality of imaging elements, and The shape is measured.
  • the invention described in claim 6 is the data measuring device described in claim 5, wherein the image photographing unit is a visible camera. [0016] The invention described in claim 7
  • the second eye region force is also detected by the updated template.
  • the invention according to claim 8 is the data measurement method according to claim 7, wherein the data relating to the pupil is a pixel value of a pupil center and pupil diameter data. It is characterized by.
  • FIG. 1 is a block diagram showing an overall structure of a data measurement device according to the present embodiment.
  • FIG. 2 is a conceptual diagram of processing for extracting an eye region from face image data.
  • FIG. 3 is a conceptual diagram showing pupil template creation processing.
  • FIG. 4 is a graph showing transition of pupil pixel values and iris pixel values in each frame.
  • FIG. 5 is a graph showing transition of pupil pixel values and iris pixel values in each frame.
  • FIG. 6 is a graph showing transition of pupil pixel values and iris pixel values in each frame.
  • FIG. 7 is a graph showing the transition of the minimum pixel value in each frame.
  • FIG. 8 is a conceptual diagram showing extraction processing of pupil diameter data using pixel values.
  • FIG. 9 is a graph showing the transition of diameter data in each frame.
  • FIG. 11 is a flowchart showing a method for extracting pupil and diameter data by template matching.
  • FIG. 12 is a flowchart showing a method for updating a pupil template.
  • FIG. 1 is a block configuration diagram of the data measuring apparatus 1 according to the present embodiment.
  • an external device 2 is connected to the data measuring device 1 via a network that can communicate with each other, and the measurement results of the data measuring device 1 can be transmitted to the external device 2. .
  • the network in the present embodiment is not particularly limited as long as it means a communication network capable of data communication.
  • the Internet a LAN (Local Area Network), a WAN (Wide Area Network), a telephone line network ISDN (Integrated Services Digital Network) network, CATV (Cable Television) line, optical communication line, etc.
  • LAN Local Area Network
  • WAN Wide Area Network
  • ISDN Integrated Services Digital Network
  • CATV Consumer Television
  • optical communication line etc.
  • the external device 2 is constituted by a personal computer or the like, and is preferably installed in a place where some kind of consulting and diagnosis can be received.
  • the external device 2 may be configured as an Internet site from which consulting information can be obtained, or as a mobile terminal such as a consultant, doctor, or clerk.
  • the external device 2 may be configured as a data server for a home health management system.
  • the data measuring device 1 includes a control unit 3, an external communication unit 4, an image capturing unit 5, an illumination unit 6, a user interface unit 7, an IZO unit 8, a memory unit 9, and data processing.
  • An analysis unit 10, a parameter setting / management unit 11, a data storage unit 12, and a display unit 13 are provided.
  • the control unit 3 includes a CPU and a RAM, and drives and controls each component of the data measurement device 1. Since the data measuring device 1 of the present embodiment also handles moving images, it is desirable that the control unit 3 be configured with a chip that can control the operation as fast as possible.
  • the external communication unit 4 is configured to be able to perform information communication with the external device 2 by wired or wireless communication means.
  • the data measuring apparatus 1 of the present embodiment handles moving image data, and therefore preferably has a communication mode capable of high-speed transmission as much as possible.
  • the image capturing unit 5 captures a moving image around the subject's eyes, such as a camera module or other camera attached to a CCD camera, digital still camera, CMOS camera, video camera, mobile phone, or the like. Consists of. In addition, the image shooting unit 5 may be shifted in color shooting or monochrome shooting, but the following will be described in the case of monochrome shooting.
  • an infrared camera can be used as the image photographing unit 5, it is desirable to photograph with a visible camera in the present embodiment.
  • a visible camera it is desirable to use a camera with high sensitivity in the red region, focusing on the contrast of the captured image.
  • the sensitivity in the red region can be relatively increased by installing an optical filter with high transmittance in that band. Further, it may be used except for a camera filter provided with an infrared cut filter.
  • the image capturing unit 5 of the present embodiment is configured as a distance sensor or a displacement sensor for detecting the distance or displacement between the image capturing unit 5 and the subject.
  • the existing distance sensor or displacement sensor can be used.
  • the image capturing unit 5 associates the distance and displacement between the image capturing unit 5 and the subject with each captured frame and stores them in the data storage unit 12 or the parameter setting / management unit 11. It has become.
  • the distance and displacement from the subject can be determined by configuring the camera as the image capturing unit 5 with two eyes and using images (areas between the eyebrows, nose, and the other eye) captured by each power camera. It can also be obtained from measuring the 3D shape.
  • the absolute distance per pixel can be obtained without using a special sensor. This makes it possible to detect the absolute value of data relating to the pupil without providing a distance sensor or a displacement sensor.
  • the illumination unit 6 is an arbitrary component of the data measuring device 1, and can irradiate around the eyes of a subject with a light source when the surrounding environment is dark at the time of photographing.
  • a light source white to incandescent (color temperature around 3000K) visible light can be used. It is also possible to use an infrared light source as the light source. Further, by diffusing and illuminating light by the illumination unit 6, direct reflection of light on the surface of the subject's eyes can be mitigated, and the accuracy of image processing can be improved. Further, it is desirable that the illumination unit 6 has a mechanism that can vary the light intensity of the light source.
  • the illumination unit 6 has a mechanism capable of irradiating illumination light for stimulus addition.
  • the illumination light for applying the stimulus it is preferable to use strobe (flash) light that is visible light and stable in intensity and irradiation time. In this case, the intensity of the irradiated light is kept constant over time. The As a result, it is possible to measure the pupil response by applying a stimulus to the eye of the subject at the time of photographing by the image photographing unit 5. In addition, it is desirable that the flash (flash) light can adjust the light emission time if the light emission timing is reached.
  • the user interface unit 7 is composed of a keyboard, a mouse, a trackball, and the like.
  • the user interface unit 7 allows a user to input an instruction, and can transmit the status and request of the data measurement device 1 to the user.
  • the touch panel may be configured integrally with the display unit 13.
  • the IZO unit 8 is configured to be able to connect a portable device such as a CF card, an SD card, or a USB memory card.
  • a port for accessing an external device 2 such as Ethernet (registered trademark) can be connected.
  • Ethernet registered trademark
  • the memory unit 9 is composed of RAM, ROM, DIMM, etc., and the data processing 'analysis unit 10 etc. transfers data required by the data storage unit 12 etc.
  • Device 1 is designed to operate at high speed and stability.
  • the memory unit 9 of this embodiment needs to have a capacity that can execute moving image processing in real time without dropping frames.
  • the analysis unit 10 measures a time-series change of data related to the pupil of the living body by analyzing the moving image captured by the image capturing unit 5.
  • the data processing / analysis unit 10 extracts the eye region from the face image data.
  • eye region extraction conventional methods using eye template matching and positional relationship between the eye periphery can be applied.
  • an eye area in an arbitrary frame may be manually set, and the area may be used as an eye area in another frame. Also, set the eye area wider so that the eyes and iris do not protrude from other frames.
  • the data processing 'analysis unit 10 is adapted to create a pupil template.
  • the pupil template is created using data such as the size of the pupil, the average pixel value of the pupil, and the average pixel value of the iris, which are obtained by analyzing moving images of a plurality of people's eye regions in advance.
  • the size of the template is set so that the pupil can be accommodated, and a circle corresponding to the pupil is arranged at the center.
  • a pixel template is created by setting the pixel values of the circumference and the inside to the pixel values of the pupils of the plurality of people and the pixel values of the outside of the circle to the average of the pixel values of the irises of the plurality of people. be able to.
  • the pupil template may be obtained from image data when the pupil is contracted. As a result, the detection accuracy of the pupil center coordinates is increased even for image data in which the pupil is contracted.
  • the data processing 'analysis unit 10 extracts the pupil by performing the template matching of the pupil for each frame of the moving image with respect to the eye region from which the facial image data force has been extracted. At this time, when the difference between the image data of the eye area of each frame and the pupil template is minimized, it is determined that the center of the pupil. Thereby, the pupil center coordinates can be detected.
  • the search area of the next frame may be narrowed down using the pupil center coordinates extracted in the previous frame under the assumption that the moving distance of the subject (pupil) between the frames is small.
  • the size of the search area of the next frame can be set separately, and the “pupil center coordinater setting range” of the previous frame can be searched in that frame.
  • the feature amount of the difference between the image data of each frame and the template is as shown in the graphs of Figs.
  • the feature quantity of the difference between the image data of each frame and the template is as shown in the graph in FIG.
  • FIG. 7 is a graph showing the transition of the minimum value of the difference from the template in each frame when light stimulation is given to the pupil three times with a certain period of illumination light for stimulus addition.
  • Figure 7 As shown, the minimum value appears as a discontinuous value due to blinking of the subject.
  • the pupil center coordinate appears as a discontinuous value when the subject blinks or closes his eyes even in the graph showing the pupil center coordinate transition, etc. .
  • the data processing 'analysis unit 10 extracts image data of a part in which transitions such as pixel values and pupil center coordinates are discontinuous from the moving image data, and when the subject blinks, Is excluded as image data when is closed.
  • a predetermined threshold value is separately set for the minute amount (variation amount) in the graph of FIG. 7, and when the differential amount is equal to or greater than the predetermined threshold value, it is possible to determine “blink”.
  • the graph showing the transition of the minimum value in Fig. 7 is smoothed multiple times, the difference between the minimum value before smoothing and after smoothing is taken for each frame, and the image data when the difference increases is blinked. It can also be excluded as image data when the eyes are closed.
  • a force using a light stimulus as a stimulus to be applied to the subject may be sound, smell, smell, or touching the subject.
  • the stimulus to be applied to the subject may be sound, smell, smell, or touching the subject.
  • plosives, explosive sounds, incense, and aromas are possible.
  • the subject's stress may be alleviated or the subject may be stressed.
  • hot springs, footbaths, hot air, and cold air can be considered.
  • the "blink" data obtained when extracting the image data when the subject blinks or when the eyes are closed is the power of diverting the data to other analyzes as “blink count” data.
  • the data may be transferred to the external device 2. It is also possible to extract changes at the time of stimulus response as data by searching for the start point, apex and end point position of the change in pixel values and pupil center coordinates at the time of stimulus response.
  • the start position of the pupil response due to light stimulation can be detected by, for example, storing the light emission timing of stimulation light in association with moving image data and using the light emission timing data during image processing.
  • the search of the start point, the apex point, and the end point position of the change at the time of stimulus response can be performed using other known methods.
  • the data processing 'analysis unit 10 starts with the pupil center coordinates obtained by matching the template of the pupil as the starting point, and ends the pupil in at least one of the left, right, up, down, and diagonal directions for each frame of the moving image.
  • pupil diameter data is extracted.
  • the pixel value at the search position is in the range from 0 to “pixel value at the pupil center coordinate + threshold value” using the pixel value at the pupil center coordinate and the threshold value set separately. If it is outside, it can be determined to be outside the pupil.
  • Pupil diameter data can also be obtained from the end point force of the pupil thus obtained.
  • the pixel value force average value and standard deviation amount around the center of the pupil are calculated when searching for the end of the pupil, and ranges such as ⁇ average value + 3 times the standard deviation '' It may be performed by providing a lens and considering it in the pupil.
  • This diameter data may be averaged as a single feature value as shown in FIG. 9, which may be a separate feature value based on the diameter data in the horizontal direction or the diameter data in the vertical direction.
  • Fig. 9 shows the pupil diameter data in each frame. The pupil diameter data changes depending on the stimulus response.
  • the data processing 'analysis unit 10 updates the pupil template based on the pixel value at the center of the pupil and the diameter data of the pupil obtained by the processing of the immediately preceding frame. This is applied to pupil extraction of the next frame.
  • the data used for updating the pupil template may be data obtained by averaging not only the data obtained in the immediately preceding frame but also the data obtained in all the previous frames.
  • the pupil template is customized for each subject, so that the pupil extraction accuracy is improved.
  • the data processing 'analysis unit 10 considers that a template suitable for the subject has been created, Thereafter, the template update can be terminated. As a result, unnecessary template creation steps can be omitted, and the process can be performed quickly.
  • the data processing 'analysis unit 10 does not use the updated data when updating the pupil template, if the immediately preceding frame is image data when blinking or closing eyes. To do. Also, when averaging the data of all frames, do not use the image data when blinking or closing eyes.
  • the data processing ' analysis unit 10 performs a correction to make the data related to the pupil of each frame relative to a predetermined reference value.
  • the pupil diameter data obtained by the above processing is the data obtained as relative values in pixel units within one frame. Data and not a uniform value between frames. Therefore, the data processing 'analysis unit 10 obtains the distance between a plurality of points whose positional relationship does not change in time series such as the corner of the eye and the top of each eye in each frame, and calculates the relative distance between these points between each frame. Accordingly, the pupil diameter data is corrected so that the pupil diameter data is unified between the frames. For example, as shown in FIG.
  • the distance between the corner of the eye and the eye is D
  • the distance Dn between the corner of the eye and the eye is obtained in the nth frame
  • the correction amount DnZD is obtained.
  • the data processing 'analysis unit 10 measures the absolute value of pupil diameter data. That is, the diameter data of the pupil obtained by the above processing is a relative value in pixel units, and the absolute value of the diameter data varies depending on the distance between the image capturing unit 5 and the subject. Therefore, the data processing / analysis unit 10 uses the distance sensor or the displacement sensor as the image capturing unit 5 to measure and store the distance and displacement between the image capturing unit 5 and the subject in each frame as parameters. The absolute distance is now calculated!
  • a reference object (patch, sticker, etc.) with a known size and length is placed around the subject, and one image is drawn from the size of the object on the image.
  • the absolute value of the length per element may be calculated. It is also possible to take an image using a subject that can measure the length of the subject, such as a scaled seal, and calculate the absolute length per pixel in the same way.
  • the length and size of the area around the eyes such as the eyelashes of the subject, the nose, and the distance from the corners of the eyes to the eyes, can be separately measured and given as reference parameters.
  • the parameter setting / management unit 11 is configured to be able to set parameters necessary for processing and control of the data measuring device 1, and manages the set parameters. ing.
  • the data storage unit 12 is configured by an HDD or the like, and includes image data input from the outside, moving image data captured by the image capturing unit 5, and a moving image that has been subjected to image processing by the data processing / analyzing unit 10. It manages and holds image data or temporary data during image processing.
  • the display unit 13 may be a CRT, liquid crystal, organic EL, plasma, or projection display.
  • the data processing / analysis unit 10 also relates to the state of each component of the data measuring device 1. Information and information given from the external device 2 are displayed. It is also possible to adopt a configuration that also functions as the user interface unit 7 such as a touch panel.
  • control unit 3 external communication unit 4, IZO unit 8, memory unit 9, data processing / analysis unit 10, parameter setting / management unit 11, data storage unit 12, and display
  • the unit 13 can be configured as a general personal computer, and the data measuring device 1 can be configured by attaching the image capturing unit 5, the illumination unit 6, and the user interface unit 7 thereto.
  • the illumination unit 6 illuminates the subject's eyes and the image photographing unit 5 photographs the subject's eyes with a moving image. At this time, the illumination unit 6 may irradiate illumination light for stimulus addition.
  • the data processing / analysis unit 10 extracts an eye area from the face image data photographed by the image photographing unit 5.
  • the data processing 'analysis unit 10 creates a pupil template.
  • the pupil template is created using data such as pupil size, pupil pixel value average, and iris pixel value average obtained in advance by analyzing moving images of a plurality of human eye regions.
  • the data processing 'analysis unit 10 extracts the pupil by performing template matching of the pupil for each frame of the moving image for the extracted eye region of the face image data force.
  • Figure 11 shows a flowchart of template matching and pupil extraction methods.
  • step S1 when face image data is input to the data processing 'analysis unit 10 (step S1), the data processing' analysis unit 10 extracts the eye region (step S2) and creates it. Template matching is performed using the pupil template (step S3). The center of the pupil is when the difference between the image data of the eye area of each frame and the pupil template is minimized. By determining, pupil center coordinates are detected (step S4). At this time, the data processing / analysis unit 10 extracts image data of a part of the moving image data in which transitions such as pixel values and pupil center coordinates are discontinuous. Excluded as image data when closed.
  • the pupil diameter data is obtained by searching for the edge of the pupil in at least one of the left, right, up, down, and diagonal directions for each frame of the moving image, starting from the pupil center coordinates obtained by matching the template of the pupil. Is extracted (step S5). Subsequently, it is determined whether or not there is a next frame (step S6). If there is a next frame, steps S1 to S5 are repeated, and if there is no next frame, the process is terminated.
  • FIG. 12 shows a flowchart of a method for updating the pupil template.
  • step S11 when the first frame of the moving image is input to the data processing 'analysis unit 10 (step S11), the data processing / analysis unit 10 extracts the eye region (step S12), Template matching is performed using the created pupil template (step S13). Thereby, pupil center coordinates are detected (step S14), and pupil diameter data is extracted (step S15). Next, the data processing / analysis unit 10 updates the pupil template based on the data obtained by the processing of the immediately preceding frame or the average data of all the frames (step S16).
  • the data processing 'analysis unit 10 determines whether or not the difference between the templates before and after the update is equal to or less than a predetermined threshold value that is set separately. If it is updated and the value is below the predetermined threshold, template matching can be performed without updating the template thereafter.
  • the update of the template will be described in detail.
  • the pupil diameter data obtained by pupil extraction and the pixel values around the pupil center are averaged and calculated. Pixel values and pixel values outside the pupil are used.
  • the pixel value outside the pupil is calculated by averaging pixel values in a predetermined area outside the area determined to be outside the pupil, that is, the area inside the pupil calculated from the diameter data of the pupil.
  • the area to be averaged is the range of the portion surrounded by concentric circles outside the diameter of the pupil by a predetermined number of pixels or distance.
  • the predetermined number of pixels or distance is set by the parameter setting unit of the image processing apparatus.
  • the partial force other than the white-eye area of the eye is inside the pupil. You can use the area where the area is deleted as the extra-pupil area (resulting in an iris area), with the pixel values in the area averaged.
  • step S17 when the next frame is input to the data processing / analysis unit 10 (step S17), the data processing / analysis unit 10 extracts the eye region (step S18), and the generated pupil template is used. Template matching is performed again (step S19). Then, using the updated template, pupil center coordinates are detected (step S20), and pupil diameter data is extracted (step S21). Next, it is determined whether or not there is a next frame (step S22) .If there is a next frame, the pupil template is updated again (step S16), and if there is no next frame, the process ends. To do.
  • the data processing 'analysis unit 10 performs correction so that pupil diameter data is unified between the frames. That is, the diameter data of the pupil is a relative value between the frames.
  • the data processing 'analysis unit 10 measures the absolute value of the diameter data of the pupil. That is, the absolute distance per pixel is calculated using the distance and displacement between the image capturing unit 5 and the subject in each frame measured and stored by the distance sensor or displacement sensor as the image capturing unit 5 as parameters.
  • the data processing / analyzing unit 10 transmits the measurement result to the external device 2.
  • the pupil template is updated using the data of the frame in which the data measurement has already been performed, so that the pupil is determined for each living body.
  • This template can be customized to improve the pupil extraction accuracy.
  • the data is Although it is only a relative value within one frame in pixel units, the absolute distance per pixel is calculated by using the distance or displacement from the living body as a parameter, and the absolute value of data related to the pupil is calculated. It becomes possible.
  • the apparatus configuration can be simplified. It becomes possible to do.
  • the data measuring device 1 of the present embodiment can be used after the concentration of subjects during personal computer work and to determine the degree of fatigue. It can also be used to determine the subject's psychological stress level by adding stimuli other than light stimuli (sound, aroma, incense, footbath, etc.). Furthermore, it can also be used to measure the degree of emotion and attention of a subject who is awarding a movie, a TV program, or news.
  • the data measuring device and the data measuring method of the present invention it is possible to measure data relating to the pupil of the living body with high accuracy without contacting the living body.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)
  • Image Analysis (AREA)

Abstract

La présente invention concerne un système de mesure de données et un procédé de mesure de données capables d'effectuer des mesures précises de données sur la pupille d'un corps humain sans contact avec le corps humain. Le dispositif de mesure de données (1) comprend : une unité de capture d'image (5) destinée à la capture d'une image dynamique autour d'un œil du corps humain ; et une unité d'analyse de données (10) destinée à l'extraction de la pupille par correspondance avec un modèle pour chaque trame de l'image dynamique capturée par l'unité de capture d'images (5) et à la mesure des données sur la pupille en fonction d'une modification à ce moment-là. L'unité d'analyse de données (10) met à jour le modèle pour chaque trame en utilisant les données sur la pupille de la trame pour laquelle la mesure de données a déjà été effectuée. Les données sur la pupille sont formées d'une valeur de pixel autour de la pupille est des données de diamètre de la pupille.
PCT/JP2007/058430 2006-04-27 2007-04-18 Dispositif de mesure de donnees et procede de mesure de donnees WO2007125794A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006-123776 2006-04-27
JP2006123776 2006-04-27

Publications (1)

Publication Number Publication Date
WO2007125794A1 true WO2007125794A1 (fr) 2007-11-08

Family

ID=38655323

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/058430 WO2007125794A1 (fr) 2006-04-27 2007-04-18 Dispositif de mesure de donnees et procede de mesure de donnees

Country Status (1)

Country Link
WO (1) WO2007125794A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009106382A (ja) * 2007-10-26 2009-05-21 Naoji Kitajima 音響性瞳孔反応検査システム
JP2014515291A (ja) * 2011-05-20 2014-06-30 アイフルエンス,インコーポレイテッド 頭部、眼、眼瞼および瞳孔の反応を測定するためのシステムおよび方法
CN104114079A (zh) * 2011-10-24 2014-10-22 Iriss医疗科技有限公司 用于识别眼部健康状况的系统和方法
CN104173063A (zh) * 2014-09-01 2014-12-03 北京工业大学 一种视觉注意的检测方法及系统
WO2015075894A1 (fr) * 2013-11-19 2015-05-28 日本電気株式会社 Dispositif d'imagerie, dispositif d'imagerie de pupille, dispositif de mesure de diamètre de pupille, dispositif de détection d'état de pupille, et procédé d'imagerie de pupille
CN105310703A (zh) * 2014-07-02 2016-02-10 北京邮电大学 一种基于用户瞳孔直径数据获取主观满意度的方法
JP2016159050A (ja) * 2015-03-04 2016-09-05 富士通株式会社 瞳孔径測定装置、瞳孔径測定方法及びそのプログラム
CN112331003A (zh) * 2021-01-06 2021-02-05 湖南贝尔安亲云教育有限公司 一种基于差异化教学的习题生成方法和系统

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08105724A (ja) * 1994-10-05 1996-04-23 Fujitsu Ltd 3次元形状計測装置および3次元形状計測方法
JP2002056394A (ja) * 2000-08-09 2002-02-20 Matsushita Electric Ind Co Ltd 眼位置検出方法および眼位置検出装置
WO2004012142A1 (fr) * 2002-07-26 2004-02-05 Mitsubishi Denki Kabushiki Kaisha Appareil de traitement d'images
JP2005078311A (ja) * 2003-08-29 2005-03-24 Fujitsu Ltd 顔部位の追跡装置、眼の状態判定装置及びコンピュータプログラム
JP2005348832A (ja) * 2004-06-08 2005-12-22 National Univ Corp Shizuoka Univ 実時間瞳孔位置検出システム
WO2006013803A1 (fr) * 2004-08-03 2006-02-09 Matsushita Electric Industrial Co., Ltd. Dispositif d’imagerie et procede d’imagerie
JP2006099718A (ja) * 2004-08-30 2006-04-13 Toyama Prefecture 個人認証装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08105724A (ja) * 1994-10-05 1996-04-23 Fujitsu Ltd 3次元形状計測装置および3次元形状計測方法
JP2002056394A (ja) * 2000-08-09 2002-02-20 Matsushita Electric Ind Co Ltd 眼位置検出方法および眼位置検出装置
WO2004012142A1 (fr) * 2002-07-26 2004-02-05 Mitsubishi Denki Kabushiki Kaisha Appareil de traitement d'images
JP2005078311A (ja) * 2003-08-29 2005-03-24 Fujitsu Ltd 顔部位の追跡装置、眼の状態判定装置及びコンピュータプログラム
JP2005348832A (ja) * 2004-06-08 2005-12-22 National Univ Corp Shizuoka Univ 実時間瞳孔位置検出システム
WO2006013803A1 (fr) * 2004-08-03 2006-02-09 Matsushita Electric Industrial Co., Ltd. Dispositif d’imagerie et procede d’imagerie
JP2006099718A (ja) * 2004-08-30 2006-04-13 Toyama Prefecture 個人認証装置

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009106382A (ja) * 2007-10-26 2009-05-21 Naoji Kitajima 音響性瞳孔反応検査システム
JP2014515291A (ja) * 2011-05-20 2014-06-30 アイフルエンス,インコーポレイテッド 頭部、眼、眼瞼および瞳孔の反応を測定するためのシステムおよび方法
US9931069B2 (en) 2011-05-20 2018-04-03 Google Llc Systems and methods for measuring reactions of head, eyes, eyelids and pupils
CN104114079A (zh) * 2011-10-24 2014-10-22 Iriss医疗科技有限公司 用于识别眼部健康状况的系统和方法
JP2014530730A (ja) * 2011-10-24 2014-11-20 アイリス・メディカル・テクノロジーズ・リミテッド 目の症状を特定するためのシステムおよび方法
WO2015075894A1 (fr) * 2013-11-19 2015-05-28 日本電気株式会社 Dispositif d'imagerie, dispositif d'imagerie de pupille, dispositif de mesure de diamètre de pupille, dispositif de détection d'état de pupille, et procédé d'imagerie de pupille
CN105310703B (zh) * 2014-07-02 2018-01-19 北京邮电大学 一种基于用户瞳孔直径数据获取主观满意度的方法
CN105310703A (zh) * 2014-07-02 2016-02-10 北京邮电大学 一种基于用户瞳孔直径数据获取主观满意度的方法
CN104173063B (zh) * 2014-09-01 2015-08-12 北京工业大学 一种视觉注意的检测方法及系统
WO2016033950A1 (fr) * 2014-09-01 2016-03-10 北京工业大学 Procédé et système de détection d'une attention visuelle
CN104173063A (zh) * 2014-09-01 2014-12-03 北京工业大学 一种视觉注意的检测方法及系统
JP2016159050A (ja) * 2015-03-04 2016-09-05 富士通株式会社 瞳孔径測定装置、瞳孔径測定方法及びそのプログラム
CN112331003A (zh) * 2021-01-06 2021-02-05 湖南贝尔安亲云教育有限公司 一种基于差异化教学的习题生成方法和系统
CN112331003B (zh) * 2021-01-06 2021-03-23 湖南贝尔安亲云教育有限公司 一种基于差异化教学的习题生成方法和系统

Similar Documents

Publication Publication Date Title
WO2007125794A1 (fr) Dispositif de mesure de donnees et procede de mesure de donnees
CN108553081B (zh) 一种基于舌苔图像的诊断系统
AU2012328140B2 (en) System and method for identifying eye conditions
CN102458220B (zh) 形状辨别视力评估和跟踪系统
KR101998595B1 (ko) 이미지 기반 황달 진단 방법 및 장치
TWI669103B (zh) 資訊處理裝置、資訊處理方法及程式
US20140316235A1 (en) Skin imaging and applications
US20180177434A1 (en) Image based jaundice diagnosing method and apparatus and image based jaundice diagnosis assisting apparatus
KR20150107565A (ko) 건강 상태 정보를 제공하는 전자 장치, 그 제어 방법, 및 컴퓨터 판독가능 저장매체
EP3373798A1 (fr) Procédé et système de classification de papille de nerf optique
CN111937082A (zh) 远程牙科成像的引导方法和系统
US20170311872A1 (en) Organ image capture device and method for capturing organ image
Jongerius et al. Eye-tracking glasses in face-to-face interactions: Manual versus automated assessment of areas-of-interest
WO2016067892A1 (fr) Dispositif de génération de degré de santé, système de génération de degré de santé, et programme
US10070787B2 (en) System and method for detection and monitoring of a physical condition of a user
JP5698293B2 (ja) 携帯型医用画像表示端末及びその作動方法
CN114502059A (zh) 用于评估瞳孔反应的系统和方法
JP2006149679A (ja) 健康度判定方法、装置、及びプログラム
JP2007105147A (ja) データ検出装置及びデータ検出方法
TW201441944A (zh) 光學裝置及其運作方法
CN115317304A (zh) 一种基于生理特征检测的智能按摩方法及系统
Paul et al. Fundus Imaging Based Affordable Eye Care.
WO2019168372A1 (fr) Appareil de traitement d'images médicales et son procédé de fonctionnement
FR3123795A1 (fr) Système d’aide pour fournir une information de diagnostic
JP5474663B2 (ja) 肝斑判断支援システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07741866

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07741866

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP