WO2014054293A1 - 眠気推定装置、眠気推定方法、コンピュータ読み取り可能な非一時的な記録媒体 - Google Patents
眠気推定装置、眠気推定方法、コンピュータ読み取り可能な非一時的な記録媒体 Download PDFInfo
- Publication number
- WO2014054293A1 WO2014054293A1 PCT/JP2013/005924 JP2013005924W WO2014054293A1 WO 2014054293 A1 WO2014054293 A1 WO 2014054293A1 JP 2013005924 W JP2013005924 W JP 2013005924W WO 2014054293 A1 WO2014054293 A1 WO 2014054293A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- temperature
- visible light
- image data
- sleepiness
- infrared
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/01—Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
- A61B5/015—By temperature mapping of body part
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/01—Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/18—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K28/00—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
- B60K28/02—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
- B60K28/06—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
Definitions
- the present invention belongs to the technical field of analysis technology of a user's biological rhythm.
- the analysis technique of a user's biological rhythm is a technique for analyzing whether or not the user is unable to withstand sleepiness based on image data obtained by photographing the user.
- biological rhythm analysis technology is a wakefulness detection device that detects wakefulness.
- Awake state detection means whether the user is awake or doze by detecting the eye area from the image obtained by shooting the user and detecting the degree of eyelid fall and the number of blinks. Is to judge.
- the conventional state of wakefulness detection through image processing is intended to detect whether or not a person is in a dozing state based on the determination of the number of eye blinks and the degree of eyelid lowering.
- the drowsiness state is a state in which it is impossible to endure drowsiness, and even if such a state is detected, it is possible that effective measures can be taken to prevent accidents, and it may be too late.
- An object of the present invention is to detect with high accuracy the arrival of a transition period when transitioning from an awake state to a drowsiness state.
- image processing means for specifying the center region of the subject's eye By performing image processing on visible light image data, image processing means for specifying the center region of the subject's eye;
- the temperature of the central region of the eye is detected from the body surface temperature distribution of the subject indicated by the body surface temperature distribution data, and the temperature parameter for estimating sleepiness is corrected using the central region temperature. This is solved by the correcting means.
- the estimation means estimates whether the user is drowsy using the temperature parameter for sleepiness estimation. Furthermore, since this temperature parameter is corrected based on the temperature of the central region of the eye, the influence of the surrounding environment on the temperature parameter is sufficiently canceled. The temperature change inherent in the user's body at the time of transition from the awake state to the sleepy state is reflected in the temperature parameter, so it is preferable to determine how drowsy the user is by using the temperature parameter. can do.
- FIG. (A) shows the external appearance of the sleepiness estimation apparatus 100.
- FIG. (B) shows the example of installation at the time of the drowsiness estimating apparatus 100 being installed in the inside of a car. The internal structure of the sleepiness estimation apparatus 100 is shown.
- (A) shows visible light image data obtained by photographing a user who holds the steering wheel at the driver's seat.
- (B) shows infrared image data.
- (C) shows the correspondence between the pixel value of the pixel in the infrared image data and the surface temperature.
- (D) is a figure which shows the bit assignment of the pixel data of an infrared image.
- A) shows the detection result of the face area, the detection result of the eye area, and the detection result of the forehead area.
- (B) is an enlarged view of the eye area detected by the eye area detection.
- (C) shows where in the infrared image data the part to be converted is.
- (D) shows temporal transitions of Tf (t) and Tc (t). The procedure for deriving the difference of Td is shown. It shows the relationship between biological rhythm and temperature.
- It is a flowchart which shows the main routine of the process sequence of a drowsiness estimation apparatus. It is a flowchart which shows the subroutine of the calculation procedure of deep body temperature.
- the internal structure of the sleepiness estimation apparatus which concerns on 2nd Embodiment is shown.
- the derived structure of the sleepiness estimation apparatus which concerns on 2nd Embodiment is shown.
- FIG. 1 shows a drowsiness estimation device installed indoors for controlling indoor equipment.
- A It is an external view in the sleepiness estimation apparatus 100.
- FIG. (B) It is a figure which shows a visible light image and an infrared image side by side.
- C A conversion formula for obtaining (Xi, Yi) from (Xv, Yv) is shown.
- A It is an external view in the sleepiness estimation apparatus 100.
- FIG. (B) It is a figure which shows a visible light image and an infrared image side by side.
- C A conversion formula for obtaining (Xi, Yi) from (Xv, Yv) is shown.
- Non-Patent Document 1 the biological rhythm of drowsiness and arousal is made with a biological clock.
- the body clock determines a target temperature value (set point) for maintaining the biological rhythm of sleepiness and arousal.
- the heat production and heat release mechanism adjusts heat production and release so that the body temperature approaches the set point. Since the biological clock is in the optic or upper nucleus, in order to obtain a set point for maintaining biological rhythm, the deep body temperature must be measured. Deep body temperature is the body temperature of the deep part such as the heart and brain. Since it is difficult to directly measure the deep body temperature, a site that is easy to measure near the deep temperature is selected.
- Patent Document 1 There is a prior art described in Patent Document 1 as a technique for detecting a transition period from an arousal state to a dozing state by measuring a body temperature of a part that is easy to measure near a deep temperature.
- the prior art described in Patent Document 1 calculates the moving average value of the nasal bone skin temperature change, the masseter muscle skin temperature change, and the temperature change for each fixed time, and from the calculated moving average value, A change in skin temperature caused by the decrease is detected, and signs of dozing and dozing are determined.
- Patent Document 1 it is necessary to consider the influence of the ambient temperature around the user when detecting the skin temperature, so a sensor for detecting the air temperature is required separately.
- a sleepiness estimation device that detects sleepiness by detecting a temperature parameter for sleepiness estimation from a subject, Visible light image data obtained by photographing the subject in the visible light wavelength range, and acquisition means for obtaining body surface temperature distribution data obtained by measuring the temperature distribution of the subject's body surface; By performing image processing on visible light image data, image processing means for specifying the center region of the subject's eye; The temperature of the central region of the eye is detected from the body surface temperature distribution of the subject indicated by the body surface temperature distribution data, and the temperature parameter for estimating sleepiness is corrected using the central region temperature. Correction means; Is provided.
- the temperature of the central region of the eye is the temperature of the eyeball surface and represents the temperature of the cornea without blood flow control.
- the eyeball surface temperature changes greatly with changes in the environmental temperature, but the changes are also different from changes in the eardrum temperature, esophageal temperature, average skin temperature, and forehead skin temperature.
- the detection of the center of the eye region in 1. can be developed to a specific one along the image content of the visible light image. That is, the specification of the center region of the eye by the image processing means is performed by detecting the contour shape of the cornea from the image of the face region of the subject represented in the visible light image data, and determining the center of the arc indicating the contour shape of the cornea. It may be done by specifying.
- the contour of the cornea forms an arc in the eye region. Therefore, if the arc region is used as a clue, the center of the eye region can be identified relatively easily.
- the acquisition source where the deep body temperature for sleepiness estimation is acquired can be expanded to a specific one. Deep body temperature appears most accurately in the axilla, oral cavity, tympanic membrane, and rectum.
- sensor attachment to the axilla, oral cavity, tympanic membrane, and rectum is not possible unless it is performed by a medical worker, and cannot be realized by general users. Even if it is attached to the hand or foot, if the accuracy depends on the sensor attachment, it is not qualified as a product.
- the temperature parameter for estimating sleepiness is the body temperature of the forehead portion of the face region
- the correction means includes A temperature parameter for estimating sleepiness may be obtained by detecting the temperature of the forehead portion from the body surface temperature distribution shown in the body surface temperature distribution data.
- the forehead part reflects the deep body temperature, although it is necessary to remove the influence of the environmental temperature.
- the face area is detected and the eye area is detected, if the temperature of the forehead area located above the eye area in the face area is acquired, it is easy to replace the measurement of the axilla, oral cavity, tympanic membrane, and rectum. Deep body temperature can be obtained.
- the sleepiness estimation apparatus can detect a temperature change of the human body without attaching a sensor for directly measuring the temperature to the human body, and can estimate sleepiness from the temperature change. Since there is no need to attach a sensor to the human body, it is ideal for practical use.
- the acquisition source where the parameter for sleepiness estimation is acquired can be expanded to a specific one. That is, in any one of aspects 1 to 3, the temperature parameter for estimating sleepiness is a temperature parameter of a part of the face area excluding the peripheral part of the mouth and the hair part,
- the correction means includes A temperature parameter for estimating sleepiness may be obtained by detecting the temperature of any part of the facial region temperature distribution shown in the body surface temperature distribution data, excluding the peripheral part of the mouth and the hair part.
- the correction of the temperature parameter for the sleepiness estimation is as follows: Each of the temperature parameter for sleepiness estimation and the center region temperature of the eye is multiplied by the first and second weighting factors, and the second weighting factor is multiplied from the temperature parameter multiplied by the first weighting factor. This may be done by reducing the central region temperature of the eyes. By multiplying the temperature parameter for sleepiness estimation and the eye center region temperature by a weighting factor, the temperature parameter for sleepiness estimation and the eye center region temperature can be converted to a reasonable one suitable for sleepiness estimation. . Such conversion increases the accuracy of sleep estimation.
- the body surface temperature distribution data is infrared image data including a plurality of pixels having a predetermined resolution, and each pixel in the infrared image data is visible.
- the luminance of the color component of each pixel in the infrared image data may indicate the amount of radiation that indicates how much infrared light is radiated from the corresponding part of the human body surface shown in the visible light image data.
- the depth body temperature can be calculated. . Therefore, the depth body temperature in an appropriate numerical range can be obtained regardless of the appearance of the human body image in the infrared image data, so that the detection accuracy of the depth body temperature can be improved.
- Coordinate transformation can be introduced as a bridging process for performing eye center region detection and temperature transformation. That is, in the aspect of 6. above, the visible light image and the infrared image have different resolutions, Specification of the center region of the eye by the image processing means is made using the X coordinate or the Y coordinate in the coordinate system of the visible light image data, The temperature correction means performs conversion on the X coordinate or Y coordinate of the central region of the eye, converts the pixel value of the pixel located at the converted X coordinate or Y coordinate in the infrared image into temperature, The coordinate conversion by the temperature correction means is Multiply the ratio of the number of horizontal pixels of the visible light image and the infrared image or the ratio of the number of vertical pixels of the visible light image and the infrared image by the X coordinate or the Y coordinate of the central region of the eye, and You may make by adding the offset of the horizontal or vertical direction resulting from the imaging
- the norm of what state is defined as a state of feeling sleepy can be developed into more specific contents. That is, in the aspect of 6. or 7., the visible light image data and the infrared image data are obtained by photographing the subject at each of a plurality of time points in the measurement time zone, The center region of the eye of the subject by the image processing means, the acquisition of infrared image data by the acquisition means, and the correction for the temperature parameter for sleepiness estimation by the correction means are made at the time of shooting at the plurality of time points, In the sleepiness estimation, the corrected temperature parameter obtained by subject imaging at a certain measurement point is in a decreasing tendency compared with the corrected temperature parameter obtained by subject photographing at a past measurement point, and the decrease This may be done by determining whether the width exceeds a predetermined threshold.
- the change in the deep body temperature typically appears in the transition period from the awake state to the drowsiness state. Therefore, it is possible to take measures such as issuing an alarm when the user feels mild sleepiness. Moreover, since it can be judged whether it is a transition period from the difference of the deep body temperature obtained in the several measurement time, the precision of sleepiness estimation can be improved.
- An imaging means can be added as an optional component. That is, in any of the aspects from 6. to 8.
- An imaging means that can be switched to either a first mode that transmits visible light and blocks infrared light, or a second mode that transmits infrared light and blocks visible light, Visible light image data and infrared image data may be obtained by switching between the first mode and the second mode.
- a single imaging means obtains a visible light image for the visible wavelength region and infrared image data for the infrared wavelength region, so that the pixel coordinates in the visible light image and the pixel coordinates in the infrared image data are obtained. Correspondence relationship with will be precise. Thereby, the temperature of the part obtained by the eye region detection with respect to the visible light image data can be accurately derived from the infrared image data.
- the temperature parameter for estimating sleepiness may be obtained from a contact sensor attached to the back of the arm, the back of the foot, or the collarbone. Good.
- a highly reliable contact sensor measurement value can be used as a temperature parameter serving as a basis for sleepiness estimation. Thereby, since the deep body temperature can be measured by the method described in Non-Patent Document 3, the reliability of sleepiness estimation can be improved.
- the method in this aspect is a sleepiness estimation method that performs sleepiness estimation using a temperature parameter for sleepiness estimation from a subject, Visible light image data obtained by photographing the subject in the wavelength range of visible light, and body surface temperature distribution data obtained by measuring the temperature distribution of the subject's body surface, Identify the center area of the subject's eye by performing image processing on the visible light image data, Thereafter, the temperature of the central region of the eye is detected from the body surface temperature distribution of the subject indicated by the body surface temperature distribution data, and the temperature parameter for estimating sleepiness is corrected using the central region temperature. It will be to give.
- This method embodiment can be improved from 2. to 10. as described above.
- Such a drowsiness estimation method can be used in a place used by a user in an enterprise or an end user, so that the application of the method invention belonging to the technical scope of the present application can be expanded.
- the recording medium in the aspect records the program code that causes the computer to perform sleepiness estimation using the temperature parameter for estimating sleepiness from the subject.
- Computer-readable non-transitory recording medium When the computer obtains the visible light image data obtained by imaging the subject in the visible light wavelength range, and the body surface temperature distribution data obtained by measuring the temperature distribution of the subject's body surface, Identify the central region of the subject's eye by performing image processing on the visible light image data, Thereafter, the temperature of the central region of the eye is detected from the body surface temperature distribution of the subject indicated by the body surface temperature distribution data, and the temperature parameter for estimating sleepiness is corrected using the central region temperature.
- One or more program codes that cause the computer to execute the process of performing the above are recorded.
- the above-described improvements from 2. to 10. can be applied to the mode of the recording medium. Since the program can be distributed through the network provider server and various recording media, the application of the present invention can be extended to general computer software and online service industries.
- the first embodiment relates to the realization of a sleepiness estimation apparatus that derives an appropriate deep body temperature from the center temperature of the eye region and the temperature of the forehead region, and estimates sleepiness based on this.
- the drowsiness estimation device is different from the awake state detection device described in the background art in that the user determines the transition period from the awake state to the drowsiness state, and is a higher function of the awake state detection device.
- the sleepiness estimation device is a device that is installed in a car and monitors a driver's biological rhythm, and determines whether the user is in a dozing state based on the number of blinks of the user and the degree of eyelid fall. It is determined whether the user is in a transition period from the awake state to the sleepy state.
- FIG. 1A shows the appearance of the drowsiness estimation apparatus 100.
- the light receiving unit of the camera 101 exists on the surface of the sleepiness estimation apparatus 100.
- the camera 101 is a visible light-infrared camera, and can capture a visible light image and an infrared image with a single photographing system.
- FIG.1 (b) shows the example of installation at the time of the drowsiness estimation apparatus 100 installed in the inside of a motor vehicle.
- the sleepiness estimation device 100 is installed such that the front faces a user seated in a driver's seat.
- FIG. 2 shows the internal configuration of the drowsiness estimation apparatus 100.
- the sleepiness estimation apparatus 100 includes an imaging unit 1, a switching unit 2, frame memories 3a and 3b, an image processing unit 4, a temperature correction unit 5, a site-specific temperature calculation unit 6, a weight subtraction unit 7, and a history memory. 8 and drowsiness estimation unit 9.
- the imaging unit 1 uses the visible light / infrared common camera 101 to perform imaging with a wavelength in the visible light region and imaging with a wavelength in the infrared region at predetermined time intervals, and outputs pixel data for one screen.
- the imaging unit 1 includes an image sensor configured by a photoelectric conversion element such as a CCD sensor or a CMOS sensor.
- the pixel value for each pixel output from the image sensor in the infrared imaging mode is the temperature information of the part corresponding to the pixel.
- the imaging unit 1 creates temperature distribution of an image by arranging temperature information of parts corresponding to pixels in a plane, and obtains infrared image data.
- the switching unit 2 stores a mode setting indicating whether the apparatus is in a visible light imaging mode or an infrared imaging mode, and a storage destination of image data obtained by imaging by the imaging unit 1 according to the mode setting. Is switched from the frame memory 3a to the frame memory 3b, and from the frame memory 3b to the frame memory 3a.
- the frame memory 3a stores pixel data for one screen transferred from the imaging unit 1 as visible light image data.
- the visible light image data stored in the frame memory 3a is subject to face area detection and eye detection.
- the frame memory 3b stores the pixel data for one screen transferred from the imaging unit 1 as infrared image data in the infrared imaging mode.
- Visible light image data is subject to face area detection and eye detection, whereas infrared image data is subject to temperature calculation.
- Visible light image data is subject to face area detection and eye area detection, so that the details of the subject appear in the image, whereas infrared image data does not require the expression of such details.
- the image processing unit 4 is composed of a signal processing circuit such as a digital signal processor, and performs a region detection on the visible light image stored in the frame memory 3a and a series of positions indicating a desired region position in the visible light image data. Get the coordinate group.
- the image processing unit 4 includes a face region detection unit 4a that detects a face region, an eye region detection unit 4b that detects an eye region, and a forehead region detection unit 4c that detects a forehead region. (Xfa1, yfa1), (xfa2, yfa2), (xfa3, yfa3), (xfa4, yfa4)... In the figure are obtained by the detection of the face area detection unit 4a.
- (Xe1, ye1), (xe2, ye2), (xe3, ye3), (xe4, ye4) in the figure are obtained by detection by the eye region detection unit 4b, and the coordinates Based on the above, it performs blinking determination for detecting the drowsiness state and eyelid lowering determination.
- (xfh1, yfh1), (xfh2, yfh2), (xfh3, yfh3), (xfh4, yfh4)... are obtained by detection by the forehead area detection unit.
- (Xec, xec) in the figure indicates the center of the eye region, that is, the center of the cornea.
- the eye area detection unit 4b detects the eye area, detects an arc existing inside the eye area, and obtains the coordinates of the center of the circular area including the arc as a part.
- a circular region is a cornea, and a part of the visible light image data is hidden by the eyelid.
- the eye area detection unit 4b detects the contour line of the circular area and derives the coordinates of the center of the circular area.
- the temperature correction unit 5 obtains the deep body temperature by correcting the sleepiness estimation parameter using the temperature of the part located at a specific coordinate in the body image represented in the infrared image data.
- the site-specific temperature calculation unit 6 is a component of the temperature correction unit 5, receives a coordinate group specifying an arbitrary site in the visible light image data from the image processing unit 4, and sets the coordinate group in the infrared image data.
- the pixel value of the pixel existing in the specified part is converted into temperature. Such temperature conversion is performed by converting the pixel value of the pixel located at the coordinates acquired from the image processing unit 4 in the infrared image data into temperature.
- the imaging unit 1 obtains infrared image data by converting the amount of infrared radiation into pixel values in the infrared imaging mode.
- the site-specific temperature calculation unit 6 has a lookup table indicating the correspondence between the amount of infrared radiation in the conversion and the pixel value, and by using such a lookup table, from the pixel value of the infrared image data, Obtain the temperature of the user's body surface. Since the coordinate group output from the image processing unit 4 includes a coordinate group that defines the eye region and the forehead region, the temperature of the eye region and the forehead region is acquired by conversion of the temperature calculation unit 6 for each part. become.
- the weight subtraction unit 7 is a component of the temperature correction unit 6, and is arbitrarily determined from a value obtained by multiplying a forehead region temperature (referred to as Tf (t)) acquired from infrared image data at an arbitrary measurement time t by a coefficient ⁇ .
- Tf (t) a forehead region temperature
- Tc (t) eye region temperature
- Td (t) deep body temperature
- Equations of conversion equations for correcting Tf (t) using Tc (t) to obtain Td (t) are shown below.
- ⁇ is the temperature coefficient of the forehead
- ⁇ is the temperature coefficient of the cornea.
- ⁇ Formula 1 ⁇ Td (t) ⁇ ⁇ Tf (t) ⁇ ⁇ Tc (t)
- the history memory 8 is a memory that stores past deep body temperatures in a list format in association with the measurement time.
- the drowsiness estimation unit 9 determines whether or not the transition from the awake state to the drowsiness state is made based on the amount of change in the deep body temperature Td (t) between a plurality of time points.
- the sleepiness estimation unit 9 subtracts the deep body temperature Td (t-1) at the previous time point from the deep body temperature Td (t) at the measurement time point t to obtain a change amount ⁇ Td, from the awake state to the sleepy state.
- a threshold value storage unit 9b for storing a threshold value for determining whether or not it is a transition period, and a comparison unit 9c for comparing the amount of change ⁇ Td (t) with the threshold value.
- the sleepiness estimation apparatus 100 is placed on the front panel of an automobile.
- the imaging unit 1 captures an image of the user sitting in the driver's seat in the visible light imaging mode and the infrared imaging mode.
- the switching unit 2 switches the output destination of the visible light image data and the infrared image data to the frame memory 3a and the frame memory 3b according to the mode setting, the visible light image data of FIG.
- the infrared image data shown in FIG. 3B is obtained in the frame memory 3a and the frame memory 3b.
- FIG. 3A and 3B are diagrams showing the visible light image data captured by the photographing unit in the visible light photographing mode and the infrared image data photographed by the photographing unit in the infrared photographing mode in association with each other.
- FIG. 3A a user who holds the steering wheel in the driver's seat appears.
- FIG. 3B shows infrared image data.
- the resolution of infrared image data is less than that of visible light image data, and multiple pixels of visible light image data correspond to one pixel of distribution image data.
- the same amount of infrared rays is radiated from the portions having the same color in the image data.
- FIG.3 (b) the part which corresponds to a user's skin is displayed with the color which shows the temperature range of the normal temperature of a human body temperature.
- FIG. 3C is a graph showing the correspondence between the pixel value of the pixel in the infrared image data and the surface temperature.
- the color of the pixel in the infrared image data changes, for example, as “black ⁇ purple ⁇ red ⁇ yellow ⁇ white” in accordance with the amount of infrared radiation of the target part of the user's body.
- Such a color change from black to purple shows, for example, a temperature change from 22 ° C. to 25 ° C.
- a color change from red to white shows a temperature change from 27 ° C. to 37 ° C.
- the pixel value of a pixel indicating such a temperature change is a combination of a red component, a green component, and a blue component (RGB value), or a combination of luminance, red difference, and blue difference (Y, Cr, Cb). It is the data represented.
- RGB value red component
- Y, Cr, Cb blue component
- the word length of the storage element of the frame memory 3b is 32 bits, and infrared image data pixel value data is stored in the frame memory 3b with the bit assignment of FIG.
- FIG. 3D shows the bit assignment of pixel data indicating the pixel value in the infrared image data.
- Such pixel data is 32 bits long, and the four 8-bit parts from the most significant bit to the least significant bit are the luminance (R) of the red component in the numerical range of 0 to 255 and green in the numerical range of 0 to 255, respectively.
- Transparency exists as pixel data of an infrared image in order to adjust whether or not to display a GUI or wallpaper image as a background image when the infrared image is displayed.
- These visible light image data and infrared image data are detected by the face region detection unit 4a, the eye region detection unit 4b, and the forehead region detection unit 4c in the image processing unit 4. If the face area detection unit 4a, the eye area detection unit 4b, and the forehead area detection unit 4c detect the face area detection, the eye area detection, and the forehead area detection, the visible light image data is shown in FIG. The coordinates of the part will be output.
- FIG. 4A shows the detection result by the drowsiness estimation unit 9.
- FIG. 4B shows an enlarged view of the eye area detected by the eye area detection.
- (Xec, yec) in the figure indicates the center of the eye region, that is, the center of the cornea.
- the visible light image data is taken at a high resolution such as 1K image (1920 ⁇ 1080), 4K image (7680 ⁇ 4320), etc.
- the face region detection unit 4a, eye region detection unit 4b, and forehead region detection unit 4c The detection result more accurately reproduces the user's face, eyes, forehead shape, and center coordinates of the eye area. Such coordinates are sent to the site-specific temperature calculation unit 6.
- the part-specific temperature calculation unit 6 includes the pixel value of the pixel located at the center of the eye region output from the eye region detection unit 4b ⁇ among the pixel values in the infrared image data, and the coordinate group output from the forehead region detection unit 4c.
- the pixel value surrounded by is converted to temperature.
- FIG. 4C shows where the part to be converted is.
- FIG. 4C shows which part temperature is calculated in the infrared image data.
- (Rec, Gec, Bec) is the pixel value of the pixel existing at (yec, yec) in the infrared image data (in this embodiment, it is referred to as RGB value). Since (xec, yec) is the center of the eye region, the temperature Tc of the eye region is calculated by converting the RGB value of such coordinates into a temperature.
- FIG. 4D shows temporal transitions of Tf (t) and Tc (t).
- the horizontal axis of FIG.4 (d) is a time axis, and a vertical axis
- shaft shows the coordinate system used as temperature. In such a coordinate system, change curves of Tf (t) and Tc (t) are drawn.
- Tf (t) at an arbitrary coordinate x on the time axis of the coordinate system and Tc (t) is Td (t).
- FIG. 5 shows repetition of the above-described region detection and temperature conversion in five measurement periods of “t-2”, “t-1”, “t”, “t + 1”, and “t + 2”. is there.
- Fig. 5 shows the procedure for deriving the temporal transition of Td.
- the first level is visible light image data measured at a plurality of measurement time points “t-2”, “t ⁇ 1”, “t”, “t + 1”, “t + 2”.
- the tier is a plurality of infrared image data measured at a plurality of measurement points “t ⁇ 2”, “t ⁇ 1”, “t”, “t + 1”, and “t + 2”.
- the third level shows the temperature (Tf (t-2), Tf (t-1), Tf (t), Tf (t + 1) of the forehead area acquired from the infrared image data taken at each time point.
- Tf (t + 2) Tf (t + 2)
- the fourth level shows the temperature (Tc (t-2), Tc (t-1), Tc of the eye region acquired from the infrared image data taken at each time point. (t), Tc (t + 1), Tc (t + 2)).
- the weight subtracting unit 7 calculates the forehead temperature Tf (t) obtained at each measurement time point, Weighted subtraction is performed on the eye center temperature Tc (t) to obtain the deep body temperature Td (t).
- the sleepiness estimation unit 9 calculates a difference from the deep body temperature Td (t-1) obtained at the immediately previous measurement time.
- the difference ⁇ Td in the deep body temperature Td is obtained for each measurement time point. From the change tendency of ⁇ Td, it is detected which state the user is in among the awakening state, drowsiness state and dozing state constituting the biological rhythm.
- the fifth row shows Td (t-2), Td (t-1), Td (t), Td (t + 1), Td (Td () obtained from Tc and Tf calculated in each visible light image.
- the sixth row shows the difference in the deep body temperature ⁇ Td (t ⁇ 1), ⁇ Td (t), ⁇ Td (t + 1) at each measurement time point.
- FIG. 6 shows the relationship between biological rhythm and temperature.
- the second level shows a biological rhythm that periodically repeats the awake state, the drowsiness state, and the dozing state.
- the first level shows a change curve of Td. This curve shows how Td changes in the time zones of the awake state, the drowsiness state, and the dozing state.
- Td is almost constant in the wakefulness state, and the difference in Td between measurement time points is almost zero.
- Td shows a descending line.
- the difference ⁇ Td of Td tends to decrease, and the change width is large.
- the deep body temperature ⁇ Td is negative and that the change in the deep body temperature ⁇ Td is large is peculiar to the transition period from the awake state to the sleepy state, and is not found in the awake state or the sleepy state. Therefore, if the characteristic that the difference ⁇ Td in the deep body temperature is negative and that the absolute value of the difference ⁇ Td in the deep body temperature exceeds the predetermined threshold is determined, it is correct that the user is in the transition period from the awake state to the sleepy state. Can be determined. In FIG. 6, if the change width of ⁇ Td is slight, a determination result that the patient is dozing is given. If ⁇ Td indicates a sign of sleepiness, the user's current state is entering a sleepiness state, and an alarm is issued or external notification is made.
- the lower part of the second stage in FIG. 6 shows changes in the facial expression of the user.
- the facial expression of the user does not change between the awake state and the transition from the awake state to the sleepy state.
- the eyelids are greatly lowered, and in the dozing state, the eyelids are completely closed.
- the sleepiness estimation device it is possible to detect that the user's facial expression is drowsy at the initial stage of the sleepiness state in which the user's facial expression is not different from the awakening state.
- the sleepiness estimation means 100 If the user is suddenly attacked by a sleeper, the amount of such descent will become extremely large, so it is possible to detect dangerous signs by detecting the transition from awake state to sleepy state by using a sudden drop in deep body temperature. It becomes possible to detect as soon as possible.
- the above is the principle of sleepiness estimation by the sleepiness estimation means 100.
- a variable in the figure is a variable that indicates a target to be processed among a plurality of information elements existing in the data structure.
- a variable (t) in this flowchart is a control variable for specifying an information element to be processed.
- FIG. 7 shows that this flowchart corresponds to the highest level process, that is, the main routine, and the flowchart shown in FIG. 8 exists as a lower flowchart of this flowchart.
- FIG. 8 shows that this flowchart corresponds to the highest level process, that is, the main routine, and the flowchart shown in FIG. 8 exists as a lower flowchart of this flowchart.
- a processing procedure in the main routine will be described.
- step S1 The loop consisting of the processing from step S1 to step S12 is repeated.
- the end condition of this loop is that the determination results in steps S5 and S11 are Yes, and steps S2 to S10 are repeated until this end condition is satisfied. Since the control variable is incremented every time this loop is completed, the visible light image data and the infrared image data indicated by this variable are used for the processing of this loop.
- Step S1 is a determination as to whether or not the measurement time point (t) has arrived. If it has arrived, the processing of steps S2 to S5 is performed. In this process, the optical system is switched to the visible light photographing mode, the visible light image data is obtained in the frame memory (step S2), the face area is detected from the visible light image data (t) (step S3), and the face area is detected. An eye area is detected from the inside (step S4). In step S5, it is determined whether or not the patient is in a dozing state by referring to the number of eye region blinks and the degree of eyelid fall. If it is a doze state, it goes out of the loop and takes measures such as external notification and emergency stop, and this flowchart is ended.
- the forehead area is detected from the visible light image data (t) (step S7), the imaging unit 1 is switched to the infrared imaging mode, and infrared image data is obtained in the frame memory (step S8).
- Td (t) is calculated from the forehead region and the eye region of the infrared image data (step S9), and the difference ⁇ Td (t) between the deep body temperature Td (t) measured this time and the deep body temperature Td (t) measured last time (In step S10 and step S11, the sign of the difference ⁇ Td (t) is negative, and it is determined in step S11 whether the absolute value of the difference ⁇ Td (t) exceeds the threshold value. For example, the measurement time t is incremented in step S12 and the process returns to step S1 If so, the loop is exited, and it is determined in step S13 that the transition from the awake state to the sleepy state is reached, and an alarm is issued.
- FIG. 8 shows the procedure for calculating the deep body temperature.
- the processing procedure shown in this flowchart is a subroutine and represents details of step S9 in FIG.
- this subroutine first, an arc of a circle that becomes the outline of the cornea is detected from the inside of the eye area, and the coordinates of the center of this arc are set as the center coordinates (xec, yec) of the eye area (step S21). Then, the center coordinates (xec, yec) of the eye area are converted into the coordinate system of the infrared image data (step S22), and the position of the center of the eye area coordinates (xec, yec) after conversion is selected among the pixels of the infrared image data.
- the temperature of the eye region and the temperature of the forehead region are detected from the temperature information acquired in a non-contact manner, and the influence of the environmental temperature is considered from the temperature change of the eye region, The temperature change is calculated, and sleepiness is estimated from the temperature transition. For this reason, there is no burden of attaching a sensor to the user, and sleepiness can be estimated at an early stage, and various services can be provided to the user. For example, in the application to prevent drowsiness in a vehicle, drowsiness can be estimated and attention can be alerted at an early stage before the drowsiness begins.
- the body temperature is determined by correcting the temperature Tf (t) of the forehead region with Tc (t).
- the part having a large causal relationship with the deep body temperature is not limited to the forehead region, and the temperature of various parts may have a causal relationship with the deep body temperature.
- the skin temperature on the nasal bone and the skin temperature on the masseter muscle as described in Patent Document 1 have a causal relationship with the deep body temperature. Therefore, in the present embodiment, as in the first embodiment, the base part for calculating the deep body temperature is not limited to a specific part (the forehead region), and the deep body temperature is calculated from various parts of the face region. The candidate is extracted, and the appropriate one is selected as the basis for calculating the deep body temperature from the temperatures of the plurality of candidate sites thus selected.
- the drowsiness estimation apparatus 100 is provided with a configuration requirement for detecting a part that cannot have a causal relationship with the deep body temperature, and the part detected by the sleepiness estimation apparatus 100 is excluded from the deep body temperature calculation.
- FIG. 9 shows an internal configuration of the drowsiness estimation apparatus according to the second embodiment.
- the internal configuration of this figure is drawn based on the internal configuration diagram in the first embodiment, and some of the existing components are replaced with other components compared to the base configuration. Is different. Specifically, the forehead area detection unit 4c, which is one of the constituent elements, is replaced with a mouth and hair area detection unit 4d, and the temperature correction unit 5 is replaced with a temperature correction unit 5a.
- the mouth and hair region detection unit 4d detects a region below the eye region as a mouth region with respect to the face region detected by the face region detection unit, and is above the eye region and is a face region detection unit.
- a region that shows different video properties from the other face regions detected in step 1 is defined as a hair region.
- the part-specific temperature calculation unit 6a receives a coordinate group that specifies a face region, and coordinates that specify a coordinate group that specifies a mouth and a hair region. Then, based on the coordinate group, the eye region, the mouth and the hair region are defined as prohibited regions. After that, select a region in the region surrounded by the coordinate group that identifies the face region, which can be a candidate for deep body temperature calculation, from among the regions other than the prohibited region, and select a pixel existing in the region. Convert pixel values to temperature. Also, coordinates that specify the center of the eye region are received, and the pixel value of the pixel located at the center of the eye region is converted into temperature.
- the temperature correction unit 5a corrects the temperature of a region that is a candidate for deep body temperature calculation using the temperature at the center of the eye region, as in the first embodiment.
- the drowsiness estimation unit 9 calculates a difference as to how the temperature of the region that is a candidate for calculating the deep body temperature corrected using the temperature of the eye region changes with time, and the difference tends to decrease. If it is present and exceeds a predetermined threshold, it is determined that sleepiness has occurred, and the result of sleepiness estimation is output.
- the eye area that does not affect the deep body temperature is defined as the prohibited area together with the mouth and the hair, so that various parts in the face area can be set to the deep body temperature as long as the prohibited area is not violated. It can be selected as a calculation target.
- various parts except the eye area and mouth and hair can be used as the basis for deep body temperature calculation, so the credibility of sleepiness estimation compared to the case where the base of deep body temperature is limited only to the forehead Can be increased.
- both the forehead region detection unit 4c described in the first embodiment and the mouth and hair region detection unit 4d described in the second embodiment are provided in the sleepiness estimation apparatus.
- FIG. 10 shows a derived configuration of the sleepiness estimation apparatus according to the second embodiment.
- the image processing unit 4 includes both a forehead region detection unit 4c and a mouth and hair region detection unit 4d.
- the forehead area detection unit 4c detects the forehead area
- the mouth and hair area detection unit 4d detects the mouth and hair.
- the site-specific temperature calculation unit 6b selects either the forehead detected by the forehead region detection unit 4c as a target for temperature conversion or the region of the entire face region excluding the eye region, mouth, and hair. Select whether to convert temperature.
- the temperature correction unit 5 and the site-specific temperature calculation unit 6 in FIG. 2 are replaced with a temperature correction unit 5b and a site-specific temperature calculation unit 6b.
- the temperature correction unit 5b includes a part-specific temperature calculation unit 6b, and the part-specific temperature calculation unit 6b corrects the forehead region temperature Tf with the temperature Tc at the center of the eye region,
- two types of deep body temperatures ie, deep body temperature 2 obtained by correcting the temperature of the portion excluding the eye region, mouth, and hair with the temperature at the center of the eye region are obtained.
- the sleepiness estimation unit 9 performs sleepiness estimation for each of the two deep body temperatures 1 and 2, and on the other hand, if it is determined that the patient is in a sleepiness state, issues a warning and issues an external report.
- the temperature detected from the forehead area is used as a temperature parameter for estimating sleepiness.
- it is related with the improvement which uses the measured value of the contact-type sensor attached to the user's body as the foundation of sleepiness estimation.
- FIG. 11 shows an internal configuration of the drowsiness estimation apparatus according to the third embodiment.
- the internal configuration in this figure is drawn based on the internal configuration diagram in the first embodiment, and the forehead region detection unit 4c, the mouth and hair region detection unit 4d do not exist as compared to the configuration as the base. The point is different. Instead, a contact sensor interface 10 exists.
- the contact sensor interface 10 receives an input of a body temperature as a measurement value from a contact sensor provided on the user's body, and outputs it to the weight subtraction unit 7.
- the contact type sensor includes a sensor provided on the clavicle, a sensor provided on the back of the hand, and a sensor provided on the back of the foot.
- a measurement value of a sensor provided in the clavicle portion is defined as a proximal skin temperature (proximal).
- the average value of the measured values of sensors provided on the back of the hand and the back of the foot is defined as the distal skin temperature (distal).
- the proximal skin temperature and the distal skin temperature have a strong correlation with the biological rhythm.
- an increase in the difference between the proximal skin temperature and the distal skin temperature means sleepiness. Therefore, if the difference between the proximal skin temperature and the distal skin temperature exceeds a predetermined threshold value, it is considered that the user is in a sleepy state. Therefore, in this case, the user is determined to be in a sleepy state.
- the weight subtraction unit 7 uses the temperature detected from the central region of the eye region, and the difference between the proximal skin temperature and the distal skin temperature, which is a sensor measurement value input from the contact sensor interface 10. The effect of environmental temperature is canceled by correcting. In this way, more accurate measurement of the proximal skin temperature and the distal skin temperature becomes possible.
- the measurement value of the contact sensor attached to the user's body is used as a basis for sleepiness estimation, and the temperature calculated from the infrared image data is corrected for the measurement value of the contact sensor. Since it is used only for use, the accuracy of sleepiness estimation can be improved as compared with the case where all parameters necessary for sleepiness estimation are acquired from infrared image data.
- the vehicle-mounted drowsiness estimation device has been described.
- the present embodiment discloses a drowsiness estimation device used for controlling indoor devices.
- FIG. 12 shows a drowsiness estimation apparatus installed indoors for control of indoor equipment.
- the sleepiness estimation apparatus 100 in this figure is installed on the television 200 and can photograph a user who views the television 200.
- the sleepiness estimation apparatus 100 calculates the deep body temperature from the captured visible light image and infrared image, and performs sleepiness estimation according to the difference of the calculated deep body temperature. And if it is judged that it is the transition period mentioned above, the television 200, the illuminating device 300, and the air-conditioner 400 will be notified.
- the television 200 When the television 200 receives a notification that it is a transition period from the drowsiness estimation device 100, the television 200 performs control to lower the volume or switch the power off.
- the lighting device 300 If the lighting device 300 receives a notification that it is a transition period from the drowsiness estimation device 100, the lighting device 300 performs control to reduce the amount of illumination light.
- the air conditioner 400 receives a notification from the sleepiness estimation device 100 that it is in the transition period from the awake state to the sleepiness state, the air conditioner 400 performs control to increase the indoor set temperature, decrease the air volume, or set the sleep mode.
- the TV volume, the brightness of the illumination, and the room temperature are switched when the user is in a sleepy state, so that a comfortable environment can be provided to the user.
- the visible light-infrared common camera 101 capable of capturing a visible light image and an infrared image with one photographing system is used.
- the present embodiment relates to an improvement in which the drowsiness estimation apparatus 100 includes a visible light camera 102 for capturing a visible light image and an infrared camera 103 for capturing an infrared image, instead of the visible light / infrared common camera 101.
- the imaging unit 1 uses a visible light camera 102 for capturing a visible light image and an infrared camera 103 for capturing an infrared image, and uses the visible light / infrared shared camera 101 at predetermined time intervals. Then, both imaging with a wavelength in the visible light region and imaging with a wavelength in the infrared region are performed to obtain visible light image data and infrared image data.
- the appearance of the body image in the image varies depending on the positional relationship between these cameras. Specifically, when the visible light camera 102 and the infrared camera 103 are side by side, the position of the body image is shifted in the horizontal direction. When the visible light camera 102 and the infrared camera 103 are vertically aligned, the position of the body image is shifted in the vertical direction.
- infrared image data often has less resolution than visible light image data.
- the temperature calculation unit 6b for each part in the temperature correction unit 5b receives the coordinate group from the eye region detection unit 4b, the forehead region detection unit 4c, the mouth and hair region detection unit 4d, it is delivered.
- the coordinates of the infrared image data are obtained by performing the above coordinate conversion on the coordinates of the visible light image data.
- FIG. 13 shows coordinate conversion when the visible light camera 102 and the infrared camera 103 are arranged side by side in the sleepiness estimation apparatus 100.
- FIG. 13A is an external view of the drowsiness estimation apparatus 100.
- FIG. 13B is a diagram showing a visible light image and an infrared image side by side. Since the resolution of the infrared image is small, the infrared image is written small. Further, since the visible light camera 102 and the infrared camera 103 are aligned in the horizontal direction, OffsetX that is a horizontal shift exists between the visible light image and the infrared image.
- a coordinate group delivered from the eye region detection unit 4b, the forehead region detection unit 4c, the mouth and hair region detection unit 4d that is, an arbitrary coordinate of the coordinate group in the coordinate system of the visible light image data is represented by (Xv, Yv).
- an arbitrary coordinate of the coordinate group in the coordinate system of the infrared image data is defined as (Xi, Yi).
- OffsetX is the horizontal offset between the position of the visible light camera and the position of the infrared camera, and the X coordinate of the origin of the coordinate system of the infrared image data is in the right direction from the origin of the coordinate system of the visible light image data. Or it shows how much it has shifted to the left.
- FIG. 14 shows coordinate conversion in the case where the visible light camera 102 and the infrared camera 103 are arranged side by side in the drowsiness estimation apparatus 100.
- FIG. 14A is an external view of the drowsiness estimation apparatus 100.
- FIG. 14B shows a visible light image and an infrared image side by side. Since the resolution of the infrared image is small, the infrared image is written small. Further, since the visible light camera 102 and the infrared camera 103 are aligned in the vertical direction, there is OffsetY that is a vertical shift between the visible light image and the infrared image.
- FIG. 14A is an external view of the drowsiness estimation apparatus 100.
- FIG. 14B shows a visible light image and an infrared image side by side. Since the resolution of the infrared image is small, the infrared image is written small. Further, since the visible light camera 102 and the infrared camera 103 are aligned in the vertical direction,
- an arbitrary coordinate of the coordinate group in the coordinate system of the visible light image data is (Xv, Yv).
- an arbitrary coordinate of the coordinate group in the coordinate system of the infrared image data is defined as (Xi, Yi).
- OffsetY is the vertical offset between the position of the visible light camera and the position of the infrared camera, and the Y coordinate of the origin of the coordinate system of the infrared image data is in the right direction from the origin of the coordinate system of the visible light image data. Or it shows how much it has shifted to the left.
- OffsetX, OffsetY depend on how the visible light camera and infrared camera are attached. It is desirable to select appropriate values by performing calibration at the manufacturing stage of the drowsiness estimation device 100. In addition, it is desirable to select appropriate values for the ratio of the number of vertical and horizontal pixels to be multiplied by Xv and Yv by performing calibration.
- the eye region detection unit 4b detects the center region of the eye from the visible light image to obtain (Xv, Yv).
- the eye area detection unit 4b detects the center area of the eye for the infrared image, and obtains (Xi, Yi). If (Xv, Yv) and (Xi, Yi) are obtained in this way, this is applied to the above formula, and the horizontal / vertical pixel number ratio to be multiplied by (Xv, Yv) and offsetX, Y And get.
- OffsetX and OffsetY can be set to zero. This is because when the visible light imaging and the infrared imaging are shared by one camera, the body image cannot be shifted due to the mounting position of the camera.
- the infrared image data captured in the infrared imaging mode is the body surface temperature distribution data, but this is intended to capture infrared image data with the same imaging system as the visible light image data. Only. As long as the correspondence between the temperature distribution of the user's body and the coordinates can be specified, data other than the infrared image may be used as the body surface temperature distribution data.
- the imaging system and the frame memory are excluded from the configuration requirements of the drowsiness estimation apparatus 100, and the sleepiness estimation apparatus 100 is configured using only the image processing unit 4 to the drowsiness estimation unit 9. It may be configured. This is because the imaging unit 1 is for obtaining visible light image data and infrared image data, and it is sufficient if it is provided in another device used in combination with the drowsiness estimation device 100.
- the sleepiness estimation apparatus is a single device, but it may be incorporated into a vehicle as a part of in-car equipment. Moreover, it may be incorporated in the vehicle as a car navigation device or a car AV device. In the fourth embodiment, the drowsiness estimation device may be incorporated into these devices as a part of a television, a lighting device, or an air conditioner.
- the sleepiness estimation apparatus may be commercialized as a television, personal computer, smartphone, or tablet terminal equipped with a videophone function. This is because in the videophone function, the video taken by the user must be transmitted to the other party, and the user takes a self-portrait, that is, takes a picture of himself / herself in this process.
- a visible light image measured at each measurement point is encoded to obtain a video stream, and the deep body temperature obtained at each measurement point is associated with the video stream, thereby recording the deep body temperature at each measurement point on the recording medium. May be.
- the drowsiness estimation device functions as a biological rhythm recorder.
- the imaging device and the sleepiness estimation device may be connected via a network.
- the sleepiness estimation apparatus receives measurement images (visible light image data and infrared image data) from the camera of the display device via the network and performs sleepiness estimation. Then, the estimation result is output to an external device, and processing corresponding to the presence or absence of drowsiness is executed.
- the sleepiness estimation apparatus 100 may perform deep body temperature calculation and sleepiness estimation for a plurality of visible light image-infrared image pairs obtained by photographing a plurality of users. A warning may be given to a user who is in a dozing state among a plurality of users thus photographed. As a result, for example, in a corporation that requires a large number of employees, whether or not employees are sleepy by using a visible light image and an infrared image captured by a camera provided on a personal computer or an in-vehicle camera. Can be monitored.
- the sleepiness estimation apparatus 100 is operated on a cloud network server (cloud server) that can process visible light images and infrared images as big data. desirable.
- cloud server cloud server
- the hypervisor activates the operating system in the cloud server.
- an application program that performs processing of the image processing unit 4, the temperature correction unit 5, and the drowsiness estimation unit 9 is loaded into the cloud server from an intra network existing in the company. By such loading, the processing described from the first embodiment to the fifth embodiment is executed for big data.
- the visible light image may be an individual visible light image obtained by moving image photographing or an individual visible light image obtained by still image photographing.
- the measurement cycle of the deep body temperature is the display cycle of the display when the moving image is reproduced. For example, if the display frequency is 1 / 23.976Hz, the measurement cycle is 1 / 23.976 seconds, if the display frequency is 59.94Hz in field conversion, 1 / 59.94 seconds, and the display frequency is 50Hz in field conversion. If there is, it will be 1/50 second.
- the sleepiness estimation apparatus When acquiring a visible light image by taking a still image, set a measurement cycle of deep body temperature according to the timer setting of the sleepiness estimation device. For example, if the timer of the sleepiness estimation apparatus is set to 5 seconds, the sleepiness estimation apparatus captures images every 5 seconds, thereby obtaining a deep body temperature at intervals of 5 seconds.
- the eye area detection unit may register eyelid shape patterns indicating various eyelid shapes in the detection dictionary, and detect the eye area using the detection dictionary. Specifically, the eye region detection unit performs feature amount extraction on the visible light image, and detects, as an edge pixel group, a portion where the pixel value is greatly changed in the visible light image as compared with the surrounding area. Of the edge pixel group, an area that corresponds to the shape pattern registered in the detection dictionary is detected as an eye area. In addition, by registering the face contour pattern and the forehead contour pattern in the detection dictionary, the face area and the forehead area can be similarly detected.
- the eye area may be detected by detecting an eye area that is surrounded by a face skin color area of the face area and does not have a skin color area. Then, the positional relationship in the vertical direction of the face between the pairs of eyelid regions, the arrangement rule that the pairs of eyelid regions are arranged on the left and right of the face, the area difference between the pairs of eyelid regions, the pairs of eyelid regions Evaluate the color dispersion within the region. Based on the evaluation value, a set of eyelid regions that are determined to be most suitable as both eyes among the eyehole region sets is estimated as the eye regions of both eyes.
- the coordinate conversion formula is changed between the case where the visible light camera 102 and the infrared camera 103 are arranged horizontally, the case where they are arranged vertically, and the case where the visible light-infrared common camera 101 is used.
- Coordinate conversion may be performed using mathematical formulas. That is, when the infrared cameras 103 are arranged horizontally, vertically, or when the visible light / infrared common camera 101 is used, coordinate conversion may be performed using the following formula.
- the imaging sensor may include a focus lens that can move in the optical axis direction and an imaging optical system that adjusts the amount of light according to control by automatic aperture driving, thereby performing follow-up shooting of the user's face area.
- the image including the user's face is input from the image sensor to the image processing unit, and all the image data captured by the image capturing unit is input to the face area detection unit.
- the infrared camera may be configured by housing an infrared lens group and an infrared imaging element located on the imaging plane of the infrared lens group in a housing.
- a window portion is formed in the housing at a portion facing the subject side of the infrared lens group.
- the infrared imaging element is composed of a plurality of thermal resistors arranged one-dimensionally or two-dimensionally, and generates infrared image data by detecting the amount of received infrared rays in each thermal resistor.
- the optical filter of a normal photographing apparatus is configured to block infrared wavelengths.
- the visible light / infrared common camera 101 switches between visible light imaging mode and infrared imaging mode by controlling switching of whether or not to block the infrared wavelength, and obtains visible light image data and infrared image data, respectively. May be.
- visible light image data is obtained by switching the filter so as to block infrared rays in the visible light photographing mode.
- image data in which each pixel is composed of visible rays and infrared rays is obtained.
- the difference between the image data and the visible light image data obtained in the visible light imaging mode is taken for each pixel, and the image data using the difference value as the pixel value is converted into an infrared image.
- Data can be obtained by using a filter of a normal photographing apparatus.
- the visible light / infrared common camera 101 can be switched between the visible light imaging mode and the infrared imaging mode by improving the color filter arranged on the front surface of the image sensor, and the visible light image data and the infrared image data can be captured.
- the color filter has a plurality of types of specific wavelength transmission filters that transmit electromagnetic waves having different wavelengths.
- the specific wavelength transmission filter includes at least an R filter that transmits an electromagnetic wave with a red component wavelength, a G filter that transmits an electromagnetic wave with a green component wavelength, a B filter that transmits an electromagnetic wave with a blue component wavelength, and an infrared ray.
- An I filter for transmission is included.
- the filter group includes a plurality of filter groups in which R, G, B filters, and I filters are arranged in a matrix of a predetermined number of rows and a predetermined number of columns, and the plurality of filter groups are arranged in a matrix.
- the R, G, and B filters are enabled in the visible light shooting mode
- the I filter is enabled in the infrared shooting mode, so that visible light image data is displayed in the visible light shooting mode and infrared image data is displayed in the infrared shooting mode. May be obtained.
- the formulas for determining the deep body temperature and the formulas for deriving the deep body temperature from the pixel values of the infrared image data do not mean a mathematical concept, but merely numerical operations executed on the computer. It is. Therefore, as a matter of course, it goes without saying that necessary modifications are made to be realized by a computer. For example, it goes without saying that a saturation operation or a positive value operation for handling a numerical value in an integer type, a fixed-point type, or a floating-point type may be performed. Furthermore, the arithmetic processing and calculation processing based on the mathematical formulas shown in each embodiment can be realized by a ROM multiplier using a constant ROM.
- the product of the multiplicand and the constant is calculated and stored in advance. For example, if the multiplicand is 16 bits long, this multiplicand is divided into four every 4 bits, and the product of this 4-bit part and a constant, that is, a multiple of 0 to 15 of the constant is stored in the above constant ROM. Stored.
- “calculation processing” and “arithmetic processing” in this specification do not mean only pure arithmetic operations, but ROM and the like. This also includes reading out of the recording medium in which the calculation result stored in the recording medium is read out according to the value of the operand.
- the system LSI can be obtained by packaging the image processing unit 4 and the temperature correction unit 5 on a high-density substrate. May be configured.
- a system LSI is a multi-chip module in which a plurality of bare chips are mounted on a high-density substrate and packaged so that the bare chip has the same external structure as a single LSI. .
- the integrated circuit architecture is composed of a pre-programmed DMA master circuit, etc., and is composed of a front-end processing circuit (1) that performs general stream processing and a SIMD processor, etc., and a signal processing circuit that performs general signal processing ( 2), a back-end circuit (3) that performs pixel processing, image superposition, resizing, image format conversion AV output processing in general, a media interface circuit (4) that is an interface with the drive and network, and a memory access It is a slave circuit and is composed of a memory controller circuit (5) that implements reading and writing of packets and data in response to requests from the front end unit, signal processing unit, and back end unit.
- QFP Quad Flood Array
- PGA Peripheral Component Interconnect Express
- a conversion circuit that converts a pixel group of visible light image data or infrared image data into a desired format, a pixel group of visible light image data or infrared image data Cache memory that temporarily stores data, buffer memory that adjusts the speed of data transfer, an initialization circuit that reads the necessary program from ROM to RAM when power is turned on, and initialization, and power control according to the state of the histogram Power control circuit that performs multiple operations, MPU manages multiple programs as task applications, schedules programs according to the priority of these programs, interrupt signals according to external events such as reset occurrences and power supply abnormalities An interrupt handler part to be generated can be added. *
- the present invention accepts the designation of visible light image data and the designation of infrared image data as arguments, and calls a desired application programming interface to determine whether the user is in a sleepy state, and displays the determination result.
- You may comprise as a program module returned as a return value.
- the program code constituting such a program module that is, the program code for causing the computer to perform the processing of the main routine of FIG. 7 and the subroutine of FIG. 8 can be created as follows.
- a software developer uses a programming language to write a source program that implements each flowchart and functional components.
- the software developer describes a source program that embodies each flowchart and functional components using a class structure, a variable, an array variable, and an external function call according to the syntax of the programming language.
- the described source program is given to the compiler as a file.
- the compiler translates these source programs to generate an object program.
- Translator translation consists of processes such as syntax analysis, optimization, resource allocation, and code generation.
- syntax analysis lexical analysis, syntax analysis, and semantic analysis of the source program are performed, and the source program is converted into an intermediate program.
- optimization operations such as basic block formation, control flow analysis, and data flow analysis are performed on the intermediate program.
- resource allocation in order to adapt to the instruction set of the target processor, a variable in the intermediate program is allocated to a register or memory of the processor of the target processor.
- code generation each intermediate instruction in the intermediate program is converted into a program code to obtain an object program.
- the object program generated here is composed of one or more program codes that cause a computer to execute the steps of the flowcharts shown in the embodiments and the individual procedures of the functional components.
- program codes such as a processor native code and JAVA (registered trademark) byte code.
- JAVA registered trademark
- a call statement that calls the external function becomes a program code.
- a program code that realizes one step may belong to different object programs.
- each step of the flowchart may be realized by combining arithmetic operation instructions, logical operation instructions, branch instructions, and the like.
- the programmer When object programs are generated, the programmer activates the linker for them.
- the linker allocates these object programs and related library programs to a memory space, and combines them into one to generate a load module.
- the load module generated in this manner is premised on reading by a computer, and causes the computer to execute the processing procedures and the functional component processing procedures shown in each flowchart.
- Such a computer program may be recorded on a non-transitory computer-readable recording medium and provided to the user.
- the drowsiness estimation apparatus is useful for preventing a drowsiness in a vehicle. Further, it can be applied to uses such as illumination brightness control and volume control of a television or the like.
Abstract
Description
可視光線の波長域で被験者を撮影することにより得られた可視光線画像データ、及び、被験者の体表面の温度分布を測定することにより得られた体表面温度分布データを取得する取得手段と、
可視光線画像データに対して画像処理を行うことで、被験者の眼の中心領域を特定する画像処理手段と、
体表面温度分布データにて示される被験者の体表面温度分布のうち、眼の中心領域の温度を検出して、当該中心領域温度を用いて、眠気推定のための温度パラメータに対して補正を施す補正手段とにより解決される。
被験者から眠気推定のための温度パラメータを検出して、眠気推定を行う眠気推定装置であって、
可視光線の波長域で被験者を撮影することにより得られた可視光線画像データ、及び、被験者の体表面の温度分布を測定することにより得られた体表面温度分布データを取得する取得手段と、
可視光線画像データに対して画像処理を行うことで、被験者の眼の中心領域を特定する画像処理手段と、
体表面温度分布データにて示される被験者の体表面温度分布のうち、眼の中心領域の温度を検出して、当該中心領域温度を用いて、眠気推定のための温度パラメータに対して補正を施す補正手段と、
を備える。
1.における眼領域中央の検出は、可視光線画像の画像内容に沿った具体的なものに展開することができる。つまり、前記画像処理手段による眼の中心領域の特定は、可視光線画像データに表される被験者の顔領域の像から角膜の輪郭形状を検出して、当該角膜の輪郭形状を示す円弧の中央を特定することによりなされてもよい。角膜の輪郭は、眼領域において円弧をなすから、かかる円弧領域を手掛かりにすれば、比較的容易に、眼領域の中心を特定することができる。
眠気推定のための深部体温を何処から取得するかという取得先については、具体的なものに展開することができる。深部体温が最も正確に表れるのは、腋窩、口腔、鼓膜、直腸である。しかし腋窩、口腔、鼓膜、直腸へのセンサ取り付けは、医療従事者によるものでないと不可能であり、一般ユーザが到底実現できるものではない。また、手や足への取り付けるものであったとしても、センサ取付如何で精度が左右されるのであれば、製品としての適格性に欠く。
前記補正手段は、
体表面温度分布データに示される体表面温度分布から、前額部分の温度を検出することで眠気推定のための温度パラメータを得てもよい。
眠気推定のためのパラメータを何処から取得するかという取得先については、具体的なものに展開することができる。つまり1.~3.の何れかの態様において、前記眠気推定のための温度パラメータは、顔領域のうち口周辺部及び頭髪部を除いた部分の温度パラメータであり、
前記補正手段は、
体表面温度分布データに示される顔領域の温度分布のうち、口周辺部及び頭髪部を除いた何れかの部位の温度を検出することで眠気推定のための温度パラメータを得てもよい。
眠気推定パラメータをどのように補正するかについては、より詳細なものに展開することができる。つまり上記の1.~4.の態様の何れかにおいて、前記眠気推定のための温度パラメータの補正は、
眠気推定のための温度パラメータ、及び、眼の中心領域温のそれぞれに第1、第2の重み係数を乗じて、第1の重み係数が乗じられた温度パラメータから第2の重み係数が乗じられた眼の中心領域温を減じることでなされてもよい。眠気推定のための温度パラメータや目の中心領域温に重み係数を乗じることで、眠気推定のための温度パラメータや目の中心領域温を眠気推定に適した、妥当なものに変換することができる。かかる変換により眠り推定の精度が高まる。
どのようなデータを体表面温度分布データとするかという体表面温度分布データの構成については、より具体的なものに展開することができる。つまり上記の1.から5.までの何れかの態様において、前記体表面温度分布データは、所定の解像度をなす複数の画素からなる赤外線画像データであり、赤外線画像データにおける個々の画素は、可視光線画像データにおける個々の画素に対応していて、
赤外線画像データにおける個々の画素の色成分の輝度は、可視光線画像データに現された人体表面の対応する部位から、どれだけの赤外線が輻射されているかという輻射量を示していてもよい。赤外線画像データにおける各画素の画素値は、人体表面の様々な部位からの赤外線輻量を示すから、赤外線画像データの中にとにかくユーザの身体が現れていれば、深部体温の算出が可能になる。よって、赤外線画像データにおける人体像の現れ方がどのようなものであっても、妥当な数値範囲の深部体温が得られるから、深部体温の検出精度を向上させることができる。
眼の中心領域検出と、温度変換とを行うための橋渡しの処理として、座標変換を導入することができる。つまり、上記6.の態様において、前記可視光線画像と、赤外線画像とは解像度が異なり、
前記画像処理手段による眼の中心領域の特定は、可視光線画像データの座標系におけるX座標又はY座標を用いてなされ、
前記温度補正手段は、眼の中心領域のX座標又はY座標に対して変換を施し、赤外線画像において当該変換後のX座標又はY座標に位置する画素の画素値を温度に変換し、
前記温度補正手段による座標変換は、
可視光線画像と、赤外線画像との横画素数の比率、又は、可視光線画像と、赤外線画像との縦画素数の比率を、眼の中心領域のX座標又はY座標に乗じ、可視光線画像の撮影系と、赤外線画像の撮影系とが異なることに起因する水平又は垂直方向のオフセットを加算することでなされてもよい。
どのような状態を眠気を感じている状態として規定するかという規範は、より具体的な内容に展開することができる。つまり、6.又は7.の態様において、前記可視光線画像データ、及び、赤外線画像データは、測定時間帯の複数の時点のそれぞれにおいて、被験者を撮影することで得られ、
画像処理手段による被験者の眼の中心領域の特定、取得手段による赤外線画像データの取得、補正手段による眠気推定のための温度パラメータに対する補正は、前記複数時点の撮影時のそれぞれにおいてなされ、
前記眠気推定は、ある測定時点の被験者撮影により得られた補正後の温度パラメータが、過去の測定時点の被験者撮影により得られた補正後の温度パラメータと比較して減少傾向であり、かつその減少幅が所定の閾値を上回っているかを判断することでなされてもよい。
任意的な構成要素として、撮影手段を追加することができる。つまり6.から8.までの何れかの態様において、前記眠気推定装置は、
可視光線を通過させ赤外線を遮断する第1モード、赤外線を通過させ可視光線を遮断する第2モードの何れかに切り替え可能な撮影手段を備え、
前記第1モード、第2モードの切り替えにより、可視光線画像データ、及び、赤外線画像データのそれぞれを得てもよい。
ここで温度パラメータの取得については、更なる改良を加えることができる。つまり、1.又は2.、4.から9.までの何れかの態様において、眠気推定のための温度パラメータは、腕背部、足背部、鎖骨部に取り付けられた接触式センサから取得したものでもよい。眠気推定の基礎となる温度パラメータとして、信頼性が高い接触式センサの測定値を使用することができる。これにより、非特許文献3に記載された方法による深部体温の測定が可能になるから、眠気推定の信頼性を高めることができる。
方法発明の局面で実施化障壁の克服を図る場合、当該局面における方法は、被験者から眠気推定のための温度パラメータを用いて、眠気推定を行う眠気推定方法であって、
可視光線の波長域で被験者を撮影することにより得られた可視光線画像データ、及び、被験者の体表面の温度分布を測定することにより得られた体表面温度分布データを取得して、
可視光線画像データに対して画像処理を行うことによる、被験者の眼の中心領域を特定し、
その後、体表面温度分布データにて示される被験者の体表面温度分布のうち、眼の中心領域の温度を検出して、当該中心領域温度を用いて、眠気推定のための温度パラメータに対して補正を施すというものになる。この方法の態様は、上述したような2.から10.までの改良を施すことができる。かかる眠気推定方法では、企業内のユーザ、又は、エンドユーザが使用する場所での使用が可能になるから、本願の技術的範囲に属する方法発明の用途を広げることができる。
コンピュータ読み取り可能な記録媒体を実施する局面で実施化障壁の克服を図る場合、当該局面における記録媒体は、被験者から眠気推定のための温度パラメータを用いた眠気推定をコンピュータに行わせるプログラムコードが記録されたコンピュータ読み取り可能な非一時的な記録媒体であって、
コンピュータが可視光線の波長域で被験者を撮影することにより得られた可視光線画像データ、及び、被験者の体表面の温度分布を測定することにより得られた体表面温度分布データを取得した際、
可視光線画像データに対して画像処理を行うことによる、被験者の眼の中心領域を特定して、
その後、体表面温度分布データにて示される被験者の体表面温度分布のうち、眼の中心領域の温度を検出して、当該中心領域温度を用いて、眠気推定のための温度パラメータに対して補正を施す処理をコンピュータに実行させる1つ以上のプログラムコードが記録されたものとなる。この記録媒体の態様には、上述したような2.から10.までの改良を施すことができる。ネットワークプロバイダサーバや各種記録媒体を通じたプログラムの配布が可能になるから、一般のコンピュータソフトウェアやオンラインサービスの業界まで、本発明の用途を広げることができる。
第1実施形態は、眼領域の中心温度と、前額領域の温度から妥当な深部体温を導き、これを基に眠気推定を行う眠気推定装置の実現に関する。当該眠気推定装置は、ユーザが覚醒状態から眠気状態への移行期を判断する点で背景技術で述べた覚醒状態検出装置と異なっており、当該覚醒状態検出装置をより高機能にしたものである。つまり眠気推定装置は、自動車車内に設置され、運転手の生体リズム監視を行う機器であり、ユーザのまばたき回数やまぶたの下がり具合から、ユーザが居眠状態にあるかどうかの判定を行うと共に、ユーザが覚醒状態から眠気状態への移行期にあるかの判定を行う。
{数式1}
Td(t)=α×Tf(t)-β×Tc(t)
履歴メモリ8は、過去の深部体温を、その測定時刻に対応付けてリスト形式で格納しているメモリである。
以上が眠気推定装置の構成要件についての説明である。以降、眠気推定装置100による眠気推定の原理を説明する。かかる説明には、図3から図6までの図面を参照しながら行う。眠気推定装置100は、図1に示すように自動車のフロントパネルの上に置かれている。この状態で、眠気推定装置100が起動すると撮像部1は、運転席に着座しているユーザを、可視光線撮影モード、赤外線撮影モードにおいて撮影する。撮像部1が撮影を行い、切替部2がモード設定に応じて可視光線画像データ、赤外線画像データの出力先をフレームメモリ3a、フレームメモリ3bに切り替えると、図3(a)の可視光線画像データ、図3(b)の赤外線画像データがフレームメモリ3a、フレームメモリ3bに得られることになる。図3(a)(b)は可視光線撮影モードにより撮影部で撮影される可視光線画像データと、赤外線撮影モードにより撮影部で撮影される赤外線画像データとを対応付けて示す図である。図3(a)では、運転座席でハンドルを握るユーザの姿が現れている。図3(b)は、赤外線画像データを示す。通常、赤外線画像データの解像度は、可視光線画像データの解像度よりも少ないものになっていて、可視光線画像データの複数の画素が、分布画データの1画素に対応しており、可視光線画像データに現れた被写体の人体表面からの赤外線輻射量を表している。画像データにおいて同じ色になっている部所からは、同じ量の赤外線が輻射されていることになる。図3(b)において、ユーザの皮膚にあたる部分は、人間の体温の平温の温度範囲を示す色で表示される。
第1実施形態では、前額領域の温度が深部体温と大きな因果関係をもつことから、前額領域の温度Tf(t)をTc(t)で補正して深部体温を得ることにした。しかし、深部体温と大きな因果関係をもつ部位は、前額領域だけに限らず、様々な部位の温度が深部体温と因果関係をもち得る。具体的にいうと、特許文献1に記載されたような鼻骨上皮膚温や咬筋上皮膚温も深部体温と因果関係をもつ。そこで本実施形態では、第1実施形態のように、深部体温算出の基礎となる部位をある特定の部位(前額領域)とは限定せず、顔領域の様々な部位の中から深部体温算出の候補を抽出し、そうして選ばれた複数の候補部位の温度から、深部体温算出の基礎として妥当なものを選ぶことにしている。
この応用例は、第1実施形態で述べた前額領域検出部4cと、第2実施形態で述べた口及び頭髪領域検出部4dとの双方を眠気推定装置に設けるものである。
第1実施形態では、前額領域から検出した温度を眠気推定のための温度パラメータとして用いた。これに対し本実施形態では、ユーザの身体に取り付けられた接触式センサの測定値を眠気推定の基礎とする改良に関する。
(第4実施形態)
第1実施形態では、車載型の眠気推定装置について説明したが、本実施形態は、屋内機器の制御に用いられる眠気推定装置を開示する。図12は、屋内機器の制御のために、室内に設置された眠気推定装置を示す。本図における眠気推定装置100は、テレビ200の上に設置され、テレビ200を視聴するユーザを撮影することができる。眠気推定装置100は、撮影された可視光線画像、赤外線画像から深部体温を算出して、この算出された深部体温の差分に従い眠気推定を行う。そして、上述した移行期と判断されれば、テレビ200、照明装置300、エアコン400にその旨を通知する。
第1実施形態では、1つの撮影系で可視光線画像と、赤外線画像との撮影を行うことができる可視光線-赤外線共用カメラ101を使用した。本実施形態では、この可視光線-赤外線共用カメラ101に代えて、可視光線画像撮影のための可視光線カメラ102と、赤外線画像撮影のための赤外線カメラ103とを眠気推定装置100に設ける改良に関する。
以上、本願の出願時点において、出願人が知り得る最良の実施形態について説明したが、以下に示す技術的トピックについては、更なる改良や変更実施を加えることができる。各実施形態に示した通り実施するか、これらの改良・変更を施すか否かは、何れも任意的であり、実施する者の主観によることは留意されたい。
これまでの実施形態では、赤外線撮影モードで撮影された赤外線画像データを体表面温度分布データとしたが、これは、可視光線画像データと同一の撮影系で赤外線画像データを撮影することを意図したに過ぎない。ユーザの身体の温度分布と、座標との対応を規定できるものであれば赤外線画像以外のものを体表面温度分布データとして使用してもかまわない。
眠気推定装置100を他の機器との組合せで使用する場合、撮影系統やフレームメモリを眠気推定装置100の構成要件から除外して、画像処理部4~眠気推定部9のみで眠気推定装置100を構成してもよい。撮像部1は、可視光線画像データ、赤外線画像データを得るためのものであり、眠気推定装置100との組みで用いられる他の機器に備わっていれば足りるからである。
第1実施形態において眠気推定装置を単体の機器としたが自動車の車内装備の一部として自動車に組込んでもよい。また、カーナビ機器やカーAV機器として車内に組込まれてもよい。第4実施形態では、眠気推定装置をテレビや照明装置、エアコンの一部として、これらの機器に組込んでもよい。
眠気推定装置は、テレビ電話機能を搭載したテレビ、パーソナルコンピュータ、スマートフォン、タブレット端末として製品化してもよい。テレビ電話機能では、自分で撮影した映像を相手側に送信せねばならず、この過程でユーザは、自画撮り、つまり自分自身の姿を撮影するからである。
各測定時点で測定された可視光線画像をエンコードしてビデオストリームを得て、測定時点毎に得られた深部体温を、当該ビデオストリームに対応付けることで、各測定時点の深部体温を記録媒体に記録してもよい。測定時点毎の深部体温をビデオストリームに対応付けて記録することで、眠気推定装置は生体リズムレコーダとして機能することになる。
撮影装置と、眠気推定装置とはネットワークを介して接続されてもよい。この場合、眠気推定装置は、ネットワークを介して表示装置のカメラによる測定画像(可視光線画像データ、赤外線画像データ)を受け取り眠気推定を行う。そして推定結果を外部装置に出力して、眠気の有無に応じた処理を実行させる。
眠気推定装置100は、複数ユーザを撮影することで得られた、可視光線画像-赤外線画像の複数の組みに対して深部体温算出と、眠気推定を行ってもよい。そうして撮影された複数ユーザのうち、居眠り状態にあるユーザに対して警告を行ってもよい。これにより、例えば、多数の従業者を要する法人においては、パソコンに設けられたカメラや車載カメラで撮影された可視光線画像と、赤外線画像とを用いることで従業員が眠気状態になっているかどうかの監視が可能になる。この場合、多くの可視光線画像、赤外線画像を処理せねばならないから、可視光線画像、赤外線画像をビッグデータとして処理できるようなクラウドネットワークのサーバ(クラウドサーバ)で眠気推定装置100を動作させるのが望ましい。クラウドネットワークにおいて眠気推定の開始が命じられれば、ハイパーバイザは、クラウドサーバにおいてオペレーティングシステムを起動する。こうしてオペレーティングシステムを起動した上で、企業内部に存在するイントラネットワークから画像処理部4、温度補正部5、眠気推定部9の処理を行うアプリケーションプログラムを、クラウドサーバにロードする。かかるロードにより、第1実施形態から第5実施形態までの述べた処理をビッグデータに対し実行する。
可視光線画像は、動画像撮影で得られた個々の可視光線画像でもよいし、静止画撮影で得られた個々の可視光線画像でもよい。動画像撮影で可視光線画像を取得する場合、深部体温の測定周期は、動画像を再生する際のディスプレイの表示周期になる。例えば、ディスプレイの表示周波数が、1/23.976Hzであるなら、測定周期は1/23.976秒になり、表示周波数がフィールド換算で59.94Hzであるなら1/59.94秒、表示周波数がフィールド換算で50Hzであるなら1/50秒になる。
眼領域検出部は、検出辞書に、様々なまぶた形状を示すまぶた形状パターンを登録しておいて、かかる検出辞書を用いて眼領域の検出をおこなってもよい。具体的にいうと眼領域検出部は、可視光線画像に対して特徴量抽出を行い、可視光線画像において、周辺と比較して画素値が大きく変化している箇所をエッジ画素群として検出する。かかるエッジ画素群のうち、検出辞書に登録された形状パターンにあてはまるものの領域を眼領域として検出する。その他、顔の輪郭パターン、前額の輪郭パターンを検出辞書に登録しておくことで、顔領域や前額領域も同様に検出することができる。
顔領域の顔の肌の色領域で周囲を囲まれ、自身は肌の色領域を有さない眼孔領域を検出することで、眼領域の検出をおこなってもよい。そして、眼孔領域の組み間の顔の上下方向の位置関係、眼孔領域の組みが顔の左右に配置されているという配置規則、眼孔領域の組み間の面積差、眼孔領域の組みの領域内の色の分散を評価する。かかる評価値に基づき、眼孔領域の組みのうち、最も両眼として適格と判断される眼孔領域の組みを両眼の眼領域として推定する。
第1実施形態では、可視光線カメラ102、赤外線カメラ103を横に並べる場合と、縦に並べる場合と、可視光線-赤外線共用カメラ101を用いる場合とで座標変換の数式を変えたが、統一の数式で座標変換を行ってもよい。つまり、赤外線カメラ103を横に並べる場合、縦に並べる場合、可視光線-赤外線共用カメラ101を用いる場合の何れであっても、以下の数式を用いて座標変換を行ってもよい。
Xi=(赤外線画像の横画素数/可視光線画像の横画素数)×Xv+OffsetX
Yi=(赤外線画像の縦画素数/可視光線画像の縦画素数)×Yv+OffsetY
これは可視光線画像と、赤外線画像とは、垂直方向、水平方向で微妙なズレが発生するからである。特に、可視光線画像と、赤外線画像との解像度の違いを考慮すると、カメラの配置に拘らず、上記の数式を用いるべきである。
(追従撮影)
撮像センサは、光軸方向に移動可能となるフォーカスレンズや、自動絞り駆動による制御に従って光量調整を行う撮像光学系を具備することで、ユーザの顔領域の追従撮影を行ってもよい。この場合、撮像センサからは使用者の顔を含む映像が映像処理部に入力され、顔領域検出部には撮像部が撮影した映像データの全てが入力されることになる。
赤外線カメラは、赤外線レンズ群と、当該赤外線レンズ群の結像面に位置する赤外線撮像素子とをハウジング内に収容することで構成してもよい。ハウジング内には、赤外線レンズ群の被写体側に対向する部分に窓部が形成される。赤外線撮像素子は、1次元または2次元に配列された複数の感熱抵抗体から構成され、各感熱抵抗体における赤外線の受光量を検出することにより赤外線画像データを生成する。
通常の撮影機器の光学フィルタは、赤外線波長を遮断するように構成されている。可視光線-赤外線共用カメラ101は、赤外線波長を遮断するか否かの切替えを制御することで、可視光線撮影モード及び赤外線撮モードの切り替えを行い、可視光線画像データ及び赤外線画像データをそれぞれ取得してもよい。先ず始めに、可視光線撮影モードにおいて赤外線を遮断するようフィルタを切り替えることで可視光線画像データを得る。次に、赤外線撮影モードにおいて赤外線及び可視光線を透過するようフィルタを切り替えることで、個々の画素が可視光線と、赤外線とで構成された画像データを得る。このような画像データが得られた後、かかる画像データと、可視光線撮影モードで得られた可視光線画像データとの画素毎の差分をとり、かかる差分値を画素値とする画像データを赤外線画像データとする。このような処理を経れば、通常の撮影機器のフィルタを用いることで、可視光線画像データ、赤外線画像データを得ることができる。
可視光線-赤外線共用カメラ101は、撮像素子前面に配置したカラーフィルタを改良することで、可視光線撮影モード及び赤外線撮モードの切り替えを行い、可視光線画像データ、赤外線画像データの撮影を行ってもよい。カラーフィルタは、それぞれ異なる波長の電磁波を透過させる特定波長透過フィルタを複数種類有している。かかる特定波長透過フィルタには、少なくとも、赤成分の波長の電磁波を透過させるRフィルタ、緑成分の波長の電磁波を透過させるGフィルタ、青成分の波長の電磁波を透過させるBフィルタ、及び、赤外線を透過させるIフィルタが含まれる。R、G、Bフィルタ、及び、Iフィルタを所定数の行×所定数の列のマトリクスとしたフィルタ群を複数有し、複数の前記フィルタ群はマトリクス状に配列されていて、Iフィルタが、前記Rフィルタ及び前記Bフィルタに隣接して配置される。かかる構成において、可視光線撮影モードではR、G、Bフィルタを有効にし、赤外線撮影モードではIフィルタを有効にすることで、可視光線撮影モードでは可視光線画像データを、赤外線撮影モードでは赤外線画像データを取得してもよい。
深部体温を求めるための数式や、赤外線画像データの画素値から深部体温を導くための数式は、数学的な概念を意味するのではなく、あくまでも、コンピュータ上で実行される数値演算を意味するものである。よって当然のことながら、コンピュータに実現させるための、必要な改変が加えられることはいうまでもない。例えば、数値を、整数型、固定少数点型、浮動小数点型で扱うための飽和演算や正値化演算が施されてよいことはいうまでもない。更に、各実施形態に示した、数式に基づく演算処理や算出処理は、定数ROMを用いたROM乗算器で実現することができる。定数ROMには、被乗数と定数との積の値はあらかじめ計算されて格納されている。例えば、被乗数が16ビット長である場合、この被乗数は、4ビット毎に四つに区切られ、この4ビット部分と定数との積、つまり、定数の0~15の倍数が上記の定数ROMに格納されている。上記の一区切りの4ビットと定数16ビットとの積は20ビット長であり、上記の四つの定数が同一のアドレスに格納されるので、20×4=80ビット長が一語のビット長となる。以上述べたように、ROM乗算器での実現が可能であるので、本明細書でいうところの“算出処理”や“演算処理”は、純粋な算術演算のみを意味するのではなく、ROM等の記録媒体に格納された演算結果を、被演算子の値に応じて読み出すという、記録媒体の読み出しをも包含する。
画像処理部4及び温度補正部5は、装置に組込むべき半導体集積回路としての実装が可能であるから、かかる画像処理部4及び温度補正部5を高密度基板上にパッケージングすることでシステムLSIを構成してもよい。システムLSIは、複数個のベアチップを高密度基板上に実装し、パッケージングすることにより、あたかも1つのLSIのような外形構造を複数個のベアチップに持たせたものであり、マルチチップモジュールと呼ばれる。集積回路のアーキテクチャは、プリプログラムされたDMAマスタ回路等から構成され、ストリーム処理全般を実行するフロントエンド処理回路(1)と、SIMDプロセッサ等から構成され、信号処理全般を実行する信号処理回路(2)と、画素処理や画像重畳、リサイズ、画像フォーマット変換AV出力処理全般を行うバックエンド回路(3)と、ドライブ、ネットワークとのインターフェイスであるメディアインターフェイス回路(4)と、メモリアクセスのためのスレーブ回路であり、フロントエンド部、信号処理部、バックエンド部の要求に応じて、パケットやデータの読み書きを実現するメモリコントローラ回路(5)とから構成される。ここでパッケージの種別に着目するとシステムLSIには、QFP(クッド フラッド アレイ)、PGA(ピン グリッド アレイ)という種別がある。QFPは、パッケージの四側面にピンが取り付けられたシステムLSIである。PGAは、底面全体に、多くのピンが取り付けられたシステムLSIである。
本発明は、可視光線画像データの指定と、赤外線画像データの指定とを引数として受け付けて、所望のアプリケーションプログラミングインターフェイスをコールすることにより、ユーザが眠気状態にあるかどうかを判定し、判定結果を戻り値として返すプログラムモジュールとして構成してもよい。かかるプログラムモジュールを構成するプログラムコード、つまり、図7のメインルーチン、図8のサブルーチンの処理をコンピュータに行わせるプログラムコードは、以下のようにして作ることができる。先ず初めに、ソフトウェア開発者は、プログラミング言語を用いて、各フローチャートや、機能的な構成要素を実現するようなソースプログラムを記述する。この記述にあたって、ソフトウェア開発者は、プログラミング言語の構文に従い、クラス構造体や変数、配列変数、外部関数のコールを用いて、各フローチャートや、機能的な構成要素を具現するソースプログラムを記述する。
2 取得部
3a,b フレームメモリ
4 画像処理部
4a 顔領域検出部
4b 眼領域検出部
4c 前額領域検出部
4d 口及び頭髪領域検出部
5 温度補正部
5a、5b 温度補正部
6 部位別温度算出部
6a、6b 部位別温度算出部
7 重み減算部
8 履歴メモリ
9 眠気推定部
10 接触式センサI/F
9a 差分演算部
9b 閾値格納部
9c 比較部
100 眠気推定装置
101 カメラ(可視光線-赤外線共用カメラ)
102 可視光線カメラ
103 赤外線カメラ
200 テレビ
300 照明装置
400 エアコン
Claims (12)
- 被験者から眠気推定のための温度パラメータを検出して、眠気推定を行う眠気推定装置であって、
可視光線の波長域で被験者を撮影することにより得られた可視光線画像データ、及び、被験者の体表面の温度分布を測定することにより得られた体表面温度分布データを取得する取得手段と、
可視光線画像データに対して画像処理を行うことで、被験者の眼の中心領域を特定する画像処理手段と、
体表面温度分布データにて示される被験者の体表面温度分布のうち、眼の中心領域の温度を検出して、当該中心領域温度を用いて、眠気推定のための温度パラメータに対して補正を施す補正手段と、
を備えることを特徴とする眠気推定装置。 - 前記画像処理手段による眼の中心領域の特定は、可視光線画像データに表される被験者の顔領域の像から角膜の輪郭形状を検出して、当該角膜の輪郭形状を示す円弧の中央を特定することによりなされる
ことを特徴とする請求項1記載の眠気推定装置。 - 前記眠気推定のための温度パラメータは、顔領域のうち前額部分の体温であり、
前記補正手段は、
体表面温度分布データに示される体表面温度分布から、前額部分の温度を検出することで眠気推定のための温度パラメータを得る
ことを特徴とする請求項1記載の眠気推定装置。 - 前記眠気推定のための温度パラメータは、顔領域のうち口周辺部及び頭髪部を除いた部分の温度パラメータであり、
前記補正手段は、
体表面温度分布データに示される顔領域の温度分布のうち、口周辺部及び頭髪部を除いた何れかの部位の温度を検出することで眠気推定のための温度パラメータを得る
ことを特徴とする請求項1記載の眠気推定装置。 - 前記眠気推定のための温度パラメータの補正は、
眠気推定のための温度パラメータ、及び、眼の中心領域温のそれぞれに第1、第2の重み係数を乗じて、第1の重み係数が乗じられた温度パラメータから第2の重み係数が乗じられた眼の中心領域温を減じることでなされる
ことを特徴とする請求項1記載の眠気推定装置。 - 前記体表面温度分布データは、所定の解像度をなす複数の画素からなる赤外線画像データであり、赤外線画像データにおける個々の画素は、可視光線画像データにおける個々の画素に対応していて、
赤外線画像データにおける個々の画素の色成分の輝度は、可視光線画像データに現された人体表面の対応する部位から、どれだけの赤外線が輻射されているかという輻射量を示す
ことを特徴とする請求項1記載の眠気推定装置。 - 前記可視光線画像と、赤外線画像とは解像度が異なり、
前記画像処理手段による眼の中心領域の特定は、可視光線画像データの座標系におけるX座標又はY座標を用いてなされ、
前記温度補正手段は、眼の中心領域のX座標又はY座標に対して変換を施し、赤外線画像において当該変換後のX座標又はY座標に位置する画素の画素値を温度に変換し、
前記温度補正手段による座標変換は、
可視光線画像と、赤外線画像との横画素数の比率、又は、可視光線画像と、赤外線画像との縦画素数の比率を、眼の中心領域のX座標又はY座標に乗じ、可視光線画像の撮影系と、赤外線画像の撮影系とが異なることに起因する水平又は垂直方向のオフセットを加算することでなされる
ことを特徴とする請求項6記載の眠気推定装置。 - 前記可視光線画像データ、及び、赤外線画像データは、測定時間帯の複数の時点のそれぞれにおいて、被験者を撮影することで得られ、
画像処理手段による被験者の眼の中心領域の特定、取得手段による赤外線画像データの取得、補正手段による眠気推定のための温度パラメータに対する補正は、前記複数時点の撮影時のそれぞれにおいてなされ、
前記眠気推定は、ある測定時点の被験者撮影により得られた補正後の温度パラメータが、過去の測定時点の被験者撮影により得られた補正後の温度パラメータと比較して減少傾向であり、かつその減少幅が所定の閾値を上回っているかを判断することでなされる
ことを特徴とする請求項7記載の眠気推定装置。 - 前記眠気推定装置は、
可視光線を通過させ赤外線を遮断する第1モード、赤外線を通過させ可視光線を遮断する第2モードの何れかに切り替え可能な撮影手段を備え、
前記第1モード、第2モードの切り替えにより、可視光線画像データ、及び、赤外線画像データのそれぞれを得る
ことを特徴とする請求項6記載の眠気推定装置。 - 前記眠気推定のための温度パラメータは、腕背部、足背部、鎖骨部に取り付けられた接触式センサから取得したものである
ことを特徴とする請求項1記載の眠気推定装置。 - 被験者から眠気推定のための温度パラメータを用いて、眠気推定を行う眠気推定方法であって、
可視光線の波長域で被験者を撮影することにより得られた可視光線画像データ、及び、被験者の体表面の温度分布を測定することにより得られた体表面温度分布データを取得して、
可視光線画像データに対して画像処理を行うことによる、被験者の眼の中心領域を特定し、
その後、体表面温度分布データにて示される被験者の体表面温度分布のうち、眼の中心領域の温度を検出して、当該中心領域温度を用いて、眠気推定のための温度パラメータに対して補正を施す、眠気推定方法。 - 被験者から眠気推定のための温度パラメータを用いた眠気推定をコンピュータに行わせるプログラムコードが記録されたコンピュータ読み取り可能な非一時的な記録媒体であって、
コンピュータが可視光線の波長域で被験者を撮影することにより得られた可視光線画像データ、及び、被験者の体表面の温度分布を測定することにより得られた体表面温度分布データを取得した際、
可視光線画像データに対して画像処理を行うことによる、被験者の眼の中心領域を特定して、
その後、体表面温度分布データにて示される被験者の体表面温度分布のうち、眼の中心領域の温度を検出して、当該中心領域温度を用いて、眠気推定のための温度パラメータに対して補正を施す処理をコンピュータに実行させる1つ以上のプログラムコードが記録されたコンピュータ読み取り可能な非一時的な記録媒体。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/355,078 US9501704B2 (en) | 2012-10-05 | 2013-10-04 | Drowsiness estimation device, drowsiness estimation method, and computer-readable non-transient recording medium |
JP2014539619A JP6110396B2 (ja) | 2012-10-05 | 2013-10-04 | 眠気推定装置、眠気推定方法、コンピュータ読み取り可能な非一時的な記録媒体 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-222830 | 2012-10-05 | ||
JP2012222830 | 2012-10-05 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014054293A1 true WO2014054293A1 (ja) | 2014-04-10 |
Family
ID=50434636
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/005924 WO2014054293A1 (ja) | 2012-10-05 | 2013-10-04 | 眠気推定装置、眠気推定方法、コンピュータ読み取り可能な非一時的な記録媒体 |
Country Status (3)
Country | Link |
---|---|
US (1) | US9501704B2 (ja) |
JP (1) | JP6110396B2 (ja) |
WO (1) | WO2014054293A1 (ja) |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017012730A (ja) * | 2015-06-29 | 2017-01-19 | パナソニックIpマネジメント株式会社 | 人状態推定方法、及び、人状態推定システム |
JP2017100039A (ja) * | 2015-12-01 | 2017-06-08 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 体調推定装置、体調推定システム及びプロセッサ |
JP2017153963A (ja) * | 2016-02-29 | 2017-09-07 | ダイキン工業株式会社 | 判定結果出力装置、判定結果提供装置、及び判定結果出力システム |
JP2018139070A (ja) * | 2017-02-24 | 2018-09-06 | 株式会社デンソー | 車両用表示制御装置 |
JP2018183564A (ja) * | 2017-04-26 | 2018-11-22 | パナソニックIpマネジメント株式会社 | 深部体温測定装置、深部体温測定システム及び深部体温測定方法 |
JP2019126657A (ja) * | 2018-01-26 | 2019-08-01 | 富士ゼロックス株式会社 | 検出装置、及び検出プログラム |
JP2019531825A (ja) * | 2016-11-01 | 2019-11-07 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 対象の深部体温を決定するデバイス、システム及び方法 |
WO2020129426A1 (ja) * | 2018-12-20 | 2020-06-25 | パナソニックIpマネジメント株式会社 | 生体計測装置、生体計測方法、コンピュータ読み取り可能な記録媒体、およびプログラム |
JP2021089636A (ja) * | 2019-12-05 | 2021-06-10 | 富士フイルムビジネスイノベーション株式会社 | 情報処理装置及びプログラム |
WO2021140583A1 (ja) * | 2020-01-08 | 2021-07-15 | 三菱電機株式会社 | 眠気推定装置および眠気推定方法 |
US11123021B2 (en) | 2015-12-01 | 2021-09-21 | Panasonic Intellectual Property Corporation Of America | Method for estimating physical condition, physical condition estimation apparatus, and non-transitory computer-readable recording medium |
US20210321876A1 (en) * | 2020-04-21 | 2021-10-21 | Ehsan Zare Bidaki | System and method for imaging, segmentation, temporal and spatial tracking, and analysis of visible and infrared images of ocular surface and eye adnexa |
WO2021234783A1 (ja) * | 2020-05-18 | 2021-11-25 | 日本電信電話株式会社 | 情報処理装置、方法およびプログラム |
CN114061761A (zh) * | 2021-11-17 | 2022-02-18 | 重庆大学 | 基于单目红外立体视觉矫正的远距离目标温度精确测量方法 |
RU211713U1 (ru) * | 2022-02-07 | 2022-06-20 | Общество с ограниченной ответственностью "КСОР" | Устройство мониторинга состояния водителя |
JP7386438B2 (ja) | 2018-12-20 | 2023-11-27 | パナソニックIpマネジメント株式会社 | 生体計測装置、生体計測方法、コンピュータ読み取り可能な記録媒体、およびプログラム |
Families Citing this family (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10227063B2 (en) * | 2004-02-26 | 2019-03-12 | Geelux Holdings, Ltd. | Method and apparatus for biological evaluation |
US9311544B2 (en) * | 2012-08-24 | 2016-04-12 | Jeffrey T Haley | Teleproctor reports use of a vehicle and restricts functions of drivers phone |
US20160310007A1 (en) * | 2013-12-18 | 2016-10-27 | Shimadzu Corporation | Infrared light imaging apparatus |
EP3280323A1 (en) * | 2015-04-09 | 2018-02-14 | Marcio Marc Abreu | Device configured to be supported on a human body, to measure a biological parameter of the human body, and to control a characteristic of the human body |
CN106264449B (zh) * | 2015-06-29 | 2022-01-28 | 松下知识产权经营株式会社 | 人状态推定方法和人状态推定系统 |
US10821805B2 (en) | 2016-04-01 | 2020-11-03 | Gentherm Incorporated | Occupant thermal state detection and comfort adjustment system and method |
WO2018155267A1 (ja) * | 2017-02-23 | 2018-08-30 | パナソニックIpマネジメント株式会社 | 画像表示装置及び画像表示方法並びにプログラム |
JP6930247B2 (ja) * | 2017-06-29 | 2021-09-01 | 株式会社アイシン | 覚醒支援装置、覚醒支援方法、及び、覚醒支援プログラム |
US20190108318A1 (en) * | 2017-10-05 | 2019-04-11 | Kenneth J. Bagan | Safety Center and Associated Equipment |
US10943092B2 (en) | 2018-05-23 | 2021-03-09 | ClairLabs Ltd. | Monitoring system |
KR102218526B1 (ko) * | 2019-07-26 | 2021-02-19 | 엘지전자 주식회사 | 졸음 운전을 방지하기 위한 방법, 시스템 및 차량 |
EP3821793A1 (en) * | 2019-11-12 | 2021-05-19 | Koninklijke Philips N.V. | A method for determining the risk of a user waking up in an undesirable state |
WO2021102325A1 (en) * | 2019-11-22 | 2021-05-27 | Xtemp Llc | Infrared based core body temperature sensing system and method |
US20210251568A1 (en) * | 2020-02-14 | 2021-08-19 | Objectvideo Labs, Llc | Infrared sleep monitoring |
CN111369560B (zh) * | 2020-04-26 | 2023-04-28 | 成都大熊猫繁育研究基地 | 一种圈养大熊猫体温快速测量方法 |
US20230004745A1 (en) * | 2021-06-30 | 2023-01-05 | Fotonation Limited | Vehicle occupant monitoring system and method |
FR3129070A1 (fr) * | 2021-11-16 | 2023-05-19 | Valeo Systemes Thermiques | Système de détermination d’une température corporelle |
DE102022206390A1 (de) * | 2022-06-24 | 2024-01-04 | Continental Engineering Services Gmbh | Verfahren und Fahrzeugregelungssystem zur Einstellung angenehmer Temperaturen |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH053876A (ja) * | 1991-06-25 | 1993-01-14 | Matsushita Electric Works Ltd | 生体リズム曲線測定装置 |
JP2007068620A (ja) * | 2005-09-05 | 2007-03-22 | Konica Minolta Holdings Inc | 心理状態計測装置 |
JP2007516018A (ja) * | 2003-05-27 | 2007-06-21 | カーディオウエーブ インコーポレーテッド | 赤外線画像による被検体の中核体温を、遠隔、非侵襲で検出する技術のための装置および方法 |
JP2010133692A (ja) * | 2008-10-31 | 2010-06-17 | Mitsubishi Electric Corp | 空気調和機 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5689241A (en) * | 1995-04-24 | 1997-11-18 | Clarke, Sr.; James Russell | Sleep detection and driver alert apparatus |
JPH09154835A (ja) | 1995-12-07 | 1997-06-17 | Matsushita Electric Ind Co Ltd | 居眠り検出装置 |
US7202792B2 (en) * | 2002-11-11 | 2007-04-10 | Delphi Technologies, Inc. | Drowsiness detection system and method |
US9024764B2 (en) * | 2007-01-25 | 2015-05-05 | Honda Motor Co., Ltd. | Method and apparatus for manipulating driver core temperature to enhance driver alertness |
US8446470B2 (en) * | 2007-10-04 | 2013-05-21 | Magna Electronics, Inc. | Combined RGB and IR imaging sensor |
-
2013
- 2013-10-04 WO PCT/JP2013/005924 patent/WO2014054293A1/ja active Application Filing
- 2013-10-04 US US14/355,078 patent/US9501704B2/en active Active
- 2013-10-04 JP JP2014539619A patent/JP6110396B2/ja not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH053876A (ja) * | 1991-06-25 | 1993-01-14 | Matsushita Electric Works Ltd | 生体リズム曲線測定装置 |
JP2007516018A (ja) * | 2003-05-27 | 2007-06-21 | カーディオウエーブ インコーポレーテッド | 赤外線画像による被検体の中核体温を、遠隔、非侵襲で検出する技術のための装置および方法 |
JP2007068620A (ja) * | 2005-09-05 | 2007-03-22 | Konica Minolta Holdings Inc | 心理状態計測装置 |
JP2010133692A (ja) * | 2008-10-31 | 2010-06-17 | Mitsubishi Electric Corp | 空気調和機 |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017012730A (ja) * | 2015-06-29 | 2017-01-19 | パナソニックIpマネジメント株式会社 | 人状態推定方法、及び、人状態推定システム |
JP2017100039A (ja) * | 2015-12-01 | 2017-06-08 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 体調推定装置、体調推定システム及びプロセッサ |
US11883210B2 (en) | 2015-12-01 | 2024-01-30 | Panasonic Intellectual Property Corporation Of America | Method for estimating physical condition, physical condition estimation apparatus, and non-transitory computer-readable recording medium |
JP2020168554A (ja) * | 2015-12-01 | 2020-10-15 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | 制御方法、制御装置及びプログラム |
JP2022123075A (ja) * | 2015-12-01 | 2022-08-23 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 体調推定装置の制御方法及び体調推定装置の制御装置 |
JP7093814B2 (ja) | 2015-12-01 | 2022-06-30 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | 機器の制御方法、機器の制御装置及びプログラム |
US11123021B2 (en) | 2015-12-01 | 2021-09-21 | Panasonic Intellectual Property Corporation Of America | Method for estimating physical condition, physical condition estimation apparatus, and non-transitory computer-readable recording medium |
JP2017153963A (ja) * | 2016-02-29 | 2017-09-07 | ダイキン工業株式会社 | 判定結果出力装置、判定結果提供装置、及び判定結果出力システム |
JP2019531825A (ja) * | 2016-11-01 | 2019-11-07 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | 対象の深部体温を決定するデバイス、システム及び方法 |
JP2018139070A (ja) * | 2017-02-24 | 2018-09-06 | 株式会社デンソー | 車両用表示制御装置 |
JP7054800B2 (ja) | 2017-04-26 | 2022-04-15 | パナソニックIpマネジメント株式会社 | 深部体温測定装置、深部体温測定システム及び深部体温測定方法 |
JP2018183564A (ja) * | 2017-04-26 | 2018-11-22 | パナソニックIpマネジメント株式会社 | 深部体温測定装置、深部体温測定システム及び深部体温測定方法 |
JP2019126657A (ja) * | 2018-01-26 | 2019-08-01 | 富士ゼロックス株式会社 | 検出装置、及び検出プログラム |
JP7386438B2 (ja) | 2018-12-20 | 2023-11-27 | パナソニックIpマネジメント株式会社 | 生体計測装置、生体計測方法、コンピュータ読み取り可能な記録媒体、およびプログラム |
WO2020129426A1 (ja) * | 2018-12-20 | 2020-06-25 | パナソニックIpマネジメント株式会社 | 生体計測装置、生体計測方法、コンピュータ読み取り可能な記録媒体、およびプログラム |
JP2021089636A (ja) * | 2019-12-05 | 2021-06-10 | 富士フイルムビジネスイノベーション株式会社 | 情報処理装置及びプログラム |
JP7429416B2 (ja) | 2019-12-05 | 2024-02-08 | 株式会社Agama-X | 情報処理装置及びプログラム |
WO2021140583A1 (ja) * | 2020-01-08 | 2021-07-15 | 三菱電機株式会社 | 眠気推定装置および眠気推定方法 |
US20210321876A1 (en) * | 2020-04-21 | 2021-10-21 | Ehsan Zare Bidaki | System and method for imaging, segmentation, temporal and spatial tracking, and analysis of visible and infrared images of ocular surface and eye adnexa |
WO2021234783A1 (ja) * | 2020-05-18 | 2021-11-25 | 日本電信電話株式会社 | 情報処理装置、方法およびプログラム |
CN114061761A (zh) * | 2021-11-17 | 2022-02-18 | 重庆大学 | 基于单目红外立体视觉矫正的远距离目标温度精确测量方法 |
CN114061761B (zh) * | 2021-11-17 | 2023-12-08 | 重庆大学 | 基于单目红外立体视觉矫正的远距离目标温度精确测量方法 |
RU211713U1 (ru) * | 2022-02-07 | 2022-06-20 | Общество с ограниченной ответственностью "КСОР" | Устройство мониторинга состояния водителя |
Also Published As
Publication number | Publication date |
---|---|
JPWO2014054293A1 (ja) | 2016-08-25 |
JP6110396B2 (ja) | 2017-04-05 |
US9501704B2 (en) | 2016-11-22 |
US20140313309A1 (en) | 2014-10-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6110396B2 (ja) | 眠気推定装置、眠気推定方法、コンピュータ読み取り可能な非一時的な記録媒体 | |
Villarroel et al. | Non-contact physiological monitoring of preterm infants in the neonatal intensive care unit | |
US11744475B2 (en) | Remote heart rate monitoring based on imaging for moving subjects | |
US9843743B2 (en) | Infant monitoring systems and methods using thermal imaging | |
Ru et al. | A detailed research on human health monitoring system based on internet of things | |
KR102296396B1 (ko) | 비접촉 체온 측정 시 정확도를 향상시키기 위한 장치 및 방법 | |
EP2713871B1 (en) | Method and system for monitoring the skin color of a user | |
JP6306022B2 (ja) | 遠隔的に検出された電磁放射線から導出可能なデータを処理する装置及び方法 | |
Tsouri et al. | On the benefits of alternative color spaces for noncontact heart rate measurements using standard red-green-blue cameras | |
CN103955272B (zh) | 一种终端设备用户姿态检测系统 | |
US9232912B2 (en) | System for evaluating infant movement using gesture recognition | |
WO2014012070A1 (en) | Infant monitoring systems and methods using thermal imaging | |
CN106551679A (zh) | 一种移动终端及基于移动终端的测温方法 | |
WO2019039698A1 (ko) | 외부 광에 기반하여 이미지를 처리하는 방법 및 이를 지원하는 전자 장치 | |
JP2019005553A (ja) | 情報処理方法、情報処理装置、および情報処理システム | |
CN114283494A (zh) | 一种用户跌倒预警方法、装置、设备和存储介质 | |
US10838492B1 (en) | Gaze tracking system for use in head mounted displays | |
Gupta et al. | Remote photoplethysmography‐based human vital sign prediction using cyclical algorithm | |
WO2022072838A1 (en) | Contactless monitoring system | |
JPH08252226A (ja) | ストレス計測装置 | |
KR102564483B1 (ko) | 비-접촉식 방식으로 획득되는 정보에 기반하여 정확도 높은 생체 신호를 제공하기 위한 전자 장치, 서버, 시스템, 그 동작 방법 | |
US20240122486A1 (en) | Physiological monitoring soundbar | |
TWI795028B (zh) | 一種改善心跳及呼吸偵測精度的生理監控系統 | |
US20230195046A1 (en) | Gradual Pixel Aperture Design for Improved Visualization at a Sensor Location of an Electronics Display | |
WO2021182129A1 (ja) | 遠隔医療システム、遠隔医療方法、情報処理装置、及び、プログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 14355078 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13843184 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014539619 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13843184 Country of ref document: EP Kind code of ref document: A1 |