WO2017159215A1 - Information processing device and information processing method - Google Patents

Information processing device and information processing method Download PDF

Info

Publication number
WO2017159215A1
WO2017159215A1 PCT/JP2017/006046 JP2017006046W WO2017159215A1 WO 2017159215 A1 WO2017159215 A1 WO 2017159215A1 JP 2017006046 W JP2017006046 W JP 2017006046W WO 2017159215 A1 WO2017159215 A1 WO 2017159215A1
Authority
WO
WIPO (PCT)
Prior art keywords
coordinates
eye
end point
eyelid
approximate
Prior art date
Application number
PCT/JP2017/006046
Other languages
French (fr)
Japanese (ja)
Inventor
雄大 中村
内藤 正博
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Publication of WO2017159215A1 publication Critical patent/WO2017159215A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes

Definitions

  • the present invention relates to an information processing apparatus and an information processing method, and more particularly to an information processing apparatus and an information processing method for determining the shape of an eyelid from a face image.
  • the degree of eye opening the degree of opening of the driver's eyes
  • the apparatus described in Patent Document 1 extracts horizontal edges and vertical edges of a face image, estimates the eyelid position from the positional relationship, and determines the degree of eye opening.
  • the apparatus described in Patent Document 2 extracts an edge from a face image and approximates the eyelid with a curve to accurately determine the degree of eye opening even when wearing glasses.
  • the device described in Patent Document 3 extracts a white-eye region from a face image and realizes an eye opening degree robust to makeup such as eye shadow.
  • near infrared light that cannot be seen by human eyes
  • auxiliary light in a dark environment.
  • near-infrared light it is difficult to distinguish between a white-eye area and a black-eye area.
  • an error occurs in the eye detection position. For example, in the case of low image quality, some of the eye positions are detected by using changes in eyebrow and eye brightness, but are limited to those indicating the presence area of the eye, such as the positions of the corners of the eyes and the eyes. It is difficult to detect accurately.
  • an object of the present invention is to detect the eyelid shape with high accuracy even for a low-quality face image or a face image using near infrared light. Is to be able to do it.
  • An information processing apparatus includes an eye presence area specifying unit that specifies an eye presence area, which is an area where an eye exists, from a person's face image, and between pixels in the vertical direction of the eye presence area.
  • an eye presence area specifying unit that specifies an eye presence area, which is an area where an eye exists, from a person's face image, and between pixels in the vertical direction of the eye presence area.
  • An approximate curve generation unit that performs the plurality of first approximate songs
  • a shape determining unit that selects the most suitable first approximate curve as the first boundary line and determines the shape of the upper eyelid from the selected first approximate curve, and the light / dark change detecting unit includes: A first selection area formed on a straight line passing through the first end point and the second end point and including the first end point in the eye presence area based on a change in brightness between the pixels; The approximate curve generation unit selects the plurality of coordinates from the first selection region.
  • An information processing apparatus provides an eye presence area specifying unit that specifies an eye presence area, which is an area where an eye exists, from a person's face image, and between pixels in the vertical direction of the eye presence area.
  • an eye presence area specifying unit that specifies an eye presence area, which is an area where an eye exists, from a person's face image, and between pixels in the vertical direction of the eye presence area.
  • an approximate curve generation unit and the plurality of first approximate curves In the eye presence region, an approximate curve selection unit that determines the coordinates of a fixed end point that has determined the highest point in the first boundary line by selecting the most suitable first approximate curve as the first boundary line; Selecting a plurality of first selection coordinates from a straight line in the vertical direction passing through the coordinates of the eye and a plurality of second selection coordinates from a straight line in the vertical direction passing through the coordinates of the corner of the eye, and the plurality of first selection coordinates Each of the plurality of second selected coordinates and a correction curve generation unit that generates a plurality of correction curves that are approximate curves from the coordinates of the determined end point, and the first boundary from the plurality of correction curves A shape determining unit that selects a most suitable correction curve as a line and determines the shape of the upper eyelid from the selected correction curve.
  • the information processing method specifies an eye presence area, which is an area where an eye exists, from a human face image, and changes in brightness between pixels in the vertical direction of the eye presence area.
  • a first selection area including a first end point is specified based on a change in brightness between the pixels, a plurality of coordinates are selected from the first selection area, and each of the plurality of selected coordinates is A plurality of first approximate curves are generated from the coordinates of the eyes and the coordinates of the corners of the eyes. From said plurality of first approximation curve, and select the most suitable first approximation curve as said first boundary line, and determines the shape of the upper eyelid from the first approximation curve is the selected.
  • the information processing method specifies an eye presence area, which is an area where an eye exists, from a human face image, and changes in brightness between pixels in the vertical direction of the eye presence area.
  • a plurality of first approximate curves are generated from each of the selected plurality of coordinates, the coordinates of the top of the eye, and the coordinates of the corners of the eyes, and the first boundary line is most preferably generated from the plurality of first approximate curves.
  • the first boundary line Determining the coordinates of the final endpoint that has determined the uppermost point, and selecting a plurality of first selection coordinates from a straight line extending in the vertical direction passing through the coordinates of the eye in the eye presence area, and in the eye presence area, A plurality of second selection coordinates are selected from a straight line extending in the vertical direction passing through the coordinates of the plurality of first selection coordinates, each of the plurality of second selection coordinates, and the coordinates of the determined end point, Generating a plurality of correction curves as approximate curves, selecting a correction curve most suitable as the first boundary line from the plurality of correction curves, and determining the shape of the upper eyelid from the selected correction curve; It is characterized by.
  • the eyelid shape can be detected with high accuracy.
  • FIG. 3 is a block diagram schematically showing a configuration of an eye opening degree detection apparatus according to Embodiments 1 to 3.
  • FIG. (A) And (B) is the schematic for demonstrating the face image imaged with the low-resolution camera.
  • FIG. 3 is a schematic diagram for explaining an eye presence area in the first embodiment.
  • (A) And (B) is the schematic which shows the filter used by the light-and-dark change detection part in Embodiment 1.
  • 4 is a flowchart showing processing in the eye opening degree detection apparatus according to the first embodiment.
  • FIG. 10 is a block diagram schematically showing a configuration of an eye opening degree detection device according to a fourth embodiment.
  • FIG. 10 is a schematic diagram illustrating an example of an image near the eye when the driver is facing in a lateral direction in the fourth embodiment.
  • FIG. 10 is a block diagram schematically showing a configuration of an eye opening degree detection device according to a fifth embodiment.
  • Embodiment 5 it is the schematic for demonstrating the production
  • FIG. (A) and (B) are schematic diagrams illustrating an example of a hardware configuration of an eye-opening degree detection apparatus according to Embodiments 1 to 5.
  • FIG. 1 is a block diagram schematically showing a configuration of an eye opening degree detection apparatus 100 as an information processing apparatus according to the first embodiment.
  • the eye opening degree detection device 100 includes an eyelid shape detection device 110 and an eye opening degree calculation unit 130.
  • FIG. 2A shows a face image FI captured by a low-resolution camera.
  • the face image FI includes an upper eyelid elu, a lower eyelid elb, a double eyelid els, and eyebrows ebr.
  • FIG. 2B is a schematic diagram showing the positions of the upper eyelid elu, the lower eyelid elb, the double eyelid els, and the eyebrows ebr in the face image FI.
  • the eyelid shape detection device 110 detects the eyelid shape with high accuracy even for a low-quality face image FI.
  • the eyelid shape detection device 110 illustrated in FIG. 1 includes an eye presence region specifying unit 111, a normalization processing unit 112, a light / dark change detection unit 113, an eyelid reference position estimation unit 114, and an approximate curve generation unit 115. And a shape determining unit 116.
  • the eye presence area specifying unit 111 is provided with low-resolution face image data Im representing a face image that is a face image of a person (hereinafter referred to as a driver for simplification of description) to be an eye opening degree detection target, From the face image represented by the face image data Im, an eye presence area that is an area where the driver's eyes exist is specified. Then, the eye presence region specifying unit 111 generates eye presence region data Dm that is data indicating the specified eye presence region. Then, the eye presence area specifying unit 111 provides the eye presence area data Dm to the normalization processing unit 112 together with the face image data Im.
  • the eye presence area specifying unit 111 is given face image data Im representing a driver's face image, and the left eye, right eye, or both eyes of the driver are selected from the face image. An eye presence area is specified, and eye presence area data Dm indicating the specified eye presence area is generated.
  • the face image data Im is, for example, color image data or gray scale image data captured by an RGB camera, a gray scale camera, or an infrared light camera.
  • the upper face (head, forehead) of the driver is shown above the face image
  • the lower face (mouth, chin) is shown below the face image.
  • the upper left point of the image is the origin of the image coordinate system
  • the right direction of the origin is the x coordinate direction
  • the downward direction of the origin is the y coordinate direction.
  • the left eye of the driver is the eye located on the left side of the face when facing the face image
  • the right eye of the driver is the eye located on the right side of the face when facing the face image. is there.
  • the eye presence area data Dm indicates the presence area of the right eye of the driver.
  • the eye presence area is an area where each eye in the face image is considered to exist.
  • the eye presence region Dr is a region including the corner of the eye eout, the eye ein, the end point etop of the upper eyelid, and the end point ebot of the lower eyelid.
  • the eye presence region is a region that satisfies the following first condition and second condition.
  • the first condition is that when both eyes, eyebrows, nose and mouth are face parts, only one eye is included and the other parts are not included. However, the eyebrows may be partially included.
  • the second condition is that when the eye presence area is a rectangular area, the center of gravity Mg of the rectangular area is included in an area surrounded by the corner of the eye eout, the eye ein, the end point etop of the upper eyelid, and the end point ebot of the lower eyelid. It is that.
  • FIG. 3 shows an example in which the eye presence area Dr is expressed as a rectangular area.
  • the eye presence region data Dm is data represented by the coordinates of the center of gravity Mg of the rectangular region and the width Mw and height Mh of the rectangular region.
  • the eye presence area data Dm may be a combination of the coordinates of the upper left point of the rectangular area and the coordinates of the lower right point.
  • the eye existing area data Dm may be a combination of the coordinates of the center point of the circular area and the radius.
  • the eye presence area Dr is a rectangular area
  • the coordinates (Mgx, Mgy) of the center of gravity Mg of the rectangular area and the width Mw and height Mh of the rectangular area are the eye presence area data Dm. .
  • the eye presence region Dr may be set based on the position of the nostril, for example, as shown in Patent Document 1, by detecting a nostril that is a characteristic part that is easy to detect in the face. Further, the eye presence area may be set by detecting the approximate positions of the corners of the eyes and the eyes by statistical processing using AAM (Active Appearance Model). In the present embodiment, the method for setting the eye presence region Dr is not particularly limited.
  • the normalization processing unit 112 shown in FIG. 1 is given eye presence area data Dm and face image data Im, and from the face image represented by the face image data Im, the eye indicated by the eye presence area data Dm. Cut out the image of the existing area. Then, the normalization processing unit 112 converts the clipped image into a normalized image having a predetermined size. Then, the normalization processing unit 112 generates normalized image data In representing the normalized image. Further, the normalization processing unit 112 converts the normalized image data In and the normalized eye presence region data Dn indicating the eye presence region in the normalized image data In, the light / dark change detection unit 113 and the eyelid reference position estimation unit 114. To give.
  • the normalization processing unit 112 converts the image size, first, an image of the eye existing area is cut out, and then an enlargement or reduction process is performed. For example, if the width of the rectangle of the eye presence region is Mw, the height is Mh, the width of the normalized image is Nw, and the height is Nh, the enlargement or reduction magnification is Nw / Mw times in the x-axis direction , Nh / Mh times in the y-axis direction. At this time, if the ratio between the width Mw and the height Mh and the ratio between the width Nw and the height Nh are set in advance, the aspect ratio does not change between the image before normalization and the image after normalization.
  • 2.5, for example.
  • bilinear interpolation for example, bilinear interpolation, bilateral interpolation, or nearest neighbor interpolation may be used, and is not particularly limited in the present embodiment.
  • enlargement or reduction processing is performed using bilinear interpolation.
  • the normalized eye existing area data Dn is the center of the normalized image. It is defined by coordinates Ng, width Nw, and height Nh.
  • the light / dark change detection unit 113 calculates light / dark change data En by calculating the magnitude of the light / dark change in the vertical direction of the face image with respect to the normalized image represented by the normalized image data In. Then, the light / dark change detection unit 113 supplies the light / dark change data En to the eyelid reference position estimation unit 114 and the approximate curve generation unit 115 together with the normalized image data In.
  • the light / dark change detection unit 113 detects the magnitude of the change in brightness between pixels in the y-axis direction of the normalized image represented by the normalized image data In, so that the driver's eyes and the upper eyelid are detected.
  • the coordinates of the first end point on the boundary line (first boundary line) between and the coordinates of the second end point on the boundary line (second boundary line) between the driver's eyes and the lower eyelid presume.
  • the brightness change detection unit 113 uses the coordinates of the pixel with the greatest degree of brightness decrease from the upper pixel as the coordinates of the first end point in the normalized image, and the brightness from the lower pixel. Let the coordinates of the pixel with the greatest degree of decrease be the coordinates of the second end point.
  • the light / dark change detection unit 113 generates light / dark change data En indicating the estimated coordinates of the first end point and the coordinates of the second end point, and supplies the data to the reference position estimation unit 114 and the approximate curve generation unit 115 which are the eyelids. give.
  • the light and dark change detection unit 113 uses, for example, the filters shown in FIGS.
  • the 5 ⁇ 5 filter Ebf shown in FIG. 4A is an example of a filter that extracts a point where light and dark changes from dark to bright in the y-axis direction.
  • the filter Ebf is a second filter that extracts a change in brightness near the lower eyelid.
  • the filter Ebf is set with the point (nx, ny) on the normalized image as the center position.
  • the evaluation value Ebv of the filter Ebf is calculated by the following equation (1). Note that In (x, y) indicates a luminance value at coordinates (x, y) in the normalized image data In.
  • the 5 ⁇ 5 filter Etf shown in FIG. 4B is an example of a filter that extracts a point where light and dark changes from light to dark in the y-axis direction.
  • the filter Etf is a first filter that extracts a change in brightness near the upper eyelid. If the center position (first pixel of interest) in the filter Etf is the position of “23” shown in FIG. 4B, the filter Etf is set with the point (nx, ny) on the normalized image as the center position.
  • the evaluation value Etv of the filter Etf is calculated by the following equation (2).
  • the filter Etf is a filter obtained by inverting the top and bottom of the filter Ebf so that the area of one item (second area) on the right side of equation (2) is large and the area of two items (first area) is small. Designed to. This is the same as in the case of the lower eyelid described above, and when observing the change in brightness around the upper eyelid, the region above the upper eyelid (the side opposite to the y-axis direction) is relatively bright and dark. This is a region where the change is small and the change due to opening and closing of the eye is small. On the other hand, the region on the lower side of the upper eyelid (in the y-axis direction) is a region where the brightness changes greatly due to the influence of pupil movement and eye opening / closing. Therefore, the filter Etf is suitable for detecting the brightness of the upper eyelid stably regardless of the eye state.
  • the light / dark change detection unit 113 applies the above-described two types of filters to each pixel of the normalized image data In, whereby an evaluation value Etv indicating the degree of light / dark change of the two types of luminance values in each pixel and the evaluation The value Ebv is obtained.
  • the evaluation value Etv indicates the magnitude of change from light to dark in the y-axis direction from light to dark, that is, the degree of decrease in brightness in the y-axis direction.
  • the evaluation value Ebv is the lightness and darkness in the y-axis direction. The magnitude of the change from dark to light, that is, the degree of decrease in brightness in the direction opposite to the y-axis direction.
  • the coordinates of the point having the maximum value in the evaluation value Etv of the light-dark change are set to Ent (Entx, Enty), and the coordinates of the point having the maximum value in the evaluation value Ebv are set to Enb (Enbx, Enby).
  • the brightness change detection unit 113 estimates the coordinate Ent as the coordinate of the first end point that is the uppermost point of the boundary line between the driver's eye and the upper eyelid, and sets the coordinate Enb as the driver's eye and the lower eyelid. Is estimated to be the coordinates of the second end point that is the lowest point of the boundary line between the two, and light-dark change data En indicating the coordinates of these two points is given to the eyelid reference position estimating unit 114 and the approximate curve generating unit 115 .
  • the coordinate Ent and the coordinate Enb are coordinates indicating a likely position as an end point of the upper eyelid and an end point of the lower eyelid in the brightness change included in the eye presence region Dr.
  • the coordinate point may deviate upward from the upper eyelid due to the influence of makeup such as double eyelids on the upper eyelid, false eyelashes or eye shadow, or bangs. .
  • FIG. 5 is a schematic diagram illustrating an example in which the coordinate Ent is located on the double eyelid els.
  • the evaluation value Etv of the filter may take the maximum value at a point on the double eyelid els instead of a point on the upper eyelid elu. Since the processing in this case is handled by the shape determination unit 116, it will be described later. In the following description, it is assumed that the coordinate Ent is a point on the double eyelid els.
  • the eyelid reference position estimation unit 114 shown in FIG. 1 is given the normalized image data In, the normalized eye presence area data Dn, and the light / dark change data En, and at the position of the reference point existing on the eyelid. At least two eyelid reference positions are calculated. For example, the eyelid reference position estimation unit 114 estimates the coordinates of the eyes and the coordinates of the eyes in the normalized image represented by the normalized image data In. Then, the eyelid reference position estimating unit 114 generates coordinate data Fn indicating the coordinates of the reference point on the normalized image. Then, the eyelid reference position estimation unit 114 gives the coordinate data Fn to the approximate curve generation unit 115.
  • the point serving as the eyelid reference position is preferably, for example, a point corresponding to the corner of the eye and the eye.
  • the face image captures the front face of the driver, it can be estimated that the positions of the corners of the eyes and the eyes on the y-axis are substantially the same as the lower end of the lower eyelid. Therefore, it is possible to estimate the y-coordinates of the corners of the eyes and the eyes from the y-coordinate (Enby) of the coordinates Enb in the light / dark change data En.
  • FIG. 5 shows an example of the positional relationship of each point when the y-coordinate of the corner of the eye and the top of the eye is determined from the y-coordinate of the coordinate Enb.
  • the coordinates of the corner of the eye are coordinates Eno (Enox, Enoy)
  • the coordinates of the head of the eye are coordinates Eni (Enix, Eniy)
  • the eyelid reference position estimation unit 114 estimates the estimated coordinates Eno of the eye corners and the estimated coordinates Eni of the eyes as the coordinates of the eyelid reference position, and generates coordinate data Fn indicating these coordinates.
  • the approximate curve generation unit 115 shown in FIG. 1 generates a plurality of approximate curves for the eyelid from the normalized image data In, the light / dark change data En, and the eyelid reference position data Fn.
  • Parameter data Gn including a set of parameters indicating each of the curves is given to the shape determining unit 116.
  • the approximate curve generation unit 115 is selected from a straight line passing through the coordinates Ent of the first end point and the coordinates Enb of the second end point indicated by the light / dark change data En in the normalized image represented by the normalized image data In. Further, a plurality of approximate curves (first approximate curves) are generated from each of the plurality of coordinates, from the coordinates Eno of the corner of the eye and the coordinates Eni of the eye indicated by the eyelid reference position data Fn. In the first embodiment, the approximate curve generation unit 115 selects a plurality of coordinates from a line segment between the coordinates Ent of the first end point and the coordinates Enb of the second end point.
  • the approximate curve generation unit 115 gives parameter data Gn including a set of parameters indicating each of the plurality of approximate curves to the shape determination unit 116.
  • the curve used for approximation is, for example, a quadratic or higher curve.
  • the y-coordinate of the estimated coordinate Eno of the corner of the eye and the estimated coordinate Eni of the eye is equal to the coordinate Enb of the second end point, and therefore the curve Cb of the lower eyelid is a straight line. Therefore, in the following description, generation of an approximate curve of the upper eyelid will be described.
  • FIG. 6 is a schematic diagram illustrating a method of generating a plurality of approximate curves for the upper eyelid.
  • the approximate curve of the upper eyelid is a curve that passes through the eyelid reference points (coordinates of the corner of the eye and the eye) located on the eyelid and the end point etop of the upper eyelid. That is, if the eyelid reference point and the upper eyelid end point etop are correctly estimated, the approximate curve can be accurately obtained.
  • the coordinates Ent specified by the change in brightness to detect the upper eyelid are not necessarily located on the upper eyelid, but may be located on the eye shadow or the double eyelid. There is. Here, for the sake of explanation, it is assumed that the coordinates Ent are located on the double eyelid.
  • the approximate curve generation unit 115 generates a plurality of approximate curves Cu (i) shown in FIG. 6 as curves that are candidates for approximate curves that pass through the eyelid reference point and the end point etop of the upper eyelid.
  • i 1, 2, 3,..., N.
  • N is the number of candidate approximate curves, and is an integer of 2 or more.
  • Each approximate curve is defined using the three points of the coordinates Cupa (i), the coordinates Cupb, and the coordinates Cupc.
  • the coordinates Cupa (i) are selected for each curve with respect to the generated curves.
  • a plurality of coordinates Cupa (i) are selected from points on a straight line l passing through the coordinates Enb and the coordinates Ent.
  • N points on the straight line l that satisfy Entity ⁇ y ⁇ Enby are selected and set as a point group P.
  • each point P (i) (i ⁇ 1, 2,..., N) included in the point group P is set as a coordinate Cupa (i).
  • what is necessary is just to select each point P (i) so that it may become equal intervals, for example.
  • the coordinates Cupb and the coordinates Cupc common coordinates are selected among a plurality of generated curves.
  • the coordinates Cupb and the coordinates Cupc are set based on the eyelid reference position data Fn.
  • the x and y coordinates of the coordinates Cupa (i) are (Cupa (i) x, Cupa (i) y), the x and y coordinates of the coordinates Cupb are (Cupbx, Cupby), and the coordinates of Cupc
  • the x coordinate and y coordinate are (Cupcx, Cupcy). Note that Cupbx ⁇ Cupa (i) x ⁇ Cupcx.
  • the approximate curve is the curve Cul (i) of the portion where the x coordinate is smaller than the coordinate Cupa (i).
  • the large curve Cur (i) is set separately, and two curves are set as one curve Cu (i).
  • a quadratic function is used as the approximate curve.
  • a quadratic function for example, a point that is a vertex of the quadratic function and a point through which the quadratic function passes may be specified.
  • a quadratic function is described as an approximate curve.
  • the curve Cul (i) has a coordinate Cupa (i) as a vertex and a coordinate Cupb as a point passing through other than the vertex.
  • each parameter can be obtained by the following equation (4).
  • the curve Cur (i) has a coordinate Cupa (i) as a vertex and a point passing through the coordinate Cupc other than the vertex.
  • each parameter can be obtained by the following equation (6).
  • Parameters indicating the plurality of curves Cu (i) and the curve Cb generated as described above are given to the shape determining unit 116 as parameter data Gn.
  • the parameter data Gn includes a set of parameters for each curve.
  • the parameter data Gn includes a “quadratic function” that is a type of curve, “N upper eyelids and one lower eyelid” that are the number of curves, eyelid reference points (coordinates Eno of eye corners and coordinates Eni of eye eyes), and It includes a coordinate Enb, as well as a set of parameters for each curve.
  • the set of parameters is the parameters abl (i), bbl (i), cbl (i), acr (i), bcr (i) and ccr (i) of the upper eyelid approximate curve, and This is the eyelid linear parameter.
  • the linear parameter of the lower eyelid is the y coordinate value of one of the eyelid reference point (coordinate of the eye corner Eno and the coordinate of the eye head Eni) and the coordinate Enb. Since it is assumed that the y coordinate of the estimated coordinate Eno of the eye corner and the estimated coordinate Eni of the eye head is equal to the y coordinate of the coordinate Enb, the straight line of the lower eyelid is a straight line parallel to the x axis. It is only necessary that the y-coordinate value of any one of (coordinates Eno of the corner of the eye and coordinates Eni of the top of the eye) and the coordinate Enb is included as a straight line parameter.
  • the shape determining unit 116 is given parameter data Gn including parameters of a plurality of approximate curves, evaluates pixel values on the approximate curve represented by each parameter, and calculates a shape parameter Hn representing the shape of the upper eyelid and the lower eyelid. . Then, the shape determination unit 116 gives the shape parameter Hn to the eye opening degree calculation unit 130. For example, the shape determining unit 116 selects the most suitable approximate curve as a boundary line between the driver's eye and the upper eyelid from a plurality of approximate curves indicated by the parameter data Gn, thereby increasing the selected approximate curve. Determine the shape of the eyelid. In the first embodiment, the selected approximate curve is determined as the shape of the upper eyelid.
  • the shape determining unit 116 determines the shape of the lower eyelid based on the eyelid reference point (coordinate of the eye corner Eno and the coordinate of the eye head Eni) and the coordinate Enb of the second end point, which are linear parameters of the lower eyelid included in the parameter data Gn. To decide.
  • the shape of the lower eyelid is determined from the line segment between the coordinates of the corner of the eye Eno and the coordinates of the eye head Eni.
  • the line segment between the coordinate Eno of the corner of the eye and the coordinate Eni of the eye is determined as the shape of the lower eyelid.
  • the shape determination unit 116 calculates a parameter indicating the shape of the driver's eye from the determined shape of the upper eyelid and the shape of the lower eyelid, and calculates a shape parameter Hn indicating the calculated parameter as the degree of eye opening. Part 130.
  • the shape parameter Hn may be a parameter that specifies one approximate curve parameter for each of the upper eyelid and the lower eyelid.
  • the shape parameter Hn may include the coordinates of the eyelid reference point, the coordinates of the upper eyelid end point etop, and the coordinates of the lower eyelid end point ebot, instead of the approximate curve parameters.
  • the shape parameter Hn may represent one curvature of each of the approximate curves of the upper eyelid and the lower eyelid.
  • the shape parameter Hn is a ratio value ep between the distance ew between the corners of the eyes and the eyes and the distance eh between the end point etop of the upper eyelid and the end point ebot of the lower eyelid.
  • the shape determining unit 116 obtains the average value of the luminance values of the pixels on each approximate curve among the multiple approximate curves generated for the upper eyelid, and obtains the curve having the lowest average value. By selecting, the most suitable approximate curve can be selected as the boundary line between the eye and the upper eyelid. As described above, the shape determination unit 116 selects a curve Cu (iu) that is likely to be the upper eyelid from the approximate curves, and sets the coordinates Cupa (iu) as the end point etop of the upper eyelid. The end point ebot of the lower eyelid is set to the end point of the lower eyelid curve Cb, that is, the coordinate Enb.
  • the value Nd can be obtained from the following equation (7).
  • the eye opening degree calculation unit 130 compares the given shape parameter Hn with the reference shape parameter Hs stored in the memory 131 in advance, and calculates the eye opening degree Kn of the driver.
  • the degree of eye opening Kn indicates the degree of opening of the driver's eyes.
  • the degree of opening of the eyes in a normal state is preferably represented by a value from 0% to 100%, assuming that the degree of eye opening is 100%.
  • a ratio value eps ehs / ews between the distance between the eye corners and the eye distance ews in the steady state of the driver and the distance ehs between the end point etop of the upper eyelid and the end point ebot of the lower eyelid. keeping.
  • the eye opening degree calculation unit 130 compares the reference shape parameter Hs and the given shape parameter Hn, and calculates the eye opening degree Kn by the following equation (8).
  • Kn 100 ⁇ ep / eps (8)
  • the eyelid shape detection apparatus 110 in Embodiment 1 assumes that the boundary line between the eye and the lower eyelid is a straight line, it may be assumed to be a curved line.
  • the eyelid reference position estimation unit 114 calculates the coordinates of the corner of the eye and the eye using, for example, the y coordinate Ngy of the center coordinate Ng of the normalized image instead of the coordinate Enb.
  • the eyelid reference position estimation unit 114 determines the coordinates. Used as the coordinates of the corner of the eye and the eye.
  • the approximate curve generation unit 115 selects a plurality of points through which the approximate curve passes from the straight line l, the upper eyelid is selected in the range of Enty ⁇ y ⁇ Eny, and the lower eyelid is selected from Anyy ⁇ y ⁇ Enby.
  • the shape determining unit 116 determines the most suitable approximate curve as the boundary line between the eye and the upper eyelid from the plurality of approximate curves for the upper eyelid in the same manner as described above, From the approximate curve, an approximate curve most suitable as a boundary line between the eye and the upper eyelid may be determined.
  • the shape determining unit 116 shows a curve passing through the coordinates Eni, the coordinates Eno, and the coordinates Enb. It may be selected as an approximate curve.
  • the light / dark change detection unit 113 performs the filter processing for extracting the light / dark change for all the pixels included in the eye presence region. There is no need to implement it.
  • the upper and lower end points etop and ebot of the eyelid are basically straight lines passing through the center point between the corner of the eye and the eye and parallel to the y axis.
  • the brightness change detection unit 113 can obtain the estimated coordinates Ent and Enb of the upper and lower eyelid points simply by searching for a point that satisfies the above conditions. In this way, the processing time can be shortened by limiting the number of pixels on which the filter processing is performed.
  • the pixel to be filtered is not limited to a straight line that passes through the center point between the corner of the eye and the top of the eye and is parallel to the y-axis, but is within a range of Ngx ⁇ Nw / 4 ⁇ x ⁇ Ngx + Nw / 4.
  • a plurality of straight lines parallel to the y-axis may be selected, and the light / dark change detection unit 113 may perform filter processing on pixels on the straight lines.
  • FIG. 7 is a flowchart showing processing in the eye opening degree detection apparatus 100 according to the first embodiment.
  • the flowchart illustrated in FIG. 7 is started, for example, when the face image data Im is input to the eye opening degree detection device 100.
  • the eye presence region specifying unit 111 specifies the eye presence region from the face image represented by the face image data Im, and generates eye presence region data Dm indicating the specified eye presence region (S10). Then, the eye presence region specifying unit 111 provides the generated eye presence region data Dm to the normalization processing unit 112 together with the face image data Im.
  • the normalization processing unit 112 cuts out the eye presence area from the face image represented by the face image data Im based on the eye presence area data Dm, and converts the cut out image into a predetermined image size. Normalized image data In representing the normalized image after conversion is generated, and normalized eye presence region data Dn indicating the eye presence region in the normalized image data In is generated (S11). Then, the normalization processing unit 112 gives the normalized image data In and the normalized eye presence region data Dn to the light / dark change detection unit 113.
  • the light / dark change detection unit 113 calculates the magnitude of the light / dark change in the vertical direction (y-coordinate direction) of the normalized image data In, and generates light / dark change data En (S12). Then, the light / dark change detection unit 113 gives the light / dark change data En to the eyelid reference position estimation unit 114, and gives the light / dark change data En and the normalized image data In to the approximate curve generation unit 115.
  • the eyelid reference position estimation unit 114 calculates at least two reference point positions existing on the eyelid based on the light / dark change data En, and obtains coordinate data Fn indicating coordinates of the calculated reference points on the normalized image. Generate (S13). Then, the eyelid reference position estimation unit 114 gives the coordinate data Fn to the approximate curve generation unit 115.
  • the approximate curve generation unit 115 generates a plurality of approximate curves for the eyelid from the normalized image data In, the light / dark change data En, and the eyelid reference position data Fn, and includes parameters of each of the plurality of approximate curves.
  • Parameter data Gn is generated (S14). And it outputs to the shape determination part 116.
  • the shape determining unit 116 calculates feature amounts of pixel values on a plurality of approximate curves indicated by a plurality of parameters included in the parameter data Gn (S15).
  • the shape determination unit 116 compares and evaluates the feature values of the approximate curves, and selects the approximate curve that seems to be the most eyelid (S16).
  • the shape determining unit 116 calculates a shape parameter Hn representing the shape of the upper eyelid and the lower eyelid from the selected approximate curve (S17). Then, the shape determination unit 116 gives the calculated shape parameter Hn to the eye opening degree calculation unit 130.
  • the eye opening degree calculation unit 130 compares the shape parameter Hn with the reference shape parameter Hs stored in advance in the memory 131, and calculates the driver's eye opening degree Kn (S18).
  • the eyelid shape detection apparatus 110 estimates the eyelid reference position from a point with a large change in brightness, and is an area defined by the point with a large change in brightness and the estimated reference position. Detecting the shape of the eyelid accurately even when edges cannot be detected in low-quality face images by finding multiple approximate curves from the points included in the image and comparing and evaluating the feature values of the pixel values on the multiple approximate curves can do.
  • the eyelid shape detection apparatus 110 opens even a low-quality face image by using a vertically asymmetric filter to extract changes in brightness of the eyelid.
  • the position of the eyelid can be estimated regardless of the eye state such as being present or closed. For this reason, the shape of the eyelid can be detected with high accuracy.
  • the eyelid shape detection apparatus 110 according to the first embodiment configured as described above accurately sets an eyelid shape approximate curve separately on the left and right sides with respect to the reference point, so that the eyelid shape can be accurately detected even for the left and right asymmetric eyelid shapes.
  • the eyelid shape can be detected.
  • the eyelid shape detection apparatus 110 compares and evaluates the average value of the luminance values of the pixels with respect to a plurality of generated approximate curves, and the average value of the luminance values is the highest. By selecting a small curve as an approximate curve that passes through the upper eyelid, even if the person in the face image has an eye shadow, or the person in the face image has a double eyelid, the eyelid shape can be accurately determined. Can be detected.
  • the eye-opening degree detection apparatus 100 configured as described above detects an eyelid shape after converting a high-quality face image into a low-quality face image even for a high-quality face image.
  • the eyelid shape can be accurately detected with a small amount of processing.
  • the eye-opening degree detection apparatus 100 configured as described above can accurately detect an eyelid shape with respect to a low-quality face image, the degree of freedom regarding the camera installation position is high. For example, even if the camera is placed at a position away from the driver, such as near the display of a car navigation system, and the face image has low image quality, the eye-opening degree detection device 100 can detect the eyelid shape with high accuracy.
  • a camera equipped with a wide-angle lens for example, 90 ° to 150 °
  • the eye opening degree detection device 100 can detect the eyelid shape with high accuracy.
  • the eye-opening degree detection device 100 converts the eyelid shape after converting it to a low-quality face image. By detecting the eyelid shape, the eyelid shape can be detected accurately with a small amount of processing.
  • the eye-opening degree detection apparatus 100 estimates the eyelid reference position from a point with a large change in brightness, and is an area defined by the point with a large change in brightness and the estimated reference position. Detecting the shape of the eyelid accurately even when edges cannot be detected in low-quality face images by finding multiple approximate curves from the points included in the image and comparing and evaluating the feature values of the pixel values on the multiple approximate curves can do. Furthermore, the eye-opening degree detection device 100 can accurately detect the degree of eye opening for a low-quality face image by comparing the shape of the detected eyelid with a shape registered in advance.
  • a straight line connecting the first end point, which is the highest point of the first boundary line, and the second end point, which is the lowest point of the second boundary line is used.
  • the straight line may be shifted to some extent in the x-axis direction. This allowable “some degree” refers to a range of about one-fifth of the distance between the corners of the eyes and the corners of the eyes, for example, centering on the center point between the corners of the corners of the eyes.
  • the eye opening degree detection apparatus 200 according to Embodiment 2 includes an eyelid shape detection apparatus 210 and an eye opening degree calculation unit 130.
  • the eye opening degree detection apparatus 200 according to the second embodiment is configured in the same manner as the eye opening degree detection apparatus 100 according to the first embodiment except for the eyelid shape detection apparatus 210.
  • the eyelid shape detection apparatus 210 includes an eye presence region specifying unit 111, a normalization processing unit 112, a light / dark change detection unit 213, an eyelid reference position estimation unit 114, an approximate curve generation unit 215, and a shape.
  • the eyelid shape detection device 210 according to the second embodiment is configured in the same manner as the eyelid shape detection device 110 according to the first embodiment, except for the light / dark change detection unit 213, the approximate curve generation unit 215, and the shape determination unit 216.
  • eye opening degree detection apparatus 100 When generating an approximated curve, eye opening degree detection apparatus 100 according to Embodiment 1 selects a plurality of points from the points satisfying Enby ⁇ y ⁇ Enty on line l connecting coordinates Ent and Enb. An approximate curve is generated.
  • the eye-opening degree detection apparatus 200 according to Embodiment 2 reduces the processing time by limiting the range of points used for the approximate curve.
  • the light / dark change detection unit 213 generates the light / dark change data En by calculating the magnitude of the change in the vertical brightness of the face image with respect to the normalized image data In. Then, the light / dark change detection unit 213 supplies the light / dark change data En to the eyelid reference position estimation unit 114 and the approximate curve generation unit 215 together with the normalized image data In.
  • the light / dark change data En includes the coordinate data of the coordinates Ent and the coordinate Enb, the region where the light / dark change (bright to dark) is large (first selection region) Rnt, and the region where the light / dark change (dark to light) is large (second selection). Area) Rnb.
  • the region Rnt and the region Rnb are determined according to the evaluation value of the filter. Specifically, the light / dark change detection unit 213 uses the threshold value THtf for the filter Etf and the threshold value THbf for the filter Ebf to set a region where the evaluation value of the filter Etf is equal to or greater than the threshold value THtf as the region Rnt, and the evaluation value of the filter Ebf is the threshold value.
  • a region equal to or greater than THbf is defined as a region Rnb.
  • the region Rnt and the region Rnb are pixels on a straight line l connecting the coordinate Ent of the point where the light / dark change (bright to dark) is maximum and the coordinate Enb of the point where the light / dark change (dark to light) is the maximum. Ask for.
  • FIG. 8A and 8B are schematic diagrams illustrating a method for calculating the region Rnt.
  • FIG. 8A shows the evaluation value of the filter Etf and the straight line 1 in each image near the coordinate Ent.
  • the threshold value THtf 100
  • the pixels belonging to the region Rnt in which the evaluation value of the filter Etf is equal to or greater than the threshold value THtf are hatched or cross-hatched.
  • the light / dark change detection unit 213 starts from the coordinate Ent and first scans on the straight line l in the y-axis direction (downward in the figure). Then, as illustrated in FIG. 8B, the light / dark change detection unit 213 detects the pixel scanned immediately before the pixel for which the evaluation value of the filter Etf is less than the threshold value THtf for the first time as the pixel Rntb at the lower end of the region Rnt. To do. In the example shown in FIG. 8B, when the y-axis direction is scanned along the straight line 1 from the coordinate Ent, the evaluation values of the filter Etf are obtained in the order of 150, 124 and 46.
  • the light / dark change detection unit 213 sets the pixel having the evaluation value 124 of the filter Etf as the pixel Rntb at the lower end of the region Rnt.
  • the light / dark change detection unit 213 scans on the straight line l in the reverse direction of the y-axis (upward in the figure). Then, the light / dark change detection unit 213 sets a pixel scanned immediately before a pixel whose evaluation value of the filter Etf is lower than the threshold value THtf for the first time as a pixel Rntt at the upper end of the region Rnt.
  • the evaluation values of the filter Etf are obtained as 100, 90, and 31, respectively.
  • the threshold value Thhtf is not reached until the output value has shifted from 100 to 90. Therefore, a pixel having an evaluation value of 100 for the filter Etf is set as a pixel Rntt at the upper end of the region Rnb.
  • the light / dark change detection unit 213 obtains a pixel Rnbt at the upper end and a pixel Rnbb at the lower end of the region Rnb based on the coordinates Enb and the threshold value THbf.
  • region Rnt is indicated by, for example, the y coordinate of the coordinates corresponding to the pixel Rntt and the pixel Rntb.
  • region Rnb is indicated by, for example, the y coordinate of the coordinates corresponding to the pixel Rnbt and the pixel Rnbb.
  • the threshold value THtf and the threshold value THbf for defining the region Rnt and the region Rnb may be defined in advance as constants, or may be obtained using the maximum value and the minimum value of the evaluation values of the filter Etf and the filter Ebf. Good.
  • the threshold value THtf and the threshold value THbf may be obtained from the following equations (9) and (10).
  • THtf ⁇ ⁇ (FntMax ⁇ FntMin) + FntMin (9)
  • THbf ⁇ ⁇ (FnbMax ⁇ FnbMin) + FnbMin (10)
  • the value FntMax is the maximum value of the evaluation value when the pixel in the eye presence region is processed by the filter Etf.
  • the value FntMin is its minimum value.
  • the value FnbMax is the maximum evaluation value when the pixel in the eye presence region is processed by the filter Ebf.
  • the value FnbMin is its minimum value.
  • the approximate curve generation unit 215 generates a plurality of approximate curves for the eyelid from the normalized image data In, the light / dark change data En, and the eyelid reference position data Fn, and sets parameters indicating each of the plurality of approximate curves. Parameter data Gn including the set is given to the shape determining unit 216.
  • the approximate curve generation unit 215 When selecting the coordinates Cupa (i), the approximate curve generation unit 215 generates an approximate curve by selecting from the region Rnt for the upper eyelid and from the region Rnb for the lower eyelid. By generating the approximate curve so as to pass through the region where the change in brightness is large as described above, the approximate curve generation unit 215 can reduce the number of approximate curves as compared with the first embodiment, and the processing time can be reduced. Can be shortened.
  • the shape determining unit 216 determines the most suitable approximate curve as a boundary line between the eye and the upper eyelid from a plurality of approximate curves (first approximate curve) corresponding to the upper eyelid. select. In addition, the shape determining unit 216 selects the most suitable approximate curve as a boundary line between the eye and the lower eyelid from a plurality of approximate curves (second approximate curve) corresponding to the lower eyelid. The method of generating and selecting the approximate curve is the same as in the case of the upper eyelid.
  • the eyelid shape detection apparatus 210 Since the eyelid shape detection apparatus 210 according to the second embodiment generates an approximate curve so as to pass through a region having a large change in brightness, the shape of the eyelid can be detected in a short processing time.
  • an eye opening degree detection apparatus 300 according to Embodiment 3 includes an eyelid shape detection apparatus 310 and an eye opening degree calculation unit 130.
  • the eye opening degree detection apparatus 300 according to the third embodiment is configured in the same manner as the eye opening degree detection apparatus 100 according to the first embodiment, except for the eyelid shape detection apparatus 310.
  • the eyelid shape detection apparatus 310 includes an eye presence region specifying unit 111, a normalization processing unit 112, a light / dark change detection unit 313, an eyelid reference position estimation unit 114, an approximate curve generation unit 115, a shape And a determination unit 116.
  • the eyelid shape detection device 310 in the third embodiment is configured in the same manner as the eyelid shape detection device 110 in the first embodiment, except for the light / dark change detection unit 313.
  • the eye-opening degree detection apparatus 100 obtains two points (bright to dark, dark to bright) of points having a large light / dark change in the eye presence region, and generates light / dark change data En. In this case, since the filtering process is necessary for all the pixels in the eye presence region, the processing load is large.
  • the eye-opening degree detection apparatus 200 according to Embodiment 2 sets a plurality of straight lines in the eye presence region, and obtains two points having a large contrast change on each straight line. Then, the eye opening degree detection device 200 calculates the distance between the two points obtained on each straight line, and generates the light / dark change data En indicating the coordinates of the two points having the largest distance. Thereby, the pixels to be filtered can be limited to pixels on a straight line, and eyelid shape detection robust to changes in eyelid shape due to face orientation changes and individual differences can be performed.
  • the light / dark change detection unit 313 generates the light / dark change data En by calculating the magnitude of the change in brightness in the vertical direction of the normalized image represented by the normalized image data In. For example, the brightness change detection unit 313 sets a plurality of predetermined straight lines extending in the vertical direction in the normalized image, and reduces brightness from the upper pixel on each straight line included in the plurality of straight lines. The distance between the coordinates of the first candidate pixel having the largest degree of the brightness and the coordinates of the second candidate pixel having the greatest degree of brightness reduction from the lower pixel is calculated.
  • the light / dark change detection unit 313 identifies the coordinates of the first candidate pixel and the second candidate pixel having the largest calculated distance, and the coordinates of the identified first candidate pixel are the coordinates of the first end point Ent, The coordinates of the specified second candidate pixel are set as the coordinates Enb of the second end point. Then, the light / dark change detecting unit 313 supplies the light / dark change data En to the eyelid reference position estimating unit 114 and the approximate curve generating unit 115 together with the normalized image data In.
  • the light / dark change detection unit 313 sets a plurality of straight lines on the normalized image represented by the normalized image data In.
  • the light / dark change detection unit 313 executes processing of the filter Etf and the filter Ebf along each straight line.
  • the point where the evaluation value of the filter becomes maximum is obtained for each straight line.
  • the coordinates of the point at which the evaluation value of the filter Etf is maximum on the straight line l (k) (k ⁇ ⁇ 1, 2, 3 ⁇ ) are the coordinates of the first candidate pixel Ent (k) and the evaluation value of the filter Ebf.
  • the coordinate of the point at which is the maximum is defined as the coordinate Enf (k) of the second candidate pixel.
  • the brightness change detection unit 313 calculates a distance d (k) between the coordinates Ent (k) and the coordinates Enb (k) on each straight line.
  • d (k) Enby (k) ⁇ Enty (k).
  • Enby (k) is the y coordinate of the coordinate Enb (k)
  • Enty (k) is the y coordinate of the coordinate Ent (k).
  • the light / dark change detecting unit 313 selects the maximum distance from the obtained distances d (k) (k corresponding to the selected distance is K), and coordinates Ent (K) and Enb ( Brightness / darkness change data En indicating K) is generated.
  • the number of straight lines is set to three, but the number is not particularly limited. Further, although a straight line parallel to the y-axis is selected as the straight line, the present invention is not limited to this, and an arbitrary straight line may be set.
  • the eyelid shape detection apparatus 310 can limit the pixels to be filtered to pixels on a straight line, the processing time can be shortened. Further, by selecting the two points where the distance between the two kinds of points with the large change in brightness is the largest, two points close to the upper end point and the lower end point of the eyelid can be selected. The shape of the approximate curve to be generated is a shape close to the original eyelid shape. For this reason, the eyelid shape can be detected with higher accuracy.
  • FIG. FIG. 9 is a block diagram schematically showing the configuration of the eye opening degree detection apparatus 400 according to the fourth embodiment.
  • the eye opening degree detection device 400 includes an eyelid shape detection device 410 and an eye opening degree calculation unit 130.
  • the eye opening degree detection apparatus 400 according to the fourth embodiment is configured in the same manner as the eye opening degree detection apparatus 100 according to the first embodiment except for the eyelid shape detection apparatus 410.
  • the eyelid shape detection device 410 includes an eye presence region specifying unit 111, a normalization processing unit 112, a light / dark change detection unit 413, an eyelid reference position estimation unit 114, an approximate curve generation unit 115, a shape, And a determination unit 116.
  • the eyelid shape detection device 410 according to the fourth embodiment is configured in the same manner as the eyelid shape detection device 110 according to the first embodiment except for the light / dark change detection unit 413. Further, the face direction information Hd indicating the direction of the driver's face is given from the outside to the light / dark change detection unit 413 in the fourth embodiment.
  • the eye-opening degree detection apparatus 300 calculates the distance between two types of points with a large change in brightness in a plurality of straight lines on the normalized image, and uses the two points on the straight line with the longest distance.
  • the pixels to which the filter process is applied are limited.
  • the eye-opening degree detection apparatus 400 according to Embodiment 4 uses the face orientation information Hd to identify a straight line that should be used for extraction of a light / dark change from a plurality of straight lines, thereby further reducing processing time. I do.
  • FIG. 10 is a schematic diagram illustrating an example of an image near the eye when the driver is facing sideways. Unlike the example of the image near the eye shown in FIG. 2 when the driver is facing the front, the coordinates of the upper end point of the eyelid to be detected when the driver is facing the lateral direction There is a tendency that the Ent and the coordinate Enb of the lower end point are closer to the left side of the eye presence region. Such a bias in the lateral direction (x-axis direction) of the coordinate Ent of the upper end point and the coordinate Enb of the lower end point of the eyelid is mainly caused by a change in the face direction in the horizontal direction.
  • the light / dark change detection unit 413 determines a straight line from which the light / dark change is extracted based on the face orientation information Hd, thereby filtering a plurality of straight lines like the light / dark change detecting unit 313 in the third embodiment. There is no need to perform processing.
  • the light / dark change detection unit 413 generates light / dark change data En by calculating the magnitude of the change in brightness in the vertical direction of the normalized image represented by the normalized image data In. For example, the light / dark change detection unit 413 sets a plurality of predetermined straight lines extending in the vertical direction in the normalized image, and selects one straight line from the plurality of straight lines based on the orientation of the person's face. Then, the light / dark change detection unit 413 sets, on the selected straight line, the coordinates of the pixel with the greatest degree of brightness reduction from the upper pixel as the first endpoint coordinate Ent, and the brightness of the lower pixel. The coordinate of the pixel with the greatest degree of decrease is taken as the coordinate Enb of the second end point. Then, the light / dark change detection unit 413 supplies the light / dark change data En to the eyelid reference position estimating unit 114 and the approximate curve generating unit 115 together with the normalized image data In.
  • the light / dark change detection unit 413 sets a plurality of straight lines in the normalized image represented by the normalized image data In.
  • the light / dark change detection unit 413 selects a straight line to be filtered based on the horizontal face orientation information Hd.
  • the face direction information Hd is the direction of the driver's face
  • the front direction is 0 degree
  • the right direction is a positive rotation angle
  • the left direction is a negative direction. It shall be shown with the rotation angle.
  • FIG. 10 shows a state in which it faces rightward, that is, in a positive direction.
  • the light / dark change detection unit 413 provides a threshold THfdp and a threshold THfdn (THfdp> THfdn) with respect to the face direction (rotation angle) indicated by the face direction information Hd, and a straight line l (if Hd ⁇ THfdp) 1) is selected. If THfdp ⁇ Hd ⁇ THfdn, the straight line l (2) is selected, and if Hd ⁇ Thfdn, the straight line l (3) is selected.
  • the light / dark change detection unit 413 performs filter processing on the selected straight line, and generates light / dark change data En indicating the coordinates of the two types of large light-dark changes (bright to dark, dark to light).
  • the eyelid shape detection apparatus 410 can select a straight line that should be used for extraction of light and dark changes from a plurality of straight lines by using the face orientation information Hd, and therefore the number of times of filter processing is reduced. The processing time can be shortened.
  • FIG. FIG. 11 is a block diagram schematically showing a configuration of eye opening degree detection apparatus 500 according to the fifth embodiment.
  • the eye opening degree detection device 500 includes an eyelid shape detection device 510 and an eye opening degree calculation unit 130.
  • Eye opening degree detection apparatus 500 according to Embodiment 5 is configured in the same manner as eye opening degree detection apparatus 100 according to Embodiment 1 except for eyelid shape detection apparatus 510.
  • the eyelid shape detection apparatus 510 includes an eye presence region specifying unit 111, a normalization processing unit 112, a light / dark change detection unit 113, an eyelid reference position estimation unit 114, an approximate curve generation unit 115, an approximation A curve selection unit 517, a correction curve generation unit 518, and a shape determination unit 516 are provided.
  • the eyelid shape detection apparatus 510 according to the fifth embodiment is the same as the first embodiment except that the processing by the shape determination unit 516 and the approximate curve selection unit 517 and the correction curve generation unit 518 are further provided. It is comprised similarly to the eyelid shape detection apparatus 110 in FIG.
  • the approximate curve selection unit 517 evaluates the pixel value on the approximate curve represented by each parameter from the parameter data Gn including the parameters of the plurality of approximate curves, by the same processing as the shape determination unit 116 in the first embodiment.
  • the coordinates of the end point etop of the eyelid and the end point ebot of the lower eyelid are specified. Then, the approximate curve selection unit 517 generates the shape parameter data On indicating the coordinates Cupa (iu) of the end point etop of the upper eyelid, the coordinates Enb of the end point ebot of the lower eyelid, the coordinates Eno of the eye corners, and the coordinates Eni of the eye head, This is given to the correction curve generation unit 518.
  • the end point etop of the upper eyelid is also referred to as a fixed end point.
  • the correction curve generation unit 518 generates a plurality of correction curves that are approximate curves for correcting the positions of the corners of the eyes and the eyes from the given shape parameter data On, and sets each parameter of the generated plurality of correction curves.
  • the correction curve parameter data Pn that is included is provided to the shape determination unit 516.
  • FIG. 12 is a schematic diagram for explaining generation of a correction curve for the upper eyelid.
  • the approximate curve selection unit 517 calculates the coordinates Cupa (iu) of the end point etop of the upper eyelid and the coordinates Enb of the end point ebot of the lower eyelid.
  • the coordinates Eni of the eyelid and the coordinate Eno of the eyelid are estimated based on the coordinate Enb. Value, and lacks accuracy. Therefore, the correction curve generation unit 518 generates a correction curve that is an approximate curve for correcting the coordinate Eno and the coordinate Eno.
  • Each correction curve is defined using three points of coordinates Pupa, Pupb (i), and Pupc (i) passing through the curve.
  • the coordinate Pupb (i) is also referred to as a first selected coordinate
  • the coordinate Pupc (i) is also referred to as a second selected coordinate.
  • the coordinate Pupa is a common point among a plurality of generated curves.
  • the coordinates Pupa the coordinates Cupa (iu) of the end point etop of the upper eyelid are used.
  • the coordinates Pupb (i) and the coordinates Pupc (i) are selected for each curve with respect to a plurality of generated curves.
  • the x and y coordinates of the coordinates Pupb (i) are (Pupb (i) x, Pupb (i) y).
  • the x and y coordinates of the coordinate Pupc (i) are (Pupc (i) x, Pupc (i) y).
  • a plurality of coordinates Pupb (i) are selected from points on the straight line l (1) passing through the coordinates Eni.
  • M M is an integer of 2 or more
  • points satisfying Any-mgin ⁇ y ⁇ Eny + mgin are selected on the straight line l (1), and the selected point is defined as a point group Pin.
  • each point Pin (i) (i ⁇ 1, 2,..., M) included in the point group Pin is set as Pupb (i).
  • the value mgin is a predetermined constant.
  • a plurality of coordinates Pupc (i) are selected from points on the straight line l (3) passing through the coordinates Eno. Specifically, M points that satisfy Enoy ⁇ mgout ⁇ y ⁇ Enoy + mgout are selected on the straight line l (3), and the selected point is set as a point group Pout. Then, each point Pout (i) (i ⁇ 1, 2,..., M) included in the point group Pout is set as Pout (i).
  • the value mgout is a predetermined constant.
  • each point Pout (i) should just be selected so that it may become equal intervals.
  • the eyelid shape may be asymmetrical depending on the face orientation of the face image input to the device. Therefore, one correction curve Pu (i) has two curves, a curve Pu (i) where the x coordinate is smaller than the x coordinate of the coordinate Pupa and a curve Pur (i) where the x coordinate is large. Consists of.
  • a quadratic function is used as the correction curve.
  • the quadratic function can be uniquely defined by specifying a point that is a vertex of the quadratic function and a point through which the quadratic function passes. In the following description, a quadratic function is described as a correction curve.
  • the curve Pul (i) has a coordinate Pupa as a vertex and a coordinate Pupb (i) as a point passing through other than the vertex.
  • each parameter can be obtained by the following equation (12).
  • the curve Pur (i) has a coordinate Pupa as a vertex and a coordinate Pupc (i) as a point passing through other than the vertex.
  • each parameter can be obtained by the following equation (14).
  • the parameters indicating the plurality of curves Pu (i) and the curve Cb generated as described above are given to the shape determination unit 516 as the correction curve parameter data Pn.
  • the correction curve parameter data Pn includes a set of parameters for each curve.
  • the correction curve parameter data Pn includes a “quadratic function” that is a type of curve, “M upper eyelid and one lower eyelid” that are the number of curves, eyelid reference points (coordinates Eno of eye corners and coordinates Eni of eye eyes) ) And coordinates Enb, and a set of parameters for each curve.
  • the set of parameters is the upper eyelid correction curve parameters abl (i), bbl (i), cbl (i), acr (i), bcr (i) and ccr (i), and This is the eyelid linear parameter.
  • the linear parameter of the lower eyelid is the y coordinate value of one of the eyelid reference point (coordinate of the eye corner Eno and the coordinate of the eye head Eni) and the coordinate Enb.
  • the shape determining unit 516 shown in FIG. 11 is given correction curve parameter data Pn including a plurality of correction curve parameters, evaluates pixel values on the correction curve represented by each parameter, and determines the upper eyelid and lower eyelid. A shape parameter Hn representing the shape is calculated. Then, the shape determination unit 516 gives the shape parameter Hn to the eye opening degree calculation unit 130.
  • the shape determining unit 516 obtains the average value of the luminance values of the pixels on the correction curve in each of the plurality of correction curves, and selects the curve having the lowest average value. This selects the correction curve that passes through the upper eyelid. As described above, the shape determining unit 516 selects the most likely curve Pu (iu) as the upper eyelid from among the correction curves, and uses the coordinates Pupb (iu) as the coordinates of the head and Pupc (iu) as the coordinates of the corner of the eye. .
  • the x-coordinate value of the eye coordinates Pupb (iu) is the x-coordinate value of the estimated eye coordinates Eni
  • the y-coordinate value of the eye coordinates Pupb (iu) is the correction curve Pu (iu).
  • the x coordinate value of the corner coordinates Pupc (iu) is the x coordinate value of the estimated corner coordinates Eno
  • the y coordinate value of the corner coordinates Pupb (iu) is represented by the correction curve Pu (iu). It can be obtained by substituting the value of the x coordinate of the estimated coordinate Eno of the corner of the eye.
  • the value of the x coordinate of the coordinate Pupa is the parameter bcr (i) or the parameter bbl (i)
  • the value of the y coordinate of the coordinate Pupa is the parameter cbl (i) or the parameter ccr (i).
  • the value Lw can be obtained by the following equation (15).
  • the value Ld can be obtained from the following equation (16).
  • a plurality of correction curves are generated again from the end points of the upper eyelid and the end points of the lower eyelid calculated by using the approximate curve, and the positions of the corners of the eyes and the eyes are corrected. This makes it possible to detect the eyelid shape with high accuracy.
  • FIG. 13 is a block diagram schematically showing the configuration of the dozing level detection apparatus 600 according to the sixth embodiment.
  • the dozing level detection device 600 includes eye opening degree detection devices 100 to 500 and a dozing level calculation unit 601.
  • the dozing level detection device 600 receives face image data Im continuously at regular intervals and outputs a driver's dozing level Qn.
  • the face image data Im is given, for example, 30 frames per second, and the dozing level Qn is updated and output every time the face image data Im is newly input.
  • the eye opening degree detection devices 100 to 500 may be any one of the eye opening degree detection devices 100 to 500 described in the first to fifth embodiments.
  • the degree of eye opening Kn calculated by any one of the eye opening degree detection devices 100 to 500 described in the first to fifth embodiments is given to the dozing level calculation unit 601.
  • the dozing level is a value indicating the degree of sleepiness.
  • the degree of drowsiness represented by the Karolinska Sleepiness Scale (KSS) indicates the degree of sleepiness by a numerical value represented by 9 levels from 1 (very awakening) to 9 (very sleepy).
  • the degree of sleepiness may be indicated by a numerical value of 10 levels obtained by adding 10 (sleeping) to the 9 levels.
  • the dozing level is considered based on a numerical value of 10 levels, and is defined as a continuous value rather than a stepwise value.
  • numerical values expressed using decimal points such as 1.1 and 5.4 are also included. It should be noted that the numerical value of the dozing level is not limited to the above-described 9 levels or 10 levels.
  • the level of dozing level is indicated by a numerical value of 1 to 5.
  • the dozing level calculation unit 601 is provided with the eye opening degree Kn, and calculates the driver's dozing level Qn from the eye opening degree Kn for the past Nq frames. For example, the dozing level calculation unit 601 calculates PERCLOS (Percentage of Eyelid Closure) for the past Nq frames, and calculates the dozing level.
  • PERCLOS Percentage of Eyelid Closure
  • the closed eye state is a state in which, for example, a frame whose eye opening degree Kn is equal to or less than the threshold value Knh is continuously observed for Nqch.
  • the dozing level Qn is calculated by the following equation (17).
  • Qn fq (PERCLOS) (17)
  • f is an arbitrary function with PERCLOS as a variable, and for example, a linear function, a quadratic function, or a nonlinear function using a neural network is used.
  • a function represented by the following formula (18) or (19) can be used as the function f that can calculate a numerical value of 1 to 10.
  • Each parameter ( ⁇ 1 to ⁇ NN, ⁇ , etc.) in equation (18) or equation (19) is statistically calculated from past experimental data. For example, in an experiment for measuring the degree of drowsiness, by evaluating the degree of dozing using the Karolinska sleepiness scale and simultaneously measuring PERCLOS, the relationship between the degree of drowsiness and PERCLOS can be acquired. Based on the data acquired in this way, a parameter for calculating the degree of doze from the PERCLOS is specified, and the specified parameter may be applied to the above equation (18) or (19).
  • a neural network is a technique for modeling the correspondence between input variables and results as a function. If a neural network is used, the correspondence between PERCLOS and the degree of dozing obtained from experimental data can be acquired as a function. That is, a function that outputs a likely dozing level with respect to PERCLOS obtained as an input is obtained. However, the same data as the past data is not always obtained in the actual usage scene. In this case, an output other than 1 to 10 may be obtained. On the other hand, as shown in the above equation (18), when a value larger than 10 is obtained, the output is set to 10, and when a value smaller than 0 is obtained, This can be handled by setting the output to 0.
  • PERCLOS for the past Nq frame is PERCLOS (1)
  • PERCLOS for the past 2 ⁇ Nq to Nq frames is PERCLOS (2)
  • the dozing level Qn may be calculated by the following equation (21).
  • PERCLOS in the equation (21) will be described.
  • the current frame is F (k)
  • Nq is 900
  • NN 10.
  • PERCLOS calculated for F (k) to F (k-899) is PERCLOS (1).
  • PERCLOS calculated for F (k ⁇ 900) to F (k ⁇ 1799) becomes PERCLOS (2).
  • PERCLOS calculated for F (k-1800) to F (k-2699) becomes PERCLOS (3).
  • PERCLOS calculated for F (k-8100) to F (k-8999) is PERCLOS (10).
  • PERCLOS is a variable of fq, and the degree of drowsiness Qn is calculated.
  • arbitrary features such as the interval at which the closed eye state occurs, the blink frequency, and the blink interval are extracted, It may be a variable of fq.
  • the driver may be alerted or alerted by using the dozing level output by the dozing level detecting device 600 configured as described above.
  • Part or all of the eye opening degree detection devices 100 to 500 and the drowsiness level detection device 600 described above are stored in the memory 150 and the memory 150 as shown in FIG. 14A, for example.
  • a processor 151 such as a CPU (Central Processing Unit) for executing the program.
  • Such a program may be provided through a network, or may be provided by being recorded on a recording medium.
  • a part of or all of the eye-opening degree detection devices 100 to 500 and the drowsiness detection device 600 may be a single circuit, a decoding circuit, a programmed processor, It can also be configured by a processing circuit 152 such as a programmed processor, ASIC (Application Specific Integrated Circuits), or FPGA (Field Programmable Gate Array).
  • the eye opening degree Kn is used as the output of the eye opening degree detecting devices 100 to 500.
  • the present invention is not limited to such an example.
  • the shape parameter Hn may be used as the output.
  • the eyelid shape detection devices 110 to 510 function as information processing devices.
  • the dozing level detection device 600 that outputs the dozing level Qn functions as an information processing device.

Abstract

The present invention is provided with: a brightness-change detecting unit (113) that estimates the coordinates of a first endpoint existing on a first boundary between an eye and an upper eyelid and the coordinates of a second endpoint existing on a second boundary between the eye and a lower eyelid by detecting the magnitudes of variations in brightness between pixels along the vertical direction in an eye presence region identified from a face image; an eyelid- reference-position estimating unit (114) that estimates the coordinates of the inner corner of the eye and the coordinates of the outer corner of the eye in the eye presence region; an approximate-curve generating unit (115) that selects a plurality of coordinates from a first selected region formed on a straight line passing the first endpoint and the second endpoint in the eye presence region and that generates a plurality of approximate curves from the individual coordinates in the plurality of selected coordinates, the coordinates of the inner corner of the eye, and the coordinates of the outer corner of the eye; and a shape determining unit (116) that selects an approximate curve that is most suitable as the first boundary from among the plurality of generated approximate curves.

Description

情報処理装置及び情報処理方法Information processing apparatus and information processing method
 本発明は、情報処理装置及び情報処理方法に関し、特に、顔画像からまぶたの形状を決定する情報処理装置及び情報処理方法に関する。 The present invention relates to an information processing apparatus and an information processing method, and more particularly to an information processing apparatus and an information processing method for determining the shape of an eyelid from a face image.
 近年、自動車及び電車の運転手の居眠りによる事故が問題視されている。これに対し、カメラ画像から運転手の眼の開き度合い(以後、開眼度という)を検出し、居眠りを監視することが有効である。従来、カメラで撮像した運転手の顔画像から眼の位置を検出し、眼付近の明暗のエッジを解析することでまぶたの位置及び形状を推定し、開眼度を判定する技術が提案されている。例えば、特許文献1に記載された装置は、顔画像の水平方向のエッジ及び垂直方向のエッジを抽出し、それらの位置関係からまぶたの位置を推定し、開眼度を判定している。また、特許文献2に記載された装置は、顔画像からエッジを抽出し、まぶたを曲線近似することで、眼鏡を掛けた状態でも精度良く開眼度を判定している。特許文献3に記載された装置は、顔画像から白眼領域を抽出し、アイシャドー等化粧に頑健な開眼度の判定を実現している。 In recent years, accidents caused by the falling asleep of car and train drivers are regarded as a problem. On the other hand, it is effective to detect the degree of opening of the driver's eyes (hereinafter referred to as the degree of eye opening) from the camera image and monitor the dozing. Conventionally, a technique has been proposed in which the position of an eye is detected from a driver's face image captured by a camera, the position and shape of the eyelid is estimated by analyzing bright and dark edges near the eye, and the degree of eye opening is determined. . For example, the apparatus described in Patent Document 1 extracts horizontal edges and vertical edges of a face image, estimates the eyelid position from the positional relationship, and determines the degree of eye opening. Further, the apparatus described in Patent Document 2 extracts an edge from a face image and approximates the eyelid with a curve to accurately determine the degree of eye opening even when wearing glasses. The device described in Patent Document 3 extracts a white-eye region from a face image and realizes an eye opening degree robust to makeup such as eye shadow.
特許第4137969号公報Japanese Patent No. 4137969 特許第4825737号公報Japanese Patent No. 4825737 特許第4623044号公報Japanese Patent No. 4623044
 しかしながら、コスト、処理量、環境の影響(暗い環境)により、顔画像が低画質の場合は、エッジの抽出及び白眼領域の抽出が難しい。 However, due to cost, processing amount, and environmental influences (dark environment), it is difficult to extract edges and white-eye areas when the face image has low image quality.
 また、暗い環境下では人の眼に見えない近赤外光を補助光として使用するのが有効である。しかしながら、近赤外光を使用した場合は、白眼領域と黒眼領域との区別が難しい。
 さらに、低画質の場合には、眼の検出位置に誤差が生じる。例えば、低画質の場合には、眼の位置を検出する上では、眉と眼の明暗の変化を利用するものがあるが、眼の存在領域を示すものに限られ、目尻や目頭等の位置の正確な検出は難しい。
In addition, it is effective to use near infrared light that cannot be seen by human eyes as auxiliary light in a dark environment. However, when near-infrared light is used, it is difficult to distinguish between a white-eye area and a black-eye area.
Further, when the image quality is low, an error occurs in the eye detection position. For example, in the case of low image quality, some of the eye positions are detected by using changes in eyebrow and eye brightness, but are limited to those indicating the presence area of the eye, such as the positions of the corners of the eyes and the eyes. It is difficult to detect accurately.
 そこで、本発明はこのような事情を鑑みてなされたもので、本発明の目的は、低画質の顔画像又は近赤外光を使用した顔画像であっても、高精度にまぶた形状を検出できるようにすることである。 Therefore, the present invention has been made in view of such circumstances, and an object of the present invention is to detect the eyelid shape with high accuracy even for a low-quality face image or a face image using near infrared light. Is to be able to do it.
 本発明の第1の態様に係る情報処理装置は、人物の顔画像から、眼が存在する領域である眼存在領域を特定する眼存在領域特定部と、前記眼存在領域の縦方向における画素間の明るさの変化の大きさを検出することで、前記眼と上まぶたとの間の第1境界線上にある第1端点の座標及び前記眼と下まぶたとの間の第2境界線上にある第2端点の座標を推定する明暗変化検出部と、前記眼存在領域における、前記眼の目頭の座標及び前記眼の目尻の座標を推定するまぶた基準位置推定部と、前記眼存在領域において、前記第1端点及び前記第2端点を通る直線上から複数の座標を選択して、当該選択された複数の座標の各々、前記目頭の座標及び前記目尻の座標から、複数の第1近似曲線を生成する近似曲線生成部と、前記複数の第1近似曲線から、前記第1境界線として最も好適な第1近似曲線を選択し、当該選択された第1近似曲線から前記上まぶたの形状を決定する形状決定部と、を備え、前記明暗変化検出部は、前記眼存在領域において、前記第1端点及び前記第2端点を通る直線上に形成され、前記第1端点を含む第1選択領域を、前記画素間の明るさの変化に基づき特定し、前記近似曲線生成部は、前記第1選択領域から前記複数の座標を選択することを特徴とする。 An information processing apparatus according to a first aspect of the present invention includes an eye presence area specifying unit that specifies an eye presence area, which is an area where an eye exists, from a person's face image, and between pixels in the vertical direction of the eye presence area. By detecting the magnitude of the change in brightness of the eye, the coordinates of the first end point on the first boundary line between the eye and the upper eyelid and the second boundary line between the eye and the lower eyelid In the eye presence region, a light / dark change detection unit that estimates the coordinates of the second end point, an eyelid reference position estimation unit that estimates the coordinates of the eye's eye corner and the eye's eye corner in the eye presence region, A plurality of coordinates are selected from a straight line passing through the first end point and the second end point, and a plurality of first approximate curves are generated from each of the selected plurality of coordinates, the eye coordinates and the eye corner coordinates. An approximate curve generation unit that performs the plurality of first approximate songs A shape determining unit that selects the most suitable first approximate curve as the first boundary line and determines the shape of the upper eyelid from the selected first approximate curve, and the light / dark change detecting unit includes: A first selection area formed on a straight line passing through the first end point and the second end point and including the first end point in the eye presence area based on a change in brightness between the pixels; The approximate curve generation unit selects the plurality of coordinates from the first selection region.
 本発明の第2の態様に係る情報処理装置は、人物の顔画像から、眼が存在する領域である眼存在領域を特定する眼存在領域特定部と、前記眼存在領域の縦方向における画素間の明るさの変化の大きさを検出することで、前記眼と上まぶたとの間の第1境界線上にある第1端点の座標及び前記眼と下まぶたとの間の第2境界線上にある第2端点の座標を推定する明暗変化検出部と、前記眼存在領域における、前記眼の目頭の座標及び前記眼の目尻の座標を推定するまぶた基準位置推定部と、前記眼存在領域において、前記第1端点及び前記第2端点を通る直線上から複数の座標を選択し、当該選択された複数の座標の各々、前記目頭の座標及び前記目尻の座標から、複数の第1近似曲線を生成する近似曲線生成部と、前記複数の第1近似曲線から、前記第1境界線として最も好適な第1近似曲線を選択することで、前記第1境界線における最上点を確定した確定端点の座標を決定する近似曲線選択部と、前記眼存在領域において、前記目頭の座標を通る縦方向における直線上から複数の第1選択座標及び前記目尻の座標を通る縦方向における直線上から複数の第2選択座標を選択して、当該複数の第1選択座標の各々、当該複数の第2選択座標の各々、及び、前記確定端点の座標から、近似曲線である複数の補正曲線を生成する補正曲線生成部と、前記複数の補正曲線から、前記第1境界線として最も好適な補正曲線を選択し、当該選択された補正曲線から前記上まぶたの形状を決定する形状決定部と、を備えることを特徴とする。 An information processing apparatus according to a second aspect of the present invention provides an eye presence area specifying unit that specifies an eye presence area, which is an area where an eye exists, from a person's face image, and between pixels in the vertical direction of the eye presence area. By detecting the magnitude of the change in brightness of the eye, the coordinates of the first end point on the first boundary line between the eye and the upper eyelid and the second boundary line between the eye and the lower eyelid In the eye presence region, a light / dark change detection unit that estimates the coordinates of the second end point, an eyelid reference position estimation unit that estimates the coordinates of the eye's eye corner and the eye's eye corner in the eye presence region, A plurality of coordinates are selected from a straight line passing through the first end point and the second end point, and a plurality of first approximate curves are generated from each of the selected plurality of coordinates, the eye coordinates and the eye corner coordinates. An approximate curve generation unit and the plurality of first approximate curves In the eye presence region, an approximate curve selection unit that determines the coordinates of a fixed end point that has determined the highest point in the first boundary line by selecting the most suitable first approximate curve as the first boundary line; Selecting a plurality of first selection coordinates from a straight line in the vertical direction passing through the coordinates of the eye and a plurality of second selection coordinates from a straight line in the vertical direction passing through the coordinates of the corner of the eye, and the plurality of first selection coordinates Each of the plurality of second selected coordinates and a correction curve generation unit that generates a plurality of correction curves that are approximate curves from the coordinates of the determined end point, and the first boundary from the plurality of correction curves A shape determining unit that selects a most suitable correction curve as a line and determines the shape of the upper eyelid from the selected correction curve.
 本発明の第1の態様に係る情報処理方法は、人物の顔画像から、眼が存在する領域である眼存在領域を特定し、前記眼存在領域の縦方向における画素間の明るさの変化の大きさを検出することで、前記眼と上まぶたとの間の第1境界線上にある第1端点の座標及び前記眼と下まぶたとの間の第2境界線上にある第2端点の座標を推定し、前記眼存在領域における、前記眼の目頭の座標及び前記眼の目尻の座標を推定し、前記眼存在領域において、前記第1端点及び前記第2端点を通る直線上に形成され、前記第1の端点を含む第1選択領域を、前記画素間の明るさの変化に基づき特定し、前記第1選択領域から複数の座標を選択し、前記選択された複数の座標の各々、前記目頭の座標及び前記目尻の座標から、複数の第1近似曲線を生成し、前記複数の第1近似曲線から、前記第1境界線として最も好適な第1近似曲線を選択し、当該選択された第1近似曲線から前記上まぶたの形状を決定することを特徴とする。 The information processing method according to the first aspect of the present invention specifies an eye presence area, which is an area where an eye exists, from a human face image, and changes in brightness between pixels in the vertical direction of the eye presence area. By detecting the size, the coordinates of the first end point on the first boundary line between the eye and the upper eyelid and the coordinates of the second end point on the second boundary line between the eye and the lower eyelid are obtained. Estimating the coordinates of the eye's eyes and the coordinates of the outer corners of the eye in the eye presence region, and forming the eye presence region on a straight line passing through the first end point and the second end point, A first selection area including a first end point is specified based on a change in brightness between the pixels, a plurality of coordinates are selected from the first selection area, and each of the plurality of selected coordinates is A plurality of first approximate curves are generated from the coordinates of the eyes and the coordinates of the corners of the eyes. From said plurality of first approximation curve, and select the most suitable first approximation curve as said first boundary line, and determines the shape of the upper eyelid from the first approximation curve is the selected.
 本発明の第2の態様に係る情報処理方法は、人物の顔画像から、眼が存在する領域である眼存在領域を特定し、前記眼存在領域の縦方向における画素間の明るさの変化の大きさを検出することで、前記眼と上まぶたとの間の第1境界線上にある第1端点の座標及び前記眼と下まぶたとの間の第2境界線上にある第2端点の座標を推定し、前記眼存在領域における、前記眼の目頭の座標及び前記眼の目尻の座標を推定し、前記眼存在領域において、前記第1端点及び前記第2端点を通る直線上から複数の座標を選択し、前記選択された複数の座標の各々、前記目頭の座標及び前記目尻の座標から、複数の第1近似曲線を生成し、前記複数の第1近似曲線から、前記第1境界線として最も好適な第1近似曲線を選択することで、前記第1境界線における最上点を確定した確定端点の座標を決定し、前記眼存在領域において、前記目頭の座標を通る縦方向に延びる直線から複数の第1選択座標を選択し、前記眼存在領域において、前記目尻の座標を通る縦方向に延びる直線上から複数の第2選択座標を選択し、前記複数の第1選択座標の各々、前記複数の第2選択座標の各々、及び、前記確定端点の座標から、近似曲線である複数の補正曲線を生成し、前記複数の補正曲線から、前記第1境界線として最も好適な補正曲線を選択し、当該選択された補正曲線から前記上まぶたの形状を決定することを特徴とする。 The information processing method according to the second aspect of the present invention specifies an eye presence area, which is an area where an eye exists, from a human face image, and changes in brightness between pixels in the vertical direction of the eye presence area. By detecting the size, the coordinates of the first end point on the first boundary line between the eye and the upper eyelid and the coordinates of the second end point on the second boundary line between the eye and the lower eyelid are obtained. Estimating the coordinates of the top of the eye and the coordinates of the outer corner of the eye in the eye presence region, and in the eye presence region, a plurality of coordinates from a straight line passing through the first end point and the second end point A plurality of first approximate curves are generated from each of the selected plurality of coordinates, the coordinates of the top of the eye, and the coordinates of the corners of the eyes, and the first boundary line is most preferably generated from the plurality of first approximate curves. By selecting a suitable first approximate curve, the first boundary line Determining the coordinates of the final endpoint that has determined the uppermost point, and selecting a plurality of first selection coordinates from a straight line extending in the vertical direction passing through the coordinates of the eye in the eye presence area, and in the eye presence area, A plurality of second selection coordinates are selected from a straight line extending in the vertical direction passing through the coordinates of the plurality of first selection coordinates, each of the plurality of second selection coordinates, and the coordinates of the determined end point, Generating a plurality of correction curves as approximate curves, selecting a correction curve most suitable as the first boundary line from the plurality of correction curves, and determining the shape of the upper eyelid from the selected correction curve; It is characterized by.
 本発明の一態様によれば、明るさの変化の大きい点を基準に複数の近似曲線を生成し、各曲線を比較評価して、眼とまぶたとの間の境界線を決定することで、低画質の顔画像又は近赤外光を使用した顔画像であっても、高精度にまぶた形状を検出することができる。 According to one aspect of the present invention, by generating a plurality of approximate curves based on a point with a large change in brightness, comparing each curve, and determining a boundary line between the eye and the eyelid, Even with a low-quality face image or a face image using near-infrared light, the eyelid shape can be detected with high accuracy.
実施の形態1~3に係る開眼度検出装置の構成を概略的に示すブロック図である。3 is a block diagram schematically showing a configuration of an eye opening degree detection apparatus according to Embodiments 1 to 3. FIG. (A)及び(B)は、低解像度のカメラで撮像された顔画像を説明するための概略図である。(A) And (B) is the schematic for demonstrating the face image imaged with the low-resolution camera. 実施の形態1における眼存在領域を説明するための概略図である。FIG. 3 is a schematic diagram for explaining an eye presence area in the first embodiment. (A)及び(B)は、実施の形態1における明暗変化検出部で使用されるフィルタを示す概略図である。(A) And (B) is the schematic which shows the filter used by the light-and-dark change detection part in Embodiment 1. FIG. 実施の形態1において、二重まぶた上に座標が特定された例を示す概略図である。In Embodiment 1, it is the schematic which shows the example by which the coordinate was specified on the double eyelid. 実施の形態1において、上まぶたに対して複数の近似曲線を生成する方法を示す概略図である。In Embodiment 1, it is the schematic which shows the method of producing | generating several approximate curves with respect to an upper eyelid. 実施の形態1に係る開眼度検出装置における処理を示すフローチャートである。4 is a flowchart showing processing in the eye opening degree detection apparatus according to the first embodiment. (A)及び(B)は、実施の形態2におる領域の算出方法を示す概略図である。(A) And (B) is the schematic which shows the calculation method of the area | region in Embodiment 2. FIG. 実施の形態4に係る開眼度検出装置の構成を概略的に示すブロック図である。FIG. 10 is a block diagram schematically showing a configuration of an eye opening degree detection device according to a fourth embodiment. 実施の形態4において運転手が横方向を向いている場合の眼付近の画像の一例を示す概略図である。FIG. 10 is a schematic diagram illustrating an example of an image near the eye when the driver is facing in a lateral direction in the fourth embodiment. 実施の形態5に係る開眼度検出装置の構成を概略的に示すブロック図である。FIG. 10 is a block diagram schematically showing a configuration of an eye opening degree detection device according to a fifth embodiment. 実施の形態5において、上まぶたの補正曲線の生成を説明するための概略図である。In Embodiment 5, it is the schematic for demonstrating the production | generation of the correction | amendment curve of an upper eyelid. 実施の形態6に係る居眠り度検出装置の構成を概略的に示すブロック図である。It is a block diagram which shows roughly the structure of the dozing degree detection apparatus which concerns on Embodiment 6. FIG. (A)及び(B)は、実施の形態1~5に係る開眼度検出装置のハードウェア構成の一例を示す概略図である。(A) and (B) are schematic diagrams illustrating an example of a hardware configuration of an eye-opening degree detection apparatus according to Embodiments 1 to 5. FIG.
実施の形態1.
 図1は、実施の形態1に係る情報処理装置としての開眼度検出装置100の構成を概略的に示すブロック図である。
 開眼度検出装置100は、まぶた形状検出装置110と、開眼度算出部130とを備える。
Embodiment 1 FIG.
FIG. 1 is a block diagram schematically showing a configuration of an eye opening degree detection apparatus 100 as an information processing apparatus according to the first embodiment.
The eye opening degree detection device 100 includes an eyelid shape detection device 110 and an eye opening degree calculation unit 130.
 図2(A)は、低解像度のカメラで撮像された顔画像FIを示している。
 顔画像FIには、上まぶたelu、下まぶたelb、二重まぶたels及びまゆ毛ebrが含まれている。図2(B)は、顔画像FI内の上まぶたelu、下まぶたelb、二重まぶたels及びまゆ毛ebrの位置を示す概略図である。
 図2(A)に示されているように、低解像度のカメラで撮像された顔画像FIにおいては、輝度の明暗の変化は確認できても、エッジとして認識するのは難しい。
 まぶた形状検出装置110は、低画質の顔画像FIであっても、高精度にまぶた形状を検出する。
FIG. 2A shows a face image FI captured by a low-resolution camera.
The face image FI includes an upper eyelid elu, a lower eyelid elb, a double eyelid els, and eyebrows ebr. FIG. 2B is a schematic diagram showing the positions of the upper eyelid elu, the lower eyelid elb, the double eyelid els, and the eyebrows ebr in the face image FI.
As shown in FIG. 2A, in the face image FI captured by the low-resolution camera, it is difficult to recognize as an edge even though the brightness change of brightness can be confirmed.
The eyelid shape detection device 110 detects the eyelid shape with high accuracy even for a low-quality face image FI.
 図1に示されているまぶた形状検出装置110は、眼存在領域特定部111と、正規化処理部112と、明暗変化検出部113と、まぶた基準位置推定部114と、近似曲線生成部115と、形状決定部116とを備える。 The eyelid shape detection device 110 illustrated in FIG. 1 includes an eye presence region specifying unit 111, a normalization processing unit 112, a light / dark change detection unit 113, an eyelid reference position estimation unit 114, and an approximate curve generation unit 115. And a shape determining unit 116.
 眼存在領域特定部111は、開眼度検出対象となる人物(以下、説明の簡略化のため運転手とする)の顔の画像である顔画像を表す低解像度の顔画像データImが与えられ、顔画像データImで表される顔画像の中から、運転手の眼が存在する領域である眼存在領域を特定する。そして、眼存在領域特定部111は、特定された眼存在領域を示すデータである眼存在領域データDmを生成する。そして、眼存在領域特定部111は、眼存在領域データDmを顔画像データImと共に正規化処理部112に与える。 The eye presence area specifying unit 111 is provided with low-resolution face image data Im representing a face image that is a face image of a person (hereinafter referred to as a driver for simplification of description) to be an eye opening degree detection target, From the face image represented by the face image data Im, an eye presence area that is an area where the driver's eyes exist is specified. Then, the eye presence region specifying unit 111 generates eye presence region data Dm that is data indicating the specified eye presence region. Then, the eye presence area specifying unit 111 provides the eye presence area data Dm to the normalization processing unit 112 together with the face image data Im.
 具体的には、眼存在領域特定部111は、運転手の顔画像を表す顔画像データImが与えられ、その顔画像の中から運転手の左側の眼、右側の眼又はその両方の眼の眼存在領域を特定し、特定された眼存在領域を示す眼存在領域データDmを生成する。 Specifically, the eye presence area specifying unit 111 is given face image data Im representing a driver's face image, and the left eye, right eye, or both eyes of the driver are selected from the face image. An eye presence area is specified, and eye presence area data Dm indicating the specified eye presence area is generated.
 顔画像データImは、例えば、RGBカメラ、グレースケールカメラ又は赤外光カメラで撮像されたカラー画像又はグレースケール画像のデータである。顔画像データImでは、運転手の顔上部(頭部、おでこ)が顔画像の上側、顔下部(口、顎)が顔画像の下側に写っている。
 また、顔画像データImにおいては、画像左上の点を画像座標系の原点とし、原点右向きをx座標方向、原点下向きをy座標方向とする。
 運転手の左側の眼は、顔画像に正対したときに顔の左側に位置する眼であり、運転手の右側の眼は、顔画像に正対したときに顔の右側に位置する眼である。以下、説明を簡単にするため、眼存在領域データDmは、運転手の右側の眼の存在領域を示すものとする。
The face image data Im is, for example, color image data or gray scale image data captured by an RGB camera, a gray scale camera, or an infrared light camera. In the face image data Im, the upper face (head, forehead) of the driver is shown above the face image, and the lower face (mouth, chin) is shown below the face image.
In the face image data Im, the upper left point of the image is the origin of the image coordinate system, the right direction of the origin is the x coordinate direction, and the downward direction of the origin is the y coordinate direction.
The left eye of the driver is the eye located on the left side of the face when facing the face image, and the right eye of the driver is the eye located on the right side of the face when facing the face image. is there. Hereinafter, in order to simplify the description, it is assumed that the eye presence area data Dm indicates the presence area of the right eye of the driver.
 眼存在領域について、図3を参照しながら説明する。
 眼存在領域とは、顔画像における各眼が存在すると思われる領域である。具体的には、図3に示されているように、眼存在領域Drは、目尻eout、目頭ein、上まぶたの端点etop、下まぶたの端点ebotを含む領域である。
 ここで、眼存在領域は、以下の第1の条件と、第2の条件とを満たす領域である。
 第1の条件は、両目、眉、鼻及び口を顔のパーツとした場合に、片方の眼のみを含み、他のパーツを含まないことである。但し、眉が部分的に含まれていてもよい。
 第2の条件は、眼存在領域を矩形領域とした場合に、その矩形領域の重心Mgが、目尻eout、目頭ein、上まぶたの端点etop、下まぶたの端点ebotにより囲まれる領域内に含まれていることである。
The eye presence area will be described with reference to FIG.
The eye presence area is an area where each eye in the face image is considered to exist. Specifically, as shown in FIG. 3, the eye presence region Dr is a region including the corner of the eye eout, the eye ein, the end point etop of the upper eyelid, and the end point ebot of the lower eyelid.
Here, the eye presence region is a region that satisfies the following first condition and second condition.
The first condition is that when both eyes, eyebrows, nose and mouth are face parts, only one eye is included and the other parts are not included. However, the eyebrows may be partially included.
The second condition is that when the eye presence area is a rectangular area, the center of gravity Mg of the rectangular area is included in an area surrounded by the corner of the eye eout, the eye ein, the end point etop of the upper eyelid, and the end point ebot of the lower eyelid. It is that.
 図3は、眼存在領域Drを矩形領域として表現した一例を示している。例えば、眼存在領域Drを矩形領域として表現する場合、眼存在領域データDmは、その矩形領域の重心Mgの座標と、矩形領域の幅Mw及び高さMhとで表されるデータである。
 また、眼存在領域データDmは、矩形領域の左上端の点の座標と、右下端の点の座標との組み合わせであってもよい。また、或いは、眼存在領域が円形の領域である場合、眼存在領域データDmは、その円形領域の中心点の座標と、半径との組み合わせであってもよい。
 以下、説明を簡単にするため、眼存在領域Drを矩形領域とし、矩形領域の重心Mgの座標(Mgx、Mgy)と、矩形領域の幅Mw及び高さMhとを眼存在領域データDmとする。
FIG. 3 shows an example in which the eye presence area Dr is expressed as a rectangular area. For example, when the eye presence region Dr is expressed as a rectangular region, the eye presence region data Dm is data represented by the coordinates of the center of gravity Mg of the rectangular region and the width Mw and height Mh of the rectangular region.
The eye presence area data Dm may be a combination of the coordinates of the upper left point of the rectangular area and the coordinates of the lower right point. Alternatively, when the eye existing area is a circular area, the eye existing area data Dm may be a combination of the coordinates of the center point of the circular area and the radius.
Hereinafter, in order to simplify the description, the eye presence area Dr is a rectangular area, and the coordinates (Mgx, Mgy) of the center of gravity Mg of the rectangular area and the width Mw and height Mh of the rectangular area are the eye presence area data Dm. .
 眼存在領域Drは、例えば、特許文献1に示されるように、顔の中で検出しやすい特徴的な部分である鼻孔を検出し、鼻孔の位置に基づいて設定されてもよい。また、AAM(Active Appearance Model)を用いた統計学的な処理により、おおよその目尻及び目頭の位置を検出し、眼存在領域が設定されてもよい。本実施の形態においては、眼存在領域Drの設定方法については、特別限定はしない。 The eye presence region Dr may be set based on the position of the nostril, for example, as shown in Patent Document 1, by detecting a nostril that is a characteristic part that is easy to detect in the face. Further, the eye presence area may be set by detecting the approximate positions of the corners of the eyes and the eyes by statistical processing using AAM (Active Appearance Model). In the present embodiment, the method for setting the eye presence region Dr is not particularly limited.
 図1に示されている正規化処理部112は、眼存在領域データDmと顔画像データImとを与えられ、顔画像データImで表される顔画像から、眼存在領域データDmで示される眼存在領域の画像を切り出す。そして、正規化処理部112は、切り出された画像を予め定められたサイズの正規化画像に変換する。そして、正規化処理部112は、その正規化画像を表す正規化画像データInを生成する。さらに、正規化処理部112は、正規化画像データInと、正規化画像データInにおける眼の存在領域を示す正規化眼存在領域データDnと、を明暗変化検出部113及びまぶた基準位置推定部114に与える。 The normalization processing unit 112 shown in FIG. 1 is given eye presence area data Dm and face image data Im, and from the face image represented by the face image data Im, the eye indicated by the eye presence area data Dm. Cut out the image of the existing area. Then, the normalization processing unit 112 converts the clipped image into a normalized image having a predetermined size. Then, the normalization processing unit 112 generates normalized image data In representing the normalized image. Further, the normalization processing unit 112 converts the normalized image data In and the normalized eye presence region data Dn indicating the eye presence region in the normalized image data In, the light / dark change detection unit 113 and the eyelid reference position estimation unit 114. To give.
 正規化処理部112が画像サイズを変換する際には、まず、眼存在領域の画像を切出した後、拡大又は縮小処理を行う。例えば、眼存在領域の矩形の幅をMw、高さをMhとし、正規化後の画像の幅をNw、高さをNhとすると、拡大又は縮小の倍率は、x軸方向にNw/Mw倍、y軸方向にNh/Mh倍となる。このとき、予め幅Mwと高さMhとの比及び幅Nwと高さNhとの比を揃えておくと、正規化前の画像と正規化後の画像とで縦横比が変化しない。そのため、Mw=αMh(αは定数)、Nw=αNhと設定しておくことが望ましい。αの値は、例えば、α=2.5が好ましい。また、正規化後の画像の幅、高さは、例えば、Nw=20、Nh=50が好ましい。 When the normalization processing unit 112 converts the image size, first, an image of the eye existing area is cut out, and then an enlargement or reduction process is performed. For example, if the width of the rectangle of the eye presence region is Mw, the height is Mh, the width of the normalized image is Nw, and the height is Nh, the enlargement or reduction magnification is Nw / Mw times in the x-axis direction , Nh / Mh times in the y-axis direction. At this time, if the ratio between the width Mw and the height Mh and the ratio between the width Nw and the height Nh are set in advance, the aspect ratio does not change between the image before normalization and the image after normalization. Therefore, it is desirable to set Mw = αMh (α is a constant) and Nw = αNh. The value of α is preferably α = 2.5, for example. Further, the width and height of the image after normalization are preferably Nw = 20 and Nh = 50, for example.
 画像の拡大又は縮小には、例えば、バイリニア補間、バイラテラル補間又はニアレストネイバー補間を使用してもよく、本実施の形態においては特に限定しない。ここでは、バイリニア補間を使用して拡大又は縮小処理が行われるものとする。 For the enlargement or reduction of the image, for example, bilinear interpolation, bilateral interpolation, or nearest neighbor interpolation may be used, and is not particularly limited in the present embodiment. Here, it is assumed that enlargement or reduction processing is performed using bilinear interpolation.
 また、正規化画像データInで示される正規化画像は、元画像の眼存在領域を切出し、画像サイズを変換したものとなっているため、正規化眼存在領域データDnは、正規化画像の中心座標Ngと、幅Nw及び高さNhで定義される。中心座標Ngは、正規化画像上における中心座標であり、Ng(Ngx、Ngy)=(Nw/2、Nh/2)で定義される。 Further, since the normalized image indicated by the normalized image data In is obtained by cutting out the eye existing area of the original image and converting the image size, the normalized eye existing area data Dn is the center of the normalized image. It is defined by coordinates Ng, width Nw, and height Nh. The center coordinate Ng is a center coordinate on the normalized image and is defined by Ng (Ngx, Ngy) = (Nw / 2, Nh / 2).
 明暗変化検出部113は、正規化画像データInで表される正規化画像に対し、顔画像の縦方向の明暗の変化の大きさを算出して、明暗変化データEnを生成する。そして、明暗変化検出部113は、明暗変化データEnを、正規化画像データInとともに、まぶた基準位置推定部114及び近似曲線生成部115へ供給する。 The light / dark change detection unit 113 calculates light / dark change data En by calculating the magnitude of the light / dark change in the vertical direction of the face image with respect to the normalized image represented by the normalized image data In. Then, the light / dark change detection unit 113 supplies the light / dark change data En to the eyelid reference position estimation unit 114 and the approximate curve generation unit 115 together with the normalized image data In.
 例えば、明暗変化検出部113は、正規化画像データInで表される正規化画像のy軸方向における画素間の明るさの変化の大きさを検出することで、運転手の眼と上まぶたとの間の境界線(第1境界線)上にある第1端点の座標、及び、運転手の眼と下まぶたとの間の境界線(第2境界線)上にある第2端点の座標を推定する。実施の形態1では、明暗変化検出部113は、正規化画像において、上の画素からの明るさの低下の度合いが最も大きい画素の座標を第1端点の座標とし、下の画素からの明るさの低下の度合いが最も大きい画素の座標を第2端点の座標とする。そして、明暗変化検出部113は、推定された第1端点の座標及び第2端点の座標を示す明暗変化データEnを生成して、このデータをまぶた基準位置推定部114及び近似曲線生成部115に与える。 For example, the light / dark change detection unit 113 detects the magnitude of the change in brightness between pixels in the y-axis direction of the normalized image represented by the normalized image data In, so that the driver's eyes and the upper eyelid are detected. The coordinates of the first end point on the boundary line (first boundary line) between and the coordinates of the second end point on the boundary line (second boundary line) between the driver's eyes and the lower eyelid presume. In the first embodiment, the brightness change detection unit 113 uses the coordinates of the pixel with the greatest degree of brightness decrease from the upper pixel as the coordinates of the first end point in the normalized image, and the brightness from the lower pixel. Let the coordinates of the pixel with the greatest degree of decrease be the coordinates of the second end point. Then, the light / dark change detection unit 113 generates light / dark change data En indicating the estimated coordinates of the first end point and the coordinates of the second end point, and supplies the data to the reference position estimation unit 114 and the approximate curve generation unit 115 which are the eyelids. give.
 明暗の変化を抽出するために、明暗変化検出部113は、例えば、図4(A)及び(B)に示されているフィルタを用いる。
 図4(A)に示されている5×5のフィルタEbfは、y軸方向に明暗が暗から明に変化する点を抽出するフィルタの一例である。フィルタEbfは、下まぶた付近の明暗の変化を抽出する第2フィルタである。
 フィルタEbfにおける中心位置(第2注目画素)を、図4(A)に示されている「3」の位置とすると、正規化画像上の点(nx、ny)をその中心位置としてフィルタEbfを適用した場合、フィルタEbfの評価値Ebvは、下記の(1)式で算出される。なお、In(x、y)は、正規化画像データInにおける座標(x、y)における輝度値を示す。
Figure JPOXMLDOC01-appb-M000001
In order to extract the change in light and dark, the light and dark change detection unit 113 uses, for example, the filters shown in FIGS.
The 5 × 5 filter Ebf shown in FIG. 4A is an example of a filter that extracts a point where light and dark changes from dark to bright in the y-axis direction. The filter Ebf is a second filter that extracts a change in brightness near the lower eyelid.
Assuming that the center position (second target pixel) in the filter Ebf is the position “3” shown in FIG. 4A, the filter Ebf is set with the point (nx, ny) on the normalized image as the center position. When applied, the evaluation value Ebv of the filter Ebf is calculated by the following equation (1). Note that In (x, y) indicates a luminance value at coordinates (x, y) in the normalized image data In.
Figure JPOXMLDOC01-appb-M000001
 フィルタEbfは、(1)式の右辺の一項目の領域(第4領域)が大きく(5×4=20画素分)、二項目の領域(第3領域)が小さく(5×1=5画素分)なるように設計されている。これは、下まぶた周辺の明暗の変化を観察した際に、下まぶたの下側(y軸方向側)の領域は、比較的輝度の明暗変化が小さく、眼の開閉に伴う変化が少ない領域となる。一方で、下まぶたの上側(y軸方向と逆方向側)の領域は、瞳の存在及び眼の開閉の影響を受け、輝度の明暗の変化が大きい領域となる。すなわち、眼の状態に依らず安定して下まぶたの明暗を検出するには、フィルタEbfが好適である。 The filter Ebf has a large one-item area (fourth area) (5 × 4 = 20 pixels) and a small two-item area (third area) in equation (1) (5 × 1 = 5 pixels). Min). This is because when the change in brightness around the lower eyelid is observed, the area below the lower eyelid (in the y-axis direction) is an area where the change in brightness is relatively small and the change due to eye opening and closing is small. Become. On the other hand, the region above the lower eyelid (opposite to the y-axis direction) is affected by the presence of the pupil and the opening / closing of the eye, and is a region where the brightness changes greatly. That is, the filter Ebf is suitable for stably detecting the brightness of the lower eyelid irrespective of the eye state.
 図4(B)に示されている5×5フィルタEtfは、y軸方向に明暗が明から暗に変化する点を抽出するフィルタの一例である。フィルタEtfは、上まぶた付近の明暗の変化を抽出する第1フィルタである。
 フィルタEtfにおける中心位置(第1注目画素)を、図4(B)に示されている「23」の位置とすると、正規化画像上の点(nx、ny)をその中心位置としてフィルタEtfを適用した場合、フィルタEtfの評価値Etvは、下記の(2)式で算出される。
Figure JPOXMLDOC01-appb-M000002
The 5 × 5 filter Etf shown in FIG. 4B is an example of a filter that extracts a point where light and dark changes from light to dark in the y-axis direction. The filter Etf is a first filter that extracts a change in brightness near the upper eyelid.
If the center position (first pixel of interest) in the filter Etf is the position of “23” shown in FIG. 4B, the filter Etf is set with the point (nx, ny) on the normalized image as the center position. When applied, the evaluation value Etv of the filter Etf is calculated by the following equation (2).
Figure JPOXMLDOC01-appb-M000002
 フィルタEtfは、フィルタEbfにおける上下を反転させたフィルタとなっており、(2)式の右辺の一項目の領域(第2領域)が大きく、二項目の領域(第1領域)が小さくなるように設計されている。これは、前述した下まぶたの場合と同様であり、上まぶた周辺の明暗の変化を観察した際に、上まぶたの上側(y軸方向と逆方向側)の領域は、比較的輝度の明暗の変化が小さく、眼の開閉に伴う変化が少ない領域となる。一方で、上まぶたの下側(y軸方向)の領域は、瞳の動きや眼の開閉の影響を受け、輝度の明暗の変化が大きい領域となる。そのため、眼の状態に依らず安定して上まぶたの明暗を検出するには、フィルタEtfが好適である。 The filter Etf is a filter obtained by inverting the top and bottom of the filter Ebf so that the area of one item (second area) on the right side of equation (2) is large and the area of two items (first area) is small. Designed to. This is the same as in the case of the lower eyelid described above, and when observing the change in brightness around the upper eyelid, the region above the upper eyelid (the side opposite to the y-axis direction) is relatively bright and dark. This is a region where the change is small and the change due to opening and closing of the eye is small. On the other hand, the region on the lower side of the upper eyelid (in the y-axis direction) is a region where the brightness changes greatly due to the influence of pupil movement and eye opening / closing. Therefore, the filter Etf is suitable for detecting the brightness of the upper eyelid stably regardless of the eye state.
 明暗変化検出部113は、前述した2種類のフィルタを正規化画像データInの各画素に適用することで、各画素において、2種類の輝度値の明暗変化の度合いを示す評価値Etvと、評価値Ebvとを得る。ここで、評価値Etvは、y軸方向に明暗が明から暗への変化の大きさ、即ち、y軸方向における明るさの低下の度合いを示し、評価値Ebvは、y軸方向に明暗が暗から明への変化の大きさ、即ち、y軸方向とは反対方向における明るさの低下の度合いを示す。 The light / dark change detection unit 113 applies the above-described two types of filters to each pixel of the normalized image data In, whereby an evaluation value Etv indicating the degree of light / dark change of the two types of luminance values in each pixel and the evaluation The value Ebv is obtained. Here, the evaluation value Etv indicates the magnitude of change from light to dark in the y-axis direction from light to dark, that is, the degree of decrease in brightness in the y-axis direction. The evaluation value Ebv is the lightness and darkness in the y-axis direction. The magnitude of the change from dark to light, that is, the degree of decrease in brightness in the direction opposite to the y-axis direction.
 明暗変化の評価値Etvにおいて最大値をとる点の座標をEnt(Entx、Enty)とし、評価値Ebvにおいて最大値をとる点の座標をEnb(Enbx、Enby)とする。明暗変化検出部113は、座標Entを、運転手の眼と上まぶたとの間の境界線の最上点である第1端点の座標と推定し、座標Enbを、運転手の眼と下まぶたとの間の境界線の最下点である第2端点の座標と推定して、これらの2つの点の座標を示す明暗変化データEnを、まぶた基準位置推定部114及び近似曲線生成部115に与える。 The coordinates of the point having the maximum value in the evaluation value Etv of the light-dark change are set to Ent (Entx, Enty), and the coordinates of the point having the maximum value in the evaluation value Ebv are set to Enb (Enbx, Enby). The brightness change detection unit 113 estimates the coordinate Ent as the coordinate of the first end point that is the uppermost point of the boundary line between the driver's eye and the upper eyelid, and sets the coordinate Enb as the driver's eye and the lower eyelid. Is estimated to be the coordinates of the second end point that is the lowest point of the boundary line between the two, and light-dark change data En indicating the coordinates of these two points is given to the eyelid reference position estimating unit 114 and the approximate curve generating unit 115 .
 ここで、座標Ent及び座標Enbは、眼存在領域Drに含まれる明暗変化の中で、上まぶたの端点及び下まぶたの端点として尤もらしい位置を示す座標となる。しかし、座標Entに関しては、上まぶたの上にある二重まぶた、つけまつげ若しくはアイシャドー等の化粧、又は、前髪の影響を受けるため、座標点が上まぶたよりy軸の上方向に外れることがある。 Here, the coordinate Ent and the coordinate Enb are coordinates indicating a likely position as an end point of the upper eyelid and an end point of the lower eyelid in the brightness change included in the eye presence region Dr. However, with respect to the coordinate Ent, the coordinate point may deviate upward from the upper eyelid due to the influence of makeup such as double eyelids on the upper eyelid, false eyelashes or eye shadow, or bangs. .
 図5は、座標Entが、二重まぶたels上に位置する一例を示す概略図である。
 例えば、図5においては、上まぶたelu上の点ではなく二重まぶたels上の点で、フィルタの評価値Etvが最大値をとることがある。この場合の処理については、形状決定部116にて対応するため、後述する。
 以下の説明では、座標Entが、二重まぶたels上の点であるものとして説明する。
FIG. 5 is a schematic diagram illustrating an example in which the coordinate Ent is located on the double eyelid els.
For example, in FIG. 5, the evaluation value Etv of the filter may take the maximum value at a point on the double eyelid els instead of a point on the upper eyelid elu. Since the processing in this case is handled by the shape determination unit 116, it will be described later.
In the following description, it is assumed that the coordinate Ent is a point on the double eyelid els.
 図1に示されているまぶた基準位置推定部114は、正規化画像データInと、正規化眼存在領域データDnと、明暗変化データEnとを与えられ、まぶた上に存在する基準点の位置であるまぶた基準位置を少なくとも2点算出する。例えば、まぶた基準位置推定部114は、正規化画像データInで表される正規化画像における、目頭の座標及び目尻の座標を推定する。そして、まぶた基準位置推定部114は、その基準点の正規化画像上の座標を示す座標データFnを生成する。そして、まぶた基準位置推定部114は、その座標データFnを近似曲線生成部115に与える。 The eyelid reference position estimation unit 114 shown in FIG. 1 is given the normalized image data In, the normalized eye presence area data Dn, and the light / dark change data En, and at the position of the reference point existing on the eyelid. At least two eyelid reference positions are calculated. For example, the eyelid reference position estimation unit 114 estimates the coordinates of the eyes and the coordinates of the eyes in the normalized image represented by the normalized image data In. Then, the eyelid reference position estimating unit 114 generates coordinate data Fn indicating the coordinates of the reference point on the normalized image. Then, the eyelid reference position estimation unit 114 gives the coordinate data Fn to the approximate curve generation unit 115.
 まぶた基準位置となる点は、例えば、目尻及び目頭に対応する点が好ましい。顔画像が運転手の正面顔を捉えている場合、目尻及び目頭のy軸における位置は、下まぶたの下端とほぼ同じ位置に存在すると推定することができる。そのため、明暗変化データEnにおける座標Enbのy座標(Enby)から目尻と目頭のy座標を推定することができる。 The point serving as the eyelid reference position is preferably, for example, a point corresponding to the corner of the eye and the eye. When the face image captures the front face of the driver, it can be estimated that the positions of the corners of the eyes and the eyes on the y-axis are substantially the same as the lower end of the lower eyelid. Therefore, it is possible to estimate the y-coordinates of the corners of the eyes and the eyes from the y-coordinate (Enby) of the coordinates Enb in the light / dark change data En.
 図5に、座標Enbのy座標から、目尻と目頭のy座標を決定した場合の、各点の位置関係の一例を示している。
 例えば、目尻の座標を座標Eno(Enox、Enoy)、目頭の座標を座標Eni(Enix、Eniy)とし、正規化眼存在領域データDnで示される正規化画像における眼存在領域の右端のx座標をEndxとすると、目頭の座標Eni=(0、Enby)、目尻の座標Eno=(Endx、Enby)となる。ここで、正規化画像の幅が幅Nwであるため、目尻の座標Eno=(Nw、Enby)となる。
 そして、まぶた基準位置推定部114は、目尻の推定座標Eno及び目頭の推定座標Eniを、まぶた基準位置の座標として推定し、これらの座標を示す座標データFnを生成する。
FIG. 5 shows an example of the positional relationship of each point when the y-coordinate of the corner of the eye and the top of the eye is determined from the y-coordinate of the coordinate Enb.
For example, the coordinates of the corner of the eye are coordinates Eno (Enox, Enoy), the coordinates of the head of the eye are coordinates Eni (Enix, Eniy), and the x coordinate of the right end of the eye existing area in the normalized image indicated by the normalized eye existing area data Dn is Assuming Endx, the coordinates of the eye head Eni = (0, Enby) and the coordinates of the eye corner Eno = (Endx, Enby). Here, since the width of the normalized image is the width Nw, the coordinates of the corner of the eye Eno = (Nw, Enby).
Then, the eyelid reference position estimation unit 114 estimates the estimated coordinates Eno of the eye corners and the estimated coordinates Eni of the eyes as the coordinates of the eyelid reference position, and generates coordinate data Fn indicating these coordinates.
 図1に示されている近似曲線生成部115は、正規化画像データInと、明暗変化データEnと、まぶた基準位置データFnとから、まぶたに対して複数の近似曲線を生成し、複数の近似曲線の各々を示すパラメータの集合を含むパラメータデータGnを形状決定部116に与える。 The approximate curve generation unit 115 shown in FIG. 1 generates a plurality of approximate curves for the eyelid from the normalized image data In, the light / dark change data En, and the eyelid reference position data Fn. Parameter data Gn including a set of parameters indicating each of the curves is given to the shape determining unit 116.
 例えば、近似曲線生成部115は、正規化画像データInで表される正規化画像において、明暗変化データEnで示される第1端点の座標Ent及び第2端点の座標Enbを通る直線上から選択された複数の座標の各々、まぶた基準位置データFnで示される目尻の座標Eno及び目頭の座標Eniから、複数の近似曲線(第1近似曲線)を生成する。実施の形態1では、近似曲線生成部115は、第1端点の座標Ent及び第2端点の座標Enb間の線分から複数の座標を選択する。そして、近似曲線生成部115は、これらの複数の近似曲線の各々を示すパラメータの集合を含むパラメータデータGnを形状決定部116に与える。近似に使用する曲線は、例えば、2次以上の曲線である。
 本実施の形態では、目尻の推定座標Enoと目頭の推定座標Eniのy座標を第2端点の座標Enbと等しいと仮定していることから、下まぶたの曲線Cbは直線となる。そのため、以下の説明では、上まぶたの近似曲線の生成について説明する。
For example, the approximate curve generation unit 115 is selected from a straight line passing through the coordinates Ent of the first end point and the coordinates Enb of the second end point indicated by the light / dark change data En in the normalized image represented by the normalized image data In. Further, a plurality of approximate curves (first approximate curves) are generated from each of the plurality of coordinates, from the coordinates Eno of the corner of the eye and the coordinates Eni of the eye indicated by the eyelid reference position data Fn. In the first embodiment, the approximate curve generation unit 115 selects a plurality of coordinates from a line segment between the coordinates Ent of the first end point and the coordinates Enb of the second end point. Then, the approximate curve generation unit 115 gives parameter data Gn including a set of parameters indicating each of the plurality of approximate curves to the shape determination unit 116. The curve used for approximation is, for example, a quadratic or higher curve.
In the present embodiment, it is assumed that the y-coordinate of the estimated coordinate Eno of the corner of the eye and the estimated coordinate Eni of the eye is equal to the coordinate Enb of the second end point, and therefore the curve Cb of the lower eyelid is a straight line. Therefore, in the following description, generation of an approximate curve of the upper eyelid will be described.
 図6は、上まぶたに対して複数の近似曲線を生成する方法を示す概略図である。
 一般的に、上まぶたの近似曲線とは、まぶた上に位置するまぶた基準点(目尻及び目頭の座標)と、上まぶたの端点etopとを通過する曲線のことである。すなわち、まぶた基準点と、上まぶたの端点etopとが正しく推定できていれば、近似曲線を正確に求めることができる。
 しかし、前述したとおり、上まぶたを検出するために明暗変化により特定された座標Entは、上まぶた上に位置しているとは限らず、アイシャドー又は二重まぶた上に位置している可能性がある。ここでは、説明上、座標Entは、二重まぶた上に位置していると仮定している。
 そのため、近似曲線生成部115は、まぶた基準点と、上まぶたの端点etopとを通る近似曲線の候補となる曲線として、図6に示す複数の近似曲線Cu(i)を生成する。ここで、i=1,2,3,・・・,Nである。Nは、候補となる近似曲線の本数であり、2以上の整数である。
FIG. 6 is a schematic diagram illustrating a method of generating a plurality of approximate curves for the upper eyelid.
In general, the approximate curve of the upper eyelid is a curve that passes through the eyelid reference points (coordinates of the corner of the eye and the eye) located on the eyelid and the end point etop of the upper eyelid. That is, if the eyelid reference point and the upper eyelid end point etop are correctly estimated, the approximate curve can be accurately obtained.
However, as described above, the coordinates Ent specified by the change in brightness to detect the upper eyelid are not necessarily located on the upper eyelid, but may be located on the eye shadow or the double eyelid. There is. Here, for the sake of explanation, it is assumed that the coordinates Ent are located on the double eyelid.
Therefore, the approximate curve generation unit 115 generates a plurality of approximate curves Cu (i) shown in FIG. 6 as curves that are candidates for approximate curves that pass through the eyelid reference point and the end point etop of the upper eyelid. Here, i = 1, 2, 3,..., N. N is the number of candidate approximate curves, and is an integer of 2 or more.
 各近似曲線は、通過する3点の座標Cupa(i)、座標Cupb及び座標Cupcを使用して定義される。
 座標Cupa(i)は、生成される複数の曲線に対し、曲線ごとに選択される。例えば、座標Cupa(i)は、座標Enbと、座標Entとを通る直線l上の点から複数選択される。具体的には、直線l上で、Enty≦y≦Enbyを満たす点をN個選択し、点群Pとする。そして、点群Pに含まれる各点P(i) (i∈1,2,・・・,N)が座標Cupa(i)として設定される。なお、各点P(i)は、例えば、等間隔となるように選択されればよい。
Each approximate curve is defined using the three points of the coordinates Cupa (i), the coordinates Cupb, and the coordinates Cupc.
The coordinates Cupa (i) are selected for each curve with respect to the generated curves. For example, a plurality of coordinates Cupa (i) are selected from points on a straight line l passing through the coordinates Enb and the coordinates Ent. Specifically, N points on the straight line l that satisfy Entity ≦ y ≦ Enby are selected and set as a point group P. Then, each point P (i) (iε1, 2,..., N) included in the point group P is set as a coordinate Cupa (i). In addition, what is necessary is just to select each point P (i) so that it may become equal intervals, for example.
 座標Cupb及び座標Cupcに関しては、生成される複数の曲線で、共通の座標が選択される。例えば、座標Cupb及び座標Cupcは、まぶた基準位置データFnに基づいて設定される。この場合、近似曲線生成部115は、座標Cupb=座標Eni、座標Cupc=座標Enoと設定する。 As for the coordinates Cupb and the coordinates Cupc, common coordinates are selected among a plurality of generated curves. For example, the coordinates Cupb and the coordinates Cupc are set based on the eyelid reference position data Fn. In this case, the approximate curve generation unit 115 sets coordinates Cupb = coordinate Eni and coordinates Cupc = coordinate Eno.
 また、座標Cupa(i)のx座標及びy座標は、(Cupa(i)x、Cupa(i)y)とし、座標Cupbのx座標及びy座標は、(Cupbx、Cupby)とし、座標Cupcのx座標及びy座標は、(Cupcx、Cupcy)とする。なお、Cupbx<Cupa(i)x<Cupcxとする。 The x and y coordinates of the coordinates Cupa (i) are (Cupa (i) x, Cupa (i) y), the x and y coordinates of the coordinates Cupb are (Cupbx, Cupby), and the coordinates of Cupc The x coordinate and y coordinate are (Cupcx, Cupcy). Note that Cupbx <Cupa (i) x <Cupcx.
 まぶた形状は、装置に入力される顔画像の顔の向きに応じて、左右非対称なことがあるため、近似曲線は、座標Cupa(i)よりもx座標が小さい部分の曲線Cul(i)と、大きい部分の曲線Cur(i)とを別々に設定し、2つの曲線をセットで1つの曲線Cu(i)とする。
 近似曲線としては、例えば、二次関数を使用する。二次関数を定義するためには、例えば、二次関数の頂点となる点と、それ以外に二次関数が通過する点とを特定すればよい。以下の説明では、二次関数を近似曲線として説明する。
Since the eyelid shape may be asymmetrical depending on the face orientation of the face image input to the device, the approximate curve is the curve Cul (i) of the portion where the x coordinate is smaller than the coordinate Cupa (i). The large curve Cur (i) is set separately, and two curves are set as one curve Cu (i).
For example, a quadratic function is used as the approximate curve. In order to define a quadratic function, for example, a point that is a vertex of the quadratic function and a point through which the quadratic function passes may be specified. In the following description, a quadratic function is described as an approximate curve.
 曲線Cul(i)は、座標Cupa(i)を頂点とし、座標Cupbを頂点以外に通過する点とする。曲線Cul(i)を示す二次関数を下記の(3)式とすると、各パラメータは、下記の(4)式により求めることができる。
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000004
The curve Cul (i) has a coordinate Cupa (i) as a vertex and a coordinate Cupb as a point passing through other than the vertex. When the quadratic function indicating the curve Cul (i) is represented by the following equation (3), each parameter can be obtained by the following equation (4).
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000004
 同様に、曲線Cur(i)は、座標Cupa(i)を頂点とし、座標Cupcを頂点以外に通過する点とする。曲線Cur(i)を示す二次関数を下記の(5)式とすると、各パラメータは、下記の(6)式により求めることができる。
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000006
Similarly, the curve Cur (i) has a coordinate Cupa (i) as a vertex and a point passing through the coordinate Cupc other than the vertex. When the quadratic function indicating the curve Cur (i) is the following equation (5), each parameter can be obtained by the following equation (6).
Figure JPOXMLDOC01-appb-M000005
Figure JPOXMLDOC01-appb-M000006
 上記により生成した複数の曲線Cu(i)及び曲線Cbを示すパラメータは、パラメータデータGnとして、形状決定部116に与えられる。
 パラメータデータGnは、各曲線のパラメータの集合を含む。例えば、パラメータデータGnは、曲線の種類である「二次関数」、曲線の数である「上まぶたN個、下まぶた1個」、まぶた基準点(目尻の座標Eno及び目頭の座標Eni)及び座標Enb、並びに、各曲線のパラメータの集合を含む。パラメータの集合は、上記の例では、上まぶたの近似曲線のパラメータabl(i),bbl(i),cbl(i),acr(i),bcr(i)及びccr(i)、並びに、下まぶたの直線パラメータである。下まぶたの直線パラメータは、まぶた基準点(目尻の座標Eno及び目頭の座標Eni)及び座標Enbの内の何れか1点のy座標の値である。目尻の推定座標Eno及び目頭の推定座標Eniのy座標と、座標Enbのy座標とが等しいと仮定していることから、下まぶたの直線は、x軸に平行な直線となるので、基準点(目尻の座標Eno及び目頭の座標Eni)及び座標Enbの内、何れか1点のy座標の値が直線パラメータとして含まれていればよい。
Parameters indicating the plurality of curves Cu (i) and the curve Cb generated as described above are given to the shape determining unit 116 as parameter data Gn.
The parameter data Gn includes a set of parameters for each curve. For example, the parameter data Gn includes a “quadratic function” that is a type of curve, “N upper eyelids and one lower eyelid” that are the number of curves, eyelid reference points (coordinates Eno of eye corners and coordinates Eni of eye eyes), and It includes a coordinate Enb, as well as a set of parameters for each curve. In the above example, the set of parameters is the parameters abl (i), bbl (i), cbl (i), acr (i), bcr (i) and ccr (i) of the upper eyelid approximate curve, and This is the eyelid linear parameter. The linear parameter of the lower eyelid is the y coordinate value of one of the eyelid reference point (coordinate of the eye corner Eno and the coordinate of the eye head Eni) and the coordinate Enb. Since it is assumed that the y coordinate of the estimated coordinate Eno of the eye corner and the estimated coordinate Eni of the eye head is equal to the y coordinate of the coordinate Enb, the straight line of the lower eyelid is a straight line parallel to the x axis. It is only necessary that the y-coordinate value of any one of (coordinates Eno of the corner of the eye and coordinates Eni of the top of the eye) and the coordinate Enb is included as a straight line parameter.
 形状決定部116は、複数の近似曲線のパラメータを含むパラメータデータGnを与えられ、各パラメータが表す近似曲線上の画素値を評価し、上まぶた及び下まぶたの形状を表す形状パラメータHnを算出する。そして、形状決定部116は、形状パラメータHnを開眼度算出部130に与える。
 例えば、形状決定部116は、パラメータデータGnで示される複数の近似曲線から、運転手の眼と上まぶたとの境界線として最も好適な近似曲線を選択することで、選択された近似曲線から上まぶたの形状を決定する。実施の形態1では、選択された近似曲線が上まぶたの形状として決定される。また、形状決定部116は、パラメータデータGnに含まれている下まぶたの直線パラメータであるまぶた基準点(目尻の座標Eno及び目頭の座標Eni)及び第2端点の座標Enbにより、下まぶたの形状を決定する。実施の形態1では、目尻の座標Eno及び目頭の座標Eni間の線分から下まぶたの形状を決定する。ここでは、目尻の座標Eno及び目頭の座標Eni間の線分が下まぶたの形状として決定される。そして、形状決定部116は、決定された上まぶたの形状と、下まぶたの形状とから、運転手の眼の形状を示すパラメータを算出し、算出されたパラメータを示す形状パラメータHnを開眼度算出部130に与える。
The shape determining unit 116 is given parameter data Gn including parameters of a plurality of approximate curves, evaluates pixel values on the approximate curve represented by each parameter, and calculates a shape parameter Hn representing the shape of the upper eyelid and the lower eyelid. . Then, the shape determination unit 116 gives the shape parameter Hn to the eye opening degree calculation unit 130.
For example, the shape determining unit 116 selects the most suitable approximate curve as a boundary line between the driver's eye and the upper eyelid from a plurality of approximate curves indicated by the parameter data Gn, thereby increasing the selected approximate curve. Determine the shape of the eyelid. In the first embodiment, the selected approximate curve is determined as the shape of the upper eyelid. In addition, the shape determining unit 116 determines the shape of the lower eyelid based on the eyelid reference point (coordinate of the eye corner Eno and the coordinate of the eye head Eni) and the coordinate Enb of the second end point, which are linear parameters of the lower eyelid included in the parameter data Gn. To decide. In the first embodiment, the shape of the lower eyelid is determined from the line segment between the coordinates of the corner of the eye Eno and the coordinates of the eye head Eni. Here, the line segment between the coordinate Eno of the corner of the eye and the coordinate Eni of the eye is determined as the shape of the lower eyelid. Then, the shape determination unit 116 calculates a parameter indicating the shape of the driver's eye from the determined shape of the upper eyelid and the shape of the lower eyelid, and calculates a shape parameter Hn indicating the calculated parameter as the degree of eye opening. Part 130.
 形状パラメータHnは、例えば、正規化画像における目尻及び目頭間の距離ewと、上まぶたの端点etop及び下まぶたの端点ebot間の距離ehとの比の値ep(=eh/ew)が好ましい。または、形状パラメータHnは、上まぶた及び下まぶたに対してそれぞれ1つずつ近似曲線のパラメータを特定したものでもよい。さらに、形状パラメータHnは、近似曲線のパラメータではなく、まぶた基準点の座標、上まぶたの端点etopの座標及び下まぶたの端点ebotの座標を含むものでもよい。その他には、形状パラメータHnは、上まぶた及び下まぶたの近似曲線の曲率をそれぞれ1つずつ表すものでもよい。
 以下の説明では、目尻及び目頭間の距離ewと、上まぶたの端点etop及び下まぶたの端点ebot間の距離ehとの比の値epを形状パラメータHnとする。
The shape parameter Hn is preferably, for example, a value ep (= eh / ew) of a ratio between the distance ew between the corners of the eyes and the eyes in the normalized image and the distance eh between the end point etop of the upper eyelid and the end point ebot of the lower eyelid. Alternatively, the shape parameter Hn may be a parameter that specifies one approximate curve parameter for each of the upper eyelid and the lower eyelid. Further, the shape parameter Hn may include the coordinates of the eyelid reference point, the coordinates of the upper eyelid end point etop, and the coordinates of the lower eyelid end point ebot, instead of the approximate curve parameters. In addition, the shape parameter Hn may represent one curvature of each of the approximate curves of the upper eyelid and the lower eyelid.
In the following description, the shape parameter Hn is a ratio value ep between the distance ew between the corners of the eyes and the eyes and the distance eh between the end point etop of the upper eyelid and the end point ebot of the lower eyelid.
 まぶたの形状を特定するためには、複数の近似曲線から、まぶたとして尤もらしい近似曲線を、上まぶた及び下まぶたに関してそれぞれ1つずつ選択する必要がある。本実施の形態では、眼と下まぶたとの間の境界線の曲線は一意に設定済みであるため、上まぶたに対して生成された複数の近似曲線から1つの曲線が選択される。 In order to specify the shape of the eyelid, it is necessary to select an approximate curve that is likely to be an eyelid from a plurality of approximate curves, one for the upper eyelid and one for the lower eyelid. In the present embodiment, since the curve of the boundary line between the eye and the lower eyelid has been uniquely set, one curve is selected from a plurality of approximate curves generated for the upper eyelid.
 上まぶたに対して生成された複数の近似曲線から、1つの近似曲線を選択するにあたって、眼と上まぶたとの間の境界線上を通る近似曲線と、アイシャドー又は二重まぶた上を通る近似曲線との違いを説明する。
 上まぶた付近には、眼球との段差及びまつ毛の生え際が存在するため、眼と上まぶたとの間の境界線上の画素の輝度値は、アイシャドー又は二重まぶた上の画素の輝度値と比べて小さくなる傾向がある。また、この傾向は、上まぶたと下まぶたを比較した場合も同様で、眼と上まぶたとの間の境界線上の画素の輝度値は、眼と下まぶたとの間の境界線上の画素の輝度値と比較して小さくなる傾向がある。
When selecting one approximate curve from a plurality of approximate curves generated for the upper eyelid, an approximate curve that passes on the boundary between the eye and the upper eyelid, and an approximate curve that passes over the eye shadow or the double eyelid The difference is explained.
Near the upper eyelid, there is a step with the eyeball and the hairline of the eyelashes, so the luminance value of the pixel on the boundary line between the eye and the upper eyelid is compared with the luminance value of the pixel on the eye shadow or double eyelid Tend to be smaller. This tendency is the same when comparing the upper and lower eyelids. The luminance value of the pixel on the boundary line between the eye and the upper eyelid is the luminance value of the pixel on the boundary line between the eye and the lower eyelid. There is a tendency to be smaller than the value.
 この傾向を利用して、形状決定部116は、上まぶたに対して複数生成された近似曲線において、各々の近似曲線上の画素の輝度値の平均値を求めて、平均値が最も低い曲線を選択することで、眼と上まぶたとの間の境界線として最も好適な近似曲線を選択することができる。
 上記のように、形状決定部116は、近似曲線の中から上まぶたとして尤もらしい曲線Cu(iu)を選択し、座標Cupa(iu)を上まぶたの端点etopとして設定する。下まぶたの端点ebotは、下まぶたの曲線Cbの端点、すなわち座標Enbが設定される。
Using this tendency, the shape determining unit 116 obtains the average value of the luminance values of the pixels on each approximate curve among the multiple approximate curves generated for the upper eyelid, and obtains the curve having the lowest average value. By selecting, the most suitable approximate curve can be selected as the boundary line between the eye and the upper eyelid.
As described above, the shape determination unit 116 selects a curve Cu (iu) that is likely to be the upper eyelid from the approximate curves, and sets the coordinates Cupa (iu) as the end point etop of the upper eyelid. The end point ebot of the lower eyelid is set to the end point of the lower eyelid curve Cb, that is, the coordinate Enb.
 形状決定部116は、目尻及び目頭間の距離ew=Nwと、上まぶたの端点と下まぶたの端点との距離eh=Ndとの比の値ep=eh/ewを、形状パラメータHnとして開眼度算出部130に与える。なお、値Ndは、下記の(7)式により求めることができる。
Figure JPOXMLDOC01-appb-M000007
The shape determining unit 116 uses the distance value ep = eh / ew between the distance ew = Nw between the corners of the eyes and the eyes and the distance eh = Nd between the end points of the upper eyelid and the lower eyelid as the shape parameter Hn. This is given to the calculation unit 130. The value Nd can be obtained from the following equation (7).
Figure JPOXMLDOC01-appb-M000007
 開眼度算出部130は、与えられた形状パラメータHnと、メモリ131に予め保存してある参照形状パラメータHsとを比較し、運転手の開眼度Knを算出する。
 ここで、開眼度Knは、運転手の眼の開き度合いを示すもので、例えば、定常時の眼の開き度合いを開眼度100%として、0%から100%の値で表すのが好ましい。
The eye opening degree calculation unit 130 compares the given shape parameter Hn with the reference shape parameter Hs stored in the memory 131 in advance, and calculates the eye opening degree Kn of the driver.
Here, the degree of eye opening Kn indicates the degree of opening of the driver's eyes. For example, the degree of opening of the eyes in a normal state is preferably represented by a value from 0% to 100%, assuming that the degree of eye opening is 100%.
 メモリ131には、参照形状パラメータHsとして、運転手の定常時の目尻及び目頭間距離ewsと、上まぶたの端点etop及び下まぶたの端点ebotの距離ehsとの比の値eps=ehs/ewsを保持している。
 開眼度算出部130は、この参照形状パラメータHsと、与えられた形状パラメータHnとを比較して、下記の(8)式により開眼度Knを算出する。
 Kn=100×ep/eps             (8)
In the memory 131, as a reference shape parameter Hs, a ratio value eps = ehs / ews between the distance between the eye corners and the eye distance ews in the steady state of the driver and the distance ehs between the end point etop of the upper eyelid and the end point ebot of the lower eyelid. keeping.
The eye opening degree calculation unit 130 compares the reference shape parameter Hs and the given shape parameter Hn, and calculates the eye opening degree Kn by the following equation (8).
Kn = 100 × ep / eps (8)
 なお、実施の形態1におけるまぶた形状検出装置110は、眼と下まぶたとの間の境界線を直線であると仮定しているが、曲線と仮定してもよい。その場合、まぶた基準位置推定部114は、目尻及び目頭の座標を、座標Enbの代わりに、例えば、正規化画像の中心座標Ngのy座標Ngyを用いて算出する。または、眼存在領域の算出で、眼存在領域特定部111が前述のAAMを使用した場合、目尻及び目頭の特徴点の座標を得ることができるため、まぶた基準位置推定部114は、その座標を目尻及び目頭の座標として使用する。
 また、近似曲線生成部115は、直線l上から近似曲線が通過する複数の点を選択する際に、上まぶたはEnty≦y≦Eniyの範囲で選択し、下まぶたはEniy≦y≦Enbyから選択することで、上まぶた用の複数の近似曲線と下まぶた用の複数の近似曲線を生成する。
 そして、形状決定部116は、上記と同様の方法で、上まぶた用の複数の近似曲線から、眼と上まぶたとの間の境界線として最も好適な近似曲線を決定し、下まぶた用の複数の近似曲線から、眼と上まぶたとの間の境界線として最も好適な近似曲線を決定してもよい。
 但し、下まぶたは、上まぶたと比較して近似曲線上の輝度値の評価値に関して、曲線間で差が出にくいため、形状決定部116は、座標Eni、座標Eno及び座標Enbを通る曲線を近似曲線として選択してもよい。
In addition, although the eyelid shape detection apparatus 110 in Embodiment 1 assumes that the boundary line between the eye and the lower eyelid is a straight line, it may be assumed to be a curved line. In that case, the eyelid reference position estimation unit 114 calculates the coordinates of the corner of the eye and the eye using, for example, the y coordinate Ngy of the center coordinate Ng of the normalized image instead of the coordinate Enb. Alternatively, in the calculation of the eye presence area, when the eye presence area specifying unit 111 uses the above-described AAM, the coordinates of the feature points of the corner of the eye and the eye can be obtained. Therefore, the eyelid reference position estimation unit 114 determines the coordinates. Used as the coordinates of the corner of the eye and the eye.
Further, when the approximate curve generation unit 115 selects a plurality of points through which the approximate curve passes from the straight line l, the upper eyelid is selected in the range of Enty ≦ y ≦ Eny, and the lower eyelid is selected from Anyy ≦ y ≦ Enby. By selecting, a plurality of approximate curves for the upper eyelid and a plurality of approximate curves for the lower eyelid are generated.
Then, the shape determining unit 116 determines the most suitable approximate curve as the boundary line between the eye and the upper eyelid from the plurality of approximate curves for the upper eyelid in the same manner as described above, From the approximate curve, an approximate curve most suitable as a boundary line between the eye and the upper eyelid may be determined.
However, since the lower eyelid is less likely to have a difference between the curves with respect to the evaluation value of the luminance value on the approximate curve compared to the upper eyelid, the shape determining unit 116 shows a curve passing through the coordinates Eni, the coordinates Eno, and the coordinates Enb. It may be selected as an approximate curve.
 なお、実施の形態1におけるまぶた形状検出装置110では、明暗変化検出部113は、眼存在領域に含まれる全ての画素に対して明暗変化を抽出するフィルタ処理を行ったが、必ずしも全ての画素に対して実施する必要はない。
 例えば、明暗変化検出部113は、正規化画像に含まれる画素のうち、x=Ngx、Ngy-Nh/4≦y≦Ngy+Nh/4の条件を満たす点のみに、フィルタ処理を行ってもよい。まぶた形状検出装置110に与えられる顔画像が正面顔の場合には、まぶたの上下の端点etop、ebotは、基本的には、目尻と目頭との間の中心点を通りy軸に平行な直線上に位置すると仮定することができる。そのため、明暗変化検出部113は、上記条件を満たす点を探索するだけでまぶたの上下端点の推定座標Ent、Enbを求めることができる。このようにフィルタ処理を実施する画素数を限定することで、処理時間を短縮することができる。
In the eyelid shape detection apparatus 110 according to the first embodiment, the light / dark change detection unit 113 performs the filter processing for extracting the light / dark change for all the pixels included in the eye presence region. There is no need to implement it.
For example, the light / dark change detection unit 113 may perform the filtering process only on the points included in the normalized image that satisfy the conditions of x = Ngx, Ngy−Nh / 4 ≦ y ≦ Ngy + Nh / 4. When the face image given to the eyelid shape detection device 110 is a frontal face, the upper and lower end points etop and ebot of the eyelid are basically straight lines passing through the center point between the corner of the eye and the eye and parallel to the y axis. It can be assumed that it is located above. Therefore, the brightness change detection unit 113 can obtain the estimated coordinates Ent and Enb of the upper and lower eyelid points simply by searching for a point that satisfies the above conditions. In this way, the processing time can be shortened by limiting the number of pixels on which the filter processing is performed.
 また、フィルタ処理を実施する画素を、目尻と目頭との間の中心点を通りy軸に平行な直線上にのみ限定するのではなく、Ngx-Nw/4≦x≦Ngx+Nw/4の範囲内でy軸に平行な直線を複数選択し、明暗変化検出部113がそれらの直線上の画素に対してフィルタ処理を実施してもよい。複数の直線に対してフィルタ処理を行うことで、入力される顔画像が正面画像ではない場合でも、座標Ent及び座標Enbを精度良く算出しつつ、処理時間を短縮することができる。 In addition, the pixel to be filtered is not limited to a straight line that passes through the center point between the corner of the eye and the top of the eye and is parallel to the y-axis, but is within a range of Ngx−Nw / 4 ≦ x ≦ Ngx + Nw / 4. In this case, a plurality of straight lines parallel to the y-axis may be selected, and the light / dark change detection unit 113 may perform filter processing on pixels on the straight lines. By performing filter processing on a plurality of straight lines, even when the input face image is not a front image, the processing time can be shortened while accurately calculating the coordinates Ent and Enb.
 図7は、実施の形態1に係る開眼度検出装置100における処理を示すフローチャートである。
 図7に示されているフローチャートは、例えば、開眼度検出装置100に顔画像データImが入力されることで開始される。
FIG. 7 is a flowchart showing processing in the eye opening degree detection apparatus 100 according to the first embodiment.
The flowchart illustrated in FIG. 7 is started, for example, when the face image data Im is input to the eye opening degree detection device 100.
 眼存在領域特定部111は、顔画像データImで表される顔画像から眼存在領域を特定し、特定された眼存在領域を示す眼存在領域データDmを生成する(S10)。そして、眼存在領域特定部111は、生成された眼存在領域データDmを、顔画像データImとともに、正規化処理部112に与える。 The eye presence region specifying unit 111 specifies the eye presence region from the face image represented by the face image data Im, and generates eye presence region data Dm indicating the specified eye presence region (S10). Then, the eye presence region specifying unit 111 provides the generated eye presence region data Dm to the normalization processing unit 112 together with the face image data Im.
 正規化処理部112は、眼存在領域データDmに基づいて、顔画像データImで表される顔画像から眼存在領域を切り出し、切り出された画像を予め与えられた画像サイズに変換することで、変換後の正規化画像を表す正規化画像データInを生成するとともに、正規化画像データInにおける眼存在領域を示す正規化眼存在領域データDnを生成する(S11)。そして、正規化処理部112は、正規化画像データIn及び正規化眼存在領域データDnを明暗変化検出部113に与える。 The normalization processing unit 112 cuts out the eye presence area from the face image represented by the face image data Im based on the eye presence area data Dm, and converts the cut out image into a predetermined image size. Normalized image data In representing the normalized image after conversion is generated, and normalized eye presence region data Dn indicating the eye presence region in the normalized image data In is generated (S11). Then, the normalization processing unit 112 gives the normalized image data In and the normalized eye presence region data Dn to the light / dark change detection unit 113.
 明暗変化検出部113は、正規化画像データInの縦方向(y座標方向)の明暗の変化の大きさを算出し、明暗変化データEnを生成する(S12)。そして、明暗変化検出部113は、明暗変化データEnを、まぶた基準位置推定部114に与え、明暗変化データEn及び正規化画像データInを近似曲線生成部115に与える。 The light / dark change detection unit 113 calculates the magnitude of the light / dark change in the vertical direction (y-coordinate direction) of the normalized image data In, and generates light / dark change data En (S12). Then, the light / dark change detection unit 113 gives the light / dark change data En to the eyelid reference position estimation unit 114, and gives the light / dark change data En and the normalized image data In to the approximate curve generation unit 115.
 まぶた基準位置推定部114は、明暗変化データEnに基づいて、まぶた上に存在する基準点の位置を少なくとも2点算出し、算出された基準点の正規化画像上の座標を示す座標データFnを生成する(S13)。そして、まぶた基準位置推定部114は、座標データFnを近似曲線生成部115に与える。 The eyelid reference position estimation unit 114 calculates at least two reference point positions existing on the eyelid based on the light / dark change data En, and obtains coordinate data Fn indicating coordinates of the calculated reference points on the normalized image. Generate (S13). Then, the eyelid reference position estimation unit 114 gives the coordinate data Fn to the approximate curve generation unit 115.
 近似曲線生成部115は、正規化画像データInと、明暗変化データEnと、まぶた基準位置データFnとから、まぶたに対して複数の近似曲線を生成し、複数の近似曲線の各々のパラメータを含むパラメータデータGnを生成する(S14)。そして、形状決定部116へ出力する。 The approximate curve generation unit 115 generates a plurality of approximate curves for the eyelid from the normalized image data In, the light / dark change data En, and the eyelid reference position data Fn, and includes parameters of each of the plurality of approximate curves. Parameter data Gn is generated (S14). And it outputs to the shape determination part 116.
 形状決定部116は、パラメータデータGnに含まれる複数のパラメータで示される複数の近似曲線上の画素値の特徴量を算出する(S15)。 The shape determining unit 116 calculates feature amounts of pixel values on a plurality of approximate curves indicated by a plurality of parameters included in the parameter data Gn (S15).
 形状決定部116は、各近似曲線の特徴量を比較評価し、最もまぶたらしい近似曲線を選択する(S16)。 The shape determination unit 116 compares and evaluates the feature values of the approximate curves, and selects the approximate curve that seems to be the most eyelid (S16).
 形状決定部116は、選択された近似曲線から上まぶた及び下まぶたの形状を表す形状パラメータHnを算出する(S17)。そして、形状決定部116は、算出された形状パラメータHnを開眼度算出部130に与える。 The shape determining unit 116 calculates a shape parameter Hn representing the shape of the upper eyelid and the lower eyelid from the selected approximate curve (S17). Then, the shape determination unit 116 gives the calculated shape parameter Hn to the eye opening degree calculation unit 130.
 開眼度算出部130は、形状パラメータHnと、メモリ131に予め保存してある参照形状パラメータHsとを比較し、運転手の開眼度Knを算出する(S18)。 The eye opening degree calculation unit 130 compares the shape parameter Hn with the reference shape parameter Hs stored in advance in the memory 131, and calculates the driver's eye opening degree Kn (S18).
 以上のように構成された実施の形態1におけるまぶた形状検出装置110は、明暗変化の大きな点からまぶたの基準位置を推定し、明暗変化の大きな点と、推定された基準位置とで定められる領域に含まれる点から複数の近似曲線を求め、複数の近似曲線上の画素値の特徴量を比較評価することで、低画質の顔画像でエッジが検出できない場合でも、精度良くまぶたの形状を検出することができる。 The eyelid shape detection apparatus 110 according to the first embodiment configured as described above estimates the eyelid reference position from a point with a large change in brightness, and is an area defined by the point with a large change in brightness and the estimated reference position. Detecting the shape of the eyelid accurately even when edges cannot be detected in low-quality face images by finding multiple approximate curves from the points included in the image and comparing and evaluating the feature values of the pixel values on the multiple approximate curves can do.
 以上のように構成された実施の形態1におけるまぶた形状検出装置110は、まぶたの明暗変化を抽出するために上下非対称のフィルタを使用することで、低画質の顔画像であっても、開いている又は閉じているといった眼の状態に依らず、まぶたの位置を推定することができる。このため、精度良くまぶたの形状を検出することができる。 The eyelid shape detection apparatus 110 according to the first embodiment configured as described above opens even a low-quality face image by using a vertically asymmetric filter to extract changes in brightness of the eyelid. The position of the eyelid can be estimated regardless of the eye state such as being present or closed. For this reason, the shape of the eyelid can be detected with high accuracy.
 以上のように構成された実施の形態1におけるまぶた形状検出装置110は、まぶた形状の近似曲線を、基準点を境に左右個別に設定することで、左右非対称なまぶた形状に対しても精度良くまぶた形状を検出することができる。 The eyelid shape detection apparatus 110 according to the first embodiment configured as described above accurately sets an eyelid shape approximate curve separately on the left and right sides with respect to the reference point, so that the eyelid shape can be accurately detected even for the left and right asymmetric eyelid shapes. The eyelid shape can be detected.
 以上のように構成された実施の形態1におけるまぶた形状検出装置110は、複数生成された近似曲線に対し、画素の輝度値の平均値を曲線間で比較評価し、最も輝度値の平均値が小さい曲線を、上まぶたを通過する近似曲線として選択することで、顔画像に写る人物がアイシャドーをしている場合、又は、顔画像に写る人物が二重まぶたの場合でも精度良くまぶた形状を検出することができる。 The eyelid shape detection apparatus 110 according to Embodiment 1 configured as described above compares and evaluates the average value of the luminance values of the pixels with respect to a plurality of generated approximate curves, and the average value of the luminance values is the highest. By selecting a small curve as an approximate curve that passes through the upper eyelid, even if the person in the face image has an eye shadow, or the person in the face image has a double eyelid, the eyelid shape can be accurately determined. Can be detected.
 以上のように構成された実施の形態1における開眼度検出装置100は、高画質の顔画像に対しても、高画質の顔画像を低画質の顔画像に変換してからまぶた形状を検出することで、少ない処理量で精度良くまぶた形状を検出することができる。 The eye-opening degree detection apparatus 100 according to Embodiment 1 configured as described above detects an eyelid shape after converting a high-quality face image into a low-quality face image even for a high-quality face image. Thus, the eyelid shape can be accurately detected with a small amount of processing.
 以上のように構成された実施の形態1における開眼度検出装置100は、低画質の顔画像に対して精度良くまぶた形状を検出できることから、カメラの設置位置に関する自由度が高い。例えば、カーナビゲーションシステムのディスプレイ付近等、運転手から離れた位置にカメラを置き、顔画像が低画質であったとしても、開眼度検出装置100は、精度良くまぶた形状を検出できる。また、広角レンズ(例えば、90度~150度)を備えたカメラを使用して運転手と助手席同乗者の顔を同時に撮像し、運転手と同乗者の顔画像が低画質であっても、開眼度検出装置100は、精度良くまぶた形状を検出できる。更には、Aピラー及びクラスターメーター等、運転手の近くにカメラを置き、高画質の顔画像が取得されたとしても、開眼度検出装置100は、低画質の顔画像に変換してからまぶた形状を検出することで、少ない処理量で精度良くまぶた形状を検出できる。 Since the eye-opening degree detection apparatus 100 according to Embodiment 1 configured as described above can accurately detect an eyelid shape with respect to a low-quality face image, the degree of freedom regarding the camera installation position is high. For example, even if the camera is placed at a position away from the driver, such as near the display of a car navigation system, and the face image has low image quality, the eye-opening degree detection device 100 can detect the eyelid shape with high accuracy. In addition, a camera equipped with a wide-angle lens (for example, 90 ° to 150 °) is used to simultaneously capture the faces of the driver and the passenger on the front passenger seat. The eye opening degree detection device 100 can detect the eyelid shape with high accuracy. Furthermore, even when a camera is placed near the driver, such as an A-pillar and a cluster meter, and a high-quality face image is acquired, the eye-opening degree detection device 100 converts the eyelid shape after converting it to a low-quality face image. By detecting the eyelid shape, the eyelid shape can be detected accurately with a small amount of processing.
 以上のように構成された実施の形態1における開眼度検出装置100は、明暗変化の大きな点からまぶたの基準位置を推定し、明暗変化の大きな点と、推定された基準位置とで定められる領域に含まれる点から複数の近似曲線を求め、複数の近似曲線上の画素値の特徴量を比較評価することで、低画質の顔画像でエッジが検出できない場合でも、精度良くまぶたの形状を検出することができる。さらに、開眼度検出装置100は、検出されたまぶたの形状と、予め登録してある形状を比較することで、低画質の顔画像に対して眼の開き度合いを精度良く検出することができる。
 なお、上記実施の形態においては、第1境界線の最上点である第1端点と、第2境界線の最下点である第2端点とを結ぶ直線を用いていたが、目の範囲内のx軸方向に直線がある程度ずれてもよい。この許容される「ある程度」とは、例えば目尻と目頭との間の中心点を中心に、目尻及び目頭間の距離の5分の1程度の範囲を指す。
The eye-opening degree detection apparatus 100 according to Embodiment 1 configured as described above estimates the eyelid reference position from a point with a large change in brightness, and is an area defined by the point with a large change in brightness and the estimated reference position. Detecting the shape of the eyelid accurately even when edges cannot be detected in low-quality face images by finding multiple approximate curves from the points included in the image and comparing and evaluating the feature values of the pixel values on the multiple approximate curves can do. Furthermore, the eye-opening degree detection device 100 can accurately detect the degree of eye opening for a low-quality face image by comparing the shape of the detected eyelid with a shape registered in advance.
In the above embodiment, a straight line connecting the first end point, which is the highest point of the first boundary line, and the second end point, which is the lowest point of the second boundary line, is used. The straight line may be shifted to some extent in the x-axis direction. This allowable “some degree” refers to a range of about one-fifth of the distance between the corners of the eyes and the corners of the eyes, for example, centering on the center point between the corners of the corners of the eyes.
 実施の形態2.
 図1に示されているように、実施の形態2に係る開眼度検出装置200は、まぶた形状検出装置210と、開眼度算出部130とを備える。
 実施の形態2に係る開眼度検出装置200は、まぶた形状検出装置210を除いて、実施の形態1に係る開眼度検出装置100と同様に構成されている。
Embodiment 2. FIG.
As shown in FIG. 1, the eye opening degree detection apparatus 200 according to Embodiment 2 includes an eyelid shape detection apparatus 210 and an eye opening degree calculation unit 130.
The eye opening degree detection apparatus 200 according to the second embodiment is configured in the same manner as the eye opening degree detection apparatus 100 according to the first embodiment except for the eyelid shape detection apparatus 210.
 実施の形態2におけるまぶた形状検出装置210は、眼存在領域特定部111と、正規化処理部112と、明暗変化検出部213と、まぶた基準位置推定部114と、近似曲線生成部215と、形状決定部216とを備える。
 実施の形態2におけるまぶた形状検出装置210は、明暗変化検出部213、近似曲線生成部215及び形状決定部216を除いて、実施の形態1におけるまぶた形状検出装置110と同様に構成されている。
The eyelid shape detection apparatus 210 according to the second embodiment includes an eye presence region specifying unit 111, a normalization processing unit 112, a light / dark change detection unit 213, an eyelid reference position estimation unit 114, an approximate curve generation unit 215, and a shape. A determination unit 216.
The eyelid shape detection device 210 according to the second embodiment is configured in the same manner as the eyelid shape detection device 110 according to the first embodiment, except for the light / dark change detection unit 213, the approximate curve generation unit 215, and the shape determination unit 216.
 実施の形態1に係る開眼度検出装置100は、近似曲線を生成する際に、座標Entと座標Enbとを結ぶ直線l上の、Enby≦y≦Entyを満たす点から、複数の点を選択して近似曲線を生成している。実施の形態2に係る開眼度検出装置200は、近似曲線に使用する点の範囲を限定することで処理時間を低減させる。 When generating an approximated curve, eye opening degree detection apparatus 100 according to Embodiment 1 selects a plurality of points from the points satisfying Enby ≦ y ≦ Enty on line l connecting coordinates Ent and Enb. An approximate curve is generated. The eye-opening degree detection apparatus 200 according to Embodiment 2 reduces the processing time by limiting the range of points used for the approximate curve.
 明暗変化検出部213は、正規化画像データInに対し、顔画像の縦方向の明るさの変化の大きさを算出することで、明暗変化データEnを生成する。そして、明暗変化検出部213は、明暗変化データEnを、正規化画像データInとともに、まぶた基準位置推定部114及び近似曲線生成部215へ供給する。
 明暗変化データEnは、座標Ent及び座標Enbの座標データと、明暗変化(明から暗)が大きかった領域(第1選択領域)Rnt及び明暗変化(暗から明)が大きかった領域(第2選択領域)Rnbとを示すデータである。
The light / dark change detection unit 213 generates the light / dark change data En by calculating the magnitude of the change in the vertical brightness of the face image with respect to the normalized image data In. Then, the light / dark change detection unit 213 supplies the light / dark change data En to the eyelid reference position estimation unit 114 and the approximate curve generation unit 215 together with the normalized image data In.
The light / dark change data En includes the coordinate data of the coordinates Ent and the coordinate Enb, the region where the light / dark change (bright to dark) is large (first selection region) Rnt, and the region where the light / dark change (dark to light) is large (second selection). Area) Rnb.
 領域Rntと領域Rnbは、フィルタの評価値に応じて決定される。具体的には、明暗変化検出部213は、フィルタEtfに対する閾値THtf及びフィルタEbfに対する閾値THbfを用いて、フィルタEtfの評価値が閾値THtf以上の領域を領域Rntとし、フィルタEbfの評価値が閾値THbf以上の領域を領域Rnbとする。このとき、領域Rnt及び領域Rnbは、明暗変化(明から暗)が最大となる点の座標Entと明暗変化(暗から明)が最大となる点の座標Enbとを結んだ直線l上の画素に対して求める。 The region Rnt and the region Rnb are determined according to the evaluation value of the filter. Specifically, the light / dark change detection unit 213 uses the threshold value THtf for the filter Etf and the threshold value THbf for the filter Ebf to set a region where the evaluation value of the filter Etf is equal to or greater than the threshold value THtf as the region Rnt, and the evaluation value of the filter Ebf is the threshold value. A region equal to or greater than THbf is defined as a region Rnb. At this time, the region Rnt and the region Rnb are pixels on a straight line l connecting the coordinate Ent of the point where the light / dark change (bright to dark) is maximum and the coordinate Enb of the point where the light / dark change (dark to light) is the maximum. Ask for.
 図8(A)及び(B)は、領域Rntの算出方法を示す概略図である。
 図8(A)には、座標Ent付近の各画像におけるフィルタEtfの評価値と直線lとが示されている。
 ここでは、閾値THtf=100とし、フィルタEtfの評価値が閾値THtf以上である領域Rntに属する画素には、ハッチング又はクロスハッチングが施されている。
8A and 8B are schematic diagrams illustrating a method for calculating the region Rnt.
FIG. 8A shows the evaluation value of the filter Etf and the straight line 1 in each image near the coordinate Ent.
Here, the threshold value THtf = 100, and the pixels belonging to the region Rnt in which the evaluation value of the filter Etf is equal to or greater than the threshold value THtf are hatched or cross-hatched.
 明暗変化検出部213は、座標Entを起点とし、まずは直線l上をy軸方向(図の下方向)に走査する。そして、図8(B)に示されているように、明暗変化検出部213は、フィルタEtfの評価値が、はじめて閾値THtfを下回る画素の直前に走査した画素を領域Rntの下端の画素Rntbとする。図8(B)に示されている例では、座標Entから直線lに沿ってy軸方向に走査すると、フィルタEtfの評価値が150、124及び46の順に得られる。このとき、評価値が124から46に移ったところではじめて閾値THtfを下回る。そのため、明暗変化検出部213は、フィルタEtfの評価値が124の画素を領域Rntの下端の画素Rntbとする。 The light / dark change detection unit 213 starts from the coordinate Ent and first scans on the straight line l in the y-axis direction (downward in the figure). Then, as illustrated in FIG. 8B, the light / dark change detection unit 213 detects the pixel scanned immediately before the pixel for which the evaluation value of the filter Etf is less than the threshold value THtf for the first time as the pixel Rntb at the lower end of the region Rnt. To do. In the example shown in FIG. 8B, when the y-axis direction is scanned along the straight line 1 from the coordinate Ent, the evaluation values of the filter Etf are obtained in the order of 150, 124 and 46. At this time, the threshold value THtf is not reached until the evaluation value has shifted from 124 to 46. Therefore, the light / dark change detection unit 213 sets the pixel having the evaluation value 124 of the filter Etf as the pixel Rntb at the lower end of the region Rnt.
 続いて、明暗変化検出部213は、直線l上をy軸逆方向(図の上方向)に走査する。そして、明暗変化検出部213は、フィルタEtfの評価値がはじめて閾値THtfを下回る画素の直前に走査した画素を領域Rntの上端の画素Rnttとする。図8の例では、座標Entから直線lに沿ってy軸逆方向に走査すると、フィルタEtfの評価値が100、90、31と得られる。このとき、出力値が100から90に移ったところではじめて閾値Thtfを下回る。そのため、フィルタEtfの評価値が100の画素を領域Rnbの上端の画素Rnttとする。 Subsequently, the light / dark change detection unit 213 scans on the straight line l in the reverse direction of the y-axis (upward in the figure). Then, the light / dark change detection unit 213 sets a pixel scanned immediately before a pixel whose evaluation value of the filter Etf is lower than the threshold value THtf for the first time as a pixel Rntt at the upper end of the region Rnt. In the example of FIG. 8, when the scanning is performed in the y-axis reverse direction along the straight line 1 from the coordinate Ent, the evaluation values of the filter Etf are obtained as 100, 90, and 31, respectively. At this time, the threshold value Thhtf is not reached until the output value has shifted from 100 to 90. Therefore, a pixel having an evaluation value of 100 for the filter Etf is set as a pixel Rntt at the upper end of the region Rnb.
 同様に、明暗変化検出部213は、座標Enbと閾値THbfに基づいて領域Rnbの上端の画素Rnbtと下端の画素Rnbbを得る。 Similarly, the light / dark change detection unit 213 obtains a pixel Rnbt at the upper end and a pixel Rnbb at the lower end of the region Rnb based on the coordinates Enb and the threshold value THbf.
 領域Rntは、例えば、画素Rnttと画素Rntbに対応する座標のy座標で示されるものとする。同様に、領域Rnbは、例えば、画素Rnbtと画素Rnbbに対応する座標のy座標で示されるものとする。 It is assumed that the region Rnt is indicated by, for example, the y coordinate of the coordinates corresponding to the pixel Rntt and the pixel Rntb. Similarly, the region Rnb is indicated by, for example, the y coordinate of the coordinates corresponding to the pixel Rnbt and the pixel Rnbb.
 領域Rnt及び領域Rnbを定めるための閾値THtf及び閾値THbfは、予め定数で定義されていてもよく、また、フィルタEtf及びフィルタEbfの評価値の最大値及び最小値を利用して求められてもよい。
 例えば、下記の(9)式及び(10)式より、閾値THtf及び閾値THbfが求められてもよい。
 THtf=α×(FntMax-FntMin)+FntMin (9)
 THbf=β×(FnbMax-FnbMin)+FnbMin
                              (10)
 ここで、値FntMaxは、眼存在領域の画素に対してフィルタEtfにより処理した際の評価値の最大値である。値FntMinは、その最小値である。
 同様に、値FnbMaxは、眼存在領域の画素に対してフィルタEbfにより処理した際の評価値の最大値である。値FnbMinは、その最小値である。
 また、α及びβは、それぞれのフィルタの評価値の最大値の何割以上の値であれば明暗変化が大きいかを判断するための係数であり、予め定められているものとする。例えば、α=β=0.5と設定されているものとする。
The threshold value THtf and the threshold value THbf for defining the region Rnt and the region Rnb may be defined in advance as constants, or may be obtained using the maximum value and the minimum value of the evaluation values of the filter Etf and the filter Ebf. Good.
For example, the threshold value THtf and the threshold value THbf may be obtained from the following equations (9) and (10).
THtf = α × (FntMax−FntMin) + FntMin (9)
THbf = β × (FnbMax−FnbMin) + FnbMin
(10)
Here, the value FntMax is the maximum value of the evaluation value when the pixel in the eye presence region is processed by the filter Etf. The value FntMin is its minimum value.
Similarly, the value FnbMax is the maximum evaluation value when the pixel in the eye presence region is processed by the filter Ebf. The value FnbMin is its minimum value.
Further, α and β are coefficients for determining whether a change in brightness is large if the value is greater than what percentage of the maximum evaluation value of each filter, and is assumed to be predetermined. For example, it is assumed that α = β = 0.5 is set.
 近似曲線生成部215は、正規化画像データInと、明暗変化データEnと、まぶた基準位置データFnとから、まぶたに対して複数の近似曲線を生成し、複数の近似曲線の各々を示すパラメータの集合を含むパラメータデータGnを形状決定部216に与える。 The approximate curve generation unit 215 generates a plurality of approximate curves for the eyelid from the normalized image data In, the light / dark change data En, and the eyelid reference position data Fn, and sets parameters indicating each of the plurality of approximate curves. Parameter data Gn including the set is given to the shape determining unit 216.
 近似曲線生成部215は、座標Cupa(i)を選択する際に、上まぶたであれば領域Rntから、下まぶたであれば領域Rnbから選択し、近似曲線を生成する。このように明暗変化の大きい領域を通過するように近似曲線を生成することで、実施の形態1と比較して、近似曲線生成部215は、近似曲線の数を低減することができ、処理時間を短縮することができる。 When selecting the coordinates Cupa (i), the approximate curve generation unit 215 generates an approximate curve by selecting from the region Rnt for the upper eyelid and from the region Rnb for the lower eyelid. By generating the approximate curve so as to pass through the region where the change in brightness is large as described above, the approximate curve generation unit 215 can reduce the number of approximate curves as compared with the first embodiment, and the processing time can be reduced. Can be shortened.
 なお、形状決定部216は、実施の形態1と同様に、上まぶたに対応する複数の近似曲線(第1近似曲線)から、眼と上まぶたとの間の境界線として最も好適な近似曲線を選択する。
 また、形状決定部216は、下まぶたに対応する複数の近似曲線(第2近似曲線)から、眼と下まぶたとの間の境界線として最も好適な近似曲線を選択する。近似曲線の生成及び選択の仕方は、上まぶたの場合と同様である。
 そして、形状決定部216は、決定された上まぶたの形状及び下まぶたの形状から、実施の形態1と同様に、目尻及び目頭間の距離ew=Nwと、上まぶたの端点と下まぶたの端点との距離eh=Ndとの比の値ep=eh/ewを、形状パラメータHnとして開眼度算出部130に与える。
As in the first embodiment, the shape determining unit 216 determines the most suitable approximate curve as a boundary line between the eye and the upper eyelid from a plurality of approximate curves (first approximate curve) corresponding to the upper eyelid. select.
In addition, the shape determining unit 216 selects the most suitable approximate curve as a boundary line between the eye and the lower eyelid from a plurality of approximate curves (second approximate curve) corresponding to the lower eyelid. The method of generating and selecting the approximate curve is the same as in the case of the upper eyelid.
Then, from the determined shape of the upper eyelid and the shape of the lower eyelid, the shape determining unit 216 determines the distance ew = Nw between the corner of the eye and the eye, the end points of the upper eyelid and the end points of the lower eyelid, as in the first embodiment. The ratio value ep = eh / ew to the distance eh = Nd is given to the eye opening degree calculation unit 130 as the shape parameter Hn.
 実施の形態2に係るまぶた形状検出装置210は、明暗変化の大きい領域を通過するように近似曲線を生成するため、少ない処理時間でまぶたの形状を検出することができる。 Since the eyelid shape detection apparatus 210 according to the second embodiment generates an approximate curve so as to pass through a region having a large change in brightness, the shape of the eyelid can be detected in a short processing time.
 実施の形態3.
 図1に示されているように、実施の形態3に係る開眼度検出装置300は、まぶた形状検出装置310と、開眼度算出部130とを備える。
 実施の形態3に係る開眼度検出装置300は、まぶた形状検出装置310を除いて、実施の形態1に係る開眼度検出装置100と同様に構成されている。
Embodiment 3 FIG.
As shown in FIG. 1, an eye opening degree detection apparatus 300 according to Embodiment 3 includes an eyelid shape detection apparatus 310 and an eye opening degree calculation unit 130.
The eye opening degree detection apparatus 300 according to the third embodiment is configured in the same manner as the eye opening degree detection apparatus 100 according to the first embodiment, except for the eyelid shape detection apparatus 310.
 実施の形態3におけるまぶた形状検出装置310は、眼存在領域特定部111と、正規化処理部112と、明暗変化検出部313と、まぶた基準位置推定部114と、近似曲線生成部115と、形状決定部116とを備える。
 実施の形態3におけるまぶた形状検出装置310は、明暗変化検出部313を除いて、実施の形態1におけるまぶた形状検出装置110と同様に構成されている。
The eyelid shape detection apparatus 310 according to the third embodiment includes an eye presence region specifying unit 111, a normalization processing unit 112, a light / dark change detection unit 313, an eyelid reference position estimation unit 114, an approximate curve generation unit 115, a shape And a determination unit 116.
The eyelid shape detection device 310 in the third embodiment is configured in the same manner as the eyelid shape detection device 110 in the first embodiment, except for the light / dark change detection unit 313.
 実施の形態1に係る開眼度検出装置100は、眼存在領域の中で明暗変化の大きい点を2点(明から暗、暗から明)求め、明暗変化データEnを生成しているが、この場合、眼存在領域の全ての画素に対してフィルタ処理が必要なため、処理負荷が大きい。実施の形態2に係る開眼度検出装置200は、眼存在領域内に複数の直線を設定し、各直線上にて明暗変化の大きい点を2点求める。そして、開眼度検出装置200は、各直線上で求められた2点の距離を算出し、距離の最も大きい2点の座標を示す明暗変化データEnを生成する。これにより、フィルタ処理する画素を直線上の画素に限定できる上に、顔向き変化によるまぶた形状の変化や個人差に頑健なまぶた形状検出を行うことができる。 The eye-opening degree detection apparatus 100 according to Embodiment 1 obtains two points (bright to dark, dark to bright) of points having a large light / dark change in the eye presence region, and generates light / dark change data En. In this case, since the filtering process is necessary for all the pixels in the eye presence region, the processing load is large. The eye-opening degree detection apparatus 200 according to Embodiment 2 sets a plurality of straight lines in the eye presence region, and obtains two points having a large contrast change on each straight line. Then, the eye opening degree detection device 200 calculates the distance between the two points obtained on each straight line, and generates the light / dark change data En indicating the coordinates of the two points having the largest distance. Thereby, the pixels to be filtered can be limited to pixels on a straight line, and eyelid shape detection robust to changes in eyelid shape due to face orientation changes and individual differences can be performed.
 明暗変化検出部313は、正規化画像データInで表される正規化画像の縦方向における明るさの変化の大きさを算出することで、明暗変化データEnを生成する。例えば、明暗変化検出部313は、正規化画像において、縦方向に延びる予め定められた複数の直線を設定し、複数の直線に含まれる各々の直線上で、上の画素からの明るさの低下の度合いが最も大きい第1候補画素の座標と、下の画素からの明るさの低下の度合いが最も大きい第2候補画素の座標との距離を算出する。そして、明暗変化検出部313は、算出された距離が最も大きい第1候補画素の座標及び第2候補画素の座標を特定し、特定された第1候補画素の座標を第1端点の座標Ent、特定された第2候補画素の座標を第2端点の座標Enbとする。
 そして、明暗変化検出部313は、明暗変化データEnを、正規化画像データInとともに、まぶた基準位置推定部114及び近似曲線生成部115へ供給する。
The light / dark change detection unit 313 generates the light / dark change data En by calculating the magnitude of the change in brightness in the vertical direction of the normalized image represented by the normalized image data In. For example, the brightness change detection unit 313 sets a plurality of predetermined straight lines extending in the vertical direction in the normalized image, and reduces brightness from the upper pixel on each straight line included in the plurality of straight lines. The distance between the coordinates of the first candidate pixel having the largest degree of the brightness and the coordinates of the second candidate pixel having the greatest degree of brightness reduction from the lower pixel is calculated. Then, the light / dark change detection unit 313 identifies the coordinates of the first candidate pixel and the second candidate pixel having the largest calculated distance, and the coordinates of the identified first candidate pixel are the coordinates of the first end point Ent, The coordinates of the specified second candidate pixel are set as the coordinates Enb of the second end point.
Then, the light / dark change detecting unit 313 supplies the light / dark change data En to the eyelid reference position estimating unit 114 and the approximate curve generating unit 115 together with the normalized image data In.
 明暗変化検出部313は、正規化画像データInで表される正規化画像上に複数の直線を設定する。ここでは、説明を簡単にするために、y軸に平行な直線を3本設定する場合を例にして説明する。直線l(1)をx=Ngx-Nw/4、直線l(2)をx=Ngx、及び、直線l(3)をx=Ngx+Nw/4とする。 The light / dark change detection unit 313 sets a plurality of straight lines on the normalized image represented by the normalized image data In. Here, in order to simplify the description, a case where three straight lines parallel to the y-axis are set will be described as an example. The straight line l (1) is x = Ngx−Nw / 4, the straight line l (2) is x = Ngx, and the straight line l (3) is x = Ngx + Nw / 4.
 明暗変化検出部313は、まず、各直線に沿ってフィルタEtf及びフィルタEbfの処理を実行する。次に、直線毎にフィルタの評価値が最大となる点を求める。ここでは、直線l(k) (k∈{1,2,3})においてフィルタEtfの評価値が最大となった点の座標を第1候補画素の座標Ent(k)、フィルタEbfの評価値が最大となった点の座標を第2候補画素の座標Enf(k)とする。 First, the light / dark change detection unit 313 executes processing of the filter Etf and the filter Ebf along each straight line. Next, the point where the evaluation value of the filter becomes maximum is obtained for each straight line. Here, the coordinates of the point at which the evaluation value of the filter Etf is maximum on the straight line l (k) (kε {1, 2, 3}) are the coordinates of the first candidate pixel Ent (k) and the evaluation value of the filter Ebf. The coordinate of the point at which is the maximum is defined as the coordinate Enf (k) of the second candidate pixel.
 次に、明暗変化検出部313は、各直線で座標Ent(k)と座標Enb(k)との間の距離d(k)を算出する。距離は、例えばユークリッド距離を用いる。ここでは、各直線がy軸に平行なためd(k)=Enby(k)-Enty(k)で求められる。ここで、値Enby(k)は、座標Enb(k)のy座標であり、値Enty(k)は、座標Ent(k)のy座標である。 Next, the brightness change detection unit 313 calculates a distance d (k) between the coordinates Ent (k) and the coordinates Enb (k) on each straight line. For the distance, for example, the Euclidean distance is used. Here, since each straight line is parallel to the y-axis, d (k) = Enby (k) −Enty (k). Here, the value Enby (k) is the y coordinate of the coordinate Enb (k), and the value Enty (k) is the y coordinate of the coordinate Ent (k).
 明暗変化検出部313は、求めた距離d(k)の中から距離が最大となるものを選択し(選択された距離に対応するkをKとする)、座標Ent(K)及び座標Enb(K)を示す明暗変化データEnを生成する。 The light / dark change detecting unit 313 selects the maximum distance from the obtained distances d (k) (k corresponding to the selected distance is K), and coordinates Ent (K) and Enb ( Brightness / darkness change data En indicating K) is generated.
 なお、実施の形態3においては、直線の本数を3本と設定したが、本数に特に制限はない。また、直線としてy軸に平行な直線を選択したが、これに限るものではなく、任意の直線を設定して良い。 In the third embodiment, the number of straight lines is set to three, but the number is not particularly limited. Further, although a straight line parallel to the y-axis is selected as the straight line, the present invention is not limited to this, and an arbitrary straight line may be set.
 以上のように構成された実施の形態3に係るまぶた形状検出装置310は、フィルタ処理する画素を直線上の画素に限定できるため、処理時間を短縮することができる。また、2種類の明暗変化が大きい点の距離が最も大きくなる2点を選択することにより、まぶたの上端点及び下端点に近い2点を選択することができるため、近似曲線生成部115にて生成する近似曲線の形状が、本来のまぶた形状に近しい形状となる。このため、より精度良くまぶた形状を検出することができる。 Since the eyelid shape detection apparatus 310 according to Embodiment 3 configured as described above can limit the pixels to be filtered to pixels on a straight line, the processing time can be shortened. Further, by selecting the two points where the distance between the two kinds of points with the large change in brightness is the largest, two points close to the upper end point and the lower end point of the eyelid can be selected. The shape of the approximate curve to be generated is a shape close to the original eyelid shape. For this reason, the eyelid shape can be detected with higher accuracy.
実施の形態4.
 図9は、実施の形態4に係る開眼度検出装置400の構成を概略的に示すブロック図である。
 開眼度検出装置400は、まぶた形状検出装置410と、開眼度算出部130とを備える。実施の形態4に係る開眼度検出装置400は、まぶた形状検出装置410を除いて、実施の形態1に係る開眼度検出装置100と同様に構成されている。
Embodiment 4 FIG.
FIG. 9 is a block diagram schematically showing the configuration of the eye opening degree detection apparatus 400 according to the fourth embodiment.
The eye opening degree detection device 400 includes an eyelid shape detection device 410 and an eye opening degree calculation unit 130. The eye opening degree detection apparatus 400 according to the fourth embodiment is configured in the same manner as the eye opening degree detection apparatus 100 according to the first embodiment except for the eyelid shape detection apparatus 410.
 実施の形態4におけるまぶた形状検出装置410は、眼存在領域特定部111と、正規化処理部112と、明暗変化検出部413と、まぶた基準位置推定部114と、近似曲線生成部115と、形状決定部116とを備える。
 実施の形態4におけるまぶた形状検出装置410は、明暗変化検出部413を除いて、実施の形態1におけるまぶた形状検出装置110と同様に構成されている。また、実施の形態4における明暗変化検出部413には、運転手の顔の向きを示す顔向き情報Hdが外部から与えられる。
The eyelid shape detection device 410 according to the fourth embodiment includes an eye presence region specifying unit 111, a normalization processing unit 112, a light / dark change detection unit 413, an eyelid reference position estimation unit 114, an approximate curve generation unit 115, a shape, And a determination unit 116.
The eyelid shape detection device 410 according to the fourth embodiment is configured in the same manner as the eyelid shape detection device 110 according to the first embodiment except for the light / dark change detection unit 413. Further, the face direction information Hd indicating the direction of the driver's face is given from the outside to the light / dark change detection unit 413 in the fourth embodiment.
 実施の形態3に係る開眼度検出装置300は、正規化画像上の複数の直線において2種類の明暗変化の大きい点の距離を算出し、最も距離の大きい直線上の2点を使用することで、フィルタ処理を適用する画素を限定している。実施の形態4に係る開眼度検出装置400は、顔向き情報Hdを用いることで、複数の直線の中から、明暗変化の抽出に使用すべき直線を特定することで、更なる処理時間の短縮を行う。 The eye-opening degree detection apparatus 300 according to Embodiment 3 calculates the distance between two types of points with a large change in brightness in a plurality of straight lines on the normalized image, and uses the two points on the straight line with the longest distance. The pixels to which the filter process is applied are limited. The eye-opening degree detection apparatus 400 according to Embodiment 4 uses the face orientation information Hd to identify a straight line that should be used for extraction of a light / dark change from a plurality of straight lines, thereby further reducing processing time. I do.
 図10は、運転手が横方向を向いている場合の眼付近の画像の一例を示す概略図である。
 図2に示されている、運転手が正面を向いている場合の眼付近の画像の一例とは異なり、運転手が横方向を向いている場合には、検出すべきまぶたの上端点の座標Ent及び下端点の座標Enbが眼存在領域の左側に寄る傾向がある。このような、まぶたの上端点の座標Ent及び下端点の座標Enbの横方向(x軸方向)への偏りは、主に水平方向の顔向きの変化により生じる。そのため、明暗変化検出部413は、顔向き情報Hdに基づいて、明暗変化を抽出する直線を決定することで、実施の形態3における明暗変化検出部313のように、複数の直線に対してフィルタ処理を実施する必要がなくなる。
FIG. 10 is a schematic diagram illustrating an example of an image near the eye when the driver is facing sideways.
Unlike the example of the image near the eye shown in FIG. 2 when the driver is facing the front, the coordinates of the upper end point of the eyelid to be detected when the driver is facing the lateral direction There is a tendency that the Ent and the coordinate Enb of the lower end point are closer to the left side of the eye presence region. Such a bias in the lateral direction (x-axis direction) of the coordinate Ent of the upper end point and the coordinate Enb of the lower end point of the eyelid is mainly caused by a change in the face direction in the horizontal direction. Therefore, the light / dark change detection unit 413 determines a straight line from which the light / dark change is extracted based on the face orientation information Hd, thereby filtering a plurality of straight lines like the light / dark change detecting unit 313 in the third embodiment. There is no need to perform processing.
 明暗変化検出部413は、正規化画像データInで表される正規化画像の縦方向における明るさの変化の大きさを算出することで、明暗変化データEnを生成する。例えば、明暗変化検出部413は、正規化画像において、縦方向に延びる予め定められた複数の直線を設定し、複数の直線から、人物の顔の向きに基づいて、1つの直線を選択する。そして、明暗変化検出部413は、選択された直線上で、上の画素からの明るさの低下の度合いが最も大きい画素の座標を第1端点の座標Entとし、下の画素からの明るさの低下の度合いが最も大きい画素の座標を第2端点の座標Enbとする。
 そして、明暗変化検出部413は、明暗変化データEnを、正規化画像データInとともに、まぶた基準位置推定部114及び近似曲線生成部115へ供給する。
The light / dark change detection unit 413 generates light / dark change data En by calculating the magnitude of the change in brightness in the vertical direction of the normalized image represented by the normalized image data In. For example, the light / dark change detection unit 413 sets a plurality of predetermined straight lines extending in the vertical direction in the normalized image, and selects one straight line from the plurality of straight lines based on the orientation of the person's face. Then, the light / dark change detection unit 413 sets, on the selected straight line, the coordinates of the pixel with the greatest degree of brightness reduction from the upper pixel as the first endpoint coordinate Ent, and the brightness of the lower pixel. The coordinate of the pixel with the greatest degree of decrease is taken as the coordinate Enb of the second end point.
Then, the light / dark change detection unit 413 supplies the light / dark change data En to the eyelid reference position estimating unit 114 and the approximate curve generating unit 115 together with the normalized image data In.
 明暗変化検出部413は、正規化画像データInで表される正規化画像において複数の直線を設定する。ここでは、説明を簡単にするために、y軸に平行な直線を3本設定する場合を例にして説明する。直線l(1)をx=Ngx-Nw/4、直線l(2)をx=Ngx、及び、直線l(3)をx=Ngx+Nw/4とする。 The light / dark change detection unit 413 sets a plurality of straight lines in the normalized image represented by the normalized image data In. Here, in order to simplify the description, a case where three straight lines parallel to the y-axis are set will be described as an example. The straight line l (1) is x = Ngx−Nw / 4, the straight line l (2) is x = Ngx, and the straight line l (3) is x = Ngx + Nw / 4.
 そして、明暗変化検出部413は、水平方向の顔向き情報Hdに基づいて、フィルタ処理すべき直線を選択する。ここでは、顔向き情報Hdは、運転手の顔の向きを、正面向きを0度とし、右方向を向いている場合を正の方向の回転角、左方向を向いている場合を負の方向の回転角で示すものとする。図10は、右方向を向いている場合、つまり正の方向を向いている状態を示している。 Then, the light / dark change detection unit 413 selects a straight line to be filtered based on the horizontal face orientation information Hd. Here, the face direction information Hd is the direction of the driver's face, the front direction is 0 degree, the right direction is a positive rotation angle, and the left direction is a negative direction. It shall be shown with the rotation angle. FIG. 10 shows a state in which it faces rightward, that is, in a positive direction.
 明暗変化検出部413は、例えば、顔向き情報Hdで示される顔の向き(回転角)に対して、閾値THfdpと閾値THfdnとを設け(THfdp>THfdn)、Hd≧THfdpの場合は直線l(1)を選択し、THfdp<Hd<THfdnの場合は直線l(2)を選択し、Hd≦Thfdnの場合は直線l(3)を選択する。 For example, the light / dark change detection unit 413 provides a threshold THfdp and a threshold THfdn (THfdp> THfdn) with respect to the face direction (rotation angle) indicated by the face direction information Hd, and a straight line l (if Hd ≧ THfdp) 1) is selected. If THfdp <Hd <THfdn, the straight line l (2) is selected, and if Hd ≦ Thfdn, the straight line l (3) is selected.
 そして、明暗変化検出部413は、選択した直線に対しフィルタ処理を行い、2種類の明暗変化の(明から暗、暗から明)大きい点の座標を示す明暗変化データEnを生成する。 Then, the light / dark change detection unit 413 performs filter processing on the selected straight line, and generates light / dark change data En indicating the coordinates of the two types of large light-dark changes (bright to dark, dark to light).
 実施の形態4におけるまぶた形状検出装置410は、顔向き情報Hdを用いることで、複数の直線の中から、明暗変化の抽出に使用すべき直線を選択することができるため、フィルタ処理の回数を低減し、処理時間を短縮することができる。 The eyelid shape detection apparatus 410 according to the fourth embodiment can select a straight line that should be used for extraction of light and dark changes from a plurality of straight lines by using the face orientation information Hd, and therefore the number of times of filter processing is reduced. The processing time can be shortened.
実施の形態5.
 図11は、実施の形態5に係る開眼度検出装置500の構成を概略的に示すブロック図である。
 開眼度検出装置500は、まぶた形状検出装置510と、開眼度算出部130とを備える。実施の形態5に係る開眼度検出装置500は、まぶた形状検出装置510を除いて、実施の形態1に係る開眼度検出装置100と同様に構成されている。
Embodiment 5 FIG.
FIG. 11 is a block diagram schematically showing a configuration of eye opening degree detection apparatus 500 according to the fifth embodiment.
The eye opening degree detection device 500 includes an eyelid shape detection device 510 and an eye opening degree calculation unit 130. Eye opening degree detection apparatus 500 according to Embodiment 5 is configured in the same manner as eye opening degree detection apparatus 100 according to Embodiment 1 except for eyelid shape detection apparatus 510.
 実施の形態5におけるまぶた形状検出装置510は、眼存在領域特定部111と、正規化処理部112と、明暗変化検出部113と、まぶた基準位置推定部114と、近似曲線生成部115と、近似曲線選択部517と、補正曲線生成部518と、形状決定部516とを備える。
 実施の形態5におけるまぶた形状検出装置510は、形状決定部516での処理の点、並びに、近似曲線選択部517及び補正曲線生成部518がさらに設けられている点を除いて、実施の形態1におけるまぶた形状検出装置110と同様に構成されている。
The eyelid shape detection apparatus 510 according to the fifth embodiment includes an eye presence region specifying unit 111, a normalization processing unit 112, a light / dark change detection unit 113, an eyelid reference position estimation unit 114, an approximate curve generation unit 115, an approximation A curve selection unit 517, a correction curve generation unit 518, and a shape determination unit 516 are provided.
The eyelid shape detection apparatus 510 according to the fifth embodiment is the same as the first embodiment except that the processing by the shape determination unit 516 and the approximate curve selection unit 517 and the correction curve generation unit 518 are further provided. It is comprised similarly to the eyelid shape detection apparatus 110 in FIG.
 近似曲線選択部517は、実施の形態1における形状決定部116と同様の処理により、複数の近似曲線のパラメータを含むパラメータデータGnから、各パラメータが表す近似曲線上の画素値を評価し、上まぶたの端点etop及び下まぶたの端点ebotの座標を特定する。そして、近似曲線選択部517は、上まぶたの端点etopの座標Cupa(iu)、下まぶたの端点ebotの座標Enb、目尻の座標Eno及び目頭の座標Eniを示す形状パラメータデータOnを生成して、それを補正曲線生成部518に与える。ここで、上まぶたの端点etopを、確定端点ともいう。 The approximate curve selection unit 517 evaluates the pixel value on the approximate curve represented by each parameter from the parameter data Gn including the parameters of the plurality of approximate curves, by the same processing as the shape determination unit 116 in the first embodiment. The coordinates of the end point etop of the eyelid and the end point ebot of the lower eyelid are specified. Then, the approximate curve selection unit 517 generates the shape parameter data On indicating the coordinates Cupa (iu) of the end point etop of the upper eyelid, the coordinates Enb of the end point ebot of the lower eyelid, the coordinates Eno of the eye corners, and the coordinates Eni of the eye head, This is given to the correction curve generation unit 518. Here, the end point etop of the upper eyelid is also referred to as a fixed end point.
 補正曲線生成部518は、与えられた形状パラメータデータOnから、目尻及び目頭の位置を補正するための近似曲線である複数の補正曲線を生成し、生成された複数の補正曲線の各々のパラメータを含む補正曲線パラメータデータPnを形状決定部516に与える。 The correction curve generation unit 518 generates a plurality of correction curves that are approximate curves for correcting the positions of the corners of the eyes and the eyes from the given shape parameter data On, and sets each parameter of the generated plurality of correction curves. The correction curve parameter data Pn that is included is provided to the shape determination unit 516.
 補正曲線生成部518の動作を詳しく説明する。
 図12は、上まぶたの補正曲線の生成を説明するための概略図である。
 近似曲線選択部517により上まぶたの端点etopの座標Cupa(iu)及び下まぶたの端点ebotの座標Enbは算出されているが、目頭の座標Eni及び目尻の座標Enoは、座標Enbを元に推定された値であり、精度に欠けたものである。そこで、補正曲線生成部518は、座標Eno及び座標Enoを補正するための近似曲線である補正曲線を生成する。
The operation of the correction curve generation unit 518 will be described in detail.
FIG. 12 is a schematic diagram for explaining generation of a correction curve for the upper eyelid.
The approximate curve selection unit 517 calculates the coordinates Cupa (iu) of the end point etop of the upper eyelid and the coordinates Enb of the end point ebot of the lower eyelid. However, the coordinates Eni of the eyelid and the coordinate Eno of the eyelid are estimated based on the coordinate Enb. Value, and lacks accuracy. Therefore, the correction curve generation unit 518 generates a correction curve that is an approximate curve for correcting the coordinate Eno and the coordinate Eno.
 各補正曲線は、曲線を通過する3点の座標Pupa、座標Pupb(i)及び座標Pupc(i)を使用して定義される。座標Pupb(i)は、第1選択座標ともいい、座標Pupc(i)は、第2選択座標ともいう。
 座標Pupaは、生成される複数の曲線で、共通の点である。ここでは、座標Pupaは、上まぶたの端点etopの座標Cupa(iu)が使用される。
 座標Pupb(i)及び座標Pupc(i)は、生成される複数の曲線に対し、曲線毎に選択される。なお、座標Pupaのx座標及びy座標は、(Pupax、Pupay)=(Cupa(iu)x,Cupa(iu)y)である。座標Pupb(i)のx座標及びy座標は、(Pupb(i)x、Pupb(i)y)である。座標Pupc(i)のx座標及びy座標は、(Pupc(i)x、Pupc(i)y)である。なお、Pupbx(i)<Pupax<Pupcx(i)である。
Each correction curve is defined using three points of coordinates Pupa, Pupb (i), and Pupc (i) passing through the curve. The coordinate Pupb (i) is also referred to as a first selected coordinate, and the coordinate Pupc (i) is also referred to as a second selected coordinate.
The coordinate Pupa is a common point among a plurality of generated curves. Here, as the coordinates Pupa, the coordinates Cupa (iu) of the end point etop of the upper eyelid are used.
The coordinates Pupb (i) and the coordinates Pupc (i) are selected for each curve with respect to a plurality of generated curves. Note that the x and y coordinates of the coordinates Pupa are (Pupax, Pupay) = (Cupa (iu) x, Cupa (iu) y). The x and y coordinates of the coordinates Pupb (i) are (Pupb (i) x, Pupb (i) y). The x and y coordinates of the coordinate Pupc (i) are (Pupc (i) x, Pupc (i) y). Note that Pupbx (i) <Pupax <Pupcx (i).
 座標Pupb(i)は、座標Eniを通る直線l(1)上の点から複数選択される。具体的には、直線l(1)上で、Eniy-mgin≦y≦Eniy+mginを満たす点がM(Mは2以上の整数)個選択され、選択された点を点群Pinとする。そして、点群Pinに含まれる各点Pin(i) (i∈1,2,・・・,M)がPupb(i)として設定される。なお、値mginは、予め定められた定数である。また、直線l(1)は、例えば、x=Enixで表される直線である。なお、各点Pin(i)は、等間隔となるように選択されればよい。 A plurality of coordinates Pupb (i) are selected from points on the straight line l (1) passing through the coordinates Eni. Specifically, M (M is an integer of 2 or more) points satisfying Any-mgin ≦ y ≦ Eny + mgin are selected on the straight line l (1), and the selected point is defined as a point group Pin. Then, each point Pin (i) (iε1, 2,..., M) included in the point group Pin is set as Pupb (i). The value mgin is a predetermined constant. The straight line l (1) is a straight line represented by x = Enix, for example. Note that the points Pin (i) may be selected so as to be equally spaced.
 座標Pupc(i)は、座標Enoを通る直線l(3)上の点から複数選択される。具体的には、直線l(3)上で、Enoy-mgout≦y≦Enoy+mgoutを満たす点がM個選択され、選択された点を点群Poutとする。そして、点群Poutに含まれる各点Pout(i) (i∈1,2,・・・,M)がPout(i)として設定される。なお、値mgoutは、予め定められた定数である。また、直線l(3)は、例えば、x=Enoxで表される直線である。なお、各点Pout(i)は、等間隔となるように選択されればよい。 A plurality of coordinates Pupc (i) are selected from points on the straight line l (3) passing through the coordinates Eno. Specifically, M points that satisfy Enoy−mgout ≦ y ≦ Enoy + mgout are selected on the straight line l (3), and the selected point is set as a point group Pout. Then, each point Pout (i) (i∈1, 2,..., M) included in the point group Pout is set as Pout (i). The value mgout is a predetermined constant. The straight line l (3) is a straight line represented by x = Enox, for example. In addition, each point Pout (i) should just be selected so that it may become equal intervals.
 座標Pupaは、端点etopの座標値を用いて設定され、座標Pupaのx座標Pupax=etopx、そのy座標Pupay=etopyとする。 The coordinate Pupa is set using the coordinate value of the end point etop, and the x coordinate Pupax = etopx of the coordinate Pupa and its y coordinate Pupay = etopy.
 まぶた形状は、装置に入力される顔画像の顔の向きに応じて、左右非対称なことがある。このため、1つの補正曲線Pu(i)は、座標Pupaのx座標よりも、x座標が小さい部分の曲線Pul(i)と、x座標が大きい部分の曲線Pur(i)との2つの曲線により構成される。
 補正曲線としては、例えば、二次関数が使用される。二次関数は、例えば、二次関数の頂点となる点と、それ以外に二次関数が通過する点とを特定することで、一意に定義できる。以下の説明では、二次関数を補正曲線として説明する。
The eyelid shape may be asymmetrical depending on the face orientation of the face image input to the device. Therefore, one correction curve Pu (i) has two curves, a curve Pu (i) where the x coordinate is smaller than the x coordinate of the coordinate Pupa and a curve Pur (i) where the x coordinate is large. Consists of.
For example, a quadratic function is used as the correction curve. For example, the quadratic function can be uniquely defined by specifying a point that is a vertex of the quadratic function and a point through which the quadratic function passes. In the following description, a quadratic function is described as a correction curve.
 曲線Pul(i)は、座標Pupaを頂点とし、座標Pupb(i)を頂点以外に通過する点とする。
 曲線Pul(i)を示す二次関数を下記の(11)式とすると、各パラメータは、下記の(12)式により求めることができる。
Figure JPOXMLDOC01-appb-M000008
Figure JPOXMLDOC01-appb-M000009
The curve Pul (i) has a coordinate Pupa as a vertex and a coordinate Pupb (i) as a point passing through other than the vertex.
When the quadratic function indicating the curve Pul (i) is the following equation (11), each parameter can be obtained by the following equation (12).
Figure JPOXMLDOC01-appb-M000008
Figure JPOXMLDOC01-appb-M000009
 同様に、曲線Pur(i)は、座標Pupaを頂点とし、座標Pupc(i)を頂点以外に通過する点とする。
 曲線Pur(i)を示す二次関数を下記の(13)式とすると、各パラメータは、下記の(14)式により求めることができる。
Figure JPOXMLDOC01-appb-M000010
Figure JPOXMLDOC01-appb-M000011
Similarly, the curve Pur (i) has a coordinate Pupa as a vertex and a coordinate Pupc (i) as a point passing through other than the vertex.
When the quadratic function indicating the curve Pur (i) is represented by the following equation (13), each parameter can be obtained by the following equation (14).
Figure JPOXMLDOC01-appb-M000010
Figure JPOXMLDOC01-appb-M000011
 上記により生成された複数の曲線Pu(i)及び曲線Cbを示すパラメータは、補正曲線パラメータデータPnとして形状決定部516に与えられる。
 補正曲線パラメータデータPnは、各曲線のパラメータの集合を含む。例えば、補正曲線パラメータデータPnは、曲線の種類である「二次関数」、曲線の数である「上まぶたM個、下まぶた1個」、まぶた基準点(目尻の座標Eno及び目頭の座標Eni)及び座標Enb、並びに、各曲線のパラメータの集合を含む。パラメータの集合は、上記の例では、上まぶたの補正曲線のパラメータabl(i)、bbl(i)、cbl(i)、acr(i)、bcr(i)及びccr(i)、並びに、下まぶたの直線パラメータである。下まぶたの直線パラメータは、まぶた基準点(目尻の座標Eno及び目頭の座標Eni)及び座標Enbの内の何れか1点のy座標の値である。
The parameters indicating the plurality of curves Pu (i) and the curve Cb generated as described above are given to the shape determination unit 516 as the correction curve parameter data Pn.
The correction curve parameter data Pn includes a set of parameters for each curve. For example, the correction curve parameter data Pn includes a “quadratic function” that is a type of curve, “M upper eyelid and one lower eyelid” that are the number of curves, eyelid reference points (coordinates Eno of eye corners and coordinates Eni of eye eyes) ) And coordinates Enb, and a set of parameters for each curve. In the above example, the set of parameters is the upper eyelid correction curve parameters abl (i), bbl (i), cbl (i), acr (i), bcr (i) and ccr (i), and This is the eyelid linear parameter. The linear parameter of the lower eyelid is the y coordinate value of one of the eyelid reference point (coordinate of the eye corner Eno and the coordinate of the eye head Eni) and the coordinate Enb.
 図11に示されている形状決定部516は、複数の補正曲線のパラメータを含む補正曲線パラメータデータPnを与えられ、各パラメータが表す補正曲線上の画素値を評価し、上まぶた及び下まぶたの形状を表す形状パラメータHnを算出する。そして、形状決定部516は、形状パラメータHnを開眼度算出部130に与える。 The shape determining unit 516 shown in FIG. 11 is given correction curve parameter data Pn including a plurality of correction curve parameters, evaluates pixel values on the correction curve represented by each parameter, and determines the upper eyelid and lower eyelid. A shape parameter Hn representing the shape is calculated. Then, the shape determination unit 516 gives the shape parameter Hn to the eye opening degree calculation unit 130.
 まぶたの形状を特定するためには、入力として与えられる複数の補正曲線から、まぶたとして尤もらしい補正曲線を、上まぶた及び下まぶたに関してそれぞれ1つずつ選択する必要がある。本実施の形態では、下まぶたの曲線は一意に設定済みであるため、上まぶたに対して生成された複数の補正曲線から、眼と上まぶたとの間の境界線として最も好適な1つの補正曲線が選択される。 In order to specify the shape of the eyelid, it is necessary to select a correction curve that is likely to be an eyelid from a plurality of correction curves given as inputs, one for each of the upper eyelid and the lower eyelid. In the present embodiment, since the lower eyelid curve is uniquely set, one correction most suitable as a boundary line between the eye and the upper eyelid is obtained from a plurality of correction curves generated for the upper eyelid. A curve is selected.
 形状決定部516は、実施の形態1における形状決定部116と同様に、複数の補正曲線の各々において、補正曲線上の画素の輝度値の平均値を求めて、平均値が最も低い曲線を選択することで上まぶたを通過する補正曲線を選択する。
 上記のように、形状決定部516は、補正曲線の中から上まぶたとして尤もらしい曲線Pu(iu)を選択し、座標Pupb(iu)を目頭の座標、Pupc(iu)を目尻の座標とする。
Similar to the shape determining unit 116 in the first embodiment, the shape determining unit 516 obtains the average value of the luminance values of the pixels on the correction curve in each of the plurality of correction curves, and selects the curve having the lowest average value. This selects the correction curve that passes through the upper eyelid.
As described above, the shape determining unit 516 selects the most likely curve Pu (iu) as the upper eyelid from among the correction curves, and uses the coordinates Pupb (iu) as the coordinates of the head and Pupc (iu) as the coordinates of the corner of the eye. .
 ここで、目頭の座標Pupb(iu)のx座標の値は、目頭の推定座標Eniのx座標の値であり、目頭の座標Pupb(iu)のy座標の値は、補正曲線Pu(iu)に目頭の推定座標Eniのx座標の値を代入することにより求めることができる。
 また、目尻の座標Pupc(iu)のx座標の値は、目尻の推定座標Enoのx座標の値であり、目尻の座標Pupb(iu)のy座標の値は、補正曲線Pu(iu)に目尻の推定座標Enoのx座標の値を代入することにより求めることができる。
 さらに、座標Pupaのx座標の値は、パラメータbcr(i)又はパラメータbbl(i)であり、座標Pupaのy座標の値は、パラメータcbl(i)又はパラメータccr(i)である。
Here, the x-coordinate value of the eye coordinates Pupb (iu) is the x-coordinate value of the estimated eye coordinates Eni, and the y-coordinate value of the eye coordinates Pupb (iu) is the correction curve Pu (iu). Can be obtained by substituting the value of the x coordinate of the estimated coordinate Eni of the eye.
Further, the x coordinate value of the corner coordinates Pupc (iu) is the x coordinate value of the estimated corner coordinates Eno, and the y coordinate value of the corner coordinates Pupb (iu) is represented by the correction curve Pu (iu). It can be obtained by substituting the value of the x coordinate of the estimated coordinate Eno of the corner of the eye.
Further, the value of the x coordinate of the coordinate Pupa is the parameter bcr (i) or the parameter bbl (i), and the value of the y coordinate of the coordinate Pupa is the parameter cbl (i) or the parameter ccr (i).
 そして、形状決定部516は、目尻及び目頭間の距離ew=Lwと、上まぶたの端点と下まぶたの端点との距離eh=Ldとの比の値ep=eh/ewを形状パラメータHnとして開眼度算出部130に与える。なお、値Lwは、下記の(15)式により求めることができる。また、値Ldは、下記の(16)式により求めることができる。
Figure JPOXMLDOC01-appb-M000012
Figure JPOXMLDOC01-appb-M000013
Then, the shape determining unit 516 uses the value ep = eh / ew, which is the ratio of the distance ew = Lw between the corner of the eye and the corner of the eye and the distance eh = Ld between the end point of the upper eyelid and the end point of the lower eyelid, as the shape parameter Hn. This is given to the degree calculation unit 130. The value Lw can be obtained by the following equation (15). The value Ld can be obtained from the following equation (16).
Figure JPOXMLDOC01-appb-M000012
Figure JPOXMLDOC01-appb-M000013
 以上のように、実施の形態5におけるまぶた形状検出装置510では、近似曲線の利用により算出した上まぶたの端点及び下まぶたの端点から、再度補正曲線を複数生成し、目尻及び目頭の位置を補正することで、精度の良いまぶた形状検出が可能である。 As described above, in the eyelid shape detection apparatus 510 according to Embodiment 5, a plurality of correction curves are generated again from the end points of the upper eyelid and the end points of the lower eyelid calculated by using the approximate curve, and the positions of the corners of the eyes and the eyes are corrected. This makes it possible to detect the eyelid shape with high accuracy.
実施の形態6.
 図13は、実施の形態6に係る居眠り度検出装置600の構成を概略的に示すブロック図である。
 居眠り度検出装置600は、開眼度検出装置100~500と、居眠り度算出部601とを備える。
 居眠り度検出装置600は、顔画像データImが一定の間隔で連続的に入力され、運転手の居眠り度Qnを出力する。顔画像データImは、例えば、1秒間に30フレーム与えられ、居眠り度Qnは、顔画像データImが新しく入力される毎に更新され、出力される。
 開眼度検出装置100~500は、実施の形態1~5に記載された何れかの開眼度検出装置100~500であればよい。実施の形態1~5に記載された何れかの開眼度検出装置100~500で算出された開眼度Knが居眠り度算出部601に与えられる。
Embodiment 6 FIG.
FIG. 13 is a block diagram schematically showing the configuration of the dozing level detection apparatus 600 according to the sixth embodiment.
The dozing level detection device 600 includes eye opening degree detection devices 100 to 500 and a dozing level calculation unit 601.
The dozing level detection device 600 receives face image data Im continuously at regular intervals and outputs a driver's dozing level Qn. The face image data Im is given, for example, 30 frames per second, and the dozing level Qn is updated and output every time the face image data Im is newly input.
The eye opening degree detection devices 100 to 500 may be any one of the eye opening degree detection devices 100 to 500 described in the first to fifth embodiments. The degree of eye opening Kn calculated by any one of the eye opening degree detection devices 100 to 500 described in the first to fifth embodiments is given to the dozing level calculation unit 601.
 居眠り度は、眠気の程度を示す値である。例えば、カロリンスカ眠気尺度(KSS)によって表される居眠り度は、1(非常にはっきり目覚めている)から9(とても眠い)の9段階で表される数値により、眠気の程度が示される。他には、前記9段階に加えて、10(寝ている)を加えた10段階の数値で眠気の程度が示されてもよい。以降の説明では、一例として、居眠り度を、10段階の数値を基準に考え、段階的でなく連続的な値として定義する。例えば、1.1及び5.4等の小数点を使って表される数値も含まれるものとする。なお、居眠り度の数値は、上記の9段階又は10段階に限定されるものではなく、例えば、表情判定法では、居眠り度は、1~5の数値で示される。 The dozing level is a value indicating the degree of sleepiness. For example, the degree of drowsiness represented by the Karolinska Sleepiness Scale (KSS) indicates the degree of sleepiness by a numerical value represented by 9 levels from 1 (very awakening) to 9 (very sleepy). In addition, the degree of sleepiness may be indicated by a numerical value of 10 levels obtained by adding 10 (sleeping) to the 9 levels. In the following description, as an example, the dozing level is considered based on a numerical value of 10 levels, and is defined as a continuous value rather than a stepwise value. For example, numerical values expressed using decimal points such as 1.1 and 5.4 are also included. It should be noted that the numerical value of the dozing level is not limited to the above-described 9 levels or 10 levels. For example, in the facial expression determination method, the level of dozing level is indicated by a numerical value of 1 to 5.
 居眠り度算出部601は、開眼度Knを与えられ、過去Nqフレーム分の開眼度Knから運転手の居眠り度Qnを算出する。
 例えば、居眠り度算出部601は、過去Nqフレーム分に対しPERCLOS(Percentage of Eyelid Closure)を算出し、居眠り度を算出する。PERCLOSは、例えば、過去Nqフレームに含まれる閉眼状態のフレームCfの数Nqcの割り合いで表され、PERCLOS=Nqc/Nqで算出される。
 ここで、閉眼状態とは、例えば、開眼度Knが閾値Knh以下であるフレームがNqch継続して観測された状態である。例えば、Nq=5,400、Knh=50、Nqch=5が用いられる。
The dozing level calculation unit 601 is provided with the eye opening degree Kn, and calculates the driver's dozing level Qn from the eye opening degree Kn for the past Nq frames.
For example, the dozing level calculation unit 601 calculates PERCLOS (Percentage of Eyelid Closure) for the past Nq frames, and calculates the dozing level. PERCLOS is expressed, for example, as a percentage of the number Nqc of frames Cf in the closed eye state included in the past Nq frames, and is calculated as PERCLOS = Nqc / Nq.
Here, the closed eye state is a state in which, for example, a frame whose eye opening degree Kn is equal to or less than the threshold value Knh is continuously observed for Nqch. For example, Nq = 5,400, Knh = 50, and Nqch = 5 are used.
 そして、居眠り度Qnは、下記の(17)式によって算出される。
 Qn=fq(PERCLOS)           (17)
The dozing level Qn is calculated by the following equation (17).
Qn = fq (PERCLOS) (17)
 (17)式において、fはPERCLOSを変数とした任意の関数であり、例えば、一次関数、二次関数、又は、ニューラルネットを使用した非線形関数が用いられる。ここでは、居眠り度は、1~10の数値で示されると仮定されているため、fは、1~10の数値を算出することのできる任意の関数を用いることができる。例えば、1~10の数値を算出することのできる関数fとして、下記の(18)式又は(19)式で表される関数を用いることができる。
Figure JPOXMLDOC01-appb-M000014
Figure JPOXMLDOC01-appb-M000015
 但し、Sigmoid(a、x)はシグモイド関数であり、下記の(20)式で定義される。
 Sigmoid(a、x)=1/(1+exp(-ax)) (20)
In the equation (17), f is an arbitrary function with PERCLOS as a variable, and for example, a linear function, a quadratic function, or a nonlinear function using a neural network is used. Here, since it is assumed that the dozing level is indicated by a numerical value of 1 to 10, any function that can calculate a numerical value of 1 to 10 can be used as f. For example, a function represented by the following formula (18) or (19) can be used as the function f that can calculate a numerical value of 1 to 10.
Figure JPOXMLDOC01-appb-M000014
Figure JPOXMLDOC01-appb-M000015
However, sigmoid (a, x) is a sigmoid function and is defined by the following equation (20).
Sigmaid (a, x) = 1 / (1 + exp (−ax)) (20)
 (18)式又は(19)式における各パラメータ(α1~αNN、β等)は過去の実験データから統計的に算出される。例えば、居眠り度を計測する実験において、カロリンスカ眠気尺度を用いた居眠り度の評価を行い、同時にPERCLOSの計測を行うことで、居眠り度とPERCLOSとの関係性を取得することができる。このようにして取得されたデータに基づいて、PERCLOSから居眠り度を尤もらしく算出するパラメータを特定し、特定されたパラメータを上記(18)式又は(19)式に適用すればよい。 Each parameter (α1 to αNN, β, etc.) in equation (18) or equation (19) is statistically calculated from past experimental data. For example, in an experiment for measuring the degree of drowsiness, by evaluating the degree of dozing using the Karolinska sleepiness scale and simultaneously measuring PERCLOS, the relationship between the degree of drowsiness and PERCLOS can be acquired. Based on the data acquired in this way, a parameter for calculating the degree of doze from the PERCLOS is specified, and the specified parameter may be applied to the above equation (18) or (19).
 ニューラルネットは、入力変数と結果の対応関係を関数としてモデル化する手法である。ニューラルネットを使用すると、実験データで得られたPERCLOSと居眠り度の対応関係を関数として取得することができる。即ち、入力として得られるPERCLOSに対し尤もらしい居眠り度を出力する関数が得られる。
 但し、過去のデータと同じデータが、実際の利用シーンで得られるとは限らない。この場合は、1~10以外の出力が得られる可能性がある。これに対しては、上記(18)式に示されているように、10よりも大きい値が得られた場合には、出力を10にし、0よりも小さい値が得られた場合には、出力を0にすることで、対応することができる。
A neural network is a technique for modeling the correspondence between input variables and results as a function. If a neural network is used, the correspondence between PERCLOS and the degree of dozing obtained from experimental data can be acquired as a function. That is, a function that outputs a likely dozing level with respect to PERCLOS obtained as an input is obtained.
However, the same data as the past data is not always obtained in the actual usage scene. In this case, an output other than 1 to 10 may be obtained. On the other hand, as shown in the above equation (18), when a value larger than 10 is obtained, the output is set to 10, and when a value smaller than 0 is obtained, This can be handled by setting the output to 0.
 また、過去Nqフレームに対するPERCLOSをPERCLOS(1)、過去2×Nq~Nqフレームに対するPERCLOSをPERCLOS(2)、・・・、過去NN×Nq~(NN-1)×Nqに対するPERCLOSをPERCLOS(NN)とし、下記の(21)式によって居眠り度Qnを算出してもよい。
Figure JPOXMLDOC01-appb-M000016
Also, PERCLOS for the past Nq frame is PERCLOS (1), PERCLOS for the past 2 × Nq to Nq frames is PERCLOS (2),... ) And the dozing level Qn may be calculated by the following equation (21).
Figure JPOXMLDOC01-appb-M000016
 ここで、(21)式におけるPERCLOSについて説明する。例えば、現在のフレームをF(k)とし、Nqを900、NN=10とする。
 この場合、F(k)~F(k-899)を対象に算出されたPERCLOSがPERCLOS(1)となる。
 また、F(k-900)~F(k-1799)を対象に算出されたPERCLOSがPERCLOS(2)となる。
 また、F(k-1800)~F(k-2699)を対象に算出されたPERCLOSがPERCLOS(3)となる。
 以下同様にして、F(k-8100)~F(k-8999)を対象に算出されたPERCLOSがPERCLOS(10)となる。
Here, PERCLOS in the equation (21) will be described. For example, assume that the current frame is F (k), Nq is 900, and NN = 10.
In this case, PERCLOS calculated for F (k) to F (k-899) is PERCLOS (1).
Also, PERCLOS calculated for F (k−900) to F (k−1799) becomes PERCLOS (2).
Also, PERCLOS calculated for F (k-1800) to F (k-2699) becomes PERCLOS (3).
Similarly, PERCLOS calculated for F (k-8100) to F (k-8999) is PERCLOS (10).
 なお、上記の説明では、PERCLOSをfqの変数とし、居眠り度Qnを算出したが、開眼度Knから、閉眼状態が起きる間隔、瞬きの頻度及び瞬きの間隔等、任意の特徴を抽出して、fqの変数としてもよい。 In the above description, PERCLOS is a variable of fq, and the degree of drowsiness Qn is calculated. However, from the degree of eye opening Kn, arbitrary features such as the interval at which the closed eye state occurs, the blink frequency, and the blink interval are extracted, It may be a variable of fq.
 以上のように構成された居眠り度検出装置600により出力される居眠り度を使用して、運転手に警告や注意喚起しても良い。 The driver may be alerted or alerted by using the dozing level output by the dozing level detecting device 600 configured as described above.
 以上のように構成することで、精度の高い開眼度検出結果を使用して、運転手の眠気の程度を精度よく推定することができる。 By configuring as described above, it is possible to accurately estimate the degree of drowsiness of the driver using a highly accurate eye opening degree detection result.
 以上に記載された開眼度検出装置100~500及び居眠り度検出装置600の一部又は全部は、例えば、図14(A)に示されているように、メモリ150と、メモリ150に格納されているプログラムを実行するCPU(Central Processing Unit)等のプロセッサ151とにより構成することができる。このようなプログラムは、ネットワークを通じて提供されてもよく、また、記録媒体に記録されて提供されてもよい。 Part or all of the eye opening degree detection devices 100 to 500 and the drowsiness level detection device 600 described above are stored in the memory 150 and the memory 150 as shown in FIG. 14A, for example. And a processor 151 such as a CPU (Central Processing Unit) for executing the program. Such a program may be provided through a network, or may be provided by being recorded on a recording medium.
 また、開眼度検出装置100~500及び居眠り度検出装置600の一部又は全部は、例えば、図14(B)に示されているように、単一回路、復号回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuits)又はFPGA(Field Programmable Gate Array)等の処理回路152で構成することもできる。 Further, as shown in FIG. 14B, for example, as shown in FIG. 14B, a part of or all of the eye-opening degree detection devices 100 to 500 and the drowsiness detection device 600 may be a single circuit, a decoding circuit, a programmed processor, It can also be configured by a processing circuit 152 such as a programmed processor, ASIC (Application Specific Integrated Circuits), or FPGA (Field Programmable Gate Array).
 以上に記載された実施の形態1~5では、開眼度Knを開眼度検出装置100~500の出力としているが、このような例に限定されず、例えば、形状パラメータHnを出力としてもよい。このような場合には、まぶた形状検出装置110~510が、情報処理装置として機能する。
 また、実施の形態6では、居眠り度Qnを出力する居眠り度検出装置600が、情報処理装置として機能する。
In the first to fifth embodiments described above, the eye opening degree Kn is used as the output of the eye opening degree detecting devices 100 to 500. However, the present invention is not limited to such an example. For example, the shape parameter Hn may be used as the output. In such a case, the eyelid shape detection devices 110 to 510 function as information processing devices.
In the sixth embodiment, the dozing level detection device 600 that outputs the dozing level Qn functions as an information processing device.
 100,200,300,400,500 開眼度検出装置、 600 居眠り度検出装置、 601 居眠り度算出部、 110,210,310,410,510 まぶた形状検出装置、 130 開眼度算出部、 111 眼存在領域特定部、 112 正規化処理部、 113,213,313,413 明暗変化検出部、 114 まぶた基準位置推定部、 115,215 近似曲線生成部、 116,216,516 形状決定部、 517 近似曲線選択部、 518 補正曲線生成部、 150 メモリ、 151 プロセッサ、 152 処理回路。 100, 200, 300, 400, 500 Eye opening degree detection device, 600 Dozing degree detection device, 601 Dozing degree calculation unit, 110, 210, 310, 410, 510 Eyelid shape detection device, 130 Eye opening degree calculation unit, 111 Eye presence region Identification unit, 112 Normalization processing unit, 113, 213, 313, 413 Light / dark change detection unit, 114 Eyelid reference position estimation unit, 115, 215 Approximation curve generation unit, 116, 216, 516 Shape determination unit, 517 Approximation curve selection unit , 518 correction curve generation unit, 150 memory, 151 processor, 152 processing circuit.

Claims (17)

  1.  人物の顔画像から、眼が存在する領域である眼存在領域を特定する眼存在領域特定部と、
     前記眼存在領域の縦方向における画素間の明るさの変化の大きさを検出することで、前記眼と上まぶたとの間の第1境界線上にある第1端点の座標及び前記眼と下まぶたとの間の第2境界線上にある第2端点の座標を推定する明暗変化検出部と、
     前記眼存在領域における、前記眼の目頭の座標及び前記眼の目尻の座標を推定するまぶた基準位置推定部と、
     前記眼存在領域において、前記第1端点及び前記第2端点を通る直線上から複数の座標を選択して、当該選択された複数の座標の各々、前記目頭の座標及び前記目尻の座標から、複数の第1近似曲線を生成する近似曲線生成部と、
     前記複数の第1近似曲線から、前記第1境界線として最も好適な第1近似曲線を選択し、当該選択された第1近似曲線から前記上まぶたの形状を決定する形状決定部と、を備え、
     前記明暗変化検出部は、前記眼存在領域において、前記第1端点及び前記第2端点を通る直線上に形成され、前記第1端点を含む第1選択領域を、前記画素間の明るさの変化に基づき特定し、
     前記近似曲線生成部は、前記第1選択領域から前記複数の座標を選択すること
     を特徴とする情報処理装置。
    An eye-existing area specifying unit that specifies an eye-existing area, which is an area where the eye exists, from a human face image;
    By detecting the magnitude of the change in brightness between pixels in the vertical direction of the eye presence region, the coordinates of the first end point on the first boundary line between the eye and the upper eyelid, and the eye and the lower eyelid A light / dark change detection unit that estimates the coordinates of the second end point on the second boundary line between
    An eyelid reference position estimator that estimates the coordinates of the eye's eye corner and the eye's eye corner in the eye presence region;
    In the eye presence region, a plurality of coordinates are selected from a straight line passing through the first end point and the second end point, and a plurality of coordinates are selected from each of the selected plurality of coordinates, the eye coordinates and the eye corner coordinates. An approximate curve generation unit for generating the first approximate curve of
    A shape determining unit that selects the most suitable first approximate curve as the first boundary line from the plurality of first approximate curves, and determines the shape of the upper eyelid from the selected first approximate curve; ,
    The brightness change detection unit is formed on a straight line passing through the first end point and the second end point in the eye presence region, and changes in brightness between the pixels in the first selection region including the first end point. Based on
    The information processing apparatus, wherein the approximate curve generation unit selects the plurality of coordinates from the first selection region.
  2.  前記明暗変化検出部は、前記眼存在領域において、上の画素からの明るさの低下の度合いが最も大きい画素の座標を前記第1端点の座標とし、下の画素からの明るさの低下の度合いが最も大きい画素の座標を前記第2端点の座標とすること
     を特徴とする請求項1に記載の情報処理装置。
    In the eye presence region, the brightness change detection unit uses the coordinates of the pixel with the highest degree of brightness reduction from the upper pixel as the coordinates of the first end point, and the degree of brightness reduction from the lower pixel. The information processing apparatus according to claim 1, wherein a coordinate of a pixel having the largest value is a coordinate of the second end point.
  3.  前記明暗変化検出部は、前記眼存在領域において、縦方向に延びる予め定められた複数の直線を設定し、当該複数の直線に含まれる各々の直線上で、上の画素からの明るさの低下の度合いが最も大きい第1候補画素の座標と、下の画素からの明るさの低下の度合いが最も大きい第2候補画素の座標との距離を算出し、算出された距離が最も大きい第1候補画素の座標及び第2候補画素の座標を、それぞれ前記第1端点の座標及び前記第2端点の座標とすること
     を特徴とする請求項1に記載の情報処理装置。
    The brightness change detection unit sets a plurality of predetermined straight lines extending in the vertical direction in the eye presence region, and reduces brightness from the upper pixel on each straight line included in the plurality of straight lines. The distance between the coordinates of the first candidate pixel having the largest degree of the brightness and the coordinates of the second candidate pixel having the largest degree of brightness reduction from the lower pixel is calculated, and the first candidate having the largest calculated distance is calculated. The information processing apparatus according to claim 1, wherein the coordinates of the pixel and the coordinates of the second candidate pixel are the coordinates of the first end point and the coordinates of the second end point, respectively.
  4.  前記明暗変化検出部は、前記眼存在領域において、縦方向に延びる予め定められた複数の直線を設定し、当該複数の直線から、前記人物の顔の向きに基づいて、1つの直線を選択し、当該選択された直線上で、上の画素からの明るさの低下の度合いが最も大きい画素の座標を前記第1端点の座標とし、下の画素からの明るさの低下の度合いが最も大きい画素の座標を前記第2端点の座標とすること
     を特徴とする請求項1に記載の情報処理装置。
    The brightness change detection unit sets a plurality of predetermined straight lines extending in the vertical direction in the eye presence region, and selects one straight line from the plurality of straight lines based on the orientation of the person's face. On the selected straight line, the coordinate of the pixel having the greatest degree of brightness decrease from the upper pixel is set as the coordinate of the first end point, and the pixel having the greatest degree of brightness decrease from the lower pixel The information processing apparatus according to claim 1, wherein the coordinates are the coordinates of the second end point.
  5.  前記明暗変化検出部は、前記第1端点の座標を推定するための第1フィルタと、前記第2端点の座標を推定するための第2フィルタとを用いており、
     前記第1フィルタは、前記眼存在領域において、第1注目画素を中心として、当該第1注目画素の横方向に複数の画素を含む第1領域に含まれる画素の輝度値の平均値を、当該第1領域の上に接し、当該第1領域と横方向の画素数は同じであり当該第1領域よりも縦方向の画素数が多い第2領域に含まれる画素の輝度値の平均値から引いた値を、当該第1注目画素の評価値として算出し、当該算出された評価値を、上の画素からの明るさの低下の度合いとするものであり、
     前記第2フィルタは、前記眼存在領域において、第2注目画素を中心として、当該第2注目画素の横方向に複数の画素を含む第3領域に含まれる画素の輝度値の平均値を、当該第3領域の下に接し、当該第3領域と横方向の画素数は同じであり当該第3領域よりも縦方向の画素数が多い第4領域に含まれる画素の輝度値の平均値から引いた値を、当該第2注目画素の評価値として算出し、当該算出された評価値を、下の画素からの明るさの低下の度合いとするものであること、
     を特徴とする請求項2から4の何れか一項に記載の情報処理装置。
    The brightness change detecting unit uses a first filter for estimating the coordinates of the first end point and a second filter for estimating the coordinates of the second end point,
    The first filter is configured to calculate an average value of luminance values of pixels included in a first region including a plurality of pixels in the lateral direction of the first target pixel with respect to the first target pixel in the eye presence region. Subtract from the average value of the luminance values of the pixels included in the second area that touches the first area and has the same number of pixels in the horizontal direction as the first area and has a larger number of pixels in the vertical direction than the first area. Is calculated as an evaluation value of the first target pixel, and the calculated evaluation value is set as a degree of decrease in brightness from the upper pixel.
    In the eye presence region, the second filter calculates an average value of luminance values of pixels included in a third region including a plurality of pixels in the horizontal direction of the second target pixel with the second target pixel as a center. Subtract from the average value of the luminance values of the pixels included in the fourth region that is in contact with the third region and has the same number of pixels in the horizontal direction as the third region and has a larger number of pixels in the vertical direction than the third region. The calculated value as the evaluation value of the second pixel of interest, and the calculated evaluation value as the degree of decrease in brightness from the lower pixel,
    The information processing apparatus according to any one of claims 2 to 4, wherein:
  6.  前記まぶた基準位置推定部は、前記第2端点の座標の縦方向における軸の値と、前記眼存在領域の横方向における一方の端の横方向における軸の値とを、前記目頭の座標とし、前記第2端点の座標の縦方向における軸の値と、前記眼存在領域の横方向における他方の端の横方向における軸の値とを前記目尻の座標とし、
     前記形状決定部は、前記目頭の座標及び前記目尻の座標間の線分から前記下まぶたの形状を決定すること
     を特徴とする請求項1から5の何れか一項に記載の情報処理装置。
    The eyelid reference position estimation unit uses the value of the axis in the vertical direction of the coordinates of the second end point and the value of the axis in the horizontal direction of one end in the horizontal direction of the eye presence region as the coordinates of the eye head, The value of the axis in the vertical direction of the coordinates of the second end point and the value of the axis in the horizontal direction of the other end in the horizontal direction of the eye presence region are the coordinates of the corner of the eye,
    The information processing apparatus according to any one of claims 1 to 5, wherein the shape determining unit determines the shape of the lower eyelid from a line segment between the coordinates of the eyes and the coordinates of the corners of the eyes.
  7.  前記近似曲線生成部は、前記眼存在領域において、前記第1端点及び前記第2端点間の線分から前記複数の座標を選択すること
     を特徴とする請求項1から6の何れか一項に記載の情報処理装置。
    The approximate curve generation unit selects the plurality of coordinates from a line segment between the first end point and the second end point in the eye presence region. Information processing device.
  8.  前記明暗変化検出部は、前記眼存在領域において、前記第1端点及び前記第2端点を通る直線上の座標の内、前記第1端点を含み、前記第1フィルタの評価値が予め定められた値以上となっている座標からなる前記第1選択領域を特定すること
     を特徴とする請求項5に記載の情報処理装置。
    The light / dark change detection unit includes the first end point of coordinates on a straight line passing through the first end point and the second end point in the eye presence region, and an evaluation value of the first filter is predetermined. The information processing apparatus according to claim 5, wherein the first selection area including coordinates that are equal to or greater than a value is specified.
  9.  前記明暗変化検出部は、前記眼存在領域において、前記第1端点及び前記第2端点を通る直線上の座標の内、前記第2端点を含み、前記第2フィルタの評価値が予め定められた値以上となっている座標からなる第2選択領域を特定し、
     前記近似曲線生成部は、前記第2選択領域から複数の座標を選択し、前記第2選択領域から選択された複数の座標の各々、前記目頭の座標及び前記目尻の座標から、複数の第2近似曲線を生成し、
     前記形状決定部は、前記複数の第2近似曲線から、前記第2境界線として最も好適な第2近似曲線を選択し、当該選択された第2近似曲線から前記下まぶたの形状を決定すること
     を特徴とする請求項8に記載の情報処理装置。
    The light / dark change detection unit includes the second end point of coordinates on a straight line passing through the first end point and the second end point in the eye presence region, and an evaluation value of the second filter is predetermined. Specify a second selection area consisting of coordinates greater than or equal to the value,
    The approximate curve generation unit selects a plurality of coordinates from the second selection area, and each of a plurality of second coordinates is selected from the plurality of coordinates selected from the second selection area, the coordinates of the eyes and the coordinates of the corner of the eye. Generate an approximate curve,
    The shape determining unit selects a second approximate curve most suitable as the second boundary line from the plurality of second approximate curves, and determines the shape of the lower eyelid from the selected second approximate curve. The information processing apparatus according to claim 8.
  10.  前記形状決定部は、前記複数の第2近似曲線の内、前記眼存在領域において、前記第2近似曲線上の画素の輝度値の平均値が最も小さい第2近似曲線を、前記第2境界線として最も好適と判断すること
     を特徴とする請求項9に記載の情報処理装置。
    The shape determination unit determines a second approximate curve having the smallest average value of luminance values of pixels on the second approximate curve in the eye presence region among the plurality of second approximate curves as the second boundary line. The information processing apparatus according to claim 9, wherein the information processing apparatus is determined to be the most suitable.
  11.  前記第1近似曲線は、前記目頭の座標を通り、前記選択された複数の座標の内の1つの座標を頂点とする二次関数と、前記目尻の座標を通り、当該1つの座標を頂点とする二次関数とにより特定されること
     を特徴とする請求項1から10の何れか一項に記載の情報処理装置。
    The first approximate curve passes through the coordinates of the eye, and a quadratic function having one of the selected coordinates as a vertex, and passing through the coordinates of the corner of the eye, the one coordinate as a vertex. The information processing apparatus according to any one of claims 1 to 10, wherein the information processing apparatus is specified by a quadratic function.
  12.  前記形状決定部は、前記複数の第1近似曲線の内、前記眼存在領域において、前記第1近似曲線上の画素の輝度値の平均値が最も小さい第1近似曲線を、前記第1境界線として最も好適と判断すること
     を特徴とする請求項1から11の何れか一項に記載の情報処理装置。
    The shape determining unit determines a first approximate curve having the smallest average value of luminance values of pixels on the first approximate curve in the eye presence region among the plurality of first approximate curves as the first boundary line. The information processing apparatus according to any one of claims 1 to 11, wherein the information processing apparatus is determined to be most suitable as the information processing apparatus.
  13.  人物の顔画像から、眼が存在する領域である眼存在領域を特定する眼存在領域特定部と、
     前記眼存在領域の縦方向における画素間の明るさの変化の大きさを検出することで、前記眼と上まぶたとの間の第1境界線上にある第1端点の座標及び前記眼と下まぶたとの間の第2境界線上にある第2端点の座標を推定する明暗変化検出部と、
     前記眼存在領域における、前記眼の目頭の座標及び前記眼の目尻の座標を推定するまぶた基準位置推定部と、
     前記眼存在領域において、前記第1端点及び前記第2端点を通る直線上から複数の座標を選択し、当該選択された複数の座標の各々、前記目頭の座標及び前記目尻の座標から、複数の第1近似曲線を生成する近似曲線生成部と、
     前記複数の第1近似曲線から、前記第1境界線として最も好適な第1近似曲線を選択することで、前記第1境界線における最上点を確定した確定端点の座標を決定する近似曲線選択部と、
     前記眼存在領域において、前記目頭の座標を通る縦方向における直線上から複数の第1選択座標及び前記目尻の座標を通る縦方向における直線上から複数の第2選択座標を選択して、当該複数の第1選択座標の各々、当該複数の第2選択座標の各々、及び、前記確定端点の座標から、近似曲線である複数の補正曲線を生成する補正曲線生成部と、
     前記複数の補正曲線から、前記第1境界線として最も好適な補正曲線を選択し、当該選択された補正曲線から前記上まぶたの形状を決定する形状決定部と、を備えること
     を特徴とする情報処理装置。
    An eye-existing area specifying unit that specifies an eye-existing area, which is an area where the eye exists, from a human face image;
    By detecting the magnitude of the change in brightness between pixels in the vertical direction of the eye presence region, the coordinates of the first end point on the first boundary line between the eye and the upper eyelid, and the eye and the lower eyelid A light / dark change detection unit that estimates the coordinates of the second end point on the second boundary line between
    An eyelid reference position estimator that estimates the coordinates of the eye's eye corner and the eye's eye corner in the eye presence region;
    In the eye presence region, a plurality of coordinates are selected from a straight line passing through the first end point and the second end point, and each of the selected plurality of coordinates, from the coordinates of the eyes and the coordinates of the eye corners, An approximate curve generator for generating a first approximate curve;
    An approximate curve selection unit that determines the coordinates of the determined end point that determines the highest point on the first boundary line by selecting the most suitable first approximate curve as the first boundary line from the plurality of first approximate curves. When,
    In the eye presence region, a plurality of first selection coordinates are selected from a straight line in the vertical direction passing through the coordinates of the eyes, and a plurality of second selection coordinates are selected from a straight line in the vertical direction passing through the coordinates of the corners of the eyes. A correction curve generation unit that generates a plurality of correction curves that are approximate curves from each of the first selection coordinates, each of the plurality of second selection coordinates, and the coordinates of the determined end point;
    A shape determining unit that selects the most suitable correction curve as the first boundary line from the plurality of correction curves and determines the shape of the upper eyelid from the selected correction curve. Processing equipment.
  14.  開眼度算出部をさらに備え、
     前記形状決定部は、前記決定された上まぶたの形状と、前記目頭の座標と、前記目尻の座標と、前記第2端点とから、前記眼の形状を示す形状パラメータを算出し、
     前記開眼度算出部は、前記形状パラメータに基づいて、前記眼の開き度合いである開眼度を算出すること
     を特徴とする請求項1から13の何れか一項に記載の情報処理装置。
    An eye opening degree calculation unit,
    The shape determining unit calculates a shape parameter indicating the shape of the eye from the determined shape of the upper eyelid, the coordinates of the eyes, the coordinates of the corner of the eye, and the second end point,
    The information processing apparatus according to claim 1, wherein the eye opening degree calculation unit calculates an eye opening degree that is an opening degree of the eye based on the shape parameter.
  15.  前記開眼度に基づいて、前記人物の眠気の程度である居眠り度を算出する居眠り度算出部をさらに備えること
     を特徴とする請求項14に記載の情報処理装置。
    The information processing apparatus according to claim 14, further comprising a drowsiness level calculation unit that calculates a drowsiness level that is a degree of drowsiness of the person based on the eye open degree.
  16.  人物の顔画像から、眼が存在する領域である眼存在領域を特定し、
     前記眼存在領域の縦方向における画素間の明るさの変化の大きさを検出することで、前記眼と上まぶたとの間の第1境界線上にある第1端点の座標及び前記眼と下まぶたとの間の第2境界線上にある第2端点の座標を推定し、
     前記眼存在領域における、前記眼の目頭の座標及び前記眼の目尻の座標を推定し、
     前記眼存在領域において、前記第1端点及び前記第2端点を通る直線上に形成され、前記第1の端点を含む第1選択領域を、前記画素間の明るさの変化に基づき特定し、
     前記第1選択領域から複数の座標を選択し、
     前記選択された複数の座標の各々、前記目頭の座標及び前記目尻の座標から、複数の第1近似曲線を生成し、
     前記複数の第1近似曲線から、前記第1境界線として最も好適な第1近似曲線を選択し、当該選択された第1近似曲線から前記上まぶたの形状を決定すること
     を特徴とする情報処理方法。
    From the person's face image, specify the eye presence area that is the area where the eye exists,
    By detecting the magnitude of the change in brightness between pixels in the vertical direction of the eye presence region, the coordinates of the first end point on the first boundary line between the eye and the upper eyelid, and the eye and the lower eyelid Estimate the coordinates of the second end point on the second boundary between and
    Estimating the coordinates of the eye's eyes and the corners of the eyes in the eye presence region,
    In the eye presence region, a first selection region that is formed on a straight line passing through the first end point and the second end point and includes the first end point is specified based on a change in brightness between the pixels,
    Selecting a plurality of coordinates from the first selection area;
    Generating a plurality of first approximate curves from each of the plurality of selected coordinates, the coordinates of the eyes and the coordinates of the corners of the eyes;
    A first approximate curve that is most suitable as the first boundary line is selected from the plurality of first approximate curves, and the shape of the upper eyelid is determined from the selected first approximate curve. Method.
  17.  人物の顔画像から、眼が存在する領域である眼存在領域を特定し、
     前記眼存在領域の縦方向における画素間の明るさの変化の大きさを検出することで、前記眼と上まぶたとの間の第1境界線上にある第1端点の座標及び前記眼と下まぶたとの間の第2境界線上にある第2端点の座標を推定し、
     前記眼存在領域における、前記眼の目頭の座標及び前記眼の目尻の座標を推定し、
     前記眼存在領域において、前記第1端点及び前記第2端点を通る直線上から複数の座標を選択し、
     前記選択された複数の座標の各々、前記目頭の座標及び前記目尻の座標から、複数の第1近似曲線を生成し、
     前記複数の第1近似曲線から、前記第1境界線として最も好適な第1近似曲線を選択することで、前記第1境界線における最上点を確定した確定端点の座標を決定し、
     前記眼存在領域において、前記目頭の座標を通る縦方向に延びる直線から複数の第1選択座標を選択し、
     前記眼存在領域において、前記目尻の座標を通る縦方向に延びる直線上から複数の第2選択座標を選択し、
     前記複数の第1選択座標の各々、前記複数の第2選択座標の各々、及び、前記確定端点の座標から、近似曲線である複数の補正曲線を生成し、
     前記複数の補正曲線から、前記第1境界線として最も好適な補正曲線を選択し、当該選択された補正曲線から前記上まぶたの形状を決定すること
     を特徴とする情報処理方法。
    From the person's face image, specify the eye presence area that is the area where the eye exists,
    By detecting the magnitude of the change in brightness between pixels in the vertical direction of the eye presence region, the coordinates of the first end point on the first boundary line between the eye and the upper eyelid, and the eye and the lower eyelid Estimate the coordinates of the second end point on the second boundary between and
    Estimating the coordinates of the eye's eyes and the corners of the eyes in the eye presence region,
    In the eye presence region, select a plurality of coordinates from a straight line passing through the first end point and the second end point,
    Generating a plurality of first approximate curves from each of the plurality of selected coordinates, the coordinates of the eyes and the coordinates of the corners of the eyes;
    By selecting the most suitable first approximate curve as the first boundary line from the plurality of first approximate curves, the coordinates of the fixed end point that has determined the highest point on the first boundary line are determined,
    In the eye presence region, a plurality of first selection coordinates are selected from a straight line extending in the vertical direction passing through the coordinates of the eye head,
    In the eye presence region, select a plurality of second selection coordinates from a straight line extending in the vertical direction passing through the coordinates of the corner of the eye,
    From each of the plurality of first selection coordinates, each of the plurality of second selection coordinates, and the coordinates of the fixed end point, a plurality of correction curves that are approximate curves are generated,
    An information processing method comprising: selecting a correction curve most suitable as the first boundary line from the plurality of correction curves and determining the shape of the upper eyelid from the selected correction curve.
PCT/JP2017/006046 2016-03-18 2017-02-20 Information processing device and information processing method WO2017159215A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016055591A JP2019082743A (en) 2016-03-18 2016-03-18 Information processing apparatus and information processing method
JP2016-055591 2016-03-18

Publications (1)

Publication Number Publication Date
WO2017159215A1 true WO2017159215A1 (en) 2017-09-21

Family

ID=59850893

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/006046 WO2017159215A1 (en) 2016-03-18 2017-02-20 Information processing device and information processing method

Country Status (2)

Country Link
JP (1) JP2019082743A (en)
WO (1) WO2017159215A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109544721A (en) * 2018-12-04 2019-03-29 北京诺士诚国际工程项目管理有限公司 A kind of long-range punch card method and system
CN114052667A (en) * 2021-12-18 2022-02-18 郑州大学 Sleep state monitoring method and sleep monitoring device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7315006B2 (en) * 2019-08-08 2023-07-26 オムロン株式会社 monitoring equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008257575A (en) * 2007-04-06 2008-10-23 Fujitsu Ltd Sleeping state detecting method, sleeping state detector, sleeping state detecting system and computer program
JP2010244178A (en) * 2009-04-02 2010-10-28 Toyota Central R&D Labs Inc Face feature point detection device and program
JP2012022531A (en) * 2010-07-14 2012-02-02 Toyota Central R&D Labs Inc Eyelid detecting device and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008257575A (en) * 2007-04-06 2008-10-23 Fujitsu Ltd Sleeping state detecting method, sleeping state detector, sleeping state detecting system and computer program
JP2010244178A (en) * 2009-04-02 2010-10-28 Toyota Central R&D Labs Inc Face feature point detection device and program
JP2012022531A (en) * 2010-07-14 2012-02-02 Toyota Central R&D Labs Inc Eyelid detecting device and program

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109544721A (en) * 2018-12-04 2019-03-29 北京诺士诚国际工程项目管理有限公司 A kind of long-range punch card method and system
CN114052667A (en) * 2021-12-18 2022-02-18 郑州大学 Sleep state monitoring method and sleep monitoring device
CN114052667B (en) * 2021-12-18 2024-03-05 郑州大学 Sleep state monitoring method and sleep monitoring device

Also Published As

Publication number Publication date
JP2019082743A (en) 2019-05-30

Similar Documents

Publication Publication Date Title
Vezhnevets et al. Robust and accurate eye contour extraction
JP4307496B2 (en) Facial part detection device and program
EP2923306B1 (en) Method and apparatus for facial image processing
US8891819B2 (en) Line-of-sight detection apparatus and method thereof
JP4895847B2 (en) 瞼 Detection device and program
JP4912206B2 (en) Image processing method, image processing apparatus, image processing system, and computer program
CN104200192A (en) Driver gaze detection system
US7362885B2 (en) Object tracking and eye state identification method
CN109840565A (en) A kind of blink detection method based on eye contour feature point aspect ratio
CN101261677A (en) New method-feature extraction layer amalgamation for face and iris
CN104008364B (en) Face identification method
Darshana et al. Efficient PERCLOS and gaze measurement methodologies to estimate driver attention in real time
CN111291701B (en) Sight tracking method based on image gradient and ellipse fitting algorithm
WO2017159215A1 (en) Information processing device and information processing method
JP4757787B2 (en) Emotion estimation device
Luo et al. The driver fatigue monitoring system based on face recognition technology
JP4107087B2 (en) Open / close eye determination device
JP4082203B2 (en) Open / close eye determination device
JP4788319B2 (en) Opening and closing eye determination device and method
JP2004192552A (en) Eye opening/closing determining apparatus
CN106446859B (en) Utilize the method for stain and the trace of blood in mobile phone front camera automatic identification human eye
JP7240910B2 (en) Passenger observation device
JP5493676B2 (en) Eye position recognition device
KR100338805B1 (en) Method for detecting drowsiness level
Horak et al. Eyes detection and tracking for monitoring driver vigilance

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17766222

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17766222

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP