WO2022123751A1 - 判定方法、判定プログラム、及び情報処理装置 - Google Patents
判定方法、判定プログラム、及び情報処理装置 Download PDFInfo
- Publication number
- WO2022123751A1 WO2022123751A1 PCT/JP2020/046151 JP2020046151W WO2022123751A1 WO 2022123751 A1 WO2022123751 A1 WO 2022123751A1 JP 2020046151 W JP2020046151 W JP 2020046151W WO 2022123751 A1 WO2022123751 A1 WO 2022123751A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- face image
- color component
- component
- color
- determined
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 118
- 230000010365 information processing Effects 0.000 title claims description 16
- 238000009826 distribution Methods 0.000 claims abstract description 78
- 238000010801 machine learning Methods 0.000 claims description 8
- 230000004044 response Effects 0.000 claims description 2
- 230000008569 process Effects 0.000 description 75
- 101100345605 Rattus norvegicus Mill2 gene Proteins 0.000 description 9
- 230000001815 facial effect Effects 0.000 description 9
- 238000004891 communication Methods 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000013528 artificial neural network Methods 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 230000037303 wrinkles Effects 0.000 description 3
- 230000009471 action Effects 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000001747 exhibiting effect Effects 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/774—Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
Definitions
- the present invention relates to an image determination technique.
- a technique for determining the presence or absence of a concealed object in an input target image based on the similarity of the feature amount at each corresponding sample point between the reference target image on the target image and the input target image obtained by imaging the object. ing.
- the face in the face image is in the state of wearing the attached object.
- the technology is known.
- the center line between the uppermost part (top of the head) and the lowermost part (chin) of the human face is set as the vertical center, and the image information of the upper predetermined area and the lower predetermined area sandwiching the upper and lower centers is compared.
- a technique of determining whether or not the photographed person is wearing a mask is known.
- Patent Documents 1 to 1 to Patent there are some known techniques for determining the presence or absence of an attachment such as a mask or sunglasses that partially shields the person's face from the person's face image (for example, Patent Documents 1 to 1 to Patent). See Document 5).
- Face recognition is to confirm the identity of the person to be authenticated by collating the captured face image obtained by imaging the person to be authenticated with the registered face image of the person to be authenticated in advance. be.
- the person subject to certification may be wearing an accessory that partially shields the face, such as a mask or sunglasses.
- face recognition When face recognition is performed using the captured face image obtained by imaging such a person to be authenticated, for example, it may cause a person's refusal due to a decrease in the similarity between the captured face image and the registered face image.
- the accuracy of detecting the orientation and posture of the face may decrease.
- the attachment that covers a part of the face of the person to be authenticated may affect the face recognition. Therefore, it is desired to accurately determine whether or not such an attachment is present on the subject's face in the subject's face image.
- the computer acquires the face image taken by the camera. For each color component contained in the face image, the computer sets an index indicating the high possibility that an attachment that partially shields the face exists on the face image, for each color component in the face image. Obtained based on the distribution of lightness. The computer determines any color component contained in the face image based on the color information of the face image. The computer determines whether or not an attachment is present on the face image based on the determined color component and the index for each color component.
- the image on the left side in each of FIGS. 1 and 2 is an example of a face image, and is an image of the face of the same person.
- the face shown in the face image of FIG. 1 does not have a mask, which is an example of an attached object, whereas the face shown in the face image of FIG.
- the part of the face including the mouth is shielded.
- spectacles are attached to both faces shown in these facial images, the lenses of these spectacles have high transparency and do not block the eyes of the face.
- the curves on the right side of each of FIGS. 1 and 2 represent the distribution of brightness for one of the color components contained in the facial image on the left side of each.
- the total value for each pixel row or the average value for each pixel row for the pixel values of the respective color components of each pixel arranged in each horizontal pixel row of the image frame of the face image is used for the image frame. It shows the distribution arranged vertically according to the arrangement order of each pixel row in the vertical direction.
- the horizontal direction in the drawings of these distributions indicates the magnitude of the value for each pixel row, and the right direction of the drawing indicates the direction in which the value is large, that is, the direction in which the brightness of the corresponding color component for the pixel row is high. Represents.
- These distributions are examples of the distribution of the brightness of the color components contained in the facial image.
- the brightness is minimized at each position corresponding to the pupil, nostril, and mouth fissure (between the upper lip and the lower lip) on the face shown in the facial image, and the tip of the nose. It is clear that it has the characteristic that the brightness is maximized at the position corresponding to (the tip of the nostril).
- the characteristics of the maximum and minimum brightness at each position corresponding to the nostrils and the mouth fissure are unclear. This difference is due to the fact that the mask worn on the face shown in the facial image shields the nose and mouth.
- an index indicating the high possibility that a wearer that partially shields the face is present on the face image is used as the distribution of brightness for each color component in the face image as described above. Get based on. Then, it is determined based on this index whether or not the attached object is present on the face image.
- the above-mentioned is described for the input of the above-mentioned brightness distribution.
- This model is generated by machine learning using a well-known machine learning algorithm such as a neural network.
- a machine learning algorithm such as a neural network.
- teacher data each of the data group representing the distribution of brightness for the face image with the attachment and the distribution of the brightness for the face image without the attachment for each color component.
- the represented data group is used.
- the model generated in this way is used to acquire an index indicating the high possibility of being present on the facial image.
- the distribution of lightness for a face image in which the mask is attached to the face and the nose and mouth are shielded is the same as the distribution of lightness in the face image in which the mask is not attached to the face. May be exhibited.
- a method for accurately determining the presence or absence of an attached object will be described even in such a case.
- FIGS. 3 and 4 both show an example of the lightness distribution for a face image without a mask
- FIG. 3 is for the reddish component
- FIG. 4 is for the bluish component. ..
- the maximum / minimum characteristics of the brightness at the positions of each part of the face which are clearly shown in the distribution example of FIG. Is clearly shown in both the distribution examples of FIGS. 3 and 4.
- FIGS. 5 and 6 show the distribution of brightness for a facial image when a mask is worn on the face
- FIG. 5 shows the redness component
- FIG. 6 shows the bluish component. It is a thing.
- the distribution examples of FIGS. 5 and 6 are for the case where the mask worn on the face is red.
- the distribution example of FIG. 6 which shows the distribution example of the lightness of the bluish component
- the lightness in the part where the red mask is attached is low, and the maximum and minimum are unclear. Therefore, in the distribution example of FIG. 6, it is easy to determine the presence / absence of the mask by comparison with the distribution example of FIG. 4 in which the mask does not exist.
- the distribution example of FIG. 6, which shows the distribution of the lightness of the bluish component the lightness is low in the entire area of the red mask. Therefore, even if the wrinkles of the mask cause a dark portion in the mask portion of the face image, the features of maximum and minimum at the position corresponding to the mask wearing portion as in the distribution example of FIG. 5 do not clearly appear.
- any color component contained in the face image is determined as a color component suitable for use in determining whether or not the attached object is present in the face image. This determination is made based on the color information of the face image, more specifically, for example, whether the face image has a small redness component or the image has a small bluish component. It is good to set it to. If the face image has a red attachment, the redness component of the face image is large, so that the bluish component of the face image is relatively small. This is because the bluish component is large and the reddish component of the face image is relatively small.
- the color component determined as described above for determining whether or not the attached object is present on the face image and the above-mentioned high possibility that the attached object is present on the face image are determined. It is performed based on the index for each color component to be represented. More specifically, this determination is not made by using the indexes for each color component equally, but so that the weight for the determined color component is heavier than the weight for the color components other than the determined color component. It is performed using the index given the weight. By doing so, it is possible to accurately determine the presence or absence of an attached object in the face image.
- FIG. 7 shows the configuration of an exemplary information processing apparatus 1.
- a camera 2 is connected to this information processing device 1.
- the camera 2 is used for photographing a person, and the photographed image including the image of the face of the person obtained by the photographing is input to the information processing apparatus 1.
- the information processing device 1 includes a face image acquisition unit 11, an index acquisition unit 12, a color component determination unit 13, and a determination unit 14.
- the face image acquisition unit 11 acquires an image (face image) of a person's face region from the captured image.
- the face image acquisition unit 11 may apply any of these well-known techniques to acquire a face image from a captured image.
- the face image acquisition unit 11 acquires a face image by adopting the following method described in the above-mentioned Cited Document 4 as a method for acquiring a face image.
- the luminance value is set. It is determined that the pixels in the range of the sequentially added areas are included in the face area. This judgment utilizes the fact that the face region tends to be imaged relatively brightly as compared with the hair region and the background region.
- the amount of change in the horizontal direction is calculated for the sequentially added luminance values, and the position where the calculated amount of change changes more than the threshold value is specified as the contour portion of the face in the horizontal direction.
- the amount of change in the horizontal brightness value in the image frame tends to change significantly in the boundary region between the background region and the face region as compared with other parts. It is something that makes use of that.
- the face region is specified by specifying the range between the vertical direction and the horizontal direction in the captured image in this way.
- the index acquisition unit 12 creates a brightness distribution for each color component included in the face image acquired by the face image acquisition unit 11, and the wearer is on the face image based on the brightness distribution for each acquired color component. An index showing the high possibility of being present in is obtained for each color component.
- the index acquisition unit 12 creates the above-mentioned distribution for each color component, and when the created distribution is input, the index acquisition unit 12 uses a model that outputs the above-mentioned index for each color component to generate the above-mentioned index. To get.
- This model is, for example, a neural network.
- This model is generated by executing a well-known machine learning algorithm using a pre-prepared data group as teacher data.
- each color component of the data group representing the lightness distribution for each color component of the face image in which the attachment is present and the color component of the face image in which the attachment is not present.
- the data group that represents the distribution of the lightness of is used. For example, to the input of the distribution for the face image in which the attachment is present, the value "1" indicating the existence of the attachment is associated as an output, and the distribution is input for the face image in which the attachment is not present. On the other hand, a value "0" indicating that there is no attached object is associated as an output. Then, machine learning is executed using these data groups as teacher data. By doing so, a model that outputs the above-mentioned index having a range of values that can be 0 or more and 1 or less is generated.
- the index acquisition unit 12 may acquire the above-mentioned index without using such a machine learning model.
- the distribution of the brightness standard for each color component included in the reference image for the face on which the attachment is present is prepared in advance, and the distribution of the brightness for each color component and the distribution of the brightness standard for each color component are prepared.
- the degree of similarity of may be calculated as the above-mentioned index.
- the color component determination unit 13 determines any color component included in the face image based on the color information of the face image acquired by the face image acquisition unit 11. In the present embodiment, the determination of the color component is performed as follows.
- the image data of the face image is expressed in the YCrCb space.
- the Cr component represents a reddish component
- the Cb component represents a bluish component.
- the image data may be converted into image data in the YCbCr space, and then each process described below may be performed. Further, for example, when the image data is represented in the RGB color space, it is assumed that the R component represents the reddish component and the B component represents the bluish component, and the Cr component and the Cb component in the following description are respectively. Each process may be performed by replacing it with the R component and the B component.
- the Cr component value and the Cb component value for each pixel constituting the face image are normalized so that the range of the component value is 0 or more and 1 or less. In the following description, it is assumed that the Cr component value and the Cb component value are normalized by this.
- the average value of each of the Cr component value and the Cb component value for each pixel constituting the face image is calculated for all the pixels constituting the face image.
- the average value for each of the Cr component value and the Cb component value calculated at this time is taken as MI Cr and MI Cb , respectively.
- the MI Cr and MI Cb are information indicating the height of the brightness of each of the reddish component and the bluish component contained in the face image, and are an example of the color information of the face image.
- the color component determination unit 13 may determine the color component by comparing the magnitude of MI Cr and MIC b calculated in this way. That is, when MI Cb is smaller than MI Cr , the Cb component may be the result of determining the color component, and when MI Cr is smaller than MI Cb , the Cr component may be the result of determining the color component.
- MI Cb is smaller than MI Cr
- MI Cr is smaller than MI Cb
- a comparison target is made based on this tendency. The color component is determined after correcting the color component of.
- the color component determining unit 13 first determines the brightness of the reddish component or the bluish component based on the relationship of the brightness of each of the reddish component and the bluish component included in the reference face image in which no attachment is present. To correct. The color component determination unit 13 determines the color component based on the lightness corrected in this way.
- a face image in which no attachment is present is prepared as a reference face image, and the average value of the normalized Cr component value and the Cb component value for each pixel constituting this reference face image is prepared. Is calculated in advance. The average value of each of the Cr component value and the Cb component value of the reference face image calculated at this time is defined as MT Cr and MT Cb , respectively.
- the color component determination unit 13 substitutes the MICb of the MICr and the MICb calculated as described above into the following equation [Equation 1] to calculate the correction value MICb '.
- the color component determination unit 13 compares the magnitude of the correction value MI Cb'and MI Cr , and if MI Cb'is smaller than MI Cr , the Cb component is taken as the result of color component determination, and MI Cr is MI. If it is smaller than Cb ', the Cr component is taken as the result of determining the color component. That is, in the color component determination unit 13, when the bluish component is lower than the reddish component in the relationship of the brightness of each of the bluish component and the reddish component contained in the face image, the bluish component is determined and the redness is determined. When the component is lower than the bluish component, the reddish component is determined.
- the color component determining unit 13 determines the correction value MICb'obtained by correcting the high brightness MICb of the bluish component contained in the face image and the reddish component contained in the face image.
- the color component is determined based on the result of the magnitude comparison with the high brightness MI Cr .
- the color component determination unit 13 may determine the color component based on the result of.
- the determination unit 14 determines whether or not an attachment is present on the face image based on the color component determined by the color component determination unit 13 and the above-mentioned index for each color component acquired by the index acquisition unit 12. conduct.
- the determination unit 14 acquires the weight for each color component included in the face image. This weight is set so that the weight for the color component determined by the color component determination unit 13 is heavier than the weight for the other color components.
- the determination unit 14 gives the above-mentioned index for each color component acquired by the index acquisition unit 12 the weight acquired for each color component.
- the determination unit 14 determines whether or not an attachment is present on the face image based on the index for each color component weighted in this way.
- the determination unit 14 calculates the weight W Cr of the Cr component and the weight WCb of the Cb component by performing the following formula [Equation 2], respectively.
- the determination unit 14 calculates the comprehensive index Pt , which is an index indicating the high possibility that the attached object is present on the face image, by performing the calculation of the following equation [Equation 3].
- the formula [Equation 3] is a formula for calculating the average value of the index for the above-mentioned weighted Cr component and the above-mentioned index for the weighted Cb component as the comprehensive index Pt.
- the determination unit 14 makes a magnitude comparison between the calculated comprehensive index Pt and a predetermined threshold value (for example, 0.5). Based on the result of this magnitude comparison, the determination unit 14 determines that the wearer is present on the face image when the comprehensive index P t is equal to or greater than the threshold value, and when the comprehensive index P t is smaller than the threshold value. It is determined that the attached object does not exist on the face image, and the determination result is output.
- a predetermined threshold value for example, 0.5
- the output of the determination unit 14 is the output of the information processing device 1. This output is used, for example, in the face recognition process to perform collation using a separate registered face image depending on the presence or absence of the attached object on the face image.
- the information processing device 1 of FIG. 7 includes each of the above-mentioned components.
- the information processing device 1 may be configured by a combination of a computer and software.
- FIG. 8 shows an example of the hardware configuration of the computer 20.
- the computer 20 includes, for example, a processor 21, a memory 22, a storage device 23, a reading device 24, a communication interface 26, and an input / output interface 27 as components. These components are connected via the bus 28, and data can be exchanged between the components.
- the processor 21 may be, for example, a single processor, a multiprocessor, and a multicore.
- the processor 21 uses the memory 22 to execute, for example, an attachment presence / absence determination processing program that describes a procedure for attachment presence / absence determination processing described later.
- the memory 22 is, for example, a semiconductor memory, and may include a RAM area and a ROM area.
- the storage device 23 is, for example, a semiconductor memory such as a hard disk or a flash memory, or an external storage device.
- RAM is an abbreviation for Random Access Memory.
- ROM is an abbreviation for Read Only Memory.
- the reading device 24 accesses the removable storage medium 25 according to the instructions of the processor 21.
- the removable storage medium 25 is, for example, a semiconductor device (USB memory or the like), a medium to which information is input / output by magnetic action (magnetic disk or the like), a medium to which information is input / output by optical action (CD-ROM, etc.). It is realized by DVD etc.).
- USB is an abbreviation for Universal Bus.
- CD is an abbreviation for Compact Disc.
- DVD is an abbreviation for Digital Versaille Disk.
- the communication interface 26 transmits / receives data via a communication network (not shown) according to the instructions of the processor 21, for example.
- the input / output interface 27 acquires various data such as image data of a captured image sent from the camera 2. Further, the input / output interface 27 outputs the result of the attachment presence / absence determination process described later, which is output from the processor 21.
- the attachment presence / absence determination program executed by the processor 21 of the computer 20 is provided, for example, in the following form. (1) It is pre-installed in the storage device 23. (2) Provided by the removable storage medium 25. (3) It is provided to the communication interface 26 from a server such as a program server via a communication network.
- the hardware configuration of the computer 20 is an example, and the embodiment is not limited to this.
- some or all the functions of the above-mentioned functional parts may be implemented as hardware by FPGA, SoC, or the like.
- FPGA is an abbreviation for Field Programmable Gate Array.
- SoC is an abbreviation for System-on-a-chip.
- attachment presence / absence determination process described in the attachment presence / absence determination program executed by the processor 21 will be described.
- 9 and 10 are flowcharts showing the processing contents of the attachment presence / absence determination process.
- the processor 21 outputs an index indicating the high possibility that the wearer is present on the face image for each color component in response to the input of the brightness distribution for each color component of the face image (a model ().
- a model For example, a neural network
- the above-mentioned model is generated in advance by executing a well-known machine learning algorithm using a data group prepared in advance as teacher data.
- the data group used as the teacher data the data group representing the distribution of the brightness for each color component of the face image in which the attachment is present and the color component of each of the face image in which the attachment is not present.
- a data group representing the distribution of lightness is used.
- a process of acquiring a face image from the acquired photographed image is performed.
- the face image is acquired by using the same method as described above as the method adopted by the face image acquisition unit 11 in the information processing apparatus 1 of FIG. 7.
- the processor 21 provides the above-mentioned function of the face image acquisition unit 11 by executing the above processes of S101 and S102.
- a process of creating a brightness distribution for the reddish component and the bluish component of the face image acquired by the process of S102 is performed.
- the distribution created by this process is the distribution described above as created by the index acquisition unit 12 in the information processing apparatus 1 of FIG. 7.
- a process of acquiring an index indicating the high possibility that an attached object is present in the face image acquired by the process of S102 is performed for the reddish component and the bluish component.
- the index output from the model is acquired by inputting the brightness distribution of the reddish component and the bluish component created by the process of S103 into the above-mentioned model that outputs this index. Will be.
- the processor 21 provides the above-mentioned function of the index acquisition unit 12 by executing the above processes of S103 and S104.
- a process of calculating the high brightness of the reddish component and the bluish component of the acquired face image is performed.
- the above-mentioned process is performed as the process performed by the color component determining unit 13 in the information processing apparatus 1 of FIG. 7. That is, in this process, from the Cr component value and the Cb component value for each pixel constituting the face image, the average value MI Cr for all the pixels constituting the face image for each of the Cr component value and the Cb component value. And the process of calculating MICb is performed.
- a process of comparing the magnitude of the high brightness of the reddish component and the high brightness of the bluish component is performed.
- the process of comparing the magnitude of MICb'and MIC Cr which is the process described above as that performed by the color component determining unit 13, is performed.
- a determination process is performed based on the result of the magnitude comparison of S107. That is, a process of determining whether or not the height of the lightness of the bluish component is lower than the height of the lightness of the reddish component is performed.
- the process proceeds to S109. Then, in S109, as a result of determining the color component, a process of selecting a bluish component is performed.
- the reddish component is selected as a result of determining the color component. is doing. Instead, the bluish component may be selected as a result of determining the color component when both are determined to be equal.
- the processor 21 provides the above-mentioned function of the color component determination unit 13 by executing the above processes from S105 to S110.
- a process of acquiring weights for the reddish component and the bluish component is performed.
- the process performed by the determination unit 14 in the information processing apparatus 1 of FIG. 7 the process of calculating the weights W Cr and WC b from MI Cr and MIC b'using the above-mentioned equation [Equation 2] is performed. Will be done.
- the weight of the color component selected by executing the process of S109 or S110 of FIG. 9 is set to be heavier than the weight of the other color component.
- a process of calculating a comprehensive index indicating the high possibility that an attached object is present in the face image acquired by the process of S102 of FIG. 9 is performed.
- This process is a process described above as performed by the determination unit 14, which is a process of calculating a comprehensive index Pt from MI Cr and MIC b'and weights W Cr and WC b using the equation [Equation 3]. ..
- a process of comparing the magnitude of the comprehensive index calculated in the process of S112 with a predetermined threshold value is performed.
- the process of comparing the magnitude of the comprehensive index Pt and the threshold value which is described above as the process performed by the determination unit 14, is performed.
- a determination process is performed based on the result of the magnitude comparison of S113. That is, a process of determining whether or not the comprehensive index is a value equal to or higher than the threshold value is performed.
- the process proceeds to S115. Then, in S115, as a final determination result of the presence / absence of the attached object, a process of determining that the attached object is present in the face image acquired by the process of S102 of FIG. 9 is performed.
- the process proceeds to S116. Then, in S116, as a final determination result of the presence / absence of the attached object, a process of determining that the attached object does not exist in the face image acquired by the process of S102 is performed.
- the processor 21 provides the above-mentioned function of the determination unit 14 by executing the above processes from S111 to S117.
- the process up to the above is the process of determining the presence or absence of an attached item.
- the computer 20 of FIG. 8 operates as the information processing device 1 of FIG. 7, and it is possible to accurately determine the presence / absence of an attachment in the face image. ..
- an index indicating the high possibility that the attached object is present in the face image is calculated by a model, and the presence or absence of the attached object is determined based on the calculated index and the weight.
- the present invention is not limited to this. That is, for example, any mounting method can be adopted as long as it determines the presence or absence of an attachment based on the distribution of the brightness of the determined color component after determining the color component of either the reddish component or the bluish component. Needless to say. That is, it may be an implementation method that does not use a model or an index.
- Information processing device 2 Camera 11 Face image acquisition unit 12 Index acquisition unit 13 Color component determination unit 14 Judgment unit 20 Computer 21 Processor 22 Memory 23 Storage device 24 Reader 25 Detachable storage medium 26 Communication interface 27 Input / output interface 28 Bus
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Multimedia (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Human Computer Interaction (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
(1)記憶装置23に予めインストールされている。
(2)着脱可能記憶媒体25により提供される。
(3)プログラムサーバなどのサーバから通信ネットワークを介して通信インタフェース26へ提供される。
2 カメラ
11 顔画像取得部
12 指標取得部
13 色成分決定部
14 判定部
20 コンピュータ
21 プロセッサ
22 メモリ
23 記憶装置
24 読取装置
25 着脱可能記憶媒体
26 通信インタフェース
27 入出力インタフェース
28 バス
Claims (13)
- カメラにより撮影された顔画像を取得し、
前記顔画像に含まれる各色成分について、顔を部分的に遮蔽する装着物が前記顔画像上に存在する可能性の高さを表す指標を、前記顔画像における、それぞれの色成分の明度の分布に基づき取得し、
前記顔画像の色情報に基づき、前記顔画像に含まれるいずれかの色成分を決定し、
前記決定した色成分と、各色成分についての前記指標とに基づき、前記顔画像上に前記装着物が存在するか否かの判定を行う、
処理をコンピュータが実行することを特徴とする判定方法。 - 前記色情報は、前記顔画像に含まれる各色成分のそれぞれについての明度の高さを表す情報であり、
前記色成分の決定において、前記顔画像に含まれる各色成分のうちで前記明度の高さが他の色成分よりも低い色成分が決定される、
ことを特徴とする請求項1に記載の判定方法。 - 前記顔画像に含まれる各色成分は、赤み成分と青み成分とを含み、
前記色成分の決定において、
前記顔画像に含まれる前記青み成分についての前記明度の高さが前記顔画像に含まれる前記赤み成分についての前記明度の高さよりも低い場合には前記青み成分が決定され、
前記顔画像に含まれる前記赤み成分についての前記明度の高さが前記顔画像に含まれる前記青み成分についての前記明度の高さよりも低い場合には前記赤み成分が決定される、
ことを特徴とする請求項2に記載の判定方法。 - 前記赤み成分についての前記明度の高さ若しくは前記青み成分についての前記明度の高さを、前記装着物が存在しない基準の顔画像に含まれる前記赤み成分と前記青み成分との間でのそれぞれの成分についての前記明度の高さの関係に基づいて補正し、
前記色成分の決定を、補正された明度に基づいて行う、
ことを特徴とする請求項3に記載の判定方法。 - 前記赤み成分及び前記青み成分は、それぞれYCrCb色空間におけるCr成分及びCb成分であることを特徴とする請求項3又は4に記載の判定方法。
- 前記顔画像に含まれる各色成分の明度の分布は、前記顔画像の画像フレームの水平方向の各画素列に並ぶ各画素の、それぞれの色成分の画素値についての画素列毎の合計値若しくは画素列毎の平均値を、該画像フレームの垂直方向における各画素列の配置順に従って並べた分布であることを特徴とする請求項1から5のうちのいずれか一項に記載の判定方法。
- 前記顔画像に含まれる各色成分について、それぞれの色成分についての前記分布の入力に対し、それぞれの色成分についての前記指標を出力するモデルを、前記装着物が存在する顔画像のそれぞれの色成分についての前記分布をそれぞれが表しているデータ群と前記装着物が存在しない顔画像のそれぞれの色成分についての前記分布をそれぞれが表しているデータ群とを教師データとして用いる機械学習を行うことによって生成しておき、それぞれの色成分についての前記生成しておいたモデルへ、前記顔画像に含まれるそれぞれの色成分についての前記分布を入力したときにおける該モデルからの出力を、それぞれの色成分についての前記指標として取得することを特徴とする請求項1から6のうちのいずれか一項に記載の判定方法。
- 前記顔画像に含まれる各色成分についての重みであって前記決定した色成分についての前記重みが前記決定した色成分以外の色成分についての前記重みよりも重い前記重みを、それぞれの色成分について取得し、
各色成分についての前記指標にそれぞれの色成分についての前記重みを与え、
前記前記装着物が存在するか否かの判定は、各色成分についての、前記重みを与えた前記指標に基づいて行われる、
ことを特徴とする請求項1から7のうちのいずれか一項に記載の判定方法。 - 前記顔画像上に前記装着物が存在するか否かは、各色成分についての、前記重みを与えた前記指標の平均値と閾値との大小比較の結果によって判定されることを特徴とする請求項8に記載の判定方法。
- 前記決定した色成分についての前記重みを、前記顔画像に含まれる前記決定した色成分についての明度の高さが高いほど重くすることを特徴とする請求項8又は9に記載の判定方法。
- カメラにより撮影された顔画像を取得し、
前記顔画像に含まれる各色成分について、顔を部分的に遮蔽する装着物が前記顔画像上に存在する可能性の高さを表す指標を、前記顔画像における、それぞれの色成分の明度の分布に基づき取得し、
前記顔画像の色情報に基づき、前記顔画像に含まれるいずれかの色成分を決定し、
前記決定した色成分と、各色成分についての前記指標とに基づき、前記顔画像上に前記装着物が存在するか否かの判定を行う、
処理をコンピュータに実行させるための判定プログラム。 - カメラにより撮影された顔画像を取得する顔画像取得部と、
前記顔画像に含まれる各色成分について、顔を部分的に遮蔽する装着物が前記顔画像上に存在する可能性の高さを表す指標を、前記顔画像における、それぞれの色成分の明度の分布に基づき取得する指標取得部と、
前記顔画像の色情報に基づき、前記顔画像に含まれるいずれかの色成分を決定する色成分決定部と、
前記決定した色成分と、各色成分についての前記指標とに基づき、前記顔画像上に前記装着物が存在するか否かの判定を行う判定部と、
を備えることを特徴とする情報処理装置。 - カメラにより撮影された顔画像を取得し、
取得した前記顔画像の色情報に基づき、いずれかの色成分を決定し、
取得した前記顔画像に含まれる色成分のうち、決定した前記色成分の明度分布に基づき、前記顔画像上に装着物が存在するか否かの判定を行う、
処理をコンピュータが実行することを特徴とする判定方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20965133.0A EP4261773A1 (en) | 2020-12-10 | 2020-12-10 | Determination method, determination program, and information processing device |
CN202080106991.5A CN116472555A (zh) | 2020-12-10 | 2020-12-10 | 判定方法、判定程序以及信息处理装置 |
JP2022567992A JPWO2022123751A1 (ja) | 2020-12-10 | 2020-12-10 | |
PCT/JP2020/046151 WO2022123751A1 (ja) | 2020-12-10 | 2020-12-10 | 判定方法、判定プログラム、及び情報処理装置 |
US18/305,859 US20230260322A1 (en) | 2020-12-10 | 2023-04-24 | Determination method, storage medium, and information processing device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2020/046151 WO2022123751A1 (ja) | 2020-12-10 | 2020-12-10 | 判定方法、判定プログラム、及び情報処理装置 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/305,859 Continuation US20230260322A1 (en) | 2020-12-10 | 2023-04-24 | Determination method, storage medium, and information processing device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022123751A1 true WO2022123751A1 (ja) | 2022-06-16 |
Family
ID=81973473
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2020/046151 WO2022123751A1 (ja) | 2020-12-10 | 2020-12-10 | 判定方法、判定プログラム、及び情報処理装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20230260322A1 (ja) |
EP (1) | EP4261773A1 (ja) |
JP (1) | JPWO2022123751A1 (ja) |
CN (1) | CN116472555A (ja) |
WO (1) | WO2022123751A1 (ja) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000082147A (ja) * | 1998-09-05 | 2000-03-21 | Sharp Corp | ヒトの顔を検出する方法および装置、ならびに観察者トラッキングディスプレイ |
JP2004310397A (ja) | 2003-04-07 | 2004-11-04 | Toyota Central Res & Dev Lab Inc | マスク着用判定装置 |
JP2005018466A (ja) * | 2003-06-26 | 2005-01-20 | Canon Inc | 画像領域の抽出及び画像の再生 |
JP2010157073A (ja) | 2008-12-26 | 2010-07-15 | Fujitsu Ltd | 顔認識装置、顔認識方法及び顔認識プログラム |
WO2010126120A1 (ja) | 2009-04-30 | 2010-11-04 | グローリー株式会社 | 画像処理装置、画像処理方法、及び同方法をコンピュータに実行させるプログラム |
WO2019102619A1 (ja) | 2017-11-27 | 2019-05-31 | 三菱電機株式会社 | 表情認識装置 |
US20190259174A1 (en) * | 2018-02-22 | 2019-08-22 | Innodem Neurosciences | Eye tracking method and system |
JP2019194888A (ja) | 2014-02-12 | 2019-11-07 | 日本電気株式会社 | 情報処理装置、情報処理方法及びプログラム |
-
2020
- 2020-12-10 EP EP20965133.0A patent/EP4261773A1/en not_active Withdrawn
- 2020-12-10 JP JP2022567992A patent/JPWO2022123751A1/ja active Pending
- 2020-12-10 CN CN202080106991.5A patent/CN116472555A/zh active Pending
- 2020-12-10 WO PCT/JP2020/046151 patent/WO2022123751A1/ja active Application Filing
-
2023
- 2023-04-24 US US18/305,859 patent/US20230260322A1/en not_active Abandoned
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000082147A (ja) * | 1998-09-05 | 2000-03-21 | Sharp Corp | ヒトの顔を検出する方法および装置、ならびに観察者トラッキングディスプレイ |
JP2004310397A (ja) | 2003-04-07 | 2004-11-04 | Toyota Central Res & Dev Lab Inc | マスク着用判定装置 |
JP2005018466A (ja) * | 2003-06-26 | 2005-01-20 | Canon Inc | 画像領域の抽出及び画像の再生 |
JP2010157073A (ja) | 2008-12-26 | 2010-07-15 | Fujitsu Ltd | 顔認識装置、顔認識方法及び顔認識プログラム |
WO2010126120A1 (ja) | 2009-04-30 | 2010-11-04 | グローリー株式会社 | 画像処理装置、画像処理方法、及び同方法をコンピュータに実行させるプログラム |
JP2019194888A (ja) | 2014-02-12 | 2019-11-07 | 日本電気株式会社 | 情報処理装置、情報処理方法及びプログラム |
WO2019102619A1 (ja) | 2017-11-27 | 2019-05-31 | 三菱電機株式会社 | 表情認識装置 |
US20190259174A1 (en) * | 2018-02-22 | 2019-08-22 | Innodem Neurosciences | Eye tracking method and system |
Also Published As
Publication number | Publication date |
---|---|
EP4261773A1 (en) | 2023-10-18 |
CN116472555A (zh) | 2023-07-21 |
US20230260322A1 (en) | 2023-08-17 |
JPWO2022123751A1 (ja) | 2022-06-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4345622B2 (ja) | 瞳色推定装置 | |
JP4214420B2 (ja) | 瞳色補正装置およびプログラム | |
CN110377385B (zh) | 一种屏幕显示方法、装置及终端设备 | |
KR101301821B1 (ko) | 안색 정보 생성 장치 및 그 방법, 안색 정보를 이용한 건강 상태 판단 장치 및 그 방법, 건강 분류 함수 생성 장치 및 그 방법 | |
CN109086723B (zh) | 一种基于迁移学习的人脸检测的方法、装置以及设备 | |
US12067095B2 (en) | Biometric authentication system, biometric authentication method, and storage medium | |
KR100922653B1 (ko) | 눈동자색 보정 장치 및 기록 매체 | |
CN112818901B (zh) | 一种基于眼部注意力机制的戴口罩人脸识别方法 | |
JP5730044B2 (ja) | 顔画像認証装置 | |
KR20150072463A (ko) | 안면 영상을 이용하는 건강 상태 판단 장치 및 건강 상태 판단 방법 | |
CN106570447B (zh) | 基于灰度直方图匹配的人脸照片太阳镜自动去除方法 | |
JP5480532B2 (ja) | 画像処理装置、画像処理方法、及び同方法をコンピュータに実行させるプログラム | |
CN111291701A (zh) | 一种基于图像梯度和椭圆拟合算法的视线追踪方法 | |
JP6396357B2 (ja) | 顔画像認証装置 | |
CN114894337A (zh) | 一种用于室外人脸识别测温方法及装置 | |
KR20210136092A (ko) | 화상 처리 장치, 화상 처리 방법 및 화상 처리 프로그램 | |
US20230020160A1 (en) | Method for determining a value of at least one geometrico-morphological parameter of a subject wearing an eyewear | |
WO2022123751A1 (ja) | 判定方法、判定プログラム、及び情報処理装置 | |
JP4602688B2 (ja) | 画像処理方法、画像処理装置およびそのプログラム | |
CN107847136B (zh) | 弹力感评价装置、弹力感评价方法以及弹力感评价程序 | |
KR20200049936A (ko) | 생체 인식 장치 및 방법 | |
CN115393695A (zh) | 人脸图像质量评估方法、装置、电子设备及存储介质 | |
Tang et al. | Visualizing vein patterns from color skin images based on image mapping for forensics analysis | |
Kabbani et al. | Robust Sclera Segmentation for Skin-Tone Agnostic Face Image Quality Assessment | |
KR102669584B1 (ko) | 반려동물 생체정보 검출 방법 및 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20965133 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2022567992 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202080106991.5 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2020965133 Country of ref document: EP Effective date: 20230710 |