JP2003178304A - Face image retrieving device, face image retrieving method and program for executing method on computer - Google Patents

Face image retrieving device, face image retrieving method and program for executing method on computer

Info

Publication number
JP2003178304A
JP2003178304A JP2001379136A JP2001379136A JP2003178304A JP 2003178304 A JP2003178304 A JP 2003178304A JP 2001379136 A JP2001379136 A JP 2001379136A JP 2001379136 A JP2001379136 A JP 2001379136A JP 2003178304 A JP2003178304 A JP 2003178304A
Authority
JP
Japan
Prior art keywords
image
feature amount
extracting
area
captured
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2001379136A
Other languages
Japanese (ja)
Inventor
Yasushi Kage
Takaaki Sakaguchi
Masahiro Takeda
隆明 坂口
昌弘 竹田
裕史 鹿毛
Original Assignee
Mitsubishi Electric Corp
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp, 三菱電機株式会社 filed Critical Mitsubishi Electric Corp
Priority to JP2001379136A priority Critical patent/JP2003178304A/en
Publication of JP2003178304A publication Critical patent/JP2003178304A/en
Pending legal-status Critical Current

Links

Abstract

(57) [Summary] [Problem] A face image search device that searches for a registered face image having the highest correlation with a captured face image by comparing a captured face image with a registered face image. Achieving a simple configuration and high-speed processing by suppressing an enormous amount of calculation. SOLUTION: A face area of a person is detected from a captured face image, a face part such as an eye or a mouth is extracted based on the detection result, and a feature amount of the face is extracted based on the extraction result. Then, the degree of similarity with the registered face image information database is evaluated.

Description

Detailed Description of the Invention

[0001]

BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention searches a registered face image having the highest correlation with the captured face image by comparing the feature of the captured face image with the feature of the registered face image. FACE IMAGE SEARCH DEVICE, FACE IMAGE SEARCH METHOD, AND PROGRAM FOR EXECUTING THE METHOD IN A COMPUTER The present invention relates to a face image search method and a program for causing a computer to execute the method.

[0002]

2. Description of the Related Art Conventionally, the use of ATM cards in banks,
In online shopping using the Internet, the input of a personal identification number and password is used to identify an individual, but in recent years, with the advancement of computerization and sophistication of devices, face images from installed cameras are displayed. A technique for capturing an image and performing person authentication and search has been attracting attention, and various methods have been proposed.

For example, in Japanese Unexamined Patent Publication No. 9-251534 (Prior Art 1), feature points such as eyes and nose are extracted using a separability filter and collated with a template such as eyes and nose registered in advance. Discloses a person authentication device and a person authentication method for performing person authentication. Further, in Japanese Patent Laid-Open No. 2001-92963 (Prior Art 2), a face area is cut out from the face image, and the distance between the input data and the model data is compared and compared in the feature space area to compare the images. And a device is disclosed.

[0004]

However, since the above-mentioned conventional technique is characterized by the extraction processing by the separability filter or the comparison processing in the feature space area, it is considered that the calculation amount becomes enormous, and the real-time processing is performed in real time. There is a problem in processing. On the other hand, it may be possible to use special hardware in order to speed up the processing, but there is a drawback that the system cannot be easily constructed.

The present invention has been made in view of the above, and a face image search apparatus, a face image search method, and a face image search method which enable a simple structure and high-speed processing by suppressing a huge amount of calculation. The purpose is to obtain a program that causes a computer to carry out the method.

[0006]

In order to achieve the above object, a face image search apparatus according to the present invention provides a face image captured by comparing the captured face image with a registered face image. In a face image search device for searching a registered face image having the highest correlation, an image capturing means for capturing a face image, and a face area detecting means for detecting a face area of a person from an image captured by the image capturing means. A face part extracting means for extracting face parts such as eyes and mouth from the face area detecting means, a feature amount extracting means for extracting a face feature amount based on the result of the face part extracting means, and a plurality of face images Based on the feature amount obtained by the feature extraction means, the similarity between the captured face image and the registered face image held by the database. Characterized in that it comprises an evaluation processing unit for determining.

According to the present invention, the face area of a person is detected from the imaged face image, the face parts such as eyes and mouth are extracted based on the detection result, and the face is extracted based on the extraction result. By extracting the feature amount and evaluating the degree of similarity with the registered face image stored in the database, it is possible to realize a face image search device that suppresses the amount of calculation.

A face image search apparatus according to the next invention is characterized in that, in the above-mentioned invention, it is provided with image display means for displaying an image picked up by said image pick-up means, and displays a face matching position at the time of picking up. To do.

According to the present invention, the position of the face part such as the eyes and the mouth can be easily determined by displaying the face matching position at the time of image capturing in addition to the captured image.

A face image retrieval apparatus according to the next invention is a feature quantity extraction means in the above invention, wherein a waveform obtained by projecting a hair area in the face area detected by the face area detection means in each of vertical and horizontal directions is generated. The feature is that it is extracted as a feature amount.

According to the present invention, a face image search with a reduced amount of calculation is performed by extracting a waveform obtained by projecting a hair region in the face region detected by the face region detecting means in each of vertical and horizontal directions as a feature amount. The device can be realized.

A face image search apparatus according to the next invention is characterized in that, in the feature quantity extraction means in the above invention, an image obtained by cutting out the face parts extracted by the face part extraction means is extracted as a feature quantity. .

According to the present invention, by using the image obtained by cutting out the extracted face parts as the feature amount, it is possible to realize the face image search device in which the calculation amount is suppressed.

A face image retrieval apparatus according to the next invention is characterized in that, in the feature quantity extraction means in the above invention, the positional relationship of the face parts extracted by the face part extraction means is extracted as a feature quantity.

According to the present invention, by using the positional relationship of the extracted face parts as the feature amount, it is possible to realize the face image search device in which the calculation amount is suppressed.

In the face image searching method according to the next invention, the registered face image having the highest correlation with the captured face image is searched by comparing the captured face image with the registered face image. In the face image search method, an image capturing step of capturing a face image, a face area detecting step of detecting a face area of a person from the image captured by the image capturing step, and a face area detecting means for detecting eyes and mouth. A face part extracting step of extracting face parts, a feature amount extracting step of extracting a face feature amount based on the result of the face part extracting step, and a database which is a step of holding information based on a plurality of face images , An evaluation processing step of obtaining a degree of similarity between the captured face image and the registered face image held in the database based on the feature amount obtained in the feature extraction step. To.

According to the present invention, the face area of a person is detected from the captured face image, the face parts such as eyes and mouth are extracted based on the detection result, and the face is extracted based on the extraction result. By extracting the feature amount and evaluating the degree of similarity with the registered face image stored in the database, it is possible to realize a face image search method that suppresses the amount of calculation.

A facial image searching method according to the next invention is characterized in that, in the above invention, the method includes an image display step of displaying an image captured by the image capturing step, and displays a face matching position at the time of capturing. To do.

According to the present invention, the position of the face part such as the eyes and the mouth can be easily determined by displaying the face matching position at the time of image capturing in addition to the captured image.

In the face image search method according to the next invention, a waveform obtained by projecting the hair area in the face area detected by the face area detecting step in the vertical and horizontal directions in the feature amount extracting step in the above invention is generated. The feature is that it is extracted as a feature amount.

According to the present invention, a face image search with a reduced amount of calculation is performed by extracting a waveform obtained by projecting the hair region in the face region detected by the face region detecting means in each of the vertical and horizontal directions as a feature amount. The method can be realized.

A face image search method according to the next invention is characterized in that, in the feature amount extracting step in the above invention, an image obtained by cutting out the face parts extracted in the face part extracting step is extracted as a feature amount. .

According to the present invention, by using the image obtained by cutting out the extracted face parts as the feature amount, it is possible to realize the face image search method in which the calculation amount is suppressed.

A face image search method according to the next invention is characterized in that the feature amount extraction means in the above invention extracts the positional relationship of the face parts extracted in the face part extraction step as a feature amount.

According to the present invention, by using the positional relationship of the extracted face parts as the feature amount, it is possible to realize a face image search method that suppresses the calculation amount.

A program according to the next invention is a program for causing a computer to execute the method described in any one of the above inventions, and the program becomes computer readable, whereby any one of the above inventions can be realized. An operation can be performed by a computer.

[0027]

BEST MODE FOR CARRYING OUT THE INVENTION Embodiments of a face image search apparatus, a face image search method and a program for causing a computer to execute the method according to the present invention will be described in detail below with reference to the accompanying drawings.

Embodiment 1. First, the first embodiment of the present invention will be described. FIG. 1 is a functional block diagram showing the configuration of a face image search device according to an embodiment of the present invention. In the figure, 11 is an image capturing unit that captures an image, 12 is a face region detecting unit that detects a hair region and a skin color region from the captured face image, 13 is an eye from the skin color region, A face part extraction unit processing unit that extracts positions such as mouth and nose, and a feature amount extraction unit 14 that extracts a feature amount such as a hairstyle feature amount and a face part shape,
Reference numeral 15 denotes a database for registering and retaining the information extracted by the face part extraction unit 13 and the feature amount extraction unit 14,
16 is a database 15 when a search processing request is made.
An evaluation processing unit that reads out information from the above and performs comparison / collation processing with a search target is shown.

For example, the image pickup section 11 is composed of a CCD camera, a video camera or the like for picking up an image of the face. The face area detection unit 12 is composed of an image input / output board, a workstation or a personal computer, image processing software, and the like. The face part extraction unit 13 includes a workstation, a personal computer, image processing software, or the like. The feature amount extraction unit 14 includes a workstation, a personal computer, image processing software, or the like. The database 15 comprises a recording device such as a magnetic disk device. The evaluation processing unit 16 is composed of a workstation, a personal computer, database processing software, or the like. It should be noted that each processing unit may have a configuration in which the processing is integrated and processed on the same hardware according to the processing amount of each processing unit and the processing capacity of the hardware, or each processing unit has a different hardware. It may be configured to be processed by ware.

Next, two processing modes of the face image retrieval apparatus 10 of FIG. 1 will be described. Face image retrieval device 10
Has a registration mode for extracting image information and a feature amount from a captured image and performing registration processing in a database, and an evaluation mode for performing comparison / collation processing between the captured image and registered information. In the registration mode, the image capturing unit 11, the face area detecting unit 12, the face part extracting unit 13,
The image information and the feature amount are registered in the database 15 via the feature amount extraction unit 14. In the evaluation mode, the image capturing unit 11, the face area detecting unit 12, the face part extracting unit 1
3. The image information to be evaluated and the feature amount are taken into the evaluation processing unit 16 via the feature amount extraction unit 14, and the comparison / collation process with the registered information is performed to evaluate the degree of similarity.

First, the processing procedure in the registration mode will be described. FIG. 2 is a flowchart showing the processing in the registration mode. Hereinafter, description will be given with reference to the drawings.

As shown in FIG. 2, a face image of a person is picked up by the image pickup unit 11 to obtain a face image 31 shown in FIG. 6 (step S101). Next, the face area detection unit 12 extracts pixels having a color close to the color specified as the skin color from the face image 31 and extracts the skin color area 32 (step S1).
02). This process determines the range of the skin color on the color space from the information of the skin color sampled in advance on the color space for distinguishing the skin color from other colors, and determines whether the color of each pixel is within the defined range. It is performed by determining whether or not. Further, pixels having a color similar to the color designated for hair are extracted from the face image 31 to extract the hair region 33 (step S10).
3). This processing can also be realized by the same processing as the extraction processing of the skin color area. The face area image shown in FIG. 7 is obtained by these two processes.

Next, the face part extraction unit 13 inserts the eye and mouth position information from the skin color region 32 of the face region image shown in FIG. 7 obtained by the face region detection unit 12 between the positions of both eyes and mouth. The information of the position of the nose is extracted from the respective regions (step S104). Further, in the feature amount extraction unit 14, in the face image of FIG. 6 obtained by the image capturing unit 11 or the face region image of FIG. 7 obtained by the face region detection unit 12, the luminance values in the vertical and horizontal directions of the hair region are obtained. The information of the histogram, the information of the horizontal luminance value histogram of the face part area, and the information of the extracted image of the face and the mouth are extracted (step S1).
05).

Finally, in the database 15, information on the skin color area extracted in step S102, step S102.
Information on hair region extracted by 103, step S10
4, information on the positions of the eyes, mouth and nose extracted in step 4, information on the brightness value histograms in the vertical and horizontal directions of the hair area extracted in step S105, information on the brightness value histograms in the horizontal direction of the face part area and the face and The information of the image of the mouth is registered in the database 15 in association with the captured face image (step S106). From the above,
A series of processing for extracting desired information from the captured face image and registering it in the database 15 is completed.

Next, the processing procedure in the evaluation mode will be described. FIG. 5 is a flowchart showing the processing in the evaluation mode. Hereinafter, description will be given with reference to the drawings.

As shown in FIG. 5, a face image is picked up (step S401), a skin color area is detected (step S402),
Hair region detection (step S403), face part extraction (step S404), and feature amount extraction (step S).
Each process up to 405) is the same as the process in the registration mode. That is, in the image capturing unit 11, as shown in FIG.
7 is acquired (step S401), the face area detection unit 12 extracts the skin color area 32 and the hair area 33 shown in FIG. 7 from the face image 31 (steps S402 and S403), and the face part extraction unit At 13, the eye, mouth, and nose position information is extracted from the skin color area 32 obtained by the face area detecting section 12 (step S404), and at the feature quantity extracting section 14, the information is obtained by the image capturing section 11. For the face image 31 shown in FIG. 6 or the face area image shown in FIG. 7 obtained by the face area detection unit 12, information on the brightness value histograms in the vertical and horizontal directions of the hair area, and the brightness in the horizontal direction of the face part area. Information on the value histogram and information on the extracted images of the face and mouth are extracted (step S405).

Next, the information obtained by this series of processing and the registered information read from the database 15 are subjected to correlation processing by the evaluation processing section 16 to calculate the correlation value.
Further, if there is other registered information, the process of calculating the correlation value is repeated for that information. With respect to the set of correlation values thus obtained, the face image having the maximum total of the correlation values is output. In this process, the candidate sequence of face images may be output according to a certain rule (for example, in descending order of correlation value). As described above, a series of processes for comparing and collating the information extracted from the captured face image with the information registered in the database 15 is completed.

The above-described two processes, that is, the process in the registration mode for extracting desired information from the captured image and registering it in the database, and the evaluation mode for comparing and collating the information extracted from the captured image with the registered information. The time processing is the overall processing flow.

Next, the processing common to the image search apparatus 1 shown in FIG. 1 in the registration mode and the evaluation mode will be described with reference to FIG.
The face part extraction process shown in FIG. 4 and the feature amount extraction process shown in FIG. 4 will be described with reference to the drawings.

FIG. 3 is a flow chart showing the facial part extraction processing. This process corresponds to the process of step S104 of FIG. 2 or step S404 of FIG. First,
In the flesh color region 32 of FIG. 7, a region having a lower brightness value than the flesh color region 32 is searched for, and the positions 37 of both eyes are determined as shown in FIG. 10 (step S201). Further, a range in which the horizontal direction is the middle position between the eyes and the vertical direction is lower than this position, and a region having a luminance value lower than that of the skin color region 32 is searched for, and the mouth position 38 is set as shown in FIG. Is determined (step S202). Finally, as for the position of the nose, the position of the nose is determined by the following three steps of processing. (A) The nose area 34 shown in FIG. 8 that is located between both eyes and the mouth and roughly includes the entire nose is specified (step S203). (B) The nose area 34 is subjected to a lateral edge enhancement process (step S204) to obtain a nose area lateral edge enhancement image 35 shown in FIG. (C) The nose region luminance value lateral projection waveform 36 shown in FIG. 9 obtained by the process of horizontally projecting the luminance value of the lateral edge enhancement image 35 (step S205).
In, the position having the smallest luminance value is determined as the nose position 39 as shown in FIG. 10 (step S206).
From the face image obtained from the image capturing unit 11, the eyes,
This is a process for extracting the positions of the nose and mouth.

FIG. 4 is a flowchart of the feature amount extraction processing. This process corresponds to the process of step S105 of FIG. 2 or step S405 of FIG. First, in order to obtain the characteristic amount of the hairstyle, a process of vertically adding the brightness values of the hair region 9 detected in step S101 (step S30).
1) and processing for adding in the horizontal direction (step S302)
Is carried out to obtain the hair vertical histogram 40 and the hair horizontal histogram 41 shown in FIG. 11, respectively. Next,
In order to obtain the feature amount of the arrangement of the face parts, the face part area 42 shown in FIG. 12 including the nose is determined based on the positions of both eyes and the mouth (step S303), and the brightness value of the area is set in the horizontal direction. Face part horizontal histogram 43 added to
Is obtained (step S304). Finally, the nose image 44 (step S305) and the mouth image 45 (step S306) are obtained as the feature amount of the shape of the face part shown in FIG.

From the face image obtained by the image pickup unit 11 and the face region image obtained by the face region detection unit 12, the vertical and horizontal histograms of the luminance value of the hair region, the histogram of the face part region, and This is a process for extracting images of the nose and mouth.

As described above, according to the first embodiment, the image retrieval apparatus is configured as shown in FIG. 1, the face area of the person is detected from the captured face image, and the eyes, mouth, etc. are detected based on the detection result. Since it was decided to extract the facial parts of the human face and the feature amount of the face based on this extraction result, and to evaluate the similarity with the registered face images stored in the database, a huge amount of calculation is required. Therefore, it is possible to realize the face image search processing that enables a simple configuration and high-speed processing.

Embodiment 2. FIG. 14 is a block diagram showing a face image search device 20 according to the second embodiment, which relates to the face image search device 10 of FIG.
Is added. Using this image display unit 21,
When acquiring the image shown in FIG. 15, the user captures the image by aligning the eyes and mouth with the alignment position 46. Also,
The face area detection unit 12 can extract the skin color information from the area 47 sandwiched between the eyes and the mouth, and can extract the skin color area 9 from the skin color information. Furthermore, the face part extraction unit 13
Can search the eyes and mouth only around the alignment position 46 to determine the position.

As described above, according to the second embodiment, the image retrieving apparatus of FIG. 1 is added with the image display section as shown in FIG. 14, and the image display section is used to add to the captured image. Since the face alignment position can be displayed at the time of image capturing, the face part positions such as eyes and mouth can be easily determined.

[0046]

As described above, according to the present invention, a face area of a person is detected from a captured face image, and face parts such as eyes and mouth are extracted based on the detection result. By extracting the feature amount of the face based on this extraction result and evaluating the degree of similarity with the registered face image information database, a huge amount of calculation can be suppressed, so a simple configuration and high speed An effect that a face image search device that enables processing can be obtained.

According to the next invention, in addition to the picked-up image, it is possible to display the face matching position at the time of picking up the image. Therefore, it is possible to easily determine the parts position of the face such as eyes and mouth. There is an effect that an image search device can be obtained.

According to the next invention, a huge amount of calculation is suppressed by extracting a histogram obtained by projecting the hair area in the face area detected by the face area detecting means in each of the vertical and horizontal directions as a feature quantity. Therefore, it is possible to obtain a face image search device that enables a simple configuration and high-speed processing.

According to the next invention, since an image in which the extracted face parts are cut out is used as the feature amount, a huge amount of calculation can be suppressed, so that a simple configuration and high-speed processing are possible. The effect that the face image search device is obtained is obtained.

According to the next invention, since the positional relationship of the extracted face parts is used as the feature amount, a huge amount of calculation can be suppressed, so that a simple structure and high-speed processing are possible. This has the effect of providing a face image search device that does.

According to the next invention, the face area of a person is detected from the captured face image, the face parts such as eyes and mouth are extracted based on the detection result, and the face is extracted based on the extraction result. It is possible to suppress the enormous amount of calculation by extracting the feature amount of the face and evaluating the degree of similarity with the registered face image information database, so that a face that enables a simple configuration and high-speed processing can be achieved. This has the effect of providing an image search method.

According to the next invention, in addition to the picked-up image, by displaying the face matching position at the time of picking up,
It is possible to obtain a face image search method capable of easily determining the positions of face parts such as eyes and mouth.

According to the next invention, a huge amount of calculation is suppressed by extracting a histogram obtained by projecting the hair region in the face region detected by the face region detecting means in each of the vertical and horizontal directions as a feature amount. Therefore, it is possible to obtain a face image search method that enables a simple configuration and high-speed processing.

According to the next invention, since an image in which the extracted face parts are cut out is used as a feature amount, a huge amount of calculation can be suppressed, so that a simple configuration and high-speed processing are possible. The effect that the face image illumination search method is obtained is obtained.

According to the next invention, a huge amount of calculation can be suppressed by using the positional relationship of the extracted face parts as a feature amount, so that a simple configuration and high-speed processing are possible. This has the effect of providing a face image search processing method that

The program according to the next invention has the effect of obtaining a program for causing a computer to execute the method described in any one of the above.

[Brief description of drawings]

FIG. 1 is a functional block diagram showing a configuration of a first embodiment of a face image search device according to the present invention.

FIG. 2 is a flowchart showing processing in a registration mode according to the first and second embodiments.

FIG. 3 is a flowchart showing face part extraction processing according to the first and second embodiments.

FIG. 4 is a flowchart showing a feature amount extraction process in the first and second embodiments.

FIG. 5 is a flowchart showing processing in an evaluation mode according to the first and second embodiments.

FIG. 6 is a diagram showing a face image captured by the image capturing section of FIG. 1.

7 is a diagram showing a face area image in which a skin color area and a hair area are detected from the face image shown in FIG.

FIG. 8 is a schematic diagram for extracting a nose position from a face area.
It is a figure which shows the image cut out from the image of FIG.

FIG. 9 is a diagram showing an image in which lateral edges are emphasized in a nose region and a result of horizontally projecting luminance values of the image.

10 is a diagram showing a result of specifying the positions of eyes, a nose, and a mouth from the image shown in FIG.

11 is a diagram showing a result of adding the luminance values of the hair region of the image shown in FIG. 6 in the vertical and horizontal directions.

FIG. 12 is a diagram showing a region extracted from the image shown in FIG. 6 for obtaining a feature amount of the arrangement of face parts and a result of horizontally adding luminance values of the region.

13 is a diagram showing images of a nose and a mouth extracted from the image shown in FIG.

FIG. 14 is a functional block diagram showing the configuration of the second embodiment of the face image search device according to the present invention.

FIG. 15 is a diagram showing eye and mouth alignment positions and skin color information extraction positions during image capturing.

[Explanation of symbols]

10 face image search device, 11 image capturing unit, 12 face region detecting unit, 13 face part extracting unit, 14 feature charge extracting unit, 15 database, 16 evaluation processing unit, 21 image display unit, 31 face image, 32 skin color region, 33 hair region, 34 nose region, 35 nose region horizontal edge enhancement image, 36 nose region brightness value horizontal projection waveform, 37 both eye positions, 38 mouth position, 39 nose position, 40 hair vertical histogram, 41 hair Horizontal histogram, 42
Face parts area, 43 face parts horizontal histogram, 4
4 Nose image, 45 mouth image, 46 Eye / mouth alignment position, 47 Skin color information extraction position.

   ─────────────────────────────────────────────────── ─── Continued front page    (72) Inventor Hiroshi Kage             2-3 2-3 Marunouchi, Chiyoda-ku, Tokyo             Inside Ryo Electric Co., Ltd. F term (reference) 5B057 DA11 DB02 DB06 DC16 DC19                       DC22 DC25 DC34 DC36 DC39                 5B075 ND08 NK07 NK37 UU08                 5L096 AA02 AA06 CA02 FA06 FA34                       FA36 FA69 GA38 HA09 JA03                       JA11 KA15

Claims (11)

[Claims]
1. A face image retrieving apparatus for retrieving a registered face image having the highest correlation with the captured face image by comparing the captured face image with the registered face image, Image capturing means for capturing an image, face area detecting means for detecting a face area of a person from the face image captured by the image capturing means, and face parts for extracting face parts such as eyes and mouth from the face area detecting means. Extraction means, feature amount extraction means for extracting a face feature amount based on the result of the face part extraction means, information holding means for holding information based on a plurality of face images, and the feature amount extraction means A face image search apparatus comprising: an evaluation processing unit that obtains a degree of similarity between the captured face image and the registered face image held by the information holding unit based on the feature amount.
2. The face image search device according to claim 1, further comprising image display means for displaying an image picked up by said image pick-up means, and displaying a face matching position at the time of picking up.
3. The feature quantity extracting means extracts a waveform obtained by projecting a hair area in the face area detected by the face area detecting means in each of vertical and horizontal directions as a feature quantity. The face image search device according to 1 or 2.
4. The feature quantity extracting means extracts an image obtained by cutting out the face parts extracted by the face part extracting means as a feature quantity. Face image retrieval device.
5. The feature amount extraction means extracts the positional relationship of the face parts extracted by the face part extraction means as a feature amount, according to any one of claims 1 to 4. Face image search device.
6. A face image search method for searching a registered face image having the highest correlation with the captured face image by comparing the captured face image with the registered face image, An image capturing step for capturing an image, a face area detecting step for detecting a face area of a person from the image captured by the image capturing step, and a face part extracting for extracting face parts such as eyes and mouths from the face area detecting step. Process, a feature amount extraction process for extracting a face feature amount based on the result of the face part extraction process, an information holding process for holding information based on a plurality of face images, and a feature obtained by the feature extraction process. A face image search method comprising: an evaluation processing step of obtaining a degree of similarity between the captured face image and the registered face image held in the information holding step based on the amount.
7. The face image search method according to claim 6, further comprising an image display step of displaying an image captured by the image capturing step, and displaying a face alignment position at the time of image capturing.
8. The feature quantity extracting step extracts a waveform obtained by projecting a hair area in the face area detected by the face area detecting unit in each of vertical and horizontal directions as a feature quantity. The face image search method according to 6 or 7.
9. The feature amount extracting step extracts an image obtained by cutting out the face parts extracted in the face part extracting step as a feature amount. Face image search method.
10. The feature amount extraction step extracts the positional relationship of the face parts extracted by the face part extraction step as a feature amount. Face image search method.
11. A program for causing a computer to execute the method according to any one of claims 6 to 10.
JP2001379136A 2001-12-12 2001-12-12 Face image retrieving device, face image retrieving method and program for executing method on computer Pending JP2003178304A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2001379136A JP2003178304A (en) 2001-12-12 2001-12-12 Face image retrieving device, face image retrieving method and program for executing method on computer

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2001379136A JP2003178304A (en) 2001-12-12 2001-12-12 Face image retrieving device, face image retrieving method and program for executing method on computer

Publications (1)

Publication Number Publication Date
JP2003178304A true JP2003178304A (en) 2003-06-27

Family

ID=19186636

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2001379136A Pending JP2003178304A (en) 2001-12-12 2001-12-12 Face image retrieving device, face image retrieving method and program for executing method on computer

Country Status (1)

Country Link
JP (1) JP2003178304A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006092957A1 (en) * 2005-03-01 2006-09-08 Osaka Prefecture University Public Corporation Document/image searching method and program, and document/image recording and searching device
US7545983B2 (en) * 2004-06-10 2009-06-09 Fujifilm Corporation Person image retrieval apparatus
JP2009169518A (en) * 2008-01-11 2009-07-30 Kddi Corp Area identification apparatus and content identification apparatus
JP2009237618A (en) * 2008-03-25 2009-10-15 Seiko Epson Corp Detection of face area in image
KR100922693B1 (en) * 2008-03-12 2009-10-20 엔에이치엔(주) System and method for searching person
US8036497B2 (en) 2005-03-01 2011-10-11 Osaka Prefecture University Public Corporation Method, program and apparatus for storing document and/or image using invariant values calculated from feature points and method, program and apparatus for retrieving document based on stored document and/or image
US8314948B2 (en) 2007-10-03 2012-11-20 Canon Kabushiki Kaisha Image forming system utilizing network camera
JP2013055668A (en) * 2012-10-10 2013-03-21 Olympus Imaging Corp Image reproduction device and image reproduction method
US9117111B2 (en) 2009-06-16 2015-08-25 Canon Kabushiki Kaisha Pattern processing apparatus and method, and program
US9449215B2 (en) 2012-04-27 2016-09-20 Rakuten, Inc. Information processing apparatus, information processing method and information processing program

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7545983B2 (en) * 2004-06-10 2009-06-09 Fujifilm Corporation Person image retrieval apparatus
US7869633B2 (en) * 2004-06-10 2011-01-11 Fujifilm Corporation Person image retrieval apparatus
WO2006092957A1 (en) * 2005-03-01 2006-09-08 Osaka Prefecture University Public Corporation Document/image searching method and program, and document/image recording and searching device
US8036497B2 (en) 2005-03-01 2011-10-11 Osaka Prefecture University Public Corporation Method, program and apparatus for storing document and/or image using invariant values calculated from feature points and method, program and apparatus for retrieving document based on stored document and/or image
US8314948B2 (en) 2007-10-03 2012-11-20 Canon Kabushiki Kaisha Image forming system utilizing network camera
JP2009169518A (en) * 2008-01-11 2009-07-30 Kddi Corp Area identification apparatus and content identification apparatus
KR100922693B1 (en) * 2008-03-12 2009-10-20 엔에이치엔(주) System and method for searching person
JP2009237618A (en) * 2008-03-25 2009-10-15 Seiko Epson Corp Detection of face area in image
US9117111B2 (en) 2009-06-16 2015-08-25 Canon Kabushiki Kaisha Pattern processing apparatus and method, and program
US9449215B2 (en) 2012-04-27 2016-09-20 Rakuten, Inc. Information processing apparatus, information processing method and information processing program
JP2013055668A (en) * 2012-10-10 2013-03-21 Olympus Imaging Corp Image reproduction device and image reproduction method

Similar Documents

Publication Publication Date Title
US8457406B2 (en) Identifying descriptor for person and object in an image
Goswami et al. RGB-D face recognition with texture and attribute features
US20150023596A1 (en) Person clothing feature extraction device, person search device, and processing method thereof
Ma et al. Bicov: a novel image representation for person re-identification and face verification
Jain et al. Facial marks: Soft biometric for face recognition
Kose et al. Countermeasure for the protection of face recognition systems against mask attacks
Park et al. Face matching and retrieval using soft biometrics
US8903123B2 (en) Image processing device and image processing method for processing an image
US7003135B2 (en) System and method for rapidly tracking multiple faces
US7110581B2 (en) Wavelet-enhanced automated fingerprint identification system
Burge et al. Ear biometrics
DE60317053T2 (en) Method and device for displaying a group of pictures
Miao et al. A hierarchical multiscale and multiangle system for human face detection in a complex background using gravity-center template
JP4246154B2 (en) Biometric authentication method
US6404903B2 (en) System for identifying individuals
JP4168940B2 (en) Video display system
JP4505362B2 (en) Red-eye detection apparatus and method, and program
US5450504A (en) Method for finding a most likely matching of a target facial image in a data base of facial images
US7460705B2 (en) Head-top detecting method, head-top detecting system and a head-top detecting program for a human face
KR100731937B1 (en) Face meta-data creation
US9530045B2 (en) Method, system and non-transitory computer storage medium for face detection
JP5008269B2 (en) Information processing apparatus and information processing method
JP5202148B2 (en) Image processing apparatus, image processing method, and computer program
JP4166143B2 (en) Face position extraction method, program for causing computer to execute face position extraction method, and face position extraction apparatus
ES2348248T3 (en) Multibiometric system and procedure based on a single image.

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20041210

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20070625

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20070807

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20080422