WO2004081872A1 - データ分析装置およびデータ認識装置 - Google Patents
データ分析装置およびデータ認識装置 Download PDFInfo
- Publication number
- WO2004081872A1 WO2004081872A1 PCT/JP2004/002526 JP2004002526W WO2004081872A1 WO 2004081872 A1 WO2004081872 A1 WO 2004081872A1 JP 2004002526 W JP2004002526 W JP 2004002526W WO 2004081872 A1 WO2004081872 A1 WO 2004081872A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- frequency distribution
- difference
- arbitrary point
- frequency
- point
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/37—Determination of transform parameters for the alignment of images, i.e. image registration using transform domain methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/42—Global feature extraction by analysis of the whole pattern, e.g. using frequency domain transformations or autocorrelation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/169—Holistic features and representations, i.e. based on the facial image taken as a whole
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
Definitions
- the present invention relates to a data analyzer for analyzing image data.
- the present invention also relates to a data recognition device using the data analysis device, and more particularly to a data recognition device effective as an image recognition device for a face or the like.
- Non-Patent Document 1 a face recognition method based on a very simple and highly reliable vector quantization (VQ) algorithm has been proposed (see Non-Patent Document 1 below).
- VQ vector quantization
- Patent Documents 1 and 2 below disclose similar data recognition devices.
- Non-Patent Document 1 K. Kotani, C. Qiu, and T. Ohmi, "Face Recognition Using Vector Quantization Histogram Method," Proc. 2002 Int. Conf. On Image Proces sing, Vol. II of III, pp. II — 105— II— 108, 2002.
- Patent Document 1 Japanese Patent Application Laid-Open No. 2000-101437.
- Patent Document 2 Japanese Patent Application Laid-Open No. 2002-203241
- the face recognition method described above was used as a feature extraction method for individuals in which a histogram generated from the usage frequency of each code vector obtained by VQ processing of a face image was extremely effective. By applying appropriate filtering and VQ processing to face images, useful features for face recognition can be extracted. AT & T face data The experimental results showed a recognition rate of 95.6%. When using a 1.1 GHz personal computer, the processing time for one image is 194 ms.
- the VQ histogram method is much simpler and faster than earlier face recognition methods, but at the video rate (moving images are 30 frames per second, which is 33 milliseconds per frame) This is not enough to apply to high-speed data recognition such as recognition. Disclosure of the invention
- the data analysis device and the data recognition device according to the present invention are as follows.
- a difference in luminance value between an arbitrary point on the image and a point near the arbitrary point in the first direction is defined as a first luminance difference of the arbitrary point, and And calculating the difference between the brightness value of a point in the second direction different from the first direction and a point in the vicinity of the arbitrary point as the second luminance difference of the arbitrary point.
- Difference calculating means for performing a plurality of points
- a frequency distribution generating means for generating a frequency distribution of the plurality of regions, using the number of vectors allocated to each of the plurality of regions as a frequency of the region. apparatus.
- the frequency distribution for at least a part of the region is extracted from the frequency distribution of the plurality of regions generated by the frequency distribution generating means, and the characteristic data is extracted.
- the image processing apparatus performs a filtering process on the image before the difference calculation unit performs the calculation on the image.
- Data analyzer
- a difference in luminance value between an arbitrary point on the image and a point near the arbitrary point in the first direction is defined as a first luminance difference of the arbitrary point
- the arbitrary point is A plurality of points on the image are calculated by using the difference in brightness value between the arbitrary point and a point near the arbitrary point in a second direction different from the first direction as the second luminance difference of the arbitrary point.
- Frequency distribution generating means for generating the frequency distribution of the plurality of regions, with the number of vectors allocated to each of the plurality of regions as the frequency of the region,
- a data recognizing device comprising: comparing means for comparing a frequency distribution of one or more images generated by the frequency distribution generating means with a frequency distribution in the frequency distribution storage means.
- the comparing means may include a frequency distribution relating to one or more of the images generated by the frequency distribution generating means and a frequency in the frequency distribution storing means. A frequency distribution specified by a predetermined comparison function from the frequency distributions stored in the frequency distribution storage means.
- the frequency distribution related to one or more of the images generated by the frequency distribution generating means may include a frequency distribution of the plurality of regions generated by the frequency distribution generating means.
- a filtering process is performed on the image before the difference calculation unit performs the calculation on the image.
- Filter means for performing filter processing on input image data, and luminance difference dlx in the X direction of any one point on the screen with respect to the image data on which filter processing has been performed by the filter means.
- Difference calculating means for calculating the difference with respect to a plurality of points on the screen;
- a vector composed of a luminance difference in the X direction and a luminance difference in the y direction obtained for each of the plurality of points on the screen by the difference calculation means is assigned to one region of the plurality of regions divided by a predetermined region dividing method.
- Frequency distribution storage means for storing information of at least one or more frequency distributions; a frequency distribution relating to the input image data generated by the frequency distribution generating means; and a frequency distribution in the frequency distribution storage means.
- a data recognizing device comprising: comparing means for comparing and selecting a frequency distribution specified by a predetermined comparison function from each frequency distribution stored in the frequency distribution storing means.
- the luminance difference dlx in the X direction at any one point on the screen is calculated by comparing the luminance value of the arbitrary point with the right point or the left side of the arbitrary point. Calculation of the difference between the luminance value of the arbitrary point and the luminance value of the lower point or the upper point of the arbitrary point as the difference between the luminance value of the arbitrary point and the luminance value of the lower point or the upper point of the arbitrary point.
- the vector composed of the luminance difference in the X direction and the luminance difference in the y direction obtained for each of the plurality of points on the screen by the difference calculation means is converted into one region of a plurality of regions divided by a predetermined region dividing method.
- Frequency distribution generating means for generating the frequency distribution of the plurality of regions, with the number of vectors allocated to each of the plurality of regions as the frequency of the region,
- Frequency distribution storage means for storing a plurality of sets of frequency distribution information of the plurality of regions; frequency distributions relating to the input image data generated by the frequency distribution generation means; and each set in the frequency distribution storage means Of the frequency distribution of each set stored in the frequency distribution storage means, and specified by a predetermined comparison function.
- a data recognizing device comprising: comparing means for selecting a set of frequency distributions.
- the data is selected by the predetermined comparison function as a result of the comparison by the comparing means. If the frequency distribution is not in the frequency distribution storage means, the data recognition apparatus further comprises a frequency distribution registration means for registering the frequency distribution generated by the frequency distribution generation means in the frequency distribution storage means. .
- a data analyzer that achieves a high processing speed and a data recognition device using the data analyzer can be obtained, and high-speed data recognition and instantaneous data recognition can be performed.
- FIG. 1 is a flowchart showing recognition processing steps of an adjacent pixel luminance difference quantization (APIDQ) histogram method used in a data recognition device according to an embodiment of the present invention. ,is there.
- AIDQ adjacent pixel luminance difference quantization
- FIG. 2 is a diagram used to explain the calculation of a difference in brightness between adjacent pixels according to an embodiment of the present invention.
- FIG. 3 is a diagram used to explain the operation of one embodiment of the present invention, and is a diagram showing a typical example of a (dlx, dly) vector distribution.
- FIG. 4 is a diagram used to explain the operation of one embodiment of the present invention, and is a diagram showing an r-10 plane and a quantization table.
- FIG. 5 is a diagram showing a sample of a typical face image based on the AT & T data base used in one embodiment of the present invention.
- FIG. 6 is a diagram used to explain the operation of one embodiment of the present invention, and is a diagram showing a typical example of a histogram.
- FIG. 7 is a diagram used to explain the effect of one embodiment of the present invention, and is a diagram showing the recognition success rate as a function of the filter size.
- FIG. 8 is a diagram used to explain the effect of one embodiment of the present invention, and is a diagram showing recognition results obtained by using a large number of filters.
- the present inventors have developed a new and very simple method called the Adj cent Pixel Intensity Difference Quantization (APIDQ) histogram method that enables high-speed data recognition (for example, video rate face recognition).
- AIDQ Adj cent Pixel Intensity Difference Quantization
- FIG. 1 shows processing steps of an adjacent pixel luminance difference quantization (AP IDQ) histogram method used in a data recognition apparatus according to an embodiment of the present invention.
- AP IDQ adjacent pixel luminance difference quantization
- AP IDQ adjacent pixel luminance difference quantization
- step S1 the face image undergoes mouth-to-pass filtering (step S1), which will be described later, and an adjacent pixel luminance difference is calculated (step S2).
- each pixel (pixel) position in the input image is represented by a two-dimensional vector (that is, a luminance difference (d Ix) from a horizontally adjacent pixel (pixel) and a vertically adjacent pixel (pixel).
- the luminance change vector) composed of the luminance difference (dly) with the pixel) is calculated.
- the two-dimensional vector (luminance change vector composed of dlx and dly) at each pixel position in the input image contains information on the luminance change angle ( ⁇ ) and its amount (r).
- step S3 coordinate transformation to r-system
- each vector is quantized with respect to its ⁇ and r values (step S4).
- a histogram can be generated by counting the number of elements included in each quantized region on the r-plane (step S5). This histogram obtained by AP I DQ for the face image is used as a very effective personal feature.
- the Neighbor Pixel Intensity Difference Quantization (AP IDQ) histogram method shown in Figure 1 is very similar to the VQ histogram method except for the feature extraction procedure.
- the former (VQ histogram) method uses VQ processing for that purpose.
- the VQ histogram method uses a very basic codebook composed of 33 regular code vectors, and applied VQ processing to the luminance change image block from which the DC component was removed. These are the essence of the VQ histogram method, but the process is simply to detect and quantize the direction and amount of luminance change in the block.
- AP IDQ can perform the same processing more easily.
- the luminance difference (dlx) between the horizontally adjacent pixel and the luminance difference between the vertically adjacent pixel (dly) is first calculated using the following simple subtraction operation.
- dlx (i, j) I (i + l, j) -I (i, j)
- the calculated dlx, dly pair represents a single vector (luminance change vector) starting at (0,0) in the dlx-dly plane.
- the end point of the luminance change vector is distributed on the dlx-dly plane as shown in Fig.3.
- the distribution (density and shape) of the end point of the luminance change vector represents the characteristics of the input face image.
- each luminance change vector is quantized in the r-10 plane.
- Quantization table An example of a bull is shown below Figure 4. Numbers 0 to 49 in the quantization table represent index numbers of the 0th to 49th quantization regions, respectively.
- 0 of the luminance change vector shown in the upper part of FIG. 4 is in the region between 3 pit / 8 and pit / 8
- r of the luminance change vector is the third region from the inside (FIG. 4).
- r corresponds to the region between 2 and 4). Therefore, the luminance change vector shown in the upper part of FIG. 4 is quantized as a quantization region with an index of 10 based on the quantization table in the lower part of FIG.
- the number of vectors quantized in each quantization area is counted.
- the number of counts is displayed as a bar as the frequency of a histogram (shown later in FIG. 6) generated as an index 0 to 49 of the quantization area on the horizontal axis.
- the vector shown in the upper part of FIG. 4 constitutes one frequency of the index 10 in the histogram.
- This histogram becomes the feature vector of the human face.
- this histogram is stored in the database 10 as personal identification information.
- a histogram is created from the unknown input face image and compared with the registered personal histogram, and the best match is output as the recognition result of the data 'base matching S8.
- the Manhattan distance (MD) between the histograms is used as an example of the degree of matching.
- step S1 of FIG. 1 note that before APIDQ, mouth-to-mouth filtering is performed first, using a simple two-dimensional moving average filter. This low-pass filtering is essential for reducing the high frequency noise and extracting the most effective low frequency components for recognition. Since the recognition algorithm is very simple and the developed facial feature extraction method is completely different from the conventional recognition method, it can be used not only by itself but also at the minimum additional cost and with increased recognition accuracy. It is easy to combine with the conventional method and very effective. Next, results of a face recognition experiment using the present invention will be described.
- a publicly available AT & T face data base was used for recognition experiments.
- Each image has a resolution of 92X112.
- Figure 6 shows a typical example of a histogram.
- the other person's histogram is clearly different.
- the histograms of different images of the same person are often similar, with minor differences in detail. It can be said that the histogram obtained by AP I DQ is a very effective individual feature for identifying a person.
- Figure 7 shows the recognition results.
- the recognition rate is shown as a function of the filter size.
- the filter size indicates the size of the averaging filter core.
- the size of F 3 represents, for example, the size of a 3 ⁇ 3 filter core.
- the recognition rate is almost constant for the filter sizes F3 to F19, giving the highest average recognition rate of 95.7%. This is almost the same as the VQ histogram method (95.6%) under the same conditions.
- Detailed facial features that reduce cognitive performance are excluded by applying a low-pass fill.
- the AP IDQ process can effectively exclude the dc component of pixel luminance that changes depending on the lighting conditions. By combining these two effects, the most important information for face recognition can be effectively extracted.
- mouth-to-pass' filtering is very effective in extracting facial features using AP ID Q. It can be expected that different features can be extracted by using different sizes of filters. Therefore, stronger personal feature information can be obtained by combining multiple recognition results using multiple fill sizes.
- FIG. 8 shows the recognition results obtained by using multiple filters.
- F3, F5, F17, and F23 represent the fill sizes of 3X3, 5X5, 17X17, and 23X23, respectively.
- the recognition algorithm was programmed using ANS I C and executed on a PC (AM D A t ro ron 1.1 GHz). Quantization in r-coordinates is performed via simple conditional branches ("if" statements). Processing time for one image in AT & T data base is 37 ms (15 ms for low-pass fill ring, 7 ms for API DQ processing, and data base matching For 15 ms). Compared to the VQ processing time of the VQ histogram method, the processing time of facial feature extraction performed by API DQ was reduced from 164 ms (VQ) to 7 ms (AP IDQ). A significant reduction in processing time has been achieved.
- TLU table lookup
- the present invention was able to provide a very fast and highly reliable face recognition method called the API DQ histogram method.
- the face recognition method uses appropriate filtering, quantization of the direction and amount of brightness change, and histogram generation and analysis. based on. Significant face recognition performance, with a recognition rate of 95.7%, was confirmed by using the publicly available AT & T face database.
- the total recognition processing time is only 31 ms, enabling face recognition at video rates.
- the data recognition device includes the following data analyzer 100.
- a difference in luminance value between an arbitrary point on the image and a point near the arbitrary point in the first direction is defined as a first luminance difference (dlx) of the arbitrary point
- a difference between a luminance value of one point and a point near the arbitrary one point in a second direction different from the first direction (for example, orthogonal to the first direction) is defined as a second value of the arbitrary one point.
- a vector (luminance change vector) composed of a first luminance difference and a second luminance difference obtained for each of the plurality of points on the image by the difference calculating means is divided by a predetermined area dividing method. Is assigned to one area of a plurality of areas (areas represented by indices 0 to 49 of the quantization table in FIG. 4), and the number of vectors assigned to each area of the plurality of areas is defined as a frequency of the area. Frequency distribution generating means (S3 to S5 in FIG. 1) for generating the frequency distribution of the plurality of regions.
- the predetermined area dividing method is not limited to the method used in the above embodiment, in which the luminance change vector is assigned to one area of a plurality of areas on the r-10 plane by coordinate conversion to the Sr system.
- another area division method may be used as the predetermined area division method.
- the difference calculating means calculates the luminance difference dlx of an arbitrary point on the screen in the X direction with respect to the image data and the luminance value of the arbitrary point and the luminance of the right point (or left point) of the arbitrary point. And calculating the difference between the luminance value dly of the arbitrary point in the y direction and the luminance value of the lower point (or upper point) of the arbitrary point. It may be performed for a plurality of points on the screen. Note that, in the data analyzer 100, the frequency distribution for at least a part of the region may be extracted from the frequency distribution of the plurality of regions generated by the frequency distribution generating unit to generate feature data. good.
- the data recognition device can be considered to include the following means in addition to the data analysis device 100 described above.
- Frequency distribution storage means (data base 10 in FIG. 1) for storing information of one or more frequency distributions for at least one or more images;
- a comparison unit (S8 in FIG. 1) for comparing the frequency distribution of one or more of the images generated by the frequency distribution generation unit with the frequency distribution in the frequency distribution storage unit is provided.
- the comparing means compares a frequency distribution of the one or more images generated by the frequency distribution generating means with a frequency distribution in the frequency distribution storing means, and stores the frequency distribution in the frequency distribution storing means.
- the frequency distribution specified by the predetermined comparison function is selected from the respective frequency distributions.
- the frequency distribution for one or more of the images generated by the frequency distribution generating means is obtained by calculating the frequency distribution for a partial area from the frequency distribution of the plurality of areas generated by the frequency distribution generating means.
- the distribution may be extracted.
- the data recognition device uses the frequency distribution generation means.
- a frequency distribution registration means (S6 in FIG. 1) for registering the generated frequency distribution in the frequency distribution storage means may be provided.
- the data recognition device further includes a filter means (S 1 in FIG. 1) for performing a filtering process on the input image data, and the difference calculation means performs the filtering process by the filtering means.
- the difference may be calculated for the image data obtained.
- This filter means (S 1 in FIG. 1) is not limited to the low-pass filter used in the above embodiment, and another filter may be used as the filter means. Instead, a plurality of filtering means for performing a plurality of filtering processes on the input image data is provided, and the difference calculating means calculates the difference between the image data filtered by the plurality of filtering means. You may make it calculate.
- the data recognition device includes a plurality of sets of information on the frequency distribution of the plurality of regions, a frequency distribution storage unit, a frequency distribution related to the input image data generated by the frequency distribution generating unit, and the frequency
- the frequency distribution of each set in the distribution storage means is compared, and a set of frequency distributions specified by a predetermined comparison function is determined from the frequency distributions of each set stored in the frequency distribution storage means.
- a comparing means for selecting is not limited to the application to face recognition described in the above embodiment, but may be applied to high-speed data recognition of general images and other large amounts of data. .
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/548,760 US7436999B2 (en) | 2003-03-11 | 2004-03-02 | Data analysis device and data recognition device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003-065517 | 2003-03-11 | ||
JP2003065517A JP4439829B2 (ja) | 2003-03-11 | 2003-03-11 | データ分析装置およびデータ認識装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004081872A1 true WO2004081872A1 (ja) | 2004-09-23 |
Family
ID=32984497
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/002526 WO2004081872A1 (ja) | 2003-03-11 | 2004-03-02 | データ分析装置およびデータ認識装置 |
Country Status (3)
Country | Link |
---|---|
US (1) | US7436999B2 (ja) |
JP (1) | JP4439829B2 (ja) |
WO (1) | WO2004081872A1 (ja) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN100573541C (zh) * | 2005-04-15 | 2009-12-23 | 智能病毒成像公司 | 分析细胞结构及其组分的方法 |
US8712140B2 (en) * | 2005-04-15 | 2014-04-29 | Intelligent Virus Imaging Inc. | Method of analyzing cell structures and their components |
US8238677B2 (en) * | 2008-03-07 | 2012-08-07 | International Business Machines Corporation | Adaptive lossless data compression method for compression of color image data |
US8260076B1 (en) * | 2009-03-31 | 2012-09-04 | Hewlett-Packard Development Company, L.P. | Constant time filtering |
TWI384406B (zh) * | 2009-05-26 | 2013-02-01 | Univ Nat Chiao Tung | 人臉辨識與合成方法 |
US8818714B2 (en) * | 2010-12-10 | 2014-08-26 | Sony Corporation | Portable navigation device and method with active elements |
JP2013029953A (ja) * | 2011-07-28 | 2013-02-07 | Sony Corp | 画像処理装置および方法 |
DE102015120967A1 (de) * | 2015-12-02 | 2017-06-08 | Carl Zeiss Ag | Verfahren und Vorrichtung zur Bildkorrektur |
DE102017112484A1 (de) | 2017-06-07 | 2018-12-13 | Carl Zeiss Ag | Verfahren und Vorrichtung zur Bildkorrektur |
EP3798592A1 (en) * | 2019-09-27 | 2021-03-31 | Koninklijke Philips N.V. | Multi-/hyperspectral two-dimensional image processing |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0799581A (ja) * | 1992-09-25 | 1995-04-11 | Olympus Optical Co Ltd | 画像処理装置 |
US5668898A (en) * | 1993-07-23 | 1997-09-16 | Olympus Optical Co., Ltd. | Device for detecting the inclination of image |
JP2000101437A (ja) * | 1998-04-17 | 2000-04-07 | Tadahiro Omi | コ―ドブック方式によるデ―タ分析装置および方法、デ―タ認識装置および方法、記録媒体 |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3373008B2 (ja) * | 1993-10-20 | 2003-02-04 | オリンパス光学工業株式会社 | 画像像域分離装置 |
EP0657839B1 (en) * | 1993-12-10 | 2002-02-20 | Ricoh Company, Ltd | Methodsand apparatus for recognizing a specific image from an imput image signal |
JP2002203241A (ja) | 2000-12-28 | 2002-07-19 | Tadahiro Omi | コードブック方式によるデータ認識装置、データ認識方法および記録媒体 |
US6865295B2 (en) * | 2001-05-11 | 2005-03-08 | Koninklijke Philips Electronics N.V. | Palette-based histogram matching with recursive histogram vector generation |
US7162076B2 (en) * | 2003-02-11 | 2007-01-09 | New Jersey Institute Of Technology | Face detection method and apparatus |
-
2003
- 2003-03-11 JP JP2003065517A patent/JP4439829B2/ja not_active Expired - Fee Related
-
2004
- 2004-03-02 US US10/548,760 patent/US7436999B2/en not_active Expired - Fee Related
- 2004-03-02 WO PCT/JP2004/002526 patent/WO2004081872A1/ja active Application Filing
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0799581A (ja) * | 1992-09-25 | 1995-04-11 | Olympus Optical Co Ltd | 画像処理装置 |
US5668898A (en) * | 1993-07-23 | 1997-09-16 | Olympus Optical Co., Ltd. | Device for detecting the inclination of image |
JP2000101437A (ja) * | 1998-04-17 | 2000-04-07 | Tadahiro Omi | コ―ドブック方式によるデ―タ分析装置および方法、デ―タ認識装置および方法、記録媒体 |
Also Published As
Publication number | Publication date |
---|---|
JP2004272801A (ja) | 2004-09-30 |
JP4439829B2 (ja) | 2010-03-24 |
US7436999B2 (en) | 2008-10-14 |
US20060088212A1 (en) | 2006-04-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Adouani et al. | Comparison of Haar-like, HOG and LBP approaches for face detection in video sequences | |
Qiu et al. | Finger vein presentation attack detection using total variation decomposition | |
Wang et al. | Fingerprint enhancement in the singular point area | |
RU2382408C2 (ru) | Способ и система для идентификации человека по изображению лица | |
CN105426829B (zh) | 基于人脸图像的视频分类方法和装置 | |
JP5431362B2 (ja) | 画像識別のための特徴ベースの識別特性(signature) | |
Mohamed et al. | An improved LBP algorithm for avatar face recognition | |
CN111339897B (zh) | 活体识别方法、装置、计算机设备和存储介质 | |
Kotani et al. | Face recognition using vector quantization histogram method | |
WO2004081872A1 (ja) | データ分析装置およびデータ認識装置 | |
Dwaich et al. | Signature texture features extraction using GLCM approach in android studio | |
Thamaraimanalan et al. | Multi biometric authentication using SVM and ANN classifiers | |
CN111259792A (zh) | 基于dwt-lbp-dct特征的人脸活体检测方法 | |
KR20080079798A (ko) | 얼굴 검출 및 인식을 위한 방법 | |
Bharadi et al. | Multi-instance iris recognition | |
Mohamed et al. | Automated face recogntion system: Multi-input databases | |
Ribarić et al. | Personal recognition based on the Gabor features of colour palmprint images | |
Dutagaci et al. | 3D face recognition by projection-based methods | |
Ma et al. | Feature extraction method for lip-reading under variant lighting conditions | |
Harakannanavar et al. | Performance evaluation of face recognition based on multiple feature descriptors using Euclidean distance classifier | |
Aravabhumi et al. | Robust method to identify the speaker using lip motion features | |
Kekre et al. | Face Recognition using Texture Features Extracted form Haarlet Pyramid | |
Li et al. | PGT-Net: Progressive Guided Multi-task Neural Network for Small-area Wet Fingerprint Denoising and Recognition | |
Jain et al. | Recognition using palm vein detection | |
Kakarash et al. | Biometric Iris recognition approach based on filtering techniques |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
ENP | Entry into the national phase |
Ref document number: 2006088212 Country of ref document: US Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10548760 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 10548760 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |