WO2005070301A1 - 男女識別方法 - Google Patents
男女識別方法 Download PDFInfo
- Publication number
- WO2005070301A1 WO2005070301A1 PCT/JP2005/001035 JP2005001035W WO2005070301A1 WO 2005070301 A1 WO2005070301 A1 WO 2005070301A1 JP 2005001035 W JP2005001035 W JP 2005001035W WO 2005070301 A1 WO2005070301 A1 WO 2005070301A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gender
- temperature
- person
- discrimination
- extracted
- Prior art date
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6825—Hand
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/01—Measuring temperature of body parts ; Diagnostic temperature sensing, e.g. for malignant or inflamed tissue
- A61B5/015—By temperature mapping of body part
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
- A61B5/1171—Identification of persons based on the shapes or appearances of their bodies or parts thereof
- A61B5/1176—Recognition of faces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/70—Multimodal biometrics, e.g. combining information from different biometric modalities
Definitions
- the present invention relates to a method for distinguishing between male and female humans.
- Japanese Patent Laid-Open Publication No. 2001-218020 discloses that gender discrimination is performed based on the appearance of a person in a photograph taken by a camera, and the subject person is identified based on the gender discrimination result. There is disclosed an image processing method for performing image data processing for improving appearance.
- gender discrimination systems are installed at entrances in various halls and stores, and the number of guests by gender is counted for marketing. A system for acquiring materials is disclosed.
- a person is photographed with a television camera or digital camera, etc., and from the image data obtained by this, a portion of interest, such as a face, hands, shoes, belongings, etc., is extracted.
- Gender discrimination is performed by analyzing the content. Specifically, the criteria are the center of gravity of the amount of hair, the time of arm assembly, the size of the back, the length of time during which hands are placed in the pocket, the state of makeup on the face, the use of lipstick, etc. Used.
- discrimination since the conventional gender discrimination method is based on features that can be changed by human will, such as the appearance and gesture of the discriminated object, discrimination may be uncertain.
- a method of gender discrimination based on voice data is also known, but in this case, the discrimination becomes uncertain due to the use of voices or the like. Must be enforced, and the range of use is severely restricted.
- the present invention utilizes biological characteristics that cannot be changed by human will, and specifically utilizes the fact that there is a difference between men and women in the heat generation temperature and / or distribution state of human faces and hands. .
- the present invention was obtained by examining the gender differences in the fever temperature of the face and hands and their distribution, and medically, the basal metabolic rate is high for males and low for females because the basic metabolic rate is large for males and female for small Women tend to secrete the female hormone estrogen, so they generally have a higher percentage of fat than men and more fat on the body surface, and 'fat is less conductive than muscle' because of its lower thermal conductivity This is based on the finding that women tend to have a lower body surface temperature than men because the body temperature radiation is cut off. Since this feature appears in exposed areas that are not covered by clothing, it is convenient to discriminate between males and females based on the heat generation temperature of the face and hands and their distribution.
- the inventor first quantitatively examined the influence of the ambient temperature on the face temperature. Specifically, three men and three women of the same age were selected as subjects. Then, in a room with a constant ambient temperature, the subject's ⁇ and chin were subjected to temperature changes using a body warmer and a cooling material, and data on temporal changes in face temperature were collected. Specifically, first, each subject was exposed to the cairo for a period of time (5 minutes) to the ⁇ and the chin, and then the cairo was removed and the time-dependent changes in the temperature of the chin and chin were measured. The temperature was measured after a lapse of 5 seconds, and thereafter every 15 seconds until a lapse of 125 seconds. Next, the same measurement was performed using a coolant instead of Cairo.
- the vertical axis represents the value obtained by dividing the average ⁇ temperature of the extracted part of each subject by the average chin temperature (hereinafter referred to as [ ⁇ / chin].
- the horizontal axis in Fig. 2 represents the ⁇ de data 05 001035
- Figure 3 shows how this value changes with the passage of time, that is, a change in temperature. Chin and ⁇ are exposed to the same ambient temperature, so they undergo the same tendency of temperature change. As a result, the influence of the ambient temperature can be eliminated by focusing on ⁇ data / chin data.
- Fig. 2 shows that the value of this ratio is not affected by the ambient temperature because the ratio is almost constant even if the time elapses after warming in Cairo, that is, even if the ambient temperature changes. . After all, Fig. 2 shows the normalized value of the ⁇ temperature with the jaw temperature. The following can be seen from the results shown in FIG.
- this value is not affected by changes in ambient temperature, and there is a gender difference. Therefore, this value is a feature for gender discrimination that is not affected by changes in ambient temperature.
- the temperature of the subject's ⁇ was extracted using the ⁇ part temperature extraction template shown in Fig. 3, and how the temperature of the extracted part was affected by the ambient temperature was examined.
- the two men and women selected as subjects were allowed to stand in a room at an ambient temperature of 24 for about 10 minutes, and then the air-conditioning temperature was raised to 27 ° C by 1 degree, and the temperature of each subject was raised to the above temperature. And collected.
- Figure 4 shows the data of one male in the data collected in this way.
- Figure 5 shows an example of the temperature distribution of male and female ( ⁇ ). From Fig. 5, it is found that the average value of the temperature in (1) is higher for males, and the temperature (gradation value) distribution is narrower for males and narrower for females than for males. The value should be wide in the low range. I understand. From Fig. 5, it is considered that the difference in variance between men and women appears because women have a wide range of fats, especially in ⁇ .
- the following equation (1) is an average value
- the following equation (2) is a separation value
- the following equation (3) is an equation representing an emphasized variance value.
- equation (3) used to highlight the difference is a quadratic version of the square part of the generally defined dispersion equation (2).
- n is the number of pixels in the extracted portion
- Xi is each gradation value of the pixel in the extracted portion.
- the value obtained by Expression (3) is referred to as an enhanced variance value. It can be fully understood by comparing FIGS. 5, 6, and 7 that this emphasized variance value increases the distance between men and women and facilitates gender discrimination.
- Fig. 5 shows the original histogram
- Figs. 6 and 7 show the variance and emphasis variance values Xi-X squared and quadratic terms corresponding to Xi. Comparing FIG. 6 with FIG. 7, it can be seen that the distance between males and females is clearly larger with the enhanced variance value in FIG. 7 than with the variance value in FIG.
- the temperature distribution range is narrow for men and wide for women. This is because male fat is concentrated and females are distributed over a wide area. Since this distribution state is a biological characteristic, it is hardly affected by the ambient temperature. Therefore, gender discrimination can be performed by using the difference in the distribution state, for example, the above-described enhanced variance value.
- the present invention is based on medical grounds and utilizes biological characteristics that cannot be changed by human will. Therefore, it is possible to realize a more advantageous gender discrimination method than conventional methods. If only the emphasis variance value of temperature ⁇ is used, the identification may be erroneous for some reason. Therefore, in order to enhance reliability, three of the following combinations are made by combining two of the following 1 to 5, and statistical discrimination (for example, Mahalanobis distance) is performed on these combinations. It is preferable that the result of the identification be the identification result.
- FIG. 1 is a diagram showing a chin-shaped temperature extraction template and a ⁇ -shaped temperature extraction template.
- FIG. 2 is a graph showing the influence on the ambient temperature of a value obtained by dividing the average temperature in the temperature extraction template of ⁇ by the average temperature in the chin temperature extraction template and normalizing the average temperature.
- FIG. 3 is a diagram showing a template for extracting a ⁇ -shaped temperature for an enhanced variance value.
- FIG. 4 is a graph showing the influence of the ambient temperature on the variance value of the temperature in the template for extracting ⁇ -type temperature for enhanced variance value of male.
- FIG. 5 is a diagram showing histograms of males and females of the temperature in the template for extracting type III temperature.
- Figure 6 is a graph showing the differences in gender variance.
- FIG. 7 is a graph showing the difference between the emphasis variance values of ⁇ for men and women.
- FIG. 8 is a block diagram showing an embodiment of the present invention.
- FIG. 9A is a partial flowchart showing an image data processing program executed in the image data processing device of FIG.
- FIG. 9B is a partial flowchart showing an image data processing program executed in the image data processing device of FIG.
- FIG. 10 is a diagram showing a ⁇ -type temperature extraction template from which the influence of glasses has been removed.
- Fig. 11 shows the main part of the image data processing program that combines the temperature of the hand and the temperature of the face. It is a flow chart.
- FIG. 12 is a diagram showing a sensor for extracting the temperature of the hand.
- FIG. 13 is a flow chart showing the main part of an image data processing program using only the palm temperature.
- FIG. 14 is a flowchart showing the main part of an image data processing program combining the black level of the beard under the nose and the temperature of the face.
- FIG. 15 is a diagram showing a template for extracting the temperature of the hand.
- FIG. 8 shows an embodiment of the gender discriminating apparatus according to the present invention.
- the gender identification device 1 shown in Fig. 8 is installed at the entrance of a building (not shown), identifies whether the person P entering the building is a man or a woman, and enters a display according to the identification result. The configuration is performed for the person P who is about to do.
- the gender identification device 1 has a television camera 2A for taking an infrared image, which is installed at the entrance so that at least the face of the person P who is going to enter can be photographed.
- the image pickup signal VDA from the television camera 2A is sent to the image data processing device 3, where it is subjected to processing for discriminating between male and female as described later, and also sent to the monitor device 4, where the monitor device 4 This allows the situation at the entrance to be grasped, for example, at another location (security room).
- the configuration is such that the imaging signal VDA is sent to the monitor device 4 via the image data processing device 3, but the imaging signal VDA may be sent directly to the monitor device 4.
- Reference numeral 5 denotes a display comprising a liquid crystal display for displaying a person P.
- Reference numeral 6 indicates a pushbutton switch that must be operated when a person P tries to enter.
- the image data processing device 3 is configured as a microcomputer system, and includes a central processing unit (CPU) 3A, a memory 3B, and an input / output interface (IZF) 3C connected by a bus 3D. It is of a known configuration.
- the central processing unit (CPU) 3A has a memory 3E in it, and image data processing with gender discrimination function described later. Five
- a program is stored, and the imaging signal VDA from the television camera 2A is processed according to the image processing program.
- FIGS. 9A and 9B are flowcharts showing the image data processing program stored in the memory 3E.
- the image data processing program 10 is executed each time the push button switch 6 is pressed.
- an infrared person image capturing process for capturing the imaging signal VDA is performed in step 11.
- infrared face image extraction processing for extracting the face image of the person P based on the data captured in step 11 is performed.
- step 13 it is determined whether or not the person 'P is wearing glasses based on the infrared face image extracted in step 127. If it is determined that the eyeglasses are worn, the result of the determination in step 13 is YES, and the process proceeds to step 14, where the wearer of the eyeglasses shown in FIG. Prepare a template for the ⁇ emphasis variance value at the time, and proceed to step 15. If it is determined in step 13 that the user does not wear glasses, the determination result in step 13 is NO, and the process proceeds to step 15. In step 15, normal templates that are not used when wearing glasses are prepared in advance. Then, in step 15, one of the templates is selected according to the presence / absence of wearing glasses, and each temperature of ⁇ and the chin of the person P is extracted.
- step 16 the average temperature of ⁇ and the chin extracted in step 15 is calculated, and in the next step 17, the ratio between the average temperature and the chin average temperature ( ⁇ average temperature / average chin temperature) is calculated I do.
- step 18 the emphasis dispersion value of the temperature of the portion ⁇ extracted in step 15 is calculated according to equation (3).
- step 19 processing for expanding the values calculated in steps 17 and 18 on a two-dimensional plane is performed.
- the classification method (a) is determined by applying the discrimination method based on the Mahalanobis distance obtained in advance and using the curve for classification into two types (ie, men and women) to determine the gender.
- ⁇ average temperature / chin average temperature is compared with the reference value of gender classification which is obtained in advance to determine gender, and the classification result (b) is obtained.
- gender classification (c) is performed by comparing the emphasis variance value of the request calculated in step 18 with the previously determined gender classification reference value. For the input infrared face image of the person P who is the identification target, the three gender classification results (a), (b), and (c) classified up to this point are collected in step 23.
- step 2 of 4 gender discrimination is performed based on whether two or more of the results of the three classifications (a), (b), and (c) match. Output with 5. If the result of the determination in step 24 is a male, the determination result in step 25 is YES, and the process proceeds to step 26 in which the message "Please go to 2F" is displayed on the display 5 to execute the process. The execution of the image data processing program 10 ends. On the other hand, if the result of the determination in step 24 is female, the result of the determination in step 25 is NO, and the process proceeds to step 27 to display the message “Please go to 3F” on the display 5. Then, the execution of the image data processing program 10 ends.
- the gender identification device 1 is configured as described above, when the person P to enter enters the push button switch 6, the gender identification device 1 operates as described above and the person P to enter is male. It is possible to identify whether it is a woman and, based on the result, guide to the second floor if it is a male and to the third floor if it is a female. In addition, instead of or in addition to the display 5, guidance by sound may be provided.
- gender discrimination is performed based only on the temperature of the face portion of the person P.
- FIG. 11 shows a flowchart of the main part of the image processing data program in this case.
- a sensor 7 shown in FIG. 12 is used instead of the push button switch 6 shown in FIG.
- the sensor 7 is configured as a push button switch 6 in Fig. 8 *
- the opening and closing bar of an automatic door as shown by 7 A in Fig. 12 and an infrared temperature sensor 7 A is installed in this part By doing so, the temperature of the palm is detected.
- a plate 7B is installed near the sensor 7A, and a warning is displayed on the plate 7B saying "Press the palm of your hand.” This detects palm temperature can do.
- the data processing for gender discrimination by combining palm temperatures is basically the same as that shown in Figs. 9A and 9B. The difference is that a new palm emphasis variance is also used. Same as the previous embodiment except that steps 31 to 37 shown in FIG. 11 are performed instead of steps 17 to 22 shown in FIGS. 9A and 9B .
- the variance value of the hand emphasis is calculated in steps 31 and 32, and the step is replaced with the ⁇ emphasis variance value in step 18.
- the classification result (a) is obtained by combining the emphasis variance value of the hand with the mean value / chin average value.
- the classification result (b) is obtained by combining the emphasis variance value of the hand and the emphasis variance value of ⁇ instead of Steps 19 and 20.
- the classification result (c) is obtained by using the hand variance value in step 37 in step 37.
- the image data processing device 3, the monitor device 4, and the display 5 are used, and the sensor 7 shown in FIG. 12 is used in place of the push button switch 6.
- Can be The data processing in this case executes the processing shown in FIG. 13 instead of the steps 11 to 24 shown in FIGS. 9A and 9B.
- steps 41A and 42A constituting step 41 are the same processing as steps 31 and 32 in Fig. 11, taking the palm temperature and calculating the emphasis dispersion value. I do.
- step 42 the calculated palm emphasis variance is compared with a previously determined reference value to classify the gender. Subsequent processing is the same as that after step 25 in FIG. 9B.
- a man and a woman are identified by using a color image of a beard portion under the nose in addition to the temperature of ⁇ and the temperature of the chin.
- a color image of the lower part of the nose of the visible image is obtained and combined with the ⁇ temperature and the chin temperature.
- a visible camera 2B is installed in addition to the infrared camera 2A, and the visible image signal VDB from the visible camera 2B is also processed by the image data processor 3. It is configured to be sent to
- step 51 in FIG. 14 the infrared face image is extracted.
- step 52 a color image of the beard under the nose of the face is extracted from the visible image signal VDB of the television camera 2B for taking a visible image.
- step 53 the extracted color image of the beard below the nose is compared with the black level reference of the previously-obtained beard part of the male, and if it is above the standard level, it is judged to be male, if it is below the standard, Classify as female and get result (a).
- Steps 54A to 54E constituting step 54 are the same as steps 15 to 19 in FIG. 9A.
- step 55 the second classification (b) is performed, and in step 56, the third classification (c) is performed. Subsequent steps are the same as steps 24 and on in FIG. 9A.
- the color of the lipstick in the face visible image the eye shadow around the eyes, the eye line, the mascara, the eyebrow itself and the eyebrow around it, the red lipstick, the skin color of the face, etc. It is also possible to carry out in place of black.
- the gender discriminating device is configured to function as an unmanned guide device for discriminating between men and women and providing different guidance according to the discrimination result, but the present invention is not limited to this embodiment. Absent. For example, if steps 26 and 27 are described as "I can't enter” and "Welcome,” you can allow only women to enter. Alternatively, a buzzer that sounds only when the decision is made by a male is provided in place of the indicator 5 and applied to the opening and closing of the female entrance door of a female changing room or a public toilet, for example, to prevent the entry of men. Can work as a security device.
- gender discrimination is performed based on the heat generation temperature of the face of the person P, the temperature of the palm, etc.
- the eye shadow around the eyes, the eye line, the mascara, and the shield The eyebrow itself and its surroundings, blue lipstick, lipstick painted on the mouth, skin color of the face, ⁇
- the accuracy of gender discrimination can be further increased by combining gender discrimination methods based on the color of the dark skin due to the chin and beard under the nose and the color of the beard itself.
- Other application examples include the following.
- the security system can be further strengthened.
- Men and women can be distinguished by their appearance, such as clothes, hairstyle, makeup, and walking style, which can be changed according to the will of the person, and by their biological characteristics regardless of their voice. If only biometric features are used, gender can be distinguished regardless of race. Men and women can be identified without contacting the person. Men and women can be distinguished regardless of the ambient temperature. Men and women can be identified regardless of whether or not they use glasses. Reliable gender discrimination can be achieved by performing discrimination based on the combination of each feature. People who cannot determine the discrimination based on their biological characteristics can also be distinguished by sex using the color information of each part of the face. Industrial applicability
- the gender discrimination method according to the present invention enables gender discrimination based on biological characteristics regardless of appearance, such as clothes, hairstyle, makeup, and how to walk, which can be changed according to the will of a person, and voice. It is useful for building a more accurate gender identification system.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Human Computer Interaction (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Image Processing (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005517318A JP4359652B2 (ja) | 2004-01-22 | 2005-01-20 | 男女識別方法 |
US10/586,057 US7899218B2 (en) | 2004-01-22 | 2005-01-20 | Gender identification method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-045440 | 2004-01-22 | ||
JP2004045440 | 2004-01-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2005070301A1 true WO2005070301A1 (ja) | 2005-08-04 |
Family
ID=34805960
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2005/001035 WO2005070301A1 (ja) | 2004-01-22 | 2005-01-20 | 男女識別方法 |
Country Status (3)
Country | Link |
---|---|
US (1) | US7899218B2 (ja) |
JP (1) | JP4359652B2 (ja) |
WO (1) | WO2005070301A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008027242A (ja) * | 2006-07-21 | 2008-02-07 | Toyota Central Res & Dev Lab Inc | 対象物の部位判別装置及び性別判定装置 |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7904461B2 (en) * | 2007-05-01 | 2011-03-08 | Google Inc. | Advertiser and user association |
US8055664B2 (en) | 2007-05-01 | 2011-11-08 | Google Inc. | Inferring user interests |
US8041082B1 (en) | 2007-11-02 | 2011-10-18 | Google Inc. | Inferring the gender of a face in an image |
US8027521B1 (en) * | 2008-03-25 | 2011-09-27 | Videomining Corporation | Method and system for robust human gender recognition using facial feature localization |
DE102011011767A1 (de) * | 2011-02-18 | 2012-08-23 | Fresenius Medical Care Deutschland Gmbh | Medizintechnisches Gerät mit Mehrfunktionsdisplay |
US20120182427A1 (en) * | 2011-06-06 | 2012-07-19 | Aaron Marshall | System and method for providing thermal gender recognition |
FR2997528B1 (fr) * | 2012-10-26 | 2021-10-15 | Oberthur Technologies | Identification biometrique |
US9852324B2 (en) | 2015-12-08 | 2017-12-26 | Intel Corporation | Infrared image based facial analysis |
KR101664940B1 (ko) * | 2016-05-12 | 2016-10-12 | (주)엘리비젼 | 가상현실을 이용한 헤어 스마트 미러 시스템 |
US20180070850A1 (en) * | 2016-09-15 | 2018-03-15 | Karen S. Stafford | Apparatus and method for detecting body composition and correlating it with cognitive efficiency |
US11185235B2 (en) * | 2017-03-27 | 2021-11-30 | Panasonic Intellectual Property Management Co., Ltd. | Information processing method, information processing device, and recording medium |
US11450151B2 (en) * | 2019-07-18 | 2022-09-20 | Capital One Services, Llc | Detecting attempts to defeat facial recognition |
US11984222B2 (en) | 2021-09-02 | 2024-05-14 | Safety Shield Products, LLC | System and method for sharing health data |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08249444A (ja) * | 1995-03-09 | 1996-09-27 | Nippon Telegr & Teleph Corp <Ntt> | 物体属性検出方法 |
JPH10293837A (ja) * | 1997-04-21 | 1998-11-04 | Oki Electric Ind Co Ltd | 2値データ識別方法 |
JP2001216515A (ja) * | 2000-02-01 | 2001-08-10 | Matsushita Electric Ind Co Ltd | 人物の顔の検出方法およびその装置 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5163094A (en) * | 1991-03-20 | 1992-11-10 | Francine J. Prokoski | Method for identifying individuals from analysis of elemental shapes derived from biosensor data |
US6173068B1 (en) * | 1996-07-29 | 2001-01-09 | Mikos, Ltd. | Method and apparatus for recognizing and classifying individuals based on minutiae |
-
2005
- 2005-01-20 WO PCT/JP2005/001035 patent/WO2005070301A1/ja active Application Filing
- 2005-01-20 US US10/586,057 patent/US7899218B2/en not_active Expired - Fee Related
- 2005-01-20 JP JP2005517318A patent/JP4359652B2/ja not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08249444A (ja) * | 1995-03-09 | 1996-09-27 | Nippon Telegr & Teleph Corp <Ntt> | 物体属性検出方法 |
JPH10293837A (ja) * | 1997-04-21 | 1998-11-04 | Oki Electric Ind Co Ltd | 2値データ識別方法 |
JP2001216515A (ja) * | 2000-02-01 | 2001-08-10 | Matsushita Electric Ind Co Ltd | 人物の顔の検出方法およびその装置 |
Non-Patent Citations (6)
Title |
---|
IRIKI M. ET AL: "Kenjo Nipponjin Kokuon no Tokeichi ni Tsuite -Kokuon Sokutei no Mondaiten Oyobi Ekikaon tono Hikaku", NAIKA, vol. 62, no. 1, 1 July 1988 (1988-07-01), pages 162 - 165, XP002990290 * |
KIHARA M. ET AL: "Seijo Seijin Sokkonbu Hifuon no Shinkei Seirigakuteki Kento -Seisa, Sayusa, Oyobi Kareiteki Henka ni Tsuite", JAPANESE JOURNAL OF GERIATRICS, vol. 21, no. 4, July 1984 (1984-07-01), pages 381 - 382, XP002990292 * |
NISHINO S. ET AL: "Sekigaisen Gazo o Mochiita Danjo Shikibetsu no Shiko", 2004 NEN THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS SOGO TAIKAI KOEN RONBUNSHU JOHO SYSTEM 2, 8 March 2004 (2004-03-08), pages 181, XP002990288 * |
OKAMOTO N. ET AL: "Danjo Taion no Hikaku Kenkyu -Shisatsuho Oyobi Cosinor-ho o Mochitte", RINSHO NOHA, vol. 33, no. 7, 1 July 1991 (1991-07-01), pages 485 - 488, XP002990289 * |
YAMAMOTO K. ET AL: "Shitsuon no Chigai ni yoru Haibu Seishiki ga Hifuon, Komakuon Oyobi Onnetsu Kankaku ni Oyobosu Eikyo", JAPANESE JOURNL OF PHYSIOLOGICAL ANTHROPOLOGY, vol. 8, no. 4, November 2003 (2003-11-01), pages 217 - 223, XP002990287 * |
YOSHIUE S. ET AL: "Seijochi . Ijochi Taion", SOGO RINSHO, vol. 34, 10 August 1985 (1985-08-10), pages 1599 - 1606, XP002990291 * |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008027242A (ja) * | 2006-07-21 | 2008-02-07 | Toyota Central Res & Dev Lab Inc | 対象物の部位判別装置及び性別判定装置 |
Also Published As
Publication number | Publication date |
---|---|
US20070274572A1 (en) | 2007-11-29 |
US7899218B2 (en) | 2011-03-01 |
JPWO2005070301A1 (ja) | 2007-09-06 |
JP4359652B2 (ja) | 2009-11-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4359652B2 (ja) | 男女識別方法 | |
Silva et al. | A new database for breast research with infrared image | |
Wang et al. | Infrared imaging of hand vein patterns for biometric purposes | |
WO2019090769A1 (zh) | 一种人脸脸型识别方法、装置和智能终端 | |
KR20190051256A (ko) | 이미지 분석 결과 및 학습된 피부 인식기에 기초하여 피부의 상태를 평가하는 방법 및 프로그램 | |
AU2017217944B2 (en) | Systems and methods for evaluating pigmented tissue lesions | |
CN106241584A (zh) | 一种基于扶梯安全的智能视频监控系统及方法 | |
JP2004529703A (ja) | ポリグラフ検査のための熱画像解析 | |
JP2004529702A (ja) | ポリグラフ検査のための熱画像解析 | |
KR20160078208A (ko) | 생체 인증 장치 및 방법 | |
JPH06500177A (ja) | バイオセンサデータから導き出された基本形状の解析から個人を識別する方法 | |
KR101301821B1 (ko) | 안색 정보 생성 장치 및 그 방법, 안색 정보를 이용한 건강 상태 판단 장치 및 그 방법, 건강 분류 함수 생성 장치 및 그 방법 | |
JP2004527313A (ja) | 被験者の行動解析 | |
JP2007068620A (ja) | 心理状態計測装置 | |
Beveridge et al. | Focus on quality, predicting FRVT 2006 performance | |
Marzec et al. | Automatic method for detection of characteristic areas in thermal face images | |
Mohd et al. | A non-invasive facial visual-infrared stereo vision based measurement as an alternative for physiological measurement | |
KR20130141285A (ko) | 피부상태 진단 방법 및 장치와 이를 이용한 피부상태 적합 화장정보 제공 방법 | |
Boudouane et al. | Wearable camera for fall detection embedded system | |
Buddharaju et al. | Face recognition beyond the visible spectrum | |
JP2001061817A (ja) | 個人識別方法及び個人識別プログラムを記録した記録媒体 | |
CN110399786A (zh) | 一种无感识别方法及系统 | |
Gupta et al. | Detection of cancer in breast thermograms using mathematical threshold based segmentation and morphology technique | |
Sharma et al. | Lip print recognition for security systems: an up-coming biometric solution | |
CN112950242A (zh) | 一种信息推送方法、装置及电子设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 2005517318 Country of ref document: JP |
|
122 | Ep: pct application non-entry in european phase | ||
WWE | Wipo information: entry into national phase |
Ref document number: 10586057 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 10586057 Country of ref document: US |