EP0756426A2 - Verfahren und Vorrichtung zum Extrahieren spezifischer Bildinformation zum Erzeugen von Videoinformation - Google Patents

Verfahren und Vorrichtung zum Extrahieren spezifischer Bildinformation zum Erzeugen von Videoinformation Download PDF

Info

Publication number
EP0756426A2
EP0756426A2 EP19960305514 EP96305514A EP0756426A2 EP 0756426 A2 EP0756426 A2 EP 0756426A2 EP 19960305514 EP19960305514 EP 19960305514 EP 96305514 A EP96305514 A EP 96305514A EP 0756426 A2 EP0756426 A2 EP 0756426A2
Authority
EP
European Patent Office
Prior art keywords
image
area
specified
specified area
color component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
EP19960305514
Other languages
English (en)
French (fr)
Other versions
EP0756426A3 (de
EP0756426B1 (de
Inventor
Naoki Tomizawa
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sharp Corp
Original Assignee
Sharp Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharp Corp filed Critical Sharp Corp
Publication of EP0756426A2 publication Critical patent/EP0756426A2/de
Publication of EP0756426A3 publication Critical patent/EP0756426A3/de
Application granted granted Critical
Publication of EP0756426B1 publication Critical patent/EP0756426B1/de
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/646Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/162Detection; Localisation; Normalisation using pixel segmentation or colour matching
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/643Hue control means, e.g. flesh tone control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/20Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video object coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/30Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using hierarchical techniques, e.g. scalability

Definitions

  • the present invention relates to a specified image-area extracting method and a specified image-area extracting device, and more particularly, to a method of and a device for extracting, from an input image, a remarkable portion, e.g., a person's skin-color portion including its face, arms and the like as a specified image-area, which method and device are usable for video processing device, e.g., for producing video information suitable to use in video telephones and video conferences.
  • Japanese Laid-Open Patent Publication No. 5-165120 which is referred to as a first example of conventional face-area extracting method, discloses sequential steps of a face-area extracting process according to a featured image-data extracting method.
  • input data R, G, B is cleared from noise components.
  • data R, G and B are converted to H (hue) value, S (saturation) value and L (luminance) value.
  • H hue value
  • S saturation value
  • L luminance
  • 3rd Step a two-dimensional histogram showing hue value and saturation value is prepared by using a coordinate system with orthogonal axes for hue value (H), saturation value (S) and the number of pixels.
  • the determined two-dimensional histogram is clustered by cutting-out small peaks therefrom by a plane parallel to the coordinate plane and detecting small peaks.
  • a large number of pixels are clustered on the basis of the detected small peaks cutout from the two-dimensional histogram, and surrounding pixels are integrated together to form an integrated area.
  • the input image scene (frame) is divided into areas according to the integrated area.
  • Prospective areas of a person's face are extracted from the divided image.
  • face areas are estimated from the extracted prospective face areas, and then data sets R, G and B for estimated face areas are outputted.
  • a two-dimensional histogram is plotted for color difference values U and V, where a face area included in respective area. Therefore, the respective area is extracted and outputted as a face area.
  • the above-mentioned processing is conducted on all video frames.
  • the first example of a conventional face-area extraction method divides an input image into areas by first preparing a two-dimensional histogram of hue and luminance and then extracting a peak of histogram frequencies (the number of pixels).
  • This method encounters such a problem that it is rather difficult to decide what peak corresponds to a skin color area: in practice, the white race and the black race have different hue values, i.e., erroneous face-area extraction may arise depending upon the different races.
  • the second example of a conventional face-area extracting method involves the following problem:
  • Face areas of all races can not be limited to two kinds of color difference distributions and are of at least three kinds of color difference (the white race, the black race and the yellow race) because the color difference has a close correlation with luminance (brightness). Accordingly, this method can not extract a face area depending upon the race.
  • a color space of all pixels in each frame must be transformed by performing a large number of time-consuming operations.
  • the present invention was made to provide a skin-color-area extracting method that is capable of correctly extracting a skin-color area with no error and with a minimized amount of processing operations.
  • the present invention provides the following methods and devices:
  • Figure 1 is a flow chart for explaining a conventional face-area extracting method.
  • Figure 2 is a graph showing a two-dimensional histogram of hue and luminance of an input image according to a conventional face-area extracting method.
  • Figure 3 is a graph showing how to divide an input image into areas according to a conventional face-area extracting method.
  • Figure 4 is a view for explaining a conventional image processing.
  • Figure 5 is a graph of histogram of hue values of the white race and the yellow race.
  • Figure 6 is a graph of histogram of hue values of the black race.
  • Figure 7 is a block diagram showing an embodiment of the present invention, which embodiment relates to means for realizing a specified image area extracting method.
  • Figure 8 is a flow chart showing sequential operations of a control unit of an embodiment shown in Fig. 7.
  • Figure 9 is a block diagram showing a hardware realizing a method of the present invention.
  • FIG. 10 is a block diagram showing an image processing device embodying the present invention.
  • Figure 11 is a view for explaining an embodiment of Fig. 10.
  • Figure 12 is a view for explaining an embodiment of Fig. 10.
  • Face-area extraction by detecting a location of a face of a human figure in an input image and processing the detected area with priority has been proposed as an effective method to improve the quality of video image. It is easily understood that a face portion is remarkable in an image displayed on a display screen of, e.g., a video telephone or a video conference apparatus in the field of video communication. It is, therefore, preferred to extract a face area from an image and suitably encode and quantize the extracted face area to improve the image quality.
  • Fig. 1 is a flow chart showing sequential steps of a face-area extracting process according to a featured image-data extracting method described in Japanese Laid-Open Patent Publication No. 5-165120, which is referred to as a first example of conventional face-area extracting method.
  • Step S0 input data R, G, B is cleared from noise components.
  • Step S1 data R, G and B are converted to H (hue) value, S (saturation) value and L (luminance) value.
  • H hue value
  • S saturation value
  • L luminance
  • a two-dimensional histogram showing hue value and saturation value is prepared by using a coordinate system with orthogonal axes for hue value (H), saturation value (S) and the number of pixels as shown in Fig. 2.
  • the determined two-dimensional histogram is clustered by cutting-out small peaks therefrom by a plane ST parallel to the coordinate plane and detecting small peaks.
  • Step S4 a large number of pixels are clustered on the basis of the detected small peaks cutout from the two-dimensional histogram, and surrounding pixels are integrated together to form an integrated area.
  • the input image scene (frame) is divided into areas according to the integrated area.
  • Prospective areas R1 and R3 of a person's face are extracted from the divided image.
  • face areas R1 and R3 are estimated from the extracted prospective face areas, and then data sets R, G and B for estimated face areas are outputted.
  • Fig. 4 shows a two-dimensional histogram plotted for color difference values U and V, where a face area included in area Rfl or Rf2. Therefore, the area Rf1 defined by LU1 ⁇ U ⁇ HU1 and LV1 ⁇ V ⁇ HV1 or the area Rf2 defined by LU2 ⁇ U ⁇ HU2 and LV2 ⁇ V ⁇ HV2 is extracted and outputted as a face area.
  • the above-mentioned processing is conducted on all video frames.
  • hue is used as a color component having a little correlation with a luminance component.
  • this embodiment adopts a hue-deriving equation according to a standard HSV color representation system (H- hue, S- Saturation, V- Luminance).
  • H- hue, S- Saturation, V- Luminance a standard HSV color representation system
  • the following equation (1) is an example of hue (H-value) deriving equation for an input format of a RGB signal.
  • Figs. 5 and 6 show histograms of hue values derived from an image including a face area according to Equation (1).
  • Fig. 5 there is shown a histogram obtained from an image including therein a person with a relatively light colored skin as the white race or the yellow race has.
  • Fig. 6 there is shown a histogram obtained from an image including a person having a relatively dark colored skin as the black race has.
  • pixels of a skin-color area are usually distributed in a ridge (peak) M1 of the histogram.
  • pixels representing a skin color exist in a range W1 in which hue value takes 0.15 to 9.45 rad.
  • the histogram of Fig. 6 shows no ridge of frequency (pixels) in the range W1, that is, no skin-color area exists within the hue value range W1.
  • the skin-color area is distributed in a ridge M2 of the histogram, which ridge lies within a range W2 of hue values.
  • Fig. 7 is a block diagram showing a technical means for realizing an area extracting method according to the present invention. The operation of the means is as follows:
  • a hue value calculating portion 1 determines a hue value from an input video signal according to Equation (1). Derived hue data together with the input video signal is recorded into a frame memory 6 and then is outputted to a primary discriminating portion 2.
  • the primary discriminating portion 2 determines a specified range W1 defined by an upper limit value h2 and a lower limit value hl outputted from a control portion 5 and extracts pixels whose hue values determined by the hue value calculating portion 1 lie within the range W1.
  • the upper limit h2 and the lower limit h1 may take constant (fixed) values.
  • h1 is set at 0.15 rad and h2 at 0.45 rad.
  • the extraction result is outputted to a pixel counting portion 3.
  • the pixel counting portion 3 counts the number of pixels extracted by the primary discriminating portion 2. The counted value is outputted to a control portion 5.
  • the control portion 5 decides to which group (light or dark) the extracted person's skin color belongs according to the counted pixel value and decides, on the basis of the above-mentioned discrimination, a range of color component values to be used for further area-extraction.
  • a mode representing the decided color range together with a threshold value is outputted to a face-area extracting portion 4.
  • Fig. 8 shows a sequence of operations of the control portion 5 when deciding a color area.
  • a count value N is obtained.
  • a proportion of a face area to an entire image is estimated at 3 to 4% of a total of pixels.
  • the extractable person is considered to have a light skin-color if the counted value exceeds the estimated value.
  • a color component area W1 within hl ⁇ H ⁇ h2 is decided as a specified area. On the contrary, the person is considered to have a dark skin color if the counted value does not exceed the threshold.
  • a logical sum of color areas W2 and W3 defined by h2 and h3 from the control portion 5 is determined as a specified area.
  • h3 is set at 6.0.
  • the area W2 is h3 ⁇ H ⁇ 2 ⁇ and the area W3 is 0 ⁇ H ⁇ h2.
  • Equation (1) for deriving a hue value is defined by an inverse cosine, the two areas W2 and W3 have hue values being successive to each other at 2 ⁇ and zero to form a substantially single area.
  • a face (skin-color) image area according to the threshold value and the mode outputted from the control portion 5 is extracted as, e.g., an address for a hue signal read from the frame memory 6.
  • Embodiment of the present invention which corresponds to aforementioned (1) to (4):
  • Fig. 9 is a block diagram showing another example of technical means for realizing the present invention.
  • the operation of the technical means is as follows:
  • This embodiment differs from the aforementioned first embodiment by using a histogram preparing portion 7, a noise eliminating portion 8 and a peak detecting portion 9 in place of the primary discriminating portion 2 and the pixel counting portion 3 of the first embodiment.
  • the histogram preparing portion 7 prepares a histogram of hue values.
  • the histogram may include noise and calculation error components.
  • the noise eliminating portion 8 conducts smoothing the histogram by using, e.g., a median filter and then cuts off a histogram part whose frequency (the number of pixels) is not more than a value T as shown in Fig. 5. The thus obtained histogram is outputted to the peak detecting portion 9.
  • the peak detecting portion 9 examines whether histogram has a peak in a specified area W1 (this is the same as described before with reference to Figs. 5 and 6). The examination result is outputted to a control portion 5.
  • the control portion 5 correlates the histogram with a color area of a person with a light skin-color (i.e., of the white race or the yellow race) if a peak was found in the area W1, or with a color area of a person with dark skin color (i.e., of the black race) if no peak was found therein.
  • the control portion 5 outputs a decided color-area mode and a threshold to a face-area extracting portion 4.
  • the face-area extracting portion 4 extracts a face (skin-color) area from a hue signal of an image read from a frame memory according to the mode and the threshold received from the control portion 5.
  • Embodiment of the present invention which corresponds to aforementioned (7) to (9):
  • the following description relates to an embodiment of the present invention, which treats primarily not with edited image (e.g., a movie picture) but non-edited images taken by a camera. Images of this type allow extraction of respective face-areas therefrom by using the same threshold as far as the same person exists in the images. This embodiment is intended to use the above-mentioned feature.
  • Fig. 10 is a block diagram showing technical means for realizing the present invention.
  • the operation of the technical means is as follows:
  • a frame memory 6 stores a frame of input original image.
  • a color component discriminating portion 10 discriminates a face area in the frame of the original image read from the frame memory 6 according to a color component value.
  • the discrimination can be made, for example, by using the face-area extracting method described in the first embodiment of the present invention defined in the above (1) to (4).
  • the discrimination result is outputted to an input color-space discriminating portion 11.
  • the input color-space discriminating portion 11 extracts data corresponding to the same co-ordinates as those of pixels which were judged by the color-component discriminating portion 11 to be the pixels of a face area of the original image read from the frame memory 6.
  • the portion 11 then prepares a histogram in an input color-space by using the extracted data. For example, with the input format of RGB signals, the portion prepares three histograms for respective primary colors (Red, Green and Blue).
  • a face area of the input original image in the color space is discriminated by detecting peaks (frequency distributions) above a noise level in respective histograms, and a threshold for the face area in the input color space is determined.
  • the obtained threshold is outputted to a face-area extracting portion 4.
  • a control portion 5 instructs the face-area extracting portion 4 to output an original image without processing until it receives a threshold value from the input color-space discriminating portion 11.
  • the control portion 5 controls the face-area extracting portion 4 to extract a face area from the original image as soon as the threshold value was outputted to them. It requires, for example, one second to obtain the threshold: one second corresponds to a duration of 30 video frames.
  • the control portion 5 therefore instructs the face-area extracting portion 4 to output 30 successive frames of the original image without any processing and extract a face-area from the 31st and proceeding frames as shown in Fig. 11.
  • This method can reduce processing time to 1/30 in comparison with the case of extracting a face-area by determining threshold for each video-frame. Consequently, a hardware for determining threshold values may be simplified and can be controlled by a micro-processor.
  • the face-area extracting portion 4 under the control from the control portion 5 outputs an original image without processing or extracts a face area from the original image according to the threshold given from the input color-space discriminating portion 11.
  • the above-described embodiment determines a threshold for 1 frame of video and extracts a face area from each of successive images by applying the same threshold.
  • this method in applying this method to visual telephones and video-conferencing apparatus, it must be considered that a person may alternate with another person in the middle of communication, whereby erroneous extraction may occur.
  • a threshold obtained by the above-mentioned method is used for face-area extraction but is updated at an interval of the specified number of frames (e.g., once every 300 frames) for further face-area extraction.
  • the above-mentioned embodiment is repeated periodically with updating the threshold value.
  • a current threshold value may be updated periodically once every N2 frames (N2> N1).
  • the control portion 5 instructs the frame memory 6 to record frame No. 0, frame No. N2, frame No. 2 ⁇ N2 ... frame No. n ⁇ N2 (n is an integer) as shown in Fig. 12.
  • a threshold T1 is determined by the above-mentioned method, then a face-area is extracted from frames N1+1 to N2+N1 by using the threshold T1.
  • a frame No. N2 is recorded into the frame memory 6 and processed to obtain therefrom a threshold T2.
  • the frames No. N2+N1+1 to 2 ⁇ N2+N1 are then processed for extracting therefrom a face area by applying an updated threshold T2. The same cycle of processing will be repeated.
  • the present invention offers the following advantages:

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
EP19960305514 1995-07-28 1996-07-26 Verfahren und Vorrichtung zum Extrahieren spezifischer Bildinformation zum Erzeugen von Videoinformation Expired - Lifetime EP0756426B1 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP193870/95 1995-07-28
JP19387095 1995-07-28
JP19387095A JP3461626B2 (ja) 1995-07-28 1995-07-28 特定画像領域抽出方法及び特定画像領域抽出装置

Publications (3)

Publication Number Publication Date
EP0756426A2 true EP0756426A2 (de) 1997-01-29
EP0756426A3 EP0756426A3 (de) 1997-08-06
EP0756426B1 EP0756426B1 (de) 2001-05-02

Family

ID=16315128

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19960305514 Expired - Lifetime EP0756426B1 (de) 1995-07-28 1996-07-26 Verfahren und Vorrichtung zum Extrahieren spezifischer Bildinformation zum Erzeugen von Videoinformation

Country Status (4)

Country Link
US (1) US6088137A (de)
EP (1) EP0756426B1 (de)
JP (1) JP3461626B2 (de)
DE (1) DE69612643T2 (de)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0924648A2 (de) * 1997-12-18 1999-06-23 Fuji Photo Film Co., Ltd. Bildverarbeitungsvorrichtung und -verfahren
EP0996080A2 (de) * 1998-10-08 2000-04-26 Hewlett-Packard Company System und Verfahren zur automatischen Bereichsauswahl
WO2003034729A1 (en) * 2001-10-17 2003-04-24 Qualcomm, Incorporated System and method for maintaining a video image in a wireless communication device
EP1311124A1 (de) * 2001-11-13 2003-05-14 Matsushita Electric Industrial Co., Ltd. Selektives Schutzverfahren für die Bildübertragung
GB2384305A (en) * 2002-01-16 2003-07-23 Autoliv Dev Human position detection by capturing spectral contents of images representative of human skin
US6704448B1 (en) 1999-05-27 2004-03-09 Minolta Co., Ltd. Device and method for extracting specific region from image and computer-readable recording medium storing region extraction program
EP1441497A2 (de) * 2003-01-17 2004-07-28 Omron Corporation Bildaufnahmevorrichtung, Programm und Gerät
EP1619875A1 (de) * 1997-06-17 2006-01-25 Seiko Epson Corporation Verfahren und Vorrichtung zum Einstellen der Farbe
EP1643762A1 (de) * 2004-10-01 2006-04-05 Nikon Corporation Gerät zum Verarbeiten beweglicher Bilder und Verfahren zum Ausführen verschiedener Verarbeitungen bei Bereichen beweglicher und unbeweglicher Objeke
EP1802096A1 (de) * 2004-09-30 2007-06-27 FUJIFILM Corporation Bildverarbeitungseinrichtung, verfahren und bildverarbeitungsprogramm
EP1684520A3 (de) * 2005-01-24 2008-12-31 Kabushiki Kaisha Toshiba Methode und Apparat für die Bildkompression

Families Citing this family (43)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3747589B2 (ja) * 1997-09-17 2006-02-22 コニカミノルタビジネステクノロジーズ株式会社 画像特徴量比較装置および画像特徴量比較プログラムを記録した記録媒体
GB2341231A (en) * 1998-09-05 2000-03-08 Sharp Kk Face detection in an image
JP2000187731A (ja) * 1998-12-21 2000-07-04 Ricoh Co Ltd 画像特徴抽出方法およびその方法の各工程をコンピュータに実行させるためのプログラムを記録したコンピュータ読み取り可能な記録媒体
KR100311952B1 (ko) * 1999-01-11 2001-11-02 구자홍 유효범위 조건 탬플리트 매칭을 이용한 얼굴 영역 추출방법
KR100327485B1 (ko) * 1999-03-17 2002-03-13 윤종용 칼라 이미지로 부터의 얼굴 검출 장치 및 그 방법
US6978042B1 (en) * 1999-04-23 2005-12-20 The Regents Of The University Of California Color image segmentation method
JP2001014457A (ja) * 1999-06-29 2001-01-19 Minolta Co Ltd 画像処理装置
US6577759B1 (en) * 1999-08-17 2003-06-10 Koninklijke Philips Electronics N.V. System and method for performing region-based image retrieval using color-based segmentation
US6940998B2 (en) 2000-02-04 2005-09-06 Cernium, Inc. System for automated screening of security cameras
JP3780810B2 (ja) * 2000-03-27 2006-05-31 コニカミノルタビジネステクノロジーズ株式会社 画像処理回路
US7092122B2 (en) * 2000-07-18 2006-08-15 Fuji Photo Film Co., Ltd. Image processing device and method
JP2002101393A (ja) * 2000-09-22 2002-04-05 Sony Corp 映像表示装置
US7453468B2 (en) * 2000-11-29 2008-11-18 Xerox Corporation Intelligent color to texture converter
US6903782B2 (en) * 2001-03-28 2005-06-07 Koninklijke Philips Electronics N.V. System and method for performing segmentation-based enhancements of a video image
JP2002335501A (ja) * 2001-05-10 2002-11-22 Mitsubishi Electric Corp 携帯型表示装置
KR100422709B1 (ko) * 2001-10-05 2004-03-16 엘지전자 주식회사 영상 의존적인 얼굴 영역 추출방법
KR100439377B1 (ko) * 2002-01-17 2004-07-09 엘지전자 주식회사 이동 통신 환경에서의 사람 영역 추출방법
KR100438303B1 (ko) * 2002-01-17 2004-07-01 엘지전자 주식회사 객체 추출방법
KR100474848B1 (ko) * 2002-07-19 2005-03-10 삼성전자주식회사 영상시각 정보를 결합하여 실시간으로 복수의 얼굴을검출하고 추적하는 얼굴 검출 및 추적 시스템 및 방법
KR100438283B1 (ko) * 2002-08-09 2004-07-02 엘지전자 주식회사 이동통신 단말기 및 이동통신 단말기에서의 얼굴 영역추출 방법
KR20040052142A (ko) * 2002-12-13 2004-06-19 엘지전자 주식회사 이동통신 단말기 및 그 운용방법
WO2005060466A2 (en) * 2003-11-13 2005-07-07 Digitalderm, Inc. Image management system for use in dermatological examinations
US7376270B2 (en) * 2003-12-29 2008-05-20 Canon Kabushiki Kaisha Detecting human faces and detecting red eyes
US7948501B2 (en) * 2004-03-09 2011-05-24 Olympus Corporation Display control apparatus and method under plural different color spaces
JP4373828B2 (ja) * 2004-03-22 2009-11-25 富士フイルム株式会社 特定領域検出方法、特定領域検出装置、およびプログラム
JP4328286B2 (ja) 2004-12-14 2009-09-09 本田技研工業株式会社 顔領域推定装置、顔領域推定方法及び顔領域推定プログラム
JP2006185109A (ja) * 2004-12-27 2006-07-13 Hitachi Ltd 画像計測装置及び画像計測方法
JP4766302B2 (ja) * 2005-03-22 2011-09-07 オムロン株式会社 画像処理装置および方法、記録媒体、並びにプログラム
JP4735170B2 (ja) * 2005-09-30 2011-07-27 オムロン株式会社 画像処理装置および画像処理方法
JP4654874B2 (ja) * 2005-10-17 2011-03-23 株式会社ニコン 被写体解析装置、撮像装置、および画像処理プログラム
KR100691971B1 (ko) * 2006-01-20 2007-03-09 엘지전자 주식회사 이동통신 단말기 및 이미지 데이터 확대 표시방법
US8107762B2 (en) 2006-03-17 2012-01-31 Qualcomm Incorporated Systems, methods, and apparatus for exposure control
WO2008001889A1 (fr) * 2006-06-29 2008-01-03 Fujitsu Limited procédé de classification de couleur, procédé de reconnaissance de couleur, dispositif de classification de couleur, dispositif de reconnaissance de couleur, système de reconnaissance de couleur, programme informatique et support d'enregistrement
US7304740B1 (en) * 2006-09-27 2007-12-04 Weyerhaeuser Company Methods for detecting compression wood in lumber
US7757159B1 (en) * 2007-01-31 2010-07-13 Yazaki North America, Inc. Method of determining the projected area of a 2-D view of a component
WO2010124062A1 (en) 2009-04-22 2010-10-28 Cernium Corporation System and method for motion detection in a surveillance video
US8411970B2 (en) * 2010-03-16 2013-04-02 Pixia Corp. Method and system for determining statistical data for image pixels having a higher bit depth per band
KR101668702B1 (ko) * 2010-07-13 2016-10-24 엘지전자 주식회사 화상회의 장치 및 화상회의 서비스 제공 방법
WO2012049845A1 (ja) * 2010-10-12 2012-04-19 パナソニック株式会社 色信号処理装置
JP5306500B2 (ja) * 2012-02-29 2013-10-02 株式会社東芝 画像処理装置、画像処理方法及びプログラム
JP6039942B2 (ja) * 2012-07-09 2016-12-07 キヤノン株式会社 情報処理装置及びその制御方法及びプログラム
US9361411B2 (en) 2013-03-15 2016-06-07 Honeywell International, Inc. System and method for selecting a respirator
US12026862B2 (en) 2019-10-11 2024-07-02 3M Innovative Properties Company Apparatus and methods for preprocessing images having elements of interest

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04346333A (ja) 1991-05-23 1992-12-02 Fuji Photo Film Co Ltd 人物の顔のデータの抽出方法及び露光量決定方法
JPH04346334A (ja) 1991-05-23 1992-12-02 Fuji Photo Film Co Ltd 特徴画像データの抽出方法
JPH04346332A (ja) 1991-05-23 1992-12-02 Fuji Photo Film Co Ltd 露光量決定方法
JPH05100328A (ja) 1991-10-09 1993-04-23 Fuji Photo Film Co Ltd 露光量決定方法
JPH05158164A (ja) 1991-12-05 1993-06-25 Fuji Photo Film Co Ltd 特徴画像データの抽出方法
JPH05165120A (ja) 1991-12-12 1993-06-29 Fuji Photo Film Co Ltd 特徴画像データの抽出方法
JPH0667320A (ja) 1991-05-23 1994-03-11 Fuji Photo Film Co Ltd 人物の顔のデータの抽出方法及び露光量決定方法
JPH06160993A (ja) 1992-11-18 1994-06-07 Fuji Photo Film Co Ltd 特徴画像データの抽出方法

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS63223974A (ja) * 1987-03-13 1988-09-19 Toshiba Corp 画像処理装置
US5296945A (en) * 1991-03-13 1994-03-22 Olympus Optical Co., Ltd. Video ID photo printing apparatus and complexion converting apparatus
US5309228A (en) * 1991-05-23 1994-05-03 Fuji Photo Film Co., Ltd. Method of extracting feature image data and method of extracting person's face data
JP3089605B2 (ja) * 1991-06-17 2000-09-18 日本電信電話株式会社 顔基準点抽出方法
JP3387071B2 (ja) * 1993-04-20 2003-03-17 ソニー株式会社 画像識別装置および方法
US5680230A (en) * 1993-09-09 1997-10-21 Canon Kabushiki Kaisha Image processing method and apparatus thereof
US5689575A (en) * 1993-11-22 1997-11-18 Hitachi, Ltd. Method and apparatus for processing images of facial expressions
JP3258840B2 (ja) * 1994-12-27 2002-02-18 シャープ株式会社 動画像符号化装置および領域抽出装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH04346333A (ja) 1991-05-23 1992-12-02 Fuji Photo Film Co Ltd 人物の顔のデータの抽出方法及び露光量決定方法
JPH04346334A (ja) 1991-05-23 1992-12-02 Fuji Photo Film Co Ltd 特徴画像データの抽出方法
JPH04346332A (ja) 1991-05-23 1992-12-02 Fuji Photo Film Co Ltd 露光量決定方法
JPH0667320A (ja) 1991-05-23 1994-03-11 Fuji Photo Film Co Ltd 人物の顔のデータの抽出方法及び露光量決定方法
JPH05100328A (ja) 1991-10-09 1993-04-23 Fuji Photo Film Co Ltd 露光量決定方法
JPH05158164A (ja) 1991-12-05 1993-06-25 Fuji Photo Film Co Ltd 特徴画像データの抽出方法
JPH05165120A (ja) 1991-12-12 1993-06-29 Fuji Photo Film Co Ltd 特徴画像データの抽出方法
JPH06160993A (ja) 1992-11-18 1994-06-07 Fuji Photo Film Co Ltd 特徴画像データの抽出方法

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7292371B2 (en) 1997-06-17 2007-11-06 Seiko Epson Corporation Image processing apparatus, image processing method, image processing program recording medium, color adjustment method, color adjustment device, and color adjustment control program recording medium
US7286265B2 (en) 1997-06-17 2007-10-23 Seiko Epson Corporation Image processing apparatus, image processing method, image processing program recording medium, color adjustment method, color adjustment device, and color adjustment control program recording medium
EP1619875A1 (de) * 1997-06-17 2006-01-25 Seiko Epson Corporation Verfahren und Vorrichtung zum Einstellen der Farbe
US7072074B2 (en) 1997-06-17 2006-07-04 Seiko Epson Corporation Image processing apparatus, image processing method, image processing program recording medium, color adjustment method, color adjustment device, and color adjustment control program recording medium
EP0924648A3 (de) * 1997-12-18 2003-01-15 Fuji Photo Film Co., Ltd. Bildverarbeitungsvorrichtung und -verfahren
US6577760B1 (en) 1997-12-18 2003-06-10 Fuji Photo Film Co., Ltd. Image processing apparatus and method, image synthesizing system and method, image synthesizer and client computer which constitute image synthesizing system, and image separating method
US7301666B2 (en) 1997-12-18 2007-11-27 Fujifilm Corporation Image processing apparatus and method, image synthesizing system and method, image synthesizer and client computer which constitute image synthesizing system, and image separating method
EP0924648A2 (de) * 1997-12-18 1999-06-23 Fuji Photo Film Co., Ltd. Bildverarbeitungsvorrichtung und -verfahren
EP0996080A3 (de) * 1998-10-08 2002-01-09 Hewlett-Packard Company, A Delaware Corporation System und Verfahren zur automatischen Bereichsauswahl
EP0996080A2 (de) * 1998-10-08 2000-04-26 Hewlett-Packard Company System und Verfahren zur automatischen Bereichsauswahl
US6704448B1 (en) 1999-05-27 2004-03-09 Minolta Co., Ltd. Device and method for extracting specific region from image and computer-readable recording medium storing region extraction program
WO2003034729A1 (en) * 2001-10-17 2003-04-24 Qualcomm, Incorporated System and method for maintaining a video image in a wireless communication device
US6970580B2 (en) 2001-10-17 2005-11-29 Qualcomm Incorporated System and method for maintaining a video image in a wireless communication device
EP1311124A1 (de) * 2001-11-13 2003-05-14 Matsushita Electric Industrial Co., Ltd. Selektives Schutzverfahren für die Bildübertragung
GB2384305B (en) * 2002-01-16 2005-03-16 Autoliv Dev Improvements in or relating to a camera arrangement
GB2384305A (en) * 2002-01-16 2003-07-23 Autoliv Dev Human position detection by capturing spectral contents of images representative of human skin
EP1441497A2 (de) * 2003-01-17 2004-07-28 Omron Corporation Bildaufnahmevorrichtung, Programm und Gerät
EP1802096A1 (de) * 2004-09-30 2007-06-27 FUJIFILM Corporation Bildverarbeitungseinrichtung, verfahren und bildverarbeitungsprogramm
EP1802096A4 (de) * 2004-09-30 2011-04-13 Fujifilm Corp Bildverarbeitungseinrichtung, verfahren und bildverarbeitungsprogramm
EP1643762A1 (de) * 2004-10-01 2006-04-05 Nikon Corporation Gerät zum Verarbeiten beweglicher Bilder und Verfahren zum Ausführen verschiedener Verarbeitungen bei Bereichen beweglicher und unbeweglicher Objeke
US8514293B2 (en) 2004-10-01 2013-08-20 Nikon Corporation Moving image processing device and method for performing different image processings to moving object region and still object region
EP1684520A3 (de) * 2005-01-24 2008-12-31 Kabushiki Kaisha Toshiba Methode und Apparat für die Bildkompression

Also Published As

Publication number Publication date
DE69612643T2 (de) 2001-09-13
EP0756426A3 (de) 1997-08-06
US6088137A (en) 2000-07-11
JPH0944670A (ja) 1997-02-14
JP3461626B2 (ja) 2003-10-27
EP0756426B1 (de) 2001-05-02
DE69612643D1 (de) 2001-06-07

Similar Documents

Publication Publication Date Title
EP0756426B1 (de) Verfahren und Vorrichtung zum Extrahieren spezifischer Bildinformation zum Erzeugen von Videoinformation
EP0747855B1 (de) Verfahren und Vorrichtung zur Verbesserung eines digitalen Bildes
US6819796B2 (en) Method of and apparatus for segmenting a pixellated image
US6587593B1 (en) Image processing device and image processing method
WO2017131343A1 (en) A device for and method of enhancing quality of an image
US20060203311A1 (en) Automatic white balance method adaptive to digital color images
US20050058341A1 (en) Image quality correction apparatus and image quality correction method
EP2068569A1 (de) Verfahren und Vorrichtung zum Erkennen und Anpassen von Farbwerten der Hautfarbenpixel
US20050207643A1 (en) Human skin tone detection in YCbCr space
CN104200431A (zh) 图像灰度化的处理方法及处理装置
JPH07203303A (ja) データ供給方法及びデータ供給装置
EP1428394B1 (de) Bildverarbeitungsvorrichtung und verfahren für die verbesserung von bildern und eine bildausgabe vorrichtung, die die bildverarbeitungsvorrichtung enthält
EP1757083A1 (de) Identifizieren von roten augen in digitalkamerabildern
US8482630B2 (en) Apparatus and method for adjusting automatic white balance by detecting effective area
CN112001853A (zh) 图像处理设备、图像处理方法、摄像设备和存储介质
US8290262B2 (en) Color processing apparatus and method thereof
US20120154545A1 (en) Image processing apparatus and method for human computer interaction
EP0557003A1 (de) Methode zur Filmtypidentifikation
CN112488933B (zh) 一种视频的细节增强方法、装置、移动终端和存储介质
JP4807866B2 (ja) 色再現処理切替装置および可読記録媒体
JP3387071B2 (ja) 画像識別装置および方法
US6917704B2 (en) Image processing method, image processing apparatus and image processing system
KR20000059451A (ko) 이미지 검색시스템의 분위기 칼라 자동추출 및 원래 칼라 조정방법
EP1342367B1 (de) Prozess und vorrichtung zur räumlichen glättung für dunkle regionen eines bildes
JP2002208013A (ja) 画像領域抽出装置及び画像領域抽出方法

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): DE FR GB

PUAL Search report despatched

Free format text: ORIGINAL CODE: 0009013

AK Designated contracting states

Kind code of ref document: A3

Designated state(s): DE FR GB

17P Request for examination filed

Effective date: 19971006

17Q First examination report despatched

Effective date: 19990527

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

GRAG Despatch of communication of intention to grant

Free format text: ORIGINAL CODE: EPIDOS AGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

GRAH Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOS IGRA

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): DE FR GB

REF Corresponds to:

Ref document number: 69612643

Country of ref document: DE

Date of ref document: 20010607

ET Fr: translation filed
REG Reference to a national code

Ref country code: GB

Ref legal event code: IF02

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed
PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20060719

Year of fee payment: 11

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20060720

Year of fee payment: 11

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: GB

Payment date: 20060726

Year of fee payment: 11

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20070726

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20080201

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20070726

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20080331

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20070731