EP0578508B1 - Caméra vidéo avec système de poursuite de cible basé sur la couleur - Google Patents
Caméra vidéo avec système de poursuite de cible basé sur la couleur Download PDFInfo
- Publication number
- EP0578508B1 EP0578508B1 EP93305419A EP93305419A EP0578508B1 EP 0578508 B1 EP0578508 B1 EP 0578508B1 EP 93305419 A EP93305419 A EP 93305419A EP 93305419 A EP93305419 A EP 93305419A EP 0578508 B1 EP0578508 B1 EP 0578508B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- divided area
- divided
- lightness
- hair
- skin colour
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S3/00—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
- G01S3/78—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
- G01S3/782—Systems for determining direction or deviation from predetermined direction
- G01S3/785—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
- G01S3/786—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically
- G01S3/7864—T.V. type tracking systems
Definitions
- This invention relates to a video camera system which can be applied in automonous target tracking apparatus in which the field of view of a video camera can track the center of an object such as a human face.
- a first automonous target tracking method in which the peak value of a high frequency element of a brightness signal in the measuring frame is memorized and its movement is automatically tracked as the feature of the target.
- a second autonomous target tracking method has been proposed in which, regarding brightness signals of the front and back fields in the measuring frame, a motion vector is formed by taking the key point matching and it is assumed that the motion vector in the measuring frame is the movement of the target.
- the first autonomous tracking method has an advantage in that the general construction is simple, but, for example in the case where there is a man in the measuring frame with a tree in the background, the man cannot be tracked automatically since the high frequency element of the tree is the biggest peak at this point.
- the first autonomous tracking method utilizes peak signals, this method is easily affected by noise. As a result, there is a possibility that automatic tracking cannot be performed in a photographic environment with low brightness. Also, it is possible that an object with poor contrast cannot automatically be tracked since basically the high frequency element is extracted.
- the second autonomous tracking method it is difficult to distinguish whether the calculated motion vector is caused by unintentional movement of the video camera or by the movement of the object. and in practice there is a possibility of the system functioning erroneously.
- United States Patent US-A-5.031,049 discloses a video camera system in which a colour difference signal representing a feature of an object is used for detecting a relative shift between said object and said system, and then tracked within an image field to provide a reference area for focus measurement and control.
- United States Patent US-A-5,093,869 discloses a missile aiming pattern recognition apparatus in which regions with uniform properties are identified and then grown to include neighbouring areas.
- a video camera system comprising:
- the skin divided area and the lightness divided area can be extracted properly because they are absorbed in skin divided area groups FC1.
- FC2 and lightness area groups HR1, HR2, and simultaneously, the measuring frame for the object can be set with certainty because the measuring frame is set based on the adjacent set FC1, HR1 in said skin divided area groups FC1, FC2 and the lightness divided area groups HR1, HR2.
- Another aspect of the present invention provides an autonomous target tracking method for automatically tracking an object displayed on a display screen based on imaging output signals output from an image pickup device within a lens block unit, said autonomous target tracking method comprising the steps of:
- the image information of each pixel which constitutes the display screen is divided into the prescribed number of small divided areas, and the measuring frame is set by judging the position and size of the skin colour area based on the hue signal of each small divided area, a video camera system which is capable of setting the measuring frame on the object with high accuracy can be easily obtained.
- Embodiments of the invention provide video camera autonomous tracking apparatus which is in general unaffected by the intensity and direction of the light source and, simultaneously, which is in general unaffected by the background.
- ATT generally shows an autonomous target tracking apparatus of the video camera system, and the autonomous target tracker aims at a human face as its target.
- the autonomous target tracker aims at a human face as its target.
- it is applicable as a means to make a lens block of the video camera system constituting the Autonomous Target Tracker with Human Recognition, ATT-HR system for automatically tracking function.
- the autonomous target tracking apparatus ATT receives an image light LA which comes from the human face model as the object at a charge coupled device 4 (CCD) through a lens 2 of the lens block unit 1 and an iris 3 and delivers an image output signal S1 which shows the target image to a signal separation/automatic gain adjusting circuit 5.
- CCD charge coupled device 4
- the signal separation/automatic gain adjusting circuit sample holds the image output signal S1, which is gain controlled in order to have the prescribed gain by the control signal from an auto iris (AE) system (not shown) and thus obtained image output signal S2 will be supplied to a digital camera processing circuit 7 through an analog-to-digital converter 6.
- AE auto iris
- the digital camera processing circuit 7 forms brightness signal Y, chrominance signal C and color difference signals R-Y and B-Y according to the image output signal S2, and outputs the brightness signal Y and chrominance signal C through a digital-to-analog converter 8 as video signal S3.
- the digital camera processing circuit 7 supplies brightness signal Y and color difference signals R-Y and B-Y as target tracking detection signal S4 to a tracking control circuit 11, which generates tracking control signal S5 for a panning driving motor 12 and a tilting driving motor 13 which are equipped in the lens block unit 1 in accordance with the target tracking detection signal S4.
- the tracking control circuit unit 11 supplies color difference signals R-Y and B-Y to a saturation/hue detection circuit 14 and forms the hue signal HUE and the saturation signal SAT and memorizes these with brightness signal Y in an image memory 15 constituted by, such as field memory, per each picture element unit.
- the saturation/hue detection circuit 14 forms hue signal HUE and saturation signal SAT by rectangular coordinate to polar coordinate converting color difference signals R-Y and B-Y, and recognizes the human face model as an object based on the visual stimulation which human being can perceive by means of brightness signal Y, hue signal HUE and saturation signal SAT.
- the visual stimulation which human beings can generally perceive can be expressed in L axis and color coordinate system which contains SH plane perpendicularly intersecting to L axis, as it is called "HLS system".
- L axis shows "lightness” and this is equivalent to brightness signal Y.
- SH plane is expressed by polar coordinate which intersects perpendicularly to L axis.
- S shows "saturation” and is expressed by the distance from L axis.
- H stands for "hue” and hue is expressed by an angle at the time when the direction of hue signal B-Y is 0 [°].
- the cubic factor of this HLS is represented that, as the light source becomes lighter, the color coordinate, i.e., SH plane, moves upward along the L axis, and all colors become white. At this point, the saturation S gradually decreases. On the other hand, if the light source becomes darker, the color coordinate, i.e., SH plane, falls down along the L axis, and all colors become black. At this point, the saturation S also decreases.
- the saturation S and the brightness Y are easily affected by the lightness of the light source in accordance with the characteristic of HLS color coordinate system; and therefore, it is difficult to say that they are the optimal parameters to express the quantity of features of the object.
- the hue H is the quantity which is not easily affected by the light source as a parameter to express the quantity of inherent features of the object.
- the tracking control circuit 11 extracts the feature of the human face model as the object and when the feature changes, the panning driving motor 12 and tilting driving motor 13 will be actuated in order to follow the changes and as a result, video signal which tracks the movement of the object as video signal S3 will be obtained.
- target tracking detection signal S4 memorized in an image memory 16 supplies address signal S7 to divide the display screen PIC formed in the image memory 15 into divided areas AR of the prescribed size in accordance with xy rectangular coordinates (x, y), as shown in Fig. 3, by giving block defining signal S6 to be outputted from the microprocessor composed tracking signal processing circuit 16 to an address generation circuit 17.
- the data of each picture element which constitute the display screen PIC of the image memory 16 will be read out in every divided area AR and will be evaluated as an image information in each divided area AR.
- the skin color detection circuit 19 detects the image part of skin of the human face model and when the incoming hue signal HUE element is within the range of prescribed skin color, it gives "+1" count output S8 to a skin color pixel number counter 21 as the image of said picture element is skin color.
- the dynamic range of hue signal HUE takes the value in the range of an angle 0 - 350 [°].
- the hair color detection circuit 20 detects hair areas in the image part of the human face model and in the case of this embodiment, the dynamic range of brightness signal Y is expressed by the value 0 - 255 (8 bits) and when the brightness signal of each picture element is below the value 50, it defines as black and judges that said picture element exists in the hair area and outputs "+1" count output S9 to a hair pixel number counter 22.
- the number of pixels having information on skin color and hair color respectively in pixels contained in each divided area AR will be counted in the skin color pixel number counter 21 and the hair pixel number counter 22.
- judgement reference signals S10 and S11 are set to the value to determine whether the counting result of skin color pixel number counter 21 and hair pixel number counter 22 will be judges as prospective skin color area or prospective hair area or not for said divided areas AR, and thus, comparators 23 and 24 transmit prospective skin color area detection information S12 and prospective hair area detection information S13 to the tracking signal processing circuit 16.
- a hue noise gate signal forming circuit 25 composed of comparator is equipped for the gate circuit 18 and it compares saturation signal SA1 to be read out in every pixel from the image memory 15 with noise judgement signal S14 to be outputted from tracking signal processing circuit 16, and by giving gate signal S15 which operates the gate circuit to close when saturation signal SAT is under the prescribed level, it is so arranged that the hue signal HUE element of said pixel will not be inputted to the skill color detection circuit 19.
- the tracking signal processing circuit 16 judges the feature of an image on the display screen PIC from the prospective skin color area detection information S12 and the prospective hair area detection information S13 to be obtained based on brightness signal Y, hue signal HUE and saturation signal SAT of each pixel contained in the image memory 15 and delivers tracking control signal S5 so as the center of gravity of human face model comes in the center of the measuring frame constantly, to the panning driving motor 12 and the tilting driving motor 13.
- the display screen PIC is generally one measuring frame, it is so arranged that the centroid of a face model would come in the center of display screen PIC.
- step SP1 the procedure of step SP1 has been terminated, and the tracking signal processing circuit 16 is able to obtain the distribution condition of divided areas AR judged as the prospective hair divided areas ARA and prospective skin color divided areas ARB as shown in Fig. 3 or Fig. 6.
- the tracking signal processing circuit 16 absorbs said discontinuous areas according to (hair area absorbing rule 1) and (skin color area absorbing rule 2) at the following step SP2.
- the tracking signal processing circuit 16 proceeds to the step SP3 when the process of step SP2 is terminated and defines the determination of hair area and skin color area by the "hair area determination rule 1" and the "skin color area determination rule 2".
- divided area groups HR1 and HR2 in the display screen PIC are combined as the hair area and divided area groups FC1 and FC2 are combined as the skin color area at the steps SP2 and SP3 by absorbing discontinuous divided areas and by combining hair divided area and skin color divided area.
- the tracking signal processing circuit 16 proceeds to the step SP4 of Fig. 5 and calculates the sizes Area_Hair and Area_Skin of combined hair areas HR1 and HR2, and skin color areas FC1 and FC2.
- the tracking signal processing circuit 16 proceeds to the step SP5 and judges the process to determine the face area under the condition to satisfy conditions of the "face area determination rule 1" and the "face area determination rule 2".
- the "face area determination rule 1" identifies one hair divided area group and one skin color divided area group as a set of areas and if the size ratio of the size Area_Hair of the one hair area group and the size Area_Skin of one skin color area group, i.e., Area_Hair/Area_Skin > 5 and Area_Hair/Area_Skin ⁇ 1/5, determines as prospective face area.
- This "face determination rule 1" judges that this set of divided area group is most possible the face area if the size ratio of the sizes Area_Hair and Area_Skin is less than five times and more than 1/5 times.
- the above judgement avoids the possibility of misjudging that all dark areas would be judged as hair area, such as the image having numerous divided areas with dark brightness level on the display screen PIC (e.g., in the case where there is a dark screen in the back), similarly, in the case where there exist a large quantity of skin color divided areas besides face skin, it is possible that the judgement of face area can not correctly be defined, and the above judgement of face area may not correctly be defined, and the above judgement thus avoids erroneous judgement.
- This "face area determination rule 2" judges a set of hair area group and skin color area group as the face if the upper side of the screen is hair divided area group (i.e., I (i, j)) and the lower side is skin color area group (i.e., I (i, j-1)) and under the condition that these hair divided area group and skin color divided area group are in contract with at least three divided areas.
- a pair of divided area groups i.e., a set of hair area group HR1 and skin color area group FC1 is in contact with 8 divided areas and satisfies this rule 2, and therefore, it is judged as a face.
- a pair of divided area groups i.e., a set of hair area group HR2 and skin color area group FC2 is not in contact with any areas and does not satisfy the condition of rule 2, it is excluded from the judgement of face area.
- the tracking signal processing circuit 16 proceeds to the step SP6 and judges whether the process of steps SP2 - SP5 has been repeated twice or not.
- the first process proceeds to the step SP7 and changes the divided number divided at the step SP1 from 32 x 32 to 16 x 16 and extracts prospective hair area and prospective skin color area and returns to the step SP2.
- the tracking signal processing circuit 16 executes steps SP2 - SP3 - SP4 - SP5 on the divided areas- of 16 x 16, and thus improve the probability of face recognition by executing the process on roughly divided areas than the process on 32 x 32 divided.
- step SP6 When an affirmative result is obtained at the step SP6, it means that the image processing has been finished twice and the tracking signal processing circuit 16 proceeds to the step SP8 and when more than two faces are extracted from one display screen PIC, it executes the process so as to select the face area near the center of display screen PIC as an object at the time when the record pause button is released, and then at the step SP9, it supplies tracking control signal S5 in order that the centroid of face will be the center of the measuring frame, i.e., the display screen PIC, to the panning driving motor 12 and the tilting driving motor 13.
- the tracking signal processing circuit 16 terminates a cycle of the automatic tracking process and returns to the step SP1 mentioned above to enter the following automatic tracking cycle.
- the peak value since the peak value has not been used as the characteristic quantity, it is resistant to noises and since the human face model which is not affected by changes of the coordinate on the screen is used without using the motion vector, the system which has resistance to the movement with hand of camera will be constructed.
- Fig. 8 illustrates the other embodiment according to the present invention and as shown, the identical reference numerals are given with corresponding parts of Fig. 1, brightness signal Y, color difference signal R-Y and B-Y obtained from the digital camera processing circuit 7 will be given to the screen reduction circuit 32 via a low pass filter 31.
- the screen reduction circuit 32 executes the process to divide the display screen PIC into divided areas 32 x 32 or 16 x 16 by means of screen reduction control signal S21 outputted from the tracking signal processing circuit 16 and brightness signal Y and color difference signal R-Y and B-Y of each pixel outputted from the digital camera processing circuit 7.
- hue signal HUE and saturation signal SAT will be supplied to the image memory 15 by giving color difference signals R-Y and B-Y corresponding to 32 x 32 areas or 16 x 16 areas from the screen reduction circuit 32 to the saturation/hue detection circuit 14 as target tracking detection signal S4 and simultaneously, brightness signal Y will be supplied directly to the image memory 15 as target tracking detection signal.
- the image memory 15 has a memory area for 32 x 32 images, therefore, in the case of Fig. 1 it is necessary to have memory capacity for one field of pixel, on the other hand in the case of Fig. 8, it can be simplified further.
- hue signal HUE and saturation signal SAT will be outputted as signals accumulated in 32 x 32 pixels or 16 x 16 pixels, and thus the hair detection circuit 20 and skin color detection circuit 19 detect divided areas 32 x 32 or 16 x 16 whether they are the hair areas or skin color areas respectively and supply the detection result to the tracking signal processing circuit 16 as hair area detection output S22 and skin color area detection output S23.
- the lens block unit 1 can set the measuring frame on the centroid of the face followed by the display screen PIC and the centroid of the face can be always light measured, it becomes unnecessary to correct the counter light as the auto iris adjusting system and simultaneously, the brightness level of the object can be set automatically to the optimum value.
- the auto white balance adjusting system can adjust colors of overall screen by adjusting to get optimal skin color or the face. And as a result, the auto white balance adjusting system as a whole can adjust the auto skin color balance (ASB) and thus, as compared with the previously proposed auto white balance adjusting apparatus which adjusts because the integrated result of the entire screen is white, even in the case where the environment of the object is over exposed or under exposed, the display screen having excellent color balance can be obtained.
- ASB auto skin color balance
- the embodiment described above has dealt with the case of executing absorption, combination and determination processes on the group of hair divided areas and the group of skin color divided areas by setting the measuring frame on the overall display screen PIC.
- the same effects as those of the embodiment described above can be obtained if the measuring frame is set in the area selected according to demands in the display screen PIC.
- the embodiment described above has dealt with the case of shifting the field of vision of the lens block 1 corresponding to the movement of the object by panning and/or tilting and thus the desired object image will be entered in the measuring frame.
- the same effects as those of the embodiment described above can be obtained by shifting the measuring frame for the image in the display screen PIC without panning or tilting.
Landscapes
- Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Color Television Image Signal Generators (AREA)
- Processing Of Color Television Signals (AREA)
- Closed-Circuit Television Systems (AREA)
Claims (9)
- Système de caméra vidéo comprenant :un moyen de division de zones (16) pour diviser une image d'affichage produite à partir des signaux de sortie d'image fournis par un dispositif de prise de vue d'image dans une unité de bloc de lentille (1) en un nombre prédéterminé de zones divisées (AR) ;un moyen d'extraction de zones de couleur de peau (16) pour détecter des zones divisées de couleur de peau (ARB) ayant des signaux de teinte dans une gamme prédéterminée correspondant à une couleur de peau à partir desdites zones divisées ;un moyen d'extraction de zones de luminosité (16) pour détecter des zones divisées de luminosité (ARA) ayant des signaux de luminance dans une gamme de luminance prédéterminée à partir desdites zones divisées ;un moyen d'absorption (16) pour former des groupes de zones divisées de couleur de peau par concaténation des zones divisées vierges, n'étant pas des zones divisées de couleur de peau ou des zones divisées de luminosité, adjacentes à une zone divisée ne couleur de peau avec ladite zone divisée de couleur de peau, et pour former des groupes de zones divisées de luminosite par concaténation de zones divisées vierges adjacentes à une zone divisée de luminosité avec ladite zone divisée de luminosité ;un moyen d'extraction de zones divisées adjacentes (16) pour détecter un groupe de zones divisées de couleur de peau et un groupe de zones divisées de luminosité qui sont adjacentes l'une de l'autre à partir desdits groupes de zones de couleur divisées de couleur de peau et des groupes de zones divisées de luminosité ; etun moyen de réglage de cadre de mesure (16) pour régler un cadre de mesure dans ladite zone divisée d'affichage d'image adjacente.
- Système de caméra vidéo selon la revendicaticn 1, dans lequel ledit moyen de réglage de cadre de mesure règle ledit groupe de zones divisées de couleur de peau adjacentes et le groupe de zones divisées de luminosité pour apparaítre dans ledit cadre de mesure en déplaçant ledit cadre de mesure par rapport à ladite image d'affichage en fonction du résultat détecté dudit moyen d'extraction de groupe de zones divisées adjacentes.
- Système de caméra vidéo selon la revendication 1, dans lequel ledit moyen de réglage de cadre de mesure règle ledit groupe de zones divisées de couleur de peau adjacentes, ledit groupe de zones divisées de luminosité pour apparaítre dans ledit cadre de mesure par pivotement et / ou inclinaison de ladite unité de bloc de lentille en fonction du résultat détecté dudit moyen d'extraction de groupe de zones divisées adjacentes.
- Système de caméra vidéo selon, la revendication 1, dans lequel :
ledit moyen d'extraction de zones de luminosité est un moyen d'extraction de zones de cheveux avec ladite gamme de luminance prédéterminée correspondant à une luminosité de cheveux et ledit groupe de zones divisées de luminosité étant un groupe de zones divisées de cheveux, et dans lequel ledit groupe de zones divisées de couleur de peau adjacentes et un groupe de zones divisées de cheveux correspond à un visage. - Système de caméra vidéo selon la revendication 4, dans lequel ledit moyen d'extraction de zones divisées adjacentes est sensible audit groupe de zones divisées de couleur de peau adjacentes et le groupe de zones divisées de cheveux pour détecter un centroïde de ladite face et ledit moyen de réglage de cadre de mesure (16) règle ledit visage par pivotement et / ou inclinaison de ladite unité de bloc de lentille (1) afin que le centroïde dudit visage apparaisse au centre de ladite image d'affichage.
- Système de caméra vidéo selon la revendication 4, dans lequel ledit moyen de définition de visage détermine que ledit groupe de zones divisées de cheveux et ledit groupe de zones divisées de couleur de peau est un visage lorsque la dimension dudit groupe de zones divisées de cheveux par rapport audit groupe de zones divisées de couleur de peau est dans une gamme de dimension relative prédéterminée et ledit groupe de zones divisées de cheveux est adjacent du côté supérieur dudit groupe de zones divisées de couleur de peau.
- Procédé de poursuite de cible autonome pour poursuivre automatiquement un objet affiché sur un écran d'affichage sur la base des signaux de sortie de prise de vue fournis par un dispositif de prise de vue d'image dans une unité de bloc de lentille (1), ledit procédé de poursuite de cible autonome comprenant les étapes de :division d'une image d'affichage produite par lesdits signaux de sortie d'image fournis par ledit dispositif de prise de vue dans ledit bloc de lentille en un nombre prédéterminé de zones divisées (AR) ;détection des zones divisées de couleur de peau (ARB) ayant un signal de teinte dans une gamme prédéterminée correspondant à une couleur de peau à partir desdites zones divisées ;détection des zones divisées de luminosité ayant des signaux de luminance dans une gamme de luminance prédéterminée à partir desdites zones divisées ;concaténation des zones divisées de bloc vierge, n'étant pas des zones divisées de couleur de peau ou des zones divisées de luminosité, adjacentes à ladite zone divisée de couleur de peau avec ladite zone divisée de couleur de peau pour former des groupes de zones divisées de couleur de peau et concaténation des zones divisées vierges adjacentes à ladite zone divisée de luminosité pour former des groupes de zones divisées de luminosité ;détection d'un groupe de zones divisées de couleur de peau et d'un groupe de zones divisées de luminosité qui sont adjacentes les unes des autres à partir desdits groupes de zones divisées de couleur de peau et des groupes de zones divisées de luminosité ; etréglage d'un cadre de mesure dans ladite image d'affichage afin que ledit groupe de zones divisées de couleur de peau détectée et le groupe de zones divisées de luminosité qui sont adjacentes les unes des autres apparaissent dans ledit cadre de mesure.
- Procédé de poursuite de cible autonome selon la revendication 7, dans lequel
lesdites zones divisées de luminosité sont des zones divisées de cheveux avec ladite gamme de luminance prédéterminée correspondant à une luminosité de cheveux et ledit groupe de zones divisées de luminosité est un groupe de zones divisées de cheveux ; et dans lequel ledit groupe de zones divisées de couleur de peau adjacentes et ledit groupe de zones divisées de cheveux correspondent à un visage. - Procédé de poursuite de cible autonome selon la revendication 8, dans lequel
ledit groupe de zones divisées de cheveux et ledit groupe de zones divisées de couleur de peau sont déterminés pour être un visage lorsqu'une dimension dudit groupe de zones divisées de cheveux par rapport audit groupe de zones divisées de couleur de peau est dans une gamme relative prédéterminée et ledit groupe de zones divisées de cheveux est adjacent du côté supérieur dudit groupe de zones divisées de couleur de peau.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP207107/92 | 1992-07-10 | ||
JP20710792 | 1992-07-10 | ||
JP20710792A JP3298072B2 (ja) | 1992-07-10 | 1992-07-10 | ビデオカメラシステム |
Publications (3)
Publication Number | Publication Date |
---|---|
EP0578508A2 EP0578508A2 (fr) | 1994-01-12 |
EP0578508A3 EP0578508A3 (fr) | 1995-01-04 |
EP0578508B1 true EP0578508B1 (fr) | 1999-09-15 |
Family
ID=16534320
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP93305419A Expired - Lifetime EP0578508B1 (fr) | 1992-07-10 | 1993-07-09 | Caméra vidéo avec système de poursuite de cible basé sur la couleur |
Country Status (5)
Country | Link |
---|---|
US (1) | US5430809A (fr) |
EP (1) | EP0578508B1 (fr) |
JP (1) | JP3298072B2 (fr) |
KR (1) | KR100274999B1 (fr) |
DE (1) | DE69326394T2 (fr) |
Families Citing this family (157)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5699442A (en) * | 1992-04-10 | 1997-12-16 | Andrew Welch | System for detecting the location of a reflective object within a video field |
KR100276681B1 (ko) * | 1992-11-07 | 2001-01-15 | 이데이 노부유끼 | 비디오 카메라 시스템 |
JPH0730796A (ja) * | 1993-07-14 | 1995-01-31 | Sony Corp | ビデオ信号追従処理システム |
US5774261A (en) * | 1993-11-19 | 1998-06-30 | Terumo Kabushiki Kaisha | Image display system |
ATE192275T1 (de) * | 1993-12-03 | 2000-05-15 | Terumo Corp | Stereoskopisches bildanzeigesystem |
JPH07218864A (ja) * | 1994-02-07 | 1995-08-18 | Terumo Corp | 立体画像表示装置 |
JPH07222204A (ja) * | 1994-02-07 | 1995-08-18 | Terumo Corp | 立体画像表示装置 |
JPH07222866A (ja) * | 1994-02-09 | 1995-08-22 | Terumo Corp | 立体画像ゲーム装置 |
JPH07226957A (ja) * | 1994-02-09 | 1995-08-22 | Terumo Corp | 立体画像通信装置 |
US6011580A (en) * | 1994-06-07 | 2000-01-04 | Terumo Kabushiki Kaisha | Image display apparatus |
US6714665B1 (en) | 1994-09-02 | 2004-03-30 | Sarnoff Corporation | Fully automated iris recognition system utilizing wide and narrow fields of view |
JPH0896129A (ja) * | 1994-09-21 | 1996-04-12 | Omron Corp | モデル画像登録方法 |
US5535302A (en) * | 1994-12-01 | 1996-07-09 | Tsao; Tien-Ren | Method and apparatus for determining image affine flow using artifical neural system with simple cells and lie germs |
DE19511713A1 (de) * | 1995-03-30 | 1996-10-10 | C Vis Computer Vision Und Auto | Verfahren und Vorrichtung zur automatischen Bildaufnahme von Gesichtern |
US6172706B1 (en) * | 1995-04-19 | 2001-01-09 | Canon Kabushiki Kaisha | Video camera with automatic zoom adjustment based on distance between user's eyes |
US5912980A (en) * | 1995-07-13 | 1999-06-15 | Hunke; H. Martin | Target acquisition and tracking |
WO1997021188A1 (fr) * | 1995-12-04 | 1997-06-12 | David Sarnoff Research Center, Inc. | Systeme et procede de reconnaissance a champ de vision grand angulaire et petit angulaire |
GB2311182A (en) * | 1996-03-13 | 1997-09-17 | Innovision Plc | Improved gradient based motion estimation |
US5717512A (en) * | 1996-05-15 | 1998-02-10 | Chmielewski, Jr.; Thomas A. | Compact image steering and focusing device |
US6222939B1 (en) | 1996-06-25 | 2001-04-24 | Eyematic Interfaces, Inc. | Labeled bunch graphs for image analysis |
US7650015B2 (en) | 1997-07-22 | 2010-01-19 | Image Processing Technologies. LLC | Image processing method |
GB9619117D0 (en) * | 1996-09-12 | 1996-10-23 | Pandora Int Ltd | Digital image processing |
US6343141B1 (en) | 1996-10-08 | 2002-01-29 | Lucent Technologies Inc. | Skin area detection for video image systems |
US5864630A (en) * | 1996-11-20 | 1999-01-26 | At&T Corp | Multi-modal method for locating objects in images |
US6275258B1 (en) | 1996-12-17 | 2001-08-14 | Nicholas Chim | Voice responsive image tracking system |
US6121953A (en) * | 1997-02-06 | 2000-09-19 | Modern Cartoons, Ltd. | Virtual reality system for sensing facial movements |
US5845009A (en) * | 1997-03-21 | 1998-12-01 | Autodesk, Inc. | Object tracking system using statistical modeling and geometric relationship |
US6151403A (en) * | 1997-08-29 | 2000-11-21 | Eastman Kodak Company | Method for automatic detection of human eyes in digital images |
US6252976B1 (en) * | 1997-08-29 | 2001-06-26 | Eastman Kodak Company | Computer program product for redeye detection |
US6148092A (en) | 1998-01-08 | 2000-11-14 | Sharp Laboratories Of America, Inc | System for detecting skin-tone regions within an image |
US6556708B1 (en) * | 1998-02-06 | 2003-04-29 | Compaq Computer Corporation | Technique for classifying objects within an image |
JP3271750B2 (ja) * | 1998-03-05 | 2002-04-08 | 沖電気工業株式会社 | アイリス識別コード抽出方法及び装置、アイリス認識方法及び装置、データ暗号化装置 |
US6272231B1 (en) | 1998-11-06 | 2001-08-07 | Eyematic Interfaces, Inc. | Wavelet-based facial motion capture for avatar animation |
JP3970520B2 (ja) | 1998-04-13 | 2007-09-05 | アイマティック・インターフェイシズ・インコーポレイテッド | 人間の姿を与えたものを動画化するためのウェーブレットに基づく顔の動きの捕捉 |
US6301370B1 (en) | 1998-04-13 | 2001-10-09 | Eyematic Interfaces, Inc. | Face recognition from video images |
US6593956B1 (en) | 1998-05-15 | 2003-07-15 | Polycom, Inc. | Locating an audio source |
JP2000048184A (ja) * | 1998-05-29 | 2000-02-18 | Canon Inc | 画像処理方法及び顔領域抽出方法とその装置 |
AUPP400998A0 (en) | 1998-06-10 | 1998-07-02 | Canon Kabushiki Kaisha | Face detection in digital images |
AU739936B2 (en) * | 1998-06-10 | 2001-10-25 | Canon Kabushiki Kaisha | Face detection in digital images |
US6404900B1 (en) | 1998-06-22 | 2002-06-11 | Sharp Laboratories Of America, Inc. | Method for robust human face tracking in presence of multiple persons |
US7050655B2 (en) * | 1998-11-06 | 2006-05-23 | Nevengineering, Inc. | Method for generating an animated three-dimensional video head |
US6714661B2 (en) | 1998-11-06 | 2004-03-30 | Nevengineering, Inc. | Method and system for customizing facial feature tracking using precise landmark finding on a neutral face image |
GB2343945B (en) * | 1998-11-18 | 2001-02-28 | Sintec Company Ltd | Method and apparatus for photographing/recognizing a face |
US7050624B2 (en) * | 1998-12-04 | 2006-05-23 | Nevengineering, Inc. | System and method for feature location and tracking in multiple dimensions including depth |
US7057636B1 (en) * | 1998-12-22 | 2006-06-06 | Koninklijke Philips Electronics N.V. | Conferencing system and method for the automatic determination of preset positions corresponding to participants in video-mediated communications |
US6393136B1 (en) * | 1999-01-04 | 2002-05-21 | International Business Machines Corporation | Method and apparatus for determining eye contact |
US6539100B1 (en) | 1999-01-27 | 2003-03-25 | International Business Machines Corporation | Method and apparatus for associating pupils with subjects |
JP3357628B2 (ja) | 1999-04-16 | 2002-12-16 | 池上通信機株式会社 | ビューファインダの制御装置およびテレビカメラ |
JP4366758B2 (ja) * | 1999-05-27 | 2009-11-18 | コニカミノルタホールディングス株式会社 | 領域抽出装置および領域抽出方法、ならびに領域抽出プログラムを記録した記録媒体 |
AU739832B2 (en) * | 1999-06-10 | 2001-10-18 | Canon Kabushiki Kaisha | Object camera description |
JP2001014457A (ja) * | 1999-06-29 | 2001-01-19 | Minolta Co Ltd | 画像処理装置 |
US6466695B1 (en) | 1999-08-04 | 2002-10-15 | Eyematic Interfaces, Inc. | Procedure for automatic analysis of images and image sequences based on two-dimensional shape primitives |
US6965693B1 (en) * | 1999-08-19 | 2005-11-15 | Sony Corporation | Image processor, image processing method, and recorded medium |
US6792135B1 (en) * | 1999-10-29 | 2004-09-14 | Microsoft Corporation | System and method for face detection through geometric distribution of a non-intensity image property |
US6940545B1 (en) * | 2000-02-28 | 2005-09-06 | Eastman Kodak Company | Face detecting camera and method |
AUPQ684600A0 (en) * | 2000-04-11 | 2000-05-11 | Safehouse International Limited | An object monitoring system |
JP2004501463A (ja) * | 2000-06-16 | 2004-01-15 | アイウェブ・インコーポレイテッド | 虹彩サイズを用いた人体顔面の画像からの対象物測定方法及び寸法決め方法 |
JP2002047694A (ja) * | 2000-07-31 | 2002-02-15 | Komatsu Ltd | 建設機械の表示装置 |
JP3707371B2 (ja) * | 2000-08-28 | 2005-10-19 | セイコーエプソン株式会社 | 画像表示システム、画像処理方法および情報記憶媒体 |
US6774908B2 (en) * | 2000-10-03 | 2004-08-10 | Creative Frontier Inc. | System and method for tracking an object in a video and linking information thereto |
US6917703B1 (en) | 2001-02-28 | 2005-07-12 | Nevengineering, Inc. | Method and apparatus for image analysis of a gabor-wavelet transformed image using a neural network |
US7392287B2 (en) | 2001-03-27 | 2008-06-24 | Hemisphere Ii Investment Lp | Method and apparatus for sharing information using a handheld device |
US6834115B2 (en) | 2001-08-13 | 2004-12-21 | Nevengineering, Inc. | Method for optimizing off-line facial feature tracking |
US6876364B2 (en) | 2001-08-13 | 2005-04-05 | Vidiator Enterprises Inc. | Method for mapping facial animation values to head mesh positions |
US6853379B2 (en) * | 2001-08-13 | 2005-02-08 | Vidiator Enterprises Inc. | Method for mapping facial animation values to head mesh positions |
US7298412B2 (en) * | 2001-09-18 | 2007-11-20 | Ricoh Company, Limited | Image pickup device, automatic focusing method, automatic exposure method, electronic flash control method and computer program |
KR100439377B1 (ko) * | 2002-01-17 | 2004-07-09 | 엘지전자 주식회사 | 이동 통신 환경에서의 사람 영역 추출방법 |
EP1353516A1 (fr) * | 2002-04-08 | 2003-10-15 | Mitsubishi Electric Information Technology Centre Europe B.V. | Méthode et appareil de détection et/ou de suivi d'une ou de plusieurs zones colorées dans une image ou une séquence d'images |
US6975759B2 (en) * | 2002-06-25 | 2005-12-13 | Koninklijke Philips Electronics N.V. | Method and system for white balancing images using facial color as a reference signal |
US7110575B2 (en) * | 2002-08-02 | 2006-09-19 | Eastman Kodak Company | Method for locating faces in digital color images |
US7991192B2 (en) | 2002-11-27 | 2011-08-02 | Lockheed Martin Corporation | Method of tracking a moving object by an emissivity of the moving object |
US7120279B2 (en) * | 2003-01-30 | 2006-10-10 | Eastman Kodak Company | Method for face orientation determination in digital color images |
US7039222B2 (en) * | 2003-02-28 | 2006-05-02 | Eastman Kodak Company | Method and system for enhancing portrait images that are processed in a batch mode |
US7508961B2 (en) * | 2003-03-12 | 2009-03-24 | Eastman Kodak Company | Method and system for face detection in digital images |
US8682097B2 (en) | 2006-02-14 | 2014-03-25 | DigitalOptics Corporation Europe Limited | Digital image enhancement with reference images |
US7471846B2 (en) | 2003-06-26 | 2008-12-30 | Fotonation Vision Limited | Perfecting the effect of flash within an image acquisition devices using face detection |
US8593542B2 (en) | 2005-12-27 | 2013-11-26 | DigitalOptics Corporation Europe Limited | Foreground/background separation using reference images |
US8948468B2 (en) | 2003-06-26 | 2015-02-03 | Fotonation Limited | Modification of viewing parameters for digital images using face detection information |
US9692964B2 (en) | 2003-06-26 | 2017-06-27 | Fotonation Limited | Modification of post-viewing parameters for digital images using image region or feature information |
US8498452B2 (en) | 2003-06-26 | 2013-07-30 | DigitalOptics Corporation Europe Limited | Digital image processing using face detection information |
US7792335B2 (en) * | 2006-02-24 | 2010-09-07 | Fotonation Vision Limited | Method and apparatus for selective disqualification of digital images |
US8989453B2 (en) | 2003-06-26 | 2015-03-24 | Fotonation Limited | Digital image processing using face detection information |
US9129381B2 (en) | 2003-06-26 | 2015-09-08 | Fotonation Limited | Modification of post-viewing parameters for digital images using image region or feature information |
US7792970B2 (en) | 2005-06-17 | 2010-09-07 | Fotonation Vision Limited | Method for establishing a paired connection between media devices |
US7565030B2 (en) | 2003-06-26 | 2009-07-21 | Fotonation Vision Limited | Detecting orientation of digital images using face detection information |
US8330831B2 (en) | 2003-08-05 | 2012-12-11 | DigitalOptics Corporation Europe Limited | Method of gathering visual meta data using a reference image |
US8553949B2 (en) | 2004-01-22 | 2013-10-08 | DigitalOptics Corporation Europe Limited | Classification and organization of consumer digital images using workflow, and face detection and recognition |
US8189927B2 (en) * | 2007-03-05 | 2012-05-29 | DigitalOptics Corporation Europe Limited | Face categorization and annotation of a mobile phone contact list |
US7616233B2 (en) | 2003-06-26 | 2009-11-10 | Fotonation Vision Limited | Perfecting of digital image capture parameters within acquisition devices using face detection |
US8896725B2 (en) | 2007-06-21 | 2014-11-25 | Fotonation Limited | Image capture device with contemporaneous reference image capture mechanism |
US8155397B2 (en) | 2007-09-26 | 2012-04-10 | DigitalOptics Corporation Europe Limited | Face tracking in a camera processor |
US7440593B1 (en) * | 2003-06-26 | 2008-10-21 | Fotonation Vision Limited | Method of improving orientation and color balance of digital images using face detection information |
US8363951B2 (en) | 2007-03-05 | 2013-01-29 | DigitalOptics Corporation Europe Limited | Face recognition training method and apparatus |
US7574016B2 (en) | 2003-06-26 | 2009-08-11 | Fotonation Vision Limited | Digital image processing using face detection information |
US7620218B2 (en) | 2006-08-11 | 2009-11-17 | Fotonation Ireland Limited | Real-time face tracking with reference images |
US7844076B2 (en) | 2003-06-26 | 2010-11-30 | Fotonation Vision Limited | Digital image processing using face detection and skin tone information |
US8494286B2 (en) | 2008-02-05 | 2013-07-23 | DigitalOptics Corporation Europe Limited | Face detection in mid-shot digital images |
US7269292B2 (en) | 2003-06-26 | 2007-09-11 | Fotonation Vision Limited | Digital image adjustable compression and resolution using face detection information |
FR2857481A1 (fr) * | 2003-07-08 | 2005-01-14 | Thomson Licensing Sa | Procede et dispositif de detection de visages dans une image couleur |
JP2005165984A (ja) * | 2003-12-05 | 2005-06-23 | Seiko Epson Corp | 人物顔の頭頂部検出方法及び頭頂部検出システム並びに頭頂部検出プログラム |
US7564994B1 (en) * | 2004-01-22 | 2009-07-21 | Fotonation Vision Limited | Classification system for consumer digital images using automatic workflow and face detection and recognition |
US20060018522A1 (en) * | 2004-06-14 | 2006-01-26 | Fujifilm Software(California), Inc. | System and method applying image-based face recognition for online profile browsing |
US7623156B2 (en) * | 2004-07-16 | 2009-11-24 | Polycom, Inc. | Natural pan tilt zoom camera motion to preset camera positions |
JP3892454B2 (ja) * | 2004-08-03 | 2007-03-14 | 株式会社リコー | 電子スチルカメラ装置 |
KR100612858B1 (ko) * | 2004-08-23 | 2006-08-14 | 삼성전자주식회사 | 로봇을 이용하여 사람을 추적하는 방법 및 장치 |
JP4677753B2 (ja) * | 2004-10-01 | 2011-04-27 | 株式会社ニコン | 動画像処理装置及び方法 |
US8320641B2 (en) | 2004-10-28 | 2012-11-27 | DigitalOptics Corporation Europe Limited | Method and apparatus for red-eye detection using preview or other reference images |
EP1672460B1 (fr) * | 2004-12-15 | 2009-11-25 | STMicroelectronics (Research & Development) Limited | Dispositif de détection d'utilisateur d'ordinateur |
JP2006185109A (ja) * | 2004-12-27 | 2006-07-13 | Hitachi Ltd | 画像計測装置及び画像計測方法 |
US8488023B2 (en) * | 2009-05-20 | 2013-07-16 | DigitalOptics Corporation Europe Limited | Identifying facial expressions in acquired digital images |
US8503800B2 (en) | 2007-03-05 | 2013-08-06 | DigitalOptics Corporation Europe Limited | Illumination detection using classifier chains |
US7715597B2 (en) | 2004-12-29 | 2010-05-11 | Fotonation Ireland Limited | Method and component for image recognition |
US7315631B1 (en) | 2006-08-11 | 2008-01-01 | Fotonation Vision Limited | Real-time face tracking in a digital image acquisition device |
JP4245185B2 (ja) * | 2005-02-07 | 2009-03-25 | パナソニック株式会社 | 撮像装置 |
JP2006229663A (ja) * | 2005-02-18 | 2006-08-31 | Casio Hitachi Mobile Communications Co Ltd | 撮像装置、撮像データ表示方法および撮像データ表示プログラム |
KR101303877B1 (ko) * | 2005-08-05 | 2013-09-04 | 삼성전자주식회사 | 얼굴 검출과 피부 영역 검출을 적용하여 피부의 선호색변환을 수행하는 방법 및 장치 |
JP2007065290A (ja) * | 2005-08-31 | 2007-03-15 | Nikon Corp | オートフォーカス装置 |
JP4718952B2 (ja) * | 2005-09-27 | 2011-07-06 | 富士フイルム株式会社 | 画像補正方法および画像補正システム |
JP4844073B2 (ja) * | 2005-10-03 | 2011-12-21 | 株式会社ニコン | 撮像装置 |
GB2432659A (en) * | 2005-11-28 | 2007-05-30 | Pixology Software Ltd | Face detection in digital images |
US7804983B2 (en) | 2006-02-24 | 2010-09-28 | Fotonation Vision Limited | Digital image acquisition control and correction method and apparatus |
JP4127297B2 (ja) * | 2006-06-09 | 2008-07-30 | ソニー株式会社 | 撮像装置、および撮像装置制御方法、並びにコンピュータ・プログラム |
WO2008023280A2 (fr) | 2006-06-12 | 2008-02-28 | Fotonation Vision Limited | Progrès de l'extension des techniques aam des images en niveaux de gris aux images en couleurs |
WO2008015586A2 (fr) | 2006-08-02 | 2008-02-07 | Fotonation Vision Limited | Reconnaissance faciale avec ensembles de données combinés basés sur une acp |
US7403643B2 (en) | 2006-08-11 | 2008-07-22 | Fotonation Vision Limited | Real-time face tracking in a digital image acquisition device |
US7916897B2 (en) | 2006-08-11 | 2011-03-29 | Tessera Technologies Ireland Limited | Face tracking for controlling imaging parameters |
US20090225213A1 (en) * | 2006-10-05 | 2009-09-10 | Matsushita Electric Industrial Co., Ltd. | Luminescent display device |
DE102006049681B4 (de) * | 2006-10-12 | 2008-12-24 | Caveh Valipour Zonooz | Aufnahmeeinrichtung zur Erstellung einer multimedialen Aufnahme einer Veranstaltung und Verfahren zur Bereitstellung einer multimedialen Aufnahme |
AU2007221976B2 (en) * | 2006-10-19 | 2009-12-24 | Polycom, Inc. | Ultrasonic camera tracking system and associated methods |
US7847830B2 (en) * | 2006-11-21 | 2010-12-07 | Sony Ericsson Mobile Communications Ab | System and method for camera metering based on flesh tone detection |
US8055067B2 (en) | 2007-01-18 | 2011-11-08 | DigitalOptics Corporation Europe Limited | Color segmentation |
EP2115662B1 (fr) | 2007-02-28 | 2010-06-23 | Fotonation Vision Limited | Séparation de variabilité d'éclairage directionnelle dans un modelage de visage statistique sur la base d'une décomposition spatiale de texture |
US8649604B2 (en) | 2007-03-05 | 2014-02-11 | DigitalOptics Corporation Europe Limited | Face searching and detection in a digital image acquisition device |
US7916971B2 (en) | 2007-05-24 | 2011-03-29 | Tessera Technologies Ireland Limited | Image processing method and apparatus |
WO2008150936A1 (fr) * | 2007-05-30 | 2008-12-11 | Creatier Interactive, Llc | Procédé et système pour permettre une publicité et une transaction dans un contenu vidéo généré par l'utilisateur |
JP4874913B2 (ja) * | 2007-09-28 | 2012-02-15 | 富士フイルム株式会社 | 頭頂位置算出装置、それを用いた画像処理装置および頭頂位置算出方法並びにプログラム |
US8750578B2 (en) | 2008-01-29 | 2014-06-10 | DigitalOptics Corporation Europe Limited | Detecting facial expressions in digital images |
US7855737B2 (en) | 2008-03-26 | 2010-12-21 | Fotonation Ireland Limited | Method of making a digital camera image of a scene including the camera user |
KR20090120991A (ko) * | 2008-05-21 | 2009-11-25 | 엘지이노텍 주식회사 | 자동 화이트 밸런스 영역 설정 방법 |
KR101446975B1 (ko) | 2008-07-30 | 2014-10-06 | 디지털옵틱스 코포레이션 유럽 리미티드 | 얼굴 검출 기능을 사용한 얼굴 및 피부의 자동 미화 |
US8270682B2 (en) * | 2008-11-25 | 2012-09-18 | Eastman Kodak Company | Hair segmentation |
WO2010063463A2 (fr) | 2008-12-05 | 2010-06-10 | Fotonation Ireland Limited | Reconnaissance de visage au moyen de données de classificateur de suiveur de visage |
US20100208078A1 (en) * | 2009-02-17 | 2010-08-19 | Cisco Technology, Inc. | Horizontal gaze estimation for video conferencing |
US20100295782A1 (en) | 2009-05-21 | 2010-11-25 | Yehuda Binder | System and method for control based on face ore hand gesture detection |
JP5383366B2 (ja) * | 2009-07-28 | 2014-01-08 | キヤノン株式会社 | 画像処理装置、画像処理方法、および、そのプログラム |
US8379917B2 (en) | 2009-10-02 | 2013-02-19 | DigitalOptics Corporation Europe Limited | Face recognition performance using additional image features |
JP5424819B2 (ja) * | 2009-11-04 | 2014-02-26 | キヤノン株式会社 | 画像処理装置、画像処理方法 |
US8914734B2 (en) | 2009-12-23 | 2014-12-16 | 8X8, Inc. | Web-enabled conferencing and meeting implementations with a subscription-based model |
US20110149809A1 (en) | 2009-12-23 | 2011-06-23 | Ramprakash Narayanaswamy | Web-Enabled Conferencing and Meeting Implementations with Flexible User Calling and Content Sharing Features |
US20110149811A1 (en) | 2009-12-23 | 2011-06-23 | Ramprakash Narayanaswamy | Web-Enabled Conferencing and Meeting Implementations with Flexible User Calling Features |
US20110150194A1 (en) | 2009-12-23 | 2011-06-23 | Ramprakash Narayanaswamy | Web-Enabled Conferencing and Meeting Implementations with Flexible User Calling Features |
JP2014062926A (ja) * | 2011-01-18 | 2014-04-10 | Fujifilm Corp | オートフォーカスシステム |
US8817801B1 (en) | 2011-07-08 | 2014-08-26 | 8X8, Inc. | Conferencing and meeting implementations with advanced features |
JP6790611B2 (ja) * | 2016-09-02 | 2020-11-25 | 富士通株式会社 | 生体画像処理装置、生体画像処理方法、および生体画像処理プログラム |
CN109459722A (zh) * | 2018-10-23 | 2019-03-12 | 同济大学 | 基于人脸追踪装置的语音交互方法 |
KR102170912B1 (ko) * | 2019-05-15 | 2020-10-28 | 한국과학기술원 | 얼굴 피부 컬러 판정을 위한 인물 영상 촬영 용 컬러 팔레트 및 그를 이용한 방법 및 장치 |
CN114708543B (zh) * | 2022-06-06 | 2022-08-30 | 成都信息工程大学 | 一种考场监控视频图像中考生定位方法 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS60253887A (ja) * | 1984-05-30 | 1985-12-14 | Canon Inc | カメラにおける自動追尾装置 |
US4975969A (en) * | 1987-10-22 | 1990-12-04 | Peter Tal | Method and apparatus for uniquely identifying individuals by particular physical characteristics and security system utilizing the same |
US4991223A (en) * | 1988-06-30 | 1991-02-05 | American Innovision, Inc. | Apparatus and method for recognizing image features using color elements |
GB8905926D0 (en) * | 1989-03-15 | 1990-04-25 | British Aerospace | Target aim point location |
JP2854359B2 (ja) * | 1990-01-24 | 1999-02-03 | 富士通株式会社 | 画像処理システム |
JPH0771288B2 (ja) * | 1990-08-24 | 1995-07-31 | 神田通信工業株式会社 | 自動視野調整方法及び装置 |
GB9019538D0 (en) * | 1990-09-07 | 1990-10-24 | Philips Electronic Associated | Tracking a moving object |
JP2921973B2 (ja) * | 1990-11-29 | 1999-07-19 | 株式会社日立製作所 | 特定物体の画像抽出方法及び画像抽出装置 |
US5093869A (en) * | 1990-12-26 | 1992-03-03 | Hughes Aircraft Company | Pattern recognition apparatus utilizing area linking and region growth techniques |
JPH04354490A (ja) * | 1991-05-31 | 1992-12-08 | Mitsubishi Heavy Ind Ltd | 自動追尾ビデオ撮影装置 |
-
1992
- 1992-07-10 JP JP20710792A patent/JP3298072B2/ja not_active Expired - Lifetime
-
1993
- 1993-07-05 KR KR1019930012564A patent/KR100274999B1/ko not_active IP Right Cessation
- 1993-07-07 US US08/086,868 patent/US5430809A/en not_active Expired - Lifetime
- 1993-07-09 DE DE69326394T patent/DE69326394T2/de not_active Expired - Lifetime
- 1993-07-09 EP EP93305419A patent/EP0578508B1/fr not_active Expired - Lifetime
Also Published As
Publication number | Publication date |
---|---|
EP0578508A3 (fr) | 1995-01-04 |
DE69326394D1 (de) | 1999-10-21 |
JP3298072B2 (ja) | 2002-07-02 |
EP0578508A2 (fr) | 1994-01-12 |
KR940003341A (ko) | 1994-02-21 |
US5430809A (en) | 1995-07-04 |
DE69326394T2 (de) | 2000-03-23 |
KR100274999B1 (ko) | 2000-12-15 |
JPH0630318A (ja) | 1994-02-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP0578508B1 (fr) | Caméra vidéo avec système de poursuite de cible basé sur la couleur | |
US5812193A (en) | Video camera system which automatically follows subject changes | |
US5355163A (en) | Video camera that automatically maintains size and location of an image within a frame | |
EP0634665B1 (fr) | Appareil de traitement pour la poursuite d'un signal vidéo | |
CN110519489B (zh) | 图像采集方法及装置 | |
US9196043B2 (en) | Image processing apparatus and method | |
US5031049A (en) | Automatic object image follow-up device | |
US8749637B2 (en) | Image recognition device, focus adjustment device, image-capturing device, and image recognition method | |
US6118484A (en) | Imaging apparatus | |
US8538075B2 (en) | Classifying pixels for target tracking, apparatus and method | |
US5204749A (en) | Automatic follow-up focus detecting device and automatic follow-up device | |
US6362852B2 (en) | Focus control apparatus and method for use with a video camera or the like | |
US7181047B2 (en) | Methods and apparatus for identifying and localizing an area of relative movement in a scene | |
US5218414A (en) | Distance measuring apparatus utilizing two-dimensional image | |
US6707491B1 (en) | Method of correcting white balance in video camera | |
US6124890A (en) | Automatic focus detecting device | |
WO2000011610A1 (fr) | Appareil et procede pour le traitement d'images | |
US8503723B2 (en) | Histogram-based object tracking apparatus and method | |
JP3399995B2 (ja) | 映像信号追尾方法及びビデオカメラシステム | |
JP3704201B2 (ja) | 自動撮影カメラシステム及びそのシステムにおける被写体の認識方法 | |
JP3757403B2 (ja) | 被写体検出装置および被写体検出方法 | |
EP0436511B1 (fr) | Dispositif de détection de mise au point | |
JPH07154666A (ja) | ビデオカメラ | |
JPH01177789A (ja) | フォーカス制御回路 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): DE FR GB |
|
PUAL | Search report despatched |
Free format text: ORIGINAL CODE: 0009013 |
|
AK | Designated contracting states |
Kind code of ref document: A3 Designated state(s): DE FR GB |
|
17P | Request for examination filed |
Effective date: 19950519 |
|
17Q | First examination report despatched |
Effective date: 19970912 |
|
GRAG | Despatch of communication of intention to grant |
Free format text: ORIGINAL CODE: EPIDOS AGRA |
|
GRAG | Despatch of communication of intention to grant |
Free format text: ORIGINAL CODE: EPIDOS AGRA |
|
GRAH | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOS IGRA |
|
GRAH | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOS IGRA |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): DE FR GB |
|
REF | Corresponds to: |
Ref document number: 69326394 Country of ref document: DE Date of ref document: 19991021 |
|
ET | Fr: translation filed | ||
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed | ||
REG | Reference to a national code |
Ref country code: GB Ref legal event code: IF02 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20120719 Year of fee payment: 20 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: DE Payment date: 20120720 Year of fee payment: 20 Ref country code: FR Payment date: 20120806 Year of fee payment: 20 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R071 Ref document number: 69326394 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: PE20 Expiry date: 20130708 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DE Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION Effective date: 20130710 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: GB Free format text: LAPSE BECAUSE OF EXPIRATION OF PROTECTION Effective date: 20130708 |