JP4819001B2 - Imaging apparatus and method, program, image processing apparatus and method, and program - Google Patents

Imaging apparatus and method, program, image processing apparatus and method, and program Download PDF

Info

Publication number
JP4819001B2
JP4819001B2 JP2007182975A JP2007182975A JP4819001B2 JP 4819001 B2 JP4819001 B2 JP 4819001B2 JP 2007182975 A JP2007182975 A JP 2007182975A JP 2007182975 A JP2007182975 A JP 2007182975A JP 4819001 B2 JP4819001 B2 JP 4819001B2
Authority
JP
Japan
Prior art keywords
predetermined object
detection
image
detected
latest
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2007182975A
Other languages
Japanese (ja)
Other versions
JP2008054295A (en
Inventor
雅彦 杉本
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2006202427 priority Critical
Priority to JP2006202427 priority
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2007182975A priority patent/JP4819001B2/en
Publication of JP2008054295A publication Critical patent/JP2008054295A/en
Application granted granted Critical
Publication of JP4819001B2 publication Critical patent/JP4819001B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/21Intermediate information storage
    • H04N1/2104Intermediate information storage for one or a few pictures
    • H04N1/2112Intermediate information storage for one or a few pictures using still video cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00221Acquiring or recognising human faces, facial parts, facial sketches, facial expressions
    • G06K9/00228Detection; Localisation; Normalisation
    • G06K9/00261Detection; Localisation; Normalisation using comparisons between temporally consecutive images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23212Focusing based on image signals provided by the electronic image sensor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, TV cameras, video cameras, camcorders, webcams, camera modules for embedding in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/225Television cameras ; Cameras comprising an electronic image sensor, e.g. digital cameras, video cameras, camcorders, webcams, camera modules specially adapted for being embedded in other devices, e.g. mobile phones, computers or vehicles
    • H04N5/232Devices for controlling television cameras, e.g. remote control ; Control of cameras comprising an electronic image sensor
    • H04N5/23218Control of camera operation based on recognized objects
    • H04N5/23219Control of camera operation based on recognized objects where the recognized objects include parts of the human body, e.g. human faces, facial parts or facial expressions

Description

  The present invention relates to a photographing apparatus such as a digital camera for obtaining an image by photographing, a photographing method, and a program for causing a computer to execute the photographing method.

  In shooting with a digital camera, for example, an object such as a face is detected from an image acquired by shooting, and the image processing conditions applied to the image are changed according to the detection result of the object, or the shooting conditions at the time of shooting are changed. It has been done.

  Here, various methods for detecting a predetermined object from an image have been proposed. For example, for a target image obtained from video information obtained from an imaging device such as a monitoring camera, the feature amount of the target image is stored, and the comparison evaluation process in the time direction such as the stored feature amount and its temporal change amount To obtain the movement vector of the moving object, and use the acquired movement vector together with the features of the past image of interest, the change of the feature amount, and the reliability evaluation result of each feature to give the discrimination result There has been proposed a method of discriminating an image of interest at each time point while changing the degree of influence (see Patent Document 1).

Also, from the time-series data of the input frame image, a region having movement is extracted by the motion detection unit, a skin color region is extracted by the skin color detection unit, and the region integration unit becomes a region having movement and a skin color region. A motion recognition system that recognizes the shape and motion of a target by extracting a region as a target region and processing time-series image data including a specific target image has also been proposed (see Patent Document 2).
JP 9-322153 A Japanese Patent Laid-Open No. 2001-16606

  In the methods described in Patent Documents 1 and 2 described above, the object is detected using the past characteristics of the object to be detected, but it is desired to detect the object more efficiently.

  The present invention has been made in view of the above circumstances, and an object thereof is to efficiently detect a predetermined object such as a face from an image.

An imaging apparatus according to the present invention includes imaging means for acquiring an image by imaging,
Object detection means for detecting a predetermined object from the image photographed by the photographing means;
Storage means for storing a detection history composed of past detection results and latest detection results of the predetermined object;
It is characterized by comprising determination means for referring to the detection history to determine whether or not the predetermined object is detected in the latest acquired image.

  In the photographing apparatus according to the present invention, when the predetermined object is detected N times (M ≧ N) or more in the detection result for M times including the past, the determination unit may determine the predetermined object. May be treated as detection and determination.

  In this case, even when the determination means is not detected more than N times (M ≧ N) in the M detection results including the past, the latest detection result satisfies a predetermined condition. Furthermore, the predetermined object may be determined as the detection handling.

  Further, in this case, when the predetermined object is detected N times (M ≧ N) or more in the detection results for M times including the past, the determination unit is not detected in the latest detection result. Even in the state, when a certain condition is satisfied, the predetermined object may be determined as the detection handling.

  In the photographing apparatus according to the present invention, when the predetermined object treated as the detection in the new image is selected, the predetermined object corresponding to the selected predetermined object in the new image is further selected. In the case where the detection is detected, a selection holding means for holding the selection state of the predetermined object may be further provided.

  Further, in the photographing apparatus according to the present invention, the determination unit smoothes at least one of the positions and sizes of a plurality of predetermined objects corresponding to the detection, which are treated as the detection included in the detection history, It may be a means for outputting the converted information.

The photographing method according to the present invention acquires an image by photographing,
Detecting a predetermined object from the image photographed by the photographing means;
Save a detection history consisting of past detection results and latest detection results of the predetermined object,
With reference to the detection history, it is determined whether or not the predetermined object is to be detected in the latest acquired image.

  In addition, you may provide as a program for making a computer perform the imaging | photography method by this invention.

  According to the present invention, a new predetermined object is detected from a new image acquired by photographing, and a detection history including a past detection result and the latest detection result of the predetermined object is referred to, and the latest acquisition is performed. It is determined whether or not the predetermined object is to be detected in the processed image. For this reason, a predetermined target can be efficiently detected from an image using past detection results.

  Hereinafter, embodiments of the present invention will be described with reference to the drawings. FIG. 1 is a perspective view centering on the back surface of a digital camera 1 which is an embodiment of a photographing apparatus of the present invention. As shown in FIG. 1, an operation mode switch 11, a menu / OK button 12, a zoom / up / down arrow lever 13, a left / right arrow button 14, a back are provided on the back of the digital camera 1 as an interface for operations by a photographer. In addition to having a (return) button 15 and a display switching button 16, it has a finder 17 and a liquid crystal monitor 18. A release button 19 is provided on the upper surface. FIG. 2 is a perspective view centering on the front surface of the digital camera 1 according to the present embodiment. As shown in FIG. 2, the front surface of the digital camera 1 includes a lens 20, a lens cover 21, a power switch 22, a finder window 23, a flash 24, and a self-timer lamp 25.

  The operation mode switch 11 is a slide switch for switching operation modes of a still image shooting mode, a moving image shooting mode, and a playback mode.

  When the menu / OK button 12 is pressed, various menus for setting the shooting mode type, flash emission mode, number of recording pixels, sensitivity, and the like are displayed on the liquid crystal monitor 18 or displayed on the liquid crystal monitor 18. This is a button for confirming selection / setting based on the selected menu. The menu / OK button 12 can be used to set the shooting mode to a full auto mode for automatically setting exposure and white balance, and a manual mode for manually setting. As the manual mode, a program auto mode, an aperture priority mode, a shutter priority mode, and a manual exposure mode can be set. Further, the menu / OK button 12 can be used to set a mode for performing shooting according to the type of scene such as landscape, flower, sunset, and person as a manual mode.

  The zoom / up / down arrow lever 13 is used to adjust the telephoto / wide angle during shooting by tilting the lever in the vertical direction, and to move the cursor in the menu screen displayed on the liquid crystal monitor 18 up and down during various settings. The lever.

  The left and right arrow buttons 14 are buttons for moving the cursor in the menu screen displayed on the liquid crystal monitor 18 to the left and right at various settings.

  The Back button 15 is a button for stopping various setting operations when pressed and returning to the previous screen.

  The display switching button 16 is a button for switching ON / OFF of the display of the liquid crystal monitor 18, various guide displays, ON / OFF of character display, and the like when pressed.

  The contents set by the operation of each button and lever can be confirmed by the display on the liquid crystal monitor 18, the position of the lamp in the finder, the slide lever, and the like.

  The liquid crystal monitor 18 displays a through image for confirming a subject at the time of shooting, thereby functioning as an electronic viewfinder, displaying a still image and a moving image after shooting, and displaying various setting menus.

  FIG. 3 is a schematic block diagram showing the configuration of a digital camera which is an embodiment of the photographing apparatus of the present invention. The digital camera 1 shown in FIG. 3 converts image data acquired by shooting into an Exif format image file and records it on an external recording medium 70 that can be attached to and detached from the main body.

  The operation system of this digital camera includes the above-described operation mode switch 11, menu / OK button 12, zoom / up / down arrow lever 13, left / right arrow button 14, Back button 15, display switch button 16, release button 19, The power switch 22 and an operation system control unit 74 which is an interface part for transmitting the operation contents of these switches to the CPU 75 are provided.

  The optical system includes a focus lens 20a and a zoom lens 20b. Each lens can be moved in the optical axis direction by a focus lens driving unit 51 and a zoom lens driving unit 52 including a motor and a motor driver. The focus lens driving unit 51 controls the movement of each lens based on the focus driving amount data output from the AF processing unit 62 and the zoom lens driving unit 52 based on the operation amount data of the zoom / up / down arrow lever 13. .

  The diaphragm 54 is driven by a diaphragm driving unit 55 including a motor and a motor driver. The aperture driver 55 adjusts the aperture diameter based on aperture value data output from the AE / AWB processor 63.

  The shutter 56 is a mechanical shutter and is driven by a shutter drive unit 57 including a motor and a motor driver. The shutter drive unit 57 controls the opening / closing of the shutter 56 in accordance with a signal generated by pressing the release button 19 and the shutter speed data output from the AE / AWB processing unit 63.

  A CCD 58, which is an image sensor, is provided behind the optical system. The CCD 58 has a photoelectric surface in which a large number of light receiving elements are two-dimensionally arranged, and subject light that has passed through the optical system forms an image on the photoelectric surface and is subjected to photoelectric conversion. In front of the photocathode, a microlens array for condensing light on each pixel and a color filter array in which filters of R, G, and B colors are regularly arranged are arranged. The CCD 58 outputs the charges accumulated for each pixel as a serial analog photographing signal line by line in synchronization with the vertical transfer clock and horizontal transfer clock supplied from the CCD controller 59. The time for accumulating charges in each pixel, that is, the exposure time is determined by an electronic shutter drive signal given from the CCD controller 59. The gain of the CCD 58 is adjusted by the CCD control unit 59 so that an analog imaging signal having a predetermined size can be obtained.

  The analog photographing signal captured from the CCD 58 is input to the analog signal processing unit 60. The analog signal processing unit 60 includes a correlated double sampling circuit (CDS) that removes noise from the analog signal, an auto gain controller (AGC) that adjusts the gain of the analog signal, and an A / D that converts the analog signal into a digital signal. It consists of a converter (ADC). The image data converted into the digital signal is CCD-RAW data having R, G, and B density values for each pixel.

  The timing generator 72 generates a timing signal. By supplying this timing signal to the shutter drive unit 57, the CCD control unit 59, and the analog signal processing unit 60, the operation of the release button 19, the opening and closing of the shutter 56, The charge taking of the CCD 58 and the processing of the analog signal processing unit 60 are synchronized.

  The flash control unit 73 causes the flash 24 to emit light during shooting. Specifically, when the flash emission mode is set to flash on, and when the flash emission mode is the auto mode, the flash 24 is turned on when a pre-image to be described later is not at a predetermined brightness, and the flash is set at the time of shooting. 24 is caused to emit light. On the other hand, when the flash emission mode is set to flash off, the flash 24 is prohibited from emitting light during shooting.

  The image input controller 61 writes the CCD-RAW data input from the analog signal processing unit 60 in the frame memory 66.

  The frame memory 66 is a working memory used when various digital image processing (signal processing) described later is performed on image data. For example, an SDRAM (data transfer device that performs data transfer in synchronization with a bus clock signal having a fixed period. Synchronous Dynamic Random Access Memory) is used.

  The display control unit 71 displays the image data stored in the frame memory 66 on the liquid crystal monitor 18 as a through image, or displays the image data stored in the external recording medium 70 on the liquid crystal monitor 18 in the reproduction mode. Is for. The through image is taken by the CCD 58 at predetermined time intervals while the shooting mode is selected.

  The AF processing unit 62 and the AE / AWB processing unit 63 determine shooting conditions based on the pre-image. This pre-image is an image represented by image data stored in the frame memory 66 as a result of the CPU 75 having detected a half-press signal generated by half-pressing the release button 19 causing the CCD 58 to perform pre-photographing. It is.

  The AF processing unit 62 detects the focal position based on the pre-image and outputs focus drive amount data (AF processing). As a focus position detection method, for example, a passive method that detects a focus position using a feature that the contrast of an image is high in a focused state can be considered.

  The AE / AWB processing unit 63 measures the subject brightness based on the pre-image, determines the aperture value, the shutter speed, and the like based on the measured subject brightness, and determines the aperture value data and the shutter speed data as the exposure setting value. At the same time (AE processing), white balance at the time of shooting is automatically adjusted (AWB processing). The exposure and white balance can be set manually by the photographer of the digital camera 1 when the shooting mode is set to the manual mode. Even when the exposure and white balance are automatically set, the photographer can manually adjust the exposure and white balance by giving an instruction from the operation system such as the menu / OK button 12.

  The image processing unit 64 performs image quality correction processing such as gradation correction, sharpness correction, and color correction on the image data of the main image, CCD-RAW data as Y data that is a luminance signal, and Cb data that is a blue color difference signal. Then, YC processing for converting into YC data composed of Cr data which is a red color difference signal is performed. The main image is an image based on image data that is captured from the CCD 58 in the main photographing executed when the release button 19 is fully pressed and stored in the frame memory 66 via the analog signal processing unit 60 and the image input controller 61. It is. The upper limit of the number of pixels of the main image is determined by the number of pixels of the CCD 58. For example, the number of recorded pixels can be changed by setting such as fine and normal. On the other hand, the number of images of the through image and the pre-image is smaller than that of the main image.

  The compression / decompression processing unit 65 performs compression processing on the image data of the main image that has been corrected and converted by the image processing unit 64 in a compression format such as JPEG, and generates an image file. A tag storing incidental information such as shooting date and time is added to the image file based on the Exif format or the like. In the reproduction mode, the compression / decompression processing unit 65 reads the compressed image file from the external recording medium 70 and performs the decompression process. The decompressed image data is output to the liquid crystal monitor 18.

  The media control unit 67 accesses the external recording medium 70 and controls writing and reading of the image file.

  The internal memory 68 stores various constants set in the digital camera 1, programs executed by the CPU 75, and the like.

  The face detection unit 80 detects a person's face from each of the through images acquired continuously at predetermined time intervals. Specifically, an area having a facial feature included in the face (for example, having skin color, having eyes, or having a face shape) is detected as a face area, but the present invention is not limited to this. The face detection unit 80 also detects and outputs the face center position, size, inclination on the plane, and orientation (front, right, left).

  In addition, the face detection unit 80 detects an area having a facial feature included in the face (for example, having a skin color, having an eye, or having a face shape) as the face area.

  For example, a technique disclosed in Japanese Patent Application Laid-Open No. 2006-202276 (hereinafter referred to as Reference Document 1) can be used. In the method of Reference 1, the face tracking is performed by updating the weights sequentially when re-sampling a known method or learning data such as motion vector and feature point detection. It is conceivable that a machine learning method based on Adaboost, which is a method of creating an integrated learning machine by adding weights to learning machines and adding them, is used. For example, the average frame model is fitted into the actual face image, and the position of each landmark on the average frame model is moved so as to match the position of the corresponding landmark detected from the face. When constructing a deformed face frame model, the machine learning method shows that the brightness profile at points on the sample images that are known to be a given landmark and that it is not the landmark A method for detecting the position of a point indicating a landmark from a face image by using a discriminator obtained by performing learning on luminance profiles at points on a plurality of sample images and discrimination conditions for each discriminator. is there. Moreover, it is also possible to use the method of Unexamined-Japanese-Patent No. 2004-334836 (henceforth the reference document 2). The technique of Reference 2 cuts out image data of a certain size from image data, compares each cut-out image data with the matching data of feature part image data, and the feature part image exists in the processing target image. This is a technique using a feature portion extraction method of image data for detecting whether or not to perform. In addition to the human face area, an animal face or the like may be detected as a specific subject as in the technique disclosed in Japanese Patent Application Laid-Open No. 2007-11970 (hereinafter referred to as Reference Document 3).

  Here, the face detection unit 80 outputs predetermined values for the face orientation and the face inclination. FIG. 4 is a table showing face orientation output values, and FIG. 5 is a table showing face tilt output values. As shown in FIG. 4, the face detection unit 80 is 0 for a face facing front (front face), 1 for a face facing right (right profile), and a face facing left ( In the case of (left profile), -1 is output.

  Further, as shown in FIG. 5, the face detection unit 80 has 0, 1, 2, 3, 4, 5, 6, -5, -4 in 30 degree increments from 0 degrees to 330 degrees with respect to the vertical direction. , -3, -2, and -1.

  The face detection unit 80 outputs the length of one side of the rectangle surrounding the detected face area as the face size. Also, the two-dimensional coordinate value of the center of the face is output as the center position.

  The determination unit 82 determines whether or not to treat a new face detected from the latest through image among through images continuously captured at a predetermined time interval, and updates the detection history. Specifically, referring to the detection history of the face treated as a detection, it is determined whether or not a new face is to be detected, and the detection history is updated. The detection history is composed of past detection results and latest detection results, and is stored in the internal memory 68.

  The follow-up processing unit 84 performs a process of following the face that is the AF target, as will be described later.

  The CPU 75 controls each part of the main body of the digital camera 1 according to signals from the operation system such as the operation mode switch 11 and various processing units such as the AF processing unit 62.

  The data bus 76 includes an image input controller 61, various processing units 62 to 65, a frame memory 66, a media control unit 67, an internal memory 68, a display control unit 71, a face detection unit 80, a determination unit 82, a tracking processing unit 84, and a CPU 75. To exchange digital image data and the like.

  Next, processing performed in the digital camera 1 having the above configuration will be described. First, in the present embodiment, the past and latest M face detection results are stored in the internal memory 68 as a detection history. FIG. 6 shows the data structure of the detection history. The detection history holds the past and latest M detection results. The detection result of each time holds the number of faces detected at that time, and information of each face including unique information and link information. The detected face is assigned a number from 0.

  The unique information includes the center position, size, inclination, direction, and detection result score of the face detected by the face detection unit 80. Here, the score is a value representing the face likeness of the detected face, and the larger the value, the more likely the detected face is to be a face.

  The link information includes a link in the past direction and a link in the future direction. The link in the past direction includes a history difference indicating how many previous detection results including the corresponding face are included, and the number of the corresponding face in the past detection results. The link to the future direction includes a history difference indicating how many detection histories including the corresponding face are ahead, and the number of the corresponding face in the future detection result.

  The detection history is initialized when the power of the digital camera 1 is turned on, when the face detection function is turned on, when the main shooting is finished, and when the mode of the digital camera 1 is changed. .

  FIG. 7 is a flowchart showing processing performed in the present embodiment. This process is performed every time a through image is acquired in the shooting mode and a face is detected from the acquired through image. A new through image is acquired, and when a face is detected from the acquired through image, the determination unit 82 starts processing to determine whether or not the condition for initialization of the detection history is satisfied (step ST1). If step ST1 is affirmed, the detection history is initialized (step ST2). When step ST1 is negative and following step ST2, the detection history stored in the internal memory 68 is read (step ST3).

  FIG. 8 is a diagram schematically showing the read detection history. In FIG. 8, the detection result is denoted as history, and for the sake of simplicity, the past five detection results of history [0] to history [4] are included. In addition, the smaller the history value, the newer the detection result. The number described in each history is the detected face number. As shown in FIG. 8A, three faces 0 to 2 are detected in history [0] which is the latest detection result. In history [1], four faces 0 to 3 are detected. In history [2], two faces 0 to 1 are detected. In history [3], three faces 0 to 2 are detected. In history [4], two faces 0 to 1 are detected.

  In FIG. 8A, faces 0, 1, 0 in history [0], [1], [2] are linked and number 1 in history [0], [1], [3] is linked. , 0, 2 are linked, faces 2, 3, 1, 1, 1 in history [0] to history [4] are linked, and numbers 0, 0 in history [3], [4] The fact that the faces are linked is represented by connecting the numbers representing the linked faces with different line segments.

  Subsequently, the determination unit 82 slides past detection results in the detection history in order to add a detection result (latest detection result) of a new through image to the detection history (step ST4). This slide of detection results is a process of deleting the oldest detection result from the M past detection results and allowing the latest detection result to be added to the detection history. Specifically, as shown in FIG. 8B, history [4] is deleted, history [0] to history [3] are changed to new history [1] to history [4], and history [0] A new detection result can be added to.

  Subsequently, the latest detection result is added to the detection history (step ST5). In FIG. 8B, it is assumed that four faces numbered 0 to 4 are detected as the latest detection result. Next, the latest detection result and the past detection result are linked to update the detection history (step ST6). Hereinafter, the update of the detection history will be described. 9 and 10 are flowcharts of detection history update processing.

  First, the determination unit 82 sets the face number j included in the latest detection result to 0, which is an initial value (step ST11), and determines whether j is equal to or less than the number of faces included in the latest detection result. (Step ST12). If step ST12 is negative, the process returns. If step ST12 is affirmed, the latest detection result other than the latest detection result included in the detection history is set to k = 1 to be a target for determining the link (step ST13), and k is the detection result. It is determined whether or not it is the maximum number M or less (step ST14). If step ST14 is negative, the link determination target is set to the face of the next number (j = j + 1: step ST15), and the process returns to step ST12.

  If step ST14 is affirmed, a threshold value Th1 for determining the distance between the centers of the faces is set to an initial value as will be described later (step ST16). The initial value has a value for determining whether or not the faces are adjacent to each other, and is updated so that the value becomes smaller in accordance with the calculated face center distance d1 as described later. is there.

  Then, in the detection result [k] that is the current target, the face number i of the link determination target is set to 0, which is an initial value (step ST17), and the number i is included in the detection result [k]. It is determined whether or not it is equal to or less than the number of faces determined to be handled (number of detected faces) (step ST18). If step ST18 is negative, the detection result [k] to be subjected to link determination is set to the next oldest detection result (k = k + 1: step ST19), and the process returns to step ST14.

  If step ST18 is positive, the face [i] of the detection result [k] determines whether or not there is a link destination in the future direction (step ST20). Here, in FIG. 8B, all the faces in history [1] have no link destination in the future direction. Also, faces 0, 1, and 3 in history [2] have a link destination in the future direction, but face 2 has no link destination in the future direction. When step ST20 is affirmed, the face [j] of the latest detection result and the face [j] of the detection result [k] are set as link determination targets (step ST21).

  Next, it is determined whether or not the difference in orientation between the face [j] and the face [i] is equal to or less than a predetermined threshold value Th2 (step ST22). Here, if the threshold Th2 is set to 0, only faces in the same direction are determined to be the same. If the threshold Th2 is set to 1, the same determination is made up to a difference of 90 degrees, and the threshold Th2 is set to 2. If it does, it will judge that the face of any direction is the same.

  If step ST22 is affirmed, it is determined whether or not the difference in inclination between the face [j] and the face [i] is equal to or smaller than a predetermined threshold value Th3 (step ST23). As the threshold Th3, it is preferable to use a value that can be determined to be the same for faces whose orientation difference is up to 30 degrees.

  If step ST23 is positive, it is determined whether or not the difference in size between the face [j] and the face [i] is equal to or smaller than a predetermined threshold Th4 (step ST24).

  If step ST24 is affirmed, it is determined whether or not the center distance d1 between the face [j] and the face [i] is equal to or less than a threshold value Th1 (step ST25). Instead of the center distance, the square of the center distance may be used.

  When step ST25 is affirmed, the value of the threshold Th1 is updated to the center-to-center distance d1 (step ST26), and it is further determined whether or not the center-to-center distance d1 is equal to or less than a predetermined threshold Th5 (step ST26). Step ST27).

  If step ST27 is positive, the link destination in the past direction of the face [j] is set as the face [i] of the detection result [k], and the link destination in the future direction of the face [i] is the face of the latest detection result. Set to [j]. That is, the link information of the face [j] in the past direction is set to the history difference k and the corresponding face number i, and the link information of the face [i] to the future direction is set to the history difference k and the corresponding face number j ( Step ST28). Then, the link determination target in the detection result [k] is set to the next face (i = i + 1: step ST29), and the process returns to step ST18.

  If Steps ST20, ST22, ST23, ST24, ST25, and ST27 are denied, the link determination target in the history [k] is set to the next face (i = i + 1: Step ST29), and the process returns to Step ST18.

  Thus, for example, as shown in FIG. 8C, the faces 0 and 2 of history [0], which are the latest detection results, are faces 1 and 0 of history [1], and the face 1 of history [0] is history [2]. ] Is linked to face 2.

  Returning to FIG. 7, following step ST6, a face to be detected is determined (step ST7). Hereinafter, determination of a face to be detected will be described. First, the determination unit 82 determines that the face detected in the N detection results among the M detection results included in the detection history is treated as a detection (first condition). Further, when the first condition is satisfied, the face detected in the N detection results is determined to be detected even if it is not detected in the latest detection result (second condition). Even if the first condition is not satisfied, among the faces detected in the latest detection result, a face whose score is equal to or higher than a predetermined threshold Th10 is determined to be detected (third condition).

  In addition, it is possible to switch whether or not it is determined to be detected even if it is not detected in the latest detection result, and in the case of determining to be detected even if it is not detected in the latest detection result, the center of the face is an image. It may be determined that it is not treated as a detection when it is in the peripheral part of.

In addition, in the case of determining the detection treatment even if it is not detected in the latest detection result,
Even if the center of the face is in the peripheral part of the image before the latest detection result, there is a case where the probability of moving from the motion vector of the subject to the central part of the image is high. At that time, for example, as shown in FIGS. 15A, 15B, and 15C, the detection frame may be controlled. The person shown by the dotted line is the latest subject image in the face from which the detection result was obtained, and the person shown by the solid line shows the subject image that would currently exist before obtaining the detection result.

  In FIG. 15A, the frame is left behind by displaying the frame using the latest face detection position as it is among the faces obtained from the history. However, by detecting the face of the subject image that would be present, the frame is displayed following the arrow in FIG. 15A.

  FIG. 15B also estimates the subject image that would currently exist from the motion vector. Then, a frame is displayed from the estimated face detection position. The estimation method based on the motion vector may be estimated from the transition of the center position of the face detected at the latest detection result. Further, it may be calculated by a motion vector transition derived from a comparison with the entire specific frame image as a reference at the latest detection result.

  Further, FIG. 15C displays a large frame based on the latest face detection position and the face position estimated from the motion vector among the faces obtained detection results from the detected history. By doing so, even if the subject at the time of the latest detection cannot be detected, there is an effect in that it is visually recognized by the user as if it was detected.

In the above-described embodiment, not only the detection frame is displayed, but the position information of the detected face can be recorded in the external recording medium 80, the frame memory 66, or the like.

  The recorded face position information can also be used, for example, for AE processing or AWB processing in the AE / AWB processing unit 63 in the AF processing unit 62.

  Here, when it is determined that the detection is handled when 3 out of 5 times are detected in FIG. 8, the faces 0 and 2 of the history [0] as the latest detection result are detected as shown in FIG. And face 2 of history [1] is determined to be detected. In FIG. 8D, the face number determined to be detected is indicated by ◯.

  If the number of faces to be detected is too large, a through image is difficult to see when a detection frame is added and displayed on a face determined to be detected as described later. For this reason, in this embodiment, the number of faces determined to be detected is limited, and when the number of faces to be detected exceeds the limit value, the number of faces to be detected is within the limit value. I do. Hereinafter, processing when the number of faces to be detected exceeds the limit value will be described. FIG. 11 is a flowchart showing processing when the number of faces to be detected exceeds the limit value.

  The determination unit 82 determines whether the number of faces to be detected exceeds the limit value (step ST31), and ends the process when step ST31 is negative. If step ST31 is affirmed, the faces determined to be detected are sorted in order of history (step ST32). Next, it is determined whether or not the digital camera 1 is set so as to prioritize the size of faces having the same history (step ST33). If step ST33 is affirmed, the faces having the same history are ordered in size. Sort (step ST34). Next, the faces having the same history and the same size are sorted in the order close to the center of the through image (step ST35).

  On the other hand, if step ST33 is negative, the faces having the same history are sorted in the order from the center (step ST36). Next, the faces having the same history and the same distance from the center are sorted in order of size (step ST37).

  Subsequent to step ST35 and step ST37, faces having the same history, the same size, and the same distance from the center are sorted in the order of score (step ST38). Then, the face from the top to the limit value in the sorting result is finally determined as a face to be detected (step ST39), and the process ends.

  Returning to FIG. 7, following step ST7, AF target face tracking processing is performed (step ST8), and the process returns. Hereinafter, the AF target face tracking process will be described. The AF target face tracking process is a process performed by the tracking processing unit 84. After the release button 19 is pressed halfway to set the AF target face, the photographer may slightly shift the angle of view of the digital camera 1. This is processing for preventing the face to be an AF target from fluctuating. As a result, even when there are a plurality of faces, the AF target face does not change, and the same face can be brought into focus.

  FIG. 12 is a flowchart of the AF target face tracking process. The AF target face tracking process is repeatedly performed at predetermined time intervals. This predetermined time interval may be the same as or different from the through image acquisition.

  First, the tracking processing unit 84 determines whether or not the digital camera 1 satisfies the above-described conditions for initializing the detection history (step ST41). If step ST41 is affirmed, check_result is set to 0 (step ST41). ST42). This check_result represents the number of times the AF target face tracking process has been performed.

  When step ST41 is denied and following step ST42, it is determined whether or not the digital camera 1 is set not to perform AF target face tracking processing (step ST43), and step ST43 is affirmed, that is, AF. If the target face following process is not set, check_result = 0 is set (step ST44), the face information selected as the AF target is cleared (AF selected face information clear: step ST45), and further AF priority face selection processing is performed (step ST46), the selected face is set as the AF target face in the current AF target tracking processing (step ST47), the process proceeds to step ST55, check_result is incremented, and the process returns.

  Here, since the information of the face selected as the AF target is added to the detection history, the face of the AF target can be identified by referring to the detection history. Also, the AF priority face selection process is a process for setting the priority order of faces to be AF-matched to a plurality of detected faces.

  On the other hand, if step ST43 is negative, it is determined whether or not check_result is less than a predetermined number Th12 (step ST48). If step ST48 is negative, the process returns to step ST44 to redo the AF target tracking process. move on. If step ST48 is affirmed, AF target information in the previous AF target face tracking process is acquired with reference to the detection history (step ST49). The AF target information is a detection result including a face that is an AF target in the detection history and information on the face. Next, based on the acquired AF target information, it is determined whether or not there is a face linked in the future direction for the face that has become the AF target in the previous AF target tracking process (step ST50).

  Here, as shown in FIG. 8C, when face 1 in history [1] is an AF target, face 0 corresponding to detection result history [0] in the future direction exists. In this case, step ST50 is affirmed. On the other hand, when face 2 in history [1] is an AF target, there is no corresponding face in future direction detection result history [0]. In this case, step ST50 is denied.

  If step ST50 is negative, it is determined whether or not the face that has become the AF target in the previous AF target tracking process is determined to be detected in the current AF target tracking process (step ST51). If not, the process proceeds to step ST44 to redo the AF target tracking process. If step ST51 is affirmed, the face that became the AF target in the previous AF target tracking process is set as the AF target face in the current AF target tracking process (step ST52), and the process proceeds to step ST55 to increment check_result and return. To do.

  On the other hand, if step ST50 is affirmed, it is determined whether or not the face linked to the future direction of the face that has become the AF target in the previous AF target follow-up process is handled as a detection with reference to the detection history ( Step ST53). If step ST53 is negative, the process proceeds to step ST44 to redo the AF target tracking process. If step ST53 is positive, the face linked to the future direction of the face that has become the AF target in the previous AF target tracking process is set as the AF target face in the current AF target tracking process (step ST54), and the process proceeds to step ST55. Increment check_result and return.

  Here, when face 1 in history [1] is an AF target as shown in FIG. 8C, face 0 corresponding to detection result history [0] in the future direction exists, and this face 0 is detected. It has become. Therefore, the face 0 in history [0] is the face of the AF target in the current AF target tracking process.

  Thereby, it is possible to follow the face as the AF target.

  In the above embodiment, while the through image is displayed on the liquid crystal monitor 18, the face as the AF target is surrounded by the detection frame, and the face is surrounded when the face is tracked by the AF target tracking process. It is preferable to display the detection frame following the detection frame. For example, as shown in FIGS. 13A to 13C, when the face F1 that is an AF target within the angle of view moves from left to right to determine the position at the time of shooting, a detection frame that surrounds the face F1. It is preferable to make A1 follow the movement of the face F1.

  Further, not only the face set as the AF target but all the faces F1 to F4 determined to be detected as shown in FIG. 14 may be surrounded by the detection frames A1 to A4. In this case, when the detection history is updated, the size and position of the detection frame are also updated.

  When the face determined to be treated as an AF target or detection is surrounded by a detection frame in this way, the size and position of the face changes as the person moves or the angle of view changes. When the detection history is updated, the size and position of the detection frame also change. However, if the size and position of the detection frame is changed in accordance with the change in the size and position of the face in conjunction with the update of the detection history, the through image becomes difficult to see.

  In such a case, it is preferable to smooth the face size and position with reference to the detection history, and display the detection frame according to the smoothed face size and position.

  This smoothing process is a process of determining the size and position of the detection frame using the detection history and the information of the AF target face or the face determined to be face detection. Specifically, for the face of the AF target or the face determined to be detected, the link destination in the past direction is checked back to the detection result up to a predetermined number of times, and the corresponding face unique information in each detection result ( Direction, inclination, size and center position). Then, those whose unique information differences are within a predetermined range are further extracted from the corresponding faces. In this stage, when there are a predetermined number or more of extracted faces, the largest face and the smallest face are removed, and smoothing is performed using the remaining faces. When the number of extracted faces is less than a predetermined number, smoothing is performed using all the faces.

  Thus, by displaying the detection frame using the smoothed face size and position, the size and position of the detection frame fluctuate greatly each time the detection history is updated, making it difficult to see the through image. Can be prevented.

  In the above embodiment, both the size and position of the face are smoothed, but only one of the size and position of the face may be smoothed.

  In addition, the smoothed face size and position information is stored in the external recording medium 70 together with the image data of the main image acquired by the main shooting, and is smoothed when the image is played back in the playback mode. The detection frame may be displayed on the reproduced image using the information on the size and position of the face.

  In the above-described embodiment, the predetermined object is a face, but the present invention is not limited to this, and a subject other than the face may be used.

  Although the digital camera according to the embodiment of the present invention has been described above, the computer functions as a unit corresponding to the face detection unit 80, the determination unit 82, and the follow-up processing unit 84, and is illustrated in FIGS. A program for performing such processing is also one embodiment of the present invention. A computer-readable recording medium in which such a program is recorded is also one embodiment of the present invention.

The perspective view centering on the back surface of the digital camera which is embodiment of the imaging device of this invention The perspective view centering on the front of the digital camera which is embodiment of the imaging device of this invention 1 is a schematic block diagram showing the configuration of a digital camera that is an embodiment of a photographing apparatus of the present invention. Table showing output values of face orientation Table showing the output value of face tilt Diagram showing data structure of detection history A flowchart showing processing performed in the present embodiment Diagram showing detection history Flow chart of detection history update processing (part 1) Flow chart of detection history update processing (part 2) Flow chart showing processing when the number of faces to be detected exceeds the limit value AF target face tracking process flowchart The figure which shows the through image at the time of AF target tracking processing The figure which shows the through image which gave the detection frame to the face determined to be detected The figure which shows the state which displayed the detection frame using a detection history and a motion vector on the liquid crystal monitor (the 1) FIG. 2 is a diagram showing a state in which a detection frame using a detection history and a motion vector is displayed on a liquid crystal monitor (part 2) FIG. 3 shows a state in which a detection frame using a detection history and a motion vector is displayed on a liquid crystal monitor (No. 3)

Explanation of symbols

1 Digital Camera 18 Liquid Crystal Monitor 24 Flash 62 AF Processing Unit 63 AE / AWB Processing Unit 64 Image Processing Unit 71 Display Control Unit 73 Flash Control Unit 75 CPU
80 face detection unit 82 determination unit 84 tracking processing unit

Claims (24)

  1. Photographing means for acquiring an image by photographing;
    Object detection means for detecting a predetermined object from the image photographed by the photographing means;
    Storage means for storing a detection history composed of past detection results and latest detection results of the predetermined object;
    A determination unit that refers to the detection history and determines whether or not the predetermined object is detected in the latest acquired image ;
    The determination means is a means for determining that the predetermined object is detected as being detected when the predetermined object is detected N times (M ≧ N) or more in the detection results for M times including the past. An imaging apparatus characterized by that.
  2. Even if the determination means has not been detected more than N times (M ≧ N) in the M detection results including the past, the determination means satisfies the predetermined condition when the latest detection result satisfies a predetermined condition. The photographing apparatus according to claim 1 , wherein the photographing apparatus is a unit that determines that a predetermined object is detected.
  3. The determination means, the predetermined object, be detected in the detection result of the past was also included M times N times (M ≧ N) or more, Ri states der undetected at the latest detection result, and, wherein when the center of the predetermined object is in the periphery of the image capturing apparatus according to claim 1 or 2, wherein the said predetermined object is a unit that does not determine that the detection treatment.
  4. When the determination means determines that there is a high probability that the predetermined object moves from the motion vector of the predetermined object to the center of the image even if the center of the predetermined object is in the periphery of the image. 4. The photographing apparatus according to claim 3 , wherein said photographing device is means for determining that said predetermined object is said to be detected.
  5. Photographing means for acquiring an image by photographing;
    Object detection means for detecting a predetermined object from the image photographed by the photographing means;
    Storage means for storing a detection history composed of past detection results and latest detection results of the predetermined object;
    Determining means for referring to the detection history to determine whether or not the predetermined object is detected in the latest acquired image;
    When the predetermined object treated as the detection is selected and the predetermined object corresponding to the selected predetermined object is detected in a new image, the selection state of the predetermined object is maintained. shadow device Taking you characterized in that a selection holding means for.
  6. Photographing means for acquiring an image by photographing;
    Object detection means for detecting a predetermined object from the image photographed by the photographing means;
    Storage means for storing a detection history composed of past detection results and latest detection results of the predetermined object;
    A determination unit that refers to the detection history and determines whether or not the predetermined object is detected in the latest acquired image;
    The determination means is means for smoothing at least one of the positions and sizes of a plurality of predetermined objects corresponding to each other and treated as the detection included in the detection history, and outputting the smoothed information. shadow apparatus shooting it said.
  7. The image is acquired by the photographing means ,
    Detecting a predetermined object from the image photographed by the photographing means;
    Save a detection history consisting of past detection results and latest detection results of the predetermined object,
    With reference to the detection history, it is determined whether or not the predetermined object is to be detected in the latest acquired image, and
    In the determination, when the predetermined object is detected N times (M ≧ N) or more in the detection results for M times including the past, the predetermined object is determined to be treated as detected. How to shoot.
  8. The image is acquired by the photographing means,
    Detecting a predetermined object from the image photographed by the photographing means;
    Save a detection history consisting of past detection results and latest detection results of the predetermined object,
    With reference to the detection history, it is determined whether or not the predetermined object is to be detected in the latest acquired image, and
    When the predetermined object treated as the detection is selected and the predetermined object corresponding to the selected predetermined object is detected in a new image, the selection state of the predetermined object is maintained. A photographing method characterized by:
  9. The image is acquired by the photographing means,
    Detecting a predetermined object from the image photographed by the photographing means;
    Save a detection history consisting of past detection results and latest detection results of the predetermined object,
    With reference to the detection history, it is determined whether or not the predetermined object is to be detected in the latest acquired image, and
    An imaging method comprising: smoothing at least one of positions and sizes of a plurality of predetermined objects corresponding to each other, which are treated as detected, included in the detection history, and outputting the smoothed information.
  10. A procedure for acquiring an image by a photographing means ;
    A procedure for detecting a predetermined object from an image photographed by the photographing means;
    A procedure for storing a detection history including a past detection result and a latest detection result of the predetermined object;
    Above with reference to detection history, it possesses a procedure for determining whether a detection treats the predetermined object in the acquired image in the latest,
    In the determining steps, imaging method wherein the predetermined object, to determine the result of the detection of M times, including the past when it is detected N times (M ≧ N) or more, said predetermined object and said detection treat A program that causes a computer to execute.
  11. A procedure for acquiring an image by a photographing means;
    A procedure for detecting a predetermined object from an image photographed by the photographing means;
    A procedure for storing a detection history including a past detection result and a latest detection result of the predetermined object;
    A procedure for referring to the detection history to determine whether or not to treat the predetermined object in the latest acquired image;
    When the predetermined object treated as the detection is selected and the predetermined object corresponding to the selected predetermined object is detected in a new image, the selection state of the predetermined object is maintained. And a program for causing a computer to execute a photographing method.
  12. A procedure for acquiring an image by a photographing means;
    A procedure for detecting a predetermined object from an image photographed by the photographing means;
    A procedure for storing a detection history including a past detection result and a latest detection result of the predetermined object;
    A procedure for referring to the detection history to determine whether or not to treat the predetermined object in the latest acquired image;
    An imaging method comprising: smoothing at least one of the positions and sizes of a plurality of predetermined objects corresponding to each other, which are treated as the detection included in the detection history, and outputting the smoothed information to a computer A program to be executed.
  13. Object detection means for detecting a predetermined object from an image photographed by the photographing means;
    Storage means for storing a detection history composed of past detection results and latest detection results of the predetermined object;
    A determination unit that refers to the detection history and determines whether or not the predetermined object is detected in the latest acquired image;
    The determination means is a means for determining that the predetermined object is detected as being detected when the predetermined object is detected N times (M ≧ N) or more in the detection results for M times including the past. An image processing apparatus.
  14. Even if the determination means has not been detected more than N times (M ≧ N) in the M detection results including the past, the determination means satisfies the predetermined condition when the latest detection result satisfies a predetermined condition. The image processing apparatus according to claim 13, wherein the image processing apparatus is a unit that determines that a predetermined object is detected.
  15. The determination means is in a state where the predetermined object is not detected in the latest detection result even if the predetermined detection object is detected N times (M ≧ N) or more in the M detection results including the past, and The image processing apparatus according to claim 13 or 14, wherein when the center of the predetermined object is in a peripheral portion of the image, the predetermined object is not determined as the detection handling.
  16. When the determination means determines that there is a high probability that the predetermined object moves from the motion vector of the predetermined object to the center of the image even if the center of the predetermined object is in the periphery of the image. The image processing apparatus according to claim 15, wherein the image processing apparatus is means for determining that the predetermined object is the detection handling.
  17. Object detection means for detecting a predetermined object from an image photographed by the photographing means;
    Storage means for storing a detection history composed of past detection results and latest detection results of the predetermined object;
    Determining means for referring to the detection history to determine whether or not the predetermined object is detected in the latest acquired image;
    When the predetermined object treated as the detection is selected and the predetermined object corresponding to the selected predetermined object is detected in a new image, the selection state of the predetermined object is maintained. An image processing apparatus, comprising: a selection holding unit that performs the selection holding.
  18. Object detection means for detecting a predetermined object from an image photographed by the photographing means;
    Storage means for storing a detection history composed of past detection results and latest detection results of the predetermined object;
    A determination unit that refers to the detection history and determines whether or not the predetermined object is detected in the latest acquired image;
    The determination means is means for smoothing at least one of the positions and sizes of a plurality of predetermined objects corresponding to each other and treated as the detection included in the detection history, and outputting the smoothed information. An image processing apparatus.
  19. Detecting a predetermined object from an image photographed by the photographing means;
    Save a detection history consisting of past detection results and latest detection results of the predetermined object,
    With reference to the detection history, it is determined whether or not the predetermined object is to be detected in the latest acquired image, and
    In the determination, when the predetermined object is detected N times (M ≧ N) or more in the detection results for M times including the past, the predetermined object is determined to be treated as detected. Image processing method.
  20. Detecting a predetermined object from an image photographed by the photographing means;
    Save a detection history consisting of past detection results and latest detection results of the predetermined object,
    With reference to the detection history, it is determined whether or not the predetermined object is to be detected in the latest acquired image, and
    When the predetermined object treated as the detection is selected and the predetermined object corresponding to the selected predetermined object is detected in a new image, the selection state of the predetermined object is maintained. An image processing method.
  21. Detecting a predetermined object from an image photographed by the photographing means;
    Save a detection history consisting of past detection results and latest detection results of the predetermined object,
    With reference to the detection history, it is determined whether or not the predetermined object is to be detected in the latest acquired image, and
    An image processing method comprising: smoothing at least one of positions and sizes of a plurality of predetermined objects corresponding to each other, which are treated as detected, included in the detection history, and outputting the smoothed information.
  22. A procedure for detecting a predetermined object from an image photographed by the photographing means;
    A procedure for storing a detection history including a past detection result and a latest detection result of the predetermined object;
    A procedure for referring to the detection history and determining whether or not the predetermined object is detected in the latest acquired image;
    In the determination procedure, when the predetermined object is detected N times (M ≧ N) or more in the detection results for M times including the past, the predetermined object is determined to be treated as detected.
    A program for causing a computer to execute the image processing method.
  23. A procedure for detecting a predetermined object from an image photographed by the photographing means;
    A procedure for storing a detection history including a past detection result and a latest detection result of the predetermined object;
    A procedure for referring to the detection history to determine whether or not to treat the predetermined object in the latest acquired image;
    When the predetermined object treated as the detection is selected and the predetermined object corresponding to the selected predetermined object is detected in a new image, the selection state of the predetermined object is maintained. And a program for causing a computer to execute an image processing method.
  24. A procedure for detecting a predetermined object from an image photographed by the photographing means;
    A procedure for storing a detection history including a past detection result and a latest detection result of the predetermined object;
    A procedure for referring to the detection history to determine whether or not to treat the predetermined object in the latest acquired image;
    An image processing method comprising: a step of smoothing at least one of positions and sizes of a plurality of predetermined objects corresponding to each other and treated as the detection included in the detection history, and outputting the smoothed information A program to make it run.
JP2007182975A 2006-07-25 2007-07-12 Imaging apparatus and method, program, image processing apparatus and method, and program Active JP4819001B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2006202427 2006-07-25
JP2006202427 2006-07-25
JP2007182975A JP4819001B2 (en) 2006-07-25 2007-07-12 Imaging apparatus and method, program, image processing apparatus and method, and program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2007182975A JP4819001B2 (en) 2006-07-25 2007-07-12 Imaging apparatus and method, program, image processing apparatus and method, and program
US11/878,457 US7973833B2 (en) 2006-07-25 2007-07-24 System for and method of taking image and computer program

Publications (2)

Publication Number Publication Date
JP2008054295A JP2008054295A (en) 2008-03-06
JP4819001B2 true JP4819001B2 (en) 2011-11-16

Family

ID=39160157

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007182975A Active JP4819001B2 (en) 2006-07-25 2007-07-12 Imaging apparatus and method, program, image processing apparatus and method, and program

Country Status (3)

Country Link
US (2) US7973833B2 (en)
JP (1) JP4819001B2 (en)
CN (1) CN101136066B (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4819001B2 (en) * 2006-07-25 2011-11-16 富士フイルム株式会社 Imaging apparatus and method, program, image processing apparatus and method, and program
JP4529094B2 (en) * 2007-12-28 2010-08-25 ソニー株式会社 Imaging apparatus, function control method, and function control program
JP2009223580A (en) * 2008-03-14 2009-10-01 Omron Corp Priority target determination device, electronic apparatus, priority target determination method, program, and recording medium
JP5224955B2 (en) * 2008-07-17 2013-07-03 キヤノン株式会社 Imaging device, imaging device control method, program, and recording medium
EP2148499B1 (en) 2008-07-25 2018-01-17 FUJIFILM Corporation Imaging apparatus and method
KR101542436B1 (en) * 2008-07-29 2015-08-06 후지필름 가부시키가이샤 Imaging apparatus and imaging method
JP5219697B2 (en) * 2008-08-25 2013-06-26 キヤノン株式会社 Image processing apparatus, imaging apparatus, control method for image processing apparatus, and program
JP2010068030A (en) * 2008-09-08 2010-03-25 Panasonic Corp Image processing apparatus, image processing method, image processing program and imaging apparatus
JP4702418B2 (en) * 2008-09-09 2011-06-15 カシオ計算機株式会社 Imaging apparatus, image region existence determination method and program
JP2010113130A (en) * 2008-11-06 2010-05-20 Nikon Corp Focus detecting device, imaging apparatus, focus detecting method
JP2010117487A (en) * 2008-11-12 2010-05-27 Fujinon Corp Autofocus system
JP5407380B2 (en) * 2009-02-03 2014-02-05 富士通モバイルコミュニケーションズ株式会社 Mobile terminal with imaging function
JP5434339B2 (en) * 2009-07-29 2014-03-05 ソニー株式会社 Imaging control apparatus, imaging system, imaging method, program
KR101116593B1 (en) 2009-09-16 2012-03-16 (주) 인텍플러스 System for measurement optical shape of an object and method for measurement thereof
JP2011077694A (en) * 2009-09-29 2011-04-14 Aiphone Co Ltd Intercom device
CN102209196B (en) * 2010-03-30 2016-08-03 株式会社尼康 Image processing apparatus and image evaluation method
JP5935581B2 (en) * 2012-08-03 2016-06-15 株式会社ニコン Imaging apparatus, image processing apparatus, image processing method, and image processing program
JP5888614B2 (en) * 2013-03-21 2016-03-22 カシオ計算機株式会社 Imaging device, video content generation method, and program
DE102013109348A1 (en) 2013-08-29 2015-03-05 C. & E. Fein Gmbh Accumulator with monitoring device
JP5720767B2 (en) * 2013-12-17 2015-05-20 株式会社ニコン Focus detection device
JP6024728B2 (en) * 2014-08-08 2016-11-16 カシオ計算機株式会社 Detection apparatus, detection method, and program

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3729933B2 (en) 1996-05-31 2005-12-21 松下電器産業株式会社 Automatic monitoring device
US6160903A (en) * 1998-04-24 2000-12-12 Dew Engineering And Development Limited Method of providing secure user access
JP3657463B2 (en) 1999-06-29 2005-06-08 シャープ株式会社 Motion recognition system and recording medium on which motion recognition program is recorded
US7298412B2 (en) * 2001-09-18 2007-11-20 Ricoh Company, Limited Image pickup device, automatic focusing method, automatic exposure method, electronic flash control method and computer program
JP4013684B2 (en) * 2002-07-23 2007-11-28 オムロン株式会社 Unauthorized registration prevention device in personal authentication system
KR100492153B1 (en) * 2002-10-30 2005-06-02 삼성전자주식회사 method and apparatus for analyzing a sample using fast Fourier transformation
JP4314016B2 (en) * 2002-11-01 2009-08-12 株式会社東芝 Person recognition device and traffic control device
JP2005215750A (en) * 2004-01-27 2005-08-11 Canon Inc Face detecting device and face detecting method
US8194173B2 (en) * 2004-07-16 2012-06-05 Nikon Corporation Auto-focusing electronic camera that focuses on a characterized portion of an object
JP4650669B2 (en) * 2004-11-04 2011-03-16 富士ゼロックス株式会社 Motion recognition device
US20060182433A1 (en) * 2005-02-15 2006-08-17 Nikon Corporation Electronic camera
JP4577113B2 (en) * 2005-06-22 2010-11-10 オムロン株式会社 Object determining device, imaging device, and monitoring device
JP4819001B2 (en) * 2006-07-25 2011-11-16 富士フイルム株式会社 Imaging apparatus and method, program, image processing apparatus and method, and program

Also Published As

Publication number Publication date
US8525903B2 (en) 2013-09-03
CN101136066A (en) 2008-03-05
US20080024621A1 (en) 2008-01-31
US7973833B2 (en) 2011-07-05
US20110228136A1 (en) 2011-09-22
CN101136066B (en) 2011-11-16
JP2008054295A (en) 2008-03-06

Similar Documents

Publication Publication Date Title
US9681040B2 (en) Face tracking for controlling imaging parameters
TWI549501B (en) An imaging device, and a control method thereof
US8346073B2 (en) Image taking apparatus
CN101854484B (en) Image selection device and method for selecting image
JP5791336B2 (en) Image processing apparatus and control method thereof
JP5234119B2 (en) Imaging apparatus, imaging processing method, and program
JP3541820B2 (en) Imaging device and imaging method
JP6106921B2 (en) Imaging apparatus, imaging method, and imaging program
JP4898532B2 (en) Image processing apparatus, photographing system, blink state detection method, blink state detection program, and recording medium on which the program is recorded
US7453506B2 (en) Digital camera having a specified portion preview section
JP4518131B2 (en) Imaging method and apparatus
US8462228B2 (en) Image processing method, apparatus and computer program product, and imaging apparatus, method and computer program product
JP4930302B2 (en) Imaging apparatus, control method thereof, and program
WO2018201809A1 (en) Double cameras-based image processing device and method
US8736689B2 (en) Imaging apparatus and image processing method
KR101510098B1 (en) Apparatus and method for blurring an image background in digital image processing device
JP4656331B2 (en) Imaging apparatus and imaging method
JP4674471B2 (en) Digital camera
JP4720810B2 (en) Image processing apparatus, imaging apparatus, image processing method, and image processing program
CN101621624B (en) Focus adjustment apparatus and control method therefor
US8786760B2 (en) Digital photographing apparatus and method using face recognition function
CN101567976B (en) Image capturing apparatus
JP4626493B2 (en) Image processing apparatus, image processing method, program for image processing method, and recording medium recording program for image processing method
EP1522952B1 (en) Digital camera
JP5423305B2 (en) Image evaluation apparatus and camera

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20100225

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20110421

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110426

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110624

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20110809

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20110831

R150 Certificate of patent or registration of utility model

Ref document number: 4819001

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140909

Year of fee payment: 3

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250