US10547774B2 - Image processing device, image processing method, and program - Google Patents

Image processing device, image processing method, and program Download PDF

Info

Publication number
US10547774B2
US10547774B2 US14/655,621 US201314655621A US10547774B2 US 10547774 B2 US10547774 B2 US 10547774B2 US 201314655621 A US201314655621 A US 201314655621A US 10547774 B2 US10547774 B2 US 10547774B2
Authority
US
United States
Prior art keywords
subject
image
main subject
distance
determination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US14/655,621
Other languages
English (en)
Other versions
US20150350523A1 (en
Inventor
Masaya Kinoshita
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Assigned to SONY CORPORATION reassignment SONY CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KINOSHITA, MASAYA
Publication of US20150350523A1 publication Critical patent/US20150350523A1/en
Application granted granted Critical
Publication of US10547774B2 publication Critical patent/US10547774B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • H04N5/23212
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection
    • G06K9/00228
    • G06T5/002
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/255Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • H04N5/23219
    • H04N5/23296
    • H04N5/2353
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • G02B7/365Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals by analysis of the spatial frequency components of the image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Definitions

  • the present disclosure relates to an image processing device, an image processing method, and a program for performing a process of determining a main subject in an image.
  • Patent Literature 1 JP 2011-166305A
  • Patent Literature 2 JP 2011-146826A
  • a desired area that is subject to tracking or focusing i.e., a “main subject” is decided by a photographer by directly selecting one candidate from “a plurality of candidate areas” obtained from various detectors using any method at present.
  • a main subject is chosen through an action of selecting an arbitrary face from a plurality of faces projected on a through image displayed on a screen (a monitoring image of a subject displayed at times other than at a time of manipulating a shutter) on a touch panel.
  • a subject present in a predetermined area is set to be a main subject at a time designated by a user (half-pressing of a shutter or the like).
  • hand-held type cameras have a problem in that the action of selecting a main subject itself is difficult in many use examples that require the function, which is stressful for photographers.
  • the present disclosure aims to realize a technology of determining a target subject desired by a user such as a photographer and setting the subject as a main subject without an action of the user intentionally selecting the subject.
  • an image processing device includes a subject distance change determination unit configured to detect a temporal change of a distance from an imaging position to each subject present in an image and determine a tendency toward approach or recession of the each subject with respect to the imaging position on the basis of the detection, and a main subject determination unit configured to determine a main subject on the basis of the tendency toward approach or recession of the each subject determined by the subject distance change determination unit.
  • an image processing method includes detecting a temporal change of a distance from an imaging position to each subject present in an image and determining a tendency toward approach or recession of the each subject with respect to the imaging position on the basis of the detection, and determining a main subject on the basis of the determined tendency toward approach or recession of the each subject.
  • a subject that the user considers to be a principal or main figure can be estimated, and thus automatic main subject determination can be performed accordingly.
  • a main subject is automatically determined in a captured image, and thus it is not necessary for a user such as a photographer to perform an action of selecting the main subject. Accordingly, enhancement of product-added value including improvement of operability when imaging is performed using the imaging apparatus in which the image processing device of the present disclosure is mounted held in a hand, a reduction of stress on users, and further realization of various functions caused by automatic main subject determination can be realized.
  • FIG. 1 is a block diagram of a configuration example of an image processing device of an embodiment of the present disclosure.
  • FIG. 2 is a flowchart of a main subject determination process of the image processing device of the embodiment.
  • FIG. 10 is a flowchart of the recession determination and the main subject determination process of the third embodiment.
  • FIG. 15 is a flowchart of a time correspondence process in each block of the fourth embodiment.
  • FIG. 16 is an explanatory diagram of the time correspondence process in each block of the fourth embodiment.
  • FIG. 17 is a flowchart of a main subject setting process of the fourth embodiment.
  • FIG. 18 is an explanatory diagram of the main subject setting process of the fourth embodiment.
  • FIG. 1 shows a configuration example of an image processing device according to an embodiment.
  • the image processing device 1 including the main subject determination unit 2 and the subject distance change determination unit 3 described above can be realized by a central processing unit (CPU) or a digital signal processor (DSP) serving as an arithmetic processing device.
  • CPU central processing unit
  • DSP digital signal processor
  • the function of the main subject determination unit 2 is realized by a CPU or the like and the function of the subject distance change determination unit 3 is realized by an image processing DSP or the like connected to the CPU or as a cooperation process.
  • Step F 3 the main subject determination unit 2 outputs the main subject information Dm which is the determination result of the main subject to transmit and receive the main subject information Dm to and from an application program or the like.
  • the process example of FIG. 2B is a process of detecting an image size of the subject in the image data in each frame and obtaining its size change amount to determine the tendency toward approach or recession of each subject. That is, a temporal change of a distance of the subject is detected as a size change on an image.
  • the process of this idea is referred to as a “size determination scheme” for description.
  • First to third embodiments to be described below will be described as examples in which the idea of the size determination scheme is used.
  • the subject distance change determination unit 3 detects a candidate image which can become the main subject in the image data.
  • the candidate image is, for example, a human face image, a human body image, a dog image, or a cat image.
  • the subject distance change determination unit 3 sets one candidate image or a plurality of candidate images such as face images as the subjects present in an image through an image analysis process on the image data.
  • the subject distance change determination unit 3 determines the tendency toward approach or recession of each candidate image. For example, when the size change is observed during a span of some time and the subject approaches as the candidate image, the size of the subject gradually increases. That is, the size change amount in a size expansion direction is observed on average, cumulatively, or continuously to some extent. In this case, the candidate image can be determined to have the tendency toward approach at the imaging position.
  • the candidate image can be determined to have the tendency toward recession at the imaging position.
  • Step F 14 the main subject determination unit 2 selects the candidate image having the tendency toward approach or the recession and sets the subject of the candidate image as the main subject.
  • Step F 20 , F 21 , and F 22 are performed as Step F 1 of FIG. 2A and the processes of Steps F 23 and F 24 are performed as Step F 2 of FIG. 2A .
  • Step F 21 the subject distance change determination unit 3 calculates the distance change in regard to each division region. For example, a difference of the subject distance between each division region of the current processing target frame in continuous frame image data and each division region in the frame image data before the unit time (for example, one frame before) is calculated. Thus, the distance change of the subject in each division region is calculated.
  • Step F 22 the subject distance change determination unit 3 determines the tendency toward approach or recession of each division region. For example, when the distance change is observed during a span of some time and the subject of the division region approaches, the value of the distance gradually decreases and the distance change amount in a direction in which the distance is shortened is observed on average, cumulatively, or continuously to some extent. In this case, the subject of the division region can be determined to have the tendency toward approach at the imaging position.
  • the main subject determination unit 2 determines a subject image region containing the division region in which there is the tendency toward approach or the recession.
  • the division regions are divided from an image region and do not correspond to regions of a subject image in a one-to-one manner. For example, one subject image is present over the plurality of division regions in many cases.
  • the main subject determination unit 2 determines a region range of one subject image under conditions of, for example, a region (an adjacent region or a region also adjacent to the adjacent region) which has substantially the same value of the subject distance as that of the division region having the tendency toward approach or the recession and is continuous with this division region.
  • Step F 24 the subject image in the determined region range is determined to be a main subject.
  • the main subject determination is performed as in each of the above-described examples, so that the subject intended as a target by the user can be estimated by a motion (approaching/receding) of the subject.
  • the main subject determination can be performed automatically without dependency on a manual manipulation of the user.
  • the image processing device 1 in FIG. 1 is mounted on any of the various electronic apparatuses performing an operation according to the setting of the main subject, thereby considerably improving the operability of the user.
  • FIG. 3 A configuration example of the imaging apparatus 10 according to the embodiment is shown in FIG. 3 .
  • the configuration example of the imaging apparatus 10 is appropriate for a first embodiment.
  • Configuration examples of the imaging apparatus 10 according to second to fourth embodiments will be described at each time.
  • the imaging apparatus 10 is considered as a so-called digital still camera or a so-called digital video camera and is an apparatus performing imaging or recording a still image or a moving image and including the image processing device described in the claims.
  • the imaging apparatus 10 shown in FIG. 3 has an optical system 11 , an imager 12 , an optical system drive unit 13 , a sensor unit 14 , a recording unit 15 , a communication unit 16 , a digital signal processing unit 20 , a control unit 30 , a user interface controller (hereinafter, “UI controller”) 32 , and a user interface 33 .
  • UI controller user interface controller
  • the optical system 11 has lenses such as a cover lens, a zoom lens, and a focus lens and a throttle mechanism. By this optical system 11 , light from a subject is collected in the imager 12 .
  • the imager 12 for example, has a CCD (Charge Coupled Device) type or CMOS (Complementary Metal OxideSemiconductor) type image sensor.
  • CCD Charge Coupled Device
  • CMOS Complementary Metal OxideSemiconductor
  • the imager 12 for example, performs a CDS (Correlated Double Sampling) process, an AGC (Automatic Gain Control) process and the like for an electrical signal obtained through photoelectric conversion in the image sensor, and further performs an A-D (Analog-Digital) conversion process. Then, the imager 12 outputs an imaging signal as digital data to the digital signal processing unit 20 of a rear stage.
  • CDS Correlated Double Sampling
  • AGC Automatic Gain Control
  • A-D Analog-Digital
  • the optical system drive unit 13 drives the focus lens of the optical system 11 and performs a focus operation under the control of the control unit 30 . Furthermore, the optical system drive unit 13 drives the throttle mechanism of the optical system 11 and performs exposure adjustment under the control of the control unit 30 . Moreover, the optical system drive unit 13 drives the zoom lens of the optical system 11 and performs a zoom operation under the control of the control unit 30 .
  • the digital signal processing unit 20 for example, is configured as an image processor by a DSP and the like.
  • the digital signal processing unit 20 performs various types of signal processes for a digital signal (captured image signal) from the imager 12 .
  • the digital signal processing unit 20 includes a pre-processing unit 21 , a synchronization unit 22 , a YC generation unit 23 , a resolution conversion unit 24 , a codec unit 25 , and a candidate detection unit 27 .
  • the pre-processing unit 21 performs a clamping process of clamping a black level of R, G, and B to a predetermined level, or a correction process among color channels of R, G, and B with respect to the captured image signal from the imager 12 .
  • the synchronization unit 22 performs a demosaicing process such that image data for each pixel has color components of all of R, G, and B.
  • the candidate detection unit 27 performs an image analysis process in units of frames (or per intermittent frame) for a captured image signal (luminance signal and color signal) obtained by, for example, the YC generation unit 23 , and then extracts a candidate image.
  • a captured image signal luminance signal and color signal
  • face image detection, human body detection, and the like are performed for image data continuously input on a time axis, and then images serving as candidates for a main subject are extracted.
  • detecting a moving body and setting the moving body to be a candidate image using a technique of moving body detection based on a frame difference can also be considered, and a technique of extracting an area of interest that is called saliency (Saliency) may be used.
  • Saliency a technique of extracting an area of interest that is called saliency
  • the candidate detection unit 27 for example, a face image is detected, and an area in which the face image is present is extracted as a candidate image frame.
  • a functional configuration in which the candidate detection unit 27 is implemented in the digital signal processing unit 20 is set in the example of FIG. 3 , but this is an example, and the control unit 30 may execute the process of the candidate detection unit 27 .
  • the RAM serving as a work area when the CPU performs various kinds of data processes is used for temporarily storing data, programs, and the like.
  • the ROM and the flash memory are used for storing an OS (Operating System) necessary for control of each unit by the CPU, content files such as image files, application programs for various operations, firmware, and the like.
  • OS Operating System
  • content files such as image files
  • application programs for various operations, firmware, and the like are stored therein.
  • the control unit 30 described above controls operations of necessary units relating to instruction of various signal processes in the digital signal processing unit 20 , imaging operations and recording operations according to a user manipulation, a reproducing operation of recorded image files, camera operations such as zooming, focusing, and exposure adjustment, user interface operations, and the like.
  • the distance change calculation unit 30 b calculates the image size of the candidate image set in the candidate detection unit 27 , calculates a change in the image size at each unit time, and determines the tendency toward approach or the recession from the calculation result.
  • the main subject determination unit 30 a performs a process of setting the main subject in the candidate image based on the determination result of the distance change calculation unit 30 b.
  • This display unit 34 includes the display device and a display driver that allows the display device to perform display.
  • the display driver allows various types of display to be performed on the display device based on the instruction of the control unit 30 .
  • the display driver reproduces and displays a still image or a dynamic image captured and recorded in a recording medium, or displays a through image (subject monitoring image) as a dynamic image based on captured image data of each frame, which is captured during release (a shutter manipulation) standby, on a screen of the display device.
  • the display driver allows various manipulation menus, icons, messages and the like, that is, a GUI (Graphical User Interface), to be displayed on the screen.
  • GUI Graphic User Interface
  • An operation of the display unit 34 of the user interface 33 and the like is controlled by the UI controller 32 according to instructions of the control unit 30 .
  • information of manipulation by the manipulation unit 35 is transmitted to the control unit 30 by the UI controller 32 .
  • the recording unit 15 includes, for example, a non-volatile memory, and serves as a storage area for storing image files (content files) such as still image data or dynamic image data, attribute information of the image files, thumbnail images and the like.
  • image files content files
  • image files such as still image data or dynamic image data, attribute information of the image files, thumbnail images and the like.
  • the communication unit 16 performs data communication or network communication with an external device in a wired or wireless manner.
  • a luminance sensor that detects external luminance for exposure adjustment and the like and a distance measuring sensor that measures subject distances may be provided.
  • the various sensors of the sensor unit 14 each transmit detected information to the control unit 30 .
  • the control unit 30 can perform various kinds of control using the information detected by the sensor unit 14 .
  • the configuration of the image processing device 1 in FIG. 1 corresponding to the main subject determination unit 2 is implemented as the main subject determination unit 30 a in the control unit 30 of the imaging apparatus 10 by software.
  • the configuration corresponding to the subject distance change determination unit 3 is implemented as the distance change calculation unit 30 b in the candidate detection unit 27 and the control unit 30 of the imaging apparatus 10 by software.
  • the control unit 30 controls the execution of an operation as an image processing method described in the claims by executing a process based on a program described in the claims.
  • Main subject determination is executed when, for example, a user (photographer) aims at a shutter timing (release timing), but the control unit 30 can perform the following process after a main subject is automatically determined.
  • a main subject set in each captured frame is tracked.
  • a main subject is specified on a through image display for the user, and provided for adjusting an angle of view performed by the user (for example, for decision of a subject in a state in which a camera is held in a hand).
  • Auto focus is controlled for a main subject.
  • focus is adjusted tracking the main subject even when the main subject moves around.
  • Auto zoom is controlled with respect to a main subject.
  • the zoom lens is automatically driven so that the main subject is projected in a captured image in a predetermined size or greater at all times.
  • an angle of view may be set to be adjusted using zoom according to a change in a distance to the main subject.
  • Main subject determination may be set to trigger a start of dynamic image capturing. For example, dynamic image capturing and recording are started according to decision of a main subject.
  • Image processes including image quality adjustment, noise reduction, skin color adjustment, and the like are performed only on the area of a main subject in each captured frame.
  • a process of cropping, enlarging, or the like of a partial area within a frame in which a main subject is included can be performed.
  • cutting of image peripheral portions of captured image data or the like can be performed so that a main subject is disposed at the center of the image, and composition adjustment can be performed.
  • the main subject determination process may be performed.
  • the main subject determination process may be performed again at the time when the tracking is lost.
  • main subject determination process may be set to start through user manipulation.
  • the imaging apparatus 10 that is carried and used by a user, such as a digital still camera, a camera included in a mobile telephone, or the like used by general users, has the display unit 34 of a small size, and thus it is difficult for the user to perform an accurate manipulation of designating a main subject on a screen.
  • the problem of erroneous designation is resolved by performing automatic determination as described in the present embodiment.
  • FIG. 4A schematically shows a candidate image frame extraction operation performed by the candidate detection unit 27 .
  • FIG. 4A shows frames FR 1 , FR 2 , FR 3 , . . . of captured image signals input to the digital signal processing unit 20 through an operation of the optical system 11 and the imager 12 of the imaging apparatus 10 .
  • the candidate detection unit 27 performs detection of candidate images from the continuous frames (or intermittent frames) sequentially input as above.
  • the control unit 30 calculates a frame area (h ⁇ w) as the size of each candidate image frame whenever the candidate image frame information for each frame is taken, and further detects a difference from the size of this candidate image of a previous frame as a change amount of the frame area. Then, the control unit 30 determines whether each candidate image has the tendency toward approach by observing a change in the difference on a time axis.
  • a determination result of each area change amount is first obtained using a distance determination threshold value Thd for determining whether the subject is approaching based on the area change amount.
  • Thd a distance determination threshold value for determining whether the subject is approaching based on the area change amount.
  • the value of the area change amount is normally a high value and an E 1 determination result is continuously set to “1” in the candidate image frame E 1 .
  • an E 2 determination result is set to “1” during only a certain period since the value of the area change amount is a high value in some cases.
  • the subject that is approaching on average, cumulatively, or continuously for some time is determined to be a subject with the tendency toward approach.
  • the subject for which the period in which the determination result in FIG. 5 C is “1” is long is determined to be a subject with the tendency toward approach. For example, when a period length of a continuous period, a cumulative period, or the like in which the determination result is “1” is counted, the tendency toward approach can be determined.
  • the candidate image frame E 1 can be determined to have the tendency toward approach during this determination period since the determination result is “1” and the period is long.
  • the candidate image frame E 2 can be said to be, for example, a subject temporarily approaching or receding.
  • the candidate image frame E 3 is a subject remaining at a relative distant position.
  • a period from determination start to determination end differs according to a specific process example.
  • the candidate image frame is determined to have the tendency toward approach.
  • a timing of the determination end is quickened. That is, the determination period length changes depending on a situation of the determination process.
  • the period from the determination start to the determination end is set as a fixed period length.
  • a process to be described below is a process performed by the functions of the distance change calculation unit 30 b and the main subject determination unit 30 a of the control unit 30 .
  • the count value Cnt(n) is the value of a counter for determining a time length in regard to a determination result obtained by comparing the above-described area change amount and the distance determination threshold value Thd.
  • a frame area Area(n) to be described in FIG. 6 also similarly indicates a frame area of each candidate image frame.
  • a process for the frame area Area(n) is used to mean a process on each of frame areas Area 1 , Area 2 , Area 3 , . . . of the candidate image frames E 1 , E 2 , and E 3 , . . . , for example.
  • an area change amount Diff(n) similarly indicates an area change amount of each candidate image frame.
  • the process for the area change amount Diff(n) is used to mean a process on each of area change amounts Diff 1 , Diff 2 , Diff 3 , . . . of the candidate image frames E 1 , E 2 , and E 3 , . . . , for example.
  • a candidate image frame E(n) indicates each of the candidate image frames E 1 , E 2 , E 3 . . . , but it is preferably distinguished for each subject over a plurality of frames.
  • the candidate detection unit 27 extracts a face
  • the face image portion of the person A is set to be the candidate image frame E 1
  • the face image portion of the person B to be the candidate image frame E 2
  • the face image portion of the person C to be the candidate image frame E 3 common in each of the frames.
  • Step F 102 the control unit 30 calculates the frame area Area(n) of each frame image frame E(n).
  • the frame is assumed to be square.
  • the candidate image frame E(n) may not necessarily be square, but can also be considered to be circular, elliptical, or amorphous.
  • the frame area Area(n) may be assumed to be the number of pixels contained in the candidate image frame E(n).
  • Step F 103 the control unit 30 obtains an area change amount Diff(n) of each candidate image frame E(n).
  • Area(n)pre is a frame area Area(n) of the previous frame of the candidate image frame E(n). For example, the frame area Area(n) obtained in Step F 102 when an image one frame before is a target is considered to be “Area(n)pre” at a process time point of the current frame.
  • Step F 104 may be unnecessary.
  • Steps F 105 , F 106 , and F 107 the control unit 30 confirms whether the area change amount Diff(n) of each candidate image frame E(n) indicates approach.
  • Step F 105 the control unit 30 compares the area change amount Diff(n) of each candidate image frame E(n) to the distance determination threshold value Thd.
  • the determination flag Flg(n) corresponds to the determination result of “1” or “0” described in FIG. 5C .
  • Steps F 108 , F 109 , and F 110 the control unit 30 sets the offset value OFS(n) for a counting process depending on whether the determination flag Flg(n) is “1” or “0.”
  • Step F 101 the process after Step F 101 is performed, as described above, based on the candidate image frame information input for a subsequent frame.
  • the user may be allowed to select the main subject through a manipulation of touching the main subject on the screen of the display unit 34 or a manipulation of half pressing the shutter button according to a predetermined position of a subject on the screen, in parallel to the automatic main subject determination of this example.
  • the user performs such a designation manipulation during the execution of the process of FIG. 6 , it is desirable to prefer the user's manipulation.
  • the main subject determination performed through the approach determination is performed during a certain time length. Therefore, when the process is not performed on the candidate image frame information for some time (the number of frames), the determination does not end in Step F 116 , as described above, the process returns to Step F 101 , and the process is repeated.
  • the certain candidate image frame E 1 may be discontinuous, but there is a situation in which the approach is detected in a plurality of frames. Then, there are many opportunities in which the count value Cnt 1 of the candidate image frame E 1 is incremented in Step F 111 as time passes, and thus the count value Cnt 1 advances earlier than the count values Cnt 2 and Cnt 3 .
  • the count value Cnt 1 first reaches the time threshold value Tht.
  • control unit 30 causes the process to proceed from Step F 112 to Step F 114 .
  • Step F 115 the main subject information is output to be transmitted to or received from, for example, an application program or a control program using the main subject information.
  • the tendency toward approach is determined in the plurality of frames among the extracted candidate images.
  • a subject of high certainty at which a photographer holding a camera is considered to aim as a target has the tendency toward approach in many cases.
  • the child is observed to approach in many cases.
  • the process is appropriate for the photographer, thereby considerably improving operability at the time of imaging.
  • Even a user unaccustomed to an imaging manipulation can capture a still image or a moving image with high quality, for example, when focus control or exposure control is automatically performed on the main subject through the main subject determination.
  • the count value CNT(n) corresponds to a cumulative value of the number of times the approach detection is cumulatively performed.
  • the count value CNT(n) can be regarded as a value at which the approach detection is performed on average.
  • the approach determination of Step F 112 is the determination in which the subject is approaching cumulatively or on average.
  • a stopping or temporarily receding subject can be prevented from being determined to be the main subject. Accordingly, this determination is appropriate for the main subject determination to be performed prudently. Conversely, when the determination of the main subject is desired to be performed in a short time or the main subject is desired to be set as easily as possible, the subject that is approaching on average or cumulatively is appropriately determined to have the tendency toward approach.
  • FIG. 7 the configuration of an imaging apparatus 10 is shown in FIG. 7 .
  • the same reference numerals are given to the same portions as those of FIG. 3 and the description thereof will be omitted.
  • the attribute identification unit 28 can be functionally configured to be performed by the digital signal processing unit 20 .
  • An example in which the attribute identification unit 28 is realized as a processing function in the control unit 30 can also be considered.
  • the control unit 30 performs the processes of Steps F 101 to F 116 at a timing of every one frame.
  • Step F 101 A the control unit 30 takes candidate image frame information of a certain frame from the candidate detection unit 27 .
  • information including an x value and a y value of two-dimensional (x-y) coordinate values of image data as position information, and a width w and a height h of the candidate image frame are acquired as position information.
  • the control unit 30 further acquires attribute information AT(n) of each candidate image frame E(n) from the attribute identification unit 28 .
  • the attribute information is, for example, identification information for discriminating an adult from a child or for discriminating a male from a female.
  • Step F 112 A for each candidate image frame E(n), the count value CNT(n) is compared to the time threshold value Tht(ATn), and thus whether having the tendency toward approach or not is consequently determined. Then, the candidate image having the tendency toward approach is determined to be the main subject in Step F 114 .
  • ease of determining the candidate image to be the main subject differs according to the attribute information.
  • Tht time threshold value
  • Tht(ATn) is a small value in the case of a child as described above, it is easy to determine the child to be the main subject.
  • Tht(ATn) is smaller in the case of a female than in the case of a male, it is easy to determine the female to be the main subject.
  • the user may set a preference order according to various use cases. For example, when a child is imaged, setting is performed so that the child is preferred. When a male is imaged, setting is performed so that the male is preferred.
  • the user can select the time threshold value according to the attribute, the main subject determination is realized quickly and with high accuracy according to an imaging purpose of the user.
  • the time threshold value is lowered in this case, and such a candidate image is easily determined to be the main subject can also be considered.
  • the faces of the children, family, or the like of the user are registered in advance and characteristic data are obtained in advance.
  • attribute information is generated as information regarding a close relative and the control unit 30 sets the time threshold value Tht(ATn) to a small value in this case.
  • the attribute information is not limited to a person, but attribute information of an animal such as a dog or a cat or a type of animal may be generated and the time threshold value may be changed.
  • a main subject determination process will be described according to the third embodiment.
  • this embodiment is an example in which a receding subject is determined to be the main subject with the idea of the size determination scheme described in FIG. 2B .
  • FIGS. 9A, 9B, and 9C show the determination result using a change in the calculated frame area, an area change amount, and a distance determination threshold Thd when the candidate image frames E 1 , E 2 , and E 3 are assumed to be continuously present in the frames (FR 1 , FR 2 , . . . ) during a certain period.
  • the frame area gradually decreases when the candidate image frame E 1 is focused on.
  • the value of the area change amount is a large value when the frame area decreases.
  • the area change amount is shown in FIG. 9B according to the frame area change in FIG. 9A .
  • the candidate image frames E 1 , E 2 , and E 3 are shown in FIG. 9C .
  • the E 3 determination result is continuously “0” since the area change amount is normally a low value.
  • the candidate image frame E 1 is determined to be a subject with the tendency toward recession since the subject is receding on average, cumulatively, or continuously for some time.
  • Steps F 104 to F 116 are performed as in FIG. 6 .
  • the count value Cnt(n) reaches the time threshold Tht at a certain time point in Step F 112 in the candidate image frame E(n) of the subject that is receding on average or cumulatively.
  • the subject of the candidate image frame E(n) is assumed to have the tendency toward recession and is selected as the main subject in Step F 114 .
  • Even a user unaccustomed to an imaging manipulation can capture a still image or a moving image with high quality, for example, when focus control or exposure control is automatically performed on the main subject through the main subject determination.
  • the subject that is receding on average or cumulatively is appropriately determined to have the tendency toward recession.
  • Another condition may be considered to be added as an AND condition to the determination of the tendency toward recession in order to determine a subject as the main subject.
  • a subject distance is equal to or greater a predetermined distance
  • the fact that a type of image is specific is specific
  • the user may be configured to select the additional conditions.
  • the idea of the second embodiment may be applied even to the determination of the tendency toward recession and another time threshold value Tht(ATn) may be used according to the attribute information.
  • a main subject determination process will be described according to the fourth embodiment.
  • This embodiment is an example in which an approaching subject is determined to be the main subject with the idea of the block determination scheme described in FIG. 2C .
  • the distance calculation unit 29 calculates the subject distance for each of the division regions (block) divided from a captured image using a value detected by the distance sensor 17 .
  • phase difference sensor scheme is a scheme of obtaining a distance of a target subject from a distance (the number of pixels) between the pixels in which the same target subject is detected in captured images of the imager disposed on the right and left sides.
  • the Time-of-Flight scheme is a scheme in which the distance sensor 17 emits and receives infrared light and divides a time taken to reflect the emitted infrared light from a target subject and receive the reflected light by the speed of the infrared light to obtain a distance.
  • FIG. 12 shows a distance detection operation for each block.
  • FIG. 12A shows frames FR 1 , FR 2 , FR 3 , . . . of captured image signals input to the digital signal processing unit 20 through an operation of the optical system 11 and the imager 12 of the imaging apparatus 10 .
  • the distance sensor 17 operates to measure the distance of a subject and the detected information is input to the distance calculation unit 29 .
  • the functions of the main subject determination unit 30 a and the distance change calculation unit 30 b are provided in the control unit 30 .
  • the configuration portion of the image processing device 1 described in FIG. 1 is as follows.
  • a configuration corresponding to the main subject determination unit 2 of the image processing device 1 in FIG. 1 is implemented as the main subject determination unit 30 a on the control unit 30 of the imaging apparatus 10 by software.
  • a configuration corresponding to the subject distance change determination unit 3 is implemented as the distance change calculation unit 30 b and the distance calculation unit 29 by hardware or software.
  • the distance calculation unit 29 is functionally configured to be executed by the digital signal processing unit 20 , but this is merely an example.
  • the function of the distance calculation unit 29 can be considered to be implemented on the control unit 30 by software.
  • the distance calculation unit 29 obtains the subject distance for each of the blocks BK (BK 1 , BK 2 , . . . BK(M).
  • the subjects distances of the blocks BK are exemplified (numerical values in meters or infinity ⁇ ). For example, 20 m is exemplified for the block BK 4 and infinity ⁇ is exemplified for the block BK 3 .
  • the distance calculation unit 29 obtains the subject distance for each block BK in this way in each frame, and then transmits or receives distance information of each block BK to and from the control unit 30 .
  • FIG. 13A shows a change in the subject distance calculated in each frame in regard to each block BK.
  • the blocks BK 1 , BK 2 , and BK(x) are exemplified.
  • the distance of a subject pictured in the block BK(x) is gradually shortened.
  • the distances of the subjects pictured in the blocks BK 1 and BK 2 are slightly changed, but are not considerably changed on average.
  • a period from the start of the determination to the end of the determination differs according to a specific process example.
  • a period length in which the determination result is “1” is counted and a subject of the block BK is determined to have the tendency toward approach when the period length reaches a predetermined time.
  • a timing of the end of the determination is quickened. That is, the determination period length changes depending on a situation of the determination process.
  • the period from the determination start to the determination end is set as a fixed period length.
  • Step F 201 the control unit 30 takes pieces of distance information Db 1 to Db(M) with regard to the respective blocks BK 1 to BK(M) in a certain frame from the distance calculation unit 29 .
  • the information is, for example, information regarding the values of the distances shown in the lower part of FIG. 12B .
  • the control unit 30 performs the time matching process on each block BK in Step F 202 and obtains the distance change amount bDiff(m) for each block BK in Step F 203 . That is, the distance change amount bDiff(m) is a difference between a distance value of a current frame and a distance value of a previous frame.
  • Step F 202 the time matching process of Step F 202 will be described with reference to FIGS. 15 and 16 .
  • the same subjects are not necessarily located at the same blocks on the captured image data of each frame when time passes.
  • the subject located at the block BK 4 in the immediately previous frame is located at the block BK 5 in the current frame in some cases.
  • a distance difference from the previous frame of the block BK 5 is output, this difference is not the distance change amount of the subject. Accordingly, matching (tracking) a subject with a block is performed in Step F 202 .
  • Step F 202 The time matching process of Step F 202 will be described with reference to FIG. 15 .
  • control unit 30 calculates absolute distance differences bDiffX 1 to bDiffX 9 between the target block BK(m) of the current frame FRc and the respective corresponding blocks BKx 1 to BKx 9 of the previous frame FRpre in Step F 231 as follows.
  • the distances Db(BKx 1 ) to Db(BKx 9 ) are distance values of the nine blocks BKx 1 to BKx 9 and are the above-described values “10,” “10,” “9,” “10,”, “7,” “6,” “9,” “7,” and “5” in the example of FIG. 16 .
  • the distance change amount bDiff(m) is a change amount indicating the approach of a predetermined amount or more is determined using the distance determination threshold value Thd, as described in FIG. 13B .
  • the determination flag Flg(n) corresponds to the determination result of “1” or “0” described in FIG. 13C .
  • Steps F 208 , F 209 , and F 210 the control unit 30 sets the offset value OFS(m) for a counting process depending on whether the determination flag Flg(m) is “1” or “0.”
  • Step F 211 the control unit 30 performs a process of the count value bCNT(m) of the counter counting a time length in which approach is observed.
  • the count value bCNT(m) is incremented when the approach is detected. Therefore, the count value is a value corresponding to the length of the period in which the subject of the block BK(m) is detected to approach.
  • the count value bCNT(m) is a cumulative value of the approach detection. Therefore, when the approach is detected many times, the count value bCnt(m) increases.
  • the count value bCNT(m) increases according to the average approach.
  • Step F 212 the control unit 30 determines whether each block BK(m) has the tendency toward approach. Specifically, the control unit 30 determines whether the count value bCNT(m) indicating the time length of the approach detection becomes a value equal to or greater than the predetermined time threshold value Tht.
  • Step F 201 the process after Step F 201 is performed, as described above, based on distance information Db(m) of each block BK(m) input for a subsequent frame.
  • This step is the same as Step F 116 of FIG. 6 .
  • the process from Step F 217 may end (interruption end).
  • the main subject determination performed through the approach determination is performed during a certain time length. Therefore, when the process is not performed on each block BK(m) for some time (the number of frames), the determination does not end in Step F 217 , the process returns to Step F 201 , and the process is repeated.
  • control unit 30 causes the process to proceed from Step F 212 to Step F 214 .
  • Step F 216 the main subject information is output to be transmitted to or received from, for example, an application program or a control program using the main subject information.
  • Step F 217 the determination is considered to end in Step F 217 .
  • Step F 214 the setting of the main subject will be described in Step F 214 .
  • the determination of the tendency toward approach in Step F 212 is performed in units of the blocks BK and is not necessarily performed in units of subjects.
  • Step F 214 a subject range is searched for based on the blocks and the subject range is determined to be the main subject.
  • FIG. 18A shows a part of the image data of one frame. Dotted lines indicate the blocks BK.
  • Step F 212 it is assumed that a subject P is approaching and a certain block BK(p) is determined to have the tendency toward approach at a certain time point in Step F 212 .
  • the block BK(p) is a part of the subject P and it is not appropriate that only the portion of the block BK(p) is set as the main subject. Accordingly, to set the range of the subject P as the main subject, the control unit 30 performs a process shown in FIG. 17 in Step F 214 .
  • Step F 240 a distance difference between the block BK(m) determined to have the tendency toward approach in Step F 212 and each of the other blocks BK is calculated. That is, differences of the distance values between, for example, the block BK(p) in FIG. 18A which is the block BK(m) and all of the other blocks BK 1 to BK(M) (here, excluding the block BK(p)) in the frame are calculated. Then, in Step F 241 , the blocks BK for which the distance difference is within a predetermined value are extracted.
  • the blocks BK indicated by the diagonal lines in FIG. 18B are the blocks for which the distance difference from the block BK(p) is within the predetermined value.
  • Step F 242 the control unit 30 groups the blocks continuous with the block BK(m) (the block BK(p) in FIG. 18 ) determined to have the tendency toward approach among the blocks BK extracted in the above-described manner.
  • the continuous blocks refer to the blocks with a relation physically continuous on an image plane, such as the blocks adjacent to the block BK(p) and the blocks also adjacent to the adjacent blocks. For example, a block group MA shown in FIG. 18C is formed.
  • Step F 243 the control unit 30 sets the formed block group MA as a region forming a main subject image, sets this image range as the main subject, and generates main subject information.
  • Step F 242 the number of blocks grouped irrespective of the other blocks in Step F 242 is only one block in some cases. In this case, only the range of the block BK(m) determined to have the tendency toward approach may be determined to be the main subject.
  • the image region of a certain subject can be appropriately set to the main subject image based on the blocks. This is because the differences of the distance values mostly do not occur when the subject is the same, and thus the range of the subject can be appropriately determined by extracting the blocks for which the distance differences are small and grouping the continuous blocks, as described above.
  • image analysis may be performed in the range of the formed block group, a contour is determined through face detection, body detection, or the like, and the contour may be determined to be the main subject.
  • Step F 214 Another process may be performed to determine the main subject in Step F 214 in addition to the above-described process.
  • the candidate detection unit 27 may be provided in the digital signal processing unit 20 or the control unit 30 and candidate images may be extracted in parallel to the process of FIG. 14 .
  • a candidate image frame in which the block BK(m) determined to have the tendency toward approach in Step F 212 may be searched for and a subject of the candidate image frame may be set as the main subject.
  • the tendency toward approach is determined in a plurality of frames for each of the blocks divided from an image. Then, the setting of the main subject is performed based on the determination of the blocks of the tendency toward approach.
  • operability at the time of imaging is considerably improved as in the first embodiment.
  • a still image or a moving image with high quality can be captured, for example, when focus control or exposure control is automatically performed on the main subject through the main subject determination.
  • Step F 210 the tendency toward approach is determined in the case of the detection of a situation of cumulative approach.
  • the advantages described in the first embodiment can be obtained.
  • the subject when the approach is detected in the continuous, cumulative, or average sense under a predetermined time condition, the subject is determined to have the tendency toward approach, but weight of the determination may be changed temporarily. Specifically, the value a substituted into the offset value OFS(m) set in Steps F 209 are increased or conversely decreased step by step.
  • a process example of setting a given determination period and setting, as the target of Step F 214 , the block BK(m) for which the count value bCNT(m) is the largest or one or a plurality of blocks BK(m) for which the count value bCNT(m) is equal to or greater than a predetermined value to determine the block BK to be the main subject within the determination period can also be considered.
  • All of the blocks BK 1 to BK(M) have been set as the targets and the process has been performed. However, for example, some of the blocks BK such as only the blocks BK near the middle portion of the image data may be set as the targets and the process of FIG. 14 may be performed. For example, when a photographer is considered to adjust an imaging direction to catch a subject that the photographer thinks to set as the main subject in the middle portion as much as possible, only the blocks BK near the middle portion can be set as targets and the process of FIG. 14 can be performed, thereby improving the accuracy of the setting of the main subject. Further, the processing load of the control unit 30 can be reduced.
  • the blocks BK and a subject correspond with each other continuously during a determination period only when a certain subject comes straight from the front to approach in the direction of the imaging apparatus 10 .
  • a certain subject comes straight from the front to approach in the direction of the imaging apparatus 10 .
  • a situation in which the subject is contained in other blocks BK as a frame progresses occurs.
  • a case in which the tendency toward approach in the front direction is set as a determination condition of the main subject may be a process example in which Step F 202 of FIG. 14 is not performed.
  • each block BK to be divided do not have to have the same sizes or the same area shape. It is also considered that, for example, one block BK of the end parts of an image is set as a wide range and the center part of the screen is divided into fine block BKs. Considering that a main subject is highly likely to be positioned in the periphery of the center of the screen, dividing the center of the screen into small areas is appropriate.
  • the program is a program causing an arithmetic processing device to perform a process of detecting a temporal change of a distance from an imaging position in regard to a subject present in an image and determining the tendency toward approach or recession of a subject with respect to the imaging position based on the detection and a process of determining the main subject based on the determined tendency toward approach or recession of each subject.
  • the program of the embodiment may be a program that causes the arithmetic processing device to execute the process shown in FIG. 2 , FIG. 6 , FIG. 8 , FIG. 10 or FIG. 14 .
  • a device that executes the above-described main subject determination can be realized using the arithmetic processing device.
  • Such a program can be recorded in advance on an HDD as a recording medium embedded in an appliance such as a computer device, a ROM in a microcomputer having a CPU, and the like.
  • the program can be temporarily or permanently stored (recorded) in a removable recording medium such as a flexible disk, a CD-ROM (Compact Disc Read Only Memory), a MO (Magnet optical) disc, a DVD (Digital Versatile Disc), a Blu-ray Disc (registered trademark), a magnetic disk, a semiconductor memory, or a memory card.
  • a removable recording medium such as a flexible disk, a CD-ROM (Compact Disc Read Only Memory), a MO (Magnet optical) disc, a DVD (Digital Versatile Disc), a Blu-ray Disc (registered trademark), a magnetic disk, a semiconductor memory, or a memory card.
  • a removable recording medium can be provided as so-called package software.
  • Such a program can be downloaded from a download site through a network such as a LAN (Local Area Network) or the Internet, in addition to the installation from the removable recording medium to a personal computer and the like.
  • a network such as a LAN (Local Area Network) or the Internet
  • Such a program is suitable for the image processing device of the embodiment to be extensively provided.
  • the program is downloaded to a personal computer, a portable information processing apparatus, a cellular phone, a game device, a video player, a PDA (Personal Digital Assistant) and the like, so that the portable information processing device and the like are available as the image processing device according to an embodiment of the present disclosure.
  • the same process as the main subject determination process can be performed in the image processing device 1 of FIG. 1 , and the imaging apparatus 10 .
  • a CPU 71 of a computer device 70 performs various processes according to a program stored in a ROM 72 or a program loaded from a storage unit 78 to a RAM 73 . Furthermore, the RAM 73 appropriately stores data and the like which are necessary when the CPU 71 performs the various processes.
  • the CPU 71 , the ROM 72 , and the RAM 73 are connected to one another through a bus 74 . Furthermore, an input and output interface 75 is also connected to the bus 74 .
  • the input and output interface 75 is connected to an input unit 76 including a keyboard, a mouse and the like, an output unit 77 including a display such as a CRT (Cathode Ray Tube), an LCD, or an organic EL panel, and a speaker, the storage unit 78 including a hard disk, and a communication unit 79 including a modem and the like.
  • the communication unit 79 performs a communication process through a network including the Internet.
  • a program constituting the software is installed from a network or a recording medium.
  • the recording medium for example, as illustrated in FIG. 19 , is configured by the removable medium 81 including a magnetic disk (including a flexible disk), an optical disc (including a Blu-ray disc, a CD-ROM, and a DVD, a magneto optical disc (including a MD (Mini Disc)), a semiconductor memory and the like which are distributed to deliver a program to a user, separately from an apparatus body with the program recorded therein.
  • the recording medium is also configured by the ROM 72 , a hard disk included in the storage unit 78 , and the like, which are delivered to a user in the state of being incorporated in advance into the apparatus body with the program recorded therein.
  • the setting of the time threshold value Tht and the distance determination threshold value Thd described in each embodiment can be appropriately modified according to a product which is the imaging apparatus 10 or the image processing device 1 , a use form, or the like.
  • the user may be configured to set any desired values.
  • a criterion of the main subject image (the candidate image frame E(n) or the block BK(m)) can be modified.
  • the time threshold value Tht By setting the time threshold value Tht, swift determination can be selected and prioritized or high certainty determination can be selected and prioritized.
  • main subject information has been described as also being used in an image effect process and an image editing process, however, it is also preferable to perform the main subject determination process targeting a reproduced image to this end.
  • a result of the main subject determination process may be added to still image data or dynamic image data imaged and recorded thereafter as metadata.
  • information representing a main subject is added to a still image file, or the like.
  • manipulation of designating a main subject through manipulation by a photographer may be set to be possible while a through image is displayed and, at the same time, the main subject determination process is performed.
  • the process of determining a main subject has been described mainly on the assumption of capturing still images in the embodiments, however, the process of the embodiments described above can be applied as a process of performing main subject determination on a plurality of captured frames during standby for capturing a dynamic image, and capturing and execution of recording of a dynamic image.
  • present technology may also be configured as below.
  • An image processing device including:
  • a subject distance change determination unit configured to detect a temporal change of a distance from an imaging position to each subject present in an image and determine a tendency toward approach or recession of the each subject with respect to the imaging position on the basis of the detection
  • the subject distance change determination unit determines the tendency toward approach of the each subject to the imaging position on the basis of information regarding the temporal change of the distance of the each subject, and wherein the main subject determination unit determines the main subject on the basis of a determination result of the tendency toward approach.
  • the image processing device according to (1) or (2), wherein the subject distance change determination unit detects a temporal change of a size of a subject image in the image as the temporal change of the distance.
  • the image processing device according to any one of (1) to (3), wherein the subject distance change determination unit detects, as the temporal change of the distance, a temporal change of a size of a subject image in the image, the subject image being one or more of candidate images extracted in the image.
  • the subject distance change determination unit determines that the subject that is approaching on average, cumulatively, or continuously is a subject with the tendency toward approach, as a detection result of the temporal change of the distance, and wherein the main subject determination unit determines a part or all of the subject determined to have the tendency toward approach as the main subject.
  • the image processing device according to any one of (1) to (5), further including:
  • an attribute identification unit configured to identify an attribute of the subject and output attribute information
  • the subject distance change determination unit changes a determination condition of the tendency toward approach of the subject according to the attribute information.
  • the image processing device according to (1) or (2), wherein the subject distance change determination unit detects the temporal change of the distance of the subject in each division region in the image.
  • the image processing device according to (7), wherein the subject distance change determination unit determines a division region at which the subject of the division region of a current processing target image is located in an image before a unit time and detects the temporal change of the distance of the subject as a difference between a subject distance of the determined division region and a subject distance of the division region of the current processing target image.
  • the main subject determination unit determines the main subject on the basis of information regarding the division region in which the subject is determined to have the tendency toward approach.
  • the image processing device wherein the main subject determination unit groups different division region in which a same subject as the subject contained in one division region in which the subject is determined to have the tendency toward approach is contained, and sets an image range serving as the main subject based on a range of the grouped one or more of division regions.
  • the image processing device wherein the main subject determination unit groups the one division region and the different division region for which a distance difference of a subject distance is within a predetermined value and which is a region continuous with the one division region.
  • main subject determination unit determines the main subject on the basis of a determination result of the tendency toward recession.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)
US14/655,621 2013-01-09 2013-11-12 Image processing device, image processing method, and program Active US10547774B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2013-001798 2013-01-09
JP2013001798 2013-01-09
PCT/JP2013/080606 WO2014109125A1 (ja) 2013-01-09 2013-11-12 画像処理装置、画像処理方法、プログラム

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/080606 A-371-Of-International WO2014109125A1 (ja) 2013-01-09 2013-11-12 画像処理装置、画像処理方法、プログラム

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/713,766 Continuation US11012614B2 (en) 2013-01-09 2019-12-13 Image processing device, image processing method, and program

Publications (2)

Publication Number Publication Date
US20150350523A1 US20150350523A1 (en) 2015-12-03
US10547774B2 true US10547774B2 (en) 2020-01-28

Family

ID=51166790

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/655,621 Active US10547774B2 (en) 2013-01-09 2013-11-12 Image processing device, image processing method, and program
US16/713,766 Active US11012614B2 (en) 2013-01-09 2019-12-13 Image processing device, image processing method, and program

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/713,766 Active US11012614B2 (en) 2013-01-09 2019-12-13 Image processing device, image processing method, and program

Country Status (5)

Country Link
US (2) US10547774B2 (de)
EP (2) EP2945366B1 (de)
JP (1) JP6319101B2 (de)
CN (1) CN104919791A (de)
WO (1) WO2014109125A1 (de)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220337734A1 (en) * 2019-08-23 2022-10-20 Kabushiki Kaisha Tokai Rika Denki Seisakusho Image-capturing control system, control device, and non-transitory computer-readable medium

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106324945A (zh) * 2015-06-30 2017-01-11 中兴通讯股份有限公司 非接触式自动对焦方法和装置
US9535423B1 (en) * 2016-03-29 2017-01-03 Adasworks Kft. Autonomous vehicle with improved visual detection ability
KR102623989B1 (ko) * 2016-08-01 2024-01-11 삼성전자주식회사 영상 처리 방법 및 이를 지원하는 전자 장치
JP6598748B2 (ja) * 2016-08-26 2019-10-30 任天堂株式会社 情報処理システム、情報処理プログラム、情報処理装置、および情報処理方法
KR102289837B1 (ko) 2017-01-06 2021-08-17 삼성전자주식회사 촬영 방법 및 전자 장치
FR3063409B1 (fr) * 2017-02-28 2021-07-16 Thales Sa Procede de pilotage d'une camera ptz, produit programme d'ordinateur et dispositif de pilotage associes
CN108702452B (zh) 2017-06-09 2020-02-14 华为技术有限公司 一种图像拍摄方法及装置
CN110110189A (zh) * 2018-02-01 2019-08-09 北京京东尚科信息技术有限公司 用于生成信息的方法和装置
CN109064776A (zh) * 2018-09-26 2018-12-21 广东省交通规划设计研究院股份有限公司 预警方法、系统、计算机设备和存储介质
CN111432133B (zh) * 2019-01-09 2021-04-23 恒景科技股份有限公司 自动曝光成像系统与方法
JP7406880B2 (ja) 2019-04-10 2023-12-28 キヤノン株式会社 画像処理装置、その制御方法及びプログラム
JPWO2020209097A1 (de) * 2019-04-10 2020-10-15
CN111131717B (zh) * 2019-12-31 2021-10-26 深圳市维海德技术股份有限公司 聚焦方法、装置、设备与计算机可读存储介质
US20230128043A1 (en) * 2020-01-24 2023-04-27 Sony Group Corporation Information processing device, information processing method, and information processing program
KR20210128736A (ko) 2020-04-17 2021-10-27 삼성전자주식회사 멀티 카메라를 포함하는 전자 장치 및 촬영 방법

Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS4720673Y1 (de) 1968-04-26 1972-07-11
US20010002225A1 (en) * 1988-03-10 2001-05-31 Masayoshi Sekine Image shake detecting device
JP2002228920A (ja) 2001-01-30 2002-08-14 Minolta Co Ltd 主被写体検出装置及び自動焦点カメラ
US20060012681A1 (en) * 2004-07-14 2006-01-19 Matsushita Electric Industrial Co., Ltd. Object tracing device, object tracing system, and object tracing method
US20060066742A1 (en) * 2004-09-24 2006-03-30 Casio Computer Co., Ltd. Photography device and photography method thereof
US20060125919A1 (en) * 2004-09-30 2006-06-15 Joseph Camilleri Vision system for vehicle
US20070195371A1 (en) * 2006-02-20 2007-08-23 Jiro Shimosato Image pickup apparatus and control method therefor
US20080031611A1 (en) 2006-08-01 2008-02-07 Canon Kabushiki Kaisha Focus control apparatus, image sensing apparatus and focus control method
JP2008052225A (ja) 2006-08-28 2008-03-06 Olympus Imaging Corp カメラ、合焦制御方法、プログラム
JP2008281701A (ja) 2007-05-09 2008-11-20 Canon Inc 焦点調節装置、撮像装置、および焦点調節方法
CN101339349A (zh) 2007-07-04 2009-01-07 三洋电机株式会社 摄像装置以及自动聚焦控制方法
US20090109304A1 (en) * 2007-10-29 2009-04-30 Ricoh Company, Limited Image processing device, image processing method, and computer program product
CN101426088A (zh) 2007-11-02 2009-05-06 索尼株式会社 成像设备及其控制方法以及程序
JP2010041076A (ja) 2008-07-31 2010-02-18 Olympus Corp カメラ
JP2010152162A (ja) * 2008-12-25 2010-07-08 Canon Inc 自動焦点検出装置及びその制御方法
US20100265353A1 (en) * 2009-04-16 2010-10-21 Sanyo Electric Co., Ltd. Image Processing Device, Image Sensing Device And Image Reproduction Device
US20110044676A1 (en) 2009-08-24 2011-02-24 Canon Kabushiki Kaisha Image pickup system having ranging function
US20110149120A1 (en) * 2009-12-21 2011-06-23 Canon Kabushiki Kaisha Image-capturing apparatus with automatically adjustable angle of view and control method therefor
JP4720673B2 (ja) 2006-08-16 2011-07-13 株式会社ニコン 被写体追尾装置およびカメラ
JP2011146827A (ja) 2010-01-13 2011-07-28 Sony Corp 画像処理装置および方法、並びにプログラム
JP2011146826A (ja) 2010-01-13 2011-07-28 Sony Corp 画像処理装置および方法、並びにプログラム
JP2011160379A (ja) 2010-02-04 2011-08-18 Sony Corp 画像処理装置および方法、並びにプログラム
JP2011166305A (ja) 2010-02-05 2011-08-25 Sony Corp 画像処理装置および撮像装置
US20110211073A1 (en) * 2010-02-26 2011-09-01 Research In Motion Limited Object detection and selection using gesture recognition
US20120206619A1 (en) 2011-01-25 2012-08-16 Nikon Corporation Image processing apparatus, image capturing apparatus and recording medium
US20120219183A1 (en) * 2011-02-24 2012-08-30 Daishi Mori 3D Object Detecting Apparatus and 3D Object Detecting Method
US20130258167A1 (en) * 2012-03-28 2013-10-03 Qualcomm Incorporated Method and apparatus for autofocusing an imaging device
US9266473B1 (en) * 2012-01-06 2016-02-23 Intuit Inc. Remote hands-free backseat driver

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8253800B2 (en) * 2006-06-28 2012-08-28 Nikon Corporation Tracking device, automatic focusing device, and camera
JP4274233B2 (ja) * 2006-11-30 2009-06-03 ソニー株式会社 撮影装置、画像処理装置、および、これらにおける画像処理方法ならびに当該方法をコンピュータに実行させるプログラム
JP4964807B2 (ja) * 2008-03-07 2012-07-04 パナソニック株式会社 撮像装置及び撮像方法
US9335610B2 (en) * 2008-05-19 2016-05-10 Canon Kabushiki Kaisha Image pickup system and lens apparatus
US9405970B2 (en) * 2009-02-02 2016-08-02 Eyesight Mobile Technologies Ltd. System and method for object recognition and tracking in a video stream
US9073484B2 (en) * 2010-03-03 2015-07-07 Honda Motor Co., Ltd. Surrounding area monitoring apparatus for vehicle
JP2013013061A (ja) * 2011-05-27 2013-01-17 Sanyo Electric Co Ltd 撮像装置
JP2015111746A (ja) * 2012-04-09 2015-06-18 ソニー株式会社 画像処理装置、画像処理方法、プログラム
JP2014126710A (ja) * 2012-12-26 2014-07-07 Canon Inc 自動焦点検出装置およびその制御方法、撮像装置

Patent Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS4720673Y1 (de) 1968-04-26 1972-07-11
US20010002225A1 (en) * 1988-03-10 2001-05-31 Masayoshi Sekine Image shake detecting device
JP2002228920A (ja) 2001-01-30 2002-08-14 Minolta Co Ltd 主被写体検出装置及び自動焦点カメラ
US20060012681A1 (en) * 2004-07-14 2006-01-19 Matsushita Electric Industrial Co., Ltd. Object tracing device, object tracing system, and object tracing method
US20060066742A1 (en) * 2004-09-24 2006-03-30 Casio Computer Co., Ltd. Photography device and photography method thereof
US20060125919A1 (en) * 2004-09-30 2006-06-15 Joseph Camilleri Vision system for vehicle
US20070195371A1 (en) * 2006-02-20 2007-08-23 Jiro Shimosato Image pickup apparatus and control method therefor
US20080031611A1 (en) 2006-08-01 2008-02-07 Canon Kabushiki Kaisha Focus control apparatus, image sensing apparatus and focus control method
JP4720673B2 (ja) 2006-08-16 2011-07-13 株式会社ニコン 被写体追尾装置およびカメラ
JP2008052225A (ja) 2006-08-28 2008-03-06 Olympus Imaging Corp カメラ、合焦制御方法、プログラム
JP2008281701A (ja) 2007-05-09 2008-11-20 Canon Inc 焦点調節装置、撮像装置、および焦点調節方法
CN101339349A (zh) 2007-07-04 2009-01-07 三洋电机株式会社 摄像装置以及自动聚焦控制方法
US20090109304A1 (en) * 2007-10-29 2009-04-30 Ricoh Company, Limited Image processing device, image processing method, and computer program product
CN101426088A (zh) 2007-11-02 2009-05-06 索尼株式会社 成像设备及其控制方法以及程序
JP2010041076A (ja) 2008-07-31 2010-02-18 Olympus Corp カメラ
JP2010152162A (ja) * 2008-12-25 2010-07-08 Canon Inc 自動焦点検出装置及びその制御方法
US20100265353A1 (en) * 2009-04-16 2010-10-21 Sanyo Electric Co., Ltd. Image Processing Device, Image Sensing Device And Image Reproduction Device
US20110044676A1 (en) 2009-08-24 2011-02-24 Canon Kabushiki Kaisha Image pickup system having ranging function
US20110149120A1 (en) * 2009-12-21 2011-06-23 Canon Kabushiki Kaisha Image-capturing apparatus with automatically adjustable angle of view and control method therefor
JP2011146827A (ja) 2010-01-13 2011-07-28 Sony Corp 画像処理装置および方法、並びにプログラム
JP2011146826A (ja) 2010-01-13 2011-07-28 Sony Corp 画像処理装置および方法、並びにプログラム
JP2011160379A (ja) 2010-02-04 2011-08-18 Sony Corp 画像処理装置および方法、並びにプログラム
JP2011166305A (ja) 2010-02-05 2011-08-25 Sony Corp 画像処理装置および撮像装置
US20110211073A1 (en) * 2010-02-26 2011-09-01 Research In Motion Limited Object detection and selection using gesture recognition
US20120206619A1 (en) 2011-01-25 2012-08-16 Nikon Corporation Image processing apparatus, image capturing apparatus and recording medium
US20120219183A1 (en) * 2011-02-24 2012-08-30 Daishi Mori 3D Object Detecting Apparatus and 3D Object Detecting Method
US9266473B1 (en) * 2012-01-06 2016-02-23 Intuit Inc. Remote hands-free backseat driver
US20130258167A1 (en) * 2012-03-28 2013-10-03 Qualcomm Incorporated Method and apparatus for autofocusing an imaging device

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
Chinese Office Action dated Apr. 10, 2018 for corresponding Chinese Application No. 201380069149.9.
Chinese Office Action dated Sep. 6, 2017 for corresponding Chinese Application No. 201380069149.9.
Extended European Search Report dated Aug. 24, 2016 for corresponding European Application No. 13870934.0.
Extended European Search Report dated Sep. 5, 2019 for corresponding European Application No. 19178866.0.
International Search Report; International Application No. PCT/JP2013/080606; International Filing Date: Nov. 12, 2013; Date of completion of the international search: Dec. 3, 2013.
Japanese Office Action dated Oct. 17, 2017 for corresponding Japanese Application No. 2014-556331.
Title: Translation of JP2010152162 Author: Tomosada, Toshihiko Date: Jul. 2010. *
Written Opinion of the International Searching Authority; International Application No. PCT/JP2013/080606; International Filing Date: Nov. 12, 2013; Date of Written Opinion: dated Dec. 10, 2013.

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220337734A1 (en) * 2019-08-23 2022-10-20 Kabushiki Kaisha Tokai Rika Denki Seisakusho Image-capturing control system, control device, and non-transitory computer-readable medium

Also Published As

Publication number Publication date
EP2945366A1 (de) 2015-11-18
EP2945366A4 (de) 2016-09-21
JPWO2014109125A1 (ja) 2017-01-19
US20200120262A1 (en) 2020-04-16
CN104919791A (zh) 2015-09-16
US20150350523A1 (en) 2015-12-03
WO2014109125A1 (ja) 2014-07-17
JP6319101B2 (ja) 2018-05-09
US11012614B2 (en) 2021-05-18
EP2945366B1 (de) 2019-10-16
EP3562143B1 (de) 2021-04-21
EP3562143A1 (de) 2019-10-30

Similar Documents

Publication Publication Date Title
US11012614B2 (en) Image processing device, image processing method, and program
US10848662B2 (en) Image processing device and associated methodology for determining a main subject in an image
US10630891B2 (en) Image processing apparatus, image processing method, and program
JP5159515B2 (ja) 画像処理装置およびその制御方法
JP5251215B2 (ja) デジタルカメラ
US8818055B2 (en) Image processing apparatus, and method, and image capturing apparatus with determination of priority of a detected subject and updating the priority
US9942460B2 (en) Image processing device, image processing method, and program
US10455154B2 (en) Image processing device, image processing method, and program including stable image estimation and main subject determination
JP5623256B2 (ja) 撮像装置、その制御方法及びプログラム
US10270977B2 (en) Imaging apparatus and a method of tracking a subject in the imaging apparatus
JP4807582B2 (ja) 画像処理装置、撮像装置及びそのプログラム
JP6274272B2 (ja) 画像処理装置、画像処理方法、プログラム
JP5832618B2 (ja) 撮像装置、その制御方法及びプログラム
US11595565B2 (en) Image capturing apparatus, method for controlling the same, and recording medium for automatic image capturing of a subject
CN116076083A (zh) 成像装置,信息处理装置,信息处理方法和程序
JP2024033747A (ja) 撮像装置、撮像装置の制御方法、プログラム
JP2018056650A (ja) 光学装置、撮像装置および制御方法
JP2011259212A (ja) 撮像装置

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KINOSHITA, MASAYA;REEL/FRAME:035908/0184

Effective date: 20150330

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4