JP5278483B2 - Imaging apparatus, imaging method, and imaging program - Google Patents

Imaging apparatus, imaging method, and imaging program Download PDF

Info

Publication number
JP5278483B2
JP5278483B2 JP2011094000A JP2011094000A JP5278483B2 JP 5278483 B2 JP5278483 B2 JP 5278483B2 JP 2011094000 A JP2011094000 A JP 2011094000A JP 2011094000 A JP2011094000 A JP 2011094000A JP 5278483 B2 JP5278483 B2 JP 5278483B2
Authority
JP
Japan
Prior art keywords
pixel area
comparison
means
pixel region
comparison pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2011094000A
Other languages
Japanese (ja)
Other versions
JP2011193496A (en
Inventor
俊也 木曽
Original Assignee
カシオ計算機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by カシオ計算機株式会社 filed Critical カシオ計算機株式会社
Priority to JP2011094000A priority Critical patent/JP5278483B2/en
Publication of JP2011193496A publication Critical patent/JP2011193496A/en
Application granted granted Critical
Publication of JP5278483B2 publication Critical patent/JP5278483B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To improve the followup accuracy of following an object using a block matching. <P>SOLUTION: An imaging apparatus includes: a pixel area of interest extractor (6d) for extracting a pixel area of interest (15) from a predetermined frame image; a comparison pixel area extractor (6e) for extracting a comparison pixel area (20) sequentially from a present frame image; a correlation degree deriving device (6f) for deriving a degree of correlation between each comparison pixel area and the pixel area of interest; a correlation degree corrector (6h) for correcting each degree of correlation based on a correlation degree correction map (6g) wherein a correction coefficient is preset, and deriving a corrected degree of correlation for every comparison pixel area; and a comparison pixel area selector (6i) for selecting a comparison pixel area from the respective comparison pixel areas based on the corrected degree of correlation. When a predetermined new frame image is acquired, the pixel area of interest extractor extracts the comparison pixel area selected by the comparison pixel area selector as a new pixel area of interest. <P>COPYRIGHT: (C)2011,JPO&amp;INPIT

Description

  The present invention relates to an imaging apparatus, an imaging method, and an imaging program. For example, the present invention relates to an imaging apparatus, an imaging method, and an imaging program that are suitable for application to a digital camera or the like equipped with an electronic viewfinder.

  Many digital cameras have an electronic viewfinder such as a liquid crystal display. In such a digital camera, a frame image (so-called through image) periodically output from an image sensor such as a CCD is displayed on an electronic viewfinder, so that a photographer can adjust the composition or zoom while viewing the through image. You can adjust the magnification.

  On the other hand, the number of pixels of image sensors mounted on today's digital cameras has been increasing, and some models with giga-level pixels have appeared. Therefore, the number of pixels of about VGA (640 × 480 pixels) or XGA (1024 × 768 pixels) is sufficient, so if desired, the pixels of the full-size image can be reduced or reduced. Cut out and trim the desired part to generate an image with the necessary and sufficient number of pixels (and therefore an image with a small file size), save the image in a storage element such as a semiconductor memory, or output it to a personal computer You can also.

  As the above trimming method, for example, a trimming reference frame (hereinafter referred to as a trimming frame; see the trimming frame 25 in FIG. 2) is superimposed on a through image on an electronic viewfinder, and a subject is included in the trimming frame. After adjusting the direction of the camera so as to enter, it is conceivable that the shutter button is fully pressed and the image in the trimming frame is stored and stored in a storage element such as a semiconductor memory.

  However, such a trimming method requires an operation of placing a subject in a trimming frame, and is particularly inconvenient for a subject that moves around like a child. The reason is that the size of the trimming frame is about a fraction of the screen size of the electronic viewfinder, and it is quite difficult to keep moving subjects in this small trimming frame.

  Therefore, moving the trimming frame in accordance with the movement of the subject, that is, “following trimming shooting” can be considered. For this purpose, for example, the in-focus portion (a part of the in-focus subject) in the focus index (so-called focus mark; see the focus mark 22 in FIG. 2) displayed when the shutter button is half-pressed is always present. The position of the trimming frame on the electronic viewfinder may be controlled so as to be positioned at the center of the trimming frame.

  To explain this in the actual shooting situation, the photographer first turns on the “tracking trimming shooting” function of the camera and holds the camera toward the moving subject. At this time, an in-focus mark is displayed at the center of the electronic viewfinder. Next, when the shutter button is pressed halfway, a trimming frame is displayed around the focus mark, and at the same time, a part of the subject within the focus mark is focused. When the subject is moving, the focus tracking and trimming frame move together following the subject's movement using the “following trimming shooting” function. The shutter button may be fully pressed at the timing. In this way, a trimmed image with an appropriate composition can be obtained in any case regardless of whether the subject is stationary or moving.

  By the way, in order to realize the function of the “following trimming photographing”, the target pixel region (the in-focus portion or the portion corresponding thereto) of the through image (the frame image periodically output from the image sensor) is obtained. It is necessary to perform automatic tracking (automatic tracking) over each frame image. As a conventional technique applicable to this automatic tracking, for example, there is a technique using template matching (also referred to as block matching) as described in Patent Document 1 below.

  In this technique, a small image area similar to a template cut out from the previous frame image (corresponding to the above noted pixel area) is searched from the current frame image, and the highest similarity among the searched small image areas Is identified as a tracking target region, and this operation is repeated while sequentially updating the template between successive frame images. Since the template is a block (collection of pixels) of about 8 × 8 pixels, template matching is also called block matching.

  In this conventional technique, the small image area is searched by superimposing the template on the current frame image while shifting the pixel one pixel at a time, and calculating the absolute value of the difference in luminance value for each pixel in the small image area having the same size as the template. The result score that represents the degree of matching is obtained by summing the values. The score of the result is smaller as the degree of matching is higher, and is 0 in the case of perfect matching. In the above-described conventional technique, the operation is performed so as to track the small image area having the minimum score value (0 or close to 0 as much as possible).

JP 2001-76156 A

  However, the above-described conventional technique operates to track a small image area having the smallest score value, but the tracking performance is not sufficient, and there is a problem in that the wrong small image area is often tracked. For example, when there are multiple moving subjects in the shooting composition, when trying to track one of those subjects (hereinafter referred to as a true subject), another subject (hereinafter referred to as a false subject) is being tracked during the tracking of the true subject. ) May be tracked incorrectly. This is because the “true subject score value> fake subject score value” relationship, that is, the false subject score value may be smaller than the true subject score value depending on the situation.

  Therefore, an object of the present invention is to improve the tracking accuracy of subject tracking by block matching. Specifically, even if a similar motion region exists, the true region set at the beginning is determined. It is an object to provide an imaging apparatus, an imaging method, and an imaging program that can keep tracking accurately.

The invention according to claim 1 is an imaging apparatus including subject tracking means for tracking a subject by block matching based on sequentially acquired frame images, wherein the subject tracking means includes a first target pixel from a predetermined frame image. Attention pixel region extraction means for extracting a region and second attention pixel region, position derivation means for deriving a position of the first attention pixel region extracted by the attention pixel region extraction means with respect to the frame image, and the position derivation means The search range setting means for automatically setting the search range in the current frame image based on the position derived from the above, and a predetermined comparison pixel region is sequentially extracted from the search range set by the search range setting means and comparing the pixel region extraction means for, and each comparative pixel area extracted by the comparison pixel region extracting means, the target pixel area extracted The second target pixel area extracted by the means, a correlation derivation means for deriving a correlation representing a degree of similarity of the image content, on the basis of the correlation derived by correlation deriving means, the comparison pixel region Comparison pixel area selection means for selecting a predetermined comparison pixel area from each comparison pixel area extracted by the extraction means, and the target pixel area extraction means, each time a new current frame image is acquired, The comparison pixel area selected by the comparison pixel area selection means is extracted as a new first target pixel area, and is selected by the comparison pixel area selection means each time a new current frame image is acquired a predetermined number of times. In the imaging apparatus, the comparison pixel area is extracted as a new second target pixel area.
The invention according to claim 2 is an imaging method including a subject tracking step of tracking a subject by block matching based on sequentially acquired frame images, wherein the subject tracking step includes a first target pixel region from a predetermined frame image. A target pixel region extracting step for extracting the second target pixel region, a position deriving step for deriving a position of the first target pixel region extracted by the target pixel region extracting step with respect to the frame image, and the position deriving step. Based on the derived position, a search range setting step for automatically setting a search range in the current frame image and a predetermined comparison pixel region are sequentially extracted from the search range set by the search range setting step. and comparing the pixel region extraction step, and each comparative pixel area extracted by the comparison pixel region extraction step, the target pixel area extracted The second noticed pixel region extracted by the extent, the correlation deriving step of deriving a correlation representing a degree of similarity of the image content, based on the correlation degree derived by the correlation deriving step, the comparison pixel region A comparison pixel region selection step of selecting a predetermined comparison pixel region from each comparison pixel region extracted by the extraction step, and the target pixel region extraction step is performed each time a new current frame image is acquired. The comparison pixel area selected by the comparison pixel area selection step is extracted as a new first target pixel area, and is selected by the comparison pixel area selection step each time a new current frame image is acquired a predetermined number of times. In this imaging method, a comparison pixel area is extracted as a new second target pixel area.
The invention according to claim 3 is an imaging program for causing a computer to implement subject tracking means for tracking a subject by block matching based on sequentially acquired frame images, wherein the subject tracking means is configured from a predetermined frame image. Attention pixel region extraction means for extracting the first attention pixel region and the second attention pixel region; and position derivation means for deriving a position of the first attention pixel region extracted by the attention pixel region extraction means with respect to the frame image; Based on the position derived by the position deriving means, a search range setting means for automatically setting a search range in the current frame image, and a predetermined comparison in order from within the search range set by the search range setting means Comparison pixel area extraction means for extracting a pixel area, and each comparison extracted by the comparison pixel area extraction means And containing area, the attention of the second target pixel region extracted by the pixel region extracting means, a correlation derivation means for deriving a correlation representing a degree of similarity of the image content, the correlation derived by the correlation deriving means Comparison pixel area selection means for selecting a predetermined comparison pixel area from each comparison pixel area extracted by the comparison pixel area extraction means on the basis of the degree, and the target pixel area extraction means includes a new current frame. Each time an image is acquired, the comparison pixel area selected by the comparison pixel area selection unit is extracted as a new first target pixel area, and each time a new current frame image is acquired a predetermined number of times, The imaging program is characterized in that the comparison pixel area selected by the comparison pixel area selection means is extracted as a new second target pixel area.

  In the present invention, tracking of the comparison pixel region can be performed with a center weight in the search range. As a result, only a true subject located near the center of the search range can be tracked, and in particular, the tracking performance of a subject with little movement can be improved. In addition, even for multiple tracking target candidate blocks with little difference in correlation and for which clear results cannot be obtained, tracking in the center of the search range can be performed preferentially, and tracking performance of subjects with few features Can be improved.

It is a block diagram of the imaging device which concerns on embodiment. 2 is a usage state diagram of the imaging apparatus 1. FIG. It is a conceptual diagram of subject automatic tracking in the following trimming shooting function of the present embodiment. It is the schematic diagram of a block matching in this embodiment, and the schematic diagram of a search range. It is a schematic diagram of the correlation correction map 6g. It is a tracking schematic diagram of a comparison pixel area. FIG. 6 is a relationship diagram between a reference frame and a current frame.

  Embodiments of the present invention will be described below with reference to the drawings. It should be noted that the specific details or examples in the following description and the illustrations of numerical values, character strings, and other symbols are only for reference in order to clarify the idea of the present invention, and the present invention may be used in whole or in part. Obviously, the idea of the invention is not limited. In addition, a well-known technique, a well-known procedure, a well-known architecture, a well-known circuit configuration, and the like (hereinafter, “well-known matter”) are not described in detail, but this is also to simplify the description. Not all or part of the matter is intentionally excluded. Such well-known matters are known to those skilled in the art at the time of filing of the present invention, and are naturally included in the following description.

  FIG. 1 is a configuration diagram of an imaging apparatus according to the embodiment. In this figure, an imaging apparatus 1 includes an optical system 2, an imaging device 3, an image processing unit 4, an image buffer 5, a control unit 6, an operation unit 7, an electronic viewfinder 8, a storage unit 9, a power supply unit 10, and the like.

  The optical system 2 includes an optical lens composed of a fixed focus lens or a variable focus lens (so-called zoom lens), and ties an optical image of a subject captured via the variable aperture stop mechanism onto the light receiving surface of the image sensor 3. Let me image.

  The imaging device 3 is a semiconductor two-dimensional image sensor such as a CCD or a CMOS, and photoelectrically converts a subject image formed on the light receiving surface. The image pickup device 3 includes a large number of photoelectric conversion elements (also referred to as pixels) arranged in a matrix in the y-axis direction (vertical direction) and the x-axis direction (horizontal direction). By sequentially outputting, an image signal of y lines × x pixels per frame is output at a cycle of several tens of frames per second (generally 30 frames per second).

  The image processing unit 4 performs image processing such as gamma correction on the image signal output from the image sensor 3 and outputs the image signal to the image buffer 5 and the control unit 6 as a current frame image.

  The image buffer 5 delays the current frame image output from the image processing unit 4 by one frame period to make a previous frame image, and outputs the previous frame image to the control unit 6.

  The control unit 6 is a one-chip microprocessor including a RAM 6a, a ROM 6b, a CPU 6c, and the like. The control program stored in advance in the ROM 6b is loaded into the RAM 6a and executed by the CPU 6c, whereby the operation of the image pickup apparatus 1 is comprehensively controlled. To do.

  The operation unit 7 includes various buttons necessary for the operation of the imaging apparatus 1, such as a shutter button 7a and a trimming shooting button 7b, and generates a signal corresponding to the button operation and outputs the signal to the control unit 6.

  The electronic viewfinder 8 is a flat display device composed of a liquid crystal display panel of several inches or the like. The electronic viewfinder 8 is used as a viewfinder for composition adjustment when the imaging device 1 is used as a digital camera.

  The storage unit 9 is a storage element for storing and storing captured images, and includes, for example, a large-capacity nonvolatile semiconductor storage device or a magnetic storage device.

  The power supply unit 10 includes a primary battery and a secondary battery, and supplies necessary power to each unit of the imaging apparatus 1 including the control unit 6.

  Note that the configuration of the imaging apparatus 1 described above is that of a digital camera, but is not limited thereto. For example, various portable electronic devices having an imaging function, such as a camera-equipped mobile phone or a camera-equipped information terminal. Any device can be applied.

  FIG. 2 is a usage state diagram of the imaging apparatus 1. In this figure, (a) shows a normal shooting (normal shooting) state, and (b) and (c) show a trimming shooting state. Also, (b) trimming shooting shows the one when the subject is stationary, and (c) trimming shooting shows the one when the subject is moving.

  First, normal shooting will be described. Here, normal shooting refers to shooting a stationary subject, for example, normal portrait shooting. In this case, the photographer places the subject 23 on the focus mark 22 located at the center of the finder screen 21 and presses the shutter button halfway as shown in FIG. Press the button fully. Thereby, the image 24 corresponding to the entire finder screen 21 can be taken and recorded and stored in the storage unit 9.

  Next, trimming photography will be described. Trimming photography in this embodiment is divided into two cases. One case is when the subject is stationary, and the second case is when the subject is moving (moving). In any case, the trimming photographing function can be turned on by depressing the trimming photographing button 7b of the operation unit 7. The photographer need not be aware of these two cases.

  In the case of the subject stationary trimming shooting, as shown in (b), the photographer places the subject 23 on the in-focus mark 22 located at the center of the finder screen 21, presses the shutter button halfway, and focuses on the subject 23. Then press the shutter button fully. The operation itself is no different from normal shooting. The difference is that the image 26 recorded and saved in the storage unit 9 is limited to the trimming frame 25 of a predetermined size, not the entire finder screen 21. That is, an image 26 obtained by “trimming” a part of the finder screen 21 is recorded and saved in the storage unit 9.

  In such trimming photography, if the trimming frame 25 is fixed, especially in the case of a subject that moves around like a child, it is quite difficult to always keep the photographing direction toward the subject, and there is also a concern about camera shake. . The following trimming shooting function (FIG. 2C) of the present embodiment is for eliminating such inconvenience.

  That is, as long as it is displayed on the finder screen 21, the trimming frame 25 can be automatically moved following the movement of the subject 23 without changing the shooting direction, and the shutter button is fully pressed at a desired timing. As a result, an image 27 obtained by “trimming” a part of the finder screen 21 can be recorded and stored in the storage unit 9.

  FIG. 3 is a conceptual diagram of subject automatic tracking in the following trimming photographing function of the present embodiment. This conceptual diagram schematically shows a control function that is executed in software by the control unit 6.

  In this figure, a reference frame image 12 is a predetermined frame image fetched from the image buffer 5, and this predetermined frame image is obtained at the start of the trimming shooting (FIGS. 2B and 2C) (that is, A frame image taken into the image buffer 5 immediately after the shutter button is half-pressed. A subject 13 is shown in the approximate center of the reference frame image 12, and this subject 13 is a moving child, for example.

  On the other hand, a plurality of frame images 16, 17, 18... Drawn below the reference frame image 12 are frame images sequentially output from the image processing unit 4 following the reference frame image 12. Each frame image 16, 17, 18,... Is the latest current frame image at each output time point. When the frame image 17 is output as the latest current frame image, the frame image 16 is referred to as a reference frame image, and when the frame image 18 is output as the latest current frame image, the frame image 17 is referred to. It shall be updated as a frame image.

  In the frame images 16, 17, 18,..., The same subject 19 as the subject 13 in the reference frame image 12 is shown (for the sake of illustration, only the frame image 16 on the top surface is shown). As described above, since the subject 13 of the reference frame image 12 is a moving child, the positions of the subjects 19 of the frame images 16, 17, 18,...

  The reference frame image 12 is input to the target pixel region extraction unit 6d. The pixel-of-interest extraction unit 6d extracts the pixel-of-interest region 15 including the whole or part of the subject 13 shown in the reference frame image 12. The target pixel area 15 corresponds to, for example, a pixel area in the focusing mark 22 in FIGS. 2B and 2C, that is, a focused pixel area.

On the other hand, the comparison pixel region extraction unit 6e sequentially extracts the comparison pixel region 20 to be compared with the subject image in the target pixel region 15 from the search range described later in detail in the newly acquired current frame image.
Here, each comparison pixel area 20 is a pixel area having the same size as the target pixel area 15. For example, in the frame image, each comparison pixel area is sequentially shifted by one pixel in the x-axis direction or the y-axis direction and extracted. Is done.

  Each pixel value (luminance value) in the target pixel region 15 extracted by the target pixel region extraction unit 6d and each pixel value (luminance value) of each comparison pixel region 20 extracted by the comparison pixel region extraction unit 6e are correlated. It is input to the degree deriving unit 6f. Then, the correlation degree deriving unit 6f derives the degree of correlation with the subject image in the target pixel area 15 for each comparison pixel area, and outputs it to the correlation degree correcting unit 6h.

  The correlation degree correction unit 6h corrects each correlation degree derived by the correlation degree derivation unit 6f according to a predetermined correlation degree correction map 6g, and outputs the corrected correlation degree to the comparison pixel region selection unit 6i. The comparison pixel region selection unit 6i selects the comparison pixel region 20 having the highest degree of correlation with the target pixel region 15 based on the correlation degree corrected by the correlation degree correction unit 6h, and the selected comparison pixel region 20 Is determined as the tracking target subject image, and the selected comparison pixel area 20 is determined as the tracking position.

  When a new current frame image is output, the pixel-of-interest extraction unit 6e sets a new subject image to be tracked as a subject image of the next new pixel-of-interest region 15 and the tracking position. Based on this, the above-described processing is repeated sequentially. That is, every time a new current frame image is acquired, the subject image to be tracked and the tracking position are updated based on the selected comparison pixel area.

The trimming frame 25 is updated so that the relative relationship between the position of the target pixel region 15, that is, the tracking position and the position of the trimming frame 25 always matches.
That is, based on the target pixel region, the comparison pixel region 20 determined to match the target pixel region from the newly acquired current frame image is selected and extracted as a comparison target pixel region to be tracked. By setting the area, the subject is sequentially tracked in the trimming frame 25 every time a new current frame image is acquired.

  Here, in order to selectively extract the comparison pixel region 20 determined to match the target pixel region, a search range in which the comparison pixel region extraction unit 6e extracts each comparison pixel region is set in advance. Is preferred.

Details will be described below.
FIG. 4A is a schematic diagram of block matching in the present embodiment. In this figure, a block 29 of 8 × 8 pixels is a block matching target area, and block matching is performed using pixel information (luminance information) included in the block 29 as the target pixel region 15. Here, each pixel is represented by a square, and in this figure, for example, in order to shorten the matching processing time, for example, only black pixels arranged every other pixel are subjected to block matching processing. It shows the case. That is, an example is shown in which block matching is performed based on pixel information of 16 pixels in a pixel region of interest consisting of 8 × 8 pixels. The pixel information includes not only the luminance information but also pixel position information. This position information is used when setting the search range.

  FIG. 4B is a schematic diagram of a search range in the present embodiment. In this embodiment, it is determined by block matching which position in the current frame image the subject image as the target pixel region 15 in the reference frame image has moved, but the distance that the subject moves during one frame is limited. By limiting the search range for the subject image corresponding to the target pixel region 15 to a predetermined search range, the time spent for the processing and the load on the CPU are reduced. The search range 30 is set based on the center position (or tracking position) of the target pixel region 15 extracted at this time and the pixel number range stored in advance. For example, the center position of the search range is set to be a substantial center position corresponding to the tracking position (position of the target pixel region). For example, when the frame rate is 30 fps, the center of each comparison pixel region is horizontally (x direction) centered on the position of the current frame image corresponding to the center position of the target pixel region 15 according to various experimental results. If the search range is an area having ± 12 pixels and ± 8 pixels in the vertical direction (y direction), the subject can be tracked in most cases even if the subject moves, and the above assumption can be achieved. .

  FIG. 5 is a schematic diagram of the correlation correction map 6g. Note that in this schematic diagram, the x-axis and the y-axis are opposite to the x-axis and the y-axis in FIGS. 4A and 4B for convenience of illustration. That is, the horizontal axis of the correlation correction map 6g corresponds to the y-axis (vertical direction) in FIGS. 4A and 4B, and the vertical axis represents the x-axis (in FIGS. 4A and 4B). Corresponds to the horizontal direction).

  The numerical value in the schematic diagram is a “weighting factor” as a correction factor for correcting the degree of correlation. In this example, the minimum weighting factor is “64” in the center of the schematic diagram and the surrounding four, and the maximum weighting factor is the model. It is “255” in the four corners of the figure. In the correlation degree correction map 6g, each weighting factor is set in advance so as to correspond to each comparison region obtained from the search range.

  These weighting factors are applied to the degree of correlation of each comparison pixel region having the coordinate position of the search range in FIG. For example, the weighting coefficient applied to the correlation degree of the comparison pixel region in which the substantial center coordinates of the comparison pixel region are x = 0 and y = 0 in FIG. 4B is “64” in the center of the schematic diagram. It is. The coordinate position of x = 0 and y = 0 in FIG. 4B is the coordinate position of the current frame corresponding to the center coordinate position of the target pixel region 15. That is, this is a coordinate position corresponding to the case where the subject image as the target pixel region 15 in the reference frame image has not moved in the current frame image (the tracking position has not changed).

  In the correlation degree correction map 6g of FIG. 5, the point to be noted is that the closer the position of the current frame corresponding to the center position of the target pixel region 15, that is, the smaller the absolute values of x and y, the smaller the weighting factor, In other words, a weighting coefficient is set to increase the degree of correlation with the subject image in the target pixel region. That is, the center-weighted weighting coefficient setting is such that the closer the position of the current frame corresponding to the center position of the target pixel area 15 is, the higher the degree of correlation between the subject image in the target pixel area 15 and the subject image in the comparison pixel area is. There is in point.

  By the way, the derivation of the degree of correlation in the degree-of-correlation deriving unit 6f is performed according to a known square-square difference calculation formula shown in the following equation (1). That is, the smaller the value is obtained, the higher the degree of correlation between the subject image in the target pixel region and the subject image in the comparison pixel region.

  Then, the correlation degree correction unit 6h applies the weighting factor of the correlation degree correction map 6g in FIG. 5 to the correlation degree derived according to the above equation (1). The comparison pixel area selection unit 6i compares the comparison pixel areas based on the correlation degree to which the weighting coefficient is applied, and tracks the comparison pixel area 20 having the subject image having the highest correlation degree with the subject image of the target pixel area. Select as target.

  As described above, since the center-weighted weighting factor is used in the imaging apparatus 1 according to the present embodiment, the tracking of the comparison pixel region 20 is given priority in the search range in FIG. It is possible to track the subject giving priority to the coordinate position corresponding to (priority to the comparison pixel region with little change in the tracking position). As a result, only a true subject located near the center of the search range can be tracked, and in particular, the tracking performance of a subject with little movement can be improved. In addition, even for multiple tracking target candidate blocks with little difference in the degree of correlation and where clear results cannot be obtained, tracking in the center of the search range can be performed preferentially, and tracking performance of subjects with few features can be improved. Can be improved.

  Here, FIG. 6 is a tracking schematic diagram of the comparison pixel region. In this figure, the comparison search range 28 (see the comparison search range 30 in FIG. 4) includes one comparison pixel region 20a corresponding to the target pixel region 15 and two comparison pixel regions 20b not corresponding to the target pixel region 15. Suppose it exists. One comparison pixel region 20a is a “true” tracking target, and the second comparison pixel region 20b is a “false (virtual image)” tracking target. When the colors and brightness of the subjects included in these two comparison pixel areas 20a and 20b are similar, in the related art described at the beginning, the “false” subject (two comparison pixel areas 20b) is often used. Sometimes tracked.

  This simply operates by summing the absolute values of the luminance value differences to obtain a result score representing the degree of matching, and tracking a small image area with the smallest score value (nearly close to 0 or 0). This is because the relationship of “score value of true subject> score value of false subject”, that is, the score value of false subject may be smaller than the score value of true subject. .

  On the other hand, in the present embodiment, the correlation score corresponding to the score value is derived without using the result score as it is, and the correlation score is corrected according to a predetermined correlation score correction map 6g. The later correlation is output to the comparison pixel region selection unit 6i.

  In the above description, the configuration for selecting and extracting the comparison pixel region having the subject image having the highest degree of correlation with the subject image in the target pixel region from the search range has been described. The comparison pixel areas are selected by the number corresponding to the top few% (for example, the top 5%) having a high degree of correlation, and the comparison pixel area closest to the center of the search range is selected as the tracking target subject image and tracking position. You may make it select. The subject can be tracked with higher accuracy. In the case of this configuration, it is not always necessary to perform the above-described correlation degree correction, and whether or not to perform the above-described correlation degree correction may be appropriately set according to the shooting scene.

  In the description of FIG. 3, the pixel region 15 of interest is extracted from the reference frame image 12 at the start of trimming imaging. However, the reference frame image 12 may be replaced with the current frame image at an arbitrary time. .

  Further, in the above description, each time a new current frame image is output (acquired), a new target pixel region is extracted from the new frame image, and based on the position (tracking position) of the extracted target pixel region. The configuration for setting the position of the search range and selecting a predetermined comparison target pixel region based on the degree of correlation with the subject image in the extracted target pixel region has been described, but a new current frame image is output (acquired) ), A new target pixel area is extracted from the new current frame image, and the position of the search range is set based on the position (tracking position) of the extracted target pixel area. The subject image as the target pixel region of interest may be updated each time a new current frame image is acquired a predetermined number of times (for example, 10 frames).

This will be described in detail below.
FIG. 7 is a relationship diagram between the reference frame and the current frame. In this figure, Fi, Fi + 1, Fi + 2,..., Fi + 10 are frame images periodically output from the image processing unit 4, respectively. Here, if Fi is a reference frame image at the start of trimming shooting, the subject image to be tracked is extracted from the target pixel region 15 of Fi, and the comparison pixel region 20 corresponding to the subject image in this target pixel region 15 is Fi + 1, Fi + 2,..., Fi + 10, and the extracted comparison pixel area 20 becomes a tracking position at each time point.

  Further, for example, when the comparison pixel area 20 corresponding to the subject image in the target pixel area 15 of Fi is extracted at Fi + 10 10 frames after Fi, the subject image in the comparison pixel area 20 becomes a new target. The subject image (the subject image to be tracked) as the pixel area 15 is updated. In this way, it is possible to flexibly deal with a change in the position of the subject over time. For example, when the tracking target is lost (lost), the subject can be captured and tracked again.

1 Imaging device 6 Control unit (subject tracking means)
6d attention pixel area extraction unit (target pixel area extraction means, first attention image area extraction means, second attention image area extraction means)
6e comparison pixel area extraction unit (comparison pixel area extraction means, position derivation means, first comparison pixel area selection means, second comparison pixel area selection means, search range setting means)
6f Correlation degree deriving section (correlation degree deriving means)
6g Correlation correction map 6h Correlation correction unit (correlation correction means)
6i comparison pixel area selection unit (comparison pixel area selection means)
15 attention pixel area 20 comparison pixel area

Claims (3)

  1. An imaging apparatus comprising subject tracking means for tracking a subject by block matching based on sequentially acquired frame images,
    The subject tracking means includes
    Pixel-of-interest region extraction means for extracting a first pixel-of-interest region and a second pixel-of-interest region from a predetermined frame image;
    Position deriving means for deriving the position of the first target pixel area extracted by the target pixel area extracting means with respect to the frame image;
    Search range setting means for automatically setting a search range in the current frame image based on the position derived by the position deriving means;
    Comparison pixel area extraction means for sequentially extracting a predetermined comparison pixel area from the search range set by the search range setting means;
    And each comparative pixel area extracted by the comparison pixel region extracting means, the target of the second target pixel region extracted by the pixel region extraction means, correlation deriving means for deriving a correlation representing a degree of similarity of the image content When,
    Comparison pixel area selection means for selecting a predetermined comparison pixel area from each comparison pixel area extracted by the comparison pixel area extraction means based on the correlation degree derived by the correlation degree derivation means,
    The target pixel area extracting unit extracts the comparison pixel area selected by the comparison pixel area selection unit as a new first target pixel area each time a new current frame image is acquired, and a new current frame An imaging apparatus, wherein a comparison pixel area selected by the comparison pixel area selection unit is extracted as a new second target pixel area every time an image is acquired a predetermined number of times.
  2. An imaging method including a subject tracking step of tracking a subject by block matching based on sequentially acquired frame images,
    The subject tracking step includes
    A target pixel region extraction step of extracting a first target pixel region and a second target pixel region from a predetermined frame image;
    A position deriving step for deriving a position of the first target pixel region extracted by the target pixel region extraction step with respect to the frame image;
    A search range setting step for automatically setting a search range in the current frame image based on the position derived by the position deriving step;
    A comparison pixel region extraction step of sequentially extracting a predetermined comparison pixel region from within the search range set by the search range setting step;
    And each comparative pixel area extracted by the comparison pixel region extraction step, the target of the second target pixel region extracted by the pixel area extracting step, correlation deriving step of deriving a correlation representing a degree of similarity of the image content When,
    A comparison pixel region selection step of selecting a predetermined comparison pixel region from each comparison pixel region extracted by the comparison pixel region extraction step based on the correlation degree derived by the correlation degree derivation step,
    The target pixel region extraction step extracts the comparison pixel region selected by the comparison pixel region selection step as a new first target pixel region each time a new current frame image is acquired, and a new current frame An imaging method comprising extracting a comparison pixel region selected by the comparison pixel region selection step as a new second target pixel region each time an image is acquired a predetermined number of times.
  3. An imaging program for causing a computer to implement subject tracking means for tracking a subject by block matching based on sequentially acquired frame images,
    The subject tracking means includes
    Pixel-of-interest region extraction means for extracting a first pixel-of-interest region and a second pixel-of-interest region from a predetermined frame image;
    Position deriving means for deriving the position of the first target pixel area extracted by the target pixel area extracting means with respect to the frame image;
    Search range setting means for automatically setting a search range in the current frame image based on the position derived by the position deriving means;
    Comparison pixel area extraction means for sequentially extracting a predetermined comparison pixel area from the search range set by the search range setting means;
    And each comparative pixel area extracted by the comparison pixel region extracting means, the target of the second target pixel region extracted by the pixel region extraction means, correlation deriving means for deriving a correlation representing a degree of similarity of the image content When,
    Comparison pixel area selection means for selecting a predetermined comparison pixel area from each comparison pixel area extracted by the comparison pixel area extraction means based on the correlation degree derived by the correlation degree derivation means,
    The target pixel area extracting unit extracts the comparison pixel area selected by the comparison pixel area selection unit as a new first target pixel area each time a new current frame image is acquired, and a new current frame An imaging program that extracts a comparison pixel area selected by the comparison pixel area selection unit as a new second target pixel area each time an image is acquired a predetermined number of times.
JP2011094000A 2011-04-20 2011-04-20 Imaging apparatus, imaging method, and imaging program Active JP5278483B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2011094000A JP5278483B2 (en) 2011-04-20 2011-04-20 Imaging apparatus, imaging method, and imaging program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2011094000A JP5278483B2 (en) 2011-04-20 2011-04-20 Imaging apparatus, imaging method, and imaging program

Related Child Applications (1)

Application Number Title Priority Date Filing Date
JP2006201682 Division 2006-07-25

Publications (2)

Publication Number Publication Date
JP2011193496A JP2011193496A (en) 2011-09-29
JP5278483B2 true JP5278483B2 (en) 2013-09-04

Family

ID=44797867

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2011094000A Active JP5278483B2 (en) 2011-04-20 2011-04-20 Imaging apparatus, imaging method, and imaging program

Country Status (1)

Country Link
JP (1) JP5278483B2 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013161134A1 (en) * 2012-04-25 2013-10-31 Necカシオモバイルコミュニケーションズ株式会社 Electronic device, control method for same, and non-temporary computer-readable medium storing control program

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10123229A (en) * 1996-10-21 1998-05-15 Mitsubishi Electric Corp Image tracking device
JP3206563B2 (en) * 1998-10-14 2001-09-10 日本電気株式会社 Updating method and apparatus tracking template
JP4819380B2 (en) * 2004-03-23 2011-11-24 キヤノン株式会社 Surveillance system, imaging setting device, control method, and program
JP4290164B2 (en) * 2006-01-31 2009-07-01 キヤノン株式会社 Display method for displaying display showing identification area together with image, program executed by computer apparatus, and imaging apparatus

Also Published As

Publication number Publication date
JP2011193496A (en) 2011-09-29

Similar Documents

Publication Publication Date Title
EP2563006B1 (en) Method for displaying character information, and image-taking device
JP5136669B2 (en) Image processing apparatus, image processing method, and program
US7791668B2 (en) Digital camera
JP4904108B2 (en) Imaging apparatus and image display control method
JP4135100B2 (en) Imaging device
JP4288612B2 (en) Image processing apparatus and method, and program
KR20110004085A (en) Photographing apparatus and photographing method
JP2011004152A (en) Target tracking device, image tracking device, method for controlling operation of the same, and digital camera
US8106995B2 (en) Image-taking method and apparatus
JP5398156B2 (en) White balance control device, its control method, and imaging device
JP3873994B2 (en) Imaging apparatus and image acquisition method
JP5159515B2 (en) Image processing apparatus and control method thereof
CN101521747B (en) Imaging apparatus provided with panning mode for taking panned image
JP3823921B2 (en) Imaging device
JP4761146B2 (en) Imaging apparatus and program thereof
JP2010171815A (en) Imaging apparatus, subject tracking method, and program
JP2004040712A (en) Imaging apparatus
JP2008164839A (en) Photographing apparatus and focusing method
US8395694B2 (en) Apparatus and method for blurring image background in digital image processing device
JP5054583B2 (en) Imaging device
JP5251215B2 (en) Digital camera
US20120057786A1 (en) Image processing apparatus, image processing method, image pickup apparatus, and storage medium storing image processing program
KR101034109B1 (en) Image capture apparatus and computer readable recording medium storing with a program
JP4985808B2 (en) Imaging apparatus and program
JP2011155492A (en) Image processor

Legal Events

Date Code Title Description
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20121101

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20121127

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130125

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20130423

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20130506

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150