CN101155263A - Image processing apparatus, image processing method, image processing program, and image pickup apparatus - Google Patents
Image processing apparatus, image processing method, image processing program, and image pickup apparatus Download PDFInfo
- Publication number
- CN101155263A CN101155263A CNA200710161610XA CN200710161610A CN101155263A CN 101155263 A CN101155263 A CN 101155263A CN A200710161610X A CNA200710161610X A CN A200710161610XA CN 200710161610 A CN200710161610 A CN 200710161610A CN 101155263 A CN101155263 A CN 101155263A
- Authority
- CN
- China
- Prior art keywords
- data
- image
- base image
- view data
- overlapping
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/265—Mixing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
- G06T3/4038—Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6811—Motion detection based on the image signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
- H04N23/683—Vibration or motion blur correction performed by a processor, e.g. controlling the readout of an image memory
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/951—Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
Abstract
An image pickup apparatus includes: a first base image selection unit selecting a piece of image data as a first base image used in superposition; a shift amount calculation unit calculating an amount of shift between the first base image and another different piece of image data; an image superposition unit detecting a superposing area as an area in which the different piece of image data can be superposed on the first base image after a shift correction is made on a basis of the calculated amount of shift, and performing superposition for the superposing area; a second base image selection unit selecting a piece of image data as a second base image used in a non-superposing area; and an image combination unit combining a superposition result with the non-superposing area in the second base image, and outputting image data as a combination result.
Description
Technical field
The present invention relates to a kind of image processing apparatus and image pick-up device, be used to handle the multiple bar chart that obtains by modes such as continuous shootings as data.
Background technology
When using digital camera (image pick-up device) the carries out image shooting of configuration in mobile phone etc., the camera shake that causes owing to the shake of for example holding the hand of mobile phone in exposure process can exert an influence for the image that will obtain.In order to suppress the influence of camera shake, the image that obtain is carried out camera shake correction.
In the camera shake correction process, many (N) bar view data of obtaining by continuous shooting is carried out combined treatment to obtain a combination image.
As combined treatment, following conventional method is known.
(1) image that first is obtained is defined as base image, and with second to N image and first base image sequential combination.
(2) (for example, m=N/2) the individual image that obtains is defined as base image, and with other image and m base image sequential combination with m.
Under the situation of above-mentioned method (1), camera shake usually occurs on a certain direction.Therefore, when with first to N image sequence when overlapping, at first image (that is, base image) and treat to produce bigger skew between the overlapping final goal image, thereby weakened overlapping function.
In addition, in above-mentioned method (2), for example, when the individual image that obtains of m (m=N/2) is base image, side-play amount may be about with first or N image be defined as base image situation 1/2, thereby improved overlapping function.Yet, because m image is to be used for overlapping base image, although when it presses shutter, just obtained image so the user thinks, the image that obtains in the moment of pressing shutter and can have time lag between as the image of base image during when superimposed images.As a result, can there be bigger skew in the image with user expectation.
Multiple technologies are disclosed as the camera shake correction technology.
For example, patent documentation 1 discloses a kind of electronic camera with two kinds of patterns, i.e. ambiguity correction pattern (tremble correction mode) and non-fuzzy correction mode (non-tremble correction mode).Electronic camera is only carried out ambiguity correction (jitter correction) in CCD (charge coupled device) exposure process, take between the preparatory stage when electronic camera is in, and does not carry out ambiguity correction (for the non-fuzzy correction mode) in the pixel data output procedure.
Patent documentation 2 discloses a kind of like this camera, promptly when selecting when preventing to shake screening-mode and can coming it to control by changing under the common screening-mode operating rate of eyeglass and shutter.The screening-mode alternative pack of camera is selected the screening-mode of camera, for example prevents to shake screening-mode, common screening-mode etc.When beginning to expose, actuating speed changes parts according to the selection result output of screening-mode alternative pack instructing to eyeglass driver part and shutter driver part in order to the actuating speed that changes eyeglass and shutter.
[patent documentation 1] Japanese Laid-Open Patent Application No.2003-333414 " electronic camera "
[patent documentation 2] Japanese Laid-Open Patent Application No.H7-28149 " camera device against shake "
Summary of the invention
The object of the invention is to provide a kind of image data processing system, image processing method and image data procesisng program, can be from the multiple bar chart that obtains as obtaining having the more combination image of high accuracy the data.
Another object of the present invention is to provide a kind of image pick-up device, can avoid combination image from the user expectation image, to be offset.
Image data processing system according to first aspect present invention is a kind of like this image data processing system, be used to handle multiple bar chart as data, it comprises: the first base image selected cell, from the multiple bar chart that obtains as selecting a view data the data, as first base image of in overlap operation, using; The side-play amount computing unit calculates the multiple bar chart be selected for first base image of overlap operation and obtain as the side-play amount between another different images data in the data; Doubling of the image unit is carrying out detecting the overlapping region after the offset correction based on institute's offset calculated, and overlap operation is carried out in this overlapping region, wherein this overlapping region be can be on first base image of selection the zone of overlapping described different images data; The second base image selected cell, from the described multiple bar chart that obtains as selecting a view data the data, as second base image of in Non-overlapping Domain, using; And the image assembled unit, overlap operation result and the Non-overlapping Domain in this second base image are made up, and output is as the view data of combined result.
By above-mentioned configuration, this first and second base image selected cell is selected first base image of using and second base image of using respectively in this Non-overlapping Domain in this overlapping region when overlapping described a plurality of view data of obtaining.
Therefore, for example by wherein having the view data of minimum jitter amount as second base image in this Non-overlapping Domain as selecting the data from described multiple bar chart, or by opening the view data of degree maximum at the eyes on the human face as second base image of in this Non-overlapping Domain, using as selecting in the Non-overlapping Domain the data in order to indication from described multiple bar chart, can improve in the Non-overlapping Domain of view data picture quality, and can obtain having more high-precision combination image as combined result.
Image pick-up device according to second aspect present invention is a kind of like this image pick-up device, be used to handle the multiple bar chart that obtains by continuous shooting as data, this image pick-up device comprises: imageing sensor, carry out light/electricity conversion for the light that receives from object to be obtained, and output light/electric translation data; The view data generation unit produces view data based on this light/electric translation data; Take control unit, expose control information to this imageing sensor in response to taking instruction output, and obtain multiple bar chart as data; The first base image selected cell, from the described multiple bar chart that obtains as selecting a view data the data as first base image of in overlap operation, using; The side-play amount computing unit calculates the multiple bar chart be selected for first base image of overlap operation and obtain as the side-play amount between another different images data in the data; Doubling of the image unit is carrying out detecting the overlapping region after the offset correction based on institute's offset calculated, and overlap operation is carried out in this overlapping region, wherein this overlapping region be can be on first base image of selection the zone of overlapping described different images data; The second base image selected cell, from the described multiple bar chart that obtains as selecting a view data the data, as second base image of in Non-overlapping Domain, using; And the image assembled unit, overlap operation result and the Non-overlapping Domain in this second base image are made up, and output is as the view data of combined result.
By above-mentioned configuration, this first and second base image selected cell is selected first base image of using and second base image of using respectively in this Non-overlapping Domain in this overlapping region when the overlapping described a plurality of view data of obtaining.
Therefore, for example by from the described multiple bar chart that obtains as selecting the view data corresponding the data with first photographic images, as this second base image of in this Non-overlapping Domain, using, can avoid that combination image is offset when the user thinks the image that obtains when obtaining pressing shutter release button from the image of user expectation.In addition, by from the described multiple bar chart that obtains as selecting the data and the corresponding view data of image of shooting constantly of sounding at shutter, as this second base image of in this Non-overlapping Domain, using, can avoid obtaining when the user thinks that combination image is offset when shutter is sounded the image that constantly obtains from the image of user expectation.
By image data processing system according to the present invention, can be from the multiple bar chart that obtains as the combination image that obtains having higher precision the data.
In addition, can avoid in combination image, occurring from the image of user expectation, being offset according to image pick-up device of the present invention.
Description of drawings
Fig. 1 is the schematic diagram according to detection of the present invention overlapping region method;
Fig. 2 illustrates in order to implement the block diagram of the image pick-up device to every kind of mode common of the present invention;
Fig. 3 shows the configuration according to correcting unit operation principle of the present invention, and is the block diagram that illustrates according to the configuration of the correcting unit that is used to realize first and second pattern of the present invention;
Fig. 4 is the flow chart according to photographing process of the present invention and image processing;
Fig. 5 is that basis is in order to the photographing process of enforcement the present invention first pattern and the flow chart of image processing;
Fig. 6 is that basis is in order to the photographing process of realization the present invention second pattern and the flow chart of image processing;
The block diagram of Fig. 7 for disposing according to the second base image selected cell (the base image selected cell that is used for Non-overlapping Domain) in order to realization the present invention three-mode;
Fig. 8 is that basis is in order to the photographing process of realization the present invention three-mode and the flow chart of image processing;
The block diagram of Fig. 9 for disposing according to the second base image selected cell (the base image selected cell that is used for Non-overlapping Domain) in order to realization the present invention four-mode;
Figure 10 is that basis is in order to the photographing process of realization the present invention four-mode and the flow chart of image processing; With
Figure 11 illustrates the example of storage medium.
Embodiment
Hereinafter described and produced camera shake correction in the field of image by making up a plurality of images.Yet except camera shake correction, the present invention can also be applied to carry out other field of similar process.
In the combination of the image carried out during camera shake correction, by continuous shooting special object, each image of taking continuously harmonized (jitter correction) and make up these images produces an image.
The image that obtains as combined result has two kinds of zones, first kind is by in the zone (overlapping region) that the resulting pixel of pixel value disposes that is averaged of the pixel of the same position of taking continuously a plurality of images that obtained, and second kind be that for example one of them the pixel value of pixel of a plurality of images by will taking acquisition continuously multiply by the zone (Non-overlapping Domain) that the resulting pixel of an integer disposes.That is, comprise overlapping region (for example, when the image number of continuous shooting is 3) and Non-overlapping Domain (zone of the information generation by only relating to an image) by making up the zone that these 3 images produce as the image of combined result.
Fig. 1 is the schematic diagram according to detection of the present invention overlapping region method.
In Fig. 1, when with three image simple combination of image 1-3, the gained image that produces image 4 for example is as combination picture.
Subsequently, carrying out offset correction, calculating the difference between the pixel value at the same position place of each image, and will be defined as region of variation (Non-overlapping Domain) in order to represent the part that this difference equals or exceeds threshold value with after image is harmonized.For example, in image 5, by " white " expression region of variation, by " black " expression invariant region (overlapping region).
The present invention is characterised in that the independent base image of selecting in region of variation and invariant region.For example, when in region of variation, using image 2 and in invariant region, use, obtain the result of the image of image 6 for example as combination image 1-3 by the resulting image of combination image 1-3.
What next describe is in the structure that reaches the correcting unit in image pick-up device in order to the image pick-up device of implementing to use in every kind of pattern of the present invention.
Fig. 2 is in order to implement the block diagram of the image pick-up device to every kind of mode common of the present invention.
In Fig. 2, image pick-up device 10 comprises: lens 11, imageing sensor 12, shooting control unit 13, AGC (automatic gain control) circuit 16, AD converter (analog to digital converter, ADC) 17, graphics processing unit (being also referred to as image-signal processor (ISP)) 18, image holding unit 22 and correcting unit 23.
Imageing sensor 12 disposes photosensitive unit (not showing in the accompanying drawings) and signal output unit (not showing in the accompanying drawings), wherein said photosensitive unit is carried out light/electricity and is changed (promptly, scioptics 11 are converted to electric charge from waiting the light that obtains the target reception), the electric charge that described signal output unit output is assembled is as light/electric translation data.
In response to taking instruction, take control unit 13 outputs one and control signal to imageing sensor 12, wherein this control signal comprises the exposure control information of calculating in order to wait to obtain target, thereby can obtain multiple bar chart as data in photographing process (processing of being undertaken by photosensitive unit and signal output unit) by imageing sensor 12.
To be stored in the image holding unit 22 by graphics processing unit 18 as data by the multiple bar chart that imageing sensor 12 obtains in photographing process.
Correcting unit 23 reads in the multiple bar chart of storage in the image holding unit 22 as data, produce many camera shake correction (combination) view data based on the multiple bar chart that reads as data, and export the view data that produces to be configured to be used for storing image data at subsequent stage memory (not showing in the accompanying drawings).The every kind of pattern that is used for realizing the present invention is depended in the operation of correcting unit 23.
When shooting control unit 13, graphics processing unit 18 and correcting unit 23 are realized with program, being used to carry out the CPU (CPU) of taking control unit 13 and graphics processing unit 18 can be the CPU that is used to carry out correcting unit 23, or different respectively CPU.
Fig. 3 shows the configuration according to correcting unit operation principle of the present invention, and is the block diagram that illustrates according to the configuration of the correcting unit that is used to realize first and second pattern of the present invention.
In Fig. 3, correcting unit 30 comprises: the first base image selected cell 31, side-play amount computing unit 32, overlapping region detecting unit 33, doubling of the image unit 34, the second base image selected cell 36 and image assembled unit 38.
The first base image selected cell 31 the preorder of correcting unit 30 in the stage from image holding unit 22 multiple bar chart of storage select a view data in as data, as employed first base image when carrying out overlap operation.
32 calculating of side-play amount computing unit are selected for the multiple bar chart of storage in first overlapping base image and the image holding unit 22 as the side-play amount between other view data in the data.
Overlapping region detecting unit 33 detects the overlapping region according to the side-play amount of calculating gained, and this overlapping region can be carried out offset correction and can carry out overlapping areas on first base image for other view data than the selected first base image data.
Overlap operation is carried out in doubling of the image unit 34 in detected overlapping region.
The multiple bar chart that the second base image selected cell 36 is stored from image holding unit 22 is as selecting a view data in the data, as second base image of using in Non-overlapping Domain.
Image assembled unit 38 makes up overlapping result and the Non-overlapping Domain in second base image, and output is as the view data stage extremely subsequently of combined result.
Fig. 4 is for showing the main flow chart according to photographing process of the present invention and image processing.Realize this flow chart by each unit that in Fig. 2 and Fig. 3, shows.
In Fig. 4, at first in step S101, continuously shot images.Below main process will be described.
At first, in response to the shooting instruction (for example, pressing shutter release button) that the user sends, shooting control unit 13 outputs one as shown in Figure 2 control signal to imageing sensor 12, and wherein this control signal comprises the exposure control information of calculating in order to wait to obtain target.Subsequently, in the process (by photosensitive unit and signal output unit) of using imageing sensor 12 to take, obtain a view data.That is to say that imageing sensor 12 is converted to electric charge with scioptics 11 from the light of waiting to obtain target and receiving, electric charge assembled, and output as light/electric translation data through the electric charge of overbunching to graphics processing unit 18.
In step S102 subsequently, the multiple bar chart of the storage from (that is, selected) image holding unit 22 of the first base image selected cell 31 as shown in Figure 3 is as selecting a view data in the data, as employed first base image when carrying out overlap operation.
In step S103, side-play amount computing unit 32 calculate the multiple bar chart that keeps by image holding unit 22 as data in side-play amount between selecteed first base image and another view data (treating overlapping target image).For example, because the image of the image of continuous shooting as shown in Figure 1 for when vehicle moves, obtaining continuously, so the position of vehicle all changes in each image.Yet, although (in each image as shown in Figure 1, not necessarily showing), the background of vehicle, for example all skews more or less such as Yuan Chu mountain peak, forest owing to the influence of camera shake, but compare with the change of vehicle location in each image, can ignore background offset to a certain extent.In step S103, the skew between the image of calculating background parts is as side-play amount.If obtained the side-play amount between two images, then can detect pixel corresponding between two images.
In step S104, overlapping region detecting unit 33 utilizes the side-play amount of calculating gained to carry out offset correction, treats the target image overlapping with first base image thereby harmonize, and the pixel of calculating between the respective pixel is poor.Subsequently, determine whether can carry out overlapping less than the pixel of threshold value to its difference that calculates, and detect for first base image and the overlapping region for the treatment of overlapping target image, wherein said overlapping region is defined as the one group pixel of its difference less than threshold value.
In step S105, overlap operation is carried out in doubling of the image unit 34 on detected overlapping region.As a result, the pixel for the treatment of the counterpart of overlapping target image on the detected overlapping region in first base image is carried out overlapping.
In step S106, determine whether the image of predetermined number has been finished above-mentioned processing.For example, when obtaining three images by continuous shooting and setting second image to be first base image, carrying out overlap operation between second and first image and between the second and the 3rd image.Therefore, predetermined number is twice.
If determine the image of predetermined number is not also finished above-mentioned processing in step S106, then step S103 is returned in control.
On the other hand, if in step S106, determine the image of predetermined number has been finished above-mentioned processing, so, in step S107, the second base image selected cell 36 (promptly, selected) multiple bar chart that keeps in image holding unit 22 selects a view data in as data, as second base image of in Non-overlapping Domain, using.The process of execution in step S103 to S105 repeatedly, detect first base image and treat that common ground between the overlapping target image is as final overlapping region, and for final overlapping region, can obtain the zone except that final overlapping region, i.e. Non-overlapping Domain (region of variation).
In step S108, image assembled unit 38 makes up overlapping result (for example, first base image after carrying out overlapping processing) and the Non-overlapping Domain in second base image, and exports the view data of combined result to subsequently stage.
Fig. 5 is that basis is in order to the photographing process of enforcement the present invention first pattern and the flow chart of image processing.Realize this flow chart by each unit shown in Fig. 2 and Fig. 3.
Because step S101 to S106 is similar to step as shown in Figure 4, so repeat no more.
In the step S102 of this process, definition also selects n predetermined view data as first base image, but also can use other method as the method for selecting first base image.
If (in step S106, determining the image of predetermined number has been finished aforesaid operations), then in step S201, the multiple bar chart that the second base image selected cell 36 keeps from image holding unit 22 is as selecting the view data corresponding with first photographic images in the data, as second base image of using in Non-overlapping Domain.
In following step S202, image assembled unit 38 with overlapping result (for example, first base image after carrying out overlapping processing) and the Non-overlapping Domain in second base image make up, and will export subsequently stage to as the view data of combined result.
In order to realize in first pattern of the present invention, second base image of from many view data of obtaining, selecting the view data conduct corresponding in Non-overlapping Domain, to use with first photographic images, thus avoided that combination image is offset when the user thinks the image that obtains when obtaining pressing shutter release button from the image of user expectation.
Fig. 6 is that basis is in order to the photographing process of realization the present invention second pattern and the flow chart of image processing.Realize this flow chart by each unit shown in Fig. 2 and Fig. 3.
Because step S101 to S106 is similar to step as shown in Figure 4, so repeat no more.
If (in step S106, determining the image of predetermined number has been finished aforesaid operations), then in step S301, the multiple bar chart that the second base image selected cell 36 keeps from image holding unit 22 is as selecting in the data and the corresponding view data of image of sounding at shutter and taking constantly, as second base image of using in Non-overlapping Domain.
In following step S302, image assembled unit 38 with overlapping result (for example, first base image after carrying out overlapping processing) and the Non-overlapping Domain in second base image make up, and will export subsequently stage to as the view data of combined result.
In order to realize in second pattern of the present invention, from a plurality of view data of obtaining, be chosen in shutter and sound the view data of the image correspondence of constantly taking, thereby avoided obtaining when the user thinks that combination image is offset when shutter is sounded the image that the moment obtains from the image of user expectation as second base image of in Non-overlapping Domain, using.
As described below is example three-mode of the present invention.
The block diagram of Fig. 7 for disposing according to the second base image selected cell (the base image selected cell that is used for Non-overlapping Domain) in order to realization the present invention three-mode.
In Fig. 7, the second base image selected cell 41 comprises amount of jitter computing unit 42.
Amount of jitter computing unit 42 detects for the multiple bar chart that is kept by image holding unit 22 as the edge in the Non-overlapping Domain of every in data view data (profile).Described edge is the density-change boundary in a view data, and can observe Non-overlapping Domain from the second base image selected cell 41.Subsequently, 42 pairs of amount of jitter computing units are taken the direct of travel and the amount of travel of the marginal portion in every view data being obtained continuously and are carried out the operation that tracks, and compute vectors, just amount of jitter, left and right sides amount of jitter and reach counterclockwise amount of jitter clockwise up and down in every view data.
For example, amount of jitter computing unit 42 calculates the multiple bar chart that kept by image holding unit 22 as the resolution separately of detected marginal portion respectively in the data, and to the second base image selected cell, 41 notices in order to be identified in the information that multiple bar chart has the view data of highest resolution in as data.In this case, resolution is high more to show that then amount of jitter is more little, and the low more amount of jitter that then shows of resolution is big more.That is to say that the pictorial data representation of highest resolution has the view data of minimum jitter amount.
The specified view data of information that the second base image selected cell 41 is selected by amount of jitter computing unit 42 notices is as the base image of using in Non-overlapping Domain.
Fig. 8 is that basis is in order to the photographing process of realization the present invention three-mode and the flow chart of image processing.Realize this flow chart by each unit shown in Fig. 2, Fig. 3 and Fig. 7.
Because step S101 to S106 is similar to step as shown in Figure 4, so repeat no more.
If (in step S106, determining the image of predetermined number has been finished aforesaid operations), then in step S401, amount of jitter computing unit 42 as shown in Figure 7 detects for the multiple bar chart that is kept by image holding unit 22 as the edge in the Non-overlapping Domain of every in data view data (profile).Described edge is the density-change boundary in a view data, and can observe Non-overlapping Domain from the second base image selected cell 41.Subsequently, equally in step S401,42 pairs of amount of jitter computing units are taken the direct of travel and the amount of travel of the marginal portion in every the view data that obtains continuously and are carried out the operation that tracks, and compute vectors, just up and down amount of jitter, left and right sides amount of jitter and reach counterclockwise amount of jitter clockwise in each view data.For example, amount of jitter computing unit 42 calculates the multiple bar chart that kept by image holding unit 22 as the resolution separately of detected marginal portion respectively in the data, and to the second base image selected cell, 41 notices in order to be identified in the information that multiple bar chart has the view data of highest resolution in as data.
In following step S402, the base image that the second base image selected cell 41 as shown in Figure 7 selects (that is, selected) to be used in Non-overlapping Domain by the specified view data conduct of the information of amount of jitter computing unit 42 notices.
In following step S403, image assembled unit 38 as shown in Figure 3 with overlapping result (for example, first base image after carrying out overlapping processing) and the Non-overlapping Domain in second base image make up, and will export subsequently stage to as the view data of combined result.
In as above describing, amount of jitter computing unit 42 detects the multiple bar chart that kept by image holding unit 22 as the edge in the data, but it also can the detected characteristics point.
In this case, the said process of carrying out by each unit shown in Fig. 7 is below described.
Amount of jitter computing unit 42 detects for the multiple bar chart that is kept by image holding unit 22 as the edge in the Non-overlapping Domain of every in data view data (profile).Described edge is the density-change boundary in a view data, and can observe Non-overlapping Domain from the second base image selected cell 41.Subsequently, amount of jitter computing unit 42 is from as only extracting the part can discern its feature easily the marginal portion of characteristic point, for example high brightness point, end points, summit, breakout and crosspoint etc., and the direct of travel and the amount of travel of the characteristic point part in every view data that continuous shooting is obtained are carried out the operation that tracks.As the operating result that tracks, amount of jitter computing unit 42 compute vectors, just up and down amount of jitter, left and right sides amount of jitter and reach counterclockwise amount of jitter clockwise in each view data.
For example, amount of jitter computing unit 42 calculates the multiple bar chart that kept by image holding unit 22 resolution separately as respectively detected characteristic point part in the data, and to the second base image selected cell, 41 notices in order to be identified in the information that multiple bar chart has the view data of highest resolution in as data.In this case, the pictorial data representation of highest resolution has the view data of minimum jitter amount.
The specified view data of information that the second base image selected cell 41 is selected by amount of jitter computing unit 42 notices is as the base image of using in Non-overlapping Domain.
In order to realize in the three-mode of the present invention, be chosen in the multiple bar chart that obtains as the view data that has the minimum jitter amount in the Non-overlapping Domain in the data as second base image of in Non-overlapping Domain, using, thereby improved as the picture quality in the Non-overlapping Domain in the view data of combined result, and obtained to have more high-precision combination image.In order in realizing that of the present invention first the side-play amount to the third pattern is calculated, also can consider amount of jitter.
As described below is in order to realize four-mode of the present invention.
The block diagram of Fig. 9 for disposing according to the second base image selected cell (the base image selected cell that is used for Non-overlapping Domain) in order to realization the present invention four-mode.
In Fig. 9, the second base image selected cell 51 comprises face recognition unit 52.
Face recognition unit 52 at first detects facial Search Area at the multiple bar chart that is kept by image holding unit 22 in as data from the second base image selected cell, 51 observed Non-overlapping Domain.That is to say that face recognition unit 52 as color information, and when the shape in skin color zone and size satisfy condition as facial zone, is facial Search Area with the skin color zone definitions of extracting from image data extraction skin color zone.
Next, face recognition unit 52 extracts the candidate item (right eye eyebrow, left eye eyebrow, right eye, left eye, nostril and mouth) of facial characteristics point as facial element.For example, extract zone corresponding to following clauses and subclauses in the facial Search Area of detection as the facial characteristics point.
1. horizontal direction extended area
2. the red area of similar ellipse
3. its brightness value is less than the zone of the brightness value in skin color zone
For example, has low brightness values and change little horizontal direction and expand the eyes that facial characteristic point can be identified as eyebrow or close.Have by two bright areas around the facial characteristics point of similar ellipse of dark central area can be identified as the eyes of opening.Represent that its color information is that the facial characteristics point of erythroid similar ellipse can be identified as mouth.
In addition, maximum image data information is opened to the eyes that multiple bar chart that the second base image selected cell 51 notice is kept by image holding unit 22 in order to identification detects two eyes in as data in face recognition unit 52.
The base image that 51 selections of the second base image selected cell are used in Non-overlapping Domain by the specified view data conduct of face recognition unit 52 observed information.
The multiple bar chart that is kept by image holding unit 22 may not comprise face as all Non-overlapping Domain in the data.In this case, face recognition unit 52 does not detect the image that any demonstration is opened eyes to the second base image selected cell, 51 notices.
The operation of carrying out by the second base image selected cell 51 that does not detect the image that any demonstration opens eyes in order to notice is not limited to this, but for example, also can carry out according in order to realize the present invention's first operation to three-mode.
Figure 10 is that basis is in order to the photographing process of realization the present invention four-mode and the flow chart of image processing.Realize this flow chart by each unit shown in Fig. 2, Fig. 3 and Fig. 9.
Because step S101 to S106 is similar to step as shown in Figure 4, so repeat no more.
If (in step S106, determining the image of predetermined number has been finished aforesaid operations), then in step S501, face recognition unit 52 as shown in Figure 9 detects for the multiple bar chart that is kept by image holding unit 22 as the facial Search Area from the second base image selected cell, 51 observed Non-overlapping Domain in the data, and determines whether to comprise face in this zone.
If face recognition unit 52 is determined to comprise face in step S501, then in step S502, maximum image data information is opened to the eyes that multiple bar chart that the second base image selected cell, 51 notices are kept by image holding unit 22 in order to identification detects two eyes in as data in face recognition unit 52.The second base image selected cell 51 is selected the base image that (promptly selected) used in Non-overlapping Domain by the specified view data conduct of face recognition unit 52 observed information subsequently.Subsequently, control flow forwards step S504 to.
On the other hand, if face is determined not comprise in face recognition unit 52 in step S501, then in step S503, be chosen in the multiple bar chart that keeps by image holding unit 22 as the corresponding view data of the image in the data with at first taking as second base image of in Non-overlapping Domain, using, and control flow forwards step S504 to.
In following step S504, image assembled unit 38 as shown in Figure 3 with overlapping result (for example, first base image after carrying out overlapping processing) and the Non-overlapping Domain in second base image make up, and will export subsequently step to as the view data of combined result.
In as above describing, the processing of when determining not comprise face, carrying out (processing in step S503) with identical in order to realize the operation of carrying out under first pattern of the present invention, but according in order to realize that operation that second and third pattern of the present invention is carried out and other method of selection base image also can be applied in the processing of step S503.
In order to realize in the four-mode of the present invention, selection is opened maximum image data information in order to be identified in the multiple bar chart that obtains as the eyes that have the human face in the Non-overlapping Domain in the data, as second base image of in Non-overlapping Domain, using, thereby improved the picture quality in the Non-overlapping Domain of view data, and obtained to have the more combination image of high accuracy as combined result.
Figure 11 illustrates the example of storage medium.
Multiple bar chart according to the present invention can be realized by image data processing system 81 as the combination operation of data.Can be used for the program of processing of the present invention and data storage device 95 from image data processing system 81 be loaded in order to executive program the memory of image data processing system 81, from portable storage media 83 is loaded on the memory image data processing system 81 in order to executive program, perhaps be loaded on by network in the memory image data processing system 81 in order to executive program from external memory 82.
Claims (18)
1. an image data processing system is used to handle multiple bar chart as data, and this image data processing system comprises:
The first base image selected cell, from the multiple bar chart that obtains as selecting a view data the data, as first base image of in overlap operation, using;
The side-play amount computing unit calculates first base image be selected for overlap operation and the described multiple bar chart that obtains as the side-play amount between another different images data in the data;
Doubling of the image unit is carrying out detecting the overlapping region after the offset correction based on institute's offset calculated, and overlap operation is carried out in this overlapping region, wherein this overlapping region be can be on first base image of selection the zone of overlapping described different images data;
The second base image selected cell, from the described multiple bar chart that obtains as selecting a view data the data, as second base image of in Non-overlapping Domain, using; And
The image assembled unit overlap operation result and the Non-overlapping Domain in this second base image are made up, and output is as the view data of combined result.
2. device according to claim 1, wherein this second base image selected cell wherein has the view data of minimum jitter amount as this second base image of using from described multiple bar chart as selecting the data in this Non-overlapping Domain.
3. device according to claim 2, wherein said view data with minimum jitter amount refers to the view data that has highest resolution in this view data in the detected marginal portion.
4. device according to claim 2, wherein said view data with minimum jitter amount refer to the view data that has highest resolution in this view data in the characteristic point part of extracting.
5. device according to claim 1 also comprises:
The face recognition unit, the degree of opening of the eyes in the Non-overlapping Domain of the described view data of obtaining on the identification human face; Wherein
This second base image selected cell select by this face recognition unit identification open the view data of degree maximum in order to the eyes of indication on the human face, as second base image of in this Non-overlapping Domain, using.
6. image pick-up device is used to handle the multiple bar chart that obtains by continuous shooting as data, and this image pick-up device comprises:
Imageing sensor is carried out light/electricity conversion for the light that receives from object to be caught, and output light/electric translation data;
The view data generation unit produces view data based on this light/electric translation data;
Take control unit, export the exposure control information in response to taking instruction to this imageing sensor, and obtain multiple bar chart as data;
The first base image selected cell, from the described multiple bar chart that obtains as selecting a view data the data as first base image of in overlap operation, using;
The side-play amount computing unit calculates first base image be selected for overlap operation and the described multiple bar chart that obtains as the side-play amount between another different images data in the data;
Doubling of the image unit is carrying out detecting the overlapping region after the offset correction based on institute's offset calculated, and overlap operation is carried out in this overlapping region, wherein this overlapping region be can be on first base image of selection the zone of overlapping described different images data;
The second base image selected cell, from the described multiple bar chart that obtains as selecting a view data the data, as second base image of in Non-overlapping Domain, using; And
The image assembled unit overlap operation result and the Non-overlapping Domain in this second base image are made up, and output is as the view data of combined result.
7. device according to claim 6, wherein this second base image selected cell from the described multiple bar chart that obtains as selecting the view data corresponding the data, as this second base image of in this Non-overlapping Domain, using with first photographic images.
8. device according to claim 6, wherein this second base image selected cell is from the image corresponding view data of the described multiple bar chart that obtains as the moment shooting of selecting the data and sounding at shutter, as this second base image of using in this Non-overlapping Domain.
9. an image processing method is used to handle multiple bar chart as data, and this method comprises the following steps:
First base image is selected step, from the multiple bar chart that obtains as selecting a view data the data, as first base image of in overlap operation, using;
The side-play amount calculation procedure is calculated first base image be selected for overlap operation and the described multiple bar chart that obtains as the side-play amount between another different images data in the data;
Doubling of the image step is carrying out detecting the overlapping region after the offset correction based on institute's offset calculated, and overlap operation is carried out in this overlapping region, wherein this overlapping region be can be on first base image of selection the zone of overlapping described different images data;
Second base image is selected step, from the described multiple bar chart that obtains as selecting a view data the data, as second base image of in Non-overlapping Domain, using; And
The image combination step overlap operation result and the Non-overlapping Domain in this second base image are made up, and output is as the view data of combined result.
10. method according to claim 9 is wherein selected in the step in this second base image, wherein has the view data of minimum jitter amount as this second base image of using from described multiple bar chart as selecting the data in this Non-overlapping Domain.
11. method according to claim 10, wherein said view data with minimum jitter amount refers to the view data that has highest resolution in this view data in the detected marginal portion.
12. method according to claim 10, wherein said view data with minimum jitter amount refer to the view data that has highest resolution in this view data in the characteristic point part of extracting.
13. method according to claim 9 also comprises the following steps:
The face recognition step is identified in the degree of opening of the eyes on the human face in the Non-overlapping Domain of the described view data of obtaining; Wherein
Select in step in this second base image, be chosen in discern in the described face recognition step open the view data of degree maximum in order to the eyes of indication on the human face, as this second base image of in this Non-overlapping Domain, using.
14. a computer-readable recording medium, its storage are used to indicate the image data procesisng program of Computer Processing multiple bar chart as data, this image data procesisng program comprises the following steps:
First base image is selected step, from the multiple bar chart that obtains as selecting a view data the data, as first base image of in overlap operation, using;
The side-play amount calculation procedure is calculated first base image be selected for overlap operation and the described multiple bar chart that obtains as the side-play amount between another different images data in the data;
Doubling of the image step is carrying out detecting the overlapping region after the offset correction based on institute's offset calculated, and overlap operation is carried out in this overlapping region, wherein this overlapping region be can be on first base image of selection the zone of overlapping described different images data;
Second base image is selected step, from the described multiple bar chart that obtains as selecting a view data the data, as second base image of in Non-overlapping Domain, using; And
The image combination step overlap operation result and the Non-overlapping Domain in this second base image are made up, and output is as the view data of combined result.
15. storage medium according to claim 14, wherein select in the step in this second base image, this image data procesisng program instruct computer wherein has the view data of minimum jitter amount as this second base image of using from described multiple bar chart as selecting the data in this Non-overlapping Domain.
16. storage medium according to claim 15, wherein said view data with minimum jitter amount refers to the view data that has highest resolution in this view data in the detected marginal portion.
17. storage medium according to claim 15, wherein said view data with minimum jitter amount refer to the view data that has highest resolution in this view data in the characteristic point part of extracting.
18. storage medium according to claim 14, wherein this image data procesisng program also comprises:
The face recognition step is identified in the degree of opening of the eyes on the human face in the Non-overlapping Domain of the described view data of obtaining; Wherein
Select in the step in this second base image, this image data procesisng program instruct computer be chosen in discern in the described face recognition step open the view data of degree maximum in order to the eyes of indication on the human face, as this second base image of in this Non-overlapping Domain, using.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006261788A JP4861109B2 (en) | 2006-09-27 | 2006-09-27 | Image data processing apparatus, image data processing method, image data processing program, and imaging apparatus |
JP2006261788 | 2006-09-27 |
Publications (2)
Publication Number | Publication Date |
---|---|
CN101155263A true CN101155263A (en) | 2008-04-02 |
CN100559844C CN100559844C (en) | 2009-11-11 |
Family
ID=39224443
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CNB200710161610XA Expired - Fee Related CN100559844C (en) | 2006-09-27 | 2007-09-27 | Image processing apparatus, method and image pick-up device |
Country Status (4)
Country | Link |
---|---|
US (1) | US8026932B2 (en) |
JP (1) | JP4861109B2 (en) |
KR (1) | KR100840856B1 (en) |
CN (1) | CN100559844C (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101753778B (en) * | 2008-12-17 | 2012-06-27 | 佳能株式会社 | Image processing apparatus and image processing method |
CN102547332A (en) * | 2010-12-22 | 2012-07-04 | 富士通株式会社 | Image capturing device and image capturing control method |
CN104125393A (en) * | 2013-04-25 | 2014-10-29 | 佳能株式会社 | Image capturing apparatus and method of controlling the same |
CN104428815A (en) * | 2012-07-13 | 2015-03-18 | 富士胶片株式会社 | Image deformation device and method for controlling actuation of same |
CN107483829A (en) * | 2013-01-30 | 2017-12-15 | 奥林巴斯株式会社 | Camera device, operation device, object confirmation method |
CN109564376A (en) * | 2016-03-10 | 2019-04-02 | 维斯比特股份有限公司 | Time-multiplexed programmable view field imaging |
CN109963082A (en) * | 2019-03-26 | 2019-07-02 | Oppo广东移动通信有限公司 | Image capturing method, device, electronic equipment, computer readable storage medium |
CN110796664A (en) * | 2019-10-14 | 2020-02-14 | 北京字节跳动网络技术有限公司 | Image processing method, image processing device, electronic equipment and computer readable storage medium |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5217270B2 (en) * | 2007-06-26 | 2013-06-19 | ソニー株式会社 | Image processing apparatus and method, and program |
JP5009204B2 (en) * | 2008-03-14 | 2012-08-22 | オリンパスイメージング株式会社 | Image capturing apparatus and image composition method in image capturing apparatus |
TW201002053A (en) * | 2008-06-27 | 2010-01-01 | Altek Corp | Digital image composition method |
JP2010093679A (en) * | 2008-10-10 | 2010-04-22 | Fujifilm Corp | Imaging apparatus, and imaging control method |
US8411321B2 (en) * | 2009-09-09 | 2013-04-02 | Seiko Epson Corporation | Printing apparatus, layout adjustment method, program and recording medium |
JP5424835B2 (en) * | 2009-11-30 | 2014-02-26 | キヤノン株式会社 | Image processing apparatus and image processing method |
JP2011119930A (en) | 2009-12-02 | 2011-06-16 | Seiko Epson Corp | Imaging apparatus, imaging method, and imaging program |
JP2012010212A (en) * | 2010-06-28 | 2012-01-12 | Casio Comput Co Ltd | Image display device and program |
US20120019686A1 (en) * | 2010-07-23 | 2012-01-26 | Casio Computer Co., Ltd. | Image synthesizing device, image synthesizing method and computer readable medium |
CN111415298B (en) * | 2020-03-20 | 2023-06-02 | 北京百度网讯科技有限公司 | Image stitching method and device, electronic equipment and computer readable storage medium |
Family Cites Families (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4553165A (en) * | 1983-08-11 | 1985-11-12 | Eastman Kodak Company | Transform processing method for reducing noise in an image |
US5469536A (en) * | 1992-02-25 | 1995-11-21 | Imageware Software, Inc. | Image editing system including masking capability |
US5448053A (en) * | 1993-03-01 | 1995-09-05 | Rhoads; Geoffrey B. | Method and apparatus for wide field distortion-compensated imaging |
JPH0728149A (en) | 1993-07-13 | 1995-01-31 | Olympus Optical Co Ltd | Camera shake preventive device |
JPH10191136A (en) * | 1996-12-27 | 1998-07-21 | Canon Inc | Image pickup device and image synthesizer |
US6429895B1 (en) * | 1996-12-27 | 2002-08-06 | Canon Kabushiki Kaisha | Image sensing apparatus and method capable of merging function for obtaining high-precision image by synthesizing images and image stabilization function |
JP3634556B2 (en) * | 1997-05-12 | 2005-03-30 | キヤノン株式会社 | Image processing method and system |
US6157747A (en) * | 1997-08-01 | 2000-12-05 | Microsoft Corporation | 3-dimensional image rotation method and apparatus for producing image mosaics |
JP4002655B2 (en) * | 1998-01-06 | 2007-11-07 | 株式会社日立製作所 | Pattern inspection method and apparatus |
JP4110560B2 (en) * | 1998-05-20 | 2008-07-02 | カシオ計算機株式会社 | Image processing method and apparatus |
US6483538B2 (en) * | 1998-11-05 | 2002-11-19 | Tektronix, Inc. | High precision sub-pixel spatial alignment of digital images |
JP3600755B2 (en) * | 1999-05-13 | 2004-12-15 | 三菱電機株式会社 | Face image processing device |
JP2001222046A (en) * | 1999-12-03 | 2001-08-17 | Sharp Corp | Digital camera, digital camera recovery system and picture browsing method |
JP3927353B2 (en) * | 2000-06-15 | 2007-06-06 | 株式会社日立製作所 | Image alignment method, comparison inspection method, and comparison inspection apparatus in comparison inspection |
GB0031016D0 (en) * | 2000-12-20 | 2001-01-31 | Alphafox Systems Ltd | Security systems |
JP4596226B2 (en) * | 2001-06-27 | 2010-12-08 | ソニー株式会社 | Image processing apparatus and method, recording medium, and program |
WO2003003720A1 (en) * | 2001-06-28 | 2003-01-09 | Omnivee Inc. | Method and apparatus for control and processing of video images |
US7650044B2 (en) * | 2001-07-30 | 2010-01-19 | Cedara Software (Usa) Limited | Methods and systems for intensity matching of a plurality of radiographic images |
JP4050498B2 (en) | 2001-11-07 | 2008-02-20 | オリンパス株式会社 | Image synthesizer |
JP3632677B2 (en) | 2002-05-13 | 2005-03-23 | ミノルタ株式会社 | Electronic camera |
US7014604B2 (en) * | 2002-07-19 | 2006-03-21 | Voith Paper Patent Gmbh | Paper machine roll cover |
US7295232B2 (en) * | 2003-01-15 | 2007-11-13 | Canon Kabushiki Kaisha | Camera and program |
JP2004221992A (en) * | 2003-01-15 | 2004-08-05 | Canon Inc | Imaging device and program |
JP2005020607A (en) * | 2003-06-27 | 2005-01-20 | Casio Comput Co Ltd | Composite image output device and composite image output processing program |
DE10337767A1 (en) * | 2003-08-14 | 2005-03-31 | Leica Microsystems Semiconductor Gmbh | Method for measuring the overlay shift |
JP4461937B2 (en) * | 2003-09-30 | 2010-05-12 | セイコーエプソン株式会社 | Generation of high-resolution images based on multiple low-resolution images |
US8654201B2 (en) * | 2005-02-23 | 2014-02-18 | Hewlett-Packard Development Company, L.P. | Method for deblurring an image |
JP4562182B2 (en) * | 2005-03-07 | 2010-10-13 | キヤノン株式会社 | Image processing apparatus, image processing method, and program |
DE102005018939B4 (en) * | 2005-04-22 | 2007-09-20 | Siemens Ag | Improved MRI imaging based on conventional PPA reconstruction techniques |
WO2007032082A1 (en) * | 2005-09-16 | 2007-03-22 | Fujitsu Limited | Image processing method, and image processing device |
CA2654960A1 (en) * | 2006-04-10 | 2008-12-24 | Avaworks Incorporated | Do-it-yourself photo realistic talking head creation system and method |
US20080143969A1 (en) * | 2006-12-15 | 2008-06-19 | Richard Aufranc | Dynamic superposition system and method for multi-projection display |
DE102007009185A1 (en) * | 2007-02-26 | 2008-08-28 | Siemens Ag | Method for planning an angiographic measurement |
-
2006
- 2006-09-27 JP JP2006261788A patent/JP4861109B2/en not_active Expired - Fee Related
-
2007
- 2007-09-21 KR KR1020070096888A patent/KR100840856B1/en active IP Right Grant
- 2007-09-25 US US11/860,865 patent/US8026932B2/en not_active Expired - Fee Related
- 2007-09-27 CN CNB200710161610XA patent/CN100559844C/en not_active Expired - Fee Related
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101753778B (en) * | 2008-12-17 | 2012-06-27 | 佳能株式会社 | Image processing apparatus and image processing method |
CN102547332A (en) * | 2010-12-22 | 2012-07-04 | 富士通株式会社 | Image capturing device and image capturing control method |
CN102547332B (en) * | 2010-12-22 | 2014-12-17 | 富士通株式会社 | Image capturing device and image capturing control method |
CN104428815B (en) * | 2012-07-13 | 2017-05-31 | 富士胶片株式会社 | Anamorphose device and its method of controlling operation |
CN104428815A (en) * | 2012-07-13 | 2015-03-18 | 富士胶片株式会社 | Image deformation device and method for controlling actuation of same |
CN107483829A (en) * | 2013-01-30 | 2017-12-15 | 奥林巴斯株式会社 | Camera device, operation device, object confirmation method |
CN104125393A (en) * | 2013-04-25 | 2014-10-29 | 佳能株式会社 | Image capturing apparatus and method of controlling the same |
CN109564376A (en) * | 2016-03-10 | 2019-04-02 | 维斯比特股份有限公司 | Time-multiplexed programmable view field imaging |
CN109564376B (en) * | 2016-03-10 | 2021-10-22 | 维斯比特股份有限公司 | Time multiplexed programmable field of view imaging |
CN109963082A (en) * | 2019-03-26 | 2019-07-02 | Oppo广东移动通信有限公司 | Image capturing method, device, electronic equipment, computer readable storage medium |
CN109963082B (en) * | 2019-03-26 | 2021-01-08 | Oppo广东移动通信有限公司 | Image shooting method and device, electronic equipment and computer readable storage medium |
CN110796664A (en) * | 2019-10-14 | 2020-02-14 | 北京字节跳动网络技术有限公司 | Image processing method, image processing device, electronic equipment and computer readable storage medium |
CN110796664B (en) * | 2019-10-14 | 2023-05-23 | 北京字节跳动网络技术有限公司 | Image processing method, device, electronic equipment and computer readable storage medium |
Also Published As
Publication number | Publication date |
---|---|
US8026932B2 (en) | 2011-09-27 |
JP4861109B2 (en) | 2012-01-25 |
KR20080028814A (en) | 2008-04-01 |
CN100559844C (en) | 2009-11-11 |
JP2008085531A (en) | 2008-04-10 |
US20080074441A1 (en) | 2008-03-27 |
KR100840856B1 (en) | 2008-06-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN100559844C (en) | Image processing apparatus, method and image pick-up device | |
CN100493147C (en) | Image capturing device having a hand shake correction function and hand shake correction method | |
US8509482B2 (en) | Subject tracking apparatus, subject region extraction apparatus, and control methods therefor | |
US8786760B2 (en) | Digital photographing apparatus and method using face recognition function | |
CN101605208B (en) | Image processing apparatus, imaging apparatus, image processing method | |
EP2563006B1 (en) | Method for displaying character information, and image-taking device | |
JP4900014B2 (en) | Imaging apparatus and program thereof | |
US8350918B2 (en) | Image capturing apparatus and control method therefor | |
JP5293206B2 (en) | Image search apparatus, image search method and program | |
US7614559B2 (en) | Apparatus and method for deciding in-focus position of imaging lens | |
JP2005055744A (en) | Imaging apparatus and program | |
JP5395650B2 (en) | Subject area extraction device and control method thereof, subject tracking device, and program | |
JP2008005438A (en) | Imaging apparatus and imaging method | |
JP4807623B2 (en) | Imaging apparatus, imaging method, and imaging program | |
JP5278483B2 (en) | Imaging apparatus, imaging method, and imaging program | |
JP2009252069A (en) | Image processor, imaging device, image processing method and program | |
JP4885079B2 (en) | Digital camera, photographing method and photographing program | |
JP5375943B2 (en) | Imaging apparatus and program thereof | |
CN115037867B (en) | Shooting method, shooting device, computer readable storage medium and electronic equipment | |
JP2008187677A (en) | Digital camera | |
JP6508889B2 (en) | Imaging device, control method for imaging device and program | |
JP4819737B2 (en) | Imaging apparatus and control method | |
JP2014039098A (en) | Image processing apparatus, imaging device, and image processing program | |
JP2012039645A (en) | Image data processing apparatus, image data processing method, image data processing program, and imaging apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20091111 Termination date: 20190927 |