CN101151889A - Image processing device and program - Google Patents
Image processing device and program Download PDFInfo
- Publication number
- CN101151889A CN101151889A CNA2006800105562A CN200680010556A CN101151889A CN 101151889 A CN101151889 A CN 101151889A CN A2006800105562 A CNA2006800105562 A CN A2006800105562A CN 200680010556 A CN200680010556 A CN 200680010556A CN 101151889 A CN101151889 A CN 101151889A
- Authority
- CN
- China
- Prior art keywords
- frame
- motion vector
- order
- macro block
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/44—Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
- H04N19/51—Motion estimation or motion compensation
- H04N19/513—Processing of motion vectors
- H04N19/521—Processing of motion vectors for estimating the reliability of the determined motion vectors or motion vector field, e.g. for smoothing the motion vector field or for correcting motion vectors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/503—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
- H04N19/51—Motion estimation or motion compensation
- H04N19/527—Global motion vector estimation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/85—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using pre-processing or post-processing specially adapted for video compression
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/681—Motion detection
- H04N23/6811—Motion detection based on the image signal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Compression Or Coding Systems Of Tv Signals (AREA)
- Studio Devices (AREA)
- Image Analysis (AREA)
Abstract
Disclosed is an imaging device provided with a moving image decoding section (4) to decode encoded moving image data encoded by MPEG technique into moving image data, blur total motion vector estimating section (63) to obtain motion vector of macroblock selected by macroblock selecting section (62) that is suitable for estimating a total motion vector as well as to estimate the total motion vector of frame according to the motion vector, and a total motion vector interpolating section (63) to perform blur correction when replaying moving image according to the estimated total motion vector.
Description
Technical field
The present invention relates to a kind of fuzzy image processing equipment that compensates and program that is used for live image (moving image).
Background technology
Usually, as the imaging device such as digital camera or the like, come then it to be recorded as moving image file to because live image is carried out fuzzy the compensating that imaging produces by image processing (being called ambiguity correction), this has been well-known.
In addition, for not carrying out ambiguity correction with regard to the moving image file that is recorded when the imaging, developed a kind of by detect when the playback live image move carry out ambiguity correction equipment (for example, with reference to Japanese Patent Laid prospectus Tokukaiher 10-136304, hereinafter referred to as patent documentation 1).
Summary of the invention
Simultaneously, under the situation of above-mentioned patent documentation 1 grade, the characteristic of a frame of search when the playback live image, and by with before this characteristic and this frame and frame afterwards compare, detect the mobile of each part.Yet detect moving of entire frame according to moving of each part.
Yet, since the characteristic of a frame of search and with before this characteristic and this frame and frame afterwards compare and need a large amount of calculating, therefore just exist when carrying out playback time and bear very heavy problem.
Therefore, an object of the present invention is to provide a kind of image processing equipment and program, wish to alleviate because the burden that the ambiguity correction processing causes when not carrying out ambiguity correction resetting with regard to the live image that is recorded.
According to an aspect of the present invention, a kind of image processing equipment comprises:
The decoding moving image part, in order to being live image information through the live image information decoding of coding, wherein, described live image information through coding is to adopt the compressionism of carrying out motion compensation to encode;
Motion compensation information is obtained part, in order to according to the decoding of decoding moving image part to live image information, obtain with a plurality of frames that constitute live image in the relevant motion compensation information of at least one frame;
Total estimation of motion vectors part in order to the motion compensation information of obtaining according to motion compensation information acquisition unit branch, is estimated total motion vector of this frame; And
The ambiguity correction part in order to according to total estimated total motion vector of estimation of motion vectors part, is carried out ambiguity correction when resetting this live image.
Preferably, this image processing equipment also comprises:
The frame identification division, in order to identify the frame that does not have movable information in described a plurality of frames, wherein, motion compensation information is obtained part and can't do not obtained motion compensation information from this has the frame of movable information; And
Total motion vector interpolation part in order to according to before by the frame that does not have movable information of frame identification part branch identification and frame afterwards, is carried out for not having the interpolation calculating of the relevant total motion vector of the frame of movable information with this.
Preferably, this image processing equipment also comprises:
The frame identification division, in order to identify the frame that does not have movable information in described a plurality of frames, wherein, motion compensation information is obtained part and can't do not obtained motion compensation information from this has the frame of movable information;
Characteristic point extracting portion is in order to extract and the relevant characteristic point of being discerned by frame identification part branch of the frame that does not have movable information;
The feature point tracking part in one of two frames before or after not having the frame of movable information, is followed the tracks of the characteristic point of being extracted by characteristic point extracting portion; And
Total motion vector determining section, in order to according to the tracking results of feature point tracking part to this characteristic point, determining does not have the relevant total motion vector of frame of movable information with this.
Preferably, this image processing equipment also comprises:
The certificate authenticity device is in order to the reliability of the motion vector of a plurality of macro blocks of judge constituting this frame; And
Macro block is selected part, is judged as high macro block in order to select a motion vector reliability in described a plurality of macro blocks, partly is used for estimating the macro block of total motion vector as total estimation of motion vectors.
Preferably, the certificate authenticity device is judged the reliability of motion vector according to the flatness (flatness) of the image section relevant with macro block in this frame.
Preferably, according to following at least one determine flatness: with the relevant Q factor of quantification of the image information of macro block, and the macroblock encoding data volume.
Preferably, the certificate authenticity device is determined the realization degree of the motion compensation of macro block, and judges the reliability of motion vector according to determined realization degree.
According to a second aspect of the invention, a kind of image processing method comprises:
Decoding step, in order to being live image information through the live image information decoding of coding, wherein, described live image information through coding is to adopt the compressionism of carrying out motion compensation to encode;
The motion compensation information obtaining step, in order to according to for the decoding of described live image information obtain with a plurality of frames that constitute live image in the relevant information of at least one frame;
Estimating step is in order to estimate total motion vector of this frame according to the motion compensation information of being obtained; And
The ambiguity correction step is in order to come blur correction mode when resetting this live image according to estimated total motion vector.
According to a third aspect of the invention we, the software program product in a kind of embeddeding computer computer-readable recording medium, in order to carry out a kind of method, this method comprises:
Decoding step, in order to being live image information through the live image information decoding of coding, wherein, described live image information through coding is to adopt the compressionism of carrying out motion compensation to encode;
The motion compensation information obtaining step, in order to obtain according to the decoding of described live image information with a plurality of frames that constitute live image in the relevant information of at least one frame;
Estimating step is in order to estimate total motion vector of this frame according to the motion compensation information of being obtained; And
The ambiguity correction step is in order to come blur correction mode when resetting this live image according to estimated total motion vector.
Description of drawings
Fig. 1 is the block diagram that shows the primary structure of imaging device, and the example of this imaging device as an embodiment who uses image processing equipment of the present invention wherein is provided;
Fig. 2 schematically shows a schematic diagram that constitutes the frame of live image, and this live image is presented on the display of imaging device shown in Figure 1;
Fig. 3 A to 3C schematically shows with imaging device shown in Figure 1 to handle the relevant frame and the schematic diagram of viewing area at the ambiguity correction that playback time carried out;
Fig. 4 illustrates the flow chart in an operational instances of the ambiguity correction processing of playback time according to Fig. 3 A to 3C;
Fig. 5 illustrates the flow chart of selecting an operational instances of processing according to the macro block of Fig. 3 A to 3C in the ambiguity correction of playback time is handled; And
Fig. 6 is the block diagram that illustrates according to the primary structure of the imaging device of revising example 1.
Embodiment
Below, describe specific embodiment of the present invention with reference to the accompanying drawings in detail.Yet, the example that scope of the present invention is not limited in the accompanying drawing to be provided.
At this, Fig. 1 is the block diagram that the primary structure of imaging device 100 is shown, and the example of this imaging device 100 as an embodiment who uses image processing equipment of the present invention wherein is provided.
For example, imaging device 100 is applied to the live image of being gathered is encoded and the digital camera that writes down etc.At this, utilize the compressionism such as MPEG4 etc. to carry out coding, wherein this compressionism is the standard code technology that is used to write down the live image of being gathered.
Particularly, as shown in Figure 1, imaging device 100 is by image acquisition device 1, encoder 2, register 3, decoder 4, display 51, ambiguity correction processor 6 formations such as grade, wherein, the still image or the live image of 1 pair of target of image acquisition device carry out imaging, the image that encoder 2 utilizes predetermined compressionism that image acquisition device 1 is gathered is encoded, register 3 record is by the coded process image encoded data of encoder 2, the process image encoded data that 4 pairs of registers 3 of decoder are write down are decoded, display 51 shows the image of being gathered by image acquisition device 1, and shown the fuzzy of live image compensates on 6 pairs of displays 51 of ambiguity correction processor.
Under the control of imaging controller, from the view data of signal processor to encoder 2 output process image processing.
In addition, encoder 2 is carried out for example quantification treatment, and in this was handled, the DCT conversion coefficient that will calculate by DCT was divided by a predetermined Q factor, and this Q factor is taken intuitionistic feature into account.
Motion compensation is a kind of inter prediction, and its view data with input is divided into macroblock unit, and macroblock unit is the square area of 16 * 16 pixels; The predetermined search zone of search in reference picture detects wherein and the piece of the difference minimum between the macro block, then compensation motion amounts.At this, expression is detected in the reference picture, wherein difference for minimum piece in the horizontal direction with vertical direction on the vector of amount of movement be called motion vector.
Motion compensation as standard, produces P frame, B frame or the like with in-frame encoding picture I frame, and wherein, the P frame is a frame forward predictive coded image, and the B frame is forward direction and back forecast coded image.
In addition, DCT is divided into low-frequency component and the nondescript radio-frequency component that human eye is noticed easily by two-dimentional frequency translation with image.Particularly, adopt the module unit of 8 * 8 pixels, to carrying out DCT calculating in image and the difference between the input picture through motion compensation process.
At this,, therefore directly input picture is carried out DCT calculating and do not obtained difference because the I frame is an in-frame encoding picture.
Decoder (decoding moving image part) 4 produces bit stream (moving image data) by the coded image data that is write down in the register 3 is decoded when the playback live image.Particularly, decoder 4 is by decoding and calculate quantization parameter and motion vector the coding moving image data, and further carries out re-quantization and convert quantization parameter to the DCT coefficient.Then, the module unit of 8 * 8 pixels is carried out inverse DCT, to come calculating pixel value (difference between pixel values) according to the DCT coefficient.In addition, by adding the piece that the difference of having utilized between pixel value and the motion vector has been carried out compensation, come P frame and B frame are decoded.
Outside outlet terminal 52 can be used for connecting display device (for example TV does not show) and display image.Video-stream processor 53 cuts out required part according to viewing area 5R from 4 decoded image data of decoder, wherein, this viewing area 5R is provided with device 65 (describing after a while) setting when the playback ambiguity correction is handled by the viewing area of ambiguity correction processor 6.Then, handle this required part, it is become become in display 51 or be connected to the form that shows in the display device of outside outlet terminal 52.
At this, video-stream processor 53 comprises the buffer (not shown), so that to calculate resulting time of delay synchronous with total motion vector interpolation of interpolation processor 64.
When reset in display 51 (demonstrations) do not carried out ambiguity correction and handled the moving image data that just is recorded, ambiguity correction processor (ambiguity correction part) 6 was by the detection fuzzy quantity, and promptly the amount that moves of frame is carried out ambiguity correction.Particularly, ambiguity correction processor 6 comprises that Frame Handler 61, macro block selector 62, blur estimation device 63, interpolation processor 64, viewing area are provided with device 65 or the like.
In addition, Frame Handler 61 judges for each frame in described a plurality of frames whether it is can not be from the I frame (frame that does not have movable information) that wherein obtains motion vector (motion compensation information).Then, to macro block selector 62 and interpolation processor 64 these judged results of output.That is, Frame Handler 61 has made up a frame identification division, in order to identify the I frame that periodically occurs in described a plurality of frames.
Be judged to be the frame (for example, P frame and B frame) that is not the I frame for Frame Handler 61, macro block selector 62 is selected the macro block that is applicable to the total motion vector that calculates this frame.
That is, for example, under the situation of the MPEG4 of VGA size stream, each frame has 1200 macro blocks (with reference to Fig. 2), and comprises inter macroblocks (Intermacroblock) with interframe compensation and the intra-frame macro block (Intra Macroblock) that does not have the interframe compensation.In these macro blocks, suppose to average by motion vector to inter macroblocks, calculate total motion vector (bluring) of a whole frame.
Yet, also might when frame is encoded, produce wrong motion vector.Therefore, not all motion vector all is applicable to the total motion vector of calculating.In addition, owing to detect to judge whether motion vector is that error detection is unpractiaca once more, so whether very high according to the possibility that the various information relevant with macro block come misjudgment to detect.Therefore, selected macro block with high reliability.
At this, macro block selector 62 has made up a certificate authenticity device, and it judges the reliability of the motion vector of a plurality of macro blocks that constitute a frame.In addition, macro block selector 62 is selected a macro block that is judged as the motion vector with high reliability in described a plurality of macro blocks, is used for estimating the macro block of total motion vector of a frame as blur estimation device 63.
At this, for the judgement of macro block selector 62 for the motion vector reliability, the computation burden of handling all macro blocks in the frame is very heavy.Therefore, frame is divided into macro block group (4 * 4 macro block group).Then, by from each group, selecting a macro block to judge reliability.Under reliability is high situation, select this macro block as the macro block that is used to estimate total motion vector, and under reliability is low situation, other macro blocks in this macro block group (for example, next macro block) are carried out similarly handled.
In addition, macro block selector (determining the part of reliability according to flatness) 62 is according to the flatness of the image section relevant with macro block, judges the reliability of motion vector of the macro block of this frame.
That is, for a frame, be easy to move the error detection of vector in part smooth and that have feature hardly.Therefore, have low reliability, and be not used in and estimate total motion vector with motion vector smooth and that have the relevant macro block of the image section of feature hardly.
Particularly, encoder 2 is used for the Q factor (quantization parameter) that the view data of macro block is encoded and becomes very little in part smooth and that have feature hardly, and becomes very big in the very big part of frequency content.Therefore, macro block selector 62 is judged the image flatness according to this Q factor.
In addition, macro block selector 62 is according to the DC coefficient that obtains the DCT of encoder after, determines the realization degree of motion compensation of the macro block of this frame, then according to should definite result, and the reliability of the motion vector of judgement macro block.
That is, the DC coefficient is through DC component after the DCT and the difference between the reference block, and under the big situation of DC coefficient change, have the possibility of not carrying out motion compensation on the tram, so the difference between DC component and the reference block just becomes big.Therefore, under the DC coefficient was situation more than or equal to a predetermined threshold, the reliability of the motion vector of macro block was low, estimated total motion vector thereby do not use it for.
At this, macro block selector 62 has made up a motion compensation realization degree determining section, and it determines the realization degree of the motion compensation of macro block.In addition, macro block selector 62 has made up one and has judged the part of reliability according to realization degree, and it is according to determining that result judged the reliability of motion vector.
At this, blur estimation device 63 has made up a motion compensation information and has obtained part, and it is according to the decoding of 4 pairs of moving image datas of decoder, obtain with a plurality of frames that constitute live image in the relevant motion vector of at least one frame.In addition, blur estimation device 63 has made up a total estimation of motion vectors part, and it estimates total motion vector of this frame according to the motion vector that is obtained.
At this, interpolation processor 64 has made up a total motion vector interpolation part, and it is carried out for the interpolation of the total motion vector relevant with this I frame (frame that does not have movable information) and calculates.
The viewing area is provided with the viewing area 5R (Fig. 3 A) that device 65 is arranged on the frame F of the live image that shows on the display 51.Particularly, when resetting the live image that constitutes by these frames F, total motion vector of the I frame that is calculated according to total motion vector of P frame that is calculated (estimation) by blur estimation device 63 and B frame and by interpolation processor 64, the viewing area be provided with device 65 by with the viewing area 5R2 (Fig. 3 B) of a frame F2 from the amount that the viewing area 5R1 (Fig. 3 A) of previous frame F1 moves a total motion vector, carry out ambiguity correction.Therefore, on display 51, show playback (with reference to figure 3C) for the live image that has carried out fuzzy compensation.
At this, Fig. 3 A is schematically illustrated frame F1 that at first shows and the schematic diagram of viewing area 5R1, Fig. 3 B is shown schematically in the frame F2 that will show after the initial frame F1 and the schematic diagram of viewing area 5R2, and Fig. 3 C is shown schematically in frame F2 shown after handling through ambiguity correction and the schematic diagram of viewing area 5R2.
Next, be described in the ambiguity correction processing of playback time with reference to Figure 4 and 5.
At this, Fig. 4 shows the flow chart of an operational instances of handling according to the playback time ambiguity correction, and Fig. 5 shows the flow chart of selecting an operational instances of processing according to macro block in the processing of playback time ambiguity correction.
As shown in Figure 4, when a scheduled operation of imaging device 100 being indicated the live image that playback write down in register 3 according to the user, at first, decoder 4 produces bit stream (step S1) by obtaining the MPEG moving image file and this document is decoded from register 3.
Then, Frame Handler 61 is carried out and is handled, and so that a plurality of frames that comprised in decoded bit stream are determined order, and for each frame, judges whether it is I frame (step S2).Then, mark I frame, and to the judged result of macro block selector 62 and interpolation processor 64 transmit frame processors 61.
Then, macro block processor 62 is for be judged to be frame (the step S2: not), carry out macro block and select to handle (step S3) that is not the I frame in step S2.
Below, describe macro block with reference to Fig. 5 and select to handle.
As shown in Figure 5, macro block selector 62 is macro block group (4 * 4 macro block group) (with reference to figure 2, step S301) with the macroblock partitions of the frame of input.
Then, macro block selector 62 is selected a macro block group (step S302) in a plurality of macro block groups of being divided, and selects a macro block (for example, the macro block of left upper filling black in Fig. 2) (step S303) in this macro block group.
Then, macro block selector 602 extracts (obtaining) Q factor relevant with this macro block from decoded bit stream, and selects relevant various parameters, for example DC coefficient (step S304) with macro block.
Then, macro block selector 602 according in the frame/the interframe label, judge whether selected macro block is inter macroblocks (step S305).
At this, being judged as at it is that macro block selector 62 judges that whether the Q factor of this macro block is more than or equal to predetermined value (step S306) under the situation of inter macroblocks (step S305: be).
Judging that this Q factor exceeds under the situation of this predetermined value, promptly, being judged to be is to comprise under the situation of macro block of a large amount of frequency contents (step S306: be), and macro block selector 62 judges that whether the DC coefficient of this macro block after DCT is less than predetermined value (step S307).
At this, judging under the situation of DC coefficient less than predetermined value (step S307: be) that macro block selector 62 adopts this macro block as the macro block (step S308) that is used to estimate total motion vector.
Simultaneously, judging that it is not under the situation of inter macroblocks, promptly, judge that in step S305 it is not have (step S305: not) under the situation of intra-frame macro block of independent type of motion vector, in step S306, judge (step S306: not) under the situation of this Q factor less than predetermined value, perhaps under the situation of DC coefficient greater than predetermined value, promptly, determine that in step S307 (step S307: not), macro block selector 62 judges whether all macro blocks in the macro block group that comprises this macro block all to have been carried out the judgement (step S309) of various parameters not under the very big situation of the possibility of carrying out motion compensation on the tram.
At this, judging also do not have all macro blocks are carried out (step S309: deny) under the situation about judging, macro block selector 62 is selected another macro block (step S310) in the macro block group, move to step S304 and execution processing subsequently.
In addition, in step S309, judge all macro blocks have all been carried out under the situation about judging, perhaps in step S308, adopted under the situation of this macro block as the macro block that is used to estimate, judged whether that all macro block groups have all been carried out the macro block relevant with the estimation of total motion vector estimates (step S311).
At this, also all macro block groups have not all been carried out under the situation that macro block estimates that (step S311: not), macro block selector 62 moves to step S302 and carries out subsequently processing judging.
In addition, in step S311, judging that this macro block is selected to finish dealing with under the situation of all macro block groups having been carried out the macro block estimation (step S311: be).
When finishing macro block selection processing, as shown in Figure 4, blur estimation device 63 obtains the motion vector by macro block selector 62 a plurality of macro blocks relevant with frame that adopted, and by these motion vectors are carried out total motion vector (step S4) that average treatment are calculated this frame.
Next, for the frame that in step S2, is judged as the I frame (I frame) (step S2: be), interpolation processor 64 obtains total motion vector of a frame after this I frame from blur estimation device 63, and carries out interpolation at total motion vector of this frame and between total motion vector of the frame before this I frame and calculate.Thereby obtain total motion vector (step S5) of this I frame.
Then, for a plurality of frame F that obtained total motion vector, the viewing area is provided with device 65 by the viewing area 5R of a frame is moved the amount of a total motion vector from the viewing area 5R of previous frame F, come the live image that shows on display 51 is carried out ambiguity correction (step S6 is with reference to Fig. 3 A to 3C).
As mentioned above, according to imaging device 100 of the present invention, owing to adopt the MPEG technology that moving image data is encoded, therefore can estimate total motion vector of this frame according to the motion vector of a plurality of macro blocks of the frame that constitutes live image.
Particularly, in a plurality of macro blocks that constitute this frame, can select the high macro block of motion vector reliability as the macro block that is used to estimate total motion vector.Therefore, eliminate the macro block with low reliability from the estimation of total motion vector, it uses wrong motion vector to produce in cataloged procedure.Therefore, can more suitably carry out the estimation of total motion vector.
At this, can be according to the flatness of the image section relevant with macro block, and, carry out judgement to the motion vector reliability of macro block in the frame according to definite result to the realization degree of the motion compensation of this macro block.Therefore, can more suitably carry out judgement, therefore can suitably carry out determining for the macro block that will from the estimation of total motion vector, eliminate to the motion vector reliability.
In addition, owing to can carry out judgement according to the Q factor relevant with the quantification of the view data of macro block for the flatness of the image section of being correlated with macro block in the frame, therefore can more suitably carry out judgement, therefore can suitably carry out judgement the motion vector reliability of macro block to flatness.
Therefore, even if do not carrying out the live image that ambiguity correction just is recorded, also can when resetting this live image, suitably carry out ambiguity correction according to total motion vector of this estimated frame.Therefore, when carrying out ambiguity correction and handle, do not need as in the conventional practice, in a frame search characteristics partly and with before this characteristic and this frame and frame afterwards compare.Thereby, the burden that causes owing to the ambiguity correction processing in the time of can suitably being suppressed at the playback live image.
In addition, for the I frame that does not have motion vector, can suitably carry out interpolation calculating according to before this I frame and frame afterwards to total motion vector relevant with this I frame.Therefore, even the I frame also can obtain its total motion vector, therefore can when resetting this live image, suitably carry out ambiguity correction and handle.
At this, the present invention is not limited to the foregoing description, but can be applied to various modifications and alternative design, as long as it does not depart from scope of the present invention.
For example, the Q factor of macro block is used as the parameter that macro block selector 62 is used for judging the flatness of the image section relevant with this macro block.Yet it is not limited to this situation.For example, in this frame, since very high in compression ratio smooth and that have hardly in the image section of feature, so amount of coded data is very low.Therefore, macro block selector 62 can utilize the macroblock encoding data volume as the parameter that is used for judging as flatness amount of coded data judgment part, and judges the flatness of this image section.
Therefore, according to the macroblock encoding data volume, just can more suitably carry out judgement to the flatness of the image section relevant with macro block.Therefore, can suitably carry out the reliability of the motion vector of macro block is judged.
In addition, in the above-described embodiments, judge the macro block that is used to estimate total motion vector according to the Q factor of macro block and the DC coefficient after the DCT.Yet it is not limited to this situation.For example, can only use at least a in the Q factor and the DC coefficient to carry out judgement, perhaps, except them both one of, also use the macroblock encoding data volume as the parameter of judging, and carry out and judge.
In addition, in the above-described embodiments,, come interpolation to calculate the total motion vector relevant with this I frame according to before the I frame that does not have motion vector and total motion vector of afterwards frame.Yet the method that is used to obtain total motion vector of I frame is not limited to this situation.That is, for example, can provide image-signal processor 164 (with reference to Fig. 6) for imaging device 200.At this, image-signal processor 164 extracts predetermined characteristic point as characteristic point extracting portion from the I frame.Then, as the feature point tracking part, image-signal processor 164 adopts the corresponding characteristic point of characteristic point that (for example a, frame before) followed the tracks of and extracted in any one frame in KLT (Kanade-Lucas-Tomasi) signature tracking algorithm or the like two frames before or after this I frame.Then, as total motion vector determining section, image-signal processor 164 is according to total motion vector of the tracking results (amount of movement of characteristic point) of characteristic point being determined the I frame.
Therefore, even there is not the I frame of motion vector, follows the tracks of its characteristic point in any one frame in also can two frames before or after this I frame, and can determine the total motion vector relevant according to this tracking results with this I frame.Therefore, can suitably carry out ambiguity correction in the live image playback procedure handles.
In addition, in the above-described embodiments, the image processor equipment as related to the present invention provides imaging device 100 and 200 as an example, and their photographic images also show still image and the live image that (playback) gathered.Yet it is not limited to this situation, and it can be any equipment, as long as the playback that it at least can the executed activity image.For example, can be to personal computer (PC as image processing equipment, show) etc. the loading activity image file, and according to PC for pre-programmed execution, carry out the decoding processing to this moving image file, the computing of obtaining processings, total motion vector of motion vector and the ambiguity correction processing when resetting this live image.
The application is based on the previous Japanese patent application 2005-311041 that submits on October 26th, 2005, and requires its priority, quotes it in full as a reference at this.
Industrial applicability
As mentioned above, can be applied to imaging according to image processing equipment of the present invention and program establishes Standby, and do not carry out the live image that ambiguity correction just is recorded according to desirably alleviating resetting The time because ambiguity correction is processed the burden cause.
Claims (21)
1. image processing equipment comprises:
Decoding moving image part (4), in order to being live image information through the live image information decoding of coding, wherein, described live image information through coding is to adopt the compressionism of carrying out motion compensation to encode;
Motion compensation information is obtained part (61), in order to according to the decoding of described decoding moving image part to described live image information, obtains and the relevant motion compensation information of at least one frame in a plurality of frames of formation live image;
Total estimation of motion vectors part (63) in order to the motion compensation information of obtaining according to described motion compensation information acquisition unit branch, is estimated total motion vector of described frame; And
Ambiguity correction part (65) in order to according to the estimated total motion vector of described total estimation of motion vectors part, is carried out ambiguity correction when resetting described live image.
2. image processing equipment as claimed in claim 1 also comprises:
Frame identification division (61), in order to identify the frame that does not have movable information in described a plurality of frames, wherein, described motion compensation information is obtained part can't obtain motion compensation information from the described frame that does not have a movable information; And
Total motion vector interpolation part (64) in order to according to before the described frame that does not have a movable information that is identified by described frame identification part branch and frame afterwards, is carried out the interpolation calculating for the total motion vector relevant with the described frame that does not have a movable information.
3. image processing equipment as claimed in claim 1 also comprises:
The frame identification division, in order to identify the frame that does not have movable information in described a plurality of frames, wherein, described motion compensation information is obtained part can't obtain motion compensation information from the described frame that does not have a movable information;
Characteristic point extracting portion (164) is in order to extract and the described relevant characteristic point of frame that does not have movable information that is identified by described frame identification part branch;
Feature point tracking part (164) in one of two frames before or after the described frame that does not have a movable information, is followed the tracks of the characteristic point of being extracted by described characteristic point extracting portion; And
Total motion vector determining section (63) in order to according to the tracking results of described feature point tracking part to described characteristic point, is determined and the described relevant total motion vector of frame that does not have movable information.
4. image processing equipment as claimed in claim 1 also comprises:
Certificate authenticity device (62) is in order to the reliability of the motion vector of a plurality of macro blocks of judge constituting described frame; And
Macro block is selected part (62), is judged as high macro block in order to select a motion vector reliability in described a plurality of macro blocks, as described total estimation of motion vectors part in order to estimate the macro block of described total motion vector.
5. image processing equipment as claimed in claim 4, wherein, described certificate authenticity device is judged the reliability of motion vector according to the flatness of the image section relevant with macro block in the described frame.
6. image processing equipment as claimed in claim 5, wherein, according to following at least one judge described flatness: with the relevant Q factor of quantification of the image information of macro block, and the macroblock encoding data volume.
7. image processing equipment as claimed in claim 4, wherein, described certificate authenticity device is determined the realization degree of the motion compensation of macro block, and judges the reliability of motion vector according to determined realization degree.
8. image processing method comprises:
Decoding step (S1), in order to being live image information through the live image information decoding of coding, wherein, described live image information through coding is to adopt the compressionism of carrying out motion compensation to encode;
Motion compensation information obtaining step (S1) in order to according to the decoding of described live image information, obtains and the relevant information of at least one frame in a plurality of frames of formation live image;
Estimating step (S4) in order to according to the motion compensation information of being obtained, is estimated total motion vector of described frame; And
Ambiguity correction step (S6), in order to according to estimated total motion vector, blur correction mode when resetting described live image.
9. image processing method as claimed in claim 8 also comprises:
Frame identification step (S2) is in order to identify the frame that does not have movable information in described a plurality of frames; And
Total motion vector interpolation step (S5) in order to according to before the described frame that does not have a movable information that is identified and frame afterwards, is carried out the interpolation calculating for the total motion vector relevant with the described frame that does not have a movable information.
10. image processing method as claimed in claim 8 also comprises:
Frame identification step (S2) is in order to identify the frame that does not have movable information in described a plurality of frames;
The feature point extraction step is in order to extract and the described relevant characteristic point of frame that does not have movable information that is identified;
The feature point tracking step in one of two frames before or after the described frame that does not have a movable information, is followed the tracks of the characteristic point of being extracted; And
Total step of determining motion vector in order to according to the tracking results to described characteristic point, is determined and the described relevant total motion vector of frame that does not have movable information.
11. image processing method as claimed in claim 8 also comprises:
Certificate authenticity step (S3) is in order to the reliability of the motion vector of a plurality of macro blocks of judge constituting described frame; And
Macro block is selected step (S3), is judged as high macro block in order to select a motion vector reliability in described a plurality of macro blocks, as in order to estimate the macro block of described total motion vector.
12. image processing method as claimed in claim 11 wherein, in the certificate authenticity step, according to the flatness of the image section relevant with macro block in the described frame, is judged the reliability of motion vector.
13. image processing method as claimed in claim 12, wherein, according to following at least one judge described flatness: with the relevant Q factor of quantification of the image information of macro block, and the macroblock encoding data volume.
14. image processing method as claimed in claim 11 wherein, in the certificate authenticity step, according to the realization degree of the motion compensation of macro block, is judged the reliability of motion vector.
15. the software program product in the embeddeding computer computer-readable recording medium is in order to carry out method as claimed in claim 8.
16. the software program product in the embeddeding computer computer-readable recording medium is in order to carry out method as claimed in claim 9.
17. the software program product in the embeddeding computer computer-readable recording medium is in order to carry out method as claimed in claim 10.
18. the software program product in the embeddeding computer computer-readable recording medium is in order to carry out method as claimed in claim 11.
19. the software program product in the embeddeding computer computer-readable recording medium is in order to carry out method as claimed in claim 12.
20. the software program product in the embeddeding computer computer-readable recording medium is in order to carry out method as claimed in claim 13.
21. the software program product in the embeddeding computer computer-readable recording medium is in order to carry out method as claimed in claim 14.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP311041/2005 | 2005-10-26 | ||
JP2005311041A JP2007122232A (en) | 2005-10-26 | 2005-10-26 | Image processor and program |
Publications (2)
Publication Number | Publication Date |
---|---|
CN101151889A true CN101151889A (en) | 2008-03-26 |
CN100521744C CN100521744C (en) | 2009-07-29 |
Family
ID=37649276
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CNB2006800105562A Active CN100521744C (en) | 2005-10-26 | 2006-10-19 | Image processing device and program |
Country Status (7)
Country | Link |
---|---|
US (1) | US7742647B2 (en) |
EP (1) | EP1941720B1 (en) |
JP (1) | JP2007122232A (en) |
KR (1) | KR100941285B1 (en) |
CN (1) | CN100521744C (en) |
TW (1) | TWI327026B (en) |
WO (1) | WO2007049662A1 (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007122232A (en) * | 2005-10-26 | 2007-05-17 | Casio Comput Co Ltd | Image processor and program |
JP4686795B2 (en) * | 2006-12-27 | 2011-05-25 | 富士フイルム株式会社 | Image generating apparatus and image reproducing apparatus |
US8780990B2 (en) * | 2008-12-16 | 2014-07-15 | Panasonic Intellectual Property Corporation Of America | Imaging device for motion vector estimation using images captured at a high frame rate with blur detection and method and integrated circuit performing the same |
JP2010278931A (en) * | 2009-05-29 | 2010-12-09 | Toshiba Corp | Image processing apparatus |
JP5430234B2 (en) * | 2009-06-04 | 2014-02-26 | パナソニック株式会社 | Image processing apparatus, image processing method, program, recording medium, and integrated circuit |
KR101103350B1 (en) * | 2010-02-09 | 2012-01-05 | 원창주식회사 | Apparatus for diffusing light in light lamp |
JP2011254223A (en) * | 2010-06-01 | 2011-12-15 | Panasonic Corp | Image processing device and electronic device equipped with the same |
JP5317023B2 (en) * | 2010-09-16 | 2013-10-16 | カシオ計算機株式会社 | Camera shake correction apparatus, camera shake correction method, and program |
KR20120090101A (en) * | 2010-12-23 | 2012-08-17 | 한국전자통신연구원 | Digital video fast matching system using key-frame index method |
WO2015111840A1 (en) * | 2014-01-24 | 2015-07-30 | 에스케이플래닛 주식회사 | Device and method for inserting advertisement by using frame clustering |
Family Cites Families (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1993002529A1 (en) * | 1991-07-23 | 1993-02-04 | British Telecommunications Public Limited Company | Method and device for frame interpolation of a moving image |
JPH0575913A (en) * | 1991-09-13 | 1993-03-26 | Matsushita Electric Ind Co Ltd | Motion vector detecting circuit and jiggling correcting circuit |
JP3164121B2 (en) * | 1992-02-21 | 2001-05-08 | キヤノン株式会社 | Motion vector detection device |
JP3168492B2 (en) * | 1992-10-14 | 2001-05-21 | 三菱電機株式会社 | Imaging device |
EP0701760B1 (en) * | 1993-06-01 | 1997-09-17 | THOMSON multimedia | Method and apparatus for motion compensated interpolation |
JP2940762B2 (en) * | 1993-06-28 | 1999-08-25 | 三洋電機株式会社 | Video camera with image stabilization device |
KR100292475B1 (en) * | 1993-12-08 | 2001-06-01 | 구자홍 | Device for compensating digital image shake |
JPH08163565A (en) * | 1994-12-05 | 1996-06-21 | Canon Inc | Motion vector detection method and device therefor |
GB2297450B (en) * | 1995-01-18 | 1999-03-10 | Sony Uk Ltd | Video processing method and apparatus |
JPH10136304A (en) | 1996-10-30 | 1998-05-22 | Nagano Nippon Denki Software Kk | Camera-shake correction device for recorded dynamic image file |
JPH10210473A (en) * | 1997-01-16 | 1998-08-07 | Toshiba Corp | Motion vector detector |
US6809758B1 (en) * | 1999-12-29 | 2004-10-26 | Eastman Kodak Company | Automated stabilization method for digital image sequences |
JP3948596B2 (en) * | 2000-03-06 | 2007-07-25 | Kddi株式会社 | Moving object detection and tracking device in moving images |
KR100683849B1 (en) * | 2000-06-28 | 2007-02-15 | 삼성전자주식회사 | Decoder having digital image stabilization function and digital image stabilization method |
JP4596202B2 (en) * | 2001-02-05 | 2010-12-08 | ソニー株式会社 | Image processing apparatus and method, and recording medium |
JP4366023B2 (en) * | 2001-03-16 | 2009-11-18 | インターナショナル・ビジネス・マシーンズ・コーポレーション | Partial image region extraction method of video image, partial image region extraction system, program for extracting partial image region, distribution method of extracted video image, and content creation method |
JP3886769B2 (en) * | 2001-10-26 | 2007-02-28 | 富士通株式会社 | Correction image generation apparatus and correction image generation program |
EP1377040A1 (en) * | 2002-06-19 | 2004-01-02 | STMicroelectronics S.r.l. | Method of stabilizing an image sequence |
JP2005006275A (en) * | 2002-11-22 | 2005-01-06 | Matsushita Electric Ind Co Ltd | Device, method, and program for generating interpolation frame |
US7639741B1 (en) * | 2002-12-06 | 2009-12-29 | Altera Corporation | Temporal filtering using object motion estimation |
JP4003128B2 (en) * | 2002-12-24 | 2007-11-07 | ソニー株式会社 | Image data processing apparatus and method, recording medium, and program |
US8085850B2 (en) * | 2003-04-24 | 2011-12-27 | Zador Andrew M | Methods and apparatus for efficient encoding of image edges, motion, velocity, and detail |
US7440634B2 (en) * | 2003-06-17 | 2008-10-21 | The Trustees Of Columbia University In The City Of New York | Method for de-blurring images of moving objects |
JP2005027046A (en) * | 2003-07-02 | 2005-01-27 | Sony Corp | Image processor and image processing method |
JP4164424B2 (en) * | 2003-08-29 | 2008-10-15 | キヤノン株式会社 | Imaging apparatus and method |
GB2407226B (en) * | 2003-10-18 | 2008-01-23 | Hewlett Packard Development Co | Image processing scheme |
JP4258383B2 (en) * | 2004-01-14 | 2009-04-30 | パナソニック株式会社 | Imaging device |
JP2005244780A (en) * | 2004-02-27 | 2005-09-08 | Matsushita Electric Ind Co Ltd | Imaging apparatus |
JP2005260481A (en) * | 2004-03-10 | 2005-09-22 | Olympus Corp | Device and method for detecting motion vector and camera |
WO2005093654A2 (en) * | 2004-03-25 | 2005-10-06 | Fatih Ozluturk | Method and apparatus to correct digital image blur due to motion of subject or imaging device |
WO2006047727A2 (en) * | 2004-10-27 | 2006-05-04 | Eg Technology, Inc. | Optimal rate allocation for a group of channels |
JP2007122232A (en) * | 2005-10-26 | 2007-05-17 | Casio Comput Co Ltd | Image processor and program |
-
2005
- 2005-10-26 JP JP2005311041A patent/JP2007122232A/en active Pending
-
2006
- 2006-10-19 KR KR1020077022807A patent/KR100941285B1/en active IP Right Grant
- 2006-10-19 CN CNB2006800105562A patent/CN100521744C/en active Active
- 2006-10-19 EP EP06822283.5A patent/EP1941720B1/en active Active
- 2006-10-19 WO PCT/JP2006/321306 patent/WO2007049662A1/en active Application Filing
- 2006-10-25 TW TW95139313A patent/TWI327026B/en not_active IP Right Cessation
- 2006-10-25 US US11/586,047 patent/US7742647B2/en active Active
Also Published As
Publication number | Publication date |
---|---|
KR20080000580A (en) | 2008-01-02 |
CN100521744C (en) | 2009-07-29 |
JP2007122232A (en) | 2007-05-17 |
EP1941720B1 (en) | 2017-05-03 |
EP1941720A1 (en) | 2008-07-09 |
TW200723861A (en) | 2007-06-16 |
TWI327026B (en) | 2010-07-01 |
WO2007049662A1 (en) | 2007-05-03 |
US20070092145A1 (en) | 2007-04-26 |
US7742647B2 (en) | 2010-06-22 |
KR100941285B1 (en) | 2010-02-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN100521744C (en) | Image processing device and program | |
JP5580453B2 (en) | Direct mode encoding and decoding apparatus | |
KR0181069B1 (en) | Motion estimation apparatus | |
US8559515B2 (en) | Apparatus and method for encoding and decoding multi-view video | |
US8000393B2 (en) | Video encoding apparatus and video encoding method | |
TWI621351B (en) | Image prediction decoding device, image prediction decoding method and image prediction decoding program | |
KR100843418B1 (en) | Apparatus and method for image coding | |
CN102202221A (en) | Video coding method and video coding apparatus | |
TWI511531B (en) | Video coding apparatus, video coding method, and video coding program | |
JPH07112284B2 (en) | Predictive encoding device and decoding device | |
JP2013532926A (en) | Method and system for encoding video frames using multiple processors | |
CN102362499A (en) | Image encoding apparatus and image encoding method | |
JP5748225B2 (en) | Moving picture coding method, moving picture coding apparatus, and moving picture coding program | |
CN101822058A (en) | Video encoding using pixel decimation | |
JP4898415B2 (en) | Moving picture coding apparatus and moving picture coding method | |
JP5180887B2 (en) | Encoding apparatus and method thereof | |
JP3812808B2 (en) | Skip region detection type moving image encoding apparatus and recording medium | |
JP2007174202A (en) | Motion vector detecting device and method therefor | |
JP3711022B2 (en) | Method and apparatus for recognizing specific object in moving image | |
JP2000261809A (en) | Image coder coping with feature of picture | |
CN103155566A (en) | Movie image encoding method and movie image encoding device | |
JP2003032691A (en) | Picture coding device corresponding to picture feature | |
JPH1042300A (en) | Motion vector detection device | |
JP2008072608A (en) | Apparatus and method for encoding image | |
CN109587496B (en) | Skip block distinguishing method, encoder, electronic device and readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
C41 | Transfer of patent application or patent right or utility model | ||
TR01 | Transfer of patent right |
Effective date of registration: 20160815 Address after: 100080, Haidian District, 68, Qinghe street, Huarun colorful city shopping center, two, 9, 01, room Patentee after: BEIJING XIAOMI MOBILE SOFTWARE Co.,Ltd. Address before: Tokyo, Japan Patentee before: CASIO Computer Co., Ltd. |