CN103139562A - Motion estimation method and motion estimation device - Google Patents

Motion estimation method and motion estimation device Download PDF

Info

Publication number
CN103139562A
CN103139562A CN201110400975XA CN201110400975A CN103139562A CN 103139562 A CN103139562 A CN 103139562A CN 201110400975X A CN201110400975X A CN 201110400975XA CN 201110400975 A CN201110400975 A CN 201110400975A CN 103139562 A CN103139562 A CN 103139562A
Authority
CN
China
Prior art keywords
pixel
macro block
frame
present frame
block
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201110400975XA
Other languages
Chinese (zh)
Other versions
CN103139562B (en
Inventor
付轩
郑艳
朱建清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Ltd
Original Assignee
Fujitsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd filed Critical Fujitsu Ltd
Priority to CN201110400975.XA priority Critical patent/CN103139562B/en
Publication of CN103139562A publication Critical patent/CN103139562A/en
Application granted granted Critical
Publication of CN103139562B publication Critical patent/CN103139562B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The invention discloses a motion estimation method and a motion estimation device. The motion estimation method comprises the following steps: respectively conducting sampling on pixels of a current frame and a reference frame, and obtaining a current image corresponding to the current frame and a reference image corresponding to the reference frame; conducting full search on the reference image, finding out an optimal movement vector of each fundamental pixel unit of the current image, and using the optimal movement vector as the optimal movement vector of multi-pixel accuracy of each macro block of the current frame, wherein each fundamental pixel unit of the current image corresponds to each macro block of the current frame, and each fundamental pixel unit of the reference image corresponds to each macro block of the reference frame.

Description

Method for estimating and device
Technical field
The present invention relates to image processing field, relate more specifically to a kind of method for estimating and device.
Background technology
Estimation (Motion Estimation) is widely used a kind of technology in Video processing (for example, Video coding, video deinterleave).The basic thought of estimation is each frame in image sequence to be divided into the macro block of many non-overlapping copies, and think that the displacement of interior all pixels of each macro block is identical, then find out the piece the most similar to current macro (namely according to certain matching criterior in a certain specific hunting zone of reference frame, the match block of current macro), the relative displacement of match block and current macro is the motion vector of current macro.
Only need preserve current macro with respect to motion vector and the residual error data of its match block when video compression, just can recover current macro fully when video decompression.Usually need to carry out motion search and just can find out the optimum movement vector of current macro (that is, finding out the match block the most similar to current macro) in Video processing in the very large search scope of reference frame.In order to obtain the optimum movement vector of current macro, need to entirely search in the hunting zone.The complete all possible motion vector candidates of search, and the most accurate result is provided, but the processing time is very long.
Summary of the invention
In view of above problem, the invention provides a kind of method for estimating and device of novelty.
Comprise according to the method for estimating of the embodiment of the present invention: by respectively the pixel in present frame and the pixel in reference frame being sampled, obtain the current picture corresponding with present frame and the reference picture corresponding with reference frame; By entirely searching in reference picture, find out the optimum movement vector of each base pixel unit in current picture, optimum movement vector as many pixel precisions of each macro block in present frame, wherein each base pixel unit in current picture is corresponding with each macro block in present frame respectively, and each base pixel unit in reference picture is corresponding with each macro block in reference frame respectively.
Comprise according to the movement estimation apparatus of the embodiment of the present invention: the pixel sampling unit, be used for by respectively the pixel of present frame and the pixel in reference frame being sampled, obtain the current picture corresponding with present frame and the reference picture corresponding with reference frame; The first search unit, be used for by entirely searching at reference picture, find out the optimum movement vector of each base pixel unit in current picture, optimum movement vector as many pixel precisions of each macro block in present frame, wherein each base pixel unit in current picture is corresponding with each macro block in present frame respectively, and each base pixel unit in reference picture is corresponding with each macro block in reference frame respectively.
By the present invention, can find out present frame within the very short time in the optimum movement vector of many pixel precisions of each macro block.
Description of drawings
From below in conjunction with the description of accompanying drawing to the specific embodiment of the present invention, the present invention may be better understood, wherein:
Fig. 1 shows the example that reference frame/present frame is sampled;
Fig. 2 shows under 16 * 16 interframe encoding mode the reference zone of optimum movement vector of the whole pixel precision of any one macro block that is used for the search present frame;
Fig. 3 shows the block diagram according to the movement estimation apparatus of the embodiment of the present invention; And
Fig. 4 shows the flow chart according to the method for estimating of the embodiment of the present invention.
Embodiment
The below will describe feature and the exemplary embodiment of various aspects of the present invention in detail.Many details have been contained in following description, in order to complete understanding of the present invention is provided.But, it will be apparent to one skilled in the art that the present invention can in the situation that some details in not needing these details implement.The below is only in order to provide the clearer understanding to the present invention by example of the present invention is shown to the description of embodiment.The present invention never is limited to following any concrete configuration and the algorithm that proposes, but has covered under the premise of without departing from the spirit of the present invention any modification, replacement and the improvement of coherent element, parts and algorithm.
Thereby in order to carry out the optimum movement vector that current macro is found out in full search on reference frame within the limited time, the present invention proposes a kind of quick all direction search method.
According to one embodiment of present invention, in order to carry out full search fast, at first need reference frame and present frame are sampled.Fig. 1 shows the example that reference frame/present frame is sampled.Wherein, the frame of pixels shown in Fig. 1 can be counted as reference frame, also can be counted as present frame.
As shown in Figure 1, the pixel in each 4 * 4 in reference frame/present frame is sampled, to obtain reference picture/current picture.Wherein, the pixel of an ad-hoc location in each 4 * 4 in reference frame/present frame is sampled; And the pixel sample locations that all in reference frame/present frame are 4 * 4 is all identical.For example, illustrated in Fig. 1 each 4 * 4 the first row in reference frame/present frame, the pixel of first row have been sampled.But it will be understood by those skilled in the art that each pixel sample locations of 4 * 4 is not limited to the position shown in Fig. 1, and can be predefined any position.In addition, the pixel sample locations in reference frame is identical with pixel sample locations in present frame.
The reference picture that process sampling shown in Figure 1 obtains and the size of current picture are 1/16 (1/4 width * 1/4 height) of reference frame and present frame.So, thereby the optimum movement vector of entirely searching for each 4 * 4 (4 * 4 of each in current picture are corresponding with each macro block in present frame) obtaining in current picture in reference picture only needs for 1/16 processing time (this processing time is the required time of optimum movement vector that draws the whole pixel precision of each macro block in present frame by reference frame being carried out full search).
Obviously, undertaken by the reference picture that draws based on sampling process shown in Figure 1 and current picture that in current picture that full search draws, each optimum movement vector of 4 * 4 is the motion vector of 4 pixel precisions.And in the image of reality is processed, in order to ensure the quality of estimation, should carry out the motion search of whole pixel precision.So, need to carry out full search in the larger reference zone centered by the reference zone of finding out according to the motion vector of aforementioned 4 pixel precisions.
For example, interframe encoding mode for 16 * 16, for any one the macro block B in present frame, need to be according to finding out the reference zone R corresponding with macro block B with the motion vector of 4 pixel precisions of corresponding 4 * 4 b of macro block B in reference frame in current picture, and entirely search for, thereby draw the optimum movement vector of the whole pixel precision of macro block B in 24 * 24 zone centered by reference zone R.Fig. 2 shows under 16 * 16 interframe encoding mode the reference zone of optimum movement vector of the whole pixel precision of any one the macro block B that is used for the search present frame.As shown in Figure 2, in according to current picture the motion vector of 4 pixel precisions of 4 * 4 piece bs corresponding with macro block B find out in reference frame with 16 * 16 reference zone R corresponding to macro block B after, respectively at the upper and lower, left and right four direction with 4 pixels of 16 * 16 reference zone R expansion, can obtain corresponding with macro block B 24 * 24 the zone that is used for full search.
For example, interframe encoding mode for 16 * 8,16 * 8 B ' that are partitioned into for any one the macro block B from present frame, need the motion vector of 4 pixel precisions of 4 * 4 b corresponding with macro block B in current picture to find out in reference frame and 16 * 8 reference zone R that B ' is corresponding, and entirely search for, thereby draw the optimum movement vector of the whole pixel precision of 16 * 8 B ' in 24 * 16 zone centered by reference zone R.Similar with process shown in Figure 2, in according to current picture the motion vector of 4 pixel precisions of 4 * 4 piece bs corresponding with macro block B find out in reference frame with 16 * 8 reference zone R corresponding to 16 * 8 B ' after, respectively at the upper and lower, left and right four direction with 4 pixels of 16 * 8 reference zone R expansion, can obtain corresponding with these 16 * 8 B ' 24 * 16 the zone entirely searched for of being used for.
For other interframe encoding modes, the rest may be inferred.That is to say, for certain interframe encoding mode, after the motion vector of 4 pixel precisions of 4 * 4 b corresponding with macro block B is found out the reference zone R of certain the big or small sub-block B ' that is partitioned into from macro block B according to this interframe encoding mode in reference frame according to current picture, expand 4 pixels at the upper and lower, left and right four direction with reference to regional R respectively, can obtain the zone that be used for full search corresponding with sub-block B '.
After the optimum movement vector that draws the whole pixel precision of each seed block under various interframe encoding modes, calculating under various interframe encoding modes for the coding cost of macro block B, thereby find out a kind of interframe encoding mode of coding Least-cost as the best interframe encoding mode of macro block B.
Fig. 3 shows the block diagram according to the movement estimation apparatus of the embodiment of the present invention.Fig. 4 shows the flow chart according to the method for estimating of the embodiment of the present invention.Below in conjunction with Fig. 3 and Fig. 4, describe movement estimation apparatus and method according to the embodiment of the present invention in detail.
As shown in Figure 4, this movement estimation apparatus comprises pixel sampling unit 302 and the first search unit 304.Wherein, pixel sampling unit 302 obtains the current picture corresponding with present frame and the reference picture corresponding with reference frame (S402) by respectively the pixel in present frame and the pixel in reference frame being sampled; The first search unit is by entirely searching in reference picture, finds out the optimum movement vector of each base pixel unit in current picture, as the optimum movement vector (S404) of many pixel precisions of each macro block in present frame.Wherein, each base pixel unit in current picture is corresponding with each macro block in present frame respectively, and each base pixel unit in reference frame is corresponding with each macro block in reference frame respectively.
In one embodiment, can sample to the pixel in present frame and reference frame by the sampling process of describing in conjunction with Fig. 1 in pixel sampling unit 302.That is, the pixel of an ad-hoc location in pixel sampling unit 302 can comprise each macro block in reference frame and present frame each 4 * 4 is sampled, with draw the reference picture corresponding with reference frame and with present frame in corresponding current picture.Wherein, the size of reference picture is 1/16 of reference frame, the size of current picture is 1/16 of present frame, 4 * 4 of each in reference picture (namely, the base pixel unit) corresponding with each macro block in reference frame respectively, each in current picture 4 * 4 (that is, base pixel unit) is corresponding with each macro block in present frame respectively.
Then, the first search unit 304 can by entirely searching for, be found out each 4 * 4 the optimum movement vector in current picture in reference picture.Here, the first search unit 304 can be regarded each 4 * 4 the optimum movement vector in current picture as the optimum movement vector of 4 pixel precisions of each macro block in current picture.
Then, in specific interframe encoding mode, the optimum movement vector of the whole pixel precision of any one sub-block that goes out for any one macroblock partition that obtains according to this interframe encoding mode from present frame further comprises regional search unit 306 and the second search unit 308 according to the movement estimation apparatus of the embodiment of the present invention.
Wherein, for any one the macro block B in present frame, the reference zone corresponding with any one the sub-block B ' that is partitioned into from macro block B according to specific interframe encoding mode found out according to the optimum movement vector of the base pixel unit b corresponding with macro block B in current picture in regional search unit 306 in reference frame; The second 308 pairs of search units reference zone corresponding with sub-block B ' expanded, and carries out the full optimum movement vector of searching for the whole pixel precision of finding out sub-block B ' in the reference zone after expansion.Wherein, regional search unit 306 can utilize the process of describing in conjunction with Fig. 2 to process.In sum, the reference picture that the present invention draws by sample-based and current picture are found out in current picture the optimum movement vector of each base pixel unit, then utilizing in current picture the optimum movement vector of each base pixel unit to find out carries out entirely searching for to draw the reference zone of the optimum movement vector of the whole pixel precision of each macro block in present frame in reference frame, accelerated processing speed, reduce the processing time, thereby saved hardware resource.
Below with reference to specific embodiments of the invention, the present invention has been described, but those skilled in the art all understand, can carry out various modifications, combination and change to these specific embodiments, and can not break away from the spirit and scope of the present invention that limited by claims or its equivalent.
Can come execution in step with hardware or software as required.Note, without departing from the scope of the invention, the flow chart that can provide in this specification adds step, therefrom removes step or revise wherein step.In general, flow chart just is used to refer to a kind of possible sequence for the basic operation of practical function.
Embodiments of the invention can utilize programming general purpose digital computer, utilize application-specific integrated circuit (ASIC), programmable logic device, field programmable gate array, light, chemistry, biological, system quantum or nanometer engineering, assembly and mechanism to realize.In general, function of the present invention can be realized by any means known in the art.Can use distributed or networked system, assembly and circuit.The communication of data or to transmit can be wired, wireless or by any other means.
Also will recognize, according to the needs of application-specific, one or more can by more separate or more integrated mode realizes, perhaps being removed even in some cases or being deactivated in the key element shown in accompanying drawing.Program or code that realization can be stored in machine readable media are carried out above-mentioned any method with the permission computer, also within the spirit and scope of the present invention.
In addition, it is only exemplary that any signal arrows in accompanying drawing should be considered to, rather than restrictive, unless concrete indication is separately arranged.Separate or the ability of combination when not knowing when term is also contemplated as to make, the combination of assembly or step also will be considered to put down in writing.

Claims (10)

1. method for estimating comprises:
By respectively the pixel in present frame and the pixel in reference frame being sampled, obtain the current picture corresponding with present frame and the reference picture corresponding with reference frame;
By entirely searching for, find out the optimum movement vector of each base pixel unit in described current picture, as the optimum movement vector of many pixel precisions of each macro block in described present frame, wherein in described reference picture
Each base pixel unit in described current picture is corresponding with each macro block in described present frame respectively, and each base pixel unit in described reference picture is corresponding with each macro block in described reference frame respectively.
2. method for estimating according to claim 1, is characterized in that, the pixel of the fixed position in each macro block in described present frame sampled, and the pixel of the fixed position in each macro block in described reference frame is sampled.
3. method for estimating according to claim 2, it is characterized in that, the pixel of a fixed position in each macro block in described present frame is comprised each 4 * 4 is sampled, and the pixel of a fixed position in each macro block in described reference frame is comprised each 4 * 4 is sampled.
4. the described method for estimating of any one according to claim 1 to 3, is characterized in that, the pixel sample locations of described present frame is identical with the pixel sample locations of described reference frame.
5. method for estimating according to claim 1, is characterized in that, also comprises:
For any one macro block in described present frame, according to the optimum movement vector of corresponding with the described macro block base pixel unit in described current picture, find out the reference zone corresponding with each sub-block that goes out from a described macroblock partition according to specific interframe encoding mode in described reference frame; And
For any one sub-block that goes out from a described macroblock partition according to described specific coding pattern, the reference zone corresponding with a described sub-block expanded, and carried out the full optimum movement vector of searching for the whole pixel precision of finding out a described sub-block in the reference zone after expansion.
6. movement estimation apparatus comprises:
The pixel sampling unit is used for by respectively the pixel of present frame and the pixel in reference frame being sampled, and obtains the current picture corresponding with present frame and the reference picture corresponding with reference frame;
The first search unit is used for by entirely searching at described reference picture, finds out the optimum movement vector of each base pixel unit in described current picture, as the optimum movement vector of many pixel precisions of each macro block in described present frame, wherein
Each base pixel unit in described current picture is corresponding with each macro block in described present frame respectively, and each base pixel unit in described reference picture is corresponding with each macro block in described reference frame respectively.
7. movement estimation apparatus according to claim 6, it is characterized in that, sample to the pixel of the fixed position in each macro block in described present frame in described pixel sampling unit, and the pixel of the fixed position in each macro block in described reference frame is sampled.
8. movement estimation apparatus according to claim 7, it is characterized in that, the pixel of a fixed position in described pixel sampling unit each macro block in described present frame is comprised each 4 * 4 is sampled, and the pixel of a fixed position in each macro block in described reference frame is comprised each 4 * 4 is sampled.
9. the described movement estimation apparatus of any one according to claim 6 to 8, is characterized in that, the pixel sample locations of described present frame is identical with the pixel sample locations of described reference frame.
10. movement estimation apparatus according to claim 6, is characterized in that, also comprises:
The regional search unit, be used for any one macro block for described present frame, according to the optimum movement vector of corresponding with the described macro block base pixel unit in described current picture, find out the reference zone corresponding with each sub-block that goes out from a described macroblock partition according to specific interframe encoding mode in described reference frame; And
The second search unit, for any one sub-block that goes out from a described macroblock partition according to described specific coding pattern, the reference zone corresponding with a described sub-block expanded, and carried out the full optimum movement vector of searching for the whole pixel precision of finding out a described sub-block in the reference zone after expansion.
CN201110400975.XA 2011-11-30 2011-11-30 Method for estimating and device Expired - Fee Related CN103139562B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201110400975.XA CN103139562B (en) 2011-11-30 2011-11-30 Method for estimating and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201110400975.XA CN103139562B (en) 2011-11-30 2011-11-30 Method for estimating and device

Publications (2)

Publication Number Publication Date
CN103139562A true CN103139562A (en) 2013-06-05
CN103139562B CN103139562B (en) 2016-05-04

Family

ID=48498797

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201110400975.XA Expired - Fee Related CN103139562B (en) 2011-11-30 2011-11-30 Method for estimating and device

Country Status (1)

Country Link
CN (1) CN103139562B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106664416A (en) * 2014-07-06 2017-05-10 Lg电子株式会社 Method for processing video signal, and apparatus therefor
CN110545428A (en) * 2018-05-28 2019-12-06 深信服科技股份有限公司 motion estimation method and device, server and computer readable storage medium
CN112738517A (en) * 2019-10-14 2021-04-30 珠海格力电器股份有限公司 Motion estimation search method, device, equipment and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1633184A (en) * 2005-01-14 2005-06-29 北京航空航天大学 Multi-reference frame rapid movement estimation method based on effective coverage
CN1750657A (en) * 2004-09-18 2006-03-22 三星电子株式会社 Based on the method for estimating of mixed block coupling with the device of its converting frame rate
CN1756355A (en) * 2004-09-29 2006-04-05 腾讯科技(深圳)有限公司 Motion estimating method in video data compression
CN1852442A (en) * 2005-08-19 2006-10-25 深圳市海思半导体有限公司 Layering motion estimation method and super farge scale integrated circuit
CN101227614A (en) * 2008-01-22 2008-07-23 炬力集成电路设计有限公司 Motion estimation device and method of video coding system
US20080212679A1 (en) * 2007-03-02 2008-09-04 Meng-Chun Lin Motion estimation with dual search windows for high resolution video coding
US20090154564A1 (en) * 2007-12-17 2009-06-18 Electronics And Telecommunications Research Institute Motion estimation apparatus and method for moving picture coding
CN101720039A (en) * 2009-09-08 2010-06-02 广东工业大学 Diamond search-based multi-resolution quick motion estimation method
CN101945284A (en) * 2010-09-29 2011-01-12 无锡中星微电子有限公司 Motion estimation device and method
WO2011142644A2 (en) * 2010-05-14 2011-11-17 삼성전자 주식회사 Method for encoding and decoding video and apparatus for encoding and decoding video using expanded block filtering

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1750657A (en) * 2004-09-18 2006-03-22 三星电子株式会社 Based on the method for estimating of mixed block coupling with the device of its converting frame rate
CN1756355A (en) * 2004-09-29 2006-04-05 腾讯科技(深圳)有限公司 Motion estimating method in video data compression
CN1633184A (en) * 2005-01-14 2005-06-29 北京航空航天大学 Multi-reference frame rapid movement estimation method based on effective coverage
CN1852442A (en) * 2005-08-19 2006-10-25 深圳市海思半导体有限公司 Layering motion estimation method and super farge scale integrated circuit
US20080212679A1 (en) * 2007-03-02 2008-09-04 Meng-Chun Lin Motion estimation with dual search windows for high resolution video coding
US20090154564A1 (en) * 2007-12-17 2009-06-18 Electronics And Telecommunications Research Institute Motion estimation apparatus and method for moving picture coding
CN101227614A (en) * 2008-01-22 2008-07-23 炬力集成电路设计有限公司 Motion estimation device and method of video coding system
CN101720039A (en) * 2009-09-08 2010-06-02 广东工业大学 Diamond search-based multi-resolution quick motion estimation method
WO2011142644A2 (en) * 2010-05-14 2011-11-17 삼성전자 주식회사 Method for encoding and decoding video and apparatus for encoding and decoding video using expanded block filtering
CN101945284A (en) * 2010-09-29 2011-01-12 无锡中星微电子有限公司 Motion estimation device and method

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106664416A (en) * 2014-07-06 2017-05-10 Lg电子株式会社 Method for processing video signal, and apparatus therefor
CN106664416B (en) * 2014-07-06 2019-11-05 Lg电子株式会社 Handle the method and device thereof of vision signal
US10567755B2 (en) 2014-07-06 2020-02-18 Lg Electronics Inc. Method for processing video signal, and apparatus therefor
CN110545428A (en) * 2018-05-28 2019-12-06 深信服科技股份有限公司 motion estimation method and device, server and computer readable storage medium
CN110545428B (en) * 2018-05-28 2024-02-23 深信服科技股份有限公司 Motion estimation method and device, server and computer readable storage medium
CN112738517A (en) * 2019-10-14 2021-04-30 珠海格力电器股份有限公司 Motion estimation search method, device, equipment and storage medium
CN112738517B (en) * 2019-10-14 2022-03-01 珠海格力电器股份有限公司 Motion estimation search method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN103139562B (en) 2016-05-04

Similar Documents

Publication Publication Date Title
US9146299B2 (en) Method and apparatus for position estimation using trajectory
CN102075760B (en) Quick movement estimation method and device
CN111093075A (en) Motion candidate derivation based on spatial neighborhood blocks in sub-block motion vector prediction
RU2017133236A (en) DISPLAYING THE MOTION VECTOR FOR VIDEO CODING
RU2013157151A (en) DEVICE FOR PROCESSING IMAGES AND METHOD FOR PROCESSING IMAGES
CN102811346B (en) coding mode selection method and system
CN103139562A (en) Motion estimation method and motion estimation device
WO2020140834A1 (en) Resolution-adaptive video coding
WO2012131151A1 (en) Methods and apparatuses for generating a panoramic image
US8823820B2 (en) Methods and apparatuses for capturing an image
JP2014183595A5 (en)
JP2005323021A (en) In-vehicle imaging system and imaging method
US10368087B2 (en) Dynamic reload of video encoder motion estimation search window under performance/power constraints
CN100366092C (en) Search method for video frequency encoding based on motion vector prediction
US20140126639A1 (en) Motion Estimation Method
US7852939B2 (en) Motion vector detection method and device of the same
CN102647587B (en) Motion estimation method and motion estimation device
JP5316309B2 (en) Image processing apparatus and image processing method
CN103841427A (en) Method for sliding search window and device thereof
US20190007687A1 (en) Specific Operation Prediction in Video Compression
WO2020258039A1 (en) Processing method for motion compensation, encoder, decoder and storage medium
JP2005294910A (en) Method for motion estimation
US20210203920A1 (en) Methods of video picture coding with sub-block merge simplification and related apparatuses
JP2008085674A (en) Motion detecting apparatus and method
JP6081049B2 (en) Image processing device, portable device, vehicle device, image processing method, and program

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160504

Termination date: 20181130