CN117083854A - Selective motion compensated frame interpolation - Google Patents

Selective motion compensated frame interpolation Download PDF

Info

Publication number
CN117083854A
CN117083854A CN202280025649.1A CN202280025649A CN117083854A CN 117083854 A CN117083854 A CN 117083854A CN 202280025649 A CN202280025649 A CN 202280025649A CN 117083854 A CN117083854 A CN 117083854A
Authority
CN
China
Prior art keywords
frame
motion
interpolation
interpolation factor
metric
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202280025649.1A
Other languages
Chinese (zh)
Inventor
A·肖阿哈萨尼拉什旦
A·埃尔沙迪
D·格纳纳普拉加萨姆
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qualcomm Inc
Original Assignee
Qualcomm Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qualcomm Inc filed Critical Qualcomm Inc
Publication of CN117083854A publication Critical patent/CN117083854A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/136Incoming video signal characteristics or properties
    • H04N19/137Motion inside a coding unit, e.g. average field, frame or block difference
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/172Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a picture, frame or field
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/513Processing of motion vectors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/587Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal sub-sampling or interpolation, e.g. decimation or subsequent interpolation of pictures in a video sequence
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0127Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level by changing the field or frame frequency of the incoming video signal, e.g. frame rate converter
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level
    • H04N7/0135Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving interpolation processes
    • H04N7/0137Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level involving interpolation processes dependent on presence/absence of motion, e.g. of motion zones

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Television Systems (AREA)

Abstract

An apparatus includes one or more processors configured to execute instructions to obtain motion data indicative of an estimated motion between a first frame and a second frame of an input sequence of image frames, and identify any frame region of the first frame indicative of motion greater than a motion threshold based on the motion data. The one or more processors are further configured to determine a motion metric associated with the identified frame region based on the motion data, and perform a determination of whether to use motion compensated frame interpolation to generate an intermediate frame based on the motion metric and a size metric associated with the identified frame region. The one or more processors are further configured to generate an intermediate frame based on the determination, and to generate an output sequence of image frames including the intermediate frame between the first frame and the second frame.

Description

Selective motion compensated frame interpolation
Cross Reference to Related Applications
The present application claims priority from commonly owned U.S. non-provisional patent application No. 17/219,080 filed on 3/2021, the entire contents of which are expressly incorporated herein by reference.
Technical Field
The present disclosure relates generally to selective motion compensated frame interpolation.
Background
Advances in technology have resulted in smaller and more powerful computing devices. For example, there are a variety of portable personal computing devices currently available, including wireless telephones such as mobile telephones and smart phones, tablets and laptop computers that are small, lightweight, and easily carried by users. These devices may communicate voice and data packets over a wireless network. In addition, many such devices incorporate additional functionality, such as digital still cameras, digital video cameras, digital audio recorders, and audio file players. Also, such devices may process executable instructions, including software applications that may be used to access the internet, such as web browser applications. Thus, these devices may include significant computing power.
Such computing devices often incorporate functionality to play (play) video streams. For example, the video stream may represent video content received (e.g., downloaded) from another device. Reducing the frame rate of video to meet the transmission bandwidth limit may result in poor playback (playback) quality, such as increased jitter. Motion compensated frame interpolation is used at the playback device to increase the frame rate of the video clips for smoother playback. However, frame interpolation is computationally intensive and power consumption can be high. On smaller screens, such as mobile devices, for scenes with relatively little motion, the increase in smoothness of playback from frame interpolation may not be perceived.
Disclosure of Invention
According to one embodiment of the present disclosure, an apparatus includes a memory and one or more processors. The memory is configured to store instructions. The one or more processors are configured to execute the instructions to obtain motion data indicative of estimated motion between a first frame and a second frame of the input sequence of image frames. The one or more processors are further configured to execute the instructions to identify, based on the motion data, any frame region of the first frame that indicates motion greater than a motion threshold. The one or more processors are further configured to execute the instructions to determine a motion metric associated with the identified frame region based on the motion data. The one or more processors are further configured to execute the instructions to perform a determination of whether to use motion compensated frame interpolation to generate the intermediate frame based on the motion metric and the size metric associated with the identified frame region. The one or more processors are further configured to execute the instructions to generate an intermediate frame based on the determination, and to generate an output sequence of image frames including the intermediate frame between the first frame and the second frame.
According to another embodiment of the present disclosure, a method includes obtaining, at a device, motion data indicative of estimated motion between a first frame and a second frame of an input sequence of image frames. The method also includes identifying, based on the motion data, any frame region of the first frame that indicates motion greater than a motion threshold. The method also includes determining a motion metric associated with the identified frame region based on the motion data. The method also includes performing a determination of whether to use motion compensated frame interpolation to generate an intermediate frame based on the motion metric and the size metric associated with the identified frame region. The method also includes generating an intermediate frame at the device based on the determination. The method also includes generating, at the device, an output sequence of image frames including an intermediate frame between the first frame and the second frame.
According to another embodiment of the present disclosure, a non-transitory computer-readable medium includes instructions that, when executed by one or more processors, cause the one or more processors to obtain motion data indicative of estimated motion between a first frame and a second frame of an input sequence of image frames. The instructions, when executed by the one or more processors, further cause the one or more processors to identify, based on the motion data, any frame region of the first frame that indicates motion greater than a motion threshold. The instructions, when executed by the one or more processors, further cause the one or more processors to determine a motion metric associated with the identified frame region based on the motion data. The instructions, when executed by the one or more processors, further cause the one or more processors to perform a determination of whether to use motion compensated frame interpolation to generate an intermediate frame based on the motion metric and the size metric associated with the identified frame region. The instructions, when executed by the one or more processors, further cause the one or more processors to generate an intermediate frame based on the determination. The instructions, when executed by the one or more processors, further cause the one or more processors to generate an output sequence of image frames including an intermediate frame between the first frame and the second frame.
According to another embodiment of the present disclosure, an apparatus includes means for obtaining motion data indicative of estimated motion between a first frame and a second frame of an input sequence of image frames. The apparatus further includes means for identifying, based on the motion data, any frame region of the first frame that indicates motion greater than a motion threshold. The apparatus also includes means for determining a motion metric associated with the identified frame region based on the motion data. The apparatus also includes means for performing a determination of whether to use motion compensated frame interpolation to generate an intermediate frame based on the motion metric and the size metric associated with the identified frame region. The apparatus further includes means for generating an intermediate frame based on the determination. The apparatus further comprises means for generating an output sequence of image frames comprising an intermediate frame between the first frame and the second frame.
Other aspects, advantages, and features of the present disclosure will become apparent after review of the entire application, including the following sections: the accompanying drawings, detailed description and claims.
Drawings
Fig. 1 is a block diagram of certain illustrative aspects of a system operable to perform selective motion compensated frame interpolation, according to some examples of the present disclosure.
Fig. 2 is a diagram of an illustrative aspect of a frame rate adjuster of the system of fig. 1, according to some examples of the present disclosure.
Fig. 3 is a diagram of an illustrative example of interpolation factor-determination data used by the frame rate adjuster of fig. 2, according to some examples of the present disclosure.
Fig. 4 is a diagram of an illustrative example of a frame generated by the system of fig. 1, according to some examples of the present disclosure.
Fig. 5 is a diagram of an illustrative aspect of a motion compensated frame interpolator of the frame rate adjuster of fig. 2, according to some examples of the present disclosure.
Fig. 6 illustrates an example of an integrated circuit operable to perform selective motion compensated frame interpolation in accordance with some examples of this disclosure.
Fig. 7 is a diagram of a mobile device operable to perform selective motion compensated frame interpolation according to some examples of the present disclosure.
Fig. 8 is a diagram of a wearable electronic device operable to perform selective motion compensated frame interpolation according to some examples of the present disclosure.
Fig. 9 is a diagram of a head-mounted viewer (head set), such as a virtual reality or augmented reality head-mounted viewer, operable to perform selective motion compensated frame interpolation according to some examples of the present disclosure.
Fig. 10 is a diagram of a first example of a vehicle operable to perform selective motion compensated frame interpolation according to some examples of the present disclosure.
Fig. 11 is a diagram of a second example of a vehicle operable to perform selective motion compensated frame interpolation according to some examples of the present disclosure.
Fig. 12 is a diagram of a particular embodiment of a method of selective motion compensated frame interpolation that may be performed by the apparatus of fig. 1, according to some examples of this invention.
Fig. 13 is a block diagram of a particular illustrative example of an apparatus operable to perform selective motion compensated frame interpolation in accordance with some examples of this invention.
Detailed Description
Motion compensated frame interpolation is used to increase the frame rate of video clips for smoother playback. For example, the frame rate of a video stream is increased from 30 frames per second (fps) to 60fps by inserting motion compensated interpolated frames between each pair of original frames. In full interpolation, the motion compensated interpolated frame represents half of the motion depicted between the original frame pair. For example, if the object is shifted 50 pixels to the right between the first original frame and the second original frame, the object is shifted 25 pixels to the right between the first original frame and the interpolated frame, and 25 pixels to the right between the interpolated frame and the second original frame. This results in smoother playback of the scene containing motion. However, frame interpolation is computationally intensive and power consumption can be high. On smaller screens, such as mobile devices, for scenes with relatively little motion, the increase in smoothness of playback from frame interpolation may not be perceived.
Systems and methods for performing selective motion compensated frame interpolation are disclosed. For example, a frame rate adjuster receives an input frame sequence and generates an output frame sequence based on the input frame sequence. The output frame sequence has a higher frame rate than the input frame sequence. For example, the frame rate adjuster adds one or more intermediate frames to the input frame sequence to generate an output frame sequence. To illustrate, the input frame sequence includes a first frame followed by a second frame. The output frame sequence includes an intermediate frame between the first frame and the second frame.
The frame rate adjuster performs motion compensated interpolation or frame copying to generate an intermediate frame. For example, the frame rate adjuster generates a motion vector indicative of motion detected between the first frame and the second frame. The frame rate adjuster determines a motion metric (e.g., average motion) and a size metric (e.g., frame percentage) for regions of the first frame that correspond to greater than a threshold motion. The frame rate adjuster determines whether to perform motion compensated frame interpolation based on the motion metric and the size metric. For example, when a greater percentage of the first frame corresponds to a higher motion, the framer performs motion-compensated frame interpolation to generate an intermediate frame. Alternatively, the framer performs frame copying to generate an intermediate frame when the first frame corresponds to a lower motion, or when a smaller percentage of the first frame corresponds to a higher motion.
Specific aspects of the disclosure are described below with reference to the accompanying drawings. In the specification, common features are designated by common reference numerals. As used herein, various terms are used solely for the purpose of describing particular embodiments and are not intended to limit the embodiments. For example, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Furthermore, some features described herein are singular in some embodiments and plural in other embodiments. For illustration, fig. 1 depicts a device 102 that includes one or more processors (processors 190 of fig. 1), indicating that in some implementations, the device 102 includes a single processor 190, while in other implementations, the device 102 includes multiple processors 190.
As used herein, the terms "include," comprises, "and" including "are used interchangeably with" include, "" comprises, "or" including. Additionally, the term "wherein" may be used interchangeably with "it. As used herein, "exemplary" indicates examples, embodiments, and/or aspects, and should not be construed as limiting or indicating a preference or preferred embodiment. As used herein, ordinal terms (e.g., "first," "second," "third," etc.) for modifying an element (such as a structure, component, operation, etc.) do not by itself indicate any priority or order of the element relative to another element, but rather merely distinguish the element from another element having the same name (although the ordinal term is used). As used herein, the term "set" refers to one or more particular elements, and the term "plurality" refers to a plurality (e.g., two or more) of the particular elements.
As used herein, "coupled" may include "communicatively coupled", "electrically coupled" or "physically coupled" and may also (or alternatively) include any combination thereof. Two devices (or components) may be directly or indirectly coupled (e.g., communicatively coupled, electrically or physically coupled) via one or more other devices, components, wires, buses, networks (e.g., wired networks, wireless networks, or a combination thereof), etc. As an illustrative, non-limiting example, two devices (or components) that are electrically coupled may be included in the same device or different devices, and may be connected via an electronic device, one or more connectors, or inductive coupling. In some implementations, two devices (or components) that are communicatively coupled, such as in electrical communication, may send and receive signals (e.g., digital or analog signals) directly or indirectly via one or more wires, buses, networks, etc. As used herein, "directly coupled" may include two devices coupled (e.g., communicatively coupled, electrically or physically coupled) without intervening components.
In this disclosure, terms such as "determine," "calculate," "estimate," "shift," "adjust," and the like may be used to describe how to perform one or more operations. It should be noted that these terms are not to be construed as limiting and that other techniques may be utilized to perform similar operations. Additionally, as referred to herein, "generating," "computing," "estimating," "using," "selecting," "accessing," and "determining" may be used interchangeably. For example, "generating," "computing," "estimating," or "determining" a parameter (or signal) may refer to actively generating, estimating, computing, or determining the parameter (or signal), or may refer to using, selecting, or accessing an already generated parameter (or signal), such as a parameter (or signal) generated by another component or device.
Referring to fig. 1, a particular illustrative aspect of a system configured to perform selective motion compensated frame interpolation is disclosed and is generally designated 100. The system 100 includes a device 102 coupled to a display device 106. The device 102 is configured to perform selective motion compensated frame interpolation using the frame rate adjuster 140.
The device 102 includes one or more processors 190 coupled to the memory 132 and the modem 170. The one or more processors 190 include a frame rate adjuster 140. Memory 132 is configured to store instructions 196. The one or more processors 190 are configured to execute the instructions 196 to perform one or more operations described herein. The modem 170 is configured to be able to communicate with one or more second devices, such as receiving a frame sequence 180 of one or more frames 101 (e.g., video frames, photo burst (photo burst), or a combination thereof). In a particular aspect, the device 102 is coupled to a display device 106. As an illustrative example, display device 106 is depicted as being external to device 102. In some examples, display device 106 is integrated into device 102.
The frame rate adjuster 140 is configured to receive the frame sequence 180 and output a frame sequence 192. The frame sequence 192 has a higher frame rate (e.g., more frames per second) than the frame sequence 180. For example, frame sequence 192 includes one or more frames 101 of frame sequence 180 and also includes one or more intermediate frames 191 interspersed between one or more frames 101. To illustrate, one or more frames 101 include frame 101A, followed by frame 101B, followed by frame 101C. The frame rate adjuster 140 is configured to generate an intermediate frame 191A based on the frames 101A and 101B, and output the intermediate frame 191A between the frames 101A and 101B as in the frame sequence 192. In a particular embodiment, the frame sequence 192 includes an intermediate frame 191 between each pair of consecutive frames of the one or more frames 101. For example, frame sequence 192 includes an intermediate frame 191A between frame 101A and frame 101B, and an intermediate frame 191B between frame 101B and frame 101C. In an alternative embodiment, frame sequence 192 includes one or more intermediate frames 191 between at least one pair of consecutive frames of one or more frames 101.
In some implementations, the device 102 corresponds to or is included in one or more types of devices. In an illustrative example, the one or more processors 190 are integrated into at least one of a mobile phone or tablet computer device as described with reference to fig. 7, a wearable electronic device as described with reference to fig. 8, or a virtual reality headset or augmented reality headset as described with reference to fig. 9. In another illustrative example, one or more processors 190 are integrated into a vehicle, such as further described with reference to fig. 10 and 11.
During operation, frame rate adjuster 140 receives a frame sequence 180 of one or more frames 101 (e.g., video frames). For example, frame rate adjuster 140 receives frames 101A and 101B of frame sequence 180. In particular embodiments, frame rate adjuster 140 receives frame sequence 180 from modem 170, memory 132, a second device (e.g., a storage device), or a combination thereof.
In a particular aspect, the frame rate adjuster 140 obtains motion data (e.g., motion vectors) indicative of estimated motion between the frames 101A and 101B. In example 150, the motion data indicates: each of the regions A, B and C of the frame 101A corresponds to a first horizontal motion (e.g., 1 pixel block to the right) and a first vertical motion (e.g., 2 pixel blocks down), the region D corresponds to no horizontal motion and a second vertical motion (e.g., 1 pixel block down), and the region X of the frame 101A corresponds to no motion (e.g., the same position in each of the frames 101A and 101B).
In a particular aspect, each region of the frame 101A has the same size. In an alternative aspect, at least one region of one or more frames 101 has a different size than another region of frame 101A. In particular aspects, the dimensions of one or more regions of frame 101A are based on default data, configuration settings, user input, or a combination thereof. In particular aspects, one or more regions of frame 101A are square, rectangular, elliptical, irregular, or a combination thereof. In a particular aspect, the regions of frame 101A do not overlap. In an alternative aspect, the regions of frame 101A at least partially overlap. In particular aspects, the pixel blocks are the same size, the same shape, or both as the regions of frame 101A. In particular aspects, as regions of frame 101A, pixel blocks have different sizes, different shapes, or both. In particular aspects, the size, shape, or both of the pixel blocks are based on default data, configuration settings, user input, or a combination thereof. In a particular aspect, the pixel block includes one or more pixels of frame 101A.
The frame rate adjuster 140 determines one or more Region Motion Metrics (RMMs) 117 for regions of the frame 101A (e.g., frame regions). For example, the frame rate adjuster 140 is based on the horizontal and vertical motion of the region (e.g., region motion metrics) ) To determine a region motion metric 117 for each region of frame 101A.
In example 150, frame rate adjuster 140 determines region motion metric 117A (e.g., 0) indicating a region X with no motion. Frame rate adjustmentThe processor 140 determines a region motion metric 117B for each of the regions A, B and C (e.g.,). The frame rate adjuster 140 determines the region motion metric 117C for region D (e.g., +.>)。
The frame rate adjuster 140 identifies any frame region of the frame 101A that indicates motion greater than the motion threshold 111 (e.g., 2) based on the motion data (e.g., the one or more region motion metrics 117). In particular aspects, the motion threshold 111 corresponds to default data, configuration settings, user input, or a combination thereof. In example 150, frame rate adjuster 140 identifies regions A, B and C as corresponding to motion greater than motion threshold 111 in response to determining that each of regions A, B and C has a region motion metric 117B (e.g., 2.2) greater than motion threshold 111 (e.g., 2). The frame rate adjuster 140 identifies the region X and the region D as corresponding to a motion less than or equal to the motion threshold 111 in response to determining that the region X has a region motion metric 117A (e.g., 0) less than or equal to the motion threshold 111 (e.g., 2) and the region D has a region motion metric 117C (e.g., 1) less than or equal to the motion threshold 111.
In a particular aspect, the frame rate adjuster 140 sets the motion metric 115 to a first motion metric value (e.g., 0) and the size metric 113 to a first size metric value (e.g., 0%) in response to determining that no region of the frame 101A corresponds to motion greater than the motion threshold 111. Alternatively, frame rate adjuster 140, in response to identifying one or more regions of frame 101A that indicate motion greater than motion threshold 111, determines motion metric 115 and size metric 113 based on region motion metric 117 of the identified regions.
The frame rate adjuster 140 determines the motion metric 115 based on the region motion metrics 117 identified as indicating each region (e.g., A, B and C) that is greater than the motion threshold 111, and is independent of the region motion metrics (e.g., region motion metrics 117A and region motion metrics 117C) identified as indicating regions (e.g., regions X and D) that are less than or equal to the motion threshold 111. In particular embodiments, motion metric 115 is indicated by region motion metric 117 identified as indicating a region of motion greater than motion threshold 111 based on an average motion (e.g., mean, median, or mode), a maximum motion, a range of motion, or a combination thereof. In example 150, motion metric 115 corresponds to an average value (e.g., (2.2+2.2+2.2)/3=2.2) of region motion metrics 117 identified as indicating a region (e.g., A, B and C) of motion greater than motion threshold 111 (e.g., 2).
The frame rate adjuster 140 determines the size metric 113 associated with the frame region (e.g., A, B and C) identified as indicating motion greater than the motion threshold 111 (e.g., 2). In particular aspects, the sizing 113 is based on a combined size of the identified frame regions (e.g., A, B and C), a percentage of the frames 101A that include the identified frame regions (e.g., A, B and C), or a combination thereof. In a particular example, the frame rate adjuster 140 determines the sizing 113 (e.g., sizing 113 = first count/second count) (e.g., 3/16 = 18.75%) based on a first count (e.g., 3) of regions (e.g., A, B, C) identified as indicating motion greater than the motion threshold 111 and a second count (e.g., 16) of all regions of the frame 101A.
The frame rate adjuster 140 performs a determination of whether to use motion compensated frame interpolation to generate the intermediate frame 191A based on the size metric 113 and the motion metric 115, as further described with reference to fig. 2-3. For example, the intermediate frame generation option 120 includes performing motion compensated frame interpolation 124 and one or more substitutions of motion compensated frame interpolation 124 (e.g., performing frame copying 122). In a particular aspect, the transition between frames 101A and 101B is predicted to be more significant during playback in the event that a frame region indicating motion greater than motion threshold 111 corresponds to a relatively larger portion of frame 101A (as indicated by size metric 113) or corresponds to a relatively larger motion (as indicated by motion metric 115). In this case, the frame rate adjuster 140 uses the motion compensated frame interpolation 124 to increase playback fluency. Alternatively, in the event that a frame region (if any) indicative of motion greater than motion threshold 111 corresponds to a relatively smaller portion of frame 101A (as indicated by size metric 113) or corresponds to relatively small motion (as indicated by motion metric 115), the transition between frame 101A and frame 101B is predicted to be less significant during playback. In this case, the frame rate adjuster 140 uses the frame copy 122 to save resources.
The frame rate adjuster 140 generates the interpolation frame 123 as the intermediate frame 191A using the motion-compensated frame interpolation 124 in response to a determination that the motion-compensated frame interpolation is to be performed. For example, frame rate adjuster 140 generates interpolated frame 123 such that the second motion between frame 101A and interpolated frame 123 is based on interpolation weight 119 applied to the first motion between frame 101A and frame 101B. In particular embodiments, interpolation weights 119 are based on size metrics 113 and motion metrics 115, as further described with reference to fig. 2-3. In a particular embodiment, interpolation weights 119 are based on predetermined weights. In particular aspects, interpolation weights 119 are based on default data, configuration settings, user input, or a combination thereof. In a particular aspect, a first duplicate value (e.g., 0) of interpolation weight 119 corresponds to a duplicate of frame 101A, and a second duplicate value (e.g., 1) of interpolation weight 119 corresponds to a duplicate of frame 101B. In a particular aspect, the value of interpolation weight 119 between the first copy value and the second copy value corresponds to interpolation. For example, the full interpolation value (e.g., 0.5) of interpolation weight 119 corresponds to full interpolation. As another example, the half interpolation value (e.g., 0.25) corresponds to half interpolation.
In example 150, motion compensated frame interpolation 124 corresponds to "full interpolation" to generate interpolated frame 123. "full interpolation" refers to generating the interpolated frame 123 such that the second motion between the interpolated frame 123 and each of the frames 101A and 101B is a full interpolation value (e.g., 0.5) of the interpolation weight 119 applied to the first motion between the frames 101A and 101B. For example, the first motion between frame 101A and frame 101B indicates a first horizontal motion (e.g., 1 pixel block to the right) and a first vertical motion (e.g., 2 pixel blocks down) for each of regions A, B and C, a no horizontal motion and a second vertical motion (e.g., 1 pixel block down) for region D, and no motion for region X. The frame rate adjuster 140 generates the interpolated frame 123 such that the second motion between the frame 101A and the interpolated frame 123 indicates a first specific horizontal motion (e.g., 0.5×1=0.5 pixel block to the right) and a first specific vertical motion (e.g., 0.5×2=1 pixel block to the bottom) for each of the regions A, B and C, a non-horizontal motion (e.g., 0.5×0=0) and a second specific vertical motion (e.g., 0.5×1=0.5 pixel block to the bottom) for the region D, and a non-motion (e.g., 0.5×0 and 0.5×0) for the region X. Playback of interpolated frame 123 as intermediate frame 191A between frame 101A and frame 101B will smooth the transition (e.g., reduce jitter) between frame 101A and frame 101B.
Alternatively, frame rate adjuster 140 uses an alternative to motion compensated frame interpolation 124 to generate intermediate frame 191A in response to a determination that motion compensated frame interpolation is not to be performed. For example, the frame rate adjuster 140 performs the frame copying 122 to generate the copied frame 121 as the intermediate frame 191A. In example 150, duplicate frame 121 corresponds to a duplicate of frame 101A. In other examples, duplicate frame 121 may correspond to a duplicate of frame 101B. In particular embodiments, generating duplicate frame 121 corresponds to generating additional frames based on frame 101A (or frame 101B). In another embodiment, generating duplicate frame 121 corresponds to increasing (e.g., doubling) the playout time of frame 101A (or frame 101B) independent of generating additional frames. In particular embodiments, generating duplicate frame 121 corresponds to including a reference to frame 101A (or frame 101B) twice in the playlist. The generation of duplicate frames 121 uses less resources (e.g., power, processing cycles, and time) than the generation of interpolated frames 123.
The frame rate adjuster 140 generates a frame sequence 192 comprising an intermediate frame 191A between the frames 101A and 101B. In a particular aspect, the frame rate adjuster 140 provides the frame sequence 192 to the display device 106 for playback, stores the frame sequence 192 in the memory 132 or storage device, provides (e.g., streams) the frame sequence 192 to another device, or a combination thereof.
The system 100 is thus able to perform selective motion compensated frame interpolation based on the size of the higher motion region and the degree of motion indicated by the higher motion region. For example, the frame rate adjuster 140 transitions between performing motion compensated frame interpolation to increase playback smoothness and performing frame copying to conserve resources.
Referring to fig. 2, an illustrative aspect of the frame rate adjuster 140 is shown. The frame rate adjuster 140 includes a motion estimator 204 coupled to an interpolation selector 208 via an interpolation factor generator 206. The interpolation selector 208 is coupled to each of the frame replicator 210 and the motion compensated frame interpolator 214.
The motion estimator 204 receives the sequence of frames 180. For example, motion estimator 204 receives frame 101A and frame 101B. The motion estimator 204 generates Motion Data (MD) 205 indicative of estimated motion between successive pairs of frames of the sequence of frames 180. For example, motion estimator 204 generates motion data 205 (e.g., a set of motion vectors) indicative of estimated motion between frames 101A and 101B. The motion estimator 204 provides motion data 205 to each of the interpolation factor generator 206 and the motion compensated frame interpolator 214.
An Interpolation Factor (IF) generator 206 determines an interpolation factor 207 based on the motion data 205. For example, interpolation factor generator 206 determines size metric 113 and motion metric 115, as described with reference to fig. 1. In a particular aspect, the interpolation factor generator 206 determines that the motion data 205 includes a set of motion vectors (e.g., 16 motion vectors) that indicate estimated motion between the frames 101A and 101B. The interpolation factor generator 206 determines that a subset of motion vectors (e.g., 3 motion vectors) indicates motion that is greater than the motion threshold 111 (e.g., 2 pixel blocks). Interpolation factor generator 206 determines size metric 113 (e.g., 3/16=18.75%) as a percentage of the motion vector of motion data 205 that indicates motion above motion threshold 111. The interpolation factor generator 206 determines the motion metric 115 based on the motion (e.g., average motion or range of motion) indicated by a subset of motion vectors (e.g., 3 motion vectors). Interpolation factor generator 206 generates interpolation factor 207 based on size metric 113 and motion metric 115 compared to interpolation factor determination data 270.
In a particular aspect, the interpolation factor-determination data 270 indicates a plurality of interpolation factor zones defined by a range of size metric values, a range of motion metric values, or a combination thereof. In fig. 3, an example 300 of interpolation factor determination data 270 is shown. The interpolation factor determination data 270 includes a plurality of interpolation factor fields including a full interpolation field 362, a transition field 364, and a duplication field 366. For example, the duplicate region 366 corresponds to little or no motion (e.g., low motion metric) indicated by a small portion (e.g., low size metric) of the frame 101A, where frame interpolation may result in an insignificant increase in playback fluency. The full interpolation region 362 corresponds to a high motion (e.g., a high motion metric) indicated by a majority (e.g., a high size metric) of the frame 101A, where frame interpolation may result in a significant increase in playback fluency. The transition region 364 corresponds to high motion in a small portion of the frame 101A or small motion in a large portion of the frame 101A, where frame interpolation may result in a slightly significant increase in playback smoothness.
In some implementations, the interpolation factor-determination data 270 includes the full interpolation region 362 and the replication region 366, and does not include any transition regions. In some implementations, the interpolation factor-determination data 270 includes a plurality of transition regions. In a particular aspect, the interpolation factor determination data 270 is based on configuration settings, default data, user input, detected context, mode of operation, screen size, or a combination thereof. To illustrate, the example 300 of the interpolation factor-determination data 270 corresponds to a first configuration setting (e.g., a resource-saving setting), a first user input (e.g., a resource-saving input), a first detected context (e.g., a lecture video), a first operating mode (e.g., a low-power mode or a battery-saving mode), a first screen size (e.g., less than a threshold screen size), or a combination thereof, and the example 350 of the interpolation factor-determination data 270 corresponds to a second configuration setting (e.g., a playback fluency setting), a second user input (e.g., a playback fluency input), a second detected context (e.g., a movie), a second operating mode (e.g., a charging mode, a charged mode, or a full-power mode), a second screen size (e.g., greater than or equal to the threshold screen size), or a combination thereof.
In a particular aspect, the detected context includes a type of video content of the frame sequence 180. For example, playback fluency is more relevant for a second type of video content (e.g., a movie) than for a first type of video content (e.g., a presentation). In a particular aspect, the detected context includes a calendar event associated with the frame sequence 180, with a playback time of the frame sequence 180, or both. For example, playback fluency is more relevant for a second type of calendar event (e.g., advertising campaign presentation) than for a first type of calendar event (e.g., workout). In a particular aspect, the detected context includes a movement amount of the display device 106. For example, when the frame sequence 192 is played back while the display device 106 is moving, the smoothness of playback is less relevant (e.g., the display device 106 is integrated into a virtual reality headset that plays back the frame sequence 192 while the user of the virtual reality headset is running).
In a particular aspect, the frame rate adjuster 140 adjusts the boundaries of the region of the interpolation factor-determination data 270 based on the detected condition. For example, the detected condition is based on configuration settings, user input, detected context, mode of operation, screen size, or a combination thereof. To illustrate, the frame rate adjuster 140 shifts (e.g., to the right, top, or both) the interpolation factor determining the boundary of the region of the data 270 (e.g., from example 300 to example 350) to increase the duplicate region 366 and decrease the full interpolation region 362 in response to detecting a lower power mode of operation (e.g., low battery), a smaller screen size, a context indicating a first type of video (e.g., speech), or a combination thereof.
Increasing the copy area 366 and decreasing the full interpolation area 362 increases the motion threshold that the motion metric 115 must meet and increases the size threshold that the size metric 113 must meet to trigger the use of motion compensated interpolation to generate an intermediate frame. Resource consumption is reduced by increasing the likelihood of using frame duplication, while also enabling motion compensated interpolation to increase playback fluency for high motion and large frame portion motion. Alternatively, the frame rate adjuster 140 shifts (e.g., toward the left, the bottom, or both) the interpolation factor to determine the boundary of the region of the data 270 (e.g., from example 350 to example 300) to decrease the duplicate region 366 and increase the full interpolation region 362 in response to detecting a higher power mode of operation (e.g., plugged into a power supply), a larger screen size, a context indicating a second type of video (e.g., a movie), or a combination thereof. Reducing the duplicate area 366 and increasing the full interpolation area 362 reduces the motion threshold that the motion metric 115 must meet and reduces the size threshold that the size metric 113 must meet to trigger the use of motion compensated interpolation to generate an intermediate frame. Playback fluency is increased by increasing the likelihood of using motion compensated interpolation while also enabling frame duplication to conserve resources for low motion or small frame portion motion.
Each of the plurality of regions of interpolation factor-determination data 270 corresponds to a particular interpolation factor value. For example, the duplicate region 366 corresponds to a first interpolation factor value (e.g., 0) and the full interpolation region 362 corresponds to a second interpolation factor value (e.g., 1). In a particular aspect, the transition region 364 corresponds to a third interpolation factor value (e.g., 0.5).
In response to determining that the size measure 113 and the motion measure 115 correspond to (e.g., are within) a particular region of the interpolation factor determination data 270, the interpolation factor generator 206 generates an interpolation factor 207 indicative of a particular interpolation factor value corresponding to the particular region. For example, in response to determining that the size metric 113 and the motion metric 115 correspond to (e.g., are at) the replication region 366, the interpolation factor generator 206 generates an interpolation factor 207 that indicates a first interpolation factor value (e.g., 0) that corresponds to the replication region 366. Alternatively, the interpolation factor generator 206 generates the interpolation factor 207 indicative of the second interpolation factor value (e.g., 1) in response to determining that the size metric 113 and the motion metric 115 correspond to (e.g., are in) the full interpolation region 362. In a particular embodiment, the interpolation factor generator 206 generates the interpolation factor 207 indicative of a third interpolation factor value (e.g., 0.5) in the transition region 364 in response to determining the size metric 113 and the motion metric 115. A first interpolation factor value (e.g., 0) indicates that motion compensated frame interpolation is not to be performed. The second interpolation factor value (e.g., 1) indicates that full interpolation is to be performed. A third interpolation factor value (e.g., 0.5) indicates that partial interpolation is to be performed.
Interpolation factor determination data 270 indicating a plurality of interpolation factor zones defined by a range of size metric values and a range of motion metric values is provided as an illustrative example. In some examples, interpolation factor-determination data 270 indicates a plurality of interpolation factor zones defined by and independent of a range of motion metric values. In these examples, interpolation factor generator 206 generates interpolation factor 207 indicative of a particular interpolation factor value corresponding to a particular region in response to determining that dimension measure 113 corresponds to (e.g., is at) a particular region of interpolation factor determination data 270. In other examples, the interpolation factor-determination data 270 indicates a plurality of interpolation factor zones defined by and independent of a range of motion metric values. In these examples, interpolation factor generator 206 generates interpolation factor 207 indicative of a particular interpolation factor value corresponding to a particular region in response to determining that motion metric 115 corresponds to (e.g., is at) a particular region of interpolation factor determination data 270.
Returning to fig. 2, the interpolation factor generator 206 provides an interpolation factor 207 to each of the interpolation selector 208 and the motion compensated frame interpolator 214. The interpolation selector 208 determines whether to use motion compensated frame interpolation to generate the intermediate frame 191A based on the interpolation factor 207. For example, the determination of whether to use motion compensated frame interpolation is based on whether the interpolation factor 207 meets an interpolation criterion (e.g., whether the interpolation factor 207 is equal to 0). To illustrate, the interpolation selector 208 determines that motion compensated frame interpolation is to be used and sends an activate interpolation command 213 to the motion compensated frame interpolator 214 in response to determining that the interpolation factor 207 meets an interpolation criterion (e.g., the interpolation factor 207 is not equal to 0). Alternatively, the interpolation selector 208 determines that motion compensated frame interpolation is not to be used (e.g., a replacement for motion compensated frame interpolation is to be used) and sends an activate copy command 209 to the frame replicator 210 in response to determining that the interpolation factor 207 fails to meet the interpolation criteria (e.g., the interpolation factor 207 is equal to 0).
The motion-compensated frame interpolator 214 performs motion-compensated frame interpolation on the frames 101A and 101B in response to receiving the activate interpolation command 213 to generate the interpolated frame 123 as an intermediate frame 191A, as further described with reference to fig. 5. For example, the motion compensated frame interpolator 214 performs motion compensated frame interpolation based on the motion data 205, the interpolation factor 207, the frame 101A, the frame 101B, or a combination thereof.
The frame duplicator 210 generates a duplicate frame 121 by duplicating one of the frames 101A or 101B in response to receiving the activate duplicate command 209, as described with reference to fig. 1. The frame rate adjuster 140 outputs the duplicate frame 121 as an intermediate frame 191A. For example, the frame rate adjuster 140 outputs a frame 191 between the frame 101A and the frame 101B in the frame sequence 192.
In a first embodiment, the interpolation factor-determination data 270 does not include any transition regions. In this embodiment, the frame rate adjuster 140 switches between full interpolation and duplication to generate intermediate frames. In a second embodiment, the interpolation factor-determination data 270 includes at least one transition region. In this embodiment, when the movement in the frame sequence 180 transitions via the transition region 364 between the full interpolation region 362 and the replication region 366, the frame rate adjuster 140 transitions via partial interpolation between full interpolation and replication to generate an intermediate frame.
The frame rate adjuster 140 is thus capable of selective motion compensated frame interpolation based on the size metric 113 and the motion metric 115. In a particular aspect, the criteria for selecting motion compensated frame interpolation may be dynamically changed by adjusting the interpolation factor determination data 270 based on detected conditions (e.g., configuration settings, user input, detected context, mode of operation, screen size, or a combination thereof).
In particular embodiments, device 102 includes an always-on (always-on) power domain and a second power domain, such as an on-demand power domain. In some implementations, a first stage of the frame rate adjuster 140 is configured to operate in an always-on mode, and a second stage of the frame rate adjuster 140 is configured to operate in an on-demand mode. In a particular aspect, the motion estimator 204, the interpolation factor generator 206, the interpolation selector 208, the frame replicator 210, or a combination thereof, is included in a first stage of the frame rate adjuster 140, and the motion compensated frame interpolator 214 is included in a second stage of the frame rate adjuster 140.
The first stage is configured to generate an activate interpolation command 213 to initiate one or more operations at the second stage. In an example, the activate interpolation command 213 is configured to transition the second power domain from the low power mode to the active mode to activate one or more components of the second stage. For example, the interpolation selector 208 may include or be coupled to a power management circuit, a clock circuit, a headswitch or footswitch circuit, a buffer control circuit, or any combination thereof. The interpolation selector 208 may be configured to initiate energization of the second stage, such as by selectively applying or boosting the voltage of a power supply of the second stage, the second power domain, or both. As another example, the interpolation selector 208 may be configured to selectively gate or not gate the clock signal to the second stage, such as to prevent or enable (enable) circuit operation without removing power.
The interpolated frame 123 generated by the second stage is provided to the frame rate adjuster 140. The frame rate adjuster 140 is configured to output the interpolated frame 123 as an intermediate frame 191A. By selectively activating the second stage based on the result of processing the frames at the first stage of the frame rate adjuster 140, the overall power consumption associated with performing selective motion compensated frame interpolation may be reduced.
Referring to fig. 4, a diagram 400 of an illustrative example of a frame generated by the system 100 of fig. 1 is shown. Example 402 indicates a constant playback speed of frames of frame sequence 192 (e.g., one or more frames 101 interspersed with one or more intermediate frames 191).
The full interpolation example 404 indicates that each of the one or more intermediate frames 191 has half the similarity between the previous frame 101 and the subsequent frame 101. For example, frame rate adjuster 140 generates intermediate frame 191A such that the second motion between intermediate frame 191A and each of frames 101A and 101B corresponds to half of the first motion between frames 101A and 101B, as described with reference to motion compensated frame interpolation 124 of fig. 1. Full interpolation example 404 represents applying a full interpolation value (e.g., 0.5) of Interpolation Weight (IW) 119 to the first motion to generate intermediate frame 191A.
The semi-interpolation example 406 indicates that each of the one or more intermediate frames 191 is closer in similarity to the previous frame 101 than to the next frame 101. For example, frame rate adjuster 140 generates intermediate frame 191A such that a second motion between intermediate frame 191A and frame 101A corresponds to one-fourth of the first motion and a third motion between intermediate frame 191A and frame 101B corresponds to three-fourths of the first motion. Half interpolation example 406 represents applying a half interpolation value (e.g., 0.25) of Interpolation Weight (IW) 119 to the first motion to generate intermediate frame 191A.
The no interpolation example 408 indicates that each of the one or more intermediate frames 191 is a copy of the previous frame 101. For example, frame rate adjuster 140 generates intermediate frame 191A as a copy of frame 101A. The no interpolation example 408 represents generating the duplicate frame 121 as an intermediate frame 191A.
The transition example 410 indicates that the earlier intermediate frame 191 has half the similarity between the previous frame 101 and the next frame 101, the intermediate frame 191 in the middle is closer to the previous frame 101 than the next frame 101, and the next intermediate frame 191 is a copy of the previous frame 101. For example, the frame rate adjuster 140 transitions via partial interpolation between full interpolation and duplication to generate one or more intermediate frames 191. Transition example 410 represents updating interpolation weight 119 from a fully interpolated value (e.g., 0.5) to a non-interpolated value (e.g., 0) as one or more intermediate frames 191 are generated. The transition between full interpolation to duplication via partial interpolation during playback is less pronounced than the switch between full interpolation and duplication.
Referring to fig. 5, an illustrative aspect of the motion compensated frame interpolator 214 is shown. The motion compensated frame interpolator 214 includes a motion vector processor 502 coupled to a frame renderer 510 via an occlusion detector 504, a motion vector projector 506, a back-off analyzer 508, or a combination thereof.
In a first embodiment, the motion compensated frame interpolator 214 determines interpolation weights 119 based on a predetermined weight (e.g., 0.5). For example, the motion compensated frame interpolator 214 performs interpolation (e.g., full interpolation) corresponding to a predetermined weight (e.g., 0.5) independent of the interpolation factor 207. In a second embodiment, motion compensated frame interpolator 214 determines interpolation weights 119 based at least in part on interpolation factor 207. For example, motion compensated frame interpolator 214 applies a predetermined factor (e.g., 0.5) to interpolation factor 207 to determine interpolation weight 119. To illustrate, the full interpolation value (e.g., 1) of interpolation factor 207 corresponds to the full interpolation value (e.g., 0.5) of interpolation weight 119. The half-interpolation value (e.g., 0.5) of the interpolation factor 207 corresponds to the half-interpolation value (e.g., 0.25) of the interpolation weight 119.
In a particular aspect, when the interpolation factor 207 has a full interpolation value, the interpolation weight 119 has the same value (e.g., 0.5) in both the first embodiment and the second embodiment. When the interpolation factor 207 has a value (for example, a half interpolation value) different from the full interpolation value, the interpolation weight 119 has a value different from that in the first embodiment in the second embodiment.
The motion vector processor 502 generates a motion vector 503 based on the motion data 205, the frame 101A, and the frame 101B. For example, the motion vector processor 502 performs motion vector post-processing and refinement to generate a motion vector 503. The motion vector processor 502 provides motion vectors 503 to an occlusion detector 504, a motion vector projector 506, a back-off analyzer 508, or a combination thereof.
Motion vector projector 506 generates motion vector data 507 based on interpolation weights 119. For example, in response to determining that a first motion vector of motion vector 503 indicates a first pixel shift (e.g., 2 pixel blocks) in a first direction (e.g., downward) between frame 101A and frame 101B for region a, motion vector data 507 is generated that includes a second motion vector indicating a second pixel shift (e.g., 1 pixel block) in the first direction between frame 101A and interpolated frame 123. The second pixel shift is based on applying interpolation weight 119 (e.g., 0.5) to the first pixel shift (e.g., second pixel shift = interpolation weight 119 x first pixel shift). The motion vector projector 506 provides motion vector data 507 to the frame renderer 510.
In example 550, frame 101A includes region E. The movement of region E between frame 101A and frame 101B intersects the path of movement of region a between frame 101A and frame 101B. The motion vector data 507 indicates that region E overlaps region a, and that both region E and region a are visible in the interpolated frame 123. For example, one of region E or region a corresponds to the glass of a window.
Occlusion detector 504 detects possible occlusions in response to determining that motion vector 503, motion vector data 507, or both indicate a path of intersection of movement of region E between frame 101A and frame 101B and movement of region a between frame 101A and frame 101B. Occlusion detector 504 generates occlusion data 505 in response to detecting a possible occlusion. For example, occlusion data 505 includes occlusion data 505A (e.g., a motion vector) corresponding to region E, where occlusion data 505A partially obstructs region A in interpolated frame 123. As another example, occlusion data 505 includes occlusion data 505B (e.g., a motion vector) corresponding to region a, where occlusion data 505B at least partially obstructs region a in interpolated frame 123. Occlusion detector 504 provides occlusion data 505 to frame renderer 510.
The back-off analyzer 508 generates back-off data 509 corresponding to the interpolated frame 123 back to frame 101A in response to determining that the motion vector 503, the motion vector data 507, or both indicate a threshold count greater than the cross-path. For example, the fallback data 509 indicates that the interpolated frame 123 is a copy of the frame 101A. The back-off analyzer 508 provides back-off data 509 to the frame renderer 510.
The frame renderer 510 generates the interpolated frame 123 based on the motion vector data 507, the occlusion data 505, the back-off data 509, the frame 101A, the frame 101B, or a combination thereof. For example, in response to determining that occlusion data 505 indicates that no possible occlusion is detected, frame renderer 510 applies motion vector data 507 (e.g., a motion vector) to frame 101A to generate interpolated frame 123. Alternatively, the frame renderer 510 applies the occlusion data 505 to the frame 101A to generate the interpolated frame 123 in response to determining that the occlusion data 505 indicates that less than a threshold count of occlusions is detected. For example, the frame renderer 510 selects the occlusion data 505A or the occlusion data 505B using various occlusion resolution techniques and applies the selected one of the occlusion data 505A or the occlusion data 505B to the frame 101A to generate the interpolated frame 123. In a particular aspect, the frame renderer 510 copies the frame 101A as the interpolated frame 123 in response to determining that the fallback data 509 indicates that a threshold count of greater than or equal to occlusion is detected.
The motion compensated frame interpolator 214 is thus able to generate the interpolated frame 123 based on the interpolation weights 119. The motion compensated frame interpolator 214 can adjust the interpolated frame 123 based on the detected occlusion, including back to frame 101A.
Fig. 6 depicts an embodiment 600 of the device 102 as an integrated circuit 602 including one or more processors 190. The integrated circuit 602 also includes a signal input 604, such as one or more bus interfaces, to enable the frame sequence 180 to be received for processing. The integrated circuit 602 also includes a signal output 606, such as a bus interface, to enable transmission of an output signal, such as the frame sequence 192. The integrated circuit 602 enables selective motion compensated frame interpolation to be implemented as a component in a system, such as a mobile phone or tablet as depicted in fig. 7, a wearable electronic device as depicted in fig. 8, a virtual reality headset or augmented reality headset as depicted in fig. 9, or a vehicle as depicted in fig. 10 or 11.
Fig. 7 depicts an embodiment 700 in which, as an illustrative, non-limiting example, the device 102 includes a mobile device 702, such as a phone or tablet. The mobile device 702 includes a display device 106 (e.g., a display screen). Components of the one or more processors 190 including the frame rate adjuster 140 are integrated into the mobile device 702 and are shown using dashed lines to indicate internal components that are not generally visible to a user of the mobile device 702. In a particular example, the frame rate adjuster 140 operates to generate a frame sequence 192, which frame sequence 192 is then processed to perform one or more operations at the mobile device 702, such as initiating a graphical user interface or otherwise displaying the frame sequence 192.
Fig. 8 depicts an implementation 800 in which the device 102 includes a wearable electronic device 802, shown as a "smart watch". The frame rate adjuster 140 is integrated into the wearable electronic device 802. In a particular example, the frame rate adjuster 140 operates to generate a frame sequence 192, the frame sequence 192 then being processed to perform one or more operations at the wearable electronic device 802, such as initiating a graphical user interface or otherwise displaying the frame sequence 192 at the display device 106 (e.g., display screen) of the wearable electronic device 802. In a particular example, the wearable electronic device 802 includes a haptic device that provides a haptic notification (e.g., vibration) in response to detecting that the frame sequence 192 is ready for display. For example, the haptic notification may cause the user to look at wearable electronic device 802 to view frame sequence 192. The wearable electronic device 802 may thus alert a user with a hearing impairment or a user wearing a head-mounted viewer to detect video data.
Fig. 9 depicts an implementation 900 in which the device 102 comprises a portable electronic device corresponding to a virtual reality, augmented reality, or mixed reality headset 902. The frame rate adjuster 140 is integrated into the head-mounted viewer 902. A visual interface device (e.g., display device 106) is placed in front of the user's eyes to enable an augmented reality or virtual reality image or scene to be displayed to the user while the head mounted viewer 902 is worn. In a particular example, the visual interface device is configured to display a sequence of frames 192.
Fig. 10 depicts an embodiment 1000 in which the device 102 corresponds to or is integrated into a vehicle 1002, the vehicle 1002 being shown as a manned or unmanned aerial device (e.g., a package delivery drone). The frame rate adjuster 140, the display device 106 (e.g., a display screen), or both are integrated into the vehicle 1002. The sequence of frames 192 may be displayed for the recipient on the display device 106, such as for delivering a communication, an advertisement, an installation instruction, or a combination thereof.
Fig. 11 depicts another embodiment 1100 in which the device 102 corresponds to or is integrated into a vehicle 1102, shown as an automobile. The vehicle 1102 includes one or more processors 190, and the one or more processors 190 include a frame rate adjuster 140. In particular embodiments, frame sequence 192 is displayed via display device 106 (e.g., a display screen) in response to generating frame sequence 192 via operation of frame rate adjuster 140.
Referring to fig. 12, a particular embodiment of a method 1200 of selective motion compensated frame interpolation is shown. In particular aspects, one or more operations of method 1200 are performed by at least one of frame rate adjuster 140, one or more processors 190, device 102, system 100 of fig. 1, motion estimator 204, interpolation factor generator 206, interpolation selector 208, frame replicator 210, motion compensated frame interpolator 214 of fig. 2, motion vector processor 502, occlusion detector 504, motion vector projector 506, back-off analyzer 508, frame renderer 510 of fig. 5, or a combination thereof.
The method 1200 includes obtaining motion data indicative of estimated motion between a first frame and a second frame of an input sequence of image frames at 1202. For example, the motion estimator 204 of fig. 2 obtains motion data 205 (e.g., motion vectors) indicative of estimated motion between frames 101A and 101B of a sequence of frames 180 (e.g., image frames, video frames, photo continuous shots, or a combination thereof), as described with reference to fig. 2.
The method 1200 also includes identifying, at 1204, based on the motion data, any frame region of the first frame that indicates motion greater than a motion threshold. For example, interpolation factor generator 206 of fig. 2 identifies any frame region of frame 101A that indicates motion greater than motion threshold 111 based on motion data 205, as described with reference to fig. 2.
The method 1200 also includes determining a motion metric associated with the identified frame region based on the motion data at 1206. For example, the interpolation factor generator 206 of fig. 2 determines the motion metric 115 associated with the identified frame region based on the motion data 205, as described with reference to fig. 2.
The method 1200 also includes, at 1208, performing a determination of whether to use motion-compensated frame interpolation to generate an intermediate frame based on the motion metric and the size metric associated with the identified frame region. For example, interpolation factor generator 206 of fig. 2 generates interpolation factor 207 based on motion threshold 111 and size metric 113, as described with reference to fig. 2. The interpolation selector 208 performs a determination of whether to use motion compensated frame interpolation to generate the intermediate frame 191A based on the interpolation factor 207, as described with reference to fig. 2.
The method 1200 also includes generating an intermediate frame based on the determination at 1210. For example, the interpolation selector 208 of FIG. 2 generates one of the activate copy command 209 or the activate interpolation command 213 based on the interpolation factor 207, as described with reference to FIG. 2. In a particular aspect, frame replicator 210 generates replica frame 121 as intermediate frame 191A in response to receiving activate replica command 209, as described with reference to fig. 2. In an alternative aspect, motion compensated frame interpolator 214 generates interpolated frame 123 as intermediate frame 191A in response to receiving activate interpolation command 213, as described with reference to fig. 2.
The method 1200 also includes generating an output sequence of image frames including an intermediate frame between the first frame and the second frame at 1212. For example, the frame rate adjuster 140 of fig. 1 generates a frame sequence 192 including an intermediate frame 191A between the frames 101A and 101B, as described with reference to fig. 1.
The method 1200 is thus capable of selective motion compensated frame interpolation based on the size metric 113 (e.g., corresponding to the size of the higher motion region) and the motion metric 115 (e.g., corresponding to the degree of motion indicated by the higher motion region). For example, the frame rate adjuster 140 transitions between performing motion compensated frame interpolation to increase playback smoothness and performing frame copying to conserve resources.
The method 1200 of fig. 12 may be implemented by a Field Programmable Gate Array (FPGA) device, an Application Specific Integrated Circuit (ASIC), a processing unit such as a Central Processing Unit (CPU), a DSP, a controller, another hardware device, a firmware device, or any combination thereof. By way of example, the method 1200 of fig. 12 may be performed by a processor executing instructions, such as described with reference to fig. 13.
Referring to fig. 13, a block diagram of a particular illustrative embodiment of a device is depicted and generally designated 1300. In various embodiments, device 1300 may have more or fewer components than shown in fig. 13. In the illustrative implementation, the device 1300 may correspond to the device 102, include the device 102, or be included within the device 102. In an illustrative embodiment, the device 1300 may perform one or more of the operations described with reference to fig. 1-12.
In particular embodiments, device 1300 includes a processor 1306 (e.g., a Central Processing Unit (CPU)). The device 1300 may include one or more additional processors 1310 (e.g., one or more DSPs). In a particular aspect, the one or more processors 190 of fig. 1 correspond to the processor 1306, the processor 1310, or a combination thereof. The processor 1310 may include a voice and music coder-decoder (CODEC) 1308 including a voice coder ("vocoder") encoder 1336, a vocoder decoder 1338, or both. The processor 1310 includes a frame rate adjuster 140.
The device 1300 can include a memory 132 and a CODEC 1334. The memory 132 may include instructions 196, the instructions 196 being executable by one or more additional processors 1310 (or processors 1306) to implement the functions described with reference to the frame rate adjuster 140. Device 1300 may include a modem 170 coupled to an antenna 1352 via a transceiver 1350.
The device 1300 may include a display device 106 coupled to a display controller 1326. One or more speakers 1392 and one or more microphones 1390 can be coupled to the CODEC 1334. The CODEC 1334 may include a digital-to-analog converter (DAC) 1302, an analog-to-digital converter (ADC) 1304, or both. In particular embodiments, CODEC 1334 may receive analog signals from one or more microphones 1390, convert the analog signals to digital signals using analog-to-digital converter 1304, and provide the digital signals to speech and music CODEC 1308. The speech and music codec 1308 may process digital signals. In particular embodiments, the voice and music CODEC 1308 can provide digital signals to a CODEC 1334. The CODEC 1334 can convert digital signals to analog signals using a digital-to-analog converter 1302 and can provide the analog signals to one or more speakers 1392.
In particular embodiments, device 1300 may be included in a system-in-package or system-on-chip device 1322. In particular embodiments, memory 132, processor 1306, processor 1310, display controller 1326, CODEC 1334, and modem 170 are included in a system-in-package or system-on-chip device 1322. In a particular implementation, an input device 1330 and a power supply 1344 are coupled to the system-on-chip device 1322. Also, in particular embodiments, as shown in FIG. 13, display device 106, input device 1330, one or more speakers 1392, one or more microphones 1390, antenna 1352, and power supply 1344 are external to system-on-chip device 1322. In particular embodiments, each of display device 106, input device 1330, one or more speakers 1392, one or more microphones 1390, antenna 1352, and power supply 1344 may be coupled to a component of system-on-chip device 1322, such as an interface or controller.
The device 1300 may include a smart speaker, speaker bar, mobile communication device, smart phone, cellular phone, laptop computer, tablet, personal digital assistant, display device, television, game console, music player, radio, digital video player, digital Video Disc (DVD) player, tuner, camera, navigation device, vehicle, head-mounted viewer, augmented reality head-mounted viewer, virtual reality head-mounted viewer, air vehicle, home automation system, voice controlled device, wireless speaker and voice activated device, portable electronic device, automobile, computing device, communication device, internet of things (IoT) device, virtual Reality (VR) device, base station, mobile device, or any combination thereof.
In connection with the described embodiments, an apparatus includes means for obtaining motion data indicative of estimated motion between a first frame and a second frame of an input sequence of image frames. For example, the means for obtaining motion data may correspond to the frame rate adjuster 140, the one or more processors 190, the device 102, the system 100 of fig. 1, the motion estimator 204, the processor 1306, the processor 1310 of fig. 2, one or more other circuits or components configured to obtain motion data, or any combination thereof.
The apparatus further includes means for identifying, based on the motion data, any frame region of the first frame that indicates motion greater than a motion threshold. For example, the means for identifying may correspond to the frame rate adjuster 140, the one or more processors 190, the device 102, the system 100 of fig. 1, the interpolation factor generator 206 of fig. 2, the processor 1306, the processor 1310, the one or more other circuits or components configured to identify any frame region indicative of motion greater than a motion threshold, or any combination thereof.
The apparatus also includes means for determining a motion metric associated with the identified frame region based on the motion data. For example, the means for determining a motion metric may correspond to the frame rate adjuster 140, the one or more processors 190, the device 102, the system 100 of fig. 1, the interpolation factor generator 206, the processor 1306, the processor 1310 of fig. 2, one or more other circuits or components configured to determine a motion metric, or any combination thereof.
The apparatus also includes means for performing a determination of whether to use motion compensated frame interpolation to generate an intermediate frame based on the motion metric and the size metric associated with the identified frame region. For example, the means for performing the determination of whether to use motion compensated frame interpolation may correspond to the frame rate adjuster 140, the one or more processors 190, the device 102, the system 100 of fig. 1, the interpolation factor generator 206, the interpolation selector 208, the processor 1306, the processor 1310 of fig. 2, one or more other circuits or components configured to perform the determination of whether to use motion compensated frame interpolation, or any combination thereof.
The apparatus further includes means for generating an intermediate frame based on the determination. For example, the means for generating an intermediate frame may correspond to the adjustor 140, the one or more processors 190, the device 102, the system 100, the frame replicator 210, the motion-compensated frame interpolator 214, the processor 1306, the processor 1310 of fig. 2, the one or more other circuits or components configured to generate an intermediate frame, or any combination thereof of fig. 1.
The apparatus further comprises means for generating an output sequence of image frames comprising an intermediate frame between the first frame and the second frame. For example, the means for generating the output sequence may correspond to the frame rate adjuster 140, the one or more processors 190, the apparatus 102, the system 100 of fig. 1, the frame replicator 210, the motion compensated frame interpolator 214, the processor 1306, the processor 1310 of fig. 2, one or more other circuits or components configured to generate the output sequence, or any combination thereof.
In a particular aspect, the apparatus further includes means for generating an interpolation factor based on the size metric and the motion metric, wherein the determination of whether to use motion compensated frame interpolation to generate the intermediate frame is based on whether the interpolation factor meets an interpolation criterion. For example, the means for generating interpolation factors may correspond to the frame rate adjuster 140, the one or more processors 190, the device 102, the system 100 of fig. 1, the interpolation factor generator 206, the processor 1306, the processor 1310 of fig. 2, one or more other circuits or components configured to generate interpolation factors, or any combination thereof.
In a particular aspect, the apparatus further comprises means for receiving an input sequence of image frames. For example, the means for receiving may correspond to the modem 170, the frame rate adjuster 140, the one or more processors 190, the device 102, the system 100 of fig. 1, the motion estimator 204, the frame replicator 210, the motion compensated frame interpolator 214 of fig. 2, the processor 1306, the processor 1310, the transceiver 1350, the antenna 1352, one or more other circuits or components configured to receive an input sequence of image frames, or any combination thereof.
In some implementations, a non-transitory computer-readable medium (e.g., a computer-readable storage device, such as memory 132) includes instructions (e.g., instructions 196) that, when executed by one or more processors (e.g., one or more processors 1310 or 1306), cause the one or more processors to obtain motion data (e.g., motion data 205) indicative of estimated motion between a first frame (e.g., frame 101A) and a second frame (e.g., frame 101B) of an input sequence of image frames (e.g., frame sequence 180). The instructions, when executed by the one or more processors, further cause the one or more processors to identify, based on the motion data, any frame region of the first frame that indicates motion greater than a motion threshold (e.g., motion threshold 111). The instructions, when executed by the one or more processors, further cause the one or more processors to determine a motion metric (e.g., motion metric 115) associated with the identified frame region based on the motion data. The instructions, when executed by the one or more processors, further cause the one or more processors to determine whether to generate an intermediate frame (e.g., intermediate frame 191A) using motion compensated frame interpolation based on the motion metric and the size metric (e.g., size metric 113) associated with the identified frame region. The instructions, when executed by the one or more processors, further cause the one or more processors to generate an intermediate frame based on the determination. The instructions, when executed by the one or more processors, further cause the one or more processors to generate an output sequence of image frames (e.g., frame sequence 192) including an intermediate frame between the first frame and the second frame.
Certain aspects of the disclosure are described in the following group of various related terms:
according to clause 1, an apparatus comprises: a memory configured to store instructions; and one or more processors configured to execute the instructions to: obtaining motion data indicative of estimated motion between a first frame and a second frame of an input sequence of image frames; based on the motion data, identifying any frame region of the first frame that indicates motion greater than a motion threshold; determining a motion metric associated with the identified frame region based on the motion data; based on the motion metric and the size metric associated with the identified frame region, a determination is performed whether to use motion compensated frame interpolation to generate an intermediate frame; generating an intermediate frame based on the determination; and generating an output sequence of image frames comprising an intermediate frame between the first frame and the second frame.
Clause 2 includes the apparatus of clause 1, wherein the motion metric is based on an average motion, a maximum motion, a range of motion, or a combination thereof associated with the identified frame region.
Clause 3 includes the apparatus of clause 1 or clause 2, wherein the motion metric is based on an average motion associated with the identified frame region.
Clause 4 includes the apparatus of any of clauses 1-3, wherein the motion metric is based on a maximum motion associated with the identified frame region.
Clause 5 includes the apparatus of any of clauses 1-4, wherein the motion metric is based on a range of motion associated with the identified frame region.
Clause 6 includes the apparatus of any of clauses 1-5, wherein the sizing is based on a combined size of the identified frame regions, a count of the identified frame regions, a percentage of the first frame comprising the identified frame regions, or a combination thereof.
Clause 7 includes the apparatus of any of clauses 1-6, wherein the sizing is based on a combined size of the identified frame regions.
Clause 8 includes the apparatus of any of clauses 1-7, wherein the sizing is based on a count of identified frame regions.
Clause 9 includes the apparatus of any of clauses 1-8, wherein the sizing is based on a percentage of the first frame including the identified frame region.
Clause 10 includes the apparatus of any of clauses 1-9, wherein the one or more processors are configured to generate an interpolation factor based on the size metric and the motion metric, and wherein the determination of whether to generate the intermediate frame using motion compensated frame interpolation is based on whether the interpolation factor meets an interpolation criterion.
Clause 11 includes the apparatus of clause 10, wherein the one or more processors are further configured to generate the interpolation factor based on the size metric and the comparison of the motion metric with the interpolation factor determination data.
Clause 12 includes the device of clause 11, wherein the interpolation factor determination data is based on configuration settings, default data, user input, detected context, mode of operation, screen size, or a combination thereof.
Clause 13 includes the apparatus of clause 11 or clause 12, wherein the interpolation factor determination data is based on the configuration settings.
Clause 14 includes the apparatus of any of clauses 11-13, wherein the interpolation factor determination data is based on default data.
Clause 15 includes the apparatus of any of clauses 11-14, wherein the interpolation factor determination data is based on user input.
Clause 16 includes the apparatus of any of clauses 11-15, wherein the interpolation factor determination data is based on the detected context.
Clause 17 includes the apparatus of any of clauses 11-16, wherein the interpolation factor determination data is based on the mode of operation.
Clause 18 includes the apparatus of any of clauses 11-17, wherein the interpolation factor determination data is based on a screen size.
Clause 19 includes the apparatus of any of clauses 11-18, wherein the interpolation factor-determination data indicates a plurality of interpolation factor zones defined by a range of size metric values and a range of motion metric values, wherein each of the plurality of interpolation factor zones corresponds to a particular interpolation factor value, and wherein the one or more processors are further configured to generate an interpolation factor having an interpolation factor value corresponding to the particular interpolation factor zone based on determining that the motion metric and the size metric are included in the particular interpolation factor zone of the plurality of interpolation factor zones.
Clause 20 includes the apparatus of any of clauses 1-9, wherein the one or more processors are configured to generate an interpolation factor based on the size metric, and wherein the determination of whether to generate the intermediate frame using motion compensated frame interpolation is based on whether the interpolation factor meets an interpolation criterion.
Clause 21 includes the apparatus of clause 20, wherein the one or more processors are further configured to generate the interpolation factor based on a comparison of the size metric and the interpolation factor determination data.
Clause 22 includes the device of clause 21, wherein the interpolation factor determination data is based on configuration settings, default data, user input, detected context, mode of operation, screen size, or a combination thereof.
Clause 23 includes the apparatus of clause 21 or clause 22, wherein the interpolation factor determination data is based on the configuration settings.
Clause 24 includes the apparatus of any of clauses 21-23, wherein the interpolation factor determination data is based on default data.
Clause 25 includes the apparatus of any of clauses 21-24, wherein the interpolation factor determination data is based on user input.
Clause 26 includes the apparatus of any of clauses 21-25, wherein the interpolation factor determination data is based on the detected context.
Clause 27 includes the apparatus of any of clauses 21-26, wherein the interpolation factor determination data is based on the mode of operation.
Clause 28 includes the apparatus of any of clauses 21-27, wherein the interpolation factor determination data is based on a screen size.
Clause 29 includes the apparatus of any of clauses 21 to 28, wherein the interpolation factor-determination data indicates a plurality of interpolation factor zones defined by at least a range of size metric values, wherein each of the plurality of interpolation factor zones corresponds to a particular interpolation factor value, and wherein the one or more processors are further configured to generate an interpolation factor having an interpolation factor value corresponding to the particular interpolation factor zone based on the determined size metric being included in the particular interpolation factor zone of the plurality of interpolation factor zones.
Clause 30 includes the apparatus of any of clauses 1-9, wherein the one or more processors are configured to generate an interpolation factor based on the motion metric, and wherein the determination of whether to generate the intermediate frame using motion compensated frame interpolation is based on whether the interpolation factor meets an interpolation criterion.
Clause 31 includes the apparatus of clause 30, wherein the one or more processors are further configured to generate the interpolation factor based on a comparison of the motion metric and the interpolation factor determination data.
Clause 32 includes the apparatus of clause 31, wherein the interpolation factor determination data is based on configuration settings, default data, user input, detected context, mode of operation, screen size, or a combination thereof.
Clause 33 includes the apparatus of clause 31 or clause 32, wherein the interpolation factor determination data is based on the configuration settings.
Clause 34 includes the apparatus of any of clauses 31-33, wherein the interpolation factor determination data is based on default data.
Clause 35 includes the apparatus of any of clauses 31-34, wherein the interpolation factor determination data is based on user input.
Clause 36 includes the apparatus of any of clauses 31-35, wherein the interpolation factor determination data is based on the detected context.
Clause 37 includes the apparatus of any of clauses 31-36, wherein the interpolation factor determination data is based on the mode of operation.
Clause 38 includes the apparatus of any of clauses 31-37, wherein the interpolation factor determination data is based on a screen size.
Clause 39 includes the apparatus of any of clauses 31 to 38, wherein the interpolation factor-determination data indicates a plurality of interpolation factor zones defined by at least a range of motion metric values, wherein each of the plurality of interpolation factor zones corresponds to a particular interpolation factor value, and wherein the one or more processors are further configured to generate an interpolation factor having an interpolation factor value corresponding to the particular interpolation factor zone based on determining that the motion metric is included in the particular interpolation factor zone of the plurality of interpolation factor zones.
Clause 40 includes the apparatus of any of clauses 1 to 39, wherein the one or more processors are configured to generate the intermediate frame using motion compensated frame interpolation in response to determining that the interpolation factor meets an interpolation criterion.
Clause 41 includes the device of any of clauses 1 to 40, wherein the one or more processors are configured to generate the intermediate frame using an alternative to motion compensated frame interpolation in response to determining that the interpolation factor fails to meet the interpolation criteria.
Clause 42 includes the device of any of clauses 1 to 41, wherein the one or more processors are configured to generate the intermediate frame in response to a determination that motion compensated frame interpolation is to be used to generate the intermediate frame, and to indicate a first motion between the first frame and the second frame based on the motion data such that a second motion between the first frame and the intermediate frame is based on the first motion and an interpolation factor.
Clause 43 includes the device of any of clauses 1 to 41, wherein the one or more processors are configured to generate the intermediate frame in response to a determination that motion compensated frame interpolation is to be used to generate the intermediate frame, and to generate the intermediate frame based on the determination that the motion data indicates a first motion between the first frame and the second frame such that a second motion between the first frame and the intermediate frame is based on a predetermined weight applied to the first motion.
Clause 44 includes the apparatus of any of clauses 1 to 43, wherein the one or more processors are configured to generate the intermediate frame as a copy of one of the first frame or the second frame in response to a determination that motion compensated frame interpolation is not to be used to generate the intermediate frame.
Clause 45 includes the apparatus of any of clauses 1 to 44, further comprising a modem configured to receive the input sequence of image frames.
According to clause 46, a method comprises: obtaining, at a device, motion data indicative of estimated motion between a first frame and a second frame of an input sequence of image frames; based on the motion data, identifying any frame region of the first frame that indicates motion greater than a motion threshold; determining a motion metric associated with the identified frame region based on the motion data; based on the motion metric and the size metric associated with the identified frame region, a determination is performed whether to use motion compensated frame interpolation to generate an intermediate frame; generating an intermediate frame at the device based on the determination; and generating, at the device, an output sequence of image frames including an intermediate frame between the first frame and the second frame.
Clause 47 includes the method of clause 46, wherein the motion metric is based on an average motion, a maximum motion, a range of motion, or a combination thereof associated with the identified frame region.
Clause 48 includes the method of clause 46 or clause 47, wherein the motion metric is based on an average motion associated with the identified frame region.
Clause 49 includes the method of any of clauses 46 to 48, wherein the motion metric is based on a maximum motion associated with the identified frame region.
Clause 50 includes the method of any of clauses 46 to 49, wherein the motion metric is based on a range of motion associated with the identified frame region.
Clause 51 includes the method of any of clauses 46 to 50, wherein the sizing is based on a combined size of the identified frame regions, a count of the identified frame regions, a percentage of the first frame comprising the identified frame regions, or a combination thereof.
Clause 52 includes the method of any of clauses 46 to 51, wherein the sizing is based on a combined size of the identified frame regions.
Clause 53 includes the method of any of clauses 46-52, wherein the sizing is based on a count of identified frame regions.
Clause 54 includes the method of any of clauses 46 to 53, wherein the sizing is based on a percentage of the first frame including the identified frame region.
Clause 55 includes the method of any of clauses 46-54, further comprising generating, at the device, an interpolation factor based on the size metric and the motion metric, wherein the determination of whether to generate the intermediate frame using motion compensated frame interpolation is based on whether the interpolation factor meets an interpolation criterion.
Clause 56 includes the method of clause 55, wherein the interpolation factor is based on a comparison of the size metric and the motion metric with interpolation factor determination data.
Clause 57 includes the method of clause 56, wherein the interpolation factor determination data is based on configuration settings, default data, user input, detected context, mode of operation, screen size, or a combination thereof.
Clause 58 includes the method of clause 56 or clause 57, wherein the interpolation factor determination data is based on the configuration settings.
Clause 59 includes the method of any of clauses 56-58, wherein the interpolation factor determination data is based on default data.
Clause 60 includes the method of any of clauses 56 to 59, wherein the interpolation factor determination data is based on user input.
Clause 61 includes the method of any of clauses 56-60, wherein the interpolation factor determination data is based on the detected context.
Clause 62 includes the method of any of clauses 56-61, wherein the interpolation factor determination data is based on the mode of operation.
Clause 63 includes the method of any of clauses 56-62, wherein the interpolation factor determination data is based on screen size.
Clause 64 includes the method of any of clauses 56 to 63, wherein the interpolation factor-determination data indicates a plurality of interpolation factor zones defined by a range of size metric values and a range of motion metric values, wherein each of the plurality of interpolation factor zones corresponds to a particular interpolation factor value, wherein the motion metric and the size metric are included in a particular interpolation factor zone of the plurality of interpolation factor zones, and wherein the interpolation factor has an interpolation factor value corresponding to the particular interpolation factor zone.
Clause 65 includes the method of any of clauses 55 to 64, further comprising generating the intermediate frame using motion compensated frame interpolation in response to determining that the interpolation factor meets an interpolation criterion.
Clause 66 includes the method of any of clauses 55 to 65, further comprising generating the intermediate frame using an alternative to motion compensated frame interpolation in response to determining that the interpolation factor fails to meet the interpolation criteria.
Clause 67 includes the method of any of clauses 55 to 66, further comprising generating an intermediate frame based on the motion data indicating a determination of a first motion between the first frame and the second frame in response to a determination that motion compensated frame interpolation is to be used to generate the intermediate frame, such that the second motion between the first frame and the intermediate frame is based on the first motion and an interpolation factor.
Clause 68 includes the method of any of clauses 46 to 54, further comprising generating, at the device, an interpolation factor based on the size metric, wherein the determination of whether to use motion compensated frame interpolation to generate the intermediate frame is based on whether the interpolation factor meets an interpolation criterion.
Clause 69 includes the method of clause 68, wherein the interpolation factor is based on a comparison of the size metric and interpolation factor determination data.
Clause 70 includes the method of clause 69, wherein the interpolation factor determination data is based on configuration settings, default data, user input, detected context, mode of operation, screen size, or a combination thereof.
Clause 71 includes the method of clause 69 or clause 70, wherein the interpolation factor determination data is based on the configuration settings.
Clause 72 includes the method of any of clauses 69-71, wherein the interpolation factor determination data is based on default data.
Clause 73 includes the method of any of clauses 69-72, wherein the interpolation factor determination data is based on user input.
Clause 74 includes the method of any of clauses 69-73, wherein the interpolation factor determination data is based on the detected context.
Clause 75 includes the method of any of clauses 69-74, wherein the interpolation factor determination data is based on the mode of operation.
Clause 76 includes the method of any of clauses 69-75, wherein the interpolation factor determination data is based on screen size.
Clause 77 includes the method of any of clauses 69 to 76, wherein the interpolation factor-determination data indicates a plurality of interpolation factor zones defined by at least a range of size metric values, wherein each of the plurality of interpolation factor zones corresponds to a particular interpolation factor value, wherein the size metric is included in a particular interpolation factor zone of the plurality of interpolation factor zones, and wherein the interpolation factor has an interpolation factor value corresponding to the particular interpolation factor zone.
Clause 78 includes the method of any of clauses 46 to 54, further comprising generating, at the device, an interpolation factor based on the motion metric, wherein the determination of whether to use motion compensated frame interpolation to generate the intermediate frame is based on whether the interpolation factor meets an interpolation criterion.
Clause 79 includes the method of clause 78, wherein the interpolation factor is based on a comparison of the motion metric and interpolation factor determination data.
Clause 80 includes the method of clause 79, wherein the interpolation factor determination data is based on configuration settings, default data, user input, detected context, mode of operation, screen size, or a combination thereof.
Clause 81 includes the method of clause 79 or clause 80, wherein the interpolation factor determination data is based on the configuration settings.
Clause 82 includes the method of any of clauses 79-81, wherein the interpolation factor determination data is based on default data.
Clause 83 includes the method of any of clauses 79 to 82, wherein the interpolation factor determination data is based on user input.
Clause 84 includes the method of any of clauses 79 to 83, wherein the interpolation factor determination data is based on the detected context.
Clause 85 includes the method of any of clauses 79 to 84, wherein the interpolation factor determination data is based on the mode of operation.
Clause 86 includes the method of any of clauses 79 to 85, wherein the interpolation factor determination data is based on screen size.
Clause 87 includes the method of any of clauses 79 to 86, wherein the interpolation factor-determination data indicates a plurality of interpolation factor zones defined by at least a range of motion metric values, wherein each of the plurality of interpolation factor zones corresponds to a particular interpolation factor value, wherein the motion metric is included in a particular interpolation factor zone of the plurality of interpolation factor zones, and wherein the interpolation factor has an interpolation factor value corresponding to the particular interpolation factor zone.
Clause 88 includes the method of any of clauses 46 to 87, further comprising generating an intermediate frame using motion compensated frame interpolation in response to determining that the interpolation factor meets an interpolation criterion.
Clause 89 includes the method of any of clauses 46 to 88, further comprising generating an intermediate frame using an alternative to motion compensated frame interpolation in response to determining that the interpolation factor fails to meet the interpolation criteria.
Clause 90 includes the method of any of clauses 46 to 89, further comprising generating an intermediate frame based on the determination that the motion data indicates a first motion between the first frame and the second frame in response to the determination that motion compensated frame interpolation is to be used to generate the intermediate frame, such that the second motion between the first frame and the intermediate frame is based on the first motion and the interpolation factor.
Clause 91 includes the method of any of clauses 46 to 89, further comprising generating an intermediate frame based on the determination that the motion data indicates a first motion between the first frame and the second frame in response to the determination that motion compensated frame interpolation is to be used to generate the intermediate frame, such that the second motion between the first frame and the intermediate frame is based on a predetermined weight applied to the first motion.
Clause 92 includes the method of any of clauses 46 to 91, further comprising generating the intermediate frame as a copy of one of the first frame or the second frame in response to a determination that motion compensated frame interpolation will not be used to generate the intermediate frame.
According to clause 93, a non-transitory computer-readable medium stores instructions that, when executed by one or more processors, cause the one or more processors to: obtaining motion data indicative of estimated motion between a first frame and a second frame of an input sequence of image frames; based on the motion data, identifying any frame region of the first frame that indicates motion greater than a motion threshold; determining a motion metric associated with the identified frame region based on the motion data; based on the motion metric and the size metric associated with the identified frame region, a determination is performed whether to use motion compensated frame interpolation to generate an intermediate frame; generating an intermediate frame based on the determination; and generating an output sequence of image frames comprising an intermediate frame between the first frame and the second frame.
Clause 94 includes the non-transitory computer-readable medium of clause 93, wherein the motion metric is based on an average motion, a maximum motion, a range of motion, or a combination thereof associated with the identified frame region.
Clause 95 includes the non-transitory computer-readable medium of clause 93 or 94, wherein the motion metric is based on an average motion associated with the identified frame region.
Clause 96 includes the non-transitory computer-readable medium of any of clauses 93-95, wherein the motion metric is based on a maximum motion associated with the identified frame region.
Clause 97 includes the non-transitory computer-readable medium of any of clauses 93-96, wherein the motion metric is based on a range of motion associated with the identified frame region.
Clause 98 includes the non-transitory computer-readable medium of any of clauses 93-97, wherein the sizing is based on a combined size of the identified frame regions, a count of the identified frame regions, a percentage of the first frame comprising the identified frame regions, or a combination thereof.
Clause 99 includes the non-transitory computer-readable medium of any of clauses 93-98, wherein the size measure is based on a combined size of the identified frame regions.
Clause 100 includes the non-transitory computer-readable medium of any of clauses 93-99, wherein the size metric is based on a count of identified frame regions.
Clause 101 includes the non-transitory computer-readable medium of any of clauses 93-100, wherein the size metric is based on a percentage of the first frame including the identified frame region.
Clause 102 includes the non-transitory computer-readable medium of any of clauses 93 to 101, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to generate an interpolation factor based on the size metric and the motion metric, wherein the determination of whether to use motion compensated frame interpolation to generate the intermediate frame is based on whether the interpolation factor meets an interpolation criterion.
Clause 103 includes the non-transitory computer-readable medium of clause 102, wherein the interpolation factor is based on the comparison of the size metric and the motion metric with interpolation factor determination data.
Clause 104 includes the non-transitory computer-readable medium of clause 103, wherein the interpolation factor determination data is based on configuration settings, default data, user input, detected context, mode of operation, screen size, or a combination thereof.
Clause 105 includes the non-transitory computer-readable medium of clause 103 or clause 104, wherein the interpolation factor determination data is based on the configuration settings.
Clause 106 includes the non-transitory computer-readable medium of any of clauses 103-105, wherein the interpolation factor determination data is based on default data.
Clause 107 includes the non-transitory computer-readable medium of any of clauses 103-106, wherein the interpolation factor determination data is based on user input.
Clause 108 includes the non-transitory computer-readable medium of any of clauses 103-107, wherein the interpolation factor determination data is based on the detected context.
Clause 109 includes the non-transitory computer-readable medium of any of clauses 103-108, wherein the interpolation factor determination data is based on the mode of operation.
Clause 110 includes the non-transitory computer-readable medium of any of clauses 103-109, wherein the interpolation factor determination data is based on a screen size.
Clause 111 includes the non-transitory computer-readable medium of any of clauses 103-110, wherein the interpolation factor-determining data indicates a plurality of interpolation factor-areas defined by a range of size metric values and a range of motion metric values, wherein each of the plurality of interpolation factor-areas corresponds to a particular interpolation factor value, wherein the motion metric and the size metric are included in a particular interpolation factor-area of the plurality of interpolation factor-areas, and wherein the interpolation factor has an interpolation factor value corresponding to the particular interpolation factor-area.
Clause 112 includes the non-transitory computer-readable medium of any of clauses 102-111, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to generate the intermediate frame using motion compensated frame interpolation in response to determining that the interpolation factor meets an interpolation criterion.
Clause 113 includes the non-transitory computer-readable medium of any of clauses 102 to 112, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to generate the intermediate frame using an alternative to motion compensated frame interpolation in response to determining that the interpolation factor fails to meet the interpolation criteria.
Clause 114 includes the non-transitory computer-readable medium of any of clauses 102 to 113, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to generate the intermediate frame in response to a determination that motion compensated frame interpolation is to be used to generate the intermediate frame, and to indicate a determination of a first motion between the first frame and the second frame based on the motion data, such that a second motion between the first frame and the intermediate frame is based on the first motion and an interpolation factor.
Clause 115 includes the non-transitory computer-readable medium of any of clauses 93 to 101, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to generate an interpolation factor based on the size metric, wherein the determination of whether to use motion compensated frame interpolation to generate the intermediate frame is based on whether the interpolation factor meets an interpolation criterion.
Clause 116 includes the non-transitory computer-readable medium of clause 115, wherein the interpolation factor is based on a comparison of the size metric and interpolation factor determination data.
Clause 117 includes the non-transitory computer-readable medium of clause 116, wherein the interpolation factor determination data is based on configuration settings, default data, user input, detected context, mode of operation, screen size, or a combination thereof.
Clause 118 includes the non-transitory computer-readable medium of clause 116 or clause 117, wherein the interpolation factor determination data is based on the configuration settings.
Clause 119 includes the non-transitory computer-readable medium of any of clauses 116-118, wherein the interpolation factor determination data is based on default data.
Clause 120 includes the non-transitory computer-readable medium of any of clauses 116-119, wherein the interpolation factor determination data is based on user input.
Clause 121 includes the non-transitory computer-readable medium of any of clauses 116-120, wherein the interpolation factor determination data is based on the detected context.
Clause 122 includes the non-transitory computer-readable medium of any of clauses 116-121, wherein the interpolation factor determination data is based on the mode of operation.
Clause 123 includes the non-transitory computer-readable medium of any of clauses 116-122, wherein the interpolation factor determination data is based on a screen size.
Clause 124 includes the non-transitory computer-readable medium of any of clauses 116-123, wherein the interpolation factor-determination data indicates a plurality of interpolation factor-areas defined by at least a range of size-metric values, wherein each of the plurality of interpolation factor-areas corresponds to a particular interpolation factor value, wherein the size-metric is included in a particular interpolation factor-area of the plurality of interpolation factor-areas, and wherein the interpolation factor has an interpolation factor value corresponding to the particular interpolation factor-area.
Clause 125 includes the non-transitory computer-readable medium of any of clauses 93 to 124, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to generate an interpolation factor based on the motion metric, wherein the determination of whether to use motion compensated frame interpolation to generate the intermediate frame is based on whether the interpolation factor meets an interpolation criterion.
Clause 126 includes the non-transitory computer-readable medium of clause 125, wherein the interpolation factor is based on a comparison of the motion metric and interpolation factor determination data.
Clause 127 includes the non-transitory computer-readable medium of clause 126, wherein the interpolation factor determination data is based on configuration settings, default data, user input, detected context, mode of operation, screen size, or a combination thereof.
Clause 128 includes the non-transitory computer-readable medium of clause 126 or clause 127, wherein the interpolation factor determination data is based on the configuration settings.
Clause 129 includes the non-transitory computer-readable medium of any of clauses 126-128, wherein the interpolation factor determination data is based on default data.
Clause 130 includes the non-transitory computer-readable medium of any of clauses 126-129, wherein the interpolation factor determination data is based on user input.
Clause 131 includes the non-transitory computer-readable medium of any of clauses 126-130, wherein the interpolation factor determination data is based on the detected context.
Clause 132 includes the non-transitory computer-readable medium of any of clauses 126-131, wherein the interpolation factor determination data is based on the mode of operation.
Clause 133 includes the non-transitory computer-readable medium of any of clauses 126-132, wherein the interpolation factor determination data is based on a screen size.
Clause 134 includes the non-transitory computer-readable medium of any of clauses 126-133, wherein the interpolation factor-determination data indicates a plurality of interpolation factor-areas defined by at least a range of motion metric values, wherein each of the plurality of interpolation factor-areas corresponds to a particular interpolation factor value, wherein the motion metric is included in a particular interpolation factor-area of the plurality of interpolation factor-areas, and wherein the interpolation factor has an interpolation factor value corresponding to the particular interpolation factor-area.
Clause 135 includes the non-transitory computer-readable medium of any of clauses 93-134, further comprising generating an intermediate frame using motion compensated frame interpolation in response to determining that the interpolation factor meets an interpolation criterion.
Clause 136 includes the non-transitory computer-readable medium of any of clauses 93 to 135, further comprising generating the intermediate frame using an alternative to motion compensated frame interpolation in response to determining that the interpolation factor fails to meet the interpolation criteria.
Clause 137 includes the non-transitory computer-readable medium of any of clauses 93-136, further comprising generating an intermediate frame based on the determination that the motion data indicates a first motion between the first frame and the second frame in response to the motion compensated frame interpolation being to be used to generate the intermediate frame, such that a second motion between the first frame and the intermediate frame is based on the first motion and the interpolation factor.
Clause 138 includes the non-transitory computer-readable medium of any of clauses 93 to 136, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to generate the intermediate frame in response to a determination that motion compensated frame interpolation is to be used to generate the intermediate frame, and to indicate a determination of a first motion between the first frame and the second frame based on the motion data such that a second motion between the first frame and the intermediate frame is based on a predetermined weight applied to the first motion.
Clause 139 includes the non-transitory computer-readable medium of any of clauses 93 to 138, wherein the instructions, when executed by the one or more processors, further cause the one or more processors to generate the intermediate frame as a copy of one of the first frame or the second frame in response to a determination that motion compensated frame interpolation is not to be used to generate the intermediate frame.
According to clause 140, an apparatus comprises: means for obtaining motion data indicative of estimated motion between a first frame and a second frame of an input sequence of image frames; means for identifying, based on the motion data, any frame region of the first frame indicative of motion greater than a motion threshold; means for determining a motion metric associated with the identified frame region based on the motion data; means for performing a determination of whether to use motion compensated frame interpolation to generate an intermediate frame based on the motion metric and the size metric associated with the identified frame region; means for generating an intermediate frame based on the determination; and means for generating an output sequence of image frames comprising an intermediate frame between the first frame and the second frame.
Clause 141 includes the apparatus of clause 140, wherein the motion metric is based on an average motion, a maximum motion, a range of motion, or a combination thereof associated with the identified frame region.
Clause 142 includes the means of clause 140 or clause 141, wherein the motion metric is based on an average motion associated with the identified frame region.
Clause 143 includes the apparatus of any of clauses 140 to 142, wherein the motion metric is based on a maximum motion associated with the identified frame region.
Clause 144 includes the apparatus of any of clauses 140 to 143, wherein the motion metric is based on a range of motion associated with the identified frame region.
Clause 145 includes the apparatus of any of clauses 140 to 144, wherein the sizing is based on a combined size of the identified frame regions, a count of the identified frame regions, a percentage of the first frame comprising the identified frame regions, or a combination thereof.
Clause 146 includes the apparatus of any of clauses 140 to 145, wherein the sizing is based on a combined size of the identified frame regions.
Clause 147 includes the apparatus of any of clauses 140 to 146, wherein the sizing is based on a count of identified frame regions.
Clause 148 includes the apparatus of any of clauses 140 to 147, wherein the sizing is based on a percentage of the first frame including the identified frame region.
Clause 149 includes the apparatus of any of clauses 140-148, further comprising means for generating an interpolation factor based on the size metric and the motion metric, wherein the determination of whether to generate the intermediate frame using motion compensated frame interpolation is based on whether the interpolation factor meets an interpolation criterion.
Clause 150 includes the apparatus of clause 149, wherein the interpolation factor is based on a comparison of the size metric and the motion metric with interpolation factor determination data.
Clause 151 includes the apparatus of clause 150, wherein the interpolation factor determination data is based on configuration settings, default data, user input, detected context, mode of operation, screen size, or a combination thereof.
Clause 152 includes the means of clause 150 or 151, wherein the interpolation factor determination data is based on the configuration settings.
Clause 153 includes the apparatus of any of clauses 150 to 152, wherein the interpolation factor determination data is based on default data.
Clause 154 includes the apparatus of any of clauses 150-153, wherein the interpolation factor determination data is based on user input.
Clause 155 includes the apparatus of any of clauses 150-154, wherein the interpolation factor determination data is based on the detected context.
Clause 156 includes the apparatus of any of clauses 150-155, wherein the interpolation factor determination data is based on the mode of operation.
Clause 157 includes the apparatus of any of clauses 150 to 156, wherein the interpolation factor determination data is based on screen size.
Clause 158 includes the apparatus of any of clauses 150 to 157, wherein the interpolation factor-determination data indicates a plurality of interpolation factor zones defined by a range of size metric values and a range of motion metric values, wherein each of the plurality of interpolation factor zones corresponds to a particular interpolation factor value, and wherein the means for generating the interpolation factor generates the interpolation factor having an interpolation factor value corresponding to the particular interpolation factor zone based on determining that the motion metric and the size metric are included in the particular interpolation factor zone of the plurality of interpolation factor zones.
Clause 159 includes the apparatus of any of clauses 149 to 158, wherein the means for generating an intermediate frame uses motion compensated frame interpolation to generate the intermediate frame in response to determining that the interpolation factor meets an interpolation criterion.
Clause 160 includes the apparatus of any of clauses 149 to 159, wherein the means for generating an intermediate frame uses an alternative to motion compensated frame interpolation to generate the intermediate frame in response to determining that the interpolation factor fails to meet the interpolation criteria.
Clause 161 includes the apparatus of any of clauses 149 to 160, wherein the means for generating an intermediate frame is to be used to generate the determination of the intermediate frame in response to the motion compensated frame interpolation, and to generate the intermediate frame based on the determination that the motion data indicates a first motion between the first frame and the second frame such that a second motion between the first frame and the intermediate frame is based on the first motion and the interpolation factor.
Clause 162 includes the apparatus of any of clauses 140 to 148, further comprising means for generating an interpolation factor based on the size metric, wherein the determination of whether to use motion compensated frame interpolation to generate the intermediate frame is based on whether the interpolation factor meets an interpolation criterion.
Clause 163 includes the apparatus of clause 162, wherein the interpolation factor is based on a comparison of the size metric and interpolation factor determination data.
Clause 164 includes the apparatus of clause 163, wherein the interpolation factor determination data is based on configuration settings, default data, user input, detected context, mode of operation, screen size, or a combination thereof.
Clause 165 includes the means of clause 163 or clause 164, wherein the interpolation factor determination data is based on the configuration settings.
Clause 166 includes the apparatus of any of clauses 163-165, wherein the interpolation factor determination data is based on default data.
Clause 167 includes the apparatus of any of clauses 163-166, wherein the interpolation factor determination data is based on user input.
Clause 168 includes the apparatus of any of clauses 163-167, wherein the interpolation factor determination data is based on the detected context.
Clause 169 includes the apparatus of any of clauses 163-168, wherein the interpolation factor determination data is based on the mode of operation.
Clause 170 includes the apparatus of any of clauses 163-169, wherein the interpolation factor determination data is based on a screen size.
Clause 171 includes the apparatus of any of clauses 163 to 170, wherein the interpolation factor-determination data indicates a plurality of interpolation factor fields defined by at least a range of size metric values, wherein each of the plurality of interpolation factor fields corresponds to a particular interpolation factor value, wherein the size metric is included in the particular interpolation factor field of the plurality of interpolation factor fields, and wherein the interpolation factor has an interpolation factor value corresponding to the particular interpolation factor field.
Clause 172 includes the apparatus of any of clauses 140 to 148, further comprising generating, at the device, an interpolation factor based on the motion metric, wherein the determination of whether to use motion compensated frame interpolation to generate the intermediate frame is based on whether the interpolation factor meets an interpolation criterion.
Clause 173 includes the apparatus of clause 172, wherein the interpolation factor is based on a comparison of the motion metric and interpolation factor determination data.
Clause 174 includes the apparatus of clause 173, wherein the interpolation factor determination data is based on configuration settings, default data, user input, detected context, mode of operation, screen size, or a combination thereof.
Clause 175 includes the means of clause 173 or clause 174, wherein the interpolation factor determination data is based on the configuration settings.
Clause 176 includes the apparatus of any of clauses 173 to 175, wherein the interpolation factor determination data is based on default data.
Clause 177 includes the apparatus of any of clauses 173-176, wherein the interpolation factor determination data is based on user input.
Clause 178 includes the apparatus of any one of clauses 173 to 177, wherein the interpolation factor determination data is based on the detected context.
Clause 179 includes the apparatus of any of clauses 173 to 178, wherein the interpolation factor determination data is based on the mode of operation.
Clause 180 includes the apparatus of any of clauses 173 to 179, wherein the interpolation factor determination data is based on screen size.
Clause 181 includes the apparatus of any of clauses 173 to 180, wherein the interpolation factor-determination data indicates a plurality of interpolation factor zones defined by at least a range of motion metric values, wherein each of the plurality of interpolation factor zones corresponds to a particular interpolation factor value, wherein the motion metric is included in a particular interpolation factor zone of the plurality of interpolation factor zones, and wherein the interpolation factor has an interpolation factor value corresponding to the particular interpolation factor zone.
Clause 182 includes the apparatus of any of clauses 140 to 181, further comprising generating an intermediate frame using motion compensated frame interpolation in response to determining that the interpolation factor meets an interpolation criterion.
Clause 183 includes the apparatus of any of clauses 140 to 182, further comprising generating the intermediate frame using an alternative to motion compensated frame interpolation in response to determining that the interpolation factor fails to meet the interpolation criteria.
Clause 184 includes the apparatus of any of clauses 140 to 183, further comprising generating an intermediate frame based on the motion data indicating a determination of a first motion between the first frame and the second frame in response to a determination that motion compensated frame interpolation is to be used to generate the intermediate frame, such that the second motion between the first frame and the intermediate frame is based on the first motion and an interpolation factor.
Clause 185 includes the apparatus of any of clauses 140 to 183, wherein the means for generating an intermediate frame is to be used to generate the determination of the intermediate frame in response to the motion compensated frame interpolation, and to generate the intermediate frame based on the determination that the motion data indicates a first motion between the first frame and the second frame such that a second motion between the first frame and the intermediate frame is based on a predetermined weight applied to the first motion.
Clause 186 includes the apparatus of any of clauses 140 to 185, wherein the means for generating an intermediate frame generates the intermediate frame as a copy of one of the first frame or the second frame in response to a determination that motion compensated frame interpolation will not be used to generate the intermediate frame.
Clause 187 includes the apparatus of any of clauses 140 to 186, further comprising means for receiving an input sequence of image frames.
Clause 188 includes the apparatus of any of clauses 140 to 187, wherein the means for obtaining motion data, the means for identifying any frame regions, the means for determining motion metrics, the means for performing the determining, the means for generating intermediate frames, the means for generating output sequences are integrated into at least one of a communication device, a computer, a display device, a television, a game console, a digital video player, a camera, a navigation device, a vehicle, a head-mounted viewer, an augmented reality head-mounted viewer, a virtual reality head-mounted viewer, an air vehicle, a home automation system, a voice control device, an internet of things (IoT) device, a Virtual Reality (VR) device, a base station, or a mobile device.
Those of skill would further appreciate that the various illustrative logical blocks, configurations, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software executed by a processor, or combinations of both. Various illustrative components, blocks, configurations, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or processor-executable instructions depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), flash memory, read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), registers, hard disk, a removable disk, a compact disc read-only memory (CD-ROM), or any other form of non-transitory storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an Application Specific Integrated Circuit (ASIC). The ASIC may reside in a computing device or user terminal. In the alternative, the processor and the storage medium may reside as discrete components in a computing device or user terminal.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the disclosed aspects. Various modifications to these aspects will be readily apparent to those skilled in the art, and the principles defined herein may be applied to other aspects without departing from the scope of the invention. Thus, the present disclosure is not intended to be limited to the aspects shown herein but is to be accorded the widest scope possible consistent with the principles and novel features as defined by the following claims.

Claims (30)

1. An apparatus, comprising:
a memory configured to store instructions; and
one or more processors configured to execute the instructions to:
obtaining motion data indicative of estimated motion between a first frame and a second frame of an input sequence of image frames;
based on the motion data, identifying any frame regions of the first frame that are indicative of motion greater than a motion threshold;
determining a motion metric associated with the identified frame region based on the motion data;
performing a determination of whether to use motion compensated frame interpolation to generate an intermediate frame based on the motion metric and size metric associated with the identified frame region;
generating the intermediate frame based on the determination; and
an output sequence of image frames including the intermediate frame between the first frame and the second frame is generated.
2. The apparatus of claim 1, wherein the motion metric is based on an average motion, a maximum motion, a range of motion, or a combination thereof associated with the identified frame region.
3. The apparatus of claim 1, wherein the size metric is based on a combined size of the identified frame regions, a count of the identified frame regions, a percentage of the first frames comprising the identified frame regions, or a combination thereof.
4. The apparatus of claim 1, wherein the one or more processors are configured to generate an interpolation factor based on the size metric, and wherein the determination of whether to generate the intermediate frame using the motion compensated frame interpolation is based on whether the interpolation factor meets an interpolation criterion.
5. The apparatus of claim 4, wherein the one or more processors are further configured to generate the interpolation factor based on a comparison of the size metric and the interpolation factor determination data.
6. The apparatus of claim 5, wherein the interpolation factor-determination data is based on configuration settings, default data, user input, detected context, mode of operation, screen size, or a combination thereof.
7. The apparatus of claim 5, wherein the interpolation factor-determination data indicates a plurality of interpolation factor zones defined by a range of at least size metric values, wherein each of the plurality of interpolation factor zones corresponds to a particular interpolation factor value, and wherein the one or more processors are further configured to generate the interpolation factor having an interpolation factor value corresponding to the particular interpolation factor zone based on determining that the size metric is included in a particular interpolation factor zone of the plurality of interpolation factor zones.
8. The device of claim 1, wherein the one or more processors are configured to generate an interpolation factor based on the motion metric, and wherein the determination of whether to generate the intermediate frame using the motion compensated frame interpolation is based on whether the interpolation factor meets an interpolation criterion.
9. The apparatus of claim 8, wherein the one or more processors are further configured to generate the interpolation factor based on a comparison of the motion metric and interpolation factor determination data.
10. The apparatus of claim 9, wherein the interpolation factor-determination data is based on configuration settings, default data, user input, detected context, mode of operation, screen size, or a combination thereof.
11. The apparatus of claim 9, wherein the interpolation factor-determination data indicates a plurality of interpolation factor zones defined by at least a range of motion metric values, wherein each of the plurality of interpolation factor zones corresponds to a particular interpolation factor value, and wherein the one or more processors are further configured to generate the interpolation factor having an interpolation factor value corresponding to the particular interpolation factor zone based on determining that the motion metric is included in a particular interpolation factor zone of the plurality of interpolation factor zones.
12. The apparatus of claim 1, wherein the one or more processors are configured to generate the intermediate frame using the motion compensated frame interpolation in response to determining that an interpolation factor meets an interpolation criterion.
13. The apparatus of claim 1, wherein the one or more processors are configured to generate the intermediate frame using an alternative to the motion compensated frame interpolation in response to determining that an interpolation factor fails to meet an interpolation criterion.
14. The device of claim 1, wherein the one or more processors are configured to generate the intermediate frame based on the determination that the motion data indicates a first motion between the first frame and the second frame in response to the determination that the motion compensated frame interpolation is to be used to generate the intermediate frame such that a second motion between the first frame and the intermediate frame is based on the first motion and an interpolation factor.
15. The device of claim 1, wherein the one or more processors are configured to generate the intermediate frame based on the determination that the motion data indicates a first motion between the first frame and the second frame in response to the determination that the motion compensated frame interpolation is to be used to generate the intermediate frame such that a second motion between the first frame and the intermediate frame is based on a predetermined weight applied to the first motion.
16. The device of claim 1, wherein the one or more processors are configured to generate the intermediate frame as a copy of one of the first frame or the second frame in response to the determination that the motion-compensated frame interpolation is not to be used to generate the intermediate frame.
17. The device of claim 1, further comprising a modem configured to receive an input sequence of the image frames.
18. A method, comprising:
obtaining, at a device, motion data indicative of estimated motion between a first frame and a second frame of an input sequence of image frames;
identifying, based on the motion data, any frame region of the first frame that indicates motion greater than a motion threshold;
determining a motion metric associated with the identified frame region based on the motion data;
performing a determination of whether to use motion compensated frame interpolation to generate an intermediate frame based on the motion metric and size metric associated with the identified frame region;
generating, at the device, the intermediate frame based on the determination; and
an output sequence of image frames including the intermediate frame between the first frame and the second frame is generated at the device.
19. The method of claim 18, wherein the motion metric is based on an average motion, a maximum motion, a range of motion, or a combination thereof associated with the identified frame region.
20. The method of claim 18, wherein the size metric is based on a combined size of the identified frame regions, a count of the identified frame regions, a percentage of the first frames comprising the identified frame regions, or a combination thereof.
21. The method of claim 18, further comprising generating, at the device, an interpolation factor based on the size metric and the motion metric, wherein the determination of whether to generate the intermediate frame using the motion compensated frame interpolation is based on whether the interpolation factor meets an interpolation criterion.
22. The method of claim 21, wherein the interpolation factor is based on the size metric and a comparison of the motion metric to interpolation factor determination data.
23. The method of claim 22, wherein the interpolation factor determination data is based on configuration settings, default data, user input, detected context, mode of operation, screen size, or a combination thereof.
24. The method of claim 22, wherein the interpolation factor-determination data indicates a plurality of interpolation factor-areas defined by a range of size measurement values and a range of motion measurement values, wherein each of the plurality of interpolation factor-areas corresponds to a particular interpolation factor value, wherein the motion metric and the size metric are included in a particular interpolation factor-area of the plurality of interpolation factor-areas, and wherein the interpolation factor has an interpolation factor value corresponding to the particular interpolation factor-area.
25. The method of claim 21, further comprising generating the intermediate frame using the motion compensated frame interpolation in response to determining that the interpolation factor meets the interpolation criteria.
26. The method of claim 21, further comprising, in response to determining that the interpolation factor fails to meet the interpolation criteria, generating the intermediate frame using an alternative to the motion compensated frame interpolation.
27. The method of claim 21, further comprising, in response to the determination that the motion compensated frame interpolation is to be used to generate the intermediate frame, and based on the determination that the motion data indicates a first motion between the first frame and the second frame, generating the intermediate frame such that a second motion between the first frame and the intermediate frame is based on the first motion and the interpolation factor.
28. A non-transitory computer-readable medium storing instructions that, when executed by one or more processors, cause the one or more processors to:
obtaining motion data indicative of estimated motion between a first frame and a second frame of an input sequence of image frames;
identifying, based on the motion data, any frame region of the first frame that indicates motion greater than a motion threshold;
Determining a motion metric associated with the identified frame region based on the motion data;
performing a determination of whether to use motion compensated frame interpolation to generate an intermediate frame based on the motion metric and size metric associated with the identified frame region;
generating the intermediate frame based on the determination; and
an output sequence of image frames including the intermediate frame between the first frame and the second frame is generated.
29. An apparatus, comprising:
means for obtaining motion data indicative of estimated motion between a first frame and a second frame of an input sequence of image frames;
means for identifying, based on the motion data, any frame region of the first frame that indicates motion greater than a motion threshold;
means for determining a motion metric associated with the identified frame region based on the motion data;
means for performing a determination of whether to use motion compensated frame interpolation to generate an intermediate frame based on the motion metric and a size metric associated with the identified frame region;
means for generating the intermediate frame based on the determination; and
means for generating an output sequence of image frames comprising the intermediate frame between the first frame and the second frame.
30. The apparatus of claim 29, wherein the means for obtaining the motion data, the means for identifying any frame regions, the means for determining the motion metric, the means for performing the determination, the means for generating the intermediate frames, and the means for generating the output sequence are integrated into at least one of a communication device, a computer, a display device, a television, a game console, a digital video player, a camera, a navigation device, a vehicle, a head-mounted viewer, an augmented reality head-mounted viewer, a virtual reality head-mounted viewer, an air vehicle, a home automation system, a voice control device, an internet of things (IoT) device, a Virtual Reality (VR) device, a base station, or a mobile device.
CN202280025649.1A 2021-03-31 2022-03-02 Selective motion compensated frame interpolation Pending CN117083854A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US17/219,080 2021-03-31
US17/219,080 US11558621B2 (en) 2021-03-31 2021-03-31 Selective motion-compensated frame interpolation
PCT/US2022/070914 WO2022212996A1 (en) 2021-03-31 2022-03-02 Selective motion-compensated frame interpolation

Publications (1)

Publication Number Publication Date
CN117083854A true CN117083854A (en) 2023-11-17

Family

ID=80820315

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202280025649.1A Pending CN117083854A (en) 2021-03-31 2022-03-02 Selective motion compensated frame interpolation

Country Status (7)

Country Link
US (1) US11558621B2 (en)
EP (1) EP4315853A1 (en)
KR (1) KR20230138052A (en)
CN (1) CN117083854A (en)
BR (1) BR112023019069A2 (en)
TW (1) TW202240527A (en)
WO (1) WO2022212996A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230088882A1 (en) * 2021-09-22 2023-03-23 Samsung Electronics Co., Ltd. Judder detection for dynamic frame rate conversion

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008126252A1 (en) * 2007-03-30 2008-10-23 Pioneer Corporation Image generating device, image generating method, image generating program, and computer-readable recording medium
CN101919249A (en) * 2007-12-10 2010-12-15 高通股份有限公司 Resource-adaptive video interpolation or extrapolation
US20170187985A1 (en) * 2015-12-24 2017-06-29 Samsung Electronics Co., Ltd. Apparatus and method for frame rate conversion
CN110933497A (en) * 2019-12-10 2020-03-27 Oppo广东移动通信有限公司 Video image data frame insertion processing method and related equipment

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6594313B1 (en) 1998-12-23 2003-07-15 Intel Corporation Increased video playback framerate in low bit-rate video applications
US6442203B1 (en) 1999-11-05 2002-08-27 Demografx System and method for motion compensation and frame rate conversion
US20030202780A1 (en) * 2002-04-25 2003-10-30 Dumm Matthew Brian Method and system for enhancing the playback of video frames
US7408986B2 (en) * 2003-06-13 2008-08-05 Microsoft Corporation Increasing motion smoothness using frame interpolation with motion analysis
US20080025390A1 (en) 2006-07-25 2008-01-31 Fang Shi Adaptive video frame interpolation
US8477848B1 (en) * 2008-04-22 2013-07-02 Marvell International Ltd. Picture rate conversion system architecture
US20090310679A1 (en) * 2008-06-11 2009-12-17 Mediatek Inc. Video processing apparatus and methods
US8619198B1 (en) 2009-04-28 2013-12-31 Lucasfilm Entertainment Company Ltd. Adjusting frame rates for video applications
US8411751B2 (en) * 2009-12-15 2013-04-02 Nvidia Corporation Reducing and correcting motion estimation artifacts during video frame rate conversion
US10713753B2 (en) * 2018-10-12 2020-07-14 Apical Limited Data processing systems

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008126252A1 (en) * 2007-03-30 2008-10-23 Pioneer Corporation Image generating device, image generating method, image generating program, and computer-readable recording medium
CN101919249A (en) * 2007-12-10 2010-12-15 高通股份有限公司 Resource-adaptive video interpolation or extrapolation
US20170187985A1 (en) * 2015-12-24 2017-06-29 Samsung Electronics Co., Ltd. Apparatus and method for frame rate conversion
CN110933497A (en) * 2019-12-10 2020-03-27 Oppo广东移动通信有限公司 Video image data frame insertion processing method and related equipment

Also Published As

Publication number Publication date
US20220321889A1 (en) 2022-10-06
KR20230138052A (en) 2023-10-05
US11558621B2 (en) 2023-01-17
EP4315853A1 (en) 2024-02-07
TW202240527A (en) 2022-10-16
BR112023019069A2 (en) 2023-10-17
WO2022212996A1 (en) 2022-10-06

Similar Documents

Publication Publication Date Title
WO2021175055A1 (en) Video processing method and related device
US7952596B2 (en) Electronic devices that pan/zoom displayed sub-area within video frames in response to movement therein
CN107135422B (en) Information processing apparatus, information processing method, and computer program
US8238420B1 (en) Video content transcoding for mobile devices
WO2021175054A1 (en) Image data processing method, and related apparatus
JP2011035796A (en) Picture processing apparatus and picture processing method
KR20160115020A (en) Device supporting multipath tcp, and method of receiving video data of device by streaming
WO2022161383A1 (en) Filming control method and apparatus, and electronic device
JP7155164B2 (en) Temporal placement of rebuffering events
US20130223813A1 (en) Moving image reproduction apparatus, information processing apparatus, and moving image reproduction method
WO2023160617A9 (en) Video frame interpolation processing method, video frame interpolation processing device, and readable storage medium
WO2016200721A1 (en) Contextual video content adaptation based on target device
CN117083854A (en) Selective motion compensated frame interpolation
JP2008022070A (en) Content distribution system, content distribution server, content reproduction terminal, program, content distribution method
US10747492B2 (en) Signal processing apparatus, signal processing method, and storage medium
JP2011244328A (en) Video reproduction apparatus and video reproduction apparatus control method
JP5683291B2 (en) Movie reproducing apparatus, method, program, and recording medium
KR102411911B1 (en) Apparatus and method for frame rate conversion
JP2009135769A (en) Image processing device
JP2006101063A (en) Transmission of moving image data in consideration of environment of reproduction side
US11930207B2 (en) Display device, signal processing device, and signal processing method
JP4924131B2 (en) Image processing apparatus, image processing method, image processing program, reproduction information generation apparatus, reproduction information generation method, and reproduction information generation program
US20230421998A1 (en) Video decoder with inline downscaler
KR20180013243A (en) Method and Apparatus for Providing and Storing Streaming Contents
JP2005292691A (en) Moving image display device and moving image display method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination