WO2018123202A1 - Dispositif de traitement d'images animés, dispositif d'affichage, procédé de traitement d'images animés et programme de commande - Google Patents

Dispositif de traitement d'images animés, dispositif d'affichage, procédé de traitement d'images animés et programme de commande Download PDF

Info

Publication number
WO2018123202A1
WO2018123202A1 PCT/JP2017/036763 JP2017036763W WO2018123202A1 WO 2018123202 A1 WO2018123202 A1 WO 2018123202A1 JP 2017036763 W JP2017036763 W JP 2017036763W WO 2018123202 A1 WO2018123202 A1 WO 2018123202A1
Authority
WO
WIPO (PCT)
Prior art keywords
moving image
texture
unit
motion vector
image processing
Prior art date
Application number
PCT/JP2017/036763
Other languages
English (en)
Japanese (ja)
Inventor
直大 北城
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2018123202A1 publication Critical patent/WO2018123202A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/223Analysis of motion using block-matching
    • G06T7/231Analysis of motion using block-matching using full search
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region

Definitions

  • the following disclosure relates to a moving image processing apparatus and the like.
  • Patent Literature 1 discloses a technique for imparting a predetermined texture to a predetermined area in a moving image.
  • Patent Document 1 discloses an image texture manipulation method for dynamically manipulating an image area (second image area) that gives a transparent layer texture.
  • Patent Document 1 it is not possible to perform moving image processing according to the texture of individual objects (objects) expressed in the moving image.
  • One aspect of the present disclosure has been made in view of the above-described problems, and an object thereof is to realize a moving image processing apparatus or the like that can enhance the texture of an object expressed in a moving image. .
  • a moving image processing apparatus analyzes a motion vector of a moving image to determine a texture of an object expressed in the moving image moving image. And a moving image processing unit that processes the moving image according to the determination result of the texture determining unit.
  • a moving image processing method analyzes a motion vector of a moving image to determine a texture of an object expressed in the moving image. And a moving image processing step for processing the moving image in accordance with the determination result in the texture determination step.
  • the moving image processing apparatus has an effect that the texture of the object expressed in the moving image can be enhanced.
  • FIG. 3 is a functional block diagram illustrating a configuration of a main part of the display device according to the first embodiment.
  • FIG. It is the schematic for demonstrating a motion vector. It is a figure which shows an example of a 2nd motion vector set.
  • (A) And (b) is a figure for demonstrating the relationship between the viscosity and the texture of a liquid, respectively. It is a figure which shows an example of HMM.
  • (A) And (b) is a figure for demonstrating the moving image process in the display apparatus of FIG. 1, respectively. It is a functional block diagram which shows roughly the signal processing part which concerns on Embodiment 2, and the structure of the periphery.
  • FIG. 1 is a functional block diagram illustrating a configuration of a main part of the display device 1.
  • the display device 1 includes a signal processing unit 10 (moving image processing device), a receiving unit 60, a decoding unit 61, a display unit 70, and a storage unit 90.
  • the display device 1 may be a television or a PC (Personal Computer).
  • the display device 1 may be a portable information terminal such as a multifunction mobile phone (smartphone) or a tablet.
  • the signal processing unit 10 processes the moving image (input image image) and outputs the processed moving image (output moving image) to the display unit 70.
  • the display unit 70 displays a moving image.
  • the display unit 70 may be, for example, a liquid crystal display or an organic EL (Electro-Luminescence) display.
  • the signal processing unit 10 is provided as a part of a control unit (not shown) that comprehensively controls each unit of the display device 1.
  • the function of the control unit may be realized by a CPU (Central Processing Unit) executing a program stored in the storage unit 90.
  • the function of each part of the signal processing unit 10 will be described in detail later.
  • the storage unit 90 stores various programs executed by the signal processing unit 10 and data used by the programs.
  • the storage unit 90 stores a feature pattern model DB (DataBase) 91.
  • the feature pattern model DB 91 is a DB in which various feature pattern models (described later) are stored.
  • the receiving unit 60 receives broadcast waves (radio waves).
  • the decoding unit 61 acquires compressed moving image data (moving image data compressed by a predetermined encoding method) included in the broadcast wave received by the receiving unit 60. Subsequently, the decoding unit 61 acquires an input moving image (input video signal) by decoding the compressed moving image data. Then, the decoding unit 61 supplies the acquired input moving image to the signal processing unit 10 (more specifically, the first correction unit 11 described later).
  • an input moving image (moving image input to the signal processing unit 10) is also referred to as a moving image A.
  • the moving image A is a moving image to be processed in the signal processing unit 10.
  • the moving image A may have a resolution of 4K2K (resolution of 3840 horizontal pixels ⁇ 2160 vertical pixels).
  • the resolution of each moving image described in the first embodiment is not limited to the above, and may be set as appropriate.
  • the receiving unit 60 and the decoding unit 61 may be provided as an integrated functional unit.
  • a known tuner can be used as the receiving unit 60 and the decoding unit 61.
  • the signal processing unit 10 may acquire the moving image A from the storage unit 90.
  • the signal processing unit 10 may acquire the moving image A from an external device (for example, a digital movie camera) connected to the display device 1.
  • the signal processing unit 10 processes the moving image A supplied from the receiving unit 60 to generate an output moving image (output video signal). Then, the signal processing unit 10 (more specifically, a texture correction unit 14 described later) supplies the output moving image to the display unit 70. According to this configuration, the output moving image can be displayed on the display unit 70.
  • a display control unit (not shown) that controls the operation of the display unit 70 may be provided in the signal processing unit 10 or may be provided in the display unit 70 itself.
  • the signal processing unit 10 includes a first correction unit 11, a frame rate conversion unit 12, a texture detection unit 13 (texture determination unit), and a texture correction unit 14 (moving image processing unit). Yes.
  • the texture detection unit 13, the texture correction unit 14, and the feature pattern model DB 91 are main parts of the moving image processing apparatus according to an aspect of the present disclosure.
  • the texture detection unit 13, the texture correction unit 14, and the feature pattern model DB 91 may be collectively referred to as “texture processing unit”.
  • the texture processing unit is indicated by a dotted line for convenience of explanation.
  • the first correction unit 11 processes the moving image A described above.
  • the moving image after processing in the first correction unit 11 is also referred to as a moving image B.
  • the process in the first correction unit 11 may be a known image quality correction process.
  • the first correction unit 11 may perform scaling (resolution change) on the moving image A.
  • the resolution of the moving image displayed on the display unit 70 can be converted into a resolution according to the performance specifications of the display unit 70.
  • the first correction unit 11 is not an essential component in the signal processing unit 10 as shown in FIG. For example, if the resolution of the moving image A already conforms to the performance specifications of the display unit 70, the first correction unit 11 does not need to generate the moving image B (convert the resolution).
  • the first correction unit 11 sets image quality parameters of the moving image A (eg, parameters indicating the degree of brightness, contrast, color density, peaking, outline enhancement, etc.) according to the user's operation. Also good. In this case, the first correction unit 11 processes the moving image A using the set image quality parameter. For example, when the image quality parameter is arbitrarily selected by the user according to the usage mode of the user, the first correction unit 11 may be operated as described above.
  • image quality parameters of the moving image A eg, parameters indicating the degree of brightness, contrast, color density, peaking, outline enhancement, etc.
  • the first correction unit 11 supplies the moving image B to the frame rate conversion unit 12 (more specifically, each of the interpolation image generation unit 121 and the motion vector calculation unit 122 described below).
  • the moving image A may be supplied from the decoding unit 61 to the frame rate conversion unit 12.
  • the frame rate conversion unit 12 includes an interpolated image generation unit 121 and a motion vector calculation unit 122.
  • the interpolated image generation unit 121 performs processing for increasing the frame rate of the moving image B. Specifically, the interpolated image generation unit 121 extracts each of a plurality of frames constituting the moving image B from the moving image B. Each frame extracted by the interpolated image generation unit 121 may be stored, for example, in a frame memory (not shown).
  • the frame memory may be provided in the frame rate conversion unit 12 or may be provided outside the frame rate conversion unit 12.
  • the interpolated image generation unit 121 generates an interpolation frame (intermediate frame) based on the frame using a known algorithm. For example, the interpolated image generation unit 121 may generate an interpolation frame using a motion vector described below. Then, the interpolated image generation unit 121 increases the frame rate of the moving image B by inserting interpolation frames into the moving image B at predetermined frame intervals.
  • the processed moving image in the interpolated image generation unit 121 is also referred to as a moving image C.
  • the frame rate conversion unit 12 may increase the frame rate of the moving image B by a factor of two. For example, when the frame rate of the moving image B is 60 fps (frames per second), the interpolated image generating unit 121 generates a moving image C having a frame rate of 120 fps. Then, the interpolated image generation unit 121 supplies the moving image C to the texture correction unit 14 (more specifically, the second correction unit 142 described below).
  • the conversion rate of the frame rate in the frame rate conversion unit 12 is not limited to the above, and may be set as appropriate. Further, the frame rate of each moving image described in the first embodiment is not limited to the above.
  • the interpolated image generation unit 121 By providing the interpolated image generation unit 121, the frame rate of the moving image displayed on the display unit 70 can be converted into one according to the performance specifications of the display unit 70.
  • the interpolated image generation unit 121 is not an essential component of the signal processing unit 10 as illustrated in FIG. For example, if the frame rate of the moving image B (moving image A) already conforms to the performance specifications of the display unit 70, the interpolated image generating unit 121 generates the moving image C (converts the frame rate). It is not necessary.
  • the motion vector calculation unit 122 calculates (detects) a motion vector by analyzing the moving image B (more specifically, each frame of the moving image B stored in the frame memory). A known algorithm may be used to calculate the motion vector in the motion vector calculation unit 122.
  • the interpolated image generation unit 121 when the interpolated image generation unit 121 is excluded from the frame rate conversion unit 12, a function of extracting each frame from the moving image B may be given to the motion vector calculation unit 122. Further, as shown in FIG. 8 and the like described later, the motion vector calculation unit 122 can be excluded from the signal processing unit 10. That is, it should be noted that the frame rate conversion unit 12 is also not an essential component in the signal processing unit 10.
  • a motion vector is a block (more specifically, a virtual object located in a block) in one frame (eg, a reference frame) and another frame (eg, a reference frame) subsequent to the one frame. This is a vector indicating the positional deviation from the corresponding block in the next frame.
  • a motion vector is a vector indicating to which position a block in one frame has moved in another subsequent frame.
  • the motion vector is used as an index indicating the movement amount of the block.
  • FIG. 2 is a schematic diagram for explaining a motion vector.
  • each frame included in the moving image B is uniformly divided into blocks having a horizontal length (resolution) a and a vertical length b.
  • the horizontal pixel number of the moving image B is represented as H
  • the vertical pixel number is represented as V.
  • each frame is divided in the horizontal direction (H / a) and in the vertical direction (V / b). That is, each frame is divided into (H / a) ⁇ (V / b) blocks. Note that the values of a, b, H, and V may be set arbitrarily.
  • one of the blocks in FIG. 2 is represented as a block (x, y).
  • x and y are indices (numbers) indicating horizontal and vertical positions in each frame, respectively.
  • the block located in the upper left among the blocks in FIG. 2 be a block (0, 0).
  • the block number in the horizontal direction increases from left to right
  • the block number in the vertical direction increases by 1 from top to bottom. Is set. Therefore, “0 ⁇ x ⁇ H / a ⁇ 1 and 0 ⁇ y ⁇ V / b ⁇ 1”.
  • the motion vector of the block (x, y) is represented as MV (x, y).
  • the motion vector calculation unit 122 calculates a motion vector for each block in FIG.
  • a set of motion vectors calculated by the motion vector calculation unit 122 for each block of one frame is referred to as a first motion vector set.
  • the motion vector calculation unit 122 supplies the first motion vector set described above to the interpolation image generation unit 121 and the texture detection unit 13 (more specifically, the extraction unit 131 described below).
  • the texture detection unit 13 includes an extraction unit 131 and a collation unit 132. As will be described below, the texture detection unit 13 analyzes each motion vector of a moving image (for example, moving image B), so that each object represented in the moving image (more specifically, in the moving image) Is determined. Then, the texture detection unit 13 supplies the determination result (a texture ID described later) to the texture correction unit 14. Hereinafter, each part of the texture detection part 13 is demonstrated concretely.
  • the extraction unit 131 extracts (acquires) a part (subset) of the first motion vector set described above.
  • this subset is referred to as a second motion vector set.
  • FIG. 3 is a diagram illustrating an example of the second motion vector set.
  • the extraction unit 131 may extract an area composed of blocks (m, n) to (m + A ⁇ 1, n + B ⁇ 1) in each frame as a partial area.
  • the values of m, n, A, and B may be arbitrary values as long as the partial area is set so as not to deviate spatially from each frame.
  • the extraction unit 131 acquires a motion vector of each block in the partial area. That is, the extraction unit 131 acquires, as the second motion vector set, the motion vector set of each block in the partial area in the first motion vector set. Then, the extraction unit 131 supplies the second motion vector set to the collation unit 132.
  • the collation unit 132 compares (collates) the second motion vector set (motion vector) with various feature pattern models included in the feature pattern model DB 91 described above.
  • the “feature pattern model” may be a model representing a motion vector (more specifically, a set of motion vectors) representing the texture of an object represented in a moving image.
  • the feature pattern model may be a model related to a motion vector (a set of motion vectors) representing the texture.
  • the feature pattern model is a set of motion vectors derived (set) by performing learning (automatic learning) with a prior pattern recognition technique on the second motion vector set.
  • the feature pattern model may be set in advance in the display device 1.
  • the matching unit 132 can determine the texture of an object based on the feature pattern model.
  • the collation unit 132 has a function as a function unit (pattern setting unit) for setting a feature pattern model is illustrated.
  • the pattern setting unit may be provided as a functional unit separate from the matching unit 132.
  • the pattern setting unit performs the above learning using a known algorithm and sets a feature pattern model.
  • the pattern setting unit stores the feature pattern model in the feature pattern model DB 91.
  • a feature pattern model corresponding to the texture of each object can be stored in the feature pattern model DB 91.
  • the “texture” in the present specification is a sensation perceived by a person (user, viewer), and among those sensations, such as a glossy feeling or a material feeling perceived by dynamically changing. Mean sense.
  • the sensation is expressed by onomatopoeia or mimetic words such as “sarasara”, “nebaba”, “yurayura”, “fluffy”, “kirakira”, and the like.
  • the texture in this specification does not necessarily have to directly specify the material of the object (eg, metal or paper).
  • the texture in this specification may simulate visual texture in a general sense.
  • FIG. 3 described above schematically shows an example of the second motion vector set when the object represented in the moving image is a liquid having high viscosity (eg, oil).
  • the motion vector included in the second motion vector set is one of indices indicating the liquid flow velocity.
  • the flow rate generally depends on the viscosity (viscosity) of the fluid. Therefore, the magnitude of the viscosity of the liquid can be distinguished by the motion vector. From this, the difference in the texture according to the viscosity can be distinguished by the motion vector.
  • FIG. 4A shows a liquid having a low viscosity (for example, water).
  • a low viscosity for example, water
  • the liquid When a person sees a low-viscosity liquid flowing (moving), the liquid generally flows at a relatively high flow rate, and thus it is common for a person to perceive a “free” sensation.
  • a liquid eg oil having a high viscosity is shown in FIG.
  • a liquid eg oil having a high viscosity
  • the liquid flows at a relatively low flow rate, and thus it is common for a person to perceive a “sticky” sensation.
  • the liquid generally moves smoothly (is easily deformed) in a natural state as compared with a solid. That is, the movement pattern differs greatly between liquid and solid. Even if it is a solid, the movement pattern varies depending on, for example, the difference in rigidity. From this, it can be said that each object has a peculiar movement pattern according to the texture.
  • the surface of the object reflects the light, so that a luminance distribution of reflected light is formed on the surface.
  • a person perceives the glossiness (sensation of “glitter”) of an object by the luminance distribution.
  • the person perceives the glossiness of the object more specifically by the movement (change) of the reflected light (specular reflection component or diffuse reflection component) accompanying the change of the viewpoint or the movement of the object.
  • a motion vector indicating the pattern of reflected light movement can also be used as one index indicating the glossiness of the object.
  • the motion vector of the moving image can be used as one of indices indicating the texture of the object (particularly the surface of the object) expressed in the moving image.
  • the inventor of the present application “discriminates (detects and estimates) the texture of the object represented in the moving image based on the motion vector of the moving image, and the moving image corresponding to the determination result”.
  • process a novel technical idea (knowledge) that “process”.
  • Each component of the texture processing unit described above (for example, the texture detection unit 13 and the texture correction unit 14 included in the signal processing unit 10) is conceived based on the technical idea.
  • the collation unit 132 calculates (evaluates) the relevance (matching degree) of the second motion vector set (motion vector) to the feature pattern model. That is, the collation unit 132 acquires information indicating how much the texture of the object indicated by the second motion vector set matches the texture of the object indicated by the feature pattern model.
  • texture ID texture discrimination information
  • the texture ID may be understood as information indicating a result of determining the texture of each object expressed in the moving image (eg, moving image B) by the matching unit 132.
  • the texture detection unit 13 supplies the texture ID to the texture correction unit 14 (more specifically, the parameter setting unit 141 described below).
  • the collation unit 132 may acquire the texture ID using a known pattern recognition method for a two-dimensional vector set (vector sequence). Specific examples of the pattern recognition method include the following three.
  • a correlation function between the second motion vector set and the feature pattern model is calculated.
  • the collation unit 132 calculates the correlation function ⁇ MV in MV database (x ′, y ′).
  • the matching unit 132 may calculate the value of the correlation function as the texture ID.
  • MV in represents a second motion vector set (observation series)
  • MV database represents a feature pattern model.
  • the texture ID can be calculated by a simple calculation as compared with the second method and the third method described below. For this reason, the texture ID can be calculated even when the hardware resources are relatively limited (for example, when the processing performance of the processor is relatively low). Therefore, the signal processing unit 10 can be realized with a simple configuration.
  • FIG. 5 is a diagram illustrating an example of the HMM.
  • t is a symbol representing time.
  • a ij is a state transition probability
  • b ij is an output probability.
  • the matching unit 132 is made to calculate the probability P (Y).
  • the collation unit 132 may calculate the value of the probability P (Y) as the texture ID.
  • the texture ID can be calculated with higher accuracy than in the first method described above.
  • the texture ID can be appropriately calculated even when non-local deformation (that is, deformation with continuous expansion / contraction) occurs in the object (when the motion vector distribution includes expansion / contraction).
  • non-local deformation that is, deformation with continuous expansion / contraction
  • the second method may be adopted.
  • an appropriate probability model needs to be set in advance by the designer of the display device 1.
  • Adopt deep learning technology For example, using a neural network such as CNN (Convolutional Neural Network), the pattern setting unit is made to learn a feature pattern model in advance. Then, the collation unit 132 is made to compare the above-described observation series MV in with the feature pattern model. In this case, the collation unit 132 may output the comparison result as a texture ID.
  • CNN Convolutional Neural Network
  • the texture ID can be calculated with higher accuracy than in the second method described above.
  • the feature setting model is learned by the pattern setting unit using sufficient hardware resources, it can be expected to calculate the texture ID with particularly high accuracy.
  • the feature pattern can be obtained by appropriate learning even if the designer of the display device 1 does not specify in advance the individual features of the motion vector according to the type of texture (for example, glossiness).
  • a model can be obtained. Therefore, it is expected that a texture ID corresponding to a wide range of textures can be acquired by a more flexible method.
  • the third method it is necessary to prepare particularly abundant hardware resources.
  • the texture correction unit 14 includes a parameter setting unit 141 and a second correction unit 142 (correction unit).
  • the parameter setting unit 141 selects (sets) an image quality parameter based on the texture ID acquired from the matching unit 132.
  • the parameter setting unit 141 a table of image quality parameters associated with the texture ID is set in advance.
  • the parameter setting unit 141 refers to the table and selects an image quality parameter corresponding to the texture ID. Then, the parameter setting unit 141 supplies the selected image quality parameter to the second correction unit 142.
  • the second correction unit 142 may be a functional unit that performs a known image quality correction process, similarly to the first correction unit 11 described above. However, the second correction unit 142 may perform image quality correction processing (image quality correction processing mainly intended for texture reproduction) by a new method that is not known.
  • the moving image C is supplied from the interpolated image generation unit 121 to the second correction unit 142.
  • the second correction unit 142 processes the moving image C using the image quality parameter selected by the parameter setting unit 141.
  • the moving image after processing in the second correction unit 142 is also referred to as a moving image D (output moving image).
  • the second correction unit 142 supplies the moving image D to the display unit 70.
  • the second correction unit 142 can perform processing according to the texture of the object expressed in the moving image C using the image quality parameter selected by the parameter setting unit 141.
  • the image quality correction process for example, it is possible to enhance the glossiness or texture of an object.
  • the second correction unit 142 can generate the moving image D with a higher texture. Therefore, it is possible to provide the user with the moving image D in which the texture is more emphasized (expressed more effectively).
  • the second correction unit 142 may process the moving image C using an image quality parameter corresponding to the texture ID of the object that occupies the most part in the moving image C. That is, the second correction unit 142 may process the moving image C based on one texture ID (main texture ID).
  • the second correction unit 142 may process the moving image C using image quality parameters corresponding to the texture IDs of the plurality of objects.
  • the second correction unit 142 may use an image quality parameter corresponding to the texture ID of an object that occupies the most part in each of the partial regions in FIG. In this case, the second correction unit 142 processes each area corresponding to each partial area in the moving image C.
  • the parameter setting unit 141 may also supply the selected image quality parameter to the first correction unit 11 described above.
  • the first correction unit 11 can perform the same image quality correction process as that of the second correction unit 142 on the moving image A described above.
  • each of the first correction unit 11 and the second correction unit 142 can perform processing according to the texture of the object, so that the texture can be more effectively emphasized.
  • the second correction unit 142 may further perform processing for enhancing the texture pattern of the object expressed in the moving image C.
  • the second correction unit 142 may store in advance a table indicating the correspondence relationship between the texture ID and the texture pattern.
  • the 2nd correction part 142 may give a texture pattern to an object according to texture ID.
  • the second correction unit 142 may enhance the texture pattern of the object by performing a filter process.
  • the second correction unit 142 may perform special processing such as HDR (High Dynamic Range) expansion on the object according to the texture ID.
  • HDR High Dynamic Range
  • the brightness of the object can be partially enhanced, so that a predetermined texture (eg, glossiness) of the object can be enhanced.
  • the second correction unit 142 may perform a process of distorting the object and the area around the object according to the texture ID. Also by this processing, the texture of the object can be enhanced. As described above, the second correction unit 142 may be configured to perform an arbitrary process for enhancing the texture of an object according to the texture ID.
  • the signal processing unit 10 in the display device 1 includes a texture detection unit 13 and a texture correction unit 14 (components of the above-described texture processing unit).
  • the texture detecting unit 13 determines the texture of each object expressed in the moving image by analyzing the motion vector of the moving image. Then, the texture detection unit 13 supplies the texture correction unit 14 with the texture ID indicating the determination result.
  • the texture correction unit 14 processes the moving image based on the texture ID (according to the determination result of the texture detection unit 13). That is, the moving image can be processed by the texture correction unit 14 so as to more effectively express the texture of the object. Therefore, according to the signal processing unit 10, it is possible to improve the texture of the object expressed in the moving image.
  • the signal processing unit 10 even if (i) the resolution of the moving image is not necessarily high enough, or (ii) the moving image deteriorates during decoding in the decoding unit 61, The texture of the can be expressed effectively. That is, a moving image that can sufficiently express the texture of an object can be provided with a simpler configuration than in the past.
  • FIGS. 6A and 6B are diagrams for explaining the moving image processing in the texture correction unit 14, respectively. Specifically, FIGS. 6A and 6B show an object before the moving image processing and an object after the moving image processing are performed.
  • FIG. 6A illustrates a case where moving image processing (eg, contrast adjustment, brightness adjustment, HDR expansion) for enhancing the glossiness of an object is performed by moving image processing in the texture correction unit 14. .
  • moving image processing eg, contrast adjustment, brightness adjustment, HDR expansion
  • the motion vector is used as one of indexes indicating the glossiness of the object in the processing in the texture detection unit 13.
  • FIG. 6B when moving image processing (eg, contour correction) is performed by the moving image processing in the texture correction unit 14 to enhance the “fluffy” feeling of the object (material feeling representing lightness). Is illustrated. In the case of FIG. 6B, it is understood that the motion vector is used as one of indexes indicating the “fluffy” feeling of the object in the processing in the texture detection unit 13.
  • moving image processing eg, contour correction
  • the moving image processing method includes: (i) a texture determining step of determining a texture of an object represented in the moving image by analyzing a motion vector of the moving image; and (ii) a determination in the texture determining step.
  • FIG. 7 is a functional block diagram schematically showing the configuration of the signal processing unit 20 (moving image processing apparatus) and its periphery according to the second embodiment.
  • the display device of Embodiment 2 is referred to as a display device 2.
  • portions not shown are the same as those of the display device 1 of FIG. This also applies to each embodiment described below.
  • the signal processing unit 20 has a configuration in which the first correction unit 11 and the interpolated image generation unit 121 are excluded from the signal processing unit 10 of the first embodiment.
  • the above-described moving image A (input moving image) is supplied from the decoding unit 61 to each of the motion vector calculation unit 122 and the texture correction unit 14.
  • the motion vector calculation unit 122 supplies the set of motion vectors of the moving image A to the extraction unit 131 of the texture detection unit 13 as the first motion vector set. Similar to the first embodiment, the extraction unit 131 extracts a second motion vector set from the first motion vector set. As in the first embodiment, the collation unit 132 compares the second motion vector set with various feature pattern models included in the feature pattern model DB 91. Since the specific processing of the texture detection unit 13 is the same, the description thereof is omitted.
  • the texture correction unit 14 processes the moving image A based on the texture ID acquired from the matching unit 132 of the texture detection unit 13. That is, the texture correction unit 14 processes the moving image A to generate a moving image D (output moving image) and supplies the moving image D to the display unit 70.
  • the first correction unit 11 image quality correction processing before the texture is determined
  • the interpolated image generation unit 121 frame
  • the configuration of the moving image processing apparatus can be simplified as compared with the first embodiment.
  • FIG. 8 is a functional block diagram schematically showing the configuration of the signal processing unit 30 (moving image processing apparatus) and its periphery according to the third embodiment. Note that the display device of Embodiment 3 is referred to as a display device 3.
  • the decoding unit 61 acquires the compressed moving image data from the receiving unit 60.
  • information motion vector information
  • An example of a format of compressed moving image data including the motion vector information can be MPEG4.
  • the signal processing unit 30 has a configuration in which the motion vector calculation unit 122 is excluded from the signal processing unit 20 of the second embodiment. That is, in the signal processing unit 30, the configuration of the moving image processing apparatus is further simplified as compared with the second embodiment described above.
  • the moving image A (input moving image) is supplied from the decoding unit 61 to the texture correction unit 14.
  • the extraction unit 131 acquires motion vector information included in the above-described compressed moving image data from the decoding unit 61.
  • the extraction unit 131 acquires a set of motion vectors indicated in the motion vector information as a first motion vector set. Then, the extraction unit 131 extracts a second vector set from the motion vector information (first motion vector set). And the collation part 132 compares a 2nd motion vector set with the various feature pattern models contained in feature pattern model DB91 similarly to each above-mentioned embodiment.
  • the texture detection unit 13 in the third embodiment performs the same processing as in each of the above-described embodiments using the motion vector information included in the compressed moving image data. That is, the texture detection unit 13 according to the third embodiment analyzes a motion vector included in advance in the compressed moving image data.
  • the motion image processing apparatus when the motion vector information is included in the compressed motion image data, the motion image processing apparatus according to an aspect of the present disclosure can omit the process of calculating the motion vector. Therefore, the configuration of the moving image processing apparatus is further simplified.
  • FIG. 9 is a functional block diagram schematically showing the configuration of the signal processing unit 30v (moving image processing apparatus) and its periphery according to the present modification. Note that the display device of this modification is referred to as a display device 3v.
  • FIG. 9 a signal processing unit 30v as an example of a variation of the signal processing unit 30 of the third embodiment is illustrated for convenience of explanation.
  • the configuration of the present modification may be applied to the above-described first and second embodiments and the later-described fourth embodiment.
  • auxiliary information may be further input to the texture detection unit 13 (more specifically, each of the extraction unit 131 and the collation unit 132).
  • the auxiliary information means information other than the motion vector included in the moving image.
  • information indicating an object boundary, a color, a texture pattern, and the like can be given.
  • the texture detection unit 13 may further determine the texture of the object expressed in the moving image by further using auxiliary information in addition to the motion vector information (motion vector). According to this configuration, the texture can be determined by further considering the shape and color of the object indicated by the auxiliary information, and it is expected that the texture can be determined with higher accuracy.
  • FIG. 10 is a functional block diagram schematically illustrating the configuration of the signal processing unit 40 (moving image processing apparatus) and its periphery according to the fourth embodiment. Note that the display device of Embodiment 4 is referred to as a display device 4.
  • the signal processing part 40 as an example of the variation of the signal processing part 30 of Embodiment 3 is illustrated for convenience of explanation.
  • the configuration of the fourth embodiment may be applied to any of the above-described embodiments and modifications.
  • the texture of an object is determined using a motion vector calculated for one frame (eg, a reference frame) included in a moving image.
  • a motion vector calculated for one frame e.g, a reference frame
  • N is an integer equal to or greater than 2
  • the texture of the object is determined using only the motion vector in the Nth frame. It was.
  • the texture of an object may be determined by further using a single motion vector. That is, the texture of the object may be determined using the motion vector history in the past frame.
  • the fourth embodiment exemplifies a configuration in which a motion vector history 92 described below is added to the above-described texture processing unit.
  • the signal processing unit 40 of the fourth embodiment shows a configuration in which the motion vector history 92 is further stored in the storage unit 90 in the signal processing unit 30 of the third embodiment.
  • the motion vector history 92 stores a set of motion vectors in the first to N ⁇ 1th frames. That is, the motion vector history 92 stores a history of a set of motion vectors (first motion vector set) in a past frame.
  • the decoding unit 61 stores motion vector information indicating the set of motion vectors described above in the motion vector history 92 for each frame of a moving image. That is, the motion vector history 92 stores motion vector information (hereinafter also referred to as “past motion vector information”) indicating a set of motion vectors in the first to (N ⁇ 1) th frames.
  • past motion vector information motion vector information
  • the extraction unit 131 acquires the first motion vector set of the current (Nth) frame as in the third embodiment. Specifically, the extraction unit 131 acquires, from the decoding unit 61, motion vector information indicating a set of motion vectors in the current frame as a first motion vector set.
  • the extraction unit 131 further acquires past motion vector information included in the motion vector history 92.
  • the texture detection unit 13 can further determine the texture of the object by further using the motion vector history in the past frame in addition to the motion vector in the current frame.
  • the main elements that characterize the texture of an object include the pattern of movement of the object or the pattern of movement of reflected light. Therefore, by paying attention to the motion vector history in the past frame, the temporal transition of each pattern can be further considered. Therefore, it is expected that the texture can be discriminated with higher accuracy.
  • control blocks (particularly the signal processing units 10 to 40) of the display devices 1 to 4 may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like, or a CPU (Central Processing Unit). It may be realized by software using
  • the display devices 1 to 4 include a CPU that executes instructions of a program that is software that realizes each function, and a ROM (Read Only Memory) in which the program and various data are recorded so as to be readable by the computer (or CPU). ) Or a storage device (these are referred to as “recording media”), a RAM (Random Access Memory) for expanding the program, and the like. And the objective of 1 aspect of this indication is achieved when a computer (or CPU) reads and runs the said program from the said recording medium.
  • the recording medium a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
  • the program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program.
  • an arbitrary transmission medium such as a communication network or a broadcast wave
  • one aspect of the present disclosure can also be realized in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
  • a moving image processing apparatus (signal processing unit 10) according to aspect 1 of the present disclosure includes a texture determination unit (13) that determines a texture of an object represented in the moving image by analyzing a motion vector of the moving image.
  • a moving image processing unit (texture correcting unit 14) that processes the moving image according to the determination result of the texture determining unit.
  • the inventor of the present application has proposed a novel technique of “determining the texture of an object expressed in a moving image based on the motion vector of the moving image and performing moving image processing according to the determination result”. I found a new technical idea. The above configuration has been conceived based on the technical idea.
  • the texture determination unit can determine the texture of the object expressed in the moving image by analyzing the motion vector of the moving image. Then, the moving image processing unit performs moving image processing according to the determination result. That is, it is possible to cause the moving image processing unit to perform moving image processing so as to more effectively express the texture of the object. Therefore, it is possible to enhance the texture of the object expressed in the moving image.
  • a feature pattern model that is a model representing the motion vector representing the texture or a model related to the motion vector representing the texture in the aspect 1 is set in advance.
  • the texture determination unit preferably determines the texture by comparing the motion vector of the moving image with the feature pattern model.
  • the texture determination unit compares the motion vector of the moving image with the feature pattern model, thereby moving the moving image with respect to the feature pattern model.
  • Material discrimination information indicating the degree of coincidence of the motion vectors of the image may be generated as the discrimination result.
  • the moving image processing apparatus is any one of Aspects 1 to 3, wherein the texture determination unit is included in advance in the compressed moving image data that is data obtained by compressing the moving image. It is preferable to analyze the motion vector.
  • the configuration of the moving image processing apparatus can be simplified.
  • the display device preferably includes the moving image processing apparatus according to any one of aspects 1 to 4.
  • the motion vector of the moving image is analyzed to determine the texture of the object represented in the moving image, and the determination result in the texture determining step. And a moving image processing step for processing the moving image accordingly.
  • the moving image processing apparatus may be realized by a computer.
  • the moving image processing apparatus is operated by causing the computer to operate as each unit (software element) included in the moving image processing apparatus.
  • a control program for a moving image processing apparatus for realizing the above in a computer and a computer-readable recording medium on which the control program is recorded also fall within the category of one aspect of the present disclosure.
  • Display device 10 1, 2, 3, 3v, 4 Display device 10, 20, 30, 30v, 40 Signal processing unit (moving image processing device) 13 Texture detection unit (material discrimination unit) 14 Texture correction unit (moving image processing unit)

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

La texture d'un objet représenté dans une image animée est augmentée. Une unité de traitement de signal (10) est pourvue d'une unité de détection de texture (13) pour analyser le vecteur de mouvement d'une image animée et déterminer ainsi la texture d'un objet représenté dans l'image animée, et une unité de correction de texture (14) pour exécuter un processus d'image animée qui correspond au résultat de détermination de l'unité de détection de texture (13).
PCT/JP2017/036763 2016-12-28 2017-10-11 Dispositif de traitement d'images animés, dispositif d'affichage, procédé de traitement d'images animés et programme de commande WO2018123202A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-256284 2016-12-28
JP2016256284 2016-12-28

Publications (1)

Publication Number Publication Date
WO2018123202A1 true WO2018123202A1 (fr) 2018-07-05

Family

ID=62707956

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/036763 WO2018123202A1 (fr) 2016-12-28 2017-10-11 Dispositif de traitement d'images animés, dispositif d'affichage, procédé de traitement d'images animés et programme de commande

Country Status (1)

Country Link
WO (1) WO2018123202A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020022362A1 (fr) * 2018-07-24 2020-01-30 国立研究開発法人国立精神・神経医療研究センター Dispositif de détection de mouvement, dispositif de détection de caractéristique, dispositif de détection de fluide, système de détection de mouvement, procédé de détection de mouvement, programme et support d'informations
JP2021010109A (ja) * 2019-07-01 2021-01-28 日本放送協会 フレームレート変換モデル学習装置およびフレームレート変換装置、ならびに、それらのプログラム

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004261272A (ja) * 2003-02-28 2004-09-24 Oki Electric Ind Co Ltd 体感装置、モーション信号の生成方法およびプログラム
JP2008257382A (ja) * 2007-04-03 2008-10-23 Nippon Telegr & Teleph Corp <Ntt> 動き検出装置、動き検出方法、及び、動き検出プログラム
JP2012104018A (ja) * 2010-11-12 2012-05-31 Hitachi Kokusai Electric Inc 画像処理装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004261272A (ja) * 2003-02-28 2004-09-24 Oki Electric Ind Co Ltd 体感装置、モーション信号の生成方法およびプログラム
JP2008257382A (ja) * 2007-04-03 2008-10-23 Nippon Telegr & Teleph Corp <Ntt> 動き検出装置、動き検出方法、及び、動き検出プログラム
JP2012104018A (ja) * 2010-11-12 2012-05-31 Hitachi Kokusai Electric Inc 画像処理装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
KAWABE, TAKAHIRO ET AL.: "Science and control of texture recognition - Exploration of motion information in images that brings a liquid texture", NTT GIJUTU JOURNAL, vol. 26, no. 9, 1 September 2014 (2014-09-01), pages 27 - 31 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020022362A1 (fr) * 2018-07-24 2020-01-30 国立研究開発法人国立精神・神経医療研究センター Dispositif de détection de mouvement, dispositif de détection de caractéristique, dispositif de détection de fluide, système de détection de mouvement, procédé de détection de mouvement, programme et support d'informations
JP2021010109A (ja) * 2019-07-01 2021-01-28 日本放送協会 フレームレート変換モデル学習装置およびフレームレート変換装置、ならびに、それらのプログラム
JP7274367B2 (ja) 2019-07-01 2023-05-16 日本放送協会 フレームレート変換モデル学習装置およびフレームレート変換装置、ならびに、それらのプログラム

Similar Documents

Publication Publication Date Title
US11544831B2 (en) Utilizing an image exposure transformation neural network to generate a long-exposure image from a single short-exposure image
US10755391B2 (en) Digital image completion by learning generation and patch matching jointly
CN109379550B (zh) 基于卷积神经网络的视频帧率上变换方法及系统
US10832069B2 (en) Living body detection method, electronic device and computer readable medium
CN108921225B (zh) 一种图像处理方法及装置、计算机设备和存储介质
US9785865B2 (en) Multi-stage image classification
CN111869220B (zh) 电子装置及其控制方法
US10430694B2 (en) Fast and accurate skin detection using online discriminative modeling
US10742990B2 (en) Data compression system
US8605957B2 (en) Face clustering device, face clustering method, and program
CN111402143A (zh) 图像处理方法、装置、设备及计算机可读存储介质
EP3185176A1 (fr) Procédé et dispositif permettant de synthétiser une image d&#39;un visage partiellement occluse
JP2020518191A (ja) ディープニューラルネットワークを用いた、視覚的品質を維持した量子化パラメータ予測
KR20170047167A (ko) 전자 장치가 동영상의 얼굴의 인상을 변형하는 방법 및 그 전자 장치
CN114339409B (zh) 视频处理方法、装置、计算机设备及存储介质
CN109413510B (zh) 视频摘要生成方法和装置、电子设备、计算机存储介质
US20200034617A1 (en) Processing image data to perform object detection
US20220156987A1 (en) Adaptive convolutions in neural networks
US20240126810A1 (en) Using interpolation to generate a video from static images
US20220164934A1 (en) Image processing method and apparatus, device, video processing method and storage medium
CN110503002B (zh) 一种人脸检测方法和存储介质
WO2018123202A1 (fr) Dispositif de traitement d&#39;images animés, dispositif d&#39;affichage, procédé de traitement d&#39;images animés et programme de commande
CN114529785A (zh) 模型的训练方法、视频生成方法和装置、设备、介质
US20240169701A1 (en) Affordance-based reposing of an object in a scene
KR20190001444A (ko) 보간 프레임을 생성하기 위한 움직임 예측 방법 및 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17887880

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17887880

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP