WO2015034061A1 - Video encoding device, video transcoding device, video encoding method, video transcoding method and video stream transmission system - Google Patents

Video encoding device, video transcoding device, video encoding method, video transcoding method and video stream transmission system Download PDF

Info

Publication number
WO2015034061A1
WO2015034061A1 PCT/JP2014/073532 JP2014073532W WO2015034061A1 WO 2015034061 A1 WO2015034061 A1 WO 2015034061A1 JP 2014073532 W JP2014073532 W JP 2014073532W WO 2015034061 A1 WO2015034061 A1 WO 2015034061A1
Authority
WO
WIPO (PCT)
Prior art keywords
encoding
area
unit
picture
image
Prior art date
Application number
PCT/JP2014/073532
Other languages
French (fr)
Japanese (ja)
Inventor
亮史 服部
守屋 芳美
彰 峯澤
一之 宮澤
関口 俊一
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2013-185196 priority Critical
Priority to JP2013185196 priority
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Publication of WO2015034061A1 publication Critical patent/WO2015034061A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • H04N19/463Embedding additional information in the video signal during the compression process by compressing encoding parameters before transmission
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/105Selection of the reference unit for prediction within a chosen coding or prediction mode, e.g. adaptive choice of position and number of pixels used for prediction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/107Selection of coding mode or of prediction mode between spatial and temporal predictive coding, e.g. picture refresh
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/102Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or selection affected or controlled by the adaptive coding
    • H04N19/103Selection of coding mode or of prediction mode
    • H04N19/114Adapting the group of pictures [GOP] structure, e.g. number of B-frames between two anchor frames
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/134Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the element, parameter or criterion affecting or controlling the adaptive coding
    • H04N19/167Position within a video image, e.g. region of interest [ROI]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/17Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object
    • H04N19/174Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being an image region, e.g. an object the region being a slice, e.g. a line of blocks or a group of blocks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/189Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding
    • H04N19/196Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the adaptation method, adaptation tool or adaptation type used for the adaptive coding being specially adapted for the computation of encoding parameters, e.g. by averaging previously computed encoding parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/40Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using video transcoding, i.e. partial or full decoding of a coded input stream followed by re-encoding of the decoded output stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/46Embedding additional information in the video signal during the compression process
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/513Processing of motion vectors
    • H04N19/517Processing of motion vectors by encoding
    • H04N19/52Processing of motion vectors by encoding by predictive encoding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/57Motion estimation characterised by a search window with variable size or shape
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/573Motion compensation with multiple frame prediction using two or more reference frames in a given prediction direction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/58Motion compensation with long-term prediction, i.e. the reference frame for a current frame not being the temporally closest one
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/593Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving spatial prediction techniques
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/60Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding
    • H04N19/61Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using transform coding in combination with predictive coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/90Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using coding techniques not provided for in groups H04N19/10-H04N19/85, e.g. fractals
    • H04N19/91Entropy coding, e.g. variable length coding [VLC] or arithmetic coding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/472End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
    • H04N21/4728End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for selecting a Region Of Interest [ROI], e.g. for requesting a higher resolution version of a selected region
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/50Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
    • H04N19/503Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding involving temporal prediction
    • H04N19/51Motion estimation or motion compensation
    • H04N19/55Motion estimation with spatial constraints, e.g. at image or region borders

Abstract

A variable length encoding unit (23) multiplexes hint information into an all-region bitstream, the hint information including motion vector limit information that indicates the maximum searchable range of the motion vector, GOP size limit information that indicates the maximum value of a GOP size which is the number of pictures belonging to the GOP, and reference structure designation information that indicates pictures which are referenced when decoding the pictures that belong to the GOP. Due to the foregoing, without causing a drop in the compression efficiency of an all-region bitstream, an all-region bitstream can be generated, such bitstream being appropriate for generating an efficient partial region bitstream using low computation.

Description

Moving picture coding apparatus, moving picture transcoding apparatus, moving picture coding method, moving picture transcoding method, and moving picture stream transmission system

The present invention generates another encoded data having different properties from an image encoding apparatus and an image encoding method for generating encoded data by compressing and encoding an image, and encoded data generated by the image encoding apparatus. The present invention relates to a moving image transcoding device and a moving image transcoding method, and a moving image stream transmission system that transmits and receives encoded data generated by an image encoding device.

With the advancement of imaging equipment, display equipment, compression encoding technology, transmission technology, etc., ultra-high definition of UHD (Ultra-High Definition) with resolution exceeding HD (High Definition) (for example, resolution of 4K, 8K, etc.) Resolution video distribution services are being studied.
In ultra-high-resolution video, the amount of video information is enormous, and therefore, when video signals are transmitted and stored, compression is generally performed using video coding technology.
Hereinafter, it is assumed that when transmitting an ultra-high resolution video, it is handled in a bit stream format compressed by a predetermined video encoding technique.

When watching ultra-high resolution video, the apparent size of the display device is too small compared to the number of pixels of the video, so that the fine structure in the video (for example, character information, human face, etc.) Even if it exists as information in the video, it may be difficult to view.
In order to solve such a problem, the entire area of the transmitted super high resolution video is displayed on the main display device (for example, a large screen TV installed in the living room), and the super high resolution video is displayed. A system is conceivable in which an image of a partial area designated by the user is extracted from the entire area, and the partial area image is transmitted to a sub-display device (for example, a tablet terminal at hand of the user) for viewing.

In the above system, the partial area video is transmitted from the main display device to the sub display device. When the partial area video is transmitted, the partial area video is transmitted in a bit stream format including only information on the partial area video. It is desirable to transmit.
When the entire area bit stream of the ultra-high resolution video is not compressed into the partial area bit stream (bit stream including only information related to the partial area video) and the entire area bit stream of the ultra-high resolution video is transmitted as it is, the amount of transmission information This is because the sub-display device needs to decode the entire region of the ultra-high resolution video, resulting in an increase in processing load.
Therefore, it is desirable that the main display device in the above system has a transcoding function for generating an arbitrary partial area bit stream from the entire area bit stream of the ultra-high resolution video.

As a method for generating an arbitrary partial region bit stream from the entire region bit stream, for example, the following method is conceivable.
[Method 1]
After the main display device decodes the entire area of the ultra-high resolution video, the decoded image of the partial area specified by the user is cut out from the decoded image of the entire area, and the decoded image of the partial area is again processed by a predetermined video encoding technique. Encode.
Then, the main display device generates a partial area bit stream including the partial area encoded data and the encoding parameters as the encoding result.
However, in the case of the method 1, since it is necessary to re-encode the decoded image of the partial region, there is a problem that the processing load of the main display device increases and the image quality deteriorates with re-encoding.

[Method 2]
Method 2 is disclosed in Patent Document 1 below, and is a method in which, when an all-region bitstream is generated, tile division is performed and reference between regions of an image is cut off.
In other words, the entire area bit stream is generated by dividing the image of a rectangular area called a tile and encoding each rectangular area, but local decoding referred to when encoding each rectangular area The image and the encoding parameter are limited so that reference (including interframe reference and entropy encoding) across the tile boundary is not performed.
By providing such a restriction, each tile can be decoded completely independently, so that the encoded data and encoding parameters of the tile including the partial area specified by the user are extracted from the entire area bitstream. By doing so, it is possible to generate a partial area bit stream including encoded data and encoding parameters of the partial area.
However, in Method 2, since the extraction of encoded data and encoding parameters is performed in units of tiles, the partial area specified by the user spans multiple tiles, or the tile size is larger than the partial area, etc. However, it is inefficient because a partial area bit stream including many areas unnecessary for display is generated.
If the tile size is reduced in order to increase the generation efficiency of the partial area bitstream, there are many places where the reference is cut off, and the compression efficiency of the entire area bitstream is reduced.

International Publication 2012/060459

Since the conventional moving image coding apparatus is configured as described above, a rectangular area (tile) image is divided into all areas, and each rectangular area is coded in a state where reference across the tile boundary is limited. By doing so, it is possible to suppress an increase in processing load and image quality degradation. However, when the partial area specified by the user straddles a plurality of tiles, there is a problem that a partial area bit stream including many areas unnecessary for display is generated and becomes inefficient. On the other hand, if the tile size is reduced in order to increase the generation efficiency of the partial area bitstream, there are many places where the reference is cut off, and the compression efficiency of the entire area bitstream is reduced.

The present invention has been made to solve the above-described problems, and is suitable for the generation of an efficient partial area bit stream with a low calculation amount without causing a reduction in compression efficiency of the entire area bit stream. It is an object of the present invention to obtain a moving picture coding apparatus and a moving picture coding method capable of generating an area bit stream.
Another object of the present invention is to provide a moving picture transcoding device and a moving picture transcoding method capable of generating an efficient partial area bit stream with a low amount of computation.
Another object of the present invention is to obtain a moving image stream transmission system that transmits and receives encoded data generated by an image encoding device.

A moving picture coding apparatus according to the present invention determines a coding parameter for a coding target block in a picture belonging to a GOP (Group Of Pictures), and generates a predicted picture using the coding parameter. A bit stream that compresses and encodes a difference image between the encoding target block and the predicted image generated by the encoding target block and the predicted image generation unit, and multiplexes the encoded data and the encoding parameter as a result of the encoding Generating means, and the bitstream generating means includes motion vector restriction information indicating a searchable range of motion vectors, GOP size restriction information indicating a GOP size that is the number of pictures belonging to the GOP, and each picture belonging to the GOP. Reference structure designation information indicating a picture to be referenced when decoding The hint information including those which is adapted to multiplex the bit stream.

According to the present invention, the bitstream generation means includes motion vector restriction information indicating a searchable range of motion vectors, GOP size restriction information indicating the GOP size that is the number of pictures belonging to the GOP, and each picture belonging to the GOP. Since the hint information including the reference structure designation information indicating the picture to be referred to when decoding is multiplexed into the bit stream, the amount of calculation is reduced without causing a reduction in the compression efficiency of the bit stream in the entire region. Thus, it is possible to generate a bit stream of the entire area suitable for generating an efficient partial area bit stream.

BRIEF DESCRIPTION OF THE DRAWINGS It is a block diagram which shows the system with which the moving image encoding apparatus and moving image transcoding apparatus by Embodiment 1 of this invention are applied. It is a block diagram which shows the moving image encoder 1 by Embodiment 1 of this invention. It is a block diagram which shows the all area | region stream decoding part 3 of the moving image transcoding apparatus 2 by Embodiment 1 of this invention. It is a block diagram which shows the partial area transcoding part 4 of the moving image transcoding apparatus 2 by Embodiment 1 of this invention. It is a flowchart which shows the processing content (moving image encoding method) of the moving image encoder 1 by Embodiment 1 of this invention. It is a flowchart which shows the processing content of the all area | region stream decoding part 3 of the moving image transcoding apparatus 2 by Embodiment 1 of this invention. It is a flowchart which shows the processing content of the partial area transcoding part 4 of the moving image transcoding apparatus 2 by Embodiment 1 of this invention. It is explanatory drawing which shows the example in which the largest encoding block is divided | segmented into several encoding object blocks hierarchically. (A) shows the distribution of partitions after division, and (b) is an explanatory diagram showing a situation in which a coding mode m (B n ) is assigned by hierarchical division in a quadtree graph. It is explanatory drawing which shows the meaning of the information which GOP size restriction | limiting information and reference structure designation | designated information point out. It is a block diagram which shows the system with which the moving image encoding apparatus and moving image transcoding apparatus by Embodiment 2 of this invention are applied. It is explanatory drawing which shows the example by which the whole area image is divided | segmented into six subpictures. It is a block diagram which shows the moving image stream transmission system by Embodiment 3 of this invention. It is a block diagram which shows the moving image stream transmission system by Embodiment 3 of this invention. It is a block diagram which shows the moving image stream transmission system by Embodiment 4 of this invention.

Hereinafter, in order to explain the present invention in more detail, modes for carrying out the present invention will be described with reference to the accompanying drawings.
Embodiment 1 FIG.
In the first embodiment, by limiting the maximum value of a motion vector when performing inter-frame prediction and the number of frames between random access points, the propagation range of pixel value information between random access points is within a certain range. Encoding is performed so as to fit, and an all-region bit stream is generated, and information indicating the maximum value of the motion vector and the limit value of the number of frames between random access points is used as hint information for the entire region. A video encoding apparatus that multiplexes the video stream into a bit stream will be described.
In addition, the encoded data of the entire area is decoded from the bit stream of the entire area generated by the moving image encoding device, and the hint information multiplexed in the bit stream of the entire area is referred to. From among the encoded data and encoding parameters, specify the encoding data and encoding parameters necessary for correctly decoding the display area specified by the user, and divert the encoded data and encoding parameters. A moving picture transcoding apparatus that generates a bit stream of a partial area with a low calculation amount will be described.

FIG. 1 is a block diagram showing a system to which a moving picture coding apparatus and a moving picture transcoding apparatus according to Embodiment 1 of the present invention are applied.
In FIG. 1, the moving image encoding apparatus 1 performs encoding processing in a format that can be processed by the subsequent moving image transcoding apparatus 2 on the pictures of the entire area (including the entire frame) of the input moving image. Thus, the entire area bit stream is generated, the transcoding hint information (details will be described later) is multiplexed into the entire area bit stream, and the entire area bit stream after the hint information is multiplexed is a moving picture transcoding device. 2 is an apparatus for outputting to the apparatus.

That is, the moving picture coding apparatus 1 determines a coding parameter for a coding target block in a picture belonging to a GOP (Group of pictures), generates a prediction image using the coding parameter, and generates a prediction target. Compressed and encoded difference image between the block and the predicted image, multiplexed the encoded data that is the encoding result and the encoding parameter, and generated a full-area bitstream, and searchable motion vectors Motion vector restriction information indicating the maximum range, GOP size restriction information indicating the maximum value of the GOP size, which is the number of pictures belonging to the GOP, and reference structure designation indicating a picture to be referred to when each picture belonging to the GOP is decoded And hint information including information are multiplexed into the all-region bitstream, and the all-region bits And it carries out a process of outputting the stream to the video transcoder 2.
Note that GOP means a set of a random access point picture and a picture that is not a random access point that continues after the random access point picture in decoding order.

FIG. 10 is an explanatory diagram showing the meaning of the information indicated by the GOP size restriction information and the reference structure designation information.
The GOP size restriction information indicates the size of the GOP defined above. In a bitstream created by a conventional moving image encoding apparatus, there is a data structure called GOP, but there is no information indicating the size of the GOP. That is, the number of pictures from decoding a random access point picture to decoding the next random access point picture is the GOP size, and the decoding apparatus cannot know the GOP size until all GOPs are decoded. It was. The GOP size restriction information is multiplexed with, for example, encoded data of the first frame of the GOP, thereby having an effect of transmitting the GOP size to the decoding apparatus side before decoding all the GOPs.
The reference structure designation information describes the reference structure between pictures. In the conventional moving picture coding apparatus, only the reference frame information of the frame is multiplexed for each frame. That is, in order for the decoding apparatus to know what reference structure is encoded with a certain GOP, it is necessary to decode all the GOPs. For example, the reference structure designation information is multiplexed with the encoded data of the first frame of the GOP, so that the reference structure is transmitted to the decoding side before the entire GOP is decoded.

The moving image transcoding device 2 includes an entire region stream decoding unit 3 and a partial region transcoding unit 4, and decodes an image of the entire region from the entire region bitstream generated by the moving image encoding device 1, A process of outputting an image of the entire area (hereinafter referred to as an “all area decoded image”) to the entire area display device 5 is performed.
Also, the moving picture transcoding device 2 extracts hint information from the entire region bitstream generated by the moving picture encoding device 1, and includes motion vector restriction information, GOP size restriction information, and reference structure included in the hint information. With reference to the designation information, a process of specifying an essential coding area, which is an area necessary for decoding a display area of a picture indicated by display area information given from outside, is performed.
Furthermore, the moving picture transcoding device 2 extracts the coding data and coding parameters of the coding target block included in the essential coding region from the entire region bit stream generated by the moving image coding device 1, A process of generating a partial region bitstream that conforms to a preset encoding codec from the encoded data and encoding parameters is performed.

The all region stream decoding unit 3 extracts all region encoded data, encoding parameters, and hint information included in the all region bitstream generated by the moving image encoding device 1, The entire region decoded image is decoded from the encoding parameter, and the entire region decoded image is output to the partial region transcoding unit 4 and the entire region display device 5, and the encoded data of the entire region, the encoding parameter, and the hint information are partially stored. A process of outputting to the area transcoding unit 4 is performed.
The partial area transcoding unit 4 refers to the motion vector restriction information, the GOP size restriction information, and the reference structure designation information included in the hint information output from the whole area stream decoding unit 3, and displays the display area given from the outside A process of specifying an essential coding area, which is an area necessary for decoding the display area of the picture indicated by the information, is performed.
The all region stream decoding unit 3 and the partial region transcoding unit 4 constitute essential encoding region specifying means.

Also, the partial area transcoding unit 4 encodes the coding target block included in the essential coding area from the coding data and coding parameters of the whole area output from the whole area stream decoding unit 3. Encoding data and encoding parameters are extracted, and a process of generating a partial region bit stream that conforms to a preset encoding codec from the encoding data and encoding parameters of the encoding target block is performed.
The partial area transcoding unit 4 constitutes parameter extraction means and partial area stream generation means.

The whole area display device 5 is a display device that displays the whole area decoded image output from the whole area stream decoding unit 3.
The moving picture decoding device 6 decodes the partial area image from the partial area bit stream output from the partial area transcoding unit 4 and displays the partial area image (hereinafter referred to as “partial area decoded image”) as a partial area display. It is a device that outputs to the device 7.
The partial region display device 7 is a display device that displays the partial region decoded image output from the video decoding device 6.

As a specific operation example, a case will be described in which the moving image transcoding device 2 is built in a stationary TV capable of receiving and reproducing ultra-high resolution video.
In this case, the moving image encoding apparatus 1 is an encoder apparatus that exists on the system side that distributes an ultra-high resolution video and generates an all-region bit stream to be distributed.
Therefore, the entire area bit stream generated by the moving image encoding apparatus 1 that is an encoder apparatus is distributed to a stationary TV via a predetermined transmission system.
The moving image transcoding device 2 built in the stationary TV receives the entire region bit stream distributed from the moving image encoding device 1 and decodes the entire region decoded image from the entire region bit stream, The region decoded image is displayed on the entire region display device 5.

The stationary TV here can transmit / receive data to / from the viewer's tablet terminal. When the user operates the tablet terminal and designates an arbitrary display area, the display indicating the display area is displayed. The area information is input to the moving picture transcoding device 2 in the stationary TV, and the moving picture transcoding device 2 generates a partial area bit stream including encoding parameters necessary for reproducing the display area specified by the user, The partial area bit stream is transmitted to the tablet terminal.
The tablet terminal incorporates a moving image decoding device 6. The moving image decoding device 6 receives the partial region bit stream transmitted from the moving image transcoding device 2, and converts the partial region decoded image from the partial region bit stream. By decoding, the partial area decoded image is displayed on the partial area display device 7.
In the tablet terminal, the partial region decoded image can be displayed in an appropriately enlarged manner.
As described above, by using the system shown in FIG. 1, the user can appropriately enlarge and display an arbitrary partial area on the tablet terminal at hand while viewing an ultra-high resolution TV image.

FIG. 2 is a block diagram showing a moving picture coding apparatus 1 according to Embodiment 1 of the present invention.
In FIG. 2, the encoding control unit 11 inputs information such as motion vector restriction information, GOP size restriction information, reference structure designation information, and the like, and is a code that is the size of an encoding target block in a picture (input image) belonging to the GOP. A process of determining the encoded block size and outputting the encoded block size to the block dividing unit 12 is performed.
Also, the encoding control unit 11 performs a process of determining an encoding parameter based on GOP size restriction information, reference structure designation information, and the like.
That is, the encoding control unit 11 uses, as encoding parameters, the encoding mode (intra encoding mode, inter encoding mode, PCM (Pulse Code Modulation) encoding mode) of each block to be encoded, and prediction parameters (intra prediction). Parameters, inter prediction parameters) and PCM encoding parameters are determined.
In addition, the encoding control unit 11 determines a prediction difference encoding parameter to be referred to when performing orthogonal transform processing, quantization processing, or the like as an encoding parameter, and converts the prediction difference encoding parameter into a transform / quantization unit. 18. Output to the inverse quantization / inverse transform unit 19 and the variable length coding unit 23, determine a loop filter parameter to be referred to when the filtering process is performed as an encoding parameter, and set the loop filter parameter to the loop The process which outputs to the filter part 21 and the variable-length encoding part 23 is implemented.
Further, the encoding control unit 11 outputs motion vector restriction information to the motion compensation prediction unit 15 and outputs hint information (motion vector restriction information, GOP size restriction information, reference structure designation information) to the variable length coding unit 23. Perform the process.

Each time the block dividing unit 12 inputs a picture (input image) belonging to the GOP, the block dividing unit 12 divides the picture into blocks of the encoding block size determined by the encoding control unit 11 and encodes a block that is a block of a prediction processing unit. A process of outputting the target block to the changeover switch 13 and the subtracting unit 17 is performed.
If the coding mode determined by the coding control unit 11 is the intra coding mode, the changeover switch 13 outputs the block to be coded output from the block dividing unit 12 to the intra prediction unit 14, and the coding control unit 11 is the inter coding mode, the encoding target block output from the block dividing unit 12 is output to the motion compensation prediction unit 15, and the encoding determined by the encoding control unit 11 is performed. If the mode is the PCM encoding mode, a process of outputting the encoding target block output from the block dividing unit 12 to the PCM encoding unit 16 is performed.

The intra prediction unit 14 uses the intra prediction parameter determined by the encoding control unit 11 to perform an intra prediction process on the encoding target block output from the changeover switch 13 and generate an intra prediction image. .
The motion compensation prediction unit 15 compares the block to be encoded output from the changeover switch 13 with the locally decoded image after the loop filter processing stored in the frame memory 22, so that the motion output from the encoding control unit 11 A motion vector is searched in the region of the maximum range indicated by the vector restriction information, and using the motion vector and the inter prediction parameter determined by the encoding control unit 11, an inter prediction process is performed on the encoding target block. A process of generating an inter prediction image is performed.
Note that the motion compensation prediction unit 15 performs a process of outputting the searched motion vector to the variable length encoding unit 23 as an encoding parameter.

The PCM encoding unit 16 does not perform the prediction process, and uses the PCM encoding parameter determined by the encoding control unit 11 to use the picture (input image) corresponding to the encoding target block output from the changeover switch 13. The PCM signal (encoded data) is generated by converting the pixel data in the area to a predetermined bit width, and the PCM signal is output to the variable length encoding unit 23. A process of generating a PCM image by converting to a width and outputting the PCM image to the loop filter unit 21 is performed.
The encoding control unit 11, the block division unit 12, the changeover switch 13, the intra prediction unit 14, the motion compensation prediction unit 15, and the PCM encoding unit 16 constitute a predicted image generation unit.

The subtraction unit 17 subtracts the intra prediction image generated by the intra prediction unit 14 or the inter prediction image generated by the motion compensated prediction unit 15 from the encoding target block output from the block division unit 12, and A process of outputting a prediction difference signal (difference image) as a subtraction result to the transform / quantization unit 18 is performed.
The transform / quantization unit 18 refers to the prediction difference encoding parameter determined by the encoding control unit 11 and performs orthogonal transform processing (for example, DCT (discrete cosine transform)) on the prediction difference signal output from the subtraction unit 17. (Or orthogonal transform processing such as KL transform, in which a base design is made in advance for a specific learning sequence) is performed to calculate a transform coefficient, and the transform coefficient is quantized with reference to the prediction differential encoding parameter. Then, a process of outputting the quantized transform coefficient (hereinafter referred to as “quantized coefficient”) to the inverse quantization / inverse transform unit 19 and the variable length coding unit 23 is performed.

The inverse quantization / inverse transform unit 19 refers to the prediction difference encoding parameter determined by the encoding control unit 11 and inversely quantizes the quantized coefficient output from the transform / quantization unit 18. A process of referring to the prediction difference encoding parameter, performing an inverse orthogonal transform process on the transform coefficient after dequantization, and calculating a local decoded prediction difference signal corresponding to the prediction difference signal output from the subtraction unit 17 carry out.
The addition unit 20 is generated by the difference image indicated by the local decoded prediction difference signal calculated by the inverse quantization / inverse conversion unit 19, the intra prediction image generated by the intra prediction unit 14, or the motion compensation prediction unit 15. The inter prediction image is added, and a process of calculating a local decoded image corresponding to the coding target block output from the block dividing unit 12 is performed.

The loop filter unit 21 sequentially performs zero or more types of filtering processing based on the filter parameters output from the encoding control unit 11. However, when the loop filter process is not performed on the encoding target block in the PCM encoding mode, the loop filter process is not performed on the encoding target block in the PCM encoding mode.
The frame memory 22 is a recording medium that stores a locally decoded image after the loop filter processing by the loop filter unit 21.

The variable length encoding unit 23 is a quantized coefficient (encoded data) output from the transform / quantization unit 18 and an encoding mode (intra encoding mode / inter encoding mode) output from the encoding control unit 11. / PCM coding mode), prediction parameters (intra prediction parameters / inter prediction parameters) / PCM coding parameters, prediction differential coding parameters and filter parameters, and motion vectors output from the motion compensated prediction unit 15 (the coding mode is The inter-coding mode), the PCM signal (encoded data) output from the PCM encoding unit 16, and the hint information (motion vector restriction information, GOP size restriction information) output from the encoding control unit 11, Reference structure designation information) and variable length coding, and an all-region bitstream indicating the coding result And it carries out a process of formation.
The subtracting unit 17, the transform / quantization unit 18, and the variable length coding unit 23 constitute a bit stream generating unit.

In FIG. 2, the encoding control unit 11, the block division unit 12, the changeover switch 13, the intra prediction unit 14, the motion compensation prediction unit 15, the PCM encoding unit 16, and the subtraction unit 17 that are components of the moving image encoding device 1. , The transform / quantization unit 18, the inverse quantization / inverse transform unit 19, the adder unit 20, the loop filter unit 21, and the variable length coding unit 23 each have dedicated hardware (for example, a semiconductor integrated circuit on which a CPU is mounted). Although it is assumed that the circuit is configured by a circuit or a one-chip microcomputer, the moving image encoding apparatus 1 may be configured by a computer.
When the moving image encoding apparatus 1 is configured by a computer, the frame memory 22 is configured on the internal memory or the external memory of the computer, and the encoding control unit 11, the block division unit 12, the changeover switch 13, and the intra prediction unit. 14, motion compensation prediction unit 15, PCM encoding unit 16, subtraction unit 17, transformation / quantization unit 18, inverse quantization / inverse transformation unit 19, addition unit 20, loop filter unit 21, and variable length coding unit 23 A program describing processing contents may be stored in a memory of a computer, and a CPU of the computer may execute the program stored in the memory.
FIG. 5 is a flowchart showing the processing contents (moving image coding method) of the moving image coding apparatus 1 according to the first embodiment of the present invention.

FIG. 3 is a block diagram showing the all-region stream decoding unit 3 of the moving picture transcoding device 2 according to the first embodiment of the present invention.
In FIG. 3, when the variable length code decoding unit 31 receives the entire area bit stream output from the variable length encoding unit 23 of the video encoding device 1 of FIG. The encoded data (the quantized coefficient / PCM signal), the encoding mode (intra encoding mode / inter encoding mode / PCM encoding mode), and the intra prediction parameter (encoding mode is In the case of the intra coding mode), the inter prediction parameter (when the coding mode is the inter coding mode), the motion vector (when the coding mode is the inter coding mode), the PCM coding parameter ( Encoding mode is PCM encoding mode), prediction difference encoding parameter, loop filter parameter, Hint information (motion vector limiting information, GOP size limitation information, reference structure specification information) and carries out a process of outputting the variable-length decode.

The changeover switch 32 outputs the intra-prediction parameter output from the variable-length code decoding unit 31 to the intra-prediction unit 33 if the encoding mode variable-length decoded by the variable-length code decoding unit 31 is the intra encoding mode. If the encoding mode variable length decoded by the variable length code decoding unit 31 is the inter encoding mode, the inter prediction parameter and the motion vector output from the variable length code decoding unit 31 are output to the motion compensation unit 34 and are variable. If the coding mode variable-length decoded by the long code decoding unit 31 is the PCM coding mode, a process of outputting the PCM coding parameter and the PCM signal output from the variable length code decoding unit 31 to the PCM decoding unit 35 is performed. carry out.

The intra prediction unit 33 uses the intra prediction parameter output from the changeover switch 32 to perform an intra prediction process on the decoding target block and generate an intra predicted image.
The motion compensation unit 34 performs inter prediction processing on the decoding target block using the motion vector and the inter prediction parameter output from the changeover switch 32 while referring to the decoded image after the loop filter processing stored in the frame memory 39. The process which implements and produces | generates the inter estimated image is implemented.
The PCM decoding unit 35 generates a PCM image using the PCM encoding parameter and the PCM signal output from the changeover switch 32, and performs a process of outputting the PCM image to the loop filter unit 38.

The inverse quantization / inverse transform unit 36 refers to the prediction difference encoding parameter output from the variable length code decoding unit 31 and inversely quantizes the quantized coefficient output from the variable length code decoding unit 31. With reference to the prediction difference encoding parameter, an inverse orthogonal transform process is performed on the orthogonal transform coefficient after inverse quantization, and a process of calculating a decoded prediction difference signal is performed.
The addition unit 37 includes the difference image indicated by the decoded prediction difference signal calculated by the inverse quantization / inverse conversion unit 36, the intra prediction image generated by the intra prediction unit 33, or the inter prediction generated by the motion compensation unit 34. The image is added to generate a decoded image before the loop filter process, and a process for outputting the decoded image before the loop filter process to the outside of the loop filter unit 38 and the entire area stream decoding unit 3 is performed.

The loop filter unit 38 sequentially performs zero or more types of filtering processing based on the filter parameters output from the variable length code decoding unit 31. However, when the configuration is such that the loop filter processing is not performed on the decoding target block in the PCM coding mode, the loop filtering processing is not performed on the decoding target block in the PCM coding mode.
The frame memory 39 is a recording medium that stores the decoded image after the loop filter processing by the loop filter unit 38.

In FIG. 3, the variable length code decoding unit 31, the changeover switch 32, the intra prediction unit 33, the motion compensation unit 34, the PCM decoding unit 35, the inverse quantization / inverse conversion unit 36, which are components of the all-region stream decoding unit 3, It is assumed that each of the adding unit 37 and the loop filter unit 38 is configured by dedicated hardware (for example, a semiconductor integrated circuit on which a CPU is mounted, or a one-chip microcomputer). The stream decoding unit 3 may be configured by a computer.
When the all-region stream decoding unit 3 is configured by a computer, the frame memory 39 is configured on an internal memory or an external memory of the computer, and a variable length code decoding unit 31, a changeover switch 32, an intra prediction unit 33, a motion compensation Unit 34, PCM decoding unit 35, inverse quantization / inverse transformation unit 36, addition unit 37, and loop filter unit 38 are stored in a computer memory, and the computer CPU stores the program in the memory. The stored program may be executed.
FIG. 6 is a flowchart showing the processing contents of the all-region stream decoding unit 3 of the moving picture transcoding device 2 according to the first embodiment of the present invention.

FIG. 4 is a block diagram showing a partial area transcoding unit 4 of the moving picture transcoding apparatus 2 according to the first embodiment of the present invention.
In FIG. 4, the transcode control unit 41 is given from the outside with reference to the motion vector restriction information, GOP size restriction information, and reference structure designation information included in the hint information output from the all-region stream decoding unit 3. In addition to specifying the area to be transcoded (transcoding target area) from the display area of the picture indicated by the display area information, the area necessary for decoding the transcoding target area (the encoding parameter at the time of transcoding An essential encoding area that is an area that needs to be diverted is specified, and a process of outputting the transcoding target area information indicating the transcoding target area and the essential encoding area information indicating the essential encoding area is performed.
Note that the transcode control unit 41 determines that each picture belonging to the GOP has a different size of the essential coding area (for example, the maximum GOP size indicated by the GOP size restriction information is N, and the reference structure designation information includes When decoding (frame), if it indicates that the previous picture (frame) is to be referred to, the Nth picture (frame) in the essential coding area of N pictures (frames) ), The size of the essential coding area is the smallest, and the size of the mandatory coding area of the first picture (frame) is the largest). For example, the largest essential coding area is determined as the transcoding target area. To do. Therefore, there is a relationship between the essential coding area and the transcoding target area.
In addition, the transcode control unit 41 generates header information of the partial region bit stream based on the transcoding target region information, and performs processing for outputting the header information to the variable length encoding unit 46.

The encoding parameter extraction unit 42 includes the essential encoding indicated by the essential encoding region information output from the transcode control unit 41 from the encoded data and encoding parameters of the entire region output from the entire region stream decoding unit 3. Coding data (count after quantization / PCM signal) and coding parameters (coding mode (intra coding mode / inter coding mode / PCM coding mode), prediction parameter of the block to be coded included in the region (Intra prediction parameter / inter prediction parameter) / PCM coding parameter, motion vector (when the coding mode is the inter coding mode), prediction differential coding parameter, loop filter parameter, hint information (motion vector restriction information, GOP) Extract size limit information, reference structure specification information)) and encode And it carries out a process of outputting the coded data and encoding parameters elephants block xref block coding unit 43 and the changeover switch 45.

The external reference block encoding unit 43 encodes a block to be encoded (encoding belonging to the boundary of the essential encoding region) included in the essential encoding region indicated by the essential encoding region information output from the transcode control unit 41 If the target block is an external reference block that is subjected to intra coding with reference to a pixel value outside the essential coding region, a code that does not use the pixel value outside the essential coding region for prediction reference The encoded image of the block to be encoded is encoded by the encoding method, and the encoded data as the encoding result and the encoding parameter used for encoding the decoded image are output to the changeover switch 45.

For example, when an intra coding mode of an intra coding method that refers to a pixel value at a screen end of a block to be coded is used as a coding method that does not use a pixel value outside the essential coding region for prediction reference, the intra coding mode is used. The intra prediction image is generated in the encoding mode, and the decoded image of the encoding target block (external reference block) is extracted from the entire region decoded image output from the all region stream decoding unit 3 before the loop filter processing. Then, a differential image between the decoded image of the encoding target block and the intra-predicted image is compression-encoded to generate encoded data (quantized coefficient) that is the encoding result and the intra-predicted image. The received intra prediction parameter (encoding parameter) is output to the changeover switch 45.
In addition, when the PCM encoding mode is used as an encoding method that does not use pixel values outside the essential encoding region for prediction reference, the entire region decoded image output from the all region stream decoding unit 3 before the loop filter processing is used. A decoded image of the encoding target block (external reference block) is extracted. Then, the decoded image is PCM-encoded, and the PCM signal as the encoding result and the PCM encoding parameter (encoding parameter) used for PCM encoding of the decoded image are output to the changeover switch 45.

The unnecessary block encoding unit 44 is outside the essential encoding region, and the encoding target block (unnecessary block) inside the transcoding target region is encoded in, for example, the skip mode in the inter encoding method, Processing for outputting the encoded data which is the encoding result and the encoding parameters used for encoding the block to be encoded to the changeover switch 45 is performed.
Here, an example is shown in which encoding is performed in the skip mode in the inter-encoding method, and the encoding parameter used for encoding in the skip mode is output to the changeover switch 45. The dummy parameters may be output to the changeover switch 45.

The changeover switch 45 refers to the transcoding target area information and the essential encoding area information output from the transcoding control unit 41, and if the encoding target block included in the essential encoding area is not an external reference block, If the encoded data and the encoding parameter output from the encoding parameter extraction unit 42 are selected and the encoding target block included in the essential encoding area is an external reference block, the external reference block encoding unit 43 Select the output encoded data and encoding parameters, and if the target block to be encoded is an unnecessary block, select the encoded data and encoding parameters output from the unnecessary block encoding unit 44, and select the selected code. The process of outputting the encoded data and the encoding parameter to the variable length encoding unit 46 is performed.

The variable length encoding unit 46 performs variable length encoding on the encoded data and the encoding parameters output from the changeover switch 45, generates a partial area bit stream indicating the encoding results, and outputs from the transcode control unit 41. A process of multiplexing the header information of the output partial area bit stream into the partial area bit stream and outputting the partial area bit stream after multiplexing the header information (partial stream conforming to a predetermined encoding codec) To implement.

In FIG. 4, a transcoding control unit 41, a coding parameter extraction unit 42, an external reference block coding unit 43, an unnecessary block coding unit 44, a changeover switch 45, and a variable length code which are components of the partial area transcoding unit 4 It is assumed that each of the conversion units 46 is configured by dedicated hardware (for example, a semiconductor integrated circuit on which a CPU is mounted, or a one-chip microcomputer). You may be comprised with the computer.
When the partial area transcoding unit 4 is configured by a computer, the transcoding control unit 41, the encoding parameter extraction unit 42, the external reference block encoding unit 43, the unnecessary block encoding unit 44, the changeover switch 45, and the variable length code A program describing the processing contents of the conversion unit 46 may be stored in a memory of a computer so that the CPU of the computer executes the program stored in the memory.
FIG. 7 is a flowchart showing the processing contents of the partial area transcoding unit 4 of the moving picture transcoding apparatus 2 according to the first embodiment of the present invention.

Next, the operation will be described.
The moving image encoding apparatus 1 in FIG. 2 employs an encoding method that compresses and encodes an input image by intra prediction encoding / inter prediction encoding / PCM encoding, and performs inter prediction encoding. Is characterized by limiting the maximum value of a motion vector (limiting the search range of a motion vector) by using motion vector restriction information given from outside.
In addition, the GOP size as a base when determining the coding mode (intra coding mode / inter coding mode / PCM coding mode) and the coding parameter (intra prediction parameter / inter prediction parameter / PCM coding parameter) The reference structure is limited to a specific pattern, and hint information (motion vector restriction information, GOP size restriction information, reference structure designation information) indicating the restriction information is multiplexed in the entire region bit stream.

In the image encoding process, compression efficiency is increased by utilizing the fact that a general image has a high correlation between an image feature and a region close in space and time. In inter-coding (inter-frame reference coding), the above features are used to improve compression efficiency by predicting image features by referring to spatially close regions in temporally neighboring encoded frames. ing.
At this time, since the object in the image may move between frames, a region having a high correlation is searched, and the phase difference between the region to be predicted and the region having a high correlation is represented by information called a motion vector to absorb the movement of the object. To perform the process (motion compensation prediction process).
Therefore, even if it is desired to decode only a specific limited area of a specific frame from the encoded stream, it is necessary to decode the previous area indicated by the motion vector in the frame referred to by the frame.

Since a frame decoded by inter-frame reference may be referred to in a subsequent frame, information on a certain area of a decoded image of a certain frame propagates to a wider area than the subsequent frame in the inter-frame reference. There is a case. For example, when there is no restriction on the maximum value of the motion vector, there is a possibility that the propagation range of the decoded image information is practically unlimited.
In the moving picture coding apparatus 1 according to the first embodiment, by providing certain restrictions on the motion vector, GOP size, and reference structure, the propagation of the decoded image information is suppressed to a certain range, and the restriction information is hinted. It has a configuration that allows information to be transmitted to the decoding side.

The video signal format to be processed by the moving image encoding apparatus 1 in FIG. 1 is a color video in an arbitrary color space such as a YUV signal composed of a luminance signal and two color difference signals, or an RGB signal output from a digital image sensor. In addition to the signal, an arbitrary video signal such as a monochrome image signal or an infrared image signal whose video frame is composed of a horizontal / vertical two-dimensional digital sample (pixel) sequence is used.
However, the gradation of each pixel may be 8 bits, or a gradation of 10 bits or 12 bits.
In the following description, for convenience, unless otherwise specified, it is assumed that the video signal of the input image is a YUV signal, and the two color difference components U and V are subsampled with respect to the luminance component Y 4: 2: 0. The case of handling format signals will be described.
A processing data unit corresponding to each frame of the video signal is referred to as a “picture”.
In the first embodiment, “picture” is described as a video frame signal that is sequentially scanned (progressive scan). However, when the video signal is an interlaced signal, “picture” is a unit constituting a video frame. It may be a field image signal.

Hereinafter, the processing content of the moving image encoding device 1 will be described.
The encoding control unit 11 hierarchically has an encoding target block having an encoding block size for each image area having a predetermined maximum encoding block (CTU, macroblock) size until reaching an upper limit of a predetermined number of division layers. The coding mode for each coding target block is determined (step ST1 in FIG. 5).
Here, FIG. 8 is an explanatory diagram showing an example in which the maximum encoding block is hierarchically divided into a plurality of encoding target blocks.
In FIG. 8, the maximum coding block is a coding target block whose luminance component indicated as “0th layer” has a size of (L 0 , M 0 ).
An encoding target block is obtained by hierarchically dividing a CTU size block to a predetermined depth separately defined by a quadtree structure.
At the depth n, the encoding target block is an image area of size (L n , M n ).
However, L n and M n may be the same or different, but FIG. 8 shows the case of L n = M n .

Hereinafter, the encoding block size determined by the encoding control unit 11 is defined as the size (L n , M n ) in the luminance component of the encoding target block.
Since quadtree partitioning is performed, (L n + 1 , M n + 1 ) = (L n / 2, M n / 2) always holds.
Note that in a color video signal (4: 4: 4 format) in which all color components have the same number of samples, such as RGB signals, the size of all color components is (L n , M n ), but 4: 2. : When the 0 format is handled, the encoding block size of the corresponding color difference component is (L n / 2, M n / 2).
Later, represents the encoding target block of the n hierarchy B n, the encoding modes selectable by the encoding target block B n as represented by m (B n).
In the case of a color video signal composed of a plurality of color components, the encoding mode m (B n ) may be configured to use an individual mode for each color component, or common to all color components. It may be configured to use the mode. Hereinafter, unless otherwise specified, description will be made assuming that it indicates a coding mode for a luminance component of a coding block of a YUV signal and 4: 2: 0 format.

The coding mode m (B n ) includes one or more intra coding modes (collectively referred to as “INTRA”), one or more inter coding modes (collectively referred to as “INTER”), And one or a plurality of PCM encoding modes, and the encoding control unit 11 selects the encoding target from all the encoding modes available for the picture type of the picture or a subset thereof. Select the coding mode for block Bn .
Furthermore, the encoding target block Bn is divided into one or a plurality of prediction processing units (partitions) by the block dividing unit 12, as shown in FIG.
Hereinafter, a partition belonging to the encoding target block B n is denoted as P i n (i is a partition number in the nth layer).
How the partitioning of the encoding target block Bn is performed is included as information in the encoding mode m ( Bn ).
The partition P i n is all subjected to prediction processing according to the encoding mode m (B n ), but a prediction parameter is selected for each encoding target block B n or partition P i n .

For example, the encoding control unit 11 generates a block division state as illustrated in FIG. 9 for the maximum encoding block, and identifies the encoding target block.
The shaded portion in FIG. 9A shows the distribution of the partitions after the division, and FIG. 9B shows the situation in which the encoding mode m (B n ) is assigned by the hierarchical division in a quadtree graph. Yes.
Nodes surrounded by □ in FIG. 9B are nodes (encoding target blocks) to which the encoding mode m (B n ) is assigned.

Also, the encoding control unit 11 outputs the motion vector restriction information given from the outside to the motion compensation prediction unit 15.
The motion vector restriction information is a function that restricts to which area of the subsequent frame the information of a partial area of the decoded image of a certain frame is propagated by reference relationship by restricting the maximum value of the length of the motion vector. It is information for realizing. This motion vector restriction information may be a fixed value for all frames, or may be a different value for each frame.
Further, the encoding control unit 11 outputs the determined encoding mode, prediction difference signal parameter, intra prediction parameter, inter prediction parameter, PCM encoding parameter, and loop filter parameter to the variable length encoding unit 23.
Also, motion vector restriction information, GOP size restriction information, and reference structure designation information given from the outside are output to the variable length coding unit 23.

Note that the processing of the encoding control unit 11 is performed for each picture input to the moving image encoding device 1. The picture type, inter-picture reference structure, and the like are controlled inside the encoding control unit 11 in accordance with reference structure designation information, GOP size restriction information, and other encoding control information given from the outside. As described above, the coding mode and coding parameters are determined.
When the picture type is an I picture, the coding mode is limited to the intra coding mode or the PCM coding mode.
Also, when the picture type is a B picture or a P picture, the encoding mode is determined to be an intra encoding mode, an inter encoding mode, or a PCM encoding mode.
Further, when the picture type is a B picture or a P picture, control is performed so as to restrict the use of the intra coding mode or the use of the PCM coding mode for all pictures according to other coding control information. It is also possible.

The changeover switch 13 is output from the block dividing unit 12 when the coding mode m (B n ) determined by the coding control unit 11 is an intra coding mode (when m (B n ) ∈INTRA). The encoding target block Bn is output to the intra prediction unit 14 (step ST2).
The changeover switch 13 is output from the block dividing unit 12 when the encoding mode m (B n ) determined by the encoding control unit 11 is an inter encoding mode (when m (B n ) ∈INTER). The encoding target block Bn is output to the motion compensation prediction unit 15 (step ST3).
In addition, when the coding mode m (B n ) determined by the coding control unit 11 is the PCM coding mode, the changeover switch 13 converts the coding target block B n output from the block dividing unit 12 into the PCM code. Is output to the conversion unit 16 (step ST3).

The intra prediction unit 14 (in the case of m (B n) ∈INTRA) encoding control unit 11 coding mode m, which is determined by (B n) is intra coded mode, the encoding target block from the change-over switch 13 When B n is received, the intra prediction process for each partition P i n in the encoding target block B n is performed using the intra prediction parameter determined by the encoding control unit 11, and the intra prediction image P INTRAi is obtained. n is generated (step ST4).

The motion compensated prediction unit 15 is the encoding mode m (B n ) determined by the encoding control unit 11 is the inter coding mode (when m (B n ) ∈INTER), and the object to be encoded is selected from the changeover switch 13. When receiving the block B n, topical for each partition P i n and the other frame after performing motion compensation prediction reference image (loop filter processing is stored in the frame memory 22 in the encoding target block B n The motion vector is searched by comparing with (decoded image).
However, when searching for a motion vector, the length of the motion vector is limited so as not to exceed the maximum value indicated by the motion vector restriction information output from the encoding control unit 11. (A motion vector is searched for in the maximum range area indicated by the motion vector restriction information).
Note that the maximum value indicated by the motion vector restriction information may be fixed for all frames or may be different for each frame.
Further, it may be changed for each combination of the current frame and the reference frame. For example, the maximum value of the motion vector is specified so as to be proportional to the absolute value of the difference between the POC (Picture Order Count: a counter value that increments by one in time series order) of the current frame and the POC of the reference frame. Also good. In general, the greater the difference in POC between frames, the greater the amount of motion between frames, so it is reasonable to specify the maximum value using the rules described above.
As described above, information indicating which rule specifies the maximum value of the motion vector may also be included in the motion vector restriction information.
Motion compensation prediction unit 15, when a motion vector is searched, using inter prediction parameters determined by the motion vector and the encoding control unit 11, the inter prediction for each partition P i n in the encoding target block B n processing is carried out and generates an inter prediction image P INTERi n (step ST5).

When the encoding mode m (B n ) determined by the encoding control unit 11 is the PCM encoding mode and the PCM encoding unit 16 receives the encoding target block B n from the changeover switch 13, the PCM encoding unit 16 11, pixel gradation reduction processing is performed on the pixels included in the encoding target block B n based on the PCM encoding parameter output from 11, and the pixel value with reduced gradation is changed as a PCM signal. It outputs to the long encoding part 23 (step ST6).
In addition, the PCM encoding unit 16 outputs the pixel value that has been restored to the original gradation again after reducing the gradation to the loop filter unit 21 as a PCM image (locally decoded image in the PCM encoding mode).
When the PCM encoding parameter indicates that gradation reduction is not performed, the pixel gradation reduction process is not performed, and thus the pixel value of the encoding target block Bn is encoded without deterioration. Can do.

Subtracting unit 17 receives the encoding target block B n from the block dividing unit 12, the partition P i n in the encoding target block B n, the intra prediction image P INTRAi n generated by the intra prediction unit 14, Alternatively, the inter prediction image P INTERIn n generated by the motion compensation prediction unit 15 is subtracted, and the prediction difference signal that is the subtraction result is output to the transform / quantization unit 18 (step ST7).

When the transform / quantization unit 18 receives the prediction difference signal from the subtraction unit 17, the transform / quantization unit 18 refers to the prediction difference encoding parameter determined by the encoding control unit 11 and performs orthogonal transform processing (for example, DCT) on the prediction difference signal. (Discrete cosine transformation) or orthogonal transformation processing such as KL transformation in which a base design is made in advance for a specific learning sequence) is performed, and the transformation coefficient is calculated.
In addition, the transform / quantization unit 18 refers to the prediction differential encoding parameter, quantizes the transform coefficient, and dequantizes the quantized coefficient, which is a transform coefficient after quantization, and the inverse quantization / inverse transform unit 19 and It outputs to the variable length encoding part 23 (step ST8).

When the inverse quantization / inverse transform unit 19 receives the quantized coefficient from the transform / quantization unit 18, the inverse quantization / inverse transform unit 19 refers to the prediction difference encoding parameter determined by the encoding control unit 11 and sets the quantized coefficient. Inverse quantization.
Further, the inverse quantization / inverse transform unit 19 refers to the prediction differential encoding parameter and performs inverse orthogonal transform processing (for example, inverse DCT, inverse KL transform) on the transform coefficient after inverse quantization. Then, a local decoded prediction difference signal corresponding to the prediction difference signal output from the subtraction unit 17 is calculated (step ST9).

When the addition unit 20 receives the local decoded prediction difference signal from the inverse quantization / inverse conversion unit 19, the addition unit 20 indicates the difference image indicated by the local decoded prediction difference signal, the intra prediction image P INTRAi n generated by the intra prediction unit 14, or, by adding the inter prediction image P INTERi n generated by the motion compensation prediction unit 15, a local decoded partition image, or as a collection of the local decoded partition image coded output from the block dividing unit 12 A locally decoded image corresponding to block Bn is calculated (step ST10).

When the loop filter unit 21 completes the processing of steps ST2 to ST10 for all the encoding target blocks Bn (steps ST11 and ST12), the local decoded image output from the adding unit 20 (local decoded image before the loop filter processing) ), Zero or more types of loop filter processing are performed based on the loop filter parameters output from the encoding control unit 11, and the locally decoded image after the loop filter processing is output to the frame memory 22 (step ST13). .
Incidentally, if it is configured not to implement the loop filter processing to the encoding target block B n of the PCM coding mode, it does not perform loop filtering for the encoding target block B n of the PCM coding mode .

The variable length encoding unit 23 is a quantized coefficient (encoded data) output from the transform / quantization unit 18, an encoding mode m (B n ) output from the encoding control unit 11, and a prediction parameter. (Intra prediction parameter / inter prediction parameter) / PCM encoding parameter, prediction differential encoding parameter and filter parameter, and motion vector output from the motion compensation prediction unit 15 (when the encoding mode is the inter encoding mode) The PCM signal (encoded data) output from the PCM encoder 16 and the hint information (motion vector restriction information, GOP size restriction information, reference structure designation information) output from the encoding controller 11 are variable. Long encoding is performed, and an all-region bit stream indicating the encoding result is generated (step ST14).

Next, the processing content of the all area stream decoding part 3 of the moving image transcoding apparatus 2 is demonstrated.
When the variable length code decoding unit 31 receives the entire region bit stream generated by the moving image coding device 1, the variable length code decoding unit 31 sets the maximum coding block size and the number of divided hierarchies in the same manner as the coding control unit 11 in FIG. Determine the upper limit.
When information indicating the maximum coding block size and the upper limit of the number of division layers is multiplexed in the entire region bitstream, decoding the information allows the maximum coding block size and the number of division layers to be determined. An upper limit may be determined.

Next, the variable-length code decoding unit 31 decodes the encoding mode assigned to the maximum encoding block multiplexed in the entire area bit stream, and the maximum encoding block included in the encoding mode The information indicating the division state (tile division control information) is decoded (step ST21 in FIG. 6).
Further, when the variable length code decoding unit 31 decodes the information indicating the division state of the maximum coding block, the variable length code decoding unit 31 hierarchically divides the decoding target block (coding target block) based on the division state of the maximum coding block. ).
Further, the variable length code decoding unit 31 further divides the decoding target block into one or more prediction processing units based on the division state of the decoding target block, and is assigned to the encoding target block unit or the prediction processing unit. The encoded parameters are decoded (step ST21).

When the encoding mode assigned to the decoding target block (encoding target block) is the intra encoding mode, the variable length code decoding unit 31 performs the calculation for each one or more partitions included in the decoding target block. Intra prediction parameters are decoded (step ST21).
When the encoding mode assigned to the decoding target block is the inter encoding mode, the inter prediction parameter is decoded for each decoding target block or for each of one or more partitions included in the decoding target block. (Step ST21).
If the coding mode assigned to the decoding target block is the PCM coding mode, the PCM signal and PCM coding parameters assigned to the decoding target block are decoded (step ST21).

In addition, when the encoding mode assigned to the decoding target block is the intra encoding mode or the inter encoding mode, the variable length code decoding unit 31 includes the prediction difference encoding parameter included in the encoding parameter. Based on the information on the transform block size, the partition serving as the prediction processing unit is divided into one or a plurality of partitions serving as the transform processing unit, and the quantized coefficients are decoded for each partition serving as the transform processing unit (step ST21). .
Furthermore, the variable length code decoding unit 31 decodes the filter parameter multiplexed in the entire region bit stream, and outputs the filter parameter to the loop filter unit 38 (step ST21).
Note that the variable length code decoding unit 31 includes all the decoded encoding parameters (encoding mode, intra prediction parameter, inter prediction parameter, PCM encoding parameter, motion vector, prediction difference encoding parameter, loop filter parameter), The hint information (motion vector restriction information, GOP size restriction information, reference structure designation information) and encoded data (quantized coefficient, PCM signal) are output to the partial area transcoding unit 4 in FIG.

If the encoding mode m (B n ) variable-length decoded by the variable-length code decoding unit 31 is an intra-encoding mode (when m (B n ) ∈INTRA), the changeover switch 32 is a variable-length code decoding unit. The intra prediction parameter variable-length decoded by 31 is output to the intra prediction unit 33 (step ST22).
If the coding mode m (B n ) variable-length decoded by the variable-length code decoding unit 31 is the inter-coding mode (when m (B n ) ∈INTER), the changeover switch 32 is a variable-length code decoding unit. The inter-prediction parameter and the motion vector variable-length decoded by 31 are output to the motion compensation unit 34 (step ST23).
Further, when the coding mode m (B n ) variable-length decoded by the variable-length code decoding unit 31 is the PCM coding mode, the changeover switch 32 performs the PCM signal variable-length decoded by the variable-length code decoding unit 31. The PCM encoding parameters are output to the PCM decoding unit 35 (step ST23).

In the intra prediction unit 33, the encoding mode m (B n ) variable-length decoded by the variable-length code decoding unit 31 is the intra encoding mode (when m (B n ) ∈INTRA), and the intra changeover switch 32 receives the intra-mode. When receiving the prediction parameters, in the same manner as the intra prediction unit 14 of FIG. 2, by using the intra prediction parameters, to implement the intra prediction process for each partition P i n the decoding target block B n, intra prediction It generates an image P INTRAi n, and outputs the intra prediction image P INTRAi n to the adder 37 (step ST24).

In the motion compensation unit 34, the coding mode m (B n ) that has been subjected to variable length decoding by the variable length code decoding unit 31 is an inter coding mode (when m (B n ) ∈INTER), When the prediction parameter and the motion vector are received, the decoding target block B n or the partition P i n is used by referring to the decoded image after the loop filter processing stored in the frame memory 39 and using the motion vector and the inter prediction parameter. by carrying out inter-prediction processing for generating the inter prediction image P INTERi n, and outputs the inter prediction image P INTERi n to the adder 37 (step ST25).

When the PCM decoding unit 35 receives the PCM signal and the PCM encoding parameter from the changeover switch 32 when the encoding mode m (B n ) variable-length decoded by the variable-length code decoding unit 31 is the PCM encoding mode, Based on the PCM coding parameters, a process of restoring the gradation of the PCM signal corresponding to each pixel of the decoding target block Bn to the gradation of the decoded image is performed, and the loop filter process of the restored decoding target block Bn The previous decoded image is output to the loop filter unit 38 (step ST26). Also, the decoded image before the loop filter processing is output to the partial area transcoding unit 4 in FIG.

When receiving the post-quantization coefficient and the prediction differential encoding parameter from the variable-length code decoding unit 31, the inverse quantization / inverse transformation unit 36 performs the prediction in the same procedure as the inverse quantization / inverse transformation unit 19 in FIG. With reference to the differential encoding parameter, the quantized coefficient is dequantized, and the inverse orthogonal transform process is performed on the dequantized transform coefficient, and the prediction difference output from the subtracting unit 17 in FIG. A decoded prediction difference signal corresponding to the signal is calculated, and the decoded prediction difference signal is output to the adding unit 37 (step ST27).

Addition unit 37, a difference image shown is calculated decoded prediction difference signal by the inverse quantization and inverse transform unit 36, or the intra prediction image P INTRAi n, generated by the intra prediction unit 33, generated by the motion compensation unit 34 It has been by adding the inter prediction image P INTERi n, as a collection of one or more of the decoded partition image included in the decoding target block, and outputs the decoded image to the loop filter unit 38 (step ST28). Also, the decoded image before the loop filter processing is output to the partial area transcoding unit 4 in FIG.

When the processing of steps ST21 to ST28 is completed for all the encoding target blocks Bn in the picture (steps ST29 and ST30), the loop filter unit 38 adds based on the filter parameters output from the variable length code decoding unit 31. Zero or more types of loop filter processing are performed on the decoded image before the loop filter processing output from the unit 37 or the PCM decoding unit 35, and the decoded image after the loop filter processing is stored in the frame memory 39 (step ST31). ).
Note that the decoded image after the loop filter processing may be output to the outside of the moving picture transcoding device 2.

Next, processing contents of the partial area transcoding unit 4 of the moving image transcoding apparatus 2 will be described.
Upon receiving the hint information (motion vector restriction information, GOP size restriction information, reference structure designation information) from the all-region stream decoding unit 3 in FIG. 3, the transcode control unit 41 receives the motion vector restriction contained in the hint information. Information, GOP size restriction information, and reference structure designation information are referenced to specify a region to be transcoded (transcoding target region) from the display region of the picture indicated by the display region information given from the outside. Essential encoding required for decoding the transcoding target area (area where decoding is guaranteed to be the same as or close to that of all-area decoded images and the encoding parameters need to be used during transcoding) Specify the area, and specify the transcoding target area information and the required encoding area that indicate the transcoding target area. And it outputs the to essential coded area information (step ST41 in FIG. 7).
In addition, when the size of the essential encoding area in each picture belonging to the GOP is different, for example, the essential encoding area having the largest size is determined as a common transcoding target area in each picture.
Hereinafter, the process of specifying the transcoding target area information and the essential encoding area by the transcoding control unit 41 will be specifically described.

In moving image decoding processing, decoding is performed by adding a prediction image obtained by performing motion compensation prediction processing (an image determined based on a motion vector while referring to a decoded frame) and a prediction difference signal. A process for generating an image is performed.
Here, the random access point frame is the top frame of the GOP and F 1, when a frame that refers to the frame F n and F n + 1, if the area of the decoding target in the frame F n is a partial region P n , in the frame F n-1, it is required to be frame F n-1 of the partial region P n-1 of the decoding target area to all the partial regions P n of the frame F n is referencing.
At this time, the reference inter-frame motion compensation prediction process, the frame F n refers to the frame F n-1, frame F n-1 refers to the frame F n-2, nothing is a frame that does not refer F Since it is executed in a multistage manner within the GOP until it reaches 1 , the dependency between frames propagates from frame F 1 to frame F n .
For this reason, in order to correctly decode the partial region P N of the frame F N belonging to the last stage with the reference structure in the GOP, the partial region P from the frame F 1 to the frame F N based on the propagation of the dependency relationship described above. It is necessary to set 1 to PN as decoding target areas.

Temporarily, the moving picture transcoding device 2 processes a bit stream in which the motion vector, the GOP size, and the reference structure are not limited, instead of the entire region bit stream output from the moving picture coding device 1 of FIG. the, in order to determine the partial region P n of the frame F n, the frame F n + analyzes all motion vectors of 1, it is necessary to identify areas these motion vectors is pointing, the time required for analysis Becomes larger.
Further, since the maximum value of GOP size / motion vector and the reference structure differ depending on the bit stream, the shape / size of the partial area P n of each frame F n for the same display area designation information becomes indefinite depending on the bit stream, and transcoding processing is performed. In addition, it becomes difficult to handle in the process of decoding the stream after transcoding.
However, in the first embodiment, as described above, the motion vector / GOP size / reference structure of the entire region bit stream output from the moving picture encoding apparatus 1 in FIG. 2 is limited by the hint information. Therefore, the partial area P n of each frame F n can be fixedly obtained with a small amount of computation by the following processing.

Here, for simplicity of implementation, when the partial region P n of each frame F n is not a rectangular area, it is assumed that a rectangular region PL n which encloses a partial region P n and decoded.
Also, consider a case where the maximum value of the horizontal and vertical component absolute values of the motion vector indicated by the motion vector restriction information included in the hint information is fixed at V [pixel] in all frames. Further, it is assumed that the decoding target rectangular area of the frame F n referring to the frame F n−1 is PL n .
In this case, the decoding target rectangular area PL n-1 of the frame F n-1 is fixedly a rectangular area in which V pixels are added around the decoding target rectangular area PL n without analyzing the motion vector value. Can do. This is because since the maximum value of the motion vector is V, the amount of protrusion of the motion vector pointing to the outside of the decoding target rectangular area PL n is suppressed to V pixels at most.
Dependency references, since propagating from a random access point frames F 1, the decoding target rectangular region PL 1 of frames F 1, the add V * (n-1) pixels around the decoded rectangular regions PL n It can be obtained as a rectangular area.

In addition, even when the maximum value of the motion vector is different in each frame, it is possible to obtain the decoding target rectangular area PL n of each frame F n in the same manner. For example, when a motion vector maximum value of frame F n are given by V n, to decode the object rectangular region PL N frames F N, the decoding target rectangular region PL n of the frame F n (n <N), decoded obtained as a rectangular region obtained by adding the pixel V sum around the rectangular region PL N.

Figure JPOXMLDOC01-appb-I000001

In addition, as described in the description of the moving image encoding device 1, the maximum value of the motion vector is specified in proportion to the absolute difference value of the POC between the frame Fn and the reference frame Fn -1 . The decoding target rectangular area is obtained as follows.
First, when the difference absolute value of the frame F n and the reference destination frame F n-1 of the POC placed and d (F n -F n-1 ), the maximum motion vector in the case of referring to F n-1 from F n Consider that the value is designated as αd (F n −F n−1 ) (α is a fixed coefficient).
In this case, to decode the object rectangular region PL N frames F N, the frame F n (n <N) decoding target rectangular region PL n of the rectangular region obtained by adding the pixel V sum around the decoding target rectangular region PL N As required.

Figure JPOXMLDOC01-appb-I000002
Also, the above equation can be simplified as follows.
Figure JPOXMLDOC01-appb-I000003
That is, the value of V sum is determined by the difference absolute value of the POC of F N and F n.
Further, decoding the decoding target rectangular region PL m frames F m (m <n) for decoding the object rectangular region PL n of the frame F n (rectangular area obtained by adding the pixel V sum 'around the decoding target rectangular region PL N) In the case of a reference structure in which a certain frame is referred to from a plurality of frames, if the reference rectangular region PDL nm is relative to the frame F m of the target rectangular region PL n , among the dependent rectangular regions obtained from the respective reference source frames The largest dependent rectangular area is set as a decoding target rectangular area of the frame.
Figure JPOXMLDOC01-appb-I000004

From the above, in order to generate a partial region bitstream that can be decoded correctly in a GOP consistently in a GOP, it is necessary to include at least information on the decoding target rectangular region of each frame in the bitstream. is there. In addition, since it is impossible or difficult to change the frame size (the number of horizontal and vertical pixels) in the GOP, the size of each frame in the GOP is the decoding target rectangular area of all the frames in the GOP. among the greatest decoding target rectangular area (often decoding target rectangular area frames F 1 is applicable) must be sized including.
Based on these, when the display area information is given from the outside, the transcode control unit 41 obtains an essential encoding area and a transcoding target area in the following procedure.

(1) A decoding target rectangular area of a frame (non-reference frame: I picture) that is not referred to itself among frames other than the frame F 1 in the GOP is set as an area including the display area.
(2) The size of the dependent rectangular area for all frames is obtained from the decoding target rectangular areas of each non-reference frame set in the process of (1), and the maximum dependent rectangular area in each frame is determined as the decoding target rectangle of that frame. This is an area.
(3) The decoding target rectangular area of each frame in the GOP is set as an essential encoding area of each frame.
(4) A region including the largest decoding target rectangular region in a frame in the GOP is uniformly set as a transcoding target region of each frame in the GOP.

When the encoding parameter extraction unit 42 receives the essential encoding region information from the transcode control unit 41, the encoding parameter extraction unit 42 selects the essential code from the encoded data and encoding parameters of all regions output from the all region stream decoding unit 3. Encoded data (after quantization) of the encoding target block (including part of the encoding target block, including the encoding target block in the essential encoding area) included in the essential encoding area indicated by the encoding area information Count / PCM signal) and coding parameters (coding mode (intra coding mode / inter coding mode / PCM coding mode), prediction parameter (intra prediction parameter / inter prediction parameter) / PCM coding parameter, motion vector ( Coding mode is inter coding mode), predictive differential coding parameter, loop fill Parameters and hint information (motion vector restriction information, GOP size restriction information, reference structure designation information)) are extracted, and the coded data and coding parameters of the extracted coding target block are converted to the external reference block coding unit 43 and switching. It outputs to the switch 45 (step ST42).

When the external reference block encoding unit 43 receives the essential encoding region information from the transcoding control unit 41, the external reference block encoding unit 43 includes the encoding target block (essential encoding region) included in the essential encoding region indicated by the essential encoding region information. It is confirmed whether or not the encoding target block belonging to the boundary is an external reference block on which intra encoding is performed with reference to a pixel value outside the essential encoding region (step ST43).
Whether or not the encoding target block included in the essential encoding area is an external reference block is determined from the encoded data of all areas output from the all area stream decoding unit 3 and the encoding parameters. This can be determined by extracting the encoding parameter of the encoding target block and confirming the encoding mode and the prediction parameter included in the encoding parameter.
When the encoding target block included in the essential encoding region is an external reference block, the external reference block encoding unit 43 is an encoding method that does not use a pixel value outside the essential encoding region for prediction reference, The decoded image of the block to be encoded is encoded, and the encoded data as the encoding result and the encoding parameter used for encoding the decoded image are output to the changeover switch 45 (step ST44).

Thus, when the encoding target block is an external reference block, instead of extracting the encoding data and encoding parameters of the encoding target block from the encoding data and encoding parameters of the entire region, The reason for newly re-determining the encoded data and encoding parameters of the block to be encoded is that the outer region of the essential encoding region is not guaranteed a decoding result close to the decoded image before transcoding, so refer to that region. In the intra-coded prediction, the prediction result is different from the original result.
Examples of the method for re-determining the encoded data and the encoding parameter include the following methods.

(1) The decoded image of the external reference block is extracted from the decoded image of all regions (before the loop filter processing) output from the all region stream decoding unit 3, and is equivalent to the PCM encoding unit 16 of the moving image encoding device 1 In this procedure, the decoded image of the external reference block is encoded in the PCM encoding mode. The PCM signal (encoded data) that is the encoding result and the PCM encoding parameter used for the encoding are output to the changeover switch 45.
In the case of this method, depending on the accuracy of PCM encoding, it is possible to decode the same result as the input decoded image.
(2) Out of the encoding target blocks outside the essential coding area that the external reference block refers to during intra prediction, only the pixel portion used for reference is PCM-encoded based on the input decoded image. The PCM signal (encoded data) that is the encoding result is output to the changeover switch 45. However, with respect to coding parameters such as intra prediction parameters, the coding parameters of the external reference block are extracted from the coding parameters of the entire region, and the coding parameters are output to the changeover switch 45.
In the case of this method, depending on the accuracy of PCM encoding, it is possible to decode the same result as the input decoded image. Depending on the size of the external reference block, the code amount can be reduced as compared with the method (1).

(3) Since the decoded image outside the essential coding region is determined based on the result of the unnecessary block coding unit 44 described later, the decoded image is used as it is by intra coding prediction or inter coding prediction. Coding parameters are determined so that the result is close to the decoded image.
The coding parameter determination in the method (3) can be performed by a method equivalent to the coding method in the moving image coding apparatus 1 in FIG.
(4) Enlarging the essential coding region makes the essential coding region and the decoding region in the frame coincide with each other, and encodes the external reference block by a technique equivalent to intra coding prediction that refers to the outside of the screen. .

The unnecessary block encoder 44 encodes the encoding target block (unnecessary block) outside the essential encoding area and inside the transcoding target area, for example, in the skip mode in the inter encoding scheme, The encoded data as the encoding result and the encoding parameters used for encoding the encoding target block are output to the changeover switch 45 (step ST45).
Here, the unnecessary block is necessary for unifying the frame size in the GOP, but is an encoding target block belonging to an image area that is not used for display or reference of a subsequent frame. The result of decoding may be any result. Therefore, it is desirable to use an encoding parameter for the unnecessary block that makes the code amount as small as possible.
For this reason, unnecessary blocks are encoded in a skip mode of inter-coded prediction (vector information (predicted vector information may be excluded) and a mode in which no quantized coefficients are encoded) without performing block division as much as possible. Or the like.
In the external reference block encoding unit 43, when the above method (2) is used as a method for determining the encoding parameter of the external reference block, it is an encoding target block that is not included in the essential encoding region. However, in the case of an encoding target block referenced from the external reference block, it is necessary to determine the encoding parameter by the method (2) described above.

The changeover switch 45 refers to the transcoding target area information and the essential encoding area information output from the transcoding control unit 41, and confirms whether or not the encoding target block is an unnecessary block. If the block is included in the essential coding area, it is confirmed whether or not the coding target block is an external reference block (steps ST46 and ST47).
When the encoding target block is included in the essential encoding area and the encoding target block is not the external reference block, the changeover switch 45 displays the encoded data and the encoding parameter output from the encoding parameter extraction unit 42. It outputs to the variable length encoding part 46 (step ST48).
If the encoding target block is an external reference block, the changeover switch 45 outputs the encoded data and the encoding parameter output from the external reference block encoding unit 43 to the variable length encoding unit 46 (step ST49).
Moreover, if the encoding target block is an unnecessary block, the changeover switch 45 outputs the encoded data and the encoding parameter output from the unnecessary block encoding unit 44 to the variable length encoding unit 46 (step ST50).

When the variable length encoding unit 46 receives the encoded data and the encoding parameter of the encoding target block from the changeover switch 45, the variable length encoding unit 46 entropy encodes the encoded data and the encoding parameter, and indicates a result of the encoding An area bit stream is generated (step ST51).
Since the encoding target block included in the transcoding target area is cut out, there may be no adjacent encoding target block that existed in the original bitstream. For this reason, it is necessary to redo the encoding parameter prediction process using information on adjacent encoding target blocks such as motion vectors and filter parameters. The processing of the variable length encoding unit 46 including such prediction processing is performed by a method equivalent to that of the variable length encoding unit 23 of the moving image encoding device 1.
Further, since the variable length encoding unit 46 has a bit stream including a range wider than the display region indicated by the display region information for the partial region bit stream, the transcoding control unit 41 determines which area the display region is from. Header information of the partial area bit stream indicating whether or not, and the header information is multiplexed into the partial area bit stream, and the partial area bit stream after multiplexing the header information (in accordance with a predetermined encoding codec) Are output to the video decoding device 6.

The partial area bit stream generated by the variable length encoding unit 46 is configured to be decoded by the video decoding device 6.
The video decoding device 6 has the same function as the all-region stream decoding unit 3. However, the moving picture decoding apparatus 6 may not have the function of outputting the hint information, the encoding parameters, and the decoded image before the loop filter processing that the all-region stream decoding unit 3 has to the outside.
Further, the moving picture decoding apparatus 6 may be configured to decode by means different from the whole area stream decoding unit 3, in which case the variable length coding unit 46 of the partial area transcoding unit 4 Variable length encoding of encoded data and encoding parameters is performed so as to correspond to the decoding means (encoding codec for partial area bitstream) of the decoding device 6.

As apparent from the above, according to the first embodiment, the variable length coding unit 23 of the moving picture coding apparatus 1 belongs to the motion vector restriction information indicating the maximum range in which a motion vector can be searched for and the GOP. Hint information including GOP size restriction information indicating the maximum value of the GOP size, which is the number of pictures, and reference structure designating information indicating a picture to be referred to when decoding each picture belonging to the GOP is multiplexed into the entire region bit stream. Therefore, it is possible to generate an all-area bitstream suitable for generating an efficient partial-area bitstream with a low amount of computation without causing a reduction in the compression efficiency of the entire-area bitstream. Play.
That is, according to the first embodiment, in the moving picture coding apparatus 1, the maximum value of the motion vector, the GOP size, and the reference structure are the motion vector restriction information, the GOP size restriction information, and the reference structure for the input picture. Since encoding is limited to conform to the specified information, in a motion compensation prediction process using interframe reference, a range in which information of a specific area of a decoded image of a certain frame is propagated in a frame subsequent to the interframe reference is determined. Since it is possible to suppress to a specific range, and the variable length encoding unit 23 multiplexes the motion vector restriction information, the GOP size restriction information, and the reference structure designation information as hint information in the entire region bitstream, the decoded image There is an effect that the information propagation range can be explicitly transmitted to the moving picture transcoding device 2.

On the other hand, in the moving picture transcoding device 2, when the whole region bit stream output from the moving image coding device 1 is received, the whole region stream decoding unit 3 performs coding of the whole region, coding parameters, hint information, The decoded image before the loop filter processing is decoded and output to the partial area transcoding unit 4. The partial area transcoding unit 4 decodes the display area information of each frame based on the input display area information and hint information. To specify the transcoding target area that means the image size of the essential coding area and the partial area bitstream that are necessary areas to belong to, and belong to the essential coding area, and The encoding target block that does not need to refer to external information is assigned the encoding parameter output from the all-region stream decoding unit 3. The encoding target block that is allocated and belongs to the inside of the essential coding area and needs to refer to the information outside the mandatory coding area, regenerates the coded data and coding parameters, The encoding target block outside the encoding area but inside the transcoding target area is assigned a dummy encoding parameter with a small code amount, and the encoding target block in the transcoding target area thus allocated is allocated. Since the encoded data and the encoding parameter are configured to be multiplexed as a partial area bit stream together with the header information of an appropriate partial area bit stream, the entire area bit stream input to the video transcoding device 2 is multiplexed. Of the decoded images, it is possible to decode an image that is the same as or close to the partial area corresponding to the display area information. Rutotomoni, an effect that can be obtained with a low amount of calculation subregion bitstream size smaller than the entire area bitstream.

Embodiment 2. FIG.
In the second embodiment, an example in which the moving picture coding apparatus and the moving picture transcoding apparatus shown in the first embodiment are applied to a system different from the first embodiment will be described.

FIG. 11 is a block diagram showing a system to which a moving picture coding apparatus and a moving picture transcoding apparatus according to Embodiment 2 of the present invention are applied.
In FIG. 11, the moving image encoding device 51 has the same function as the moving image encoding device 1 of FIG. The moving image encoding device 51 outputs the created entire area stream to the moving image distribution device 53 or the storage 52.

The moving image distribution device 53 includes an all region stream decoding unit 54, a partial region transcoding unit 55, and a distribution control unit 56. The entire region bit stream generated by the moving image encoding device 51 and the moving image decoding device. It has a function of generating a partial area stream based on display area designation information input from 50-1 to 50-N and outputting the generated partial area stream to the moving picture decoding devices 50-1 to 50-N.
The all region stream decoding unit 54 has the same function as the all region stream decoding unit 3 of FIG. Further, it has a function of outputting the generated entire area decoded image to the entire area display device 57.
The partial region transcoding unit 55 has a function equivalent to that of the partial region transcoding unit 4 of FIG.
The distribution control unit 56 has a function of receiving the display area information output from the video decoding devices 50-1 to 50-N and outputting the display area information to the partial area transcoding unit 55. When the partial region bitstream output from the partial region transcoding unit 55 is received, the partial region is transmitted to the moving picture decoding apparatus that has output the display region information used when generating the partial region bitstream. It has a function of outputting a bit stream.

The whole area display device 57 is a display device that displays the whole area decoded image output from the whole area stream decoding unit 54.
The moving picture decoding apparatuses 50-1 to 50-N output display area information to the moving picture distribution apparatus 53, and perform partial processing from the partial area bit stream output from the moving picture distribution apparatus 53 based on the display area information. This is an apparatus for decoding a region image and generating a partial region decoded image.
The partial region display devices 51-1 to 51-N are display devices that display partial region decoded images corresponding to the moving image decoding devices 50-1 to 50-N, respectively.

As a specific operation example, an example will be described in which the moving image distribution device 53 is incorporated in a monitoring camera recorder that stores high-resolution monitoring video.
In this case, the moving image encoding device 51 is an encoder device that exists on the side of the monitoring camera that can acquire high-resolution video that supplies the monitoring video data to the camera recorder, and generates an all-region bit stream to be distributed. is there. The entire area bit stream generated by the moving image encoding device 51 which is an encoder device is stored in a storage 52 built in the camera recorder.

The camera recorder can decode the whole area bit stream stored in the storage 52 by the whole area stream decoding unit 54 and display the generated whole area decoded image on the whole area display device 57 directly connected to the camera recorder. Is possible.
Moreover, the camera recorder here can distribute the monitoring video data to display terminals (tablet terminals, smartphones, PCs, etc.) of a plurality of users in remote locations. Surveillance video data is distributed to a remote user's display terminal via a predetermined transmission system, but depending on the transmission capacity of the transmission system, it may be difficult to transmit the entire bit stream. It is done. Here, in this system, when the user operates the display terminal to specify an arbitrary display area and requests monitoring video data, the display area information indicating the display area is transmitted to the camera recorder via a predetermined transmission system. A partial area bit stream that is input to the moving image distribution apparatus 53 and includes the encoding parameters and the like necessary for reproduction of the display area specified by the user is generated, and the partial area bit stream is transmitted in a predetermined manner. Send to the requesting display terminal via the system.

By designating only the necessary area of the monitoring video, it is possible to control the amount of data to be transmitted and view the monitoring video at a remote place. Further, since the display area can be individually specified for each user, for example, a user who can use a transmission path having a large transmission capacity can specify a larger display area.
Each of the moving image decoding devices 50-1 to 50-N is built in the display terminal of each user, and the partial areas transmitted from the moving image distribution device 53 by the respective moving image decoding devices 50-1 to 50-N. By receiving the bitstream and decoding the partial region decoded image from the partial region bitstream, the partial region decoded image is displayed on each display terminal.

As described above, by using the system shown in FIG. 11, the user can view a high-resolution monitoring video on a display device directly connected to the camera recorder, and at a remote location via a predetermined transmission system, By designating only the necessary area, it is possible to view the monitoring video while suppressing the amount of data to be transmitted. It is also possible to change the designated display area for each user.

Embodiment 3 FIG.
In the third embodiment, a moving picture stream transmission system for more efficiently operating the moving picture coding apparatus and the moving picture transcoding apparatus shown in the first and second embodiments will be described.
In the third embodiment, it is assumed that the entire area image is divided into sub-picture units such as slices and tiles.
FIG. 12 is an explanatory diagram showing an example in which the entire region image is divided into six sub-pictures (Sub-pic).

13 is a block diagram showing a moving picture stream transmission system according to Embodiment 3 of the present invention. In FIG. 13, the same reference numerals as those in FIG.
The video encoding device 1 is the video encoding device 1 shown in the first embodiment (or the video encoding device 51 shown in the second embodiment), and multiplexes hint information. A bit stream (sub-picture unit bit stream) is generated, and the sub-picture unit bit stream is combined for all region images, and an all region bit stream (all region stream) is output as a bit stream of all region images. To do.
In the example of FIG. 12, since the entire area image is divided into six sub-pictures, an all-area bit stream in which the bit streams of the six sub-pictures are collected is output.

The MUXER 61 is a sub-picture indicating the entire area bit stream output from the moving picture encoding apparatus 1, the sub-picture division state in the entire area image, and the data position of the bit stream in units of sub-pictures included in the entire area bit stream. It is a multiplexing transmission apparatus that multiplexes information with a multiplexed signal of a preset transmission format and transmits the multiplexed signal.
The DEMUXER 62 receives the multiplexed signal transmitted by the MUXER 61, separates the entire area bit stream and subpicture information included in the multiplexed signal, and displays the subpicture information and the subpicture to be decoded. This is a demultiplexer that refers to the information and extracts the bit stream of the sub-picture to be decoded from the entire region bit stream.

Next, the operation will be described.
For example, when the entire region image is divided into six sub-pictures (Sub-pic) as shown in FIG. 12, all the bit streams of the six sub-pictures are collected from the moving image encoding device 1. An area bitstream is output.
At this time, when the moving picture decoding apparatus 6 decodes only a part of the entire area image, display area information indicating the decoding target sub-picture is input to the DEMUXER 62 and the moving picture transcoding apparatus 2.
In the example of FIG. 12, a region surrounded by a dotted line (essential encoding region) is a region to be decoded, and the display region information includes sub-pic1, Sub, to which the subpicture to which the region surrounded by the dotted line belongs. -Indicates that it is pic4.
However, on the decoding device side, each sub-picture needs to be encoded in an encoding unit (for example, NAL in HEVC or H.264) that can be decoded independently, but in Embodiment 3, Since it is assumed that the video decoding device 6 can decode in units of NAL, even if the video transcoding device 2 generates a bit stream using only NALs corresponding to Sub-pic1 and Sub-pic4. The moving picture decoding apparatus 6 can decode the sub-picture to be decoded.

The MUXER 61 receives input of sub-picture information indicating the sub-picture division state in the entire area image and the data position of the bit stream in units of sub-pictures included in the entire area bit stream from the outside.
In the example of FIG. 12, the sub-picture information indicates where the sub-pics 1 to 6 exist in the entire area image, and the bit stream corresponding to the sub-pics 1 to 6 indicates which of the entire area bit streams. Indicates whether it exists in the position.
Here, an example is shown in which the sub-picture information includes information indicating the division state and data position information. However, the sub-picture information may include other information.

When the MUXER 61 receives the entire region bit stream from the moving image encoding apparatus 1, the MUXER 61 multiplexes the entire region bit stream and the sub-picture information into a multiplexed signal having a preset transmission format, and transmits the multiplexed signal. .
As the transmission format here, for example, a transmission format defined by MPEG-2 TS, MMT (MPRG Media Transport), or the like can be considered. The above sub-picture information is multiplexed with the full area bit stream as a descriptor of these transmission systems. However, since the transmission format is an example, other transmission formats may be used.

When the DEMUXER 62 receives the multiplexed signal transmitted by the MUXER 61, the DEMUXER 62 separates the entire area bit stream and the sub-picture information included in the multiplexed signal.
Further, the DEMUXER 62 identifies the decoding target subpicture by referring to display area information indicating the decoding target subpicture given from the outside. In the example of FIG. 12, Sub-pic1 and Sub-pic4 are identified as sub-pictures to be decoded.
When the DEMUXER 62 identifies the decoding target sub-picture, the DEMUXER 62 identifies the decoding target sub-picture bit stream included in the all-region bit stream with reference to the sub-picture information separated from the multiplexed signal. A bit stream of a sub-picture to be decoded is extracted from the stream.
In the example of FIG. 12, bit streams (VCL-NAL1, VCL-NAL4) corresponding to Sub-pic1 and Sub-pic4 are extracted.

When the DEMUXER 62 extracts the bit stream of the sub-picture to be decoded, the moving picture transcoding device 2 generates a partial area bit stream from these bit streams, as in the first embodiment, and the partial area bit stream Is output to the video decoding device 6.

As is apparent from the above, according to the third embodiment, the entire region bit stream output from the moving image encoding apparatus 1, the sub-picture division state in the entire region image, and the entire region bit stream are included. MUXER 61 that multiplexes the sub-picture information indicating the data position of the bit stream in units of sub-pictures to a multiplexed signal of a preset transmission format, and transmits the multiplexed signal, and the multiplexing transmitted by MUXER 61 Receiving the signal, separating the entire region bit stream and sub-picture information included in the multiplexed signal, referring to the sub-picture information and the display region information indicating the sub-picture to be decoded. DEMUXER 62 for extracting the bit stream of the sub-picture to be decoded from the area bit stream Therefore, it is only necessary to transmit a bitstream necessary for decoding among all the region bitstreams generated by the moving image encoding apparatus 1, and the bitstream transmission amount can be reduced. There is an effect that can.

In the third embodiment, the example in which the DEMUXER 62 outputs the bit stream extracted from the entire region bit stream to the moving picture transcoding apparatus 2 has been described, but the moving picture transcoding apparatus 2 is omitted as shown in FIG. Then, the bit stream extracted from the entire area bit stream may be output to the moving picture decoding apparatus 6.
In this case, the size of the bit stream input to the video decoding device 6 is larger than that in the configuration of FIG. 13, but the transcoding process in the video transcoding device 2 is not performed. It becomes possible to carry out at high speed.

Embodiment 4 FIG.
In the third embodiment, the DEMUXER 62 refers to the sub-picture information and the display area information and extracts the bit stream of the sub-picture to be decoded from the entire area bit stream. As shown in FIG. The MUXER 61 may extract the bit stream of the sub picture to be decoded from the entire area bit stream with reference to the sub picture information and the display area information.

In this case, the MUXER 61 multiplexes the bit stream of the decoding target sub-picture extracted from the entire area bit stream into a multiplexed signal of a preset transmission format, and transmits the multiplexed signal to the DEMUXER 62.
The DEMUXER 62 receives the multiplexed signal transmitted by the MUXER 61, separates the bit stream of the sub-picture to be decoded included in the multiplexed signal, and converts the bit stream into the video transcoding device 2 or the video Output to the decoding device 6.
According to the fourth embodiment, the bit stream transmission amount can be further reduced as compared with the third embodiment.
Note that the display area information acquired by the MUXER 61 may be acquired from, for example, the video decoding device 6 on the decoding side, or may be acquired from the video encoding device 1 on the transmission side. . Or you may make it acquire by a user's input.

In the present invention, within the scope of the invention, any combination of the embodiments, or any modification of any component in each embodiment, or omission of any component in each embodiment is possible. .

The moving picture coding apparatus according to the present invention is suitable for an apparatus that needs to generate an efficient partial area bit stream with a small amount of computation without causing a reduction in compression efficiency of the entire area bit stream.

1 moving image encoding device, 2 moving image transcoding device, 3 full region stream decoding unit (required coding region specifying unit), 4 partial region transcoding unit (required coding region specifying unit, parameter extracting unit, partial region stream) Generating means), 5 full area display device, 6 moving image decoding device, 7 partial area display device, 11 encoding control unit (predicted image generating means), 12 block dividing unit (predicted image generating means), 13 changeover switch (prediction) (Image generation means), 14 intra prediction section (prediction image generation means), 15 motion compensation prediction section (prediction image generation means), 16 PCM encoding section, 17 subtraction section (bitstream generation means), 18 transform / quantization section (Bitstream generation means), 19 inverse quantization / inverse transform unit, 20 addition unit, 21 loop filter unit, 22 channel Memory, 23 variable length encoding unit (bitstream generating means), 31 variable length code decoding unit, 32 changeover switch, 33 intra prediction unit, 34 motion compensation unit, 35 PCM decoding unit, 36 inverse quantization / inverse conversion unit, 37 addition unit, 38 loop filter unit, 39 frame memory, 41 transcode control unit, 42 encoding parameter extraction unit, 43 external reference block encoding unit, 44 unnecessary block encoding unit, 45 changeover switch, 46 variable length encoding Unit, 51 video encoding device, 52 storage, 53 video distribution device, 54 all region stream decoding unit, 55 partial region transcoding unit, 56 distribution control unit, 57 all region display device, 50-1 to 50-N Video decoding device, 51-1 to 51-N partial area display device, 61 MUX R (multiplex transmission apparatus), 62 DEMUXER (demultiplexer).

Claims (13)

  1. A prediction image generating means for determining a coding parameter for a block to be coded in a picture belonging to GOP (Group Of Pictures), and generating a prediction image using the coding parameter;
    A bit stream that compresses and encodes a difference image between the encoding target block and the prediction image generated by the prediction image generation unit, multiplexes the encoded data that is the encoding result and the encoding parameter, and generates a bit stream Generating means,
    The bitstream generation means decodes motion vector restriction information indicating a searchable range of motion vectors, GOP size restriction information indicating a GOP size that is the number of pictures belonging to the GOP, and each picture belonging to the GOP. A moving picture coding apparatus, wherein hint information including reference structure designation information indicating a picture to be referred to is multiplexed into the bit stream.
  2. When the encoding mode for the encoding target block is an inter encoding mode, the prediction image generation unit searches for a motion vector in an area indicated by the motion vector restriction information, and determines the motion vector and the encoding parameter. The apparatus according to claim 1, wherein a prediction image is generated by performing prediction processing on the encoding target block.
  3. Hint information is extracted from the bitstream generated by the video encoding device according to claim 1, and motion vector restriction information, GOP size restriction information, and reference structure designation information included in the hint information are referred to. An essential encoding area specifying means for specifying an essential encoding area which is an area necessary for decoding a display area of a picture indicated by display area information given from outside;
    Parameters for extracting the encoded data and encoding parameters of the encoding target block included in the essential encoding area specified by the essential encoding area specifying means from the bit stream generated by the moving picture encoding device Extraction means;
    A moving picture transcoding device comprising: partial area stream generation means for generating a partial area stream that conforms to a preset encoding codec from the encoded data and encoding parameters extracted by the parameter extraction means.
  4. The parameter extraction means includes
    An external reference in which an encoding target block included in the essential encoding area specified by the essential encoding area specifying unit is subjected to intra encoding with reference to a pixel value outside the essential encoding area. If not a block, extract encoded data and encoding parameters of the encoding target block from the bit stream generated by the moving image encoding apparatus, and output the encoded data and encoding parameters And
    An external reference in which an encoding target block included in the essential encoding area specified by the essential encoding area specifying unit is subjected to intra encoding with reference to a pixel value outside the essential encoding area. In the case of a block, the decoded image of the encoding target block is encoded by an encoding method that does not use pixel values outside the essential encoding region for prediction reference, and the encoded data that is the encoding result and the decoding An external reference block encoding unit that outputs an encoding parameter used for encoding an image;
    The encoded data and encoding parameters output from the encoding parameter extraction unit or the encoded data and encoding parameters output from the external reference block encoding unit are selected, and the selected encoded data and code are selected. 4. The moving picture transcoding apparatus according to claim 3, further comprising a changeover switch for outputting a conversion parameter to the partial area stream generation means.
  5. The external reference block encoding unit generates an intra prediction image by an intra encoding scheme that refers to a pixel value at a screen end of the encoding target block, and a difference between the decoded image of the encoding target block and the intra prediction image 5. The moving image transcoding apparatus according to claim 4, wherein the image is compression-encoded, and encoded data used as an encoding result and an encoding parameter used when generating the intra-predicted image are output.
  6. The external reference block encoding unit encodes a decoded image of the block to be encoded by PCM (Pulse Code Modulation), and outputs encoded data and a PCM encoding parameter as an encoding result thereof. Item 5. The moving image transcoding device according to Item 4.
  7. The parameter extraction means includes
    When the size of the essential coding area in each picture belonging to the GOP is different, from among the essential coding areas in each picture, specify a mandatory coding area as a transcoding target area based on the size,
    In each picture, an encoding target block that is outside the designated essential encoding area and is within the transcoding target area is encoded in the skip mode in the inter encoding scheme, and the encoding result is An unnecessary block encoding unit that outputs certain encoded data and an encoding parameter used for encoding the encoding target block;
    The changeover switch includes the encoded data and encoding parameters output from the encoding parameter extraction unit, the encoded data and encoding parameters output from the external reference block encoding unit, or the unnecessary block encoding unit. 5. The moving picture transcoding device according to claim 4, wherein the encoded data and the encoding parameter output from the image data are selected, and the selected encoded data and the encoding parameter are output to the partial area stream generation means. .
  8. A predicted image generating means determines a coding parameter for a coding target block in a picture belonging to a GOP, generates a predicted image using the coding parameter;
    Video encoding in which a bitstream generation unit compresses and encodes a differential image between the encoding target block and the prediction image, and multiplexes encoded data that is the encoding result and the encoding parameter to generate a bitstream A method,
    The bitstream generation means decodes motion vector restriction information indicating a searchable range of motion vectors, GOP size restriction information indicating a GOP size that is the number of pictures belonging to the GOP, and each picture belonging to the GOP. A moving picture encoding method, wherein hint information including reference structure designation information indicating a picture to be referred to is multiplexed into the bit stream.
  9. The essential coding area specifying means extracts hint information from the bitstream generated by the moving picture coding method according to claim 8, and includes motion vector restriction information, GOP size restriction information and reference included in the hint information. With reference to the structure designation information, the essential encoding area that is an area necessary for decoding the display area of the picture indicated by the display area information given from the outside is specified,
    Parameter extraction means extracts the encoding data and encoding parameters of the encoding target block included in the essential encoding area from the bitstream generated by the moving image encoding method,
    A moving picture transcoding method in which the partial area stream generation means generates a partial area stream that conforms to a preset encoding codec from the encoded data and encoding parameters extracted by the parameter extraction means.
  10. The bit stream generation means generates a bit stream in units of sub-pictures in which the hint information is multiplexed when the encoding target block corresponds to a block in which an entire area image is divided in units of sub-pictures. 2. The moving picture encoding apparatus according to claim 1, wherein the bit stream in units of sub-pictures is collected for all area images, and an all area stream that is a bit stream of all area images is output.
  11. A video encoding device according to claim 10;
    An all-region stream output from the moving image encoding device, sub-picture information indicating a sub-picture division state in the all-region image and a data position of a bit stream in units of sub-pictures included in the all-region stream; Is multiplexed with a multiplexed signal of a preset transmission format and transmits the multiplexed signal;
    Receiving the multiplexed signal transmitted by the multiplexing transmission device, separating the entire region stream and the sub-picture information included in the multiplexed signal, and subtracting the sub-picture information and the sub-picture to be decoded; A video stream transmission system comprising: a demultiplexer that extracts a bit stream of a sub-picture to be decoded from the entire area stream with reference to display area information to be displayed.
  12. A video encoding device according to claim 10;
    With reference to the sub-picture information indicating the sub-picture bit stream data position of the sub-picture unit included in the whole-region stream and the sub-picture division state in the whole-region image, and output from the moving picture coding device Multiplex that extracts a bit stream of a sub-picture to be decoded from all region streams, multiplexes the bit stream of the sub-picture to be decoded into a multiplexed signal of a preset transmission format, and transmits the multiplexed signal Transmission equipment,
    A video stream transmission system comprising: a demultiplexer that receives a multiplexed signal transmitted by the multiplexed transmission device and separates a bitstream of the decoding target sub-picture included in the multiplexed signal .
  13. 13. The moving picture stream transmission system according to claim 12, wherein the multiplexed transmission apparatus acquires the display area information from a moving picture decoding apparatus that decodes the bit stream of the sub-picture to be decoded.
PCT/JP2014/073532 2013-09-06 2014-09-05 Video encoding device, video transcoding device, video encoding method, video transcoding method and video stream transmission system WO2015034061A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2013-185196 2013-09-06
JP2013185196 2013-09-06

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
EP14842955.8A EP3043560A4 (en) 2013-09-06 2014-09-05 Video encoding device, video transcoding device, video encoding method, video transcoding method and video stream transmission system
US14/916,914 US20160234523A1 (en) 2013-09-06 2014-09-05 Video encoding device, video transcoding device, video encoding method, video transcoding method, and video stream transmission system
JP2014073532A JPWO2015034061A1 (en) 2013-09-06 2014-09-05 Moving picture coding apparatus, moving picture transcoding apparatus, moving picture coding method, moving picture transcoding method, and moving picture stream transmission system
KR1020167008909A KR20160054530A (en) 2013-09-06 2014-09-05 Video encoding device, video transcoding device, video encoding method, video transcoding method and video stream transmission system
CN201480048963.7A CN105519117A (en) 2013-09-06 2014-09-05 Video encoding device, video transcoding device, video encoding method, video transcoding method and video stream transmission system

Publications (1)

Publication Number Publication Date
WO2015034061A1 true WO2015034061A1 (en) 2015-03-12

Family

ID=52628522

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/073532 WO2015034061A1 (en) 2013-09-06 2014-09-05 Video encoding device, video transcoding device, video encoding method, video transcoding method and video stream transmission system

Country Status (6)

Country Link
US (1) US20160234523A1 (en)
EP (1) EP3043560A4 (en)
JP (1) JPWO2015034061A1 (en)
KR (1) KR20160054530A (en)
CN (1) CN105519117A (en)
WO (1) WO2015034061A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106162167A (en) * 2015-03-26 2016-11-23 中国科学院深圳先进技术研究院 Efficient video coding method based on study

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6421422B2 (en) * 2014-03-05 2018-11-14 日本電気株式会社 Video analysis device, monitoring device, monitoring system, and video analysis method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11225339A (en) * 1997-10-24 1999-08-17 Matsushita Electric Ind Co Ltd Method for elegant degeneracy about calculation in audio visual compression system
JP2002044622A (en) * 2000-03-13 2002-02-08 Sony Corp Method and device for supplying contents, recording medium, method and device for generating signal, method and device for conversion, and play-back terminal and its method
JP2002152749A (en) * 2000-11-09 2002-05-24 Matsushita Electric Ind Co Ltd Image re-coder
JP2003527005A (en) * 2000-03-13 2003-09-09 ソニー株式会社 Apparatus and method for generating a reduced transcoding hints metadata are
JP2005341093A (en) * 2004-05-26 2005-12-08 Mitsubishi Electric Corp Contents adaptating apparatus, contents adaptation system, and contents adaptation method
JP2007104231A (en) * 2005-10-04 2007-04-19 Hitachi Ltd Transcoder, recording apparatus, transcode method
WO2012060459A1 (en) 2010-11-01 2012-05-10 日本電気株式会社 Dynamic image distribution system, dynamic image distribution method, and dynamic image distribution program

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5055078B2 (en) * 2007-10-01 2012-10-24 キヤノン株式会社 Image processing apparatus and method
CN101453642B (en) * 2007-11-30 2012-12-26 华为技术有限公司 Method, apparatus and system for image encoding/decoding
US8270473B2 (en) * 2009-06-12 2012-09-18 Microsoft Corporation Motion based dynamic resolution multiple bit rate video encoding
US9060174B2 (en) * 2010-12-28 2015-06-16 Fish Dive, Inc. Method and system for selectively breaking prediction in video coding
US10200689B2 (en) * 2011-03-04 2019-02-05 Qualcomm Incorporated Quantized pulse code modulation in video coding
US9414086B2 (en) * 2011-06-04 2016-08-09 Apple Inc. Partial frame utilization in video codecs
JP5678807B2 (en) * 2011-06-09 2015-03-04 富士通セミコンダクター株式会社 Video / audio data processing apparatus and data multiplexing method
WO2013001730A1 (en) * 2011-06-30 2013-01-03 三菱電機株式会社 Image encoding apparatus, image decoding apparatus, image encoding method and image decoding method
US9131245B2 (en) * 2011-09-23 2015-09-08 Qualcomm Incorporated Reference picture list construction for video coding
US10244246B2 (en) * 2012-02-02 2019-03-26 Texas Instruments Incorporated Sub-pictures for pixel rate balancing on multi-core platforms
JP5197864B2 (en) * 2012-04-12 2013-05-15 株式会社東芝 Image decoding method and apparatus
KR101968070B1 (en) * 2012-10-12 2019-04-10 캐논 가부시끼가이샤 Method for streaming data, method for providing data, method for obtaining data, computer-readable storage medium, server device, and client device
US10021414B2 (en) * 2013-01-04 2018-07-10 Qualcomm Incorporated Bitstream constraints and motion vector restriction for inter-view or inter-layer reference pictures
US20160165309A1 (en) * 2013-07-29 2016-06-09 Koninklijke Kpn N.V. Providing tile video streams to a client

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH11225339A (en) * 1997-10-24 1999-08-17 Matsushita Electric Ind Co Ltd Method for elegant degeneracy about calculation in audio visual compression system
JP2002044622A (en) * 2000-03-13 2002-02-08 Sony Corp Method and device for supplying contents, recording medium, method and device for generating signal, method and device for conversion, and play-back terminal and its method
JP2003527005A (en) * 2000-03-13 2003-09-09 ソニー株式会社 Apparatus and method for generating a reduced transcoding hints metadata are
JP2002152749A (en) * 2000-11-09 2002-05-24 Matsushita Electric Ind Co Ltd Image re-coder
JP2005341093A (en) * 2004-05-26 2005-12-08 Mitsubishi Electric Corp Contents adaptating apparatus, contents adaptation system, and contents adaptation method
JP2007104231A (en) * 2005-10-04 2007-04-19 Hitachi Ltd Transcoder, recording apparatus, transcode method
WO2012060459A1 (en) 2010-11-01 2012-05-10 日本電気株式会社 Dynamic image distribution system, dynamic image distribution method, and dynamic image distribution program

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106162167A (en) * 2015-03-26 2016-11-23 中国科学院深圳先进技术研究院 Efficient video coding method based on study

Also Published As

Publication number Publication date
CN105519117A (en) 2016-04-20
US20160234523A1 (en) 2016-08-11
EP3043560A1 (en) 2016-07-13
EP3043560A4 (en) 2017-03-01
KR20160054530A (en) 2016-05-16
JPWO2015034061A1 (en) 2017-03-02

Similar Documents

Publication Publication Date Title
US7426308B2 (en) Intraframe and interframe interlace coding and decoding
KR100943912B1 (en) Method and apparatus for processing multiview video
KR101108661B1 (en) Method for coding motion in a video sequence
TWI526055B (en) Motion vector prediction in video coding
US6421465B2 (en) Method for computational graceful degradation in an audiovisual compression system
JP4193406B2 (en) Video data conversion apparatus and video data conversion method
KR100956478B1 (en) Moving image decoding device and moving image decoding mehtod
US6427027B1 (en) Picture encoding and/or decoding apparatus and method for providing scalability of a video object whose position changes with time and a recording medium having the same recorded thereon
KR101053628B1 (en) Scalable encoding and decoding method of video signal
JP5047995B2 (en) Video intra prediction encoding and decoding method and apparatus
KR101177031B1 (en) Method and apparatus for minimizing number of reference pictures used for inter-coding
JP2006262004A (en) Dynamic image encoding/decoding method and device
EP1429564A1 (en) Moving picture encoding/transmission system, moving picture encoding/transmission method, and encoding apparatus, decoding apparatus, encoding method, decoding method, and program usable for the same
JP4542447B2 (en) Image encoding / decoding device, encoding / decoding program, and encoding / decoding method
KR101228651B1 (en) Method and apparatus for performing motion estimation
EP1618744B1 (en) Video transcoding
KR20130020697A (en) Dynamic image encoding device and dynamic image decoding device
JP3413720B2 (en) Picture coding method and apparatus, and image decoding method and apparatus
US20100118945A1 (en) Method and apparatus for video encoding and decoding
JP2009303264A (en) Image encoding device, image decoding device, image encoding method, and image decoding method
US20070098067A1 (en) Method and apparatus for video encoding/decoding
EP2437499A1 (en) Video encoder, video decoder, video encoding method, and video decoding method
JP5289440B2 (en) Image encoding device, image decoding device, image encoding method, and image decoding method
JPH10257502A (en) Hierarchical image encoding method, hierarchical image multiplexing method, hierarchical image decoding method and device therefor
JP2010135864A (en) Image encoding method, device, image decoding method, and device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14842955

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase in:

Ref document number: 2015535533

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 14916914

Country of ref document: US

NENP Non-entry into the national phase in:

Ref country code: DE

REEP

Ref document number: 2014842955

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014842955

Country of ref document: EP

ENP Entry into the national phase in:

Ref document number: 20167008909

Country of ref document: KR

Kind code of ref document: A