WO2021002303A1 - Dispositif de traitement d'informations, procédé de traitement d'informations, dispositif de traitement de lecture et procédé de traitement de lecture - Google Patents

Dispositif de traitement d'informations, procédé de traitement d'informations, dispositif de traitement de lecture et procédé de traitement de lecture Download PDF

Info

Publication number
WO2021002303A1
WO2021002303A1 PCT/JP2020/025379 JP2020025379W WO2021002303A1 WO 2021002303 A1 WO2021002303 A1 WO 2021002303A1 JP 2020025379 W JP2020025379 W JP 2020025379W WO 2021002303 A1 WO2021002303 A1 WO 2021002303A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
gra
area
file
display
Prior art date
Application number
PCT/JP2020/025379
Other languages
English (en)
Japanese (ja)
Inventor
平林 光浩
遼平 高橋
優 池田
勇司 藤本
矢ケ崎 陽一
Original Assignee
ソニー株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニー株式会社 filed Critical ソニー株式会社
Publication of WO2021002303A1 publication Critical patent/WO2021002303A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/70Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by syntax aspects related to video coding, e.g. related to compression standards
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/236Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream

Definitions

  • the present invention relates to an information processing device, an information processing method, a reproduction processing device, and a reproduction processing method.
  • H.H. which is one of the standard specifications of the image coding method.
  • 265 / HEVC the following is stipulated.
  • a sequence that corresponds to the entire compressed moving image contains a plurality of images, and each image is called a picture.
  • Each picture is divided into one or more slices.
  • a slice is the smallest decoding unit. Then, each slice is classified into one of I slice (Intra Slice), P slice (Predictive Slice) and B slice (Bipredictive Slice).
  • the I slice is a slice that is independently decoded without referring to other images.
  • a P-slice is a slice that is decoded by referencing a single other image.
  • a B slice is a slice that is decoded by referencing a plurality of other images.
  • the picture at the beginning of the sequence consisting of only I slices is called an IDR (Instantaneous Decoding Refresh) picture.
  • the IDR picture is identified by the value of the NAL (Network Abstraction Layer) unit type.
  • the pictures in the same sequence that follow the IDR picture do not refer to the pictures before the IDR picture in the decoding order (decoding order), but in the decoding order (decoding Order) or display order (presentation Order) than the IDR picture. Located behind.
  • the video when attempting to randomly access a time point in the middle of the video of a certain coded stream, the video can be appropriately decoded from the IDR picture in the vicinity of the specified time point.
  • the random access is not a decoding process from the beginning of the stream, but a process of decoding and reproducing the stream from the middle of the stream.
  • GAA Gradual Random Access
  • sync sample is stored in sync sample box.
  • VVC Very Video Coding
  • SERIES H AUDIOVISUAL AND MULTIMEDIA SYSEMS.
  • the present disclosure provides an information processing device, an information processing method, a reproduction processing device, and a reproduction processing method that provide a user with a high-quality viewing experience.
  • the coding unit encodes an image in an image sequence to generate a coded stream.
  • the determination unit determines one or more decoding start images in the image sequence that can be used as the image to start decoding at the time of Gradual Random Access (GRA).
  • GAA Gradual Random Access
  • the file generation unit inserts GRA information regarding the decoding start image determined by the determination unit into the header area of the file format including the header area and the data area, and inserts the coded stream into the data area.
  • Non-Patent Document 1 (above)
  • Non-Patent Document 2 ITU-T H.264.
  • SERIES H AUDIOVISUAL AND MULTIMEDIA SYSEMS.
  • Non-Patent Document 3 m48053, Versatile Video Coding (Draft 5), B. Bross, J. Chen, S. Liu, Joint Video Experts Team (JVET) of ITU-T SG 16 WP 3 and ISO / IEC JTC 1 / SC 29/WG 11 14th Meeting: Geneva, CH, 19-27 Mar.
  • Non-Patent Document 4 m48054, Algorithm description for Versatile Video Coding and Test Model 5 (VTM 5), J. Chen, Y. Ye, S. Kim, Joint Video Experts Team (JVET) of ITU-T SG 16 WP 3 and ISO / IEC JTC 1 / SC 29/WG 11 14th Meeting: Geneva, CH, 19-27 Mar. 2019
  • Non-Patent Document 5 m47100, AHG12: Loop filter disabled across virtual boundaries, S.-Y. Lin, L. Liu, J.-L. Lin, Y.-C. Chang, C.-C. Ju (Media Tek) ), P. Hanhart, Y.
  • Non-Patent Document 6 m47986, Gradual Random Access, S. Deshpande (Sharp), Y.-K. Wang, Hendry (Huawei), R. Sjoberg, M. Pettersson (Ericsson), L. Chen (Media Tek), Joint Video Experts Team (JVET) of ITU-T SG 16 WP 3 and ISO / IEC JTC 1 / SC 29/WG 11 14th Meeting: Geneva, CH, 19-27 Mar.
  • Non-Patent Document 6 m47986, Gradual Random Access, S. Deshpande (Sharp), Y.-K. Wang, Hendry (Huawei), R. Sjoberg, M. Pettersson (Ericsson), L. Chen (Media Tek), Joint Video Experts Team (JVET) of ITU-T SG 16 WP 3 and ISO / IEC JTC 1 / SC 29/WG 11 14th Meeting: Geneva, CH, 19-27 Mar.
  • Non-Patent Document 7 ISO / IEC 14496-12: 2015 Information technology. Coding of audio-visual object.
  • Part 12 ISO base media file format
  • Non-Patent Document 8 ISO / IEC 14496-12: 2017 Information technology. Coding of audio-visual object.
  • Part 15 Carriage of network abstraction layer (NAL) unit structured video in the ISO base media file format, 2017-02
  • CRA Create Random Access
  • BLA Broken Link Access
  • a picture equivalent to IDR / CRA / BLA of HEVC can be stored in the sync sample box of the ISOBMFF file.
  • the GRA picture is an image that displays a part of the entire display image, and it is not appropriate to treat it in the same manner as the IDR picture, and it is difficult to store it in the sync sample box.
  • recovery_poc_cnt which is the number of frames until recovery is completed, can be stored in the "roll" sample group of the ISOBMFF file and used as a roll-distance that represents the period until the complete image can be displayed. It is possible.
  • the processing method of slices other than the I slice is implementation-dependent, and in this case, the appearance of the content becomes device-dependent. Therefore, the quality of the viewing experience of the user may be impaired.
  • FIG. 1 is a system configuration diagram of an example of a distribution system.
  • the distribution system 100 includes a file generation device 1 which is an information processing device, a client device 2 which is a reproduction processing device, and a Web server 3.
  • the file generation device 1, the client device 2, and the Web server 3 are connected to the network 4. Then, the file generation device 1, the client device 2, and the Web server 3 can communicate with each other via the network 4.
  • the distribution system 100 may include a plurality of file generation devices 1 and a plurality of client devices 2, respectively.
  • the file generation device 1 generates video content which is data for providing video.
  • the file generation device 1 uploads the generated video content to the Web server 3.
  • the Web server 3 provides the video content to the client device 2
  • the distribution system 100 can adopt another configuration.
  • the file generation device 1 may include the functions of the Web server 3, store the generated video content in its own device, and provide it to the client device 2.
  • the Web server 3 holds the video content uploaded from the file generation device 1. Then, the Web server 3 provides the designated video content according to the request from the client device 2.
  • the client device 2 transmits a video content transmission request to the Web server 3. Then, the client device 2 acquires the video content specified in the transmission request from the Web server 3. Then, the client device 2 decodes the video content to generate a video, and displays the video on a display device such as a monitor.
  • FIG. 2 is a block diagram of the file generator.
  • the file generation device 1 which is an information processing device has a file generation processing unit 10, a control unit 11, and a transmission unit 12.
  • the control unit 11 executes a process related to the control of the file generation processing unit 10.
  • the control unit 11 performs integrated control such as the operation timing of each unit of the file generation processing unit 10.
  • the file generation processing unit 10 includes a data acquisition unit 101, an encoding unit 102, a metadata generation unit 103, a determination unit 104, and a file generation unit 105.
  • the data acquisition unit 101 accepts the input of the original data of the video content for displaying the video.
  • the original data of the video content includes image data and control information of each image included in an image sequence which is a series of images.
  • the control information includes, for example, time information information of each image data.
  • the data acquisition unit 101 outputs the image data included in the image sequence of the acquired video content to the coding unit 102. Further, the data acquisition unit 101 outputs the control information included in the original data of the acquired video content to the metadata generation unit 103.
  • the coding unit 102 receives input of image data of each image included in the image sequence. Then, the coding unit 102 encodes the image data of each image in the image sequence to generate a coded stream. At this time, the encoding unit 102 encodes so that the pictures 111 to 116 for realizing the reproduction return of the picture by GRA as shown in FIG. 3 are formed.
  • FIG. 3 is a diagram for explaining a picture display process at the time of RGA.
  • the pictures 111 to 116 are images for enabling the picture to be reproduced by using the refresh area of the intra-stripe.
  • the refresh area is an area that can be reproduced without referring to other images.
  • the clean area is an area that can be accurately regenerated by GRA.
  • the dirty area is an area that refers to a picture before the start of GRA, and is an area in which it is difficult to accurately reproduce the picture after the start of GRA.
  • Pictures 111 to 116 have refresh areas 121 to 126, respectively.
  • Refresh areas 121-126 include one or more slices. When all the refresh areas 121 to 126 are combined, an image that covers the entire area of one picture is obtained.
  • the picture 111 is a decoding start image in which decoding is started when the picture is returned to playback by GRA, and the reproduction start point of the picture 111, which is referred to here as a “GRA picture”, is a random access point in GRA.
  • the picture 116 is a picture in which the entire screen of the picture is regenerated by the GRA started from the picture 111, and the reproduction start point of the picture 116 is called a recovery point.
  • Picture 111 has an intra-stripe refresh area 121.
  • the refresh area 121 directly corresponds to the clean area 131.
  • the area other than the refresh area 121 is the dirty area 141.
  • Picture 112 has an intra-stripe refresh area 122.
  • the refresh area 121 of the picture 111 is referred to, and together with the refresh area 122, it becomes a clean area 132.
  • the area other than the clean area 132 is the dirty area 142.
  • the dirty area 142 is smaller than the dirty area 141 by the newly added refresh area 122.
  • Picture 113 has an intra-stripe refresh area 123.
  • the refresh area 121 of the picture 111 and the refresh area 122 of the picture 112 are referred to, and together with the refresh area 123, the clean area 133 is formed.
  • the area other than the clean area 133 is the dirty area 143.
  • the dirty area 143 is smaller than the dirty area 142 by the amount of the newly added refresh area 123.
  • the picture 114 has a clean area 134 including a refresh area 124 and a dirty area 144. Further, the picture 115 has a clean area 135 including a refresh area 125 and a dirty area 145.
  • the picture 116 serving as a recovery point has an intra-stripe refresh area 126.
  • the refresh areas 121 to 125 of the pictures 111 to 115 are referred to, and the clean area 136 is formed together with the refresh areas 126.
  • the clean area 136 is the entire screen of the picture, and there is no dirty area. As a result, the reproduction of the picture is completed in the picture 116. In this way, in GRA, the screen of the picture gradually returns to playback.
  • the refresh area is an area extending in the lateral direction of the picture toward the paper surface, and the clean area may increase from the bottom to the top.
  • the shape and position of the refresh area are not particularly limited as long as they are continuous areas, and the order in which the areas in the clean area increase is not particularly limited.
  • FIG. 4 is a diagram showing a GRA picture standard adopted in JVET-N0865.
  • recovery_per_cnt in line 152 of FIG. 4 is a value indicating the frame number at which the reproduction / return of the picture is completed from the random access point, and this value can be used as a role.
  • the coding unit 102 outputs a coded stream containing image data encoded so that GRA can be executed to the file generation unit 105. More specifically, a VCL buffer and a non-VCL buffer are provided between the encoding unit 102 and the file generation unit 105.
  • the image data includes visual data that is video and audio data that is audio. Then, the data on the visual side output from the coding unit 102 is sent to the file generation unit 105 via the VCL buffer, and the data on the audio side is sent to the file generation unit 105 via the non-VCL buffer.
  • the determination unit 104 confirms the encoding result of the encoding unit 102. Then, the determination unit 104 identifies the GRA picture, which is the decoding start image in GRA, from each picture included in the coded stream. Further, the determination unit 104 identifies a random access point and a recovery point in the GRA executed from the specified GRA picture. Then, the determination unit 104 obtains the number of frames from the next frame of the random access point to the frame of the recovery point as a role. This number of frames corresponds to recovery_per_cnt specified in JVET-N086. After that, the determination unit 104 outputs the GRA picture information and the role information to the file generation unit 105.
  • the metadata generation unit 103 receives input of control information from the data acquisition unit 101. Then, the metadata generation unit 103 generates metadata for image reproduction using the control information.
  • the metadata includes control information related to image generation and reproduction such as what kind of codec is used for compression.
  • the metadata generation unit 103 outputs the generated metadata to the file generation unit 105.
  • the file generation unit 105 receives the input of the coded stream including the image data encoded so that the GRA can be executed from the code unit 102. Further, the file generation unit 105 receives the input of the metadata from the metadata generation unit 103. Further, the file generation unit 105 receives input of GRA picture information and role information from the determination unit 104.
  • FIG. 5 is a diagram showing an example of a sample group of GRA pictures.
  • the file generation unit 105 generates GRA information representing information about the GAR picture. For example, the file generation unit 105 generates a sample group of GRA pictures as GRA information. In that case, the file generation unit 105 generates GraSyncSampleGroupEntry (), which is a new group of VisualSampleGroup, as a sample group of GRA pictures. Then, the file generation unit 105 sets the information about GRA in GraSyncSampleGroupEntry ().
  • the file generation unit 105 sets a GRA picture in GraSyncSampleGroupEntry (), and roll_distance in GraSyncSampleGroupEntry () represents a role in GRA. For example, the file generation unit 105 sets the number of frames from the random access point of GRA to the recovery point as the value of roll_distance by using recovery_per_cnt defined by JVET-N086.
  • the file generation unit 105 provides gradual display permission information which is information indicating whether or not to execute the gradual display (Gradual output) which displays the clean area so as to gradually expand.
  • gradual display permission information is information indicating whether or not to execute the gradual display (Gradual output) which displays the clean area so as to gradually expand.
  • the information on whether or not to allow the gradual display for displaying the clean area so as to gradually expand may be preset in the file generation device 1 by the user, or the file generation unit 105 may display the gradual display. You may receive input when setting permission information.
  • the file generation unit 105 acquires information on how the refresh area transitions in the picture by using the image data included in the acquired coded stream. Then, the file generation unit 105 generates gradual display type information representing the transition of the display of the refresh area from the acquired information. Then, the file generation unit 105 sets the display control information regarding the clean area when executing GRA as GradualOutputInformationStruct () in GraSyncSampleGroupEntry ().
  • the file generation unit 105 sets dirty area interpolation information indicating how to interpolate the dirty area as InterpolationStruct () in GraSyncSampleGroupEntry ().
  • Information on how to interpolate the dirty area may be preset in the file generation device 1 by the user, for example, or the file generation unit 105 receives input when setting the dirty area interpolation information. May be good.
  • the file generation unit 105 creates a file by storing the generated GRA picture sample group for each segment in the ISOBMFF file together with the image data and metadata included in the coded stream, and generates a segment file of the video content. Specifically, the file generation unit 105 generates an ISOBMFF file including video information (mdat) and management information (moov). mdat is a data area in the ISOBMFF file. Further, moov is a header area in ISOBMFF.
  • the file generation unit 105 stores GRA information, which is information about the GRA picture, in the moov of ISOBMFF. Specifically, the file generation unit 105 sets the GraSyncSampleGroupBox that stores the GraSyncSampleGroupEntry () in the moov of the ISOBMFF. For example, the file generation unit 105 sets the GraSyncSampleGroupBox in the BOX 161 in the moov indicated by the BOX 160, as shown in FIG. FIG. 6 is a diagram showing a storage example of the GraSyncSampleGroupBox.
  • FIG. 7 is a diagram showing a storage state of the GraSyncSampleGroupBox according to the presence or absence of a movie fragment.
  • the file generation unit 105 stores one GraSyncSampleGroupBox indicated by BOX171 in moov.
  • the file generation unit 105 stores one GraSyncSampleGroupBox in each moof as shown by BOX181 to 183.
  • the file generation unit 105 outputs the segment file of the video content including the sample group of the GRA picture to the transmission unit 12.
  • the transmission unit 12 receives the input of the video data segment file from the file generation unit 105. Then, the transmission unit 12 uploads the acquired video data segment file to the Web server 3.
  • FIG. 8 is a block diagram of the client device. As shown in FIG. 8, the client device 2 has a reproduction processing unit 20 and a control unit 21.
  • the control unit 21 controls the operation of each unit of the reproduction processing unit 20.
  • the control unit 21 comprehensively controls the operation timing of each unit of the reproduction processing unit 20.
  • the control unit 21 receives an input of a command from the operator. Then, the control unit 21 controls the reproduction processing unit 20 according to a command input from the user using an input device (not shown).
  • control unit 21 receives an input of a random access instruction. Then, the control unit 21 causes the reproduction processing unit 20 to execute the random access. At that time, the control unit 21 causes the file processing unit 202 to determine whether or not the random access sample is GRA, and determines whether to execute GRA as random access or normal decoding processing. In this normal decoding process, random access using the IDR picture is executed.
  • the reproduction processing unit 20 decodes and displays the image data. Further, when the operator instructs the random access, the reproduction processing unit 20 receives the control from the control unit 21 and executes the random access. The details of the reproduction processing unit 20 will be described below.
  • the reproduction processing unit 20 includes a file acquisition unit 201, a file processing unit 202, a GRA information acquisition unit 203, a decoding processing unit 204, a display information generation unit 205, and a display unit 206.
  • the file acquisition unit 201 acquires the segment file of the video content to be reproduced from the Web server 3 according to the video reproduction instruction input from the user. Then, the file acquisition unit 201 outputs the segment file of the acquired video content to the file processing unit 202.
  • the file processing unit 202 receives the input of the segment file in which the data of the video content to be played is stored from the file acquisition unit 201.
  • the file processing unit 202 parses the acquired segment file. Then, the file processing unit 202 acquires image data and metadata. After that, the file processing unit 202 outputs the image data to the decoding processing unit 204. Further, the file processing unit 202 outputs the metadata to the display information generation unit 205.
  • the file processing unit 202 receives an instruction from the control unit 21 to confirm the random access sample. Then, the file processing unit 202 confirms whether or not there is a sample group of the GRA picture represented by the GraSyncSampleGroupEntryBox, confirms whether or not the random access sample is GRA, and whether or not to use GRA as the random access. To judge.
  • the file processing unit 202 When GRA is not used, the file processing unit 202 causes the file processing unit 202 to execute a normal decoding process. In this case, the file processing unit 202 identifies an IDR picture that is a random access point corresponding to the random access specified by the user. Then, the file processing unit 202 outputs the image data following from the image data of the specified IDR picture to the decoding processing unit 204, and causes random access using the IDR picture.
  • the file processing unit 202 when GRA is used, the file processing unit 202 outputs the GraSyncSampleGroupEntryBox, which is a sample group of GRA pictures, to the GRA information acquisition unit 203. Further, the file processing unit 202 identifies a GRA picture that is a random access point corresponding to the random access specified by the user. Then, the file processing unit 202 outputs the image data following from the specified GRA picture, and instructs the decoding processing unit 204 to execute the GRA.
  • the GRA information acquisition unit 203 acquires a GraSyncSampleGroupEntryBox, which is a sample group of GRA pictures, from the file processing unit 202 when a random access instruction is input by the user and GRA is used. Then, the GRA information acquisition unit 203 acquires the information of GradualOutputStruct (), the information of GradualOutputInformationStruct (), the information of IntarpolationSturct (), and the information of roll_disntance from GraSyncSampleGroupEntry () which is a sample group of GRA pictures.
  • the GRA information acquisition unit 203 determines whether or not gradual display is permitted from the value of GradualOutputStruct (). When the gradual display is permitted, the GRA information acquisition unit 203 acquires the gradual display type information from the value of the GradualOutputInformationStruct (). Further, the GRA information acquisition unit 203 acquires dirty region interpolation information from the value of IntarpolationSturct (). Further, the GRA information acquisition unit 203 acquires a role which is the number of frames from the value of roll_disntance to the picture next to the GRA picture to the picture as the recovery point. Then, the GRA information acquisition unit 203 outputs the gradual display type information, the dirty area interpolation information, and the roll information to the decoding processing unit 204 together with the instruction of the gradual display.
  • the GRA information acquisition unit 203 acquires a role which is the number of frames from the next frame of the GRA picture to the picture which is the recovery point from the value of roll_disntance. Then, the GRA information acquisition unit 203 outputs to the decoding processing unit 204 an instruction for displaying after full-screen decoding, which is displayed after the entire screen of the picture can be decoded in the clean area, together with the roll information.
  • the decoding processing unit 204 receives the input of image data from the file processing unit 202. Then, the decoding processing unit 204 performs a decoding process on the acquired image data. After that, the decoding processing unit 204 outputs the decrypted 3DoF image data to the display information generation unit 205.
  • the decoding processing unit 204 inputs the image data continuing from the IDR picture serving as the random access point to the file processing unit 202. Receive from. Then, the decoding processing unit 204 decodes the image data following the IDR picture and outputs the decoded image data to the display information generation unit 205. As a result, the image is reproduced from the IDR picture which is the random access point designated by the user, and the random access is executed.
  • the decoding processing unit 204 inputs the image data following from the GRA picture serving as the random access point to the file processing unit 202. Receive from.
  • the decoding processing unit 204 receives the input of the following information from the GRA information acquisition unit 203 together with the instruction of the gradual display. For example, the decoding processing unit 204 receives input from the GRA information acquisition unit 203 of the gradual display type information, the dirty area interpolation information, and the roll information which is the number of frames from the next frame of the GRA picture to the picture as the recovery point. ..
  • the decoding processing unit 204 identifies what kind of gradual display is to be performed from the roll information and the gradual display type information. Then, the decoding processing unit 204 starts decoding from the image data of the GRA picture. After that, the decoding processing unit 204 decodes each frame so that the gradual display is executed while interpolating the dirty area according to the dirty area interpolation information until the last frame of the gradual display.
  • the decoding processing unit 204 sequentially outputs the image data from the decoded GRA picture to the last frame of the subsequent gradual display to the display information generation unit 205. After the output of the last frame of the gradual display is completed, the decoding processing unit 204 returns to the normal decoding processing. Then, the decoding processing unit 204 continues to output the image data obtained by normal decoding to the display information generation unit 205.
  • the decoding processing unit 204 receives the input of the display instruction after full-screen decoding from the GRA information acquisition unit 203 together with the roll information. Then, the decoding processing unit 204 starts decoding from the GRA picture. Next, the decoding processing unit 204 identifies the last picture in the GRA from the number of frames specified as the role. Then, the decoding processing unit 204 executes decoding up to the specified last picture, and generates the decoded image data with the entire screen of the picture as a clean area. After that, the decoding processing unit 204 outputs the decoded image data to the display information generation unit 205 with the entire screen of the picture as a clean area.
  • the display information generation unit 205 receives the input of the decoded image data from the decoding processing unit 204. Further, the display information generation unit 205 receives the input of metadata from the file processing unit 202. Then, the display information generation unit 205 generates a display image from the image data by using the information at the time specified in the metadata. After that, the display information generation unit 205 provides the generated display image to the display unit 206 for display.
  • the display information generation unit 205 receives input of image data following from the GRA picture that performs gradual display from the decoding processing unit 204. Then, the display information generation unit 205 generates a display image for gradual display according to the display method of the gradual display in which the clean area is designated and the processing method of the dirty area. Then, the display information generation unit 205 outputs the generated display image to the display unit 206 for display, thereby performing gradual display.
  • the display information generation unit 205 presents the user with information indicating that the display is gradual display.
  • the display information generation unit 205 may display information indicating that the gradual display is being displayed on the display unit 206.
  • the display information generation unit 205 receives the input of the image data decoded by using the entire screen of the picture as a clean area from the decoding processing unit 204. Then, the display information generation unit 205 generates a display image in which the reproduction / return of the entire screen is completed. Then, the display information generation unit 205 outputs the generated display image to the display unit 206 and displays it, so that the video content is displayed from the state in which the reproduction / return of the entire screen is completed.
  • the display unit 206 has a display device such as a monitor.
  • the display unit 206 receives the input of the display image generated by the display information generation unit 205. Then, the display unit 206 causes the display device to display the acquired display image.
  • FIG. 9 is a flowchart of a file generation process by the file generation device.
  • the data acquisition unit 101 acquires the original data of the video content from the Web server 3.
  • the original data includes image data and control information of a plurality of images.
  • the data acquisition unit 101 outputs the image data included in the acquired original data to the coding unit 102.
  • the data acquisition unit 101 outputs the control information included in the acquired original data to the metadata generation unit 103.
  • the coding unit 102 receives an input of image data from the data acquisition unit 101.
  • the coding unit 102 executes the coding of the image data so that the GRA can be executed (step S101).
  • the coding unit 102 outputs the coded image data to the file generation unit 105.
  • the metadata generation unit 103 generates metadata from the control information input from the data acquisition unit 101 and outputs the metadata to the file generation unit 105.
  • the determination unit 104 identifies a role that is the number of frames from the GRA picture of the GRA and the frame next to the GRA picture to the frame of the recovery point from the image data encoded by the coding unit 102. After that, the determination unit 104 outputs the GRA picture information and the role information to the file generation unit 105.
  • the file generation unit 105 receives the input of image data from the encoding unit 102. Further, the file generation unit 105 receives input of GRA picture and role information from the determination unit 104. Then, the file generation unit 105 newly defines the GraSyncSampleGroupEntryBox as a sample group of the GRA picture. Next, the file generation unit 105 sets the roll information in the roll_distance of GraSyncSampleGroupEntry () (step S102).
  • the file generation unit 105 acquires the transition information of the refresh area using the image data (step S103).
  • the file generation unit 105 generates the gradual display type information from the transition information of the refresh area and sets it as the GradualOutputInformationStruct () of GraSyncSampleGroupEntry () (step S104).
  • the file generation unit 105 sets the gradual display permission information as the GradualOutputOutputStruct () of GraSyncSampleGroupEntry (). Further, the file generation unit 105 sets the dirty area interpolation information as InterpolationStruct () of GraSyncSampleGroupEntry () (step S105).
  • the file generation unit 105 sets the GraSyncSampleGroupEntryBox in the moov including other management information in the ISOBMFF file (step S106).
  • the file generation unit 105 is a segment file of video content including mdat which is video information and moov which is management information, or a segment file of video content which includes mdat which is video information and moov and moof which is management information. Is generated (step S107).
  • the transmission unit 108 uploads the segment file of the video content generated by the file generation unit 105 to the Web server 3.
  • FIG. 10 is a flowchart of the reproduction process executed by the client device.
  • the file acquisition unit 201 acquires the segment file of the video content to be played back from the Web server 3.
  • the file processing unit 202 parses the segment file of the video content acquired by the file acquisition unit 201. Then, the file processing unit 202 outputs the image data to the decoding processing unit 204. Further, the file processing unit 202 outputs the metadata to the display information generation unit 205.
  • the decoding processing unit 204 decodes the acquired image data and outputs it to the display information generation unit 205.
  • the display information generation unit 205 generates a display image using image data and metadata and outputs the display image to the display unit 206 to display the display image.
  • the control unit 21 determines whether or not a random access instruction has been detected (step S201). When the random access instruction is not detected (step S201: negation), the control unit 21 causes the file acquisition unit 201 to continue the process as it is. Then, the reproduction process proceeds to step S208.
  • step S201 when the random access instruction is detected (step S201: affirmative), the control unit 21 instructs the file processing unit 202 to execute the random access.
  • the file processing unit 202 determines whether or not the random access sample is GRA (step S202).
  • the file processing unit 202 transmits the GraSyncSampleGroupEntryBox to the GRA information acquisition unit 203.
  • the GRA information acquisition unit 203 acquires the information of the GraSyncSampleGroupEntryBox (step S203). Specifically, the GRA information acquisition unit 203 acquires GRA picture information, gradual display permission information, gradual display type information, dirty area interpolation information, and role information.
  • the GRA information acquisition unit 203 determines whether or not the gradual display is permitted by using the gradual display permission information (step S204).
  • the GRA information acquisition unit 203 When the gradual display is permitted (step S204: affirmative), the GRA information acquisition unit 203 outputs the GRA picture information, the gradual display type information, the dirty area interpolation information, and the roll information to the decoding processing unit 204. ..
  • the decoding processing unit 204 decodes the image data following from the GRA picture so that the images are displayed in the display order indicated by the gradual display type information while interpolating the dirty area according to the dirty area interpolation information to generate the display information.
  • the display information generation unit 205 generates a display image for gradual display using the image data acquired from the decoding processing unit 204, and provides the display information unit 206 for display. At that time, the display information generation unit 205 presents the user with information indicating that the display is gradual display (step S205).
  • the decoding processing unit 204 determines whether or not the gradual display is completed by using the roll information and the like (step S206). If the gradual display is not completed (step S206: negative), the video reproduction process returns to step S205. On the other hand, when the gradual display is completed (step S206: affirmative), the video reproduction process returns to step S201.
  • the GRA information acquisition unit 203 outputs the GRA picture information and the roll to the decoding processing unit 204 after full-screen decoding.
  • the decoding processing unit 204 decodes from the GRA picture in response to the instruction of displaying after full-screen compounding, and confirms that the entire screen of the picture can be decoded as a clean screen by using the roll information. Then, after all the screens of the picture are decoded as the clean area, the decoding processing unit 204 outputs the image data in which the entire screen of the picture is decoded as the clean area to the display information generation unit 205.
  • the display information generation unit 205 generates a display image obtained by decoding the entire screen of the picture as a clean area and provides it to the display unit 206 for display (step S207). After that, the video reproduction process returns to step S201.
  • step S202 determines whether the random access sample is GRA (step S202: negative). If the random access sample is not GRA (step S202: negative), the video playback process proceeds to step S208.
  • the file processing unit 202, the decoding processing unit 204, the display information generation unit 205, and the display unit 206 execute normal decoding and display on the input image (step S208).
  • random access random access using an IDR picture is performed in normal decoding.
  • the file processing unit 202, the decoding processing unit 204, and the display information generation unit 205 determine whether or not all the image data of the video content has been decoded (step S209). If the image data to be decoded remains (step S209: negative), the video reproduction process returns to step S201. On the other hand, when the decoding of all the image data of the video content is completed (step S209: affirmative), the file processing unit 202, the decoding processing unit 204, and the display information generation unit 205 end the video reproduction processing.
  • the GRA is executablely encoded, the GRA picture is specified from the encoded image data, and the sample group of the GRA picture is newly created. Define and store role information. As a result, the maximum coding amount is suppressed, the code delay is reduced by the coding process and the transmission process, the image distortion due to the reproduction of the dirty area is prevented, and the same content is reproduced by any reproduction device. GRA can be executed properly. That is, the file generation device according to the present embodiment can provide the user with a high-quality viewing experience.
  • FIG. 11 is a diagram showing an example of the syntax of GradualOutputStruct ().
  • the file generation unit 105 generates a GradualOutputStruct () that stores the gradual display permission information in the GRA sample group by using the syntax shown in FIG.
  • the file generation unit 105 stores the gradual_output_flag in the GradualOutputStruct () as shown in FIG. Then, the file generation unit 105 defines the value of the gradual_output_flag as shown in FIG. FIG. 12 is a diagram of an example of the contents indicated by each value of gradual_output_flag. For example, the file generation unit 105 defines that the gradual display is valid when the value of gradual_output_flag is 0, and the gradual display is invalid when the value of gradual_output_flag is 1. The fact that the gradual display is valid means that the gradual display is permitted to be executed at the time of random access. On the other hand, when the gradual display is invalid, it means that it is prohibited to execute the gradual display at the time of random access. Then, the file generation unit 105 sets the generated GradualOutputStruct () in the GraSyncSampleGroupEntryBox to generate an ISOBMFF file.
  • a flag called gradual_output_flag is newly defined to indicate permission or prohibition of gradual display, but the setting method of this gradual permission information is not limited to this.
  • the file generation unit 105 sets the picture as the recovery point in GraSyncSampleGroupEntry (). Then, the file generation unit 105 may explicitly prohibit the gradual display by setting the number of frames from the picture one frame before the picture as the recovery point to the GRA picture as a role as a role.
  • the file generation device stores information indicating either permission or prohibition of gradual display by using the flag set in GradualOutputStruct ().
  • GRA is premised on gradual display because it aims to reduce the code delay in the coding process and the transmission process by suppressing the maximum coding amount, but it is possible to suppress image distortion depending on the user's request. It is possible to provide a viewing experience that meets the needs of users.
  • FIG. 13 is a diagram showing an example of the syntax of GradualOutputInformationStruct ().
  • the file generation unit 105 generates a GradualOutputInformationStruct () that stores the gradual display type information in the GRA sample group by using the syntax shown in FIG.
  • the file generation unit 105 stores the gradual_output_type in the GradualOutputInformationStruct () as shown in FIG. Then, the file generation unit 105 defines the value of gradual_output_type as shown in FIG.
  • FIG. 14 is a diagram of an example of the contents indicated by each value of gradual_output_type.
  • the file generation unit 105 defines that when the value of gradual_output_type is 0, it indicates that the refresh area moves from left to right or right to left on the picture display screen. In this case, the image is gradually displayed from left to right or from right to left on the screen. Further, the file generation unit 105 defines that when the value of gradual_output_type is 1, it indicates that the refresh area moves from the top to the bottom or from the bottom to the top of the picture display screen. In this case, the image is gradually displayed from the top to the bottom of the screen or from the bottom to the top. Further, the file generation unit 105 defines that when the value of gradual_output_type is 2, it means that the refresh area moves from the center to the edge of the picture display screen.
  • the image is gradually displayed from the center of the screen toward the outer edge of the screen.
  • the file generation unit 105 defines that when the value of gradual_output_type is 3, the refresh area moves in the order of the raster scan of the picture display screen. In this case, the images are gradually displayed in the order of raster scan. Further, the file generation unit 105 defines that when the value of gradual_output_type is 4, it means that the refresh area moves randomly on the picture display screen. In this case, the images are displayed randomly and gradually. Further, the file generation unit 105 sets the value of gradual_output_type to 5 when the transition order of the refresh area is not specified.
  • the six types of gradual display patterns shown in FIG. 14 are shown together with the undefined ones, but if it can be expressed by one bit other than this, the file generation unit 105 has a gradient_output_type. Other patterns may be defined.
  • the file generator stores the gradual display type information indicating how the gradual display is performed using the flag set in the GradualOutputInformationStruct ().
  • the client device may figure out without analyzing Parmeter_set and slice_header, performs advance what graph dual display before decoding can do. This makes it easy to distinguish between the accurately decoded area and the other areas, and the client device is suitable for displaying other information such as information indicating that the gradual display is being executed. The area can be easily specified.
  • the file generation unit 105 can also define GradualOutputInformationStruct () using information other than gradual_output_type.
  • FIG. 15 is a diagram showing a first example of GradualOutputInformationStruct () using other definitions. Here, the case where the gradual display is gradually displayed at a constant ratio and linearly will be described.
  • the file generation unit 105 stores the information of the display area of the clean area that is first output to GradualOutputInformationStruct () as shown in FIG.
  • First_output_clean_region_x, first_output_clean_region_y, first_output_clean_region_width, and first_output_clean_region_height in FIG. 15 represent the x-coordinate, Y-coordinate, width, and height of the reference point of the display area of the clean area that is output first, respectively.
  • the file generation unit 105 can also set the gradual display type information in the GradualOutputInformationStruct () by using the syntax shown in FIG.
  • FIG. 16 is a diagram showing a second example of GradualOutputInformationStruct () using another definition.
  • the gradual display is gradually displayed at a constant ratio and linearly will be described.
  • the file generation unit 105 stores the information of the display area of the first and last refresh areas in GradualOutputInformationStruct () as shown in FIG.
  • the first_output_refresh_region_x, first_output_refresh_region_y, first_output_refresh_region_width, and first_output_refresh_region_height in FIG. 16 represent the x-coordinate, Y-coordinate, width, and height of the reference point of the display area of the refresh area that is output first, respectively.
  • last_output_refresh_region_x, last_output_refresh_region_y, last_output_refresh_region_width, and last_output_refresh_region_height in FIG. 16 represent the x-coordinate, Y-coordinate, width, and height of the reference point of the display area of the refresh area that is output last, respectively.
  • the client device 2 When the gradual display is performed linearly at a constant ratio, the client device 2 knows the display areas of the first and last refresh areas, and how the gradual display is performed based on the roll information. Can be identified. Therefore, the file generation unit 105 can also set the gradual display type information in the GradualOutputInformationStruct () by using the syntax shown in FIG.
  • FIG. 17 is a diagram showing a third example of GradualOutputInformationStruct () using other definitions.
  • the gradual display the case where the amount of information for each frame increases monotonically in the lecture, but the amount of increase is not a constant ratio will be described.
  • the file generation unit 105 stores the information of all the clean areas of each frame used in GRA in the GradualOutputInformationStruct () as table information.
  • the first_output_clean_region_x, first_output_clean_region_y, first_output_clean_region_width, and first_output_clean_region_height in FIG. 17 represent the x-coordinate, Y-coordinate, width, and height of the reference point of the display area of the clean area in the i-th frame with the GRA picture as the 0th frame, respectively.
  • the amount of information for each frame is a monotonous increase in the lecture, but if the amount of increase is not a constant ratio and gradual display is performed, the client device 2 can display the gradual display if the clean area of each frame can be grasped. It is possible to identify what is done. Therefore, the file generation unit 105 can also set the gradual display type information in the GradualOutputInformationStruct () by using the syntax shown in FIG. By using such a definition for GradualOutputInformationStruct (), it is possible to notify the client device 2 how the gradual display is performed even when the transition of the refresh area is complicated.
  • FIG. 18 is a diagram showing a fourth example of GradualOutputInformationStruct () using other definitions.
  • the amount of information for each frame increases monotonically in the lecture, but the amount of increase is not a constant ratio will be described.
  • the file generation unit 105 stores the information of all the refresh areas of each frame used in GRA in the GradualOutputInformationStruct () as table information as shown in FIG.
  • the first_output_clean_region_x, first_output_clean_region_y, first_output_clean_region_width, and first_output_clean_region_height in FIG. 18 represent the x-coordinate, Y-coordinate, width, and height of the reference point of the display area of the refresh area in the i-th frame with the GRA picture as the 0th frame, respectively.
  • the amount of information for each frame is a monotonous increase in the lecture, but if the amount of increase is not a constant ratio and gradual display is performed, the client device 2 can display the gradual display if the refresh area of each frame can be grasped. It is possible to identify what is done. Therefore, the file generation unit 105 can also set the gradual display type information in the GradualOutputInformationStruct () by using the syntax shown in FIG. By using such a definition for GradualOutputInformationStruct (), it is possible to notify the client device 2 how the gradual display is performed even when the transition of the refresh area is complicated.
  • the client device 2 does not refer to the values of the VVC parameter_set and slice_header in the display process after decoding, and the clean area and dirty. It becomes possible to distinguish from the area. Further, the client device 2 can also utilize the information that identifies the clean area and the dirty area for the interpolation process.
  • the information on how the gradient is displayed at the time of GRA can be identified by the client device before decoding.
  • the client device can use the identified information for UX (User Experience) such as notification to the user at the time of random access.
  • UX User Experience
  • the client device can identify information as to which area is displayed in the gradient by GRA without using parameter_set, it can be used for the interpolation processing of the dirty area.
  • FIG. 19 is a diagram showing an example of the syntax of InterpolationStruct ().
  • the file generation unit 105 generates InterpolationStruct (), which stores dirty region interpolation information in the GRA sample group, using the syntax shown in FIG.
  • the file generation unit 105 stores interpolation_type in InterpolationStruct () as information indicating how to interpolate the dirty area as shown in FIG. Then, the file generation unit 105 defines the value of interpolation_type as shown in FIG. FIG. 20 is a diagram of an example of the contents indicated by each value of interpolation_type.
  • the file generation unit 105 defines that when the value of interpolation_type is 0, the dirty area is interpolated with the set color. In this case, the user determines the color to interpolate the dirty area. By interpolating the dirty area with an appropriate color in this way, the image is graduated like a frame-in at the time of GRA random access. Further, the file generation unit 105 defines that when the value of interpolation_type is 1, the image of the frame before the start of random access is displayed as a still image in the dirty area. By interpolating the dirty area with the image before the start of the random access in this way, the image is gradually displayed like a crossfade at the time of the GRA random access. Further, the file generation unit 105 sets the value of interpolation_type to 2 when the interpolation method of the dirty area is not determined. In this case, the method of interpolating the dirty region depends on the mounting state of the video reproduction function in the client device 2.
  • the file generation device stores dirty area interpolation information indicating how to perform dirty area interpolation using the flag set in InterpolationStruct ().
  • the dirty area interpolation information is performed by the same method regardless of the client device, and image distortion at the time of random access is suppressed. And the appearance can be unified.
  • the content creator can set the optimum display method for the dirty area, and the gradual display can be used as a UX such as fade-in or crossfade to realize the same content playback method regardless of the playback device. It is possible to provide a high-quality viewing experience.
  • FIG. 21 is a diagram showing the format of the Matroska Media Container.
  • the file generation unit 105 stores the transition identification information, the transition execution area information, and the transition trigger information in the element newly defined in the Track Entry element.
  • FIG. 22 is a hardware configuration diagram of the computer.
  • the file generation device 1 and the client device 2 can be realized by the computer 90 shown in FIG.
  • the processor 91, the memory 92, the network interface 93, the non-volatile storage 94, the input / output interface 95, and the display interface 86 are connected to each other via a bus.
  • External devices such as an input device, an output device, a storage device, and a drive are connected to the input / output interface 95.
  • the input device is, for example, a keyboard, a mouse, a microphone, a touch panel, an input terminal, or the like.
  • the output device is, for example, a speaker, an output terminal, or the like.
  • the storage device is, for example, a hard disk, a RAM (Random Access Memory) disk, or the like.
  • the drive drives removable media such as magnetic disks, optical disks, magneto-optical disks, or semiconductor memories.
  • a display 98 which is a display device, is connected to the display interface 96.
  • the network interface 93 is connected to an external network.
  • the file generation device 1 and the client device 2 are connected to each other via the network interface 93. Further, the file generation device 1 and the client device 2 are connected to the Web server 3 via the network interface 93.
  • the non-volatile storage 94 is a built-in auxiliary storage device such as a hard disk or SSD (Solid State Drive).
  • the processor 91 for example, loads the program stored in the non-volatile storage 94 into the memory 92 via the bus and executes the series of processing described above. Is done.
  • the memory 92 also appropriately stores data and the like necessary for the processor 91 to execute various processes.
  • the program executed by the processor 91 can be recorded and applied to removable media such as package media, for example.
  • the program can be installed in the non-volatile storage 94 via the input / output interface 95 by mounting the removable media in the drive which is the external device 97.
  • This program can also be provided via wired or wireless transmission media such as local area networks, the Internet, and digital satellite broadcasting. In that case, the program can be received at the network interface 93 and installed in the non-volatile storage 94.
  • this program can be installed in advance in the non-volatile storage 94.
  • a coding unit that encodes an image in an image sequence to generate a coded stream, and A determination unit that determines one or more decoding start images in the image sequence that can be used as an image to start decoding during Gradual Random Access (GRA).
  • the header area of the file format including the header area and the data area is provided with a file generation unit that inserts GRA information regarding the decoding start image determined by the determination unit and inserts the coded stream into the data area.
  • Information processing device (2) The information processing device according to the appendix (1), wherein the file generation unit includes gradual display permission information indicating permission or disapproval of gradual display in the GRA information.
  • the file generation unit includes gradual display type information on how to perform the gradual display in the GRA information.
  • the information processing device sets the position and area information of a clean area in each of the images displayed at the time of executing the gradual display as the gradual display type information.
  • the information processing device sets the position of a refresh area in each of the images displayed when the gradual display is executed as the gradual display type information.
  • the information processing apparatus according to the appendix (2), wherein the file generation unit includes dirty area interpolation information indicating a dirty area area information and a display method in the GRA information.
  • the images in the image sequence are encoded to generate an encoded stream. Determine one or more decoding start images in the image sequence that can be used as the image to initiate decoding during gradual random access.
  • GAA Gradual Random Access
  • a GRA information acquisition unit that acquires GRA information for identifying the start image
  • a reproduction processing apparatus including a decoding processing unit that decodes the coded stream based on the GRA information acquired by the GRA information acquisition unit.
  • (9) Obtain a file generated according to a file format including a header area and a data area containing a coded stream containing data of a series of encoded images. From the header area of the acquired file, GRA information for identifying one or more decoding start images that can be used as an image to start decoding at the time of gradual random access in the series of images is acquired.
  • a reproduction processing method in which a computer executes a process of decoding the coded stream based on the acquired GRA information.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

L'invention concerne un dispositif de traitement d'informations, un procédé de traitement d'informations, un dispositif de traitement de lecture et un procédé de traitement de lecture, qui fournissent à un utilisateur une expérience audiovisuelle d'un niveau de qualité élevé. Une unité de codage code des images en une séquence d'images pour générer un flux codé. Une unité de détermination détermine, dans la séquence d'images, une ou plusieurs images de début de décodage qui peuvent être utilisées comme images pour débuter le décodage durant un accès aléatoire progressif (GRA). Une unité de génération de fichier insère, dans une région d'en-tête, des informations GRA qui sont associées à l'image de début de décodage déterminée par l'unité de détermination, et insère le flux codé dans une région de données d'un format de fichier contenant la région d'en-tête et la région de données.
PCT/JP2020/025379 2019-07-03 2020-06-26 Dispositif de traitement d'informations, procédé de traitement d'informations, dispositif de traitement de lecture et procédé de traitement de lecture WO2021002303A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962870367P 2019-07-03 2019-07-03
US62/870,367 2019-07-03

Publications (1)

Publication Number Publication Date
WO2021002303A1 true WO2021002303A1 (fr) 2021-01-07

Family

ID=74101077

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/025379 WO2021002303A1 (fr) 2019-07-03 2020-06-26 Dispositif de traitement d'informations, procédé de traitement d'informations, dispositif de traitement de lecture et procédé de traitement de lecture

Country Status (1)

Country Link
WO (1) WO2021002303A1 (fr)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006527927A (ja) * 2003-06-19 2006-12-07 ノキア コーポレイション 漸進的デコーダリフレッシュに基づくストリーム切り換え

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006527927A (ja) * 2003-06-19 2006-12-07 ノキア コーポレイション 漸進的デコーダリフレッシュに基づくストリーム切り換え

Similar Documents

Publication Publication Date Title
KR102450781B1 (ko) 생성된 콘텐츠를 포함하는 미디어 데이터를 인코딩하기 위한 방법 및 장치
JP6292495B2 (ja) 再生方法および再生装置
US9124858B2 (en) Content processing apparatus for processing high resolution content and content processing method thereof
KR101703179B1 (ko) 미디어 스트리밍 동안 적응 세트들 간의 스위칭
GB2462732A (en) Simultaneous recording of multiple broadcast programs on a digital video recorder
JP6508206B2 (ja) 情報処理装置および方法
JP7238948B2 (ja) 情報処理装置および情報処理方法
JP2010011154A (ja) 画像生成装置及び画像再生装置
WO2018142946A1 (fr) Dispositif et procédé de traitement d'informations
JP2005094145A (ja) 画像記録装置及び画像再生装置
JP2008167061A (ja) 符号化装置及び符号化方法
JP4577409B2 (ja) 再生装置、再生方法、プログラム、及び、データ構造
KR101199166B1 (ko) 보간 프레임 생성 시스템
JP6269734B2 (ja) 動画データ編集装置、動画データ編集方法、再生装置、及びプログラム
WO2021002303A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations, dispositif de traitement de lecture et procédé de traitement de lecture
JP7521531B2 (ja) 情報処理装置、情報処理方法、再生処理装置及び再生処理方法
JP2008072182A (ja) 動画像復号化装置、動画像復号化方法、動画像復号化プログラム、動画像符号化装置、動画像符号化方法、動画像符号化プログラム、及び動画像符号化復号化装置
KR20110019955A (ko) 해상도 시그널링을 이용한 스케일러블 비디오 재생 시스템 및 방법
KR20070061149A (ko) 객체 치환이 가능한 대화형 컨텐츠 재생 장치 및 그 방법
JP2004312743A (ja) デジタルデータ複製装置及びその方法
US20240086451A1 (en) Information processing apparatus, reception apparatus, information processing method, and storage medium
JP2011078068A (ja) 映像伝送方式
JP2007180692A (ja) 映像音声編集方法、装置、プログラム、および媒体
JP2017536027A (ja) 符号化されたビデオデータ処理方法及びその装置、符号化されたビデオデータ生成方法及びその装置
JP2014175945A (ja) 映像蓄積装置、映像蓄積再生装置、映像蓄積方法及び映像蓄積再生方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20834532

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20834532

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP