WO2013136373A1 - Dispositif de traitement d'image en trois dimensions et procédé de traitement d'image en trois dimensions - Google Patents

Dispositif de traitement d'image en trois dimensions et procédé de traitement d'image en trois dimensions Download PDF

Info

Publication number
WO2013136373A1
WO2013136373A1 PCT/JP2012/001846 JP2012001846W WO2013136373A1 WO 2013136373 A1 WO2013136373 A1 WO 2013136373A1 JP 2012001846 W JP2012001846 W JP 2012001846W WO 2013136373 A1 WO2013136373 A1 WO 2013136373A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
viewpoint image
file
viewpoint
unit
Prior art date
Application number
PCT/JP2012/001846
Other languages
English (en)
Japanese (ja)
Inventor
佑貴 原
Original Assignee
パナソニック株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニック株式会社 filed Critical パナソニック株式会社
Priority to PCT/JP2012/001846 priority Critical patent/WO2013136373A1/fr
Publication of WO2013136373A1 publication Critical patent/WO2013136373A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/172Processing image signals image signals comprising non-image signal components, e.g. headers or format information
    • H04N13/178Metadata, e.g. disparity information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/189Recording image signals; Reproducing recorded image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/82Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only
    • H04N9/8205Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal
    • H04N9/8227Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback the individual colour picture signal components being recorded simultaneously only involving the multiplexing of an additional signal and the colour video signal the additional signal being at least another television signal

Definitions

  • the present invention relates to a 3D image processing apparatus and 3D image processing method, and more particularly to a 3D image processing apparatus and 3D image processing method for extracting a 3D still image from a 3D image.
  • Patent Document 1 discloses a technique for capturing a screen image of a two-dimensional video and storing the captured screen image as a JPEG file.
  • Patent Document 2 discloses a technique for storing a stereoscopic video as a stereoscopic still image file.
  • Patent Literatures 1 and 2 have a problem that they can not capture a screen image of a stereoscopic video and store it in a mode that can be viewed as a stereoscopic still image later.
  • the present invention has been made in view of the above problems, and it is an object of the present invention to provide a three-dimensional video processing apparatus and a three-dimensional video processing method capable of storing a screen image of a three-dimensional video later as a stereoscopic still image. To aim.
  • a stereoscopic video processing apparatus includes: a display unit that alternately displays a first viewpoint image and a second viewpoint image that form a stereoscopic video; an input reception unit that receives an input of a capture request from a user; A multi-picture format compliant in which the first viewpoint image displayed on the display unit and the second viewpoint image corresponding to the first viewpoint image are stored as a pair of still images at the timing when the capture request is received by the input reception unit And a file generation unit for generating a file of
  • the screen image of the stereoscopic video can be stored in a mode that can be viewed as a stereoscopic still image later.
  • the file generation unit may store the pair of still images acquired from each of the plurality of stereoscopic videos in the file when a plurality of stereoscopic videos are simultaneously displayed on the display unit.
  • the file generation unit may be configured to combine a plurality of first viewpoint images stored in the file according to a layout displayed on the display unit, and a plurality of combined first viewpoint images stored in the file.
  • the second viewpoint image may be stored in the file together with a combined second viewpoint image combined according to the layout displayed on the display unit.
  • the file generation unit may store, in the file, information specifying a combination of a first viewpoint image and a second viewpoint image constituting the pair of still images.
  • the three-dimensional video processing apparatus may further include a file reproduction unit that reads out the first viewpoint image and the second viewpoint image constituting the pair of still images from the file and alternately displays them on the display unit.
  • the first viewpoint image may be one of a left viewpoint image and a right viewpoint image having parallax.
  • the second viewpoint image may be the other of the left viewpoint image and the right viewpoint image.
  • the present invention can be realized not only as such a 3D image processing apparatus and 3D image processing method, but also as an integrated circuit for realizing the functions of the 3D image processing apparatus, and each step of the 3D image processing method It can also be realized as a program to be executed by Needless to say, such a program can be distributed via a recording medium such as a CD-ROM and a transmission medium such as the Internet.
  • a three-dimensional video processing apparatus capable of storing a screen image of a three-dimensional video so as to be viewed as a three-dimensional still image later.
  • FIG. 1 is a block diagram of a 3D image processing apparatus according to the first embodiment.
  • FIG. 2 is a diagram for explaining the outline of the operation of the 3D image processing apparatus.
  • FIG. 3 is a flowchart of file generation processing according to the first embodiment.
  • FIG. 4 is a flowchart of left viewpoint image generation processing according to the first embodiment.
  • FIG. 5 is a flowchart of right viewpoint image generation processing according to the first embodiment.
  • FIG. 6 is a diagram showing an example of data of parameter 1.
  • FIG. 7 is a view showing an example of data of parameter 2.
  • FIG. 8 is a diagram showing an example of data of parameter 3.
  • FIG. 9 is a diagram showing an example of the data layout of the mpo file according to the first embodiment.
  • FIG. 1 is a block diagram of a 3D image processing apparatus according to the first embodiment.
  • FIG. 2 is a diagram for explaining the outline of the operation of the 3D image processing apparatus.
  • FIG. 3 is a flowchart of file generation
  • FIG. 10 is a diagram for explaining the outline of the operation of the 3D image processing apparatus.
  • FIG. 11 is a flowchart of file generation processing according to the second embodiment.
  • FIG. 12 is a flowchart of left viewpoint image generation processing according to the second embodiment.
  • FIG. 13 is a flowchart of right viewpoint image generation processing according to the second embodiment.
  • FIG. 14 is a flowchart of combined left viewpoint image generation processing according to the second embodiment.
  • FIG. 15 is a flowchart of combined right-viewpoint image generation processing according to the second embodiment.
  • FIG. 16A is a diagram illustrating an example of data of parameter 4;
  • FIG. 16B is a view showing a continuation of FIG. 16A of the data example of parameter 4;
  • FIG. 16C is a view showing a continuation of FIG.
  • FIG. 17 is a diagram showing an example of data of parameter 5.
  • FIG. 18 is a diagram of an example of data of parameter 6;
  • FIG. 19 shows an example of data of parameter 7.
  • FIG. 20 is a diagram of an example of data of the parameter 8.
  • FIG. 21 is a diagram showing an example of the data layout of the mpo file according to the second embodiment.
  • FIG. 1 is a block diagram of a 3D image processing apparatus 100 according to the first embodiment.
  • FIG. 2 is a diagram for explaining an outline of the operation of the stereoscopic video processing device 100. As shown in FIG.
  • the stereoscopic video processing apparatus 100 includes a video acquisition unit 110, a video decoding unit 120, a display unit 130, an input reception unit 140, a 2D / 3D detection unit 150, and a file generation unit 160. , A storage unit 170, and a file reproduction unit 180.
  • the video acquisition unit 110 acquires a video signal from the outside of the apparatus, and outputs the acquired video signal to the video decoding unit 120.
  • the acquisition destination of the video signal is not particularly limited, for example, the video acquisition unit 110 may acquire the video signal through a broadcast wave or a communication network, or may read the video signal from the recording medium. That is, the stereoscopic video processing apparatus 100 according to the first embodiment can be applied to a television receiver or the like. Further, the video acquisition unit 110 may include an imaging device (not shown) and may acquire a video signal from the imaging device. That is, the three-dimensional video processing apparatus 100 according to the first embodiment can be applied to a video camera capable of capturing a three-dimensional video.
  • broadcast wave is not specifically limited, For example, analog broadcast, terrestrial digital broadcast, BS (Broadcast Satellite) broadcast, CS (Communication Satellite) broadcast etc. correspond.
  • specific examples of the recording medium are not particularly limited, but, for example, a DVD (Digital Versatile Disc), a BD (Blu-ray Disc), an SD (Secure Digital) card and the like correspond.
  • the video signal acquired by the video acquisition unit 110 may be a signal of a two-dimensional video, or may be a signal of a three-dimensional video (stereoscopic video).
  • a stereoscopic video is composed of a left viewpoint video and a right viewpoint video.
  • the left viewpoint video and the right viewpoint video shown in FIG. 2 are videos obtained by capturing an object from different viewpoints.
  • the left viewpoint video is composed of a plurality of images (left viewpoint images) L 1 , L 2 , L 3 , L 4 , L 5 .
  • the right viewpoint video is composed of a plurality of images (right viewpoint images) R 1 , R 2 , R 3 , R 4 , R 5 .
  • the image R 1 of the image L 1 and the right view image of the left view image, an object as seen at the same time a (the same time captured in) the image these referred to as "corresponding image".
  • the images L 2 and R 2 , the images L 3 and R 3 , the images L 4 and R 4 , and the images L 5 and R 5 are respectively corresponding images. Then, parallax in the horizontal direction is added to the corresponding pair of images.
  • the video decoding unit 120 decodes the video signal acquired from the video acquisition unit 110, and outputs the decoded video to the display unit 130.
  • the video signal acquired from the video acquisition unit 110 is, for example, H.264. H.264 / AVC or the like is coded by a moving picture coding method such as H.264 / AVC. That is, at least a part of the image included in the video is encoded with reference to another image. Therefore, the image forming the video can only be decoded after decoding the reference image.
  • the display unit 130 has a display screen on which the video acquired from the video decoding unit 120 is displayed.
  • a display part is not specifically limited, For example, a liquid crystal display, a plasma display, or organic electroluminescent (ElectroLuminescence) display etc. are employable.
  • the display unit 130 alternately displays the left viewpoint image and the right viewpoint image.
  • the stereoscopic image shown in FIG. 2 is displayed in the order of images L 1 , R 1 , L 2 , R 2 , L 3 , R 3 , L 4 , R 4 , L 5 , R 5 as indicated by arrows. Ru.
  • the viewer can grasp the depth of the stereoscopic video by opening and closing the shutters of the left eye lens and the right eye lens of the glasses worn by the viewer (user) in synchronization with the image displayed on the display unit 130 it can.
  • the display order of the corresponding left viewpoint image and right viewpoint image is not limited to left ⁇ right as described above, but may be right ⁇ left.
  • the input receiving unit 140 is a user interface that receives input of various instructions (requests) from the viewer.
  • the input receiving unit 140 according to the first embodiment receives an input of a capture request requesting acquisition of a screen image of a video displayed on the display unit 130 as a still image, and notifies the file generation unit 160 of the input.
  • the 2D / 3D detection unit 150 detects whether the image displayed on the display unit 130 is a two-dimensional (2D) image or a three-dimensional (3D) image.
  • the detection method is not particularly limited, for example, the 2D / 3D detection unit 150 sets a flag indicating that it is a stereoscopic image (typically, a flag indicating that it is a side-by-side type or top and bottom type) to the video signal.
  • a flag indicating that it is a stereoscopic image typically, a flag indicating that it is a side-by-side type or top and bottom type
  • it may be detected as a stereoscopic video.
  • the file generation unit 160 acquires an image displayed on the display unit 130 as a still image, and stores the acquired still image according to a multi-picture format file (see FIG.
  • the mpo file is generated, and the generated mpo file is stored in the storage unit 170.
  • the file generation unit 160 may obtain, for example, a still image from the video decoding unit 120. That is, the video decoding unit 120 needs to hold the decoded image for a predetermined period of time after output to the display unit 130.
  • the file generation unit 160 stores the image displayed on the display unit 130 in the mpo file in such a manner that the image can be viewed as a stereoscopic still image later. For example, as shown in FIG. 2, at the timing when the image L 3 of the left view image on the display unit 130 is displayed, the capture request is received by the input receiving unit 140. In this case, the file generation unit 160, an image L 3 displayed on the display unit 130, and an image R 3 in the right-view image corresponding to the image L 3, and stores the mpo file.
  • the storage unit 170 stores the file generated by the file generation unit 160.
  • the specific configuration of the storage unit is not particularly limited, for example, data such as dynamic random access memory (DRAM), synchronous dynamic random access memory (SDRAM), flash memory, ferroelectric memory, or hard disk drive (HDD) Any means can be used as long as it is a means capable of storing.
  • DRAM dynamic random access memory
  • SDRAM synchronous dynamic random access memory
  • HDD hard disk drive
  • the file reproduction unit 180 reads the file stored in the storage unit 170 and reproduces a still image included in the read file.
  • the file reproducing unit 180 reads the mpo file shown in FIG. 2 and causes the display unit 130 to alternately display the images L 3 and R 3 included in the read mpo file.
  • the viewer can view the stereoscopic still image.
  • the video decoding unit 120 when the video acquisition unit 110 acquires a video that has already been decoded, the video decoding unit 120 can be omitted. Further, when the video input to the video acquisition unit 110 is only a stereoscopic video, the 2D / 3D detection unit 150 can be omitted.
  • the storage unit 170 may be an external storage device provided separately from the stereoscopic video processing device 100.
  • the file playback unit 180 may be a component of a playback apparatus that plays back a 3D still image by reading an mpo file stored in the storage unit 170 instead of the component of the 3D image processing apparatus 100.
  • FIG. 3 is a flowchart of file generation processing according to the first embodiment.
  • FIG. 4 is a flowchart of left viewpoint image generation processing according to the first embodiment.
  • FIG. 5 is a flowchart of right viewpoint image generation processing according to the first embodiment.
  • 6 to 8 show examples of data stored in the header of the mpo file.
  • FIG. 9 is a diagram showing an example of the data layout of the mpo file.
  • FIG. 2 will be described as an example.
  • the stereoscopic video processing apparatus 100 displays the stereoscopic video acquired by the video acquisition unit 110 and decoded by the video decoding unit 120 on the display unit 130.
  • the input receiving unit 140 accepts a capture request from a user (S11).
  • the 2D / 3D detection unit 150 detects whether the video displayed on the display unit 130 is a stereoscopic video (S12).
  • a stereoscopic video is displayed on the display unit 130 (YES in S12)
  • the file generation unit 160 executes left viewpoint image generation processing (S13) and right viewpoint image generation processing (S14).
  • the file generation unit 160 first compresses the image L 3 of the left-view in JPEG format (S21).
  • the compression format encoding format
  • the compression format is not limited to JPEG, it needs to be a format that can be compressed and expanded (encoding and decoding) by one image alone, unlike the case of video.
  • the file generation unit 160 adds an APP2 marker to the image data compressed in step S21 (S22).
  • the APP2 marker indicates MP format attached information defined by the multi-picture format of the Camera & Imaging Products Industry Association standard.
  • the file generation unit 160 stores the parameter 1 shown in FIG. 6 in the APP2 marker added in step S22 (S23). Details of data set in parameter 1 will be described later.
  • the file generation unit 160 stores the parameter 2 shown in FIG. 7 in the APP2 marker added in step S22 (S24). Details of data set in parameter 2 will be described later.
  • the file generation unit 160 first compresses the image R 3 in right viewpoint in JPEG format (S31).
  • the compression format (coding format) is not limited to JPEG, but it is desirable to be in the same format as the left viewpoint image.
  • the file generation unit 160 adds an APP2 marker to the image data compressed in step S31 (S32).
  • the APP2 marker indicates MP format attached information defined by the multi-picture format of the Camera & Imaging Products Industry Association standard.
  • the file generation unit 160 stores the parameter 3 shown in FIG. 8 in the APP2 marker added in step S32 (S33). Details of the data set in the parameter 3 will be described later.
  • the file generation unit 160 links the compressed image data generated in the left viewpoint image generation processing (S13) and the right viewpoint image generation processing (S14) (S15). Then, the file generation unit 160 stores the data obtained in step S15 in the storage unit 170 as an mpo file (S16). The data layout of the mpo file generated in step S16 will be described later.
  • the file generation unit 160 compresses the image L 3 displayed on the display unit 130 in the JPEG format (S18), and compressed
  • the image data is stored as a JPEG file in the storage unit 170 (S19).
  • the processes in steps S18 to S19 are the same as those in the related art, and thus detailed description will be omitted.
  • the mpo file generated in step S16 has, for example, the data layout shown in FIG. Specifically, the mpo file of FIG. 9, the APP1 marker, and APP2 marker which stores parameters 1 and 2, the image L 3 which is compressed, and the APP1 marker, and APP2 marker which stores parameter 3, the compression an image R 3 which is is stored in this order.
  • the APP1 marker indicates exif attached information defined in a multi-picture format of the Camera & Imaging Products Industry Association standard.
  • Parameter 1 shown in FIG. 6 includes common items and information (a plurality of MP entries) for each image.
  • the number of MP entries corresponds to the number of images (two in the example of FIG. 6) stored in the mpo file.
  • the common items are the MP format version ("0100" in the example of FIG. 6) indicating the version of the mpo file, and the number of recorded images ("2" in the example of FIG. 6) indicating the number of images stored in the mpo file. , MP entry for storing information (offset) for specifying the position of the first MP entry (MP entry 1) on the mpo file.
  • the MP entry 1 information about the image L 3 of the left viewpoint is stored. Specifically, in the MP entry 1, individual image type management information including a subordinate parent image flag, a subordinate child image flag, and an MP type, an individual image data offset, a subordinate image 1 entry number, and a subordinate image 2 entry Including numbers and
  • the MP type MP entry 1 the value indicating that the image L 3 are stored as a three-dimensional still image ( "0x020002”) is set.
  • the individual image data offset, the offset value indicating the position of the image L 3 on mpo file (since the image L 3 is stored in the head "0x0”) is set.
  • "0x” indicates that the subsequent numerical value is a hexadecimal number.
  • the values set to the other tags of the MP entry 1 are not particularly different from the conventional ones, so the description will be omitted.
  • the MP entry 2 information about the image R 3 in the right viewpoint is stored.
  • the data format of the MP entry 2 is the same as that of the MP entry 1. Then, set an offset value indicating the position of the image R 3 on mpo file into separate image data offset, "0x0" except as set out in the dependent image 1 entry number, the same value is set and MP entry 1 .
  • Parameter 2 shown in FIG. 7 includes an individual image number and a reference viewpoint number.
  • the individual image number is a number for specifying an image (an image L 3 in the example of FIG. 9) to which the APP2 marker including the parameter 2 is added on the mpo file, and “0x1” is set in the example of FIG. Ru.
  • the reference viewpoint number is a number for identifying a viewpoint serving as a reference of a pair of still images (images L 3 and R 3 ) stored in the mpo file.
  • the individual image number “0x1” of the image L 3 Is set. That is, in this mpo file, the left viewpoint is used as the reference viewpoint.
  • Parameter 3 shown in FIG. 8 includes an MP format version, an individual image number, and a reference viewpoint number.
  • the MP format version the same value "0100" as the MP format version included in the common item of FIG. 6 is set.
  • the individual image number an image (an image R 3 in the example of FIG. 9) to which an APP2 marker including parameter 3 is added is a number for specifying on the mpo file, and “0x2” is set in the example of FIG. Be done.
  • the reference viewpoint number the individual image number of the image L 3 "0x1" is set.
  • the left viewpoint image is the first viewpoint image (reference viewpoint image) and the right viewpoint image is the second viewpoint image
  • the first viewpoint image may be used
  • the left viewpoint image may be used as the second viewpoint image.
  • the file generation unit 160 is imaged at the same time (typically, the same time stamp is added), and a pair of still images (image L 3 , R 3 ) is stored in one mpo file, and the same reference viewpoint number (“0x020002” for MP type) as a value indicating that a pair of still images is stored as a stereoscopic still image in the mpo file “0x1” is set and stored in the storage unit 170.
  • the file reproduction unit 180 recognizes that the pair of still images included in the mpo file read from the storage unit 170 is a three-dimensional still image, and causes the display unit 130 to alternately display the images L 3 and R 3 . As a result, it is possible to allow the user to view a stereoscopic still image.
  • the left viewpoint image is displayed on the display unit 130 at the timing when the capture request is input, and the file generation unit 160 is to display the left viewpoint image during display and immediately thereafter.
  • An example of acquiring the right viewpoint image has been described.
  • the file generation unit 160 may obtain the right viewpoint image being displayed and the left viewpoint image displayed immediately before that.
  • Embodiment 2 of the stereoscopic video processing device 100 Next, an operation according to Embodiment 2 of the stereoscopic video processing device 100 will be described.
  • the configuration of the three-dimensional video processing apparatus 100 is the same as that of FIG. Further, detailed description of the operation common to the first embodiment will be omitted, and differences will be mainly described.
  • the first three-dimensional video includes a left-view video composed of images L 11 , L 12 , L 13 ..., And a right-view video composed of images R 11 , R 12 , R 13 .
  • the second three-dimensional video is a left-view video composed of images L 21 , L 22 , L 23 ... And a right-view video composed of images R 21 , R 22 , R 23. including.
  • the stereoscopic video processing apparatus 100 acquires the first stereoscopic video and the second stereoscopic video by the video acquisition unit 110, decodes the first stereoscopic video and the second stereoscopic video by the video decoding unit 120, and obtains the first stereoscopic video.
  • the second stereoscopic video are simultaneously displayed on the display unit 130. That is, the display unit 130 can display a plurality of sub-screens on the display screen, and can simultaneously display different videos on each sub-screen. For example, the display unit 130 shown in FIG.
  • the image L 12 of the left perspective of the first stereoscopic image is displayed on the left side of the relatively small sub-screen
  • right image L 22 of left view of the second stereoscopic image is displayed on a relatively large child screen.
  • the file generation unit 160 acquires a pair of still images acquired from each of the first 3D image and the 2nd 3D image displayed on the display unit 130.
  • the mpo file storing the combined left viewpoint image obtained by combining the left viewpoint images of the recorded still images and the combined right viewpoint image obtained by combining the right viewpoint images of the acquired still images is stored in the storage unit 170 Let That is, assuming that the number of stereoscopic images displayed on the display unit 130 is n, 2 (n + 1) images are stored in the mpo file according to the second embodiment.
  • FIG. 11 is a flowchart of file generation processing according to the second embodiment.
  • FIG. 12 is a flowchart of left viewpoint image generation processing according to the second embodiment.
  • FIG. 13 is a flowchart of right viewpoint image generation processing according to the second embodiment.
  • FIG. 14 is a flowchart of combined left viewpoint image generation processing according to the second embodiment.
  • FIG. 15 is a flowchart of combined right-viewpoint image generation processing according to the second embodiment.
  • 16A to 20 show examples of data stored in the header of the mpo file.
  • FIG. 21 is a diagram showing an example of the data layout of the mpo file.
  • FIG. 10 will be described as an example.
  • the flowchart of FIG. 11 corresponds to the process of steps S13 to S15 of FIG.
  • the file generation unit 160 first compresses the image L 12 of the left-view in JPEG format (S51). Next, the file generation unit 160 adds an APP2 marker to the image data compressed in step S51 (S52). Steps S51 to S52 correspond to steps S21 to S22 in FIG.
  • n 1, that is, when processing the first 3D image (YES in S53), the file generation unit 160 adds the parameter 4 shown in FIGS. 16A to 16C to the APP2 marker added in step S52.
  • Store S54. Details of the data set in the parameter 4 will be described later.
  • the file generation unit 160 skips step S54.
  • the file generation unit 160 stores the parameter 5 in the APP2 marker added in step S52 (S55).
  • the data layout and setting value of parameter 5 added to the APP2 marker of the left viewpoint image of the first three-dimensional video are the same as parameter 2 shown in FIG.
  • the data layout and setting values of the parameter 5 to be added to the APP2 marker of the left viewpoint image of the second 3D image described later will be described later with reference to FIG.
  • the file generation unit 160 first compresses the image R 12 in right viewpoint in JPEG format (S61). Next, the file generation unit 160 adds an APP2 marker to the image data compressed in step S61 (S62). Steps S61 to S62 correspond to steps S31 to S32 in FIG.
  • the file generation unit 160 stores the parameter 6 in the APP2 marker added in step S62 (S63). Note that the data layout and setting values of the parameter 6 added to the APP2 marker of the first right-viewpoint image are common to the parameter 3 shown in FIG. On the other hand, the data layout and setting values of the parameters 6 to be added to the APP2 marker of the right viewpoint image of the second stereoscopic video described later will be described later with reference to FIG.
  • the file generation unit 160 uses the still image generated in steps S42 to S45 to perform combined left viewpoint image generation processing (S46), and combines the combined right viewpoint image.
  • the generation process (S47) is performed.
  • file generation unit 160 obtains the position and size on display unit 130 of each image displayed on display unit 130 (S71). In other words, the file generation unit 160 obtains the coordinates and size of the child screen on which the first three-dimensional video is displayed on the display unit 130, and the coordinates and size of the child screen on which the second three-dimensional video is displayed on the display unit 130. Do.
  • the file generation unit 160 sets the coordinate (x, y) of the upper left end of the display unit 130 to the origin (0, 0), the horizontal direction rightward of the display unit 130 the positive direction of the x coordinate, the vertical direction of the display unit 130 In a two-dimensional coordinate system where the downward direction is the positive direction of the y coordinate, acquire the coordinates of the upper left end of each child screen as the coordinates of the child screen, and acquire the height and width of each child screen as the size of the child screen Good.
  • the information specifying the layout of each child screen is not limited to the above.
  • the file generation unit 160 may acquire the coordinates of the upper left end and the coordinates of the lower right end of each child screen instead of the above information.
  • the file generation unit 160 acquires each of the images 1 (image L 12 of the left viewpoint of the first three-dimensional video) and image 3 (image L 22 of the left viewpoint of the second three-dimensional video) in step S71.
  • a combined left viewpoint image combined according to screen coordinates and size is generated.
  • the file generation unit 160 compresses the generated combined left viewpoint image in JPEG format (S72).
  • the combined left viewpoint image is a screen image actually displayed on the display unit 130.
  • the file generation unit 160 adds the APP2 marker to the image data compressed in step S72 (S73). Then, the file generation unit 160 stores the parameter 7 shown in FIG. 19 in the APP2 marker added in step S73 (S74). Details of the data set in the parameter 7 will be described later.
  • file generation unit 160 acquires the position and size on display unit 130 of each image displayed on display unit 130 (S81).
  • step S81 the information acquired in step S71 described above may be reused.
  • the file generation unit 160 acquires the image 2 (the image R 12 of the right viewpoint of the first three-dimensional video) and the image 4 (the image R 22 of the right viewpoint of the second three-dimensional video) A combined right viewpoint image combined according to the screen coordinates and size is generated. Then, the file generation unit 160 compresses the generated combined right-viewpoint image in JPEG format (S 82).
  • the combined right viewpoint image is a screen image that is actually displayed on the display unit 130.
  • the file generation unit 160 adds an APP2 marker to the image data compressed in step S82 (S83). Then, the file generation unit 160 stores the parameter 8 shown in FIG. 20 in the APP2 marker added in step S83 (S84). Details of the data set in the parameter 7 will be described later.
  • the file generation unit 160 performs left viewpoint image generation processing (S42), right viewpoint image generation processing (S43), combined left viewpoint image generation processing (S46), and combined right viewpoint image generation processing (S42).
  • the compressed image data generated in S47) are linked (S48).
  • the file generation unit 160 stores the data obtained in step S48 in the storage unit 170 as an mpo file.
  • the mpo file generated by the process of FIG. 11 has, for example, the data layout shown in FIG. Specifically, the mpo file of Figure 21, the APP1 marker, and APP2 marker which stores parameter 4,2, a compressed image L 12, and APP1 marker, and APP2 marker which stores parameter 3, the compression an image R 12 which are the APP1 marker, and APP2 marker which stores parameters 5, the image L 22 which is compressed, and the APP1 marker, and APP2 marker which stores parameter 6, the image R 22 compressed, APP1 A marker, an APP 2 marker storing parameter 7, a compressed combined left-viewpoint image, an APP 1 marker, an APP 2 marker storing parameter 8, and a compressed combined right-viewpoint image are stored in this order.
  • Parameters 4 shown in FIGS. 16A, 16B, and 16C include common items and information (a plurality of MP entries) for each image.
  • the data layout of the common items in FIG. 16A is the same as that in FIG. 6 except that the number of recording images is set to “6”.
  • the MP entry 1 shown in FIG. 16A information about the image L 12 of the left perspective of the first stereoscopic image is stored.
  • the MP entry 2 information about the image R 12 in the right perspective of the first stereoscopic image is stored.
  • the data layout and setting values of the MP entries 1 and 2 are almost the same as those in FIG. However, the same value (“0x1”) is set to the subordinate parent image flag of the MP entry 1 and the subordinate child image flag of the MP entry 2. That is, two images in which the same value is set to these flags constitute a pair of still images.
  • the MP entry 3 shown in FIG. 16B information about the image L 22 of left view of the second stereo image is stored.
  • the data layout of MP entry 3 is common to that of entry 1.
  • the subordinate parent image flag is “0x2”
  • the individual image data offset is the offset to image 3
  • the subordinate image 1 entry number is “0x4 (point to MP entry 4)
  • the same value as entry 1 is set except that “is set.
  • the MP entry 4 shown in FIG. 16B information about the image R 22 in the right perspective of the second stereo image is stored.
  • the data layout of MP entry 4 is common to that of entry 2, and the same value as entry 2 is set except that “0x2” is set to the subordinate image flag and the offset to image 4 is set to the individual image data offset. ing.
  • MP entry 5 information related to the combined left viewpoint image is stored.
  • the data layout of MP entry 5 is the same as entry 1.
  • the subordinate parent image flag is “0x3”
  • the individual image data offset is the offset to image 5
  • the subordinate image 1 entry number is “0x6 (point to MP entry 6)
  • the same value as entry 1 is set except that “is set.
  • MP entry 6 shown in FIG. 16C information on the combined right viewpoint image is stored.
  • the data layout of MP entry 6 is common to that of entry 2, and the same value as MP entry 2 is set except that “0x3” is set to the subordinate image flag and the offset to image 6 is set to the individual image data offset. It is done.
  • parameter 5 shown in FIG. 17 is the same as parameter 3 shown in FIG. 8, and the number “0x3” specifying image 3 (the image L 22 of the left viewpoint of the second stereoscopic video) is the individual image number. , except that the individual image number "0x3" of the image L 22 is set to the reference viewpoint number, the same value as the parameter 3 are set.
  • parameter 6 shown in FIG. 18 is the same as parameter 3 shown in FIG. 8, and the number “0x4” specifying image 4 (the right viewpoint image R 22 of the second stereoscopic video) is the individual image number. , except that the individual image number "0x3" of the image L 22 is set to the reference viewpoint number, the same value as the parameter 3 are set.
  • the file generation unit 160 extends the first embodiment and acquires and acquires a pair of still images from each of the plurality of stereoscopic images simultaneously displayed on the display unit 130. Store multiple sets of still images in an mpo file. Also, the file generation unit 160 stores, in the mpo file, information (a subordinate parent image flag and a subordinate child image flag of the MP entry) specifying the combination of the left viewpoint image and the right viewpoint image constituting the pair of still images.
  • information (a subordinate parent image flag and a subordinate child image flag of the MP entry) specifying the combination of the left viewpoint image and the right viewpoint image constituting the pair of still images.
  • file reproducing unit 180 sets the same value to the subordinate parent image flag and the subordinate child image flag of the MP entry.
  • the still image is read, and the read pair of images is alternately displayed on the display unit 130. As a result, it is possible to allow the user to view a stereoscopic still image.
  • the file generation unit 160 combines a plurality of left viewpoint images stored in the mpo file according to the layout displayed on the display unit 130 according to the layout, and a plurality of right viewpoint images stored in the mpo file. And the combined right viewpoint image combined according to the layout displayed on the display unit 130 is stored in the mpo file.
  • the file reproducing unit 180 reads out the combined left viewpoint image and the combined right viewpoint image from the mpo file and alternately displays the combined left viewpoint image and the combined right viewpoint image on the display unit 130, whereby the display unit at the time when the input request unit 140 receives the capture request. It becomes possible to reproduce the display contents of 130 as a three-dimensional still image.
  • the file generation unit 160 may execute the processing for NO in step S12 of FIG. 3 for 2D video and the processing for YES in step S12 of FIG. 3 for 3D video.
  • each of the above-described devices is a computer system including a microprocessor, a ROM, a RAM, a hard disk unit, a display unit, a keyboard, a mouse and the like.
  • a computer program is stored in the RAM or the hard disk unit.
  • Each device achieves its function by the microprocessor operating according to the computer program.
  • the computer program is configured by combining a plurality of instruction codes indicating instructions to the computer in order to achieve a predetermined function.
  • the system LSI is a super multifunctional LSI manufactured by integrating a plurality of components on one chip, and more specifically, is a computer system configured to include a microprocessor, a ROM, a RAM and the like. .
  • the RAM stores a computer program.
  • the system LSI achieves its functions by the microprocessor operating according to the computer program.
  • the IC card or module is a computer system including a microprocessor, a ROM, a RAM, and the like.
  • the IC card or module may include the above-described ultra-multifunctional LSI.
  • the IC card or module achieves its function by the microprocessor operating according to the computer program. This IC card or this module may have tamper resistance.
  • the present invention may be methods shown above.
  • it may be a computer program that realizes these methods by a computer, or may be a digital signal composed of a computer program.
  • the present invention is a computer program or recording medium capable of reading digital signals from a computer, such as a flexible disk, hard disk, CD-ROM, MO, DVD, DVD-ROM, DVD-RAM, BD (Blu-ray Disc), It may be recorded in a semiconductor memory or the like. In addition, digital signals may be recorded on these recording media.
  • a computer such as a flexible disk, hard disk, CD-ROM, MO, DVD, DVD-ROM, DVD-RAM, BD (Blu-ray Disc), It may be recorded in a semiconductor memory or the like.
  • digital signals may be recorded on these recording media.
  • the present invention may transmit a computer program or a digital signal via a telecommunication line, a wireless or wired communication line, a network typified by the Internet, data broadcasting, and the like.
  • the present invention is a computer system comprising a microprocessor and a memory, the memory storing the computer program, and the microprocessor may operate according to the computer program.
  • the present invention is advantageously used in a stereoscopic video processing apparatus and a stereoscopic video processing method.
  • Reference Signs List 100 stereoscopic video processing device 110 video acquisition unit 120 video decoding unit 130 display unit 140 input reception unit 150 2D / 3D detection unit 160 file generation unit 170 storage unit 180 file reproduction unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Library & Information Science (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

La présente invention se rapporte à un dispositif de traitement d'image en trois dimensions (100). Le dispositif de traitement d'image en trois dimensions selon l'invention, comprend : un module d'affichage (130), qui est utilisé afin d'afficher, alternativement, une première image de point de vue et une seconde image de point de vue, qui constituent une image en trois dimensions ; un module de réception d'entrée (140), qui est utilisé afin de recevoir, de l'utilisateur, une entrée de demande de capture d'image ; et un module de génération de fichier (160), qui est utilisé afin de générer un fichier conforme à un format multi-image. Selon la présente invention : une première image de point de vue et une seconde image de point de vue correspondant à la première image de point de vue sont stockées en tant qu'une paire d'images fixes ; et lesdites première et seconde images de point de vue s'affichent sur le module d'affichage (130) à un moment où l'entrée de demande de capture d'image est reçue par le module de réception d'entrée (140).
PCT/JP2012/001846 2012-03-16 2012-03-16 Dispositif de traitement d'image en trois dimensions et procédé de traitement d'image en trois dimensions WO2013136373A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2012/001846 WO2013136373A1 (fr) 2012-03-16 2012-03-16 Dispositif de traitement d'image en trois dimensions et procédé de traitement d'image en trois dimensions

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2012/001846 WO2013136373A1 (fr) 2012-03-16 2012-03-16 Dispositif de traitement d'image en trois dimensions et procédé de traitement d'image en trois dimensions

Publications (1)

Publication Number Publication Date
WO2013136373A1 true WO2013136373A1 (fr) 2013-09-19

Family

ID=49160348

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/001846 WO2013136373A1 (fr) 2012-03-16 2012-03-16 Dispositif de traitement d'image en trois dimensions et procédé de traitement d'image en trois dimensions

Country Status (1)

Country Link
WO (1) WO2013136373A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003092303A1 (fr) * 2002-04-25 2003-11-06 Sharp Kabushiki Kaisha Procede de generation d'informations multimedia, et dispositif de reproduction d'informations multimedia
JP2005184377A (ja) * 2003-12-18 2005-07-07 Sharp Corp 画像変換装置及びそれを用いた画像記録装置
WO2011062015A1 (fr) * 2009-11-17 2011-05-26 ソニー株式会社 Procédé de transmission d'image, procédé de réception d'image, dispositif de transmission d'image, dispositif de réception d'image et système de transmission d'image

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003092303A1 (fr) * 2002-04-25 2003-11-06 Sharp Kabushiki Kaisha Procede de generation d'informations multimedia, et dispositif de reproduction d'informations multimedia
JP2005184377A (ja) * 2003-12-18 2005-07-07 Sharp Corp 画像変換装置及びそれを用いた画像記録装置
WO2011062015A1 (fr) * 2009-11-17 2011-05-26 ソニー株式会社 Procédé de transmission d'image, procédé de réception d'image, dispositif de transmission d'image, dispositif de réception d'image et système de transmission d'image

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
CAMERA EIZO KIKI KOGYOKAI KIKAKU, CIPA DC-007- 2009 MULTI PICTURE FORMAT, 4 February 2009 (2009-02-04) *

Similar Documents

Publication Publication Date Title
JP6960528B2 (ja) メディアコンテンツを生成および処理するための方法、装置、およびコンピュータプログラム
JP5519647B2 (ja) カメラ・パラメータを利用したステレオスコピック映像データ・ストリーム生成方法及びその装置、
US8259162B2 (en) Method and apparatus for generating stereoscopic image data stream for temporally partial three-dimensional (3D) data, and method and apparatus for displaying temporally partial 3D data of stereoscopic image
US8878836B2 (en) Method and apparatus for encoding datastream including additional information on multiview image and method and apparatus for decoding datastream by using the same
US8538134B2 (en) Method and apparatus for receiving and generating image data stream including parameters for displaying local three dimensional image
JP7399224B2 (ja) メディアコンテンツを送信するための方法、装置及びコンピュータプログラム
JP5022443B2 (ja) 立体映像コンテンツ再生に利用されるメタデータの復号化方法
KR101863767B1 (ko) 의사-3d 인위적 원근법 및 장치
JP5429034B2 (ja) 立体画像データ送信装置、立体画像データ送信方法、立体画像データ受信装置および立体画像データ受信方法
WO2012017643A1 (fr) Procédé de codage, dispositif d'affichage et procédé de décodage
WO2008054100A1 (fr) Procédé et appareil pour décoder des métadonnées utilisées pour lire un contenu stéréoscopique
JP2012186781A (ja) 画像処理装置および画像処理方法
WO2012111757A1 (fr) Dispositif de traitement d'images et procédé de traitement d'images
US10205927B2 (en) Encoding device and encoding method, and decoding device and decoding method
US20130070052A1 (en) Video procesing device, system, video processing method, and video processing program capable of changing depth of stereoscopic video images
WO2011083625A1 (fr) Dispositif de traitement d'image, support d'enregistrement d'informations, procédé de traitement d'image et programme
JPWO2012128069A1 (ja) 画像処理装置および画像処理方法
WO2013136373A1 (fr) Dispositif de traitement d'image en trois dimensions et procédé de traitement d'image en trois dimensions
KR20220101169A (ko) 멀티뷰 비디오 프로세싱 방법 및 장치
EP2685730A1 (fr) Dispositif de lecture, procédé de lecture et programme
TW201249168A (en) Operating method of display chip for three-dimensional display system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12871586

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12871586

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP