WO2010064774A1 - 3차원 영상신호 전송 방법과, 3차원 영상표시 장치 및 그에 있어서의 신호 처리 방법 - Google Patents
3차원 영상신호 전송 방법과, 3차원 영상표시 장치 및 그에 있어서의 신호 처리 방법 Download PDFInfo
- Publication number
- WO2010064774A1 WO2010064774A1 PCT/KR2009/004619 KR2009004619W WO2010064774A1 WO 2010064774 A1 WO2010064774 A1 WO 2010064774A1 KR 2009004619 W KR2009004619 W KR 2009004619W WO 2010064774 A1 WO2010064774 A1 WO 2010064774A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- floating window
- images
- data
- image signal
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 44
- 238000003672 processing method Methods 0.000 title claims abstract description 15
- 230000008054 signal transmission Effects 0.000 title claims abstract description 7
- 230000001629 suppression Effects 0.000 claims abstract description 3
- 238000002156 mixing Methods 0.000 claims description 16
- 239000000284 extract Substances 0.000 claims description 9
- 239000002131 composite material Substances 0.000 claims description 7
- 238000006073 displacement reaction Methods 0.000 claims description 7
- 230000015572 biosynthetic process Effects 0.000 claims description 6
- 238000003786 synthesis reaction Methods 0.000 claims description 6
- 230000004044 response Effects 0.000 claims description 4
- 230000002194 synthesizing effect Effects 0.000 claims description 4
- 238000012937 correction Methods 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 5
- 239000011521 glass Substances 0.000 description 5
- 230000000694 effects Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000000007 visual effect Effects 0.000 description 4
- 239000000203 mixture Substances 0.000 description 3
- 230000000153 supplemental effect Effects 0.000 description 3
- 241000282412 Homo Species 0.000 description 2
- 230000004308 accommodation Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 208000003464 asthenopia Diseases 0.000 description 1
- 230000001886 ciliary effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 238000001093 holography Methods 0.000 description 1
- 230000005764 inhibitory process Effects 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 210000001525 retina Anatomy 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/194—Transmission of image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/144—Processing image signals for flicker reduction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/161—Encoding, multiplexing or demultiplexing different image signal components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
- H04N13/178—Metadata, e.g. disparity information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
- H04N13/183—On-screen display [OSD] information, e.g. subtitles or menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N19/00—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
- H04N19/50—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding
- H04N19/597—Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using predictive coding specially adapted for multi-view video sequence encoding
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/236—Assembling of a multiplex stream, e.g. transport stream, by combining a video stream with other content or additional data, e.g. inserting a URL [Uniform Resource Locator] into a video stream, multiplexing software data into a video stream; Remultiplexing of multiplex streams; Insertion of stuffing bits into the multiplex stream, e.g. to obtain a constant bit-rate; Assembling of a packetised elementary stream
- H04N21/23614—Multiplexing of additional data and video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/434—Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
- H04N21/4348—Demultiplexing of additional data and video streams
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/84—Generation or processing of descriptive data, e.g. content descriptors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/08—Systems for the simultaneous or sequential transmission of more than one television signal, e.g. additional information signals, the signals occupying wholly or partially the same frequency band, e.g. by time division
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/24—Systems for the transmission of television signals using pulse code modulation
Definitions
- the present invention relates to a video signal encoding method, and more particularly, to a method for encoding a stereoscopic video signal for implementing a three-dimensional image.
- the present invention relates to an image display device and a signal processing method thereof, and more particularly, to a stereoscopic display device for implementing a three-dimensional image, and a stereoscopic image signal processing method in such a device.
- Physiological depth cues include Accommodation, Convergence or Vergence, Binocular Disparity and Motion Parallax.
- Psychological depth cues include linear perspectives, shadows, shades of perspective, occlusion by other objects, texture gradients, and colors.
- accommodation refers to changing the focal length of the lens when the eye tries to focus on a specific area of the three-dimensional scene. Changes in the thickness of the lens are caused by changes in the tension of the ciliary muscles.
- perspective control is normally used with convergence.
- Convergence or Vergence means that when the observer looks at a finite distance, both eyes rotate inward and the two eyes intersect at the viewpoint.
- Binocular disparity is due to the fact that the left and right eyes are about 65 millimeters apart to accept different images, and the difference between the images projected onto the left and right retinas when viewing a 3D scene.
- Such binocular parallax becomes the critical depth cue that the visual system uses in depth sense or stereopsis.
- Motion parallax refers to the difference in relative displacement of each point in the 3D scene (ie, the near point moves more than the far point) when relative motion exists between the observer and the 3D scene.
- the binocular parallax is simulated by capturing different images with two image sensors about 65 millimeters away from the human eye, and inputting the two images separately to the left and right eyes of the person on the display device. Depth perception to stereoscopic vision.
- the two image sensors are aligned in the horizontal direction with the same optical characteristics, focal length and zoom ratio.
- stereoscopic three-dimensional images differ in some aspects from images that humans can actually perceive in the field.
- a different situation occurs when watching a stereoscopic image.
- the camera that captures the image focuses on a specific object and thus two stereoscopic image pairs are focused on the virtual stereoscopic window plane on which the object is located.
- the stereoscopic image pair is focused on the physical image display surface (hereinafter referred to as a "stereoscopic screen"). Accordingly, the convergence stimulus naturally varies with depth, while the focusing stimulus tends to remain fixed on the stereoscopic screen.
- the human eye always focuses on the stereoscopic screen, while the gaze point is in front of or behind the stereoscopic screen, depending on the position of the object being viewed, so that both eyes converge in a different depth plane than the stereoscopic screen.
- the human eye should focus on the object in front of or behind the stereoscopic screen that is not in focus. Attempt to focus.
- a more serious problem due to negative parallax is a clash of clues that can occur when an object with negative parallax values partially obscures near the left and right edges of a stereoscopic image pair.
- a first camera having a lens 10 and an image sensor 12 captures a left image projected onto the first stereoscopic window 14, and includes a lens 20 and an image sensor 22.
- the situation in which the second camera captures the right image projected on the second stereoscopic window 24 is depicted.
- the first and second stereoscopic windows 14 and 24 include first to third objects 30, 32, and 34.
- 5 shows an example of left and right images 14a and 24a displayed on a stereoscopic surface.
- the first and second objects 30 and 32 each having a zero parallax and a positive parallax value may be synthesized and displayed to have a stereoscopic depth effect by binocular parallax.
- the third object 34 having a negative parallax two images are synthesized by stereoscopic cues to provide a stereoscopic image to the viewer. However, because the third object 34 is cut by the left edge of the right image, another depth clue called 'occlusion' is activated, which causes the user to stereoscopic In other words, it can be recognized as if it is located behind the display device.
- Such a collision of clues is called an "edge violation", which causes discomfort and confusion for the viewer and can significantly degrade the quality of the 3D image.
- the collision of the clue due to the partial obstruction of the object may be attributed to the fact that the second camera does not cover a part of the left side of the first camera's field of view, resulting in some blind spots.
- the same problem appears when the object is covered by the edge of the right image.
- the present invention is to solve the above problems, when reducing the collision of depth cues that can occur near the left and right edges of the three-dimensional image when displaying a plurality of images for implementing the three-dimensional image in the remote receiver It is a technical problem to provide a three-dimensional image signal transmission method that allows to be made.
- the present invention is to provide a video display device for displaying a plurality of images for realizing a three-dimensional image, so that the collision of the depth clue that can occur near the left and right corners of the three-dimensional image can be reduced It is another technical problem.
- the present invention provides a video signal processing method that can reduce the collision of depth cues that may occur near the left and right edges of the three-dimensional image in displaying a plurality of images for implementing the three-dimensional image; It is another technical problem.
- a plurality of video signals respectively representing a plurality of images having different viewpoints are prepared.
- floating window information including at least one floating window application field, position and size, transparency, and color of each floating window is generated for each of the plurality of images.
- the floating window information is inserted into a video picture header region to encode and transmit the video signal.
- the floating window is a region set inside a left or right corner of each of a plurality of images, and refers to a region in which the display of an image obtained from a broadcast signal is suppressed, and the size of each floating window.
- 'inhibition' includes a case where the image of the corresponding area is completely invalidated and replaced with a specific color, as well as a case where the image is blurred by alpha blending with a specific color and a specific transparency value. I use it to mean.
- the size of each floating window is determined based on a camera parameter of an image sensor that picked up the image, a displacement between corresponding points in the plurality of images, or a combination of camera parameter and the displacement. .
- the present invention defines syntax and data structure for transmitting and including floating window information in a picture header of an image signal encoded as metadata, and defines a method of interpreting and reflecting floating window information on a screen at a receiver side.
- a data structure is generated by including the floating window information and a second field having a value corresponding to the floating window information.
- the user data structure is generated by including the data structure and a first field having a predetermined value.
- the second field is 'user_data_type_code', and this field is '0x09'.
- the floating window information is represented by the value of.
- the first field is a 'user_data_identifier' field, and this field may have a value of "0x4741 3934" for the data structure including floating window information.
- the user data structure thus generated is inserted into the picture header.
- a video display device including a decoder and a composite display.
- the decoder decodes the encoded video signal, reconstructs a plurality of video signals, and extracts floating window information from the picture header region of the encoded video signal.
- the composite display unit suppresses the image in the left and / or right corner inner region according to the floating window information for each of the plurality of images corresponding to the plurality of image signals, and locally suppresses the images in a stereoscopic manner. Display.
- the floating window information preferably includes the position and size of each floating window, transparency, and color data.
- the plurality of image signals include a left image signal for a left image and a right image signal for a right image.
- the composite display includes an on-screen display generator, first and second mixing units, and a formatter.
- the on-screen display generator generates a left floating window image signal and a right floating window image signal according to position, size data, and color data.
- the first and second mixing units synthesize the left image signal and the left floating window image signal, respectively, and synthesize the right image signal and the right floating window image signal.
- the formatter formats the output signals of the first and second mixing units according to the stereoscopic output scheme.
- the composite display unit includes a first formatter, an on-screen display generator, a second formatter, and a mixing unit.
- the first formatter formats the left video signal and the right video signal according to the stereoscopic output method.
- the on-screen display generator generates a left floating window image signal and a right floating window image signal according to position, size data, and color data.
- the second formatter formats the left floating window image signal and the right floating window image signal according to a stereoscopic output method.
- the mixing section synthesizes the output signals of the first and second formatters.
- the video display device of the present invention is implemented in the form of a 3D television receiver.
- the tuner / demodulator receives and demodulates a broadcast signal through a predetermined channel and outputs a channel-coded transport stream.
- the channel decoding unit receives the channel encoded transport stream, performs error correction decoding, and outputs the decoded transport stream.
- the demultiplexing / packet canceling unit demultiplexes the decoded transport stream and releases the packet to output the encoded video signal.
- the first encoded video signal is obtained.
- the encoded video signal is decoded to reconstruct a plurality of video signals, and the floating window information of each floating window is extracted from the picture header region of the encoded video signal.
- the image is suppressed in the inner region of the left and / or right corners according to the floating window information, and the locally suppressed images are displayed in stereoscopic manner. Done.
- a picture header is extracted from the encoded video signal.
- a user data structure in which the first field has a predetermined value is extracted from the picture header.
- a data structure having a value representing a floating window in a second field of the user data structure is extracted as the floating window information.
- the first field is a 'user_data_identifier' field, and this field is " A user data structure having a value of 0x4741 3934 "is extracted as a data structure containing floating window information.
- the second field is 'user_data_type_code', and a data structure in which this field has a value of "0x09" is extracted as floating window information.
- a floating window in processing the floating window, is set according to position and size data of at least one of the left and right edges of each of the plurality of images, and the image corresponds to the floating window. Will invalidate the part.
- the image portion corresponding to the floating window position may be filled with a specific color.
- the image portion corresponding to the floating window position may be filled with the color specified in the floating window information.
- an on-screen display image is generated for the image portion corresponding to the floating window position, and the on-screen display image is superimposed on the image portion. By doing so, it is possible to invalidate the video portion.
- a floating window in processing the floating window, is set according to position and size data in at least one of the left and right corners of each of the plurality of images, and the image corresponds to the floating window.
- the image portion may be suppressed by alpha blending the color specified in the floating window information according to the transparency data.
- the floating window information includes color and transparency data in addition to the size data for each floating window.
- the floating window may be set inside at least one of the left and right edges of each of the plurality of images, and the image portion corresponding to the floating window position may be suppressed based on the color and transparency data specified in the floating window information. .
- suppressing the image of the floating window position may be performed according to a user operation command. For example, it is preferable to activate the floating window only when the user manipulation command does not prohibit the asymmetrical suppression by referring to the manipulation command input through the remote controller or in the past. It is also desirable to be able to adjust the size of the floating window in response to a user manipulation command.
- the present invention asymmetrically suppresses a region inside the left, right, or both corners of the left and right images for a stereoscopic 3D image by a floating window. Accordingly, near the left and right edges of the stereoscopic image pair, it is possible to prevent or reduce the collision of the clues that may occur when the object having the negative parallax value partially overlaps, that is, the edge violation problem. Accordingly, it is possible to realize an optical illusion effect in which an object having a negative parallax reliably emerges in front of the display surface. Therefore, the present invention can reduce the discomfort and confusion caused by the edge violation, and improve the quality of the 3D image by increasing the three-dimensional feeling and the realism.
- FIG. 1 is a diagram illustrating a viewing point having a horizontal parallax of zero in a stereoscopic 3D image
- 3 exemplarily illustrates a gaze point having a negative parallax value in a stereoscopic 3D image
- FIG. 4 shows an example of an arrangement form of a camera for generating a stereoscopic image pair
- FIG. 5 is a diagram showing examples of left and right images to illustrate collisions of depth cues that may occur near left and right edges of a stereoscopic image pair;
- FIG. 6 is a block diagram of an embodiment of a broadcast program production and delivery system according to the present invention.
- FIG. 7 shows one embodiment of a method for calculating the size of a floating window
- FIG. 11 shows the syntax of a SEI RBSP Payload bitstream suitable for transmitting floating window information in a modified embodiment where stereoscopic image pairs are encoded by the H.264 / AVC standard;
- FIG. 12 is a block diagram of one embodiment of a television receiver in accordance with the present invention.
- FIG. 13 is a flowchart illustrating a process of extracting floating window information and synthesizing the left and right images by the television receiver of FIG. 12.
- FIG. 14 is a flowchart illustrating a process of extracting floating window information in FIG. 12 in detail
- FIGS. 15 to 21 are screen shots for explaining a process of synthesizing a floating window into left and right images, and FIGS. 15 and 16 show left and right images before synthesis of floating windows, respectively.
- FIGS. 19 to 21 are views illustrating examples of a screen in which a floating window is synthesized and displayed with the left and right images overlapping each other;
- FIG. 22 is a block diagram of another embodiment of a television receiver according to the present invention.
- FIG. 23 is a block diagram of another embodiment of a television receiver according to the present invention.
- a broadcast program production and delivery system includes a binocular camera 100, a preprocessor 102, an image encoder 104, a user data inserter 106, and a flow. Chat window information generating unit 108, a plurality of microphones 110a to 110f, voice encoding unit 112, packet generation unit 114, transmission multiplexer 116, PSI / PSIP generation unit 118, channel encoding unit 120, a modulator 122, and a transmitter 124.
- the binocular camera 100 includes two lenses and corresponding image pickup devices, and captures a pair of two-dimensional images of a front scene. Like the human eye, the two lenses are disposed to have a distance of 65 millimeters (mm). Accordingly, the camera 100 acquires two two-dimensional images having binocular parallax.
- the image acquired by the left lens of the two two-dimensional images constituting the stereoscopic image pair will be referred to as the left image and the image obtained by the right lens will be referred to as a right image.
- the preprocessor 102 removes noise that may exist in the left and right original images, corrects the image, and solves an imbalancing phenomenon of luminance components. Images may be stored or edited in the storage device before or after the preprocessing by the preprocessing unit 102, such that there may be a significant time difference between the imaging by the binocular camera 100 and the encoding by the image coding unit 104. Of course.
- the image encoder 104 compresses a video signal by removing temporal and spatial redundancy from pre-processed left and right images, and performs the MPEG-2 standard of ISO / IEC 13818-2 and the A / 53 Part 4 Create a video elementary stream (ES) according to the ATSC digital television standard.
- the image encoder 104 encodes an image used as a base view among left and right images, for example, a left image based on a frame based on the MPEG-2 standard, and has a high spatial correlation between the left and right images.
- the difference image between the left and right images is calculated for the right image, the motion vector is estimated, and the difference image is encoded.
- the image encoder 104 may encode the right image in the same manner as the left image.
- the image encoder 104 is not a block-based scheme, but a pixel-based scheme, a feature-based scheme, and an object-based scheme.
- other coding schemes such as the "
- the image encoder 104 encodes a plurality of images according to the H.264 / AVC standard drafted by the Joint Video Team (JVT) of ISO / IEC JTC1 / SC29 / WG11 and ITU-T SG16 Q.6. can do.
- JVT Joint Video Team
- extended data and user data can be inserted at a sequence level, a GOP level, or a picture level.
- the user data inserter 106 provides the image encoder 104 with extended data and user data to be inserted at the sequence level, the GOP level, or the picture level when the image encoder 104 encodes the image signal.
- the user data inserting unit 106 provides the video encoding unit 104 with floating window information from the floating window information generating unit 108 as one type of user data.
- the insertion unit 106 may include floating window information in the picture header to encode an image signal.
- the floating window information generator 108 receives the corrected left and right images from the preprocessor 102 and compares the left and right images to calculate the size of the floating window.
- the floating window may include a left edge of a left image or a left edge of the left image to reduce collisions of depth cues that may occur near left and right edges when displaying stereoscopic image pairs at the receiver side. In the vicinity of the right edge of the right image, it refers to a top-to-bottom strip region of a specific background color set to replace a part of the image.
- the floating window may be formed at the right edge of the left image or the left edge of the right image.
- the floating window information generation unit 108 converts the information about the size, width, and transparency of each floating window, and transparency and color information into the image encoding unit 104 through the user data inserting unit 106 as floating window information. ).
- the floating window information generation unit 108 selects one object 154b among the objects leftmost in the right image 150b, and in the left image 150a. The position of the corresponding object 154a is searched. In addition, the floating window information generator 108 calculates a displacement, that is, a position difference DIST, between the objects 154a and 154b and determines the displacement as the width of the floating window. Whether the floating window is applied to the right image and the size are also determined in the same manner as the floating window for the left image.
- the floating window information generation unit 108 receives camera parameters such as zoom, focal length, viewing angle, gaze point, lens distortion, etc. from the camera 100, and based on these parameters, the floating window information generation unit 108 adjusts the size of the floating window in the left and right images. I can regulate it. For example, when the focal length of the camera 100 is longer than a predetermined reference, the width of the floating window may be reduced, and when the focal length is shorter than the reference, the width of the floating window may be increased.
- whether the floating window is applied or the adjustable range may vary depending on the property of the program image or the location of the photographing.
- the cameraman may directly determine whether or not the floating window is applied or the adjustable range, or the editor may determine this during the image editing process. Of course, the work of these cameramen or editors may be performed by a computer program.
- the plurality of microphones 110a to 110f respectively installed at appropriate positions in the photographing scene acquire sound at the photographing site and convert the sound into an electrical voice signal.
- the speech encoder 112 encodes the speech signal received from each of the microphones 110a to 110f according to a predetermined standard, for example, the AC-3 standard, to generate an audio elementary stream ES.
- the packet generator 114 receives the video ES and the audio ES from the video encoder 104 and the audio encoder 112, respectively, and packetizes each stream to generate a packetized elementary stream (PES).
- the PSI / PSIP generation unit 118 generates program specification information (PSI) and program and system information protocol (PSIP) information.
- PSI program specification information
- PSIP program and system information protocol
- the multiplexer 118 adds a header to the PES and PSI / PSIP information to generate a transport stream (TS).
- TS transport stream
- the system of FIG. 6 transmits one channel through terrestrial waves, for example, in a system for transmitting a broadcast signal through a cable network or a satellite network, a separate transmission multiplexer may multiplex broadcast signals of multiple channels to generate a multi-program TS. It may be.
- the channel encoder 120 error-corrects the TS so that the receiver can detect and correct an error that may be caused by noise in the transmission channel.
- the modulator 122 modulates the channel-coded TS by a modulation scheme adopted by the system, for example, an 8-VSB modulation scheme.
- the transmitter 124 transmits the modulated broadcast signal according to channel characteristics, for example, through the antenna 126.
- an extension_and_user_data () structure is provided for defining extension data or user data in the header. Can be inserted.
- the floating window information is included and transmitted as picture user data at the picture level, that is, in extension_and_user_data (), which can be placed after the picture header in the video ES.
- extension_and_userdata (2) corresponding to the syntax name means that the syntax is at the picture level.
- extension_start_code or user_data_start_code when the next bits in the picture header are extension_start_code or user_data_start_code, the following bit strings show the extension_and_user_data (2) structure. Accordingly, the receiver recognizes bit strings following extension_start_code or user_data_start_code as extension_data (2) or user_data (), respectively.
- picture user data user_data includes 'user_data_start_code' and 'user_data_identifier' fields, which are followed by user_structure ().
- the value of 'user_data_start_code' is set to "0x0000 01B2" by the ISO / IEC 13818-2 standard.
- the 'user_data_identifier' field is a 32-bit code representing the syntax and meaning of user_structure (), and is defined as the value of 'format_identifier' as defined in the ISO / IEC 13818-1 standard.
- ATSC_user_data () as in the present invention It is set to a value of "0x4741 3934".
- user_structure is a variable length data structure defined by the 'user_data_identifier' field and includes 'user_data_type_code' and user_data_type_structure () as shown in the lower part of FIG.
- 'user_data_type_code' is an 8-bit value indicating the type of ATSC user data, and is set to a value of "0x09" when indicating floating window data.
- the floating window data syntax is a variable-length data structure that includes four flags indicating the presence or absence of a floating window at the left and right corners of the left and right images, the size and transparency of each flagged window. , Include color information.
- the 'number_pixels_of_LL_window' field has an unsigned integer value of 14 bits and indicates the number or coordinates of the last luminance sample in the left floating window of the left image, thereby indicating the width of the left floating window of the left image.
- the 'transparency_LL_window' field has a 24-bit unsigned integer value and specifies the transparency of the left floating window of the left image.
- the 'color_LL_window' field specifies the color of the left floating window of the left image.
- 'Market bits' having a value of '11' indicates that information about each window is started. Meanwhile, the width data of the floating window may be directly displayed instead of the 'number_pixels_of_LL_window' field. The same applies to other windows.
- the 'number_pixels_of_LR_window' field has an unsigned integer value of 14 bits and represents the number or coordinate of the first luminance sample in the right floating window of the left image, thereby indicating the width of the right floating window of the left image.
- the 'transparency_LR_window' field has a 24-bit unsigned integer value and specifies the transparency of the right floating window of the left image.
- the 'color_LR_window' field specifies the color of the right floating window of the left image.
- the 'number_pixels_of_RL_window' field has an unsigned integer value of 14 bits and indicates the number or coordinate of the last luminance sample in the left floating window of the right image, thereby indicating the width of the left floating window of the right image.
- the 'transparency_RL_window' field has a 24-bit unsigned integer value and specifies the transparency of the left floating window of the right image.
- the 'color_RL_window' field specifies the color of the left floating window of the right image.
- the 'number_pixels_of_RR_window' field has an unsigned integer value of 14 bits and indicates the number or coordinate of the first luminance sample in the right floating window of the right image, thereby indicating the width of the right floating window of the right image.
- the 'transparency_RR_window' field has a 24-bit unsigned integer value and specifies the transparency of the right floating window of the right image.
- the 'color_RR_window' field specifies the color of the right floating window of the right image.
- two two-dimensional images constituting a stereoscopic image pair are encoded by the MPEG-2 standard, and the floating window information is included in the picture header in the video ES as user data.
- the left and right images are different coding techniques, such as H.264, which is drafted by (JVT) of ISO / IEC JTC1 / SC29 / WG11 and ITU-T SG16 Q.6.
- JVT Joint Photographic Expertsts
- the floating window information may be included in the Supplemental Enhancement Information (SEI) region and transmitted. .
- SEI Supplemental Enhancement Information
- FIG. 11 shows the syntax of a SEI RBSP Payload bitstream suitable for transmitting floating window information in this modified embodiment in which a stereoscopic image pair is encoded by the H.264 / AVC standard.
- 'itu_t_t35_country_code' is an 8-bit country code defined in Annex A of ITU-T T35 and has a value of "0x61" in Korea.
- 'itu_t_t35_provider_code' is a 16-bit code with a value of 0x0031.
- 'user_identifier' is a 32-bit code and can indicate that the syntax of user_structure () is defined in ATSC A / 53 using the value "0x4741 3934".
- user_structure () can be used in the same way as defined in the ATSC digital television standard, ie A / 53 Part 4 Section 6.2.3, thus the user data syntax at the bottom of Fig. 9 and the floating window data of Fig. 10.
- the syntax makes it possible to display floating window information.
- the television receiver according to the present embodiment is suitable for receiving an over-the-air broadcast signal and playing back an image.
- the tuner 202 selects and outputs a broadcast signal of one channel selected by a user from among a plurality of broadcast signals input through an antenna.
- the demodulator 204 demodulates the broadcast signal from the tuner 202 and outputs a demodulated transport stream.
- the channel decoding unit 206 performs error correction decoding on the demodulated signal.
- the demultiplexer 208 demultiplexes the error correction-decoded TS, separates the video PES and the audio PES, and extracts PSI / PSIP information.
- the PSI / PSIP processing unit 210 stores the PSI / PSIP information in a memory (not shown) or provides the main control unit 200 so that the broadcast is reproduced according to this information.
- the packet release unit 212 restores the video ES and the audio ES by releasing packets for the video PES and the audio PES.
- the speech decoding unit 214 decodes the audio ES and outputs an audio bitstream.
- the audio bitstream is converted into an analog voice signal by a digital-to-analog converter (not shown), amplified by an amplifier (not shown), and then output through a speaker (not shown).
- the image decoding unit 216 decodes the video ES and outputs a video bitstream, that is, picture data.
- the left and right image separator 218 separates the left image signal and the right image signal from the picture data. Meanwhile, in the decoding process, the image decoding unit 216 extracts the header and the extension / user data in the video ES and provides them to the main control unit 200 so that the main control unit 200 extracts the header and the extension / user data.
- the floating window data fw_data () is extracted to restore the floating window information.
- the main control unit 200 determines whether there is a left or right flow in the left or right image based on four flags, that is, 'left_view_left_float_window_flag', 'left_view_right_float_window_flag', 'right_view_left_float_window_flag', and 'right_view_right_float_window_flag' among the floating window data of FIG. 10. do.
- the main controller 200 checks the size, width, and width of the floating window through the 'number_pixels_of_LL_window', 'number_pixels_of_LR_window', 'number_pixels_of_RL_window', and 'number_pixels_of_RR_window' fields with respect to the floating window having the flag set to "1".
- the main controller 200 checks a color to be used when outputting each floating window based on the fields 'color_LL_window', 'color_LR_window', 'color_RL_window', and 'color_RR_window'.
- the main control unit 200 checks the alpha blending value for each floating window based on the 'transparency_LL_window', 'transparency_LR_window', 'transparency_RL_window', and 'transparency_RR_window' fields.
- the main controller 200 provides the floating window information to the graphics engine 220 in a suitable form, while the graphics engine 220, the first and second OSD generators 222 and 224, and the first and second mixers ( 226 and 228 and the formatter 230 to control image processing based on the floating window information.
- the graphics engine 220 receives floating window information such as window size, transparency, color, etc. in a suitable form, and generates OSD data for the floating window in the left and right images based on this.
- the first OSD generator 222 generates an OSD signal for the left image based on the OSD data for the left image floating window.
- the first mixer 226 mixes the left image OSD signal from the first OSD generation unit 222 with the left image signal from the left and right image separator 218 to display the left image OSD in the floating window region for the left image.
- the signal replaces the left image or causes alpha blending into the left image signal.
- the second OSD generator 224 generates an OSD signal for the right image based on the OSD data for the right image floating window.
- the second mixer 228 mixes the right image OSD signal from the second OSD generation unit 224 with the right image signal from the left and right image separator 218 to display the left image OSD in the floating window region for the left image.
- the signal replaces the left image or causes alpha blending into the left image signal.
- the formatter 230 compares the frame time with respect to the overlapping left and right image signals, and formats the paired left and right images to be displayed on the display surface 232 in pairs, thereby reconstructing the 3D image.
- image synthesis by the first and second mixers 226 and 228 may be selectively performed in response to a user's command. That is, the viewer may apply a manipulation command so that the floating window is not applied by manipulating a remote controller (not shown).
- the main control unit 200 may store the command in a memory and control the composition of the floating window with reference to the command.
- the generation of the floating window by the graphic engine 220 may also be modified according to a user's manipulation command. For example, the user may apply a command to arbitrarily reduce the width of each floating window within a range of the window size specified in the floating window information received through the broadcast signal, and accordingly, the main controller 200 may flow the flow.
- the floating window is adjusted to provide the floating window information to the graphics engine 220.
- FIG. 13 is a flowchart illustrating a process of extracting floating window information and compositing the left and right images.
- the main controller 200 extracts the floating window data fw_data () from the header and extension / user data extracted by the image decoder 216 during the decoding of the video ES (step 250).
- the floating window data may be extracted from the SEI (Supplemental Enhancement Information) region.
- whether a floating window that invalidates a part of a left image or a right image is applied may be determined in response to a user's operation command. Accordingly, the main control unit 200 continuously checks whether an operation command for applying a floating window is applied from a remote controller (not shown), and stores it in a memory when the command is applied, and refers to the floating command. Controls the composition of windows. In operation 252, the main controller 200 checks whether a user command for operating the floating window is stored in the memory or whether an operation command for applying the floating window is applied from the remote controller. In operation 254, the main controller 200 determines whether the floating window should be activated.
- the main controller 200 may include the graphic engine 220, the first and second OSD generators 222 and 224, and the first and second mixers. 226 and 228 are controlled so that the floating window is applied to the left image and the right image (step 256).
- the formatter 230 formats the left and right images with the OSD superimposed on the stereoscopic display method in the corresponding receiver to display the 3D image (step 258).
- FIG. 14 illustrates in detail the extraction of floating window information (operation 250) from FIG. 12.
- the image decoding unit 216 decodes the video ES to output picture data or a video coding layer (VCL) and extracts headers and extension / user data in the video ES (step 270)
- the main control unit 200 performs the picture.
- the picture user data user_data () is extracted by decoding extension_and_user_data (2) in the header (step 272).
- SEI Supplemental Enhancement Information
- the main control unit 200 parses the AVC NAL unit, extracts SEI data having a value of 'nal_unit_type' of '6', and reads a user_data_registered_itu_t_t35 () message having a value of '4' of 'payloadType'.
- the main controller 200 detects ATSC_user_data () having a value of “user_data_identifier” of “0x4741 3934” in the picture user data user_data ().
- the main control unit 200 detects user_data_type_structure () having 'user_data_type_code' of “0x09” from the ATSC_user_data () (step 276).
- the main control unit 200 reads the floating window data fw_data () from the user_data_type_structure (), and extracts information such as size, transparency, and color of each window for which a flag indicating that the floating window exists is set. (Step 278).
- FIGS. 15 to 21 are screenshots for explaining a process of synthesizing the floating window to the left and right images.
- the left and right images before the floating window synthesis are as shown in FIGS. 15 and 16, respectively, and the floating window information is as follows.
- the main control unit 200 determines that the left floating window for the left image and the right floating window for the right image should be activated. . Meanwhile, since the 'right_view_left_float_window_flag' and 'right_view_right_float_window_flag' flags are set to "0", the main control unit 200 determines that the right floating window for the left image and the left floating window for the right image should not be activated. .
- the graphics engine 220 uses the 'number_pixels_of_LL_window', 'transparency_LL_window', and 'color_LL_window' fields for the left floating window for the left image based on the width and color of the floating window and transparency values for alpha blending. Generate OSD data for the floating window. In addition, the graphics engine 220 uses the 'number_pixels_of_RR_window', 'transparency_RR_window', and 'color_RR_window' fields on the right floating window for the right image based on the width and color of the floating window and transparency values for alpha blending. Generates OSD data for the floating window in the image.
- the left floating window for the left image has a vertical strip shape in which the width is 2 pixels, the transparency is 0, and the color is "0x 505050".
- the right floating window for the right image has a vertical strip shape in which the width is 3 pixels, the transparency is 0, and the color is "0x 000000”.
- the first mixer 226 superimposes the OSD image on the left image of FIG. 15, and generates a left image signal representing the left image in which the floating window is overlapped as shown in FIG. 17.
- the second mixer 228 overlaps the OSD image of FIG. 18 with the right image of FIG. 16, and generates a right image signal representing the right image in which the floating window is overlapped as shown in FIG. 18.
- the formatter 230 synthesizes left and right images in which an OSD image is superimposed.
- 19 to 21 illustrate examples of a screen in which a left and right image in which a floating window is superimposed is displayed.
- FIG. 19 illustrates an example of synthesis using a horizontal interleaving method, in which left and right images are alternately displayed while changing lines in a horizontal direction.
- FIG. 20 shows an example in which a left image and a right image are synthesized by a vertical interleaving method. Accordingly, the left image and the right image are alternately displayed by changing the vertical lines one by one.
- 21 shows an example in which a left image and a right image are synthesized in a checkerboard pattern.
- the left image or the right image replaced by the floating window is not displayed, and only the left image or the right image that is not affected by the floating window is displayed.
- the displayed left and right images are input to the viewer's left eye and right eye through polarized glasses.
- FIGS. 19 to 21 are suitable for receiver systems using polarized glasses.
- the present invention can also be applied to a receiver system using shutter glasses.
- the images of FIGS. 17 and 18 may be alternately displayed on the display surface.
- the shutter glasses synchronized with the switching of the displayed image deliver the left image and the right image only to the viewer's left eye and right eye, respectively.
- the graphics engine 220 generates the OSD data for the floating window based on the floating window information
- the first and second OSD generators 222 and 224 generate the OSD image signal.
- the graphic engine 220 and the first and second OSD generators 222 and 224 may be integrated into one. 22 shows such an embodiment.
- the graphics engine 320 receives the floating window information from the main controller 200 in a suitable form, and generates an OSD signal for implementing the left and right floating windows based on the information. Output to the second mixers 226 and 228. Since other features of the present embodiment are similar to those of FIG. 12, a detailed description thereof will be omitted.
- Figure 23 shows another modified embodiment of a television receiver according to the present invention.
- the broadcast payload data that is, the left and right picture data to the video coding layer (VCL)
- VCL video coding layer
- the first formatter 410 formats the left image signal and the right image signal from the left and right image separators 218 according to the stereoscopic output method.
- the graphics engine 420 receives the floating window information from the main controller 200 in a suitable form, and generates OSD data for the floating window in the left and right images based on the information.
- the first OSD generator 422 generates an OSD signal for the left image based on the OSD data for the left image floating window.
- the second OSD generator 424 generates an OSD signal for the right image based on the OSD data for the right image floating window.
- the second formatter 426 formats the left image OSD signal and the right image OSD signal according to the stereoscopic output method.
- the mixer 428 synthesizes the output signals of the first and second formatters 410 and 426 so that the synthesized image signals are displayed on the display surface 232.
- the OSD images are generated based on the floating window information and the OSD images are synthesized with the broadcast images.
- the floating window information Correspondingly, broadcast images may be directly manipulated.
- the above description focuses on an embodiment of erasing or invalidating the broadcast video portion by replacing the broadcast video portion corresponding to the position of the floating window with the color specified in the floating window information.
- the broadcast video portion corresponding to the position of the floating window is not completely invalidated but may be suppressed by alpha blending.
- the present invention has been described with reference to over-the-air digital broadcasting, but the present invention can be equally applied to broadcasting transmitted through a cable network.
- the same may be applied to the storage of the image and the reproduction of the stored image through a storage medium such as a DVD, a Blu-ray disc, or a personal video recorder (PVR).
- the present invention can be applied to video transmission on a network.
- the present invention asymmetrically suppresses the area inside the left, right, or both corners of the left and right images for the stereoscopic 3D image by the floating window. Accordingly, near the left and right edges of the stereoscopic image pair, it is possible to prevent or reduce the collision of the clues that may occur when the object having the negative parallax value partially overlaps, that is, the edge violation problem. Accordingly, it is possible to realize an optical illusion effect in which an object having a negative parallax reliably emerges in front of the display surface. Therefore, the present invention can reduce the discomfort and confusion caused by the edge violation, and improve the quality of the 3D image by increasing the three-dimensional feeling and the realism.
- the present invention can be applied to a stereoscopic 3D TV as well as to a multiview 3D TV in a similar manner.
- the present invention can be applied to lenticular, integral imaging, holography or other three-dimensional TV.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Library & Information Science (AREA)
- Human Computer Interaction (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
Claims (18)
- 서로 다른 뷰포인트를 갖는 복수 영상을 각각 나타내는 복수의 비디오 신호를 마련하는 단계;상기 복수 영상 각각에 대해 적어도 하나의 플로팅 윈도우 적용 여부 필드, 각 플로팅 윈도우의 위치와 크기, 투명도, 색을 포함한 플로우팅 윈도우 정보를 생성하는 단계; 및상기 플로우팅 윈도우 정보를 비디오 픽쳐 헤더 영역에 삽입하여 상기 비디오 신호를 부호화하여 전송하는 단계;를 포함하는 3차원 영상신호 전송 방법.
- 청구항 1에 있어서, 상기 플로우팅 윈도우가 상기 복수 영상 각각에서 좌우측 모서리 내측에 설정될 수 있으며,각 플로우팅 윈도우의 크기가 개별적으로 결정되는 3차원 영상신호 전송방법.
- 청구항 2에 있어서, 각 플로우팅 윈도우의 크기가, 상기 플로우팅 윈도우에 해당하는 영상을 촬상한 이미지 센서의 카메라 파라미터, 상기 복수의 영상들 내에서 대응하는 점들 사이의 변위, 및 상기 카메라 파라미터와 상기 변위의 조합으로 구성된 기준 그룹에서 선택되는 어느 하나를 토대로, 결정되는 3차원 영상신호 전송 방법.
- 부호화된 비디오 신호를 복호화하여, 복수의 영상신호를 복원하고, 상기 부호화된 비디오 신호의 픽쳐 헤더 영역으로부터 각 플로우팅 윈도우의 위치 및 크기와, 투명도와, 색 데이터를 포함하는 플로우팅 윈도우 정보를 추출하는 복호화부; 및상기 복수의 영상신호에 대응하는 복수의 영상들 각각에 대해 상기 플로우팅 윈도우 정보에 따라 좌측 또는 우측의 모서리 내측 영역에서 영상 표시를 제어하며 상기 복수의 영상들을 스테레오스코픽 방식으로 표시하는 합성 표시부;를 구비하는 3차원 영상표시 장치.
- 청구항 4에 있어서, 상기 합성 표시부가 상기 플로우팅 윈도우 정보에 따라 상기 영역에서 영상을 억제시켜, 국부적으로 억제된 영상들을 스테레오스코픽 방식으로 표시하는 3차원 영상표시 장치.
- 청구항 5에 있어서, 상기 복수의 영상신호가 좌영상에 대한 좌영상 신호와 우영상에 대한 우영상 신호를 포함하며,상기 합성 표시부가상기 위치 및 크기 데이터와 상기 색상 데이터에 따라 좌측 플로우팅 윈도우 영상 신호와 우측 플로우팅 윈도우 영상 신호를 생성하는 온스크린디스플레이 생성부;각각 상기 좌영상 신호와 상기 좌측 플로우팅 윈도우 영상 신호를 합성하고, 상기 우영상 신호와 상기 우측 플로우팅 윈도우 영상 신호를 합성하는 제1 및 제2 혼합부; 및상기 제1 및 제2 혼합부의 출력 신호들을 상기 스테레오스코픽 출력 방식에 맞게 포맷팅하는 포맷터;를 구비하는 3차원 영상표시 장치.
- 청구항 5에 있어서, 상기 복수의 영상신호가 좌영상에 대한 좌영상 신호와 우영상에 대한 우영상 신호를 포함하며,상기 합성 표시부가상기 좌영상 신호와 상기 우영상 신호를 상기 스테레오스코픽 출력 방식에 맞게 포맷팅하는 제1 포맷터;상기 위치 및 크기 데이터와 상기 색상 데이터에 따라 좌측 플로우팅 윈도우 영상 신호와 우측 플로우팅 윈도우 영상 신호를 생성하는 온스크린디스플레이 생성부;상기 좌측 플로우팅 윈도우 영상 신호와 상기 우측 플로우팅 윈도우 영상 신호를 상기 스테레오스코픽 출력 방식에 맞게 포맷팅하는 제2 포맷터; 및상기 제1 및 제2 포맷터의 출력 신호들을 합성하는 혼합부;를 구비하는 3차원 영상표시 장치.
- 청구항 5에 있어서, 상기 장치는 텔레비전 수신기로서,채널을 통해 방송신호를 받아들이고 복조하여, 채널부호화된 트랜스포트 스트림을 출력하는 튜너/복조기;상기 채널부호화된 트랜스포트 스트림을 받아들이고, 에러정정복호화를 수행하여 복호화된 트랜스포트 스트림을 출력하는 채널복호화부; 및상기 복호화된 트랜스포트 스트림을 역다중화하고 패킷을 해제하여 상기 부호화된 비디오 신호를 출력하는 역다중화/패킷해제부;를 더 구비하는 3차원 영상표시 장치.
- 3차원 영상표시 장치에 있어서,(a) 부호화된 비디오 신호를 획득하는 단계;(b) 상기 부호화된 비디오 신호를 복호화하여, 복수의 영상신호를 복원하고, 상기 부호화된 비디오 신호의 픽쳐 헤더 영역으로부터 각 플로우팅 윈도우의 위치 및 크기와, 투명도와, 색 데이터를 포함하는 플로우팅 윈도우 정보를 추출하는 단계; 및(c) 상기 복수의 영상신호에 대응하는 복수의 영상들 각각에 대해 상기 플로우팅 윈도우 정보에 따라 좌측 또는 우측의 모서리 내측 영역에서 영상 표시를 제어하며 상기 복수의 영상들을 스테레오스코픽 방식으로 표시하는 단계;를 포함하는 3차원 영상신호 처리 방법.
- 청구항 9에 있어서,상기 (c)단계에서, 상기 플로우팅 윈도우 정보에 따라 상기 영역에서 영상을 억제시켜, 국부적으로 억제된 영상들을 스테레오스코픽 방식으로 표시하는 3차원 영상신호 처리 방법.
- 청구항 10에 있어서, 상기 (b)단계가상기 부호화된 비디오 신호로부터 픽쳐 헤더를 추출하는 단계;상기 픽쳐 헤더로부터 제1 필드가 소정의 값을 가지는 유저 데이터 구조를 추출하는 단계; 및상기 유저 데이터 구조에서 제2 필드가 플로우팅 윈도우를 나타내는 값을 가지는 데이터 구조를 상기 플로우팅 윈도우 정보로써 추출하는 단계;를 포함하는 3차원 영상신호 처리 방법.
- 청구항 10에 있어서, 상기 (c)단계가상기 복수의 영상들 각각에서 좌우측 모서리 중 적어도 어느 하나의 내측 영역에 상기 위치 및 크기 데이터에 따라 플로우팅 윈도우를 설정하고, 상기 플로우팅 윈도우에 해당하는 영상 부분을 무효화시키는 단계;를 포함하는 3차원 영상신호 처리 방법.
- 청구항 12에 있어서, 상기 플로우팅 윈도우에 해당하는 상기 영상 부분을 특정 색상을 채움으로써 상기 영상 부분을 무효화하는 3차원 영상신호 처리 방법.
- 청구항 13에 있어서, 상기 플로우팅 윈도우에 해당하는 상기 영상 부분을 상기 플로우팅 윈도우 정보에서 지정된 색상으로 채우는 3차원 영상신호 처리 방법.
- 청구항 13에 있어서, 상기 (c)단계가상기 플로우팅 윈도우에 해당하는 영상 부분에 대한 온스크린디스플레이 영상을 생성하는 단계; 및상기 온스크린디스플레이 영상을 상기 영상 부분에 중첩시킴으로써, 상기 영상 부분을 무효화하는 단계;를 포함하는 3차원 영상신호 처리 방법.
- 청구항 10에 있어서, 상기 (c)단계가상기 복수의 영상들 각각에서 좌우측 모서리 중 적어도 어느 하나의 내측 영역에 상기 위치 및 크기 데이터에 따라 플로우팅 윈도우를 설정하고, 상기 플로우팅 윈도우에 해당하는 영상 부분에 상기 플로우팅 윈도우 정보에서 지정된 색상을 상기 투명도 데이터에 따라 알파 블렌딩함으로써 상기 영상을 억제시키는 단계;를 포함하는 3차원 영상신호 처리 방법.
- 청구항 10에 있어서,사용자 조작명령을 체크하는 단계;를 더 포함하며,상기 사용자 조작명령이 상기 억제를 금지하지 않는 경우에만 상기 (c)단계를 수행하는 3차원 영상신호 처리 방법.
- 청구항 10에 있어서, 상기 (c)단계가상기 사용자 조작명령에 응답하여 플로우팅 윈도우의 크기를 조절하는 단계;를 더 포함하는 3차원 영상신호 처리 방법.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020117012716A KR101667723B1 (ko) | 2008-12-02 | 2009-08-19 | 3차원 영상신호 전송 방법과, 3차원 영상표시 장치 및 그에 있어서의 신호 처리 방법 |
EP09830512.1A EP2357823A4 (en) | 2008-12-02 | 2009-08-19 | 3d image signal transmission method, 3d image display apparatus and signal processing method therein |
US13/132,239 US9288470B2 (en) | 2008-12-02 | 2009-08-19 | 3D image signal transmission method, 3D image display apparatus and signal processing method therein |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11935908P | 2008-12-02 | 2008-12-02 | |
US61/119,359 | 2008-12-02 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010064774A1 true WO2010064774A1 (ko) | 2010-06-10 |
Family
ID=42233408
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/KR2009/004619 WO2010064774A1 (ko) | 2008-12-02 | 2009-08-19 | 3차원 영상신호 전송 방법과, 3차원 영상표시 장치 및 그에 있어서의 신호 처리 방법 |
Country Status (4)
Country | Link |
---|---|
US (1) | US9288470B2 (ko) |
EP (1) | EP2357823A4 (ko) |
KR (1) | KR101667723B1 (ko) |
WO (1) | WO2010064774A1 (ko) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102572457A (zh) * | 2010-12-31 | 2012-07-11 | 财团法人工业技术研究院 | 前景深度地图产生模块及其方法 |
EP2688303A1 (en) * | 2011-05-30 | 2014-01-22 | Sony Corporation | Recording device, recording method, playback device, playback method, program, and recording/playback device |
TWI469088B (zh) * | 2010-12-31 | 2015-01-11 | Ind Tech Res Inst | 前景深度地圖產生模組及其方法 |
US9251564B2 (en) | 2011-02-15 | 2016-02-02 | Thomson Licensing | Method for processing a stereoscopic image comprising a black band and corresponding device |
US9392249B2 (en) | 2011-01-25 | 2016-07-12 | Lg Electronics Inc. | Method and apparatus for transmitting/receiving a digital broadcasting signal |
CN105933779A (zh) * | 2016-06-27 | 2016-09-07 | 北京奇虎科技有限公司 | 利用寄生工具包实现的视频播放方法及装置 |
CN108508616A (zh) * | 2018-05-17 | 2018-09-07 | 成都工业学院 | 一种3d显示系统及3d显示装置 |
Families Citing this family (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100045779A1 (en) * | 2008-08-20 | 2010-02-25 | Samsung Electronics Co., Ltd. | Three-dimensional video apparatus and method of providing on screen display applied thereto |
CN102293001B (zh) * | 2009-01-21 | 2014-05-14 | 株式会社尼康 | 图像处理装置、图像处理方法及记录方法 |
JP5267886B2 (ja) * | 2009-04-08 | 2013-08-21 | ソニー株式会社 | 再生装置、記録媒体、および情報処理方法 |
KR20100128233A (ko) * | 2009-05-27 | 2010-12-07 | 삼성전자주식회사 | 영상 처리 방법 및 장치 |
KR101719980B1 (ko) * | 2010-06-22 | 2017-03-27 | 엘지전자 주식회사 | 3차원 컨텐츠를 출력하는 디스플레이 기기의 영상 처리 방법 및 그 방법을 채용한 디스플레이 기기 |
WO2011123509A1 (en) * | 2010-03-31 | 2011-10-06 | Design & Test Technology, Inc. | 3d video processing unit |
EP2553930B1 (en) * | 2010-04-01 | 2024-01-17 | InterDigital Madison Patent Holdings, SAS | Method and system of using floating window in three-dimensional (3d) presentation |
JP2011228862A (ja) * | 2010-04-16 | 2011-11-10 | Sony Corp | データ構造、画像処理装置、画像処理方法、およびプログラム |
KR20110124161A (ko) * | 2010-05-10 | 2011-11-16 | 삼성전자주식회사 | 계층 부호화 영상의 송수신 방법 및 장치 |
KR101435594B1 (ko) * | 2010-05-31 | 2014-08-29 | 삼성전자주식회사 | 디스플레이 장치 및 그 디스플레이 방법 |
KR20120088100A (ko) * | 2011-01-31 | 2012-08-08 | 삼성전자주식회사 | 디스플레이 컨트롤러 및 디스플레이 시스템 |
US8665304B2 (en) * | 2011-03-21 | 2014-03-04 | Sony Corporation | Establishing 3D video conference presentation on 2D display |
GB2489931A (en) * | 2011-04-08 | 2012-10-17 | Sony Corp | Analysis of 3D video to detect frame violation within cropped images |
US9883172B2 (en) * | 2011-10-03 | 2018-01-30 | Echostar Technologies L.L.C. | Active 3D to passive 3D conversion |
AU2012318854B2 (en) * | 2011-10-05 | 2016-01-28 | Bitanimate, Inc. | Resolution enhanced 3D video rendering systems and methods |
US9791922B2 (en) * | 2011-10-13 | 2017-10-17 | Panasonic Intellectual Property Corporation Of America | User interface control device, user interface control method, computer program and integrated circuit |
CN104054333A (zh) * | 2011-12-19 | 2014-09-17 | 富士胶片株式会社 | 图像处理装置、方法以及程序及其记录介质 |
WO2013122385A1 (en) | 2012-02-15 | 2013-08-22 | Samsung Electronics Co., Ltd. | Data transmitting apparatus, data receiving apparatus, data transreceiving system, data transmitting method, data receiving method and data transreceiving method |
WO2013122386A1 (en) | 2012-02-15 | 2013-08-22 | Samsung Electronics Co., Ltd. | Data transmitting apparatus, data receiving apparatus, data transreceiving system, data transmitting method, data receiving method and data transreceiving method |
WO2013122387A1 (en) | 2012-02-15 | 2013-08-22 | Samsung Electronics Co., Ltd. | Data transmitting apparatus, data receiving apparatus, data transceiving system, data transmitting method, and data receiving method |
US20140055564A1 (en) * | 2012-08-23 | 2014-02-27 | Eunhyung Cho | Apparatus and method for processing digital signal |
JP2014027448A (ja) * | 2012-07-26 | 2014-02-06 | Sony Corp | 情報処理装置、情報処理方法、及びプログラム |
WO2014065635A1 (ko) * | 2012-10-25 | 2014-05-01 | 엘지전자 주식회사 | 다시점 3dtv 서비스에서 에지 방해 현상을 처리하는 방법 및 장치 |
WO2014089798A1 (en) | 2012-12-13 | 2014-06-19 | Thomson Licensing | Method and apparatus for error control in 3d video transmission |
US9986259B2 (en) | 2013-07-18 | 2018-05-29 | Lg Electronics Inc. | Method and apparatus for processing video signal |
US9591290B2 (en) * | 2014-06-10 | 2017-03-07 | Bitanimate, Inc. | Stereoscopic video generation |
CN106792089A (zh) * | 2016-12-15 | 2017-05-31 | 腾讯科技(深圳)有限公司 | 视频播放方法和装置 |
RU2686576C1 (ru) | 2017-11-30 | 2019-04-29 | Самсунг Электроникс Ко., Лтд. | Компактное устройство голографического дисплея |
US10523912B2 (en) | 2018-02-01 | 2019-12-31 | Microsoft Technology Licensing, Llc | Displaying modified stereo visual content |
CN108600334B (zh) * | 2018-04-03 | 2022-03-04 | 腾讯科技(深圳)有限公司 | 展示电子装置无线信号接入质量的方法及装置 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060133493A1 (en) * | 2002-12-27 | 2006-06-22 | Suk-Hee Cho | Method and apparatus for encoding and decoding stereoscopic video |
KR20080039797A (ko) * | 2006-11-01 | 2008-05-07 | 한국전자통신연구원 | 스테레오스코픽 콘텐츠 재생에 이용되는 메타 데이터의복호화 방법 및 장치 |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6359658B1 (en) * | 2000-03-06 | 2002-03-19 | Philips Electronics North America Corporation | Subjective noise measurement on active video signal |
WO2002071764A1 (en) * | 2001-01-24 | 2002-09-12 | Vrex, Inc. | Method and system for adjusting stereoscopic image to optimize viewing for image zooming |
US20080177994A1 (en) * | 2003-01-12 | 2008-07-24 | Yaron Mayer | System and method for improving the efficiency, comfort, and/or reliability in Operating Systems, such as for example Windows |
TWI384413B (zh) * | 2006-04-24 | 2013-02-01 | Sony Corp | An image processing apparatus, an image processing method, an image processing program, and a program storage medium |
JP4737003B2 (ja) * | 2006-08-10 | 2011-07-27 | ソニー株式会社 | 編集装置、編集方法、編集プログラムおよび編集システム |
EP2067360A1 (en) * | 2006-09-25 | 2009-06-10 | Nokia Corporation | Supporting a 3d presentation |
EP2153669B1 (en) * | 2007-05-11 | 2012-02-01 | Koninklijke Philips Electronics N.V. | Method, apparatus and system for processing depth-related information |
-
2009
- 2009-08-19 EP EP09830512.1A patent/EP2357823A4/en not_active Withdrawn
- 2009-08-19 KR KR1020117012716A patent/KR101667723B1/ko active IP Right Grant
- 2009-08-19 WO PCT/KR2009/004619 patent/WO2010064774A1/ko active Application Filing
- 2009-08-19 US US13/132,239 patent/US9288470B2/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060133493A1 (en) * | 2002-12-27 | 2006-06-22 | Suk-Hee Cho | Method and apparatus for encoding and decoding stereoscopic video |
KR20080039797A (ko) * | 2006-11-01 | 2008-05-07 | 한국전자통신연구원 | 스테레오스코픽 콘텐츠 재생에 이용되는 메타 데이터의복호화 방법 및 장치 |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102572457A (zh) * | 2010-12-31 | 2012-07-11 | 财团法人工业技术研究院 | 前景深度地图产生模块及其方法 |
TWI469088B (zh) * | 2010-12-31 | 2015-01-11 | Ind Tech Res Inst | 前景深度地圖產生模組及其方法 |
US9392249B2 (en) | 2011-01-25 | 2016-07-12 | Lg Electronics Inc. | Method and apparatus for transmitting/receiving a digital broadcasting signal |
DE112012000563B4 (de) * | 2011-01-25 | 2021-06-24 | Lg Electronics Inc. | Verfahren und Vorrichtung zum Senden/Empfangen eines digitalen Übertragungssignals |
GB2501035B (en) * | 2011-01-25 | 2017-02-22 | Lg Electronics Inc | Method and apparatus for transmitting/receiving a digital broadcasting signal |
US9251564B2 (en) | 2011-02-15 | 2016-02-02 | Thomson Licensing | Method for processing a stereoscopic image comprising a black band and corresponding device |
EP2688303A4 (en) * | 2011-05-30 | 2014-11-05 | Sony Corp | RECORDING DEVICE, RECORDING METHOD, PLAYING DEVICE, PLAYING METHOD, PROGRAM AND RECORDING / REPLAYING DEVICE |
CN103548345A (zh) * | 2011-05-30 | 2014-01-29 | 索尼公司 | 记录设备和方法、再现设备和方法、程序和记录再现设备 |
EP2688303A1 (en) * | 2011-05-30 | 2014-01-22 | Sony Corporation | Recording device, recording method, playback device, playback method, program, and recording/playback device |
CN105933779A (zh) * | 2016-06-27 | 2016-09-07 | 北京奇虎科技有限公司 | 利用寄生工具包实现的视频播放方法及装置 |
WO2018001218A1 (zh) * | 2016-06-27 | 2018-01-04 | 北京奇虎科技有限公司 | 视频播放方法、装置、程序及介质 |
CN108508616A (zh) * | 2018-05-17 | 2018-09-07 | 成都工业学院 | 一种3d显示系统及3d显示装置 |
CN108508616B (zh) * | 2018-05-17 | 2024-04-16 | 成都工业学院 | 一种3d显示系统及3d显示装置 |
Also Published As
Publication number | Publication date |
---|---|
KR101667723B1 (ko) | 2016-10-19 |
KR20110106289A (ko) | 2011-09-28 |
EP2357823A1 (en) | 2011-08-17 |
US9288470B2 (en) | 2016-03-15 |
EP2357823A4 (en) | 2017-03-01 |
US20110234760A1 (en) | 2011-09-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2010064774A1 (ko) | 3차원 영상신호 전송 방법과, 3차원 영상표시 장치 및 그에 있어서의 신호 처리 방법 | |
US9491434B2 (en) | Method for processing three dimensional (3D) video signal and digital broadcast receiver for performing the method | |
WO2010071291A1 (ko) | 3차원 영상신호 처리 방법과 이를 구현하기 위한 영상표시 장치 | |
KR101659026B1 (ko) | 3차원 캡션 디스플레이 방법 및 이를 구현하기 위한 3차원 디스플레이 장치 | |
CN104618708B (zh) | 广播接收机及其视频数据处理方法 | |
WO2010085074A2 (en) | Three-dimensional subtitle display method and three-dimensional display device for implementing the same | |
US5661518A (en) | Methods and apparatus for the creation and transmission of 3-dimensional images | |
KR101622688B1 (ko) | 3차원 캡션 디스플레이 방법 및 이를 구현하기 위한 3차원 디스플레이 장치 | |
WO2010079880A1 (ko) | 3차원 캡션 신호 전송 방법 및 3차원 캡션 디스플레이 방법 | |
WO2011084021A2 (ko) | 방송 수신기 및 3d 이미지 디스플레이 방법 | |
WO2010117129A2 (en) | Broadcast transmitter, broadcast receiver and 3d video data processing method thereof | |
WO2011001857A1 (ja) | 立体画像データ送信装置、立体画像データ送信方法、立体画像データ受信装置および立体画像データ受信方法 | |
WO2010093115A2 (en) | Broadcast receiver and 3d subtitle data processing method thereof | |
WO2011005025A2 (en) | Signal processing method and apparatus therefor using screen size of display device | |
JP2013502804A (ja) | 付加データの三次元再生のための信号処理方法及びその装置 | |
WO2011046271A1 (en) | Broadcast receiver and 3d video data processing method thereof | |
WO2011118215A1 (ja) | 映像処理装置 | |
WO2014065635A1 (ko) | 다시점 3dtv 서비스에서 에지 방해 현상을 처리하는 방법 및 장치 | |
WO2013022315A2 (ko) | 영상 제공 장치 및 방법, 그리고 영상 재생 장치 및 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09830512 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13132239 Country of ref document: US |
|
ENP | Entry into the national phase |
Ref document number: 20117012716 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2009830512 Country of ref document: EP |