KR20130076674A - Signal processing device for processing a plurality of 3d contents, display device for displaying them and methods thereof - Google Patents
Signal processing device for processing a plurality of 3d contents, display device for displaying them and methods thereof Download PDFInfo
- Publication number
- KR20130076674A KR20130076674A KR1020120054864A KR20120054864A KR20130076674A KR 20130076674 A KR20130076674 A KR 20130076674A KR 1020120054864 A KR1020120054864 A KR 1020120054864A KR 20120054864 A KR20120054864 A KR 20120054864A KR 20130076674 A KR20130076674 A KR 20130076674A
- Authority
- KR
- South Korea
- Prior art keywords
- content
- frame
- contents
- frame rate
- image
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/139—Format conversion, e.g. of frame-rate or size
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/161—Encoding, multiplexing or demultiplexing different image signal components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/194—Transmission of image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
Abstract
A display device is disclosed. The display apparatus includes a plurality of scalers for reducing a data size of a plurality of receivers, a storage unit, and a plurality of contents which receive a plurality of contents, storing them in a storage unit, and reading out each content stored in the storage unit according to an output timing. And a plurality of frame rate converters for converting the frame rates of the read contents, and an image output unit for combining and displaying the respective contents output from the plurality of frame rate converters. Accordingly, resources can be minimized.
Description
The present invention relates to a signal processing apparatus, a display apparatus and methods thereof, and more particularly, to a signal processing apparatus for processing a plurality of contents, a display apparatus for displaying the same and methods thereof.
Various types of electronic products are being developed and distributed by the development of electronic technology. In particular, various display devices such as TVs, mobile phones, PCs, notebook PCs, and PDAs are used in most households.
As the use of display devices has increased, the user needs for more diverse functions have also increased. As a result, the effort of each manufacturer to meet user needs has increased, and products with new functions that have not been available in the past are emerging.
Accordingly, various contents processed by the display apparatus are also provided. In particular, recently, contents having large data sizes such as high resolution content or 3D content have been provided.
In addition, in recent years, a development effort has been made for a display device in which a plurality of contents are simultaneously provided and a plurality of users can view different contents. In the case of such a display device, a resource such as a memory or a bus is required more than processing and displaying one content. Therefore, image processing may not be performed smoothly.
In particular, when displaying a plurality of content having a large data size such as 3D content in combination, it is difficult to implement because it takes much more resources.
Accordingly, there is a need for a technology capable of effectively displaying a multi-view by processing a plurality of contents.
SUMMARY OF THE INVENTION The present invention has been made to solve the above-described problem, and an object of the present invention is to provide a signal processing apparatus capable of processing a plurality of contents, a display apparatus for displaying the same, and methods thereof.
According to an exemplary embodiment of the present invention, a display apparatus includes a plurality of receivers, a storage unit, and a plurality of contents which reduce a data size of the plurality of contents and store the plurality of contents in the storage unit. A plurality of scaler units for reading each content stored in the unit according to an output timing, a plurality of frame rate converters for converting the frame rates of the read content, and a plurality of content output from the plurality of frame rate converters And an image output unit for displaying.
The plurality of contents may be 3D contents including a left eye image and a right eye image, respectively, and each of the plurality of scalers may store the downscaled plurality of 3D contents and reduce a frame rate in the storage unit.
Alternatively, each of the plurality of scalers down-scales the plurality of 3D contents and stores the plurality of 3D contents in the storage unit. The down frame may be provided to the plurality of frame rate converters.
Or, at least one of the plurality of scalers, if the 3D content is 3: 2 pull-down film image content, down-scaling the film image content, extract only the key frame to store in the storage unit, When the key frame is read from the storage unit, the frame rate converter may convert the frame rate of each 3D content into a multi-content display rate by interpolating the frame based on the read key frame.
The image output unit may multiplex each content provided by the plurality of frame rate converters in order according to a predetermined arrangement order, and upscale and display the multiplexed data according to a screen size.
On the other hand, according to an embodiment of the present invention, in the multi-content display method of the display device, receiving a plurality of contents each comprising a left eye image and a right eye image, reducing the data size of the plurality of content to store And converting the frame rates of the plurality of stored contents, respectively, and displaying the combined contents of the converted frame rates.
The plurality of contents may be 3D contents including a left eye image and a right eye image, respectively.
In this case, reducing and storing the data size of the plurality of contents may include:
Downscaling the plurality of 3D content, reducing the frame rate of each of the downscaled 3D content, storing each 3D content having the reduced frame rate, and converting the frame rate The frame rate of each 3D content may be converted into a multi-content display rate.
Alternatively, reducing and storing the data size of the plurality of contents may include: downscaling the film image contents when the 3D content is 3: 2 pull-down film image content; The method may include extracting and storing only a key frame, and converting the frame rate may convert the frame rate of each 3D content by interpolating the frame based on the stored key frame.
In addition, the displaying may include multiplexing each content so as to be sequentially arranged according to a predetermined arrangement order, upscaling the multiplexed data to a screen size, and displaying the upscaled data. Can be.
Meanwhile, according to an embodiment of the present invention, a signal processing apparatus includes a plurality of scalers configured to reduce a data size of a plurality of 3D contents each including a left eye image and a right eye image, and a plurality of 3Ds processed by the plurality of scalers. A storage unit for storing content, and a plurality of frame rate converter for converting the frame rate of the plurality of 3D content stored in the storage unit to a multi-content display rate.
The plurality of scalers may downscale the plurality of 3D contents and store the downscaled 3D contents in the storage unit. When the downscaled 3D contents are read from the storage unit, the plurality of scaled units convert the read 3D contents into the plurality of frame rate converters. Can be converted to a format that can be processed by.
The apparatus may further include an image processor configured to configure multi-content frame data and a interface unit to transmit the multi-content frame data to a display device by using the plurality of 3D contents having the frame rates converted by the plurality of frame rate converters. Can be.
According to an embodiment of the present disclosure, a signal processing method includes downscaling a plurality of 3D contents each including a left eye image and a right eye image, and converting a frame rate of the 3D content using a plurality of frame rate converters. And using the plurality of 3D contents having the converted frame rate, constructing multi content frame data and transmitting the 3D multi content frame data to a display device.
The method may further include converting the plurality of down-scaled 3D contents into a format that may be processed by the plurality of frame rate converters.
Meanwhile, according to another embodiment of the present disclosure, the multi-view display method may include receiving a plurality of different content having different frame rates, matching frame rates of the plurality of contents, and matching the frame rates. And displaying a multi-view frame using each content having a.
The matching of the frame rates may include storing the plurality of contents, generating the plurality of image frames by processing the plurality of contents, and having a relatively small frame rate among the plurality of contents. And interpolating the video frame.
The interpolating may include: comparing a reception time of each of the plurality of contents, and confirming a storage ratio of a corresponding frame of other contents when one image frame of one of the plurality of contents is stored. And generating an interpolation frame by combining the corresponding frame and the next frame of the corresponding frame according to the identified storage ratio.
The generating of the interpolation frame may include comparing the corresponding frame and the next frame to estimate a motion of an object displayed in the frame, and apply the reception ratio to the estimated motion to generate the interpolation frame. .
The matching of the frame rate may include detecting a key frame of each of the plurality of contents, and integrating the detected key frames.
In the step of integrating the key frames, if the number of key frames of each of the plurality of contents is different, the number of key frames may be matched by performing a frame repeating or skipping operation, and the corresponding key frames of the respective contents may be integrated. have.
The matching of the frame rate may further include performing motion judging removal by performing interpolation on the integrated key frame.
According to another embodiment of the present invention, a display apparatus includes a plurality of receivers for receiving a plurality of 3D contents, a plurality of system on chips (SoCs) each having a display processor for processing 3D contents, and the plurality of SoCs. And an output unit for outputting a plurality of content views by combining the image frames of each of the 3D content processed in.
Here, one SoC of the plurality of SoCs may include a mux for muxing data processed by a display processor mounted in the SoC and data output from another SoC.
Alternatively, the apparatus may further include a SoC equipped with a mux for muxing data output from the plurality of SoCs and a frame rate converter for converting a frame rate of the muxed data from the mux.
According to various embodiments of the present disclosure as described above, a plurality of users may view different contents on one display device, respectively.
1 is a block diagram showing a configuration of a display device according to an embodiment of the present invention,
2 is a view for explaining a method of providing different 3D content to a plurality of users;
3 and 4 are views for explaining various examples of a method of reducing and processing data sizes of a plurality of 3D contents;
5 is a view for explaining a process of converting a frame rate for one 3D content;
6 is a diagram illustrating an example of a method of composing multi-content frame data by combining a plurality of 3D contents;
7 and 8 are block diagrams showing the configuration of a signal processing apparatus according to an embodiment of the present invention;
9 is a flowchart illustrating a 3D multi content display method according to an embodiment of the present invention;
10 is a flowchart illustrating a 3D multi content display method according to another embodiment of the present invention;
11 is an exemplary diagram illustrating a system for providing a plurality of contents to a plurality of users according to another embodiment of the present invention;
12 is a block diagram of a display device used in the system of FIG.
FIG. 13 is a block diagram of a signal processor used in the display device of FIG. 12. FIG.
14 is an exemplary diagram illustrating an arrangement position of a first content and a second content relative to an image frame based on an output sink;
15 is a flowchart illustrating a multi-view display method according to another embodiment of the present invention;
16 and 17 are schematic views showing the configuration of a content providing system according to another embodiment of the present invention;
FIG. 18 is a block diagram illustrating a configuration of a display apparatus used in the system illustrated in FIGS. 16 and 17.
19 to 22 are diagrams for explaining a method of incorporating key frames of respective contents having different frame rates;
FIG. 23 is a block diagram illustrating a detailed configuration of the display device of FIG. 18.
24 is a block diagram for explaining a configuration of a spectacle device used in the system shown in FIGS. 16 and 17;
25 is a flowchart for explaining a content providing method of a display apparatus according to another exemplary embodiment;
FIG. 26 is a block diagram illustrating a configuration of a display apparatus according to another exemplary embodiment.
27 is a diagram illustrating a 3D multi view mode for displaying a plurality of 3D contents;
28 is a diagram illustrating a 2D multi view mode of displaying a plurality of 2D contents;
29 is a block diagram illustrating an example of a configuration of one SoC;
30 is a block diagram illustrating an example of a detailed configuration of a display apparatus;
31 is a block diagram illustrating a configuration of a display apparatus according to another embodiment of the present invention.
32 is a flowchart for explaining a display method according to another exemplary embodiment.
Hereinafter, the present invention will be described in detail with reference to the accompanying drawings.
1 is a block diagram showing a configuration of a display device according to an embodiment of the present invention. Referring to FIG. 1, the
The
In an embodiment of receiving content from a broadcasting station, the
In addition, the
The
The operation of reducing the data size may be performed in various ways according to the embodiment. For example, the
Alternatively, the
Alternatively, the
In particular, when the content is 3: 2 pulled down film image content, the
In addition, the
The data format conversion operation may be performed before storing the down-scaled content in the
As described above, when the
The
The
For example, in the case of a shutter glass display device, the
As another example, in the case of a glassesless display device, the
Although FIG. 1 illustrates a configuration for receiving and processing two pieces of content, an embodiment of receiving and processing three or more pieces of content may be implemented. In this case, three or more receivers, scalers, and frame rate converters may be provided.
As described above, the mode for configuring and displaying the multi-content frame data may be referred to as a multi view mode (or a dual view mode). When the
Meanwhile, the above-described content may be 2D content or 3D content. 3D content refers to content that allows a user to feel three-dimensional by using a multi-view image in which the same object is expressed from different perspectives.
In order to compose a multi-content frame using a plurality of 3D content, the
Accordingly, the left eye image, the right eye image of the first content, the left eye image of the second content, and the right eye image are sequentially arranged and displayed according to a preset arrangement order. The image is recognized.
Meanwhile, although not shown in FIG. 1, the
FIG. 2 is a diagram illustrating an operation of a shutter glass display apparatus that displays a multi-content frame using a plurality of 3D contents.
According to FIG. 2, the
The synchronization signal may be generated and transmitted in various forms. For example, the
Alternatively, the
In FIG. 2, the
When the synchronization signal is received, each of the
Each
Accordingly, a user wearing the
3 is a diagram illustrating an example of a process of reducing and storing a data size. Referring to FIG. 3, when the
4 is a diagram illustrating another example of a process of reducing and storing a data size. According to FIG. 4, when the
5 is a diagram illustrating a frame rate conversion process for one 3D content. According to FIG. 5, the main content includes left eye images ML0, ML1, ML2,..., And right eye images ML0, ML1, ML2, ..., in a vertical synchronization signal period (b). If
When the timing of processing and outputting the 3D content in this state arrives, the
FIG. 6 illustrates a process of configuring a multi-content frame using main content and sub content processed by the
Meanwhile, the
Meanwhile, the above-described embodiments may be applied to a signal processing device in addition to the display device. The signal processing device refers to a device that receives and processes content and provides the same to a display device, such as a set-top box, a recording medium reproducing device, an image processing chip, and the like.
7 is a diagram illustrating a configuration of a
The scalers 310-1 and 310-2 receive a plurality of contents and reduce the data size. The content may be various contents such as 2D and 3D. Hereinafter, a description will be given based on a case where 3D content is received.
As described in the above-described embodiments, the scalers 310-1 and 310-2 may reduce the data size by performing various processes such as down scaling, frame rate down, and data format conversion. This data size reduction operation may be performed before the content is stored in the
The
Each 3D content whose frame rate is converted is provided to a display device connected to the
8 is a diagram illustrating a configuration of a
A plurality of scalers 310-1, 310-2, ..., 310-n,
The
The
7 and 8 may be connected to a display device to support a multi-view function.
9 is a flowchart illustrating a multi-content display method of a display apparatus according to an exemplary embodiment. According to FIG. 9, when a plurality of contents is received (S910), the data size of each content is reduced (S920) and stored (S930).
When the timing of outputting each content arrives, the stored contents are read out, the frame rate is converted (S940), and the combination is displayed to display the multi-content frame (S950). Since the method for reducing the data size has been described in detail in the above-described exemplary embodiments, redundant description thereof will be omitted.
10 is a flowchart illustrating a multi-contents display method of a display apparatus according to another exemplary embodiment. According to FIG. 10, when a plurality of contents are received (S1010), after downscaling is performed (S1020), the plurality of contents are stored (S1030). Thereafter, when a situation in which the corresponding content needs to be read is generated (S1040), at least one data processing operation of data format conversion and frame rate down is performed by reading data (S1050). Then, after converting the frame rate to the target frame rate level (S1060), the multi-content frame is displayed by combining them.
Although not shown in FIG. 9 and FIG. 10, it is obvious that an audio data processing step or a synchronization signal transmission step may be further included in the multi-content display method. In addition, the content processed in FIGS. 9 and 10 may be 2D content or 3D content.
In addition, the signal processing method according to an embodiment of the present invention, the step of down-scaling a plurality of 3D content each comprising a left eye image and a right eye image, converting the frame rate of the 3D content using a plurality of frame rate converter The method may include configuring a multi-content frame using a plurality of 3D contents having a converted frame rate, and transmitting the multi-content frame to a display device.
The method may further include converting the plurality of down-scaled 3D contents into a format that may be processed by the plurality of frame rate converters.
Since each step of the signal processing method is the same as described in the above-described various embodiments, illustration and overlapping description are omitted.
As described above, according to various embodiments of the present disclosure, resources required for signal processing and display may be reduced. Accordingly, a technology of simultaneously providing a plurality of contents, in particular, a plurality of 3D contents to a plurality of users in one display apparatus can be effectively implemented.
As described above, the display apparatus may receive a plurality of contents to provide a multi-view. The contents may be of various kinds provided from various sources. Therefore, the frame rates of the contents may be different from each other. In this case, the multi-view display method may include receiving a plurality of different contents having different frame rates, matching frame rates of the plurality of contents, and multi-view frames using respective contents having the matched frame rates. It may include the step of displaying. Matching frame rates can be accomplished in a variety of ways. That is, frames may be interpolated, repeated, or skipped to be matched. Hereinafter, various embodiments of a configuration and method for configuring a multi-view by receiving contents having different frame rates will be described.
[Processing for Different Frame Rates]
First, a first embodiment of the case where the frame rates are different from each other is a method of interpolating content having a relatively small frame rate.
According to the present embodiment, even when the frame rates are different from each other, the multi-view can be provided by effectively processing the content.
11 is an exemplary diagram illustrating a system for providing a plurality of contents to a plurality of users according to the present embodiment.
As shown in FIG. 11, the system includes a
The
The spectacles 1210 and 1220 control opening timing of the left eye and right eye shutter glasses according to the synchronization signal received from the
According to an embodiment, the first glasses device 1210 opens the left eye and right eye shutter glasses at the time when the
Meanwhile, the
On the other hand, the second eyeglasses 1220 may open the left and right eye shutter glasses at the time when the
Up to now, a brief description has been made of a system including a
In the present embodiment, in order to perform multi-view display of a plurality of contents in the
Hereinafter, the configuration of the
12 is a block diagram of a display apparatus according to an exemplary embodiment.
As shown in FIG. 12, the display apparatus includes a
The
The second receiver 1112 may receive the second content through a source device such as a web server or a playback device such as a DVD device through at least one of a CART, AV, HDMI, COMPONENT, and USB interface. Meanwhile, the second receiver 1112 may receive second content transmitted from another external broadcast channel like the
The
The
13 is a block diagram of a signal processor according to an exemplary embodiment of the present invention.
As shown in FIG. 13, the first
As illustrated, the
When the first content is received from the
Meanwhile, the first audio processor 1122-1 detects audio data included in the received content and performs signal processing. In detail, when content is received from the
Meanwhile, the first additional data processor 1123-1 determines whether additional data such as an electronic program guide (EPG) and a subtitle is included in the received content, and when the additional data is included, the first additional data processor 1123-1 adds the additional data from the received content. Separate data. Thereafter, the first additional data processor 1123-1 may add the separated additional data to the corresponding image frame.
As such, data about the first content and the second content signal-processed by the
The
According to an embodiment, in the case of a shutter glass type display device, the
As such, when the multi-view frame in which the image frames for the first content and the second content are combined is displayed, each of the plurality of users may view video images of different contents through the glasses device worn by the plurality of users. Specifically, the spectacle device includes a left eye shutter glass and a right eye shutter glass. When the multi-view frame is output through the
In this way, by turning on and off the left eye and right eye shutter glasses collectively, a user wearing the spectacle device can view a video image of the content separate from other users. However, the present invention is not limited thereto, and the display device may display a multi-view frame for the first content and the second content in a polarized glass method or another method.
The controller 1150 may interpolate the image frame for the second content according to a difference in the frame rate of the image frame for the first content and the second content stored in the
The controller 1150 controls the
Accordingly, the second
14 is an exemplary diagram illustrating a relative arrangement position of an image frame of first content and second content on the basis of an output sink according to an embodiment of the present invention.
As illustrated in FIG. 14, an image frame of the first content may be set to a frame rate of 30 Hz, and an image frame of the second content may be set to a frame rate of 24 Hz. The output sync for the image frames of the first content and the second content may be set to 60 Hz.
As such, when the output sink is set to 60 Hz, the relative arrangement position of each image frame of the first content may be determined by being divided into 0.5 units. That is, the position where the output sink is set to 60 Hz relative to the image frame of the first content corresponding to the
Meanwhile, the
Meanwhile, like the
As such, when a relative arrangement position between each image frame of the first content and the second content is obtained based on an output sink according to a preset condition, the controller 1150 acquires each image frame of the first content and the second content. The
According to the control command of the controller 1150, the second
However, the present invention is not limited thereto, and the controller 1150 may control the
According to the control command, the
For example, as described with reference to FIG. 14, if the image frame A-0 of the image frames A-0,1,2,3,4,5 of the first content is stored in the
Up to now, each configuration of the display device according to the present invention has been described in detail. Hereinafter, a method of performing a multi-view display on a plurality of contents in the display apparatus according to the present invention will be described in detail.
15 is a flowchart of a multi-view display method in a display device according to an embodiment of the present invention.
As shown in FIG. 15, the display apparatus receives first content and second content having a frame rate smaller than the first content (S1510). Thereafter, the display apparatus stores the received first content and the second content in the storage unit, and processes the first content and the second content stored in the storage unit through the first signal processor and the second signal processor to process the first content and the second content. An image frame for the second content is generated (S1520 and S1530). Thereafter, the display apparatus compares the difference between the frame rate of the first content and the frame rate of the second content through the second signal processor, and interpolates the image frame of the second content according to the comparison result (S1540).
Thereafter, the display apparatus displays a combination of alternately arranged image frames of the second content generated by interpolating the image frame of the first content and the image frame of the second content generated by the first signal processor (S1550). . Accordingly, the display device according to the present invention can perform multi-view display on a plurality of contents.
In detail, the display apparatus receives the first content and the second content through the first and second receivers. Here, the first content and the second content may be transmitted from an external broadcast channel or provided from a source device such as a web server or a reproduction device such as a DVD device. One of the first content and the second content may have a smaller frame rate than another content. In the present invention, it will be described under the assumption that the frame rate of the second content has a smaller frame rate than the frame rate of the second content.
When the first content and the second content are received through the first and second receivers, the display apparatus stores the received first content and the second content in the storage unit. When the first content and the second content are stored in the storage unit, the display apparatus generates an image frame for the first content and the second content stored in the storage unit through the first signal processor and the second signal processor. An operation of generating an image frame for the first content and the second content by the first signal processor and the second signal processor has been described in detail with reference to FIG. 13, and thus, a detailed description thereof will be omitted.
When an image frame for the first content and the second content is generated through the first signal processor and the second signal processor, the display device stores the generated image frame for the first content and the second content in the storage unit. . Subsequently, the display apparatus compares the frame rate of the image frame of the first content and the frame rate of the image frame of the second content stored in the storage to interpolate the image frame of the second content to interpolate the image of the second content. Create a frame.
In detail, the display apparatus may interpolate an image frame of the second content by comparing a relative arrangement position between each image frame of the first content and the second content based on the output sink. Here, the output sink refers to a synchronized signal for an image frame in which image frames of the first content and the second content are alternately displayed. Such an output sink may be set to a frame rate of the first content having a frame rate larger than that of the second content or may be set according to information input from the outside.
Accordingly, the display device may know a relative arrangement position between each image frame of the first content and the second content based on the output sink set according to the condition. As described with reference to FIG. 14, an image frame of the first content may be set to a frame rate of 30 Hz, and an image frame of the second content may be set to a frame rate of 24 Hz. The output sync for the image frames of the first content and the second content may be set to 60 Hz.
As such, when the output sink is set to 60 Hz, the relative arrangement position of each image frame of the first content may be determined by being divided into 0.5 units. That is, the position where the output sink is set to 60 Hz relative to the image frame of the first content corresponding to the
Meanwhile, the display apparatus may determine the relative arrangement position from the output sink by referring to the number of lines for the image frames of the first content and the second content or the image frame information of the first content and the second content. For example, the entire input line of the image frame for the second content may be 1125 lines, and the 112th line of the image frame for the second content may be stored in the current storage unit. As such, when an output sink occurs at a time point when the 112th line of the entire input lines of the image frame for the second content is stored in the storage unit, the display apparatus displays the current storage unit in the entire input line of the image frame for the second content. By dividing the lines stored in, the resulting value 3.1 is calculated. From this result, it is possible to know a relative arrangement position of the image frame of the second content with respect to the time point at which the output sink is generated.
Such a display device can know the relative arrangement position of the image frame of the first content with respect to the time when the output sink is generated through the method described above. As such, when a relative arrangement position between each image frame of the first content and the second content is obtained based on an output sink according to a preset condition, the display apparatus may determine a relative relationship between each image frame of the first content and the second content that have been acquired. The image frames of the second content may be interpolated by comparing the arrangement positions.
Such a display device performs interpolation to generate an image frame of the second content corresponding to a point corresponding to a relative arrangement position of the first content with reference to the front and back image frames. However, the present invention is not limited thereto, and the display apparatus may interpolate an image frame for the second content according to a reception time of the first content and the second content. In detail, when the first content and the second content are received, the display apparatus compares the reception time points of the received first content and the second content, and corresponds to a time point when one image frame of the first content is stored in the storage unit. The storage ratio of the corresponding frame of the second content is checked. Thereafter, the display apparatus generates an interpolation frame by combining the corresponding frame of the second content and the next frame of the corresponding frame according to the identified storage ratio.
Such a display device estimates and estimates the motion of an object displayed in the frame by comparing the next frame with the corresponding frame of the second content corresponding to the time when one image frame of the image frame of the first content is stored in the storage unit. The interpolation frame may be generated by applying a reception ratio to the motion.
For example, as described with reference to FIG. 14, if the image frame A-0 of the image frames A-0,1,2,3,4,5 of the first content is stored in the
As described above, an effective multi-view display using a plurality of contents may be implemented by interpolating a frame rate.
Unlike the above-described embodiment, the frame rate may be matched by repeating or skipping the frame. That is, according to another embodiment of the case where the frame rates are different, the frame rates may be matched by integrating key frames by repeating or skipping frames. Hereinafter, embodiments of integrating and processing frame rates will be described.
16 and 17 are schematic views for explaining the configuration and operation of the content providing system according to the present embodiment. As shown in FIGS. 16 and 17, the present content providing system includes a
According to FIG. 16, the
The first spectacle apparatus 2200-1 opens both the left shutter glass and the right shutter glass when one content A is displayed according to the synchronization signal, and opens the left shutter glass and the other when the other content B is displayed. It may be operable to turn off all of the right shutter glasses. Accordingly,
17 is a diagram for describing a method of providing a plurality of 3D contents according to an embodiment of the present invention.
As shown in the drawing, when the plurality of 3D contents (contents A and B) are 3D contents, the
For example, the left eye image and the right eye images AL and AR of the 3D content A may be displayed, and the left eye image and the right eye images BL and BR of the 3D content B may be alternately displayed. In this case, the first eyeglass device 2200-1 opens the left eye and the right eye glasses at the display time point of the left eye image and the right eye images AL and AR of the 3D content A, and the second eyeglass device 2200-2 displays the 3D object. The left eye and the right eye glasses may be opened at the time of displaying the left eye image and the right eye images BL and BR of the content B.
Accordingly,
However, this is a description of a case in which the shutter glass method is assumed, and in the case of the polarization method, the polarization direction of each of the plurality of content images and the polarization direction of the first and second eyeglasses may be implemented to support the multi-view mode. It will be apparent to those skilled in the art.
18 is a block diagram illustrating a configuration of a display apparatus according to an exemplary embodiment. According to FIG. 18, the
The plurality of receivers 2110-1, 2110-2,..., 2110-n respectively receive a plurality of contents. Specifically, each receiving unit 2110-1, 2110-2,..., 2110-n stores contents from a broadcasting station transmitting broadcast program content using a broadcast network or a web server transmitting content files using the Internet. Receive. In addition, content may be received from various recording medium reproducing apparatuses provided in the
In an embodiment of receiving content from a broadcasting station, the plurality of receivers 2110-1, 2110-2,..., 2110-n may include a tuner (not shown), a demodulator (not shown), and an equalizer (not shown). It may be implemented in a form including a configuration such as. On the other hand, in an embodiment of receiving content from a source such as a web server, the plurality of receivers 2110-1, 2110-2,..., 2110-n may be implemented as a network interface card (not shown). . Alternatively, in the case of receiving the contents from the various recording medium reproducing apparatuses described above, the plurality of receiving units 2110-1, 2110-2,..., 2110-n are connected to the recording medium reproducing apparatus. May be implemented. As such, the plurality of receivers 2110-1, 2110-2,..., 2110-n may be implemented in various forms according to embodiments.
In addition, the plurality of receivers 2110-1, 2110-2,..., 2110-n do not necessarily receive content from a source of the same type, and the plurality of receivers 2110-1, 2110-2. 2110-n) may receive content from different types of sources. For example, the
Meanwhile, the plurality of receivers 2110-1, 2110-2,..., And 2110-n may each receive a plurality of contents having different frame rates. Specifically, each receiver 2110-1, 2110-2,..., 2110-n may receive content generated at 24 frames per second or 30 frames per second.
In addition, the content received by the plurality of receivers 2110-1, 2110-2,..., 2110-n may be 2D content or 3D content. 3D content refers to content that allows a user to feel three-dimensional by using a multi-view image in which the same object is expressed from different perspectives.
3D content can be in a variety of formats, and in particular, general top-bottom, side by side, horizontal interleave, vertical interleave, or checkers. It may be in a format according to one of a checker board method and a sequential frame.
The plurality of detection units 2120-1, 2120-2,..., And 2120-n detect a key frame of each of the plurality of contents. The plurality of detection units 2120-1, 2120-2,..., And 2120-n may detect key frames constituting content input in various ways.
For example, when an image frame constituting content is input at a frame rate of 24 frames per second or 30 frames per second, each detector 2120-1, 2120-2,..., 2120-n stores each frame as a key frame. Can be detected.
On the other hand, when an image frame constituting the content is input at a frame rate of 60 frames per second, each detector 2120-1, 2120-2, ..., 2120-n extracts a pull-down method of the input frame. The key frame can be detected. For example, if the current frame is repeated three times and the next frame is repeated two times, the detection units 2120-1, 2120-2,..., 2120-n display the input content on the
The
19 to 22 are diagrams for describing a method of integrating key frames of content having a frame rate with each other, according to an exemplary embodiment. In particular, in each of the figures, content A has a frame rate of 24 frames per second, in a 3: 2 pull down manner, and content B has a frame rate of 30 frames per second, in a 2: 2 pull down manner.
As shown in Fig. 19, when the key frames (Aa, Ab, Ac, ...) detected in the content A and the key frames (Ba, Bb, Bc, ...) detected in the content B are inputted, The
For example, as shown in FIG. 20, the
Here, the skipped key frames may be key frames that do not coincide with each other in time according to the pull-down method of each content. That is, as shown in FIG. 20, the first key frame Aa, the fourth key frame Ad, the fifth key frame Ae, etc. of the content A having a 3: 2 pull down method are each 2. The second key frame Ba, the fifth key frame Be, the sixth key frame Bf, etc. of the content B having a pull-down method correspond to each other in time. Accordingly, the
Meanwhile, the
Meanwhile, in the above-described embodiment, the
For example, in FIG. 20, the
Here, the
The
For example, the
On the other hand, the
On the other hand, the
Meanwhile, in addition to the above-described format, an interlaced key frame for one content and a key frame for another content are each half subsampled in the vertical direction, and the pixels of each key frame are alternately positioned for each line. The format can also be used to integrate each key frame. As such, the
Meanwhile, when a key frame for 3D content is received, the
For example, the format of the 3D image is a top-bottom method, side by side method, horizontal interleave method, vertical interleave method, or checker board. Method, and the format according to the sequential frame, the
In addition, when the format of the 3D image is a general frame sequence method, the
The
The
In this case, the
In addition, the
The
For example, in the case of a shutter glass display device, the
When the frame rate displayable by the
Specifically, the eyeglass unit is provided with a left eye shutter glass and a right eye shutter glass. The left eye shutter glass and the right eye shutter glass are alternately opened / off when watching 3D content, but as described above, when the image frames of each content are alternately arranged and displayed, It is open / off collectively according to the output timing. Accordingly, the user can watch the content separately from other users.
As described above, a mode of alternately arranging image frames of each content and displaying the same may be referred to as a multi view mode (or a dual view mode). When the
On the other hand, when using a plurality of 3D content, the
In detail, when the
Meanwhile, although not shown in FIG. 18, the
In some cases, if the content includes additional information such as an electronic program guide (EPG) and subtitles, the demultiplexer may further separate the additional data from the content and transmit the additional data to the
Meanwhile, in the normal mode (particularly, displaying 3D content), the 3D content is stored through the activated receiver 1 (2120-1) among the plurality of receivers 2120-1, 2120-2, ..., 2120-n. When received, the
In detail, the
23 is a block diagram illustrating a detailed configuration of a display apparatus. According to FIG. 23, the
The
The
The
The
For example, the
The transport packet includes time information for opening / off the shutter glass of the spectacle device in synchronization with the display timing of the content. Specifically, the time information includes a left shutter open offset for opening the left eye shutter glass of the spectacle device, a left shutter close offset for turning off the left eye shutter glass, a right shutter for opening the right eye shutter glass, open offset, and right shutter close offset for turning off the right eye shutter glass.
The offset time is delay information from the reference time set for each content to the opening or the turning off of the shutter glass. That is, the spectacle apparatus opens / offs the left eye shutter glass and the right eye shutter glass when the offset time has elapsed from the reference time point.
For example, the reference time point may be a time point at which a vertical sync signal (ie, frame sync) is generated in an image frame, and information about the reference time point may be included in a transport packet. In addition, the transport packet may include information about a clock signal used in the
In addition, the transport packet may further include information on the period of the frame sink, information for indicating the decimal point information when the period of the frame sink has a decimal point.
Meanwhile, the
In addition, when pairing with the plurality of eyeglasses is completed, the
Although the above-described embodiment has been described as the
In addition, in the above-described embodiment, the configuration for generating the synchronization signal and the configuration for transmitting the synchronization signal have been described as separate, but this is for convenience of description. That is, the
In addition, in the above-described embodiment, the
That is, the
In this case, the
24 is a block diagram illustrating a configuration of an eyeglass device according to an embodiment of the present invention. Since the first and second eyeglasses 2200-1 and 2200-2 in FIGS. 16 and 17 have the same configuration, the configuration of any one
The
The synchronization signal is a signal for synchronizing the content view output time point of the display device with the spectacle device. As described above, the synchronization signal may be received in the form of a transport packet according to various communication standards. The transport packet may include time information for indicating the display timing of the content. Since the information included in the transport packet has been described above with reference to FIG. 23, duplicate description thereof will be omitted.
The
The
The first
Specifically, the first
Meanwhile, in the case of 3D content, the first
Meanwhile, in the above-described exemplary embodiment, the display apparatus generates the synchronization signals corresponding to the display timing of the contents, and transmits them to the
When the synchronization signal is received, the
In addition, in the above-described embodiment, the display apparatus and the eyeglass apparatus communicate with each other according to various wireless communication methods capable of transmitting and receiving signals by forming a communication channel at a short distance, but this is merely an example. That is, the display device provides an IR (Infra Red) synchronization signal having a different frequency to the glasses device, the glasses device receives a synchronization signal having a specific frequency, and open or off the shutter glass in accordance with the display timing of the corresponding content Of course you can.
25 is a flowchart illustrating a content providing method of a display apparatus according to another exemplary embodiment.
First, a plurality of contents are received (S2310). In detail, a plurality of contents having different frame rates may be received.
Thereafter, a key frame of each of the plurality of contents is detected (S2320).
For example, when an image frame constituting the content is input at a frame rate of 24 frames per second or 30 frames per second, each frame may be detected as a key frame.
On the other hand, when an image frame constituting the content is input at a frame rate of 60 frames per second, a key frame may be detected by extracting a pull-down method of the input frame. For example, if the current frame is repeated three times and the next frame is repeated two times, it is determined that the input content is converted to a 3: 2 pull-down method, and one frame and two frames are repeated in the three repeated frames. One frame in the frame is detected as a key frame.
Then, the detected key frame is integrated (S2330). In detail, when the number of key frames of each of the plurality of contents is different, the number of key frames may be matched by performing a frame skip operation, and frames corresponding to each other of the contents may be integrated.
In this case, a key frame of each of the plurality of contents may be integrated into a top to bottom, side by side format, or checker board format. Since each of these embodiments has been described above, a redundant description thereof will be omitted.
Then, signal processing for the integrated key frame is performed (S2340). That is, the motion judging removal process can be performed by interpolating the integrated key frame. Specifically, Frame Rate Control (FRC) is performed to convert the frame rate of the integrated key frame into a frame rate displayable on the display device. For example, in the case of the National Television System Committee (NTSC) scheme, the frame rate displayable on the
In this case, the interpolated frame is generated by estimating the motion of the object included in the current frame and the next frame in the integrated key frame, and the generated interpolation frame is inserted between the current frame and the next frame, thereby the frame rate of the integrated key frame. Can be converted to a frame rate displayable on the display device.
Meanwhile, a frame in which the frame rate is converted is divided for each content, and each frame may be up or down scaled according to the screen size of the display apparatus using a scaler.
In operation S2350, the multi-view frame is displayed using the processed key frame. In detail, the multi-view frame may be displayed by multiplexing the image frames for each content to be arranged alternately at least one.
For example, in the case of a shutter glass type display apparatus, at least one image frame of the first content, the image frame of the second content, ..., and the image frame of the nth content are configured to be alternately arranged and displayed. In this case, when the processed frame rate is 60 Hz, each content is displayed at n × 60 Hz, and the user can watch a desired content by wearing an eyeglass device (not shown) that is linked to the timing at which the content is displayed. .
Meanwhile, when using a plurality of 3D contents, the left eye image and the right eye image included in each 3D content may be multiplexed in a predetermined arrangement, and alternately arranged with an image frame of another content.
In detail, when the display apparatus operates at 60 Hz, the left eye image, the right eye image, the left eye image, the right eye image of the second content, the left eye image and the right eye image of the n-th content are sequentially arranged. It can be displayed at a driving frequency of 2 x n x 60 Hz. The user recognizes a left eye image and a right eye image of one 3D content through the glasses device.
In addition, the content providing method according to the present embodiment may further include generating a synchronization signal for synchronizing the glasses device corresponding to each content according to the display timing of each content, and transmitting the synchronization signal to the glasses device. .
Specifically, in the multi-view mode, a synchronization signal for synchronizing the spectacle apparatus with the display timing of the image frame for one of the plurality of contents is generated, and in the normal mode, the display timing of the left eye image frame and the right eye image frame of the 3D content Generate a synchronization signal for synchronizing the spectacles device.
In addition, communication with the eyeglasses device may be performed according to various wireless communication methods to transmit a corresponding synchronization signal. Since the case in which the synchronization signal is transmitted through the Bluetooth communication method has been described above, redundant description will be omitted.
[Example using multiple SoCs]
As described above, in order to process a plurality of contents, a large number of components should be provided in comparison with the processing of one content. In particular, in order to effectively provide a multi-view, a plurality of display processors may be provided. In this case, designing an SoC with a plurality of display processors requires a lot of effort and cost. In view of this, the following provides a display apparatus and method according to another embodiment of the present invention for displaying a plurality of content views using a plurality of SoCs.
26 is a block diagram illustrating a configuration of a display apparatus according to another exemplary embodiment. The
According to FIG. 26, the
The
The
Data processed by the
The
As another example, in the case of a polarization type display device, the video output unit may display a frame in which image frames of each content are separated and arranged alternately for each line. In the polarization method, the spectacle apparatus for viewing 3D content and the spectacle apparatus for using the multi-view mode are different from each other. That is, the spectacle apparatus for viewing 3D content has different polarization directions of the left eye and the right eye, and the spectacle apparatus for using the multi-view mode has the same polarization direction for the left eye and the right eye.
The
The display apparatus may perform a multi-view mode in which a plurality of 2D contents or a plurality of 3D contents are combined.
FIG. 27 is a diagram illustrating an operation of a shutter glass type display apparatus that receives and displays a plurality of 3D contents.
Referring to FIG. 27, the
FIG. 28 is a diagram illustrating an operation of a shutter glass display apparatus that receives and displays a plurality of 2D contents. According to FIG. 28, image frames of different contents are displayed in the content views 1 and 2. The
The
FIG. 29 illustrates an example of a configuration of an
The
The
SoC 1 (3130) receives the 3D content through the HDMI port. The
As described above, the
30 is a block diagram showing a detailed configuration of a display apparatus. According to FIG. 30, the display apparatus includes the
The
The interface unit 3180 communicates with the spectacles. In detail, the interface unit 3180 may transmit an audio signal or a synchronization signal to the eyeglasses according to various wireless communication standards such as Bluetooth, Wi-Fi, Zigbee, IEEE, and the like. Alternatively, the interface unit 3180 may be implemented as an RF lamp that outputs an IR lamp or an RF synchronization signal that emits an IR synchronization signal. When the interface unit 3180 is implemented as an IR lamp or an RF transmitter, the interface unit 3180 may be provided on the exterior as shown in the
The
The
The
31 is a block diagram illustrating a configuration of a display apparatus according to another exemplary embodiment. According to FIG. 31, the display apparatus includes
The
The
The
Meanwhile, the spectacles shown in FIGS. 27 and 28 may have a configuration as shown in FIG. 24. That is, the first or
32 is a flowchart for explaining a display method according to another exemplary embodiment. Referring to FIG. 32, when a 3D multi-view mode for receiving and outputting a plurality of 3D contents is started (S3810), a plurality of 3D contents are received (S3820), and each 3D content is processed using a plurality of SoCs (S3830). ).
In the processing of each content using a plurality of SoCs, the data processed in each SoC may be muxed using a mux mounted on one of the SoCs, and the frame rate of the muxed data may be converted.
Alternatively, after processing each 3D content in a plurality of SoCs, the data may be muxed using muxes mounted on separate SoCs, and the frame rate of the muxed data may be converted.
Accordingly, a plurality of content views are displayed by combining image frames of each 3D content (S3840), and a synchronization signal is transmitted (S3850).
Although not shown in FIG. 32, the present invention may further include performing pairing with a plurality of glasses devices, and sequentially matching the plurality of glasses devices and the plurality of content views according to the pairing order.
As described above, according to various embodiments of the present disclosure, a plurality of contents may be received to effectively provide a multi-view.
The method according to the various embodiments described above may be programmed as an application and provided to a display device and an eyeglass device.
Specifically, receiving a plurality of contents each comprising a left eye image and a right eye image, reducing the data size of the plurality of contents, storing the contents, converting the frame rates of the plurality of stored contents, and the frame rate A non-transitory computer readable medium in which a program for providing a multi-view display is stored by sequentially combining and displaying each converted content is embedded in the display device, or connected to the display device for use. Can be. These programs can also be downloaded from a variety of sources, such as web servers or management servers.
Or downscaling a plurality of 3D contents each including a left eye image and a right eye image. Converting the frame rate of the 3D content using the plurality of frame rate converters, constructing the multi content frame data using the plurality of 3D contents having the converted frame rate, and converting the 3D multi content frame data to the display apparatus. A program for performing signal processing by sequentially performing the transmitting may be provided to the display apparatus through a non-transitory readable medium or a network.
Or sequentially receiving a plurality of different contents having different frame rates, matching the frame rates of the plurality of contents, and displaying the multi-view frame using the respective contents having the matched frame rates. Programs to perform may be provided to the display device via a non-transitory readable medium or network.
The non-transitory readable medium described above is not a medium storing data for a short time such as a register, a cache, a memory, but a semi-permanent data, and means a medium that can be read by a device. Specifically, the various applications or programs described above may be stored and provided in a non-transitory readable medium such as a CD, a DVD, a hard disk, a Blu-ray disk, a USB, a memory card, a ROM, or the like.
While the present invention has been particularly shown and described with reference to exemplary embodiments thereof, it is to be understood that the invention is not limited to the disclosed exemplary embodiments, but, on the contrary, It will be understood by those skilled in the art that various changes in form and detail may be made therein without departing from the spirit and scope of the present invention.
110, 120:
150: storage unit 180: image output unit
160, 170:
Claims (24)
A storage unit;
A plurality of scalers configured to reduce the data size of the plurality of contents and store the contents in the storage unit, and read out each of the contents stored in the storage unit according to an output timing;
A plurality of frame rate converters for converting frame rates of the read contents; And
And a video output unit configured to display and display respective content output from the plurality of frame rate converters.
The plurality of contents are 3D contents each including a left eye image and a right eye image,
Each of the plurality of scalers,
And down-scaling the plurality of 3D contents and reducing a frame rate and storing the plurality of 3D contents in the storage unit.
The plurality of contents are 3D contents each including a left eye image and a right eye image,
Each of the plurality of scalers,
Downscaling the plurality of 3D contents and storing the 3D content in the storage;
And displaying each 3D content stored in the storage unit according to an output timing, and lowering the frame rate of the read 3D content to the plurality of frame rate converters.
The plurality of contents are 3D contents each including a left eye image and a right eye image,
At least one of the plurality of scalers,
If the 3D content is a 3: 2 pull-down film image content, down-scaling the film image content and extracting only a key frame is stored in the storage unit,
And the frame rate converter converts the frame rate of each 3D content into a multi-content display rate by interpolating the frame based on the read key frame when the key frame is read from the storage unit.
The image output unit,
And multiplexing each content provided by the plurality of frame rate converters in order according to a predetermined arrangement order, and up-scaling the multiplexed data to fit the screen size.
Receiving a plurality of contents each including a left eye image and a right eye image;
Reducing and storing data sizes of the plurality of contents;
Converting frame rates of the plurality of stored contents, respectively; And
And combining and displaying each content of which the frame rate is converted.
The plurality of contents are 3D contents each including a left eye image and a right eye image,
Reducing and storing the data size of the plurality of contents,
Downscaling the plurality of 3D content;
Reducing the frame rate of each of the downscaled 3D content;
Storing each 3D content having the reduced frame rate;
Converting the frame rate,
And converting the frame rate of each 3D content into a multi content display rate.
The plurality of contents are 3D contents each including a left eye image and a right eye image,
Reducing and storing the data size of the plurality of contents,
If the 3D content is 3: 2 pull down film image content, downscaling the film image content;
And extracting and storing only a key frame of the downscaled film image content.
Converting the frame rate,
And converting the frame rate of each 3D content by interpolating the frame based on the stored key frame.
Wherein the displaying comprises:
Multiplexing each content so that the contents are sequentially arranged according to a predetermined arrangement order;
Upscaling the multiplexed data to fit the screen size;
Displaying the upscaled data.
A plurality of scalers configured to reduce a data size of a plurality of 3D contents each including a left eye image and a right eye image;
A storage unit which stores a plurality of 3D contents processed by the plurality of scalers;
And a plurality of frame rate converters for converting the frame rates of the plurality of 3D contents stored in the storage unit into a multi-content display rate.
The plurality of scalers,
Downscaling the plurality of 3D contents and storing the 3D content in the storage;
And when the down-scaled 3D content is read from the storage unit, converting the read 3D content into a format that can be processed by the plurality of frame rate converters.
An image processor configured to construct multi-content frame data using a plurality of 3D contents having a frame rate converted by the plurality of frame rate converters; And
And an interface unit configured to transmit the multi-content frame data to a display device.
Downscaling a plurality of 3D contents each including a left eye image and a right eye image;
Converting the frame rate of the 3D content using a plurality of frame rate converters;
Constructing multi-content frame data using a plurality of 3D contents having the converted frame rate; And
Transmitting the 3D multi content frame data to a display device.
And converting the plurality of down-scaled 3D contents into a format that can be processed by the plurality of frame rate converters.
Matching frame rates of the plurality of contents;
Displaying a multi-view frame using each content having the matched frame rate.
Matching the frame rate,
Storing the plurality of contents;
Generating a plurality of image frames by processing the plurality of contents, respectively;
And interpolating the video frame of the content having a relatively small frame rate among the plurality of contents.
The interpolation process,
Comparing a reception time of each of the plurality of contents, and confirming a storage ratio of a corresponding frame of other contents at a time when one image frame of one of the plurality of contents has been stored;
And generating an interpolation frame by combining the corresponding frame and a next frame of the corresponding frame according to the identified storage ratio.
Generating the interpolation frame,
And comparing the corresponding frame and the next frame to estimate a motion of an object displayed in the frame, and applying the reception ratio to the estimated motion to generate the interpolated frame.
Matching the frame rate,
Detecting a key frame of each of the plurality of contents;
Integrating the detected key frames.
Integrating the key frame,
And if the number of key frames of each of the plurality of contents is different, performing a frame repeat or skip operation to match the number of key frames, and integrating key frames corresponding to each of the contents.
Matching the frame rate,
And performing motion judging removal by performing interpolation on the integrated key frame.
A plurality of system on chips (SoCs) each having a display processor for processing 3D content;
And an output unit for outputting a plurality of content views by combining image frames of each of the 3D contents processed by the plurality of SoCs.
One SoC of the plurality of SoCs includes a mux for muxing data processed by a display processor mounted on the SoC and data output from another SoC.
A SoC equipped with a mux for muxing data output from the plurality of SoCs; And
And a frame rate converter for converting the frame rate of the data muxed in the mux.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/614,277 US20130169755A1 (en) | 2011-12-28 | 2012-09-13 | Signal processing device for processing plurality of 3d content, display device for displaying the content, and methods thereof |
EP12184610.9A EP2611161B1 (en) | 2011-12-28 | 2012-09-17 | Signal processing device for processing plurality of 3D content, display device for displaying the content, and methods thereof |
CN2012105899170A CN103188509A (en) | 2011-12-28 | 2012-12-28 | Signal processing device for processing a plurality of 3d content, display device, and methods thereof |
Applications Claiming Priority (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR20110144365 | 2011-12-28 | ||
KR1020110145280 | 2011-12-28 | ||
KR20110145280 | 2011-12-28 | ||
KR1020110144365 | 2011-12-28 | ||
KR1020110147502 | 2011-12-30 | ||
KR20110147291 | 2011-12-30 | ||
KR20110147502 | 2011-12-30 | ||
KR1020110147291 | 2011-12-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20130076674A true KR20130076674A (en) | 2013-07-08 |
Family
ID=48990210
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020120054864A KR20130076674A (en) | 2011-12-28 | 2012-05-23 | Signal processing device for processing a plurality of 3d contents, display device for displaying them and methods thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20130076674A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20150058809A (en) * | 2013-11-21 | 2015-05-29 | 삼성전자주식회사 | Apparatus and method for reproducing multi image |
WO2016036073A1 (en) * | 2014-09-02 | 2016-03-10 | 삼성전자 주식회사 | Display device, system and controlling method therefor |
-
2012
- 2012-05-23 KR KR1020120054864A patent/KR20130076674A/en not_active Application Discontinuation
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20150058809A (en) * | 2013-11-21 | 2015-05-29 | 삼성전자주식회사 | Apparatus and method for reproducing multi image |
WO2016036073A1 (en) * | 2014-09-02 | 2016-03-10 | 삼성전자 주식회사 | Display device, system and controlling method therefor |
US10140685B2 (en) | 2014-09-02 | 2018-11-27 | Samsung Electronics Co., Ltd. | Display device, system and controlling method therefor |
US10878532B2 (en) | 2014-09-02 | 2020-12-29 | Samsung Electronics Co., Ltd. | Display device, system and controlling method therefor |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2611161B1 (en) | Signal processing device for processing plurality of 3D content, display device for displaying the content, and methods thereof | |
EP2375767A1 (en) | Stereoscopic video player, stereoscopic video playback system, stereoscopic video playback method, and semiconductor device for stereoscopic video playback | |
US8760468B2 (en) | Image processing apparatus and image processing method | |
EP2320669B1 (en) | Stereoscopic image reproduction method in case of pause mode and stereoscopic image reproduction apparatus using same | |
US20120113113A1 (en) | Method of processing data for 3d images and audio/video system | |
US8994787B2 (en) | Video signal processing device and video signal processing method | |
US20110063422A1 (en) | Video processing system and video processing method | |
US9438895B2 (en) | Receiving apparatus, transmitting apparatus, communication system, control method of the receiving apparatus and program | |
US20150035958A1 (en) | Apparatus and method for concurrently displaying multiple views | |
US20130141534A1 (en) | Image processing device and method | |
KR20130028098A (en) | Method and apparatus for displaying images | |
JP2009296144A (en) | Digital video data transmission apparatus, digital video data reception apparatus, digital video data transport system, digital video data transmission method, digital video data reception method, and digital video data transport method | |
US20150289015A1 (en) | Broadcast receiving apparatus, upgrade device for upgrading the apparatus, broadcast signal processing system, and methods thereof | |
US20140015941A1 (en) | Image display apparatus, method for displaying image and glasses apparatus | |
JP2013090020A (en) | Image output device and image output method | |
JP5412404B2 (en) | Information integration device, information display device, information recording device | |
KR101885215B1 (en) | Display apparatus and display method using the same | |
US20130169698A1 (en) | Backlight providing apparatus, display apparatus and controlling method thereof | |
KR20130076674A (en) | Signal processing device for processing a plurality of 3d contents, display device for displaying them and methods thereof | |
US20130266287A1 (en) | Reproduction device and reproduction method | |
CN103188513A (en) | Device and method for displaying video | |
US20110310222A1 (en) | Image distributing apparatus, display apparatus, and image distributing method thereof | |
CN204697147U (en) | For the device of display video | |
KR20120062428A (en) | Image display apparatus, and method for operating the same | |
JP2013090019A (en) | Image output device and image output method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WITN | Withdrawal due to no request for examination |